Hierarchical federated learning with global differential privacy

Youqun Long, Jianhui Zhang, Gaoli Wang*, Jie Fu

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

2 Scopus citations

Abstract

Federated learning (FL) is a framework which is used in distributed machine learning to obtain an optimal model from clients’ local updates. As an effcient design in model convergence and data communication, cloud-edge-client hierarchical federated learning (HFL) attracts more attention than the typical cloud-client architecture. However, the HFL still poses threats to clients’ sensitive data by analyzing the upload and download parameters. In this paper, to address information leakage effectively, we propose a novel privacy-preserving scheme based on the concept of differential privacy (DP), adding Gaussian noises to the shared parameters when uploading them to edge and cloud servers and broadcasting them to clients. Our algorithm can obtain global differential privacy with adjustable noises in the architecture. We evaluate the performance on image classification tasks. In our experiment on the Modified National Institute of Standards and Technology (MNIST) dataset, we get 91% model accuracy. Compared to the previous two-layer HFL-DP, our design is more secure while as being accurate.

Original languageEnglish
Pages (from-to)3741-3758
Number of pages18
JournalElectronic Research Archive
Volume31
Issue number7
DOIs
StatePublished - 2023

Keywords

  • differential privacy
  • distributed network
  • federated learning
  • hierarchical architecture
  • privacy preservation

Fingerprint

Dive into the research topics of 'Hierarchical federated learning with global differential privacy'. Together they form a unique fingerprint.

Cite this