跳到主要导航 跳到搜索 跳到主要内容

Hierarchical federated learning with global differential privacy

  • Youqun Long
  • , Jianhui Zhang
  • , Gaoli Wang*
  • , Jie Fu
  • *此作品的通讯作者
  • East China Normal University
  • Ltd.

科研成果: 期刊稿件文章同行评审

摘要

Federated learning (FL) is a framework which is used in distributed machine learning to obtain an optimal model from clients’ local updates. As an effcient design in model convergence and data communication, cloud-edge-client hierarchical federated learning (HFL) attracts more attention than the typical cloud-client architecture. However, the HFL still poses threats to clients’ sensitive data by analyzing the upload and download parameters. In this paper, to address information leakage effectively, we propose a novel privacy-preserving scheme based on the concept of differential privacy (DP), adding Gaussian noises to the shared parameters when uploading them to edge and cloud servers and broadcasting them to clients. Our algorithm can obtain global differential privacy with adjustable noises in the architecture. We evaluate the performance on image classification tasks. In our experiment on the Modified National Institute of Standards and Technology (MNIST) dataset, we get 91% model accuracy. Compared to the previous two-layer HFL-DP, our design is more secure while as being accurate.

源语言英语
页(从-至)3741-3758
页数18
期刊Electronic Research Archive
31
7
DOI
出版状态已出版 - 2023

指纹

探究 'Hierarchical federated learning with global differential privacy' 的科研主题。它们共同构成独一无二的指纹。

引用此