Privacy-Preserving Verifiable Asynchronous Federated Learning

Yuanyuan Gao, Lulu Wang, Lei Zhang

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

5 Scopus citations

Abstract

Federated learning (FL) is a recently proposed technique to cope with growing data and break the barriers among datasets, which enables nodes to train machine learning models without sharing their local datasets. However, the data privacy and model performance concerns in asynchronous federated learning hinder its deployment in practical applications, especially in dynamic scenarios. To address these problems, we propose a verifiable asynchronous federated learning with a peer-to-peer network based on local dataset test and cosine value examination to improve the model performance. We also design a privacy-preserving scheme by using the local differential privacy (LDP) to protect data privacy. We evaluate our scheme on the model accuracy and convergence performance. Numerical results show the high accuracy and efficiency of our proposed scheme while protecting privacy.

Original languageEnglish
Title of host publicationICSED 2021 - 2021 3rd International Conference on Software Engineering and Development
PublisherAssociation for Computing Machinery
Pages29-35
Number of pages7
ISBN (Electronic)9781450385213
DOIs
StatePublished - 19 Nov 2021
Event3rd International Conference on Software Engineering and Development, ICSED 2021 - Virtual, Online, China
Duration: 19 Nov 202121 Nov 2021

Publication series

NameACM International Conference Proceeding Series

Conference

Conference3rd International Conference on Software Engineering and Development, ICSED 2021
Country/TerritoryChina
CityVirtual, Online
Period19/11/2121/11/21

Keywords

  • Data privacy
  • Federated learning
  • Local differential privacy
  • Verifiable aggregation

Fingerprint

Dive into the research topics of 'Privacy-Preserving Verifiable Asynchronous Federated Learning'. Together they form a unique fingerprint.

Cite this