Integrating Staleness and Shapley Value Consistency for Efficient K-Asynchronous Federated Learning

  • Yuhui Jiang
  • , Xingjian Lu*
  • , Wei Mao
  • , Ying Lin
  • *Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

3 Scopus citations

Abstract

In the big data era, Federated Learning (FL), which allows multiple participants to collaboratively train a global model without sharing their raw data, emerges as a promising solution to address the challenges of isolated data silos and privacy protection. Federated learning has two main communication strategies: synchronous and asynchronous. Synchronous FL ensures stable convergence but may encounter model quality degradation and server crash risks. Asynchronous FL avoids the straggler effect and supports more participants, but unstable convergence and non-IID data could affect the model performance. In this paper, inspired by real-world FL scenarios, we propose a highly efficient K-Asynchronous FL framework, KFLBSV, which addresses the limitations of synchronous and asynchronous strategies to some extent, leading to improved model performance and convergence speed. The framework allows clients to upload updates multiple times within the same round instead of blocking after each upload, thereby enhancing training efficiency. To ensure the stability and performance of the global model, we introduce a novel aggregation method. By approximating Shapley value to assess model consistency and balancing client contribution frequency and model staleness, we allocate weights more accurately to each participating client. We extensively conducted experiments on benchmark datasets using three distinct models, and the results show that KFLBSV outperforms existing algorithms in terms of both model performance and convergence speed.

Original languageEnglish
Title of host publicationProceedings - 2023 IEEE International Conference on Big Data, BigData 2023
EditorsJingrui He, Themis Palpanas, Xiaohua Hu, Alfredo Cuzzocrea, Dejing Dou, Dominik Slezak, Wei Wang, Aleksandra Gruca, Jerry Chun-Wei Lin, Rakesh Agrawal
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages680-689
Number of pages10
ISBN (Electronic)9798350324457
DOIs
StatePublished - 2023
Event2023 IEEE International Conference on Big Data, BigData 2023 - Sorrento, Italy
Duration: 15 Dec 202318 Dec 2023

Publication series

NameProceedings - 2023 IEEE International Conference on Big Data, BigData 2023

Conference

Conference2023 IEEE International Conference on Big Data, BigData 2023
Country/TerritoryItaly
CitySorrento
Period15/12/2318/12/23

Keywords

  • Big data
  • federated learning
  • performance
  • shapley value
  • staleness

Fingerprint

Dive into the research topics of 'Integrating Staleness and Shapley Value Consistency for Efficient K-Asynchronous Federated Learning'. Together they form a unique fingerprint.

Cite this