DFDG: Data-Free Dual-Generator Adversarial Distillation for One-Shot Federated Learning

Kangyang Luo, Shuai Wang, Yexuan Fu, Renrong Shao, Xiang Li, Yunshi Lan, Ming Gao, Jinlong Shu

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

1 Scopus citations

Abstract

Federated Learning (FL) is a distributed machine learning scheme in which clients jointly participate in the collaborative training of a global model by sharing model information rather than their private datasets. In light of concerns associated with communication and privacy, one-shot FL with a single communication round has emerged as a de facto promising solution. However, existing one-shot FL methods either require public datasets, focus on model homogeneous settings, or distill limited knowledge from local models, making it difficult or even impractical to train a robust global model. To address these limitations, we propose a new data-free dual-generator adversarial distillation method (namely DFDG) for one-shot FL, which can explore a broader local models' training space via training dual generators. DFDG is executed in an adversarial manner and comprises two parts: dual-generator training and dual-model distillation. In dual-generator training, we delve into each generator concerning fidelity, transferability and diversity to ensure its utility, and additionally tailor the cross-divergence loss to lessen the overlap of dual generators' output spaces. In dual-model distillation, the trained dual generators work together to provide the training data for updates of the global model. At last, our extensive experiments on various image classification tasks show that DFDG achieves significant performance gains in accuracy compared to SOTA baselines. We provide our code here: https://anonymous.4open.science/r/DFDG-7BDB.

Original languageEnglish
Title of host publicationProceedings - 24th IEEE International Conference on Data Mining, ICDM 2024
EditorsElena Baralis, Kun Zhang, Ernesto Damiani, Merouane Debbah, Panos Kalnis, Xindong Wu
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages281-290
Number of pages10
ISBN (Electronic)9798331506681
DOIs
StatePublished - 2024
Event24th IEEE International Conference on Data Mining, ICDM 2024 - Abu Dhabi, United Arab Emirates
Duration: 9 Dec 202412 Dec 2024

Publication series

NameProceedings - IEEE International Conference on Data Mining, ICDM
ISSN (Print)1550-4786

Conference

Conference24th IEEE International Conference on Data Mining, ICDM 2024
Country/TerritoryUnited Arab Emirates
CityAbu Dhabi
Period9/12/2412/12/24

Keywords

  • Data heterogeneity
  • Data-free knowledge distillation
  • Model heterogeneity
  • One-shot Federated Learning

Fingerprint

Dive into the research topics of 'DFDG: Data-Free Dual-Generator Adversarial Distillation for One-Shot Federated Learning'. Together they form a unique fingerprint.

Cite this