TY - GEN
T1 - MMDFL
T2 - 62nd ACM/IEEE Design Automation Conference, DAC 2025
AU - Yan, Dengke
AU - Yang, Yanxin
AU - Hu, Ming
AU - Fu, Xin
AU - Chen, Mingsong
N1 - Publisher Copyright:
© 2025 IEEE.
PY - 2025
Y1 - 2025
N2 - Along with the prosperity of Artificial Intelligence (AI) techniques, more and more Artificial Intelligence of Things (AIoT) applications adopt Federated Learning (FL) to enable collaborative learning without compromising the privacy of devices. Since existing centralized FL methods suffer from the problems of single-point-offailure and communication bottleneck caused by the parameter server, we are witnessing an increasing use of Decentralized Federated Learning (DFL), which is based on Peer-to-Peer (P2P) communication without using a global model. However, DFL still faces three major challenges, i.e., limited computing power and network bandwidth of resource-constrained devices, non-Independent and Identically Distributed (non-IID) device data, and all-neighbor-dependent knowledge aggregation operations, all of which greatly suppress the learning potential of existing DFL methods. To address these problems, this paper presents an efficient DFL framework named MMDFL based on our proposed multi-model-based learning and knowledge aggregation mechanism. Specifically, MMDFL adopts multiple traveler models, which perform local training individually along their traversed devices, accelerating and maximizing knowledge learning and sharing among devices. Moreover, based on our proposed device selection strategy, MMDFL enables each traveler to adaptively explore its next best neighboring device to further enhance the DFL training performance, taking into account issues of data heterogeneity, limited resources and catastrophic forgetting phenomenon. Experimental results from simulation and a real testbed show that, compared with state-of-the-art DFL methods, MMDFL can not only significantly reduce the communication overhead but also achieve better overall classification performance for both IID and non-IID scenarios.
AB - Along with the prosperity of Artificial Intelligence (AI) techniques, more and more Artificial Intelligence of Things (AIoT) applications adopt Federated Learning (FL) to enable collaborative learning without compromising the privacy of devices. Since existing centralized FL methods suffer from the problems of single-point-offailure and communication bottleneck caused by the parameter server, we are witnessing an increasing use of Decentralized Federated Learning (DFL), which is based on Peer-to-Peer (P2P) communication without using a global model. However, DFL still faces three major challenges, i.e., limited computing power and network bandwidth of resource-constrained devices, non-Independent and Identically Distributed (non-IID) device data, and all-neighbor-dependent knowledge aggregation operations, all of which greatly suppress the learning potential of existing DFL methods. To address these problems, this paper presents an efficient DFL framework named MMDFL based on our proposed multi-model-based learning and knowledge aggregation mechanism. Specifically, MMDFL adopts multiple traveler models, which perform local training individually along their traversed devices, accelerating and maximizing knowledge learning and sharing among devices. Moreover, based on our proposed device selection strategy, MMDFL enables each traveler to adaptively explore its next best neighboring device to further enhance the DFL training performance, taking into account issues of data heterogeneity, limited resources and catastrophic forgetting phenomenon. Experimental results from simulation and a real testbed show that, compared with state-of-the-art DFL methods, MMDFL can not only significantly reduce the communication overhead but also achieve better overall classification performance for both IID and non-IID scenarios.
KW - AIoT
KW - decentralized federated learning
KW - multi-model learning
KW - resource-constrained
KW - stochastic gradient descent
UR - https://www.scopus.com/pages/publications/105017647964
U2 - 10.1109/DAC63849.2025.11133116
DO - 10.1109/DAC63849.2025.11133116
M3 - 会议稿件
AN - SCOPUS:105017647964
T3 - Proceedings - Design Automation Conference
BT - 2025 62nd ACM/IEEE Design Automation Conference, DAC 2025
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 22 June 2025 through 25 June 2025
ER -