TY - JOUR
T1 - Privacy-Preserving and Reliable Decentralized Federated Learning
AU - Gao, Yuanyuan
AU - Zhang, Lei
AU - Wang, Lulu
AU - Choo, Kim Kwang Raymond
AU - Zhang, Rui
N1 - Publisher Copyright:
© 2008-2012 IEEE.
PY - 2023/7/1
Y1 - 2023/7/1
N2 - Conventional federated learning (FL) approaches generally rely on a centralized server, and there has been a trend of designing asynchronous FL approaches for distributed applications partly to mitigate limitations associated with conventional (synchronous) FL approaches (e.g., single point of failure / attack). In this paper, we first introduce two new tools, namely: a quality-based aggregation method and an extended dynamic contribution broadcast encryption (DConBE). Building on these two new tools and local differential privacy, we then propose a privacy-preserving and reliable decentralized FL scheme, designed to support batch joining/leaving of clients while incurring minimal delay and achieving high model accuracy. In other words, our scheme seeks to ensure an optimal trade-off between model accuracy and data privacy, which is also demonstrated in our simulation results. For example, the results show that our aggregation method can effectively avoid low-quality updates in the sense that the scheme guarantees high model accuracy even in the presence of bad clients who may submit low-quality updates. In addition, our scheme incurs a lower loss and the extended DConBE only slightly affects the efficiency of our scheme. With the extended dynamic contribution broadcast encryption, our scheme can efficiently support batch joining/leaving of clients.
AB - Conventional federated learning (FL) approaches generally rely on a centralized server, and there has been a trend of designing asynchronous FL approaches for distributed applications partly to mitigate limitations associated with conventional (synchronous) FL approaches (e.g., single point of failure / attack). In this paper, we first introduce two new tools, namely: a quality-based aggregation method and an extended dynamic contribution broadcast encryption (DConBE). Building on these two new tools and local differential privacy, we then propose a privacy-preserving and reliable decentralized FL scheme, designed to support batch joining/leaving of clients while incurring minimal delay and achieving high model accuracy. In other words, our scheme seeks to ensure an optimal trade-off between model accuracy and data privacy, which is also demonstrated in our simulation results. For example, the results show that our aggregation method can effectively avoid low-quality updates in the sense that the scheme guarantees high model accuracy even in the presence of bad clients who may submit low-quality updates. In addition, our scheme incurs a lower loss and the extended DConBE only slightly affects the efficiency of our scheme. With the extended dynamic contribution broadcast encryption, our scheme can efficiently support batch joining/leaving of clients.
KW - Broadcast encryption
KW - data privacy
KW - federated learning
KW - local differential privacy
UR - https://www.scopus.com/pages/publications/85149379087
U2 - 10.1109/TSC.2023.3250705
DO - 10.1109/TSC.2023.3250705
M3 - 文章
AN - SCOPUS:85149379087
SN - 1939-1374
VL - 16
SP - 2879
EP - 2891
JO - IEEE Transactions on Services Computing
JF - IEEE Transactions on Services Computing
IS - 4
ER -