TY - JOUR
T1 - Dual-Server-Based Lightweight Privacy-Preserving Federated Learning
AU - Zhong, Liangyu
AU - Wang, Lulu
AU - Zhang, Lei
AU - Domingo-Ferrer, Josep
AU - Xu, Lin
AU - Wu, Changti
AU - Zhang, Rui
N1 - Publisher Copyright:
© 2004-2012 IEEE.
PY - 2024
Y1 - 2024
N2 - Federated learning (FL) allows multiple users to collaboratively train global machine learning models by keeping their data sets local. However, the existing privacy-preserving FL schemes suffer from several limitations, e.g., loss of accuracy, high communication/computation cost, failure to support dynamic users, and insecurity against collusion attacks. To solve these limitations, we propose a lightweight privacy-preserving FL scheme based on a dual-server architecture. Our scheme involves only lightweight cryptographic operations, i.e., hash and symmetric encryption operations, and it has low communication overhead. Thus, it is computationally lightweight and round-efficient. Further, it allows users to join/quit an FL task and it is accuracy-lossless. We formally prove that our scheme remains secure even in case of collusion attacks. In particular, if an attacker colludes with one of the servers and all the users who participate in an FL task except two, the privacy of user gradients stays unviolated. The reported experimental results demonstrate that our scheme incurs only a marginal increase in total communication overhead compared to the FL scheme without any privacy protection. In terms of computation overhead, the cost per user remains stable as the number of users grows, while the cost for the server is comparable to that of the FL scheme without any privacy protection.
AB - Federated learning (FL) allows multiple users to collaboratively train global machine learning models by keeping their data sets local. However, the existing privacy-preserving FL schemes suffer from several limitations, e.g., loss of accuracy, high communication/computation cost, failure to support dynamic users, and insecurity against collusion attacks. To solve these limitations, we propose a lightweight privacy-preserving FL scheme based on a dual-server architecture. Our scheme involves only lightweight cryptographic operations, i.e., hash and symmetric encryption operations, and it has low communication overhead. Thus, it is computationally lightweight and round-efficient. Further, it allows users to join/quit an FL task and it is accuracy-lossless. We formally prove that our scheme remains secure even in case of collusion attacks. In particular, if an attacker colludes with one of the servers and all the users who participate in an FL task except two, the privacy of user gradients stays unviolated. The reported experimental results demonstrate that our scheme incurs only a marginal increase in total communication overhead compared to the FL scheme without any privacy protection. In terms of computation overhead, the cost per user remains stable as the number of users grows, while the cost for the server is comparable to that of the FL scheme without any privacy protection.
KW - Privacy preservation
KW - federated learning
KW - lightweight cryptography
KW - secure aggregation
UR - https://www.scopus.com/pages/publications/85192779750
U2 - 10.1109/TNSM.2024.3399534
DO - 10.1109/TNSM.2024.3399534
M3 - 文章
AN - SCOPUS:85192779750
SN - 1932-4537
VL - 21
SP - 4787
EP - 4800
JO - IEEE Transactions on Network and Service Management
JF - IEEE Transactions on Network and Service Management
IS - 4
ER -