TY - JOUR
T1 - Decentralized Over-the-Air Federated Learning by Second-Order Optimization Method
AU - Yang, Peng
AU - Jiang, Yuning
AU - Wen, Dingzhu
AU - Wang, Ting
AU - Jones, Colin N.
AU - Shi, Yuanming
N1 - Publisher Copyright:
© 2002-2012 IEEE.
PY - 2024/6/1
Y1 - 2024/6/1
N2 - Federated learning (FL) is an emerging technique that enables privacy-preserving distributed learning. Most related works focus on centralized FL, which leverages the coordination of a parameter server to implement local model aggregation. However, this scheme heavily relies on the parameter server, which could cause scalability, communication, and reliability issues. To tackle these problems, decentralized FL, where information is shared through gossip, starts to attract attention. Nevertheless, current research mainly relies on first-order optimization methods that have a relatively slow convergence rate, which leads to excessive communication rounds in wireless networks. To design communication-efficient decentralized FL, we propose a novel over-the-air decentralized second-order federated algorithm. Benefiting from the fast convergence rate of the second-order method, total communication rounds are significantly reduced. Meanwhile, owing to the low-latency model aggregation enabled by over-the-air computation, the communication overheads in each round can also be greatly decreased. The convergence behavior of our approach is then analyzed. The result reveals an error term, which involves a cumulative noise effect, in each iteration. To mitigate the impact of this error term, we conduct system optimization from the perspective of the accumulative term and the individual term, respectively. Numerical experiments demonstrate the superiority of our proposed approach and the effectiveness of system optimization.
AB - Federated learning (FL) is an emerging technique that enables privacy-preserving distributed learning. Most related works focus on centralized FL, which leverages the coordination of a parameter server to implement local model aggregation. However, this scheme heavily relies on the parameter server, which could cause scalability, communication, and reliability issues. To tackle these problems, decentralized FL, where information is shared through gossip, starts to attract attention. Nevertheless, current research mainly relies on first-order optimization methods that have a relatively slow convergence rate, which leads to excessive communication rounds in wireless networks. To design communication-efficient decentralized FL, we propose a novel over-the-air decentralized second-order federated algorithm. Benefiting from the fast convergence rate of the second-order method, total communication rounds are significantly reduced. Meanwhile, owing to the low-latency model aggregation enabled by over-the-air computation, the communication overheads in each round can also be greatly decreased. The convergence behavior of our approach is then analyzed. The result reveals an error term, which involves a cumulative noise effect, in each iteration. To mitigate the impact of this error term, we conduct system optimization from the perspective of the accumulative term and the individual term, respectively. Numerical experiments demonstrate the superiority of our proposed approach and the effectiveness of system optimization.
KW - Decentralized federated learning
KW - over-the-air computation
KW - second-order optimization method
UR - https://www.scopus.com/pages/publications/85181559013
U2 - 10.1109/TWC.2023.3327610
DO - 10.1109/TWC.2023.3327610
M3 - 文章
AN - SCOPUS:85181559013
SN - 1536-1276
VL - 23
SP - 5632
EP - 5647
JO - IEEE Transactions on Wireless Communications
JF - IEEE Transactions on Wireless Communications
IS - 6
ER -