TY - JOUR
T1 - Green Federated Learning over Cloud-RAN with Limited Fronthaul Capacity and Quantized Neural Networks
AU - Wang, Jiali
AU - Mao, Yijie
AU - Wang, Ting
AU - Shi, Yuanming
N1 - Publisher Copyright:
© 2002-2012 IEEE.
PY - 2024/5/1
Y1 - 2024/5/1
N2 - In this paper, we propose an energy-efficient federated learning (FL) framework for the energy-constrained devices over cloud radio access network (Cloud-RAN), where each device adopts quantized neural networks (QNNs) to train a local FL model and transmits the quantized model parameter to the remote radio heads (RRHs). Each RRH receives the signals from devices over the wireless link and forwards the signals to the server via the fronthaul link. We rigorously develop an energy consumption model for the local training at devices through the use of QNNs and communication models over Cloud-RAN. Based on the proposed energy consumption model, we formulate an energy minimization problem that optimizes the fronthaul rate allocation, device transmit power allocation, and QNN precision levels while satisfying the limited fronthaul capacity constraint and ensuring the convergence of the proposed FL model to a target accuracy. To solve this problem, we analyze the convergence rate and propose efficient algorithms based on the alternative optimization technique. Simulation results show that the proposed FL framework can significantly reduce energy consumption compared to other conventional approaches. We draw the conclusion that the proposed framework holds great potential for achieving a sustainable and environmentally-friendly FL in Cloud-RAN.
AB - In this paper, we propose an energy-efficient federated learning (FL) framework for the energy-constrained devices over cloud radio access network (Cloud-RAN), where each device adopts quantized neural networks (QNNs) to train a local FL model and transmits the quantized model parameter to the remote radio heads (RRHs). Each RRH receives the signals from devices over the wireless link and forwards the signals to the server via the fronthaul link. We rigorously develop an energy consumption model for the local training at devices through the use of QNNs and communication models over Cloud-RAN. Based on the proposed energy consumption model, we formulate an energy minimization problem that optimizes the fronthaul rate allocation, device transmit power allocation, and QNN precision levels while satisfying the limited fronthaul capacity constraint and ensuring the convergence of the proposed FL model to a target accuracy. To solve this problem, we analyze the convergence rate and propose efficient algorithms based on the alternative optimization technique. Simulation results show that the proposed FL framework can significantly reduce energy consumption compared to other conventional approaches. We draw the conclusion that the proposed framework holds great potential for achieving a sustainable and environmentally-friendly FL in Cloud-RAN.
KW - Cloud radio access network (Cloud-RAN)
KW - federated learning (FL)
KW - quantized neural networks (QNN)
UR - https://www.scopus.com/pages/publications/85173364783
U2 - 10.1109/TWC.2023.3317129
DO - 10.1109/TWC.2023.3317129
M3 - 文章
AN - SCOPUS:85173364783
SN - 1536-1276
VL - 23
SP - 4300
EP - 4314
JO - IEEE Transactions on Wireless Communications
JF - IEEE Transactions on Wireless Communications
IS - 5
ER -