TY - GEN
T1 - Green Federated Learning over Cloud-RAN with Limited Fronthaul and Quantized Neural Networks
AU - Wang, Jiali
AU - Mao, Yijie
AU - Wang, Ting
AU - Shi, Yuanming
N1 - Publisher Copyright:
© 2023 IEEE.
PY - 2023
Y1 - 2023
N2 - In this paper, we investigate a green federated learning (FL) framework over cloud radio access network (Cloud-RAN) system that comprises a server, multiple devices and remote radio heads (RRHs). Each device utilizes quantized neural networks (QNNs) and sends quantized model parameters to the server to save energy consumption via RRHs. The server aggregates all the signals to update the global model parameters and broadcasts the updated parameters to the selected devices. In this context, we propose an energy consumption model for the QNN training and communication model over Cloud-RAN. We develop an energy minimization problem based on the proposed energy model. We jointly design fronthaul rate allocation, device transmit power, and precision level of QNNs while ensuring target accuracy, transmit power budget and limited fronthaul capacity. Guided by the convergence analysis, we adopt alternative optimization method to solve the energy minimization problem. The simulation outcomes demonstrate that the FL framework suggested can considerably diminish energy usage in comparison to other traditional methods. This has immense potential in realizing a sustainable and eco-friendly FL over Cloud-RAN.
AB - In this paper, we investigate a green federated learning (FL) framework over cloud radio access network (Cloud-RAN) system that comprises a server, multiple devices and remote radio heads (RRHs). Each device utilizes quantized neural networks (QNNs) and sends quantized model parameters to the server to save energy consumption via RRHs. The server aggregates all the signals to update the global model parameters and broadcasts the updated parameters to the selected devices. In this context, we propose an energy consumption model for the QNN training and communication model over Cloud-RAN. We develop an energy minimization problem based on the proposed energy model. We jointly design fronthaul rate allocation, device transmit power, and precision level of QNNs while ensuring target accuracy, transmit power budget and limited fronthaul capacity. Guided by the convergence analysis, we adopt alternative optimization method to solve the energy minimization problem. The simulation outcomes demonstrate that the FL framework suggested can considerably diminish energy usage in comparison to other traditional methods. This has immense potential in realizing a sustainable and eco-friendly FL over Cloud-RAN.
KW - Cloud radio access network (Cloud-RAN)
KW - federated learning (FL)
KW - quantized neural network (QNN)
UR - https://www.scopus.com/pages/publications/85175147759
U2 - 10.1109/MeditCom58224.2023.10266623
DO - 10.1109/MeditCom58224.2023.10266623
M3 - 会议稿件
AN - SCOPUS:85175147759
T3 - 2023 IEEE International Mediterranean Conference on Communications and Networking, MeditCom 2023
SP - 139
EP - 144
BT - 2023 IEEE International Mediterranean Conference on Communications and Networking, MeditCom 2023
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 3rd IEEE International Mediterranean Conference on Communications and Networking, MeditCom 2023
Y2 - 4 September 2023 through 7 September 2023
ER -