Satellite Federated Fine-Tuning for Foundation Models in Space Computing Power Networks

Yan Zhu*, Jingyang Zhu, Ting Wang, Yuanming Shi, Chunxiao Jiang, Khaled B. Letaief

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Advancements in artificial intelligence and low-earth orbit satellites have promoted the application of large remote sensing foundation models (FMs) for various downstream tasks. However, direct downloading of these models for fine-tuning on the ground is impeded by privacy concerns and limited bandwidth. Satellite federated learning (FL) offers a solution by enabling model fine-tuning directly on-board satellites and aggregating model updates without data downloading. Nevertheless, for large FMs, the computational capacity of satellites is insufficient to support effective on-board fine-tuning in traditional satellite FL frameworks. To address these challenges, we propose a satellite-ground collaborative federated fine-tuning framework. The key of the framework lies in how to reasonably decompose and allocate model components to alleviate insufficient on-board computation capabilities. During fine-tuning, satellites exchange intermediate results with ground stations or other satellites for forward propagation and back propagation, which brings communication challenges due to the special communication topology of space transmission networks, such as intermittent satellite-ground communication, short duration of satellite-ground communication windows, and unstable inter-orbit inter-satellite links. To reduce transmission delays, we further introduce tailored communication strategies that integrate both communication and computing resources. Specifically, we propose a parallel intra-orbit communication strategy, a topology-aware satellite-ground communication strategy, and a latency-minimization inter-orbit communication strategy to reduce space communication costs. Simulation results demonstrate significant reductions in training time to 33% of on-board training time.

Original languageEnglish
JournalIEEE Transactions on Wireless Communications
DOIs
StateAccepted/In press - 2025

Keywords

  • edge learning
  • fine-tuning
  • foundation models
  • satellite communications
  • Satellite federated learning

Fingerprint

Dive into the research topics of 'Satellite Federated Fine-Tuning for Foundation Models in Space Computing Power Networks'. Together they form a unique fingerprint.

Cite this