Verifiable Private Federated Learning Achieving Low-Communication with CUR Decomposition

  • Changti Wu
  • , Lulu Wang
  • , Lei Zhang*
  • *Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Federated learning (FL) allows multiple clients to collaboratively train a shared machine learning model without sharing local data. Despite its advantages, FL faces serious security and privacy threats. Many existing solutions rely on cryptographic methods to protect data and ensure verifiability, but these approaches often enlarge the model or impose high communication costs. They also overlook FL's limited uplink and downlink bandwidth and rarely account for practical issues such as client dropouts. To address these gaps, we propose LC-VPFL, a federated learning framework that ensures data privacy, verifiability, low communication overhead, and dropout tolerance. Our approach leverages secret sharing and masking to protect data privacy, while homomorphic hashing detects malicious server behavior. To minimize communication costs, we apply quantization and CUR matrix decomposition, optimizing both uplink and downlink transmissions. We formally prove the security of LC-VPFL and provide a theoretical analysis demonstrating that, for a corruption threshold of t, the communication complexity of partial clients remains O(t) and tolerates arbitrary client dropouts. Experimental results show that LC-VPFL reduces uplink communication costs by over 50% in most scenarios and downlink communication costs to less than 12.5% of those in FedAvg, with an accuracy loss within 3%.

Original languageEnglish
JournalIEEE Transactions on Dependable and Secure Computing
DOIs
StateAccepted/In press - 2026

Keywords

  • dropout tolerance
  • Federated learning
  • low-communication
  • privacy preserving
  • verifiability

Fingerprint

Dive into the research topics of 'Verifiable Private Federated Learning Achieving Low-Communication with CUR Decomposition'. Together they form a unique fingerprint.

Cite this