Distributed Online Optimization under Dynamic Adaptive Quantization

  • Yingjie Zhou
  • , Xinyu Wang
  • , Tao Li*
  • *Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

6 Scopus citations

Abstract

For distributed online optimization over networked nodes, we propose an algorithm based on one-step gradient descent and multi-step consensus under dynamic adaptive quantization. We propose a dynamic difference encoding-decoding strategy with variable center quantizers and online-generated quantization intervals, which adjust the quantization parameters adaptively according to the optimizers' states. We use the fixed gradient descent step size to ensure the ability of tracking the optimal solutions of the dynamically changing objective functions, and give the upper bound of the dynamic regret. The effectiveness of the proposed algorithm is demonstrated by numerical simulation.

Original languageEnglish
Pages (from-to)3453-3457
Number of pages5
JournalIEEE Transactions on Circuits and Systems II: Express Briefs
Volume71
Issue number7
DOIs
StatePublished - 2024

Keywords

  • Distributed online optimization
  • dynamic adaptive quantization
  • finite dynamic difference coding

Fingerprint

Dive into the research topics of 'Distributed Online Optimization under Dynamic Adaptive Quantization'. Together they form a unique fingerprint.

Cite this