A Large-Scale Stochastic Gradient Descent Algorithm Over a Graphon

  • Yan Chen
  • , Tao Li*
  • *Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

1 Scopus citations

Abstract

We study the large-scale stochastic gradient descent algorithm over a graphon with a continuum of nodes, which is regarded as the limit of the distributed networked optimization as the number of nodes goes to infinity. Each node has a private local cost function. The global cost function, which all nodes cooperatively minimize, is the integral of the local cost functions on the node set. We propose a stochastic gradient descent algorithm evolving as a graphon particle system, where each node heterogeneously interacts with others through a coupled mean field term. It is proved that if the graphon is connected, then by properly choosing the algorithm gains, all nodes' states achieve consensus uniformly in mean square. Furthermore, if the local cost functions are strongly convex, then all nodes' states converge uniformly to the minimizer of the global cost function in mean square.

Original languageEnglish
Title of host publication2023 62nd IEEE Conference on Decision and Control, CDC 2023
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages4806-4811
Number of pages6
ISBN (Electronic)9798350301243
DOIs
StatePublished - 2023
Event62nd IEEE Conference on Decision and Control, CDC 2023 - Singapore, Singapore
Duration: 13 Dec 202315 Dec 2023

Publication series

NameProceedings of the IEEE Conference on Decision and Control
ISSN (Print)0743-1546
ISSN (Electronic)2576-2370

Conference

Conference62nd IEEE Conference on Decision and Control, CDC 2023
Country/TerritorySingapore
CitySingapore
Period13/12/2315/12/23

Fingerprint

Dive into the research topics of 'A Large-Scale Stochastic Gradient Descent Algorithm Over a Graphon'. Together they form a unique fingerprint.

Cite this