Accelerating Stochastic Newton Method via Chebyshev Polynomial Approximation

Fan Sha, Jianyu Pan

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

To solve large scale optimization problems arising from machine learning, stochastic Newton methods have been proposed for reducing the cost of computing Hessian and Hessian inverse, while still maintaining fast convergence. Recently, a second-order method named LiSSA [12] was proposed to approximate the Hessian inverse with Taylor expansion and achieves (almost) linear running time in optimization. The approach is very simple yet effective, but still could be further accelerated. In this paper, we resort to Chebyshev polynomial and its variants to approximate the Hessian inverse. Note that Chebyshev polynomial approximation is broadly acknowledged as the optimal polynomial approximation in the deterministic setting, in this paper we introduce it into the stochastic setting of Newton optimization. We provide a complete convergence analysis and the experiments on multiple benchmarks show that our proposed algorithms outperform LiSSA, which validates our theoretical insights.

Original languageEnglish
Title of host publicationAdvances in Knowledge Discovery and Data Mining - 27th Pacific-Asia Conference on Knowledge Discovery and Data Mining, PAKDD 2023, Proceedings
EditorsHisashi Kashima, Tsuyoshi Ide, Wen-Chih Peng
PublisherSpringer Science and Business Media Deutschland GmbH
Pages523-534
Number of pages12
ISBN (Print)9783031333767
DOIs
StatePublished - 2023
Event27th Pacific-Asia Conference on Knowledge Discovery and Data Mining, PAKDD 2023 - Osaka, Japan
Duration: 25 May 202328 May 2023

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume13936 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference27th Pacific-Asia Conference on Knowledge Discovery and Data Mining, PAKDD 2023
Country/TerritoryJapan
CityOsaka
Period25/05/2328/05/23

Keywords

  • Chebyshev Polynomial
  • Condition Number
  • Hessian inverse
  • Stochastic Newton Method

Fingerprint

Dive into the research topics of 'Accelerating Stochastic Newton Method via Chebyshev Polynomial Approximation'. Together they form a unique fingerprint.

Cite this