Separating variables to accelerate non-convex regularized optimization

Research output: Contribution to journalArticlepeer-review

2 Scopus citations

Abstract

In this paper, a novel variable separation algorithm stemmed from the idea of orthogonalization EM is proposed to find the minimization of general function with non-convex regularizer. The main idea of our algorithm is to construct a new function by adding an item that allows minimization to be solved separately on each component. Several attractive theoretical properties concerning the new algorithm are established. The new algorithm converges to one of the critical points with the condition that the objective function is coercive or the generated sequence is in a compact set. The convergence rate of the algorithm is also obtained. The Barzilai–Borwein (BB) rule and Nesterov's method are also used to accelerate our algorithm. The new algorithm can also be used to solve the minimization of general function with group structure regularizer. The simulation and real data results show that these methods can accelerate our method obviously.

Original languageEnglish
Article number106943
JournalComputational Statistics and Data Analysis
Volume147
DOIs
StatePublished - Jul 2020

Keywords

  • Acceleration
  • Convergence
  • Non-convex regularization
  • Optimization
  • Variable separation algorithm

Fingerprint

Dive into the research topics of 'Separating variables to accelerate non-convex regularized optimization'. Together they form a unique fingerprint.

Cite this