Scaling up kernel SVM on limited resources: A low-rank linearization approach

  • Kai Zhang
  • , Liang Lan
  • , Zhuang Wang
  • , Fabian Moerchen

Research output: Contribution to journalConference articlepeer-review

78 Scopus citations

Abstract

Kernel Support Vector Machine delivers state-of-the-art results in non-linear classification, but the need to maintain a large number of support vectors poses a challenge in large scale training and testing. In contrast, linear SVM is much more scalable even on limited computing recourses (e.g. daily life PCs), but the learned model cannot capture non-linear concepts. To scale up kernel SVM on limited resources, we propose a lowrank linearization approach that transforms a non-linear SVM to a linear one via a novel, approximate empirical kernel map computed from efficient low-rank approximation of kernel matrices. We call it LLSVM (Lowrank Linearized SVM). We theoretically study the gap between the solutions of the optimal and approximate kernel map, which is used in turn to provide important guidance on the sampling based kernel approximations. Our algorithm inherits high efficiency of linear SVMs and rich repesentability of kernel classifiers. Evaluation against large-scale linear and kernel SVMs on several truly large data sets shows that the proposed method achieves a better tradeoff between scalability and model representability.

Original languageEnglish
Pages (from-to)1425-1434
Number of pages10
JournalJournal of Machine Learning Research
Volume22
StatePublished - 2012
Externally publishedYes
Event15th International Conference on Artificial Intelligence and Statistics, AISTATS 2012 - La Palma, Spain
Duration: 21 Apr 201223 Apr 2012

Fingerprint

Dive into the research topics of 'Scaling up kernel SVM on limited resources: A low-rank linearization approach'. Together they form a unique fingerprint.

Cite this