跳到主要导航 跳到搜索 跳到主要内容

Scaling Up Kernel SVM on Limited Resources: A Low-Rank Linearization Approach

  • Liang Lan
  • , Zhuang Wang
  • , Shandian Zhe
  • , Wei Cheng
  • , Jun Wang
  • , Kai Zhang*
  • *此作品的通讯作者

科研成果: 期刊稿件文章同行评审

摘要

Kernel support vector machines (SVMs) deliver state-of-the-art results in many real-world nonlinear classification problems, but the computational cost can be quite demanding in order to maintain a large number of support vectors. Linear SVM, on the other hand, is highly scalable to large data but only suited for linearly separable problems. In this paper, we propose a novel approach called low-rank linearized SVM to scale up kernel SVM on limited resources. Our approach transforms a nonlinear SVM to a linear one via an approximate empirical kernel map computed from efficient kernel low-rank decompositions. We theoretically analyze the gap between the solutions of the approximate and optimal rank- k kernel map, which in turn provides guidance on the sampling scheme of the Nyström approximation. Furthermore, we extend it to a semisupervised metric learning scenario in which partially labeled samples can be exploited to further improve the quality of the low-rank embedding. Our approach inherits rich representability of kernel SVM and high efficiency of linear SVM. Experimental results demonstrate that our approach is more robust and achieves a better tradeoff between model representability and scalability against state-of-the-art algorithms for large-scale SVMs.

源语言英语
文章编号8392379
页(从-至)369-378
页数10
期刊IEEE Transactions on Neural Networks and Learning Systems
30
2
DOI
出版状态已出版 - 2月 2019
已对外发布

指纹

探究 'Scaling Up Kernel SVM on Limited Resources: A Low-Rank Linearization Approach' 的科研主题。它们共同构成独一无二的指纹。

引用此