Block-quantized kernel matrix for fast spectral embedding

  • Kai Zhang*
  • , James T. Kwok
  • *Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

8 Scopus citations

Abstract

Eigendecomposition of kernel matrix is an indispensable procedure in many learning and vision tasks. However, the cubic complexity O(N3) is impractical for large problem, where N is the data size. In this paper, we propose an efficient approach to solve the eigendecomposition of the kernel matrix W. The idea is to approximate W with W̄ that is composed of m 2 constant blocks. The eigenvectors of W̄, which can be solved in O(m3) time, is then used to recover the eigenvectors of the original kernel matrix. The complexity of our method is only O(mN + m 3), which scales more favorably than state-of-the-art low rank approximation and sampling based approaches (O(m2N + m3)), and the approximation quality can be controlled conveniently. Our method demonstrates encouraging scaling behaviors in experiments of image segmentation (by spectral clustering) and kernel principal component analysis.

Original languageEnglish
Title of host publicationICML 2006 - Proceedings of the 23rd International Conference on Machine Learning
Pages1097-1104
Number of pages8
StatePublished - 2006
Externally publishedYes
EventICML 2006: 23rd International Conference on Machine Learning - Pittsburgh, PA, United States
Duration: 25 Jun 200629 Jun 2006

Publication series

NameICML 2006 - Proceedings of the 23rd International Conference on Machine Learning
Volume2006

Conference

ConferenceICML 2006: 23rd International Conference on Machine Learning
Country/TerritoryUnited States
CityPittsburgh, PA
Period25/06/0629/06/06

Fingerprint

Dive into the research topics of 'Block-quantized kernel matrix for fast spectral embedding'. Together they form a unique fingerprint.

Cite this