R 1-PCA: Rotational invariant L 1-norm principal component analysis for robust subspace factorization

Chris Ding, Ding Zhou, Xiaofeng He, Hongyuan Zha

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

241 Scopus citations

Abstract

Principal component analysis (PCA) minimizes the sum of squared errors (L 2-norm) and is sensitive to the presence of outliers. We propose a rotational invariant L 1-norm PCA (R 1-PCA). R 1-PCA is similar to PCA in that (1) it has a unique global solution, (2) the solution are principal eigenvectors of a robust covariance matrix (re-weighted to soften the effects of outliers), (3) the solution is rotational invariant. These properties are not shared by the Li-norm PCA. A new subspace iteration algorithm is given to compute R 1-PCA efficiently. Experiments on several real-life datasets show R 1-PCA can effectively handle outliers. We extend R 1-norm to K-means clustering and show that L 1-norm K-means leads to poor results while R 1-K-means outperforms standard K-means.

Original languageEnglish
Title of host publicationICML 2006 - Proceedings of the 23rd International Conference on Machine Learning
Pages281-288
Number of pages8
StatePublished - 2006
Externally publishedYes
EventICML 2006: 23rd International Conference on Machine Learning - Pittsburgh, PA, United States
Duration: 25 Jun 200629 Jun 2006

Publication series

NameICML 2006 - Proceedings of the 23rd International Conference on Machine Learning
Volume2006

Conference

ConferenceICML 2006: 23rd International Conference on Machine Learning
Country/TerritoryUnited States
CityPittsburgh, PA
Period25/06/0629/06/06

Fingerprint

Dive into the research topics of 'R 1-PCA: Rotational invariant L 1-norm principal component analysis for robust subspace factorization'. Together they form a unique fingerprint.

Cite this