跳到主要导航 跳到搜索 跳到主要内容

R 1-PCA: Rotational invariant L 1-norm principal component analysis for robust subspace factorization

  • Chris Ding*
  • , Ding Zhou
  • , Xiaofeng He
  • , Hongyuan Zha
  • *此作品的通讯作者

科研成果: 书/报告/会议事项章节会议稿件同行评审

摘要

Principal component analysis (PCA) minimizes the sum of squared errors (L 2-norm) and is sensitive to the presence of outliers. We propose a rotational invariant L 1-norm PCA (R 1-PCA). R 1-PCA is similar to PCA in that (1) it has a unique global solution, (2) the solution are principal eigenvectors of a robust covariance matrix (re-weighted to soften the effects of outliers), (3) the solution is rotational invariant. These properties are not shared by the Li-norm PCA. A new subspace iteration algorithm is given to compute R 1-PCA efficiently. Experiments on several real-life datasets show R 1-PCA can effectively handle outliers. We extend R 1-norm to K-means clustering and show that L 1-norm K-means leads to poor results while R 1-K-means outperforms standard K-means.

源语言英语
主期刊名ICML 2006 - Proceedings of the 23rd International Conference on Machine Learning
281-288
页数8
出版状态已出版 - 2006
已对外发布
活动ICML 2006: 23rd International Conference on Machine Learning - Pittsburgh, PA, 美国
期限: 25 6月 200629 6月 2006

出版系列

姓名ICML 2006 - Proceedings of the 23rd International Conference on Machine Learning
2006

会议

会议ICML 2006: 23rd International Conference on Machine Learning
国家/地区美国
Pittsburgh, PA
时期25/06/0629/06/06

指纹

探究 'R 1-PCA: Rotational invariant L 1-norm principal component analysis for robust subspace factorization' 的科研主题。它们共同构成独一无二的指纹。

引用此