All Title Author
Keywords Abstract

Publish in OALib Journal
ISSN: 2333-9721
APC: Only $99


Relative Articles

Algorithms  2013 

?1 Major Component Detection and Analysis (?1 MCDA): Foundations in Two Dimensions

DOI: 10.3390/a6010012

Keywords: heavy-tailed distribution, ?1, ?2, major component, multivariate statistics, outliers, principal component analysis, 2D

Full-Text   Cite this paper   Add to My Lib


Principal Component Analysis (PCA) is widely used for identifying the major components of statistically distributed point clouds. Robust versions of PCA, often based in part on the ?1 norm (rather than the ?2 norm), are increasingly used, especially for point clouds with many outliers. Neither standard PCA nor robust PCAs can provide, without additional assumptions, reliable information for outlier-rich point clouds and for distributions with several main directions (spokes). We carry out a fundamental and complete reformulation of the PCA approach in a framework based exclusively on the ?1 norm and heavy-tailed distributions. The ?1 Major Component Detection and Analysis ( ?1 MCDA) that we propose can determine the main directions and the radial extent of 2D data from single or multiple superimposed Gaussian or heavy-tailed distributions without and with patterned artificial outliers (clutter). In nearly all cases in the computational results, 2D ?1 MCDA has accuracy superior to that of standard PCA and of two robust PCAs, namely, the projection-pursuit method of Croux and Ruiz-Gazen and the ?1 factorization method of Ke and Kanade. (Standard PCA is, of course, superior to ?1 MCDA for Gaussian-distributed point clouds.) The computing time of ?1 MCDA is competitive with the computing times of the two robust PCAs.


comments powered by Disqus

Contact Us


WhatsApp +8615387084133

WeChat 1538708413