High Dimensional Principal Component Scores and Data Visualization

mag(2012)

引用 24|浏览6
暂无评分
摘要
Principal component analysis is a useful dimension reduction and data visualization method. However, in high dimension, low sample size asymptotic contexts, where the sample size is fixed and the dimension goes to infinity,a paradox has arisen. In particular, despite the useful real data insights commonly obtained from principal component score visualization, these scores are not consistent even when the sample eigen-vectors are consistent. This paradox is resolved by asymptotic study of the ratio between the sample and population principal component scores. In particular, it is seen that this proportion converges to a non-degenerate random variable. The realization is the same for each data point, i.e. there is a common random rescaling, which appears for each eigen-direction. This then gives inconsistent axis labels for the standard scores plot, yet the relative positions of the points (typically the main visual content) are consistent. This paradox disappears when the sample size goes to infinity.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要