Dimensionality Reduction Method'S Comparison Based On Statistical Dependencies

7TH INTERNATIONAL CONFERENCE ON AMBIENT SYSTEMS, NETWORKS AND TECHNOLOGIES (ANT 2016) / THE 6TH INTERNATIONAL CONFERENCE ON SUSTAINABLE ENERGY INFORMATION TECHNOLOGY (SEIT-2016) / AFFILIATED WORKSHOPS(2016)

引用 6|浏览8
暂无评分
摘要
The field of machine learning deals with a huge amount of various algorithms, which are able to transform the observed data into many forms and dimensionality reduction (DR) is one of such transformations. There are many high quality papers which compares some of the DR's approaches and of course there other experiments which applies them with success. Not everyone is focused on information lost, increase of relevance or decrease of uncertainty during the transformation, which is hard to estimate and only few studies remark it briefly. This study aims to explain these inner features of four different DR's algorithms. These algorithms were not chosen randomly, but in purpose. It is chosen some representative from all of the major DR's groups. The comparison criteria are based on statistical dependencies, such as Correlation Coefficient, Euclidean Distance, Mutual Information and Granger causality. The winning algorithm should reasonably transform the input dataset with keeping the most of the inner dependencies. (C) 2016 The Authors. Published by Elsevier B.V.
更多
查看译文
关键词
Principal Component Analysis, Non-negative Matrix Factorization, Autoencoder, Neighborhood Preserving Embedding, Granger Causality, Mutual Information
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要