Generalised independent component analysis through unsupervised learning with emergent Bussgang properties

Girolami, M.,Fyfe, C.

Neural Networks,1997., International Conference(1997)

引用 51|浏览6
暂无评分
摘要
We utilise an information theoretic criterion for exploratory projection pursuit (EPP) and have shown that maximisation by natural gradient ascent of the divergence of a multivariate distribution from normality, using the negentropy as a distance measure, yields a generalised independent component analysis (ICA). By considering a Gram-Charlier approximation of the latent probability density functions (PDF) we develop a generalised neuron nonlinearity which can be considered as a conditional mean estimator of the underlying independent components. The unsupervised learning rule developed is shown to asymptotically exhibit the Bussgang property and as such produces output data with independent components, irrespective of whether the independent latent variables are sub-gaussian or super-gaussian. Improved convergence speeds are reported when momentum terms are introduced into the learning
更多
查看译文
关键词
entropy,neural nets,unsupervised learning,gram-charlier approximation,ica,pdf,convergence,distance measure,emergent bussgang properties,exploratory projection pursuit,generalised independent component analysis,generalised neuron nonlinearity,independent latent variables,information theory,latent probability density functions,multivariate distribution,natural gradient ascent,negentropy,normality,latent variable,information systems,independent component analysis,projection pursuit,probability density function,telephony,distributed computing,principal component analysis
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要