Classification and ICA using maximum likelihood Hebbian learning

NNSP(2002)

引用 1|浏览1
暂无评分
摘要
We investigate an extension of Hebbian learning in a principal component analysis network which has been derived to be optimal for a specific probability density function(PDF). We note that this probability density function is one of a family of PDFs and investigate the learning rules formed in order to be optimal for several members of this family. We show that, whereas previous authors have viewed the single member of the family as an extension of PCA, it is more appropriate to view the whole family of learning rules as methods of performing exploratory projection pursuit (EPP). We explore the performance of our method first in response to an artificial data type, then to a real data set.
更多
查看译文
关键词
hebbian learning,independent component analysis,probability,signal classification,ica,pdf,artificial data type,classification,learning rules,maximum likelihood hebbian learning,principal component analysis network,probability density function,artificial neural networks,hebbian theory,principal component analysis,data type,projection pursuit,negative feedback,nonlinear equations,maximum likelihood,computational intelligence
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要