Exhaustive Learning.

Daniel B. Schwartz, Vijay K. Samalam,Sara A. Solla,John S. Denker

Neural Computation(1990)

引用 56|浏览53
暂无评分
摘要
Exhaustive exploration of an ensemble of networks is used to model learning and generalization in layered neural networks. A simple Boolean learning problem involving networks with binary weights is numerically solved to obtain the entropy S m and the average generalization ability G m as a function of the size m of the training set. Learning curves G m vs m are shown to depend solely on the distribution of generalization abilities over the ensemble of networks. Such distribution is determined prior to learning, and provides a novel theoretical tool for the prediction of network performance on a specific task.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要