Solving The Local Minimum And Flat-Spot Problem By Modifying Wrong Outputs For Feed-Forward Neural Networks

IJCNN(2013)

引用 8|浏览15
暂无评分
摘要
Backpropagation (BP) algorithm, which is very popular in supervised learning, is extensively applied in training feed-forward neural networks. Many modifications have been proposed to speed up the convergence process of the standard BP algorithm. However, they seldom focus on improving the global convergence capability. This paper proposes a new algorithm called Wrong Output Modification (WOM) to improve the global convergence capability of a fast learning algorithm. When a learning process is trapped by a local minimum or a flat-spot area, this algorithm looks for some outputs that go to other extremes when compared with their target outputs, and then it modifies such outputs systemically so that they can get close to their target outputs and hence some weights of neurons are changed accordingly. It is hoped that these changes make the learning process escape from such local minima or flat-spot areas and then converge. The performance investigation shows that the proposed algorithm can be applied into different fast learning algorithms, and their global convergence capabilities are improved significantly compared with their original algorithms. Moreover, some statistical data obtained from this algorithm can be used to identify the difficulty of a learning problem.
更多
查看译文
关键词
supervised learning,statistical analysis,backpropagation algorithm,fast learning algorithm,global convergence capability,feed-forward neural networks,backpropagation,local minimum,feedforward neural nets,wom,convergence,statistical data,standard bp algorithm,flat-spot area,wrong output modification
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要