Global Learning of Neural Networks by Using Hybrid Optimization Algorithm

PROCEEDINGS OF THE INTERNATIONAL CONFERENCE ON INTELLIGENT SYSTEMS AND KNOWLEDGE ENGINEERING (ISKE 2007)(2007)

引用 1|浏览1
暂无评分
摘要
This paper proposes a global learning of neural networks by hybrid optimization algorithm. The hybrid algorithm combines a stochastic approximation with a gradient descent. The stochastic approximation is first applied for estimating an approximation point inclined toward a global escaping from a local minimum, and then the backpropagation(BP) algorithm is applied for high-speed convergence as gradient descent. The proposed method has been applied to 8-bit parity check and 6-bit symmetry check problems, respectively. The experimental results show that the proposed method has superior convergence performances to the conventional method that is BP algorithm with randomized initial weights setting.
更多
查看译文
关键词
neural networks,global learning,stochastic approximation,gradient descent,backpropagation algorithm
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要