Opposition-based differential evolution for beta basis function neural network

IEEE Congress on Evolutionary Computation(2010)

引用 12|浏览6
暂无评分
摘要
Many methods for solving optimization problems, whether direct or indirect, rely upon gradient information and therefore may converge to a local optimum. Global optimization methods like Evolutionary algorithms, overcome this problem although these techniques are computationally expensive due to slow nature of the evolutionary process. In this work, a new concept is investigated to accelerate the differential evolution. The opposition-based DE uses the concept of opposite number to create a new population during the learning process to improve the convergence rate of generalization performance of the beta basis function neural network. The proposed algorithm uses the dichotomy research to determine the target solution. Detailed performance comparison of ODE-BBFNN with learning algorithm on benchmarks problems drawn from regression and time series prediction area. The results show that the ODE-BBFNN produces a better generalization performance.
更多
查看译文
关键词
evolutionary computation,generalisation (artificial intelligence),learning (artificial intelligence),neural nets,beta basis function neural network,dichotomy research,evolutionary algorithms,generalization performance,learning process,opposite number concept,opposition-based differential evolution
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要