Improved machine learning technique for solving Hausdorff derivative diffusion equations

FRACTALS-COMPLEX GEOMETRY PATTERNS AND SCALING IN NATURE AND SOCIETY(2020)

引用 4|浏览7
暂无评分
摘要
This study aims at combining the machine learning technique with the Hausdorff derivative to solve one-dimensional Hausdorff derivative diffusion equations. In the proposed artificial neural network method, the multilayer feed-forward neural network is chosen and improved by using the Hausdorff derivative to the activation function of hidden layers. A trial solution is a combination of the boundary and initial condition terms and the network output, which can approximate the analytical solution. To transform the original Hausdorff derivative equation into a minimization problem, an error function is defined, where the coefficients are approximated by using the gradient descent algorithm in the back-propagation process. Two numerical examples are given to illustrate the accuracy and the robustness of the proposed method. The obtained results show that the improved machine learning technique is efficient in computing the Hausdorff derivative diffusion equations both from computational accuracy and stability.
更多
查看译文
关键词
Hausdorff Derivative,Machine Learning Technique,Artificial Neural Network,Anomalous Diffusion,Diffusion Equation
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要