An Accelerated Approach on Adaptive Gradient Neural Network for Solving Time-Dependent Linear Equations: A State-Triggered Perspective

IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS(2024)

引用 0|浏览0
暂无评分
摘要
To improve the acceleration performance, a hybrid state-triggered discretization (HSTD) is proposed for the adaptive gradient neural network (AGNN) for solving time-dependent linear equations (TDLEs). Unlike the existing approaches that use an activation function or a time-varying coefficient for acceleration, the proposed HSTD is uniquely designed from a control theory perspective. It comprises two essential components: adaptive sampling interval state-triggered discretization (ASISTD) and adaptive coefficient state-triggered discretization (ACSTD). The former addresses the gap in acceleration methods related to the variable sampling period, while the latter considers the underlying evolutionary dynamics of the Lyapunov function to determine coefficients greedily. Finally, compared with commonly used discretization methods, the acceleration performance and computational advantages of the proposed HSTD are substantiated by the numerical simulations and applications to robotics.
更多
查看译文
关键词
Mathematical models,Neural networks,Convergence,Lyapunov methods,Adaptive systems,Robots,Vectors,Acceleration convergence,adaptive gradient neural network (AGNN),state-triggered discretization,time-dependent linear equations (TDLEs)
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要