Towards Subderivative-Based Zeroing Neural Networks

Communications in computer and information science(2023)

引用 0|浏览2
暂无评分
摘要
Zeroing Neural Networks (ZNN) are dynamic systems suitable for studying and solving time-varying problems. The advantage of this particular type of recurrent neural networks (RNNs) is their global and exponential convergence property, which can be accelerated to a finite-time convergence. The dynamic flow of the ZNN requires the use of an appropriate error (Zhang) function E(t), which can be in matrix, vector, or scalar form, and the element-wise time derivative of $$\dot{E}(t)$$ at each time instant t. A possible difficulty arises in all cases where the time derivative $$\dot{E}(t)$$ does not exist, for any element of E(t) and any time instant $$t_0$$ from a predefined time interval [0, T]. In this research, we propose improvements to the ZNN formula for the case where the time-derivative of the Zhang function does not exist at some points. The non-differentiability occurs in various forms in several cases and occurs frequently. One possible solution in convex and non-differentiable environments is based on the use of subderivatives instead of the time derivative. Another solution is applicable in nonconvex cases and situations with discontinuity, and it is based on shifting in singular points to avoid the division by zero (DBZ) problem that often occurs in division with time-varying expressions.
更多
查看译文
关键词
neural networks,subderivative-based
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要