Relaxed Stability Criteria for Neural Networks With Time-Varying Delay Using Extended Secondary Delay Partitioning and Equivalent Reciprocal Convex Combination Techniques

IEEE Transactions on Neural Networks and Learning Systems(2020)

引用 18|浏览10
暂无评分
摘要
This article investigates global asymptotic stability for neural networks (NNs) with time-varying delay, which is differentiable and uniformly bounded, and the delay derivative exists and is upper-bounded. First, we propose the extended secondary delay partitioning technique to construct the novel Lyapunov–Krasovskii functional, where both single-integral and double-integral state variables are considered, while the single-integral ones are only solved by the traditional secondary delay partitioning. Second, a novel free-weight matrix equality (FWME) is presented to resolve the reciprocal convex combination problem equivalently and directly without Schur complement, which eliminates the need of positive definite matrices, and is less conservative and restrictive compared with various improved reciprocal convex inequalities. Furthermore, by the present extended secondary delay partitioning, equivalent reciprocal convex combination technique, and Bessel–Legendre inequality, two different relaxed sufficient conditions ensuring global asymptotic stability for NNs are obtained, for time-varying delays, respectively, with unknown and known lower bounds of the delay derivative. Finally, two examples are given to illustrate the superiority and effectiveness of the presented method.
更多
查看译文
关键词
Delays,Asymptotic stability,Artificial neural networks,Linear matrix inequalities,Stability criteria,Automation
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要