Puchen Liu, Ming Yang, Qiankun Li. Higher-order networks with leakage delays and S-type distributed delays]{Inequality methods and global exponential stability for higher-order neural networks with time-varying leakage delays and S-type distributed delays[J]. Chinese Journal of Electronics.
Citation: Puchen Liu, Ming Yang, Qiankun Li. Higher-order networks with leakage delays and S-type distributed delays]{Inequality methods and global exponential stability for higher-order neural networks with time-varying leakage delays and S-type distributed delays[J]. Chinese Journal of Electronics.

Higher-order networks with leakage delays and S-type distributed delaysInequality methods and global exponential stability for higher-order neural networks with time-varying leakage delays and S-type distributed delays

  • The higher-order neural network system with both leakage delays and S-type distributed transmission delays is introduced. By means of topological degree theory, properties of M-matrices and inequality tricks, the existence of the equilibrium point for the system is deduced. In the existing work, authors mainly employed the Lyapunov functional or multiple integral inequalities or matrix inequalities, which usually required more computations. On the other hand, in treating the infiniteness of the distributed delays, most scholars had explored the intercept method or the generalized Halanay differential inequality. Instead of constructing complicated Lyapunov functionals, the principle of reductio ad absurdum and simple algebra inequality strategy are explored to prove the global exponential stability of the system, which has largely reduced the computational complexity. Finally, two examples, their simulations and the related remark are demonstrated for illustrating the effectiveness and generality of the theoretical results. Our model is more general and the sufficient results are easily verifiable and have wider adaptiveness.
  • loading

Catalog

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return