论文部分内容阅读
通过深化Lasalle不变原理,建立了判别一般动力系统全局收敛性的一个准则.应用这一准则,详尽研究了一个求解有界约束二次规划问题神经网络的全局收敛性,给出了当目标函数为一类非凸函数时的全局收敛性条件.特别地利用常微分方程理论,证明了该网络对任意凸函数全局收敛性,所获结果深化和推广了现有文献相关结论的相应结论.这些新的结论都表明了该神经网络在求解有界约束二次规划问题时的有效性.数值模拟与理论分析结果一致.
By deepening the Lasalle invariance principle, a criterion for judging the global convergence of a general dynamical system is established. By applying this criterion, the global convergence of a neural network for bounded quadratic programming with bounded constraints is studied in detail. Is a globally convergent condition for a class of nonconvex functions. In particular, using the theory of ordinary differential equations, the global convergence of any convex function of the network is proved. The obtained results deepen and generalize the corresponding conclusions of the existing literature. The new results show the effectiveness of this neural network in solving quadratic programming problems with bounded constraints. The numerical simulation is consistent with the theoretical analysis.