摘要
根据问题自身的结构特点,通过将其转化为等价的方程,提出了求解凸二次规划的一个基于梯度的新神经网络模型.严格证明了它是Liapunov稳定的,并且渐近收敛于原问题的精确解.讨论了其全局指数稳定性,该模型不需要选择自反馈或辅助联结权矩阵,且网络规模小于原问题.模拟实验表明新模型不仅可行,而且有效.
A new gradient-based neural network for solving convex quadratic programming problems is proposed by means of the inherent properties of the original problem. The proposed model is strictly proved to be Liapunov stable and can asymptotically converge to an exact optimal solution. Moreover, the global exponential stability of the proposed model is also developed. There is no need for choosing the self-feedback or lateral connection matrices in the proposed model, and its size is less than that of original problem. The feasibility and effectiveness of the proposed neural network are supported by the simulation experiments.
出处
《陕西师范大学学报(自然科学版)》
CAS
CSCD
北大核心
2004年第2期24-27,共4页
Journal of Shaanxi Normal University:Natural Science Edition
基金
陕西师范大学重点科研基金资助项目(995091)
关键词
二次规划
梯度
神经网络
全局指数稳定性
收敛性
quadratic programming
neural network
global exponential stability
convergence