摘要
提出一个新的求解无约束优化问题的超记忆梯度法.该算法在每步迭代中充分利用前面迭代点的信息产生下降方向,利用曲线搜索产生步长,并且在每步迭代中不需计算和存储矩阵,适于求解大规模优化问题.在较弱的条件下证明了算法具有全局收敛性和线性收敛速度.数值实验表明该算法是有效的.
A new super-memory gradient method for solving unconstrained optimization problems was proposed.A new search direction was generated by using the current and previous multi-step iterative information in the proposed method,and the step-size at each iteration was defined by a curve search rule,which can be used to solve large scale unconstrained optimization problems because it avoids the computation and storage of some matrices.Furthermore,the global convergence and the convergence rate of the new algorithm were proved under some weak conditions.Finally,numerical results showed that the proposed algorithm is effective.
出处
《信阳师范学院学报(自然科学版)》
CAS
北大核心
2013年第3期324-326,共3页
Journal of Xinyang Normal University(Natural Science Edition)
基金
河南省教育厅科学技术研究重点项目(13A110767)
关键词
无约束优化
曲线搜索
全局收敛
线性收敛速度
unconstrained optimization
curve search
global convergence
linear convergence rate