摘要
本文研究一类新的解无约束最优化问题的记忆梯度法,在强Wolfe线性搜索下证明了其全局收敛性.当目标函数为一致凸函数时,对其线性收敛速率进行了分析.数值试验表明算法是很有效的.
In this paper, a new class of memory gradient methods for unconstrained optimization problem is presented. The convergence of the algorithms with strong Wolfe line search is proved. The linear convergence rate is investigated when the objective function is uniformly convex. Numberical experiments show that the new algorithm is very effcient.
出处
《数学进展》
CSCD
北大核心
2007年第1期67-75,共9页
Advances in Mathematics(China)
基金
国家自然科学基金资助(No.10171054).
关键词
无约束最优化
记忆梯度法
强WOLFE线性搜索
线性收敛速率
unconstrained optimization problem
memory gradient method
strong Wolfe line search
linear convergence rate