摘要
对求解无约束规划的超记忆梯度算法中线搜索方向中的参数,给了一个假设条件,从而确定了它的一个新的取值范围,保证了搜索方向是目标函数的充分下降方向,由此提出了一类新的记忆梯度算法.在去掉迭代点列有界和Armijo步长搜索下,讨论了算法的全局收敛性,且给出了结合形如共轭梯度法FR,PR,HS的记忆梯度法的修正形式.数值实验表明,新算法比Armijo线搜索下的FR、PR、HS共轭梯度法和超记忆梯度法更稳定、更有效.
In this paper, an assumption condition is given for the parameter in the line search direction of super-memory gradient optimization algorithms. Based on the condition, we get a new range to ensure that the line search direction is sufficient descent, and present a new memory gradient method. The convergence property of the new memory gradient method with generalized Armijo step size rule is discussed without assuming that the sequence of iterates is bounded. Combining FR, PR, HS methods with memory gradient algorithm, FR, PR, HS methods are modified. Numerical result show that the new algorithm is efficient by comparing with FR, PR, HS conjugate gradient methods with Armijo step size rule, and the super-memory gradient method.
出处
《数学的实践与认识》
CSCD
北大核心
2012年第20期98-104,共7页
Mathematics in Practice and Theory
基金
山西省自然科学基金(2008011013)