The linear conjugate gradient method is an optimal method for convex quadratic minimization due to the Krylov subspace minimization property. The proposition of limited-memory BFGS method and Barzilai-Borwein gradient...The linear conjugate gradient method is an optimal method for convex quadratic minimization due to the Krylov subspace minimization property. The proposition of limited-memory BFGS method and Barzilai-Borwein gradient method, however, heavily restricted the use of conjugate gradient method for largescale nonlinear optimization. This is, to the great extent, due to the requirement of a relatively exact line search at each iteration and the loss of conjugacy property of the search directions in various occasions. On the contrary, the limited-memory BFGS method and the Barzilai-Bowein gradient method share the so-called asymptotical one stepsize per line-search property, namely, the trial stepsize in the method will asymptotically be accepted by the line search when the iteration is close to the solution. This paper will focus on the analysis of the subspace minimization conjugate gradient method by Yuan and Stoer(1995). Specifically, if choosing the parameter in the method by combining the Barzilai-Borwein idea, we will be able to provide some efficient Barzilai-Borwein conjugate gradient(BBCG) methods. The initial numerical experiments show that one of the variants, BBCG3, is specially efficient among many others without line searches. This variant of the BBCG method might enjoy the asymptotical one stepsize per line-search property and become a strong candidate for large-scale nonlinear optimization.展开更多
The degree of numerical linear independence is proposed and discussed. Based on this linear independence theory, a modified limited memory BFGS method is deve loped. Similar to the standard limited memory method, thi...The degree of numerical linear independence is proposed and discussed. Based on this linear independence theory, a modified limited memory BFGS method is deve loped. Similar to the standard limited memory method, this new method determines the new update by applying the updating formula m times to an initial positive diagonal matrix using the m previous pairs of the change in iteration and gradient. Besides the most recent pair of the change, which guarantees the quadratic termination, the choice of the other ( m -1) pairs of the change in the new method is dependent on the degree of numerical linear independence of previous search directions. In addition, the numerical linear independence theory is further discussed and the computation of the degree of linear independence is simplified. Theoretical and numerical results show that this new modified method improves efficiently the standard limited memory method.展开更多
This paper presents a new nonmonotone filter line search technique in association with the MBFGS method for solving unconstrained minimization.The filter method,which is traditionally used for constrained nonlinear pr...This paper presents a new nonmonotone filter line search technique in association with the MBFGS method for solving unconstrained minimization.The filter method,which is traditionally used for constrained nonlinear programming(NLP),is extended to solve unconstrained NLP by converting the latter to an equality constrained minimization.The nonmonotone idea is employed to the filter method so that the restoration phrase,a common feature of most filter methods,is not needed.The global convergence and fast local convergence rate of the proposed algorithm are established under some reasonable conditions.The results of numerical experiments indicate that the proposed method is efficient.展开更多
基金supported by National Natural Science Foundation of China (Grant Nos. 81173633, 11401038 and 11331012)the Chinese Academy of Sciences Grant (Grant No. kjcx-yw-s7-03)+2 种基金National Natural Science Foundation of China for Distinguished Young Scientists (Grant No. 11125107)the Key Project of Chinese National Programs for Fundamental Research and Development (Grant No. 2015CB856000)the Fundamental Research Funds for the Central Universities (Grant No. 2014RC0904)
文摘The linear conjugate gradient method is an optimal method for convex quadratic minimization due to the Krylov subspace minimization property. The proposition of limited-memory BFGS method and Barzilai-Borwein gradient method, however, heavily restricted the use of conjugate gradient method for largescale nonlinear optimization. This is, to the great extent, due to the requirement of a relatively exact line search at each iteration and the loss of conjugacy property of the search directions in various occasions. On the contrary, the limited-memory BFGS method and the Barzilai-Bowein gradient method share the so-called asymptotical one stepsize per line-search property, namely, the trial stepsize in the method will asymptotically be accepted by the line search when the iteration is close to the solution. This paper will focus on the analysis of the subspace minimization conjugate gradient method by Yuan and Stoer(1995). Specifically, if choosing the parameter in the method by combining the Barzilai-Borwein idea, we will be able to provide some efficient Barzilai-Borwein conjugate gradient(BBCG) methods. The initial numerical experiments show that one of the variants, BBCG3, is specially efficient among many others without line searches. This variant of the BBCG method might enjoy the asymptotical one stepsize per line-search property and become a strong candidate for large-scale nonlinear optimization.
文摘The degree of numerical linear independence is proposed and discussed. Based on this linear independence theory, a modified limited memory BFGS method is deve loped. Similar to the standard limited memory method, this new method determines the new update by applying the updating formula m times to an initial positive diagonal matrix using the m previous pairs of the change in iteration and gradient. Besides the most recent pair of the change, which guarantees the quadratic termination, the choice of the other ( m -1) pairs of the change in the new method is dependent on the degree of numerical linear independence of previous search directions. In addition, the numerical linear independence theory is further discussed and the computation of the degree of linear independence is simplified. Theoretical and numerical results show that this new modified method improves efficiently the standard limited memory method.
基金supported by the National Science Foundation under Grant No.11371253the Science Foundation under Grant No.11C0336 of Provincial Education Department of Hunan
文摘This paper presents a new nonmonotone filter line search technique in association with the MBFGS method for solving unconstrained minimization.The filter method,which is traditionally used for constrained nonlinear programming(NLP),is extended to solve unconstrained NLP by converting the latter to an equality constrained minimization.The nonmonotone idea is employed to the filter method so that the restoration phrase,a common feature of most filter methods,is not needed.The global convergence and fast local convergence rate of the proposed algorithm are established under some reasonable conditions.The results of numerical experiments indicate that the proposed method is efficient.