To save the calculations of Jacobian,a multi-step Levenberg-Marquardt method named Shamanskii-like LM method for systems of nonlinear equations was proposed by Fa.Its convergence properties have been proved by using a...To save the calculations of Jacobian,a multi-step Levenberg-Marquardt method named Shamanskii-like LM method for systems of nonlinear equations was proposed by Fa.Its convergence properties have been proved by using a trust region technique under the local error bound condition.However,the authors wonder whether the similar convergence properties are still true with standard line searches since the direction may not be a descent direction.For this purpose,the authors present a new nonmonotone m-th order Armijo type line search to guarantee the global convergence.Under the same condition as trust region case,the convergence rate also has been shown to be m+1 by using this line search technique.Numerical experiments show the new algorithm can save much running time for the large scale problems,so it is efficient and promising.展开更多
In this paper, a nonmonotone method based on McCormick's second-order Armijo's step-size rule [7] for unconstrained optimization problems is proposed. Every limit point of the sequence generated by using this proced...In this paper, a nonmonotone method based on McCormick's second-order Armijo's step-size rule [7] for unconstrained optimization problems is proposed. Every limit point of the sequence generated by using this procedure is proved to be a stationary point with the second-order optimality conditions. Numerical tests on a set of standard test problems are presented and show that the new algorithm is efficient and robust.展开更多
In this paper, the continuously differentiable optimization problem min{f(x) : x∈Ω}, where Ω ∈ R^n is a nonempty closed convex set, the gradient projection method by Calamai and More (Math. Programming, Vol.39...In this paper, the continuously differentiable optimization problem min{f(x) : x∈Ω}, where Ω ∈ R^n is a nonempty closed convex set, the gradient projection method by Calamai and More (Math. Programming, Vol.39. P.93-116, 1987) is modified by memory gradient to improve the convergence rate of the gradient projection method is considered. The convergence of the new method is analyzed without assuming that the iteration sequence {x^k} of bounded. Moreover, it is shown that, when f(x) is pseudo-convex (quasiconvex) function, this new method has strong convergence results. The numerical results show that the method in this paper is more effective than the gradient projection method.展开更多
2005年袁玉波等人用一个多项式函数作为光滑函数,提出了一个多项式光滑的支持向量机模型PSSVM(polynomial smooth support vector machine),使分类性能及效率得到了一定提高.2007年熊金志等人用插值函数的方法导出了一个递推公式,得到...2005年袁玉波等人用一个多项式函数作为光滑函数,提出了一个多项式光滑的支持向量机模型PSSVM(polynomial smooth support vector machine),使分类性能及效率得到了一定提高.2007年熊金志等人用插值函数的方法导出了一个递推公式,得到了一类新的光滑函数,解决了关于是否存在以及如何寻求性能更好的光滑函数的问题.然而,支持向量机是否存在其他多项式光滑模型,以及多项式光滑模型的一般形式是什么等问题依然存在.为此,将一类多项式函数作为新的光滑函数,使用光滑技术,提出了多项式光滑的支持向量机一般模型dPSSVM(dth-order polynomial smooth support vector machine).用数学归纳法证明了该一般模型的全局收敛性,并进行了数值实验.实验结果表明,当光滑阶数等于3时,一般模型的分类性能及效率为最好,并优于PSSVM模型;当光滑阶数大于3后,分类性能基本不变,效率会有所降低.成功解决了多项式光滑的支持向量机的一般形式问题.展开更多
In this paper, we consider the convergence properties of a new class of three terms conjugate gradient methods with generalized Armijo step size rule for minimizing a continuously differentiable function f on R^π wit...In this paper, we consider the convergence properties of a new class of three terms conjugate gradient methods with generalized Armijo step size rule for minimizing a continuously differentiable function f on R^π without assuming that the sequence {xk} of iterates is bounded. We prove that the limit infimum of ‖↓△f(xk)‖ is Zero. Moreover, we prove that, when f(x) is pseudo-convex (quasi-convex) function, this new method has strong convergence results: either xk→x* and x* is a minimizer (stationary point); or ‖xk‖→∞, arg min{f(x) :x∈R^n} =φ, and.f(xk) ↓ inf(f(x) : x∈R^n}. Combining FR, PR, HS methods with our new method, FR, PR, HS methods are modified to have global convergence property.Numerical result show that the new algorithms are efficient by comparing with FR,PR, HS conjugate gradient methods with Armijo step size rule.展开更多
In this paper,we present a new nonlinear modified spectral CD conjugate gradient method for solving large scale unconstrained optimization problems.The direction generated by the method is a descent direction for the ...In this paper,we present a new nonlinear modified spectral CD conjugate gradient method for solving large scale unconstrained optimization problems.The direction generated by the method is a descent direction for the objective function,and this property depends neither on the line search rule,nor on the convexity of the objective function.Moreover,the modified method reduces to the standard CD method if line search is exact.Under some mild conditions,we prove that the modified method with line search is globally convergent even if the objective function is nonconvex.Preliminary numerical results show that the proposed method is very promising.展开更多
基金supported by the Natural Science Foundation of Anhui Province under Grant No.1708085MF159the Natural Science Foundation of the Anhui Higher Education Institutions under Grant Nos.KJ2017A375+1 种基金KJ2019A0604the abroad visiting of excellent young talents in universities of Anhui province under Grant No.GXGWFX2019022。
文摘To save the calculations of Jacobian,a multi-step Levenberg-Marquardt method named Shamanskii-like LM method for systems of nonlinear equations was proposed by Fa.Its convergence properties have been proved by using a trust region technique under the local error bound condition.However,the authors wonder whether the similar convergence properties are still true with standard line searches since the direction may not be a descent direction.For this purpose,the authors present a new nonmonotone m-th order Armijo type line search to guarantee the global convergence.Under the same condition as trust region case,the convergence rate also has been shown to be m+1 by using this line search technique.Numerical experiments show the new algorithm can save much running time for the large scale problems,so it is efficient and promising.
基金supported by the National Natural Science Foundation of China(grant No.10231060)the Specialized Research Fund of Doctoral Program of Higher Education of China at No.20040319003the Graduates'Creative Project of Jiangsu Province,China.
文摘In this paper, a nonmonotone method based on McCormick's second-order Armijo's step-size rule [7] for unconstrained optimization problems is proposed. Every limit point of the sequence generated by using this procedure is proved to be a stationary point with the second-order optimality conditions. Numerical tests on a set of standard test problems are presented and show that the new algorithm is efficient and robust.
基金Supported by the National Natural Science Foundation of China(No.10571106).
文摘In this paper, the continuously differentiable optimization problem min{f(x) : x∈Ω}, where Ω ∈ R^n is a nonempty closed convex set, the gradient projection method by Calamai and More (Math. Programming, Vol.39. P.93-116, 1987) is modified by memory gradient to improve the convergence rate of the gradient projection method is considered. The convergence of the new method is analyzed without assuming that the iteration sequence {x^k} of bounded. Moreover, it is shown that, when f(x) is pseudo-convex (quasiconvex) function, this new method has strong convergence results. The numerical results show that the method in this paper is more effective than the gradient projection method.
文摘2005年袁玉波等人用一个多项式函数作为光滑函数,提出了一个多项式光滑的支持向量机模型PSSVM(polynomial smooth support vector machine),使分类性能及效率得到了一定提高.2007年熊金志等人用插值函数的方法导出了一个递推公式,得到了一类新的光滑函数,解决了关于是否存在以及如何寻求性能更好的光滑函数的问题.然而,支持向量机是否存在其他多项式光滑模型,以及多项式光滑模型的一般形式是什么等问题依然存在.为此,将一类多项式函数作为新的光滑函数,使用光滑技术,提出了多项式光滑的支持向量机一般模型dPSSVM(dth-order polynomial smooth support vector machine).用数学归纳法证明了该一般模型的全局收敛性,并进行了数值实验.实验结果表明,当光滑阶数等于3时,一般模型的分类性能及效率为最好,并优于PSSVM模型;当光滑阶数大于3后,分类性能基本不变,效率会有所降低.成功解决了多项式光滑的支持向量机的一般形式问题.
文摘In this paper, we consider the convergence properties of a new class of three terms conjugate gradient methods with generalized Armijo step size rule for minimizing a continuously differentiable function f on R^π without assuming that the sequence {xk} of iterates is bounded. We prove that the limit infimum of ‖↓△f(xk)‖ is Zero. Moreover, we prove that, when f(x) is pseudo-convex (quasi-convex) function, this new method has strong convergence results: either xk→x* and x* is a minimizer (stationary point); or ‖xk‖→∞, arg min{f(x) :x∈R^n} =φ, and.f(xk) ↓ inf(f(x) : x∈R^n}. Combining FR, PR, HS methods with our new method, FR, PR, HS methods are modified to have global convergence property.Numerical result show that the new algorithms are efficient by comparing with FR,PR, HS conjugate gradient methods with Armijo step size rule.
基金Supported by the Key Project of 2010 Chongqing Higher Education Teaching Reform (Grant No. 102104)
文摘In this paper,we present a new nonlinear modified spectral CD conjugate gradient method for solving large scale unconstrained optimization problems.The direction generated by the method is a descent direction for the objective function,and this property depends neither on the line search rule,nor on the convexity of the objective function.Moreover,the modified method reduces to the standard CD method if line search is exact.Under some mild conditions,we prove that the modified method with line search is globally convergent even if the objective function is nonconvex.Preliminary numerical results show that the proposed method is very promising.