摘要
针对大规模的优化问题,提出一种复杂度低且能快速收敛的分布式并行方法。由于计算Hessian矩阵及其逆矩阵会带来巨大的计算和存储开销,利用内点法或牛顿法求解大规模问题并不可行;大规模优化问题通常采用基于梯度或基于分解的方法进行求解。传统的方法具有较高的复杂度的算法,因此笔者提出了一种新的具有更快收敛速度的原对偶方法,每次迭代仅需要进行简单的梯度更新,从而降低复杂度。
A distributed parallel method with low complexity and fast convergence is proposed to solve large-scale optimization problems.The calculation of Hessian matrix and its inverse matrix will bring huge calculation and storage overhead,so it is not feasible to solve large-scale problems by using the interior point method or Newton method.Large-scale optimization problems are usually solved by gradient-based or decomposition-based methods.The traditional method has a high complexity algorithm,so the author proposes a new primal-dual method with a faster convergence speed.Each iteration only needs simple gradient update to reduce the complexity.
作者
余红蕾
YU Hong-lei(Department of Education,Chuzhou City Vocation College,Chuzhou 23900,China)
出处
《信阳农林学院学报》
2019年第1期116-120,共5页
Journal of Xinyang Agriculture and Forestry University
关键词
大规模凸优化
梯度法
收敛速率
large-scale convex optimization
gradient method
convergence rate