摘要
针对支持向量机模型问题,给出了一种新的坐标梯度下降算法.算法首先求解一个特殊的二次规划问题,将所得的结果进行分解后,得到每次迭代所需要的工作集,然后,求解一个降维的二次规划子问题得到下降方向.新算法无需进行线搜索,避免了线搜索带来的时间和空间上的开销,使得计算量大大减少.最后,在较弱的条件下证明了算法的全局收敛性,并利用数值实验证明了算法的可行性和有效性.
In this paper, a new coordinate gradient descent algorithm is proposed for training support vector machines. At each iteration, a special quadratic programming problem is solved, the solution of which is decomposed into the sum of a number of vectors to obtain the working set. Then a reduced quadratic programming sub- problem is solved, with only the constraints in the working set. The new Mgorithm reduces the computational amount of time and space without using any line search and is proved to be globally convergent under weak conditions. Finally, numerical experiments show that the algorithm is feasible and effective.
作者
于静
韩鲁青
YU Jing;HAN Luqing(School of Management, Tianjin University of Technology, Tianjin 300384;College of Management and Economics, Tianjin University, Tianjin 300072)
出处
《系统科学与数学》
CSCD
北大核心
2018年第5期583-590,共8页
Journal of Systems Science and Mathematical Sciences
基金
教育部人文社科青年基金(16YJC630159)资助课题
关键词
支持向量机
坐标梯度下降
分解方法
工作集
全局收敛性
Support vector machine
coordinate gradient descent
decomposition methods
working set
global convergence.