期刊文献+

求解非凸截断L1-SVM的多阶段非精确线搜割平面方法

A multi-stage cutting plane method with inexact line search for solving non-convex truncated L1-SVMs
下载PDF
导出
摘要 截断Hinge损失能够获得更为稀疏的支持向量,因此在鲁棒性上有显著的优点,但却由此导致了难以求解的非凸问题.MM(Majorization⁃Minimization)是一种求解非凸问题的一般框架,多阶段MM策略已经在稀疏性上取得了很好的效果,但是计算复杂度较高.另一方面,非精确线搜割平面方法可以高效求解线性支持向量机问题.针对截断L1⁃SVM(L1 Support Vector Machine)这一非凸非光滑问题,提出一种基于非精确线性搜索的多阶段割平面方法,避免每个阶段都进行批处理求解,克服了计算复杂度高的缺点,具有每个阶段求解速度快的优点.该算法适用于大规模问题的求解,也从理论上保证了其收敛性.最后,与其他多阶段算法进行了实验对比,验证了该方法的有效性. The truncated Hinge loss can get more sparse support vectors,thus it has significant advantages in robustness.But it leads to a non⁃convex problem which is difficult to solve.MM(Majorization⁃Minimization)is a general framework and is suitable for solving non⁃convex problems.The specific multi⁃stage MM strategy has a good effect in sparsity for the non⁃convex problem considering the structure of the objective function,but higher computational complexity is caused.On the other hand,a cutting plane method with inexact line search can efficiently solve linear support vector machines,and the theoretical analysis shows that it has the same optimal convergence bound as the exact line search method with a higher speed and lower cost.In this paper,for the truncated L1⁃SVM(L1 Support Vector Machine)which is non⁃convex and non⁃smooth,we propose a multi⁃stage cutting plane method with inexact line search.It avoids solving the problem by a batch method at each stage,which overcomes the shortcoming of high computational complexity.Meanwhile,there is an advantage that a faster solution can be got at each stage.The algorithm is suitable for solving large⁃scale problems and it is convergent in theory.Finally,comparison experiments with other multi⁃stage algorithm demonstrate that our method is effective.
作者 袁友宏 刘欣 鲍蕾 Yuan Youhong;Liu Xin;Bao Lei(Army Academy of Artillery and Air Defense of PLA,Hefei,230031,China)
出处 《南京大学学报(自然科学版)》 CAS CSCD 北大核心 2020年第1期98-106,共9页 Journal of Nanjing University(Natural Science)
基金 国家自然科学基金(61673394) 安徽省自然科学基金(1908085MF193)
关键词 截断Hinge 损失 非凸优化 多阶段策略 非精确线性搜索 truncated Hinge loss non⁃convex optimization multi⁃stage strategy inexact line search
  • 相关文献

参考文献1

二级参考文献20

  • 1Cristianini N, Shawe-Taylor J. An Introduction to Support Vector Machines and Other Kernel-Based Learning Methods. Cambridge, UK : Cambridge University Press, 2000. 被引量:1
  • 2Yuan G X, Ho C H, Lin C J. Recent Advances of Large-Scale Line- ar Classification. Proceedings of the IEEE, 2012, 100(9) : 2584- 2603. 被引量:1
  • 3Yu J, Vishwanathan S V N, Gtinter S, et al. A Quasi-Newton Ap- proach to Nonsmooth Convex Optimization Problems in Machine Learning. Journal of Machine Learning Research, 2010, 11 : 1145- 1200. 被引量:1
  • 4Duchi J, Shalev-Shwartz S, Singer Y, et al. Composite Objective Mirror Descent[ EB/OL]. [ 2013-05-03 ]. http ://dept. stat. lsa. umich, edu/-tewaria/research/duchil0eomposite, pdf. 被引量:1
  • 5Zhang T. Solving Large Scale Linear Prediction Problems Using Sto- chastic Gradient Descent Algorithms [ EB/OL ]. [ 2013 - 03 - 20 ]. http ://stat. rutgers, edu/home/tzhang/papers/iemlo4-stograd, pdf. 被引量:1
  • 6Shalev-Shwartz S, Singer Y, Srebro N, et al. Pegasos : Primal Esti- mated Sub-Gradient Solver for SVM. Mathematical Programming, 2011, 127(1) : 3-30. 被引量:1
  • 7Bottou L, Bousquet O. The Tradeoffs of Large Scale Learning[ EB/ OL ]. [ 21307-12-03 ]. http ://machineleaming. wustl, edu/mlpapers/ paper_files/NIPS2007 _726. pdf. 被引量:1
  • 8Nesterev Y. Efficiency of Coordinate Descent Methods on Huge- Scale Optimization Problems. SIAM Journal on Optimization. 2010, 22(2) : 341-362. 被引量:1
  • 9Hsieh C J, Chang K W, Lin C J, et al. A Dual Coordinate Descent Method for Large-Scale Linear SVM//Proc of the 25th International Conference on Machine Learning. Helsinki, Finland, 2008:408-415. 被引量:1
  • 10Joachims T. Training Linear SVMs in Linear Time// Proc of the 12th ACM SIGKDD International Conference on Knowledge Discov- ery and Data Mining. Philadelphia, USA, 2006 : 217-226. 被引量:1

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部