期刊文献+

截断误差的光滑型支持向量顺序回归

Truncated Loss Smooth Support Vector Ordinal Regression
下载PDF
导出
摘要 支持向量顺序回归算法已成功应用于解决顺序回归问题,但其易受训练样本中野点的干扰。为此,提出一种截断误差的光滑型支持向量顺序回归(TLS-SVOR)算法。学习顺序回归模型时,将错划样本形成的误差s限制在范围u内。TLS-SVOR首先用包含参数u的分段多项式近似s;再引入光滑型支持向量机分类算法的思路,将优化目标转变为二次连续可微的无约束问题,从而由牛顿法直接求得唯一的决策超平面。采用两阶段的均匀设计方法确定TLS-SVOR的最优参数。实验结果表明,相比其他顺序回归算法,TLS-SVOR在多个数据集能获得更高的精度。 Support vector ordinal regression (SVOR) has been proven to be the promising algorithm for solving ordinal regression problems. However, its performance tends to be strongly affected by outliers in the training datasets. To remedy this drawback, a truncated loss smooth SVOR (TLS-SVOR) is proposed. While learning ordinal regression models, the loss s of the misranked sample is bounded between 0 and the truncated coefficient u. First, a piecewise polynomial function with parameter u is approximated to s. Then, by applying the strategy of smooth support vector machine for classification, the optimization problem is replaced with an unconstrained function which is twice continuously differentiable. The algorithm employs Newton’s method to obtain the unique discriminant hyperplane. The optimal parameter combination of TLS-SVOR is determined by a two-stage uniform designed model selection methodology. The experimental results on benchmark datasets show that TLS-SVOR has advantage in terms of accuracy over other ordinal regression approaches.
作者 何海江
出处 《电子科技大学学报》 EI CAS CSCD 北大核心 2014年第1期131-136,共6页 Journal of University of Electronic Science and Technology of China
基金 国家自然科学基金(61100139)
关键词 顺序回归 野点 分段多项式 支持向量机 截断误差 ordinal regression outlier piecewise polynomial support vector machine truncated loss
  • 相关文献

参考文献18

  • 1CHENG Jian-lin, WANG Zheng, POLLASTRI G A neural network approach to ordinal mgression[C]//Pmceedings of the International Joint Conference on Neural Networks. CA: IEEE Press, 2008: 1279-1284. 被引量:1
  • 2HERBRICH R, GRAEPEL T, OBERMAYER T. Large margin rank boundaries for ordinal regression[C]//Advances in Large Margin Classifiers. Cambridge, MA: MIT Press, 2000: 115-132. 被引量:1
  • 3LI Ling, LIN H T. Ordinal regression by extended binary classification[C]//Advances in NIPS 19. Cambridge, MA: MIT Press, 2007: 865-872. 被引量:1
  • 4EMILIO C, BELEN M B. Maximizing upgrading and downgrading margins for ordinal regression[J]. Mathematical Methods of Operations Research, 2011, 74(3): 381-407. 被引量:1
  • 5DOBRSKA M, WANG Hui, BLACKBURN W. Ordinal regression with continuous pairwise preferences[J]. International Journal of Machine Learning and Cybernetics, 2012, 3(1): 59-70. 被引量:1
  • 6SHASHUA A, LEVIN A. Ranking with large margin principle: Two approaches[C]//Advances in Neural Information Processing Systems. Cambridge, MA: MIT Press, 2002: 937-944. 被引量:1
  • 7CHU Wei, KEERTHI S S. Support vector ordinal regression[J]. Neural Computation, 2007, 19(3): 792-815. 被引量:1
  • 8SUN Bin-yu, ZHANG Xiao-ming, LI Wen-bo. An improved ordinal regression approach with sum-of-margin principle[C]//Proceedings of Sixth ICNC. CA: IEEE Press, 2010: 853-857. 被引量:1
  • 9BELLE V V, PELCKMANS K, SUYKENS J, et al. Learning transformation models for ranking and survival analysis[J]. Journal of Machine Learning Research, 2011, 12(3): 819-862. 被引量:1
  • 10ZHAO Bin, WANG Fei, ZHANG Chang-shui. Block-quantized support vector ordinal regression[J]. IEEE Transactions on Neural Networks, 2009, 20(5): 882-890. 被引量:1

二级参考文献1

共引文献80

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部