期刊文献+

Huber-支持向量回归机在线算法研究 被引量:2

Study on Online Algorithm of Huber-support Vector Regression Machine
下载PDF
导出
摘要 当数据规模逐渐扩大以及数据不断更新时,将传统的基于支持向量回归机(Support Vector Regression,SVR)的一次性建模算法用于数据的分析处理,均需要从头开始建模,而在线学习算法可以很好地解决这一问题。文章在ε-SVR在线算法的基础上,提出了一种新的Huber-SVR在线算法,采用定长的滚动窗口策略对样本进行训练,在增加一个新样本的同时删除一个旧样本,从而满足样本更新的需求,实现模型的在线学习。仿真结果表明了该在线算法的有效性,与ε-SVR在线算法相比,该算法在回归预测方面的预测误差率较低,对真实数据有较好的拟合效果。 When the scale of data is gradually expanding and the data is constantly updated,the traditional one-time modeling algorithm based on Support Vector Regression(SVR)used for data analysis and processing needs to be modeled from scratch,and the online learning algorithm can solve this problem well.On the basis ofε-SVR online algorithm,this paper proposes a new Huber-SVR online algorithm,which adopts the rolling window strategy of constant length to train the samples,adding a new sample and deleting an old sample,so as to meet the demand of sample updating and realize the online learning of the model.Simulation results demonstrate the effectiveness of the proposed algorithm.Compared with theε-SVR online algorithm,the proposed algorithm has lower prediction error rate in regression prediction and better fitting effect on real data.
作者 周晓剑 肖丹 付裕 Zhou Xiaojian;Xiao Dan;Fu Yu(School of Management,Nanjing University of Posts and Telecommunications,Nanjing 210023,China;School of Information,Xiamen University,Xiamen Fujian 361005,China)
出处 《统计与决策》 CSSCI 北大核心 2021年第20期10-14,共5页 Statistics & Decision
基金 国家自然科学基金资助项目(71872088,71401080,71904078) 江苏省自然科学基金资助项目(BK20190793) 南京邮电大学人文社会科学研究基金项目(NYS216011)。
关键词 在线算法 Huber-支持向量回归机 Huber损失函数 online algorithm Huber-support vector regression machine Huber loss function
  • 相关文献

参考文献7

二级参考文献41

  • 1Vapnik V. Statistical Learning Theory. Now York: Wiley, 1998. 被引量:1
  • 2Keerthi S, Sindhwani V, Chapelle O. An efficient method for gradient-based adaptation of hyperparameters in SVM models. In: Scholkopf B, Platt J, Hoffman T, eds. Advances in Neural Information Processing Systems 19. Cambridge: MIT Press, 2007. 673-680. 被引量:1
  • 3Bousquet O, Elisseeff A. Stability and generalization. Journal of Machine Learning Research, 2002,2(Mar.):499-526. [doi: I0. 1162/153244302760200704]. 被引量:1
  • 4Anguita D, RideUa S, Rivieceio F. K-Fold generalization eapabilityassessment for support vector classifiers. In: Proc. of the Int'l Joint Conf. on Neural Networks. Montreal, 2005. 855-858. http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=1555964& isnumber=33090 [doi: 10.1109/IJCNN.2005.1555964]. 被引量:1
  • 5Hastic T, Rosset S, Tibshirani R, Zhu J. The entire regularization path for the support vector machine. Journal of Machine Learning Research, 2004,5:1391-1415. 被引量:1
  • 6Zhu J, Rosset S, Hastie T, Tibshirani R. 1-Norm support vector machines. In: Advances in Neural Information Processing Systems. 2003. http://books.nips.cc/papers/files/nips 16/NIPS2003_AA07.pdf. 被引量:1
  • 7Gunter L, Zhu J. Efficient computation and model selection for the support vector regression. Neural Computation, 2007,19(6): 1633-1655. [doi:10.1162/neco.2007.19.6.1633]. 被引量:1
  • 8Gunter L, Zhu J. Computing the solution path for the regularized support vector regression. In: Advances in Neural Information Processing Systems, Vol. 18. Cambridge: MIT Press, 2005. 483-490. http://books.nips.cc/papers/files/nips 18/NIPS2005_0360.pdf. 被引量:1
  • 9Wang G, Yeung DY, Lochovsky FH. A new solution path algorithm in support vector regression. IEEE Trans. on Neural Networks, 2008,19(10):1753-1767. [doi: 10.1109/TNN.2008.2002077]. 被引量:1
  • 10Wang G, Yeung DY, Lochovsky FH. A kernel path algorithm for support vector machines. In: Proc. of 24th Int'l Conf. on Machine Learning. 2007. 951-958. http://www.machinelearning.org/proceedings/icml2OOT/papers/60.pdf Idol: 10.1145/1273496.1273616]. 被引量:1

共引文献19

同被引文献15

引证文献2

二级引证文献3

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部