In this paper, we consider the regularized learning schemes based on l1-regularizer and pinball loss in a data dependent hypothesis space. The target is the error analysis for the quantile regression learning. There i...In this paper, we consider the regularized learning schemes based on l1-regularizer and pinball loss in a data dependent hypothesis space. The target is the error analysis for the quantile regression learning. There is no regularized condition with the kernel function, excepting continuity and boundness. The graph-based semi-supervised algorithm leads to an extra error term called manifold error. Part of new error bounds and convergence rates are exactly derived with the techniques consisting of l1-empirical covering number and boundness decomposition.展开更多
基金Supported by by the Special Fund of Basic Scientific Research of Central Colleges(CZQ13015)the Teaching Research Fund of South-Central University for Nationalities(JYX13023)
文摘In this paper, we consider the regularized learning schemes based on l1-regularizer and pinball loss in a data dependent hypothesis space. The target is the error analysis for the quantile regression learning. There is no regularized condition with the kernel function, excepting continuity and boundness. The graph-based semi-supervised algorithm leads to an extra error term called manifold error. Part of new error bounds and convergence rates are exactly derived with the techniques consisting of l1-empirical covering number and boundness decomposition.
文摘现有的面向大规模数据分类的支持向量机(support vector machine,SVM)对噪声样本敏感,针对这一问题,通过定义软性核凸包和引入pinball损失函数,提出了一种新的软性核凸包支持向量机(soft kernel convex hull support vector machine for large scale noisy datasets,SCH-SVM).SCH-SVM首先定义了软性核凸包的概念,然后选择出能代表样本在核空间几何轮廓的软性核凸包向量,再将其对应的原始空间样本作为训练样本并基于pinball损失函数来寻找两类软性核凸包之间的最大分位数距离.相关理论和实验结果亦证明了所提分类器在训练时间,抗噪能力和支持向量数上的有效性.