摘要
在高维空间中,分类超平面倾向于通过原点,即不需要偏置(b)。为了研究在ν-SVM分类问题中是否需要b,该文提出了无(b)的ν-SVM的对偶优化问题并给出了其优化问题求解方法。该方法通过有效集策略将对偶优化问题转化为等式约束子优化问题,然后通过拉格朗日乘子法将子优化问题转化为线程方程组来求解。实验表明偏置(b)的存在会降低ν-SVM的泛化性能,ν-SVM只能得到无(b)ν-SVM的次优解。
In the high-dimensional space,the classification hyperplane tends to pass through the origin and bias(b) is not need.To study whether ν-svm for classification needs(b),dual optimization formulation of without(b) is proposed and the corresponding method of solving the optimization formulation is presented.The dual optimization formulation is transformed into equality constraint sub-optimization formulation by the active set strategy in this method,then the sub-optimization formulation is transformed into the linear equation by lagrange multiplier method.The experimental results show that the existence of(b) would reduce the generalization ability of ν-svm and ν-svm can only obtain the sub-optimal solution of ν-svm without b.
出处
《电子与信息学报》
EI
CSCD
北大核心
2011年第8期1998-2002,共5页
Journal of Electronics & Information Technology
基金
国家863计划项目(2008AA01Z136)资助课题
关键词
V-支持向量机
偏置
泛化性能
有效集
ν-Support Vector Machine(SVM)
Bias
Generalization ability
Active set