摘要
针对目前提出的Boosting提升的加权极限学习机算法用各类总分类性能作为算法的优化目标,算法对大类样本具有性能偏向性,而且没有考虑数据中包含噪声及噪点时算法对分类性能的影响,提出基于AdaBoost提升的WELM算法。该算法利用考虑各类样本分布不平衡特性的误差计算方式并对误差进行了sigmoid运算,提高了算法的对大类样本和小类样本的识别率及算法的抗噪声能力。通过在15个UCI不平衡数据集进行分析实验,实验结果表明提出的算法具有更好的分类性能。
While Boosting weighted extreme learning machine could achieve more generalization performance than weighted extreme learning machine,it rely on total error of two class samples to weigh the classifiers which results in a natural tendency to favor the majority class and do not consider the effect of outliers and noise from datasets.This paper proposed a weighted extreme learning machine based on AdaBoost,in which used error measure considering distribution of different class and calculated the sigmoid function of the error measure to reduce the effect of outliers and noise from datasets.The experimental results on 15 imbalanced datasets indicate that it achieves more generalization performance than weighted extreme learning machine and most algorithm of imbalanced data learning.
作者
唐晓芬
陈莉
Tang Xiaofen;Chen Li(School of Information Science&Technology,Northwest University,Xi’an 710127,China;School of Information Engineering,Ningxia University,Yinchuan 750021,China)
出处
《计算机应用研究》
CSCD
北大核心
2018年第10期2990-2993,3002,共5页
Application Research of Computers
基金
国家自然科学基金资助项目(61461043
11561054
61379010)