摘要
该文提出巴氏距离(BhattacharyyaDistance)和K-L(Karhunen-Loeve)变换结合的特征选择。采用巴氏距离特征选择眼3,5演的迭代算法,可以获得最小错误率上界。当特征维数高时,为了减少巴氏距离特征选择计算时间,对样本先进行K-L变换,将特征降低到中间维数。然后进行巴氏距离特征选择,降低到结果的维数。用基于MNIST手写体数字库的试验表明,该文方法比单纯用巴氏距离特征选择计算时间大大减少,并比主分量方法(即单纯使用K-L变换)特征选择的错误率小得多。
This paper presents a smart feature selection method in which authors compose the merits of K-L decomposition and Bhattacharyya Distance.First,this paper uses K-L Decomposition to remove noises and features that do not play import roles in separating classes.Then taking advantage of the direct relationship between Bhattacharyya Distance and the upper bound on Bayes error probability,the paper uses recursive algorithm to obtain the effective features to minimize the upper bound on error probability.Authors uses the method in MNIST.The result shows the method is not only workable but far more effective than the method of K-L decomposition alone.
出处
《计算机工程与应用》
CSCD
北大核心
2004年第36期90-92,共3页
Computer Engineering and Applications