摘要
本文将奇异值分解(SVD)方法用于神经网络结构优化通过对收敛后网络的权值矩阵作SVD分解,根据分解结果确定出较适宜的隐层神经元数目,这样神经网络结构得以简化,连接权的数目将减少,从而使计算量减少,节约存贮资源及时间.本文在特征提取部分还提出了将特征点定位的方法,将特征点的属性及其相对位置作为神经网络的输入.实验证明,这组特征较好地反映了手写体数字的结构信息.
This paper adopts the way of Singular Value Decomposition(SVD)to sim-plify the neural network structure. The right number of hidden unit can be choosen by us-ing the result of decomposing the weight matrix after the network is converged.Thus the weight numbers will be reduced and many advantages will be obtained while training a network.Also this paper puts forward a way about locating the characteristic points and uses attributions of these points and their relative position as neural network's input.The result of experiment has shown that these characteristics have been well reflected the char-acter structure information.
出处
《云南大学学报(自然科学版)》
CAS
CSCD
1995年第1期69-73,共5页
Journal of Yunnan University(Natural Sciences Edition)
基金
云南省应用基础研究基金
关键词
神经网络
隐层神经元
权值矩阵
数字识别
Singular Value Decomposition(SVD),neural network,hidden unit,weight matrix