期刊文献+

前向神经网络动态学习 被引量:3

Dynamic Learning for Forward Neural Networks
下载PDF
导出
摘要 在网络中同一隐层的所有神经元对不同样本的输出所构成的向量组应线性无关本文利用这一基本事实,对每一隐层引入了一相关向量及相应的无关度,根据无关度对该隐层神经元数目进行删除或增加,同时适当调整相应的网络权值,这样做既可以避免对隐层神经元的预先确定,同时还可以在学习过程中逃离局部极小根据删除神经元对网络所带来的误差的详细分析。 Vectors, which consist of the output of every neuron in same hidden layer corresponding to different samples, should be nonlinearly correlated. With this basic fact, this Paper firstly gives the definition of linearly correlated vector and corresponding nonlinear correlation measure for every hidden layer, then adds or deletes a neuron for a hidden layer according to its nonlinear correlation measure and adjusts the neural networks weight, values appropriately. This method can not only avoid confining the number of neuron units in a hidden layer, but also escape local minimum during the learnig process. According to error analysis in detail, if gives the optimistic rule of deleting neuron in hidden layer. Numerical experiments illustrate its efficiency.
出处 《电子学报》 EI CAS CSCD 北大核心 1998年第11期140-144,共5页 Acta Electronica Sinica
基金 中国科技大学研究生院院长择优基金
关键词 隐单元删除 删除规则 前向网络 动态学习 Deleting hidden neuron, Deleting rule, Forward network, Dynamic learning
  • 相关文献

同被引文献16

引证文献3

二级引证文献15

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部