期刊文献+

联想记忆神经网络的训练 被引量:3

ON THE TRAINING OF NEURAL NETWORK FOR ASSOCIATIVE MEMORY
下载PDF
导出
摘要 提出了一种联想记忆神经网络的优化训练方案,说明网络的样本吸引域可用阶深参数作一定程度的控制,使网络具有尽可能好的容错性.计算表明,训练网络可达到α 1(α=M/N,N是神经元数,M是贮存样本数),而仍有良好的容错性,明显优于外积法、正交化外积法、赝逆法等常用方案.文中还对训练网络的对称性与收敛性问题进行了讨论. In this paper, an optimized training scheme of neural network for associative memory is proposed. We show that the basins of attraction for samples attractors can be controlled in some extent by a pitfall depth parameter, therefore, the faulttolerance of network can be made as good as possible. Numerical simulations show that with this scheme, the capacity of network can reach α 1 (α = M/N, here N is the number of neurons and M is the number of stored samples) and still with good fault-tolerance. The results are much better than the popular schemes such as outer-product scheme, orthogonalized outer-product scheme, pseudo-inverse matrix scheme and etc.. The problems on symmetry and convergence of trained networks are discussed too.
作者 张承福 赵刚
机构地区 北京大学物理系
出处 《自动化学报》 EI CSCD 北大核心 1995年第6期641-648,共8页 Acta Automatica Sinica
基金 国家非线性科学攀登项目
关键词 神经网络 联想记忆 容错性 优化训练 Neural network, associative memory, fault-tolerance, attractor,basin of attraction.
  • 相关文献

参考文献3

二级参考文献1

  • 1张承福,Theor Phys,1992年,18卷,233页 被引量:1

共引文献19

同被引文献12

引证文献3

二级引证文献9

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部