期刊文献+

一种混合结构前向神经网络的快速学习算法 被引量:2

A FAST LEARNING ALGORITHM FOR FEEDFORWARD NEURAL NETWORKS WITH HYBRID STRUCTURES
原文传递
导出
摘要 本文构造了一种新的基于线性模型、多层前向网络的混合结构神经网络模型,并提出了相应的非迭代快速学习算法。该学习算法能够根据拟合精度要求,运用线性最小二乘法确定相应的最佳网络权值和线性部分的参数,并自动确定最佳的隐层节点数。与BP网络的比较结果表明,本文提出的混合结构前向神经网络的快速学习算法无论在拟合精度、学习速度、泛化能力、还是隐节点数均显著好于BP算法。 In this paper, a new type of neural networks with hybrid structures which consist of a linear model and multiply-layer feedforward neural network are constructed, and the corresponding non-iterative fast learning algorithm is proposed. According to the demand of approach-precision, the best weights of networks and the minimal numbers of hidden nodes are determined using linear least square. Compared with the well-known BP networks, simulation results show that the performance of the networks proposed in this paper is much better whether in approach-precision, learning speed, generalization ability or the numbers of hidden nodes.
出处 《模式识别与人工智能》 EI CSCD 北大核心 2003年第1期97-101,共5页 Pattern Recognition and Artificial Intelligence
关键词 混合结构 前向神经网络 快速学习算法 线性最小二乘 泛化 Feedforward Neural Networks, Linear Least Square, Learning Algorithm, Generalization
  • 相关文献

参考文献8

  • 1Rumelhart D E, McClelland J L. Parallel Distributed Processing.Cambridge: MIT Press, 1986 被引量:1
  • 2Rumelhart E D, et al. Learning Representation by BP Errors. Nature, 1986, 323:533-536 被引量:1
  • 3戴连奎,马龙华,李晓东.多层前向网络的随机学习新算法及其工业应用[J].自动化学报,1998,24(1):133-135. 被引量:11
  • 4Siddique M N H, Tokhi M O. Training Neural Networks: Back- Propagation vs. Genetic Algorithms. In: Proc of the IEEE International Joint Conference on Neural Networks, Washington, DC, 2001, 4:2673-2678 被引量:1
  • 5高小榕,杨福生.采用同伦BP算法进行多层前向网络的训练[J].计算机学报,1996,19(9):687-694. 被引量:37
  • 6Hirose Y, et al. Back-Propagation Algorithm Which Varies the Number of Hidden Units. Neural Networks, 1991, 4( 1 ) : 61 - 66 被引量:1
  • 7Holmstrom L, Koistinen P. Using Additive Noise in Back-Propagation. IEEE Trans on Neural Networks, 1992, 3( 1 ) : 24 - 38 被引量:1
  • 8Karystinos G N, Pados D A. On Overfitting, Generalization, and Randomly Expanded Training Sets. IEEE Trans on Neural Networks, 2000, 11(5): 1050- 1057 被引量:1

二级参考文献8

共引文献46

同被引文献5

引证文献2

二级引证文献2

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部