期刊文献+

Hybrid Neural Network Architecture for On-Line Learning

Hybrid Neural Network Architecture for On-Line Learning
下载PDF
导出
摘要 Approaches to machine intelligence based on brain models use neural networks for generalization but they do so as signal processing black boxes. In reality, the brain consists of many modules that operate in parallel at different levels. In this paper we propose a more realistic biologically inspired hybrid neural network architecture that uses two kinds of neural networks simultaneously to consider short-term and long-term characteristics of the signal. The first of these networks quickly adapts to new modes of operation whereas the second one provides more accurate learning within a specific mode. We call these networks the surfacing and deep learning agents and show that this hybrid architecture performs complementary functions that improve the overall learning. The performance of the hybrid architecture has been compared with that of back-propagation perceptrons and the CC and FC networks for chaotic time-series prediction, the CATS benchmark test, and smooth function approximation. It is shown that the proposed architecture provides a superior performance based on the RMS error criterion. Approaches to machine intelligence based on brain models use neural networks for generalization but they do so as signal processing black boxes. In reality, the brain consists of many modules that operate in parallel at different levels. In this paper we propose a more realistic biologically inspired hybrid neural network architecture that uses two kinds of neural networks simultaneously to consider short-term and long-term characteristics of the signal. The first of these networks quickly adapts to new modes of operation whereas the second one provides more accurate learning within a specific mode. We call these networks the surfacing and deep learning agents and show that this hybrid architecture performs complementary functions that improve the overall learning. The performance of the hybrid architecture has been compared with that of back-propagation perceptrons and the CC and FC networks for chaotic time-series prediction, the CATS benchmark test, and smooth function approximation. It is shown that the proposed architecture provides a superior performance based on the RMS error criterion.
机构地区 不详
出处 《Intelligent Information Management》 2010年第4期253-261,共9页 智能信息管理(英文)
关键词 NEURAL NETWORKS Instantaneously Trained NETWORKS BACK-PROPAGATION ON-LINE LEARNING Neural Networks Instantaneously Trained Networks Back-Propagation On-Line Learning
  • 相关文献

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部