期刊文献+

卷积神经网络中ReLU激活函数优化设计 被引量:32

Optimal Design of ReLU Activation Function in Convolutional Neural Networks
下载PDF
导出
摘要 卷积神经网络中的激活函数的作用是激活神经元的特征然后保留并映射出来,这是神经网络能模拟人脑机制,解决非线性问题的关键。ReLU函数更是其中的佼佼者,但同时其自身也存在不足之处。文章从两个方面对ReLU函数进行了优化设计。对使用梯度下降法的激活函数的学习率进行讨论研究并提出可行的学习率改进方法。提出一种新型校正激活函数,称其为e-ln函数,经过Mnist数据集仿真实验证明某些情况下其性能要优于ReLU。 The function of activation function in convolutional neural network is to activate the characteristics of neurons and then retain and map them. This is the key point that neural network can simulate human brain mechanisms and solve nonlinear problems. ReLU function is one of the best, but at the same time, its own shortcomings. In this paper, the ReLU function is optimized from two aspects. The study on the learning rate of the activation function using the gradient descent method is discussed and the feasible method for improving the learning rate is proposed. A new type of calibration activation function is proposed, which is called e-ln function. After Mnist data set simulation, it is proved that its performance is better than ReLU in some cases.
出处 《信息通信》 2018年第1期42-43,共2页 Information & Communications
关键词 卷积神经网络 激活函数 ReLU 优化设计 convolutional neural network activation function ReLU optimal design
  • 相关文献

同被引文献244

引证文献32

二级引证文献122

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部