期刊文献+

基于Softplus函数的神经网络的Reluplex算法验证研究

Research on Verification of Neural Network Based on Softplus Function by Reluplex Algorithm
下载PDF
导出
摘要 形式化验证是计算机科学中运用数理逻辑的方式验证某个系统是否可行的方法,而把形式化验证方法充分应用于神经网络领域,则能更好地研究神经网络的特性与应用.Reluplex是一种对深度神经网络进行验证的单纯形算法,它使用Relu作为激活函数,而Relu的神经元在训练过程中较脆弱并可能垂死.Softplus是与Relu近似但比Relu更平滑的激活函数,改进了Reluplex算法使之能检验采用Softplus激活函数的深度神经网络,随后获取了在Softplus激活函数下测试鲁棒对抗性得到的实验数据结果,通过与Relu测试结果进行对比,证明了Softplus的测试效率明显高于Relu,它比Relu更平衡,从而让神经网络的学习更快.该研究扩展了神经网络验证算法的功能并展开了相应的对比分析,有利于以后更好地验证和改进深度神经网络,以确保其安全性. Formal verification is a method in computer science that uses mathematical logic to verify whether a system is feasible.Applying formal verification methods to the field of neural networks fully can facilitate us to study the characteristics and applications of neural networks better.Reluplex is a kind of simplex algorithm for verifying deep neural networks which uses Relu as the activation function,and neurons of Relu are fragile and may die during training.Softplus is an activation function similar to Relu but smoother than it.We improved the Reluplex algorithm to test deep neural networks which is using Softplus activation function,and then obtained experimental data results by testing robust adversarial under Softplus activation function.Through the comparison with the test results under Relu,it is confirmed that test efficiency under Softplus is significantly higher than that of Relu,which is more balanced than Relu so that the neural network can learn faster.This study expands the functions of the neural network verification algorithm and conducts a corresponding comparative analysis,which is beneficial to better verify and improve deep neural networks to ensure its security in the future.
作者 陆明远 侯春燕 王劲松 Lu Mingyuan;Hou Chunyan;Wang Jinsong(School of Computer Science and Engineering,Tianjin University of Technology,Tianjin 300384)
出处 《信息安全研究》 2022年第9期917-924,共8页 Journal of Information Security Research
基金 国家自然科学基金面上项目(6217070940)。
关键词 神经网络 形式化验证 激活函数 Relu函数 Softplus函数 neural network formal verification activation function Relu Softplus
  • 相关文献

参考文献11

二级参考文献71

共引文献337

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部