期刊文献+

基于深度学习框架的长序列大坝监测缺失数据插补模型

Long-series missing data interpolation model for dam monitoring based on deep learning framework
下载PDF
导出
摘要 由于系统故障、传感器老化等不利因素常会导致监测数据缺失,从时间序列角度出发,针对大坝监测量中间缺失数据构建了一种基于深度学习框架下的双向CNN-BiLSTM-Attention缺失数据插补模型。该模型结合卷积神经网络与长短期记忆神经网络的算法优势,通过提取时间特征,引入注意力机制优化插补过程,同时以时间步递减的权重融合正反向插补结果。以某混凝土重力坝为例,采用该模型对大坝监测量长序列缺失数据进行插补,结果表明,双向融合插补能有效消除长序列缺失数据插补时间步的累积误差,与其他插补模型相比,这种深度学习模型具有更高的插补精度。 Many unfavorable factors such as system failure and sensor aging can often lead to deviation or even lack of monitoring data.In the present study,a bi-directional CNN-BiLSTM-Attention missing data interpolation model for intermediate missing data during dam monitoring based on the deep learning framework was constructed from the perspective of time series.The characteristics of the convolution neural network and long-term and short-term memory neural network were combined to extract time features in the model.The attention mechanism was introduced to optimize the interpolation process.The forward and backward interpolation results were fused by the weights decreasing according to time steps.A concrete gravity dam was selected to verify the effectiveness of the proposed model by interpolating the long-series missing data of dam monitoring.The results indicate that bidirectional fusion interpolation can effectively overcome the cumulative error of interpolation time steps for longseries missing data,and this deep learning framework has higher interpolation accuracy compared to other interpolation models.
作者 雷未 王建 吉同元 李鹏飞 LEI Wei;WANG Jian;JI Tongyuan;LI Pengfei(College of Water Conservancy and Hydropower Engineering,Hohai University,Nanjing 210098,China;China Design Group Co.,Ltd.,Nanjing 210014,China)
出处 《水利水电科技进展》 CSCD 北大核心 2023年第6期82-88,共7页 Advances in Science and Technology of Water Resources
基金 国家自然科学基金项目(52279099)。
关键词 大坝监测 缺失数据 时间序列 深度学习 注意力机制 dam monitoring missing data time series deep learning attention mechanism
  • 相关文献

参考文献10

二级参考文献110

共引文献191

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部