期刊文献+

基于WD-COA-LSTM模型的月降水量预测 被引量:4

Monthly precipitation prediction based on WD-COA-LSTM model
下载PDF
导出
摘要 为进一步提高月降水量预测精度,提出了基于小波分解(WD)和郊狼优化(COA)算法的长短期记忆神经网络(LSTM)降水量预测模型(WD-COA-LSTM)。首先用小波分解对时间序列进行预处理,消除序列的非平稳性,得到1个低频序列和3个高频序列;然后通过郊狼优化算法对神经网络(LSTM)模型进行参数优化;最后将各子序列预测值叠加得到月降水量预测值。将提出的模型应用于洛阳市栾川县白土镇和洛宁县故县镇两个雨量站的月降水量预测中,并与LSTM、COA-LSTM、WD-LSTM模型预测结果进行对比。结果表明:提出的WD-COA-LSTM模型的预测精度最高,说明小波分解和郊狼优化算法能有效加强LSTM模型预测的精度和泛化能力,为月降水量的预测提供了一种新的途径。 In order to improve the prediction precision of monthly precipitation,the precipitation prediction model of WD-COA-LSTM is proposed based on wavelet decomposition(WD),coyote optimization algorithm(COA)and long short-term memory(LSTM)neural network.Firstly,the time series is preprocessed by WD to eliminate its non-stationarity,and a low-frequency sequence and three high-frequency sequences are obtained as the result.Then the parameters of the LSTM model are optimized by COA.Finally,the predicted monthly precipitation is obtained by superimposing the predicted values of each subsequence.The proposed model was applied to the monthly precipitation prediction of Baitu Town in Luanchuan County and Guxian Town in Luoning County,Luoyang City,and the results were then compared with those of the LSTM,COA-LSTM and WD-LSTM models.It is found that the proposed WD-COA-LSTM model produced the highest prediction accuracy,indicating that WD and COA can improve the precision and generalization ability of LSTM model.This model provides a new approach for the prediction of monthly precipitation.
作者 王文川 杨静欣 臧红飞 WANG Wenchuan;YANG Jingxin;ZANG Hongfei(College of Water Resources,North China University of Water Resources and Electric Power,Zhengzhou 450046,China)
出处 《水资源与水工程学报》 CSCD 北大核心 2022年第4期8-13,23,共7页 Journal of Water Resources and Water Engineering
基金 河南省重点研发与推广专项(202102310259、202102310588) 河南省高校科技创新团队(18IRTSTHN009)。
关键词 月降水量预测 小波分解 郊狼优化算法 长短期记忆神经网络 monthly precipitation prediction wavelet decomposition(WD) coyote optimization algorithm(COA) long short-term memory(LSTM)neural network
  • 相关文献

参考文献18

二级参考文献217

共引文献214

同被引文献45

引证文献4

二级引证文献8

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部