准确预测风电功率可以提高电网运行的安全性和可靠性。为进一步提高短期风电功率预测精度,针对目前单一模型难以获得最优预测结果的问题,提出一种CNN-LSTM&GRU多模型组合短期风电功率预测方法。首先,利用卷积神经网络(convolutional...准确预测风电功率可以提高电网运行的安全性和可靠性。为进一步提高短期风电功率预测精度,针对目前单一模型难以获得最优预测结果的问题,提出一种CNN-LSTM&GRU多模型组合短期风电功率预测方法。首先,利用卷积神经网络(convolutional neural network,CNN)提取数据局部特征,并结合长短期记忆(long short term memory,LSTM)网络构造出融合局部特征预提取模块的CNN-LSTM网络结构;然后,将其与门控循环单元(gated recurrent unit,GRU)网络并行,并通过自适应权重学习模块为CNN-LSTM模块和GRU模块的输出选择最佳权重,构建出CNN-LSTM&GRU组合的短期预测模型。最后,对中国西北某风电场的出力进行预测研究,结果表明:所提模型与单一模型或其他组合模型相比,指标误差更小,预测精度更高。展开更多
交通流预测是城市智能交通系统的重要组成部分。随着人工智能和机器学习的不断发展,深度学习在交通工程领域得到了广泛的应用。选取门控循环单元(Gated Recurrent Unit, GRU)神经网络作为研究对象,利用交叉验证法探究GRU模型的最佳门控...交通流预测是城市智能交通系统的重要组成部分。随着人工智能和机器学习的不断发展,深度学习在交通工程领域得到了广泛的应用。选取门控循环单元(Gated Recurrent Unit, GRU)神经网络作为研究对象,利用交叉验证法探究GRU模型的最佳门控循环单元个数,并与支持向量机回归等三种预测模型通过不同指标进行综合评价和对比。结果表明,与其它3种模型相比,GRU模型具有良好的预测性能。展开更多
Recurrent neural networks (RNN) have been very successful in handling sequence data. However, understanding RNN and finding the best practices for RNN learning is a difficult task, partly because there are many comp...Recurrent neural networks (RNN) have been very successful in handling sequence data. However, understanding RNN and finding the best practices for RNN learning is a difficult task, partly because there are many competing and complex hidden units, such as the long short-term memory (LSTM) and the gated recurrent unit (GRU). We propose a gated unit for RNN, named as minimal gated unit (MCU), since it only contains one gate, which is a minimal design among all gated hidden units. The design of MCU benefits from evaluation results on LSTM and GRU in the literature. Experiments on various sequence data show that MCU has comparable accuracy with GRU, but has a simpler structure, fewer parameters, and faster training. Hence, MGU is suitable in RNN's applications. Its simple architecture also means that it is easier to evaluate and tune, and in principle it is easier to study MGU's properties theoretically and empirically.展开更多
With the explosion of online communication and publication, texts become obtainable via forums, chat messages, blogs, book reviews and movie reviews. Usually, these texts are much short and noisy without sufficient st...With the explosion of online communication and publication, texts become obtainable via forums, chat messages, blogs, book reviews and movie reviews. Usually, these texts are much short and noisy without sufficient statistical signals and enough information for a good semantic analysis. Traditional natural language processing methods such as Bow-of-Word (BOW) based probabilistic latent semantic models fail to achieve high performance due to the short text environment. Recent researches have focused on the correlations between words, i.e., term dependencies, which could be helpful for mining latent semantics hidden in short texts and help people to understand them. Long short-term memory (LSTM) network can capture term dependencies and is able to remember the information for long periods of time. LSTM has been widely used and has obtained promising results in variants of problems of understanding latent semantics of texts. At the same time, by analyzing the texts, we find that a number of keywords contribute greatly to the semantics of the texts. In this paper, we establish a keyword vocabulary and propose an LSTM-based model that is sensitive to the words in the vocabulary; hence, the keywords leverage the semantics of the full document. The proposed model is evaluated in a short-text sentiment analysis task on two datasets: IMDB and SemEval-2016, respectively. Experimental results demonstrate that our model outperforms the baseline LSTM by 1%similar to 2% in terms of accuracy and is effective with significant performance enhancement over several non-recurrent neural network latent semantic models (especially in dealing with short texts). We also incorporate the idea into a variant of LSTM named the gated recurrent unit (GRU) model and achieve good performance, which proves that our method is general enough to improve different deep learning models.展开更多
为了降低声纹识别算法在低信噪比环境下的误报率,提出了一种基于快速增量式支持向量数据描述(fast incremental support vector data description,FISVDD)以及门控循环单元(gate recurrent unit,GRU)的变压器机械故障声纹识别方法。以...为了降低声纹识别算法在低信噪比环境下的误报率,提出了一种基于快速增量式支持向量数据描述(fast incremental support vector data description,FISVDD)以及门控循环单元(gate recurrent unit,GRU)的变压器机械故障声纹识别方法。以变压器为实验对象,分别获取变压器在正常工况、铁芯松动、线圈松动3种状态下的声音信号,并使用Mel频率倒谱系数进行特征提取。FISVDD作为第1级算法分离陌生类,同时通过增量学习的方式学习新样本。GRU作为第2级分类算法,对通过第1级算法的样本进行识别。实验结果表明:与传统闭集识别算法相比,FISVDD需要的训练时间更少;相较于传统的机器学习算法,GRU在变压器音频识别任务中具有更高的识别准确率和抗噪识别能力;所提方法相较于1级算法更加有效,在识别信噪比高于10 dB的样本时召回率下降不超过1%,在识别信噪比不超过0 dB的样本时误报率不超过10%。展开更多
文摘准确预测风电功率可以提高电网运行的安全性和可靠性。为进一步提高短期风电功率预测精度,针对目前单一模型难以获得最优预测结果的问题,提出一种CNN-LSTM&GRU多模型组合短期风电功率预测方法。首先,利用卷积神经网络(convolutional neural network,CNN)提取数据局部特征,并结合长短期记忆(long short term memory,LSTM)网络构造出融合局部特征预提取模块的CNN-LSTM网络结构;然后,将其与门控循环单元(gated recurrent unit,GRU)网络并行,并通过自适应权重学习模块为CNN-LSTM模块和GRU模块的输出选择最佳权重,构建出CNN-LSTM&GRU组合的短期预测模型。最后,对中国西北某风电场的出力进行预测研究,结果表明:所提模型与单一模型或其他组合模型相比,指标误差更小,预测精度更高。
文摘交通流预测是城市智能交通系统的重要组成部分。随着人工智能和机器学习的不断发展,深度学习在交通工程领域得到了广泛的应用。选取门控循环单元(Gated Recurrent Unit, GRU)神经网络作为研究对象,利用交叉验证法探究GRU模型的最佳门控循环单元个数,并与支持向量机回归等三种预测模型通过不同指标进行综合评价和对比。结果表明,与其它3种模型相比,GRU模型具有良好的预测性能。
基金supported by National Natural Science Foundation of China(Nos.61422203 and 61333014)National Key Basic Research Program of China(No.2014CB340501)
文摘Recurrent neural networks (RNN) have been very successful in handling sequence data. However, understanding RNN and finding the best practices for RNN learning is a difficult task, partly because there are many competing and complex hidden units, such as the long short-term memory (LSTM) and the gated recurrent unit (GRU). We propose a gated unit for RNN, named as minimal gated unit (MCU), since it only contains one gate, which is a minimal design among all gated hidden units. The design of MCU benefits from evaluation results on LSTM and GRU in the literature. Experiments on various sequence data show that MCU has comparable accuracy with GRU, but has a simpler structure, fewer parameters, and faster training. Hence, MGU is suitable in RNN's applications. Its simple architecture also means that it is easier to evaluate and tune, and in principle it is easier to study MGU's properties theoretically and empirically.
文摘With the explosion of online communication and publication, texts become obtainable via forums, chat messages, blogs, book reviews and movie reviews. Usually, these texts are much short and noisy without sufficient statistical signals and enough information for a good semantic analysis. Traditional natural language processing methods such as Bow-of-Word (BOW) based probabilistic latent semantic models fail to achieve high performance due to the short text environment. Recent researches have focused on the correlations between words, i.e., term dependencies, which could be helpful for mining latent semantics hidden in short texts and help people to understand them. Long short-term memory (LSTM) network can capture term dependencies and is able to remember the information for long periods of time. LSTM has been widely used and has obtained promising results in variants of problems of understanding latent semantics of texts. At the same time, by analyzing the texts, we find that a number of keywords contribute greatly to the semantics of the texts. In this paper, we establish a keyword vocabulary and propose an LSTM-based model that is sensitive to the words in the vocabulary; hence, the keywords leverage the semantics of the full document. The proposed model is evaluated in a short-text sentiment analysis task on two datasets: IMDB and SemEval-2016, respectively. Experimental results demonstrate that our model outperforms the baseline LSTM by 1%similar to 2% in terms of accuracy and is effective with significant performance enhancement over several non-recurrent neural network latent semantic models (especially in dealing with short texts). We also incorporate the idea into a variant of LSTM named the gated recurrent unit (GRU) model and achieve good performance, which proves that our method is general enough to improve different deep learning models.
文摘为了降低声纹识别算法在低信噪比环境下的误报率,提出了一种基于快速增量式支持向量数据描述(fast incremental support vector data description,FISVDD)以及门控循环单元(gate recurrent unit,GRU)的变压器机械故障声纹识别方法。以变压器为实验对象,分别获取变压器在正常工况、铁芯松动、线圈松动3种状态下的声音信号,并使用Mel频率倒谱系数进行特征提取。FISVDD作为第1级算法分离陌生类,同时通过增量学习的方式学习新样本。GRU作为第2级分类算法,对通过第1级算法的样本进行识别。实验结果表明:与传统闭集识别算法相比,FISVDD需要的训练时间更少;相较于传统的机器学习算法,GRU在变压器音频识别任务中具有更高的识别准确率和抗噪识别能力;所提方法相较于1级算法更加有效,在识别信噪比高于10 dB的样本时召回率下降不超过1%,在识别信噪比不超过0 dB的样本时误报率不超过10%。