GRU is a gating mechanism in recurrent neural network (RNN), which is similar to other gating mechanisms, it aims to solve the gradient vanishing / exploding problem in standard RNN and retain the long-term information of the sequence. GRU is as good as LSTM in many sequence tasks such as speech recognition, but it has fewer parameters than LSTM. It only contains a reset gate and an update gate.
Installation: python
Operation mode:
Input variable: time series data
Output variable: predicted data / accuracy
QR code:
安装方式:
安装python
运行方式:
在PyCharm中打开脚本文件即可运行
输入变量:
时间序列数据
输出变量:
预测数据/预测精度
二维码:
Contact Support
Northwest Institute of Eco-Environment and Resources, CAS 0931-4967287 poles@itpcas.ac.cnLinks
National Tibetan Plateau Data CenterFollow Us
A Big Earth Data Platform for Three Poles © 2018-2020 No.05000491 | All Rights Reserved | No.11010502040845
Tech Support: westdc.cn