GRU is a gating mechanism in recurrent neural network (RNN), which is similar to other gating mechanisms, it aims to solve the gradient vanishing / exploding problem in standard RNN and retain the long-term information of the sequence. GRU is as good as LSTM in many sequence tasks such as speech recognition, but it has fewer parameters than LSTM. It only contains a reset gate and an update gate.
Input variable: time series data
Output variable: predicted data / accuracy
Contact SupportNorthwest Institute of Eco-Environment and Resources, CAS 0931-4967287 firstname.lastname@example.org
LinksNational Tibetan Plateau Data Center