Welcome

Through incremental integration and independent research and development, build a method library of big data quality control, automatic modeling and analysis, data mining and interactive visualization, form a tool library with high reliability, high scalability, high efficiency and high fault tolerance, realize the integration and sharing of collaborative analysis methods of multi-source heterogeneous, multi-granularity, multi-phase, long-time series big data in three pole environment, as well as high Efficient and online big data analysis and processing.

  • Long Short-Term Memory (LSTM)

    LSTM is a time recurrent neural network (RNN), mainly to solve the problems of gradient disappearance and gradient explosion in the process of long sequence training. In short, compared with ordinary RNN, LSTM can perform better in longer sequences.


    Installation: python

    Operation mode:

    Input variable: time series data

    Output variable: predicted data

    QR code:

    575 2022-06-15 View Details

  • Gaussian Process Regression (GPR)

    Gaussian process regression (GPR) is a non parametric model that uses Gaussian process (GP) prior to regression analysis of data. The model assumption of GPR includes regression residual and Gaussian process. The essence of its solution process is Bayesian inference. If the prior form of Gaussian process is not limited, GPR is a universal approximation of any objective function in theory. In addition, GPR can provide a posteriori of the prediction results, and the posteriori has an analytical form when the regression residual is an independent identically distributed normal distribution. Therefore, GPR is a probability model with universality and analyzability.

    Installation: python

    Operation mode:

    Input variable: time series data

    Output variable: predicted time series data

    QR code:

    557 2022-06-15 View Details

  • Deep Belief Networks (DBN)

    DBN is a probability generation model. Compared with the neural network of the traditional discrimination model, the DBN is to establish a joint distribution between observation data and labels. By training the weights among its neurons, the whole neural network can generate training data according to the maximum probability.


    Installation: matlab

    Operation mode:

    Input variable: image

    Output variable: Recognized image

    QR code:

    404 2022-06-15 View Details

  • Extreme Learning Machines (ELM)

    ELM is an artificial neural network model in the field of artificial intelligence machine learning. It is a learning algorithm for solving single hidden layer feedforward neural network. The traditional feedforward neural network (such as BP neural network) needs to set a large number of network training parameters artificially, but this algorithm only needs to set the network structure without setting other parameters, so it is easy to use. The weights from the input layer to the hidden layer are randomly determined a once, and do not need to be adjusted during the implementation of the algorithm, while the weights from the hidden layer to the output layer only need to be determined by solving a linear equation group, so the calculation speed can be improved.

    Installation: matlab

    Operation mode:

    Input variable: time series data

    Output variable: predicted data

    QR code:

    505 2022-06-15 View Details

  • Gate Recurrent Unit (GRU)

    GRU is a gating mechanism in recurrent neural network (RNN), which is similar to other gating mechanisms, it aims to solve the gradient vanishing / exploding problem in standard RNN and retain the long-term information of the sequence. GRU is as good as LSTM in many sequence tasks such as speech recognition, but it has fewer parameters than LSTM. It only contains a reset gate and an update gate.


    Installation: python

    Operation mode:

    Input variable: time series data

    Output variable: predicted data / accuracy

    QR code:

    441 2022-06-15 View Details

  • Recurrent Neural Network (RNN)

    Memory, parameter sharing and Turing completeness, so RNN has certain advantages in learning the nonlinear characteristics of sequences. RNN is applied in natural language processing (NLP), such as speech recognition, language modeling, machinetranslation and other fields. It is also used in various time series prediction.


    Installation: matlab/python

    Operation mode:

    Input variable: time series data

    Output variable: Training data

    QR code:

    490 2022-06-15 View Details