Home > Common Problem > body text

What is a neural network for processing time series data?

(*-*)浩
Release: 2019-10-25 10:25:01
Original
10350 people have browsed it

Recurrent Neural Network (RNN) is a type of network that takes sequence data as input, performs recursion in the evolution direction of the sequence, and all nodes (cyclic units) are connected in a chain. Recursive neural network.

What is a neural network for processing time series data?

Research on recurrent neural networks began in the 1980s and 1990s, and in the early 21st century Developed as one of the deep learning algorithms, among which Bidirectional RNN (Bi-RNN) and Long Short-Term Memory networks (LSTM) are common recurrent neural networks. (Recommended learning: web front-end video tutorial)

The recurrent neural network has memory, parameter sharing and Turing completeness (Turing completeness), so when learning the nonlinear characteristics of the sequence Has certain advantages.

Recurrent neural networks are used in natural language processing (NLP), such as speech recognition, language modeling, machine translation and other fields, and are also used in various time series forecasts.

Introducing a convolutional neural network (Convoutional Neural Network, CNN) to construct a recurrent neural network that can handle computer vision problems involving sequence inputs.

During the same period when SRN appeared, the learning theory of recurrent neural networks also developed. After the research on backpropagation algorithm attracted attention [18], the academic community began to try to train recurrent neural networks under the BP framework.

In 1989, Ronald Williams and David Zipser proposed Real-Time Recurrent Learning (RTRL) of recurrent neural networks. Then Paul Werbos proposed BP Through Time (BPTT) for recurrent neural networks in 1990. RTRL and BPTT are still used today and are the main methods for learning recurrent neural networks.

In 1991, Sepp Hochreiter discovered the long-term dependencies problem of recurrent neural networks, that is, when learning sequences, gradient vanishing and gradient explosion will occur in recurrent neural networks. (gradient explosion) phenomenon, it is impossible to grasp the nonlinear relationship over a long time span.

In order to solve the long-term dependency problem, a large number of optimization theories have been introduced and many improved algorithms have been derived, including Neural History Compressor (NHC) and Long Short-Term Memory networks (LSTM). ), Gated Recurrent Unit networks (GRU), echo state network (echo state network), Independent RNN, etc.

The above is the detailed content of What is a neural network for processing time series data?. For more information, please follow other related articles on the PHP Chinese website!

Related labels:
source:php.cn
Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn
Popular Tutorials
More>
Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template