Recurrent Neural Networks (RNN)


A recurrent neural network (RNN) is a class of artificial neural networks where connections between nodes form a directed graph along a temporal sequence. This allows it to exhibit temporal dynamic behavior. Derived from feedforward neural networks, RNNs can use their internal state (memory) to process variable length sequences of inputs.

Overview

Understanding LSTM Networks
A closer look at the inner workings of LSTM networks.
recurrent-neural-networks lstm tutorial article
A Visual Guide to Recurrent Layers in Keras
Understand how to use Recurrent Layers like RNN, GRU and LSTM in Keras with diagrams.
recurrent-neural-networks lstm keras tensorflow
The Sorcerer’s Apprentice Guide to Training LSTMs
Tricks of the trade for training Long Short-Term Memory networks.
recurrent-neural-networks lstm tips article

Tutorials

Lecture 10 | Recurrent Neural Networks
Discuss the use of recurrent neural networks for modeling sequence data.
recurrent-neural-networks gated-recurrent-units lstm language-modeling

Libraries

General
Visualizing Memorization in RNNs
Inspecting gradient magnitudes in context can be a powerful tool to see when recurrent units use short-term or long-term contextual understanding.
interpretability visualization recurrent-neural-networks memorization
Table of Contents
Share a project
Share something you or the community has made with ML.
Topic experts
Share