In Lecture 10 we discuss the use of recurrent neural networks for modeling sequence data. We show how recurrent neural networks can be used for language modeling and image captioning, and how soft spatial attention can be incorporated into image captioning models. We discuss different architectures for recurrent neural networks, including Long Short Term Memory (LSTM) and Gated Recurrent Units (GRU).

Don't forget to tag @jcjohnson in your comment, otherwise they may not be notified.

Authors
Share this project
Similar projects
Recurrent Neural Networks: building GRU cells VS LSTM cells
What are the advantages of RNN’s over transformers? When to use GRU’s over LSTM? What are the equations of GRU really mean? How to build a GRU cell in ...
From GRU to Transformer
How recurrent units and self-attention are related to each other.
Character level language model RNN
This is a commentary of the min char language model of [@karpathy](https://twitter.com/karpathy).
Attention? Attention!
In this post, we are gonna look into how attention was invented, and various attention mechanisms and models, such as transformer and SNAIL.