Character level language model RNN
This is a commentary of the min char language model of [@karpathy](https://twitter.com/karpathy).
recurrent-neural-networks article tutorial

@karpathy has been a legend with making intuitive models. His famous blog on Recurrent Neural Network, The Unreasonable Effectiveness of Recurrent Neural Networks is one of the best there is. While explaining RNNs he links the readers to his famous gist on character level language model. I have gone a little further to add commentary to the code. The losses being derived provides further intuition to the code.

Don't forget to tag @ariG23498 in your comment, otherwise they may not be notified.

Authors original post
I learn with a learning rate of 1e-10
Share this project
Similar projects
Visualizing Memorization in RNNs
Inspecting gradient magnitudes in context can be a powerful tool to see when recurrent units use short-term or long-term contextual understanding.
The Unreasonable Effectiveness of Recurrent Neural Networks
A close look at how RNNs are able to perform so well.
Named Entity Recognition Tagging
In this post, we go through an example from Natural Language Processing, in which we learn how to load text data and perform NER tagging for each token.
Screenshot to Code
Turning design mockups into code with deep learning.
Top collections