Last year, I took a course at the Johannes Kepler University in Linz, Austria on the topic of Recurrent Neural Networks and Long Short-Term Memory Networks. There, Sepp Hochreiter shared some of the “magic tricks” he and his team employ for training LSTMs. This blog post is the accumulation of some of my notes.
For this post, I assume you are already familiar with LSTMs. If not, I suggest you begin with Chris Olah’s Understanding LSTM Networks and then go on to read the original LSTM work  .