Christopher Olah

I want to understand things clearly and explain them well. @openai formerly @brain-research.

Top projects

Understanding LSTM Networks
A closer look at the inner workings of LSTM networks.
recurrent-neural-networks lstm tutorial article
Feature Visualization
How neural networks build up their understanding of images
features interpretability feature-visualization distill-pub
The Building Blocks of Interpretability
We explore the powerful interfaces that arise when you combine interpretability techniques and the rich structure of this combinatorial space.
interpretability neural-networks deep-learning distill-pub
Exploring Neural Networks with Activation Atlases
An explorable activation atlas of features the network has learned which can reveal how the network typically represents some concepts.
interactive activations convolutional-neural-networks article

Top collections