Resource links
Details
Objectives & Highlights

• JAX = XLA + Autograd • SymJAX = JAX + symbolic programming + deep Learning

Don't forget to tag @RandallBalestriero in your comment.

Authors community post
Share this project
Similar projects
Getting started with JAX (MLPs, CNNs & RNNs)
Learn the building blocks of JAX and use them to build some standard Deep Learning architectures (MLP, CNN, RNN, etc.).
Convoluted Stuff
Optimising compilers, and how thousand-year-old math shaped deep learning.
Finetuning Transformers with JAX + Haiku
Walking through a port of the RoBERTa pre-trained model to JAX + Haiku, then fine-tuning the model to solve a downstream task.
From PyTorch to JAX
Towards neural net frameworks that purify stateful code.