You don't know JAX
In this tutorial, we'll cover each of these transformations in turn by demonstrating their use on one of the core problems of AGI: learning the Exclusive ...
jax autograd xla
Objectives & Highlights

• Compute the derivative of a function via a successor to autograd. • Just-in-time compile a function to run efficiently on an accelerator via XLA. • Automagically vectorize a function, so that e.g. you can process a “batch” of data in parallel.

Don't forget to add the tag @craffel in your comments.

This project's author does not have a MWML account yet. If you are @craffel, then sign up to gain ownership of this project and edit this page.
Share this project
Similar projects
Getting started with JAX (MLPs, CNNs & RNNs)
Learn the building blocks of JAX and use them to build some standard Deep Learning architectures (MLP, CNN, RNN, etc.).
From PyTorch to JAX
Towards neural net frameworks that purify stateful code.
Using JAX to Improve Separable Image Filters
Optimizing the filters to improve the filtered images for computer vision tasks.
Flax: Google’s Open Source Approach To Flexibility In ML
A gentle introduction to Flax: a neural network library for JAX that is designed for flexibility.