Madison May

Machine Learning Architect at @IndicoDataSolutions

Top projects

Finetuning Transformers with JAX + Haiku
Walking through a port of the RoBERTa pre-trained model to JAX + Haiku, then fine-tuning the model to solve a downstream task.
jax haiku roberta transformers
Finetune: Scikit-learn Style Model Finetuning for NLP
Finetune is a library that allows users to leverage state-of-the-art pretrained NLP models for a wide variety of downstream tasks.
natural-language-processing finetuning pretraining transformers
A Survey of Methods for Model Compression in NLP
A look at model compression techniques applied on base model pre-training to reduce the computational cost of prediction.
model-compression pruning knowledge-distillation precision-reduction

Top collections