latest | popular

Filter by
KD Lib
A PyTorch library to easily facilitate knowledge distillation for custom deep learning models.
knowledge-distillation model-compression pytorch code
A Survey of Methods for Model Compression in NLP
A look at model compression techniques applied on base model pre-training to reduce the computational cost of prediction.
model-compression pruning knowledge-distillation precision-reduction
How to make your network smaller and faster with the use of fastai library.
model-compression fastai knowledge-distillation sparsifying
AquVitae: The Easiest Knowledge Distillation Library
AquVitae is a Python library that is the easiest to perform Knowledge Distillation through a very simple API. This library supports TensorFlow and PyTorch. ...
tensorflow pytorch light-weight deep-learning
Distilling Inductive Biases
The power of knowledge distillation for transferring the effect of inductive biases from one model to another.
inductive-bias knowledge-distillation model-compression research
Knowledge Transfer in Self Supervised Learning
A general framework to transfer knowledge from deep self-supervised models to shallow task-specific models.
self-supervised-learning knowledge-distillation model-compression article
Compression of Deep Learning Models for Text: A Survey
In this survey, we discuss six different types of methods for compression of such models to enable their deployment in real industry NLP projects.
pruning quantization knowledge-distillation parameter-sharing
All The Ways You Can Compress BERT
In this post I’ll list and briefly taxonomize all the papers I’ve seen compressing BERT.
bert pruning weight-factorization compression
projects 1 - 10 of 13
Topic experts
Share a project
Share something you or the community has made with ML.