Collection of QA projects.
Transfer Learning with T5: the Text-To-Text Transfer Transformer
In the paper, we demonstrate how to achieve state-of-the-art results on multiple NLP tasks using a text-to-text transformer pre-trained on a large text ...
Differentiable Reasoning over Text
We consider the task of answering complex multi-hop questions using a corpus as a virtual knowledge base (KB).
BART version of closed-book QA
This is a BART version of sequence-to-sequence model for open-domain QA in a closed-book setup, based on PyTorch and Huggingface's Transformers.
Zero-shot Neural Retrieval via Domain-targeted Synthetic Queries
Zero-shot learning for ad-hoc retrieval models that relies on synthetic query generation.
1 - 4
How to add Projects to your Collection
Identify the project you want to add to this collection. You can discover projects on the
page or search for them using the search bar at the top left of any page.
Once you've identified the project, click on the green bookmark symbol to its right.
means that you've already added it to some Collections and
means that you have yet to add it to any Collections.
Select the Collections you want to add the project to and unselect the Collections you want to remove the project from (if the project already existed in one of your collections).
and continue to add more projects and create more Collections.
Don't forget to tag
in your comment, otherwise they may not be notified.
AI Research @apple. Author @oreillymedia. ML Lead @Ciitizen. Alum @hopkinsmedicine and @gatech
Share this collection
Share what you've made with ML.