Differentiable Reasoning over Text
We consider the task of answering complex multi-hop questions using a corpus as a virtual knowledge base (KB).
question-answering multi-hop reasoning entity-linking natural-language-processing tutorial research article code paper arxiv:2002.10640

In particular, we describe a neural module, DrKIT, that traverses textual data like a KB, softly following paths of relations between mentions of entities in the corpus. At each step the module uses a combination of sparse-matrix TFIDF indices and a maximum inner product search (MIPS) on a special index of contextual representations of the mentions. This module is differentiable, so the full system can be trained end-to-end using gradient based methods, starting from natural language inputs.

Don't forget to tag @bdhingra in your comment, otherwise they may not be notified.

Authors community post
PhD student at Language Technologies Institute, Carnegie Mellon University
Share this project
Similar projects
COVID-Q: A Dataset of 1,690 Questions about COVID-19
This dataset consists of COVID-19 questions which have been annotated into a broad category (e.g. Transmission, Prevention) and a more specific class such ...
Simple Transformers
Transformers for Classification, NER, QA, Language Modeling, Language Generation, T5, Multi-Modal, and Conversational AI.
Zero-shot Neural Retrieval via Domain-targeted Synthetic Queries
Zero-shot learning for ad-hoc retrieval models that relies on synthetic query generation.
TeachEasy: Web app for Text Summarization & Q/A generation
An intuitive Streamlit based web app for Text Summarization and Question Answer generation so as to reduce the work for School teachers.
Top collections