Skip to yearly menu bar Skip to main content


Poster

Building Dynamic Knowledge Graphs from Text using Machine Reading Comprehension

Rajarshi Das · Tsendsuren Munkhdalai · Eric Yuan · Adam Trischler · Andrew McCallum

Great Hall BC #40

Keywords: [ machine reading comprehension ] [ entity state tracking ] [ dynamic knowledge base construction ] [ recurrent graph networks ]


Abstract:

We propose a neural machine-reading model that constructs dynamic knowledge graphs from procedural text. It builds these graphs recurrently for each step of the described procedure, and uses them to track the evolving states of participant entities. We harness and extend a recently proposed machine reading comprehension(MRC) model to query for entity states, since these states are generally communicated in spans of text and MRC models perform well in extracting entity-centric spans. The explicit, structured, and evolving knowledge graph representations that our model constructs can be used in downstream question answering tasks to improve machine comprehension of text, as we demonstrate empirically. On two comprehension tasks from the recently proposed ProPara dataset, our model achieves state-of-the-art results. We further show that our model is competitive on the Recipes dataset, suggesting it may be generally applicable.

Live content is unavailable. Log in and register to view live content