papers AI Learner
The Github is limit! Click to go to the new site.

Contextualized Word Embeddings Enhanced Event Temporal Relation Extraction for Story Understanding

2019-04-26
Rujun Han, Mengyue Liang, Bashar Alhafni, Nanyun Peng

Abstract

Learning causal and temporal relationships between events is an important step towards deeper story and commonsense understanding. Though there are abundant datasets annotated with event relations for story comprehension, many have no empirical results associated with them. In this work, we establish strong baselines for event temporal relation extraction on two under-explored story narrative datasets: Richer Event Description (RED) and Causal and Temporal Relation Scheme (CaTeRS). To the best of our knowledge, these are the first results reported on these two datasets. We demonstrate that neural network-based models can outperform some strong traditional linguistic feature-based models. We also conduct comparative studies to show the contribution of adopting contextualized word embeddings (BERT) for event temporal relation extraction from stories. Detailed analyses are offered to better understand the results.

Abstract (translated by Google)
URL

http://arxiv.org/abs/1904.11942

PDF

http://arxiv.org/pdf/1904.11942


Similar Posts

Comments