papers AI Learner
The Github is limit! Click to go to the new site.

Improving generation quality of pointer networks via guided attention

2019-01-20
Kushal Chawla, Kundan Krishna, Balaji Vasan Srinivasan

Abstract

Pointer generator networks have been used successfully for abstractive summarization. Along with the capability to generate novel words, it also allows the model to copy from the input text to handle out-of-vocabulary words. In this paper, we point out two key shortcomings of the summaries generated with this framework via manual inspection, statistical analysis and human evaluation. The first shortcoming is the extractive nature of the generated summaries, since the network eventually learns to copy from the input article most of the times, affecting the abstractive nature of the generated summaries. The second shortcoming is the factual inaccuracies in the generated text despite grammatical correctness. Our analysis indicates that this arises due to incorrect attention transition between different parts of the article. We propose an initial attempt towards addressing both these shortcomings by externally appending traditional linguistic information parsed from the input text, thereby teaching networks on the structure of the underlying text. Results indicate feasibility and potential of such additional cues for improved generation.

Abstract (translated by Google)
URL

http://arxiv.org/abs/1901.11492

PDF

http://arxiv.org/pdf/1901.11492


Similar Posts

Comments