papers AI Learner
The Github is limit! Click to go to the new site.

Exact Hard Monotonic Attention for Character-Level Transduction

2019-05-15
Shijie Wu, Ryan Cotterell

Abstract

Many common character-level, string-to-string transduction tasks, e.g., grapheme-to-phoneme conversion and morphological inflection, consist almost exclusively of monotonic transduction. Neural sequence-to-sequence models with soft attention, non-monotonic models, outperform popular monotonic models. In this work, we ask the following question: Is monotonicity really a helpful inductive bias in these tasks? We develop a hard attention sequence-to-sequence model that enforces strict monotonicity and learns alignment jointly. With the help of dynamic programming, we are able to compute the exact marginalization over all alignments. Our models achieve state-of-the-art performance on morphological inflection. Furthermore, we find strong performance on two other character-level transduction tasks.

Abstract (translated by Google)
URL

http://arxiv.org/abs/1905.06319

PDF

http://arxiv.org/pdf/1905.06319


Similar Posts

Comments