papers AI Learner
The Github is limit! Click to go to the new site.

Towards Two-Dimensional Sequence to Sequence Model in Neural Machine Translation

2018-10-09
Parnia Bahar, Christopher Brix, Hermann Ney

Abstract

This work investigates an alternative model for neural machine translation (NMT) and proposes a novel architecture, where we employ a multi-dimensional long short-term memory (MDLSTM) for translation modeling. In the state-of-the-art methods, source and target sentences are treated as one-dimensional sequences over time, while we view translation as a two-dimensional (2D) mapping using an MDLSTM layer to define the correspondence between source and target words. We extend beyond the current sequence to sequence backbone NMT models to a 2D structure in which the source and target sentences are aligned with each other in a 2D grid. Our proposed topology shows consistent improvements over attention-based sequence to sequence model on two WMT 2017 tasks, German$\leftrightarrow$English.

Abstract (translated by Google)
URL

https://arxiv.org/abs/1810.03975

PDF

https://arxiv.org/pdf/1810.03975


Similar Posts

Comments