Vazquez Carrillo, Juan RaulRaganato, AlessandroTiedemann, JörgCreutz, MathiasAugenstein, IsabelleGella, SpandanaRuder, SebastianKann, KatharinaCan, BurcuWelbl, JohannesConneau, AlexisRen, XiangRei, Marek2019-08-152019-08-152019Vazquez Carrillo, J R, Raganato, A, Tiedemann, J & Creutz, M 2019, Multilingual NMT with a language-independent attention bridge. in I Augenstein, S Gella, S Ruder, K Kann, B Can, J Welbl, A Conneau, X Ren & M Rei (eds), The 4th Workshop on Representation Learning for NLP (RepL4NLP-2019) : Proceedings of the Workshop. The Association for Computational Linguistics, Stroudsburg, pp. 33-39, Workshop on Representation Learning for NLP, Florence, Italy, 02/08/2019.conferenceORCID: /0000-0003-3065-7989/work/60613525ORCID: /0000-0003-1862-4172/work/60613552ORCID: /0009-0004-6394-4225/work/160339455http://hdl.handle.net/10138/304660In this paper, we propose a multilingual encoder-decoder architecture capable of obtaining multilingual sentence representations by means of incorporating an intermediate {\em attention bridge} that is shared across all languages. That is, we train the model with language-specific encoders and decoders that are connected via self-attention with a shared layer that we call attention bridge. This layer exploits the semantics from each language for performing translation and develops into a language-independent meaning representation that can efficiently be used for transfer learning. We present a new framework for the efficient development of multilingual NMT using this model and scheduled training. We have tested the approach in a systematic way with a multi-parallel data set. We show that the model achieves substantial improvements over strong bilingual models and that it also works well for zero-shot translation, which demonstrates its ability of abstraction and transfer learning.7engcc_byinfo:eu-repo/semantics/openAccessLanguagesComputer and information sciencesNatural language processingMultilingual machine translationMultilingual NMT with a language-independent attention bridgeConference contributionopenAccessbc9b500d-4be5-49dd-988b-a08a44401517000521942000005