NTT’s Neural Machine Translation Systems for WMT 2018

Makoto Morishita, Jun Suzuki, Masaaki Nagata

Research output: Chapter in Book/Report/Conference proceedingConference contribution

8 Citations (Scopus)

Abstract

This paper describes NTT’s neural machine translation systems submitted to the WMT 2018 English-German and German-English news translation tasks. Our submission has three main components: the Transformer model, corpus cleaning, and right-to-left nbest re-ranking techniques. Through our experiments, we identified two keys for improving accuracy: filtering noisy training sentences and right-to-left re-ranking. We also found that the Transformer model requires more training data than the RNN-based model, and the RNN-based model sometimes achieves better accuracy than the Transformer model when the corpus is small.

Original languageEnglish
Title of host publicationShared Task Papers
PublisherAssociation for Computational Linguistics (ACL)
Pages461-466
Number of pages6
ISBN (Electronic)9781948087810
DOIs
Publication statusPublished - 2018
Externally publishedYes
Event3rd Conference on Machine Translation, WMT 2018 at the Conference on Empirical Methods in Natural Language Processing, EMNLP 2018 - Brussels, Belgium
Duration: 2018 Oct 312018 Nov 1

Publication series

NameWMT 2018 - 3rd Conference on Machine Translation, Proceedings of the Conference
Volume2

Conference

Conference3rd Conference on Machine Translation, WMT 2018 at the Conference on Empirical Methods in Natural Language Processing, EMNLP 2018
Country/TerritoryBelgium
CityBrussels
Period18/10/3118/11/1

ASJC Scopus subject areas

  • Computer Science Applications
  • Information Systems
  • Computational Theory and Mathematics

Fingerprint

Dive into the research topics of 'NTT’s Neural Machine Translation Systems for WMT 2018'. Together they form a unique fingerprint.

Cite this