Robust Neural Machine Translation

1 · Google AI Research · July 29, 2019, 5:17 p.m.
Posted by Yong Cheng, Software Engineer, Google Research In recent years, neural machine translation (NMT) using Transformer models has experienced tremendous success. Based on deep neural networks, NMT models are usually trained end-to-end on very large parallel corpora (input/output text pairs) in an entirely data-driven fashion and without the need to impose explicit rules of language. Despite this huge success, NMT models can be sensitive to minor perturbations of the input, which can manife...