Publications
Proyag Pal and Kenneth Heafield.
To be published at NAACL 2022.
[PDF] View abstract
This paper describes a method to quantify the amount of information H(t|s) added by the target sentence t that is not present in the source s in a neural machine translation system. We do this by providing the model the target sentence in a highly compressed form (a "cheat code"), and exploring the effect of the size of the cheat code. We find that the model is able to capture extra information from just a single float representation of the target and nearly reproduces the target with two 32-bit floats per target token. Proyag Pal, Alham Fikri Aji, Pinzhen Chen, and Sukanta Sen.
Published at WMT21 at EMNLP 2021.
[PDF] [ACL Anthology] [Poster] [BibTeX] View abstract
We describe the University of Edinburgh’s Bengali↔Hindi constrained systems submitted to the WMT21 News Translation task. We submitted ensembles of Transformer models built with large-scale back-translation and fine-tuned on subsets of training data retrieved based on similarity to the target domain. For both translation directions, our submissions are among the best-performing constrained systems according to human evaluation.