We gratefully acknowledge support from
the Simons Foundation and member institutions.
Full-text links:

Download:

Current browse context:

cs.CL

Change to browse by:

References & Citations

DBLP - CS Bibliography

Bookmark

(what is this?)
CiteULike logo BibSonomy logo Mendeley logo del.icio.us logo Digg logo Reddit logo

Computer Science > Computation and Language

Title: WeChat Neural Machine Translation Systems for WMT20

Abstract: We participate in the WMT 2020 shared news translation task on Chinese to English. Our system is based on the Transformer (Vaswani et al., 2017a) with effective variants and the DTMT (Meng and Zhang, 2019) architecture. In our experiments, we employ data selection, several synthetic data generation approaches (i.e., back-translation, knowledge distillation, and iterative in-domain knowledge transfer), advanced finetuning approaches and self-bleu based model ensemble. Our constrained Chinese to English system achieves 36.9 case-sensitive BLEU score, which is the highest among all submissions.
Comments: Accepted at WMT 2020. Our Chinese to English system achieved the highest case-sensitive BLEU score among all submissions
Subjects: Computation and Language (cs.CL); Artificial Intelligence (cs.AI); Machine Learning (cs.LG)
Cite as: arXiv:2010.00247 [cs.CL]
  (or arXiv:2010.00247v2 [cs.CL] for this version)

Submission history

From: Fandong Meng [view email]
[v1] Thu, 1 Oct 2020 08:15:09 GMT (1040kb,D)
[v2] Mon, 5 Oct 2020 16:01:01 GMT (70kb,D)

Link back to: arXiv, form interface, contact.