We gratefully acknowledge support from
the Simons Foundation and member institutions.
Full-text links:

Download:

Current browse context:

cs.CL

Change to browse by:

cs

References & Citations

DBLP - CS Bibliography

Bookmark

(what is this?)
CiteULike logo BibSonomy logo Mendeley logo del.icio.us logo Digg logo Reddit logo

Computer Science > Computation and Language

Title: Low-Resource Cross-Lingual Adaptive Training for Nigerian Pidgin

Abstract: Developing effective spoken language processing systems for low-resource languages poses several challenges due to the lack of parallel data and limited resources for fine-tuning models. In this work, we target on improving upon both text classification and translation of Nigerian Pidgin (Naija) by collecting a large-scale parallel English-Pidgin corpus and further propose a framework of cross-lingual adaptive training that includes both continual and task adaptive training so as to adapt a base pre-trained model to low-resource languages. Our studies show that English pre-trained language models serve as a stronger prior than multilingual language models on English-Pidgin tasks with up to 2.38 BLEU improvements; and demonstrate that augmenting orthographic data and using task adaptive training with back-translation can have a significant impact on model performance.
Comments: To appear in INTERSPEECH 2023
Subjects: Computation and Language (cs.CL)
Cite as: arXiv:2307.00382 [cs.CL]
  (or arXiv:2307.00382v1 [cs.CL] for this version)

Submission history

From: Pin-Jie Lin [view email]
[v1] Sat, 1 Jul 2023 16:47:36 GMT (192kb,D)

Link back to: arXiv, form interface, contact.