We gratefully acknowledge support from
the Simons Foundation and member institutions.
Full-text links:

Download:

Current browse context:

cs.CL

Change to browse by:

cs

References & Citations

DBLP - CS Bibliography

Bookmark

(what is this?)
CiteULike logo BibSonomy logo Mendeley logo del.icio.us logo Digg logo Reddit logo

Computer Science > Computation and Language

Title: BERTwich: Extending BERT's Capabilities to Model Dialectal and Noisy Text

Abstract: Real-world NLP applications often deal with nonstandard text (e.g., dialectal, informal, or misspelled text). However, language models like BERT deteriorate in the face of dialect variation or noise. How do we push BERT's modeling capabilities to encompass nonstandard text? Fine-tuning helps, but it is designed for specializing a model to a task and does not seem to bring about the deeper, more pervasive changes needed to adapt a model to nonstandard language. In this paper, we introduce the novel idea of sandwiching BERT's encoder stack between additional encoder layers trained to perform masked language modeling on noisy text. We find that our approach, paired with recent work on including character-level noise in fine-tuning data, can promote zero-shot transfer to dialectal text, as well as reduce the distance in the embedding space between words and their noisy counterparts.
Comments: Accepted for publication in Findings of the ACL: EMNLP 2023
Subjects: Computation and Language (cs.CL)
Cite as: arXiv:2311.00116 [cs.CL]
  (or arXiv:2311.00116v1 [cs.CL] for this version)

Submission history

From: Aarohi Srivastava [view email]
[v1] Tue, 31 Oct 2023 19:44:50 GMT (69kb,D)

Link back to: arXiv, form interface, contact.