We gratefully acknowledge support from
the Simons Foundation and member institutions.
Full-text links:

Download:

Current browse context:

cs.CL

Change to browse by:

References & Citations

DBLP - CS Bibliography

Bookmark

(what is this?)
CiteULike logo BibSonomy logo Mendeley logo del.icio.us logo Digg logo Reddit logo

Computer Science > Computation and Language

Title: PreCog: Exploring the Relation between Memorization and Performance in Pre-trained Language Models

Abstract: Pre-trained Language Models such as BERT are impressive machines with the ability to memorize, possibly generalized learning examples. We present here a small, focused contribution to the analysis of the interplay between memorization and performance of BERT in downstream tasks. We propose PreCog, a measure for evaluating memorization from pre-training, and we analyze its correlation with the BERT's performance. Our experiments show that highly memorized examples are better classified, suggesting memorization is an essential key to success for BERT.
Subjects: Computation and Language (cs.CL); Artificial Intelligence (cs.AI)
Journal reference: 2023.ranlp-1.103
DOI: 10.26615/978-954-452-092-2_103
Report number: 2023.ranlp-1.103
Cite as: arXiv:2305.04673 [cs.CL]
  (or arXiv:2305.04673v2 [cs.CL] for this version)

Submission history

From: Leonardo Ranaldi Dr [view email]
[v1] Mon, 8 May 2023 12:51:00 GMT (7989kb,D)
[v2] Tue, 9 May 2023 05:37:53 GMT (7986kb,D)

Link back to: arXiv, form interface, contact.