References & Citations
Computer Science > Computation and Language
Title: PreCog: Exploring the Relation between Memorization and Performance in Pre-trained Language Models
(Submitted on 8 May 2023 (v1), last revised 9 May 2023 (this version, v2))
Abstract: Pre-trained Language Models such as BERT are impressive machines with the ability to memorize, possibly generalized learning examples. We present here a small, focused contribution to the analysis of the interplay between memorization and performance of BERT in downstream tasks. We propose PreCog, a measure for evaluating memorization from pre-training, and we analyze its correlation with the BERT's performance. Our experiments show that highly memorized examples are better classified, suggesting memorization is an essential key to success for BERT.
Submission history
From: Leonardo Ranaldi Dr [view email][v1] Mon, 8 May 2023 12:51:00 GMT (7989kb,D)
[v2] Tue, 9 May 2023 05:37:53 GMT (7986kb,D)
Link back to: arXiv, form interface, contact.