We gratefully acknowledge support from
the Simons Foundation and member institutions.
Full-text links:

Download:

Current browse context:

cond-mat.dis-nn

Change to browse by:

References & Citations

Bookmark

(what is this?)
CiteULike logo BibSonomy logo Mendeley logo del.icio.us logo Digg logo Reddit logo

Condensed Matter > Disordered Systems and Neural Networks

Title: What does self-attention learn from Masked Language Modelling?

Abstract: Transformers are neural networks which revolutionised natural language processing and machine learning. They process sequences of inputs, like words, using a mechanism called self-attention, which is trained via masked language modelling (MLM). In MLM, a word is randomly masked in an input sequence, and the network is trained to predict the missing word. Despite the practical success of transformers, it remains unclear what type of data distribution self-attention can learn efficiently. Here, we show analytically that if one decouples the treatment of word positions and embeddings, a single layer of self-attention learns the conditionals of a generalised Potts model with interactions between sites and Potts colours. Moreover, we show that training this neural network is exactly equivalent to solving the inverse Potts problem by the so-called pseudo-likelihood method, well known in statistical physics. Using this mapping, we compute the generalisation error of self-attention in a model scenario analytically using the replica method.
Comments: 4 pages, 3 figures
Subjects: Disordered Systems and Neural Networks (cond-mat.dis-nn); Statistical Mechanics (cond-mat.stat-mech); Computation and Language (cs.CL); Machine Learning (stat.ML)
Cite as: arXiv:2304.07235 [cond-mat.dis-nn]
  (or arXiv:2304.07235v2 [cond-mat.dis-nn] for this version)

Submission history

From: Riccardo Rende [view email]
[v1] Fri, 14 Apr 2023 16:32:56 GMT (563kb,D)
[v2] Thu, 14 Dec 2023 12:08:44 GMT (569kb,D)
[v3] Wed, 7 Feb 2024 09:48:07 GMT (568kb,D)
[v4] Thu, 4 Apr 2024 13:24:36 GMT (569kb,D)

Link back to: arXiv, form interface, contact.