We gratefully acknowledge support from
the Simons Foundation and member institutions.
Full-text links:

Download:

Current browse context:

math.ST

Change to browse by:

References & Citations

Bookmark

(what is this?)
CiteULike logo BibSonomy logo Mendeley logo del.icio.us logo Digg logo Reddit logo

Mathematics > Statistics Theory

Title: Finite Sample Analysis and Bounds of Generalization Error of Gradient Descent in In-Context Linear Regression

Abstract: Recent studies show that transformer-based architectures emulate gradient descent during a forward pass, contributing to in-context learning capabilities - an ability where the model adapts to new tasks based on a sequence of prompt examples without being explicitly trained or fine tuned to do so. This work investigates the generalization properties of a single step of gradient descent in the context of linear regression with well-specified models. A random design setting is considered and analytical expressions are derived for the statistical properties and bounds of generalization error in a non-asymptotic (finite sample) setting. These expressions are notable for avoiding arbitrary constants, and thus offer robust quantitative information and scaling relationships. These results are contrasted with those from classical least squares regression (for which analogous finite sample bounds are also derived), shedding light on systematic and noise components, as well as optimal step sizes. Additionally, identities involving high-order products of Gaussian random matrices are presented as a byproduct of the analysis.
Subjects: Statistics Theory (math.ST); Numerical Analysis (math.NA); Probability (math.PR)
MSC classes: 62J05, 68T10
Cite as: arXiv:2405.02462 [math.ST]
  (or arXiv:2405.02462v2 [math.ST] for this version)

Submission history

From: Karthik Duraisamy [view email]
[v1] Fri, 3 May 2024 19:52:07 GMT (3708kb,D)
[v2] Thu, 9 May 2024 21:29:03 GMT (3707kb,D)

Link back to: arXiv, form interface, contact.