We gratefully acknowledge support from
the Simons Foundation and member institutions.
Full-text links:

Download:

Current browse context:

hep-ph

Change to browse by:

References & Citations

Bookmark

(what is this?)
CiteULike logo BibSonomy logo Mendeley logo del.icio.us logo Digg logo Reddit logo

High Energy Physics - Phenomenology

Title: Folded context condensation in Path Integral formalism for infinite context transformers

Abstract: This short note is written for rapid communication of long context training and to share the idea of how to train it with low memory usage. In the note, we generalize the attention algorithm and neural network of Generative Pre-Trained Transformers and reinterpret it in Path integral formalism. First, the role of the transformer is understood as the time evolution of the token state and second, it is suggested that the all key-token states in the same time as the query-token can attend to the attention with the query token states. As a result of the repetitive time evolution, it is discussed that the token states in the past sequence meats the token states in the present sequence so that the attention between separated sequences becomes possible for maintaining infinite contextual information just by using low memory for limited size of sequence. For the experiment, the $12$ input token window size was taken and one GPU with $24$GB memory was used for the pre-training. It was confirmed that more than $150$ length context is preserved. The sampling result of the training, the code and the other details will be included in the revised version of this note later.
Comments: 7 pages, 2 figures
Subjects: High Energy Physics - Phenomenology (hep-ph); Artificial Intelligence (cs.AI); Computation and Language (cs.CL); Machine Learning (cs.LG); Neural and Evolutionary Computing (cs.NE)
Cite as: arXiv:2405.04620 [hep-ph]
  (or arXiv:2405.04620v2 [hep-ph] for this version)

Submission history

From: Won-Gi Paeng [view email]
[v1] Tue, 7 May 2024 19:05:26 GMT (120kb)
[v2] Fri, 10 May 2024 02:18:27 GMT (126kb)

Link back to: arXiv, form interface, contact.