We gratefully acknowledge support from
the Simons Foundation and member institutions.
Full-text links:

Download:

Current browse context:

cs.CL

Change to browse by:

cs

References & Citations

DBLP - CS Bibliography

Bookmark

(what is this?)
CiteULike logo BibSonomy logo Mendeley logo del.icio.us logo Digg logo Reddit logo

Computer Science > Computation and Language

Title: Structured Packing in LLM Training Improves Long Context Utilization

Abstract: Recent developments in long-context large language models have attracted considerable attention. Yet, their real-world applications are often hindered by ineffective context information use. This work shows that structuring training data to increase semantic interdependence is an effective strategy for optimizing context utilization. To this end, we introduce Structured Packing for Long Context (SPLiCe), a method for creating training examples by using information retrieval methods to collate mutually relevant documents into a single training context. We empirically validate SPLiCe on large $3$B and $7$B models, showing perplexity improvements and better long-context utilization on downstream tasks. Remarkably, already relatively short fine-tuning with SPLiCe is enough to attain these benefits. Additionally, the comprehensive study of SPLiCe reveals intriguing transfer effects such as training on code data leading to perplexity improvements on text data.
Subjects: Computation and Language (cs.CL)
Cite as: arXiv:2312.17296 [cs.CL]
  (or arXiv:2312.17296v6 [cs.CL] for this version)

Submission history

From: Piotr Miłoś [view email]
[v1] Thu, 28 Dec 2023 16:25:52 GMT (2309kb,D)
[v2] Tue, 2 Jan 2024 14:48:56 GMT (2309kb,D)
[v3] Fri, 2 Feb 2024 20:33:28 GMT (2769kb,D)
[v4] Wed, 3 Apr 2024 17:35:11 GMT (2769kb,D)
[v5] Fri, 26 Apr 2024 08:23:29 GMT (2769kb,D)
[v6] Mon, 29 Apr 2024 09:34:27 GMT (2769kb,D)

Link back to: arXiv, form interface, contact.