References & Citations
Computer Science > Computation and Language
Title: Structured Packing in LLM Training Improves Long Context Utilization
(Submitted on 28 Dec 2023 (v1), last revised 29 Apr 2024 (this version, v6))
Abstract: Recent developments in long-context large language models have attracted considerable attention. Yet, their real-world applications are often hindered by ineffective context information use. This work shows that structuring training data to increase semantic interdependence is an effective strategy for optimizing context utilization. To this end, we introduce Structured Packing for Long Context (SPLiCe), a method for creating training examples by using information retrieval methods to collate mutually relevant documents into a single training context. We empirically validate SPLiCe on large $3$B and $7$B models, showing perplexity improvements and better long-context utilization on downstream tasks. Remarkably, already relatively short fine-tuning with SPLiCe is enough to attain these benefits. Additionally, the comprehensive study of SPLiCe reveals intriguing transfer effects such as training on code data leading to perplexity improvements on text data.
Submission history
From: Piotr Miłoś [view email][v1] Thu, 28 Dec 2023 16:25:52 GMT (2309kb,D)
[v2] Tue, 2 Jan 2024 14:48:56 GMT (2309kb,D)
[v3] Fri, 2 Feb 2024 20:33:28 GMT (2769kb,D)
[v4] Wed, 3 Apr 2024 17:35:11 GMT (2769kb,D)
[v5] Fri, 26 Apr 2024 08:23:29 GMT (2769kb,D)
[v6] Mon, 29 Apr 2024 09:34:27 GMT (2769kb,D)
Link back to: arXiv, form interface, contact.