We gratefully acknowledge support from
the Simons Foundation and member institutions.
Full-text links:

Download:

Current browse context:

cs.LG

Change to browse by:

References & Citations

DBLP - CS Bibliography

Bookmark

(what is this?)
CiteULike logo BibSonomy logo Mendeley logo del.icio.us logo Digg logo Reddit logo

Computer Science > Machine Learning

Title: BLoad: Enhancing Neural Network Training with Efficient Sequential Data Handling

Abstract: The increasing complexity of modern deep neural network models and the expanding sizes of datasets necessitate the development of optimized and scalable training methods. In this white paper, we addressed the challenge of efficiently training neural network models using sequences of varying sizes. To address this challenge, we propose a novel training scheme that enables efficient distributed data-parallel training on sequences of different sizes with minimal overhead. By using this scheme we were able to reduce the padding amount by more than 100$x$ while not deleting a single frame, resulting in an overall increased performance on both training time and Recall in our experiments.
Subjects: Machine Learning (cs.LG); Distributed, Parallel, and Cluster Computing (cs.DC)
Cite as: arXiv:2310.10879 [cs.LG]
  (or arXiv:2310.10879v2 [cs.LG] for this version)

Submission history

From: Raphael Ruschel Dos Santos [view email]
[v1] Mon, 16 Oct 2023 23:14:56 GMT (1645kb,D)
[v2] Thu, 25 Apr 2024 18:06:46 GMT (1593kb,D)

Link back to: arXiv, form interface, contact.