We gratefully acknowledge support from
the Simons Foundation and member institutions.
Full-text links:

Download:

Current browse context:

cs.LG

Change to browse by:

cs

References & Citations

Bookmark

(what is this?)
CiteULike logo BibSonomy logo Mendeley logo del.icio.us logo Digg logo Reddit logo

Computer Science > Machine Learning

Title: Practical Dataset Distillation Based on Deep Support Vectors

Abstract: Conventional dataset distillation requires significant computational resources and assumes access to the entire dataset, an assumption impractical as it presumes all data resides on a central server. In this paper, we focus on dataset distillation in practical scenarios with access to only a fraction of the entire dataset. We introduce a novel distillation method that augments the conventional process by incorporating general model knowledge via the addition of Deep KKT (DKKT) loss. In practical settings, our approach showed improved performance compared to the baseline distribution matching distillation method on the CIFAR-10 dataset. Additionally, we present experimental evidence that Deep Support Vectors (DSVs) offer unique information to the original distillation, and their integration results in enhanced performance.
Subjects: Machine Learning (cs.LG)
Cite as: arXiv:2405.00348 [cs.LG]
  (or arXiv:2405.00348v1 [cs.LG] for this version)

Submission history

From: Hyunho Lee [view email]
[v1] Wed, 1 May 2024 06:41:27 GMT (556kb,D)

Link back to: arXiv, form interface, contact.