We gratefully acknowledge support from
the Simons Foundation and member institutions.
Full-text links:

Download:

Current browse context:

cs.LG

Change to browse by:

cs

References & Citations

DBLP - CS Bibliography

Bookmark

(what is this?)
CiteULike logo BibSonomy logo Mendeley logo del.icio.us logo Digg logo Reddit logo

Computer Science > Machine Learning

Title: The Simpler The Better: An Entropy-Based Importance Metric To Reduce Neural Networks' Depth

Abstract: While deep neural networks are highly effective at solving complex tasks, large pre-trained models are commonly employed even to solve consistently simpler downstream tasks, which do not necessarily require a large model's complexity. Motivated by the awareness of the ever-growing AI environmental impact, we propose an efficiency strategy that leverages prior knowledge transferred by large models. Simple but effective, we propose a method relying on an Entropy-bASed Importance mEtRic (EASIER) to reduce the depth of over-parametrized deep neural networks, which alleviates their computational burden. We assess the effectiveness of our method on traditional image classification setups. The source code will be publicly released upon acceptance of the article.
Comments: arXiv admin note: text overlap with arXiv:2404.16890
Subjects: Machine Learning (cs.LG)
Cite as: arXiv:2404.18949 [cs.LG]
  (or arXiv:2404.18949v1 [cs.LG] for this version)

Submission history

From: Victor Quétu [view email]
[v1] Sat, 27 Apr 2024 08:28:25 GMT (463kb,D)

Link back to: arXiv, form interface, contact.