Current browse context:
cond-mat.dis-nn
Change to browse by:
References & Citations
Condensed Matter > Disordered Systems and Neural Networks
Title: Entropic alternatives to initialization
(Submitted on 16 Jul 2021 (v1), last revised 28 Jul 2021 (this version, v2))
Abstract: Local entropic loss functions provide a versatile framework to define architecture-aware regularization procedures. Besides the possibility of being anisotropic in the synaptic space, the local entropic smoothening of the loss function can vary during training, thus yielding a tunable model complexity. A scoping protocol where the regularization is strong in the early-stage of the training and then fades progressively away constitutes an alternative to standard initialization procedures for deep convolutional neural networks, nonetheless, it has wider applicability. We analyze anisotropic, local entropic smoothenings in the language of statistical physics and information theory, providing insight into both their interpretation and workings. We comment some aspects related to the physics of renormalization and the spacetime structure of convolutional networks.
Submission history
From: Daniele Musso [view email][v1] Fri, 16 Jul 2021 08:17:32 GMT (1170kb,D)
[v2] Wed, 28 Jul 2021 06:21:32 GMT (1169kb,D)
Link back to: arXiv, form interface, contact.