References & Citations
Computer Science > Neural and Evolutionary Computing
Title: Gaining the Sparse Rewards by Exploring Binary Lottery Tickets in Spiking Neural Network
(Submitted on 23 Sep 2023 (this version), latest version 28 Mar 2024 (v3))
Abstract: Spiking Neural Network (SNN) as a brain-inspired strategy receives lots of attention because of the high-sparsity and low-power properties derived from its inherent spiking information state. To further improve the efficiency of SNN, some works declare that the Lottery Tickets (LTs) Hypothesis, which indicates that the Artificial Neural Network (ANN) contains a subnetwork without sacrificing the performance of the original network, also exists in SNN. However, the spiking information handled by SNN has a natural similarity and affinity with binarization in sparsification. Therefore, to further explore SNN efficiency, this paper focuses on (1) the presence or absence of LTs in the binary SNN, and (2) whether the spiking mechanism is a superior strategy in terms of handling binary information compared to simple model binarization. To certify these consumptions, a sparse training method is proposed to find Binary Weights Spiking Lottery Tickets (BinW-SLT) under different network structures. Through comprehensive evaluations, we show that BinW-SLT could attain up to +5.86% and +3.17% improvement on CIFAR-10 and CIFAR-100 compared with binary LTs, as well as achieve 1.86x and 8.92x energy saving compared with full-precision SNN and ANN.
Submission history
From: Hao Cheng [view email][v1] Sat, 23 Sep 2023 08:24:36 GMT (3491kb,D)
[v2] Thu, 28 Sep 2023 15:20:37 GMT (3491kb,D)
[v3] Thu, 28 Mar 2024 02:24:38 GMT (4628kb,D)
Link back to: arXiv, form interface, contact.