Current browse context:
cs.AI
Change to browse by:
References & Citations
Quantitative Biology > Neurons and Cognition
Title: Passive nonlinear dendritic interactions as a general computational resource in functional spiking neural networks
(Submitted on 26 Apr 2019 (v1), last revised 14 Aug 2020 (this version, v2))
Abstract: Nonlinear interactions in the dendritic tree play a key role in neural computation. Nevertheless, modeling frameworks aimed at the construction of large-scale, functional spiking neural networks, such as the Neural Engineering Framework, tend to assume a linear superposition of post-synaptic currents. In this paper, we present a series of extensions to the Neural Engineering Framework that facilitate the construction of networks incorporating Dale's principle and nonlinear conductance-based synapses. We apply these extensions to a two-compartment LIF neuron that can be seen as a simple model of passive dendritic computation. We show that it is possible to incorporate neuron models with input-dependent nonlinearities into the Neural Engineering Framework without compromising high-level function and that nonlinear post-synaptic currents can be systematically exploited to compute a wide variety of multivariate, bandlimited functions, including the Euclidean norm, controlled shunting, and non-negative multiplication. By avoiding an additional source of spike noise, the function-approximation accuracy of a single layer of two-compartment LIF neurons is on a par with or even surpasses that of two-layer spiking neural networks up to a certain target function bandwidth.
Submission history
From: Andreas Stöckel [view email][v1] Fri, 26 Apr 2019 08:32:29 GMT (1981kb)
[v2] Fri, 14 Aug 2020 01:38:19 GMT (2207kb)
Link back to: arXiv, form interface, contact.