I’m working my way through the excellent textbook “Neuronal Dynamics” by Gerstner et al. to arrive at a closed-form solution for how the rates of LIF networks depend on the stimulus amplitude (given a bunch of other parameters).

Chapter 12 does a good job of building towards and answer to this question and arrives at something called “the Siegert formula”

eq 8.54

eq 12.26

Currently I’m implementing this in python and comparing the prediction to the result of Brian simulations. I’m using scipy’s integrate.quad function to compute the result… but i’m running into numerical issues with small noise values.

I think the issue arises from integrating an \exp(x^2) term that explodes beyond numerical precision (or 1/\exp(x^2) for very small x) but i’m not sure.

It looks like there’s an alternate, approximate formulation of this relationship proposed here: “Noisy Softplus: an activation function that

enables SNNs to be trained as ANNs”

and I just found some discussion of “A.1. Numerical Evaluation of the Siegert Formula” in “Integration of Continuous-Time Dynamics in a Spiking Neural Network Simulator”

where the authors mention:

We here describe how to numerically calculate Equation (27),frequently called Siegert formula in the literature (for a recent textbook seeGerstner et al., 2014), which is not straightforward due to numerical instabilities in the integral.

Have other people tried this within the Brian community before? Is there an existing (preferably python) implementation of this we could use “off the shelf”?

If I can replicate the Diesmann et al. solution would that be useful to have as

- a short python code snippet hosted on github and referenced in Brian documentation?
- a short (jupyter) notebook demonstrating how to predict firing rate?
- a miscellaneous utility function added to the Brian codebase?