# Description of problem

Hello everyone, I’m trying to implement a model featuring additive noise, intended to be a zero-mean unit-variance normal random variable multiplied by a parameter of the model, sigma. I’m having trouble implementing a noise source capable of having the same variance independently of integration step.

# Minimal code to reproduce problem

import brian2 as br
from scipy.stats import norm
dt = 0.1
T = 10
sigma = 1
br.start_scope()
br.defaultclock.dt = dt * br.ms
duration = T * br.ms
eqs='''
dphi/dt = sigma_int * xi : 1
sigma_int : hertz ** 0.5
'''
neurons = br.NeuronGroup(100,eqs,method='heun')
neurons.sigma_int = sigma * br.Hz ** 0.5
noise = br.StateMonitor(neurons,'phi',record=True)
br.run(duration)


# What you have aready tried

I first tried to record the values of xi using a brian monitor in order to check its variance and see whether it depended on dt, but failed to implement the monitor without errors. I then resorted to computing noise variance using the following code (unsure of its correctness though):

noise = (noise.phi[:,1:]-noise.phi[:,:-1]) / dt
mu, std = norm.fit(np.ndarray.flatten(noise))


# Expected output

I was expecting std to be 1 independently of dt, since I kept sigma fixed to 1 for now

# Actual output

std varied from 0.2 to 1 changing dt (with values between 0.01 and 5)

Hi @pietro. Your implementation using xi is correct, and looking at the difference between time steps is one way of checking that it works correctly (even though I don’t think you can flatten it like this, since this introduces a big jump between the last value of one trajectory, and the first value of another trajectory). But your calculation is slightly wrong, I think: you seem to expect that the standard deviation for each step is the same if normalized by dt, i.e. if one step has a standard deviation of 0.1 for a time step of 0.1ms, it should have a standard deviation of 0.2 for a time step of twice that size. But the sum of two steps with a standard variation of 0.1 each would have a total standard deviation of 0.141… not 0.2 – it’s their variance that adds up.
An alternative way to reason about (which I find more intuitive) noise is that we are modeling a Wiener process, and after a certain time, we should have the same probability distribution of values, i.e. if we simulate a trajectory multiple times and look at the value at times t, then the distribution of values at time t should be independent of the dt we used for the simulation. This is the case with the model above, see this plot (on the right in black is the histogram of the final values of the trajectory, in red is a fit of a normal distribution):

Printing the values confirms that everything works as expected:

dt=100. us, std dev: 0.315 (expected: 0.316)
dt=0.5 ms, std dev: 0.317 (expected: 0.316)
dt=1. ms, std dev: 0.316 (expected: 0.316)
dt=5. ms, std dev: 0.317 (expected: 0.316)


For the full code, see below. Hope this helped!

Code
import brian2 as br
import matplotlib.pyplot as plt
import numpy as np
from scipy.stats import norm
T = 100*br.ms
sigma = 1
sigma_int = sigma * br.Hz ** 0.5
dt_values = [0.1, 0.5, 1.0, 5.0] * br.ms
results = []
for dt in dt_values:
br.defaultclock.dt = dt
eqs='''
dphi/dt = sigma_int * xi : 1
'''
neurons = br.NeuronGroup(10000,eqs,method='heun')

noise = br.StateMonitor(neurons,'phi',record=True)
br.run(T)
# a detail, but the monitor records at the start of the time step,
# so we need to record an extra step to actually have a final value
# at the same time point, not at t-dt
noise.record_single_timestep()
results.append((noise.t, noise.phi[:]))

fig, axs = plt.subplots(4, 2,
sharex="col", sharey=True,
gridspec_kw={"width_ratios": [3, 1]},
layout="constrained")
for idx, (dt, (t, result)) in enumerate(zip(dt_values, results)):
axs[idx, 0].set_title(f'dt = {dt}')
axs[idx, 0].plot(t/br.ms, result[::100].T, alpha=0.1, color='black')  # plot a subset of trajectories
axs[idx, 1].hist(result[:, -1], bins=100, orientation="horizontal", histtype="step", color="black", density=True)
# Fit normal distribution
mu, std = norm.fit(result[:, -1])
print(f"dt={dt}, std dev: {std:.3f} (expected: {np.sqrt(sigma*float(T)):.3f})")
axs[idx, 1].plot(norm.pdf(np.linspace(-1, 1, 100), mu, std), np.linspace(-1, 1, 100), color="darkred", alpha=0.5)
axs[idx, 1].axis("off")
axs[-1, 0].set_xlabel("Time (ms)")
plt.show()



Thanks! I understand where my attempt at analysing the random variable went wrong, thanks for the in depth explanation. Also, I understand that xi behaves independently of dt, as I was expecting.
Yet, I still feel like I need some clarification: does my code correctly implement the unit-variance additive noise I needed? My end goal is controlling noise amplitude in the model with the parameter sigma, and all my doubts about xi arose when my simulations of the full model seemed to display different noise than expected (just by eye inspection I admit).

Running the following:

import brian2 as br
from scipy.stats import norm
dt = 0.1
T = 10
sigma = 1
br.start_scope()
br.defaultclock.dt = dt * br.ms
duration = T * br.ms
eqs='''
dphi/dt = sigma_int * xi : 1
sigma_int : hertz ** 0.5
'''
neurons = br.NeuronGroup(100,eqs,method='heun')
neurons.sigma_int = sigma * br.Hz ** 0.5
noise = br.StateMonitor(neurons,'phi',record=True)
br.run(duration)
noise = (noise.phi[:,1:]-noise.phi[:,:-1]) / dt
mu, std = norm.fit(np.ndarray.flatten(noise)) # flattening should be ok as this is just a hist fit
var = std**2


for dt = 0.001, 0.01, 0.1, 1
yelded var = 1, 0.1, 0.01, 0.001 respectively.
I think I understand that in order to achieve the correct normal distribution of the end points of the trajectories, random steps need to be adjusted in size according to dt, please correct me if I’m wrong.

I can also post the full model and examples showing my problems, but maybe knowing the answer to the above question I can figure the rest out by myself.
Thanks again,

Pietro

I think your model does what you want, but “unit-variance noise” is not well-defined in this context. What is normally meant by Gaussian noise in a stochastic differential equation is indeed \xi (=“xi”) in the way that Brian implements it. Also see Models and neuron groups — Brian 2 2.6.0 documentation

There are other ways to simply add noise to a variable at every time step (e.g. in your model above: neurons.run_regularly("phi += randn()*sigma"), but this is usually a bad model (even though quite common), since it does not scale with dt. But then, if all you are doing is simulating with a single dt, this is of course undistinguishable from a “formally correct” approach for suitable values of sigma.