# Description of problem

I was trying to reproduce the mentioned paper. Since the model mostly based on the article Brunel & Wang 2001, which is already reproduced (Example: Brunel_Wang_2001 — Brian 2 2.5.1 documentation) I built my code upon this code. But I can not obtain the figures in Wang 2002 paper. Basically, after stimuli is disappeared (in delay period) I should obtain a elevated persistent activity, but I cannot. Also, just one of the selective population’s activity should reach the threshold while other one disappears, due to the competition. But this is also not happening.

Here I will explain the code step-by-step and compare it with article, so that you may help me more easily.

# Expected output

You can see that on the left, there exist a persistent activity during the delay period (after the second vertical line) for the one of the selective population, while on the right, the other selective population has “lost”, and there is even no ramping activity. (Ramping activity = the activity of left population during the stimuli period, i.e. between vertical lines.)

# Step-by-step comparison

1- Equations

Since the model is built upon some other article, and that article has already reproduced in Brian2, I first compared the Wang 2002 article with Brunel & Wang 2001 article. I thought that I can use the reproduced code if equations of models are the same.

Flow:
First, equations from Brunel & Wang 2001, then, Wang 2002.

As far as I noticed, there is only one difference: There is no summation term for s_ext,AMPA in Wang 2002. What is the meaning of it, is it just a typo?

2- Parameters

Flow:
First I will copy-paste the parameters from paper, which is mostly from the last part of the article: “Experimental Procedures”. (bold)

Then I will copy-paste the parameters from the code. (normal font)

τ_AMPA = 2 ms

tau_AMPA = 2. * ms

In the case of external AMPA currents, the spikes are emitted according to a Poisson process with rate V=ext = 2.4 kHz independently from cell to cell.

Rate and external synapses are not explicitly stated in Wang 2002 article, so I found these parameters from Brunel 2001.

rate = 3 * Hz
C_ext = 800

tau_NMDA,decay = 100 ms, a = 0.5 ms^(-1) , and tau_NMDA,rise = 2 ms.

tau_NMDA_rise = 2. * ms
tau_NMDA_decay = 100. * ms
alpha = 0.5 / ms

tau_GABA = 5 ms

tau_GABA = 5. * ms

All synapses have a latency of 0.5 ms.

C_E_E = Synapses(P_E, P_E, model=eqs_glut, on_pre=eqs_pre_glut, method=‘euler’, delay=0.5ms)
C_E_I = Synapses(P_E, P_I, model=eqs_glut, on_pre=eqs_pre_glut, method=‘euler’, delay=0.5
ms)
C_I_I = Synapses(P_I, P_I, on_pre=eqs_pre_gaba, method=‘euler’, delay=0.5ms)
C_I_E = Synapses(P_I, P_E, on_pre=eqs_pre_gaba, method=‘euler’, delay=0.5
ms)

V_e = 0 mV, V_i = -70 mV.

V_E = 0. * mV
V_I = -70. * mV

[Mg 2+] = 1 mM.

Mg2 = 1.

for pyramidal cells,
g_ext,AMPA = 2.1, g_rec,AMPA = 0.05, g_NMDA = 0.165, and g_GABA = 1.3;
for interneurons,
g_ext,AMPA = 1.62, g_rec,AMPA = 0.04, g_NMDA = 0.13, and g_GABA = 1.0.

g_AMPA_ext_E = 2.1 * nS
g_AMPA_rec_E = 0.05 * nS
g_NMDA_E = 0.165 * nS
g_GABA_E = 1.3 * nS

g_AMPA_ext_I = 1.62 * nS
g_AMPA_rec_I = 0.04 * nS
g_NMDA_I = 0.13 * nS
g_GABA_I = 1 * nS

Inside a selective population, w = w+.
Unless specified otherwise, I used w+ = 1.7.
Between two different selective populations, and from the nonselective population to selective ones, w = w-. [w- = 1 - f(w+ - 1)/(1 - f )]
Other connections have w = 1

External Input:

From Wang 2002:

I wanted to first try to reproduce the model with the more simple case, therefore I considered this statement: “Therefore, external stimuli cannot be the main source of randomness in the decision-making process of this model. To confirm this conclusion, I simulated the model without stochastic fluctuations in the stimuli (std = 0, thus s_A(t) = s_B(t ) = const.). In this case, the network’s behavior is essentially unchanged.” And therefore I made the external inputs constant.

First second: Just background.

Second and third: BG + External

Fourth: Just background.

stimuli1 = TimedArray(np.r_[np.zeros(1), np.ones(2)*45, np.zeros(1)]Hz, dt=1000ms)
stimuli2 = TimedArray(np.r_[np.zeros(1), np.ones(2)*35, np.zeros(1)]Hz, dt=1000ms)

C_selection = int(f * C_ext) (This was not stated in Wang 2002 article, and I also couldn’t find this information in Brunel 2001 article but I should say that I did not spend much time in Brunel 2001. So I just take this line of code from Brunel 2001 reproduction documentation.)

P1 = PoissonGroup(C_selection, rates=‘stimuli1(t)’)

P2 = PoissonGroup(C_selection, rates=‘stimuli2(t)’)

Background Input and I to I connections:

In Wang 2002 article, the figure shows that noise (background input) is only given to excitatory neuron groups, and the interneuron population has no recurrent activity:

But this is not the case in Brunel 2001 article:

Both articles state that they only used AMPA for external inputs for most of the simulations.

I always tried with configuration in which both interneurons and excitatory neurons receive background input.

C_P_E = PoissonInput(P_E, ‘s_AMPA_ext’, C_ext, rate, ‘1’)
C_P_I = PoissonInput(P_I, ‘s_AMPA_ext’, C_ext, rate, ‘1’)

Although figure of Wang 2002 article doesn’t shows recurrent interneuron connectivity, he states it in plain text: “The network is endowed with pyramid-to-pyramid, pyramid-to-interneuron, interneuron-to-pyramid, and interneuron-to-interneuron connections”

Thoughts:

Since the previously reproduced paper is most probably true, and most probably I wrote the parameters true, I think that the problem with this code is either about background or external input. Or maybe the aforementioned missing summation term is not just a typo. I have uploaded the .py file too.

Thank you very much in advance!

Wang2002.py (6.0 KB)

Nice write-up! We should probably have a dedicated (sub)category for replication attempts.
Just a little comment on formatting, you can enclose Python code in triple backticks like this:


# Python code here



It then gets nicely formatted and characters like * won’t mess with the markdown formatting.
Coming back to the science: I agree (also from your result), that there is something about the external input that is not quite correct. The code in your Python file uses

P1 = PoissonGroup(C_ext, rates='stimuli1(t)')
P2 = PoissonGroup(C_ext, rates='stimuli2(t)')


which would be an input of 800*40Hz=32000Hz, which is a bit extreme. In the code posted above, you use instead

C_selection = int(f * C_ext)
P1 = PoissonGroup(C_selection, rates=‘stimuli1(t)’)
P2 = PoissonGroup(C_selection, rates=‘stimuli2(t)’)


which would mean 800*0.15*40*Hz=4800Hz, which is still a lot. I’m not actually sure whether the C_selection thing was mentioned in the Brunel 2001 paper, or whether it was an inference to get meaningful firing rates. In the Wang 2002 paper, I’d say that he simply adds a single ~40Hz to each neuron receiving external input, i.e. you should rather use:

P1 = PoissonGroup(N_sub, rates='stimuli1(t)')
P2 = PoissonGroup(N_sub, rates='stimuli2(t)')
# ...
S1.connect(j='i')
S2.connect(j='i')


The j='i' means to use 1-to-1 connections, so that each neuron gets its own independent realization of a Poisson process. By making this change, the results look rather reasonable (although it does not show much persistent activity), I’d say:

I don’t think this is a typo, it is just that you can phrase the external input as 800 cells firing with 3Hz (as in Brunel&Wang 2001), and then you sum these inputs, or you directly state the input as one spike train of 2400Hz as done here – mathematically both are equivalent, but in a time-step based simulation there is a small difference: in the first formulation there can be more than one spike per time step.

Thank you very much both for the format suggestions and external input activity.

Adding a single ~40Hz to each neuron makes sense. I have done some simulations with this correction. Here are the results:

Background input as single 2400Hz train:

C_P_E = PoissonInput(P_E, 's_AMPA_ext', 1, 2400*Hz, '1')
C_P_I = PoissonInput(P_I, 's_AMPA_ext', 1, 2400*Hz, '1')


In this case both the elevated persistent activity and competing behavior is emerged. (Green one is selective 2)

But when I use 800 spike trains with ~3 Hz:

C_P_E = PoissonInput(P_E, 's_AMPA_ext', C_ext, rate, '1')
C_P_I = PoissonInput(P_I, 's_AMPA_ext', C_ext, rate, '1')


Note: I tried dt = 0.02 and dt = 0.1, no difference.

So I couldn’t understand why more than one spike per time step in input prevents the elevated persistent activity, thus changing the behavior of system. What do you think?

I’m surprised that it makes such a difference, the system is very sensitive to its input, apparently… But then, in the Wang 2002 paper he never uses the “800 neurons firing with 3Hz” formulation, right? So if the 2400Hz formulation reproduces the results, then I guess that’s fine.

In both cases, we will get on average 2400 spikes per second, but in the 800×3Hz formulation, there will be a slightly higher fluctuation around that average. This depends on the time step, since the variance of the binomial distribution is given by n\cdot p\cdot (1-p), where p is the probability that the Poisson “neuron” spikes during the time step, i.e. given by p=r\cdot dt. This leads to a dependence of the variance on the time step, where for very small time steps they become similar:

The time step also makes a difference for how many spikes will arrive at exactly the same time – in the 1×2400Hz formulation it is never more than one (if dt gets close to 1/2400Hz or even bigger, then this will make everything break down and deliver too few spikes in total), but in the 3×800Hz formulation, there can be up to 3. With a very small time step, it is very unlikely that more than 1 spike arrives in a single time step, so the both results get closer in that regard as well.

See this very simple example:

group = NeuronGroup(10000, '''x : 1
y : 1''')
inp_1 = PoissonInput(group, 'x', 1, 2400*Hz, '1')
inp_2 = PoissonInput(group, 'y', 800, 3*Hz, '1')
mon = StateMonitor(group, ['x', 'y'], record=True)
run(1000*ms)

fig, axs = plt.subplot_mosaic('''AB
AC''')
axs['A'].hist(group.x, bins=np.arange(2250, 2551, 5), histtype='step', label='1·2400Hz')
axs['A'].hist(group.y, bins=np.arange(2250, 2551, 5), histtype='step', label='3·800Hz')
axs['B'].plot(mon.t[:-1]/ms, np.diff(mon.x[0]))
axs['B'].set(xlim=(500, 525), ylim=(0, 4))
axs['C'].plot(mon.t[:-1]/ms, np.diff(mon.y[0]), color='C1')
axs['C'].set(xlim=(500, 525), ylim=(0, 4))
plt.show()


for the default time step of 0.1ms (right plots are zoomed in for 25ms):

and for 0.01ms:

[Note that there’s a typo in the legend “3·800Hz” should be “800·3Hz”]
This reply was longer than intended, but I needed to clarify things for myself

Thank you very much for the detailed response! I won’t be able to look it thoroughly in few days, but I will.

Just wanted to write: After I obtained the results on my previous reply, I noticed something about Brunel 2001 paper and reproduction.

Reproduction:

Paper:

In the paper, it is stated that 800 * 3 Hz is used, so reproduction imitates this, but there is no elevated persistent activity in reproduction. When I changed the input to 2400 * 1 Hz, I obtained the persistent activity, thus more similar results with paper:

When I used RK2 method and dt = 0.1 as paper says, (but they mentioned the modified version of RK2), with 800*3Hz version, I still couldn’t observe persistent activity:

So I couldn’t understand how they obtained these results with 800*3Hz and dt=0.1ms configuration. Perhaps I should try the mentioned RK2 method too. (And I need to read the article carefully one more time.) Do you have ideas?

And also I think it would be better to use the two “distractor” inputs, as far as I understood, reproduction doesn’t do that now.

Unfortunately, there’s no “official” code available for the Brunel & Wang 2001 paper, so we can only guess what they did – it is possibly that they actually used something equivalent to the 2400*1Hz implementation (which would make sense given that this is apparently used in Wang 2002). I am not an expert on the example model in the Brian documentation, it was contributed by a user (zifeo, not on this forum as far as I can see). Maybe consider writing Nicolas Brunel? It’s a paper from a long time ago, but he might still have his original C file (or whatever he used) lying around. And of course we’d be happy to receive corrections/additions for the model in the documentation