Description of problem
I was trying to create some synapses and store them in a list. I was able to create each synapses one by one. However, when creating them using a for loop, the results were different.
My network is simple:
Layer G has two neurons 0 and 1; neuron #1 is not connected to any other neurons.
Layer H has three neurons 0, 1, 2.
Synapses #1: G_0 is connected to H_0 and H_1 with a weight of 0 and 0.2 respectively.
Synapses #2: G_0 is connected to H_2 with a weight of 0.01
G_0 fires periodically. If everything works as expected, H_0 never fires, H_1’s voltage increases faster than H_2’s voltage.
Minimal code to reproduce problem
from brian2 import *
eqs = ‘’’
dv/dt = (I-v)/tau : 1
I : 1
tau : second
‘’’G = NeuronGroup(2, eqs, threshold=‘v>1’, reset=‘v = 0’, method=‘exact’)
G.I = [2, 2]
G.tau = [10, 10]*msH = NeuronGroup(3, eqs, threshold=‘v>1’, reset=‘v = 0’, method=‘exact’)
H.I = [0, 0, 0]
H.tau = [10, 100, 100]*msJ = [[0,1],[2]] # connection list
W = [[0,0.2],[0.01]] # weight list
S = [0]*2 # list to store synapses#The for loop below was supposed to create the synapses, but it never worked out properly
#for syn in range(2):
#S[syn] = Synapses(G, H, model=‘w : 1’, on_pre=‘v_post += w’)
#S[syn].connect(i=0, j=J[syn])
#S[syn].w = W[syn]#creating the synapses one by one worked.
S1 = Synapses(G, H, model=‘w : 1’, on_pre=‘v_post += w’)
S1.connect(i=0, j=J[0])
S1.w = W[0]
S2 = Synapses(G,H, ‘w : 1’, on_pre=‘v_post += w’)
S2.connect(i=0,j=J[1])
S2.w = W[1]M = StateMonitor(H, ‘v’, record=True)
run(50*ms)
plot(M.t/ms, M.v[0], label=‘Neuron 0’)
plot(M.t/ms, M.v[1], label=‘Neuron 1’)
plot(M.t/ms, M.v[2], label=‘Neuron 2’)xlabel(‘Time (ms)’)
ylabel(‘v’)
legend()
show()
What you have already tried
If I don’t use the indexing method to store the synapses, I was able to create a synapses (that functions properly) in the last iteration only.
S =
for syn in range(2):
S_tmp = Synapses(G, H, model=‘w : 1’, on_pre=‘v_post += w’)
S_tmp.connect(i=0, j=J[syn])
S_tmp.w = W[syn]
S.append(S_tmp)
Expected output (if relevant)
Actual output (if relevant)
None of the neuron in layer H fired. All three neurons’ voltage were flat at 0V all the time.