Hi everyone,
General description
I’m playing a with minimal triple-STDP learning rule.
For each postsynaptic and presynaptic neuron, there are low-pass filters of spikes, which contribute to the change of synaptic plasticity.
where S_{pre} and S_{post} are presynaptic and postsynaptic firing rates.
The small constant \epsilon is added to indicate that synaptic weight w is updated before update o_2.
1. Looks like the most efficient implementation
The most efficient way is to implement r_1, o_1, and o_2 equations on the neuron side:
equ ="""
dv/dt = ....
dr1/dt = -r1/tau1p : 1
do1/dt = -o1/tau1m : 1
do2/dt = -o2/tau2p : 1
"""
npop = NeuronGroup(... equ, reset='v=vres; r1+=1; o1+=1; o2+=1' ....)
and then use these variables in synaptic equations.
s = Synapses( 'w:1',
on_pre='w = w-o1_post*Am',
on_post='w = w+r1_pre*o2_post*Ap', ....)
But it wasn’t clear to me whether the neuron reset will be done after the synaptic update or not.
2.Very slow implementation but with the persistent result
I fixed this by moving equations for r_1, o_1, and o_2, on the synaptic side and updating all variables after the synaptic weight update.
s = Synapses( """
w : 1
dr1/dt = -r1/tau1p : 1 (event-driven)
do1/dt = -o1/tau1m : 1 (event-driven)
do2/dt = -o2/tau2p : 1 (event-driven)
""",
on_pre="""
w = w-o1_post*Am
r1+= 1""",
on_post="""
w = w+r1_pre*o2_post*Ap
o1+=1
o2+=1""", ....)
However, this implementation is VERY slow because instead of a few thousand differential equations, I got a few million.
Finally question
Is there any way to ask Brian to do
- Update all neurons
- Trigger all events
- update all synapses
- reset all neurons.