Question about store() and restore()

Description of problem

I currently train a network by exposing it to a sample for a timeperiod, followed by a rest time period. I think that store() and restore() can be potentially used to eliminate the need for a rest period, but the problem I am facing is that if I use restore() to reset the state variables, won’t the weights also be reset, thus nullifying the training that has happened?

Minimal code to reproduce problem

I was thinking of using store and restore in the following way:

Initialize the network
...
store('initial state')
for i in train_items:
    ...train
    restore('initial state')

Alternatively, if I have my input for all samples in a single stimulus array, I would need to reset the state variables every time_sample period, is there a way to do this?

What you have aready tried

Expected output (if relevant)

Actual output (if relevant)

Full traceback of error (if relevant)

Hi @mallard1707 . Indeed, using restore is like “going back in time”, so your training results would be gone, too. Resetting the state variables instead would be one option, you can use run_regularly for that, e.g.

neurons.run_regularly("v = E_L; I_syn = 0*nA", dt=100*ms)

would reset the membrane potential and the synaptic current every 100ms – obviously, the details depend on your model.

The other alternative is to use store/restore, but to manually handle the weights:

Initialize the network
...
store('initial state')
for i in train_items:
    ...train
    weights = neurons.w[:]  # save the weights
    restore('initial state')
    neurons.w = weights  # set the weights

Thank you, Marcel. I had a question: Can I use run_regularly along with run?

net.run(duration)
net.run_regularly("v = E_L; I_syn = 0*nA", dt = duration/100)

Yes, definitely – in fact, run_regularly without a run call will not do anything. With a run_regularly, you are setting up a part of your model, in the same way that you define equations, etc. Note that run_regularly is a method of a NeuronGroup (or Synapses), not of the network. If you want to have it executed during the run, you need to define it before the call to run. Here’s a trivial example:

G = NeuronGroup(1, "dv/dt = -v/(10*ms) : 1")
# Set the membrane potential to a random value every 20ms
G.run_regularly("v = rand()", dt=20*ms)
mon = StateMonitor(G, "v", record=0)
run(100*ms)
plt.plot(mon.t/ms, mon.v)
plt.show()

Figure_1

If you want to do something completely arbitrary doing a simulation, i.e. something that you can express in Python code, but not with the equation syntax, then you can use a network_operation.

1 Like