Dendrify Synaptic Weights Update

Hi everyone,

I am building a network using Dendrify and I have a question about how to update the weight of the synapse after adding it to a specific compartment.

Description of problem

My problem (or rather, observation) is that setting the weight parameter using the ‘parameters’ dictionary of the compartment does not seem to change the weight itself when I later print the parameter. See an example below.

Minimal code to reproduce problem

``````import brian2 as b
import dendrify as d
from brian2.units import *
from dendrify import Soma, Dendrite, PointNeuronModel

b.prefs.codegen.target = 'numpy'

soma = Soma('soma', length=20*um, diameter=20*um,
cm=1*uF/(cm**2), gl=40*uS/(cm**2),
r_axial=150*ohm*cm, v_rest=-70*mV)

dend = Dendrite('dend', length=20*um, diameter=20*um,
cm=1*uF/(cm**2), gl=40*uS/(cm**2),
r_axial=150*ohm*cm, v_rest=-70*mV)
dend.synapse('AMPA', tag='L23E', g=1*nS,  t_decay=2*ms)

print('Before modification:', dend)

# Trying to update the value of the synaptic weight parameter: w_AMPA_L23E_dend
dend.parameters['w_AMPA_L23E_dend'] = 20

print('After modification:', dend)
``````

The print statements will show that the value of the parameter ‘w_AMPA_L23E_dend’ is still 1.0 after the attempt to change it.

What I tried to do is to change the model itself after defining it with the two compartments (soma, send). Then the parameter seems to change in the dictionary. So, I added this code after the snippet above:

``````graph = [(soma, dend, 12*nS)]
model = NeuronModel(graph, scale_factor=5, spine_factor=1)

print('After changing the model itself:', model)
``````

Actual output (if relevant)

The last snippet will print this information:

``````After changing the model itself:
OBJECT
------
<class 'dendrify.neuronmodel.NeuronModel'>

EQUATIONS
---------
dV_soma/dt = (gL_soma * (EL_soma-V_soma) + I_soma) / C_soma  :volt
I_soma = I_ext_soma + I_dend_soma   :amp
I_ext_soma  :amp
I_dend_soma = (V_dend-V_soma) * g_dend_soma  :amp

dV_dend/dt = (gL_dend * (EL_dend-V_dend) + I_dend) / C_dend  :volt
I_dend = I_ext_dend + I_soma_dend  + I_AMPA_L23E_dend  :amp
I_ext_dend  :amp
I_AMPA_L23E_dend = g_AMPA_L23E_dend * (E_AMPA-V_dend) * s_AMPA_L23E_dend * w_AMPA_L23E_dend  :amp
ds_AMPA_L23E_dend/dt = -s_AMPA_L23E_dend / t_AMPA_decay_L23E_dend  :1
I_soma_dend = (V_soma-V_dend) * g_soma_dend  :amp

PARAMETERS
----------
{'Alpha_NMDA': 0.062,
'Beta_NMDA': 3.57,
'EL_dend': -70. * mvolt,
'EL_soma': -70. * mvolt,
'E_AMPA': 0. * volt,
'E_Ca': 136. * mvolt,
'E_GABA': -80. * mvolt,
'E_K': -89. * mvolt,
'E_NMDA': 0. * volt,
'E_Na': 70. * mvolt,
'Gamma_NMDA': 0,
'Mg_con': 1.0,
'gL_dend': 2.51327412 * nsiemens,
'gL_soma': 2.51327412 * nsiemens,
'g_AMPA_L23E_dend': 1. * nsiemens,
'g_dend_soma': 12. * nsiemens,
'g_soma_dend': 12. * nsiemens,
't_AMPA_decay_L23E_dend': 2. * msecond,
'w_AMPA_L23E_dend': 20}

EVENTS
------
[]

EVENT CONDITIONS
----------------
{}
``````

This seems to have changed the parameter, but I was wondering if this is the way that Dendrify is intended to be used when changing synaptic weights? Will this way of changing the weights work as expected in a more complex script, keeping the value that I set myself during the simulations? Would this mean that every time I need to change the synaptic weight parameter, I would need to update the model and recreate the neuron group using it? Or upon updating the model, will the neuron group using that model automatically update that value in the simulation?

Or perhaps there is a different way to do it at the moment when the synapse is defined for a compartment (i.e. on the line ‘dend.synapse’). I am asking this question because I am working on a model with many different populations which connect to each other, hence it would be very useful to understand the optimal method to update the synaptic weights.

Best,
Rares

1 Like

Dear Rears,

Hi again!

When I first wrote Dendrify, one of my main goals was to encapsulate as many functionalities of Brian 2 as possible. However, I noticed that while it streamlined the model code significantly, it also limited Brian’s flexibility. Since Dendrify primarily targets Brian users, I decided to allow some things to be implemented through pure Brian code.

In the notebook below, you’ll see how to:

1. Combine Dendrify and Brian objects to implement a synaptic plasticity rule (STDP).
2. Use dendritic activity (instead of somatic activity) to update synaptic weights.

Dendritic STDP notebook

If you have any further questions, please feel free to let me know!

Best regards,
Michalis

P.S. I’m still investigating your other question. You’ll hear from me ASAP (I’m still on holiday! )

2 Likes

Dear @mpagkalos ,

Thank you so much for your reply and the example code, always helpful! This is a very good reference for implementing synaptic plasticity in the model and I will definitely use it when I’m at that stage.

Right now the phenomena that I am trying to model and for which I need to find an optimal synaptic weight is to achieve a specific post-synaptic potential (PSP) induced in the receiving cell by a spike coming through a synapse (e.g. AMPA, NMDA) while the receiving cell is at rest. Therefore, it seems to me that for a case in which I will have static weights during the simulation (without synaptic plasticity), the following line, if written before defining a neuron group that uses the respective Dendrify model, should be good to use (but please let me know if not):

``````model.add_params({'w_AMPA_L23E_dend': optimal_value})
``````

This will mean that the ‘s_AMPA_L23E_dend’ will be multiplied by ‘optimal_value’ in the equation:

``````dV_dend/dt = (gL_dend * (EL_dend-V_dend) + I_dend) / C_dend  :volt
I_dend = I_ext_dend + I_AMPA_L23E_dend  :amp
I_ext_dend  :amp
I_AMPA_L23E_dend = g_AMPA_L23E_dend * (E_AMPA-V_dend) * s_AMPA_L23E_dend * w_AMPA_L23E_dend  :amp
ds_AMPA_L23E_dend/dt = -s_AMPA_L23E_dend / t_AMPA_decay_L23E_dend  :1
``````

Then, when implementing plasticity, I will adapt the code for that use-case, which explicitly uses custom equations for the Synapse object as you pointed out in the notebook.

Many thanks again.

Best,
Rares

Hi Rares,

I think I now understand what you are trying to accomplish and the solution is actually quite simple.

To calibrate the static ‘optimal’ synaptic weights, all you have to do is to adjust the conductance of the synapse as in the following example:

``````dend = Dendrite('dend', cm_abs=50*pF, gl_abs=2.5*nS)
dend.synapse('AMPA', tag='x', g=6*nS,  t_decay=5*ms)
# dend.synapse('AMPA', tag='x', g=3*nS,  t_decay=5*ms)  -> smaller EPSPs
``````

By adjusting the conductance (g) inside dend.synapse() you can calibrate the EPSP amplitude. Dendrify will automatically add this value to the parameters dictionary. By the way, adding rise kinetics will also impact the EPSP amplitude since the net synaptic charge will increase as well. See more here.

Now let’s talk about plastic weights. Currently dendrify equations look something like this:

I_AMPA = g_AMPA * (E_AMPA-V) * s_AMPA * w_AMPA

on presynaptic spike:
s → s + a

For simplicity we initialize a = 1. To scale the synaptic current in response to some plasticity rule, you have 2 equivalent options:

1. Scale a (this is exactly what I do in the colab I shared above).
2. Scale w_AMPA

I would recommend going with option 1 since it is much easier to work with and works better with other Brian 2 objects. Option 2 currently might be a lot harder to implement and maintain in practice. In the current Dendrify version (2.1.4) w_AMPA is just a scale factor, not a variable like I or V (that are stored as arrays). I could even remove it in future updates and nothing would change . As I mentioned, implementing plasticity rules directly in Dendrify was a feature I was experimenting with, but did not find a good reason to justify the effort over simply using Brian’s Synapses.

Best,
Michalis

2 Likes

Hi Michalis,

That makes sense, thank you! I’ll just work with conductances for now as it seems much easier to manipulate and define during the definition of synapses.

Regarding plasticity, I’ll definitely go with the first option as you suggested.

Best wishes,
Rares

1 Like