 # Description of problem

Hello, I need to implement custom synaptic weight matrices into my network. I have found a way of doing this using a for loop that goes through all the source and target neurons in the network and sets the synaptic weight directly for each synapse. However, the loop takes a very long time to complete for a 10^3 network size. Is there a faster or more correct way of setting custom synaptic weight matrix throughout the network? Is there a direct method to specify a custom synaptic weight matrix in the Synapses group?

# Minimal code to reproduce problem

``````N = 4000  # number of neurons in network

eqs = """
dv/dt  = (ge+gi-(v-El))/taum : volt (unless refractory)
dge/dt = -ge/taue : volt
dgi/dt = -gi/taui : volt
"""

P = NeuronGroup(N, eqs, threshold='v>Vt', reset='v = Vr', refractory=5*ms,
method='exact')

W = mtx   # N x N custom connectivity matrix

Ce = Synapses(P, P, 'w_e: volt', on_pre='ge += w_e')
Ce.connect(p=1.0)
for i in range(N):
for j in range(N):
msg = 'connecting neurons %s and %s' % (i, j)
print(msg, end='\r')
Ce.w_e[i, j] = W[i,j]*mV
``````

Thank you for any advice you might have.

Prajay Shah

Hi Prajay,

I’m a brian amateur, so I don’t know in advance whether or not this will be faster, but could you try the following:
from the documentation on synapses

``````sources, targets = W.nonzero()
Ce_synapses = Synapses(P, P, 'w_e: volt', on_pre='ge += w_e')
Ce_synapses.connect(i=sources, j=targets)
``````

at the very least this will mean only the nonzero elements of W will actually be connected (instead of some being connected with weight 0)

if the nonzero elements of W have different values from each other you’ll need some way of setting those weight values individually, perhaps making creative use of the string specification for w (see distance-dependent weight here), but if they’re all the same you could do something like

``````w_val = 1.0 #or whatever the value. is
Ce_synapses.w_e = w_val * mV
``````

as another practical note, those print statements are likely to add a lot of time! simply removing those (or having the option to turn them off when not debugging) might be sufficient to solve your problem

I have looked into the .connect documentation, and the issue is that it only allows for binary 1 or 0 connections between sources and targets.

I have a weight matrix that has a continuous distribution of synaptic weights values. My key constraint is that I have a very specific method of building the synaptic connectivity weight matrix (I take the outer product of two vectors), so I can’t think of any string specification for w that would allow me to build the matrix I need.

I wonder… would

``````Ce_synapses.w_e = 'W[i,j]'
``````

work? (you may need to multiply W by mV in advance to get the units right)

Hi @pshah95 . Unfortunately we only mention the other direction (getting a weight matrix out of Brian’s internal structure) in the documentation, but if you have a fully-connected network, then using a weight matrix to set the weights is actually quite simple. After connecting things you can use:

``````Ce.w[:] = W.flatten()*mV
``````

This works because the internal ordering of synapses is exactly the same as for a flattened matrix.

Hope that helps, best
Marcel

PS: It is doing exactly the same thing, but the more “canonical” way of asking for an all-to-all connectivity would be `Ce.connect(True)` or `Ce.connect()` instead of `Ce.connet(p=1.0)`.

1 Like

Thanks @mstimberg ! This is working great.