DimensionMismatchError in run time mode but not in standalone mode

I encountered a DimensionMismatchError and any help is appreciated. The following is the example:
The first code save the synapse in a file:

from brian2 import *
import numpy as np 
eqs = '''
dv/dt = (I-v)/tau : 1
I : 1
tau : second
'''
G = NeuronGroup(2, eqs, threshold='v>1', reset='v = 0', method='exact')
G.I = [2, 0]
G.tau = [10, 100]*ms

# Comment these two lines out to see what happens without Synapses
S = Synapses(G, G, on_pre='v_post += 0.2')
S.connect(i=0, j=1)
S.delay = 10*ms
np.savez('test', i=S.i[:], j=S.j[:], delay=S.delay[:])

The second code load the saved the synapse and build the connection:

from brian2 import *
import numpy as np 
#set_device('cpp_standalone', build_on_run=False)

eqs = '''
dv/dt = (I-v)/tau : 1
I : 1
tau : second
'''

G = NeuronGroup(2, eqs, threshold='v>1', reset='v = 0', method='exact')
G.I = [2, 0]
G.tau = [10, 100]*ms

# Comment these two lines out to see what happens without Synapses
S = Synapses(G, G, on_pre='v_post += 0.2')

trained_S = np.load('test.npz')
S.connect(i=trained_S['i'], j=trained_S['j'])
S.delay[:] = trained_S['delay']
                
M = StateMonitor(G, 'v', record=True)

run(100*ms)
#device.build(directory='test', compile=True, run=True, debug=False)

The running of the 2nd code is fine in standalone mode, but in runtime mode, I’ve got the following error.

   ---------------------------------------------------------------------------
    DimensionMismatchError                    Traceback (most recent call last)
    <ipython-input-2-0d678e5714b0> in <module>
         18 trained_S = np.load('test.npz')
         19 S.connect(i=trained_S['i'], j=trained_S['j'])
    ---> 20 S.delay[:] = trained_S['delay']
         21
         22 M = StateMonitor(G, 'v', record=True)

    ~/anaconda3/lib/python3.8/site-packages/brian2/core/variables.py in __setitem__(self, item, value)
        912
        913     def __setitem__(self, item, value):
    --> 914         self.set_item(item, value, level=1)
        915
        916     @device_override('variableview_set_with_expression')

    ~/anaconda3/lib/python3.8/site-packages/brian2/core/variables.py in set_item(self, item, value, level, namespace)
        896             if item == 'True':
        897                 # We do not want to go through code generation for runtime
    --> 898                     self.set_with_index_array(slice(None), value,
        899                                               check_units=check_units)
        900             else:

    ~/anaconda3/lib/python3.8/site-packages/brian2/core/base.py in device_override_decorated_function(*args, **kwds)
        276                 return getattr(curdev, name)(*args, **kwds)
        277             else:
    --> 278                 return func(*args, **kwds)
        279
        280         device_override_decorated_function.__doc__ = func.__doc__

    ~/anaconda3/lib/python3.8/site-packages/brian2/core/variables.py in set_with_index_array(self, item, value, check_units)
       1164         variable = self.variable
       1165         if check_units:
    -> 1166             fail_for_dimension_mismatch(variable.dim, value,
       1167                                         'Incorrect unit for setting variable %s' % self.name)
       1168         if variable.scalar:

    ~/anaconda3/lib/python3.8/site-packages/brian2/units/fundamentalunits.py in fail_for_dimension_mismatch(obj1, obj2, error_message, **error_quantities)
        185             raise DimensionMismatchError(error_message, dim1)
        186         else:
    --> 187             raise DimensionMismatchError(error_message, dim1, dim2)
        188     else:
        189         return dim1, dim2

    DimensionMismatchError: Incorrect unit for setting variable delay (units are s and 1).

To avoid this error, I need to manually as the unit as follows:

  S.delay[:] = trained_S['delay']*second

Hi. I’m surprised it works in standalone mode, this should actually be considered a bug :slight_smile:

np.savez does not know anything about Brian quantities, so it just stores a standard numpy array without units to disk. When you load it, you therefore get an array without units which you cannot assign to the delay variable which expects a value in seconds. Multiplying the value with seconds is one solution, another would be to use Quantity(trained_S['delay'], dim=second.dim, copy=False). Admittedly this is a bit complicated, but it has the advantage that it can give you a “view” on an existing array without copying it in memory. This can be useful for very large arrays – not sure that this is very relevant here, though.

Finally, the simplest option would be to ask Brian to directly use a delay variable without units by appending an underscore, i.e. use

S.delay_[:] = trained_S['delay']

You could also use S.delay_[:] in the np.savez call, but it will throw away the unit information in either case.

Thanks Marcel. Does that mean the code running in standalone mode could have the results wrong?

No, if you didn’t convert the units manually in any way, then everything will work correctly (since this is the way things are represented internally anyway).

The reason why I think it should be a bug is that when assigning values to a variable, we check whether the unit of the values match, and this should happen without regard to runtime or standalone mode. In your use case you could have stored the values in a different scale (e.g. by doing np.savez(...delay=S.delay[:]/ms)) and then S.delay[:] = trained_S['delay'] would have interpreted these values in seconds without raising an error which should not happen. Of course by using delay_ etc. you can still make the main mistake, but at least you’d be explicit about the fact that you are not using Brian’s unit checking mechanism.

1 Like