I don’t know much about pickle files. As you gave examples to get data from Monitors for only in Pickle file. But i want to get data in .csv, .txt file. I tried for that but I was unable to get all data, in that file, very less data comes with dots. Please suggest me How can i get data from monitors in csv file.

Hi and welcome to the forum.

If you are only interested in a single recorded variable, you will get it as a standard numpy array froma monitor (or as a `Quantity`

if it has physical units, but this is a subclass of the standard numpy array, see the documentation on units for more details). You can then use a function like numpy.savetxt to store it to a text file. Monitors (and groups of neurons, etc.) also provide the `get_states`

method to get the values of several variables at once. By default it will return a dictionary, but it can also return a pandas DataFrame which is convenient for storing as csv. Note that the pandas format has some limitations, though: it does not support physical units and it can only represent one-dimensional data. The latter restriction means that you cannot use it for variables recorded by a `StateMonitor`

, since each variable is stored as a 2-d array (values per timestep and per neuron). But you can use it for example with a `SpikeMonitor`

. E.g. to store the spike times and indices to a csv file, something like the following should work:

```
spike_mon = SpikeMonitor(...)
# ... run simulation
# Get spike times/indices as a data frame:
data = spike_mon.get_states(['t', 'i'], units=False, format='pandas')
# Write data frame to disk as a CSV file:
data.to_csv('spikes.csv', index=False)
```

Hope that makes things clearer.

Sorry Sir,

It’s not working for me. Actually I took 100 cells with connection probability 0.1. So I want to want to track all variable (e.g, voltage , conductance etc. and synaptic variables) I used ‘get_states’ function with your example given above. But its not working for me. I want to track all variable for ‘dt’ time step. Say I run program for 3 Seconds with dt = 0.1 so i want 30000 data sets for each variable so i can use statistics to analysis those. If it is not possible please tell me other option I have given much time to understand brian2, I can’t leave it.My work is pending due to this problem. I hope you understand my problem.

I’m afraid we cannot help you much more with this, given that this seems to be more about general data processing in Python and not so much about Brian. As I said earlier, when you record a variable for each time step for several neurons with a `StateMonitor`

, the `StateMonitor`

will give you either a 2-D `numpy.ndarray`

or a 2-D `Quantity`

(which is derived from `numpy.ndarray`

). If you want to store this data to a file, you can use for example numpy’s `savetxt`

function:

```
# .... some model
state_mon = StateMonitor(neurons, 'v', record=True) # recording variable 'v' from all neurons
run(...)
np.savetxt('v.txt', state_mon.v_.T)
```

I use `v_`

instead of `v`

above to get the values without units, but using `v`

would also work since numpy would simply discard the units. An alternative would be to use something like `state_mon.T/mV`

to get the values in mV. Note that I transposed the matrix with `.T`

to get the more common format of one row for each time step and one column for each neuron – otherwise it would be the opposite.

If you want to store the values from several variables in a file, then you’ll have to manually put them all into a single 2d array (with functions like `np.concatenate`

), or create a pandas DataFrame with one column per variable+neuron. You could also write a little function that does this somewhat automatically for you. Of course you don’t need to save anything to a csv file to do statistics on it, you can do this with numpy/scipy/pandas directly. See e.g. https://scipy-lectures.org/ for teaching material related to this. Just as a trivial example:

```
print(np.mean(state_mon.v, axis=1))
```

This would give you the average membrane potential of each neuron (i.e. averaged over time).