Simulating spiking neural networks with
Brian 2

CNS*2021 showcase

Marcel Stimberg

(Institut de la Vision/Sorbonne Université)

July 3rd 2021

Brian’s history

w:100
Dan Goodman
(Imperial College London)

w:100
Romain Brette
(Institut de la Vision)

  • Started in 2007 at ENS Paris by Romain and Dan

  • Widely used for research and teaching

A simulator should not only save the time of processors, but also the time of scientists

  • No built-in library, tools to describe "any" model

  • Big rewrite in 2014 (code generation)

  • Free-and-open-source since the start

Brian’s philosophy

  • Use the same language to describe models that
    we use in scientific publications: equations
  • Built-in system for physical units
    Dimensional quantities are used everywhere,
    consistency is checked/enforced
    >>> Cm = 200*pF; Rm = 100*Mohm
    >>> tau = Cm * Rm
    >>> print(tau)
    20. ms
    
  • Written in Python and making best use of it
    (overwritten operators for unit system,
    indexing, helpful error messages...)
  • Friendly community, extensive documentation
    and helpful forums 😺
Paper on the left:
Peron et al. Recurrent interactions in local cortical circuits. Nature (2020) 10.1038/s41586-020-2062-x

Brian’s technology:
code generation

Brian code

group = NeuronGroup(1, 'dv/dt = -v / tau : 1')

"Abstract code" (internal pseudo-code representation)

_v = v*exp(-dt/tau)
v = _v

Brian’s technology:
code generation

C++ code snippet (once-per-group code)

const double _lio_1 = exp((-dt)/tau);

C++ code snippet (once-per-neuron code)

double v = _ptr_array_neurongroup_v[_idx];
const double _v = _lio_1*v;
v = _v;
_ptr_array_neurongroup_v[_idx] = v;

Brian’s technology:
code generation

Final C++ code after insertion in template

const double _lio_1 = exp((-dt)/tau);
for(int _idx=0; _idx<_N; idx++)
{
    double v = _ptr_array_neurongroup_v[_idx];
    const double _v = _lio_1*v;
    v = _v;
    _ptr_array_neurongroup_v[_idx] = v;
}

Brian’s technology:
execution modes

  • Runtime mode
    Simulation loop runs in Python, calls compiled blocks to do the work
    👍 Very flexible, can be combined with arbitrary Python code
    👎 Overhead from running Python, especially for small networks

  • Standalone mode
    Everything (models + connection definitions, initializations)
    written out to a standalone C++ project
    Compiled binary executes full run and stores results to disk
    👍 Fast, no Python overhead
    👍 Can be tailored to other platforms
    👎 Less flexible, no Python interaction during run

Brian’s domain

  • integrate-and-fire models

    Cmdv/dt=gL(ELv)+IC_m dv/dt = g_L(E_L - v) + I

    If v>vthv > v_{th}: emit spike and set vvresetv \leftarrow v_{reset}
  • non-linear channel dynamics (e.g. Hodgkin-Huxley)

    Cmdv/dt=(gL(ELv)+m3ngNa(ENav)+n4(EKv)dm/dt=C_m dv/dt = (g_L\left(E_L - v\right) + m^3 n g_{Na}\left(E_{Na} - v\right) + n^4\left(E_K - v\right)\\ dm/dt = \dots

  • multi-compartmental models
    (complex morphologies not yet very convenient to use)

Brian’s speed

“CUBA” network (sparsely connected LIF neurons; synapses = exponential currents)

w:750px

Example 1: neuron models

Integrate-and-fire model with noisy adaptive threshold

g_L = 10*nS; E_L = -70*mV; C_m = 200*pF
vt_0 = -50*mV; vt_adapt = 10*mV; tau_vt = 50*ms; sigma_vt = 5*mV
neurons = NeuronGroup(4, model="""
                     dv/dt = (g_L*(E_L - v) + I_inj)/C_m: volt
                     dvt/dt = (vt_0 - vt)/tau_vt + sigma_vt*tau_vt**-0.5*xi : volt
                     I_inj : amp (constant)""",
                     threshold='v > vt', reset='''v = E_L
                                                  vt += vt_adapt''')

Example 2: synapse models

short-term synaptic plasticity

(Tsodyks, Pawelzik, Markram, Neural Computation, 1998)

U_SE = 0.6    # utilization of synaptic efficacy
w_max = 2*nS  # maximal synaptic conductance
synapses_eqs = '''dx/dt = z / tau_rec          : 1 (event-driven) # "recovered"
                  dy/dt = -y / tau_in          : 1 (event-driven) # "effective"
                  dz/dt = y/tau_in - z/tau_rec : 1 (event-driven) # "inactive"
                  tau_rec : second (constant)
                  tau_in : second (constant)'''
synapses_action = '''y += U_SE * x
                     x -= U_SE * x
                     g_post += w_max * x'''
syn = Synapses(spike_source, target_neurons, model=synapses_eqs,
               on_pre=synapses_action)

Example 3: synaptic connections

  • Expressive connection syntax
  • arbitrary labels/properties in models
    descriptive connection declaration
neuron = NeuronGroup(100, model="""# ... actual neuron model
                     neuron_type : integer (constant)
                     x : meter (constant)
                     y : meter (constant)""")
neuron.neuron_type = 'int(rand() * 2)'  # two types
neuron.x = '(i % 10) * 50*um'  # Neurons arranged in a grid
neuron.y = '(i // 10) * 50*um'

synapses = Synapses(neuron, neuron)  # do-nothing synapse just for illustration
synapses.connect('neuron_type_pre == neuron_type_post',  # only if same type
                 p='1.5*exp(-sqrt(((x_pre - x_post)**2 +'
                   '               (y_pre - y_post)**2))/(100*um))')

Example 4: stimulus protocols

  • Flexible description of e.g. stimuli
  • parameters can change over time
g_L = 10*nS; E_L = -70*mV; C_m = 200*pF
neuron = NeuronGroup(3, model="""
                     dv/dt = (g_L*(E_L - v) + I_inj)/C_m : volt
                     I_inj = amplitude * (0.5 + 0.5 * sin(2*pi*freq*t)) : amp
                     amplitude : amp
                     freq : Hz""",
                     threshold='v > -40*mV', reset='v = E_L')
neuron.run_regularly("""amplitude = rand() * 2*nA
                        freq = 50*Hz + 100*Hz*rand()""",
                     dt=100*ms)  # change injected current every 100ms

The world of Brian

h:5em

Brian2GeNN (code generation for GeNN)
brian2genn.readthedocs.io
With: Thomas Nowotny

Brian2CUDA (code generation for CUDA)
github.com/brian-team/brian2cuda/ 🚧
Denis Alevi, Moritz Augustin

h:5em

brian2tools
(visualization, NeuroML import/export)
brian2tools.readthedocs.io
With: Vigneswaran C, Kapil Kumar, Dominik Krzemiński

brian2modelfitting
(fitting models to experimental data)
brian2modelfitting.readthedocs.io
With: Romain Brette, Aleksandra Teska, Ante Lojić Kapetanović (current GSoC student)

GPU drawing by Misha Petrishchev from the Noun Project
Toolbox drawing by MCruz (WMF), CC BY-SA 4.0, via Wikimedia Commons

Brian2GeNN h:1em

Interface to the GeNN simulator (see previous show case)

from brian2 import *
# ↓ ↓ ↓
import brian2genn
set_device('genn')
# ↑ ↑ ↑ 
N = 10000
tau = 10*ms
Iin = 0.11/ms 
eqs = 'dV/dt = -V/tau + Iin : 1'
G = NeuronGroup(N, eqs, threshold='V>1', reset='V=0', refractory=5 * ms)

run(10*ms)
Thomas Nowotny, James Knight
github.com/brian-team/brian2genn

Brian2CUDA h:1em

Direct code generation for CUDA 🚧 (release soon!)

from brian2 import *
# ↓ ↓ ↓
import brian2cuda
set_device('cuda')
# ↑ ↑ ↑ 
N = 10000
tau = 10*ms
Iin = 0.11/ms 
eqs = 'dV/dt = -V/tau + Iin : 1'
G = NeuronGroup(N, eqs, threshold='V>1', reset='V=0', refractory=5 * ms)

run(10*ms)
Denis Alevi, Moritz Augustin
github.com/brian-team/brian2cuda

Brian2tools h:1em

Simple plotting tools

from brian2 import ...
# Setup network

from brian2tools import brian_plot
brian_plot(state_monitor)
brian_plot(synapses)

Brian2tools h:1em

import/export framework

v0 = 11*mV; tau = 10*ms
eqs = 'dv/dt = (v0 - v) / tau : volt (unless refractory)'
group = NeuronGroup(n, eqs, threshold='v > 10*mV', reset='v = 0*mV',
                    refractory=5*ms, method='exact', name='excitatory')

NeuroML

<?xml version="1.0" ?>
<Lems xmlns="http://www.neuroml.org/lems/0.7.3" 
      …><Regime initial="true" name="integrating">
        <TimeDerivative value="(v0 - v) / tau" variable="v"/>
        <OnCondition test="v .gt. (10 * mV)">



Text-based descriptions

Neuron population :

  • Group excitatory, consisting of 10 neurons.

    Model dynamics:

    ddtv\frac{d}{d t} v=v+v0τ\frac{- v + v_{0}}{\tau}, except during the refractory period.

(Included in next release 🚧)

Vigneswaran C, Kapil Kumar, Dominik Krzemiński brian2tools.readthedocs.io

Brian2modelfitting h:1em

Easy-to-use fitting framework for Brian

  • Model definitions in Brian syntax
  • global gradient-free methods (via Nevergrad)
  • local gradient-based methods (via lmfit/scipy)
  • flexible and modular (e.g. replaceable error metrics)

Simulation-based inference

  • estimate posterior distribution of parameters
  • Wrapper around sbi package
  • Work-in-progress (GSoC) 🚧

Poster: P131 (Monday 3pm EDT / 9pm CEST)

Romain Brette, Aleksandra Teska, Ante Lojić Kapetanović
brian2modelfitting.readthedocs.io

Brian user survey

For Brian users and everyone who might become one – please fill out this (very short) survey:

tinyurl.com/brian-survey-2021

Thanks 💛!

Where to learn more about Brian

h:1.5em  Website: briansimulator.org

h:1.5em  Documentation: brian2.readthedocs.io

h:1.5em  Discussion forum: brian.discourse.group

h:1.5em  Development repository: github.com/brian-team/brian2

h:1.5em  Twitter account: @briansimulator

h:1.5em

Stimberg, M., Brette, R., & Goodman, D. F. (2019). Brian 2, an intuitive and efficient neural simulator. ELife, 8, e47314. 10.7554/eLife.47314

Stimberg, M., Goodman, D. F. M., Brette, R., & Pittà, M. D. (2019). Modeling Neuron–Glia Interactions with the Brian 2 Simulator. In M. De Pittà & H. Berry (Eds.), Computational Glioscience (pp. 471–505). Springer International Publishing. 10.1007/978-3-030-00817-8_18

Stimberg, M., Goodman, D. F. M., Benichoux, V., & Brette, R. (2014). Equation-oriented specification of neural models for simulations. Frontiers in Neuroinformatics, 8. 10.3389/fninf.2014.00006