Skip to content

Commit

Permalink
thomas's changes
Browse files Browse the repository at this point in the history
  • Loading branch information
neworderofjamie committed Apr 3, 2024
1 parent acaf3a9 commit 83620b5
Show file tree
Hide file tree
Showing 41 changed files with 761 additions and 626 deletions.
2 changes: 1 addition & 1 deletion documentation/5/.buildinfo
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Sphinx build info version 1
# This file hashes the configuration used when building these files. When it is not found, a full rebuild will be done.
config: 00b64a1bd6622c25a3d981e53c278f23
config: 50012eff81b34f8c34d47502b5f3252f
tags: 645f666f9bcd5a90fca523b33c5a78b7
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
6 changes: 3 additions & 3 deletions documentation/5/_sources/bibliography.rst.txt
Original file line number Diff line number Diff line change
Expand Up @@ -3,12 +3,12 @@
============
Bibliography
============
.. [izhikevich2003simple] Izhikevich, E. M. (2003). Simple model of spiking neurons. IEEE Transactions on neural networks, 14(6), 1569-1572. https://doi.org/10.1109/TNN.2003.820440
.. [Izhikevich2003] Izhikevich, E. M. (2003). Simple model of spiking neurons. IEEE Transactions on neural networks, 14(6), 1569-1572. https://doi.org/10.1109/TNN.2003.820440
.. [Knight2018] Knight, J. C., & Nowotny, T. (2018). GPUs Outperform Current HPC and Neuromorphic Solutions in Terms of Speed and Energy When Simulating a Highly-Connected Cortical Model. Frontiers in Neuroscience, 12(December), 1–19. https://doi.org/10.3389/fnins.2018.00941
.. [Morrison2008] Morrison, A., Diesmann, M., & Gerstner, W. (2008). Phenomenological models of synaptic plasticity based on spike timing. Biological Cybernetics, 98, 459–478. https://doi.org/10.1007/s00422-008-0233-1
.. [nowotny2005self] Nowotny, T., Huerta, R., Abarbanel, H. D., & Rabinovich, M. I. (2005). Self-organization in the olfactory system: one shot odor recognition in insects. Biological cybernetics, 93, 436-446. https://doi.org/10.1007/s00422-005-0019-7
.. [Nowotny2005] Nowotny, T., Huerta, R., Abarbanel, H. D., & Rabinovich, M. I. (2005). Self-organization in the olfactory system: one shot odor recognition in insects. Biological cybernetics, 93, 436-446. https://doi.org/10.1007/s00422-005-0019-7
.. [Potjans2014] Potjans, T. C., & Diesmann, M. (2014). The Cell-Type Specific Cortical Microcircuit: Relating Structure and Activity in a Full-Scale Spiking Network Model. Cerebral Cortex, 24(3), 785–806. https://doi.org/10.1093/cercor/bhs358
.. [Rulkov2002] Rulkov, N. F. (2002). Modeling of spiking-bursting neural behavior using two-dimensional map. Physical Review E, 65(4), 041922. https://doi.org/10.1103/PhysRevE.65.041922
.. [Traub1991] Traub, R. D., & Miles, R. (1991). Neuronal networks of the hippocampus (Vol. 777). Cambridge University Press.
.. [Turner2022] Turner, J. P., Knight, J. C., Subramanian, A., & Nowotny, T. (2022). mlGeNN: accelerating SNN inference using GPU-enabled neural networks. Neuromorphic Computing and Engineering, 2(2), 024002. https://doi.org/10.1088/2634-4386/ac5ac5
.. [Zenke2018] Zenke, F., & Ganguli, S. (2018). SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation, 30(6), 1514–1541. https://doi.org/10.1162/neco_a_01086
.. [Zenke2018] Zenke, F., & Ganguli, S. (2018). SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation, 30(6), 1514–1541. https://doi.org/10.1162/neco_a_01086
62 changes: 41 additions & 21 deletions documentation/5/_sources/building_networks.rst.txt
Original file line number Diff line number Diff line change
@@ -1,5 +1,7 @@
.. py:currentmodule:: pygenn
.. _`section-building-networks`:

=================
Building networks
=================
Expand Down Expand Up @@ -31,7 +33,7 @@ Batching can be enabled on a GeNN model with:
model.batch_size = 512
Parameters and sparse connectivity are shared across all batches.
Whether state variables are duplicated or shared is controlled by the :class:`.VarAccess` or :class:`.CustomUpdateVarAccess`
Whether state variables are duplicated or shared is controlled by the :class:`.VarAccess` or :class:`.CustomUpdateVarAccess` enumeration
associated with each variable. Please see TODO for more details.

Additionally, any preferences exposed by the backend can be configured here.
Expand All @@ -47,47 +49,59 @@ Populations
-----------
Populations formalise the concept of groups of neurons or synapses that are functionally related or a practical grouping, e.g. a brain region in a neuroscience model or a layer in a machine learning context.

.. _`section-parameters`:

Parameters
----------
Parameters are initialised to constant numeric values which are homogeneous across an entire population:

.. code-block:: python
ini = {"m": 0.0529324, ...}
para = {"m": 0.0529324, ...}
They are very efficient to access from models as their values are either hard-coded into the backend code
or, on the GPU, delivered via high-performance constant cache.
However, they can only be used if literally no changes are needed for the entire simulation and all members of
the population have the exact same parameter value.
However, they can only be used if all members of the population have the exact same parameter value.
Parameters are always read-only but can be made *dynamic* so they can be changed from the host
during the simulation by calling :meth:`pygenn.NeuronGroup.set_param_dynamic` on a :class:`.NeuronGroup`, i.e.

.. code-block:: python
pop.set_param_dynamic("tau")
where ``pop`` is a neuron group returned by :meth:`.GeNNModel.add_neuron_population` or synapse group returned by :meth:`.GeNNModel.add_synapse_population` and "tau" is one of the population's declared parameters.

.. warning::
Derived parameters will not change if the parameters they rely on are made dynamic and changed at runtime.

Extra global parameters
Extra Global Parameters
-----------------------
When building more complex models, it is sometimes useful to be able to access arbitarily
sized arrays. In GeNN, these are called Extra Global Parameters (EGPs) and they need
to be manually allocated and initialised before simulating the model. For example, the built
in :func:`.neuron_models.SpikeSourceArray` model has a ``spikeTimes`` EGP
which is used to provide an array of spike times for the spike source to emit. Given two
two numpy arrays: ``spike_ids`` containing the ids of which neurons spike and
to be manually allocated and initialised before simulating the model. For example, the built-in :func:`.neuron_models.SpikeSourceArray` model has a ``spikeTimes`` EGP
which is used to provide an array of spike times for the spike source to emit. Given two numpy arrays: ``spike_ids`` containing the ids of which neurons spike and
``spike_times`` containing the time at which each spike occurs, a :func:`.neuron_models.SpikeSourceArray`
model can be configured as follows:

.. code-block:: python
# Calculate start and end index of each neurons spikes in sorted array
# Calculate start and end index of each neuron's spikes in sorted array
end_spike = np.cumsum(np.bincount(spike_ids, minlength=100))
start_spike = np.concatenate(([0], end_spike[0:-1]))
# Sort events first by neuron id and then
# by time and use to order spike times
spike_times = poisson_times[np.lexsort((spike_times, spike_ids))]
spike_times = spike_times[np.lexsort((spike_times, spike_ids))]
model = GeNNModel("float", "spike_source_array")
model = GeNNModel("float", "spike_source_array_example")
ssa = model.add_neuron_population("SSA", 100, "SpikeSourceArray", {},
{"startSpike": start_spike, "endSpike": end_spike})
ssa.extra_global_params["spikeTimes"].set_init_values(spike_times)
.. _`section-variables`:

Variables
----------
Variables contain values that are individual to the members of a population and can change over time. They can be initialised in many ways. The initialisation is configured through a Python dictionary that is then passed to :meth:`.GeNNModel.add_neuron_population` or :meth:`.GeNNModel.add_synapse_population` which create the populations.
Expand Down Expand Up @@ -115,6 +129,8 @@ The resulting initialisation snippet can then be used in the dictionary in the u
ini = {"m": init, ...}
.. _`section-variables-references`:

Variables references
--------------------
As well as variables and parameters, various types of models have variable references which are used to reference variables belonging to other populations.
Expand Down Expand Up @@ -147,10 +163,13 @@ These 'weight update variable references' also have the additional feature that
wu_transpose_var_ref = {"R": create_wu_var_ref(sg, "g", back_sg, "g")}
where ``back_sg`` is another :class:`.SynapseGroup` with tranposed dimensions to sg i.e. its _postsynaptic_ population has the same number of neurons as sg's _presynaptic_ population and vice-versa.
where ``back_sg`` is another :class:`.SynapseGroup` with tranposed dimensions to sg i.e. its *postsynaptic* population has the same number of neurons as sg's *presynaptic* population and vice-versa.

After the update has run, any updates made to the 'forward' variable will also be applied to the tranpose variable
[#]_ Tranposing is currently only possible on variables belonging to synapse groups with :attr:`.SynapseMatrixType.DENSE` connectivity [#]_

.. note::

Transposing is currently only possible on variables belonging to synapse groups with :attr:`.SynapseMatrixType.DENSE` connectivity

Variable locations
------------------
Expand All @@ -165,14 +184,15 @@ However, the following alternative 'variable locations' are available:

'Zero copy' memory is only supported on newer embedded systems such as
the Jetson TX1 where there is no physical seperation between GPU and host memory and
thus the same physical of memory can be shared between them.

thus the same physical memory can be shared between them.

.. _`section-extra-global-parameter-references`:

Extra global parameter references
---------------------------------
When building models with complex `Custom updates`_ and `Custom Connectivity updates`_,
it is often useful to share data stored in extra global parameters between different groups.
Similarly to variable references, such links are made using extra global parameter references.
Similar to variable references, such links are made using extra global parameter references.
These can be created using:

.. autofunction:: pygenn.create_egp_ref
Expand Down Expand Up @@ -203,7 +223,7 @@ Weight update models are typically initialised using:
:noindex:

Postsynaptic models define how synaptic input translates into an input current
(or other input term for models that are not current based) and are typically initialise using:
(or other type of input for models that are not current based) and are typically initialised using:

.. autofunction:: pygenn.init_postsynaptic
:noindex:
Expand All @@ -221,7 +241,7 @@ and :attr:`pygenn.SynapseMatrixType.PROCEDURAL` synaptic connectivity can be ini
.. autofunction:: pygenn.init_sparse_connectivity
:noindex:

and :attr:`pygenn.SynapseMatrixType.TOEPLITZ` can be initialised using:
:attr:`pygenn.SynapseMatrixType.TOEPLITZ` can be initialised using:

.. autofunction:: pygenn.init_toeplitz_connectivity
:noindex:
Expand All @@ -233,7 +253,7 @@ Finally, with these components in place, a synapse population can be added to th

Current sources
---------------
Current sources
Current sources are added to a model using:

.. automethod:: .GeNNModel.add_current_source
:noindex:
Expand All @@ -242,7 +262,7 @@ Custom updates
--------------
The neuron groups, synapse groups and current sources described in previous sections are all updated automatically every timestep.
However, in many types of model, there are also processes that would benefit from GPU acceleration but only need to be triggered occasionally.
For example, such updates could be used in a classifier to to reset the state of neurons after a stimuli has been presented or in a model
For example, such updates could be used in a classifier to reset the state of neurons after a stimulus has been presented or in a model
which uses gradient-based learning to optimize network weights based on gradients accumulated over several timesteps.

Custom updates allows such updates to be described as models, similar to the neuron and synapse models described in the preceding sections.
Expand Down
28 changes: 14 additions & 14 deletions documentation/5/_sources/custom_models.rst.txt
Original file line number Diff line number Diff line change
Expand Up @@ -3,8 +3,8 @@
=============
Custom models
=============
One of the main things that makes GeNN different than other SNN simulators is that all the
models and snippets used to describe the behaviour of your model (see `Building networks`_)
One of the main things that makes GeNN different from other SNN simulators is that all the
models and snippets used to describe the behaviour of your model (see :ref:`section-building-networks`)
can be easily customised by the user using strings containing a C-like language called GeNNCode.

--------
Expand All @@ -18,15 +18,15 @@ This is essentially C99 (https://en.cppreference.com/w/c/language) with the foll
- Functions, typedefines and structures cannot be defined in user code
- Structures are not supported at all
- Some esoteric C99 language features like octal integer and hexadecimal floating point literals aren't supported
- The address of (&) operator isn't supported. On the GPU hardware GeNN targets, local variables are assumed to be stored in registers and not addressable. The only time this is limiting is when dealing with extra global parameter arrays as you can no longer do stuff like ``const int *egpSubset = &egp[offset];`` and instead have to do ``const int *egpSubset = egp + offset;``.
- The address of (&) operator isn't supported. On the GPU hardware GeNN targets, local variables are assumed to be stored in registers and not addressable. The only time this is limiting is when dealing with extra global parameter arrays as you can no longer do something like ``const int *egpSubset = &egp[offset];`` and instead have to do ``const int *egpSubset = egp + offset;``.
- Like C++ (but not C99) function overloading is supported so ``sin(30.0f)`` will resolve to the floating point rather than double-precision version.
- Floating point literals like ``30.0`` without a suffix will be treated as ``scalar``, ``30.0f`` will always be treated as float and ``30.0d`` will always be treated as double.
- A LP64 data model is used on all platforms where ``int`` is 32-bit and ``long`` is 64-bit.
- Only the following standard library functions are supported: ``cos``, ``sin`, ``tan``, ``acos``, ``asin``, ``atan``, ``atan2``, ``cosh``, ``sinh``, ``tanh``, ``acosh``, ``asinh``, ``atanh``, ``exp``, ``expm1``, ``exp2``, ``pow``, ``scalbn``, ``log``, ``log1p``, ``log2``, ``log10``, ``ldexp``, ``ilogb``, ``sqrt``, ``cbrt``, ``hypot``, ``ceil``, ``floor``, ``fmod``, ``round``, ``rint``, ``trunc``, ``nearbyint``, ``nextafter``, ``remainder``, ``fabs``, ``fdim``, ``fmax``, ``fmin``, ``erf``, ``erfc``, ``tgamma``, ``lgamma``, ``copysign``, ``fma``, ``min``, ``max``, ``abs``,``printf``
- Floating point literals like ``30.0`` without a suffix will be treated as ``scalar`` (i.e. the floating point type declared as the precision of the overall model), ``30.0f`` will always be treated as float and ``30.0d`` will always be treated as double.
- An LP64 data model is used on all platforms where ``int`` is 32-bit and ``long`` is 64-bit.
- Only the following standard library functions are supported: ``cos``, ``sin``, ``tan``, ``acos``, ``asin``, ``atan``, ``atan2``, ``cosh``, ``sinh``, ``tanh``, ``acosh``, ``asinh``, ``atanh``, ``exp``, ``expm1``, ``exp2``, ``pow``, ``scalbn``, ``log``, ``log1p``, ``log2``, ``log10``, ``ldexp``, ``ilogb``, ``sqrt``, ``cbrt``, ``hypot``, ``ceil``, ``floor``, ``fmod``, ``round``, ``rint``, ``trunc``, ``nearbyint``, ``nextafter``, ``remainder``, ``fabs``, ``fdim``, ``fmax``, ``fmin``, ``erf``, ``erfc``, ``tgamma``, ``lgamma``, ``copysign``, ``fma``, ``min``, ``max``, ``abs``, ``printf``

Random number generation
------------------------
Random numbers are useful in many forms of custom model. For example as a source of noise or a probabilistic spiking mechanism.
Random numbers are useful in many forms of custom model, for example as a source of noise or a probabilistic spiking mechanism.
In GeNN this can be implemented by using the following functions within GeNNCode:

- ``gennrand()`` returns a random 32-bit unsigned integer
Expand All @@ -40,14 +40,14 @@ In GeNN this can be implemented by using the following functions within GeNNCode
-----------------------
Initialisation snippets
-----------------------
Initialisation snippets are use GeNNCode to initialise various parts of a GeNN model.
Initialisation snippets are GeNNCode to initialise various parts of a GeNN model.
They are configurable by the user with parameters, derived parameters and extra global parameters.
Parameters have a homogeneous numeric value across the population being initialised.
'Derived parameters' are a mechanism for enhanced efficiency when running neuron models.
They allow constants used within the GeNNCode implementation of a model to be computed
from more 'user friendly' parameters provided by the user. For example, a decay to apply
each timestep could be computed from a time constant provided in a parameter called ``tau``
by passing the following keyword arguments to one of the snippet or model creation functions described bwlo:
by passing the following keyword arguments to one of the snippet or model creation functions described below:

.. code-block:: python
Expand Down Expand Up @@ -76,15 +76,15 @@ Toeplitz connectivity initialisation
------------------------------------
Toeplitz connectivity initialisation snippets are used to generate convolution-like connectivity
on the fly when using :attr:`SynapseMatrixType.TOEPLITZ` connectivity.
New toeplitz connectivity initialisation snippets can be defined by calling:
New Toeplitz connectivity initialisation snippets can be defined by calling:

.. autofunction:: pygenn.create_toeplitz_connect_init_snippet
:noindex:

------
Models
------
Models extend the snippets describe above by adding state.
Models extend the snippets described above by adding state.
They are used to define the behaviour of neurons, synapses and custom updates.

Variable access
Expand All @@ -109,7 +109,7 @@ different circumstances, their variables can have the following modes:
.. autoclass:: pygenn.CustomUpdateVarAccess
:noindex:


.. _section-neuron-models:
Neuron models
-------------
Neuron models define the dynamics and spiking behaviour of populations of neurons.
Expand All @@ -128,7 +128,7 @@ New weight update models are defined by calling:

Postsynaptic models
-------------------
The postsynaptic models defines how synaptic input translates into an input current (or other input term for models that are not current based).
The postsynaptic model defines how synaptic input translates into an input current (or other input term for models that are not current based).
They can contain equations defining dynamics that are applied to the (summed) synaptic activation, e.g. an exponential decay over time.
New postsynaptic models are defined by calling:

Expand Down Expand Up @@ -157,4 +157,4 @@ Custom update models define operations on model connectivity that can be trigger
New custom connectivity update models are defined by calling:

.. autofunction:: pygenn.create_custom_connectivity_update_model
:noindex:
:noindex:
Loading

0 comments on commit 83620b5

Please sign in to comment.