Synapses

The synapse is a key building blocks for connecting/wiring together the various component cells that one would use for characterizing a biomimetic neural system. These particular objects are meant to perform, per simulated time step, a specific type of transformation utilizing their underlying synaptic parameters. Most times, a synaptic cable will be represented by a set of matrices that used to conduct a projection of an input signal (a value presented to its pre-synaptic/input compartment) resulting in an output signal (a value that appears within its post-synaptic compartment). Notably, a synapse component is typically associated with a local plasticity rule, e.g., a Hebbian-type update, that either is either triggered online (at some or all simulation time steps) or by integrating a differential equation, e.g., via eligibility traces.

Simple Factor Learning Synapse Types

Hebbian rules operate in a local manner – they generally use information more immediately available to synapses in both space and time – and can come in a wide variety of flavors. One general way to categorize variants of Hebbian learning is to clarify what (neural) statistics they operate on, e.g, do they work with real-valued information or discrete spikes, and how many factors (or distinct terms) are involved in calculating the rule. (Note that, in principle, all forms of plasticity in ngc-learn are technically local, factor-based rules. )

(Two-Factor) Hebbian Synapse

This synapse performs a linear transform of its input signals and evolves according to a simple, strictly two-factor update rule. In other words, the underlying synaptic efficacy matrix is changed according to a product between pre-synaptic compartment values (pre) and post-synaptic compartment (post) values, which can contain any type of vector/matrix statistics.

class ngclearn.components.HebbianSynapse(*args: Any, **kwargs: Any)[source]

A synaptic cable that adjusts its efficacies via a two-factor Hebbian adjustment rule.

Parameters:
  • name – the string name of this cell

  • shape – tuple specifying shape of this synaptic cable (usually a 2-tuple with number of inputs by number of outputs)

  • eta – global learning rate

  • wInit – a kernel to drive initialization of this synaptic cable’s values; typically a tuple with 1st element as a string calling the name of initialization to use, e.g., (“uniform”, -0.1, 0.1) samples U(-1,1) for each dimension/value of this cable’s underlying value matrix

  • bInit – a kernel to drive initialization of biases for this synaptic cable (Default: None, which turns off/disables biases)

  • w_bound – maximum weight to softly bound this cable’s value matrix to; if set to 0, then no synaptic value bounding will be applied

  • is_nonnegative – enforce that synaptic efficacies are always non-negative after each synaptic update (if False, no constraint will be applied)

  • w_decay – degree to which (L2) synaptic weight decay is applied to the computed Hebbian adjustment (Default: 0); note that decay is not applied to any configured biases

  • signVal – multiplicative factor to apply to final synaptic update before it is applied to synapses; this is useful if gradient descent style optimization is required (as Hebbian rules typically yield adjustments for ascent)

  • optim_type

    optimization scheme to physically alter synaptic values once an update is computed (Default: “sgd”); supported schemes include “sgd” and “adam”

    Note:

    technically, if “sgd” or “adam” is used but signVal = 1, then the ascent form of each rule is employed (signVal = -1) or a negative learning rate will mean a descent form of the optim_scheme is being employed

  • pre_wght – pre-synaptic weighting factor (Default: 1.)

  • post_wght – post-synaptic weighting factor (Default: 1.)

  • Rscale – a fixed scaling factor to apply to synaptic transform (Default: 1.), i.e., yields: out = ((W * Rscale) * in) + b

  • key – PRNG key to control determinism of any underlying random values associated with this synaptic cable

  • useVerboseDict – triggers slower, verbose dictionary mode (Default: False)

  • directory – string indicating directory on disk to save synaptic parameter values to (i.e., initial threshold values and any persistent adaptive threshold values)

advance_state(**kwargs)[source]
verify_connections()[source]
reset(**kwargs)[source]

Spike-Timing-Dependent Plasticity (STDP) Synapse Types

Synapse that evolve according to a spike-timing-dependent plasticity (STDP) process operate, at a high level, much like multi-factor Hebbian rules (given that STDP is a generalization of Hebbian adjustment to spike trains) and share many of their properties. Nevertheless, a distinguishing factor for STDP-based synapses is that they must involve action potentials (spikes) in their calculations and are typically computing synaptic change according to the relative timing of spikes. In principle, any of the synapses in this grouping of components adapt their efficacies according to rules that are at least special four-factor terms, i.e., a pre-synaptic spike (an “event”), a pre-synaptic delta timing (which can come in the form of a trace), a post-synaptic spike (or event), and a post-synaptic delta timing (also can be a trace). In addition, STDP rules in ngc-learn typically enforce soft/hard synaptic strength bounding, i.e., there is a maximum magnitude allowed for any single synaptic efficacy, and, by default, enforce that synaptic strengths are non-negative.

Trace-based STDP

This is a four-factor STDP rule that adjusts the underlying synaptic strength matrix via a weighted combination of long-term depression (LTD) and long-term potentiation (LTP). For the LTP portion of the update, a pre-synaptic trace and a post-synaptic event/spike-trigger are used, and for the LTD portion of the update, a pre-synaptic event/spike-trigger and a post-synaptic trace are utilized. Note that this specific rule can be configured to use different forms of soft threshold bounding including a scheme that recovers a power-scaling form of STDP (via the hyper-parameter mu).

class ngclearn.components.TraceSTDPSynapse(*args: Any, **kwargs: Any)[source]

A synaptic cable that adjusts its efficacies via trace-based form of spike-timing-dependent plasticity (STDP), including an optional power-scale dependence that can be equipped to the Hebbian adjustment (the strength of which is controlled by a scalar factor).

References:
Morrison, Abigail, Ad Aertsen, and Markus Diesmann. “Spike-timing-dependent
plasticity in balanced random networks.” Neural computation 19.6 (2007): 1437-1467.

Bi, Guo-qiang, and Mu-ming Poo. “Synaptic modification by correlated
activity: Hebb’s postulate revisited.” Annual review of neuroscience 24.1
(2001): 139-166.
Parameters:
  • name – the string name of this cell

  • shape – tuple specifying shape of this synaptic cable (usually a 2-tuple with number of inputs by number of outputs)

  • eta – global learning rate

  • Aplus – strength of long-term potentiation (LTP)

  • Aminus – strength of long-term depression (LTD)

  • mu – controls the power scale of the Hebbian shift

  • preTrace_target – controls degree of pre-synaptic disconnect, i.e., amount of decay (higher -> lower synaptic values)

  • wInit – a kernel to drive initialization of this synaptic cable’s values; typically a tuple with 1st element as a string calling the name of initialization to use, e.g., (“uniform”, -0.1, 0.1) samples U(-1,1) for each dimension/value of this cable’s underlying value matrix

  • w_norm – if not None, applies an L1 norm constraint to synapses

  • norm_T – clocked time at which to apply L1 synaptic norm constraint

  • key – PRNG key to control determinism of any underlying random values associated with this synaptic cable

  • useVerboseDict – triggers slower, verbose dictionary mode (Default: False)

  • directory – string indicating directory on disk to save synaptic parameter values to (i.e., initial threshold values and any persistent adaptive threshold values)

advance_state(dt, t, **kwargs)[source]
verify_connections()[source]
reset(**kwargs)[source]

Exponential STDP

This is a four-factor STDP rule that directly incorporates a controllable exponential synaptic strength dependency into its dynamics. This synapse’s LTP and LTD use traces and spike events in a manner similar to the trace-based STDP described above.

class ngclearn.components.ExpSTDPSynapse(*args: Any, **kwargs: Any)[source]

A synaptic cable that adjusts its efficacies via trace-based form of spike-timing-dependent plasticity (STDP) based on an exponential weight dependence (the strength of which is controlled by a factor).

References:
Nessler, Bernhard, et al. “Bayesian computation emerges in generic cortical
microcircuits through spike-timing-dependent plasticity.” PLoS computational
biology 9.4 (2013): e1003037.

Bi, Guo-qiang, and Mu-ming Poo. “Synaptic modification by correlated
activity: Hebb’s postulate revisited.” Annual review of neuroscience 24.1
(2001): 139-166.
Parameters:
  • name – the string name of this cell

  • shape – tuple specifying shape of this synaptic cable (usually a 2-tuple with number of inputs by number of outputs)

  • eta – global learning rate

  • exp_beta – controls effect of exponential Hebbian shift/dependency

  • Aplus – strength of long-term potentiation (LTP)

  • Aminus – strength of long-term depression (LTD)

  • preTrace_target – controls degree of pre-synaptic disconnect, i.e., amount of decay (higher -> lower synaptic values)

  • wInit – a kernel to drive initialization of this synaptic cable’s values; typically a tuple with 1st element as a string calling the name of initialization to use, e.g., (“uniform”, -0.1, 0.1) samples U(-1,1) for each dimension/value of this cable’s underlying value matrix

  • key – PRNG key to control determinism of any underlying random values associated with this synaptic cable

  • useVerboseDict – triggers slower, verbose dictionary mode (Default: False)

  • directory – string indicating directory on disk to save synaptic parameter values to (i.e., initial threshold values and any persistent adaptive threshold values)

advance_state(dt, t, **kwargs)[source]
verify_connections()[source]
reset(**kwargs)[source]