SNN-BA (Samadi et al., 2017)

This circuit implements the spiking neural model of (Samadi et al., 2017) [1]. Specifically, this model is supervised and can be used to process sensory pattern (row) vector(s) x to predict target (row) vector(s) y. This class offers, beyond settling and update routines, a prediction function by which ancestral projection is carried out to efficiently provide label distribution or regression vector outputs. Note that “SNN” denotes “spiking neural network” and “BA” stands for “broadcast alignment”. This class model it does not feature a separate calc_updates() method like other models since its settle() routine adjusts synaptic efficacies dynamically (if configured to do so).

The SNN-BA is graphically depicted by the following graph:

../_images/snn_ba.png
class ngclearn.museum.snn_ba.SNN_BA(args)[source]

A spiking neural network (SNN) classifier that adapts its synaptic cables via broadcast alignment. Specifically, this model is a generalization of the one proposed in:

Samadi, Arash, Timothy P. Lillicrap, and Douglas B. Tweed. “Deep learning with dynamic spiking neurons and fixed feedback weights.” Neural computation 29.3 (2017): 578-602.

This model encodes its real-valued inputs as Poisson spike trains with spikes emitted at a rate of approximately 63.75 Hz. The internal nodes and output nodes operate under the leaky integrate-and-fire spike response model and operate with a relative refractory rate of 1.0 ms. The integration time constant for this model has been set to 0.25 ms.

Node Name Structure:
z2 -(z2-mu1)-> mu1 ; z1 -(z1-mu0-)-> mu0 ;e0; z0
e0 -> d1 and z1 -> d1, where d1 is a teaching signal for z1
Note that z2 = x and z0 = y, yielding a classifier
Parameters

args – a Config dictionary containing necessary meta-parameters for the SNN-BA

DEFINITION NOTE:
args should contain values for the following:
* batch_size - the fixed batch-size to be fed into this model
* z_dim - # of latent variables in layers z1
* x_dim - # of latent variables in layer z2 or sensory x
* y_dim - # of variables in layer z0 or target y
* seed - number to control determinism of weight initialization
* wght_sd - standard deviation of Gaussian initialization of weights (optional)
* T - # of time steps to take when conducting iterative settling (if not online)
predict(x)[source]

Predicts the target for a given x. Specifically, this function will return spike counts, one per class in y – taking the argmax of these counts will yield the model’s predicted label.

Parameters

z_sample – the input sample to project through the NGC graph

Returns

y_sample (spike counts from the underlying predictive model)

settle(x, y=None, calc_update=True)[source]

Run an iterative settling process to find latent states given clamped input and output variables, specifically simulating the dynamics of the spiking neurons internal to this SNN model. Note that this functions returns two outputs – the first is a count matrix (each row is a sample in mini-batch) and each column represents the count for one class in y, and the second is an approximate probability distribution computed as a softmax over an average across the electrical currents produced at each step of simulation.

Parameters
  • x – sensory input to clamp top-most layer (z2) to

  • y – target output activity, i.e., label target

  • calc_update – if True, computes synaptic updates @ end of settling process (Default = True)

Returns

y_count (spike counts per class in y), y_hat (approximate probability

distribution for y)

clear()[source]

Clears the states/values of the stateful nodes in this NGC system

References:
[1] Samadi, Arash, Timothy P. Lillicrap, and Douglas B. Tweed. “Deep learning with dynamic spiking neurons and fixed feedback weights.” Neural computation 29.3 (2017): 578-602.