GNCN-t1-FFM (Whittington & Bogacz, 2017)

This circuit implements the model proposed in ((Whittington & Bogacz, 2017) [1]. Specifically, this model is supervised and can be used to process sensory pattern (row) vector(s) x to predict target (row) vector(s) y. This class offers, beyond settling and update routines, a prediction function by which ancestral projection is carried out to efficiently provide label distribution or regression vector outputs. Note that “FFM” denotes “feedforward mapping”.

The GNCN-t1-FMM is graphically depicted by the following graph:

../_images/gncn_t1_ffm.png
class ngclearn.museum.gncn_t1_ffm.GNCN_t1_FFM(args)[source]

Structure for constructing the model proposed in:

Whittington, James CR, and Rafal Bogacz. “An approximation of the error backpropagation algorithm in a predictive coding network with local hebbian synaptic plasticity.” Neural computation 29.5 (2017): 1229-1262.

This model, under the NGC computational framework, is referred to as the GNCN-t1-FFM, a slightly modified from of the naming convention in (Ororbia & Kifer 2022, Supplementary Material). “FFM” denotes feedforward mapping.

Node Name Structure:
z3 -(z3-mu2)-> mu2 ;e2; z2 -(z2-mu1)-> mu1 ;e1; z1 -(z1-mu0-)-> mu0 ;e0; z0
Note that z3 = x and z0 = y, yielding a classifier or regressor
Parameters

args – a Config dictionary containing necessary meta-parameters for the GNCN-t1-FFM

DEFINITION NOTE:
args should contain values for the following:
* batch_size - the fixed batch-size to be fed into this model
* x_dim - # of latent variables in layer z3 or sensory input x
* z_dim - # of latent variables in layers z1 and z2
* y_dim - # of latent variables in layer z0 or output target y
* seed - number to control determinism of weight initialization
* wght_sd - standard deviation of Gaussian initialization of weights
* beta - latent state update factor
* leak - strength of the leak variable in the latent states
* lmbda - strength of the Laplacian prior applied over latent state activities
* K - # of steps to take when conducting iterative inference/settling
* act_fx - activation function for layers z1, z2
* out_fx - activation function for layer mu0 (prediction of z0 or y) (Default: identity)
predict(x)[source]

Predicts the target (either a probability distribution over labels, i.e., p(y|x), or a vector of regression targets) for a given x

Parameters

z_sample – the input sample to project through the NGC graph

Returns

y_sample (sample(s) of the underlying predictive model)

settle(x, y, calc_update=True)[source]

Run an iterative settling process to find latent states given clamped input and output variables

Parameters
  • x – sensory input to clamp top-most layer (z3) to

  • y – target output activity, i.e., label or regression target

  • calc_update – if True, computes synaptic updates @ end of settling process (Default = True)

Returns

y_hat (predicted y)

calc_updates(avg_update=True, decay_rate=- 1.0)[source]

Calculate adjustments to parameters under this given model and its current internal state values

Returns

delta, a list of synaptic matrix updates (that follow order of .theta)

clear()[source]

Clears the states/values of the stateful nodes in this NGC system

References:
[1] Whittington, James CR, and Rafal Bogacz. “An approximation of the error backpropagation algorithm in a predictive coding network with local hebbian synaptic plasticity.” Neural computation 29.5 (2017): 1229-1262.