ngclearn.utils.optim package
Submodules
ngclearn.utils.optim.adam module
- class ngclearn.utils.optim.adam.Adam(learning_rate=0.001, beta1=0.9, beta2=0.999, epsilon=1e-08)[source]
Bases:
Opt
Implements the adaptive moment estimation (Adam) algorithm as a decoupled update rule given adjustments produced by a credit assignment algorithm/process.
- Parameters:
learning_rate – step size coefficient for Adam update
beta1 – 1st moment control factor
beta2 – 2nd moment control factor
epsilon – numberical stability coefficient (for calculating final update)
- update(theta, updates)[source]
Apply an update tensor to the current “theta” (parameter) tensor according to an internally specified optimization/change rule.
- Parameters:
theta – parameter value tensor to change
updates – externally produced updates to apply to “theta” (note that updates should be same shape as “theta” to ensure expected behavior)
- ngclearn.utils.optim.adam.step_update(param, update, g1, g2, lr, beta1, beta2, time, eps)
Runs one step of Adam over a set of parameters given updates. The dynamics for any set of parameters is as follows:
g1 = beta1 * g1 + (1 - beta1) * updateg2 = beta2 * g2 + (1 - beta2) * (update)^2g1_unbiased = g1 / (1 - beta1**time)g2_unbiased = g2 / (1 - beta2**time)param = param - lr * g1_unbiased / (sqrt(g2_unbiased) + epsilon)- Parameters:
param – parameter tensor to change/adjust
update – update tensor to be applied to parameter tensor (must be same shape as “param”)
g1 – first moment factor/correction factor to use in parameter update (must be same shape as “update”)
g2 – second moment factor/correction factor to use in parameter update (must be same shape as “update”)
lr – global step size value to be applied to updates to parameters
beta1 – 1st moment control factor
beta2 – 2nd moment control factor
time – current time t or iteration step/call to this Adam update
eps – numberical stability coefficient (for calculating final update)
- Returns:
adjusted parameter tensor (same shape as “param”)
ngclearn.utils.optim.opt module
- class ngclearn.utils.optim.opt.Opt(name)[source]
Bases:
object
A generic base-class for an optimizer.
- Parameters:
name – string name of optimizer
- update(theta, updates)[source]
Apply an update tensor to the current “theta” (parameter) tensor according to an internally specified optimization/change rule.
- Parameters:
theta – parameter value tensor to change
updates – externally produced updates to apply to “theta” (note that updates should be same shape as “theta” to ensure expected behavior)
ngclearn.utils.optim.sgd module
- class ngclearn.utils.optim.sgd.SGD(learning_rate=0.001)[source]
Bases:
Opt
Implements stochastic gradient descent (SGD) as a decoupled update rule given adjustments produced by a credit assignment algorithm/process.
- Parameters:
learning_rate – step size coefficient for SGD update
- update(theta, updates)[source]
Apply an update tensor to the current “theta” (parameter) tensor according to an internally specified optimization/change rule.
- Parameters:
theta – parameter value tensor to change
updates – externally produced updates to apply to “theta” (note that updates should be same shape as “theta” to ensure expected behavior)
- ngclearn.utils.optim.sgd.step_update(param, update, lr)
Runs one step of SGD over a set of parameters given updates.
- Parameters:
lr – global step size to apply when adjusting parameters
- Returns:
adjusted parameter tensor (same shape as “param”)