It is the default learning rate schedule.
It is an epoch decay learning rate schedule The learning rate decays through a function argument on number of run epochs
Learning rate schedule based on warm up Iterations
EpochSchedule is a learning rate schedule which configure the learning rate according to some pre-defined Regime.
EpochStep is a learning rate schedule, which rescale the learning rate by gamma
for each stepSize
epochs.
Exponential is a learning rate schedule, which rescale the learning rate by
lr_{n + 1} = lr * decayRate ^
(iter / decayStep)
Hyper parameter schedule for SGD
similar to step but it allows non uniform steps defined by stepSizes
NaturalExp is a learning rate schedule, which rescale the learning rate by exp ( -decay_rate * iter / decay_step ) referring to tensorflow's learning rate decay # natural_exp_decay
Plateau is the learning rate schedule when a metric has stopped improving.
A learning rate decay policy, where the effective learning rate follows a polynomial decay, to be zero by the max_iteration.
A structure to specify hyper parameters by start epoch and end epoch.
Stack several learning rate schedulers.
A learning rate decay policy, where the effective learning rate
is calculated as base_lr * gamma ^
(floor(iter / stepSize))
A learning rate gradual increase policy, where the effective learning rate increase delta after each iteration.