the model to be trained
the data set used to train a model
the criterion used to evaluate the loss of the model given an input
Trigger the optimization process
Trigger the optimization process
the model to be trained
the criterion used to evaluate the loss of the model given an input
the criterion used to evaluate the loss of the model given an input
the data set used to train a model
the data set used to train a model
Disable gradient clipping
Disable gradient clipping
Get the directory of saving checkpoint
the model to be trained
the model to be trained
Enable overwrite saving checkpoint
Set a check point saved at path
triggered by trigger
Set a check point saved at path
triggered by trigger
the directory to save
how often to save the check point
the optimizer
Set constant gradient clipping
Set constant gradient clipping
the minimum value to clip by
the maximum value to clip by
Set a new criterion to the optimizer
Set a new criterion to the optimizer
new criterion
Set dropping a certain percentage (dropPercentage
) of models during distributed
training to accelerate, because some cached model may take too long.
Set dropping a certain percentage (dropPercentage
) of models during distributed
training to accelerate, because some cached model may take too long.
drop percentage
max drop percentage
batch size
how may iteration to warm up
this optimizer
When to stop, passed in a Trigger
When to stop, passed in a Trigger
when to end
the optimizer
Clip gradient to a maximum L2-norm
Clip gradient to a maximum L2-norm
gradient L2-Norm threshold
Set a model to the optimizer
Set a model to the optimizer
new model
Set an optimization method
Set an optimization method
optimization method
Set a state(learning rate, epochs.
Set a state(learning rate, epochs...) to the optimizer
the state to be saved
Set new train dataset.
Set new train dataset.
training Samples
mini batch size
feature padding strategy, see com.intel.analytics.bigdl.dataset.PaddingParam for details.
label padding strategy, see com.intel.analytics.bigdl.dataset.PaddingParam for details.
the optimizer
Set new train dataset.
Set new train dataset. User can supply a customized implementation of trait MiniBatch to define how data is organized and retrieved in a mini batch.
training Samples
mini batch size
An User-Defined MiniBatch implementation.
the Optimizer
Enable train summary.
Set validate evaluation
Set validate evaluation
how often to evaluation validation set
validate data set in type of RDD of Sample
a set of validation method ValidationMethod
batch size
construct MiniBatch with a specified miniBatch type
Set a validate evaluation
Set a validate evaluation
how often to evaluation validation set
validate data set in type of RDD of Sample
a set of validation method ValidationMethod
batch size
this optimizer
Set a validate evaluation
Set a validate evaluation
how often to evaluation validation set
validate data set in type of RDD of Sample
a set of validation method ValidationMethod
batch size
feature padding strategy, see com.intel.analytics.bigdl.dataset.PaddingParam for details.
label padding strategy, see com.intel.analytics.bigdl.dataset.PaddingParam for details.
this optimizer
Set a validate evaluation
Set a validate evaluation
how often to evaluation validation set
validate data set in type of DataSet of MiniBatch
a set of validation method ValidationMethod
this optimizer
Enable validation summary.
make optimizer not check the singleton model on a node
make optimizer not check the singleton model on a node
(Since version 0.1.0) Use bigdl.check.singleton instead
Optimizer is an abstract class which is used to train a model automatically with some certain optimization algorithms.
numeric type, which can be Float or Double
the type of elements in DataSet, such as MiniBatch