Represent an accuracy result.
Adadelta implementation for SGD: http://arxiv.
An implementation of Adagrad.
An implementation of Adam http://arxiv.
An implementation of Adamax http://arxiv.
The optimizer run on a distributed cluster.
Validate model on a distributed cluster.
model evaluator
Apply both L1 and L2 regularization
Apply L1 regularization
Apply L2 regularization
This implementation of L-BFGS relies on a user-provided line search function (state.
Line Search strategy
Optimize a model on a single machine
Validate a model on a single machine
Use given dataset with certain validation methods such as Top1Accuracy
as an argument of its test
method
This evaluation method is calculate loss of output with respect to target
Use loss as a validation result
This evaluation method is calculate mean absolute error of output with respect to target
Performance metrics for the training process.
Similar to torch Optim method, which is used to update the parameter
Optimizer is an abstract class which is used to train a model automatically with some certain optimization algorithms.
An implementation of RMSprop
It is a trait for all regularizers.
A plain implementation of SGD
Caculate the percentage that output's max probability index equals target
Caculate the percentage that target in output's top5 probability indexes
This is a metric to measure the accuracy of Tree Neural Network/Recursive Neural Network
A trigger specifies a timespot or several timespots during training, and a corresponding action will be taken when the timespot(s) is reached.
A method defined to evaluate the model.
A result that calculate the numeric value of a validation method.
Validator is an abstract class which is used to test a model automatically
with some certain validation methods such as Top1Accuracy, as an argument of
its test
method.
(Since version 0.2.0) Validator(model, dataset) is deprecated. Please use model.evaluate instead