Clear the history information in the OptimMethod state
Clear the history information in the OptimMethod state
get learning rate
get learning rate
load optimMethod parameters from Table
load optimMethod parameters from Table
Optimize the model parameter
Optimize the model parameter
a function that takes a single input (X), the point of a evaluation, and returns f(X) and df/dX
the initial point
the new x vector and the function list, evaluated before the update
clone OptimMethod
Get hyper parameter from config table.
save OptimMethod
save OptimMethod
path
whether to overwrite
Update hyper parameter.
Update hyper parameter. We have updated hyper parameter in method optimize(). But in DistriOptimizer, the method optimize() is only called on the executor side, the driver's hyper parameter is unchanged. So this method is using to update hyper parameter on the driver side.
A string.
Clear the history information in the state
Clear the history information in the state
(Since version 0.2.0) Please use clearHistory() instead
Get hyper parameter from config table.
Get hyper parameter from config table.
a table contains the hyper parameter.
(Since version 0.2.0) Please use getHyperParameter() instead
Optimize the model parameter
Optimize the model parameter
a function that takes a single input (X), the point of a evaluation, and returns f(X) and df/dX
the initial point
a table with configuration parameters for the optimizer
a table describing the state of the optimizer; after each call the state is modified
the new x vector and the function list, evaluated before the update
(Since version 0.2.0) Please initialize OptimMethod with parameters when creating it instead of importing table
Update hyper parameter.
Update hyper parameter. We have updated hyper parameter in method optimize(). But in DistriOptimizer, the method optimize() is only called on the executor side, the driver's hyper parameter is unchanged. So this method is using to update hyper parameter on the driver side.
config table.
state Table.
A string.
(Since version 0.2.0) Please use updateHyperParameter() instead
Similar to torch Optim method, which is used to update the parameter