Maximum number of iterations allowed
Maximum number of function evaluations
Termination tolerance on the first-order optimality
Termination tol on progress in terms of func/param changes
A line search function
If no line search provided, then a fixed step size is used
Clear the history information in the OptimMethod state
Clear the history information in the OptimMethod state
clone OptimMethod
Get hyper parameter from config table.
Get hyper parameter from config table.
get learning rate
A line search function
If no line search provided, then a fixed step size is used
load optimMethod parameters from Table
Maximum number of function evaluations
Maximum number of iterations allowed
Optimize the model parameter
Optimize the model parameter
a function that takes a single input (X), the point of a evaluation, and returns f(X) and df/dX
the initial point
the new x vector and the evaluate value list, evaluated before the update
x : the new x
vector, at the optimal point
f : a table of all function values:
f[1]
is the value of the function before any optimization and
f[#f]
is the final fully optimized value, at x*
save OptimMethod
Termination tolerance on the first-order optimality
Termination tol on progress in terms of func/param changes
Update hyper parameter.
Update hyper parameter. We have updated hyper parameter in method optimize(). But in DistriOptimizer, the method optimize() is only called on the executor side, the driver's hyper parameter is unchanged. So this method is using to update hyper parameter on the driver side.
A string.
Clear the history information in the state
Clear the history information in the state
(Since version 0.2.0) Please use clearHistory() instead
Get hyper parameter from config table.
Get hyper parameter from config table.
a table contains the hyper parameter.
(Since version 0.2.0) Please use getHyperParameter() instead
Optimize the model parameter
Optimize the model parameter
a function that takes a single input (X), the point of a evaluation, and returns f(X) and df/dX
the initial point
a table with configuration parameters for the optimizer
a table describing the state of the optimizer; after each call the state is modified
the new x vector and the function list, evaluated before the update
(Since version 0.2.0) Please initialize OptimMethod with parameters when creating it instead of importing table
Update hyper parameter.
Update hyper parameter. We have updated hyper parameter in method optimize(). But in DistriOptimizer, the method optimize() is only called on the executor side, the driver's hyper parameter is unchanged. So this method is using to update hyper parameter on the driver side.
config table.
state Table.
A string.
(Since version 0.2.0) Please use updateHyperParameter() instead
This implementation of L-BFGS relies on a user-provided line search function (state.lineSearch). If this function is not provided, then a simple learningRate is used to produce fixed size steps. Fixed size steps are much less costly than line searches, and can be useful for stochastic problems.
The learning rate is used even when a line search is provided. This is also useful for large-scale stochastic problems, where opfunc is a noisy approximation of f(x). In that case, the learning rate allows a reduction of confidence in the step size.