Class

com.intel.analytics.bigdl.optim

Ftrl

Related Doc: package optim

Permalink

class Ftrl[T] extends OptimMethod[T]

An implementation of Ftrl https://www.eecs.tufts.edu/~dsculley/papers/ad-click-prediction.pdf. Support L1 penalty, L2 penalty and shrinkage-type L2 penalty.

Linear Supertypes
OptimMethod[T], Serializable, Serializable, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. Ftrl
  2. OptimMethod
  3. Serializable
  4. Serializable
  5. AnyRef
  6. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new Ftrl(learningRate: Double = 1e-3, learningRatePower: Double = 0.5, initialAccumulatorValue: Double = 0.1, l1RegularizationStrength: Double = 0.0, l2RegularizationStrength: Double = 0.0, l2ShrinkageRegularizationStrength: Double = 0.0)(implicit arg0: ClassTag[T], ev: TensorNumeric[T])

    Permalink

    learningRate

    learning rate

    learningRatePower

    double, must be less or equal to zero. Default is -0.5.

    initialAccumulatorValue

    double, the starting value for accumulators, require zero or positive values. Default is 0.1.

    l1RegularizationStrength

    double, must be greater or equal to zero. Default is zero.

    l2RegularizationStrength

    double, must be greater or equal to zero. Default is zero.

    l2ShrinkageRegularizationStrength

    double, must be greater or equal to zero. Default is zero. This differs from l2RegularizationStrength above. L2 above is a stabilization penalty, whereas this one is a magnitude penalty.

Value Members

  1. final def !=(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  4. var accumNew: Tensor[T]

    Permalink
  5. final def asInstanceOf[T0]: T0

    Permalink
    Definition Classes
    Any
  6. var buffer: Tensor[T]

    Permalink
  7. def checkParam(learningRate: Double, learningRatePower: Double, initialAccumulatorValue: Double, l1RegularizationStrength: Double, l2RegularizationStrength: Double, l2ShrinkageRegularizationStrength: Double): Unit

    Permalink
    Attributes
    protected
  8. def clearHistory(): Unit

    Permalink

    Clear the history information in the OptimMethod state

    Clear the history information in the OptimMethod state

    Definition Classes
    FtrlOptimMethod
  9. def clone(): OptimMethod[T]

    Permalink

    clone OptimMethod

    clone OptimMethod

    Definition Classes
    OptimMethod → AnyRef
  10. final def eq(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  11. def equals(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  12. def finalize(): Unit

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  13. final def getClass(): Class[_]

    Permalink
    Definition Classes
    AnyRef → Any
  14. def getHyperParameter(): String

    Permalink

    Get hyper parameter from config table.

    Get hyper parameter from config table.

    Definition Classes
    OptimMethod
  15. def getLearningRate(): Double

    Permalink

    get learning rate

    get learning rate

    Definition Classes
    FtrlOptimMethod
  16. var gradWithStrinkage: Tensor[T]

    Permalink
  17. def hashCode(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  18. var initialAccumulatorValue: Double

    Permalink

    double, the starting value for accumulators, require zero or positive values.

    double, the starting value for accumulators, require zero or positive values. Default is 0.1.

  19. final def isInstanceOf[T0]: Boolean

    Permalink
    Definition Classes
    Any
  20. var l1RegularizationStrength: Double

    Permalink

    double, must be greater or equal to zero.

    double, must be greater or equal to zero. Default is zero.

  21. var l2RegularizationStrength: Double

    Permalink

    double, must be greater or equal to zero.

    double, must be greater or equal to zero. Default is zero.

  22. var l2ShrinkageRegularizationStrength: Double

    Permalink

    double, must be greater or equal to zero.

    double, must be greater or equal to zero. Default is zero. This differs from l2RegularizationStrength above. L2 above is a stabilization penalty, whereas this one is a magnitude penalty.

  23. var learningRate: Double

    Permalink

    learning rate

  24. var learningRatePower: Double

    Permalink

    double, must be less or equal to zero.

    double, must be less or equal to zero. Default is -0.5.

  25. def loadFromTable(config: Table): Ftrl.this.type

    Permalink

    load optimMethod parameters from Table

    load optimMethod parameters from Table

    Definition Classes
    FtrlOptimMethod
  26. final def ne(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  27. final def notify(): Unit

    Permalink
    Definition Classes
    AnyRef
  28. final def notifyAll(): Unit

    Permalink
    Definition Classes
    AnyRef
  29. def optimize(feval: (Tensor[T]) ⇒ (T, Tensor[T]), parameter: Tensor[T]): (Tensor[T], Array[T])

    Permalink

    Optimize the model parameter

    Optimize the model parameter

    feval

    a function that takes a single input (X), the point of a evaluation, and returns f(X) and df/dX

    parameter

    the initial point

    returns

    the new x vector and the function list, evaluated before the update

    Definition Classes
    FtrlOptimMethod
  30. var quadratic: Tensor[T]

    Permalink
  31. def save(path: String, overWrite: Boolean = false): Ftrl.this.type

    Permalink

    save OptimMethod

    save OptimMethod

    path

    path

    overWrite

    whether to overwrite

    Definition Classes
    OptimMethod
  32. final def synchronized[T0](arg0: ⇒ T0): T0

    Permalink
    Definition Classes
    AnyRef
  33. def toString(): String

    Permalink
    Definition Classes
    AnyRef → Any
  34. def updateHyperParameter(): Unit

    Permalink

    Update hyper parameter.

    Update hyper parameter. We have updated hyper parameter in method optimize(). But in DistriOptimizer, the method optimize() is only called on the executor side, the driver's hyper parameter is unchanged. So this method is using to update hyper parameter on the driver side.

    returns

    A string.

    Definition Classes
    OptimMethod
  35. final def wait(): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  36. final def wait(arg0: Long, arg1: Int): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  37. final def wait(arg0: Long): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Deprecated Value Members

  1. def clearHistory(state: Table): Table

    Permalink

    Clear the history information in the state

    Clear the history information in the state

    Definition Classes
    OptimMethod
    Annotations
    @deprecated
    Deprecated

    (Since version 0.2.0) Please use clearHistory() instead

  2. def getHyperParameter(config: Table): String

    Permalink

    Get hyper parameter from config table.

    Get hyper parameter from config table.

    config

    a table contains the hyper parameter.

    Definition Classes
    OptimMethod
    Annotations
    @deprecated
    Deprecated

    (Since version 0.2.0) Please use getHyperParameter() instead

  3. def optimize(feval: (Tensor[T]) ⇒ (T, Tensor[T]), parameter: Tensor[T], config: Table, state: Table = null): (Tensor[T], Array[T])

    Permalink

    Optimize the model parameter

    Optimize the model parameter

    feval

    a function that takes a single input (X), the point of a evaluation, and returns f(X) and df/dX

    parameter

    the initial point

    config

    a table with configuration parameters for the optimizer

    state

    a table describing the state of the optimizer; after each call the state is modified

    returns

    the new x vector and the function list, evaluated before the update

    Definition Classes
    OptimMethod
    Annotations
    @deprecated
    Deprecated

    (Since version 0.2.0) Please initialize OptimMethod with parameters when creating it instead of importing table

  4. def updateHyperParameter(config: Table, state: Table): Unit

    Permalink

    Update hyper parameter.

    Update hyper parameter. We have updated hyper parameter in method optimize(). But in DistriOptimizer, the method optimize() is only called on the executor side, the driver's hyper parameter is unchanged. So this method is using to update hyper parameter on the driver side.

    config

    config table.

    state

    state Table.

    returns

    A string.

    Definition Classes
    OptimMethod
    Annotations
    @deprecated
    Deprecated

    (Since version 0.2.0) Please use updateHyperParameter() instead

Inherited from OptimMethod[T]

Inherited from Serializable

Inherited from Serializable

Inherited from AnyRef

Inherited from Any

Ungrouped