Class/Object

com.intel.analytics.bigdl.optim

LarsSGD

Related Docs: object LarsSGD | package optim

Permalink

class LarsSGD[T] extends SGD[T]

An implementation of LARS https://arxiv.org/abs/1708.03888 Lars.createOptimForModule is recommended to be used to create LARS optim methods for multiple layers

Linear Supertypes
SGD[T], OptimMethod[T], Serializable, Serializable, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. LarsSGD
  2. SGD
  3. OptimMethod
  4. Serializable
  5. Serializable
  6. AnyRef
  7. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new LarsSGD(lrScheduleOwner: Boolean, trust: Double = 1.0, _learningRate: Double = 1e-3, _learningRateDecay: Double = 0.01, _weightDecay: Double = 0.0005, _momentum: Double = 0.5, _learningRateSchedule: LearningRateSchedule = Default())(implicit arg0: ClassTag[T], ev: TensorNumeric[T])

    Permalink

    lrScheduleOwner

    if this optim method owns the learning rate scheduler. A scheduler may be shared by multiple LARS scheduler

    trust

    the trust on the learning rate scale, should be in 0 to 1

    _learningRate

    learning rate

    _learningRateDecay

    learning rate decay

    _weightDecay

    weight decay

    _momentum

    momentum

    _learningRateSchedule

    the learning rate scheduler

Value Members

  1. final def !=(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0

    Permalink
    Definition Classes
    Any
  5. def clearHistory(): Unit

    Permalink

    Clear the history information in the OptimMethod state

    Clear the history information in the OptimMethod state

    Definition Classes
    SGDOptimMethod
  6. def clone(): OptimMethod[T]

    Permalink

    clone OptimMethod

    clone OptimMethod

    Definition Classes
    OptimMethod → AnyRef
  7. var dampening: Double

    Permalink

    dampening for momentum

    dampening for momentum

    Definition Classes
    SGD
  8. final def eq(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  9. def equals(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  10. def finalize(): Unit

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  11. final def getClass(): Class[_]

    Permalink
    Definition Classes
    AnyRef → Any
  12. def getHyperParameter(config: Table): String

    Permalink

    return an string of current hyperParameter.

    return an string of current hyperParameter.

    config

    a table contains the hyper parameter.

    Definition Classes
    LarsSGDSGDOptimMethod
  13. def getHyperParameter(): String

    Permalink

    return an string of current hyperParameter.

    return an string of current hyperParameter.

    Definition Classes
    LarsSGDSGDOptimMethod
  14. def getLearningRate(): Double

    Permalink

    get learning rate

    get learning rate

    Definition Classes
    LarsSGDSGDOptimMethod
  15. def hashCode(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  16. final def isInstanceOf[T0]: Boolean

    Permalink
    Definition Classes
    Any
  17. var learningRate: Double

    Permalink

    learning rate

    learning rate

    Definition Classes
    SGD
  18. var learningRateDecay: Double

    Permalink

    learning rate decay

    learning rate decay

    Definition Classes
    SGD
  19. var learningRateSchedule: LearningRateSchedule

    Permalink
    Definition Classes
    SGD
  20. var learningRates: Tensor[T]

    Permalink

    1D tensor of individual learning rates

    1D tensor of individual learning rates

    Definition Classes
    SGD
  21. def loadFromTable(config: Table): LarsSGD.this.type

    Permalink

    load optimMethod parameters from Table

    load optimMethod parameters from Table

    Definition Classes
    SGDOptimMethod
  22. var momentum: Double

    Permalink

    momentum

    momentum

    Definition Classes
    SGD
  23. final def ne(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  24. var nesterov: Boolean

    Permalink

    enables Nesterov momentum

    enables Nesterov momentum

    Definition Classes
    SGD
  25. final def notify(): Unit

    Permalink
    Definition Classes
    AnyRef
  26. final def notifyAll(): Unit

    Permalink
    Definition Classes
    AnyRef
  27. def optimize(feval: (Tensor[T]) ⇒ (T, Tensor[T]), parameter: Tensor[T]): (Tensor[T], Array[T])

    Permalink

    feval

    a function that takes a single input (X), the point of a evaluation, and returns f(X) and df/dX

    parameter

    the initial point

    returns

    the new x vector and the function list {fx}, evaluated before the update

    Definition Classes
    LarsSGDSGDOptimMethod
  28. def save(path: String, overWrite: Boolean = false): LarsSGD.this.type

    Permalink

    save OptimMethod

    save OptimMethod

    path

    path

    overWrite

    whether to overwrite

    Definition Classes
    OptimMethod
  29. final def synchronized[T0](arg0: ⇒ T0): T0

    Permalink
    Definition Classes
    AnyRef
  30. def toString(): String

    Permalink
    Definition Classes
    AnyRef → Any
  31. def updateHyperParameter(config: Table, state: Table): Unit

    Permalink

    Update hyper parameter.

    Update hyper parameter. We have updated hyper parameter in method optimize(). But in DistriOptimizer, the method optimize() is only called on the executor side, the driver's hyper parameter is unchanged. So this method is using to update hyper parameter on the driver side.

    config

    config table.

    state

    state Table.

    returns

    A string.

    Definition Classes
    LarsSGDSGDOptimMethod
  32. def updateHyperParameter(): Unit

    Permalink

    Update hyper parameter.

    Update hyper parameter. We have updated hyper parameter in method optimize(). But in DistriOptimizer, the method optimize() is only called on the executor side, the driver's hyper parameter is unchanged. So this method is using to update hyper parameter on the driver side.

    returns

    A string.

    Definition Classes
    SGDOptimMethod
  33. final def wait(): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  34. final def wait(arg0: Long, arg1: Int): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  35. final def wait(arg0: Long): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  36. var weightDecay: Double

    Permalink

    weight decay

    weight decay

    Definition Classes
    SGD
  37. var weightDecays: Tensor[T]

    Permalink

    1D tensor of individual weight decays

    1D tensor of individual weight decays

    Definition Classes
    SGD

Deprecated Value Members

  1. def clearHistory(state: Table): Table

    Permalink

    Clear the history information in the state

    Clear the history information in the state

    Definition Classes
    OptimMethod
    Annotations
    @deprecated
    Deprecated

    (Since version 0.2.0) Please use clearHistory() instead

  2. def optimize(feval: (Tensor[T]) ⇒ (T, Tensor[T]), parameter: Tensor[T], config: Table, state: Table = null): (Tensor[T], Array[T])

    Permalink

    Optimize the model parameter

    Optimize the model parameter

    feval

    a function that takes a single input (X), the point of a evaluation, and returns f(X) and df/dX

    parameter

    the initial point

    config

    a table with configuration parameters for the optimizer

    state

    a table describing the state of the optimizer; after each call the state is modified

    returns

    the new x vector and the function list, evaluated before the update

    Definition Classes
    OptimMethod
    Annotations
    @deprecated
    Deprecated

    (Since version 0.2.0) Please initialize OptimMethod with parameters when creating it instead of importing table

Inherited from SGD[T]

Inherited from OptimMethod[T]

Inherited from Serializable

Inherited from Serializable

Inherited from AnyRef

Inherited from Any

Ungrouped