RMSProp

public class RMSProp<Model: Differentiable>: Optimizer
where
  Model.TangentVector: VectorProtocol & PointwiseMultiplicative
    & ElementaryFunctions & KeyPathIterable,
  Model.TangentVector.VectorSpaceScalar == Float

A RMSProp optimizer.

Implements the RMSProp optimization algorithm. RMSProp is a form of stochastic gradient descent where the gradients are divided by a running average of their recent magnitude. RMSProp keeps a moving average of the squared gradient for each weight.

References:

  • Declaration

    public typealias Model = Model
  • The learning rate.

    Declaration

    public var learningRate: Float
  • rho

    The gradient moving average decay factor.

    Declaration

    public var rho: Float
  • A small scalar added to the denominator to improve numerical stability.

    Declaration

    public var epsilon: Float
  • The learning rate decay.

    Declaration

    public var decay: Float
  • The step count.

    Declaration

    public var step: Float
  • The alpha values for all model differentiable variables.

    Declaration

    public var alpha: Model.TangentVector
  • Creates an instance for model.

    Declaration

    public init(
      for model: __shared Model,
      learningRate: Float = 1e-3,
      rho: Float = 0.9,
      epsilon: Float = 1e-8,
      decay: Float = 0
    )

    Parameters

    learningRate

    The learning rate. The default value is 1e-3.

    rho

    The gradient moving average decay factor. The default value is 0.9.

    epsilon

    A small scalar added to the denominator to improve numerical stability. The default value is 1e-8.

    decay

    The learning rate decay. The default value is 0.

  • Declaration

    public func update(_ model: inout Model, along direction: Model.TangentVector)
  • Declaration

    public required init(copying other: RMSProp, to device: Device)