subGradientMethod(M, F, ∂F, x)

perform a subgradient method $x_{k+1} = \mathrm{retr}(x_k, s_k∂F(x_k))$,

where $\mathrm{retr}$ is a retraction, $s_k$ can be specified as a function but is usually set to a constant value. Though the subgradient might be set valued, the argument ∂F should always return one element from the subgradient.

Input

• M – a manifold $\mathcal M$
• F – a cost function $F\colon\mathcal M\to\mathbb R$ to minimize
• ∂F: the (sub)gradient $\partial F\colon\mathcal M\to T\mathcal M$ of F restricted to always only returning one value/element from the subgradient
• x – an initial value $x\in\mathcal M$

Optional

and the ones that are passed to decorateOptions for decorators.

Output

• xOpt – the resulting (approximately critical) point of gradientDescent
• record - if activated (using the record key, see RecordOptions an array containing the recorded values.
source

# Options

SubGradientMethodOptions <: Options

stories option values for a subGradientMethod solver

Fields

• retraction – the retration to use within
• stepsize – a Stepsize
• stop – a StoppingCriterion
• x – (initial or current) value the algorithm is at
• optimalX – optimal value
source

For DebugActions and RecordActions to record (sub)gradient, its norm and the stepsizes, see the steepest Descent actions.