- , , where is a convex, continuously differentiable (smooth) functions,
- is a convex, non-differentiable (non-smooth) function
- Choose a set of indices without replacement , , , where is the batch size.
- Compute the gradient where
- Convergence check:Stop if where is an algorithm-specific vector (argument or gradient) and d is an algorithm-specific power of Lebesgue space
- Compute using the algorithm-specific transformation that updates the function’s argument:
- Update where is an algorithm-specific update of the set of intrinsic parameters.