Visible to Intel only — GUID: GUID-A160E4B0-1815-4DDC-B9E4-7C859E0F17E3
Visible to Intel only — GUID: GUID-A160E4B0-1815-4DDC-B9E4-7C859E0F17E3
Optimizers
This chapter describes optimizers implemented in oneDAL.
The Newton-CG optimizer minimizes the convex function iteratively using its gradient and hessian-product operator.
Mathematical Formulation
Computing
The Newton-CG optimizer, also known as the hessian-free optimizer, minimizes convex functions without calculating the Hessian matrix. Instead, it uses a Hessian product matrix operator. In the Newton method, the descent direction is calculated using the formula , where are the gradient and hessian matrix of the loss function on the k-th iteration. The Newton-CG method uses the Conjugate Gradients solver to find the approximate solution to the equation . This solver can find solutions to the system of linear equations taking vector b and functor as input.
Computation Method: dense
The method defines the Newton-CG optimizer used by other algorithms for convex optimization. There is no separate computation mode to minimize a function manually.
Programming Interface
Refer to API Reference: Newton-CG optimizer.