Block cubic Newton with greedy selection
\(\)
A second-order block coordinate descent method is proposed for the unconstrained minimization of an objective function with Lipschitz continuous Hessian. At each iteration, a block of variables is selected by means of a greedy (Gauss-Southwell) rule which considers the amount of first-order stationarity violation, then an approximate minimizer of a cubic model is computed for the block update. In the proposed scheme, blocks are not required to have a prefixed structure and their size is allowed to change during the iterations. For non-convex objective functions, global convergence to stationary points is proved and a worst-case iteration complexity analysis is provided. In particular, given a tolerance $\epsilon$, we show that at most ${\cal O}(\epsilon^{-3/2})$ iterations are needed to drive the stationarity violation with respect to the selected block of variables below $\epsilon$, while at most ${\cal O}(\epsilon^{-2})$ iterations are needed to drive the stationarity violation with respect to all variables below $\epsilon$. Numerical results are finally provided.