A second-order block coordinate descent method is proposed for the unconstrained minimization of an objective function with a Lipschitz continuous Hessian.
At each iteration, a block of variables is selected by means of a greedy (Gauss-Southwell) rule which considers the amount of first-order stationarity violation,
then an approximate minimizer of a cubic model is computed for the block update.
In the proposed scheme, blocks are not required to have a predetermined structure and their size may change during the iterations.
For non-convex objective functions, global convergence to stationary points is proved and a worst-case iteration complexity analysis is provided.
In particular, given a tolerance $\epsilon$,
we show that at most ${\cal O(\epsilon^{-3/2})}$ iterations are needed to drive the stationarity violation with respect to a selected block of variables below $\epsilon$,
while at most ${\cal O(\epsilon^{-2})}$ iterations are needed to drive the stationarity violation with respect to all variables below $\epsilon$.
Numerical results are finally given, comparing the proposed approach with other second-order methods and block selection rules.