Fast Unconstrained Optimization via Hessian Averaging and Adaptive Gradient Sampling Methods

\(\) We consider minimizing finite-sum and expectation objective functions via Hessian-averaging based subsampled Newton methods. These methods allow for gradient inexactness and have fixed per-iteration Hessian approximation costs. The recent work (Na et al. 2023) demonstrated that Hessian averaging can be utilized to achieve fast \(\mathcal{O}\left(\sqrt{\frac{\log k}{k}}\right)\) local superlinear convergence for strongly convex functions in … Read more

Superlinear convergence of an interior point algorithm on linear semi-definite feasibility problems

In the literature, besides the assumption of strict complementarity, superlinear convergence of implementable polynomial-time interior point algorithms using known search directions, namely, the HKM direction, its dual or the NT direction, to solve semi-definite programs (SDPs) is shown by (i) assuming that the given SDP is nondegenerate and making modifications to these algorithms [10], or … Read more

A successive centralized circumcentered-reflection method for the convex feasibility problem

In this paper, we present a successive centralization process for the circumcentered-reflection scheme with several control sequences for solving the convex feasibility problem in Euclidean space. Assuming that a standard error bound holds, we prove the linear convergence of the method with the most violated constraint control sequence. Moreover, under additional smoothness assumptions on the … Read more

A globally trust-region LP-Newton method for nonsmooth functions under the Hölder metric subregularity

We describe and analyse a globally convergent algorithm to find a possible nonisolated zero of a piecewise smooth mapping over a polyhedral set, such formulation includes Karush-Kuhn-Tucker (KKT) systems, variational inequalities problems, and generalized Nash equilibrium problems. Our algorithm is based on a modification of the fast locally convergent Linear Programming (LP)-Newton method with a … Read more

Variants of the A-HPE and large-step A-HPE algorithms for strongly convex problems with applications to accelerated high-order tensor methods

For solving strongly convex optimization problems, we propose and study the global convergence of variants of the A-HPE and large-step A-HPE algorithms of Monteiro and Svaiter. We prove \emph{linear} and the \emph{superlinear} $\mathcal{O}\left(k^{\,-k\left(\frac{p-1}{p+1}\right)}\right)$ global rates for the proposed variants of the A-HPE and large-step A-HPE methods, respectively. The parameter $p\geq 2$ appears in the (high-order) … Read more

Accelerating Inexact Successive Quadratic Approximation for Regularized Optimization Through Manifold Identification

For regularized optimization that minimizes the sum of a smooth term and a regularizer that promotes structured solutions, inexact proximal-Newton-type methods, or successive quadratic approximation (SQA) methods, are widely used for their superlinear convergence in terms of iterations. However, unlike the counter parts in smooth optimization, they suffer from lengthy running time in solving regularized … Read more

Limited-Memory BFGS with Displacement Aggregation

A displacement aggregation strategy is proposed for the curvature pairs stored in a limited-memory BFGS (a.k.a. L-BFGS) method such that the resulting (inverse) Hessian approximations are equal to those that would be derived from a full-memory BFGS method. This means that, if a sufficiently large number of pairs are stored, then an optimization algorithm employing … Read more

Subsampled Inexact Newton methods for minimizing large sums of convex functions

This paper deals with the minimization of large sum of convex functions by Inexact Newton (IN) methods employing subsampled Hessian approximations. The Conjugate Gradient method is used to compute the inexact Newton step and global convergence is enforced by a nonmonotone line search procedure. The aim is to obtain methods with affordable costs and fast … Read more

Multipoint secant and interpolation methods with nonmonotone line search for solving systems of nonlinear equations

Multipoint secant and interpolation methods are effective tools for solving systems of nonlinear equations. They use quasi-Newton updates for approximating the Jacobian matrix. Owing to their ability to more completely utilize the information about the Jacobian matrix gathered at the previous iterations, these methods are especially efficient in the case of expensive functions. They are … Read more

Superlinearly convergent smoothing Newton continuation algorithms for variational inequalities over definable sets

In this paper, we use the concept of barrier-based smoothing approximations introduced by Chua and Li to extend various smoothing Newton continuation algorithms to variational inequalities over general closed convex sets X. We prove that when the underlying barrier has a gradient map that is definable in some o-minimal structure, the iterates generated converge superlinearly … Read more