On Common-Random-Numbers and the Complexity of Adaptive Sampling Trust-Region Methods

\(\) In the context of simulation optimization (SO), Common Random Numbers (CRN) is the practice of querying the simulation-based oracle with the same random number stream at each point visited by an SO algorithm. This practice is widely believed to facilitate SO algorithm efficiency by preserving structure inherent to the objective function and gradient sample-paths. … Read more

Regularized methods via cubic subspace minimization for nonconvex optimization

\(\) The main computational cost per iteration of adaptive cubic regularization methods for solving large-scale nonconvex problems is the computation of the step \(s_k\), which requires an approximate minimizer of the cubic model. We propose a new approach in which this minimizer is sought in a low dimensional subspace that, in contrast to classical approaches, … Read more

Non-asymptotic superlinear convergence of Nesterov accelerated BFGS

This paper studies the convergence of a Nesterov accelerated variant of the Broyden-Fletcher-Goldfarb-Shanno (NA-BFGS) quasi-Newton method in the setting where the objective function is strongly convex, its gradient is Lipschitz continuous, and its Hessian is Lipschitz continuous at the optimal point. We demonstrate that similar to the classic BFGS method, the Nesterov accelerated BFGS method … Read more

An Explicit Three-Term Polak-Ribière-Polyak Conjugate Gradient Method for Bicriteria Optimization

We propose in this paper a Polak-Ribière-Polyak conjugate gradient type method for solving bicriteria optimization problems by avoiding scalarization techniques. Two particular advantages in this contribution are to be noted. First, the suggested descent direction common to both criteria may be directly computed by a given formula without solving any intermediate subproblem. Second, the descent … Read more

Yet another fast variant of Newton’s method for nonconvex optimization

\(\) A second-order algorithm is proposed for minimizing smooth nonconvex functions that alternates between regularized Newton and negative curvature steps. In most cases, the Hessian matrix is regularized with the square root of the current gradient and an additional term taking moderate negative curvature into account, a negative curvature step being taken only exceptionnally. As … Read more

Inexact reduced gradient methods in nonconvex optimization

This paper proposes and develops new linesearch methods with inexact gradient information for finding stationary points of nonconvex continuously differentiable functions on finite-dimensional spaces. Some abstract convergence results for a broad class of linesearch methods are established. A general scheme for inexact reduced gradient (IRG) methods is proposed, where the errors in the gradient approximation … Read more

New subspace method for unconstrained derivative-free optimization

This paper defines an efficient subspace method, called SSDFO, for unconstrained derivative-free optimization problems where the gradients of the objective function are Lipschitz continuous but only exact function values are available. SSDFO employs line searches along directions constructed on the basis of quadratic models. These approximate the objective function in a subspace spanned by some … Read more

Iteration Complexity of Fixed-Step Methods by Nesterov and Polyak for Convex Quadratic Functions

This note considers the momentum method by Polyak and the accelerated gradient method by Nesterov, both without line search but with fixed step length applied to strictly convex quadratic functions assuming that exact gradients are used and appropriate upper and lower bounds for the extreme eigenvalues of the Hessian matrix are known. Simple 2-d-examples show … Read more

An Explicit Spectral Fletcher-Reeves Conjugate Gradient Method for Bi-criteria Optimization

In this paper we propose a spectral Fletcher-Reeves conjugate gradient-like method (SFRCG) for solving unconstrained bi-criteria minimisation problems without using any technique of scalarization. We suggest an explicit formulae for computing a descent direction common to both criteria. This latter verifies furthermore a sufficient descent property which does not depend on the line search nor … Read more

Shape-Changing Trust-Region Methods Using Multipoint Symmetric Secant Matrices

In this work, we consider methods for large-scale and nonconvex unconstrained optimization. We propose a new trust-region method whose subproblem is defined using a so-called “shape-changing” norm together with densely-initialized multipoint symmetric secant (MSS) matrices to approximate the Hessian. Shape-changing norms and dense initializations have been successfully used in the context of traditional quasi Newton … Read more