The Euclidean distance degree of an algebraic variety

The nearest point map of a real algebraic variety with respect to Euclidean distance is an algebraic function. For instance, for varieties of low rank matrices, the Eckart-Young Theorem states that this map is given by the singular value decomposition. This article develops a theory of such nearest point maps from the perspective of computational … Read more

Algebraic rules for quadratic regularization of Newton’s method

In this work we propose a class of quasi-Newton methods to minimize a twice differentiable function with Lipschitz continuous Hessian. These methods are based on the quadratic regularization of Newton’s method, with algebraic explicit rules for computing the regularizing parameter. The convergence properties of this class of methods are analysed. We show that if the … Read more

Efficient tridiagonal preconditioner for the matrix-free truncated Newton method

In this report, we study an efficient tridiagonal preconditioner, based on numerical differentiation, applied to the matrix-free truncated Newton method for unconstrained optimization. It is proved that this preconditioner is positive definite for many practical problems. The efficiency of the resulting matrix-free truncated Newton method is demonstrated by results of extensive numerical experiments. Citation Technical … Read more

An inexact and nonmonotone proximal method for smooth unconstrained minimization

An implementable proximal point algorithm is established for the smooth nonconvex unconstrained minimization problem. At each iteration, the algorithm minimizes approximately a general quadratic by a truncated strategy with step length control. The main contributions are: (i) a framework for updating the proximal parameter; (ii) inexact criteria for approximately solving the subproblems; (iii) a nonmonotone … Read more

An example of slow convergence for Newton’s method on a function with globally Lipschitz continuous Hessian

An example is presented where Newton’s method for unconstrained minimization is applied to find an $\epsilon$-approximate first-order critical point of a smooth function and takes a multiple of $\epsilon^{-2}$ iterations and function evaluations to terminate, which is as many as the steepest-descent method in its worst-case. The novel feature of the proposed example is that … Read more

A note on Legendre-Fenchel conjugate of the product of two positive-definite quadratic forms

The Legendre-Fenchel conjugate of the product of two positive-definite quadratic forms was posted as an open question in the field of nonlinear analysis and optimization by Hiriart-Urruty [`Question 11′ in {\it SIAM Review} 49, 255-273, (2007)]. Under a convex assumption on the function, it was answered by Zhao [SIAM J. Matrix Analysis $\&$ Applications, 31(4), … Read more

Worst-case evaluation complexity of non-monotone gradient-related algorithms for unconstrained optimization

The worst-case evaluation complexity of finding an approximate first-order critical point using gradient-related non-monotone methods for smooth nonconvex and unconstrained problems is investigated. The analysis covers a practical linesearch implementation of these popular methods, allowing for an unknown number of evaluations of the objective function (and its gradient) per iteration. It is shown that this … Read more

A class of derivative-free nonmonotone optimization algorithms employing coordinate rotations and gradient approximations

In this paper we study a class of derivative-free unconstrained minimization algorithms employing nonmonotone inexact linesearch techniques along a set of suitable search directions. In particular, we define globally convergent nonmonotone versions of some well-known derivative-free methods and we propose a new algorithm combining coordinate rotations with approximate simplex gradients. Through extensive numerical experimentation, we … Read more

On the connection between the conjugate gradient method and quasi-Newton methods on quadratic problems

It is well known that the conjugate gradient method and a quasi-Newton method, using any well-defined update matrix from the one-parameter Broyden family of updates, produce identical iterates on a quadratic problem with positive-definite Hessian. This equivalence does not hold for any quasi-Newton method. We define precisely the conditions on the update matrix in the … Read more

The Generalized Trust Region Subproblem

The \emph{interval bounded generalized trust region subproblem} (GTRS) consists in minimizing a general quadratic objective, $q_0(x) \rightarrow \min$, subject to an upper and lower bounded general quadratic constraint, $\ell \leq q_1(x) \leq u$. This means that there are no definiteness assumptions on either quadratic function. We first study characterizations of optimality for this \emph{implicitly} convex … Read more