A NEW PROBABILISTIC ALGORITHM FOR SOLVING NONLINEAR EQUATIONS SYSTEMS

In this paper, we consider a class of optimization problems having the following characteristics: there exists a fixed number k which does not depend on the size n of the problem such that if we randomly change the value of k variables, it has the ability to find a new solution that is better than … Read more

Global Search Strategies for Solving Multilinear Least-squares Problems

The multilinear least-squares (MLLS) problem is an extension of the linear least-squares problem. The difference is that a multilinear operator is used in place of a matrix-vector product. The MLLS is typically a large-scale problem characterized by a large number of local minimizers. It originates, for instance, from the design of filter networks. We present … Read more

Approximate spectral factorization for design of efficient sub-filter sequences

A well-known approach to the design of computationally efficient filters is to use spectral factorization, i.e. a decomposition of a filter into a sequence of sub-filters. Due to the sparsity of the sub-filters, the typical processing speedup factor is within the range 1-10 in 2D, and for 3D it achieves 10-100. The design of such … Read more

Convergence analysis of a proximal Gauss-Newton method

An extension of the Gauss-Newton algorithm is proposed to find local minimizers of penalized nonlinear least squares problems, under generalized Lipschitz assumptions. Convergence results of local type are obtained, as well as an estimate of the radius of the convergence ball. Some applications for solving constrained nonlinear equations are discussed and the numerical performance of … Read more

On the convergence of an inexact Gauss-Newton trust-region method for nonlinear least-squares problems with simple bounds

We introduce an inexact Gauss-Newton trust-region method for solving bound-constrained nonlinear least-squares problems where, at each iteration, a trust-region subproblem is approximately solved by the Conjugate Gradient method. Provided a suitable control on the accuracy to which we attempt to solve the subproblems, we prove that the method has global and asymptotic fast convergence properties. … Read more

Complexity bounds for second-order optimality in unconstrained optimization

This paper examines worst-case evaluation bounds for finding weak minimizers in unconstrained optimization. For the cubic regularization algorithm, Nesterov and Polyak (2006) and Cartis, Gould and Toint (2010) show that at most O(epsilon^{-3}) iterations may have to be performed for finding an iterate which is within epsilon of satisfying second-order optimality conditions. We first show … Read more

Preconditioning and Globalizing Conjugate Gradients in Dual Space for Quadratically Penalized Nonlinear-Least Squares Problems

When solving nonlinear least-squares problems, it is often useful to regularize the problem using a quadratic term, a practice which is especially common in applications arising in inverse calculations. A solution method derived from a trust-region Gauss-Newton algorithm is analyzed for such applications, where, contrary to the standard algorithm, the least-squares subproblem solved at each … Read more

Solving structured nonlinear least-squares and nonlinear feasibility problems with expensive functions

We present an algorithm for nonlinear least-squares and nonlinear feasibility problems, i.e. for systems of nonlinear equations and nonlinear inequalities, which depend on the outcome of expensive functions for which derivatives are assumed to be unavailable. Our algorithm combines derivative-free techniques with filter trust-region methods to keep the number of expensive function evaluations low and … Read more

On the oracle complexity of first-order and derivative-free algorithms for smooth nonconvex minimization

The (optimal) function/gradient evaluations worst-case complexity analysis available for the Adaptive Regularizations algorithms with Cubics (ARC) for nonconvex smooth unconstrained optimization is extended to finite-difference versions of this algorithm, yielding complexity bounds for first-order and derivative free methods applied on the same problem class. A comparison with the results obtained for derivative-free methods by Vicente … Read more

Efficient preconditioner updates for shifted linear systems

We present a new technique for building effective and low cost preconditioners for sequences of shifted linear systems (A+aI)x=b, where A is symmetric positive definite and a>0. This technique updates a preconditioner for A, available in the form of an LDL’ factorization, by modifying only the nonzero entries of the L factor in such a … Read more