Solving nonlinear systems of equations via spectral residual methods: stepsize selection and applications

Spectral residual methods are derivative-free and low-cost per iteration procedures for solving nonlinear systems of equations. They are generally coupled with a nonmonotone linesearch strategy and compare well with Newton-based methods for large nonlinear systems and sequences of nonlinear systems. The residual vector is used as the search direction and choosing the steplength has a … Read more

A robust method based on LOVO functions for solving least squares problems

The robust adjustment of nonlinear models to data is considered in this paper. When data comes from real experiments, it is possible that measurement errors cause the appearance of discrepant values, which should be ignored when adjusting models to them. This work presents a Lower Order-value Optimization (LOVO) version of the Levenberg-Marquardt algorithm, which is … Read more

Continuous selections of solutions for locally Lipschitzian equations

This paper answers in affirmative the long-standing question of nonlinear analysis, concerning the existence of a continuous single-valued local selection of the right inverse to a locally Lipschitzian mapping. Moreover, we develop a much more general result, providing conditions for the existence of a continuous single-valued selection not only locally, but rather on any given … Read more

A relaxed interior point method for low-rank semidefinite programming problems with applications to matrix completion

A new relaxed variant of interior point method for low-rank semidefinite programming problems is proposed in this paper. The method is a step outside of the usual interior point framework. In anticipation to converging to a low-rank primal solution, a special nearly low-rank form of all primal iterates is imposed. To accommodate such a (restrictive) … Read more

Hybrid methods for nonlinear least squares problems

This contribution contains a description and analysis of effective methods for minimization of the nonlinear least squares function $F(x) = (1/2) f^T(x) f(x)$, where $x \in R^n$ and $f \in R^m$, together with extensive computational tests and comparisons of the introduced methods. All hybrid methods are described in detail and their global convergence is proved … Read more

Convergence and evaluation-complexity analysis of a regularized tensor-Newton method for solving nonlinear least-squares problems subject to convex constraints

Given a twice-continuously differentiable vector-valued function $r(x)$, a local minimizer of $\|r(x)\|_2$ within a convex set is sought. We propose and analyse tensor-Newton methods, in which $r(x)$ is replaced locally by its second-order Taylor approximation. Convergence is controlled by regularization of various orders. We establish global convergence to a constrained first-order critical point of $\|r(x)\|_2$, … Read more

A note on solving nonlinear optimization problems in variable precision

This short note considers an efficient variant of the trust-region algorithm with dynamic accuracy proposed Carter (1993) and Conn, Gould and Toint (2000) as a tool for very high-performance computing, an area where it is critical to allow multi-precision computations for keeping the energy dissipation under control. Numerical experiments are presented indicating that the use … Read more

On the complexity of solving feasibility problems

We consider feasibility problems defined by a set of constraints that exhibit gradient H\”older continuity plus additional constraints defined by the affordability of obtaining approximate minimizers of quadratic models onto the associated feasible set. Each iteration of the method introduced in this paper involves the approximate minimization of a two-norm regularized quadratic subject to the … Read more

A class of derivative-free CG projection methods for nonsmooth equations with an application to the LASSO problem

In this paper, based on a modified Gram-Schmidt (MGS) process, we propose a class of derivative-free conjugate gradient (CG) projection methods for nonsmooth equations with convex constraints. Two attractive features of the new class of methods are: (i) its generated direction contains a free vector, which can be set as any vector such that the … Read more

A Unified Framework for Sparse Relaxed Regularized Regression: SR3

Regularized regression problems are ubiquitous in statistical modeling, signal processing, and machine learning. Sparse regression in particular has been instrumental in scientific model discovery, including compressed sensing applications, vari- able selection, and high-dimensional analysis. We propose a broad framework for sparse relaxed regularized regression, called SR3. The key idea is to solve a relaxation of … Read more