On the convergence of an inexact Gauss-Newton trust-region method for nonlinear least-squares problems with simple bounds

We introduce an inexact Gauss-Newton trust-region method for solving bound-constrained nonlinear least-squares problems where, at each iteration, a trust-region subproblem is approximately solved by the Conjugate Gradient method. Provided a suitable control on the accuracy to which we attempt to solve the subproblems, we prove that the method has global and asymptotic fast convergence properties. … Read more

Sensitivity analysis and calibration of the covariance matrix for stable portfolio selection

We recommend an implementation of the Markowitz problem to generate stable portfolios with respect to perturbations of the problem parameters. The stability is obtained proposing novel calibrations of the covariance matrix between the returns that can be cast as convex or quasiconvex optimization problems. A statistical study as well as a sensitivity analysis of the … Read more

Bound reduction using pairs of linear inequalities

We describe a procedure to reduce variable bounds in Mixed Integer Nonlinear Programming (MINLP) as well as Mixed Integer Linear Programming (MILP) problems. The procedure works by combining pairs of inequalities of a linear programming (LP) relaxation of the problem. This bound reduction technique extends the implied bounds procedure used in MINLP and MILP and … Read more

SOME REGULARITY RESULTS FOR THE PSEUDOSPECTRAL ABSCISSA AND PSEUDOSPECTRAL RADIUS OF A MATRIX

The $\epsilon$-pseudospectral abscissa $\alpha_\epsilon$ and radius $\rho_\epsilon$ of an n x n matrix are respectively the maximal real part and the maximal modulus of points in its $\epsilon$-pseudospectrum, defined using the spectral norm. It was proved in [A.S. Lewis and C.H.J. Pang. Variational analysis of pseudospectra. SIAM Journal on Optimization, 19:1048-1072, 2008] that for fixed … Read more

On Nesterov’s Nonsmooth Chebyschev-Rosenbrock Functions

We discuss two nonsmooth functions on R^n introduced by Nesterov. We show that the first variant is partly smooth in the sense of [A.S. Lewis. Active sets, nonsmoothness and sensitivity. SIAM Journal on Optimization, 13:702–725, 2003.] and that its only stationary point is the global minimizer. In contrast, we show that the second variant has … Read more

Derivative-free Optimization of Expensive Functions with Computational Error Using Weighted Regression

We propose a derivative-free algorithm for optimizing computationally expensive functions with computational error. The algorithm is based on the trust region regression method by Conn, Scheinberg, and Vicente [4], but uses weighted regression to obtain more accurate model functions at each trust region iteration. A heuristic weighting scheme is proposed which simultaneously handles i) differing … Read more

A compact variant of the QCR method for quadratically constrained quadratic 0-1 programs

Quadratic Convex Reformulation (QCR) is a technique that was originally proposed for quadratic 0-1 programs, and then extended to various other problems. It is used to convert non-convex instances into convex ones, in such a way that the bound obtained by solving the continuous relaxation of the reformulated instance is as strong as possible. In … Read more

On the evaluation complexity of composite function minimization with applications to nonconvex nonlinear programming

We estimate the worst-case complexity of minimizing an unconstrained, nonconvex composite objective with a structured nonsmooth term by means of some first-order methods. We find that it is unaffected by the nonsmoothness of the objective in that a first-order trust-region or quadratic regularization method applied to it takes at most O($\epsilon^{-2}$) function-evaluations to reduce the … Read more

A Note on Superlinear Convergence of a Primal-dual Interior Point Method for Nonlinear Semi-definite Programming

We replace one of the assumptions (nondegeneracy assumption) in [9] to show that the main results in [9] still hold. We also provide a simple example to show that the new assumption is satisfied, while the original assumption is not satisfied, with other assumptions being satisfied. This example shows that the new assumption does not … Read more

Approximation Theory of Matrix Rank Minimization and Its Application to Quadratic Equations

Matrix rank minimization problems are gaining a plenty of recent attention in both mathematical and engineering fields. This class of problems, arising in various and across-discipline applications, is known to be NP-hard in general. In this paper, we aim at providing an approximation theory for the rank minimization problem, and prove that a rank minimization … Read more