Levenberg-Marquardt methods based on probabilistic gradient models and inexact subproblem solution, with application to data assimilation

The Levenberg-Marquardt algorithm is one of the most popular algorithms for the solution of nonlinear least squares problems. Motivated by the problem structure in data assimilation, we consider in this paper the extension of the classical Levenberg-Marquardt algorithm to the scenarios where the linearized least squares subproblems are solved inexactly and/or the gradient model is … Read more

Algebraic rules for quadratic regularization of Newton’s method

In this work we propose a class of quasi-Newton methods to minimize a twice differentiable function with Lipschitz continuous Hessian. These methods are based on the quadratic regularization of Newton’s method, with algebraic explicit rules for computing the regularizing parameter. The convergence properties of this class of methods are analysed. We show that if the … Read more

A Primal-Dual Regularized Interior-Point Method for Semidefinite Programming

Interior-point methods in semidefinite programming (SDP) require the solution of a sequence of linear systems which are used to derive the search directions. Safeguards are typically required in order to handle rank-deficient Jacobians and free variables. We generalize the primal-dual regularization of \cite{friedlander-orban-2012} to SDP and show that it is possible to recover an optimal … Read more

Iterative Hard Thresholding Methods for $ Regularized Convex Cone Programming

In this paper we consider $l_0$ regularized convex cone programming problems. In particular, we first propose an iterative hard thresholding (IHT) method and its variant for solving $l_0$ regularized box constrained convex programming. We show that the sequence generated by these methods converges to a local minimizer. Also, we establish the iteration complexity of the … Read more

Bounds on Eigenvalues of Matrices Arising from Interior-Point Methods

Interior-point methods feature prominently among numerical methods for inequality-constrained optimization problems, and involve the need to solve a sequence of linear systems that typically become increasingly ill-conditioned with the iterations. To solve these systems, whose original form has a nonsymmetric 3×3 block structure, it is common practice to perform block Gaussian elimination and either solve … Read more

A variable smoothing algorithm for solving convex optimization problems

In this article we propose a method for solving unconstrained optimization problems with convex and Lipschitz continuous objective functions. By making use of the Moreau envelopes of the functions occurring in the objective, we smooth the latter to a convex and differentiable function with Lipschitz continuous gradient by using both variable and constant smoothing parameters. … Read more

A regularized simplex method

In case of a special problem class, the simplex method can be implemented as a cutting-plane method that approximates a certain convex polyhedral objective function. In this paper we consider a regularized version of this cutting-plane method, and interpret the resulting procedure as a regularized simplex method. (Regularization is performed in the dual space and … Read more

Addressing rank degeneracy in constraint-reduced interior-point methods for linear optimization

In earlier work (Tits et al., SIAM J. Optim., 17(1):119–146, 2006; Winternitz et al., COAP, doi=10.1007/s10589-010-9389-4, 2011), the present authors and their collaborators proposed primal-dual interior-point (PDIP) algorithms for linear optimization that, at each iteration, use only a subset of the (dual) inequality constraints in constructing the search direction. For problems with many more constraints … Read more

Simultaneous Pursuit of Out-of-Sample Performance and Sparsity in Index Tracking Portfolios

Index tracking is a passive investment strategy in which an investor purchases a set of assets to mimic a market index. The tracking error, the difference between the performances of the index and the portfolio, may be minimized by buying all the assets contained in the index. However, this strategy results in a considerable amount … Read more

Manifold Identification in Dual Averaging for Regularized Stochastic Online Learning

Iterative methods that calculate their steps from approximate subgradient directions have proved to be useful for stochastic learning problems over large and streaming data sets. When the objective consists of a loss function plus a nonsmooth regularization term, the solution often lies on a low-dimensional manifold of parameter space along which the regularizer is smooth. … Read more