Splitted Levenberg-Marquardt Method for Large-Scale Sparse Problems

We consider large-scale nonlinear least squares problems with sparse residuals, each of them depending on a small number of variables. A decoupling procedure which results in a splitting of the original problems into a sequence of independent problems of smaller sizes is proposed and analysed. The smaller size problems are modified in a way that … Read more

A Hessian inversion-free exact second order method for distributed consensus optimization

We consider a standard distributed consensus optimization problem where a set of agents connected over an undirected network minimize the sum of their individual (local) strongly convex costs. Alternating Direction Method of Multipliers (ADMM) and Proximal Method of Multipliers (PMM) have been proved to be effective frameworks for design of exact distributed second order methods … Read more

Spectral Projected Subgradient Method for Nonsmooth Convex Optimization Problems

We consider constrained optimization problems with a nonsmooth objective function in the form of mathematical expectation. The Sample Average Approximation (SAA) is used to estimate the objective function and variable sample size strategy is employed. The proposed algorithm combines an SAA subgradient with the spectral coefficient in order to provide a suitable direction which improves … Read more

A harmonic framework for stepsize selection in gradient methods

We study the use of inverse harmonic Rayleigh quotients with target for the stepsize selection in gradient methods for nonlinear unconstrained optimization problems. This provides not only an elegant and flexible framework to parametrize and reinterpret existing stepsize schemes, but also gives inspiration for new flexible and tunable families of steplengths. In particular, we analyze … Read more

A stochastic first-order trust-region method with inexact restoration for finite-sum minimization

We propose a stochastic first-order trust-region method with inexact function and gradient evaluations for solving finite-sum minimization problems. At each iteration, the function and the gradient are approximated by sampling. The sample size in gradient approximations is smaller than the sample size in function approximations and the latter is determined using a deterministic rule inspired … Read more

LSOS: Line-search Second-Order Stochastic optimization methods for nonconvex finite sums

We develop a line-search second-order algorithmic framework for minimizing finite sums. We do not make any convexity assumptions, but require the terms of the sum to be continuously differentiable and have Lipschitz-continuous gradients. The methods fitting into this framework combine line searches and suitably decaying step lengths. A key issue is a two-step sampling at … Read more

An inexact restoration-nonsmooth algorithm with variable accuracy for stochastic nonsmooth convex optimization problems in machine learning and stochastic linear complementarity problems

We study unconstrained optimization problems with nonsmooth and convex objective function in the form of a mathematical expectation. The proposed method approximates the expected objective function with a sample average function using Inexact Restoration-based adapted sample sizes. The sample size is chosen in an adaptive manner based on Inexact Restoration. The algorithm uses line search … Read more

EFIX: Exact Fixed Point Methods for Distributed Optimization

We consider strongly convex distributed consensus optimization over connected networks. EFIX, the proposed method, is derived using quadratic penalty approach. In more detail, we use the standard reformulation – transforming the original problem into a constrained problem in a higher dimensional space – to define a sequence of suitable quadratic penalty subproblems with increasing penalty … Read more

Economic inexact restoration for derivative-free expensive function minimization and applications

The Inexact Restoration approach has proved to be an adequate tool for handling the problem of minimizing an expensive function within an arbitrary feasible set by using different degrees of precision in the objective function. The Inexact Restoration framework allows one to obtain suitable convergence and complexity results for an approach that rationally combines low- … Read more