Accelerated first-order methods for large-scale convex minimization

This paper discusses several (sub)gradient methods attaining the optimal complexity for smooth problems with Lipschitz continuous gradients, nonsmooth problems with bounded variation of subgradients, weakly smooth problems with H\”older continuous gradients. The proposed schemes are optimal for smooth strongly convex problems with Lipschitz continuous gradients and optimal up to a logarithmic factor for nonsmooth problems … Read more

Solving nonsmooth convex optimization with complexity (\eps^{-1/2})$

This paper describes an algorithm for solving structured nonsmooth convex optimization problems using OSGA, a first-order method with the complexity $O(\eps^{-2})$ for Lipschitz continuous nonsmooth problems and $O(\eps^{-1/2})$ for smooth problems with Lipschitz continuous gradient. If the nonsmoothness of the problem is manifested in a structured way, we reformulate the problem in a form that … Read more

An optimal subgradient algorithm with subspace search for costly convex optimization problems

This paper presents an acceleration of the optimal subgradient algorithm OSGA \cite{NeuO} for solving convex optimization problems, where the objective function involves costly affine and cheap nonlinear terms. We combine OSGA with a multidimensional subspace search technique, which leads to low-dimensional problem that can be solved efficiently. Numerical results concerning some applications are reported. A … Read more

An optimal subgradient algorithm for large-scale bound-constrained convex optimization

This paper shows that the OSGA algorithm — which uses first-order information to solve convex optimization problems with optimal complexity — can be used to efficiently solve arbitrary bound-constrained convex optimization problems. This is done by constructing an explicit method as well as an inexact scheme for solving the bound-constrained rational subproblem required by OSGA. … Read more

An optimal subgradient algorithm for large-scale convex optimization in simple domains

This paper shows that the optimal subgradient algorithm, OSGA, proposed in \cite{NeuO} can be used for solving structured large-scale convex constrained optimization problems. Only first-order information is required, and the optimal complexity bounds for both smooth and nonsmooth problems are attained. More specifically, we consider two classes of problems: (i) a convex objective with a … Read more

On efficiency of nonmonotone Armijo-type line searches

Monotonicity and nonmonotonicity play a key role in studying the global convergence and the efficiency of iterative schemes employed in the field of nonlinear optimization, where globally convergent and computationally efficient schemes are explored. This paper addresses some features of descent schemes and the motivation behind nonmonotone strategies and investigates the efficiency of an Armijo-type … Read more

Optimal subgradient algorithms with application to large-scale linear inverse problems

This study addresses some algorithms for solving structured unconstrained convex optimization problems using first-order information where the underlying function includes high-dimensional data. The primary aim is to develop an implementable algorithmic framework for solving problems with multi-term composite objective functions involving linear mappings using the optimal subgradient algorithm, OSGA, proposed by {\sc Neumaier} in \cite{NeuO}. … Read more