On proximal point-type algorithms for weakly convex functions and their connection to the backward Euler method

In this article we study the connection between proximal point methods for nonconvex optimization and the backward Euler method from numerical Ordinary Differential Equations (ODEs). We establish conditions for which these methods are equivalent. In the case of weakly convex functions, for small enough parameters, the implicit steps can be solved using a strongly convex … Read more

Gradient Sampling Methods for Nonsmooth Optimization

This paper reviews the gradient sampling methodology for solving nonsmooth, nonconvex optimization problems. An intuitively straightforward gradient sampling algorithm is stated and its convergence properties are summarized. Throughout this discussion, we emphasize the simplicity of gradient sampling as an extension of the steepest descent method for minimizing smooth objectives. We then provide overviews of various … Read more

Golden Ratio Algorithms for Variational Inequalities

The paper presents a fully explicit algorithm for monotone variational inequalities. The method uses variable stepsizes that are computed using two previous iterates as an approximation of the local Lipschitz constant without running a linesearch. Thus, each iteration of the method requires only one evaluation of a monotone operator $F$ and a proximal mapping $g$. … Read more

Lectures on Parametric Optimization: An Introduction

The report aims to provide an overview over results from Parametric Optimization which could be called classical results on the subject. Parametric Optimization considers optimization problems depending on a parameter and describes how the feasible set, the value function, and the local or global minimizers of the program depend on changes in the parameter. After … Read more

An algorithm to compute the Hoffman constant of a system of linear constraints

We propose a combinatorial algorithm to compute the Hoffman constant of a system of linear equations and inequalities. The algorithm is based on a characterization of the Hoffman constant as the largest of a finite canonical collection of easy-to-compute Hoffman constants. Our algorithm and characterization extend to the more general context where some of the … Read more

Fast Multilevel Algorithms for Compressive Principle Component Pursuit

Recovering a low-rank matrix from highly corrupted measurements arises in compressed sensing of structured high-dimensional signals (e.g., videos and hyperspectral images among others). Robust principal component analysis (RPCA), solved via principal component pursuit (PCP), recovers a low-rank matrix from sparse corruptions that are of unknown value and support by decomposing the observation matrix into two … Read more

Primal-Dual Interior-Point Methods for Domain-Driven Formulations: Algorithms

We study infeasible-start primal-dual interior-point methods for convex optimization problems given in a typically natural form we denote as Domain-Driven formulation. Our algorithms extend many advantages of primal-dual interior-point techniques available for conic formulations, such as the current best complexity bounds, and more robust certificates of approximate optimality, unboundedness, and infeasibility, to Domain-Driven formulations. The … Read more

Complexity of gradient descent for multiobjective optimization

A number of first-order methods have been proposed for smooth multiobjective optimization for which some form of convergence to first order criticality has been proved. Such convergence is global in the sense of being independent of the starting point. In this paper we analyze the rate of convergence of gradient descent for smooth unconstrained multiobjective … Read more

New inertial factors of a splitting method for monotone inclusions

In this article, we consider monotone inclusions of two operators in real Hilbert spaces, in which one is further assumed to be Lipschitz continuous, and we suggest adding an inertial term to a splitting method at each iteration. The associated weak convergence is analyzed under standard assumptions. The way of choosing steplength is self-adaptive via … Read more

Block Coordinate Proximal Gradient Method for Nonconvex Optimization Problems: Convergence Analysis

We propose a block coordinate proximal gradient method for a composite minimization problem with two nonconvex function components in the objective while only one of them is assumed to be differentiable. Under some per-block Lipschitz-like conditions based on Bregman distance, but without the global Lipschitz continuity of the gradient of the differentiable function, we prove … Read more