Convex-Concave Backtracking for Inertial Bregman Proximal Gradient Algorithms in Non-Convex Optimization

Backtracking line-search is an old yet powerful strategy for finding better step size to be used in proximal gradient algorithms. The main principle is to locally find a simple convex upper bound of the objective function, which in turn controls the step size that is used. In case of inertial proximal gradient algorithms, the situation … Read more

Noisy Euclidean Distance Matrix Completion with a Single Missing Node

We present several solution techniques for the noisy single source localization problem, i.e.,~the Euclidean distance matrix completion problem with a single missing node to locate under noisy data. For the case that the sensor locations are fixed, we show that this problem is implicitly convex, and we provide a purification algorithm along with the SDP … Read more

A Method for Convex Black-Box Integer Global Optimization

We study the problem of minimizing a convex function on the integer lattice when the function cannot be evaluated at noninteger points. We propose a new underestimator that does not require access to (sub)gradients of the objective but, rather, uses secant linear functions that interpolate the objective function at previously evaluated points. These linear mappings … Read more

An Augmented Lagrangian algorithm for nonlinear semidefinite programming applied to the covering problem

In this work we present an Augmented Lagrangian algorithm for nonlinear semidefinite problems (NLSDPs), which is a natural extension of its consolidated counterpart in nonlinear programming. This method works with two levels of constraints; one that is penalized and other that is kept within the subproblems. This is done in order to allow exploiting the … Read more

Potential-based analyses of first-order methods for constrained and composite optimization

We propose potential-based analyses for first-order algorithms applied to constrained and composite minimization problems. We first propose “idealized” frameworks for algorithms in the strongly and non-strongly convex cases and argue based on a potential that methods following the framework achieve the best possible rate. Then we show that the geometric descent (GD) algorithm by Bubeck … Read more

On Electricity Market Equilibria with Storage: Modeling, Uniqueness, and a Distributed ADMM

We consider spot-market trading of electricity including storage operators as additional agents besides producers and consumers. Storages allow for shifting produced electricity from one time period to a later one. Due to this, multiple market equilibria may occur even if classical uniqueness assumptions for the case without storages are satisfied. For models containing storage operators, … Read more

Are we there yet? Manifold identification of gradient-related proximal methods

In machine learning, models that generalize better often generate outputs that lie on a low-dimensional manifold. Recently, several works have separately shown finite-time manifold identification by some proximal methods. In this work we provide a unified view by giving a simple condition under which any proximal method using a constant step size can achieve finite-iteration … Read more

On First and Second Order Optimality Conditions for Abs-Normal NLP

Structured nonsmoothness is widely present in practical optimization. A particularly attractive class of nonsmooth problems, both from a theoretical and from an algorithmic perspective, are optimization problems in so-called abs-normal form as developed by Griewank and Walther. Here we generalize their theory for the unconstrained case to nonsmooth NLPs with equality and inequality constraints in … Read more

Generalized conditional subgradient and generalized mirror descent: duality, convergence, and symmetry

We provide new insight into a generalized conditional subgradient algorithm and a generalized mirror descent algorithm for the convex minimization problem \[\min_x \; \{f(Ax) + h(x)\}.\] As Bach showed in [SIAM J. Optim., 25 (2015), pp. 115–129], applying either of these two algorithms to this problem is equivalent to applying the other one to its … Read more

Status Determination by Interior-Point Methods for Convex Optimization Problems in Domain-Driven Form

We study the geometry of convex optimization problems given in a Domain-Driven form and categorize possible statuses of these problems using duality theory. Our duality theory for the Domain-Driven form, which accepts both conic and non-conic constraints, lets us determine and certify statuses of a problem as rigorously as the best approaches for conic formulations … Read more