An Adaptive Proximal ADMM for Nonconvex Linearly-Constrained Composite Programs

This paper develops an adaptive Proximal Alternating Direction Method of Multipliers (P-ADMM) for solving linearly-constrained, weakly convex, composite optimization problems. This method is adaptive to all problem parameters, including smoothness and weak convexity constants. It is assumed that the smooth component of the objective is weakly convex and possibly nonseparable, while the non-smooth component is … Read more

Black-box Optimization Algorithms for Regularized Least-squares Problems

We consider the problem of optimizing the sum of a smooth, nonconvex function for which derivatives are unavailable, and a convex, nonsmooth function with easy-to-evaluate proximal operator. Of particular focus is the case where the smooth part has a nonlinear least-squares structure. We adapt two existing approaches for derivative-free optimization of nonsmooth compositions of smooth … Read more

On the strength of Burer’s lifted convex relaxation to quadratic programming with ball constraints

We study quadratic programs with m ball constraints, and the strength of a lifted convex relaxation for it recently proposed by Burer (2024). Burer shows this relaxation is exact when m=2. For general m, Burer (2024) provides numerical evidence that this lifted relaxation is tighter than the Kronecker product based Reformulation Linearization Technique (RLT) inequalities … Read more

A combinatorial approach to Ramana’s exact dual for semidefinite programming

Thirty years ago, in a seminal paper Ramana derived an exact dual for Semidefinite Programming (SDP). Ramana’s dual has the following remarkable features: i) it assumes feasibility of the primal, but it does not make any regularity assumptions, such as strict feasibility ii) its optimal value is the same as the optimal value of the … Read more

Concrete convergence rates for common fixed point problems under Karamata regularity

\(\) We introduce the notion of Karamata regular operators, which is a notion of regularity that is suitable for obtaining concrete convergence rates for common fixed point problems. This provides a broad framework that includes, but goes beyond, Hölderian error bounds and Hölder regular operators. By concrete, we mean that the rates we obtain are … Read more

A progressive decoupling algorithm for minimizing the difference of convex and weakly convex functions over a linear subspace

Commonly, decomposition and splitting techniques for optimization problems strongly depend on convexity. Implementable splitting methods for nonconvex and nonsmooth optimization problems are scarce and often lack convergence guarantees. Among the few exceptions is the Progressive Decoupling Algorithm (PDA), which has local convergence should convexity be elicitable. In this work, we furnish PDA with a descent … Read more

A Facial Reduction Algorithm for Standard Spectrahedra

Facial reduction is a pre-processing method aimed at reformulating a problem to ensure strict feasibility. The importance of constructing a robust model is widely recognized in the literature, and facial reduction has emerged an attractive approach for achieving robustness. In this note, we outline a facial reduction algorithm for a standard spectrahedra, the intersection of … Read more

A four-operator splitting algorithm for nonconvex and nonsmooth optimization

\(\) In this work, we address a class of nonconvex nonsmooth optimization problems where the objective function is the sum of two smooth functions (one of which is proximable) and two nonsmooth functions (one proper, closed and proximable, and the other continuous and weakly concave). We introduce a new splitting algorithm that extends the Davis-Yin … Read more

A Subgradient Projection Method with Outer Approximation for Solving Semidefinite Programming Problems

We explore the combination of subgradient projection with outer approximation to solve semidefinite programming problems. We compare several ways to construct outer approximations using the problem structure. The resulting approach enjoys the strengths of both subgradient projection and outer approximation methods. Preliminary computational results on the semidefinite programming relaxations of graph partitioning and max-cut show … Read more

Projection onto hyperbolicity cones and beyond: a dual Frank-Wolfe approach

We discuss the problem of projecting a point onto an arbitrary hyperbolicity cone from both theoretical and numerical perspectives. While hyperbolicity cones are furnished with a generalization of the notion of eigenvalues, obtaining closed form expressions for the projection operator as in the case of semidefinite matrices is an elusive endeavour. To address that we … Read more