Bad semidefinite programs with short proofs, and the closedness of the linear image of the semidefinite cone

Semidefinite programs (SDPs) — some of the most useful and pervasive optimization problems of the last few decades — often behave pathologically: the optimal values of the primal and dual problems may differ and may not be attained. Such SDPs are theoretically interesting and often impossible to solve. Yet, the pathological SDPs in the literature … Read more

An Investigation of Newton-Sketch and Subsampled Newton Methods

Sketching, a dimensionality reduction technique, has received much attention in the statistics community. In this paper, we study sketching in the context of Newton’s method for solving finite-sum optimization problems in which the number of variables and data points are both large. We study two forms of sketching that perform dimensionality reduction in data space: … Read more

The Many Faces of Degeneracy in Conic Optimization

Slater’s condition — existence of a “strictly feasible solution” — is a common assumption in conic optimization. Without strict feasibility, first-order optimality conditions may be meaningless, the dual problem may yield little information about the primal, and small changes in the data may render the problem infeasible. Hence, failure of strict feasibility can negatively impact … Read more

On the pointwise iteration-complexity of a dynamic regularized ADMM with over-relaxation stepsize

In this paper, we extend the improved pointwise iteration-complexity result of a dynamic regularized alternating direction method of multipliers (ADMM) for a new stepsize domain. In this complexity analysis, the stepsize parameter can even be chosen in the interval $(0,2)$ instead of interval $(0,(1+\sqrt{5})/2)$. As usual, our analysis is established by interpreting this ADMM variant … Read more

ADMM for monotone operators: convergence analysis and rates

We propose in this paper a unifying scheme for several algorithms from the literature dedicated to the solving of monotone inclusion problems involving compositions with linear continuous operators in infinite dimensional Hilbert spaces. We show that a number of primal-dual algorithms for monotone inclusions and also the classical ADMM numerical scheme for convex optimization problems, … Read more

Local Linear Convergence Analysis of Primal–Dual Splitting Methods

In this paper, we study the local linear convergence properties of a versatile class of Primal–Dual splitting methods for minimizing composite non-smooth convex optimization problems. Under the assumption that the non-smooth components of the problem are partly smooth relative to smooth manifolds, we present a unified local convergence analysis framework for these Primal–Dual splitting methods. … Read more

A Note on the Forward-Douglas–Rachford Splitting for Monotone Inclusion and Convex Optimization

We shed light on the structure of the “three-operator” version of the forward-Douglas–Rachford splitting algorithm for finding a zero of a sum of maximally monotone operators $A + B + C$, where $B$ is cocoercive, involving only the computation of $B$ and of the resolvent of $A$ and of $C$, separately. We show that it … Read more

Facially dual complete (nice) cones and lexicographic tangents

We study the boundary structure of closed convex cones, with a focus on facially dual complete (nice) cones. These cones form a proper subset of facially exposed convex cones, and they behave well in the context of duality theory for convex optimization. Using the well-known and very commonly used concept of tangent cones in nonlinear … Read more

Several variants of the primal-dual hybrid gradient algorithm with applications

By reviewing the primal-dual hybrid algorithm (PDHA) proposed by He, You and Yuan (SIAM J. Imaging Sci. 2014;7(4):2526-2537), in this paper we introduce four improved schemes for solving a class of generalized saddle-point problems. By making use of the variational inequality, weaker conditions are presented to ensure the global convergence of the proposed algorithms, where … Read more

Linear Convergence of Proximal Incremental Aggregated Gradient Methods under Quadratic Growth Condition

Under the strongly convex assumption, several recent works studied the global linear convergence rate of the proximal incremental aggregated gradient (PIAG) method for minimizing the sum of a large number of smooth component functions and a non-smooth convex function. In this paper, under the quadratic growth condition{a strictly weaker condition than the strongly convex assumption, … Read more