A simple preprocessing algorithm for semidefinite programming

We propose a very simple preprocessing algorithm for semidefinite programming. Our algorithm inspects the constraints of the problem, deletes redundant rows and columns in the constraints, and reduces the size of the variable matrix. It often detects infeasibility. Our algorithm does not rely on any optimization solver: the only subroutine it needs is Cholesky factorization, … Read more

Constructing New Weighted l1-Algorithms for the Sparsest Points of Polyhedral Sets

The l0-minimization problem that seeks the sparsest point of a polyhedral set is a longstanding challenging problem in the fields of signal and image processing, numerical linear algebra and mathematical optimization. The weighted l1-method is one of the most plausible methods for solving this problem. In this paper, we develop a new weighted l1-method through … Read more

Exploiting Problem Structure in Optimization under Uncertainty via Online Convex Optimization

In this paper, we consider two paradigms that are developed to account for uncertainty in optimization models: robust optimization (RO) and joint estimation-optimization (JEO). We examine recent developments on efficient and scalable iterative first-order methods for these problems, and show that these iterative methods can be viewed through the lens of online convex optimization (OCO). … Read more

Linearized Alternating Direction Method of Multipliers via Positive-Indefinite Proximal Regularization for Convex Programming

The alternating direction method of multipliers (ADMM) is being widely used for various convex minimization models with separable structures arising in a variety of areas. In the literature, the proximalversion of ADMM which allows ADMM’s subproblems to be proximally regularized has been well studied. Particularly the linearized version of ADMM can be yielded when the … Read more

Convex Relaxations for Quadratic On/Off Constraints and Applications to Optimal Transmission Switching

This paper studies mixed-integer nonlinear programs featuring disjunctive constraints and trigonometric functions. We first characterize the convex hull of univariate quadratic on/off constraints in the space of original variables using perspective functions. We then introduce new tight quadratic relaxations for trigonometric functions featuring variables with asymmetrical bounds. These results are used to further tighten recent … Read more

Random permutations fix a worst case for cyclic coordinate descent

Variants of the coordinate descent approach for minimizing a nonlinear function are distinguished in part by the order in which coordinates are considered for relaxation. Three common orderings are cyclic (CCD), in which we cycle through the components of $x$ in order; randomized (RCD), in which the component to update is selected randomly and independently … Read more

Step lengths in BFGS method for monotone gradients

In this paper, we consider how to directly apply the BFGS method to finding a zero point of any given monotone gradient and thus suggest new conditions to locate the corresponding step lengths. The suggested conditions involve curvature condition and merely use gradients’ computations. Furthermore, they can guarantee convergence without any other restrictions. Finally, preliminary … Read more

A highly efficient semismooth Newton augmented Lagrangian method for solving Lasso problems

We develop a fast and robust algorithm for solving large scale convex composite optimization models with an emphasis on the $\ell_1$-regularized least squares regression (Lasso) problems. Despite the fact that there exist a large number of solvers in the literature for the Lasso problems, we found that no solver can efficiently handle difficult large scale … Read more

Perturbation Analysis of Singular Semidefinite Program and Its Application to a Control Problem

We consider the sensitivity of semidefinite programs (SDPs) under perturbations. It is well known that the optimal value changes continuously under perturbations on the right hand side in the case where the Slater condition holds in the primal problems. In this manuscript, we observe by investigating a concrete SDP that the optimal value can be … Read more

Faster Alternating Direction Method of Multipliers with a Worst-case O(1/n^2) Convergence Rate

The alternating direction method of multipliers (ADMM) is being widely used for various convex programming models with separable structures arising in specifically many scientific computing areas. The ADMM’s worst-case O(1/n) convergence rate measured by the iteration complexity has been established in the literature when its penalty parameter is a constant, where n is the iteration … Read more