Positive and Z-operators on closed convex cones

Let K be a closed convex cone with dual K-star in a finite-dimensional real Hilbert space V. A positive operator on K is a linear operator L on V such that L(K) is a subset of K. Positive operators generalize the nonnegative matrices and are essential to the Perron-Frobenius theory. We say that L is … Read more

Max-Norm Optimization for Robust Matrix Recovery

This paper studies the matrix completion problem under arbitrary sampling schemes. We propose a new estimator incorporating both max-norm and nuclear-norm regularization, based on which we can conduct efficient low-rank matrix recovery using a random subset of entries observed with additive noise under general non-uniform and unknown sampling distributions. This method significantly relaxes the uniform … Read more

A recursive semi-smooth Newton method for linear complementarity problems

A primal feasible active set method is presented for finding the unique solution of a Linear Complementarity Problem (LCP) with a P-matrix, which extends the globally convergent active set method for strictly convex quadratic problems with simple bounds proposed by [P. Hungerlaender and F. Rendl. A feasible active set method for strictly convex problems with … Read more

An Infeasible Active Set Method with Combinatorial Line Search for Convex Quadratic Problems with Bound Constraints

The minimization of a convex quadratic function under bound constraints is a fundamental building block for more complicated optimization problems. The active-set method introduced by [M. Bergounioux, K. Ito, and K. Kunisch. Primal-Dual Strategy for Constrained Optimal Control Problems. SIAM Journal on Control and Optimization, 37:1176–1194, 1999.] and [M. Bergounioux, M. Haddou, M. Hintermüller, and … Read more

A SMART Stochastic Algorithm for Nonconvex Optimization with Applications to Robust Machine Learning

Machine learning theory typically assumes that training data is unbiased and not adversarially generated. When real training data deviates from these assumptions, trained models make erroneous predictions, sometimes with disastrous effects. Robust losses, such as the huber norm are designed to mitigate the effects of such contaminated data, but they are limited to the regression … Read more

A universal and structured way to derive dual optimization problem formulations

The dual problem of a convex optimization problem can be obtained in a relatively simple and structural way by using a well-known result in convex analysis, namely Fenchel’s duality theorem. This alternative way of forming a strong dual problem is the subject in this paper. We recall some standard results from convex analysis and then … Read more

On deterministic reformulations of distributionally robust joint chance constrained optimization problems

A joint chance constrained optimization problem involves multiple uncertain constraints, i.e., constraints with stochastic parameters, that are jointly required to be satisfied with probability exceeding a prespecified threshold. In a distributionally robust joint chance constrained optimization problem (DRCCP), the joint chance constraint is required to hold for all probability distributions of the stochastic parameters from … Read more

On the convergence of a regularized Jacobi algorithm for convex optimization

In this paper we consider the regularized version of the Jacobi algorithm, a block coordinate descent method for convex optimization with differentiable objective function and block-separable constraints that has been recently proposed in the literature. Under certain regularity assumptions on the objective function, this algorithm has been shown to satisfy the so-called sufficient decrease condition, … Read more

Accelerated gradient sliding for structured convex optimization

Our main goal in this paper is to show that one can skip gradient computations for gradient descent type methods applied to certain structured convex programming (CP) problems. To this end, we first present an accelerated gradient sliding (AGS) method for minimizing the summation of two smooth convex functions with different Lipschitz constants. We show … Read more

Positive-Indefinite Proximal Augmented Lagrangian Method and its Application to Full Jacobian Splitting for Multi-block Separable Convex Minimization Problems

The augmented Lagrangian method (ALM) is fundamental for solving convex programming problems with linear constraints. The proximal version of ALM, which regularizes ALM’s subproblem over the primal variable at each iteration by an additional positive-definite quadratic proximal term, has been well studied in the literature. In this paper, we show that it is not necessary … Read more