Iteration-Complexity of a Linearized Proximal Multiblock ADMM Class for Linearly Constrained Nonconvex Optimization Problems

This paper analyzes the iteration-complexity of a class of linearized proximal multiblock alternating direction method of multipliers (ADMM) for solving linearly constrained nonconvex optimization problems. The subproblems of the linearized ADMM are obtained by partially or fully linearizing the augmented Lagrangian with respect to the corresponding minimizing block variable. The derived complexity bounds do not … Read more

Generalized Self-Concordant Functions: A Recipe for Newton-type Methods

We study the smooth structure of convex functions by generalizing a powerful concept so-called \textit{self-concordance} introduced by Nesterov and Nemirovskii in the early 1990s to a broader class of convex functions, which we call \textit{generalized self-concordant functions}. This notion allows us to develop a unified framework for designing Newton-type methods to solve convex optimization problems. … Read more

On generalized-convex constrained multi-objective optimization

In this paper, we consider multi-objective optimization problems involving not necessarily convex constraints and componentwise generalized-convex (e.g., semi-strictly quasi-convex, quasi-convex, or explicitly quasi-convex) vector-valued objective functions that are acting between a real linear topological pre-image space and a finite dimensional image space. For these multi-objective optimization problems, we show that the set of (strictly, weakly) … Read more

Linear Convergence of Proximal Incremental Aggregated Gradient Methods under Quadratic Growth Condition

Under the strongly convex assumption, several recent works studied the global linear convergence rate of the proximal incremental aggregated gradient (PIAG) method for minimizing the sum of a large number of smooth component functions and a non-smooth convex function. In this paper, under the quadratic growth condition{a strictly weaker condition than the strongly convex assumption, … Read more

Radial Subgradient Descent

We present a subgradient method for minimizing non-smooth, non-Lipschitz convex optimization problems. The only structure assumed is that a strictly feasible point is known. We extend the work of Renegar [1] by taking a different perspective, leading to an algorithm which is conceptually more natural, has notably improved convergence rates, and for which the analysis … Read more

On Relaxation of Some Customized Proximal Point Algorithms for Convex Minimization: From Variational Inequality Perspective

The proximal point algorithm (PPA) is a fundamental method for convex programming. When PPA applied to solve linearly constrained convex problems, we may prefer to choose an appropriate metric matrix to define the proximal regularization, so that the computational burden of the resulted PPA can be reduced, and in most cases, even admit closed form … Read more

Symmetric ADMM with Positive-Indefinite Proximal Regularization for Linearly Constrained Convex Optimization

The proximal ADMM which adds proximal regularizations to ADMM’s subproblems is a popular and useful method for linearly constrained separable convex problems, especially its linearized case. A well-known requirement on guaranteeing the convergence of the method in the literature is that the proximal regularization must be positive semidefinite. Recently it was shown by He et … Read more

Direct Search Methods on Reductive Homogeneous Spaces

Direct search methods are mainly designed for use in problems with no equality constraints. However, there are many instances where the feasible set is of measure zero in the ambient space and no mesh point lies within it. There are methods for working with feasible sets that are (Riemannian) manifolds, but not all manifolds are … Read more

A block symmetric Gauss-Seidel decomposition theorem for convex composite quadratic programming and its applications

For a symmetric positive semidefinite linear system of equations $\mathcal{Q} {\bf x} = {\bf b}$, where ${\bf x} = (x_1,\ldots,x_s)$ is partitioned into $s$ blocks, with $s \geq 2$, we show that each cycle of the classical block symmetric Gauss-Seidel (block sGS) method exactly solves the associated quadratic programming (QP) problem but added with an … Read more

BFGS convergence to nonsmooth minimizers of convex functions

The popular BFGS quasi-Newton minimization algorithm under reasonable conditions converges globally on smooth convex functions. This result was proved by Powell in 1976: we consider its implications for functions that are not smooth. In particular, an analogous convergence result holds for functions, like the Euclidean norm, that are nonsmooth at the minimizer. CitationManuscript: School of … Read more