How to Reach his Desires: Variational Rationality and the Equilibrium Problem on Hadamard Manifolds

In this paper we present a sufficient condition for the existence of a solution for an \mbox{equilibrium} problem on an Hadamard manifold and under suitable assumptions on the sectional curvature, we \mbox{propose} a framework for the convergence analysis of a proximal point algorithm to solve this equilibrium \mbox{problem}. Finally, we offer an application to the … Read more

Borwein-Preiss Variational Principle Revisited

In this article, we refine and slightly strengthen the metric space version of the Borwein–Preiss variational principle due to Li, Shi, J. Math. Anal. Appl. 246, 308–319 (2000), clarify the assumptions and conclusions of their Theorem 1 as well as Theorem 2.5.2 in Borwein, Zhu, Techniques of Variational Analysis, Springer (2005) and streamline the proofs. … Read more

Linear Rate Convergence of the Alternating Direction Method of Multipliers for Convex Composite Quadratic and Semi-Definite Programming

In this paper, we aim to provide a comprehensive analysis on the linear rate convergence of the alternating direction method of multipliers (ADMM) for solving linearly constrained convex composite optimization problems. Under a certain error bound condition, we establish the global linear rate of convergence for a more general semi-proximal ADMM with the dual steplength … Read more

Low-rank spectral optimization

Various applications in signal processing and machine learning give rise to highly structured spectral optimization problems characterized by low-rank solutions. Two important examples that motivate this work are optimization problems from phase retrieval and from blind deconvolution, which are designed to yield rank-1 solutions. An algorithm is described based solving a certain constrained eigenvalue optimization … Read more

On the Convergence of Multi-Block Alternating Direction Method of Multipliers and Block Coordinate Descent Method

The paper answers several open questions of the alternating direction method of multipliers (ADMM) and the block coordinate descent (BCD) method that are now wildly used to solve large scale convex optimization problems in many fields. For ADMM, it is still lack of theoretical understanding of the algorithm when the objective function is not separable … Read more

Semidefinite approximations of the polynomial abscissa

Given a univariate polynomial, its abscissa is the maximum real part of its roots. The abscissa arises naturally when controlling linear differential equations. As a function of the polynomial coefficients, the abscissa is H\”older continuous, and not locally Lipschitz in general, which is a source of numerical difficulties for designing and optimizing control laws. In … Read more

A survey on operator splitting and decomposition of convex programs

Many structured convex minimization problems can be modeled by the search of a zero of the sum of two monotone operators. Operator splitting methods have been designed to decompose and regularize at the same time these kind of models. We review here these models and the classical splitting methods. We focus on the numerical sensitivity … Read more

A DERIVATIVE-FREE APPROACH TO CONSTRAINED MULTIOBJECTIVE NONSMOOTH OPTIMIZATION

In this work, we consider multiobjective optimization problems with both bound constraints on the variables and general nonlinear constraints, where objective and constraint function values can only be obtained by querying a black box. We define a linesearch-based solution method, and we show that it converges to a set of Pareto stationary points. To this … Read more

Bounded perturbation resilience of projected scaled gradient methods

We investigate projected scaled gradient (PSG) methods for convex minimization problems. These methods perform a descent step along a diagonally scaled gradient direction followed by a feasibility regaining step via orthogonal projection onto the constraint set. This constitutes a generalized algorithmic structure that encompasses as special cases the gradient projection method, the projected Newton method, … Read more

Variational analysis of spectral functions simplified

Spectral functions of symmetric matrices — those depending on matrices only through their eigenvalues — appear often in optimization. A cornerstone variational analytic tool for studying such functions is a formula relating their subdifferentials to the subdifferentials of their diagonal restrictions. This paper presents a new, short, and revealing derivation of this result. We then … Read more