A lower bound on the optimal self-concordance parameter of convex cones

Let $K \subset \mathbb R^n$ be a regular convex cone, let $e_1,\dots,e_n \in \partial K$ be linearly independent points on the boundary of a compact affine section of the cone, and let $x^* \in K^o$ be a point in the relative interior of this section. For $k = 1,\dots,n$, let $l_k$ be the line through … Read more

On Penalty and Gap Function Methods for Bilevel Equilibrium Problems

We consider bilevel pseudomonotone equilibrium problems. We use a penalty function to convert a bilevel problem into one-level ones. We generalize a pseudo $\nabla$-monotonicity concept from $\nabla$-monotonicity and prove that under pseudo $\nabla$-monotonicity property any stationary point of a regularized gap function is a solution of the penalized equilibrium problem. As an application, we discuss … Read more

Sufficient Conditions for Low-rank Matrix Recovery,Translated from Sparse Signal Recovery

The low-rank matrix recovery (LMR) is a rank minimization problem subject to linear equality constraints, and it arises in many fields such as signal and image processing, statistics, computer vision, system identification and control. This class of optimization problems is $NP$-hard and a popular approach replaces the rank function with the nuclear norm of the … Read more

AN OPTIMAL ALGORITHM FOR CONSTRAINED DIFFERENTIABLE CONVEX OPTIMIZATION

We describe three algorithms for solving differentiable convex optimization problems constrained to simple sets in $ \R^n $, i.e., sets on which it is easy to project an arbitrary point. The first two algorithms are optimal in the sense that they achieve an absolute precision of $ \varepsilon $ in relation to the optimal value … Read more

Generalized Bundle Methods for Sum-Functions with Easy” Components: Applications to Multicommodity Network Design

We propose a modification to the (generalized) bundle scheme for minimization of a convex nondifferentiable sum-function in the case where some of the components are “easy”, that is, they are Lagrangian functions of explicitly known convex programs with “few” variables and constraints. This happens in many practical cases, particularly within applications to combinatorial optimization. In … Read more

Integration formulas via the Legendre-Fenchel Subdifferential of nonconvex functions

Starting from explicit expressions for the subdifferential of the conjugate function, we establish in the Banach space setting some integration results for the so-called epi-pointed functions. These results use the epsilon-subdifferential and the Legendre-Fenchel subdefferential of an appropriate weak lower semicontinuous (lsc) envelope of the initial function. We apply these integration results to the construction … Read more

Positive polynomials on unbounded equality-constrained domains

Certificates of non-negativity are fundamental tools in optimization. A “certificate” is generally understood as an expression that makes the non-negativity of the function in question evident. Some classical certificates of non-negativity are Farkas Lemma and the S-lemma. The lift-and-project procedure can be seen as a certificate of non-negativity for affine functions over the union of … Read more

Dependence of bilevel programming on irrelevant data

In 1997, Macal and Hurter have found that adding a constraint to the lower level problem, which is not active at the computed global optimal solution, can destroy global optimality. In this paper this property is reconsidered and it is shown that this solution remains locally optimal under inner semicontinuity of the original solution set … Read more

Implementation of a block-decomposition algorithm for solving large-scale conic semidefinite programming problems

In this paper, we consider block-decomposition first-order methods for solving large-scale conic semidefinite programming problems. Several ingredients are introduced to speed-up the method in its pure form such as: an aggressive choice of stepsize for performing the extragradient step; use of scaled inner products in the primal and dual spaces; dynamic update of the scaled … Read more

An Accelerated Hybrid Proximal Extragradient Method for Convex Optimization and its Implications to Second-Order Methods

This paper presents an accelerated variant of the hybrid proximal extragradient (HPE) method for convex optimization, referred to as the accelerated HPE (A-HPE) method. Iteration-complexity results are established for the A-HPE method, as well as a special version of it, where a large stepsize condition is imposed. Two specific implementations of the A-HPE method are … Read more