New Algorithms for maximizing the difference of convex functions

Maximizing the difference of 2 convex functions over a convex feasible set (the so called DCA problem) is a hard problem. There is a large number of publications addressing this problem. Many of them are variations of widely used DCA algorithm [20]. The success of this algorithm to reach a good approximation of a global … Read more

The Least Singular Value Function in Variational Analysis

Metric regularity is among the central concepts of nonlinear and variational analysis, constrained optimization, and their numerous applications. However, met- ric regularity can be elusive for some important ill-posed classes of problems includ- ing polynomial equations, parametric variational systems, smooth reformulations of complementarity systems with degenerate solutions, etc. The study of stability issues for such … Read more

New Sufficient and Necessary Conditions for Constrained and Unconstrained Lipschitzian Error Bounds

Local error bounds play a fundamental role in mathematical programming and variational analysis. They are used e.g. as constraint qualifications in optimization, in developing calculus rules for generalized derivatives in nonsmooth and set-valued analysis, and they serve as a key ingredient in the design and convergence analysis of Newton-type methods for solving systems of possibly … Read more

On Sum-Rules for Second-Order Contingent Derivatives

We are concerned with contingent derivatives and their second-order counterparts (introduced by Ngai et al.) of set-valued mappings. Special attention is given to the development of new sum-rules for second-order contingent derivatives. To be precise, we want to find conditions under which the second-order contingent derivative of the sum of a smooth and a set-valued … Read more

Constructing QCQP Instances Equivalent to Their SDP Relaxations

General quadratically constrained quadratic programs (QCQPs) are challenging to solve as they are known to be NP-hard. A popular approach to approximating QCQP solutions is to use semidefinite programming (SDP) relaxations. It is well-known that the optimal value η of the SDP relaxation problem bounds the optimal value ζ  of the QCQP from below, i.e., … Read more

Variable metric proximal stochastic gradient methods with additional sampling

Regularized empirical risk minimization problems arise in a variety of applications, including machine learning, signal processing, and image processing. Proximal stochastic gradient algorithms are a standard approach to solve these problems due to their low computational cost per iteration and a relatively simple implementation. This paper introduces a class of proximal stochastic gradient methods built … Read more

Extended Triangle Inequalities for Nonconvex Box-Constrained Quadratic Programming

Let Box_n = {x in R^n : 0<= x <= e }, and let QPB_n denote the convex hull of {(1, x’)'(1, x’) : x  in Box_n}. The quadratic programming problem min{x’Q x + q’x : x in Box_n} where Q is not positive semidefinite (PSD), is equivalent to a linear optimization problem over QPB_n … Read more

Sparse Polynomial Matrix Optimization

A polynomial matrix inequality is a statement that a symmetric polynomial matrix is positive semidefinite over a given constraint set. Polynomial matrix optimization concerns minimizing the smallest eigenvalue of a symmetric polynomial matrix subject to a tuple of polynomial matrix inequalities. This work explores the use of sparsity methods in reducing the complexity of sum-of-squares … Read more

Spectral Stochastic Gradient Method with Additional Sampling for Finite and Infinite Sums

In this paper, we propose a new stochastic gradient method for numerical minimization of finite sums. We also propose a modified version of this method applicable on more general problems referred to as infinite sum problems, where the objective function is in the form of mathematical expectation. The method is based on a strategy to … Read more

Global Optimization Algorithm through High-Resolution Sampling

We present an optimization algorithm that can identify a global minimum of a potentially nonconvex smooth function with high probability, assuming the Gibbs measure of the potential satisfies a logarithmic Sobolev inequality. Our contribution is twofold: on the one hand we propose a global optimization method, which is built on an oracle sampling algorithm producing … Read more