Copositivity detection by difference-of-convex decomposition and omega-subdivision

We present three new copositivity tests based upon difference-of-convex (d.c.) decompositions, and combine them to a branch-and-bound algorithm of $\omega$-subdivision type. The tests employ LP or convex QP techniques, but also can be used heuristically using appropriate test points. We also discuss the selection of efficient d.c.~decompositions and propose some preprocessing ideas based on the … Read more

^phBcnorms, log-barriers and Cramer transform in optimization

We show that the Laplace approximation of a supremum by $L^p$-norms has interesting consequences in optimization. For instance, the logarithmic barrier functions (LBF) of a primal convex problem $P$ and its dual $P^*$ appear naturally when using this simple approximation technique for the value function $g$ of $P$ or its Legendre-Fenchel conjugate $g^*$. In addition, … Read more

A joint+marginal approach to parametric polynomial optimization

Given a compact parameter set $Y\subset R^p$, we consider polynomial optimization problems $(P_\y$) on $R^n$ whose description depends on the parameter $y\in Y$. We assume that one can compute all moments of some probability measure $\varphi$ on $Y$, absolutely continuous with respect to the Lebesgue measure (e.g. $Y$ is a box or a simplex and … Read more

A Robust Implementation of a Sequential Quadratic Programming Algorithm with Successive Error Restoration

We consider sequential quadratic programming (SQP) methods for solving constrained nonlinear programming problems. It is generally believed that SQP methods are sensitive to the accuracy by which partial derivatives are provided. One reason is that differences of gradients of the Lagrangian function are used for updating a quasi-Newton matrix, e.g., by the BFGS formula. The … Read more

A Fast Algorithm for Total Variation Image Reconstruction from Random Projections

Total variation (TV) regularization is popular in image restoration and reconstruction due to its ability to preserve image edges. To date, most research activities on TV models concentrate on image restoration from blurry and noisy observations, while discussions on image reconstruction from random projections are relatively fewer. In this paper, we propose, analyze, and test … Read more

Stability of error bounds for convex constraint systems in Banach spaces

This paper studies stability of error bounds for convex constraint systems in Banach spaces. We show that certain known sufficient conditions for local and global error bounds actually ensure error bounds for the family of functions being in a sense small perturbations of the given one. A single inequality as well as semi-infinite constraint systems … Read more

A Feasible Directions Method for Nonsmooth Convex Optimization

We propose a new technique for minimization of convex functions not necessarily smooth. Our approach employs an equivalent constrained optimization problem and approximated linear programs obtained with cutting planes. At each iteration a search direction and a step length are computed. If the step length is considered “non serious”, a cutting plane is added and … Read more

Experimental analysis of heuristics for the bottleneck traveling salesman problem

In this paper we develop efficient heuristic algorithms to solve the bottleneck traveling salesman problem (BTSP). Results of extensive computational experiments are reported. Our heuristics produced optimal solutions for all the test problems considered from TSPLIB, JM-instances, National TSP instances, and VLSI TSP instances in very reasonable running time. We also conducted experiments with specially … Read more

Fenchel Decomposition for Stochastic Mixed-Integer Programming

This paper introduces a new cutting plane method for two-stage stochastic mixed-integer programming (SMIP) called Fenchel decomposition (FD). FD uses a class of valid inequalities termed, FD cuts, which are derived based on Fenchel cutting planes from integer programming. First, we derive FD cuts based on both the first and second-stage variables, and devise an … Read more

Convergence to the optimal value for barrier methods combined with Hessian Riemannian gradient flows and generalized proximal algorithms

We consider the problem $\min_{x\in\R^n}\{f(x)\mid Ax=b, \ x\in\overline{C},\ g_j(x)\le0,\ j=1,\ldots,s\}$, where $b\in\R^m$, $A\in\R^{m\times n}$ is a full rank matrix, $\overline{C}$ is the closure of a nonempty, open and convex subset $C$ of $\R^n$, and $g_j(\cdot)$, $ j=1,\ldots,s$, are nonlinear convex functions. Our strategy consists firstly in to introduce a barrier-type penalty for the constraints $g_j(x)\le0$, … Read more