Properties of a Cutting Plane Method for Semidefinite Programming

We analyze the properties of an interior point cutting plane algorithm that is based on a semi-infinite linear formulation of the dual semidefinite program. The cutting plane algorithm approximately solves a linear relaxation of the dual semidefinite program in every iteration and relies on a separation oracle that returns linear cutting planes. We show that … Read more

Graph Implementations for Nonsmooth Convex Programs

We describe graph implementations, a generic method for representing a convex function via its epigraph, described in a disciplined convex programming framework. This simple and natural idea allows a very wide variety of smooth and nonsmooth convex programs to be easily specified and efficiently solved, using interior-point methods for smooth or cone convex programs. CitationTo … Read more

Local convergence for alternating and averaged nonconvex projections

The idea of a finite collection of closed sets having “strongly regular intersection” at a given point is crucial in variational analysis. We show that this central theoretical tool also has striking algorithmic consequences. Specifically, we consider the case of two sets, one of which we assume to be suitably “regular” (special cases being convex … Read more

A secant method for nonsmooth optimization

The notion of a secant for locally Lipschitz continuous functions is introduced and a new algorithm to locally minimize nonsmooth, nonconvex functions based on secants is developed. We demonestrate that the secants can be used to design an algorithm to find descent directions of locally Lipschitz continuous functions. This algorithm is applied to design a … Read more

Another Face of DIRECT

It is shown that, contrary to a claim of [D. E. Finkel, and C. T. Kelley, Additive scaling and the DIRECT algorithm, J. Glob. Optim. 36 (2006) 597-608], it is possible to divide the smallest hypercube which contains the low function value by considering hyperrectangles whose points are located on the diagonal of the center … Read more

A Fixed-Point Continuation Method for l_1-Regularized Minimization with Applications to Compressed Sensing

We consider solving minimization problems with $\ell_1$-regularization: $$\min \|x\|_1 + \mu f(x),$$ particularly for $f(x) = \frac{1}{2}\|Ax-b\|_M^2$ where $A \in \R^{m \times n}$ with $m < n$. Our goal is to construct efficient and robust algorithms for solving large-scale problems with dense data, and our approach is based on two powerful algorithmic ideas, operator-splitting and ... Read more

Max-min separability: incremental approach and application to supervised data classification

A new algorithm for the computation of a piecewise linear function separating two finite point sets in $n$-dimensional space is developed and the algorithm is applied to solve supervised data classification problems. The algorithm computes hyperplanes incrementally and it finds as many hyperplanes as necessary to separate two sets with respect to some tolerance. An … Read more

Necessary optimality condition for Nonsmooth Switching Control problem

This paper is concerned with a class optimal switching nonsmoth optimal control problem is considered. Both the switching instants and the control function are to the chosen such that the cost functional is minimized.The necessary optimality conditions are derived by means of normal cone and Dubovitskii Milyutin theory. CitationTechnical report 2007-3 Submited to the journal … Read more

The Speed of Shor’s R-Algorithm

Shor’s r-algorithm is an iterative method for unconstrained optimization, designed for minimizing nonsmooth functions, for which its reported success has been considerable. Although some limited convergence results are known, nothing seems to be known about the algorithm’s rate of convergence, even in the smooth case. We study how the method behaves on convex quadratics, proving … Read more

Approximate Primal Solutions and Rate Analysis in Dual Subgradient Methods

We study primal solutions obtained as a by-product of subgradient methods when solving the Lagrangian dual of a primal convex constrained optimization problem (possibly nonsmooth). The existing literature on the use of subgradient methods for generating primal optimal solutions is limited to the methods producing such solutions only asymptotically (i.e., in the limit as the … Read more