Convergence of a hybrid projection-proximal point algorithm coupled with approximation methods in convex optimization

In order to minimize a closed convex function that is approximated by a sequence of better behaved functions, we investigate the global convergence of a generic diagonal hybrid algorithm, which consists of an inexact relaxed proximal point step followed by a suitable orthogonal projection onto a hyperplane. The latter permits to consider a fixed relative … Read more

Sums of Squares and Semidefinite Programming Relaxations for Polynomial Optimization Problems with Structured Sparsity

Unconstrained and inequality constrained sparse polynomial optimization problems (POPs) are considered. A correlative sparsity pattern graph is defined to find a certain sparse structure in the objective and constraint polynomials of a POP. Based on this graph, sets of supports for sums of squares (SOS) polynomials that lead to efficient SOS and semidefinite programming (SDP) … Read more

Jordan-algebraic aspects of nonconvex optimization over symmetric cones

We illustrate the usefulness of Jordan-algebraic technique for nonconvex optimization by considering a potential-reduction algorithm for a nonconvex quadratic function over the domain obtained as the intersection of a symmetric cone with an affine subspace CitationPreprint, September,2004ArticleDownload View PDF

Optimal distance separating halfspace

One recently proposed criterion to separate two datasets in discriminant analysis, is to use a hyperplane which minimises the sum of distances to it from all the misclassified data points. Here all distances are supposed to be measured by way of some fixed norm,while misclassification means lying on the wrong side of the hyperplane, or … Read more

Optimal expected-distance separating halfspace

One recently proposed criterion to separate two datasets in discriminant analysis, is to use a hyperplane which minimises the sum of distances to it from all the misclassified data points. Here all distances are supposed to be measured by way of some fixed norm, while misclassification means lying on the wrong side of the hyperplane, … Read more

A Piecewise Linearization Framework for Retail Shelf Space Management Models

Managing shelf space is critical for retailers to attract customers and to optimize profit. This paper develops a shelf space allocation optimization model that explicitly incorporates essential in-store costs and considers space- and cross-elasticities. The resultant model maximizes a signomial objective function over linear and bilinear constraints in mixed-integer variables. We propose a piecewise linearization … Read more

A Branch-Reduce-Cut Algorithm for the Global Optimization of Probabilistically Constrained Linear Programs

We consider probabilistic constrained linear programs with general distributions for the uncertain parameters. These problems generally involve non-convex feasible sets. We develop a branch and bound algorithm that searches for a global solution to this problem by successively partitioning the non-convex feasible region and by using bounds on the objective function to fathom inferior partitions. … Read more

Constrained Global Optimization with Radial Basis Functions

Response surface methods show promising results for global optimization of costly non convex objective functions, i.e. the problem of finding the global minimum when there are several local minima and each function value takes considerable CPU time to compute. Such problems often arise in industrial and financial applications, where a function value could be a … Read more

Convergence Analysis of the DIRECT Algorithm

The DIRECT algorithm is a deterministic sampling method for bound constrained Lipschitz continuous optimization. We prove a subsequential convergence result for the DIRECT algorithm that quantifies some of the convergence observations in the literature. Our results apply to several variations on the original method, including one that will handle general constraints. We use techniques from … Read more

A Trust-Region Algorithm for Global Optimization

We consider the global minimization of a bound-constrained function with a so-called funnel structure. We develop a two-phase procedure that uses sampling, local optimization, and Gaussian smoothing to construct a smooth model of the underlying funnel. The procedure is embedded in a trust-region framework that avoids the pitfalls of a fixed sampling radius. We present … Read more