Max-min separability: incremental approach and application to supervised data classification

A new algorithm for the computation of a piecewise linear function separating two finite point sets in $n$-dimensional space is developed and the algorithm is applied to solve supervised data classification problems. The algorithm computes hyperplanes incrementally and it finds as many hyperplanes as necessary to separate two sets with respect to some tolerance. An … Read more

Necessary optimality condition for Nonsmooth Switching Control problem

This paper is concerned with a class optimal switching nonsmoth optimal control problem is considered. Both the switching instants and the control function are to the chosen such that the cost functional is minimized.The necessary optimality conditions are derived by means of normal cone and Dubovitskii Milyutin theory. Citation Technical report 2007-3 Submited to the … Read more

The Speed of Shor’s R-Algorithm

Shor’s r-algorithm is an iterative method for unconstrained optimization, designed for minimizing nonsmooth functions, for which its reported success has been considerable. Although some limited convergence results are known, nothing seems to be known about the algorithm’s rate of convergence, even in the smooth case. We study how the method behaves on convex quadratics, proving … Read more

Approximate Primal Solutions and Rate Analysis in Dual Subgradient Methods

We study primal solutions obtained as a by-product of subgradient methods when solving the Lagrangian dual of a primal convex constrained optimization problem (possibly nonsmooth). The existing literature on the use of subgradient methods for generating primal optimal solutions is limited to the methods producing such solutions only asymptotically (i.e., in the limit as the … Read more

One Class Nonsmooth Dyscrete Step Control Problem

In this paper a survey and refinement of its recent results in the discrete optimal control theory are presented. The step control problem depending on a parameter is investigated. No smoothness of the cost function is assumed and new versions of the discrete maximum principle for the step control problem are derived Citation submited to … Read more

Selective Gram-Schmidt orthonormalization for conic cutting surface algorithms

It is not straightforward to find a new feasible solution when several conic constraints are added to a conic optimization problem. Examples of conic constraints include semidefinite constraints and second order cone constraints. In this paper, a method to slightly modify the constraints is proposed. Because of this modification, a simple procedure to generate strictly … Read more

Dini Derivative and a Characterization for Lipschitz and Convex Functions on Riemannian Manifolds

Dini derivative on Riemannian manifold setting is studied in this paper. In addition, a characterization for Lipschitz and convex functions defined on Riemannian manifolds and sufficient optimality conditions for constraint optimization problems in terms of the Dini derivative are given. Article Download View Dini Derivative and a Characterization for Lipschitz and Convex Functions on Riemannian … Read more

Large Scale Portfolio Optimization with Piecewise Linear Transaction Costs

We consider the fundamental problem of computing an optimal portfolio based on a quadratic mean-variance model of the objective function and a given polyhedral representation of the constraints. The main departure from the classical quadratic programming formulation is the inclusion in the objective function of piecewise linear, separable functions representing the transaction costs. We handle … Read more

Discrete gradient method: a derivative free method for nonsmooth optimization

In this paper a new derivative-free method is developed for solving unconstrained nonsmooth optimization problems. This method is based on the notion of a discrete gradient. It is demonstrated that the discrete gradients can be used to approximate subgradients of a broad class of nonsmooth functions. It is also shown that the discrete gradients can … Read more

Benchmark of Some Nonsmooth Optimization Solvers for Computing Nonconvex Proximal Points

The major focus of this work is to compare several methods for computing the proximal point of a nonconvex function via numerical testing. To do this, we introduce two techniques for randomly generating challenging nonconvex test functions, as well as two very specific test functions which should be of future interest to Nonconvex Optimization Benchmarking. … Read more