A continuous gradient-like dynamical approach to Pareto-optimization in Hilbert spaces

In a Hilbert space setting, we consider new continuous gradient-like dynamical systems for constrained multiobjective optimization. This type of dynamics was first investigated by Cl. Henry, and B. Cornet, as a model of allocation of resources in economics. Based on the Yosida regularization of the discontinuous part of the vector field which governs the system, … Read more

Worst case complexity of direct search under convexity

In this paper we prove that the broad class of direct-search methods of directional type, based on imposing sufficient decrease to accept new iterates, exhibits the same global rate or worst case complexity bound of the gradient method for the unconstrained minimization of a convex and smooth function. More precisely, it will be shown that … Read more

A merit function approach for direct search

In this paper it is proposed to equip direct-search methods with a general procedure to minimize an objective function, possibly non-smooth, without using derivatives and subject to constraints on the variables. One aims at considering constraints, most likely nonlinear or non-smooth, for which the derivatives of the corresponding functions are also unavailable. The novelty of … Read more

Faster, but Weaker, Relaxations for Quadratically Constrained Quadratic Programs

We introduce a new relaxation framework for nonconvex quadratically constrained quadratic programs (QCQPs). In contrast to existing relaxations based on semidefinite programming (SDP), our relaxations incorporate features of both SDP and second order cone programming (SOCP) and, as a result, solve more quickly than SDP. A downside is that the calculated bounds are weaker than … Read more

REDUCTION OF TWO-STAGE PROBABILISTIC OPTIMIZATION PROBLEMS WITH DISCRETE DISTRIBUTION OF RANDOM DATA TO MIXED INTEGER PROGRAMMING PROBLEMS

We consider models of two-stage stochastic programming with a quantile second stage criterion and optimization models with a chance constraint on the second stage objective function values. Such models allow to formalize requirements to reliability and safety of the system under consideration, and to optimize the system in extreme conditions. We suggest a method of … Read more

Tail bounds for stochastic approximation

Stochastic-approximation gradient methods are attractive for large-scale convex optimization because they offer inexpensive iterations. They are especially popular in data-fitting and machine-learning applications where the data arrives in a continuous stream, or it is necessary to minimize large sums of functions. It is known that by appropriately decreasing the variance of the error at each … Read more

Universal gradient methods for convex optimization problems

In this paper, we present new methods for black-box convex minimization. They do not need to know in advance the actual level of smoothness of the objective function. Their only essential input parameter is the required accuracy of the solution. At the same time, for each particular problem class they automatically ensure the best possible … Read more

An inexact proximal bundle method with applications to convex conic programming

We present an inexact bundle method for minimizing an unconstrained convex sup-function with an open domain. Under some mild assumptions, we reformulate a convex conic programming problem as such problem in terms of the support function. This method is a first-order method, hence it requires much less computational cost in each iteration than second-order approaches … Read more

A doubly stabilized bundle method for nonsmooth convex optimization

We propose a bundle method for minimizing nonsmooth convex functions that combines both the level and the proximal stabilizations. Most bundle algorithms use a cutting-plane model of the objective function to formulate a subproblem whose solution gives the next iterate. Proximal bundle methods employ the model in the objective function of the subproblem, while level … Read more

Orthogonal invariance and identifiability

Orthogonally invariant functions of symmetric matrices often inherit properties from their diagonal restrictions: von Neumann’s theorem on matrix norms is an early example. We discuss the example of “identifiability”, a common property of nonsmooth functions associated with the existence of a smooth manifold of approximate critical points. Identifiability (or its synonym, “partial smoothness”) is the … Read more