Reformulation of a model for hierarchical divisive graph modularity maximization

Finding clusters, or communities, in a graph, or network is a very important problem which arises in many domains. Several models were proposed for its solution. One of the most studied and exploited is the maximization of the so called modularity, which represents the sum over all communities of the fraction of edges within these … Read more

Subgradient methods for huge-scale optimization problems

We consider a new class of huge-scale problems, the problems with {\em sparse subgradients}. The most important functions of this type are piece-wise linear. For optimization problems with uniform sparsity of corresponding linear operators, we suggest a very efficient implementation of subgradient iterations, which total cost depends {\em logarithmically} in the dimension. This technique is … Read more

A PROBABILITY-DRIVEN SEARCH ALGORITHM FOR SOLVING MULTI-OBJECTIVE OPTIMIZATION PROBLEMS

This paper proposes a new probabilistic algorithm for solving multi-objective optimization problems – Probability-Driven Search Algorithm. The algorithm uses probabilities to control the process in search of Pareto optimal solutions. Especially, we use the absorbing Markov Chain to argue the convergence of the algorithm. We test this approach by implementing the algorithm on some benchmark … Read more

Smoothing and Worst Case Complexity for Direct-Search Methods in Non-Smooth Optimization

For smooth objective functions it has been shown that the worst case cost of direct-search methods is of the same order as the one of steepest descent, when measured in number of iterations to achieve a certain threshold of stationarity. Motivated by the lack of such a result in the non-smooth case, we propose, analyze, … Read more

Globally Convergent Evolution Strategies and CMA-ES

In this paper we show how to modify a large class of evolution strategies (ES) to rigorously achieve a form of global convergence, meaning convergence to stationary points independently of the starting point. The type of ES under consideration recombine the parents by means of a weighted sum, around which the offsprings are computed by … Read more

Pessimistic Bi-Level Optimisation

Bi-level problems are optimisation problems in which some of the decision variables must optimise a subordinate (lower-level) problem. In general, the lower-level problem can possess multiple optimal solutions. One therefore distinguishes between optimistic formulations, which assume that the most favourable lower-level solution is implemented, and pessimistic formulations, in which the most adverse lower-level solution is … Read more

Interior Point Methods for Optimal Experimental Designs

In this paper, we propose a primal IP method for solving the optimal experimental design problem with a large class of smooth convex optimality criteria, including A-, D- and p th mean criterion, and establish its global convergence. We also show that the Newton direction can be computed efficiently when the size of the moment … Read more

Slopes of multifunctions and extensions of metric regularity

This article aims to demonstrate how the definitions of slopes can be extended to multi-valued mappings between metric spaces and applied for characterizing metric regularity. Several kinds of local and nonlocal slopes are defined and several metric regularity properties for set-valued mappings between metric spaces are investigated. Citation Published in Vietnam Journal of Mathematics 40:2&3(2012) … Read more

Solving multi-stage stochastic mixed integer linear programs by the dual dynamic programming approach

We consider a model of medium-term commodity contracts management. Randomness takes place only in the prices on which the commodities are exchanged, whilst state variable is multi-dimensional, and decision variable is integer. In our previous article, we proposed an algorithm based on the quantization of random process and a dual dynamic programming type approach to … Read more

On feasibility based bounds tightening

Mathematical programming problems involving nonconvexities are usually solved to optimality using a (spatial) Branch-and-Bound algorithm. Algorithmic efficiency depends on many factors, among which the widths of the bounding box for the problem variables at each Branch-and-Bound node naturally plays a critical role. The practically fastest box-tightening algorithm is known as FBBT (Feasibility-Based Bounds Tightening): an … Read more