A DERIVATIVE-FREE APPROACH TO CONSTRAINED MULTIOBJECTIVE NONSMOOTH OPTIMIZATION

In this work, we consider multiobjective optimization problems with both bound constraints on the variables and general nonlinear constraints, where objective and constraint function values can only be obtained by querying a black box. We define a linesearch-based solution method, and we show that it converges to a set of Pareto stationary points. To this … Read more

A Frank-Wolfe Based Branch-and-Bound Algorithm for Mean-Risk Optimization

We present an exact algorithm for mean-risk optimization subject to a budget constraint, where decision variables may be continuous or integer. The risk is measured by the covariance matrix and weighted by an arbitrary monotone function, which allows to model risk-aversion in a very individual way. We address this class of convex mixed-integer minimization problems … Read more

A Feasible Active Set Method with Reoptimization for Convex Quadratic Mixed-Integer Programming

We propose a feasible active set method for convex quadratic programming problems with non-negativity constraints. This method is specifically designed to be embedded into a branch-and-bound algorithm for convex quadratic mixed integer programming problems. The branch-and-bound algorithm generalizes the approach for unconstrained convex quadratic integer programming proposed by Buchheim, Caprara and Lodi to the presence … Read more

A Fast Active Set Block Coordinate Descent Algorithm for l1-regularized least squares

The problem of finding sparse solutions to underdetermined systems of linear equations arises in several real-world problems (e.g. signal and image processing, compressive sensing, statistical inference). A standard tool for dealing with sparse recovery is the l1-regularized least-squares approach that has been recently attracting the attention of many researchers. In this paper, we describe an … Read more

Derivative-free Methods for Mixed-Integer Constrained Optimization Problems

Methods which do not use any derivative information are becoming popular among researchers, since they allow to solve many real-world engineering problems. Such problems are frequently characterized by the presence of discrete variables which can further complicate the optimization process. In this paper, we propose derivative-free algorithms for solving continuously differentiable Mixed Integer NonLinear Programming … Read more

Nonmonotone GRASP

A Greedy Randomized Adaptive Search Procedure (GRASP) is an iterative multistart metaheuristic for difficult combinatorial optimization problems. Each GRASP iteration consists of two phases: a construction phase, in which a feasible solution is produced, and a local search phase, in which a local optimum in the neighborhood of the constructed solution is sought. Repeated applications … Read more

A Derivative-Free Algorithm for Constrained Global Optimization based on Exact Penalty Functions

Constrained global optimization problems can be tackled by using exact penalty approaches. In a preceding paper, we proposed an exact penalty algorithm for constrained problems which combines an unconstrained global minimization technique for minimizing a non-differentiable exact penalty func- tion for given values of the penalty parameter, and an automatic updating of the penalty parameter … Read more

Derivative-free Robust Optimization for Circuit Design

In this paper, we introduce a framework for derivative-free robust optimization based on the use of an efficient derivative-free optimization routine for mixed integer nonlinear problems. The proposed framework is employed to find a robust optimal design of a particular integrated circuit (namely a DC-DC converter commonly used in portable electronic devices). The proposed robust … Read more

A variable fixing version of the two-block nonlinear constrained Gauss-Seidel algorithm for ℓ1-regularized least-squares

The problem of finding sparse solutions to underdetermined systems of linear equations is very common in many fields as e.g. in signal/image processing and statistics. A standard tool for dealing with sparse recovery is the ℓ1-regularized least-squares approach that has recently attracted the attention of many researchers. In this paper, we describe a new version … Read more

A Linesearch-based Derivative-free Approach for Nonsmooth Optimization

In this paper, we propose new linesearch-based methods for nonsmooth optimization problems when first-order information on the problem functions is not available. In the first part, we describe a general framework for bound-constrained problems and analyze its convergence towards stationary points, using the Clarke-Jahn directional derivative. In the second part, we consider inequality constrained optimization … Read more