A Generalized Inexact Proximal Point Method for Nonsmooth Functions that Satisfies Kurdyka Lojasiewicz Inequality

In this paper, following the ideas presented in Attouch et al. (Math. Program. Ser. A, 137: 91-129, 2013), we present an inexact version of the proximal point method for nonsmoth functions, whose regularization is given by a generalized perturbation term. More precisely, the new perturbation term is defined as a “curved enough” function of the … Read more

Parallel Algorithms for Big Data Optimization

We propose a decomposition framework for the parallel optimization of the sum of a differentiable function and a (block) separable nonsmooth, convex one. The latter term is usually employed to enforce structure in the solution, typically sparsity. Our framework is very flexible and includes both fully parallel Jacobi schemes and Gauss-Seidel (i.e., sequential) ones, as … Read more

Dynamic scaling in the Mesh Adaptive Direct Search algorithm for blackbox optimization

Blackbox optimization deals with situations in which the objective function and constraints are typically computed by launching a time-consuming computer sim- ulation. The subject of this work is the Mesh Adaptive Direct Search (MADS) class of algorithms for blackbox optimization. We propose a way to dynamically scale the mesh, which is the discrete spatial structure … Read more

Generalized Inexact Proximal Algorithms: Habit’s/ Routine’s Formation with Resistance to Change, following Worthwhile Changes

This paper shows how, in a quasi metric space, an inexact proximal algorithm with a generalized perturbation term appears to be a nice tool for Behavioral Sciences (Psychology, Economics, Management, Game theory,…). More precisely, the new perturbation term represents an index of resistance to change, defined as a “curved enough” function of the quasi distance … Read more

Fixed points and variational principles with applications to capability theory of wellbeing via variational rationality

In this paper we first develop two new results of variational analysis. One is a fixed point theorem for parametric dynamic systems in quasimetric spaces, which can also be interpreted as an existence theorem of minimal points with respect to reflexive and transitive preferences for sets in products spaces. The other one is a variational … Read more

A Scalarization Proximal Point Method for Quasiconvex Multiobjective Minimization

In this paper we propose a scalarization proximal point method to solve multiobjective unconstrained minimization problems with locally Lipschitz and quasiconvex vector functions. We prove, under natural assumptions, that the sequence generated by the method is well defined and converges globally to a Pareto-Clarke critical point. Our method may be seen as an extension, for … Read more

A Family of Subgradient-Based Methods for Convex Optimization Problems in a Unifying Framework

We propose a new family of subgradient- and gradient-based methods which converges with optimal complexity for convex optimization problems whose feasible region is simple enough. This includes cases where the objective function is non-smooth, smooth, have composite/saddle structure, or are given by an inexact oracle model. We unified the way of constructing the subproblems which … Read more

Forward-backward truncated Newton methods for convex composite optimization

This paper proposes two proximal Newton-CG methods for convex nonsmooth optimization problems in composite form. The algorithms are based on a a reformulation of the original nonsmooth problem as the unconstrained minimization of a continuously differentiable function, namely the forward-backward envelope (FBE). The first algorithm is based on a standard line search strategy, whereas the … Read more

A Trust Region Method for the Solution of the Surrogate Dual in Integer Programming

We propose an algorithm for solving the surrogate dual of a mixed integer program. The algorithm uses a trust region method based on a piecewise affine model of the dual surrogate value function. A new and much more flexible way of updating bounds on the surrogate dual’s value is proposed, which numerical experiments prove to … Read more

Problem Formulations for Simulation-based Design Optimization using Statistical Surrogates and Direct Search

Typical challenges of simulation-based design optimization include unavailable gradients and unreliable approximations thereof, expensive function evaluations, numerical noise, multiple local optima and the failure of the analysis to return a value to the optimizer. One possible remedy to alleviate these issues is to use surrogate models in lieu of the computational models or simulations and … Read more