A Reformulation Technique to Solve Polynomial Optimization Problems with Separable Objective Functions of Bounded Integer Variables

Real-world problems are often nonconvex and involve integer variables, representing vexing challenges to be tackled using state-of-the-art solvers. We introduce a mathematical identity-based reformulation of a class of polynomial integer nonlinear optimization (PINLO) problems using a technique that linearizes polynomial functions of separable and bounded integer variables of any degree. We also introduce an alternative … Read more

Minimization over the l1-ball using an active-set non-monotone projected gradient

The l1-ball is a nicely structured feasible set that is widely used in many fields (e.g., machine learning, statistics and signal analysis) to enforce some sparsity in the model solutions. In this paper, we devise an active-set strategy for efficiently dealing with minimization problems over the l1-ball and embed it into a tailored algorithmic scheme … Read more

A New Multipoint Symmetric Secant Method with a Dense Initial Matrix

In large-scale optimization, when either forming or storing Hessian matrices are prohibitively expensive, quasi-Newton methods are often used in lieu of Newton’s method because they only require first-order information to approximate the true Hessian.  Multipoint symmetric secant (MSS) methods can be thought of as generalizations of quasi-Newton methods in that they attempt to impose additional requirements on their approximation of the … Read more

Full-low evaluation methods for derivative-free optimization

We propose a new class of rigorous methods for derivative-free optimization with the aim of delivering efficient and robust numerical performance for functions of all types, from smooth to non-smooth, and under different noise regimes. To this end, we have developed Full-Low Evaluation methods, organized around two main types of iterations. The first iteration type … Read more

Global optimization using random embeddings

We propose a random-subspace algorithmic framework for global optimization of Lipschitz-continuous objectives, and analyse its convergence using novel tools from conic integral geometry. X-REGO randomly projects, in a sequential or simultaneous manner, the high- dimensional original problem into low-dimensional subproblems that can then be solved with any global, or even local, optimization solver. We estimate … Read more

A spectral PALM algorithm for matrix and tensor-train based Dictionary Learning

Dictionary Learning (DL) is one of the leading sparsity promoting techniques in the context of image classification, where the “dictionary” matrix D of images and the sparse matrix X are determined so as to represent a redundant image dataset. The resulting constrained optimization problem is nonconvex and non-smooth, providing several computational challenges for its solution. … Read more

Bishop-Phelps cones given by an equation in Banach spaces

In this work, we study Bishop-Phelps cones (briefly, BP cones) given by an equation in Banach spaces. Due to the special form, these cones enjoy interesting properties. We show that nontrivial BP cones given by an equation form a “large family” in some sense in any Banach space and they can be used to characterize … Read more

A novel approach for bilevel programs based on Wolfe duality

This paper considers a bilevel program, which has many applications in practice. To develop effective numerical algorithms, it is generally necessary to transform the bilevel program into a single-level optimization problem. The most popular approach is to replace the lower-level program by its KKT conditions and then the bilevel program can be transformed into a … Read more

Generating Cutting Inequalities Successively for Quadratic Optimization Problems in Binary Variables

We propose a successive generation of cutting inequalities for binary quadratic optimization problems. Multiple cutting inequalities are successively generated for the convex hull of the set of the optimal solutions $\subset \{0, 1\}^n$, while the standard cutting inequalities are used for the convex hull of the feasible region. An arbitrary linear inequality with integer coefficients … Read more

Derivative-free separable quadratic modeling and cubic regularization for unconstrained optimization

We present a derivative-free separable quadratic modeling and cubic regularization technique for solving smooth unconstrained minimization problems. The derivative-free approach is mainly concerned with building a quadratic model that could be generated by numerical interpolation or using a minimum Frobenious norm approach, when the number of points available does not allow to build a complete … Read more