Hedge algorithm and Dual Averaging schemes

We show that the Hedge algorithm, a method that is widely used in Machine Learning, can be interpreted as a particular instance of Dual Averaging schemes, which have recently been introduced by Nesterov for regret minimization. Based on this interpretation, we establish three alternative methods of the Hedge algorithm: one in the form of the … Read more

A randomized Mirror-Prox method for solving structured large-scale matrix saddle-point problems

In this paper, we derive a randomized version of the Mirror-Prox method for solving some structured matrix saddle-point problems, such as the maximal eigenvalue minimization problem. Deterministic first-order schemes, such as Nesterov’s Smoothing Techniques or standard Mirror-Prox methods, require the exact computation of a matrix exponential at every iteration, limiting the size of the problems … Read more

A Family of Newton Methods for Nonsmooth Constrained Systems with Nonisolated Solutions

We propose a new family of Newton-type methods for the solution of constrained systems of equations. Under suitable conditions, that do not include differentiability or local uniqueness of solutions, local, quadratic convergence to a solution of the system of equations can be established. We show that as particular instances of the method we obtain inexact … Read more

A branch and bound algorithm for the global optimization of Hessian Lipschitz continuous functions

We present a branch and bound algorithm for the global optimization of a twice differentiable nonconvex objective function with a Lipschitz continuous Hessian over a compact, convex set. The algorithm is based on applying cubic regularisation techniques to the objective function within an overlapping branch and bound algorithm for convex constrained global optimization. Unlike other … Read more

A Security Framework for Smart Metering with Multiple Data Consumers

The increasing diffusion of Automatic Meter Reading (AMR) has raised many concerns about the protection of personal data related to energy, water or gas consumption, from which details about the habits of the users can be inferred. On the other hand, aggregated measurements about consumption are crucial for several goals, including resource provisioning, forecasting, and … Read more

A new family of high order directions for unconstrained optimization inspired by Chebyshev and Shamanskii methods

The 1669-1670 Newton-Raphson’s method is still used to solve equations systems and unconstrained optimization problems. Since this method, some other algorithms inspired by Newton’s have been proposed: in 1839 Chebyshev developped a high order cubical convergence algorithm, and in 1967 Shamanskii proposed an acceleration of Newton’s method. By considering a Newton-type methods as displacement directions, … Read more

On the Difficulty of Deciding Asymptotic Stability of Cubic Homogeneous Vector Fields

It is well-known that asymptotic stability (AS) of homogeneous polynomial vector fields of degree one (i.e., linear systems) can be decided in polynomial time e.g. by searching for a quadratic Lyapunov function. Since homogeneous vector fields of even degree can never be AS, the next interesting degree to consider is equal to three. In this … Read more

Subdifferentials of nonconvex supremum functions and their applications to semi-infinite and infinite programs with Lipschitzian data

The paper is devoted to the subdifferential study and applications of the supremum of uniformly Lipschitzian functions over arbitrary index sets with no topology. Based on advanced techniques of variational analysis, we evaluate major subdifferentials of the supremum functions in the general framework of Asplund (in particular, reflexive) spaces with no convexity or relaxation assumptions. … Read more

Improved Column Generation for Highly Degenerate Master Problems

Column generation for solving linear programs with a huge number of variables alternates between solving a master problem and a pricing subproblem to add variables to the master problem as needed. The method is known to suffer from degeneracy of the master problem, exposing what is called the tailing-off effect. Inspired by recent advances in … Read more