Secant acceleration of sequential residual methods for solving large-scale nonlinear systems of equations

Sequential Residual Methods try to solve nonlinear systems of equations $F(x)=0$ by iteratively updating the current approximate solution along a residual-related direction. Therefore, memory requirements are minimal and, consequently, these methods are attractive for solving large-scale nonlinear systems. However, the convergence of these algorithms may be slow in critical cases; therefore, acceleration procedures are welcome. … Read more

A rolling-horizon approach for multi-period optimization

Mathematical optimization problems including a time dimension abound. For example, logistics, process optimization and production planning tasks must often be optimized for a range of time periods. Usually, these problems incorporating time structure are very large and cannot be solved to global optimality by modern solvers within a reasonable period of time. Therefore, the so-called … Read more

Compact Representations of Structured BFGS Matrices

For general large-scale optimization problems compact representations exist in which recursive quasi-Newton update formulas are represented as compact matrix factorizations. For problems in which the objective function contains additional structure, so-called structured quasi-Newton methods exploit available second-derivative information and approximate unavailable second derivatives. This article develops the compact representations of two structured Broyden-Fletcher-Goldfarb-Shanno update formulas. … Read more

Solving Large Scale Cubic Regularization by a Generalized Eigenvalue Problem

Cubic Regularization methods have several favorable properties. In particular under mild assumptions, they are globally convergent towards critical points with second order necessary conditions satisfied. Their adoption among practitioners, however, does not yet match the strong theoretical results. One of the reasons for this discrepancy may be additional implementation complexity needed to solve the occurring … Read more

Random projections for quadratic programs

Random projections map a set of points in a high dimensional space to a lower dimen- sional one while approximately preserving all pairwise Euclidean distances. While random projections are usually applied to numerical data, we show they can be successfully applied to quadratic programming formulations over a set of linear inequality constraints. Instead of solving … Read more

An Inexact Primal-Dual Smoothing Framework for Large-Scale Non-Bilinear Saddle Point Problems

We develop an inexact primal-dual first-order smoothing framework to solve a class of non-bilinear saddle point problems with primal strong convexity. Compared with existing methods, our framework yields a significant improvement over the primal oracle complexity, while it has competitive dual oracle complexity. In addition, we consider the situation where the primal-dual coupling term has … Read more

On Electricity Market Equilibria with Storage: Modeling, Uniqueness, and a Distributed ADMM

We consider spot-market trading of electricity including storage operators as additional agents besides producers and consumers. Storages allow for shifting produced electricity from one time period to a later one. Due to this, multiple market equilibria may occur even if classical uniqueness assumptions for the case without storages are satisfied. For models containing storage operators, … Read more

Machine learning approach to chance-constrained problems: An algorithm based on the stochastic gradient descent

We consider chance-constrained problems with discrete random distribution. We aim for problems with a large number of scenarios. We propose a novel method based on the stochastic gradient descent method which performs updates of the decision variable based only on looking at a few scenarios. We modify it to handle the non-separable objective. A complexity … Read more

User Manual for BBCPOP: A Sparse Doubly Nonnegative Relaxation of Polynomial Optimization Problems with Binary, Box and Complementarity Constraints

BBCPOP proposed in [4] is a MATLAB implementation of a hierarchy of sparse doubly nonnegative (DNN) relaxations of a class of polynomial optimization (minimization) problems (POPs) with binary, box and complementarity constraints. Given a POP in the class and a relaxation order (or a hierarchy level), BBCPOP constructs a simple conic optimization problem (COP), which … Read more

Relative-Continuity” for Non-Lipschitz Non-Smooth Convex Optimization using Stochastic (or Deterministic) Mirror Descent

The usual approach to developing and analyzing first-order methods for non-smooth (stochastic or deterministic) convex optimization assumes that the objective function is uniformly Lipschitz continuous with parameter $M_f$. However, in many settings the non-differentiable convex function $f(\cdot)$ is not uniformly Lipschitz continuous — for example (i) the classical support vector machine (SVM) problem, (ii) the … Read more