On the abs-polynomial expansion of piecewise smooth functions

Tom Streubel has observed that for functions in abs-normal form, generalized Taylor expansions of arbitrary order $\bd \!- \!1$ can be generated by algorithmic piecewise differentiation. Abs-normal form means that the real or vector valued function is defined by an evaluation procedure that involves the absolute value function $|\cdot|$ apart from arithmetic operations and $\bd$ … Read more

New convergence results for the inexact variable metric forward-backward method

Forward–backward methods are valid tools to solve a variety of optimization problems where the objective function is the sum of a smooth, possibly nonconvex term plus a convex, possibly nonsmooth function. The corresponding iteration is built on two main ingredients: the computation of the gradient of the smooth part and the evaluation of the proximity … Read more

A general branch-and-bound framework for continuous global multiobjective optimization

Current generalizations of the central ideas of single-objective branch-and-bound to the multiobjective setting do not seem to follow their train of thought all the way. The present paper complements the various suggestions for generalizations of partial lower bounds and of overall upper bounds by general constructions for overall lower bounds from partial lower bounds, and … Read more

Sequential Quadratic Optimization for Nonlinear Equality Constrained Stochastic Optimization

Sequential quadratic optimization algorithms are proposed for solving smooth nonlinear optimization problems with equality constraints. The main focus is an algorithm proposed for the case when the constraint functions are deterministic, and constraint function and derivative values can be computed explicitly, but the objective function is stochastic. It is assumed in this setting that it … Read more

Exact Penalty Function for L21 Norm Minimization over the Stiefel Manifold

L21 norm minimization with orthogonality constraints, feasible region of which is called Stiefel manifold, has wide applications in statistics and data science. The state-of-the-art approaches adopt proximal gradient technique on either Stiefel manifold or its tangent spaces. The consequent subproblem does not have closed-form solution and hence requires an iterative procedure to solve which is … Read more

A DISCUSSION ON ELECTRICITY PRICES, OR THE TWO SIDES OF THE COIN

We examine how different pricing frameworks deal with nonconvex features typical of day-ahead energy prices when the power system is hydro-dominated, like in Brazil. For the system operator, requirements of minimum generation translate into feasibility issues that are fundamental to carry the generated power through the network. When utilities are remunerated at a price depending … Read more

Necessary and sufficient conditions for rank-one generated cones

A closed convex conic subset $\cS$ of the positive semidefinite (PSD) cone is rank-one generated (ROG) if all of its extreme rays are generated by rank-one matrices. The ROG property of $\cS$ is closely related to the exactness of SDP relaxations of nonconvex quadratically constrained quadratic programs (QCQPs) related to $\cS$. We consider the case … Read more

Characterization of an Anomalous Behavior of a Practical Smoothing Technique

A practical smoothing method was analyzed and tested against state-of-the-art solvers for some non-smooth optimization problems in [BSS20a; BSS20b]. This method can be used to smooth the value functions and solution mappings of fully parameterized convex problems under mild conditions. In general, the smoothing of the value function lies from above the true value function … Read more

Mathematical Programming formulations for the Alternating Current Optimal Power Flow problem

Power flow refers to the injection of power on the lines of an electrical grid, so that all the injections at the nodes form a consistent flow within the network. Optimality, in this setting, is usually intended as the minimization of the cost of generating power. Current can either be direct or alternating: while the … Read more

Convex Maximization via Adjustable Robust Optimization

Maximizing a convex function over convex constraints is an NP-hard problem in general. We prove that such a problem can be reformulated as an adjustable robust optimization (ARO) problem where each adjustable variable corresponds to a unique constraint of the original problem. We use ARO techniques to obtain approximate solutions to the convex maximization problem. … Read more