Deterministic upper bounds in global minimization with nonlinear equality constraints

We address the problem of deterministically determining upper bounds in continuous non-convex global minimization of box-constrained problems with equality constraints. These upper bounds are important for the termination of spatial branch-and-bound algorithms. Our method is based on the theorem of Miranda which helps to ensure the existence of feasible points in certain boxes. Then, the … Read more

Understanding the Acceleration Phenomenon via High-Resolution Differential Equations

Gradient-based optimization algorithms can be studied from the perspective of limiting or- dinary differential equations (ODEs). Motivated by the fact that existing ODEs do not distin- guish between two fundamentally different algorithms—Nesterov’s accelerated gradient method for strongly convex functions (NAG-SC) and Polyak’s heavy-ball method—we study an alter- native limiting process that yields high-resolution ODEs. We … Read more

Endogenous Price Zones and Investment Incentives in Electricity Markets: An Application of Multilevel Optimization with Graph Partitioning

In the course of the energy transition, load and supply centers are growing apart in electricity markets worldwide, rendering regional price signals even more important to provide adequate locational investment incentives. This paper focuses on electricity markets that operate under a zonal pricing market design. For a fixed number of zones, we endogenously derive the … Read more

Subset selection in sparse matrices

In subset selection we search for the best linear predictor that involves a small subset of variables. From a computational complexity viewpoint, subset selection is NP-hard and few classes are known to be solvable in polynomial time. Using mainly tools from discrete geometry, we show that some sparsity conditions on the original data matrix allow … Read more

A Branch-and-Cut Algorithm for Solving Mixed-integer Semidefinite Optimization Problems

This paper is concerned with a cutting-plane algorithm for solving mixed-integer semidefinite optimization (MISDO) problems. In this algorithm, the positive semidefinite constraint is relaxed, and the resultant mixed-integer linear optimization problem is repeatedly solved with valid inequalities for the relaxed constraint. We prove convergence properties of the algorithm. Moreover, to speed up the computation, we … Read more

Douglas-Rachford method for the feasibility problem involving a circle and a disc

The Douglas-Rachford algorithm is a classical and a successful method for solving the feasibility problems. Here, we provide a region for global convergence of the algorithm for the feasibility problem involving a disc and a circle in the Euclidean space of dimension two. Citation 1. Borwein, J.M., Sims, B.: The Douglas-Rachford algorithm in the absence … Read more

Conditional Extragradient Algorithms for Solving Constrained Variational Inequalities

In this paper, we generalize the classical extragradient algorithm for solving variational inequality problems by utilizing non-null normal vectors of the feasible set. In particular, conceptual algorithms are proposed with two different linesearches. We then establish convergence results for these algorithms under mild assumptions. Our study suggests that non-null normal vectors may significantly improve convergence … Read more

Efficient global unconstrained black box optimization

For the unconstrained optimization of black box functions, this paper introduces a new randomized algorithm called VRBBO. In practice, VRBBO matches the quality of other state-of-the-art algorithms for finding, in small and large dimensions, a local minimizer with reasonable accuracy. Although our theory guarantees only local minimizers our heuristic techniques turn VRBBO into an efficient … Read more

A fundamental proof to convergence analysis of alternating direction method of multipliers for weakly convex optimization

The convergence analysis of the alternating direction method of multipliers (ADMM) methods to convex/nonconvex combinational optimization have been well established in literature. Consider the extensive applications of weakly convex function in signal processing and machine learning(e.g. \textit{Special issue: DC-Theory, Algorithms and Applications, Mathematical Programming, 169(1):1-336,2018}), in this paper, we investigate the convergence analysis of ADMM … Read more

A new dual for quadratic programming and its applications

The main outcomes of the paper are divided into two parts. First, we present a new dual for quadratic programs, in which, the dual variables are affine functions, and we prove strong duality. Since the new dual is intractable, we consider a modified version by restricting the feasible set. This leads to a new bound … Read more