Mixed-Integer Programming Techniques for the Minimum Sum-of-Squares Clustering Problem

The minimum sum-of-squares clustering problem is a very important problem in data mining and machine learning with very many applications in, e.g., medicine or social sciences. However, it is known to be NP-hard in all relevant cases and to be notoriously hard to be solved to global optimality in practice. In this paper, we develop … Read more

Limits of eventual families of sets with application to algorithms for the common fixed point problem

We present an abstract framework for asymptotic analysis of convergence based on the notions of eventual families of sets that we define. A family of subsets of a given set is called here an “eventual family” if it is upper hereditary with respect to inclusion. We define accumulation points of eventual families in a Hausdorff … Read more

Shortest Path Network Interdiction with Asymmetric Uncertainty

This paper considers an extension of the shortest path network interdiction problem that incorporates robustness to account for parameter uncertainty. The shortest path interdiction problem is a game of two players with conflicting agendas and capabilities: an evader, who traverses the arcs of a network from a source node to a sink node using the … Read more

OFFO minimization algorithms for second-order optimality and their complexity

An Adagrad-inspired class of algorithms for smooth unconstrained optimization is presented in which the objective function is never evaluated and yet the gradient norms decrease at least as fast as O(1/\sqrt{k+1}) while second-order optimality measures converge to zero at least as fast as O(1/(k+1)^{1/3}). This latter rate of convergence is shown to be essentially sharp … Read more

Branch-and-Bound Performance Estimation Programming: A Unified Methodology for Constructing Optimal Optimization Methods

We present the Branch-and-Bound Performance Estimation Programming (BnB-PEP), a unified methodology for constructing optimal first-order methods for convex and nonconvex optimization. BnB-PEP poses the problem of finding the optimal optimization method as a nonconvex but practically tractable quadratically constrained quadratic optimization problem and solves it to certifiable global optimality using a customized branch-and-bound algorithm. By … Read more

Integer programming column generation: Accelerating branch-and-price using a novel pricing scheme for finding high-quality solutions in set covering, packing, and partitioning problems

Large-neighbourhood search (LNS) heuristics are important mathematical programming techniques that search for primal feasible solutions by solving an auxiliary problem with a restricted feasible region. Extending such powerful generic LNS heuristics to the branch- and-price context is inherently challenging. The most prominent challenges arise from the fact that in branch-and-price algorithms, columns are generated with … Read more

First-Order Objective-Function-Free Optimization Algorithms and Their Complexity

A class of algorithms for unconstrained nonconvex optimization is considered where the value of the objective function is never computed. The class contains a deterministic version of the first-order Adagrad method typically used for minimization of noisy function, but also allows the use of second-order information when available. The rate of convergence of methods in … Read more

Parametric complexity analysis for a class of first-order Adagrad-like algorithms

A class of algorithms for optimization in the presence of noise is presented, that does not require the evaluation of the objective function. This class generalizes the well-known Adagrad method. The complexity of this class is then analyzed as a function of its parameters, and it is shown that some methods of the class enjoy … Read more

An SDP Relaxation for the Sparse Integer Least Square Problem

In this paper, we study the polynomial approximability or solvability of sparse integer least square problem (SILS), which is the NP-hard variant of the least square problem, where we only consider sparse {0, ±1}-vectors. We propose an l1-based SDP relaxation to SILS, and introduce a randomized algorithm for SILS based on the SDP relaxation. In … Read more

Revisiting Degeneracy, Strict Feasibility, Stability, in Linear Programming

Currently, the simplex method and the interior point method are indisputably the most popular algorithms for solving linear programs, LPs. Unlike general conic programs, LPs with a finite optimal value do not require strict feasibility in order to establish strong duality. Hence strict feasibility is seldom a concern, even though strict feasibility is equivalent to … Read more