Extended Formulations and Branch-and-Cut Algorithms for the Black-and-White Traveling Salesman Problem

In this paper we study integer linear programming models and develop branch-and-cut algorithms to solve the Black-and-White Traveling Salesman Problem (BWTSP) (Bourgeois et al., 2003) which is a variant of the well known Traveling Salesman Problem (TSP). Two strategies to model the BWTSP have been used in the literature. The problem is either modeled on … Read more

New analysis of linear convergence of gradient-type methods via unifying error bound conditions

The subject of linear convergence of gradient-type methods on non-strongly convex optimization has been widely studied by introducing several notions as sufficient conditions. Influential examples include the error bound property, the restricted strongly convex property, the quadratic growth property, and the Kurdyka-{\L}ojasiewicz property. In this paper, we first define a group of error bound conditions … Read more

A Sequential Algorithm for Solving Nonlinear Optimization Problems with Chance Constraints

An algorithm is presented for solving nonlinear optimization problems with chance constraints, i.e., those in which a constraint involving an uncertain parameter must be satisfied with at least a minimum probability. In particular, the algorithm is designed to solve cardinality-constrained nonlinear optimization problems that arise in sample average approximations of chance-constrained problems, as well as … Read more

Positioning and construction algorithms for a specific absolute positioning magnetic ruler system

Abstract Absolute positioning magnetic rulers are rulers which calculate the distance of the reading head based just on one reading of a magnetic signal. A new absolute positioning magnetic ruler method which is based on rulers with trapezoidal magnetic poles is considered in this paper. On a fixed position of a ruler, the reading head … Read more

Linear Convergence of Gradient and Proximal-Gradient Methods Under the Polyak-Lojasiewicz Condition

In 1963, Polyak proposed a simple condition that is sufficient to show a global linear convergence rate for gradient descent. This condition is a special case of the Lojasiewicz inequality proposed in the same year, and it does not require strong convexity (or even convexity). In this work, we show that this much-older Polyak-Lojasiewicz (PL) … Read more

Second-order optimality and beyond: characterization and evaluation complexity in convexly-constrained nonlinear optimization

High-order optimality conditions for convexly-constrained nonlinear optimization problems are analyzed. A corresponding (expensive) measure of criticality for arbitrary order is proposed and extended to define high-order $\epsilon$-approximate critical points. This new measure is then used within a conceptual trust-region algorithm to show that, if derivatives of the objective function up to order $q \geq 1$ … Read more

Approximate norm descent methods for constrained nonlinear systems

We address the solution of convex-constrained nonlinear systems of equations where the Jacobian matrix is unavailable or its computation/storage is burdensome. In order to efficiently solve such problems, we propose a new class of algorithms which are “derivative-free” both in the computation of the search direction and in the selection of the steplength. Search directions … Read more

Exact algorithms for bi-objective ring tree problems with reliability measures

We introduce bi-objective models for ring tree network design with a focus on network reliability within telecommunication applications. Our approaches generalize the capacitated ring tree problem (CRTP) which asks for a partially reliable topology that connects customers with different security requirements to a depot node by combined ring and tree graphs. While the CRTP aims … Read more

Low-Rank Matrix Completion using Nuclear Norm with Facial Reduction

Minimization of the nuclear norm is often used as a surrogate, convex relaxation, for finding the minimum rank completion (recovery) of a partial matrix. The minimum nuclear norm problem can be solved as a trace minimization semidefinite programming problem (\SDP). The \SDP and its dual are regular in the sense that they both satisfy strict … Read more

Convex Variational Formulations for Learning Problems

Abstract—In this article, we introduce new techniques to solve the nonlinear regression problem and the nonlinear classification problem. Our benchmarks suggest that our method for regression is significantly more effective when compared to classical methods and our method for classification is competitive. Our list of classical methods includes least squares, random forests, decision trees, boosted … Read more