Randomized Similar Triangles Method: A Unifying Framework for Accelerated Randomized Optimization Methods (Coordinate Descent, Directional Search, Derivative-Free Method)

In this paper, we consider smooth convex optimization problems with simple constraints and inexactness in the oracle information such as value, partial or directional derivatives of the objective function. We introduce a unifying framework, which allows to construct different types of accelerated randomized methods for such problems and to prove convergence rate theorems for them. … Read more

A Robust Multi-Batch L-BFGS Method for Machine Learning

This paper describes an implementation of the L-BFGS method designed to deal with two adversarial situations. The first occurs in distributed computing environments where some of the computational nodes devoted to the evaluation of the function and gradient are unable to return results on time. A similar challenge occurs in a multi-batch approach in which … Read more

On the behavior of Lagrange multipliers in convex and non-convex infeasible interior point methods

This paper analyzes sequences generated by infeasible interior point methods. In convex and non-convex settings, we prove that moving the primal feasibility at the same rate as complementarity will ensure that the Lagrange multiplier sequence will remain bounded, provided the limit point of the primal sequence has a Lagrange multiplier, without constraint qualification assumptions. We … Read more

A Decomposition Method for MINLPs with Lipschitz Continuous Nonlinearities

Many mixed-integer optimization problems are constrained by nonlinear functions that do not possess desirable analytical properties like convexity or factorability or cannot even be evaluated exactly. This is, e.g., the case for problems constrained by differential equations or for models that rely on black-box simulation runs. For these problem classes, we present, analyze, and test … Read more

Invex Optimization Revisited

Given a non-convex optimization problem, we study conditions under which every Karush-Kuhn-Tucker (KKT) point is a global optimizer. This property is known as KT-invexity and allows to identify the subset of problems where an interior point method always converges to a global optimizer. In this work, we provide necessary conditions for KT-invexity in n-dimensions and … Read more

Behavior of accelerated gradient methods near critical points of nonconvex functions

We examine the behavior of accelerated gradient methods in smooth nonconvex unconstrained optimization, focusing in particular on their behavior near strict saddle points. Accelerated methods are iterative methods that typically step along a direction that is a linear combination of the previous step and the gradient of the function evaluated at a point at or … Read more

Simplex QP-based methods for minimizing a conic quadratic objective over polyhedra

We consider minimizing a conic quadratic objective over a polyhedron. Such problems arise in parametric value-at-risk minimization, portfolio optimization, and robust optimization with ellipsoidal objective uncertainty; and they can be solved by polynomial interior point algorithms for conic quadratic optimization. However, interior point algorithms are not well-suited for branch-and-bound algorithms for the discrete counterparts of … Read more

SDP-based Branch-and-Bound for Non-convex Quadratic Integer Optimization

Semidefinite programming (SDP) relaxations have been intensively used for solving discrete quadratic optimization problems, in particular in the binary case. For the general non-convex integer case with box constraints, the branch-and-bound algorithm Q-MIST has been proposed [11], which is based on an extension of the well-known SDP-relaxation for max-cut. For solving the resulting SDPs, Q-MIST … Read more

New quasi-Newton method for solving systems of nonlinear equations

In this report, we propose the new Broyden method for solving systems of nonlinear equations, which uses the first derivatives, but it is more efficient than the Newton method (measured by the computational time) for larger dense systems. The new method updates QR decompositions of nonsymmetric approximations of the Jacobian matrix, so it requires $O(n^2)$ … Read more

Properties of the block BFGS update and its application to the limited-memory block BNS method for unconstrained minimization.

A block version of the BFGS variable metric update formula and its modifications are investigated. In spite of the fact that this formula satisfies the quasi-Newton conditions with all used difference vectors and that the improvement of convergence is the best one in some sense for quadratic objective functions, for general functions it does not … Read more