A Riemannian ADMM

We consider a class of Riemannian optimization problems where the objective is the sum of a smooth function and a nonsmooth function, considered in the ambient space. This class of problems finds important applications in machine learning and statistics such as the sparse principal component analysis, sparse spectral clustering, and orthogonal dictionary learning. We propose … Read more

Accelerated projected gradient algorithms for sparsity constrained optimization problems

We consider the projected gradient algorithm for the nonconvex best subset selection problem that minimizes a given empirical loss function under an \(\ell_0\)-norm constraint. Through decomposing the feasible set of the given sparsity constraint as a finite union of linear subspaces, we present two acceleration schemes with global convergence guarantees, one by same-space extrapolation and … Read more

Continuous Equality Knapsack with Probit-Style Objectives

We study continuous, equality knapsack problems with uniform separable, non-convex objective functions that are continuous, strictly increasing, antisymmetric about a point, and have concave and convex regions. For example, this model captures a simple allocation problem with the goal of optimizing an expected value where the objective is a sum of cumulative distribution functions of … Read more

Enhancements of Discretization Approaches for Non-Convex Mixed-Integer Quadratically Constraint Quadratic Programming: Part I

We study mixed-integer programming (MIP) relaxation techniques for the solution of non-convex mixed-integer quadratically constrained quadratic programs (MIQCQPs). We present MIP relaxation methods for non-convex continuous variable products. In Part I, we consider MIP relaxations based on separable reformulation. The main focus is the introduction of the enhanced separable MIP relaxation for non-convex quadratic products … Read more

Continuity of the conic hull

In a real Hilbert space V, the conic hull of G is the set cone(G) consisting of all nonnegative linear combinations of elements of G. Many optimization problems are sensitive to the changes in cone(G) that result from changes in G itself. Motivated by one such problem, we derive necessary and sufficient conditions for the … Read more

Decentralized Stochastic Bilevel Optimization with Improved Per-Iteration Complexity

Bilevel optimization recently has received tremendous attention due to its great success in solving important machine learning problems like meta learning, reinforcement learning, and hyperparameter optimization. Extending single-agent training on bilevel problems to the decentralized setting is a natural generalization, and there has been a flurry of work studying decentralized bilevel optimization algorithms. However, it … Read more

A Projected-Search Interior Method for Nonlinear Optimization

This paper concerns the formulation and analysis of a new interior method for general nonlinearly constrained optimization that combines a shifted primal-dual interior method with a projected-search method for bound-constrained optimization. The method involves the computation of an approximate Newton direction for a primal-dual penalty-barrier function that incorporates shifts on both the primal and dual … Read more

An Explicit Spectral Fletcher-Reeves Conjugate Gradient Method for Bi-criteria Optimization

In this paper we propose a spectral Fletcher-Reeves conjugate gradient-like method (SFRCG) for solving unconstrained bi-criteria minimisation problems without using any technique of scalarization. We suggest an explicit formulae for computing a descent direction common to both criteria. This latter verifies furthermore a sufficient descent property which does not depend on the line search nor … Read more