A Modified Projected Gradient Algorithm for Solving Quasiconvex Programming with Applications

In this manuscript, we introduce a novel projected gradient algorithm for solving quasiconvex optimization problems over closed convex sets. The key innovation of our new algorithm is an adaptive, parameter-free stepsize rule that requires no line search and avoids estimating constants, such as Lipschitz modulus. Unlike recent self-adaptive approach given in [17] which typically produce … Read more

Copositive and completely positive cones over symmetric cones of rank at least 5

We focus on copositive and completely positive cones over symmetric cones of rank at least $5$, and in particular investigate whether these cones are spectrahedral shadows. We extend known results for nonnegative orthants of dimension at least $5$ to general symmetric cones of rank at least $5$. Specifically, we prove that when the rank of … Read more

Lower Bounds for Linear Minimization Oracle Methods Optimizing over Strongly Convex Sets

We consider the oracle complexity of constrained convex optimization given access to a Linear Minimization Oracle (LMO) for the constraint set and a gradient oracle for the $L$-smooth, strongly convex objective. This model includes Frank-Wolfe methods and their many variants. Over the problem class of strongly convex constraint sets $S$, our main result proves that … Read more

Improved Analysis of Restarted Accelerated Gradient and Augmented Lagrangian Methods via Inexact Proximal Point Frameworks

This paper studies a class of double-loop (inner-outer) algorithms for convex composite optimization. For unconstrained problems, we develop a restarted accelerated composite gradient method that attains the optimal first-order complexity in both the convex and strongly convex settings. For linearly constrained problems, we introduce inexact augmented Lagrangian methods, including a basic method and an outer-accelerated … Read more

Convex analysis for composite functions without K-convexity

Composite functions have been studied for over 40 years and appear in a wide range of optimization problems. Convex analysis of these functions focuses on (i) conditions for convexity of the function based on properties of its components, (ii) formulas for the convex conjugate of the function based on those of its components and (iii) … Read more

Convex duality contracts for production-grade mathematical optimization

Deploying mathematical optimization in autonomous production systems requires precise contracts for objects returned by an optimization solver. Unfortunately, conventions on dual solution and infeasibility certificates (rays) vary widely across solvers and classes of problems. This paper presents the theoretical framework used by MathOpt (a domain-specific language developed and used at Google) to unify these notions. … Read more

Revisiting Superlinear Convergence of Proximal Newton-Like Methods to Degenerate Solutions

We describe inexact proximal Newton-like methods for solving degenerate regularized optimization problems and for the broader problem of finding a zero of a generalized equation that is the sum of a continuous map and a maximal monotone operator. Superlinear convergence for both the distance to the solution set and a certain measure of first-order optimality … Read more

Asymptotically tight Lagrangian dual of smooth nonconvex problems via improved error bound of Shapley-Folkman Lemma

In convex geometry, the Shapley–Folkman Lemma asserts that the nonconvexity of a Minkowski sum of $n$-dimensional bounded nonconvex sets does not accumulate once the number of summands exceeds the dimension $n$, and thus the sum becomes approximately convex. Originally published by Starr in the context of quasi-equilibrium in nonconvex market models in economics, the lemma … Read more

Non-Convex Self-Concordant Functions: Practical Algorithms and Complexity Analysis

We extend the standard notion of self-concordance to non-convex optimization and develop a family of second-order algorithms with global convergence guarantees. In particular, two function classes – weakly self-concordant functions and F-based self-concordant functions – generalize the self-concordant framework beyond convexity, without assuming the Lipschitz continuity of the gradient or Hessian. For these function classes, … Read more

Iteration complexity of the Difference-of-Convex Algorithm for unconstrained optimization: a simple proof

We propose a simple proof of the worst-case iteration complexity for the Difference of Convex functions Algorithm (DCA) for unconstrained minimization, showing that the global rate of convergence of the norm of the objective function’s gradients at the iterates converge to zero like $o(1/k)$. A small example is also provided indicating that the rate cannot … Read more