A Semismooth Newton-Type Method for the Nearest Doubly Stochastic Matrix Problem

We study a semismooth Newton-type method for the nearest doubly stochastic matrix problem where both differentiability and nonsingularity of the Jacobian can fail. The optimality conditions for this problem are formulated as a system of strongly semismooth functions. We show that the so-called local error bound condition does not hold for this system. Thus the … Read more

FISTA and Extensions – Review and New Insights

The purpose of this technical report is to review the main properties of an accelerated composite gradient (ACG) method commonly referred to as the Fast Iterative Shrinkage-Thresholding Algorithm (FISTA). In addition, we state a version of FISTA for solving both convex and strongly convex composite minimization problems and derive its iteration complexities to generate iterates … Read more

Alternative Regularizations for OA Algorithms for Convex MINLP

In this work, we extend the regularization framework from Kronqvist et al. (https://doi.org/10.1007/s10107-018-1356-3) by incorporating several new regularization functions and develop a regularized single-tree search method for solving convex mixed-integer nonlinear programming (MINLP) problems. We propose a set of regularization functions based on distance-metrics and Lagrangean approximations, used in the projection problem for finding new … Read more

On the Convergence Results of a class of Nonmonotone Accelerated Proximal Gradient Methods for Nonsmooth and Nonconvex Minimization Problems

In this paper, we consider a class of nonsmooth problem that is the sum of a Lipschitz differentiable function and a nonsmooth and proper lower semicontinuous function. We discuss here the convergence rate of the function values for a nonmonotone accelerated proximal gradient method, which proposed in “Huan Li and Zhouchen Lin: Accelerated proximal gradient … Read more

Local Minimizers of the Crouzeix Ratio: A Nonsmooth Optimization Case Study

Given a square matrix $A$ and a polynomial $p$, the Crouzeix ratio is the norm of the polynomial on the field of values of $A$ divided by the 2-norm of the matrix $p(A)$. Crouzeix’s conjecture states that the globally minimal value of the Crouzeix ratio is 0.5, regardless of the matrix order and polynomial degree, … Read more

Radial Duality Part II: Applications and Algorithms

The first part of this work established the foundations of a radial duality between nonnegative optimization problems, inspired by the work of (Renegar, 2016). Here we utilize our radial duality theory to design and analyze projection-free optimization algorithms that operate by solving a radially dual problem. In particular, we consider radial subgradient, smoothing, and accelerated … Read more

A Nonmonontone Accelerated Proximal Gradient Method with Variable Stepsize Strategy for Nonsmooth and Nonconvex Minimization Problems

We propose a new nonmonontone accelerated proximal gradient method with variable stepsize strategy for minimizing the sum of a nonsmooth function with a smooth one in the nonconvex setting. In this algorithm, the objective function value be allowed to increase discontinuously, but is decreasing from the overall point of view. The variable stepsize strategy don’t … Read more

Factorization of completely positive matrices using iterative projected gradient steps

We aim to factorize a completely positive matrix by using an optimization approach which consists in the minimization of a nonconvex smooth function over a convex and compact set. To solve this problem we propose a projected gradient algorithm with parameters that take into account the effects of relaxation and inertia. Both projection and gradient … Read more

Algorithms for Difference-of-Convex (DC) Programs Based on Difference-of-Moreau-Envelopes Smoothing

In this paper we consider minimization of a difference-of-convex (DC) function with and without linear constraints. We first study a smooth approximation of a generic DC function, termed difference-of-Moreau-envelopes (DME) smoothing, where both components of the DC function are replaced by their respective Moreau envelopes. The resulting smooth approximation is shown to be Lipschitz differentiable, … Read more

A Proximal Quasi-Newton Trust-Region Method for Nonsmooth Regularized Optimization

We develop a trust-region method for minimizing the sum of a smooth term f and a nonsmooth term h, both of which can be nonconvex. Each iteration of our method minimizes apossibly nonconvex model of f+h in a trust region. The model coincides with f+h in value and subdifferential at the center. We establish global … Read more