A Newton-CG Algorithm with Complexity Guarantees for Smooth Unconstrained Optimization

We consider minimization of a smooth nonconvex objective function using an iterative algorithm based on Newton’s method and linear conjugate gradient, with explicit detection and use of negative curvature directions for the Hessian of the objective function. The algorithm tracks Newton-conjugate gradient procedures developed in the 1980s closely, but includes enhancements that allow worst-case complexity … Read more

An Improved Method of Total Variation Superiorization Applied to Reconstruction in Proton Computed Tomography

Previous work showed that total variation superiorization (TVS) improves reconstructed image quality in proton computed tomography (pCT). The structure of the TVS algorithm has evolved since then and this work investigated if this new algorithmic structure provides additional benefits to pCT image quality. Structural and parametric changes introduced to the original TVS algorithm included: (1) … Read more

On Algebraic Proofs of Stability for Homogeneous Vector Fields

We prove that if a homogeneous, continuously differentiable vector field is asymptotically stable, then it admits a Lyapunov function which is the ratio of two polynomials (i.e., a rational function). We further show that when the vector field is polynomial, the Lyapunov inequalities on both the rational function and its derivative have sum of squares … Read more

A Distributed Quasi-Newton Algorithm for Empirical Risk Minimization with Nonsmooth Regularization

We propose a communication- and computation-efficient distributed optimization algorithm using second-order information for solving ERM problems with a nonsmooth regularization term. Current second-order and quasi-Newton methods for this problem either do not work well in the distributed setting or work only for specific regularizers. Our algorithm uses successive quadratic approximations, and we describe how to … Read more

On an Elliptical Trust-Region Procedure for Ill-Posed Nonlinear Least-Squares Problems

In this paper we address the stable numerical solution of ill-posed nonlinear least-squares problems with small residual. We propose an elliptical trust-region reformulation of a Levenberg-Marquardt procedure. Thanks to an appropriate choice of the trust-region radius, the proposed procedure guarantees an automatic choice of the free regularization parameters that, together with a suitable stopping criterion, … Read more

Inexact Successive Quadratic Approximation for Regularized Optimization

Successive quadratic approximations, or second-order proximal methods, are useful for minimizing functions that are a sum of a smooth part and a convex, possibly nonsmooth part that promotes regularization. Most analyses of iteration complexity focus on the special case of proximal gradient method, or accelerated variants thereof. There have been only a few studies of … Read more

A Branch-and-Bound based Algorithm for Nonconvex Multiobjective Optimization

A new branch-and-bound based algorithm for smooth nonconvex multiobjective optimization problems with convex constraints is presented. The algorithm computes an $(\varepsilon,\delta)$-approximation of all globally optimal solutions. We introduce the algorithm which uses selection rules, discarding and termination tests. The discarding tests are the most important aspect, as they examine in different ways whether a box … Read more

ADMM for Multiaffine Constrained Optimization

We propose an expansion of the scope of the alternating direction method of multipliers (ADMM). Specifically, we show that ADMM, when employed to solve problems with multiaffine constraints that satisfy certain easily verifiable assumptions, converges to the set of constrained stationary points if the penalty parameter in the augmented Lagrangian is sufficiently large. When the … Read more

A Riemannian Conjugate Gradient Algorithm with Implicit Vector Transport for Optimization on the Stiefel Manifold

In this paper, a reliable curvilinear search algorithm for solving optimization problems over the Stiefel manifold is presented. This method is inspired by the conjugate gradient method, with the purpose of obtain a new direction search that guarantees descent of the objective function in each iteration. The merit of this algorithm lies in the fact … Read more

Algorithms and Convergence Results of Projection Methods for Inconsistent Feasibility Problems: A Review

The convex feasibility problem (CFP) is to find a feasible point in the intersection of finitely many convex and closed sets. If the intersection is empty then the CFP is inconsistent and a feasible point does not exist. However, algorithmic research of inconsistent CFPs exists and is mainly focused on two directions. One is oriented … Read more