On the Computation of Restricted Normal Cones

Restricted normal cones are of interest, for instance, in the theory of local error bounds, where they have recently been used to characterize the exis- tence of a constrained Lipschitzian error bound. In this paper, we establish rela- tions between two concepts for restricted normals. The first of these concepts was introduced in the late … Read more

Variational Theory and Algorithms for a Class of Asymptotically Approachable Nonconvex Problems

We investigate a class of composite nonconvex functions, where the outer function is the sum of univariate extended-real-valued convex functions and the inner function is the limit of difference-of-convex functions. A notable feature of this class is that the inner function may fail to be locally Lipschitz continuous. It covers a range of important yet … Read more

Constraint qualifications and strong global convergence properties of an augmented Lagrangian method on Riemannian manifolds

In the past years, augmented Lagrangian methods have been successfully applied to several classes of non-convex optimization problems, inspiring new developments in both theory and practice. In this paper we bring most of these recent developments from nonlinear programming to the context of optimization on Riemannian manifolds, including equality and inequality constraints. Many research have … Read more

Regularized methods via cubic subspace minimization for nonconvex optimization

The main computational cost per iteration of adaptive cubic regularization methods for solving large-scale nonconvex problems is the computation of the step \(s_k\), which requires an approximate minimizer of the cubic model. We propose a new approach in which this minimizer is sought in a low dimensional subspace that, in contrast to classical approaches, is … Read more

A novel algorithm for a broad class of nonconvex optimization problems

In this paper, we propose a new global optimization approach for solving nonconvex optimization problems in which the nonconvex components are sums of products of convex functions. A broad class of nonconvex problems can be written in this way, such as concave minimization problems, difference of convex problems, and fractional optimization problems. Our approach exploits … Read more

On Integrality in Semidefinite Programming for Discrete Optimization

It is well-known that by adding integrality constraints to the semidefinite programming (SDP) relaxation of the max-cut problem, the resulting integer semidefinite program is an exact formulation of the problem. In this paper we show similar results for a wide variety of discrete optimization problems for which SDP relaxations have been derived. Based on a … Read more

Behavior of Newton-type methods near critical solutions of nonlinear equations with semismooth derivatives

Having in mind singular solutions of smooth reformulations of complementarity problems, arising unavoidably when the solution in question violates strict complementarity, we study the behavior of Newton-type methods near singular solutions of nonlinear equations, assuming that the operator of the equation possesses a strongly semismooth derivative, but is not necessarily twice differentiable. These smoothness restrictions … Read more

Searching for Optimal Per-Coordinate Step-sizes with Multidimensional Backtracking

The backtracking line-search is an effective technique to automatically tune the step-size in smooth optimization. It guarantees similar performance to using the theoretically optimal step-size. Many approaches have been developed to instead tune per-coordinate step-sizes, also known as diagonal preconditioners, but none of the existing methods are provably competitive with the optimal per-coordinate stepsizes. We … Read more

Adaptive Importance Sampling Based Surrogation Methods for Bayesian Hierarchical Models, via Logarithmic Integral Optimization

We explore Maximum a Posteriori inference of Bayesian Hierarchical Models (BHMs) with intractable normalizers, which are increasingly prevalent in contemporary applications and pose computational challenges when combined with nonconvexity and nondifferentiability. To address these, we propose the Adaptive Importance Sampling-based Surrogation method, which efficiently handles nonconvexity and nondifferentiability while improving the sampling approximation of the … Read more