A Partial PPA block-wise ADMM for Multi-Block Constrained Separable Convex Optimization

The alternating direction method of multipliers(ADMM) has been proved to be effective for solving two-block separable convex optimization subject to linear constraints. However, it is not necessarily convergent when it is extended to multiple-block case directly. One remedy could be regrouping multiple-block variables into two groups firstly and then adopting the classic ADMM to the … Read more

Efficient global unconstrained black box optimization

For the unconstrained optimization of black box functions, this paper introduces a new randomized algorithm called VRBBO. In practice, VRBBO matches the quality of other state-of-the-art algorithms for finding, in small and large dimensions, a local minimizer with reasonable accuracy. Although our theory guarantees only local minimizers our heuristic techniques turn VRBBO into an efficient … Read more

Adaptive Cubic Regularization methods with dynamic inexact Hessian information and applications to finite-sum minimization

We consider the Adaptive Regularization with Cubics approach for solving nonconvex optimization problems and propose a new variant based on inexact Hessian information chosen dynamically. The theoretical analysis of the proposed procedure is given. The key property of ARC framework, constituted by optimal worst-case function/derivative evaluation bounds for first- and second-order critical point, is guaranteed. … Read more

A stochastic Levenberg-Marquardt method using random models with complexity results and application to data assimilation

Globally convergent variants of the Gauss-Newton algorithm are often the methods of choice to tackle nonlinear least-squares problems. Among such frameworks, Levenberg-Marquardt and trust-region methods are two well-established, similar paradigms. Both schemes have been studied when the Gauss-Newton model is replaced by a random model that is only accurate with a given probability. Trust-region schemes … Read more

A limited-memory optimization method using the infinitely many times repeated BNS update and conjugate directions

To improve the performance of the limited-memory variable metric L-BFGS method for large scale unconstrained optimization, repeating of some BFGS updates was proposed in [1, 2]. But the suitable extra updates need to be selected carefully, since the repeating process can be time consuming. We show that for the limited-memory variable metric BNS method, matrix … Read more

A family of spectral gradient methods for optimization

We propose a family of spectral gradient methods, whose stepsize is determined by a convex combination of the short Barzilai-Borwein (BB) stepsize and the long BB stepsize. It is shown that each member of the family shares certain quasi-Newton property in the sense of least squares. The family also includes some other gradient methods as … Read more

Derivative-Free Optimization of Noisy Functions via Quasi-Newton Methods

This paper presents a finite difference quasi-Newton method for the minimization of noisy functions. The method takes advantage of the scalability and power of BFGS updating, and employs an adaptive procedure for choosing the differencing interval h based on the noise estimation techniques of Hamming (2012) and Moré and Wild (2011). This noise estimation procedure … Read more

A comparison of methods for traversing non-convex regions in optimization problems

This paper considers again the well-known problem of dealing with non-convex regions during the minimization of a nonlinear function F(x) by Newton-like methods. The proposal made here involves a curvilinear search along an approximation to the continuous steepest descent path defined by the solution of the ODE dx/dt = -grad F(x). The algorithm we develop … Read more

A Newton-CG Algorithm with Complexity Guarantees for Smooth Unconstrained Optimization

We consider minimization of a smooth nonconvex objective function using an iterative algorithm based on Newton’s method and linear conjugate gradient, with explicit detection and use of negative curvature directions for the Hessian of the objective function. The algorithm tracks Newton-conjugate gradient procedures developed in the 1980s closely, but includes enhancements that allow worst-case complexity … Read more

Cubic Regularization Method based on Mixed Factorizations for Unconstrained Minimization

Newton’s method for unconstrained optimization, subject to proper regularization or special trust-region procedures, finds first-order stationary points with precision $\varepsilon$ employing, at most, $O(\varepsilon^{-3/2})$ functional and derivative evaluations. However, the computer work per iteration of the best-known implementations may need several factorizations per iteration or may use rather expensive matrix decompositions. In this paper, we … Read more