Trust-region algorithms: probabilistic complexity and intrinsic noise with applications to subsampling techniques

A trust-region algorithm is presented for finding approximate minimizers of smooth unconstrained functions whose values and derivatives are subject to random noise. It is shown that, under suitable probabilistic assumptions, the new method finds (in expectation) an epsilon-approximate minimizer of arbitrary order q > 0 in at most O(epsilon^{-(q+1)}) inexact evaluations of the function and … Read more

A framework for convex-constrained monotone nonlinear equations and its special cases

This work refers to methods for solving convex-constrained monotone nonlinear equations. We first propose a framework, which is obtained by combining a safeguard strategy on the search directions with a notion of approximate projections. The global convergence of the framework is established under appropriate assumptions and some examples of methods which fall into this framework … Read more

Sequential Domain Adaptation by Synthesizing Distributionally Robust Experts

Least squares estimators, when trained on a few target domain samples, may predict poorly. Supervised domain adaptation aims to improve the predictive accuracy by exploiting additional labeled training samples from a source distribution that is close to the target distribution. Given available data, we investigate novel strategies to synthesize a family of least squares estimator … Read more

Accelerated derivative-free spectral residual method for nonlinear systems of equations

Spectral residual methods are powerful tools for solving nonlinear systems of equations without derivatives. In a recent paper, it was shown that an acceleration technique based on the Sequential Secant Method can greatly improve its efficiency and robustness. In the present work, an R implementation of the method is presented. Numerical experiments with a widely … Read more

Accelerated derivative-free nonlinear least-squares applied to the estimation of Manning coefficients

A general framework for solving nonlinear least squares problems without the employment of derivatives is proposed in the present paper together with a new general global convergence theory. With the aim to cope with the case in which the number of variables is big (for the standards of derivative-free optimization), two dimension-reduction procedures are introduced. … Read more

Secant acceleration of sequential residual methods for solving large-scale nonlinear systems of equations

Sequential Residual Methods try to solve nonlinear systems of equations $F(x)=0$ by iteratively updating the current approximate solution along a residual-related direction. Therefore, memory requirements are minimal and, consequently, these methods are attractive for solving large-scale nonlinear systems. However, the convergence of these algorithms may be slow in critical cases; therefore, acceleration procedures are welcome. … Read more

Strong Evaluation Complexity of An Inexact Trust-Region Algorithm for Arbitrary-Order Unconstrained Nonconvex Optimization

A trust-region algorithm using inexact function and derivatives values is introduced for solving unconstrained smooth optimization problems. This algorithm uses high-order Taylor models and allows the search of strong approximate minimizers of arbitrary order. The evaluation complexity of finding a $q$-th approximate minimizer using this algorithm is then shown, under standard conditions, to be $\mathcal{O}\big(\min_{j\in\{1,\ldots,q\}}\epsilon_j^{-(q+1)}\big)$ … Read more

Spectral Residual Method for Nonlinear Equations on Riemannian Manifolds

In this paper, the spectral algorithm for nonlinear equations (SANE) is adapted to the problem of finding a zero of a given tangent vector field on a Riemannian manifold. The generalized version of SANE uses, in a systematic way, the tangent vector field as a search direction and a continuous real–valued function that adapts this … Read more

On the abs-polynomial expansion of piecewise smooth functions

Tom Streubel has observed that for functions in abs-normal form, generalized Taylor expansions of arbitrary order $\bd \!- \!1$ can be generated by algorithmic piecewise differentiation. Abs-normal form means that the real or vector valued function is defined by an evaluation procedure that involves the absolute value function $|\cdot|$ apart from arithmetic operations and $\bd$ … Read more

A Nonmonotone Matrix-Free Algorithm for Nonlinear Equality-Constrained Least-Squares Problems

Least squares form one of the most prominent classes of optimization problems, with numerous applications in scientific computing and data fitting. When such formulations aim at modeling complex systems, the optimization process must account for nonlinear dynamics by incorporating constraints. In addition, these systems often incorporate a large number of variables, which increases the difficulty … Read more