A linearly convergent stochastic recursive gradient method for convex optimization

The stochastic recursive gradient algorithm (SARAH) [8] attracts much interest recently. It admits a simple recursive framework for updating stochastic gradient estimates. Motivated by this, in this paper, we propose a SARAH-I method incorporating importance sampling, whose linear conver- gence rate of the sequence of distances between iterates and the optima set is proven under … Read more

Efficient combination of two lower bound functions in univariate global optimization

We propose a new method for solving univariate global optimization problems by combining a lower bound function of ®BB method (see [1]) with the lower bound function of the method developed in [4]. The new lower bound function is better than the two lower bound functions. We add the convex/concave test and pruning step which … Read more