Some new accelerated and stochastic gradient descent algorithms based on locally Lipschitz gradient constants

In this paper, we revisit the recent stepsize applied for the gradient descent scheme which is called NGD proposed by [Hoai et al., A novel stepsize for gradient descent method, Operations Research Letters (2024) 53, doi: 10.1016/j.orl.2024.107072]. We first investigate NGD stepsize with two well-known accelerated techniques which are Heavy ball and Nesterov’s methods. In … Read more

An infeasible interior-point arc-search method with Nesterov’s restarting strategy for linear programming problems

An arc-search interior-point method is a type of interior-point method that approximates the central path by an ellipsoidal arc, and it can often reduce the number of iterations. In this work, to further reduce the number of iterations and the computation time for solving linear programming problems, we propose two arc-search interior-point methods using Nesterov’s … Read more

New analysis of linear convergence of gradient-type methods via unifying error bound conditions

The subject of linear convergence of gradient-type methods on non-strongly convex optimization has been widely studied by introducing several notions as sufficient conditions. Influential examples include the error bound property, the restricted strongly convex property, the quadratic growth property, and the Kurdyka-{\L}ojasiewicz property. In this paper, we first define a group of error bound conditions … Read more