Exploring Nonlinear Distance Metrics for Lipschitz Constant Estimation in Lower Bound Construction for Global Optimization

Bounds play a crucial role in guiding optimization algorithms, improving their speed and quality, and providing optimality gaps. While Lipschitz constant-based lower bound construction is an effective technique, the quality of the linear bounds depends on the function’s topological properties. In this research, we improve upon this by incorporating nonlinear distance metrics and surrogate approximations … Read more

On Inexact Solution of Auxiliary Problems in Tensor Methods for Convex Optimization

In this paper we study the auxiliary problems that appear in p-order tensor methods for unconstrained minimization of convex functions with \nu-Holder continuous pth derivatives. This type of auxiliary problems corresponds to the minimization of a (p+\nu)-order regularization of the pth order Taylor approximation of the objective. For the case p=3, we consider the use … Read more

Tensor Methods for Finding Approximate Stationary Points of Convex Functions

In this paper we consider the problem of finding \epsilon-approximate stationary points of convex functions that are p-times differentiable with \nu-Hölder continuous pth derivatives. We present tensor methods with and without acceleration. Specifically, we show that the non-accelerated schemes take at most O(\epsilon^{-1/(p+\nu-1)}) iterations to reduce the norm of the gradient of the objective below … Read more

Tensor Methods for Minimizing Convex Functions with Hölder Continuous Higher-Order Derivatives

In this paper we study p-order methods for unconstrained minimization of convex functions that are p-times differentiable with $\nu$-Hölder continuous pth derivatives. We propose tensor schemes with and without acceleration. For the schemes without acceleration, we establish iteration complexity bounds of $\mathcal{O}\left(\epsilon^{-1/(p+\nu-1)}\right)$ for reducing the functional residual below a given $\epsilon\in (0,1)$. Assuming that $\nu$ … Read more

Algorithms for the quasiconvex feasibility problem

We study the behavior of subgradient projection algorithms for the quasiconvex feasibility problem of finding a point x^* in R^n that satisfies the inequalities f_i(x^*) less or equal 0, for all i=1,2,…,m, where all functions are continuous and quasiconvex. We consider the consistent case when the solution set is nonempty. Since the Fenchel-Moreau subdifferential might … Read more