Bounds play a crucial role in guiding optimization algorithms, improving their speed and quality and providing optimality gaps. While Lipschitz constant-based lower bound construction is an effective technique, the quality of the linear bounds depends on the function’s topological properties. In this research, we improve upon this by incorporating nonlinear kernels and surrogate approximations to generate higher-quality bounds. We emphasize the importance of using a flexible kernel that can adapt to any function. We examine the characteristics and properties of different proposed kernels. While the linear kernel and Euclidean distance are popular in the literature due to their simplicity and intuitive interpretation, we discuss the potential benefits of alternative kernels, such as sublinear and superlinear kernels. Sublinear kernels are advantageous for sparse data settings, while superlinear kernels can capture nonlinear relationships between data points. Combining surrogate models and nonlinear kernels for Lipschitz constant estimations results in high-quality lower bounds that can contribute to more effective exploration and exploitation and more accurate optimality gap estimation.