Bounds play a crucial role in guiding optimization algorithms, improving their speed and quality, and providing optimality gaps. While Lipschitz constant-based lower bound construction is an effective technique, the quality of the linear bounds depends on the function’s topological properties. In this research, we improve upon this by incorporating nonlinear distance metrics and surrogate approximations to generate higher-quality bounds. We emphasize the importance of using a flexible distance metric that can adapt to any function. We examine the characteristics and properties of different incorporated distance metrics. While the linear distance metric is popular in the literature due to its simplicity and intuitive interpretation, we discuss the potential benefits of alternative distance metrics, such as sublinear and superlinear distance metrics, which may be used for Hölder continuous functions. Sublinear distance metrics are advantageous for sparse data settings, while superlinear distance metrics can capture nonlinear relationships between data points. Combining surrogate models and nonlinear distance metrics for Lipschitz constant estimations results in high-quality lower bounds that can contribute to more effective exploration and exploitation and more accurate optimality gap estimation.