Bounds play a vital role in guiding optimization algorithms by enhancing convergence, improving solution quality, and quantifying optimality gaps. While Lipschitz-based lower bounds are well-established, their effectiveness is often constrained by the function’s topological properties. To address these limitations, we propose an approach that integrates nonlinear distance metrics with surrogate approximations, yielding more adaptive and accurate lower bounds. A key aspect of our methodology lies in the flexibility of the chosen distance metric, which can be adapted to various function behaviors. In particular, we explore sublinear and superlinear metrics under the Holder continuity assumption, demonstrating their capacity to capture local function characteristics beyond the scope of conventional linear bounds. Empirical evaluations on diverse benchmark test problems show that our approach surpasses both standard Lipschitz-based and statistical methods in producing high-quality lower bounds. We further employ these refined bounds as an acquisition function within surrogate optimization, a common technique for expensive black-box problems wherein a surrogate model approximates the true objective function. By leveraging these bounds to balance exploration and exploitation, we effectively prioritize evaluations in regions with high potential for improvement. Overall, our framework not only offers more accurate and flexible lower bound estimates, but also acts as a robust acquisition strategy that expedites convergence to near-optimal solutions in black-box optimization.