Adaptive Importance Sampling Based Surrogation Methods for Bayesian Hierarchical Models, via Logarithmic Integral Optimization

We explore Maximum a Posteriori inference of Bayesian Hierarchical Models (BHMs) with intractable normalizers, which are increasingly prevalent in contemporary applications and pose computational challenges when combined with nonconvexity and nondifferentiability. To address these, we propose the Adaptive Importance Sampling-based Surrogation method, which efficiently handles nonconvexity and nondifferentiability while improving the sampling approximation of the intractable normalizer through variance reduction. Our analysis ensures its almost sure subsequential convergence to a surrogation stationary point, a necessary candidate for a local minimizer. Extensive numerical experiments demonstrate the efficiency and stability of our algorithm in enabling advanced BHMs with intractable normalizers arising from enhanced modeling capability.

Article

Download

View Adaptive Importance Sampling Based Surrogation Methods for Bayesian Hierarchical Models, via Logarithmic Integral Optimization