Estimating Marginal Likelihoods in Likelihood-Free Inference via Neural Density Estimation

Jean-Michel Marin


Date
03 févr. 2026

The marginal likelihood, or evidence, plays a central role in Bayesian model selection, but remains notoriously difficult to estimate in likelihood-free settings. In the context of Simulation-Based Inference (SBI) methods, neural density estimators offer powerful tools to approximate posteriors but typically lack evidence estimates. We address this gap by focusing on Sequential Neural Likelihood Estimation (SNLE). We first exploit the sequential nature of SNLE to design a Sequential Importance Sampling (SIS) methodology that relies on intermediate-rounds samples of SNLE to obtain evidence estimates at no additional computational cost. Another proposal is to couple the final SNLE posterior with one extra neural posterior learning step, followed by Importance Sampling (IS). Across benchmark models with known ground truth, the IS methodology substantially improves stability and accuracy over a range of proposal notably the Harmonic Mean estimator already proposed. The approach may also extend naturally to amortized SBI, offering a principled and practical tool for Bayesian model selection within the SBI paradigm.