Fast approximate BayesBag model selection via Taylor
In recent years, BayesBag has emerged as an effective remedy for the brittleness of traditional Bayesian model selection under model misspecification. However, computing BayesBag can be prohibitively expensive for large datasets. In this talk, I propose a fast approximation of BayesBag model selection. This approximation - based on Taylor approximations of the log marginal likelihood - can achieve results comparable to BayesBag in a fraction of the computation time. I provide concrete bounds on the approximation error and establish that it converges to zero asymptotically as the dataset grows. I demonstrate the utility of this approach on problems arising in forensic science and business.