Scalable Bayesian Learning with posteriors

6
Citations
#1296
in ICLR 2025
of 3827 papers
5
Authors
3
Data Points

Abstract

Although theoretically compelling, Bayesian learning with modern machine learning models is computationally challenging since it requires approximating a high dimensional posterior distribution. In this work, we (i) introduceposteriors, an easily extensible PyTorch library hosting general-purpose implementations making Bayesian learning accessible and scalable to large data and parameter regimes; (ii) present a tempered framing of stochastic gradient Markov chain Monte Carlo, as implemented in posteriors, that transitions seamlessly into optimization and unveils a minor modification to deep ensembles to ensure they are asymptotically unbiased for the Bayesian posterior, and (iii) demonstrate and compare the utility of Bayesian approximations through experiments including an investigation into the cold posterior effect and applications with large language models.posteriorsrepository: https://github.com/normal-computing/posteriors

Citation History

Jan 26, 2026
6
Jan 26, 2026
6
Jan 27, 2026
6