Bridging the Gap Between f-divergences and Bayes Hilbert Spaces

0citations
0
Citations
#2434
in ICLR 2025
of 3827 papers
3
Authors
4
Data Points

Abstract

We introduce a novel framework that generalizes $f$-divergences by incorporating locally non-convex divergence-generating functions.Using this extension, we define a new class of pseudo $f$-divergences, encompassing a wider range of distances between distributions that traditional $f$-divergences cannot capture.Among these, we focus on a particular pseudo divergence obtained by considering the induced metric of Bayes Hilbert spaces.Bayes Hilbert spaces are frequently used due to their inherent connection to Bayes's theorem. They allow sampling from potentially intractable posterior densities, which has remained challenging until now.In the more general context, we prove that pseudo $f$-divergences are well-defined and introduce a variational estimation framework that can be used in a statistical learning context.By applying this variational estimation framework to $f$-GANs, we achieve improved FID scores over existing $f$-GAN architectures and competitive results with the Wasserstein GAN, highlighting its potential for both theoretical research and practical applications in learning theory.

Citation History

Jan 26, 2026
0
Jan 27, 2026
0
Jan 27, 2026
0
Feb 1, 2026
0