"kullback-leibler divergence" Papers
8 papers found
Bounds on $L_p$ Errors in Density Ratio Estimation via $f$-Divergence Loss Functions
Yoshiaki Kitazawa
ICLR 2025poster
2
citations
DKDR: Dynamic Knowledge Distillation for Reliability in Federated Learning
Yueyang Yuan, Wenke Huang, Guancheng Wan et al.
NeurIPS 2025poster
Provable Benefit of Annealed Langevin Monte Carlo for Non-log-concave Sampling
Wei Guo, Molei Tao, Yongxin Chen
ICLR 2025posterarXiv:2407.16936
17
citations
Stochastic variance-reduced Gaussian variational inference on the Bures-Wasserstein manifold
Hoang Phuc Hau Luu, Hanlin Yu, Bernardo Williams et al.
ICLR 2025posterarXiv:2410.02490
Variational Inference with Mixtures of Isotropic Gaussians
Marguerite Petit-Talamon, Marc Lambert, Anna Korba
NeurIPS 2025posterarXiv:2506.13613
A Diffusion Model Framework for Unsupervised Neural Combinatorial Optimization
Sebastian Sanokowski, Sepp Hochreiter, Sebastian Lehner
ICML 2024poster
DistiLLM: Towards Streamlined Distillation for Large Language Models
Jongwoo Ko, Sungnyun Kim, Tianyi Chen et al.
ICML 2024poster
Theoretical Guarantees for Variational Inference with Fixed-Variance Mixture of Gaussians
Tom Huix, Anna Korba, Alain Oliviero Durmus et al.
ICML 2024poster