NeurIPS 2025 "generative modeling" Papers

18 papers found

A Black-Box Debiasing Framework for Conditional Sampling

Han Cui, Jingbo Liu

NeurIPS 2025posterarXiv:2510.11071

Ambient Diffusion Omni: Training Good Models with Bad Data

Giannis Daras, Adrian Rodriguez-Munoz, Adam Klivans et al.

NeurIPS 2025spotlightarXiv:2506.10038
12
citations

Assessing the quality of denoising diffusion models in Wasserstein distance: noisy score and optimal bounds

Vahan Arsenyan, Elen Vardanyan, Arnak Dalalyan

NeurIPS 2025posterarXiv:2506.09681

CDFlow: Building Invertible Layers with Circulant and Diagonal Matrices

XUCHEN FENG, Siyu Liao

NeurIPS 2025posterarXiv:2510.25323

Contextual Thompson Sampling via Generation of Missing Data

Kelly W Zhang, Tianhui Cai, Hongseok Namkoong et al.

NeurIPS 2025posterarXiv:2502.07064
2
citations

Cross-fluctuation phase transitions reveal sampling dynamics in diffusion models

Sai Niranjan Ramachandran, Manish Krishan Lal, Suvrit Sra

NeurIPS 2025posterarXiv:2511.00124

Energy Matching: Unifying Flow Matching and Energy-Based Models for Generative Modeling

Michal Balcerak, Tamaz Amiranashvili, Antonio Terpin et al.

NeurIPS 2025posterarXiv:2504.10612
8
citations

Fast Solvers for Discrete Diffusion Models: Theory and Applications of High-Order Algorithms

Yinuo Ren, Haoxuan Chen, Yuchen Zhu et al.

NeurIPS 2025posterarXiv:2502.00234
29
citations

FlowDAS: A Stochastic Interpolant-based Framework for Data Assimilation

Siyi Chen, Yixuan Jia, Qing Qu et al.

NeurIPS 2025posterarXiv:2501.16642
4
citations

FocalCodec: Low-Bitrate Speech Coding via Focal Modulation Networks

Luca Della Libera, Francesco Paissan, Cem Subakan et al.

NeurIPS 2025posterarXiv:2502.04465
9
citations

High-Order Flow Matching: Unified Framework and Sharp Statistical Rates

Maojiang Su, Jerry Yao-Chieh Hu, Yi-Chen Lee et al.

NeurIPS 2025poster

Informed Correctors for Discrete Diffusion Models

Yixiu Zhao, Jiaxin Shi, Feng Chen et al.

NeurIPS 2025posterarXiv:2407.21243
31
citations

Moment- and Power-Spectrum-Based Gaussianity Regularization for Text-to-Image Models

Jisung Hwang, Jaihoon Kim, Minhyuk Sung

NeurIPS 2025posterarXiv:2509.07027

Proper Hölder-Kullback Dirichlet Diffusion: A Framework for High Dimensional Generative Modeling

Wanpeng Zhang, Yuhao Fang, Xihang Qiu et al.

NeurIPS 2025poster

Riemannian Flow Matching for Brain Connectivity Matrices via Pullback Geometry

Antoine Collas, Ce Ju, Nicolas Salvy et al.

NeurIPS 2025posterarXiv:2505.18193
2
citations

Sampling 3D Molecular Conformers with Diffusion Transformers

J. Thorben Frank, Winfried Ripken, Gregor Lied et al.

NeurIPS 2025posterarXiv:2506.15378
1
citations

TreeGen: A Bayesian Generative Model for Hierarchies

Marcel Kollovieh, Nils Fleischmann, Filippo Guerranti et al.

NeurIPS 2025poster

Why Masking Diffusion Works: Condition on the Jump Schedule for Improved Discrete Diffusion

Alan Amin, Nate Gruver, Andrew Wilson

NeurIPS 2025posterarXiv:2506.08316
8
citations