2025 Poster "generative modeling" Papers

37 papers found

A Black-Box Debiasing Framework for Conditional Sampling

Han Cui, Jingbo Liu

NeurIPS 2025posterarXiv:2510.11071

Assessing the quality of denoising diffusion models in Wasserstein distance: noisy score and optimal bounds

Vahan Arsenyan, Elen Vardanyan, Arnak Dalalyan

NeurIPS 2025posterarXiv:2506.09681

CDFlow: Building Invertible Layers with Circulant and Diagonal Matrices

XUCHEN FENG, Siyu Liao

NeurIPS 2025posterarXiv:2510.25323

Conformal Generative Modeling with Improved Sample Efficiency through Sequential Greedy Filtering

Klaus-Rudolf Kladny, Bernhard Schölkopf, Michael Muehlebach

ICLR 2025posterarXiv:2410.01660
5
citations

Contextual Thompson Sampling via Generation of Missing Data

Kelly W Zhang, Tianhui Cai, Hongseok Namkoong et al.

NeurIPS 2025posterarXiv:2502.07064
2
citations

Continuous Diffusion for Mixed-Type Tabular Data

Markus Mueller, Kathrin Gruber, Dennis Fok

ICLR 2025posterarXiv:2312.10431
8
citations

Cross-fluctuation phase transitions reveal sampling dynamics in diffusion models

Sai Niranjan Ramachandran, Manish Krishan Lal, Suvrit Sra

NeurIPS 2025posterarXiv:2511.00124

Efficient Distribution Matching of Representations via Noise-Injected Deep InfoMax

Ivan Butakov, Alexander Semenenko, Alexander Tolmachev et al.

ICLR 2025posterarXiv:2410.06993
2
citations

Energy Matching: Unifying Flow Matching and Energy-Based Models for Generative Modeling

Michal Balcerak, Tamaz Amiranashvili, Antonio Terpin et al.

NeurIPS 2025posterarXiv:2504.10612
8
citations

Energy-Weighted Flow Matching for Offline Reinforcement Learning

Shiyuan Zhang, Weitong Zhang, Quanquan Gu

ICLR 2025posterarXiv:2503.04975
24
citations

Fast Solvers for Discrete Diffusion Models: Theory and Applications of High-Order Algorithms

Yinuo Ren, Haoxuan Chen, Yuchen Zhu et al.

NeurIPS 2025posterarXiv:2502.00234
29
citations

FlowDAS: A Stochastic Interpolant-based Framework for Data Assimilation

Siyi Chen, Yixuan Jia, Qing Qu et al.

NeurIPS 2025posterarXiv:2501.16642
4
citations

Flow matching achieves almost minimax optimal convergence

Kenji Fukumizu, Taiji Suzuki, Noboru Isobe et al.

ICLR 2025posterarXiv:2405.20879
12
citations

Flow to the Mode: Mode-Seeking Diffusion Autoencoders for State-of-the-Art Image Tokenization

Kyle Sargent, Kyle Hsu, Justin Johnson et al.

ICCV 2025posterarXiv:2503.11056
23
citations

FocalCodec: Low-Bitrate Speech Coding via Focal Modulation Networks

Luca Della Libera, Francesco Paissan, Cem Subakan et al.

NeurIPS 2025posterarXiv:2502.04465
9
citations

Generating Physical Dynamics under Priors

Zihan Zhou, Xiaoxue Wang, Tianshu Yu

ICLR 2025poster
4
citations

Generator Matching: Generative modeling with arbitrary Markov processes

Peter Holderrieth, Marton Havasi, Jason Yim et al.

ICLR 2025posterarXiv:2410.20587
43
citations

Go-with-the-Flow: Motion-Controllable Video Diffusion Models Using Real-Time Warped Noise

Ryan Burgert, Yuancheng Xu, Wenqi Xian et al.

CVPR 2025posterarXiv:2501.08331
59
citations

High-Order Flow Matching: Unified Framework and Sharp Statistical Rates

Maojiang Su, Jerry Yao-Chieh Hu, Yi-Chen Lee et al.

NeurIPS 2025poster

Improving Neural Optimal Transport via Displacement Interpolation

Jaemoo Choi, Yongxin Chen, Jaewoong Choi

ICLR 2025posterarXiv:2410.03783
3
citations

Informed Correctors for Discrete Diffusion Models

Yixiu Zhao, Jiaxin Shi, Feng Chen et al.

NeurIPS 2025posterarXiv:2407.21243
31
citations

Integrating Protein Dynamics into Structure-Based Drug Design via Full-Atom Stochastic Flows

Xiangxin Zhou, Yi Xiao, Haowei Lin et al.

ICLR 2025posterarXiv:2503.03989
1
citations

LaGeM: A Large Geometry Model for 3D Representation Learning and Diffusion

Biao Zhang, Peter Wonka

ICLR 2025posterarXiv:2410.01295
11
citations

Latent Zoning Network: A Unified Principle for Generative Modeling, Representation Learning, and Classification

Zinan Lin, Enshu Liu, Xuefei Ning et al.

NeurIPS 2025posterarXiv:2509.15591

MET3R: Measuring Multi-View Consistency in Generated Images

Mohammad Asim, Christopher Wewer, Thomas Wimmer et al.

CVPR 2025posterarXiv:2501.06336
43
citations

MOFFlow: Flow Matching for Structure Prediction of Metal-Organic Frameworks

Nayoung Kim, Seongsu Kim, Minsu Kim et al.

ICLR 2025posterarXiv:2410.17270
5
citations

Moment- and Power-Spectrum-Based Gaussianity Regularization for Text-to-Image Models

Jisung Hwang, Jaihoon Kim, Minhyuk Sung

NeurIPS 2025posterarXiv:2509.07027

Multi-Modal and Multi-Attribute Generation of Single Cells with CFGen

Alessandro Palma, Till Richter, Hanyi Zhang et al.

ICLR 2025posterarXiv:2407.11734
7
citations

On the Feature Learning in Diffusion Models

Andi Han, Wei Huang, Yuan Cao et al.

ICLR 2025posterarXiv:2412.01021
13
citations

Physics-Informed Diffusion Models

Jan-Hendrik Bastek, WaiChing Sun, Dennis Kochmann

ICLR 2025posterarXiv:2403.14404
52
citations

Proper Hölder-Kullback Dirichlet Diffusion: A Framework for High Dimensional Generative Modeling

Wanpeng Zhang, Yuhao Fang, Xihang Qiu et al.

NeurIPS 2025poster

Riemannian Flow Matching for Brain Connectivity Matrices via Pullback Geometry

Antoine Collas, Ce Ju, Nicolas Salvy et al.

NeurIPS 2025posterarXiv:2505.18193
2
citations

Sampling 3D Molecular Conformers with Diffusion Transformers

J. Thorben Frank, Winfried Ripken, Gregor Lied et al.

NeurIPS 2025posterarXiv:2506.15378
1
citations

Steering Protein Family Design through Profile Bayesian Flow

Jingjing Gong, Yu Pei, Siyu Long et al.

ICLR 2025posterarXiv:2502.07671
3
citations

TreeGen: A Bayesian Generative Model for Hierarchies

Marcel Kollovieh, Nils Fleischmann, Filippo Guerranti et al.

NeurIPS 2025poster

Trivialized Momentum Facilitates Diffusion Generative Modeling on Lie Groups

Yuchen Zhu, Tianrong Chen, Lingkai Kong et al.

ICLR 2025posterarXiv:2405.16381
13
citations

Why Masking Diffusion Works: Condition on the Jump Schedule for Improved Discrete Diffusion

Alan Amin, Nate Gruver, Andrew Wilson

NeurIPS 2025posterarXiv:2506.08316
8
citations