Poster "gaussian mixture models" Papers

13 papers found

Attention-based clustering

Rodrigo Maulen Soto, Pierre Marion, Claire Boyer

NeurIPS 2025posterarXiv:2505.13112

Beyond Model Collapse: Scaling Up with Synthesized Data Requires Verification

Yunzhen Feng, Elvis Dohmatob, Pu Yang et al.

ICLR 2025posterarXiv:2406.07515

Broadening Target Distributions for Accelerated Diffusion Models via a Novel Analysis Approach

Yuchen Liang, Peizhong Ju, Yingbin Liang et al.

ICLR 2025posterarXiv:2402.13901
11
citations

How Much is a Noisy Image Worth? Data Scaling Laws for Ambient Diffusion.

Giannis Daras, Yeshwanth Cherapanamjeri, Constantinos C Daskalakis

ICLR 2025posterarXiv:2411.02780
16
citations

Transformers are almost optimal metalearners for linear classification

Roey Magen, Gal Vardi

NeurIPS 2025posterarXiv:2510.19797
1
citations

Understanding Contrastive Learning via Gaussian Mixture Models

Parikshit Bansal, Ali Kavis, Sujay Sanghavi

NeurIPS 2025poster
3
citations

Analyzing $D^\alpha$ seeding for $k$-means

Etienne Bamas, Sai Ganesh Nagarajan, Ola Svensson

ICML 2024poster

Deep Equilibrium Models are Almost Equivalent to Not-so-deep Explicit Models for High-dimensional Gaussian Mixtures

Zenan Ling, Longbo Li, Zhanbo Feng et al.

ICML 2024poster

Exploring Active Learning in Meta-Learning: Enhancing Context Set Labeling

Wonho Bae, Jing Wang, Danica J. Sutherland

ECCV 2024posterarXiv:2311.02879
1
citations

GMM-IKRS: Gaussian Mixture Models for Interpretable Keypoint Refinement and Scoring

Emanuele Santellani, Martin Zach, Christian Sormann et al.

ECCV 2024posterarXiv:2408.17149
3
citations

Is Temperature Sample Efficient for Softmax Gaussian Mixture of Experts?

Huy Nguyen, Pedram Akbarian, Nhat Ho

ICML 2024poster

Theoretical insights for diffusion guidance: A case study for Gaussian mixture models

Yuchen Wu, Minshuo Chen, Zihao Li et al.

ICML 2024poster

This Probably Looks Exactly Like That: An Invertible Prototypical Network

Zachariah Carmichael, Timothy Redgrave, Daniel Gonzalez Cedre et al.

ECCV 2024posterarXiv:2407.12200
6
citations