The Adaptive Complexity of Minimizing Relative Fisher Information

0citations
0
Citations
#2219
in NEURIPS 2025
of 5858 papers
2
Authors
4
Data Points

Abstract

Non-log-concave sampling from an unnormalized density is fundamental in machine learning and statistics. As datasets grow larger, computational efficiency becomes increasingly important, particularly in reducing adaptive complexity, namely the number of sequential rounds required for sampling algorithms. In this work, we initiate the study of the adaptive complexity of non-log-concave sampling within the framework of relative Fisher information introduced by Balasubramanian et al. in 2022. To obtain a relative fisher information of at most $\varepsilon^2$ from the target distribution, we propose a novel algorithm that reduces the adaptive complexity from $\mathcal{O}(d^2/\varepsilon^4)$ to $\mathcal{O}(d/\varepsilon^2)$ by leveraging parallelism. Furthermore, we show our algorithm is optimal for a specific regime of large $\varepsilon$. Our algorithm builds on a diagonally parallelized Picard iteration, while the lower bound is based on a reduction from the problem of finding stationary points.

Citation History

Jan 26, 2026
0
Jan 26, 2026
0
Jan 27, 2026
0
Feb 3, 2026
0