Accelerated Distance-adaptive Methods for Hölder Smooth and Convex Optimization

0
Citations
#1774
in NeurIPS 2025
of 5858 papers
3
Authors
4
Data Points

Abstract

This paper introduces new parameter-free first-order methods for convex optimization problems in which the objective function exhibits Hölder smoothness. Inspired by the recently proposed distance-over-gradient (DOG) technique, we propose an accelerated distance-adaptive method which achieves optimal anytime convergence rates for Hölder smooth problems without requiring prior knowledge of smoothness parameters or explicit parameter tuning. Importantly, our parameter-free approach removes the necessity of specifying target accuracy in advance, addressing a limitation found in the universal fast gradient methods (Nesterov, Yu. \textit{Mathematical Programming}, 2015). For convex stochastic optimization, we further present a parameter-free accelerated method that eliminates the need for line-search procedures. Preliminary experimental results highlight the effectiveness of our approach on convex nonsmooth problems and its advantages over existing parameter-free or accelerated methods.

Citation History

Jan 25, 2026
0
Jan 27, 2026
0
Jan 27, 2026
0
Jan 31, 2026
0