References & Citations
Mathematics > Optimization and Control
Title: Adaptive Accelerated Composite Minimization
(Submitted on 6 May 2024)
Abstract: The choice of the stepsize in first-order convex optimization is typically based on the smoothness constant and plays a crucial role in the performance of algorithms. Recently, there has been a resurgent interest in introducing adaptive stepsizes that do not explicitly depend on smooth constant. In this paper, we propose a novel adaptive stepsize rule based on function evaluations (i.e., zero-order information) that enjoys provable convergence guarantees for both accelerated and non-accelerated gradient descent. We further discuss the similarities and differences between the proposed stepsize regimes and the existing stepsize rules (including Polyak and Armijo). Numerically, we benchmark the performance of our proposed algorithms with the state-of-the-art literature in three different classes of smooth minimization (logistic regression, quadratic programming, log-sum-exponential, and approximate semidefinite programming), composite minimization ($\ell_1$ constrained and regularized problems), and non-convex minimization (cubic problem).
Submission history
From: Reza Rahimi Baghbadorani [view email][v1] Mon, 6 May 2024 12:28:42 GMT (2825kb,D)
Link back to: arXiv, form interface, contact.