We gratefully acknowledge support from
the Simons Foundation and member institutions.
Full-text links:

Download:

Current browse context:

math.OC

Change to browse by:

References & Citations

Bookmark

(what is this?)
CiteULike logo BibSonomy logo Mendeley logo del.icio.us logo Digg logo Reddit logo

Mathematics > Optimization and Control

Title: Adaptive Accelerated Composite Minimization

Abstract: The choice of the stepsize in first-order convex optimization is typically based on the smoothness constant and plays a crucial role in the performance of algorithms. Recently, there has been a resurgent interest in introducing adaptive stepsizes that do not explicitly depend on smooth constant. In this paper, we propose a novel adaptive stepsize rule based on function evaluations (i.e., zero-order information) that enjoys provable convergence guarantees for both accelerated and non-accelerated gradient descent. We further discuss the similarities and differences between the proposed stepsize regimes and the existing stepsize rules (including Polyak and Armijo). Numerically, we benchmark the performance of our proposed algorithms with the state-of-the-art literature in three different classes of smooth minimization (logistic regression, quadratic programming, log-sum-exponential, and approximate semidefinite programming), composite minimization ($\ell_1$ constrained and regularized problems), and non-convex minimization (cubic problem).
Subjects: Optimization and Control (math.OC)
Cite as: arXiv:2405.03414 [math.OC]
  (or arXiv:2405.03414v1 [math.OC] for this version)

Submission history

From: Reza Rahimi Baghbadorani [view email]
[v1] Mon, 6 May 2024 12:28:42 GMT (2825kb,D)

Link back to: arXiv, form interface, contact.