We gratefully acknowledge support from
the Simons Foundation and member institutions.
Full-text links:

Download:

Current browse context:

math.OC

Change to browse by:

References & Citations

Bookmark

(what is this?)
CiteULike logo BibSonomy logo Mendeley logo del.icio.us logo Digg logo Reddit logo

Mathematics > Optimization and Control

Title: Inexact subgradient methods for semialgebraic functions

Authors: Jérôme Bolte (TSE-R), Tam Le (UGA, LJK), Éric Moulines (CMAP, MBZUAI), Edouard Pauwels (TSE-R, IUF)
Abstract: Motivated by the widespread use of approximate derivatives in machine learning and optimization, we study inexact subgradient methods with non-vanishing additive errors and step sizes. In the nonconvex semialgebraic setting, under boundedness assumptions, we prove that the method provides points that eventually fluctuate close to the critical set at a distance proportional to $\epsilon^\rho$ where $\epsilon$ is the error in subgradient evaluation and $\rho$ relates to the geometry of the problem. In the convex setting, we provide complexity results for the averaged values. We also obtain byproducts of independent interest, such as descent-like lemmas for nonsmooth nonconvex problems and some results on the limit of affine interpolants of differential inclusions.
Subjects: Optimization and Control (math.OC); Machine Learning (stat.ML)
Cite as: arXiv:2404.19517 [math.OC]
  (or arXiv:2404.19517v1 [math.OC] for this version)

Submission history

From: Tam Le [view email]
[v1] Tue, 30 Apr 2024 12:47:42 GMT (36kb)

Link back to: arXiv, form interface, contact.