We gratefully acknowledge support from
the Simons Foundation and member institutions.
Full-text links:

Download:

Current browse context:

cs.LG

Change to browse by:

References & Citations

DBLP - CS Bibliography

Bookmark

(what is this?)
CiteULike logo BibSonomy logo Mendeley logo del.icio.us logo Digg logo Reddit logo

Computer Science > Machine Learning

Title: High-probability Convergence Bounds for Nonlinear Stochastic Gradient Descent Under Heavy-tailed Noise

Abstract: Several recent works have studied the convergence \textit{in high probability} of stochastic gradient descent (SGD) and its clipped variant. Compared to vanilla SGD, clipped SGD is practically more stable and has the additional theoretical benefit of logarithmic dependence on the failure probability. However, the convergence of other practical nonlinear variants of SGD, e.g., sign SGD, quantized SGD and normalized SGD, that achieve improved communication efficiency or accelerated convergence is much less understood. In this work, we study the convergence bounds \textit{in high probability} of a broad class of nonlinear SGD methods. For strongly convex loss functions with Lipschitz continuous gradients, we prove a logarithmic dependence on the failure probability, even when the noise is heavy-tailed. Strictly more general than the results for clipped SGD, our results hold for any nonlinearity with bounded (component-wise or joint) outputs, such as clipping, normalization, and quantization. Further, existing results with heavy-tailed noise assume bounded $\eta$-th central moments, with $\eta \in (1,2]$. In contrast, our refined analysis works even for $\eta=1$, strictly relaxing the noise moment assumptions in the literature.
Comments: 27 pages, 3 figures
Subjects: Machine Learning (cs.LG); Optimization and Control (math.OC); Statistics Theory (math.ST); Machine Learning (stat.ML)
Cite as: arXiv:2310.18784 [cs.LG]
  (or arXiv:2310.18784v2 [cs.LG] for this version)

Submission history

From: Aleksandar Armacki [view email]
[v1] Sat, 28 Oct 2023 18:53:41 GMT (33kb)
[v2] Mon, 13 Nov 2023 20:07:45 GMT (293kb,D)
[v3] Mon, 4 Dec 2023 20:45:39 GMT (406kb,D)
[v4] Thu, 18 Apr 2024 03:53:19 GMT (412kb,D)
[v5] Fri, 19 Apr 2024 04:39:46 GMT (411kb,D)
[v6] Tue, 23 Apr 2024 20:22:30 GMT (412kb,D)
[v7] Wed, 1 May 2024 02:30:23 GMT (412kb,D)

Link back to: arXiv, form interface, contact.