We gratefully acknowledge support from
the Simons Foundation and member institutions.
Full-text links:

Download:

Current browse context:

math.ST

Change to browse by:

References & Citations

Bookmark

(what is this?)
CiteULike logo BibSonomy logo Mendeley logo del.icio.us logo Digg logo Reddit logo

Mathematics > Statistics Theory

Title: A Full Adagrad algorithm with O(Nd) operations

Authors: Antoine Godichon-Baggioni (LPSM (UMR\_8001)), Wei Lu (LMI), Bruno Portier (LMI)
Abstract: A novel approach is given to overcome the computational challenges of the full-matrix Adaptive Gradient algorithm (Full AdaGrad) in stochastic optimization. By developing a recursive method that estimates the inverse of the square root of the covariance of the gradient, alongside a streaming variant for parameter updates, the study offers efficient and practical algorithms for large-scale applications. This innovative strategy significantly reduces the complexity and resource demands typically associated with full-matrix methods, enabling more effective optimization processes. Moreover, the convergence rates of the proposed estimators and their asymptotic efficiency are given. Their effectiveness is demonstrated through numerical studies.
Subjects: Statistics Theory (math.ST); Machine Learning (stat.ML)
Cite as: arXiv:2405.01908 [math.ST]
  (or arXiv:2405.01908v1 [math.ST] for this version)

Submission history

From: Antoine Godichon-Baggioni [view email]
[v1] Fri, 3 May 2024 08:02:08 GMT (1955kb,D)

Link back to: arXiv, form interface, contact.