We gratefully acknowledge support from
the Simons Foundation and member institutions.
Full-text links:

Download:

Current browse context:

stat.ML

Change to browse by:

References & Citations

Bookmark

(what is this?)
CiteULike logo BibSonomy logo Mendeley logo del.icio.us logo Digg logo Reddit logo

Statistics > Machine Learning

Title: A Majorization-Minimization Gauss-Newton Method for 1-Bit Matrix Completion

Abstract: In 1-bit matrix completion, the aim is to estimate an underlying low-rank matrix from a partial set of binary observations. We propose a novel method for 1-bit matrix completion called MMGN. Our method is based on the majorization-minimization (MM) principle, which yields a sequence of standard low-rank matrix completion problems in our setting. We solve each of these sub-problems by a factorization approach that explicitly enforces the assumed low-rank structure and then apply a Gauss-Newton method. Our numerical studies and application to a real-data example illustrate that MMGN outputs comparable if not more accurate estimates, is often significantly faster, and is less sensitive to the spikiness of the underlying matrix than existing methods.
Comments: 33 pages, 9 figures
Subjects: Machine Learning (stat.ML); Machine Learning (cs.LG)
Cite as: arXiv:2304.13940 [stat.ML]
  (or arXiv:2304.13940v1 [stat.ML] for this version)

Submission history

From: Xiaoqian Liu [view email]
[v1] Thu, 27 Apr 2023 03:16:52 GMT (2038kb,D)
[v2] Tue, 23 Apr 2024 02:10:25 GMT (199kb,D)

Link back to: arXiv, form interface, contact.