Current browse context:
math.OC
Change to browse by:
References & Citations
Mathematics > Optimization and Control
Title: Explicit Second-Order Min-Max Optimization Methods with Optimal Convergence Guarantee
(Submitted on 23 Oct 2022 (v1), revised 6 Sep 2023 (this version, v3), latest version 23 Apr 2024 (v4))
Abstract: We propose and analyze exact and inexact regularized Newton-type methods for finding a global saddle point of \emph{convex-concave} unconstrained min-max optimization problems. Compared to first-order methods, our understanding of second-order methods for min-max optimization is relatively limited, as obtaining global rates of convergence with second-order information is much more involved. In this paper, we examine how second-order information can be used to speed up extra-gradient methods, even under inexactness. Specifically, we show that the proposed algorithms generate iterates that remain within a bounded set and the averaged iterates converge to an $\epsilon$-saddle point within $O(\epsilon^{-2/3})$ iterations in terms of a restricted gap function. Our algorithms match the theoretically established lower bound in this context and our analysis provides a simple and intuitive convergence analysis for second-order methods without any boundedness requirements. Finally, we present a series of numerical experiments on synthetic and real data that demonstrate the efficiency of the proposed algorithms.
Submission history
From: Tianyi Lin [view email][v1] Sun, 23 Oct 2022 21:24:37 GMT (171kb)
[v2] Sun, 20 Nov 2022 02:29:15 GMT (172kb)
[v3] Wed, 6 Sep 2023 02:02:19 GMT (138kb)
[v4] Tue, 23 Apr 2024 15:40:29 GMT (139kb)
Link back to: arXiv, form interface, contact.