We gratefully acknowledge support from
the Simons Foundation and member institutions.
Full-text links:

Download:

Current browse context:

cs.LG

Change to browse by:

References & Citations

DBLP - CS Bibliography

Bookmark

(what is this?)
CiteULike logo BibSonomy logo Mendeley logo del.icio.us logo Digg logo Reddit logo

Computer Science > Machine Learning

Title: Nonlinear model reduction for operator learning

Abstract: Operator learning provides methods to approximate mappings between infinite-dimensional function spaces. Deep operator networks (DeepONets) are a notable architecture in this field. Recently, an extension of DeepONet based on model reduction and neural networks, proper orthogonal decomposition (POD)-DeepONet, has been able to outperform other architectures in terms of accuracy for several benchmark tests. We extend this idea towards nonlinear model order reduction by proposing an efficient framework that combines neural networks with kernel principal component analysis (KPCA) for operator learning. Our results demonstrate the superior performance of KPCA-DeepONet over POD-DeepONet.
Comments: Published as a Tiny Paper at ICLR 2024 (Notable)
Subjects: Machine Learning (cs.LG); Numerical Analysis (math.NA)
Cite as: arXiv:2403.18735 [cs.LG]
  (or arXiv:2403.18735v1 [cs.LG] for this version)

Submission history

From: Hamidreza Eivazi [view email]
[v1] Wed, 27 Mar 2024 16:24:26 GMT (710kb,D)

Link back to: arXiv, form interface, contact.