We gratefully acknowledge support from
the Simons Foundation and member institutions.
Full-text links:

Download:

Current browse context:

cond-mat.stat-mech

Change to browse by:

References & Citations

Bookmark

(what is this?)
CiteULike logo BibSonomy logo Mendeley logo del.icio.us logo Digg logo Reddit logo

Condensed Matter > Statistical Mechanics

Title: Universality of neural dynamics on complex networks

Abstract: This paper discusses the capacity of graph neural networks to learn the functional form of ordinary differential equations that govern dynamics on complex networks. We propose necessary elements for such a problem, namely, inductive biases, a neural network architecture and a learning task. Statistical learning theory suggests that generalisation power of neural networks relies on independence and identical distribution (i.i.d.)\ of training and testing data. Although this assumption together with an appropriate neural architecture and a learning mechanism is sufficient for accurate out-of-sample predictions of dynamics such as, e.g.\ mass-action kinetics, by studying the out-of-distribution generalisation in the case of diffusion dynamics, we find that the neural network model: (i) has a generalisation capacity that depends on the first moment of the initial value data distribution; (ii) learns the non-dissipative nature of dynamics implicitly; and (iii) the model's accuracy resolution limit is of order $\mathcal{O}(1/\sqrt{n})$ for a system of size $n$.
Subjects: Statistical Mechanics (cond-mat.stat-mech); Machine Learning (cs.LG); Social and Information Networks (cs.SI); Machine Learning (stat.ML)
Cite as: arXiv:2301.04900 [cond-mat.stat-mech]
  (or arXiv:2301.04900v1 [cond-mat.stat-mech] for this version)

Submission history

From: Vaiva Vasiliauskaite [view email]
[v1] Thu, 12 Jan 2023 09:44:59 GMT (164kb,D)
[v2] Tue, 15 Aug 2023 15:59:01 GMT (961kb,D)
[v3] Tue, 17 Oct 2023 09:09:53 GMT (864kb,D)
[v4] Wed, 24 Apr 2024 19:21:05 GMT (1015kb,D)

Link back to: arXiv, form interface, contact.