We gratefully acknowledge support from
the Simons Foundation and member institutions.
Full-text links:

Download:

Current browse context:

cs.LG

Change to browse by:

References & Citations

DBLP - CS Bibliography

Bookmark

(what is this?)
CiteULike logo BibSonomy logo Mendeley logo del.icio.us logo Digg logo Reddit logo

Computer Science > Machine Learning

Title: Improving Subgraph-GNNs via Edge-Level Ego-Network Encodings

Abstract: We present a novel edge-level ego-network encoding for learning on graphs that can boost Message Passing Graph Neural Networks (MP-GNNs) by providing additional node and edge features or extending message-passing formats. The proposed encoding is sufficient to distinguish Strongly Regular Graphs, a family of challenging 3-WL equivalent graphs. We show theoretically that such encoding is more expressive than node-based sub-graph MP-GNNs. In an empirical evaluation on four benchmarks with 10 graph datasets, our results match or improve previous baselines on expressivity, graph classification, graph regression, and proximity tasks -- while reducing memory usage by 18.1x in certain real-world settings.
Comments: TMLR, graph neural networks, weisfeiler-lehman, expressivity, higher-order GNNs, 3-WL, 1-WL, edge-level, ego-networks
Subjects: Machine Learning (cs.LG); Artificial Intelligence (cs.AI)
Journal reference: Nurudin Alvarez-Gonzalez, Andreas Kaltenbrunner, Vicen\c{c} Gomez. Improving Subgraph-GNNs via Edge-Level Ego-Network Encodings. In Transactions on Machine Learning Research, 2024
Cite as: arXiv:2312.05905 [cs.LG]
  (or arXiv:2312.05905v2 [cs.LG] for this version)

Submission history

From: Francisco Nurudin Alvarez Gonzalez [view email]
[v1] Sun, 10 Dec 2023 15:05:23 GMT (618kb,D)
[v2] Thu, 2 May 2024 12:18:43 GMT (654kb,D)

Link back to: arXiv, form interface, contact.