We gratefully acknowledge support from
the Simons Foundation and member institutions.
Full-text links:

Download:

Current browse context:

cs.LG

Change to browse by:

References & Citations

DBLP - CS Bibliography

Bookmark

(what is this?)
CiteULike logo BibSonomy logo Mendeley logo del.icio.us logo Digg logo Reddit logo

Computer Science > Machine Learning

Title: A Survey on Self-Supervised Pre-Training of Graph Foundation Models: A Knowledge-Based Perspective

Abstract: Graph self-supervised learning is now a go-to method for pre-training graph foundation models, including graph neural networks, graph transformers, and more recent large language model (LLM)-based graph models. There is a wide variety of knowledge patterns embedded in the structure and properties of graphs which may be used for pre-training, but we lack a systematic overview of self-supervised pre-training tasks from the perspective of graph knowledge. In this paper, we comprehensively survey and analyze the pre-training tasks of graph foundation models from a knowledge-based perspective, consisting of microscopic (nodes, links, etc) and macroscopic knowledge (clusters, global structure, etc). It covers a total of 9 knowledge categories and 25 pre-training tasks, as well as various downstream task adaptation strategies. Furthermore, an extensive list of the related papers with detailed metadata is provided at this https URL
Comments: Work in progress
Subjects: Machine Learning (cs.LG); Social and Information Networks (cs.SI)
Cite as: arXiv:2403.16137 [cs.LG]
  (or arXiv:2403.16137v1 [cs.LG] for this version)

Submission history

From: Ziwen Zhao [view email]
[v1] Sun, 24 Mar 2024 13:10:09 GMT (191kb,D)

Link back to: arXiv, form interface, contact.