Papers
arxiv:2402.02464

A Graph is Worth K Words: Euclideanizing Graph using Pure Transformer

Published on Feb 4, 2024
Authors:
,
,
,

Abstract

GraphsGPT addresses the challenge of modeling non-Euclidean graphs by transforming them into learnable graph words in Euclidean space for accurate representation, generation, and manipulation.

AI-generated summary

Can we model non-Euclidean graphs as pure language or even Euclidean vectors while retaining their inherent information? The non-Euclidean property have posed a long term challenge in graph modeling. Despite recent GNN and Graphformer efforts encoding graphs as Euclidean vectors, recovering original graph from the vectors remains a challenge. We introduce GraphsGPT, featuring a Graph2Seq encoder that transforms non-Euclidean graphs into learnable graph words in a Euclidean space, along with a GraphGPT decoder that reconstructs the original graph from graph words to ensure information equivalence. We pretrain GraphsGPT on 100M molecules and yield some interesting findings: (1) Pretrained Graph2Seq excels in graph representation learning, achieving state-of-the-art results on 8/9 graph classification and regression tasks. (2) Pretrained GraphGPT serves as a strong graph generator, demonstrated by its ability to perform both unconditional and conditional graph generation. (3) Graph2Seq+GraphGPT enables effective graph mixup in the Euclidean space, overcoming previously known non-Euclidean challenge. (4) Our proposed novel edge-centric GPT pretraining task is effective in graph fields, underscoring its success in both representation and generation.

Community

Sign up or log in to comment

Get this paper in your agent:

hf papers read 2402.02464
Don't have the latest CLI?
curl -LsSf https://hf.co/cli/install.sh | bash

Models citing this paper 5

Browse 5 models citing this paper

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2402.02464 in a dataset README.md to link it from this page.

Spaces citing this paper 13

Collections including this paper 0

No Collection including this paper

Add this paper to a collection to link it from this page.