Skip to content

Latest commit

 

History

History
 
 

graphsage

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 
 
 

Inductive Representation Learning on Large Graphs (GraphSAGE)

Results

Full graph training

Run with following (available dataset: "cora", "citeseer", "pubmed")

python train_full.py --dataset cora     # full graph

Run with following (available dataset: "cora", "citeseer", "pubmed")

# use fensorflow background
TL_BACKEND=tensorflow python train_full.py --dataset cora --lr 0.01 --hidden_dim 128 --drop_rate 0.7 --n_epoch 500
TL_BACKEND=tensorflow python reddit_sage.py --lr 0.0005 --hidden_dim 256 --drop_rate 0.8
TL_BACKEND=paddle python train_full.py --dataset cora --n_epoch 500 --lr 0.005 --hidden_dim 512 --drop_rate 0.7 --n_epoch 500
CUDA_VISIBLE_DEVICES=5 TL_BACKEND=paddle python reddit_sage.py --lr 0.001 --hidden_dim 128 --drop_rate 0.8
# use pytorch
TL_BACKEND=torch python train_full.py --dataset cora --n_epoch 500 --lr 0.005 --hidden_dim 512 --drop_rate 0.8
TL_BACKEND=torch python reddit_sage.py --lr 0.005 --hidden_dim 128 --drop_rate 0.8
Dataset Cora Reddit
DGL 83.3 94.95
Paper 83.3 95.0
GammaGL(tf) 82.44 ± 0.88 95.0
GammaGL(th) 81.13 ± 1.08 94.9
GammaGL(pd) 82.04 ± 0.33 91.2
GammaGL(ms) --.- --.-
  • We fail to reproduce the reported accuracy on 'Cora', even with the DGL's code.
  • The model performance is the average of 5 tests