Official implementation of GRPE. We achieve the second best model on the PCQM4Mv2 dataset of the OGB-LSC Leaderboard.
conda env create --file environment.yaml
conda activate chemprop
Download pretrained weights from https://drive.google.com/drive/folders/1Oc3Ox0HAoJ5Hrihfp5-jFvStPIfFQAf9?usp=sharing
and create folder pretrained_weight
.
Please check {dataset-name}.sh
for detailed commands to reproduce the results.
- 4 gpus (A100 with 80GiB) are required to run experiments for PCQM4M, PCQM4Mv2, PCBA and HIV.
- 1 gpu is required to run experiments for MNIST and CIFAR10.
pip install git+https://github.com/lenscloth/GRPE
# Install pytorch & pytorch geometric version according to your environment
pip install torch==1.9.0+cu111 torchvision==0.10.0+cu111 torchaudio==0.9.0 -f https://download.pytorch.org/whl/torch_stable.html
pip install torch-scatter torch-sparse torch-cluster torch-spline-conv torch-geometric -f https://data.pyg.org/whl/torch-1.9.0+cu111.html
from grpe.pretrained import load_pretrained_fingerprint
fingerprint_model = load_pretrained_fingerprint(cuda=True)
finger = fingerprint_model.generate_fingerprint(
[
"CC(=O)NCCC1=CNc2c1cc(OC)cc2",
],
fingerprint_stack=5,
) # 1x3840 Pytorch Tensor
Please use the bibtex below
@inproceedings{park2022grpe,
title={GRPE: Relative Positional Encoding for Graph Transformer},
author={Park, Wonpyo and Chang, Woong-Gi and Lee, Donggeon and Kim, Juntae and Seungwon Hwang},
booktitle={ICLR2022 Machine Learning for Drug Discovery},
year={2022}
}