Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is there a code for GCN as a backbone? #3

Open
ujeong1 opened this issue Oct 3, 2022 · 1 comment
Open

Is there a code for GCN as a backbone? #3

ujeong1 opened this issue Oct 3, 2022 · 1 comment

Comments

@ujeong1
Copy link

ujeong1 commented Oct 3, 2022

Hi,

I've tried to use GCNconv for Vanilla GCN model (I implemented it as below as there is no model available in the repo)

As a result, it seems like the evaluation result is not reproducible and the result of vanilla model is even higher in some cases. i.e., when I test vanilla GCN model with REDDIT-BINARY in multiple runs, it gives me the average accuracy above 90% (I used test accuracy with the model parameters for the best validation accuracy)

Do you have any idea why it happens?

class GCN(torch.nn.Module):
    def __init__(self, num_features=1, num_classes=1, num_hidden=32):
        super(GCN, self).__init__()

        dim = num_hidden

        self.conv1 = GCNConv(num_features, dim)
        self.bn1 = torch.nn.BatchNorm1d(dim)

        self.conv2 = GCNConv(dim, dim)
        self.bn2 = torch.nn.BatchNorm1d(dim)

        self.conv3 = GCNConv(dim, dim)
        self.bn3 = torch.nn.BatchNorm1d(dim)

        self.conv4 = GCNConv(dim, dim)
        self.bn4 = torch.nn.BatchNorm1d(dim)

        self.conv5 = GCNConv(dim, dim)
        self.bn5 = torch.nn.BatchNorm1d(dim)

        self.fc1 = Linear(dim, dim)
        self.fc2 = Linear(dim, num_classes)

    def forward(self, x, edge_index, batch):
        x = F.relu(self.conv1(x, edge_index))
        x = self.bn1(x)
        x = F.relu(self.conv2(x, edge_index))
        x = self.bn2(x)
        x = F.relu(self.conv3(x, edge_index))
        x = self.bn3(x)
        x = F.relu(self.conv4(x, edge_index))
        x = self.bn4(x)
        x = F.relu(self.conv5(x, edge_index))
        x = self.bn5(x)
        # x = global_add_pool(x, batch)
        x = global_mean_pool(x, batch)
        x = F.relu(self.fc1(x))
        # x = F.dropout(x, p=0.5, training=self.training)
        x = self.fc2(x)
        return F.log_softmax(x, dim=-1)
@shelbycox
Copy link

Hi,

I also attempted to implement the GCN network and came up with something slightly different. (I am not one of the original authors). According to the G-Mixup paper, the GCN should be implemented as follows:

Four GNN layers and global mean pooling are applied. All the hidden units [are] set to 64. The activation function is ReLU

Here is my implementation:

class GCN(torch.nn.Module):
    def __init__(self, num_features=1, num_classes=1, num_hidden=32):
        super(GCN, self).__init__()

        dim = num_hidden

        self.conv1 = GCNConv(in_channels=num_features, out_channels=dim)

        self.conv2 = GCNConv(in_channels=dim, out_channels=dim)

        self.conv3 = GCNConv(in_channels=dim, out_channels=dim)

        self.conv4 = GCNConv(in_channels=dim, out_channels=dim)

        self.fc1 = Linear(dim, dim)
        self.fc2 = Linear(dim, num_classes)

    def forward(self, x, edge_index, batch):
        x = F.relu(self.conv1(x, edge_index))
        x = F.relu(self.conv2(x, edge_index))
        x = F.relu(self.conv3(x, edge_index))
        x = F.relu(self.conv4(x, edge_index))
        x = global_mean_pool(x, batch)
        x = F.relu(self.fc1(x))
        x = self.fc2(x)
        return F.log_softmax(x, dim=-1)

Basically, I copied the GIN implementation the authors provided in models.py and then made the following changes:

  1. replaced “GIN” with “GCN”, and
  2. removed the batch normalization.

With this network, G-Mixup seems to improve the performance (although we’ve only tested on some small datasets like PROTEINS). It would be great for the authors to comment on this issue!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants