-
Notifications
You must be signed in to change notification settings - Fork 17
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
The code's learning rate hyperparameters seems inconsistent with the paper. #18
Comments
It is! I'm sorry for not getting back to you sooner. I think you are right.
As I recall, just keep the same hyperparameter as the SGC. You can have a
try. Basically, the hyperparameters of SGC and SSGC are very similar.
liangyx ***@***.***> 于2023年9月9日周六 00:35写道:
… Hello, the learning rate of cora, citeseer, and pubmed is 0.02 in the
paper. But it seems to be 0.2 in the implementation. Which one is correct?
In addition, can you provide the code of reddit, or the learning rate of
reddit.
—
Reply to this email directly, view it on GitHub
<#18>, or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AMPJZOEEMCOZHTV3E7T3NZLXZMUMXANCNFSM6AAAAAA4QPOMLA>
.
You are receiving this because you are subscribed to this thread.Message
ID: ***@***.***>
|
Thank you for your reply. |
Sorry, I have moved from Canberra to Sydney, and the record was missing. I
remember that I kept the graph convolution result first, then trained on it
with MLP. I did not use the specific parameter for the original raw
features. Thus, it is an average pooling for different hops features. Then,
I only tried different hyper-parameters on MLP.
liangyx ***@***.***> 于2023年10月4日周三 03:18写道:
… Thank you for your reply.
On the arxiv code, you provided two versions MLP.py and SSGC-MLP.py. From
the implementation point of view, is MLP.py the actual SSGC+MLP of the
paper?
BTW, can you provide the reproducible hyperparameters of SSGC+MLP on the
arxiv dataset?
—
Reply to this email directly, view it on GitHub
<#18 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AMPJZODS3RL3WHW4J4BS3HTX5Q3ERAVCNFSM6AAAAAA4QPOMLCVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTONBVGMYTAOBVGM>
.
You are receiving this because you commented.Message ID:
***@***.***>
|
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Hello, the learning rate of cora, citeseer, and pubmed is 0.02 in the paper. But it seems to be 0.2 in the implementation. Which one is correct?
In addition, can you provide the code of reddit, or the learning rate of reddit.
The text was updated successfully, but these errors were encountered: