-
Notifications
You must be signed in to change notification settings - Fork 87
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
running time on OAG_CS dataset #30
Comments
Hi: From your log, it seems the bottleneck is the sampling (which is conducted on CPU). My previous setting is 8* CPU E5-2698 v4 @ 2.20GHz. (But the machine also runs other experiments so it's just a reference) |
Thank you for the quick reply and looking forward to the efficient version :) |
Hi, is there any progress about the efficient version? |
Hi, thanks for providing the awesome code of GPT-GNN.
I am trying to run your code on OAG_CS dataset but I am not sure if I get it right. In the paper, the reported pre-training time is about 10-12 hours for 400 epochs while it took much longer on my side. I wonder if you could specify the requirements of the computational resources. For example, how many cpus do I need for achieving a pre-training time of 10 hours? I attached the output for my run as follows.
The text was updated successfully, but these errors were encountered: