-
Notifications
You must be signed in to change notification settings - Fork 317
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Best hyper parameters in your paper section 4.2.3 #19
Comments
Thanks for your interest. Please get the latest version from github. For the parameter settings, please refer to the README file; for the corresponding training log, please refer to the log files. Thanks. |
Both the README and https://github.com/xiangwang1223/knowledge_graph_attention_network/blob/master/Log/training_log_amazon-book.log are using the pretrained embeddings. I'm wondering what the parameters are used for training from scratch. |
@rugezhao @xiangwang1223 I think this code is different from the paper such as the loss of KGE, In paper, the loss of KGE is contain Wr, but not in code (in code, Wr is used to calculate attention) ... I am more confused about this approach |
Please CAREFULLY check the lines 194-199 in KGAT.py, where the model parameters "trans_W" are used to calculate the KGE loss, which is CONSISTENT to Equation (1) in the paper; and check the line 395, where the same parameters "trans_W" are used to calculate the attention scores, which is also CONSISTENT to Equation (4) in the paper. ALL the codes are the same as the formulation in the paper. |
Thanks for your reply!but the loss of KGE is in lines 229-252, Wr is really not in the loss of KGE, If I made a mistake, please point out, thank you! knowledge_graph_attention_network/Model/KGAT.py Lines 229 to 252 in 6eb71fc
|
I found my mistake, thank you for your correction @xiangwang1223 |
Hi,
I am trying to reproduce the results in your paper, but I could not find the best hyper-parameters in the paper or repo.
Can you share more information on hyperparameters for each dataset?
The text was updated successfully, but these errors were encountered: