-
Notifications
You must be signed in to change notification settings - Fork 59
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
SEAL - Utilizing multiple edge features #17
Comments
Hi, Absolutely! You may replace the graph convolution layers in SEAL with a GNN that deals with edge features, such as GINE. Since there are currently no link prediction datasets with edge features, I did not add such a GNN. |
Thank you very much! |
I am currently implementing this and I have one remaining question. Currently, you are compressing the edge features into one edge weight. The k-hop_subgraph extraction method is relying on |
@bits-glitch One possible solution is to create a dictionary storing the map from (i,j) to edge attribute vector, and feed this dict into the subgraph extraction function. One other built-in data structure is the "Hybrid sparse COO tensors" in PyTorch. Check this doc. It supports querying a dense tensor via sparse indices, and slicing along a dense dimension. You may convert the edge_attr into this format and do the querying. |
Thank you for the advice, that was really helpful!
Two questions remain:
Do we need to use a different GNN operator? |
|
Hello SEAL Team,
thank you very much for the great implementation!
I noticed that you can incorporate the node features and edge weight into the SEAL learning process (
--use_feature
and--use_edge_weight
). Dealing with edge weight seems pretty straightforward for me, but have you thought about a possibility of combining further edge features (not only one weight) in SEAL? Is there a possibility of doing this as well?The text was updated successfully, but these errors were encountered: