Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Enhancement] Log loss values when training #6

Open
habaneraa opened this issue Oct 12, 2023 · 2 comments
Open

[Enhancement] Log loss values when training #6

habaneraa opened this issue Oct 12, 2023 · 2 comments

Comments

@habaneraa
Copy link

When pre-training/training the GNNs, it would be better if the loss values per step are logged. Sometimes the metric scores may not change despite the decreasing loss, and the loss values can effectively indicate the progress of self-supervised learning.

A possible solution is to add a new key-value argument (e.g. loss=f"{loss.detach().item()}") when calling logger.info(get_format_variables()) in every training script.

@Marigoldwu
Copy link
Owner

Sounds great! Thank you for your valuable suggestions and we will incorporate them in the upcoming versions. Thanks again for using this framework!

@habaneraa
Copy link
Author

This project is helpful. I'm glad to help with building a more robust and user-friendly framework for DAGC!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants