You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When pre-training/training the GNNs, it would be better if the loss values per step are logged. Sometimes the metric scores may not change despite the decreasing loss, and the loss values can effectively indicate the progress of self-supervised learning.
A possible solution is to add a new key-value argument (e.g. loss=f"{loss.detach().item()}") when calling logger.info(get_format_variables()) in every training script.
The text was updated successfully, but these errors were encountered:
When pre-training/training the GNNs, it would be better if the loss values per step are logged. Sometimes the metric scores may not change despite the decreasing loss, and the loss values can effectively indicate the progress of self-supervised learning.
A possible solution is to add a new key-value argument (e.g.
loss=f"{loss.detach().item()}"
) when callinglogger.info(get_format_variables())
in every training script.The text was updated successfully, but these errors were encountered: