-
Notifications
You must be signed in to change notification settings - Fork 15
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: Kcrps #182
base: main
Are you sure you want to change the base?
feat: Kcrps #182
Conversation
validation: 8 | ||
|
||
training: | ||
forecaster: anemoi.training.train.forecaster.GraphEnsForecaster # refactor to use _target_ etc? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is called task
in the interpolator PR. Thinking about future downscaling, I think task
maybe a more futureproof name. Thoughts?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
How significant is the refactor to use _target_
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If everyone is happy with the current form then we can leave it as it is. I mainly wanted other opinions on this.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Would it make sense to also move the task specific args under task, i.e. multistep_input
?
Just to flag, that this PR will need some docs update. The APIs will be updated automatically with sphinx, but it would be good to have some info to explain the new model interface and some of the major changes. Thank you! |
"""Provide the model instance.""" | ||
kwargs = { | ||
"config": self.config, | ||
"data_indices": self.data_indices, | ||
"graph_data": self.graph_data, | ||
"truncation_data": self.truncation_data, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
With this I guess we make sure the truncation data matrix is available at inference time right? My other question is in the same way we track the datasets paths/files in the catalogue, it could be worth tracking the files too. Have you discussed this already with Baudouin, @ssmmnn11 ? (in terms of keeping full traceability and for back up of the files)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes, same as graph data. I have not spoken to Baudouin about that yet
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we only need to store this in the supporting_arrays
property
forecaster: anemoi.training.train.forecaster.GraphForecaster | ||
|
||
# select strategy | ||
strategy: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In the init of the strategy we include the read_group_size
why don't specify this at config level?
for more information, see https://pre-commit.ci
for more information, see https://pre-commit.ci
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
A number of the config files seem to be left over from testing, and need to be removed. Some even contain local paths
batch[0] = self.allgather_batch(batch[0]) | ||
if len(batch) == 2: | ||
batch[1] = self.allgather_batch(batch[1]) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Under what circumstances do we have len(batch) ==2?
Description
AIFS-CRPS
@mishooax, @MartinLeutbecher, @jakob-schloer, @Rilwan-Adewoyin, @mc4117, @JesperDramsch
to test use debug_ens config. Work in progress ...