You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thanks for your organized repository and for sharing your code. Your paper about Self-Supervised Meta-Learning is interesting. I have a few questions and I appreciate it if you could help me understand more about the details of your paper.
In your paper, you set "Support samples per task = 80", "Query samples per task = 10", "Adaptation Steps (G) = 7" , "Meta-training Epochs = 1". Meanwhile, in your codes, you fix eval epochs to 5 by setting "inner_epochs = 5". I wonder how you choose these hyperparameters.
Besides, is it means that every SMLMT task used for meta training contains exactly 80 + 10 examples ?
Thanks.
The text was updated successfully, but these errors were encountered:
Uh oh!
There was an error while loading. Please reload this page.
Hi,
Thanks for your organized repository and for sharing your code. Your paper about Self-Supervised Meta-Learning is interesting. I have a few questions and I appreciate it if you could help me understand more about the details of your paper.
In your paper, you set "Support samples per task = 80", "Query samples per task = 10", "Adaptation Steps (G) = 7" , "Meta-training Epochs = 1". Meanwhile, in your codes, you fix eval epochs to 5 by setting "inner_epochs = 5". I wonder how you choose these hyperparameters.
Besides, is it means that every SMLMT task used for meta training contains exactly 80 + 10 examples ?
Thanks.
The text was updated successfully, but these errors were encountered: