'patience' count resets when tuning with Hyperband #654
Unanswered
mountainML
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I'm using
keras-tuner
to perform a hyperparameter optimization of a neural network by using Hyperband as:where I defined the
EarlyStopping
callback as:By setting
max_epochs=100
, I noticed that (during the Hyperband algorithm) the training of a specific model is not performed in one shot, but it's perfomed by steps ("Runnning trials
").1
to epoch3
;4
to epoch7
;8
to13
;14
to25
;26
to50
;51
to100
.So, at the end of each "Running trial", Keras saves the state of the training, in order to continue, at the next "Running trial", the training from that state.
By setting
patience=15
:I noticed that the EarlyStopping callback stops the training during "Running trial"
5)
, in particular at epoch41
(the first epoch at which EarlyStopping is able to operate, because:start_epoch + patience = 26 + 15 = 41
).Then, the training continues in the "Running trial"
6)
, and EarlyStopping arrests the training at epoch66
(thus, at the epoch:start_epoch + patience = 51 + 15 = 66
).Thus:
5)
, the training continues in running trial6)
;1
and should never reset itself).Is it the normal/expected behavior that the patience count automatically resets itself at the beginning of each "Running trial" when tuning with Keras Hyperband?
Beta Was this translation helpful? Give feedback.
All reactions