You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We introduced IBQ into speech-related tasks to model the intermediate hidden layer features of the speech encoder. However, during training, the quantization loss suddenly drops to zero at a certain stage, resulting in codebook collapse. Have you encountered a similar issue on your side?
The text was updated successfully, but these errors were encountered:
Hi @yufan-aslp,
We did not observe this phenomenon during training on images. The sudden drop in quantization loss to zero may indicate that the model is overfitting to a very small subset of codes. To investigate this, you can print the selected indices during training and monitor the codebook usage. If this is indeed the case, we recommend modifying training hyperparameters such as the learning rate and loss weights.
We introduced IBQ into speech-related tasks to model the intermediate hidden layer features of the speech encoder. However, during training, the quantization loss suddenly drops to zero at a certain stage, resulting in codebook collapse. Have you encountered a similar issue on your side?
The text was updated successfully, but these errors were encountered: