Machine Translation
Collection
2 items
•
Updated
This model is a fine-tuned version of facebook/nllb-200-distilled-600M on juanjucm/OpenSLR-SpeechT-GL-EN dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
Training Loss | Epoch | Step | Validation Loss | Bleu |
---|---|---|---|---|
14.2627 | 1.0 | 600 | 3.7799 | 61.8432 |
6.0125 | 2.0 | 1200 | 0.5403 | 66.7094 |
1.1534 | 3.0 | 1800 | 0.0243 | 69.1604 |
0.0748 | 4.0 | 2400 | 0.0147 | 70.7523 |
0.0125 | 5.0 | 3000 | 0.0131 | 73.1040 |
0.0095 | 6.0 | 3600 | 0.0126 | 73.2385 |
0.0081 | 7.0 | 4200 | 0.0122 | 73.8670 |
0.0072 | 8.0 | 4800 | 0.0122 | 73.6259 |
Base model
facebook/nllb-200-distilled-600M