You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Oct 31, 2023. It is now read-only.
Hi, I use multi-lingual bert(pre-trained weights are downloaded from Google official github) as my PTM like your paper in Table 5. Your XLT task result of EN-ZH is F1=57.5 / EM = 37.3. But my result of F1 is just about 20%. Our performance has a large margin and I don't think it's the hyperparameters that are causing this gap. Could you please release your source code in this repository. Thanks!
The text was updated successfully, but these errors were encountered:
Hi, I use multi-lingual bert(pre-trained weights are downloaded from Google official github) as my PTM like your paper in Table 5. Your XLT task result of EN-ZH is F1=57.5 / EM = 37.3. But my result of F1 is just about 20%. Our performance has a large margin and I don't think it's the hyperparameters that are causing this gap. Could you please release your source code in this repository. Thanks!
The text was updated successfully, but these errors were encountered: