Skip to content
This repository was archived by the owner on Oct 31, 2023. It is now read-only.

Multi-lingual BERT Result of Chinese XLT task #13

Closed
ztl-35 opened this issue Jul 21, 2020 · 0 comments
Closed

Multi-lingual BERT Result of Chinese XLT task #13

ztl-35 opened this issue Jul 21, 2020 · 0 comments

Comments

@ztl-35
Copy link

ztl-35 commented Jul 21, 2020

Hi, I use multi-lingual bert(pre-trained weights are downloaded from Google official github) as my PTM like your paper in Table 5. Your XLT task result of EN-ZH is F1=57.5 / EM = 37.3. But my result of F1 is just about 20%. Our performance has a large margin and I don't think it's the hyperparameters that are causing this gap. Could you please release your source code in this repository. Thanks!

@ztl-35 ztl-35 closed this as completed Jul 22, 2020
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant