Skip to content

Missing BERT code? #1

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
codyseally opened this issue Apr 5, 2020 · 3 comments
Open

Missing BERT code? #1

codyseally opened this issue Apr 5, 2020 · 3 comments

Comments

@codyseally
Copy link

I was trying to run your code with some new ideas, but it breaks at this line:

text_embeddings = np.load("rumor_detection_acl2017/output_bert.npy")

How do you generate the output_bert.npy file ? I can't seem to find the relevant code in the repo.

@j0k3w0n
Copy link

j0k3w0n commented Mar 17, 2021

I found same issue. I think it's numpy data saved from training done in section 3 with RoBERTa. Did you find any kind of fix??

@codyseally
Copy link
Author

I found same issue. I think it's numpy data saved from training done in section 3 with RoBERTa. Did you find any kind of fix??

Sadly no. They don't include the BERT code that does the embedding. You can do it yourself though. Look for the library huggingface bert. It has everything plug and play.

This: https://github.com/huggingface/transformers

With it you should be able to parse the tweets and encode them. Best of luck.

@j0k3w0n
Copy link

j0k3w0n commented Mar 19, 2021

Thank you for the tips. I'll try it. Although it would be great if they could provide the code in order to reproduce the results.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants