-
Notifications
You must be signed in to change notification settings - Fork 2.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Issues loading model to SentenceTransformer
without passing HF token
#3212
Comments
Similar issue: #3206 We're encountering similar issues in the CI of Haystack. When the CI runs from a fork (where we don't set |
Okay, I was able to reproduce the issue. from sentence_transformers import SentenceTransformer
model = SentenceTransformer("all-MiniLM-L6-v2", token="") This fails. In the past, an invalid or empty token did not cause problems with public models. The problem is also that in certain CI environments, an empty environment variable is being set for the token instead of leaving it unset. This issue is likely not specific to Sentence Transformers, but rather due to changes in Hugging Face Hub. @tomaarsen It would be great if you could notify the maintainers of other related HF projects. |
Try token=False |
Hello!
If you'd like to get from huggingface_hub import hf_hub_download
hf_hub_download(
repo_id="sentence-transformers/all-MiniLM-L6-v2",
filename="config.json",
token="",
local_dir="tmp",
cache_dir="tmp_cache",
) I indeed think that this issue is best resolved outside of Sentence Transformers, but instead in
|
Interesting! Would definitely love to see this resolved somehow (I usually prefer not to set It's not immediately obvious (although I didn't dive especially deep) into why this happens with @tomaarsen are you happy for me to make the issue in |
I will admit that it's odd that it works in
|
Code to reproduce:
The above throws an error:
Bizarrely this is resolved fully if I set the
HF_TOKEN
environment variable or pass thetoken
argument, despitesentence-transformers/all-MiniLM-L6-v2
being an ungated model.In addition, I have no issue loading it with
AutoModel
:Some versioning info:
The text was updated successfully, but these errors were encountered: