You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I was just wondering can this https://huggingface.co/dennlinger/roberta-cls-consec model perform to find cosine / dot similarities between two paragraph of text . Like sentenceBert can perform cosine similarities between two sentences?
The text was updated successfully, but these errors were encountered:
Hi @desis123,
By default, I would say it cannot. Our models were trained with a combined input setting (i.e., two paragraphs fed into the same forward pass, separated by a [SEP] token.
In comparison, late interaction models (or more generally, dual encoders) are not processing two, but one paragraph at a time. Therefore, I would argue that our model is not particularly suited towards producing meaningful embeddings.
I was just wondering can this https://huggingface.co/dennlinger/roberta-cls-consec model perform to find cosine / dot similarities between two paragraph of text . Like sentenceBert can perform cosine similarities between two sentences?
The text was updated successfully, but these errors were encountered: