You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am encountering an issue where I am unable to retrieve the inference process for the DeepSeek-R1 model. Despite following the standard procedures, the system fails to fetch or display the expected results during inference. I have verified the setup and the model, but the issue persists. Could you please help investigate the cause and suggest any possible solutions?
The text was updated successfully, but these errors were encountered:
I am encountering an issue where I am unable to retrieve the inference process for the DeepSeek-R1 model. Despite following the standard procedures, the system fails to fetch or display the expected results during inference. I have verified the setup and the model, but the issue persists. Could you please help investigate the cause and suggest any possible solutions?
Recently, the API for R1 frequently experiences timeouts and ultimately returns an empty string. You might be encountering this type of issue.
I encountered the same issue as 2lianna mentioned. I think it boils down to the fact that after the releases of DeepSeek models, we now receive Inference Process or "thinking" from models as response in addition to "content" of responses.
This is the case even for ChatGPT if we click on the Reason button:
I am encountering an issue where I am unable to retrieve the inference process for the DeepSeek-R1 model. Despite following the standard procedures, the system fails to fetch or display the expected results during inference. I have verified the setup and the model, but the issue persists. Could you please help investigate the cause and suggest any possible solutions?
The text was updated successfully, but these errors were encountered: