Is there a way to get the low level request id and actual llm response? #9968
richtong
started this conversation in
Feature Requests
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Trying to debug sambanova which returns a 400 error. I can't figure out where a log of what's exactly send an received is so I can help them debug it.
Same problem with a.ai glm 4.6. Have a good contact there and they need a request id for queries that are wrong so they can trace.
Is there a log like this somewhere deep in too code?
Beta Was this translation helpful? Give feedback.
All reactions