-
Notifications
You must be signed in to change notification settings - Fork 61
Open
Description
Description
Hey there 👋 ! I am trying to integrate a router model solution into Braintrust Proxy and I encountered a problem that Perplexity and Mistral models are not working.
Steps to reproduce:
import { OpenAI } from "openai";
const client = new OpenAI({
baseURL: "https://api.braintrust.dev/v1/proxy",
apiKey: '<API-KEY>'
});
async function main() {
const start = performance.now();
const response = await client.chat.completions.create({
model: "pplx-7b-online",
messages: [{ role: "user", content: "how's the weather in tokyo?" }],
seed: 1, // A seed activates the proxy's cache
});
console.log(response.choices[0].message.content);
console.log(response.model);
console.log(`Took ${(performance.now() - start) / 1000}s`);
}
main();
By using this you get this error:
Error code: 400 - {'error': {'message': "Invalid model 'pplx-7b-online'. Permitted models can be found in the documentation at https://docs.perplexity.ai/docs/model-cards.", 'type': 'invalid_model', 'code': 400}}
The model is taken from here: https://www.braintrust.dev/docs/guides/proxy#list-of-supported-models-and-providers
In my account I correctly set the perplexity and mistra api keys and used the correct key.

The request fails with the 400 error mentioned above.
Expected Behaviour:
The response should respond with a 200 and output the text.
Impact:
This issue is blocking to use the service and also to propose new integration for it.
r0ymanesco and t5-notdiamond
Metadata
Metadata
Assignees
Labels
No labels