Skip to content

Conversation

@pkulkark
Copy link
Member

Description

This PR updates the custom llm models url response handler so that it can work with responses with the format like this:

{
  "models": {
    "Meta-Llama4-Maverick": {
      "name": "Meta-Llama4-Maverick",
      "display_name": "Maverick (Llama 4)"
  }
}

Testing instructions

  1. Deploy this branch of the XBlock.
  2. Enable custom service in Site Configuration. In your Open edX Django admin → Site Configurations, add:
{
    "ai_eval": {
             "USE_CUSTOM_LLM_SERVICE": true,
             "CUSTOM_LLM_MODELS_URL": "https://your-custom-service/models",
             "CUSTOM_LLM_COMPLETIONS_URL": "https://your-custom-service/completions",
             "CUSTOM_LLM_TOKEN_URL": "https://your-custom-service/oauth/token"
    }
}
  1. Set the following client credentials in Django settings for Oauth2 (e.g. via Tutor’s configuration):
CUSTOM_LLM_CLIENT_ID = "your-client-id"
CUSTOM_LLM_CLIENT_SECRET = "your-client-secret"
  1. Add the XBlock component and verify that the models dropdown is populated correctly with the available models list.

Other information

Note: display_name is currently ignored because the dropdown expects a list of model strings which are sent in the request.

Copy link

@carlos-marquez-wgu carlos-marquez-wgu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @pkulkark , we tested this in our side is working, thanks!

@pkulkark pkulkark merged commit 3f06b21 into artur/all-changes Jan 14, 2026
5 checks passed
@pkulkark pkulkark deleted the pooja/update-custom-llm-model-listing branch January 14, 2026 18:15
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants