Skip to content

Conversation

@hamedkazemi
Copy link

Description

Added OpenRouter.ai as a unified LLM provider option, allowing users to access multiple LLM providers (OpenAI, Anthropic, Google, etc.) through a single API key. This simplifies configuration management and provides a more flexible approach to LLM provider selection.

Related Issues

Resolves: Feature request for OpenRouter.ai support to simplify mcp_agent.secrets.yaml configuration

Related Issue

Changes Made

Configuration Files

  • mcp_agent.secrets.yaml: Added openrouter section with api_key field
  • mcp_agent.config.yaml:
    • Updated llm_provider options to include "openrouter"
    • Added openrouter configuration section with base_url, default_model, base_max_tokens, and retry_max_tokens settings

Code Implementation

  • utils/llm_utils.py:

    • Updated get_preferred_llm_class() to support OpenRouter provider selection
    • Added OpenRouter to the provider map (leverages OpenAIAugmentedLLM class with OpenRouter endpoint)
    • Updated get_default_models() to include OpenRouter model configuration
  • schema/mcp-agent.config.schema.json:

    • Added OpenRouterSettings definition with properties for api_key, base_url, and default_model
    • Updated llm_provider enum to include "openrouter" option
    • Added openrouter property to the main schema

Documentation

  • README.md:
    • Updated Quick Start section with OpenRouter configuration instructions
    • Added OpenRouter benefits explanation (single key, unified billing, easy model switching)
    • Provided model format examples (e.g., "anthropic/claude-sonnet-4", "openai/gpt-4")
    • Maintained backward compatibility notes

Technical Details

Architecture Decision

  • Reuses existing OpenAIAugmentedLLM class since OpenRouter provides an OpenAI-compatible API
  • No new dependencies required - leverages existing OpenAI client infrastructure
  • Model names use provider/model format as per OpenRouter specification

Benefits

Simplified Configuration: Single API key for multiple LLM providers
Unified Billing: Consolidated billing across all providers
Easy Model Switching: Change models without managing multiple API keys
Backward Compatible: Existing configurations continue to work unchanged
Cost Optimization: Access to OpenRouter's competitive pricing

Checklist

  • Changes tested locally
  • Code reviewed
  • Documentation updated (README.md)
  • Configuration schema updated
  • Backward compatibility maintained
  • No breaking changes introduced

Testing

Manual Testing Performed

  • Verified OpenRouter provider selection works correctly
  • Tested fallback behavior when OpenRouter key is not configured
  • Confirmed backward compatibility with existing provider configurations
  • Validated schema changes with configuration validation

Configuration Example

# mcp_agent.secrets.yaml
openrouter:
  api_key: "sk-or-v1-..."

# mcp_agent.config.yaml
llm_provider: "openrouter"
openrouter:
  base_url: "https://openrouter.ai/api/v1"
  default_model: "anthropic/claude-sonnet-4"
  base_max_tokens: 40000
  retry_max_tokens: 32768

Additional Notes

Migration Path

Users can adopt OpenRouter incrementally:

  1. Keep existing provider configurations (no action required)
  2. Add OpenRouter key and switch llm_provider when ready
  3. Mix approaches - different agents can use different providers

Future Enhancements

  • Consider adding OpenRouter-specific features (price monitoring, fallback routing)
  • Potential integration with OpenRouter's model ranking/recommendation system
  • Support for OpenRouter's advanced parameters (temperature ranges, provider preferences)
  • Consider implementing better LLM interface

Breaking Changes

None. This is a purely additive feature that maintains full backward compatibility.

Screenshots/Examples

N/A - This is a configuration and backend integration feature without UI changes.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant