Skip to content

[Feature]: Allow for custom llm proxies #45

@0xthc

Description

Problem Statement

I would like to be able to go through a proxy instead of direct providers APIs

Proposed Solution

export interface ModelConfig {
  id: string
  name: string
  provider: ProviderId
  model: string
  description?: string
  available: boolean
  base_url?: string  // Add this
  custom?: boolean    // Add this to identify custom models
}

function getModelInstance(
  modelId?: string,
  customConfig?: { base_url?: string }
): ChatAnthropic | ChatOpenAI | ChatGoogleGenerativeAI | string {
  const model = modelId || getDefaultModel()
  
  if (model.startsWith("claude")) {
    const apiKey = getApiKey("anthropic") || "dummy-not-used"
    
    const config: any = {
      model,
      anthropicApiKey: apiKey
    }
    
    // Add base URL if provided
    if (customConfig?.base_url) {
      config.clientOptions = {
        baseURL: customConfig.base_url
      }
    }
    
    return new ChatAnthropic(config)
  }
  // ... rest of the function
}

Use Case

I have proxies already setup for internal gateways

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Type

    No type
    No fields configured for issues without a type.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions