Skip to content

Codex sends prompt_cache_retention to models that don't support it, causes 400 #478

@skrabe

Description

@skrabe

gpt-5.3-codex requests fail with 400 because CLIProxyAPIPlus sends prompt_cache_retention=24h and OpenAI rejects it.

Debug log:

request error, error status: 400, error message: {"detail":"Unsupported parameter: prompt_cache_retention"}

OpenAI's docs only list these models as supporting extended prompt cache retention: gpt-5.4, gpt-5.2, gpt-5.1, gpt-5.1-codex, gpt-5.1-codex-mini, gpt-5.1-codex-max, gpt-5, gpt-5-codex, gpt-4.1

gpt-5.3-codex is not on that list. The parameter also might not work through OAuth endpoints at all (only direct API).

Workaround via config:

payload:
  filter:
    - models:
        - name: "gpt-5.3-codex*"
      params:
        - "prompt_cache_retention"

Fix should check the model against the supported list before injecting this parameter.

Version: 6.9.5-0

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions