Skip to content

fix(responses): support OpenAI Responses tool loops#65

Open
saxyguy81 wants to merge 2 commits into
CaddyGlow:mainfrom
saxyguy81:codex/responses-function-call-ids
Open

fix(responses): support OpenAI Responses tool loops#65
saxyguy81 wants to merge 2 commits into
CaddyGlow:mainfrom
saxyguy81:codex/responses-function-call-ids

Conversation

@saxyguy81
Copy link
Copy Markdown

@saxyguy81 saxyguy81 commented May 10, 2026

Summary

  • Normalize OpenAI Responses function_call item IDs to fc_* while preserving call_id as call_*.
  • Add true OpenAI-compatible Responses route coverage for /responses, /v1/responses, and /codex/v1/responses.
  • Add local previous_response_id continuation state for Codex Responses tool loops, scoped by client auth context with TTL/max-entry cleanup.
  • Accept Droid/Factory Responses streaming fields such as stream_options.include_usage, plus Droid-sent prompt_cache_retention and safety_identifier, while stripping Codex-backend-unsupported fields before upstream dispatch.
  • Preserve valid Responses SSE for streaming tool calls, including response.function_call_arguments.*.item_id == fc_*, completed function_call.id == fc_*, and function_call.call_id == call_*.
  • Normalize provider/FastAPI 4xx failures into OpenAI-style JSON error envelopes so clients see rejected parameters instead of empty bodies.
  • Rebuild sparse Codex response.completed payloads from streamed output items when the final upstream completed event omits output.

This remains parallel to #64; it is the Responses/OpenAI-provider compatibility PR. The latest commit folds in the Droid/Factory streaming and continuation compatibility work on top of the original fc_* fix.

Validation

  • git diff --check
  • uv run ruff check ccproxy/plugins/codex/plugin.py ccproxy/plugins/codex/config.py ccproxy/plugins/codex/adapter.py ccproxy/plugins/codex/routes.py ccproxy/plugins/codex/responses_state.py ccproxy/llms/models/openai.py ccproxy/llms/streaming/accumulators.py ccproxy/llms/formatters/common/identifiers.py ccproxy/streaming/deferred.py ccproxy/streaming/buffer.py ccproxy/streaming/errors.py tests/plugins/codex/integration/test_codex_basic.py tests/plugins/codex/unit/test_adapter.py tests/plugins/codex/unit/test_responses_state.py tests/unit/llms/streaming/test_accumulators.py tests/unit/llms/test_openai_responses_request_models.py tests/unit/streaming/test_buffer_parse_responses.py
  • uv run ruff format --check ccproxy/plugins/codex/plugin.py ccproxy/plugins/codex/config.py ccproxy/plugins/codex/adapter.py ccproxy/plugins/codex/routes.py ccproxy/plugins/codex/responses_state.py ccproxy/llms/models/openai.py ccproxy/llms/streaming/accumulators.py ccproxy/llms/formatters/common/identifiers.py ccproxy/streaming/deferred.py ccproxy/streaming/buffer.py ccproxy/streaming/errors.py tests/plugins/codex/integration/test_codex_basic.py tests/plugins/codex/unit/test_adapter.py tests/plugins/codex/unit/test_responses_state.py tests/unit/llms/streaming/test_accumulators.py tests/unit/llms/test_openai_responses_request_models.py tests/unit/streaming/test_buffer_parse_responses.py
  • uv run pytest tests/plugins/codex/unit/test_adapter.py tests/plugins/codex/unit/test_responses_state.py tests/unit/llms/test_openai_responses_request_models.py tests/unit/streaming/test_buffer_parse_responses.py tests/unit/llms/streaming/test_accumulators.py tests/plugins/codex/integration/test_codex_basic.py -q - 58 passed
  • uv run pytest tests/plugins/codex/unit/test_adapter.py tests/plugins/codex/unit/test_routes.py tests/plugins/codex/unit/test_responses_state.py tests/plugins/codex/integration/test_codex_basic.py tests/plugins/codex/integration/test_codex_websocket.py tests/unit/streaming/test_buffer_parse_responses.py tests/unit/llms/streaming/test_accumulators.py -q - 67 passed
  • uv run pytest tests/plugins/codex tests/unit/streaming/test_buffer_parse_responses.py tests/unit/llms/streaming/test_accumulators.py tests/integration/test_streaming_converters.py -q - 91 passed, 1 existing Pydantic deprecation warning
  • uv run mypy ccproxy/plugins/codex/adapter.py ccproxy/plugins/codex/routes.py ccproxy/plugins/codex/responses_state.py ccproxy/plugins/codex/plugin.py ccproxy/plugins/codex/config.py ccproxy/llms/models/openai.py ccproxy/llms/streaming/accumulators.py ccproxy/llms/formatters/common/identifiers.py ccproxy/streaming/deferred.py ccproxy/streaming/buffer.py ccproxy/streaming/errors.py tests/plugins/codex/integration/test_codex_basic.py tests/plugins/codex/unit/test_adapter.py tests/plugins/codex/unit/test_responses_state.py tests/unit/llms/streaming/test_accumulators.py tests/unit/llms/test_openai_responses_request_models.py tests/unit/streaming/test_buffer_parse_responses.py - success, 17 source files
  • ./Taskfile check - all checks passed; mypy success on 592 source files; 616 files already formatted

Live local smoke

  • Local ccproxy /health returned pass after restart.
  • Live curl verified streaming /responses, /v1/responses, and /codex/v1/responses with stream_options.include_usage.
  • Live curl verified streamed function tool call + previous_response_id continuation, with fc_* item IDs and call_* call IDs.
  • Live curl verified unknown previous_response_id returns OpenAI-style JSON error body.
  • Droid/Factory target passed:
droid exec \
  --cwd /Users/smhanan/Projects/inline \
  -m custom:ccproxy-chatgpt-5.5-0 \
  --auto medium \
  'Use the Execute tool to run exactly: pwd && git branch --show-current. Then reply beginning GPT_DROID_EXEC_OK.'

Output began with GPT_DROID_EXEC_OK and returned /Users/smhanan/Projects/inline plus branch codex/inline-typora-repair.

@saxyguy81 saxyguy81 marked this pull request as ready for review May 10, 2026 07:10
@saxyguy81
Copy link
Copy Markdown
Author

Coordination note for reviewers:

This PR is intentionally parallel to #64 and #66.

There is no intended dependency between these PRs. They can be reviewed and merged independently; if one lands first, I will rebase the others only if GitHub reports a conflict.

@saxyguy81 saxyguy81 changed the title fix(responses): normalize function call item ids fix(responses): support OpenAI Responses tool loops May 11, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant