Skip to content

Conversation

@bolinfest
Copy link
Collaborator

@bolinfest bolinfest commented Oct 26, 2025

An AppServer client should be able to use any (model_provider, model) in the user's config. NewConversationParams already supported specifying the model, but this PR expands it to support model_provider, as well.


Stack created with Sapling. Best reviewed with ReviewStack.

@chatgpt-codex-connector
Copy link
Contributor

💡 Codex Review

struct ProviderMatcher<'a> {
filters: &'a [String],
matches_openai: bool,
}
impl<'a> ProviderMatcher<'a> {
fn new(filters: &'a [String]) -> Option<Self> {
if filters.is_empty() {
return None;
}
let matches_openai = filters.iter().any(|provider| provider == "openai");
Some(Self {
filters,
matches_openai,
})
}
fn matches(&self, session_provider: Option<&str>) -> bool {
match session_provider {
Some(provider) => self.filters.iter().any(|candidate| candidate == provider),
None => self.matches_openai,

P1 Badge Default filter drops older non‑OpenAI sessions

The provider matcher treats a missing model_provider as matching only when the filter contains "openai". Because the app server now defaults ListConversations to filter on the current provider, any sessions recorded before this field existed (which therefore have model_provider = null) will vanish from the list whenever the configured provider is something other than OpenAI. Users who recorded Anthropic or other provider sessions before this change will have no way to discover or resume them unless they manually pass an empty filter. Consider treating None as the active provider or bypassing provider filtering when metadata is absent.

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

@bolinfest bolinfest force-pushed the pr5793 branch 2 times, most recently from 735540b to 24d9adc Compare October 27, 2025 05:13
bolinfest added a commit that referenced this pull request Oct 27, 2025
Because conversations that use the Responses API can have encrypted
reasoning messages, trying to resume a conversation with a different
provider could lead to confusing "failed to decrypt" errors. (This is
reproducible by starting a conversation using ChatGPT login and resuming
it as a conversation that uses OpenAI models via Azure.)

This changes `ListConversationsParams` to take a `model_providers:
Option<Vec<String>>` and adds `model_provider` on each
`ConversationSummary` it returns so these cases can be disambiguated.

Note this ended up making changes to
`codex-rs/core/src/rollout/tests.rs` because it had a number of cases
where it expected `Some` for the value of `next_cursor`, but the list of
rollouts was complete, so according to this docstring:


https://github.com/openai/codex/blob/bcd64c7e7231d6316a2377d1525a0fa74f21b783/codex-rs/app-server-protocol/src/protocol.rs#L334-L337

If there are no more items to return, then `next_cursor` should be
`None`. This PR updates that logic.






---
[//]: # (BEGIN SAPLING FOOTER)
Stack created with [Sapling](https://sapling-scm.com). Best reviewed
with [ReviewStack](https://reviewstack.dev/openai/codex/pull/5658).
* #5803
* #5793
* __->__ #5658
Base automatically changed from pr5658 to main October 27, 2025 09:03
@bolinfest bolinfest enabled auto-merge (squash) October 27, 2025 09:23
@bolinfest bolinfest merged commit 15fa228 into main Oct 27, 2025
33 of 36 checks passed
@bolinfest bolinfest deleted the pr5793 branch October 27, 2025 09:33
@github-actions github-actions bot locked and limited conversation to collaborators Oct 27, 2025
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants