Skip to content

Conversation

@bolinfest
Copy link
Collaborator

@bolinfest bolinfest commented Oct 24, 2025

Because conversations that use the Responses API can have encrypted reasoning messages, trying to resume a conversation with a different provider could lead to confusing "failed to decrypt" errors. (This is reproducible by starting a conversation using ChatGPT login and resuming it as a conversation that uses OpenAI models via Azure.)

This changes ListConversationsParams to take a model_providers: Option<Vec<String>> and adds model_provider on each ConversationSummary it returns so these cases can be disambiguated.

Note this ended up making changes to codex-rs/core/src/rollout/tests.rs because it had a number of cases where it expected Some for the value of next_cursor, but the list of rollouts was complete, so according to this docstring:

/// Opaque cursor to pass to the next call to continue after the last item.
/// if None, there are no more items to return.
#[serde(skip_serializing_if = "Option::is_none")]
pub next_cursor: Option<String>,

If there are no more items to return, then next_cursor should be None. This PR updates that logic.


Stack created with Sapling. Best reviewed with ReviewStack.

@bolinfest bolinfest force-pushed the pr5658 branch 3 times, most recently from 88bc431 to 85cd533 Compare October 26, 2025 06:42
pub cursor: Option<String>,
/// Optional model provider filter (matches against session metadata).
#[serde(skip_serializing_if = "Option::is_none")]
pub model_provider: Option<Vec<String>>,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
pub model_provider: Option<Vec<String>>,
pub model_providers: Option<Vec<String>>,

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good catch: that was definitely I change I meant to make!

pub timestamp: Option<String>,
/// Optional model provider recorded for the session.
#[serde(skip_serializing_if = "Option::is_none")]
pub model_provider: Option<String>,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should we always return a value even if we have to fall back internally?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That way all clients won't have to implement fallback logic.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fair enough.

@bolinfest
Copy link
Collaborator Author

@codex please review

Copy link
Contributor

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

Comment on lines 388 to 398
args: &crate::cli::ResumeArgs,
) -> anyhow::Result<Option<PathBuf>> {
if args.last {
match codex_core::RolloutRecorder::list_conversations(&config.codex_home, 1, None, &[])
.await
let default_provider_filter = vec!["openai".to_string()];
match codex_core::RolloutRecorder::list_conversations(
&config.codex_home,
1,
None,
&[],
Some(default_provider_filter.as_slice()),
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Badge Use configured provider when filtering last session

The new filtering logic in resolve_resume_path now hard‑codes the provider filter to "openai". When the CLI is configured to use another backend (e.g., Azure), calling codex resume --last will look for the most recent conversation but the list call filters everything that isn’t labelled openai, so no Azure sessions can ever be returned even when they exist. This makes --last unusable for non‑OpenAI providers. The filter should come from config.model_provider_id, not a constant.

Useful? React with 👍 / 👎.

Comment on lines 72 to 88
let page_loader: PageLoader = Arc::new(move |request: PageLoadRequest| {
let tx = loader_tx.clone();
tokio::spawn(async move {
let provider_filter = vec![String::from("openai")];
let page = RolloutRecorder::list_conversations(
&request.codex_home,
PAGE_SIZE,
request.cursor.as_ref(),
INTERACTIVE_SESSION_SOURCES,
Some(provider_filter.as_slice()),
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Badge Resume picker filters out non‑OpenAI sessions

The resume picker’s page loader builds the provider filter as vec!["openai"] (and the initial page load does the same). For environments configured to use a different provider, every call to RolloutRecorder::list_conversations will exclude their sessions, so the picker will render an empty list even when Azure conversations exist. The provider ID should come from the user’s configuration rather than being hard‑coded.

Useful? React with 👍 / 👎.

@bolinfest
Copy link
Collaborator Author

@codex please review

Copy link
Contributor

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

Comment on lines 72 to 91
let page_loader: PageLoader = Arc::new(move |request: PageLoadRequest| {
let tx = loader_tx.clone();
tokio::spawn(async move {
let provider_filter = vec![String::from("openai")];
let page = RolloutRecorder::list_conversations(
&request.codex_home,
PAGE_SIZE,
request.cursor.as_ref(),
INTERACTIVE_SESSION_SOURCES,
Some(provider_filter.as_slice()),
)
.await;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Badge Avoid hard-coding resume picker to openai provider

The resume picker now builds every list_conversations request with vec!["openai"] as the provider filter. When Codex is configured to use a different provider (e.g., Azure), all recorded sessions are tagged with that provider and the picker will return an empty list because the filter never matches. Only the default OpenAI setup works; any non‑OpenAI configuration cannot resume prior conversations. The provider filter should come from the current configuration or the user’s selection rather than a literal string. The same constant appears both in the page loader and the initial page load.

Useful? React with 👍 / 👎.

@bolinfest
Copy link
Collaborator Author

@codex please review

Copy link
Contributor

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

Comment on lines +833 to +842
let model_provider_filter = match model_provider {
Some(providers) => {
if providers.is_empty() {
None
} else {
Some(providers)
}
}
None => Some(vec![self.config.model_provider_id.clone()]),
};
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Badge Default provider filter hides legacy non‑OpenAI sessions

The new list logic defaults to filtering by the current model_provider_id when the client omits model_providers. However, ProviderMatcher::matches treats any rollout that lacks model_provider metadata as matching only when the filter contains "openai". All rollouts recorded before this change (including Azure or other providers) have no model_provider field, so an Azure-configured client will now receive an empty list for those older sessions and has no way to discover them because the CLI/TUI always pass the default provider filter. This effectively makes existing non‑OpenAI conversations undiscoverable after upgrading. Consider treating missing metadata as matching the requested provider (or both) so that legacy sessions remain visible for their original provider.

Useful? React with 👍 / 👎.

Copy link

@Sinisterj12 Sinisterj12 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

.

Copy link

@Sinisterj12 Sinisterj12 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

s

Copy link

@Sinisterj12 Sinisterj12 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

hi

Copy link

@Sinisterj12 Sinisterj12 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ok

Copy link

@Sinisterj12 Sinisterj12 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ok

Copy link

@Sinisterj12 Sinisterj12 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

k

Copy link

@Sinisterj12 Sinisterj12 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ok

Copy link

@Sinisterj12 Sinisterj12 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ok

Copy link

@Sinisterj12 Sinisterj12 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ok

Copy link

@Sinisterj12 Sinisterj12 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ok

Copy link

@Sinisterj12 Sinisterj12 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ok

@bolinfest
Copy link
Collaborator Author

@codex please review

Copy link
Contributor

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

Comment on lines 359 to 380
struct ProviderMatcher<'a> {
filters: &'a [String],
matches_openai: bool,
}

impl<'a> ProviderMatcher<'a> {
fn new(filters: &'a [String]) -> Option<Self> {
if filters.is_empty() {
return None;
}

let matches_openai = filters.iter().any(|provider| provider == "openai");
Some(Self {
filters,
matches_openai,
})
}

fn matches(&self, session_provider: Option<&str>) -> bool {
match session_provider {
Some(provider) => self.filters.iter().any(|candidate| candidate == provider),
None => self.matches_openai,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Badge Filtering hides all legacy sessions for non-openai providers

handle_list_conversations defaults to passing vec![self.config.model_provider_id] as a filter when the client omits model_providers, so installations whose default provider is Azure or any non‑"openai" value will filter by that provider. However ProviderMatcher::matches only treats a missing model_provider in the rollout file as a match when the filter contains "openai". Every preexisting session written before this change lacks the model_provider field, regardless of which provider produced it. As a result, after upgrading, users whose default provider is not "openai" will receive an empty list (or lose all old conversations) because their historical sessions never match the filter. The filtering logic should treat missing metadata as matching the provider being filtered (or at least the configured default) rather than hard‑coding "openai".

Useful? React with 👍 / 👎.

@bolinfest
Copy link
Collaborator Author

@codex please review

@chatgpt-codex-connector
Copy link
Contributor

Codex Review: Didn't find any major issues. Keep them coming!

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

@bolinfest bolinfest merged commit 5907422 into main Oct 27, 2025
38 of 40 checks passed
@bolinfest bolinfest deleted the pr5658 branch October 27, 2025 09:03
@github-actions github-actions bot locked and limited conversation to collaborators Oct 27, 2025
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants