-
Notifications
You must be signed in to change notification settings - Fork 6k
feat: annotate conversations with model_provider for filtering #5658
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
88bc431 to
85cd533
Compare
| pub cursor: Option<String>, | ||
| /// Optional model provider filter (matches against session metadata). | ||
| #[serde(skip_serializing_if = "Option::is_none")] | ||
| pub model_provider: Option<Vec<String>>, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| pub model_provider: Option<Vec<String>>, | |
| pub model_providers: Option<Vec<String>>, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good catch: that was definitely I change I meant to make!
| pub timestamp: Option<String>, | ||
| /// Optional model provider recorded for the session. | ||
| #[serde(skip_serializing_if = "Option::is_none")] | ||
| pub model_provider: Option<String>, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
should we always return a value even if we have to fall back internally?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That way all clients won't have to implement fallback logic.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fair enough.
|
@codex please review |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Codex Review
Here are some automated review suggestions for this pull request.
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
| args: &crate::cli::ResumeArgs, | ||
| ) -> anyhow::Result<Option<PathBuf>> { | ||
| if args.last { | ||
| match codex_core::RolloutRecorder::list_conversations(&config.codex_home, 1, None, &[]) | ||
| .await | ||
| let default_provider_filter = vec!["openai".to_string()]; | ||
| match codex_core::RolloutRecorder::list_conversations( | ||
| &config.codex_home, | ||
| 1, | ||
| None, | ||
| &[], | ||
| Some(default_provider_filter.as_slice()), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Use configured provider when filtering last session
The new filtering logic in resolve_resume_path now hard‑codes the provider filter to "openai". When the CLI is configured to use another backend (e.g., Azure), calling codex resume --last will look for the most recent conversation but the list call filters everything that isn’t labelled openai, so no Azure sessions can ever be returned even when they exist. This makes --last unusable for non‑OpenAI providers. The filter should come from config.model_provider_id, not a constant.
Useful? React with 👍 / 👎.
| let page_loader: PageLoader = Arc::new(move |request: PageLoadRequest| { | ||
| let tx = loader_tx.clone(); | ||
| tokio::spawn(async move { | ||
| let provider_filter = vec![String::from("openai")]; | ||
| let page = RolloutRecorder::list_conversations( | ||
| &request.codex_home, | ||
| PAGE_SIZE, | ||
| request.cursor.as_ref(), | ||
| INTERACTIVE_SESSION_SOURCES, | ||
| Some(provider_filter.as_slice()), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Resume picker filters out non‑OpenAI sessions
The resume picker’s page loader builds the provider filter as vec!["openai"] (and the initial page load does the same). For environments configured to use a different provider, every call to RolloutRecorder::list_conversations will exclude their sessions, so the picker will render an empty list even when Azure conversations exist. The provider ID should come from the user’s configuration rather than being hard‑coded.
Useful? React with 👍 / 👎.
|
@codex please review |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Codex Review
Here are some automated review suggestions for this pull request.
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
| let page_loader: PageLoader = Arc::new(move |request: PageLoadRequest| { | ||
| let tx = loader_tx.clone(); | ||
| tokio::spawn(async move { | ||
| let provider_filter = vec![String::from("openai")]; | ||
| let page = RolloutRecorder::list_conversations( | ||
| &request.codex_home, | ||
| PAGE_SIZE, | ||
| request.cursor.as_ref(), | ||
| INTERACTIVE_SESSION_SOURCES, | ||
| Some(provider_filter.as_slice()), | ||
| ) | ||
| .await; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Avoid hard-coding resume picker to openai provider
The resume picker now builds every list_conversations request with vec!["openai"] as the provider filter. When Codex is configured to use a different provider (e.g., Azure), all recorded sessions are tagged with that provider and the picker will return an empty list because the filter never matches. Only the default OpenAI setup works; any non‑OpenAI configuration cannot resume prior conversations. The provider filter should come from the current configuration or the user’s selection rather than a literal string. The same constant appears both in the page loader and the initial page load.
Useful? React with 👍 / 👎.
|
@codex please review |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Codex Review
Here are some automated review suggestions for this pull request.
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
| let model_provider_filter = match model_provider { | ||
| Some(providers) => { | ||
| if providers.is_empty() { | ||
| None | ||
| } else { | ||
| Some(providers) | ||
| } | ||
| } | ||
| None => Some(vec![self.config.model_provider_id.clone()]), | ||
| }; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Default provider filter hides legacy non‑OpenAI sessions
The new list logic defaults to filtering by the current model_provider_id when the client omits model_providers. However, ProviderMatcher::matches treats any rollout that lacks model_provider metadata as matching only when the filter contains "openai". All rollouts recorded before this change (including Azure or other providers) have no model_provider field, so an Azure-configured client will now receive an empty list for those older sessions and has no way to discover them because the CLI/TUI always pass the default provider filter. This effectively makes existing non‑OpenAI conversations undiscoverable after upgrading. Consider treating missing metadata as matching the requested provider (or both) so that legacy sessions remain visible for their original provider.
Useful? React with 👍 / 👎.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
s
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
hi
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ok
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ok
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
k
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ok
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ok
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ok
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ok
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ok
|
@codex please review |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Codex Review
Here are some automated review suggestions for this pull request.
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
codex-rs/core/src/rollout/list.rs
Outdated
| struct ProviderMatcher<'a> { | ||
| filters: &'a [String], | ||
| matches_openai: bool, | ||
| } | ||
|
|
||
| impl<'a> ProviderMatcher<'a> { | ||
| fn new(filters: &'a [String]) -> Option<Self> { | ||
| if filters.is_empty() { | ||
| return None; | ||
| } | ||
|
|
||
| let matches_openai = filters.iter().any(|provider| provider == "openai"); | ||
| Some(Self { | ||
| filters, | ||
| matches_openai, | ||
| }) | ||
| } | ||
|
|
||
| fn matches(&self, session_provider: Option<&str>) -> bool { | ||
| match session_provider { | ||
| Some(provider) => self.filters.iter().any(|candidate| candidate == provider), | ||
| None => self.matches_openai, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Filtering hides all legacy sessions for non-openai providers
handle_list_conversations defaults to passing vec![self.config.model_provider_id] as a filter when the client omits model_providers, so installations whose default provider is Azure or any non‑"openai" value will filter by that provider. However ProviderMatcher::matches only treats a missing model_provider in the rollout file as a match when the filter contains "openai". Every preexisting session written before this change lacks the model_provider field, regardless of which provider produced it. As a result, after upgrading, users whose default provider is not "openai" will receive an empty list (or lose all old conversations) because their historical sessions never match the filter. The filtering logic should treat missing metadata as matching the provider being filtered (or at least the configured default) rather than hard‑coding "openai".
Useful? React with 👍 / 👎.
|
@codex please review |
|
Codex Review: Didn't find any major issues. Keep them coming! ℹ️ About Codex in GitHubYour team has set up Codex to review pull requests in this repo. Reviews are triggered when you
If Codex has suggestions, it will comment; otherwise it will react with 👍. Codex can also answer questions or update the PR. Try commenting "@codex address that feedback". |
Because conversations that use the Responses API can have encrypted reasoning messages, trying to resume a conversation with a different provider could lead to confusing "failed to decrypt" errors. (This is reproducible by starting a conversation using ChatGPT login and resuming it as a conversation that uses OpenAI models via Azure.)
This changes
ListConversationsParamsto take amodel_providers: Option<Vec<String>>and addsmodel_provideron eachConversationSummaryit returns so these cases can be disambiguated.Note this ended up making changes to
codex-rs/core/src/rollout/tests.rsbecause it had a number of cases where it expectedSomefor the value ofnext_cursor, but the list of rollouts was complete, so according to this docstring:codex/codex-rs/app-server-protocol/src/protocol.rs
Lines 334 to 337 in bcd64c7
If there are no more items to return, then
next_cursorshould beNone. This PR updates that logic.Stack created with Sapling. Best reviewed with ReviewStack.