Conversation
|
Caution Review failedPull request was closed or merged during review 📝 WalkthroughWalkthroughAdds propagation of a creator request ID into commitment state and reservation annotations; changes reconcile to a two-phase Spec→Status host assignment; introduces feature flags and UUID-based request IDs for report APIs; reduces logging verbosity and pre-initializes Prometheus metric label series. Changes
Sequence Diagram(s)sequenceDiagram
participant Client
participant API as "ChangeCommitments API"
participant Processor as "Commitments Processor"
participant ReservationMgr as "ReservationManager"
participant Controller as "Reservation Controller"
participant Scheduler
Client->>API: submit request (includes/gets X-Request-ID)
API->>Processor: forward request (context carries CreatorRequestID)
Processor->>ReservationMgr: compute & apply commitment changes (attach CreatorRequestID)
ReservationMgr->>ReservationMgr: create Reservation (annotate CreatorRequestID)
ReservationMgr->>Controller: ensure Reservation Spec exists
Controller->>Scheduler: select host -> write `Spec.TargetHost`
Controller-->>Controller: return (end reconcile)
Controller->>Controller: next reconcile reads `Spec.TargetHost`
Controller->>ReservationMgr: sync Spec -> Status (set `Status.Host`, mark Ready)
Controller->>Processor: report applied state
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~20 minutes Possibly related PRs
Suggested reviewers
Poem
🚥 Pre-merge checks | ✅ 1 | ❌ 2❌ Failed checks (1 warning, 1 inconclusive)
✅ Passed checks (1 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches📝 Generate docstrings
🧪 Generate unit tests (beta)
Comment |
There was a problem hiding this comment.
🧹 Nitpick comments (1)
internal/scheduling/reservations/commitments/reservation_manager.go (1)
312-314: Consider setting the annotation only whenCreatorRequestIDis non-empty.Currently, the annotation is always set, which will result in an empty annotation value for reservations created via the Syncer (where
CreatorRequestIDis not populated). While this is functionally harmless, it adds clutter to the object metadata.♻️ Suggested improvement
+ annotations := map[string]string{} + if state.CreatorRequestID != "" { + annotations[v1alpha1.AnnotationCreatorRequestID] = state.CreatorRequestID + } + return &v1alpha1.Reservation{ ObjectMeta: metav1.ObjectMeta{ Name: name, Labels: map[string]string{ v1alpha1.LabelReservationType: v1alpha1.ReservationTypeLabelCommittedResource, }, - Annotations: map[string]string{ - v1alpha1.AnnotationCreatorRequestID: state.CreatorRequestID, - }, + Annotations: annotations, }, Spec: spec, }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@internal/scheduling/reservations/commitments/reservation_manager.go` around lines 312 - 314, The Annotations map currently always sets v1alpha1.AnnotationCreatorRequestID to state.CreatorRequestID, producing empty metadata for Syncer-created reservations; change the logic where Annotations is constructed (the block that assigns Annotations map) to only include the v1alpha1.AnnotationCreatorRequestID key when state.CreatorRequestID is non-empty (e.g., check state.CreatorRequestID != "" before adding), ensuring you initialize the map if needed and otherwise omit that key entirely.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Nitpick comments:
In `@internal/scheduling/reservations/commitments/reservation_manager.go`:
- Around line 312-314: The Annotations map currently always sets
v1alpha1.AnnotationCreatorRequestID to state.CreatorRequestID, producing empty
metadata for Syncer-created reservations; change the logic where Annotations is
constructed (the block that assigns Annotations map) to only include the
v1alpha1.AnnotationCreatorRequestID key when state.CreatorRequestID is non-empty
(e.g., check state.CreatorRequestID != "" before adding), ensuring you
initialize the map if needed and otherwise omit that key entirely.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
Run ID: 72dd8326-7648-4eb2-8a88-0f522f8cc092
📒 Files selected for processing (7)
api/v1alpha1/reservation_types.gointernal/scheduling/reservations/commitments/api_change_commitments.gointernal/scheduling/reservations/commitments/api_change_commitments_test.gointernal/scheduling/reservations/commitments/controller.gointernal/scheduling/reservations/commitments/controller_test.gointernal/scheduling/reservations/commitments/reservation_manager.gointernal/scheduling/reservations/commitments/state.go
There was a problem hiding this comment.
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
internal/scheduling/reservations/commitments/api_change_commitments_monitor_test.go (1)
93-94:⚠️ Potential issue | 🟡 MinorInconsistent label value: "success" vs "accepted".
The monitor pre-initializes
commitmentChangeswith label values"accepted"and"rejected"(see lines 50-52 ofapi_change_commitments_monitor.go), but this test uses"success". If"success"is a valid result label in production, it should be pre-initialized in the monitor. If not, consider using"accepted"here for consistency.The same applies to line 26 in
TestChangeCommitmentsAPIMonitor_MetricsRegistration.Proposed fix
- monitor.commitmentChanges.WithLabelValues("success").Add(5) + monitor.commitmentChanges.WithLabelValues("accepted").Add(5)And in
TestChangeCommitmentsAPIMonitor_MetricsRegistration(line 26):- monitor.commitmentChanges.WithLabelValues("success").Inc() + monitor.commitmentChanges.WithLabelValues("accepted").Inc()🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@internal/scheduling/reservations/commitments/api_change_commitments_monitor_test.go` around lines 93 - 94, The test uses monitor.commitmentChanges.WithLabelValues("success") but the monitor pre-initializes labels as "accepted" and "rejected" in api_change_commitments_monitor.go; update the tests (e.g., TestChangeCommitmentsAPIMonitor_MetricsRegistration and the other failing case) to use "accepted" instead of "success", or alternatively add "success" to the pre-initialized labels in the monitor initialization where commitmentChanges is created so the label set matches tests; locate usages by searching for monitor.commitmentChanges.WithLabelValues(...) and the monitor initialization in api_change_commitments_monitor.go to apply the consistent fix.
🧹 Nitpick comments (1)
internal/scheduling/reservations/commitments/api_change_commitments_monitor_test.go (1)
145-148: Comment accuracy note.The comment states "Plus pre-initialized labels (accepted) - so >= 2 total" but the test adds "success" and "rejected", with "accepted" being pre-initialized. So the actual total would be 3 (success, rejected from test + accepted pre-initialized). The
>= 2assertion still passes, but the comment could be clearer about what's actually counted.Optional: Clarify comment
- // At minimum we expect the 2 labels we added (success, rejected) - // Plus pre-initialized labels (accepted) - so >= 2 total + // At minimum we expect labels from test (success, rejected) + // Plus pre-initialized labels (accepted, rejected) - assertion relaxed to >= 2🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@internal/scheduling/reservations/commitments/api_change_commitments_monitor_test.go` around lines 145 - 148, The comment above the assertion in api_change_commitments_monitor_test.go is misleading: it claims "so >= 2 total" though the test adds "success" and "rejected" and there is a pre-initialized "accepted", meaning three labels are present; update the comment to accurately describe that the expected minimum is 3 (or explicitly list the three labels: success, rejected, accepted) and optionally adjust the assertion or leave it as-is; reference the assertion where len(family.Metric) is checked to change the comment text to reflect the three counted labels.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Outside diff comments:
In
`@internal/scheduling/reservations/commitments/api_change_commitments_monitor_test.go`:
- Around line 93-94: The test uses
monitor.commitmentChanges.WithLabelValues("success") but the monitor
pre-initializes labels as "accepted" and "rejected" in
api_change_commitments_monitor.go; update the tests (e.g.,
TestChangeCommitmentsAPIMonitor_MetricsRegistration and the other failing case)
to use "accepted" instead of "success", or alternatively add "success" to the
pre-initialized labels in the monitor initialization where commitmentChanges is
created so the label set matches tests; locate usages by searching for
monitor.commitmentChanges.WithLabelValues(...) and the monitor initialization in
api_change_commitments_monitor.go to apply the consistent fix.
---
Nitpick comments:
In
`@internal/scheduling/reservations/commitments/api_change_commitments_monitor_test.go`:
- Around line 145-148: The comment above the assertion in
api_change_commitments_monitor_test.go is misleading: it claims "so >= 2 total"
though the test adds "success" and "rejected" and there is a pre-initialized
"accepted", meaning three labels are present; update the comment to accurately
describe that the expected minimum is 3 (or explicitly list the three labels:
success, rejected, accepted) and optionally adjust the assertion or leave it
as-is; reference the assertion where len(family.Metric) is checked to change the
comment text to reflect the three counted labels.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
Run ID: 49c96a15-edef-49d6-89ec-50f8a51f3218
📒 Files selected for processing (4)
internal/scheduling/reservations/commitments/api_change_commitments_monitor.gointernal/scheduling/reservations/commitments/api_change_commitments_monitor_test.gointernal/scheduling/reservations/commitments/api_report_capacity_monitor.gointernal/scheduling/reservations/commitments/api_report_usage_monitor.go
Test Coverage ReportTest Coverage 📊: 68.5% |
No description provided.