Skip to content

Conversation

@courtneypacheco
Copy link
Contributor

@courtneypacheco courtneypacheco commented Jul 14, 2025

This PR updates the constraints-dev.txt file using tox -e constraints.

Summary by CodeRabbit

  • Dependencies

    • Updated core machine learning libraries (numpy, pandas, torch, transformers)
    • Upgraded web and async frameworks (aiohttp, httpx, uvicorn)
    • Added new ecosystem components and observability tools
    • Applied security and compatibility patches across packages
  • Infrastructure

    • Updated testing and development tooling

✏️ Tip: You can customize this high-level summary in your review settings.

@sourcery-ai
Copy link

sourcery-ai bot commented Jul 14, 2025

Reviewer's Guide

Regenerated the development constraints file by running tox -e constraints, updating pinned versions for all development dependencies.

File-Level Changes

Change Details Files
Regenerate constraints-dev.txt with updated dependency pins
  • Executed tox -e constraints to refresh all dev dependency versions
constraints-dev.txt

Tips and commands

Interacting with Sourcery

  • Trigger a new review: Comment @sourcery-ai review on the pull request.
  • Continue discussions: Reply directly to Sourcery's review comments.
  • Generate a GitHub issue from a review comment: Ask Sourcery to create an
    issue from a review comment by replying to it. You can also reply to a
    review comment with @sourcery-ai issue to create an issue from it.
  • Generate a pull request title: Write @sourcery-ai anywhere in the pull
    request title to generate a title at any time. You can also comment
    @sourcery-ai title on the pull request to (re-)generate the title at any time.
  • Generate a pull request summary: Write @sourcery-ai summary anywhere in
    the pull request body to generate a PR summary at any time exactly where you
    want it. You can also comment @sourcery-ai summary on the pull request to
    (re-)generate the summary at any time.
  • Generate reviewer's guide: Comment @sourcery-ai guide on the pull
    request to (re-)generate the reviewer's guide at any time.
  • Resolve all Sourcery comments: Comment @sourcery-ai resolve on the
    pull request to resolve all Sourcery comments. Useful if you've already
    addressed all the comments and don't want to see them anymore.
  • Dismiss all Sourcery reviews: Comment @sourcery-ai dismiss on the pull
    request to dismiss all existing Sourcery reviews. Especially useful if you
    want to start fresh with a new review - don't forget to comment
    @sourcery-ai review to trigger a new review!

Customizing Your Experience

Access your dashboard to:

  • Enable or disable review features such as the Sourcery-generated pull request
    summary, the reviewer's guide, and others.
  • Change the review language.
  • Add, remove or edit custom review instructions.
  • Adjust other review settings.

Getting Help

Copy link

@sourcery-ai sourcery-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey @courtneypacheco - I've reviewed your changes and they look great!


Sourcery is free for open source - if you like our reviews please consider sharing them ✨
Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.

@courtneypacheco courtneypacheco changed the title chore: update constraints-dev.txt (2025-07-14) chore: update constraints-dev.txt (2025-07-21) Jul 21, 2025
@courtneypacheco courtneypacheco changed the title chore: update constraints-dev.txt (2025-07-21) chore: update constraints-dev.txt (2025-07-28) Jul 28, 2025
@courtneypacheco courtneypacheco changed the title chore: update constraints-dev.txt (2025-07-28) chore: update constraints-dev.txt (2025-08-04) Aug 4, 2025
@courtneypacheco courtneypacheco changed the title chore: update constraints-dev.txt (2025-08-04) chore: update constraints-dev.txt (2025-08-11) Aug 11, 2025
@courtneypacheco courtneypacheco changed the title chore: update constraints-dev.txt (2025-08-11) chore: update constraints-dev.txt (2025-08-18) Aug 18, 2025
@courtneypacheco courtneypacheco changed the title chore: update constraints-dev.txt (2025-08-18) chore: update constraints-dev.txt (2025-08-25) Aug 25, 2025
@mergify mergify bot added the ci-failure label Aug 25, 2025
@courtneypacheco courtneypacheco changed the title chore: update constraints-dev.txt (2025-08-25) chore: update constraints-dev.txt (2025-09-01) Sep 1, 2025
@mergify mergify bot added ci-failure and removed ci-failure labels Sep 1, 2025
@courtneypacheco courtneypacheco changed the title chore: update constraints-dev.txt (2025-09-01) chore: update constraints-dev.txt (2025-09-08) Sep 8, 2025
@mergify mergify bot added ci-failure and removed ci-failure labels Sep 8, 2025
@courtneypacheco courtneypacheco changed the title chore: update constraints-dev.txt (2025-09-08) chore: update constraints-dev.txt (2025-09-15) Sep 15, 2025
@mergify mergify bot added ci-failure and removed ci-failure labels Sep 15, 2025
@courtneypacheco courtneypacheco changed the title chore: update constraints-dev.txt (2025-09-15) chore: update constraints-dev.txt (2025-09-22) Sep 22, 2025
@mergify mergify bot removed the ci-failure label Sep 22, 2025
@mergify mergify bot added ci-failure and removed ci-failure labels Sep 29, 2025
@courtneypacheco courtneypacheco changed the title chore: update constraints-dev.txt (2025-09-29) chore: update constraints-dev.txt (2025-10-06) Oct 6, 2025
@mergify mergify bot added ci-failure and removed ci-failure labels Oct 6, 2025
@courtneypacheco courtneypacheco changed the title chore: update constraints-dev.txt (2025-10-06) chore: update constraints-dev.txt (2025-10-13) Oct 13, 2025
@mergify mergify bot added ci-failure and removed ci-failure labels Oct 13, 2025
@courtneypacheco courtneypacheco changed the title chore: update constraints-dev.txt (2025-10-13) chore: update constraints-dev.txt (2025-10-20) Oct 20, 2025
@mergify mergify bot added ci-failure and removed ci-failure labels Oct 20, 2025
@courtneypacheco courtneypacheco changed the title chore: update constraints-dev.txt (2025-10-20) chore: update constraints-dev.txt (2025-10-27) Oct 27, 2025
@mergify mergify bot added ci-failure and removed ci-failure labels Oct 27, 2025
@courtneypacheco courtneypacheco changed the title chore: update constraints-dev.txt (2025-10-27) chore: update constraints-dev.txt (2025-11-03) Nov 3, 2025
@courtneypacheco courtneypacheco changed the title chore: update constraints-dev.txt (2025-11-03) chore: update constraints-dev.txt (2025-11-10) Nov 10, 2025
@coderabbitai
Copy link

coderabbitai bot commented Nov 10, 2025

Walkthrough

Updated constraints-dev.txt with broad dependency changes: many package version bumps, additions, removals and reordering across ML, web, observability, tooling, and LLM ecosystems. No changes to exported/public API declarations.

Changes

Cohort / File(s) Summary
Single constraints file
constraints-dev.txt
Centralized, sweeping update of Python package constraints: numerous version bumps, additions (e.g., instructor, docstring-parser, fastapi-cloud-cli, grpcio, OpenTelemetry packages), removals/reorders and LangChain ecosystem restructuring.
ML & Scientific stack
constraints-dev.txt
Upgrades: torch 2.5.1→2.6.0, torchaudio 2.5.1→2.6.0, torchvision 0.20.1→0.21.0; numpy 1.26.4→2.2.6; transformers 4.53.1→4.57.1; vllm, triton, llvmlite, cupy-cuda12x bumped.
LangChain & LLM tooling
constraints-dev.txt
LangChain family reflowed and bumped (e.g., langchain ≈1.0.7), instructor added, gguf, mistral-common, openai and related LLM packages updated.
Web & async frameworks
constraints-dev.txt
fastapi 0.115.14→0.121.x, uvicorn bumped, aiohttp reflowed to 3.13.2, httpx/httpx-sse and anyio versions upgraded.
Serialization, data & storage
constraints-dev.txt
pyarrow 20.0.0→22.0.0; pandas series bumped to 2.3.x; orjson, pyyaml, protobuf set to 4.25.8; other data libraries adjusted.
Type-validation & config
constraints-dev.txt
pydantic moved across 2.11→2.12.x ranges and pydantic-core/settings updated; typing-extensions bumped.
Observability & infra
constraints-dev.txt
OpenTelemetry packages (api, sdk, exporters, proto, semantic-conventions) added/updated; prometheus-client and related infra libs changed.
Dev tooling & QA
constraints-dev.txt
Test and linting/tooling bumps: pytest, pytest-asyncio, pytest-cov, pre-commit, mypy, astroid, plus packaging and CI helpers updated; docstring-parser/deprecated added.
Tokenization & text
constraints-dev.txt
tiktoken 0.9.0→0.12.0; tokenizers, regex, markdown-it-py and other text/token libs bumped.
Misc / Utilities
constraints-dev.txt
Eclectic bumps and reorders: accelerate, cloudpickle, pillow, requests, charset-normalizer, pyzmq, ray, zstandard, gitpython, greenlet, dnspython, watchfiles, and more.

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~20–35 minutes

Areas needing extra attention:

  • NumPy 1.x → 2.x (1.26.4→2.2.6): compatibility and potential API changes.
  • PyTorch ecosystem (torch/torchaudio/torchvision/triton): binary/CUDA compatibility checks.
  • Protobuf pinned to 4.25.8 while other entries referenced 6.x — confirm consumer compatibility.
  • LangChain and LLM-tooling reorganization: ensure imports and integration points remain valid.
  • OpenTelemetry additions and exporter changes: verify initialization/config and dependency footprints.
  • New packages (e.g., instructor, fastapi-cloud-cli, grpcio): validate necessity, licensing and CI impacts.

Poem

🐰 I sniffed the constraints, hopped through each line,
Versions climbed higher — a wiggly design.
Torch took a leap, NumPy stretched tall,
LangChain rearranged its cozy hall,
I nibbled a package and danced down the vine 🥕✨

Pre-merge checks and finishing touches

✅ Passed checks (3 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title 'chore: update constraints-dev.txt (2025-11-24)' directly describes the main change: updating a constraints file with a date stamp, which aligns with the PR's purpose.
Docstring Coverage ✅ Passed No functions found in the changed files to evaluate docstring coverage. Skipping docstring coverage check.
✨ Finishing touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch update-constraints

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@courtneypacheco courtneypacheco changed the title chore: update constraints-dev.txt (2025-11-10) chore: update constraints-dev.txt (2025-11-17) Nov 17, 2025
Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 78821cb and 6d4dc18.

📒 Files selected for processing (1)
  • constraints-dev.txt (2 hunks)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (6)
  • GitHub Check: Sourcery review
  • GitHub Check: mypy
  • GitHub Check: ruff
  • GitHub Check: pylint
  • GitHub Check: test: 3.11 on ubuntu-latest
  • GitHub Check: test: 3.11 on macos-latest
🔇 Additional comments (7)
constraints-dev.txt (7)

163-163: ⚠️ Protobuf downgrade requires verification.

Line 163 shows protobuf pinned to 4.25.8, which is a downgrade from 6.31.1. This is unusual for an automated constraints update. Verify:

  • Whether this downgrade resolves a known compatibility issue with dependent packages (ray, vllm, opentelemetry)
  • If there are backward-incompatibility issues preventing the use of protobuf 6.x
  • Whether this is intentional or a constraint resolution side effect

Given the CI-failure label on this PR, please confirm that the protobuf downgrade does not introduce security vulnerabilities or unintended API changes.


227-234: Verify PyTorch ecosystem version alignment.

Lines 227–234 show torch 2.6.0, torchaudio 2.6.0, torchvision 0.21.0, and triton 3.2.0 with transformers 4.57.1 and vllm 0.8.5.post1. These should be tightly aligned.

Verify:

  • Whether torch 2.6.0 officially supports triton 3.2.0
  • If torchvision 0.21.0 is the correct version for torch 2.6.0 (typically torchvision version should match torch's major.minor)
  • Whether xformers 0.0.29.post2 and xgrammar work correctly with this torch version

Confirm that the torch/torchvision/triton versions are officially compatible by checking PyTorch's version compatibility matrix.


148-150: Cross-check pandas versions between constraints and requirements.

Lines 149–150 show pandas 2.3.3 and pandas-stubs 2.3.2.250926. Verify that the primary requirements.txt file is pinned to compatible pandas versions (the stubs version should match the pandas version).

Verify that both requirements.txt and requirements-dev.txt reference compatible pandas versions.


133-143: I'll continue verifying the remaining concerns from the review comment.

I need to verify two remaining concerns. Let me search for more specific information about OpenAI 2.8.0 and clarify the semantic-conventions package distinction.

Based on my verification, I found an important discrepancy in the constraints file that needs addressing.

Include both opentelemetry-semantic-conventions and opentelemetry-semantic-conventions-ai for vllm compatibility.

vllm 0.8.5.post1 requires opentelemetry-semantic-conventions-ai>=0.4.1,<0.5.0, which is a separate package from opentelemetry-semantic-conventions. The constraints file currently includes only opentelemetry-semantic-conventions==0.47b0. opentelemetry-semantic-conventions-ai is an extension for gen AI applications that defines additional attributes for spans useful for debugging and monitoring prompts, completions, token usage, etc. — distinct from the base semantic conventions package.

Additionally, openai 2.8.0 was released with API updates and Pydantic v1 compatibility, but verification for specific compatibility with instructor 1.13.0 and langchain-openai 1.0.3 is inconclusive from available sources.

Lines 133–143 should also include opentelemetry-semantic-conventions-ai==0.4.13 (matching the version pinned in the file) alongside the existing semantic-conventions package.


119-119: I need to verify compatibility with the other major packages mentioned. Let me search for additional information:

I need to verify the specific numpy requirements for transformers 4.57.1 and check the actual constraints file in the codebase:

Critical runtime risk: NumPy 2.2.6 incompatible with PyTorch 2.6.0 wheels.

NumPy 2.2.6 introduces ABI incompatibilities with PyTorch 2.6.0. PyTorch wheels compiled against NumPy 1.x will fail at import or runtime.

While scikit-learn 1.7.2 supports NumPy 2, this does not resolve the core PyTorch compatibility issue:

  • Import errors and crashes occur when calling PyTorch operations due to C-API/ABI layout differences between NumPy 1.x and 2.x
  • Conflicts arise when different packages are built against different NumPy ABIs in the same environment

Before merging, test that PyTorch 2.6.0 imports and runs correctly with NumPy 2.2.6 in your specific environment. If failures occur, revert to NumPy 1.26.4 or rebuild PyTorch against NumPy 2.x.


96-96: No breaking changes introduced by markdown-it-py 4.0.0 for this repository.

The upgrade is safe:

  • Repository requires Python 3.11+, which exceeds markdown-it-py 4.0.0's Python 3.10+ requirement
  • No direct imports of markdown-it-py found in the codebase (used only transitively via rich)
  • No API-breaking changes documented; only Python version support updates and CommonMark 0.31.2 spec compliance improvements

155-155: Pillow 12.0.0 upgrade is compatible with downstream packages.

torchvision 0.21.0 requires pillow >=5.3.0, !=8.3.*, and mistral-common requires pillow >=10.3.0, so Pillow 12.0.0 satisfies both. Breaking changes in Pillow 12.0.0 are API-level removals (ImageFile.raise_oserror, IptcImageFile helpers, ImageCms changes), not image I/O or format handling changes. These removed APIs are unlikely to affect the specified downstream packages. No further action needed; proceed with the upgrade.

Comment on lines 77 to 88
langchain==1.0.7 # via ragas
langchain-classic==1.0.0 # via langchain-community
langchain-community==0.4.1 # via ragas
langchain-core==1.0.5 # via langchain, langchain-classic, langchain-community, langchain-openai, langchain-text-splitters, langgraph, langgraph-checkpoint, langgraph-prebuilt, ragas
langchain-openai==1.0.3 # via ragas
langchain-text-splitters==1.0.0 # via langchain-classic
langdetect==1.0.9 # via lm-eval
langsmith==0.4.4 # via langchain, langchain-community, langchain-core
langgraph==1.0.3 # via langchain
langgraph-checkpoint==3.0.1 # via langgraph, langgraph-prebuilt
langgraph-prebuilt==1.0.4 # via langgraph
langgraph-sdk==0.2.9 # via langgraph
langsmith==0.4.43 # via langchain-classic, langchain-community, langchain-core
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

🧩 Analysis chain

Verify LangChain 1.0.7 API compatibility and migration path.

Lines 77–88 show a significant restructuring of the langchain ecosystem: langchain upgraded from 0.3.26 to 1.0.7, with correlated updates to langchain-core, langchain-classic, langchain-community, langchain-openai, and new packages like langchain-text-splitters. Version 1.0.0 is a milestone release that often includes breaking changes.

Verify:

  • Whether ragas (line 191) is compatible with langchain 1.0.7 (the integrations may have changed)
  • Whether langchain-classic is a new package or a renamed legacy support module
  • If code in the repository uses deprecated langchain APIs that break under 1.0.7

🌐 Web query:

What are the breaking changes in LangChain 1.0.7 from 0.3.26?

💡 Result:

Short answer — major breaking changes when moving from 0.3.26 -> 1.0.7 (summary):

  • LangChain v1 is a major refactor: core APIs and package surface were simplified and some legacy/previously‑deprecated APIs removed; much legacy functionality was moved to a separate package (langchain-classic / @langchain/classic / langchain_community split). [1][2][3][4]
  • Runtime / platform requirements raised: Python >= 3.10 (3.9 dropped) and Node >= 20 (Node 18 dropped). [2][3]
  • Import / package layout changed: new package/module boundaries (core, community, provider packages); update imports (don’t rely on internal dist/ imports). A LangChain CLI migration script and migration guides are provided. [2][4]
  • New agent/runtime model: create_agent / LangGraph runtime and new middleware system replace older agent entrypoints (some prebuilt agents deprecated/removed). Update code that used create_react_agent and other legacy agent helpers. [1][11]
  • Tool/function calling change: “tool calling” / new tool abstractions replace older function‑calling patterns in several integrations (migrations required for integrations that used function calling). [6][7]
  • Chat/LM API changes: return type signatures and chat model invocation signatures were adjusted (e.g., AIMessage vs BaseMessage), and OpenAI Responses API default output format changed (LC_OUTPUT_VERSION v0 to restore old behavior). Update custom chat models and code that depended on old return types. [3]
  • Build/output changes (JS): new build outputs / bundler approach — code that imported from dist/ or relied on old TS outputs must update imports. [2]
  • Removed deprecated methods and breaking simplifications: generate/predict deprecations already removed or refactored (see migration docs for exact renamed/removed functions and replacements). [2][3][4]

If you want, I can:

  • produce a file-by-file migration checklist for your codebase (requires a few sample imports/calls from your repo), or
  • extract the most load‑bearing removed/renamed symbols between 0.3.26 and 1.0.7 from the changelog and migration guide.

Sources: LangChain 1.0 announcement & changelog, JS and Python v1 migration guides, and the GitHub 1.0.7 release notes. [1][2][3][4]

References

  1. LangChain 1.0 announcement / changelog. [Changelog]. (changelog.langchain.com)
  2. LangChain v1 migration guide (JavaScript) — breaking changes and import/build notes. (docs.langchain.com)
  3. LangChain v1 migration guide (Python) — breaking changes (Python version, chat return types, Responses API). (docs.langchain.com)
  4. GitHub releases (langchain==1.0.7). (github.com)

Citations:


I need to verify compatibility and deprecated API usage in the codebase. Let me search for:

  1. Ragas version and compatibility with LangChain 1.0.7
  2. Actual langchain API usage patterns in the code that might rely on deprecated features

Wait—I should not skip cloning since I need to inspect the actual codebase files. Let me correct:



</script>

---

🏁 Script executed:

```shell
#!/bin/bash
# Check ragas version in constraints files
echo "=== Ragas version in constraints ==="
rg -n 'ragas' constraints-dev.txt
echo ""
echo "=== Langchain imports and usage patterns ==="
# Search for deprecated patterns: create_react_agent, generate, predict, old chat APIs
rg -n 'from langchain|import.*langchain|create_react_agent|\.generate\(|\.predict\(|BaseMessage' \
  -g '!**/test/**' \
  -g '!**/*.pyc' \
  --type=py \
  -C 2

Length of output: 3014


🏁 Script executed:

#!/bin/bash
# Check ragas version in constraints files
echo "=== Ragas version ==="
rg -n 'ragas' constraints-dev.txt constraints.txt requirements.txt pyproject.toml 2>/dev/null || echo "ragas version not found in standard files"
echo ""
echo "=== Langchain imports and usage patterns ==="
# Search for deprecated patterns: create_react_agent, generate, predict, old chat APIs
rg -n 'from langchain|import.*langchain|create_react_agent|\.generate\(|\.predict\(' \
  -g '!**/test/**' \
  --type=py \
  -C 2

Length of output: 3455


I need to verify ragas compatibility and check for broader langchain usage patterns. Let me search for more information and scan the code comprehensively.


🏁 Script executed:

#!/bin/bash
# Comprehensive langchain usage scan
echo "=== All langchain imports ==="
rg -n 'from langchain|import.*langchain' --type=py -C 1

echo ""
echo "=== All ChatOpenAI, ChatModel usage ==="
rg -n 'ChatOpenAI|ChatModel|from_pretrained|invoke|AIMessage' --type=py -B1 -A1

echo ""
echo "=== Python version requirement ==="
rg -n 'python_requires|requires-python' pyproject.toml setup.py setup.cfg 2>/dev/null || echo "No explicit Python version found in standard locations"

Length of output: 2000


🌐 Web query:

ragas 0.3.2 compatibility with LangChain 1.0.7

💡 Result:

Short answer: No — ragas 0.3.2 expects older LangChain packages and is not compatible with LangChain 1.0.7. Ragas docs recommend installing langchain-core and langchain-openai with version bounds like langchain-core>=0.2,<0.3 and langchain-openai>=0.1,<0.2, and show wrappers/integrations targeting the pre-1.0 LangChain API. [1][2]

References

  • Ragas installation / LangChain dependency recommendation. [1]
  • Ragas LangChain integration docs (wrappers and examples using langchain_openai). [2]

Resolve incompatibility between ragas 0.3.2 and LangChain 1.0.7.

Verification confirms a critical version conflict: ragas 0.3.2 is not compatible with LangChain 1.0.7, as ragas expects langchain-core>=0.2,<0.3 and langchain-openai>=0.1,<0.2, but the constraints file pins langchain-core==1.0.5 and langchain-openai==1.0.3.

When ragas executes, it will call internal LangChain APIs that have been removed or significantly changed in v1.0, causing runtime failures. Update either:

  • Ragas to a version compatible with LangChain 1.0.7, or
  • LangChain packages back to pre-1.0 versions compatible with ragas 0.3.2

Note: langchain-classic==1.0.0 is correctly introduced as the legacy module for v1.0 (answers the review concern on that point); the repository's single langchain import (from langchain_community.chat_models import ChatOpenAI in src/instructlab/eval/ragas.py:7) uses the correct v1.0 path but is overridden by ragas's internal incompatibility.

🤖 Prompt for AI Agents
In constraints-dev.txt around lines 77 to 88, the pinned langchain packages
(langchain-core==1.0.5, langchain-openai==1.0.3, langchain==1.0.7) conflict with
ragas 0.3.2 which expects langchain-core>=0.2,<0.3 and
langchain-openai>=0.1,<0.2; fix by either upgrading ragas to a release that
supports LangChain v1.x (preferred) or by downgrading the langchain packages to
the pre-1.0 versions ragas requires; update the constraint lines accordingly so
ragas and langchain package versions are mutually compatible and ensure any
other packages that pull langchain are aligned.

Signed-off-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
@courtneypacheco courtneypacheco changed the title chore: update constraints-dev.txt (2025-11-17) chore: update constraints-dev.txt (2025-11-24) Nov 24, 2025
Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 6d4dc18 and 1ac4c17.

📒 Files selected for processing (1)
  • constraints-dev.txt (2 hunks)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (4)
  • GitHub Check: Sourcery review
  • GitHub Check: mypy
  • GitHub Check: pylint
  • GitHub Check: test: 3.11 on ubuntu-latest
🔇 Additional comments (1)
constraints-dev.txt (1)

134-134: Review premise is incorrect; the cited concern does not apply to this codebase.

The codebase contains direct openai imports, contradicting the review's claim that it imports exclusively via langchain-openai:

  • from openai import Client as OpenAIClient (ragas.py)
  • from openai.types.chat import ChatCompletionMessageParam (ragas.py)
  • import openai (mt_bench_common.py)

The review specifically cites "ResponseFunctionToolCallOutputItem.output" changing from string to string|Array as a potential breaking change. This object type does not appear anywhere in the codebase.

The actual patterns used—client.chat.completions.create() and accessing response.choices[0].message.content—are standard APIs stable across openai 2.x versions. Exception handling uses standard openai exception types present in 2.8.1. The openai==2.8.1 pin satisfies the openai>=1.13.3 requirement.

Likely an incorrect or invalid review comment.

Comment on lines +78 to +89
langchain==1.0.8 # via ragas
langchain-classic==1.0.0 # via langchain-community
langchain-community==0.4.1 # via ragas
langchain-core==1.1.0 # via langchain, langchain-classic, langchain-community, langchain-openai, langchain-text-splitters, langgraph, langgraph-checkpoint, langgraph-prebuilt, ragas
langchain-openai==1.0.3 # via ragas
langchain-text-splitters==1.0.0 # via langchain-classic
langdetect==1.0.9 # via lm-eval
langsmith==0.4.4 # via langchain, langchain-community, langchain-core
langgraph==1.0.3 # via langchain
langgraph-checkpoint==3.0.1 # via langgraph, langgraph-prebuilt
langgraph-prebuilt==1.0.5 # via langgraph
langgraph-sdk==0.2.9 # via langgraph
langsmith==0.4.46 # via langchain-classic, langchain-community, langchain-core
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

🧩 Analysis chain

CRITICAL: ragas 0.3.2 is incompatible with LangChain 1.1.0/1.0.3 — unresolved from past review.

ragas 0.3.2 requires compatibility with langchain_core/langchain_openai for LLM and embedding integration, but this version expects langchain-core>=0.2,<0.3 and langchain-openai>=0.1,<0.2 (pre-1.0). The constraints file pins langchain-core==1.1.0 and langchain-openai==1.0.3, which are outside ragas's compatibility range and use the v1.0 API path (langchain_core.documents). When ragas calls LangChain internals, it will invoke removed or significantly changed v1.0 APIs, causing runtime failures.

Option A (preferred): Upgrade ragas to a version compatible with LangChain 1.1.0 (check if ragas ≥0.4.x or later supports v1.0).
Option B: Downgrade langchain packages to pre-1.0 versions compatible with ragas 0.3.2.

Ensure all downstream packages that depend on langchain (e.g., vllm, instructor) also align with the chosen langchain version.


I'll verify the key dependency compatibility claims in the review comment. Let me gather information about the critical incompatibilities and major version bumps.


🏁 Script executed:

#!/bin/bash
# Check for direct openai API usage (not wrapped by langchain-openai)
rg -n "from openai import|import openai|openai\." --type py -A 2 -B 2

Length of output: 4424


🏁 Script executed:

#!/bin/bash
# Check for ragas imports and usage patterns
rg -n "from ragas|import ragas" --type py -A 3

Length of output: 1192


🏁 Script executed:

#!/bin/bash
# Search for version-specific langchain imports or API usage
rg -n "from langchain_core|from langchain_openai|langchain_core\.|langchain_openai\." --type py | head -20

Length of output: 0


🌐 Web query:

ragas 0.3.2 langchain version requirements compatibility

💡 Result:

Ragas 0.3.2 expects the LangChain 0.2.x ecosystem (install guidance recommends langchain-core >=0.2,<0.3 and companion packages like langchain-openai >=0.1,<0.2). [1][2]

Sources:

  • Ragas installation docs (LangChain dependency bounds recommendation). [1]
  • conda package metadata for ragas 0.3.2 showing langchain / langchain-core / langchain-openai as dependencies. [2]

🌐 Web query:

numpy 2.2.6 pandas 2.3.3 compatibility

💡 Result:

Yes — pandas 2.3.3 works with NumPy 2.2.6. Pandas added general NumPy 2.x compatibility in the 2.2.x series (see v2.2.2 notes), and NumPy 2.2.6 is a current NumPy 2.x patch release. [1][2]

Sources:

  • pandas whatsnew (2.2.2) — NumPy 2.0 compatibility added. [1]
  • NumPy 2.2.6 release notes. [2]

🌐 Web query:

openai 2.8.1 breaking changes v2.x

💡 Result:

Summary: openai v2.8.1 is a patch release (2025-11-17) with a TypeScript typings bug fix (align input/output item types). It is not a breaking release for the v2.x line. [1][2]

Sources:

  • openai-python v2.8.1 changelog/commit (release entry showing 2.8.1). [1]
  • MR/notes referencing v2.8.1 and the bug fix (align types for TypeScript). [2]

CRITICAL: ragas 0.3.2 is incompatible with LangChain 1.1.0/1.0.3 — unresolved from past review.

ragas 0.3.2 requires langchain-core>=0.2,<0.3 and langchain-openai>=0.1,<0.2 (pre-1.0), but the constraints file pins langchain-core==1.1.0 and langchain-openai==1.0.3. These version ranges do not overlap. The codebase imports ragas directly (src/instructlab/eval/ragas.py), so when ragas attempts to use LangChain internals at runtime, it will invoke APIs that no longer exist or have changed in v1.x, causing failures.

Required fix: Either upgrade ragas to a v0.4.x or later version compatible with LangChain 1.1.0+, or downgrade langchain packages to pre-1.0 versions compatible with ragas 0.3.2. Verify that all downstream packages (vllm, instructor) align with the chosen langchain version.

🤖 Prompt for AI Agents
In constraints-dev.txt around lines 78 to 89, ragas 0.3.2 is incompatible with
the pinned LangChain packages (langchain-core==1.1.0 and
langchain-openai==1.0.3); update the constraints to either upgrade ragas to a
version compatible with LangChain 1.1.x (e.g., ragas>=0.4.x) or downgrade the
langchain packages to the pre-1.0 ranges ragas 0.3.2 requires
(langchain-core>=0.2,<0.3 and langchain-openai>=0.1,<0.2), and then run
dependency resolution tests to ensure all downstream packages (vllm, instructor,
etc.) align with the chosen LangChain version.

numpy==1.26.4 # via accelerate, cupy-cuda12x, datasets, evaluate, gguf, langchain-community, mistral-common, numba, numexpr, opencv-python-headless, outlines, pandas, pandas-stubs, peft, ragas, rouge-score, sacrebleu, scikit-learn, scipy, torchvision, transformers, vllm, xformers
numba==0.61.2 # via vllm
numexpr==2.14.1 # via lm-eval
numpy==2.2.6 # via accelerate, cupy-cuda12x, datasets, evaluate, gguf, langchain-community, mistral-common, numba, numexpr, opencv-python-headless, outlines, pandas, pandas-stubs, peft, ragas, rouge-score, sacrebleu, scikit-learn, scipy, torchvision, transformers, vllm, xformers
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

Verify numpy 2.2.6 compatibility across the dependency tree.

Pandas 2.2.2 is the first version generally compatible with numpy 2.0, and the constraints file pins pandas 2.3.3, which should work with numpy 2.2.6. However, numpy 2.2.6 is a significant jump from 1.26.4 with potential breaking changes for compiled extensions (e.g., cupy-cuda12x, scipy, scikit-learn, torch). Modules compiled using NumPy 1.x cannot run in NumPy 2.0.0 and may crash; modules must be recompiled with NumPy 2.0 to support both versions.

Verify that all scientific packages in this constraint file (especially torch 2.6.0, scipy 1.16.3, scikit-learn 1.7.2, cupy-cuda12x 13.6.0, and torchvision 0.21.0) are wheels built with numpy 2.x compatibility.


I'll verify the numpy 2.2.6 compatibility concerns in the review comment by checking the constraints file and investigating the compatibility claims.





---

🏁 Script executed:

```shell
#!/bin/bash
# Find and display constraints-dev.txt to verify versions and context
fd -t f 'constraints-dev.txt' --exec cat {} \;

Length of output: 15934


🌐 Web query:

numpy 2.2.6 compatibility torch scipy scikit-learn cupy 2024 2025

💡 Result:

Short answer — NumPy 2.2.6 (released May 17, 2025) can break many scientific wheels that were built against NumPy 1.x. Practical compatibility notes:

  • NumPy 2.2.6 release date and notes. [1]
  • PyTorch: older torch wheels built against NumPy 1.x will error on NumPy 2.x (ABI break); PyTorch added NumPy‑2.0 support in the PyTorch 2.3 development line and CI was updated to support NumPy 2.x — upgrade to a torch wheel built against NumPy 2.x or rebuild from source. If you see “A module that was compiled using NumPy 1.x cannot be run in NumPy 2.x…” the workaround is to upgrade torch to a NumPy‑2‑compatible release or downgrade NumPy to <2.0. [2][3][4]
  • SciPy: SciPy releases document explicit NumPy version ranges — many SciPy versions expect NumPy <2.x (or specific upper bounds) so older SciPy wheels will warn/fail with NumPy 2.2.6. Use a SciPy release whose toolchain lists compatibility with NumPy 2.x or downgrade NumPy. Check the SciPy toolchain table for the SciPy version you use. [5][6]
  • scikit‑learn: follows NumPy ABI expectations of compiled C extensions; older scikit‑learn wheels built against NumPy 1.x can fail with NumPy 2.x — upgrade scikit‑learn to a wheel built against NumPy 2.x or use numpy<2 if stuck. (scikit‑learn doesn’t yet pin a single universal range; treat it like other compiled packages.) [see PyTorch/vision discussion re: ecosystem transition]. [4]
  • CuPy: many CuPy releases historically required NumPy 1.x (older install docs list NumPy 1.*). If you use CuPy, confirm the CuPy release notes/wheels for NumPy‑2 support; otherwise expect to need a CuPy build matched to your NumPy or to keep numpy<2. [7][8]

Actionable guidance

  • If you get the “compiled using NumPy 1.x” error: either downgrade NumPy (pip install "numpy<2") or upgrade the affected libraries to versions/wheels built against NumPy 2.x (or rebuild them from source against NumPy ≥2.0). [3][4][7]
  • Before changing anything, check the package’s release/CI notes or wheel metadata to confirm which NumPy ABI it was built against.

Sources

  1. NumPy releases (includes 2.2.6 entry). [numpy.org news]. [1]
  2. PyTorch issues and guidance about NumPy‑2 ABI and required rebuilds / CI updates. [pytorch/pytorch issue threads]. [2][4]
  3. Community reports / StackOverflow on the “module compiled using NumPy 1.x” error and practical workarounds (downgrade numpy or upgrade libs). [StackOverflow]. [3]
  4. PyTorch/vision discussion noting PyTorch’s NumPy‑2.0 support as part of the 2.3 effort. [pytorch/vision issue]. [4]
  5. SciPy toolchain/version tables showing supported NumPy version ranges per SciPy release. [scipy docs]. [5][6]
  6. CuPy installation docs (shows many CuPy releases target NumPy 1.x historically — verify current CuPy release notes for NumPy‑2 support). [cupy docs]. [7][8]

If you want, tell me the exact versions of torch, scipy, scikit‑learn or cupy you use and I’ll check whether those specific wheels are known to support NumPy 2.2.6 (or recommend exact versions to upgrade/downgrade).


🌐 Web query:

pandas 2.3.3 numpy 2.2 compatibility requirements

💡 Result:

Short answer: pandas 2.3.3 is compatible with NumPy 2.x (including NumPy 2.2). Pandas added general NumPy 2.0 compatibility in the 2.2.x series and later 2.3.x releases continue to support NumPy 2.x. [1][2]

Notes/recommendation:

  • If you hit import/runtime errors after upgrading (some third‑party C extensions or optional dependencies have had issues historically), check for patches or pins on those packages (see pandas issue/bug reports). If you need maximum stability, use the wheel provided by your package manager (conda) or pin to the known-working NumPy/pandas pair used in your environment. [3]

Sources:

  • pandas "What's new" (2.2.2) — pandas 2.2.2 added general compatibility with NumPy 2.0. [1]
  • pandas 2.3.x release notes / what's new (2.3.3). [2]
  • Example pandas GitHub issue reporting compatibility/import problems after mixing pandas and NumPy 2.0 (illustrates potential 3rd-party/extension issues). [3]

I need to verify the specific package versions flagged in the review. Let me search for their NumPy 2.x compatibility.

Verify numpy 2.2.6 compatibility—critical torch/torchvision version mismatch found.

The verification reveals a more urgent issue than the original concern: torchvision 0.21.0's wheel metadata declares a requirement for torch >=2.8.0, yet constraints pin torch 2.6.0, creating a direct installation conflict. This must be resolved before testing numpy compatibility.

Regarding the numpy 2.2.6 compatibility concern itself:

  • SciPy 1.13.0 (April 2024) is the first stable release supporting NumPy 2.x, and scipy 1.16.3 (October 2025) is well within the supported range.
  • torch 2.6.0 is post-PyTorch 2.3, which added NumPy 2.x support.
  • scikit-learn 1.4.2 includes support for numpy 2, and scikit-learn 1.7.2 is newer.
  • CuPy v13 is tested against NumPy 1.22–1.25 only; there is no documented support for numpy 2.x in CuPy 13.6.0. CuPy NumPy 2.0 compatibility work is targeted for CuPy v14, meaning 13.6.0 is not validated for numpy 2.x.

Actions required:

  1. Fix the torch/torchvision version mismatch: upgrade torchvision to 0.23+ or downgrade torch to 2.8+.
  2. Verify or downgrade cupy-cuda12x compatibility: numpy 2.2.6 wheels built for cupy 13.6.0 have not been tested and may fail at runtime.
🤖 Prompt for AI Agents
In constraints-dev.txt line 120, numpy is pinned to 2.2.6 but the constraints
reveal a critical torch/torchvision mismatch: torchvision 0.21.0 requires torch
>=2.8.0 while constraints pin torch to 2.6.0; update the constraints so torch
and torchvision are compatible (preferably bump torch to >=2.8.0 to match
torchvision, or else pin torchvision to a release that supports torch 2.6), and
separately verify CuPy compatibility with NumPy 2.x (either pin cupy-cuda12x to
a CuPy version that explicitly supports NumPy 2.x or downgrade NumPy to a 1.x
line until CuPy v14+ is available); update the constraints file accordingly and
run a pip dependency resolution check to confirm no further conflicts.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants