Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Dependency resolution failure in workspaces with identical dependency lists that work in standalone packages #12272

Open
rayanramoul opened this issue Mar 18, 2025 · 3 comments
Labels
question Asking for clarification or support

Comments

@rayanramoul
Copy link

Summary

First of all thanks Astral team for the amazing tool UV is!

Summary

The issue is rather simple, the same dependency lists does work if they are in a single package, but does not if the same package is part of a workspace

Reproduction

Having this pyproject.toml:

[project]
name = "subexample"
version = "0.1.0"
description = "Add your description here"
readme = "README.md"
requires-python = ">=3.10"
dependencies = []


[project.optional-dependencies]
cpu = [
  "torch==2.3.0",
  "torchvision==0.18.0",
  "torchmetrics==0.11.*",
  "jax==0.4.13",
  "jaxlib==0.4.13",
]
gpu = [
  "torch==2.3.0",
  "torchvision==0.18.0",
  "torchmetrics==0.11.*",
  "jax[cuda11_local]==0.4.13; platform_system != 'Darwin'",
  "jaxlib==0.4.13+cuda11.cudnn86; platform_system != 'Darwin'",
  "nvidia-cudnn-cu11>=8.6.0; platform_system != 'Darwin'",
]
tpu = [
  "torch==2.3.0",
  "torchvision==0.18.0",
  "torchmetrics==0.11.*",
  "jax[tpu]==0.4.13; platform_system != 'Darwin'",
  "jaxlib===0.4.13; platform_system != 'Darwin'",
]

tpu-torch = [
  "torch==2.3.0",
  "torchvision==0.18.0",
  "torchmetrics==0.11.*",
  "torch_xla[tpu]==2.3.0",
]



[[tool.uv.index]]
name = "pytorch-cuda11"


url = "https://download.pytorch.org/whl/cu118"
explicit = true

[[tool.uv.index]]
name = "pytorch-cpu"
url = "https://download.pytorch.org/whl/cpu"

explicit = true


[tool.uv.sources]
torch = [
  { index = "pytorch-cuda11", extra = "gpu" },
  { index = "pytorch-cpu", extra = "cpu" },
]


torchvision = [
  { index = "pytorch-cuda11", extra = "gpu" },
  { index = "pytorch-cpu", extra = "cpu" },
]


[tool.uv]
prerelease = "allow"
find-links = [
  "https://storage.googleapis.com/jax-releases/jax_releases.html",

  "https://storage.googleapis.com/jax-releases/libtpu_releases.html",
  "https://storage.googleapis.com/jax-releases/jax_cuda_releases.html",
  "https://storage.googleapis.com/libtpu-releases/index.html",

]

conflicts = [ # so uv will resolve the groups independently
  [
    { extra = "gpu" },
    { extra = "cpu" },
    { extra = "tpu" },
    { extra = "tpu-torch" },
  ],
]


package = true 


[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"


[tool.hatch.metadata]
allow-direct-references = true


[tool.hatch.build.targets.wheel]
packages = ["src/subexample"]

in a simple package uv sync would work, but if this same package is part of a root workspace it would raise the following error when doing uv sync without or without specifying --all-packages

Using CPython 3.10.15
Creating virtual environment at: .venv
x No solution found when resolving dependencies:
  `-> Because there is no version of libtpu-nightly==0.1.dev20240322 and torch-xla[tpu]==2.3.0 depends on libtpu-nightly==0.1.dev20240322, we can conclude that
      torch-xla[tpu]==2.3.0 cannot be used.
      And because subexample[tpu-torch] depends on torch-xla[tpu]==2.3.0 and your workspace requires subexample[tpu-torch], we can conclude that your
      workspace's requirements are unsatisfiable.

I created this reproduction repository in which the folder simple-package-working/ reproduces the single package version which works, and the workspace-not-working/ reproduces the one wrapped in a workspace which does not work

Thank you!

Platform

macOS arm64 and Ubuntu x86_64

Version

uv 0.6.7 (029b9e1 2025-03-17)

Python version

3.10

@rayanramoul rayanramoul added the bug Something isn't working label Mar 18, 2025
@charliermarsh
Copy link
Member

I think the issue here is that we (intentionally) don't look at tool.uv configuration outside of the workspace root (except for tool.uv.sources, which can be defined per member). Does it work as expected if you lift the find-links setting into the root pyproject.toml?

@charliermarsh charliermarsh added question Asking for clarification or support and removed bug Something isn't working labels Mar 18, 2025
@rayanramoul
Copy link
Author

I think the issue here is that we (intentionally) don't look at tool.uv configuration outside of the workspace root (except for tool.uv.sources, which can be defined per member). Does it work as expected if you lift the find-links setting into the root pyproject.toml?

It solves it to a degree, but then it raises the same issue as mentioned in this issue which makes the CI and the uv-lock action fail each time

@charliermarsh
Copy link
Member

Okay, that makes sense. That's the bug to fix though -- I don't think there's anything to change here specifically.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Asking for clarification or support
Projects
None yet
Development

No branches or pull requests

2 participants