Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

uv rejects flash-attn wheels with local versions #12282

Closed
charliermarsh opened this issue Mar 18, 2025 · 0 comments · Fixed by #12285
Closed

uv rejects flash-attn wheels with local versions #12282

charliermarsh opened this issue Mar 18, 2025 · 0 comments · Fixed by #12285
Assignees
Labels
bug Something isn't working

Comments

@charliermarsh
Copy link
Member

This seems to have caused some problems with the flash-attn wheels.

#pyproject.toml
...

[tool.uv]
environments = ["sys_platform == 'darwin'", "sys_platform == 'linux'"]
constraint-dependencies = ["torch==2.5.1"]

[tool.uv.sources]
flash_attn = [
  { url = "https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.3/flash_attn-2.7.3+cu12torch2.5cxx11abiFalse-cp310-cp310-linux_x86_64.whl", marker = "sys_platform == 'linux' and python_version == '3.10'"},
  { url = "https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.3/flash_attn-2.7.3+cu12torch2.5cxx11abiFalse-cp311-cp311-linux_x86_64.whl", marker = "sys_platform == 'linux' and python_version == '3.11'"},
  { url = "https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.3/flash_attn-2.7.3+cu12torch2.5cxx11abiFalse-cp312-cp312-linux_x86_64.whl", marker = "sys_platform == 'linux' and python_version == '3.12'"},
  { url = "https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.3/flash_attn-2.7.3+cu12torch2.5cxx11abiFalse-cp313-cp313-linux_x86_64.whl", marker = "sys_platform == 'linux' and python_version == '3.13'"}
]

Now causes uv sync to error with

error: Failed to parse `uv.lock`
  Caused by: The entry for package `flash-attn` v2.7.3 has wheel `flash_attn-2.7.3+cu12torch2.5cxx11abifalse-cp310-cp310-linux_x86_64.whl` with inconsistent version: v2.7.3+cu12torch2.5cxx11abifalse

Not sure if there is a better way to configure usage of these wheels.

Originally posted by @kleinhenz in #12235 (comment)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant