forked from Dao-AILab/flash-attention
-
Notifications
You must be signed in to change notification settings - Fork 55
Issues: ROCm/flash-attention
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
[Issue]: ModuleNotFoundError: No module named 'rotary_emb'
Under Investigation
#136
opened Apr 13, 2025 by
arunhk3
[Issue]: Test failing with ROCm 6.3.1 on MI250X
Under Investigation
#120
opened Jan 29, 2025 by
al-rigazzi
[Feature]: Flash Attention 3 Support for MI300X GPUs
Feature Request
Under Investigation
#71
opened Aug 1, 2024 by
codinggosu
Merge to upstream flash-attention repo
Under Investigation
upstream
#35
opened Jan 18, 2024 by
ehartford
replace kernel implementation using CK tile-programming performant kernels
#33
opened Jan 10, 2024 by
carlushuang
4 tasks
Mi50 Support
Feature Request
pre-mi200
Under Investigation
#29
opened Dec 31, 2023 by
YehowshuaScaled
Feature request: Sliding Window Attention
Feature Request
function
#22
opened Nov 29, 2023 by
tjtanaa
ProTip!
no:milestone will show everything without a milestone.