-
Notifications
You must be signed in to change notification settings - Fork 63
Attention sink support #533
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
kareemshaik80
commented
Sep 25, 2025
- Support for single sink logit in flash attention Decode
- Add Sink to Softmax
- Cmd line flag added to enable attention sink
- Support for single sink logit in flash attention Decode - Add Sink to Softmax - Cmd line flag added to enable attention sink Signed-off-by: kareem <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also need paper/code reference to ensure this PR is what intended to do
applications/flash_attention_v2/kernel/xe_flash_attn_decode.hpp
Outdated
Show resolved
Hide resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
not changed
you can refer this: https://arxiv.org/pdf/2309.17453 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Move test under unit tests.
examples/06_bmg_flash_attention/bmg_flash_attn_decode_runner.hpp
Outdated
Show resolved
Hide resolved
applications/flash_attention_v2/collective/xe_flash_attn_decode_softmax_epilogue.hpp
Show resolved
Hide resolved
examples/06_bmg_flash_attention/bmg_flash_attn_decode_runner.hpp
Outdated
Show resolved
Hide resolved
Signed-off-by: kareem <[email protected]>
Signed-off-by: kareem <[email protected]>
Signed-off-by: kareem <[email protected]>
applications/flash_attention_v2/collective/xe_flash_attn_decode_softmax_epilogue.hpp
Show resolved
Hide resolved
applications/flash_attention_v2/collective/xe_flash_attn_decode_softmax_epilogue.hpp
Show resolved
Hide resolved
applications/flash_attention_v2/collective/xe_flash_attn_decode_softmax_epilogue.hpp
Show resolved
Hide resolved
@kareemshaik80 I believe this implementation need to change based on this PR 547 |