Skip to content

Commit

Permalink
[pre-commit.ci] auto fixes from pre-commit.com hooks
Browse files Browse the repository at this point in the history
for more information, see https://pre-commit.ci
  • Loading branch information
pre-commit-ci[bot] committed Feb 25, 2025
1 parent 94f42bf commit a011c2e
Showing 1 changed file with 4 additions and 1 deletion.
5 changes: 4 additions & 1 deletion transformer_engine/pytorch/attention.py
Original file line number Diff line number Diff line change
Expand Up @@ -507,7 +507,10 @@ def get_attention_backend(
if use_flash_attention and (
head_dim_qk > 256
or head_dim_qk % 8 != 0
or (head_dim_qk > 192 and device_compute_capability not in ((8, 0), (9, 0), (10, 0), (12, 0)))
or (
head_dim_qk > 192
and device_compute_capability not in ((8, 0), (9, 0), (10, 0), (12, 0))
)
):
if _flash_attn_is_installed:
logger.debug(
Expand Down

0 comments on commit a011c2e

Please sign in to comment.