Skip to content

Enable llama fp8 masked_flash_attention 8#984

Draft
AmosLewis wants to merge 6 commits intonod-ai:mainfrom AmosLewis:enable_kernel_fp8_attn8

Commits

Commits on Feb 4, 2025

Commits on Feb 19, 2025