How to debug tex.fused_attn_bwd
getting cuDNN Error: [cudnn_frontend] Error: No execution plans support the graph
#1591
Labels
bug
Something isn't working
Describe the bug
Fused attention backward gets RuntimeError with no informative message. Setting CUDNN_LOGERR_DBG=1 and CUDNN_LOGDEST_DBG=stderr don't help.
Error Msg
Steps/Code to reproduce bug
`test.py`
Expected behavior
Expects backward fluently.
Environment overview (please complete the following information)
H100, Cudnn 9.1.0, CUDA 12.3, python 3.12.6, pytorch 2.6.0+cu124
TE installed via compilation with uv using 8eb1712
compilation script
The text was updated successfully, but these errors were encountered: