Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

CUDA OOM error and VRAM is not in use #380

Open
EQXai opened this issue Feb 19, 2025 · 5 comments
Open

CUDA OOM error and VRAM is not in use #380

EQXai opened this issue Feb 19, 2025 · 5 comments

Comments

@EQXai
Copy link

EQXai commented Feb 19, 2025

CUDA error: out of memory
CUDA kernel errors might be asynchronously reported at some other API call, so the stacktrace below might be incorrect.
For debugging consider passing CUDA_LAUNCH_BLOCKING=1
Compile with TORCH_USE_CUDA_DSA to enable device-side assertions.

I'm getting an OOM error with a 5090 with 32GB VRAM, the strange thing is that when running WF the VRAM never fills up, it never goes beyond 12GB and then gives an OOM error. This only happens with SkyReels, normal Hunyuan works perfectly.

@kijai
Copy link
Owner

kijai commented Feb 19, 2025

Probably due to CFG, as the default behavior is to do cond and uncond batched in one pass which considerably increases VRAM use. I have added option to disable that to do it sequentially instead:

Image

@EQXai
Copy link
Author

EQXai commented Feb 19, 2025

Probably due to CFG, as the default behavior is to do cond and uncond batched in one pass which considerably increases VRAM use. I have added option to disable that to do it sequentially instead:

Image

I have tried it and it doesn't work, could it be due to some incompatibility with the RTX 5000?

@kijai
Copy link
Owner

kijai commented Feb 19, 2025

Hard to say, I do have 5090 on another setup and it's working fine so far.

@jwoeifjofwefawsfasd
Copy link

same OOM error with rtx 5090 @EQXai were you able to get any progress?

@EQXai
Copy link
Author

EQXai commented Feb 20, 2025

same OOM error with rtx 5090 @EQXai were you able to get any progress?

No, but it could be anything, there are still a lot of things that are not supported by the 5000.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants