You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Python Version: 3.12.8 (main, Jan 14 2025, 22:49:36) [MSC v.1942 64 bit (AMD64)]
Embedded Python: false
PyTorch Version: 2.6.0+cu126
Devices
Name: cuda:0 NVIDIA GeForce RTX 4070 Ti SUPER : cudaMallocAsync
Type: cuda
VRAM Total: 17170956288
VRAM Free: 5840684976
Torch VRAM Total: 9999220736
Torch VRAM Free: 89245616
Logs
2025-02-24T17:39:11.006602 - Adding extra search path custom_nodes D:\ComfyUI\custom_nodes
2025-02-24T17:39:11.006602 - Adding extra search path download_model_base D:\ComfyUI\models
2025-02-24T17:39:11.006602 - Adding extra search path custom_nodes C:\Users\Jerry\AppData\Local\Programs\@comfyorgcomfyui-electron\resources\ComfyUI\custom_nodes
2025-02-24T17:39:11.006602 - Setting output directory to: D:\ComfyUI\output
2025-02-24T17:39:11.006602 - Setting input directory to: D:\ComfyUI\input
2025-02-24T17:39:11.006602 - Setting user directory to: D:\ComfyUI\user
2025-02-24T17:39:11.154088 - [START] Security scan2025-02-24T17:39:11.154088 -
2025-02-24T17:39:11.739813 - [DONE] Security scan2025-02-24T17:39:11.739813 -
2025-02-24T17:39:11.831689 - ## ComfyUI-Manager: installing dependencies done.2025-02-24T17:39:11.831689 -
2025-02-24T17:39:11.831689 - ** ComfyUI startup time:2025-02-24T17:39:11.831689 - 2025-02-24T17:39:11.831689 - 2025-02-24 17:39:11.8312025-02-24T17:39:11.831689 -
2025-02-24T17:39:11.831689 - ** Platform:2025-02-24T17:39:11.831689 - 2025-02-24T17:39:11.831689 - Windows2025-02-24T17:39:11.831689 -
2025-02-24T17:39:11.831689 - ** Python version:2025-02-24T17:39:11.831689 - 2025-02-24T17:39:11.831689 - 3.12.8 (main, Jan 14 2025, 22:49:36) [MSC v.1942 64 bit (AMD64)]2025-02-24T17:39:11.831689 -
2025-02-24T17:39:11.831689 - ** Python executable:2025-02-24T17:39:11.831689 - 2025-02-24T17:39:11.831689 - D:\ComfyUI\.venv\Scripts\python.exe2025-02-24T17:39:11.831689 -
2025-02-24T17:39:11.831689 - ** ComfyUI Path:2025-02-24T17:39:11.831689 - 2025-02-24T17:39:11.831689 - C:\Users\Jerry\AppData\Local\Programs\@comfyorgcomfyui-electron\resources\ComfyUI2025-02-24T17:39:11.831689 -
2025-02-24T17:39:11.831689 - ** ComfyUI Base Folder Path:2025-02-24T17:39:11.831689 - 2025-02-24T17:39:11.831689 - C:\Users\Jerry\AppData\Local\Programs\@comfyorgcomfyui-electron\resources\ComfyUI2025-02-24T17:39:11.831689 -
2025-02-24T17:39:11.831689 - ** User directory:2025-02-24T17:39:11.831689 - 2025-02-24T17:39:11.831689 - D:\ComfyUI\user2025-02-24T17:39:11.831689 -
2025-02-24T17:39:11.832689 - ** ComfyUI-Manager config path:2025-02-24T17:39:11.832689 - 2025-02-24T17:39:11.832689 - D:\ComfyUI\user\default\ComfyUI-Manager\config.ini2025-02-24T17:39:11.832689 -
2025-02-24T17:39:11.832689 - ** Log path:2025-02-24T17:39:11.832689 - 2025-02-24T17:39:11.832689 - D:\ComfyUI\user\comfyui.log2025-02-24T17:39:11.832689 -
2025-02-24T17:39:12.560625 -
Prestartup times for custom nodes:
2025-02-24T17:39:12.560625 - 1.6 seconds: C:\Users\Jerry\AppData\Local\Programs\@comfyorgcomfyui-electron\resources\ComfyUI\custom_nodes\ComfyUI-Manager
2025-02-24T17:39:12.560625 -
2025-02-24T17:39:13.522652 - Checkpoint files will always be loaded safely.
2025-02-24T17:39:13.622651 - Total VRAM 16376 MB, total RAM 65349 MB
2025-02-24T17:39:13.622651 - pytorch version: 2.6.0+cu126
2025-02-24T17:39:13.623651 - Set vram state to: NORMAL_VRAM
2025-02-24T17:39:13.623651 - Device: cuda:0 NVIDIA GeForce RTX 4070 Ti SUPER : cudaMallocAsync
2025-02-24T17:39:14.235558 - Using pytorch attention
2025-02-24T17:39:15.491464 - ComfyUI version: 0.3.14
2025-02-24T17:39:15.521465 - [Prompt Server] web root: C:\Users\Jerry\AppData\Local\Programs\@comfyorgcomfyui-electron\resources\ComfyUI\web_custom_versions\desktop_app
2025-02-24T17:39:15.849736 - [AnimateDiffEvo] - �[0;31mERROR�[0m - No motion models found. Please download one and place in: ['D:\\ComfyUI\\custom_nodes\\comfyui-animatediff-evolved\\models', 'D:\\ComfyUI\\models\\animatediff_models']
2025-02-24T17:39:16.534621 - Total VRAM 16376 MB, total RAM 65349 MB
2025-02-24T17:39:16.534621 - pytorch version: 2.6.0+cu126
2025-02-24T17:39:16.534621 - Set vram state to: NORMAL_VRAM
2025-02-24T17:39:16.534621 - Device: cuda:0 NVIDIA GeForce RTX 4070 Ti SUPER : cudaMallocAsync
2025-02-24T17:39:16.626648 - ### Loading: ComfyUI-Manager (V3.17.7)
2025-02-24T17:39:16.626648 - ### ComfyUI Revision: UNKNOWN (The currently installed ComfyUI is not a Git repository)
2025-02-24T17:39:16.632220 -
Import times for custom nodes:
2025-02-24T17:39:16.632220 - 0.0 seconds: C:\Users\Jerry\AppData\Local\Programs\@comfyorgcomfyui-electron\resources\ComfyUI\custom_nodes\websocket_image_save.py
2025-02-24T17:39:16.632220 - 0.0 seconds: C:\Users\Jerry\AppData\Local\Programs\@comfyorgcomfyui-electron\resources\ComfyUI\custom_nodes\ComfyUI-Manager
2025-02-24T17:39:16.632220 - 0.0 seconds: D:\ComfyUI\custom_nodes\comfyui-animatediff-evolved
2025-02-24T17:39:16.632220 - 0.0 seconds: D:\ComfyUI\custom_nodes\comfyui-kjnodes
2025-02-24T17:39:16.632220 - 0.1 seconds: D:\ComfyUI\custom_nodes\comfyui-videohelpersuite
2025-02-24T17:39:16.632722 - 0.1 seconds: D:\ComfyUI\custom_nodes\comfyui-hunyuanvideowrapper
2025-02-24T17:39:16.632722 - 0.6 seconds: D:\ComfyUI\custom_nodes\comfyui-hunyan3dwrapper
2025-02-24T17:39:16.632722 -
2025-02-24T17:39:16.639191 - Starting server
2025-02-24T17:39:16.639191 - To see the GUI go to: http://127.0.0.1:8000
2025-02-24T17:39:17.278575 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/alter-list.json
2025-02-24T17:39:17.431153 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/model-list.json
2025-02-24T17:39:17.503163 - FETCH DATA from: C:\Users\Jerry\AppData\Local\Programs\@comfyorgcomfyui-electron\resources\ComfyUI\custom_nodes\ComfyUI-Manager\extension-node-map.json2025-02-24T17:39:17.503163 - 2025-02-24T17:39:17.507164 - [DONE]2025-02-24T17:39:17.507164 -
2025-02-24T17:39:17.685932 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/github-stats.json
2025-02-24T17:39:17.931092 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/extension-node-map.json
2025-02-24T17:39:17.993752 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/custom-node-list.json
2025-02-24T17:39:22.740542 - got prompt
2025-02-24T17:39:24.349199 - encoded latents shape2025-02-24T17:39:24.349199 - 2025-02-24T17:39:24.349199 - torch.Size([1, 16, 1, 68, 120])2025-02-24T17:39:24.349199 -
2025-02-24T17:39:24.350199 - Loading text encoder model (clipL) from: D:\ComfyUI\models\clip\clip-vit-large-patch14
2025-02-24T17:39:24.724998 - Some weights of CLIPTextModel were not initialized from the model checkpoint at D:\ComfyUI\models\clip\clip-vit-large-patch14 and are newly initialized: ['text_model.embeddings.position_embedding.weight', 'text_model.embeddings.token_embedding.weight', 'text_model.encoder.layers.0.layer_norm1.bias', 'text_model.encoder.layers.0.layer_norm1.weight', 'text_model.encoder.layers.0.layer_norm2.bias', 'text_model.encoder.layers.0.layer_norm2.weight', 'text_model.encoder.layers.0.mlp.fc1.bias', 'text_model.encoder.layers.0.mlp.fc1.weight', 'text_model.encoder.layers.0.mlp.fc2.bias', 'text_model.encoder.layers.0.mlp.fc2.weight', 'text_model.encoder.layers.0.self_attn.k_proj.bias', 'text_model.encoder.layers.0.self_attn.k_proj.weight', 'text_model.encoder.layers.0.self_attn.out_proj.bias', 'text_model.encoder.layers.0.self_attn.out_proj.weight', 'text_model.encoder.layers.0.self_attn.q_proj.bias', 'text_model.encoder.layers.0.self_attn.q_proj.weight', 'text_model.encoder.layers.0.self_attn.v_proj.bias', 'text_model.encoder.layers.0.self_attn.v_proj.weight', 'text_model.encoder.layers.1.layer_norm1.bias', 'text_model.encoder.layers.1.layer_norm1.weight', 'text_model.encoder.layers.1.layer_norm2.bias', 'text_model.encoder.layers.1.layer_norm2.weight', 'text_model.encoder.layers.1.mlp.fc1.bias', 'text_model.encoder.layers.1.mlp.fc1.weight', 'text_model.encoder.layers.1.mlp.fc2.bias', 'text_model.encoder.layers.1.mlp.fc2.weight', 'text_model.encoder.layers.1.self_attn.k_proj.bias', 'text_model.encoder.layers.1.self_attn.k_proj.weight', 'text_model.encoder.layers.1.self_attn.out_proj.bias', 'text_model.encoder.layers.1.self_attn.out_proj.weight', 'text_model.encoder.layers.1.self_attn.q_proj.bias', 'text_model.encoder.layers.1.self_attn.q_proj.weight', 'text_model.encoder.layers.1.self_attn.v_proj.bias', 'text_model.encoder.layers.1.self_attn.v_proj.weight', 'text_model.encoder.layers.10.layer_norm1.bias', 'text_model.encoder.layers.10.layer_norm1.weight', 'text_model.encoder.layers.10.layer_norm2.bias', 'text_model.encoder.layers.10.layer_norm2.weight', 'text_model.encoder.layers.10.mlp.fc1.bias', 'text_model.encoder.layers.10.mlp.fc1.weight', 'text_model.encoder.layers.10.mlp.fc2.bias', 'text_model.encoder.layers.10.mlp.fc2.weight', 'text_model.encoder.layers.10.self_attn.k_proj.bias', 'text_model.encoder.layers.10.self_attn.k_proj.weight', 'text_model.encoder.layers.10.self_attn.out_proj.bias', 'text_model.encoder.layers.10.self_attn.out_proj.weight', 'text_model.encoder.layers.10.self_attn.q_proj.bias', 'text_model.encoder.layers.10.self_attn.q_proj.weight', 'text_model.encoder.layers.10.self_attn.v_proj.bias', 'text_model.encoder.layers.10.self_attn.v_proj.weight', 'text_model.encoder.layers.11.layer_norm1.bias', 'text_model.encoder.layers.11.layer_norm1.weight', 'text_model.encoder.layers.11.layer_norm2.bias', 'text_model.encoder.layers.11.layer_norm2.weight', 'text_model.encoder.layers.11.mlp.fc1.bias', 'text_model.encoder.layers.11.mlp.fc1.weight', 'text_model.encoder.layers.11.mlp.fc2.bias', 'text_model.encoder.layers.11.mlp.fc2.weight', 'text_model.encoder.layers.11.self_attn.k_proj.bias', 'text_model.encoder.layers.11.self_attn.k_proj.weight', 'text_model.encoder.layers.11.self_attn.out_proj.bias', 'text_model.encoder.layers.11.self_attn.out_proj.weight', 'text_model.encoder.layers.11.self_attn.q_proj.bias', 'text_model.encoder.layers.11.self_attn.q_proj.weight', 'text_model.encoder.layers.11.self_attn.v_proj.bias', 'text_model.encoder.layers.11.self_attn.v_proj.weight', 'text_model.encoder.layers.2.layer_norm1.bias', 'text_model.encoder.layers.2.layer_norm1.weight', 'text_model.encoder.layers.2.layer_norm2.bias', 'text_model.encoder.layers.2.layer_norm2.weight', 'text_model.encoder.layers.2.mlp.fc1.bias', 'text_model.encoder.layers.2.mlp.fc1.weight', 'text_model.encoder.layers.2.mlp.fc2.bias', 'text_model.encoder.layers.2.mlp.fc2.weight', 'text_model.encoder.layers.2.self_attn.k_proj.bias', 'text_model.encoder.layers.2.self_attn.k_proj.weight', 'text_model.encoder.layers.2.self_attn.out_proj.bias', 'text_model.encoder.layers.2.self_attn.out_proj.weight', 'text_model.encoder.layers.2.self_attn.q_proj.bias', 'text_model.encoder.layers.2.self_attn.q_proj.weight', 'text_model.encoder.layers.2.self_attn.v_proj.bias', 'text_model.encoder.layers.2.self_attn.v_proj.weight', 'text_model.encoder.layers.3.layer_norm1.bias', 'text_model.encoder.layers.3.layer_norm1.weight', 'text_model.encoder.layers.3.layer_norm2.bias', 'text_model.encoder.layers.3.layer_norm2.weight', 'text_model.encoder.layers.3.mlp.fc1.bias', 'text_model.encoder.layers.3.mlp.fc1.weight', 'text_model.encoder.layers.3.mlp.fc2.bias', 'text_model.encoder.layers.3.mlp.fc2.weight', 'text_model.encoder.layers.3.self_attn.k_proj.bias', 'text_model.encoder.layers.3.self_attn.k_proj.weight', 'text_model.encoder.layers.3.self_attn.out_proj.bias', 'text_model.encoder.layers.3.self_attn.out_proj.weight', 'text_model.encoder.layers.3.self_attn.q_proj.bias', 'text_model.encoder.layers.3.self_attn.q_proj.weight', 'text_model.encoder.layers.3.self_attn.v_proj.bias', 'text_model.encoder.layers.3.self_attn.v_proj.weight', 'text_model.encoder.layers.4.layer_norm1.bias', 'text_model.encoder.layers.4.layer_norm1.weight', 'text_model.encoder.layers.4.layer_norm2.bias', 'text_model.encoder.layers.4.layer_norm2.weight', 'text_model.encoder.layers.4.mlp.fc1.bias', 'text_model.encoder.layers.4.mlp.fc1.weight', 'text_model.encoder.layers.4.mlp.fc2.bias', 'text_model.encoder.layers.4.mlp.fc2.weight', 'text_model.encoder.layers.4.self_attn.k_proj.bias', 'text_model.encoder.layers.4.self_attn.k_proj.weight', 'text_model.encoder.layers.4.self_attn.out_proj.bias', 'text_model.encoder.layers.4.self_attn.out_proj.weight', 'text_model.encoder.layers.4.self_attn.q_proj.bias', 'text_model.encoder.layers.4.self_attn.q_proj.weight', 'text_model.encoder.layers.4.self_attn.v_proj.bias', 'text_model.encoder.layers.4.self_attn.v_proj.weight', 'text_model.encoder.layers.5.layer_norm1.bias', 'text_model.encoder.layers.5.layer_norm1.weight', 'text_model.encoder.layers.5.layer_norm2.bias', 'text_model.encoder.layers.5.layer_norm2.weight', 'text_model.encoder.layers.5.mlp.fc1.bias', 'text_model.encoder.layers.5.mlp.fc1.weight', 'text_model.encoder.layers.5.mlp.fc2.bias', 'text_model.encoder.layers.5.mlp.fc2.weight', 'text_model.encoder.layers.5.self_attn.k_proj.bias', 'text_model.encoder.layers.5.self_attn.k_proj.weight', 'text_model.encoder.layers.5.self_attn.out_proj.bias', 'text_model.encoder.layers.5.self_attn.out_proj.weight', 'text_model.encoder.layers.5.self_attn.q_proj.bias', 'text_model.encoder.layers.5.self_attn.q_proj.weight', 'text_model.encoder.layers.5.self_attn.v_proj.bias', 'text_model.encoder.layers.5.self_attn.v_proj.weight', 'text_model.encoder.layers.6.layer_norm1.bias', 'text_model.encoder.layers.6.layer_norm1.weight', 'text_model.encoder.layers.6.layer_norm2.bias', 'text_model.encoder.layers.6.layer_norm2.weight', 'text_model.encoder.layers.6.mlp.fc1.bias', 'text_model.encoder.layers.6.mlp.fc1.weight', 'text_model.encoder.layers.6.mlp.fc2.bias', 'text_model.encoder.layers.6.mlp.fc2.weight', 'text_model.encoder.layers.6.self_attn.k_proj.bias', 'text_model.encoder.layers.6.self_attn.k_proj.weight', 'text_model.encoder.layers.6.self_attn.out_proj.bias', 'text_model.encoder.layers.6.self_attn.out_proj.weight', 'text_model.encoder.layers.6.self_attn.q_proj.bias', 'text_model.encoder.layers.6.self_attn.q_proj.weight', 'text_model.encoder.layers.6.self_attn.v_proj.bias', 'text_model.encoder.layers.6.self_attn.v_proj.weight', 'text_model.encoder.layers.7.layer_norm1.bias', 'text_model.encoder.layers.7.layer_norm1.weight', 'text_model.encoder.layers.7.layer_norm2.bias', 'text_model.encoder.layers.7.layer_norm2.weight', 'text_model.encoder.layers.7.mlp.fc1.bias', 'text_model.encoder.layers.7.mlp.fc1.weight', 'text_model.encoder.layers.7.mlp.fc2.bias', 'text_model.encoder.layers.7.mlp.fc2.weight', 'text_model.encoder.layers.7.self_attn.k_proj.bias', 'text_model.encoder.layers.7.self_attn.k_proj.weight', 'text_model.encoder.layers.7.self_attn.out_proj.bias', 'text_model.encoder.layers.7.self_attn.out_proj.weight', 'text_model.encoder.layers.7.self_attn.q_proj.bias', 'text_model.encoder.layers.7.self_attn.q_proj.weight', 'text_model.encoder.layers.7.self_attn.v_proj.bias', 'text_model.encoder.layers.7.self_attn.v_proj.weight', 'text_model.encoder.layers.8.layer_norm1.bias', 'text_model.encoder.layers.8.layer_norm1.weight', 'text_model.encoder.layers.8.layer_norm2.bias', 'text_model.encoder.layers.8.layer_norm2.weight', 'text_model.encoder.layers.8.mlp.fc1.bias', 'text_model.encoder.layers.8.mlp.fc1.weight', 'text_model.encoder.layers.8.mlp.fc2.bias', 'text_model.encoder.layers.8.mlp.fc2.weight', 'text_model.encoder.layers.8.self_attn.k_proj.bias', 'text_model.encoder.layers.8.self_attn.k_proj.weight', 'text_model.encoder.layers.8.self_attn.out_proj.bias', 'text_model.encoder.layers.8.self_attn.out_proj.weight', 'text_model.encoder.layers.8.self_attn.q_proj.bias', 'text_model.encoder.layers.8.self_attn.q_proj.weight', 'text_model.encoder.layers.8.self_attn.v_proj.bias', 'text_model.encoder.layers.8.self_attn.v_proj.weight', 'text_model.encoder.layers.9.layer_norm1.bias', 'text_model.encoder.layers.9.layer_norm1.weight', 'text_model.encoder.layers.9.layer_norm2.bias', 'text_model.encoder.layers.9.layer_norm2.weight', 'text_model.encoder.layers.9.mlp.fc1.bias', 'text_model.encoder.layers.9.mlp.fc1.weight', 'text_model.encoder.layers.9.mlp.fc2.bias', 'text_model.encoder.layers.9.mlp.fc2.weight', 'text_model.encoder.layers.9.self_attn.k_proj.bias', 'text_model.encoder.layers.9.self_attn.k_proj.weight', 'text_model.encoder.layers.9.self_attn.out_proj.bias', 'text_model.encoder.layers.9.self_attn.out_proj.weight', 'text_model.encoder.layers.9.self_attn.q_proj.bias', 'text_model.encoder.layers.9.self_attn.q_proj.weight', 'text_model.encoder.layers.9.self_attn.v_proj.bias', 'text_model.encoder.layers.9.self_attn.v_proj.weight', 'text_model.final_layer_norm.bias', 'text_model.final_layer_norm.weight']
You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.
2025-02-24T17:39:24.756984 - Text encoder to dtype: torch.float16
2025-02-24T17:39:24.756984 - Loading tokenizer (clipL) from: D:\ComfyUI\models\clip\clip-vit-large-patch14
2025-02-24T17:39:24.826411 - Loading text encoder model (llm) from: D:\ComfyUI\models\LLM\llava-llama-3-8b-text-encoder-tokenizer
2025-02-24T17:39:25.126958 -
Loading checkpoint shards: 0%| | 0/4 [00:00<?, ?it/s]2025-02-24T17:39:25.647962 - FETCH ComfyRegistry Data: 5/352025-02-24T17:39:25.647962 -
2025-02-24T17:39:31.424167 -
Loading checkpoint shards: 100%|██████████| 4/4 [00:06<00:00, 1.57s/it]2025-02-24T17:39:31.424167 -
Loading checkpoint shards: 100%|██████████| 4/4 [00:06<00:00, 1.57s/it]2025-02-24T17:39:31.424167 -
2025-02-24T17:39:33.313355 - Text encoder to dtype: torch.float16
2025-02-24T17:39:33.314354 - Loading tokenizer (llm) from: D:\ComfyUI\models\LLM\llava-llama-3-8b-text-encoder-tokenizer
2025-02-24T17:39:33.626198 - FETCH ComfyRegistry Data: 10/352025-02-24T17:39:33.626198 -
2025-02-24T17:39:36.130589 - llm prompt attention_mask shape: torch.Size([1, 161]), masked tokens: 31
2025-02-24T17:39:38.624967 - clipL prompt attention_mask shape: torch.Size([1, 77]), masked tokens: 34
2025-02-24T17:39:38.761812 - model_type FLOW
2025-02-24T17:39:38.761812 - The config attributes {'use_flow_sigmas': True, 'prediction_type': 'flow_prediction'} were passed to FlowMatchDiscreteScheduler, but are not expected and will be ignored. Please verify your scheduler_config.json configuration file.
2025-02-24T17:39:38.761812 - Scheduler config:2025-02-24T17:39:38.761812 - 2025-02-24T17:39:38.761812 - FrozenDict({'num_train_timesteps': 1000, 'flow_shift': 9.0, 'reverse': True, 'solver': 'euler', 'n_tokens': None, '_use_default_values': ['n_tokens', 'num_train_timesteps']})2025-02-24T17:39:38.761812 -
2025-02-24T17:39:38.761812 - Using accelerate to load and assign model weights to device...
2025-02-24T17:39:38.873082 - Requested to load HyVideoModel
2025-02-24T17:39:41.486239 - FETCH ComfyRegistry Data: 15/352025-02-24T17:39:41.486239 -
2025-02-24T17:39:46.569159 - loaded completely 13591.419415283202 13585.720825195312 False
2025-02-24T17:39:49.271718 - Input (height, width, video_length) = (544, 960, 73)
2025-02-24T17:39:49.277718 - The config attributes {'reverse': True, 'solver': 'euler'} were passed to DPMSolverMultistepScheduler, but are not expected and will be ignored. Please verify your scheduler_config.json configuration file.
2025-02-24T17:39:49.335344 - FETCH ComfyRegistry Data: 20/352025-02-24T17:39:49.335344 -
2025-02-24T17:39:49.744504 - Swapping 20 double blocks and 10 single blocks2025-02-24T17:39:49.744504 -
2025-02-24T17:39:52.368317 - image_cond_latents shape:2025-02-24T17:39:52.368317 - 2025-02-24T17:39:52.368317 - torch.Size([1, 16, 1, 68, 120])2025-02-24T17:39:52.368317 -
2025-02-24T17:39:52.368317 - image_latents shape:2025-02-24T17:39:52.369318 - 2025-02-24T17:39:52.369318 - torch.Size([1, 16, 19, 68, 120])2025-02-24T17:39:52.369318 -
2025-02-24T17:39:52.610941 - Sampling 73 frames in 19 latents at 960x544 with 30 inference steps
2025-02-24T17:39:52.610941 -
0%| | 0/30 [00:00<?, ?it/s]2025-02-24T17:39:53.113948 -
0%| | 0/30 [00:00<?, ?it/s]2025-02-24T17:39:53.113948 -
2025-02-24T17:39:53.182941 - !!! Exception during processing !!! 'NoneType' object is not callable
2025-02-24T17:39:53.186632 - Traceback (most recent call last):
File "C:\Users\Jerry\AppData\Local\Programs\@comfyorgcomfyui-electron\resources\ComfyUI\execution.py", line 327, in execute
output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Jerry\AppData\Local\Programs\@comfyorgcomfyui-electron\resources\ComfyUI\execution.py", line 202, in get_output_data
return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Jerry\AppData\Local\Programs\@comfyorgcomfyui-electron\resources\ComfyUI\execution.py", line 174, in _map_node_over_list
process_inputs(input_dict, i)
File "C:\Users\Jerry\AppData\Local\Programs\@comfyorgcomfyui-electron\resources\ComfyUI\execution.py", line 163, in process_inputs
results.append(getattr(obj, func)(**inputs))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\ComfyUI\custom_nodes\comfyui-hunyuanvideowrapper\nodes.py", line 1281, in process
out_latents = model["pipe"](
^^^^^^^^^^^^^^
File "D:\ComfyUI\.venv\Lib\site-packages\torch\utils\_contextlib.py", line 116, in decorate_context
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "D:\ComfyUI\custom_nodes\comfyui-hunyuanvideowrapper\hyvideo\diffusion\pipelines\pipeline_hunyuan_video.py", line 785, in __call__
uncond = self.transformer(
^^^^^^^^^^^^^^^^^
File "D:\ComfyUI\.venv\Lib\site-packages\torch\nn\modules\module.py", line 1739, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\ComfyUI\.venv\Lib\site-packages\torch\nn\modules\module.py", line 1750, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\ComfyUI\custom_nodes\comfyui-hunyuanvideowrapper\hyvideo\modules\models.py", line 1047, in forward
img, txt = _process_double_blocks(img, txt, vec, block_args)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\ComfyUI\custom_nodes\comfyui-hunyuanvideowrapper\hyvideo\modules\models.py", line 894, in _process_double_blocks
img, txt = block(img, txt, vec, *block_args)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\ComfyUI\.venv\Lib\site-packages\torch\nn\modules\module.py", line 1739, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\ComfyUI\.venv\Lib\site-packages\torch\nn\modules\module.py", line 1750, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\ComfyUI\custom_nodes\comfyui-hunyuanvideowrapper\hyvideo\modules\models.py", line 257, in forward
attn = attention(
^^^^^^^^^^
File "D:\ComfyUI\custom_nodes\comfyui-hunyuanvideowrapper\hyvideo\modules\attention.py", line 189, in attention
x = flash_attn_varlen_func(
^^^^^^^^^^^^^^^^^^^^^^^
TypeError: 'NoneType' object is not callable
2025-02-24T17:39:53.187633 - Prompt executed in 30.45 seconds
Attached Workflow
Please make sure that workflow does not contain any sensitive information such as API keys or passwords.
{"last_node_id":71,"last_link_id":100,"nodes":[{"id":57,"type":"HyVideoTorchCompileSettings","pos":[-1196.784912109375,-308.2270812988281],"size":[441,274],"flags":{},"order":0,"mode":0,"inputs":[],"outputs":[{"name":"torch_compile_args","type":"COMPILEARGS","links":[]}],"properties":{"Node name for S&R":"HyVideoTorchCompileSettings"},"widgets_values":["inductor",false,"default",false,64,true,true,false,false,false]},{"id":52,"type":"ImageConcatMulti","pos":[811.2913818359375,20.121240615844727],"size":[210,150],"flags":{},"order":20,"mode":0,"inputs":[{"name":"image_1","type":"IMAGE","link":71},{"name":"image_2","type":"IMAGE","link":85}],"outputs":[{"name":"images","type":"IMAGE","links":[73],"slot_index":0}],"properties":{},"widgets_values":[2,"right",false,null]},{"id":5,"type":"HyVideoDecode","pos":[651,-285],"size":[345.4285888671875,150],"flags":{},"order":18,"mode":0,"inputs":[{"name":"vae","type":"VAE","link":6},{"name":"samples","type":"LATENT","link":4}],"outputs":[{"name":"images","type":"IMAGE","links":[83],"slot_index":0}],"properties":{"Node name for S&R":"HyVideoDecode"},"widgets_values":[true,64,192,false]},{"id":62,"type":"Note","pos":[-270.23193359375,-210.43328857421875],"size":[393.7334289550781,58],"flags":{},"order":1,"mode":0,"inputs":[],"outputs":[],"properties":{},"widgets_values":["https://huggingface.co/Kijai/SkyReels-V1-Hunyuan_comfy/blob/main/skyreels_hunyuan_i2v_bf16.safetensors"],"color":"#432","bgcolor":"#653"},{"id":63,"type":"Note","pos":[-241.6318817138672,-498.63323974609375],"size":[329.93341064453125,58],"flags":{},"order":2,"mode":0,"inputs":[],"outputs":[],"properties":{},"widgets_values":["https://huggingface.co/Kijai/HunyuanVideo_comfy/blob/main/hunyuan_video_vae_bf16.safetensors"],"color":"#432","bgcolor":"#653"},{"id":45,"type":"ImageResizeKJ","pos":[-562.4298095703125,494.5414733886719],"size":[315,266],"flags":{},"order":14,"mode":0,"inputs":[{"name":"image","type":"IMAGE","link":56},{"name":"get_image_size","type":"IMAGE","shape":7,"link":null},{"name":"width_input","type":"INT","shape":7,"widget":{"name":"width_input"},"link":null},{"name":"height_input","type":"INT","shape":7,"widget":{"name":"height_input"},"link":null}],"outputs":[{"name":"IMAGE","type":"IMAGE","links":[63,71,84],"slot_index":0},{"name":"width","type":"INT","links":[69],"slot_index":1},{"name":"height","type":"INT","links":[70],"slot_index":2}],"properties":{"Node name for S&R":"ImageResizeKJ"},"widgets_values":[960,544,"lanczos",false,2,0,0,"center"]},{"id":58,"type":"HyVideoCFG","pos":[-214.0895538330078,757.076416015625],"size":[400,200],"flags":{},"order":3,"mode":0,"inputs":[],"outputs":[{"name":"hyvid_cfg","type":"HYVID_CFG","links":[90],"slot_index":0}],"properties":{"Node name for S&R":"HyVideoCFG"},"widgets_values":["chaotic, distortion, morphing",6,0,0.5,false]},{"id":34,"type":"VHS_VideoCombine","pos":[1073.3206787109375,-299.8507080078125],"size":[214.7587890625,376],"flags":{},"order":21,"mode":0,"inputs":[{"name":"images","type":"IMAGE","link":73},{"name":"audio","type":"AUDIO","shape":7,"link":null},{"name":"meta_batch","type":"VHS_BatchManager","shape":7,"link":null},{"name":"vae","type":"VAE","shape":7,"link":null}],"outputs":[{"name":"Filenames","type":"VHS_FILENAMES","links":null}],"properties":{"Node name for S&R":"VHS_VideoCombine"},"widgets_values":{"frame_rate":24,"loop_count":0,"filename_prefix":"HunyuanVideo_skyreel_I2V","format":"video/h264-mp4","pix_fmt":"yuv420p","crf":19,"save_metadata":true,"trim_to_audio":false,"pingpong":false,"save_output":true,"videopreview":{"hidden":false,"paused":false,"params":{"filename":"HunyuanVideo_skyreel_I2V_00036.mp4","subfolder":"","type":"output","format":"video/h264-mp4","frame_rate":24,"workflow":"HunyuanVideo_skyreel_I2V_00036.png","fullpath":"N:\\AI\\ComfyUI\\output\\HunyuanVideo_skyreel_I2V_00036.mp4"},"muted":false}}},{"id":60,"type":"ColorMatch","pos":[736.9130859375,366.1897888183594],"size":[315,102],"flags":{},"order":19,"mode":0,"inputs":[{"name":"image_ref","type":"IMAGE","link":84},{"name":"image_target","type":"IMAGE","link":83}],"outputs":[{"name":"image","type":"IMAGE","links":[85],"slot_index":0}],"properties":{"Node name for S&R":"ColorMatch"},"widgets_values":["mkl",1]},{"id":43,"type":"HyVideoEncode","pos":[-204.26951599121094,476.7945861816406],"size":[315,198],"flags":{},"order":16,"mode":0,"inputs":[{"name":"vae","type":"VAE","link":54},{"name":"image","type":"IMAGE","link":63}],"outputs":[{"name":"samples","type":"LATENT","links":[75],"slot_index":0}],"properties":{"Node name for S&R":"HyVideoEncode"},"widgets_values":[false,64,256,true,0.04,1]},{"id":61,"type":"Note","pos":[-1025.4185791015625,-458.9891357421875],"size":[258.9027099609375,83.07069396972656],"flags":{},"order":4,"mode":0,"inputs":[],"outputs":[],"properties":{},"widgets_values":["If you have working Triton install, torch compile will reduce VRAM use and increase speed about 30%"],"color":"#432","bgcolor":"#653"},{"id":64,"type":"HyVideoEnhanceAVideo","pos":[242.49143981933594,-388.3094482421875],"size":[352.79998779296875,154],"flags":{},"order":5,"mode":0,"inputs":[],"outputs":[{"name":"feta_args","type":"FETAARGS","links":[91]}],"properties":{"Node name for S&R":"HyVideoEnhanceAVideo"},"widgets_values":[3,true,true,0,1]},{"id":70,"type":"Note","pos":[265.79437255859375,-518.3580322265625],"size":[299.13330078125,65.1333236694336],"flags":{},"order":6,"mode":0,"inputs":[],"outputs":[],"properties":{},"widgets_values":["Enhance a video weight should be relative to the video lenght, if you get noisy results it's too strong"],"color":"#432","bgcolor":"#653"},{"id":30,"type":"HyVideoTextEncode","pos":[231.84727478027344,692.9142456054688],"size":[437.3100280761719,240.78936767578125],"flags":{},"order":15,"mode":0,"inputs":[{"name":"text_encoders","type":"HYVIDTEXTENCODER","link":35},{"name":"custom_prompt_template","type":"PROMPT_TEMPLATE","shape":7,"link":null},{"name":"clip_l","type":"CLIP","shape":7,"link":null},{"name":"hyvid_cfg","type":"HYVID_CFG","shape":7,"link":90}],"outputs":[{"name":"hyvid_embeds","type":"HYVIDEMBEDS","links":[74],"slot_index":0}],"properties":{"Node name for S&R":"HyVideoTextEncode"},"widgets_values":["FPS-24, Man walking forward towards the camera looking intensely at the camera holding his hand out, surrounded by polar bears, he starts laughing manically","bad quality video","video"]},{"id":59,"type":"HyVideoBlockSwap","pos":[-643.1146240234375,-82.39257049560547],"size":[315,130],"flags":{},"order":7,"mode":0,"inputs":[],"outputs":[{"name":"block_swap_args","type":"BLOCKSWAPARGS","links":[98],"slot_index":0}],"properties":{"Node name for S&R":"HyVideoBlockSwap"},"widgets_values":[20,10,false,false]},{"id":69,"type":"Note","pos":[-630.0859375,-265.2142333984375],"size":[275.66656494140625,85.13335418701172],"flags":{},"order":8,"mode":0,"inputs":[],"outputs":[],"properties":{},"widgets_values":["block swap is a manual way to do cpu offloading, use this to trade speed with memory use"],"color":"#432","bgcolor":"#653"},{"id":71,"type":"Note","pos":[-585.5635375976562,95.25086975097656],"size":[275.66656494140625,85.13335418701172],"flags":{},"order":9,"mode":0,"inputs":[],"outputs":[],"properties":{},"widgets_values":["Use sageattention if you can, flash attention if you can't... last resort use sdpa"],"color":"#432","bgcolor":"#653"},{"id":44,"type":"LoadImage","pos":[-922.2027587890625,493.7345886230469],"size":[315,314.0000305175781],"flags":{},"order":10,"mode":0,"inputs":[],"outputs":[{"name":"IMAGE","type":"IMAGE","links":[56],"slot_index":0},{"name":"MASK","type":"MASK","links":null}],"properties":{"Node name for S&R":"LoadImage"},"widgets_values":["7c9942663abcb1fa390742eb07e45a8d.jpg","image"]},{"id":7,"type":"HyVideoVAELoader","pos":[-265.9999694824219,-394.7333679199219],"size":[379.166748046875,82],"flags":{},"order":11,"mode":0,"inputs":[{"name":"compile_args","type":"COMPILEARGS","shape":7,"link":null}],"outputs":[{"name":"vae","type":"VAE","links":[6,54],"slot_index":0}],"properties":{"Node name for S&R":"HyVideoVAELoader"},"widgets_values":["hunyuan_video_vae_bf16.safetensors","bf16"]},{"id":16,"type":"DownloadAndLoadHyVideoTextEncoder","pos":[-309.5799865722656,220.413330078125],"size":[441,202],"flags":{},"order":12,"mode":0,"inputs":[],"outputs":[{"name":"hyvid_text_encoder","type":"HYVIDTEXTENCODER","links":[35]}],"properties":{"Node name for S&R":"DownloadAndLoadHyVideoTextEncoder"},"widgets_values":["Kijai/llava-llama-3-8b-text-encoder-tokenizer","openai/clip-vit-large-patch14","fp16",false,2,"disabled","offload_device"]},{"id":1,"type":"HyVideoModelLoader","pos":[-285,-102.5894546508789],"size":[426.1773986816406,242],"flags":{},"order":13,"mode":0,"inputs":[{"name":"compile_args","type":"COMPILEARGS","shape":7,"link":null},{"name":"block_swap_args","type":"BLOCKSWAPARGS","shape":7,"link":98},{"name":"lora","type":"HYVIDLORA","shape":7,"link":null}],"outputs":[{"name":"model","type":"HYVIDEOMODEL","links":[2],"slot_index":0}],"properties":{"Node name for S&R":"HyVideoModelLoader"},"widgets_values":["skyreels_hunyuan_i2v_bf16.safetensors","bf16","disabled","offload_device","flash_attn_varlen",true,true]},{"id":3,"type":"HyVideoSampler","pos":[266,-141],"size":[315,611.1666870117188],"flags":{},"order":17,"mode":0,"inputs":[{"name":"model","type":"HYVIDEOMODEL","link":2},{"name":"hyvid_embeds","type":"HYVIDEMBEDS","link":74},{"name":"samples","type":"LATENT","shape":7,"link":null},{"name":"image_cond_latents","type":"LATENT","shape":7,"link":75},{"name":"stg_args","type":"STGARGS","shape":7,"link":null},{"name":"context_options","type":"HYVIDCONTEXT","shape":7,"link":null},{"name":"feta_args","type":"FETAARGS","shape":7,"link":91},{"name":"width","type":"INT","widget":{"name":"width"},"link":69},{"name":"height","type":"INT","widget":{"name":"height"},"link":70},{"name":"teacache_args","type":"TEACACHEARGS","shape":7,"link":null}],"outputs":[{"name":"samples","type":"LATENT","links":[4],"slot_index":0}],"properties":{"Node name for S&R":"HyVideoSampler"},"widgets_values":[512,320,73,30,1,9,15,"fixed",1,1,"SDE-DPMSolverMultistepScheduler"]}],"links":[[2,1,0,3,0,"HYVIDEOMODEL"],[4,3,0,5,1,"LATENT"],[6,7,0,5,0,"VAE"],[35,16,0,30,0,"HYVIDTEXTENCODER"],[54,7,0,43,0,"VAE"],[56,44,0,45,0,"IMAGE"],[63,45,0,43,1,"IMAGE"],[69,45,1,3,7,"INT"],[70,45,2,3,8,"INT"],[71,45,0,52,0,"IMAGE"],[73,52,0,34,0,"IMAGE"],[74,30,0,3,1,"HYVIDEMBEDS"],[75,43,0,3,3,"LATENT"],[83,5,0,60,1,"IMAGE"],[84,45,0,60,0,"IMAGE"],[85,60,0,52,1,"IMAGE"],[90,58,0,30,3,"HYVID_CFG"],[91,64,0,3,6,"FETAARGS"],[98,59,0,1,1,"BLOCKSWAPARGS"]],"groups":[],"config":{},"extra":{"ds":{"scale":0.8954302432552419,"offset":[1719.9563406180907,584.5552082990192]},"node_versions":{"comfyui-hunyuanvideowrapper":"1.0.3","comfyui-kjnodes":"1.0.5","comfyui-videohelpersuite":"1.5.2","comfy-core":"0.3.14"},"VHS_latentpreview":false,"VHS_latentpreviewrate":0,"VHS_MetadataImage":true,"VHS_KeepIntermediate":true},"version":0.4}
Additional Context
(Please add any additional context or steps to reproduce the error here)
The text was updated successfully, but these errors were encountered:
ComfyUI Error Report
Error Details
Stack Trace
System Information
Devices
Logs
Attached Workflow
Please make sure that workflow does not contain any sensitive information such as API keys or passwords.
Additional Context
(Please add any additional context or steps to reproduce the error here)
The text was updated successfully, but these errors were encountered: