runtime error
Exit code: 1. Reason: ion_pytorch_model.safetensors: 100%|█████████▉| 4.17G/4.17G [00:10<00:00, 395MB/s] vae/config.json: 0%| | 0.00/739 [00:00<?, ?B/s][A vae/config.json: 100%|██████████| 739/739 [00:00<00:00, 4.68MB/s] diffusion_pytorch_model.safetensors: 0%| | 0.00/168M [00:00<?, ?B/s][A diffusion_pytorch_model.safetensors: 100%|█████████▉| 168M/168M [00:00<00:00, 254MB/s] Loading pipeline components...: 0%| | 0/4 [00:00<?, ?it/s][A Loading pipeline components...: 100%|██████████| 4/4 [00:00<00:00, 7.12it/s] Traceback (most recent call last): File "/home/user/app/app.py", line 31, in <module> pipe = StableDiffusionPipeline.from_pretrained(repo, torch_dtype=torch.float16).to(device) File "/usr/local/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 114, in _inner_fn return fn(*args, **kwargs) File "/usr/local/lib/python3.10/site-packages/diffusers/pipelines/pipeline_utils.py", line 971, in from_pretrained raise ValueError( ValueError: Pipeline <class 'diffusers.pipelines.stable_diffusion.pipeline_stable_diffusion.StableDiffusionPipeline'> expected {'vae', 'feature_extractor', 'image_encoder', 'tokenizer', 'text_encoder', 'unet', 'scheduler', 'safety_checker'}, but only {'vae', 'scheduler', 'tokenizer', 'text_encoder'} were passed. During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/home/user/app/app.py", line 33, in <module> raise RuntimeError(f"Failed to load the model. Ensure the token has access to the repo. Error: {e}") RuntimeError: Failed to load the model. Ensure the token has access to the repo. Error: Pipeline <class 'diffusers.pipelines.stable_diffusion.pipeline_stable_diffusion.StableDiffusionPipeline'> expected {'vae', 'feature_extractor', 'image_encoder', 'tokenizer', 'text_encoder', 'unet', 'scheduler', 'safety_checker'}, but only {'vae', 'scheduler', 'tokenizer', 'text_encoder'} were passed.
Container logs:
Fetching error logs...