Skip to content
This repository was archived by the owner on Jun 28, 2024. It is now read-only.
This repository was archived by the owner on Jun 28, 2024. It is now read-only.

Error running ghcr.io/evshiron/rocm_lab:rocm5.5-text-gen-webui 7dea7110f293 #13

@briansp2020

Description

@briansp2020

starlette.websockets.WebSocketDisconnect: 1001
INFO:Loading TheBloke_Llama-2-13B-chat-GGML...
INFO:llama.cpp weights detected: models/TheBloke_Llama-2-13B-chat-GGML/llama-2-13b-chat.ggmlv3.q6_K.bin

INFO:Cache capacity is 0 bytes
llama.cpp: loading model from models/TheBloke_Llama-2-13B-chat-GGML/llama-2-13b-chat.ggmlv3.q6_K.bin
error loading model: unrecognized tensor type 14

llama_init_from_file: failed to load model
Exception ignored in: <function LlamaCppModel.del at 0x7fdeac07f910>
Traceback (most recent call last):
File "/root/text-generation-webui/modules/llamacpp_model.py", line 23, in del
self.model.del()
AttributeError: 'LlamaCppModel' object has no attribute 'model'

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions