Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

mlc_llm/serve/engine.py", line 101, in _convert_model_info assert isinstance(chat_config.conv_template, Conversation) AssertionError #2008

Closed
omkar806 opened this issue Mar 23, 2024 · 2 comments
Labels
bug Confirmed bugs

Comments

@omkar806
Copy link

image I am getting this error . python -m mlc_llm.serve.server --model dist/phi-2-q4f16_1-MLC --model-lib-path dist/prebuilt/phi-2/phi-2-q4f16_1-metal.so --device "metal" --host 127.0.0.1 --port 8000 I am running this command and installing the libraries python3 -m pip install --pre -U -f https://mlc.ai/wheels mlc-llm-nightly mlc-ai-nightly using this command.
@omkar806 omkar806 added the bug Confirmed bugs label Mar 23, 2024
@anibohara2000
Copy link
Contributor

This is because the format of conv_template in mlc-chat-config.json is now changed, introduced in PR #1965 . You need to re-generate your mlc-chat-config.json using the python -m mlc_llm gen_config ... command

@omkar806
Copy link
Author

Hey also I wanted to deploy this test api on hugging face spaces . I tried to do it using python client.But there was an error of device_type = none. Umm so as I was hosting on a docker. So I am not sure is this the right way to host the mlc compiled models .I have also raised a bug it's very latest.It would be helpful if I can get a advice on this.

@tqchen tqchen closed this as completed May 11, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Confirmed bugs
Projects
None yet
Development

No branches or pull requests

3 participants