Skip to content

使用vllm serve运行时报错 #26

@alfiy

Description

@alfiy

报错信息如下:
ValueError: Loading moonshotai/Moonlight-16B-A3B-Instruct requires you to execute the configuration file in that repo on your local machine. Make sure you have read the code there to avoid malicious use, then set the option trust_remote_code=True to remove this error.

继续使用命令:vllm serve "moonshotai/Moonlight-16B-A3B-Instruct" trust_remote_code=True运行时出现如下报错:
INFO 04-14 13:59:19 [init.py:239] Automatically detected platform cuda.
usage: vllm [-h] [-v] {chat,complete,serve,bench} ...
vllm: error: unrecognized arguments: trust_remote_code=True

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions