Skip to content

UPSTREAM PR #21985: Feat/modelscope support#1354

Open
loci-dev wants to merge 7 commits intomainfrom
loci/pr-21985-feat-modelscope-support
Open

UPSTREAM PR #21985: Feat/modelscope support#1354
loci-dev wants to merge 7 commits intomainfrom
loci/pr-21985-feat-modelscope-support

Conversation

@loci-dev
Copy link
Copy Markdown

Note

Source pull request: ggml-org/llama.cpp#21985

Overview

This PR introduces ModelScope integration for model downloading and loading.
Related PR: ggml-org/llama.cpp#20941

Key changes:

  1. Added new cli arguments: -ms (ModelScope repo ID), -msf (file), and -mst (token).
  2. Added corresponding environment variables support for seamless configuration.
    Enabled community model downloading via MODEL_ENDPOINT (defaults to https://modelscope.cn/), allowing flexible usage with or without explicit endpoint specification.

Usage examples:

# 1. Download via ModelScope ID (uses default or MODEL_ENDPOINT)
./build/bin/llama-cli -ms Qwen/Qwen3-0.6B-GGUF:Q8_0 -p "hello"
or
MODEL_ENDPOINT=https://modelscope.cn/ ./build/bin/llama-cli -ms Qwen/Qwen3-0.6B-GGUF:Q8_0 -p "hello"

# 2. Specify file and token explicitly (or set 'MS_TOKEN' env)
./build/bin/llama-cli -ms <repo> -msf <file> -mst <token> -p "hello"

Additional information

Requirements

@loci-review
Copy link
Copy Markdown

loci-review Bot commented Apr 16, 2026

No summary available at this time.

@loci-review
Copy link
Copy Markdown

loci-review Bot commented Apr 17, 2026

No summary available at this time.

@loci-dev loci-dev force-pushed the main branch 3 times, most recently from 7638ab4 to f1b46d5 Compare April 20, 2026 02:19
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants