Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Does vLLM currently support multi - GPU distributed training? #1671

Open
TolearnMo opened this issue Feb 11, 2025 · 3 comments
Open

Does vLLM currently support multi - GPU distributed training? #1671

TolearnMo opened this issue Feb 11, 2025 · 3 comments

Comments

@TolearnMo
Copy link

Why is only GPU:0 enabled by default, even if I set device_map="auto"?

@dustinwloring1988
Copy link

I think unsloth only supports one GPU for training

@kallewoof
Copy link

Unsloth does a bunch of optimizations which are currently only supported on single GPUs.

@shimmyshimmer
Copy link
Collaborator

Single GPU only at the moment. Hopefully we'll have an update coming soon for multigpu

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants