Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Switch from huggingface to vLLM for offline inference #36

Open
LeoBaro opened this issue Mar 14, 2025 · 0 comments
Open

Switch from huggingface to vLLM for offline inference #36

LeoBaro opened this issue Mar 14, 2025 · 0 comments
Assignees

Comments

@LeoBaro
Copy link
Collaborator

LeoBaro commented Mar 14, 2025

vLLM gives a simple API to run locally all kind of models and langchain provides a wrapper to it.

@LeoBaro LeoBaro self-assigned this Mar 14, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant