Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

codegen: add vLLM as default inference engine #883

Merged
merged 1 commit into from
Mar 25, 2025

Conversation

lianhao
Copy link
Collaborator

@lianhao lianhao commented Mar 20, 2025

Description

codegen: add vLLM as default inference engine

Issues

Fixes #869

Type of change

List the type of change like below. Please delete options that are not relevant.

  • New feature (non-breaking change which adds new functionality)

Dependencies

List the newly introduced 3rd party dependency if exists.

Tests

CI for xeon env, manually for gaudi env.

@lianhao lianhao requested review from poussa and yongfengdu March 20, 2025 01:09
@poussa poussa merged commit 64df8b3 into opea-project:main Mar 25, 2025
11 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[ci-auto] CodeGen: add vLLM backend
3 participants