Skip to content

codetrans: add vLLM as default inference engine #710

codetrans: add vLLM as default inference engine

codetrans: add vLLM as default inference engine #710

Triggered via pull request March 19, 2025 12:31
@lianhaolianhao
ready_for_review #881
Status Success
Total duration 6m 59s
Artifacts

pr-chart-e2e.yaml

on: pull_request_target
Get-test-matrix
4s
Get-test-matrix
Matrix: Chart-test
Fit to window
Zoom out
Zoom in