Skip to content

Commit 27c60c1

Browse files
authored
Merge branch 'main' into main
2 parents ecbc4af + c3ee93c commit 27c60c1

File tree

688 files changed

+49961
-14190
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

688 files changed

+49961
-14190
lines changed

.circleci/config.yml

+206-31
Large diffs are not rendered by default.

.circleci/requirements.txt

+1-1
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
# used by CI/CD testing
2-
openai==1.54.0
2+
openai==1.66.1
33
python-dotenv
44
tiktoken
55
importlib_metadata

.github/pull_request_template.md

+10-6
Original file line numberDiff line numberDiff line change
@@ -6,6 +6,16 @@
66

77
<!-- e.g. "Fixes #000" -->
88

9+
## Pre-Submission checklist
10+
11+
**Please complete all items before asking a LiteLLM maintainer to review your PR**
12+
13+
- [ ] I have Added testing in the `tests/litellm/` directory, **Adding at least 1 test is a hard requirement** - [see details](https://docs.litellm.ai/docs/extras/contributing_code)
14+
- [ ] I have added a screenshot of my new test passing locally
15+
- [ ] My PR passes all unit tests on (`make test-unit`)[https://docs.litellm.ai/docs/extras/contributing_code]
16+
- [ ] My PR's scope is as isolated as possible, it only solves 1 specific problem
17+
18+
919
## Type
1020

1121
<!-- Select the type of Pull Request -->
@@ -20,10 +30,4 @@
2030

2131
## Changes
2232

23-
<!-- List of changes -->
24-
25-
## [REQUIRED] Testing - Attach a screenshot of any new tests passing locally
26-
If UI changes, send a screenshot/GIF of working UI fixes
27-
28-
<!-- Test procedure -->
2933

.github/workflows/ghcr_deploy.yml

+21-6
Original file line numberDiff line numberDiff line change
@@ -80,7 +80,6 @@ jobs:
8080
permissions:
8181
contents: read
8282
packages: write
83-
#
8483
steps:
8584
- name: Checkout repository
8685
uses: actions/checkout@v4
@@ -112,7 +111,11 @@ jobs:
112111
with:
113112
context: .
114113
push: true
115-
tags: ${{ steps.meta.outputs.tags }}-${{ github.event.inputs.tag || 'latest' }}, ${{ steps.meta.outputs.tags }}-${{ github.event.inputs.release_type }} # if a tag is provided, use that, otherwise use the release tag, and if neither is available, use 'latest'
114+
tags: |
115+
${{ steps.meta.outputs.tags }}-${{ github.event.inputs.tag || 'latest' }},
116+
${{ steps.meta.outputs.tags }}-${{ github.event.inputs.release_type }}
117+
${{ github.event.inputs.release_type == 'stable' && format('{0}/berriai/litellm:main-{1}', env.REGISTRY, github.event.inputs.tag) || '' }},
118+
${{ github.event.inputs.release_type == 'stable' && format('{0}/berriai/litellm:main-stable', env.REGISTRY) || '' }}
116119
labels: ${{ steps.meta.outputs.labels }}
117120
platforms: local,linux/amd64,linux/arm64,linux/arm64/v8
118121

@@ -151,8 +154,12 @@ jobs:
151154
context: .
152155
file: ./docker/Dockerfile.database
153156
push: true
154-
tags: ${{ steps.meta-database.outputs.tags }}-${{ github.event.inputs.tag || 'latest' }}, ${{ steps.meta-database.outputs.tags }}-${{ github.event.inputs.release_type }}
155-
labels: ${{ steps.meta-database.outputs.labels }}
157+
tags: |
158+
${{ steps.meta-database.outputs.tags }}-${{ github.event.inputs.tag || 'latest' }},
159+
${{ steps.meta-database.outputs.tags }}-${{ github.event.inputs.release_type }}
160+
${{ github.event.inputs.release_type == 'stable' && format('{0}/berriai/litellm-database:main-{1}', env.REGISTRY, github.event.inputs.tag) || '' }},
161+
${{ github.event.inputs.release_type == 'stable' && format('{0}/berriai/litellm-database:main-stable', env.REGISTRY) || '' }}
162+
labels: ${{ steps.meta-database.outputs.labels }}
156163
platforms: local,linux/amd64,linux/arm64,linux/arm64/v8
157164

158165
build-and-push-image-non_root:
@@ -190,7 +197,11 @@ jobs:
190197
context: .
191198
file: ./docker/Dockerfile.non_root
192199
push: true
193-
tags: ${{ steps.meta-non_root.outputs.tags }}-${{ github.event.inputs.tag || 'latest' }}, ${{ steps.meta-non_root.outputs.tags }}-${{ github.event.inputs.release_type }}
200+
tags: |
201+
${{ steps.meta-non_root.outputs.tags }}-${{ github.event.inputs.tag || 'latest' }},
202+
${{ steps.meta-non_root.outputs.tags }}-${{ github.event.inputs.release_type }}
203+
${{ github.event.inputs.release_type == 'stable' && format('{0}/berriai/litellm-non_root:main-{1}', env.REGISTRY, github.event.inputs.tag) || '' }},
204+
${{ github.event.inputs.release_type == 'stable' && format('{0}/berriai/litellm-non_root:main-stable', env.REGISTRY) || '' }}
194205
labels: ${{ steps.meta-non_root.outputs.labels }}
195206
platforms: local,linux/amd64,linux/arm64,linux/arm64/v8
196207

@@ -229,7 +240,11 @@ jobs:
229240
context: .
230241
file: ./litellm-js/spend-logs/Dockerfile
231242
push: true
232-
tags: ${{ steps.meta-spend-logs.outputs.tags }}-${{ github.event.inputs.tag || 'latest' }}, ${{ steps.meta-spend-logs.outputs.tags }}-${{ github.event.inputs.release_type }}
243+
tags: |
244+
${{ steps.meta-spend-logs.outputs.tags }}-${{ github.event.inputs.tag || 'latest' }},
245+
${{ steps.meta-spend-logs.outputs.tags }}-${{ github.event.inputs.release_type }}
246+
${{ github.event.inputs.release_type == 'stable' && format('{0}/berriai/litellm-spend_logs:main-{1}', env.REGISTRY, github.event.inputs.tag) || '' }},
247+
${{ github.event.inputs.release_type == 'stable' && format('{0}/berriai/litellm-spend_logs:main-stable', env.REGISTRY) || '' }}
233248
platforms: local,linux/amd64,linux/arm64,linux/arm64/v8
234249

235250
build-and-push-helm-chart:

.github/workflows/helm_unit_test.yml

+27
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,27 @@
1+
name: Helm unit test
2+
3+
on:
4+
pull_request:
5+
push:
6+
branches:
7+
- main
8+
9+
jobs:
10+
unit-test:
11+
runs-on: ubuntu-latest
12+
steps:
13+
- name: Checkout
14+
uses: actions/checkout@v2
15+
16+
- name: Set up Helm 3.11.1
17+
uses: azure/setup-helm@v1
18+
with:
19+
version: '3.11.1'
20+
21+
- name: Install Helm Unit Test Plugin
22+
run: |
23+
helm plugin install https://github.com/helm-unittest/helm-unittest --version v0.4.4
24+
25+
- name: Run unit tests
26+
run:
27+
helm unittest -f 'tests/*.yaml' deploy/charts/litellm-helm

.github/workflows/interpret_load_test.py

+36-11
Original file line numberDiff line numberDiff line change
@@ -52,6 +52,41 @@ def interpret_results(csv_file):
5252
return markdown_table
5353

5454

55+
def _get_docker_run_command_stable_release(release_version):
56+
return f"""
57+
\n\n
58+
## Docker Run LiteLLM Proxy
59+
60+
```
61+
docker run \\
62+
-e STORE_MODEL_IN_DB=True \\
63+
-p 4000:4000 \\
64+
ghcr.io/berriai/litellm:litellm_stable_release_branch-{release_version}
65+
```
66+
"""
67+
68+
69+
def _get_docker_run_command(release_version):
70+
return f"""
71+
\n\n
72+
## Docker Run LiteLLM Proxy
73+
74+
```
75+
docker run \\
76+
-e STORE_MODEL_IN_DB=True \\
77+
-p 4000:4000 \\
78+
ghcr.io/berriai/litellm:main-{release_version}
79+
```
80+
"""
81+
82+
83+
def get_docker_run_command(release_version):
84+
if "stable" in release_version:
85+
return _get_docker_run_command_stable_release(release_version)
86+
else:
87+
return _get_docker_run_command(release_version)
88+
89+
5590
if __name__ == "__main__":
5691
csv_file = "load_test_stats.csv" # Change this to the path of your CSV file
5792
markdown_table = interpret_results(csv_file)
@@ -79,17 +114,7 @@ def interpret_results(csv_file):
79114
start_index = latest_release.body.find("Load Test LiteLLM Proxy Results")
80115
existing_release_body = latest_release.body[:start_index]
81116

82-
docker_run_command = f"""
83-
\n\n
84-
## Docker Run LiteLLM Proxy
85-
86-
```
87-
docker run \\
88-
-e STORE_MODEL_IN_DB=True \\
89-
-p 4000:4000 \\
90-
ghcr.io/berriai/litellm:main-{release_version}
91-
```
92-
"""
117+
docker_run_command = get_docker_run_command(release_version)
93118
print("docker run command: ", docker_run_command)
94119

95120
new_release_body = (

.github/workflows/locustfile.py

+1-1
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ class MyUser(HttpUser):
88
def chat_completion(self):
99
headers = {
1010
"Content-Type": "application/json",
11-
"Authorization": "Bearer sk-ZoHqrLIs2-5PzJrqBaviAA",
11+
"Authorization": "Bearer sk-8N1tLOOyH8TIxwOLahhIVg",
1212
# Include any additional headers you may need for authentication, etc.
1313
}
1414

.gitignore

+2
Original file line numberDiff line numberDiff line change
@@ -77,3 +77,5 @@ litellm/proxy/_experimental/out/404.html
7777
litellm/proxy/_experimental/out/model_hub.html
7878
.mypy_cache/*
7979
litellm/proxy/application.log
80+
tests/llm_translation/vertex_test_account.json
81+
tests/llm_translation/test_vertex_key.json

.pre-commit-config.yaml

+1-1
Original file line numberDiff line numberDiff line change
@@ -22,7 +22,7 @@ repos:
2222
rev: 7.0.0 # The version of flake8 to use
2323
hooks:
2424
- id: flake8
25-
exclude: ^litellm/tests/|^litellm/proxy/tests/
25+
exclude: ^litellm/tests/|^litellm/proxy/tests/|^litellm/tests/litellm/|^tests/litellm/
2626
additional_dependencies: [flake8-print]
2727
files: litellm/.*\.py
2828
# - id: flake8

Makefile

+32
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,32 @@
1+
# LiteLLM Makefile
2+
# Simple Makefile for running tests and basic development tasks
3+
4+
.PHONY: help test test-unit test-integration lint format
5+
6+
# Default target
7+
help:
8+
@echo "Available commands:"
9+
@echo " make test - Run all tests"
10+
@echo " make test-unit - Run unit tests"
11+
@echo " make test-integration - Run integration tests"
12+
@echo " make test-unit-helm - Run helm unit tests"
13+
14+
install-dev:
15+
poetry install --with dev
16+
17+
lint: install-dev
18+
poetry run pip install types-requests types-setuptools types-redis types-PyYAML
19+
cd litellm && poetry run mypy . --ignore-missing-imports
20+
21+
# Testing
22+
test:
23+
poetry run pytest tests/
24+
25+
test-unit:
26+
poetry run pytest tests/litellm/
27+
28+
test-integration:
29+
poetry run pytest tests/ -k "not litellm"
30+
31+
test-unit-helm:
32+
helm unittest -f 'tests/*.yaml' deploy/charts/litellm-helm

README.md

+6-63
Original file line numberDiff line numberDiff line change
@@ -40,7 +40,7 @@ LiteLLM manages:
4040
[**Jump to LiteLLM Proxy (LLM Gateway) Docs**](https://github.com/BerriAI/litellm?tab=readme-ov-file#openai-proxy---docs) <br>
4141
[**Jump to Supported LLM Providers**](https://github.com/BerriAI/litellm?tab=readme-ov-file#supported-providers-docs)
4242

43-
🚨 **Stable Release:** Use docker images with the `-stable` tag. These have undergone 12 hour load tests, before being published.
43+
🚨 **Stable Release:** Use docker images with the `-stable` tag. These have undergone 12 hour load tests, before being published. [More information about the release cycle here](https://docs.litellm.ai/docs/proxy/release_cycle)
4444

4545
Support for more providers. Missing a provider or LLM Platform, raise a [feature request](https://github.com/BerriAI/litellm/issues/new?assignees=&labels=enhancement&projects=&template=feature_request.yml&title=%5BFeature%5D%3A+).
4646

@@ -64,7 +64,7 @@ import os
6464

6565
## set ENV variables
6666
os.environ["OPENAI_API_KEY"] = "your-openai-key"
67-
os.environ["ANTHROPIC_API_KEY"] = "your-cohere-key"
67+
os.environ["ANTHROPIC_API_KEY"] = "your-anthropic-key"
6868

6969
messages = [{ "content": "Hello, how are you?","role": "user"}]
7070

@@ -187,13 +187,13 @@ os.environ["LANGFUSE_PUBLIC_KEY"] = ""
187187
os.environ["LANGFUSE_SECRET_KEY"] = ""
188188
os.environ["ATHINA_API_KEY"] = "your-athina-api-key"
189189

190-
os.environ["OPENAI_API_KEY"]
190+
os.environ["OPENAI_API_KEY"] = "your-openai-key"
191191

192192
# set callbacks
193193
litellm.success_callback = ["lunary", "mlflow", "langfuse", "athina", "helicone"] # log input/output to lunary, langfuse, supabase, athina, helicone etc
194194

195195
#openai call
196-
response = completion(model="anthropic/claude-3-sonnet-20240229", messages=[{"role": "user", "content": "Hi 👋 - i'm openai"}])
196+
response = completion(model="openai/gpt-4o", messages=[{"role": "user", "content": "Hi 👋 - i'm openai"}])
197197
```
198198

199199
# LiteLLM Proxy Server (LLM Gateway) - ([Docs](https://docs.litellm.ai/docs/simple_proxy))
@@ -340,64 +340,7 @@ curl 'http://0.0.0.0:4000/key/generate' \
340340

341341
## Contributing
342342

343-
To contribute: Clone the repo locally -> Make a change -> Submit a PR with the change.
344-
345-
Here's how to modify the repo locally:
346-
Step 1: Clone the repo
347-
348-
```
349-
git clone https://github.com/BerriAI/litellm.git
350-
```
351-
352-
Step 2: Navigate into the project, and install dependencies:
353-
354-
```
355-
cd litellm
356-
poetry install -E extra_proxy -E proxy
357-
```
358-
359-
Step 3: Test your change:
360-
361-
```
362-
cd tests # pwd: Documents/litellm/litellm/tests
363-
poetry run flake8
364-
poetry run pytest .
365-
```
366-
367-
Step 4: Submit a PR with your changes! 🚀
368-
369-
- push your fork to your GitHub repo
370-
- submit a PR from there
371-
372-
### Building LiteLLM Docker Image
373-
374-
Follow these instructions if you want to build / run the LiteLLM Docker Image yourself.
375-
376-
Step 1: Clone the repo
377-
378-
```
379-
git clone https://github.com/BerriAI/litellm.git
380-
```
381-
382-
Step 2: Build the Docker Image
383-
384-
Build using Dockerfile.non_root
385-
```
386-
docker build -f docker/Dockerfile.non_root -t litellm_test_image .
387-
```
388-
389-
Step 3: Run the Docker Image
390-
391-
Make sure config.yaml is present in the root directory. This is your litellm proxy config file.
392-
```
393-
docker run \
394-
-v $(pwd)/proxy_config.yaml:/app/config.yaml \
395-
-e DATABASE_URL="postgresql://xxxxxxxx" \
396-
-e LITELLM_MASTER_KEY="sk-1234" \
397-
-p 4000:4000 \
398-
litellm_test_image \
399-
--config /app/config.yaml --detailed_debug
400-
```
343+
Interested in contributing? Contributions to LiteLLM Python SDK, Proxy Server, and contributing LLM integrations are both accepted and highly encouraged! [See our Contribution Guide for more details](https://docs.litellm.ai/docs/extras/contributing_code)
401344

402345
# Enterprise
403346
For companies that need better security, user management and professional support
@@ -467,4 +410,4 @@ If you have suggestions on how to improve the code quality feel free to open an
467410
### Frontend
468411
1. Navigate to `ui/litellm-dashboard`
469412
2. Install dependencies `npm install`
470-
3. Run `npm run dev` to start the dashboard
413+
3. Run `npm run dev` to start the dashboard

0 commit comments

Comments
 (0)