Skip to content

add batch api example to readme #331

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
27 changes: 27 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -386,6 +386,33 @@ for model in models:
print(model)
```

### Batch Inference

The batch API allows you to submit larger inference jobs for completion with a 24 hour turn-around time, below is an example. To learn more refer to the [docs here](https://docs.together.ai/docs/batch-inference).

```python
from together import Together

client = Together()

# Upload the batch file
batch_file = client.files.upload(file="simpleqa_batch_student.jsonl", purpose="batch-api")

# Create the batch job
batch = client.batches.create_batch(file_id=batch_file.id, endpoint="/v1/chat/completions")

# Monitor the batch status
batch_stat = client.batches.get_batch(batch.id)

# List all batches - contains other batches as well
client.batches.list_batches()

# Download the file content if job completed
if batch_stat.status == 'COMPLETED':
output_response = client.files.retrieve_content(id=batch_stat.output_file_id,
output="simpleqa_v3_output.jsonl")
```

## Usage – CLI

### Chat Completions
Expand Down