Skip to content

Commit 7117d73

Browse files
vercel-ai-sdk[bot]nicoalbanesevercel[bot]
authored
Backport: docs: update quickstart guides to use gateway (#10317)
This is an automated backport of #10292 to the release-v5.0 branch. FYI @nicoalbanese Co-authored-by: Nico Albanese <[email protected]> Co-authored-by: vercel[bot] <35613825+vercel[bot]@users.noreply.github.com>
1 parent f848eb1 commit 7117d73

File tree

4 files changed

+393
-173
lines changed

4 files changed

+393
-173
lines changed

content/docs/02-getting-started/04-svelte.mdx

Lines changed: 106 additions & 48 deletions
Original file line numberDiff line numberDiff line change
@@ -1,13 +1,13 @@
11
---
22
title: Svelte
3-
description: Welcome to the AI SDK quickstart guide for Svelte!
3+
description: Learn how to build your first agent with the AI SDK and Svelte.
44
---
55

66
# Svelte Quickstart
77

88
The AI SDK is a powerful Typescript library designed to help developers build AI-powered applications.
99

10-
In this quickstart tutorial, you'll build a simple AI-chatbot with a streaming user interface. Along the way, you'll learn key concepts and techniques that are fundamental to using the SDK in your own projects.
10+
In this quickstart tutorial, you'll build a simple agent with a streaming chat user interface. Along the way, you'll learn key concepts and techniques that are fundamental to using the SDK in your own projects.
1111

1212
If you are unfamiliar with the concepts of [Prompt Engineering](/docs/advanced/prompt-engineering) and [HTTP Streaming](/docs/advanced/why-streaming), you can optionally read these documents first.
1313

@@ -16,9 +16,9 @@ If you are unfamiliar with the concepts of [Prompt Engineering](/docs/advanced/p
1616
To follow this quickstart, you'll need:
1717

1818
- Node.js 18+ and pnpm installed on your local development machine.
19-
- An OpenAI API key.
19+
- A [ Vercel AI Gateway ](https://vercel.com/ai-gateway) API key.
2020

21-
If you haven't obtained your OpenAI API key, you can do so by [signing up](https://platform.openai.com/signup/) on the OpenAI website.
21+
If you haven't obtained your Vercel AI Gateway API key, you can do so by [signing up](https://vercel.com/d?to=%2F%5Bteam%5D%2F%7E%2Fai&title=Go+to+AI+Gateway) on the Vercel website.
2222

2323
## Set Up Your Application
2424

@@ -32,74 +32,75 @@ Navigate to the newly created directory:
3232

3333
### Install Dependencies
3434

35-
Install `ai` and `@ai-sdk/openai`, the AI SDK's OpenAI provider.
35+
Install `ai` and `@ai-sdk/svelte`, the AI package and AI SDK's Svelte bindings. The AI SDK's [ Vercel AI Gateway provider ](/providers/ai-sdk-providers/ai-gateway) ships with the `ai` package. You'll also install `zod`, a schema validation library used for defining tool inputs.
3636

3737
<Note>
38-
The AI SDK is designed to be a unified interface to interact with any large
39-
language model. This means that you can change model and providers with just
40-
one line of code! Learn more about [available providers](/providers) and
41-
[building custom providers](/providers/community-providers/custom-providers)
42-
in the [providers](/providers) section.
38+
This guide uses the Vercel AI Gateway provider so you can access hundreds of
39+
models from different providers with one API key, but you can switch to any
40+
provider or model by installing its package. Check out available [AI SDK
41+
providers](/providers/ai-sdk-providers) for more information.
4342
</Note>
4443
<div className="my-4">
4544
<Tabs items={['pnpm', 'npm', 'yarn', 'bun']}>
4645
<Tab>
47-
<Snippet text="pnpm add -D ai @ai-sdk/openai @ai-sdk/svelte zod" dark />
46+
<Snippet text="pnpm add -D ai@beta @ai-sdk/svelte@beta zod" dark />
4847
</Tab>
4948
<Tab>
50-
<Snippet
51-
text="npm install -D ai @ai-sdk/openai @ai-sdk/svelte zod"
52-
dark
53-
/>
49+
<Snippet text="npm install -D ai@beta @ai-sdk/svelte@beta zod" dark />
5450
</Tab>
5551
<Tab>
56-
<Snippet text="yarn add -D ai @ai-sdk/openai @ai-sdk/svelte zod" dark />
52+
<Snippet text="yarn add -D ai@beta @ai-sdk/svelte@beta zod" dark />
5753
</Tab>
5854
<Tab>
59-
<Snippet text="bun add -d ai @ai-sdk/openai @ai-sdk/svelte zod" dark />
55+
<Snippet text="bun add -d ai@beta @ai-sdk/svelte@beta zod" dark />
6056
</Tab>
6157
</Tabs>
6258
</div>
6359

64-
### Configure OpenAI API Key
60+
### Configure your AI Gateway API key
6561

66-
Create a `.env.local` file in your project root and add your OpenAI API Key. This key is used to authenticate your application with the OpenAI service.
62+
Create a `.env.local` file in your project root and add your AI Gateway API key. This key authenticates your application with the Vercel AI Gateway.
6763

6864
<Snippet text="touch .env.local" />
6965

7066
Edit the `.env.local` file:
7167

7268
```env filename=".env.local"
73-
OPENAI_API_KEY=xxxxxxxxx
69+
AI_GATEWAY_API_KEY=xxxxxxxxx
7470
```
7571

76-
Replace `xxxxxxxxx` with your actual OpenAI API key.
72+
Replace `xxxxxxxxx` with your actual Vercel AI Gateway API key.
7773

7874
<Note className="mb-4">
79-
Vite does not automatically load environment variables onto `process.env`, so
80-
you'll need to import `OPENAI_API_KEY` from `$env/static/private` in your code
81-
(see below).
75+
The AI SDK's Vercel AI Gateway Provider will default to using the
76+
`AI_GATEWAY_API_KEY` environment variable. Vite does not automatically load
77+
environment variables onto `process.env`, so you'll need to import
78+
`AI_GATEWAY_API_KEY` from `$env/static/private` in your code (see below).
8279
</Note>
8380

8481
## Create an API route
8582

8683
Create a SvelteKit Endpoint, `src/routes/api/chat/+server.ts` and add the following code:
8784

8885
```tsx filename="src/routes/api/chat/+server.ts"
89-
import { createOpenAI } from '@ai-sdk/openai';
90-
import { streamText, type UIMessage, convertToModelMessages } from 'ai';
86+
import {
87+
streamText,
88+
type UIMessage,
89+
convertToModelMessages,
90+
createGateway,
91+
} from 'ai';
9192

92-
import { OPENAI_API_KEY } from '$env/static/private';
93+
import { AI_GATEWAY_API_KEY } from '$env/static/private';
9394

94-
const openai = createOpenAI({
95-
apiKey: OPENAI_API_KEY,
95+
const gateway = createGateway({
96+
apiKey: AI_GATEWAY_API_KEY,
9697
});
9798

9899
export async function POST({ request }) {
99100
const { messages }: { messages: UIMessage[] } = await request.json();
100101

101102
const result = streamText({
102-
model: openai('gpt-4o'),
103+
model: gateway('openai/gpt-5.1'),
103104
messages: convertToModelMessages(messages),
104105
});
105106

@@ -108,18 +109,75 @@ export async function POST({ request }) {
108109
```
109110

110111
<Note>
111-
If you see type errors with `OPENAI_API_KEY` or your `POST` function, run the
112-
dev server.
112+
If you see type errors with `AI_GATEWAY_API_KEY` or your `POST` function, run
113+
the dev server.
113114
</Note>
114115

115116
Let's take a look at what is happening in this code:
116117

117-
1. Create an OpenAI provider instance with the `createOpenAI` function from the `@ai-sdk/openai` package.
118+
1. Create a gateway provider instance with the `createGateway` function from the `ai` package.
118119
2. Define a `POST` request handler and extract `messages` from the body of the request. The `messages` variable contains a history of the conversation between you and the chatbot and provides the chatbot with the necessary context to make the next generation. The `messages` are of UIMessage type, which are designed for use in application UI - they contain the entire message history and associated metadata like timestamps.
119120
3. Call [`streamText`](/docs/reference/ai-sdk-core/stream-text), which is imported from the `ai` package. This function accepts a configuration object that contains a `model` provider (defined in step 1) and `messages` (defined in step 2). You can pass additional [settings](/docs/ai-sdk-core/settings) to further customise the model's behaviour. The `messages` key expects a `ModelMessage[]` array. This type is different from `UIMessage` in that it does not include metadata, such as timestamps or sender information. To convert between these types, we use the `convertToModelMessages` function, which strips the UI-specific metadata and transforms the `UIMessage[]` array into the `ModelMessage[]` format that the model expects.
120121
4. The `streamText` function returns a [`StreamTextResult`](/docs/reference/ai-sdk-core/stream-text#result-object). This result object contains the [ `toUIMessageStreamResponse` ](/docs/reference/ai-sdk-core/stream-text#to-data-stream-response) function which converts the result to a streamed response object.
121122
5. Return the result to the client to stream the response.
122123

124+
## Choosing a Provider
125+
126+
The AI SDK supports dozens of model providers through [first-party](/providers/ai-sdk-providers), [OpenAI-compatible](/providers/openai-compatible-providers), and [ community ](/providers/community-providers) packages.
127+
128+
This quickstart uses the [Vercel AI Gateway](https://vercel.com/ai-gateway) provider, which is the default [global provider](/docs/ai-sdk-core/provider-management#global-provider-configuration). This means you can access models using a simple string in the model configuration:
129+
130+
```ts
131+
model: 'openai/gpt-5.1';
132+
```
133+
134+
You can also explicitly import and use the gateway provider in two other equivalent ways:
135+
136+
```ts
137+
// Option 1: Import from 'ai' package (included by default)
138+
import { gateway } from 'ai';
139+
model: gateway('openai/gpt-5.1');
140+
141+
// Option 2: Install and import from '@ai-sdk/gateway' package
142+
import { gateway } from '@ai-sdk/gateway';
143+
model: gateway('openai/gpt-5.1');
144+
```
145+
146+
### Using other providers
147+
148+
To use a different provider, install its package and create a provider instance. For example, to use OpenAI directly:
149+
150+
<div className="my-4">
151+
<Tabs items={['pnpm', 'npm', 'yarn', 'bun']}>
152+
<Tab>
153+
<Snippet text="pnpm add @ai-sdk/openai@beta" dark />
154+
</Tab>
155+
<Tab>
156+
<Snippet text="npm install @ai-sdk/openai@beta" dark />
157+
</Tab>
158+
<Tab>
159+
<Snippet text="yarn add @ai-sdk/openai@beta" dark />
160+
</Tab>
161+
162+
<Tab>
163+
<Snippet text="bun add @ai-sdk/openai@beta" dark />
164+
</Tab>
165+
166+
</Tabs>
167+
</div>
168+
169+
```ts
170+
import { openai } from '@ai-sdk/openai';
171+
172+
model: openai('gpt-5.1');
173+
```
174+
175+
#### Updating the global provider
176+
177+
You can change the default global provider so string model references use your preferred provider everywhere in your application. Learn more about [provider management](/docs/ai-sdk-core/provider-management#global-provider-configuration).
178+
179+
Pick the approach that best matches how you want to manage providers across your application.
180+
123181
## Wire up the UI
124182

125183
Now that you have an API route that can query an LLM, it's time to set up your frontend. The AI SDK's [UI](/docs/ai-sdk-ui) package abstracts the complexity of a chat interface into one class, `Chat`.
@@ -195,8 +253,8 @@ Let's enhance your chatbot by adding a simple weather tool.
195253
Modify your `src/routes/api/chat/+server.ts` file to include the new weather tool:
196254

197255
```tsx filename="src/routes/api/chat/+server.ts" highlight="2,3,17-31"
198-
import { createOpenAI } from '@ai-sdk/openai';
199256
import {
257+
createGateway,
200258
streamText,
201259
type UIMessage,
202260
convertToModelMessages,
@@ -205,17 +263,17 @@ import {
205263
} from 'ai';
206264
import { z } from 'zod';
207265

208-
import { OPENAI_API_KEY } from '$env/static/private';
266+
import { AI_GATEWAY_API_KEY } from '$env/static/private';
209267

210-
const openai = createOpenAI({
211-
apiKey: OPENAI_API_KEY,
268+
const gateway = createGateway({
269+
apiKey: AI_GATEWAY_API_KEY,
212270
});
213271

214272
export async function POST({ request }) {
215273
const { messages }: { messages: UIMessage[] } = await request.json();
216274

217275
const result = streamText({
218-
model: openai('gpt-4o'),
276+
model: gateway('openai/gpt-5.1'),
219277
messages: convertToModelMessages(messages),
220278
tools: {
221279
weather: tool({
@@ -316,8 +374,8 @@ To solve this, you can enable multi-step tool calls using `stopWhen`. By default
316374
Modify your `src/routes/api/chat/+server.ts` file to include the `stopWhen` condition:
317375

318376
```ts filename="src/routes/api/chat/+server.ts" highlight="15"
319-
import { createOpenAI } from '@ai-sdk/openai';
320377
import {
378+
createGateway,
321379
streamText,
322380
type UIMessage,
323381
convertToModelMessages,
@@ -326,17 +384,17 @@ import {
326384
} from 'ai';
327385
import { z } from 'zod';
328386

329-
import { OPENAI_API_KEY } from '$env/static/private';
387+
import { AI_GATEWAY_API_KEY } from '$env/static/private';
330388

331-
const openai = createOpenAI({
332-
apiKey: OPENAI_API_KEY,
389+
const gateway = createGateway({
390+
apiKey: AI_GATEWAY_API_KEY,
333391
});
334392

335393
export async function POST({ request }) {
336394
const { messages }: { messages: UIMessage[] } = await request.json();
337395

338396
const result = streamText({
339-
model: openai('gpt-4o'),
397+
model: gateway('openai/gpt-5.1'),
340398
messages: convertToModelMessages(messages),
341399
stopWhen: stepCountIs(5),
342400
tools: {
@@ -369,8 +427,8 @@ By setting `stopWhen: stepCountIs(5)`, you're allowing the model to use up to 5
369427
Update your `src/routes/api/chat/+server.ts` file to add a new tool to convert the temperature from Fahrenheit to Celsius:
370428

371429
```tsx filename="src/routes/api/chat/+server.ts" highlight="32-45"
372-
import { createOpenAI } from '@ai-sdk/openai';
373430
import {
431+
createGateway,
374432
streamText,
375433
type UIMessage,
376434
convertToModelMessages,
@@ -379,17 +437,17 @@ import {
379437
} from 'ai';
380438
import { z } from 'zod';
381439

382-
import { OPENAI_API_KEY } from '$env/static/private';
440+
import { AI_GATEWAY_API_KEY } from '$env/static/private';
383441

384-
const openai = createOpenAI({
385-
apiKey: OPENAI_API_KEY,
442+
const gateway = createGateway({
443+
apiKey: AI_GATEWAY_API_KEY,
386444
});
387445

388446
export async function POST({ request }) {
389447
const { messages }: { messages: UIMessage[] } = await request.json();
390448

391449
const result = streamText({
392-
model: openai('gpt-4o'),
450+
model: gateway('openai/gpt-5.1'),
393451
messages: convertToModelMessages(messages),
394452
stopWhen: stepCountIs(5),
395453
tools: {

0 commit comments

Comments
 (0)