You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Backport: docs: update quickstart guides to use gateway (#10317)
This is an automated backport of #10292 to the release-v5.0 branch. FYI
@nicoalbanese
Co-authored-by: Nico Albanese <[email protected]>
Co-authored-by: vercel[bot] <35613825+vercel[bot]@users.noreply.github.com>
description: Welcome to the AI SDK quickstart guide for Svelte!
3
+
description: Learn how to build your first agent with the AI SDK and Svelte.
4
4
---
5
5
6
6
# Svelte Quickstart
7
7
8
8
The AI SDK is a powerful Typescript library designed to help developers build AI-powered applications.
9
9
10
-
In this quickstart tutorial, you'll build a simple AI-chatbot with a streaming user interface. Along the way, you'll learn key concepts and techniques that are fundamental to using the SDK in your own projects.
10
+
In this quickstart tutorial, you'll build a simple agent with a streaming chat user interface. Along the way, you'll learn key concepts and techniques that are fundamental to using the SDK in your own projects.
11
11
12
12
If you are unfamiliar with the concepts of [Prompt Engineering](/docs/advanced/prompt-engineering) and [HTTP Streaming](/docs/advanced/why-streaming), you can optionally read these documents first.
13
13
@@ -16,9 +16,9 @@ If you are unfamiliar with the concepts of [Prompt Engineering](/docs/advanced/p
16
16
To follow this quickstart, you'll need:
17
17
18
18
- Node.js 18+ and pnpm installed on your local development machine.
19
-
-An OpenAI API key.
19
+
-A [ Vercel AI Gateway ](https://vercel.com/ai-gateway) API key.
20
20
21
-
If you haven't obtained your OpenAI API key, you can do so by [signing up](https://platform.openai.com/signup/) on the OpenAI website.
21
+
If you haven't obtained your Vercel AI Gateway API key, you can do so by [signing up](https://vercel.com/d?to=%2F%5Bteam%5D%2F%7E%2Fai&title=Go+to+AI+Gateway) on the Vercel website.
22
22
23
23
## Set Up Your Application
24
24
@@ -32,74 +32,75 @@ Navigate to the newly created directory:
32
32
33
33
### Install Dependencies
34
34
35
-
Install `ai` and `@ai-sdk/openai`, the AI SDK's OpenAI provider.
35
+
Install `ai` and `@ai-sdk/svelte`, the AI package and AI SDK's Svelte bindings. The AI SDK's [ Vercel AI Gateway provider](/providers/ai-sdk-providers/ai-gateway) ships with the `ai` package. You'll also install `zod`, a schema validation library used for defining tool inputs.
36
36
37
37
<Note>
38
-
The AI SDK is designed to be a unified interface to interact with any large
39
-
language model. This means that you can change model and providers with just
40
-
one line of code! Learn more about [available providers](/providers) and
Create a `.env.local` file in your project root and add your OpenAI API Key. This key is used to authenticate your application with the OpenAI service.
62
+
Create a `.env.local` file in your project root and add your AI Gateway API key. This key authenticates your application with the Vercel AI Gateway.
67
63
68
64
<Snippettext="touch .env.local" />
69
65
70
66
Edit the `.env.local` file:
71
67
72
68
```env filename=".env.local"
73
-
OPENAI_API_KEY=xxxxxxxxx
69
+
AI_GATEWAY_API_KEY=xxxxxxxxx
74
70
```
75
71
76
-
Replace `xxxxxxxxx` with your actual OpenAI API key.
72
+
Replace `xxxxxxxxx` with your actual Vercel AI Gateway API key.
77
73
78
74
<NoteclassName="mb-4">
79
-
Vite does not automatically load environment variables onto `process.env`, so
80
-
you'll need to import `OPENAI_API_KEY` from `$env/static/private` in your code
81
-
(see below).
75
+
The AI SDK's Vercel AI Gateway Provider will default to using the
76
+
`AI_GATEWAY_API_KEY` environment variable. Vite does not automatically load
77
+
environment variables onto `process.env`, so you'll need to import
78
+
`AI_GATEWAY_API_KEY` from `$env/static/private` in your code (see below).
82
79
</Note>
83
80
84
81
## Create an API route
85
82
86
83
Create a SvelteKit Endpoint, `src/routes/api/chat/+server.ts` and add the following code:
If you see type errors with `OPENAI_API_KEY` or your `POST` function, run the
112
-
dev server.
112
+
If you see type errors with `AI_GATEWAY_API_KEY` or your `POST` function, run
113
+
the dev server.
113
114
</Note>
114
115
115
116
Let's take a look at what is happening in this code:
116
117
117
-
1. Create an OpenAI provider instance with the `createOpenAI` function from the `@ai-sdk/openai` package.
118
+
1. Create a gateway provider instance with the `createGateway` function from the `ai` package.
118
119
2. Define a `POST` request handler and extract `messages` from the body of the request. The `messages` variable contains a history of the conversation between you and the chatbot and provides the chatbot with the necessary context to make the next generation. The `messages` are of UIMessage type, which are designed for use in application UI - they contain the entire message history and associated metadata like timestamps.
119
120
3. Call [`streamText`](/docs/reference/ai-sdk-core/stream-text), which is imported from the `ai` package. This function accepts a configuration object that contains a `model` provider (defined in step 1) and `messages` (defined in step 2). You can pass additional [settings](/docs/ai-sdk-core/settings) to further customise the model's behaviour. The `messages` key expects a `ModelMessage[]` array. This type is different from `UIMessage` in that it does not include metadata, such as timestamps or sender information. To convert between these types, we use the `convertToModelMessages` function, which strips the UI-specific metadata and transforms the `UIMessage[]` array into the `ModelMessage[]` format that the model expects.
120
121
4. The `streamText` function returns a [`StreamTextResult`](/docs/reference/ai-sdk-core/stream-text#result-object). This result object contains the [`toUIMessageStreamResponse`](/docs/reference/ai-sdk-core/stream-text#to-data-stream-response) function which converts the result to a streamed response object.
121
122
5. Return the result to the client to stream the response.
122
123
124
+
## Choosing a Provider
125
+
126
+
The AI SDK supports dozens of model providers through [first-party](/providers/ai-sdk-providers), [OpenAI-compatible](/providers/openai-compatible-providers), and [ community ](/providers/community-providers) packages.
127
+
128
+
This quickstart uses the [Vercel AI Gateway](https://vercel.com/ai-gateway) provider, which is the default [global provider](/docs/ai-sdk-core/provider-management#global-provider-configuration). This means you can access models using a simple string in the model configuration:
129
+
130
+
```ts
131
+
model: 'openai/gpt-5.1';
132
+
```
133
+
134
+
You can also explicitly import and use the gateway provider in two other equivalent ways:
135
+
136
+
```ts
137
+
// Option 1: Import from 'ai' package (included by default)
138
+
import { gateway } from'ai';
139
+
model: gateway('openai/gpt-5.1');
140
+
141
+
// Option 2: Install and import from '@ai-sdk/gateway' package
142
+
import { gateway } from'@ai-sdk/gateway';
143
+
model: gateway('openai/gpt-5.1');
144
+
```
145
+
146
+
### Using other providers
147
+
148
+
To use a different provider, install its package and create a provider instance. For example, to use OpenAI directly:
You can change the default global provider so string model references use your preferred provider everywhere in your application. Learn more about [provider management](/docs/ai-sdk-core/provider-management#global-provider-configuration).
178
+
179
+
Pick the approach that best matches how you want to manage providers across your application.
180
+
123
181
## Wire up the UI
124
182
125
183
Now that you have an API route that can query an LLM, it's time to set up your frontend. The AI SDK's [UI](/docs/ai-sdk-ui) package abstracts the complexity of a chat interface into one class, `Chat`.
@@ -195,8 +253,8 @@ Let's enhance your chatbot by adding a simple weather tool.
195
253
Modify your `src/routes/api/chat/+server.ts` file to include the new weather tool:
0 commit comments