You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
# This example uses OpenAI, but you can use any LLM provider of choice
53
-
export OPENAI_API_KEY=<your-openai-api-key>
54
-
# For LangSmith API keys linked to multiple workspaces, set the LANGSMITH_WORKSPACE_ID environment variable to specify which workspace to use.
55
-
export LANGSMITH_WORKSPACE_ID=<your-workspace-id>
56
-
```
57
-
58
-
</CodeGroup>
59
-
60
47
<Info>
61
-
If you are using LangChain.js with LangSmith and are not in a serverless environment, we also recommend setting the following explicitly to reduce latency:
62
-
63
-
`export LANGCHAIN_CALLBACKS_BACKGROUND=true`
48
+
If you are using LangChain.js with LangSmith and are not in a serverless environment, we also recommend setting the following explicitly to reduce latency:
64
49
65
-
If you are in a serverless environment, we recommend setting the reverse to allow tracing to finish before your function ends:
50
+
`export LANGCHAIN_CALLBACKS_BACKGROUND=true`
66
51
67
-
`export LANGCHAIN_CALLBACKS_BACKGROUND=false`
52
+
If you are in a serverless environment, we recommend setting the reverse to allow tracing to finish before your function ends:
# This example uses OpenAI, but you can use any LLM provider of choice
57
-
export OPENAI_API_KEY=<your-openai-api-key>
58
-
# For LangSmith API keys linked to multiple workspaces, set the LANGSMITH_WORKSPACE_ID environment variable to specify which workspace to use.
59
-
export LANGSMITH_WORKSPACE_ID=<your-workspace-id>
60
-
```
61
-
62
-
</CodeGroup>
63
-
64
51
<Info>
65
-
If you are using LangChain.js with LangSmith and are not in a serverless environment, we also recommend setting the following explicitly to reduce latency:
52
+
If you are using LangChain.js with LangSmith and are not in a serverless environment, we also recommend setting the following explicitly to reduce latency:
66
53
67
-
`export LANGCHAIN_CALLBACKS_BACKGROUND=true`
54
+
`export LANGCHAIN_CALLBACKS_BACKGROUND=true`
68
55
69
-
If you are in a serverless environment, we recommend setting the reverse to allow tracing to finish before your function ends:
56
+
If you are in a serverless environment, we recommend setting the reverse to allow tracing to finish before your function ends:
70
57
71
-
`export LANGCHAIN_CALLBACKS_BACKGROUND=false`
58
+
`export LANGCHAIN_CALLBACKS_BACKGROUND=false`
72
59
73
-
See [this LangChain.js guide](https://js.langchain.com/docs/how_to/callbacks_serverless) for more information.
60
+
See [this LangChain.js guide](https://js.langchain.com/docs/how_to/callbacks_serverless) for more information.
# This example uses OpenAI, but you can use any LLM provider of choice
252
237
export OPENAI_API_KEY=<your-openai-api-key>
253
238
```
254
239
255
-
```bash TypeScript
256
-
export LANGSMITH_TRACING=true
257
-
export LANGSMITH_API_KEY=<your-api-key>
258
-
# This example uses OpenAI, but you can use any LLM provider of choice
259
-
export OPENAI_API_KEY=<your-openai-api-key>
260
-
```
261
-
262
-
</CodeGroup>
263
-
264
240
<Info>
265
-
If you are using LangChain.js with LangSmith and are not in a serverless environment, we also recommend setting the following explicitly to reduce latency:
241
+
If you are using LangChain.js with LangSmith and are not in a serverless environment, we also recommend setting the following explicitly to reduce latency:
266
242
267
-
`export LANGCHAIN_CALLBACKS_BACKGROUND=true`
243
+
`export LANGCHAIN_CALLBACKS_BACKGROUND=true`
268
244
269
-
If you are in a serverless environment, we recommend setting the reverse to allow tracing to finish before your function ends:
245
+
If you are in a serverless environment, we recommend setting the reverse to allow tracing to finish before your function ends:
270
246
271
-
`export LANGCHAIN_CALLBACKS_BACKGROUND=false`
247
+
`export LANGCHAIN_CALLBACKS_BACKGROUND=false`
272
248
273
-
See [this LangChain.js guide](https://js.langchain.com/docs/how_to/callbacks_serverless) for more information.
249
+
See [this LangChain.js guide](https://js.langchain.com/docs/how_to/callbacks_serverless) for more information.
0 commit comments