Skip to content

Commit c49447b

Browse files
authored
remove redundant codegroup for langsmith quickstart (#1257)
Content is identical
1 parent ce6e28d commit c49447b

File tree

2 files changed

+17
-55
lines changed

2 files changed

+17
-55
lines changed

src/langsmith/trace-with-langchain.mdx

Lines changed: 5 additions & 19 deletions
Original file line numberDiff line numberDiff line change
@@ -35,9 +35,7 @@ pnpm add @langchain/openai @langchain/core
3535

3636
### 1. Configure your environment
3737

38-
<CodeGroup>
39-
40-
```bash Python
38+
```bash wrap
4139
export LANGSMITH_TRACING=true
4240
export LANGSMITH_API_KEY=<your-api-key>
4341
# This example uses OpenAI, but you can use any LLM provider of choice
@@ -46,26 +44,14 @@ export OPENAI_API_KEY=<your-openai-api-key>
4644
export LANGSMITH_WORKSPACE_ID=<your-workspace-id>
4745
```
4846

49-
```bash TypeScript
50-
export LANGSMITH_TRACING=true
51-
export LANGSMITH_API_KEY=<your-api-key>
52-
# This example uses OpenAI, but you can use any LLM provider of choice
53-
export OPENAI_API_KEY=<your-openai-api-key>
54-
# For LangSmith API keys linked to multiple workspaces, set the LANGSMITH_WORKSPACE_ID environment variable to specify which workspace to use.
55-
export LANGSMITH_WORKSPACE_ID=<your-workspace-id>
56-
```
57-
58-
</CodeGroup>
59-
6047
<Info>
61-
If you are using LangChain.js with LangSmith and are not in a serverless environment, we also recommend setting the following explicitly to reduce latency:
62-
63-
`export LANGCHAIN_CALLBACKS_BACKGROUND=true`
48+
If you are using LangChain.js with LangSmith and are not in a serverless environment, we also recommend setting the following explicitly to reduce latency:
6449

65-
If you are in a serverless environment, we recommend setting the reverse to allow tracing to finish before your function ends:
50+
`export LANGCHAIN_CALLBACKS_BACKGROUND=true`
6651

67-
`export LANGCHAIN_CALLBACKS_BACKGROUND=false`
52+
If you are in a serverless environment, we recommend setting the reverse to allow tracing to finish before your function ends:
6853

54+
`export LANGCHAIN_CALLBACKS_BACKGROUND=false`
6955
</Info>
7056

7157
### 2. Log a trace

src/langsmith/trace-with-langgraph.mdx

Lines changed: 12 additions & 36 deletions
Original file line numberDiff line numberDiff line change
@@ -39,9 +39,7 @@ pnpm add @langchain/openai @langchain/langgraph
3939

4040
### 2. Configure your environment
4141

42-
<CodeGroup>
43-
44-
```bash Python
42+
```bash wrap
4543
export LANGSMITH_TRACING=true
4644
export LANGSMITH_API_KEY=<your-api-key>
4745
# This example uses OpenAI, but you can use any LLM provider of choice
@@ -50,27 +48,16 @@ export OPENAI_API_KEY=<your-openai-api-key>
5048
export LANGSMITH_WORKSPACE_ID=<your-workspace-id>
5149
```
5250

53-
```bash TypeScript
54-
export LANGSMITH_TRACING=true
55-
export LANGSMITH_API_KEY=<your-api-key>
56-
# This example uses OpenAI, but you can use any LLM provider of choice
57-
export OPENAI_API_KEY=<your-openai-api-key>
58-
# For LangSmith API keys linked to multiple workspaces, set the LANGSMITH_WORKSPACE_ID environment variable to specify which workspace to use.
59-
export LANGSMITH_WORKSPACE_ID=<your-workspace-id>
60-
```
61-
62-
</CodeGroup>
63-
6451
<Info>
65-
If you are using LangChain.js with LangSmith and are not in a serverless environment, we also recommend setting the following explicitly to reduce latency:
52+
If you are using LangChain.js with LangSmith and are not in a serverless environment, we also recommend setting the following explicitly to reduce latency:
6653

67-
`export LANGCHAIN_CALLBACKS_BACKGROUND=true`
54+
`export LANGCHAIN_CALLBACKS_BACKGROUND=true`
6855

69-
If you are in a serverless environment, we recommend setting the reverse to allow tracing to finish before your function ends:
56+
If you are in a serverless environment, we recommend setting the reverse to allow tracing to finish before your function ends:
7057

71-
`export LANGCHAIN_CALLBACKS_BACKGROUND=false`
58+
`export LANGCHAIN_CALLBACKS_BACKGROUND=false`
7259

73-
See [this LangChain.js guide](https://js.langchain.com/docs/how_to/callbacks_serverless) for more information.
60+
See [this LangChain.js guide](https://js.langchain.com/docs/how_to/callbacks_serverless) for more information.
7461
</Info>
7562

7663
### 3. Log a trace
@@ -243,34 +230,23 @@ pnpm add openai langsmith @langchain/langgraph
243230

244231
### 2. Configure your environment
245232

246-
<CodeGroup>
247-
248-
```bash Python
233+
```bash wrap
249234
export LANGSMITH_TRACING=true
250235
export LANGSMITH_API_KEY=<your-api-key>
251236
# This example uses OpenAI, but you can use any LLM provider of choice
252237
export OPENAI_API_KEY=<your-openai-api-key>
253238
```
254239

255-
```bash TypeScript
256-
export LANGSMITH_TRACING=true
257-
export LANGSMITH_API_KEY=<your-api-key>
258-
# This example uses OpenAI, but you can use any LLM provider of choice
259-
export OPENAI_API_KEY=<your-openai-api-key>
260-
```
261-
262-
</CodeGroup>
263-
264240
<Info>
265-
If you are using LangChain.js with LangSmith and are not in a serverless environment, we also recommend setting the following explicitly to reduce latency:
241+
If you are using LangChain.js with LangSmith and are not in a serverless environment, we also recommend setting the following explicitly to reduce latency:
266242

267-
`export LANGCHAIN_CALLBACKS_BACKGROUND=true`
243+
`export LANGCHAIN_CALLBACKS_BACKGROUND=true`
268244

269-
If you are in a serverless environment, we recommend setting the reverse to allow tracing to finish before your function ends:
245+
If you are in a serverless environment, we recommend setting the reverse to allow tracing to finish before your function ends:
270246

271-
`export LANGCHAIN_CALLBACKS_BACKGROUND=false`
247+
`export LANGCHAIN_CALLBACKS_BACKGROUND=false`
272248

273-
See [this LangChain.js guide](https://js.langchain.com/docs/how_to/callbacks_serverless) for more information.
249+
See [this LangChain.js guide](https://js.langchain.com/docs/how_to/callbacks_serverless) for more information.
274250
</Info>
275251

276252
### 3. Log a trace

0 commit comments

Comments
 (0)