Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(ts): prompts ts sdk #6136

Closed
wants to merge 26 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
26 commits
Select commit Hold shift + click to select a range
4cf5883
refactor(prompts): Reorganize phoenix js client exports
cephalization Jan 12, 2025
9e7d013
feat(js): ts prompt sdk
mikeldking Jan 21, 2025
87ec13e
cleanup
mikeldking Jan 21, 2025
381ab3b
wip
mikeldking Jan 21, 2025
325b177
Lint
cephalization Jan 22, 2025
8130b2f
Initial getPrompt implementation
cephalization Jan 22, 2025
9fabd76
Scaffold provider transformers
cephalization Jan 23, 2025
4f6889e
Replace prompt variables
cephalization Jan 23, 2025
990ec7f
Add prompt formatting test
cephalization Jan 23, 2025
871e4ed
Log tool calls in example
cephalization Jan 23, 2025
404a6af
Tweak prompt examples
cephalization Jan 24, 2025
d1062a1
Implement anthropic parsing / conversion
cephalization Jan 24, 2025
f613c52
Add stripped down prompt application example
cephalization Jan 24, 2025
7f95760
Remove debug code
cephalization Jan 24, 2025
3b2b271
Publish preview releases of phoenix-client
cephalization Jan 25, 2025
70b401f
Update pkg pr new workflow
cephalization Jan 25, 2025
b73f782
Change working dir
cephalization Jan 25, 2025
1853fcf
Update lockfile path
cephalization Jan 25, 2025
34b3a23
Recursively build
cephalization Jan 25, 2025
67b75d5
Bump generated types
cephalization Jan 25, 2025
e263a9b
Update experimental package publish workflow
cephalization Jan 27, 2025
54ba18d
Update tests and schemas
cephalization Jan 27, 2025
2a1ff6f
Add anthropic tests
cephalization Jan 27, 2025
cb7c438
Documentation
cephalization Jan 28, 2025
17986b5
Regenerate schemas
cephalization Jan 28, 2025
1bb314f
Fix config test
cephalization Jan 28, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
55 changes: 55 additions & 0 deletions .github/workflows/typescript-packages-publish-experimental.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,55 @@
name: Publish Any Pull Request
# TODO: Only publish on pull requests by arize team members
on:
pull_request:

jobs:
# JOB to run change detection
changes:
runs-on: ubuntu-latest
# Required permissions
permissions:
pull-requests: read
# Set job outputs to values from filter step
outputs:
workflow_file: ${{ steps.filter.outputs.workflow_file }}
phoenix-client: ${{ steps.filter.outputs.phoenix-client }}
steps:
# For pull requests it's not necessary to checkout the code
- uses: dorny/paths-filter@v3
id: filter
with:
filters: |
phoenix-client:
- 'js/packages/phoenix-client/**'
workflow_file:
- '.github/workflows/typescript-packages-publish-experimental.yml'

publish-experimental-packages:
needs: changes
if: ${{ needs.changes.outputs.phoenix-client == 'true' || needs.changes.outputs.workflow_file == 'true' }}
runs-on: ubuntu-latest

steps:
- name: Checkout code
uses: actions/checkout@v4

- run: corepack enable
working-directory: ./js
- uses: actions/setup-node@v4
with:
node-version: 20
cache: "pnpm"
cache-dependency-path: ./js

- name: Install dependencies
working-directory: ./js
run: pnpm install

- name: Build
working-directory: ./js
run: pnpm -r build

- name: Publish
working-directory: ./js
run: pnpx pkg-pr-new publish ./packages/phoenix-client
4 changes: 2 additions & 2 deletions js/examples/apps/phoenix-experiment-runner/index.ts
Original file line number Diff line number Diff line change
@@ -1,10 +1,10 @@
/* eslint-disable no-console */
import { createClient } from "@arizeai/phoenix-client";
import {
asEvaluator,
createClient,
runExperiment,
type RunExperimentParams,
} from "@arizeai/phoenix-client";
} from "@arizeai/phoenix-client/experimental";
import { intro, outro, select, spinner, log, confirm } from "@clack/prompts";
import { Factuality, Humor } from "autoevals";
import dotenv from "dotenv";
Expand Down
1 change: 1 addition & 0 deletions js/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -28,6 +28,7 @@
"@typescript-eslint/eslint-plugin": "^6.21.0",
"@typescript-eslint/parser": "^6.21.0",
"eslint": "^8.57.1",
"pkg-pr-new": "^0.0.39",
"prettier": "^3.4.1",
"rimraf": "^5.0.10",
"tsc-alias": "^1.8.10",
Expand Down
6 changes: 4 additions & 2 deletions js/packages/phoenix-client/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,13 +18,15 @@ The client will automatically read environment variables from your environment,
The following environment variables are used:

- `PHOENIX_HOST` - The base URL of the Phoenix API.
- `PHOENIX_API_KEY` - The API key to use for authentication.
- `PHOENIX_CLIENT_HEADERS` - Custom headers to add to all requests. A JSON stringified object.

```bash
PHOENIX_HOST=http://localhost:6006 PHOENIX_CLIENT_HEADERS='{"Authorization": "bearer xxxxxx"}' pnpx tsx examples/list_datasets.ts
PHOENIX_HOST='http://localhost:12345' PHOENIX_API_KEY='xxxxxx' PHOENIX_CLIENT_HEADERS='{"X-Custom-Header": "123"}' pnpx tsx examples/list_datasets.ts
# emits the following request:
# GET http://localhost:6006/v1/datasets
# GET http://localhost:12345/v1/datasets
# headers: {
# "X-Custom-Header": "123",
# "Authorization": "bearer xxxxxx",
# }
```
Expand Down
57 changes: 57 additions & 0 deletions js/packages/phoenix-client/examples/apply_prompt.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,57 @@
/* eslint-disable no-console */
import { createClient, getPrompt, toSDK } from "../src";
import OpenAI from "openai";

const PROMPT_NAME = process.env.PROMPT_NAME!;
const OPENAI_API_KEY = process.env.OPENAI_API_KEY!;

// get first argument from command line
const question = process.argv[2];

if (!question) {
throw new Error(
"Usage: pnpx tsx examples/apply_prompt.ts 'What is the capital of France?'\nAssumes that the prompt has a variable named 'question'\nAssumes that the prompt is openai with an openai model"
);
}

if (!OPENAI_API_KEY) {
throw new Error("OPENAI_API_KEY must be provided in the environment");
}

const client = createClient({
options: {
baseUrl: "http://localhost:6006",
},
});

const openai = new OpenAI({
apiKey: OPENAI_API_KEY,
});

const main = async () => {
const prompt = await getPrompt({
client,
prompt: { name: PROMPT_NAME },
});

const openAIParams = toSDK({
prompt,
sdk: "openai",
variables: {
question,
},
});

if (!openAIParams) {
throw new Error("Prompt could not be converted to OpenAI params");
}

const response = await openai.chat.completions.create({
...openAIParams,
stream: false,
});

console.log(response.choices[0]?.message.content);
};

main();
124 changes: 124 additions & 0 deletions js/packages/phoenix-client/examples/apply_prompt_anthropic.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,124 @@
/* eslint-disable no-console */
import { createClient, getPrompt, toSDK } from "../src";
import { Anthropic } from "@anthropic-ai/sdk";
import { PromptLike } from "../src/types/prompts";

const PROMPT_NAME = process.env.PROMPT_NAME!;
const PROMPT_TAG = process.env.PROMPT_TAG!;
const PROMPT_VERSION_ID = process.env.PROMPT_VERSION_ID!;
const ANTHROPIC_API_KEY = process.env.ANTHROPIC_API_KEY!;

// get first argument from command line
const question = process.argv[2];

if (!question) {
throw new Error(
"Usage: pnpx tsx examples/apply_prompt_anthropic.ts 'What is the capital of France?'\nAssumes that the prompt has a variable named 'question'"
);
}

if (!ANTHROPIC_API_KEY) {
throw new Error("ANTHROPIC_API_KEY must be provided in the environment");
}

const client = createClient({
options: {
baseUrl: "http://localhost:6006",
},
});

const anthropic = new Anthropic({
apiKey: ANTHROPIC_API_KEY,
});

const main = async () => {
const promptArgument: PromptLike | null = PROMPT_VERSION_ID
? { versionId: PROMPT_VERSION_ID }
: PROMPT_TAG && PROMPT_NAME
? { name: PROMPT_NAME, tag: PROMPT_TAG }
: PROMPT_NAME
? { name: PROMPT_NAME }
: null;
if (!promptArgument) {
throw new Error(
`Either PROMPT_VERSION_ID, PROMPT_TAG and PROMPT_NAME, or PROMPT_NAME must be provided in the environment`
);
}
console.log(`Getting prompt ${PROMPT_VERSION_ID}`);

// TODO: Apply variable replacement to the prompt
const prompt = await getPrompt({
client,
prompt: promptArgument,
});

if (!prompt) {
throw new Error("Prompt not found");
}

console.log(
`Loaded prompt: ${prompt.id}\n${prompt.description ? `\n${prompt.description}` : ""}`
);

console.log(`Converting prompt to OpenAI params`);

const anthropicParams = toSDK({
prompt,
sdk: "anthropic",
variables: {
question,
},
});

if (!anthropicParams) {
throw new Error("Prompt could not be converted to Anthropic params");
}

// @ts-expect-error Anthropic doesn't support these parameters
delete anthropicParams.frequency_penalty;
// @ts-expect-error Anthropic doesn't support these parameters
delete anthropicParams.presence_penalty;

console.log(`Applying prompt to Anthropic`);
const response = await anthropic.messages.create({
...anthropicParams,
// we may not have an anthropic model saved in the prompt
model: "claude-3-5-sonnet-20240620",
// TODO: should this be strongly typed inside of toSDK results if sdk: "anthropic"?
stream: true,
});

console.log(`Streaming response from OpenAI:\n\n`);

let responseText = "";
let responseJson = "";
for await (const chunk of response) {
if (chunk.type === "message_delta") {
console.clear();
console.log("Input:\n");
console.log(JSON.stringify(anthropicParams.messages, null, 2));
console.log("\nOutput:\n");
try {
console.log(JSON.stringify(JSON.parse(responseText), null, 2));
console.log(JSON.stringify(JSON.parse(responseJson), null, 2));
} catch {
console.log(responseText);
console.log(responseJson);
}
} else if (chunk.type === "content_block_delta") {
console.clear();
if (chunk.delta.type === "text_delta") {
responseText += String(chunk.delta.text);
}
if (chunk.delta.type === "input_json_delta") {
responseJson += chunk.delta.partial_json;
}
console.log(responseText);
}
}

console.log("\n\n");
console.log(`Done!`);
};

main();
110 changes: 110 additions & 0 deletions js/packages/phoenix-client/examples/apply_prompt_openai.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,110 @@
/* eslint-disable no-console */
import { createClient, getPrompt, toSDK } from "../src";
import OpenAI from "openai";
import { PromptLike } from "../src/types/prompts";

const PROMPT_NAME = process.env.PROMPT_NAME!;
const PROMPT_TAG = process.env.PROMPT_TAG!;
const PROMPT_VERSION_ID = process.env.PROMPT_VERSION_ID!;
const OPENAI_API_KEY = process.env.OPENAI_API_KEY!;

// get first argument from command line
const question = process.argv[2];

if (!question) {
throw new Error(
"Usage: pnpx tsx examples/apply_prompt_openai.ts 'What is the capital of France?'\nAssumes that the prompt has a variable named 'question'"
);
}

if (!OPENAI_API_KEY) {
throw new Error("OPENAI_API_KEY must be provided in the environment");
}

const client = createClient({
options: {
baseUrl: "http://localhost:6006",
},
});

const openai = new OpenAI({
apiKey: OPENAI_API_KEY,
});

const main = async () => {
const promptArgument: PromptLike | null = PROMPT_VERSION_ID
? { versionId: PROMPT_VERSION_ID }
: PROMPT_TAG && PROMPT_NAME
? { name: PROMPT_NAME, tag: PROMPT_TAG }
: PROMPT_NAME
? { name: PROMPT_NAME }
: null;
if (!promptArgument) {
throw new Error(
`Either PROMPT_VERSION_ID, PROMPT_TAG and PROMPT_NAME, or PROMPT_NAME must be provided in the environment`
);
}
console.log(`Getting prompt ${PROMPT_VERSION_ID}`);

// TODO: Apply variable replacement to the prompt
const prompt = await getPrompt({
client,
prompt: promptArgument,
});

if (!prompt) {
throw new Error("Prompt not found");
}

console.log(
`Loaded prompt: ${prompt.id}\n${prompt.description ? `\n${prompt.description}` : ""}`
);

console.log(`Converting prompt to OpenAI params`);

const openAIParams = toSDK({
prompt,
sdk: "openai",
variables: {
question,
},
});

if (!openAIParams) {
throw new Error("Prompt could not be converted to OpenAI params");
}

console.log(`Applying prompt to OpenAI`);
const response = await openai.chat.completions.create({
...openAIParams,
// we may not have an openai model saved in the prompt
model: "gpt-4o-mini",
// TODO: should this be strongly typed inside of toSDK results if sdk: "openai"?
stream: true,
});

console.log(`Streaming response from OpenAI:\n\n`);

let responseText = "";
for await (const chunk of response) {
if (chunk.choices[0]?.delta?.content) {
responseText += chunk.choices[0]?.delta?.content;
console.clear();
console.log("Input:\n");
console.log(JSON.stringify(openAIParams.messages, null, 2));
console.log("\nOutput:\n");
try {
console.log(JSON.stringify(JSON.parse(responseText), null, 2));
} catch {
console.log(responseText);
}
} else if (chunk.choices[0]?.delta?.tool_calls) {
console.log(chunk.choices[0]?.delta?.tool_calls);
}
}

console.log("\n\n");
console.log(`Done!`);
};

main();
Loading