Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
44 changes: 41 additions & 3 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -1,4 +1,42 @@
node_modules/
# Dependencies
node_modules

# Build outputs
dist
build
*.tsbuildinfo
.turbo
dist/
.env
.turbo-tsconfig.json

# Environment files
.env
.env.local
.env.production
.env.bak

# IDE
.idea
.vscode
.zed
.DS_Store

# Test coverage
coverage
.nyc_output

# Logs
*.log
logs

# Cache
cache
.cache
tokencache

# Temporary files
*.tmp
*.temp
.tmp

# Bundler artifacts
tsup.config.bundled_*.mjs
129 changes: 45 additions & 84 deletions build.ts
Original file line number Diff line number Diff line change
@@ -1,101 +1,62 @@
#!/usr/bin/env bun

const externalDeps = [
"@elizaos/core",
"@ai-sdk/openai",
"@openrouter/ai-sdk-provider",
"ai",
"undici",
"dotenv",
];
import { $ } from "bun";

async function build() {
const totalStart = Date.now();
const pkg = await Bun.file("package.json").json();
const externalDeps = [
...Object.keys(pkg.dependencies ?? {}),
...Object.keys(pkg.peerDependencies ?? {}),
];

// Use the clean script from package.json
if (pkg.scripts?.clean) {
console.log("🧹 Cleaning...");
await $`bun run clean`.quiet();
}
Comment on lines +12 to +16
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

Don't run the repo-wide clean script as part of every build.

package.json Line 50 deletes node_modules. Running that here removes the local tsc binary and installed typings before the declaration step, and the catch block below turns the resulting failure into a green build. Use a build-only cleanup for artifacts, and let declaration generation fail the build.

Suggested fix
-  // Use the clean script from package.json
-  if (pkg.scripts?.clean) {
-    console.log("🧹 Cleaning...");
-    await $`bun run clean`.quiet();
-  }
+  console.log("🧹 Cleaning build artifacts...");
+  await $`rm -rf dist .turbo-tsconfig.json tsconfig.tsbuildinfo`.quiet();
@@
-    } catch (error) {
-      console.warn(`⚠️  TypeScript declaration generation had errors (${((Date.now() - dtsStart) / 1000).toFixed(2)}s)`);
-      console.warn("   Build will continue - fix type errors when possible");
+    } catch (error) {
+      console.error(`❌ TypeScript declaration generation failed (${((Date.now() - dtsStart) / 1000).toFixed(2)}s)`);
+      throw error;
     }

Also applies to: 36-44

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@build.ts` around lines 12 - 16, The build script currently runs the repo-wide
clean via the pkg.scripts?.clean check and await $`bun run clean`.quiet(), which
deletes node_modules and hides failures; replace this with a build-scoped
cleanup (e.g., invoke a dedicated "clean:build" or "clean:dist" npm script) or
remove the clean invocation entirely so node_modules/tsc remain intact, and stop
suppressing errors so declaration generation failures surface (remove .quiet()
or the surrounding catch that swallows errors). Update both occurrences (the one
using pkg.scripts?.clean / await $`bun run clean`.quiet() and the similar block
at lines 36-44) to call the build-only clean script or no clean and allow errors
to propagate.


// Node build
const nodeStart = Date.now();
console.log("🔨 Building @elizaos/plugin-openrouter for Node...");
await Bun.build({
entrypoints: ["src/index.node.ts"],
outdir: "dist/node",
const esmStart = Date.now();
console.log("🔨 Building @elizaos/plugin-openrouter...");
const esmResult = await Bun.build({
entrypoints: ["src/index.ts"],
outdir: "dist",
target: "node",
format: "esm",
sourcemap: "external",
minify: false,
external: [...externalDeps],
});
console.log(`✅ Node build complete in ${((Date.now() - nodeStart) / 1000).toFixed(2)}s`);

Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Build output paths don't match package.json exports

High Severity

The rewritten build.ts outputs a single ESM bundle to dist/index.js (from entrypoint src/index.ts, outdir dist), but package.json still references the old multi-target paths: dist/node/index.node.js, dist/cjs/index.node.cjs, dist/browser/index.browser.js, and dist/node/index.d.ts. None of these files will exist after a build, so the package will fail to resolve for any consumer — Node imports, CJS requires, browser imports, and TypeScript type resolution will all break.

Additional Locations (1)

Fix in Cursor Fix in Web

// Browser build
const browserStart = Date.now();
console.log("🌐 Building @elizaos/plugin-openrouter for Browser...");
await Bun.build({
entrypoints: ["src/index.browser.ts"],
outdir: "dist/browser",
target: "browser",
format: "esm",
sourcemap: "external",
minify: false,
external: externalDeps,
});
Comment on lines +18 to 28
Copy link

Copilot AI Mar 9, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The build now only outputs a single ESM bundle to dist/ (entrypoint src/index.ts), but the package metadata expects dist/node/index.node.js, dist/browser/index.browser.js, and dist/cjs/index.node.cjs (see package.json exports/main/module/types). As-is, consumers will get missing-file runtime errors. Either restore the multi-target outputs/rename steps, or update package.json entrypoints/exports/types to match the new output layout.

Copilot uses AI. Check for mistakes.
Comment on lines +18 to 28
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

Update the published entrypoints to match the new build layout.

This build now emits a single ESM dist tree, but package.json Lines 5-20 still publish dist/node/*, dist/cjs/*, and dist/browser/*. In this state, consumers will resolve files that are never created after the build change.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@build.ts` around lines 18 - 28, The package's published entrypoints still
reference non-existent directories (dist/node/*, dist/cjs/*, dist/browser/*)
while the build (see esmStart and the Bun.build call which outputs to outdir
"dist" with format "esm") now emits a single ESM dist tree; update package.json
exports/exportsField and main/module/browser fields to point to the new single
dist output (e.g., entrypoint(s) under "dist/") and remove or replace any
references to dist/node, dist/cjs, and dist/browser so consumers resolve the
actual files produced by Bun.build.

console.log(`✅ Browser build complete in ${((Date.now() - browserStart) / 1000).toFixed(2)}s`);

// Node CJS build
const cjsStart = Date.now();
console.log("🧱 Building @elizaos/plugin-openrouter for Node (CJS)...");
const cjsResult = await Bun.build({
entrypoints: ["src/index.node.ts"],
outdir: "dist/cjs",
target: "node",
format: "cjs",
sourcemap: "external",
minify: false,
external: [...externalDeps],
});
if (!cjsResult.success) {
console.error(cjsResult.logs);
throw new Error("CJS build failed");
if (!esmResult.success) {
console.error(esmResult.logs);
throw new Error("ESM build failed");
}
try {
const { rename } = await import("node:fs/promises");
await rename("dist/cjs/index.node.js", "dist/cjs/index.node.cjs");
} catch (e) {
console.warn("CJS rename step warning:", e);
}
console.log(`✅ CJS build complete in ${((Date.now() - cjsStart) / 1000).toFixed(2)}s`);
console.log(`✅ Build complete in ${((Date.now() - esmStart) / 1000).toFixed(2)}s`);

// TypeScript declarations
const dtsStart = Date.now();
console.log("📝 Generating TypeScript declarations...");
const { mkdir, writeFile } = await import("node:fs/promises");
const { $ } = await import("bun");
await $`tsc --project tsconfig.build.json`;
await mkdir("dist/node", { recursive: true });
await mkdir("dist/browser", { recursive: true });
await mkdir("dist/cjs", { recursive: true });
await writeFile(
"dist/node/index.d.ts",
`export * from '../index';
export { default } from '../index';
`
);
await writeFile(
"dist/browser/index.d.ts",
`export * from '../index';
export { default } from '../index';
`
);
await writeFile(
"dist/cjs/index.d.ts",
`export * from '../index';
export { default } from '../index';
`
);
console.log(`✅ Declarations generated in ${((Date.now() - dtsStart) / 1000).toFixed(2)}s`);
if (true) { // Always generate .d.ts
console.log("📝 Generating TypeScript declarations...");
try {
await $`tsc --project tsconfig.build.json`;
console.log(`✅ Declarations generated in ${((Date.now() - dtsStart) / 1000).toFixed(2)}s`);
} catch (error) {
console.warn(`⚠️ TypeScript declaration generation had errors (${((Date.now() - dtsStart) / 1000).toFixed(2)}s)`);
console.warn(" Build will continue - fix type errors when possible");
Copy link

Copilot AI Mar 9, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

tsc --project tsconfig.build.json errors are swallowed and the build continues, which can publish a package with missing/invalid .d.ts (and CI may still pass). Consider failing the build on declaration generation errors (or at least make this behavior opt-in via an env flag) so type breaks don’t ship silently.

Suggested change
console.warn(" Build will continue - fix type errors when possible");
console.warn(" Build will continue - fix type errors when possible");
if (process.env.FAIL_BUILD_ON_DTS_ERROR === "true") {
throw (error instanceof Error ? error : new Error("TypeScript declaration generation failed"));
}

Copilot uses AI. Check for mistakes.
}
} else {
console.log("🔍 Type checking...");
try {
Comment on lines 27 to +47
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Dead else branch behind if (true) constant

The declaration generation block uses if (true) { ... } else { ... }, making the else branch permanently unreachable dead code:

if (true) { // Always generate .d.ts
  // ... tsc emit
} else {
  // ... tsc --noEmit --incremental (never runs)
}

This clutters the build script and obscures intent. The else block should be removed entirely.

await $`tsc --noEmit --incremental --project tsconfig.build.json`;
console.log(`✅ Type check passed in ${((Date.now() - dtsStart) / 1000).toFixed(2)}s`);
} catch (error) {
console.warn(`⚠️ Type checking had errors (${((Date.now() - dtsStart) / 1000).toFixed(2)}s)`);
console.warn(" Build will continue - fix type errors when possible");
}
}
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Unreachable else branch due to if (true)

Low Severity

The if (true) condition on the .d.ts generation block makes the entire else branch (lines 45–54, type-check-only mode) unreachable dead code. This looks like leftover development/debugging code that was accidentally committed.

Fix in Cursor Fix in Web


console.log(`🎉 All builds completed in ${((Date.now() - totalStart) / 1000).toFixed(2)}s`);
console.log(`🎉 All builds finished in ${((Date.now() - totalStart) / 1000).toFixed(2)}s`);
}

await build();


build().catch((err) => {
console.error("Build failed:", err);
process.exit(1);
});
7 changes: 4 additions & 3 deletions package.json
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@
"dependencies": {
"@ai-sdk/openai": "^2.0.32",
"@ai-sdk/ui-utils": "1.2.11",
"@elizaos/core": "^1.7.0",
"@elizaos/core": "workspace:*",
Comment thread
cursor[bot] marked this conversation as resolved.
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🌐 Web query:

Does publishing an npm package with a runtime dependency set to "workspace:*" work for consumers outside the monorepo, and how do npm, pnpm, yarn, and bun handle rewriting workspace protocol versions on publish?

💡 Result:

No—publishing a package that still has runtime deps like "dep": "workspace:*" generally will not work for consumers outside your monorepo, because workspace: is meant to be resolved against local workspace packages. With plain npm, consumers typically hit errors like “Unsupported URL Type "workspace:"” when installing such a manifest. [1]

How publish/pack rewriting works by package manager:

  • npm (npm publish / npm pack): does not rewrite workspace:* to a real semver range/version (so publishing with workspace:* leaks that spec to the registry and breaks consumers). [1][2]
  • pnpm (pnpm pack / pnpm publish): rewrites workspace: deps at pack/publish time:
    • workspace:*1.5.0 (the referenced workspace package’s version)
    • workspace:~~1.5.0
    • workspace:^^1.5.0
    • other workspace:<range> → corresponding semver range (and also rewrites certain alias/path workspace forms). [3]
  • Yarn (Berry) (yarn npm publish): rewrites workspace: deps on publish, e.g. if the workspace version is 3.2.1:
    • workspace:^^3.2.1
    • workspace:~~3.2.1
    • workspace:*=3.2.1
    • workspace:^3.0.0^3.0.0. [4]
  • Bun (bun publish / publishing behavior): rewrites workspace: versions when publishing, e.g.
    • workspace:*1.0.1
    • workspace:^^1.0.1
    • workspace:~~1.0.1
    • workspace:1.0.21.0.2 (explicit version wins). [5]

Practical takeaway: if you publish libraries meant for external consumers, ensure your published tarball contains only normal semver (or normal npm alias specs like npm:pkg@version)—which happens automatically with pnpm/Yarn/Bun, but not with npm, unless you add your own prepack/prepublish rewrite step. [3][4][5][2]

Sources: [1] https://stackoverflow.com/questions/69220042/npm-err-unsupported-url-type-workspace-workspace, [2] https://nesbitt.io/2026/01/18/workspaces-and-monorepos-in-package-managers.html, [3] https://dimava.github.io/pnpm/workspaces/, [4] https://yarnpkg.com/features/workspaces, [5] https://bun.sh/docs/pm/workspaces


Pin the dependency version or verify the publish tool rewrites workspace:*.

Publishing "@elizaos/core": "workspace:*" as-is will fail for consumers installing outside the monorepo if using npm (which does not rewrite workspace protocol). If using pnpm, Yarn, or Bun, these tools automatically rewrite workspace:* to the actual version at publish time. For npm or custom publish pipelines, either pin a specific semver range here or confirm your release step rewrites workspace dependencies before publishing.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@package.json` at line 29, The package.json currently lists the dependency
"@elizaos/core": "workspace:*", which will break installs for consumers using
npm; either replace the workspace:* spec with a pinned semver (e.g. the actual
version or a semver range) for "@elizaos/core" or ensure your publish pipeline
rewrites workspace:* to the real version before publishing (update the publish
script or tool configuration that publishes package.json); locate the dependency
entry for "@elizaos/core" in package.json and make the change or add a
verification step in your release/publish scripts to rewrite workspace protocol
for npm consumers.

"@openrouter/ai-sdk-provider": "^1.2.0",
"ai": "^5.0.47",
"undici": "^7.16.0"
Expand All @@ -37,7 +37,8 @@
"@types/node": "^24.5.2",
"dotenv": "^17.2.2",
"prettier": "3.6.2",
"typescript": "^5.9.3"
"typescript": "^5.9.3",
"bun-types": "^1.2.21"
},
"scripts": {
"build": "bun run build.ts",
Expand Down Expand Up @@ -185,4 +186,4 @@
}
}
}
}
}
35 changes: 25 additions & 10 deletions src/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,7 @@ import {
type ImageGenerationParams,
type TextEmbeddingParams,
} from '@elizaos/core';
import type { Tool, ToolChoice } from 'ai';
import { initializeOpenRouter } from './init';
import { handleTextSmall, handleTextLarge } from './models/text';
import { handleObjectSmall, handleObjectLarge } from './models/object';
Expand Down Expand Up @@ -49,21 +50,35 @@ export const openrouterPlugin: Plugin = {
models: {
[ModelType.TEXT_SMALL]: async (
runtime: IAgentRuntime,
params: GenerateTextParams
) => {
return handleTextSmall(runtime, params);
params: GenerateTextParams & {
tools?: Record<string, Tool>;
toolChoice?: ToolChoice<Record<string, Tool>>;
},
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Tools and toolChoice params accepted but silently dropped

Medium Severity

The tools and toolChoice parameters are newly added to the TEXT_SMALL and TEXT_LARGE handler signatures, but buildGenerateParams in text.ts never includes them in the generateParams object passed to the AI SDK's generateText/streamText. Any caller providing tools will have them silently ignored, with no error or warning.

Additional Locations (1)

Fix in Cursor Fix in Web

): Promise<string> => {
const result = await handleTextSmall(runtime, params);
return typeof result === 'string' ? result : JSON.stringify(result);
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

JSON.stringify on TextStreamResult breaks streaming

Medium Severity

When handleTextSmall / handleTextLarge returns a TextStreamResult (streaming mode), the fallback JSON.stringify(result) serializes an object containing ReadableStream and Promise properties, producing useless output like {"textStream":{},"text":{},...}. The previous code returned the result directly, preserving the stream. Now streaming callers receive a garbage string instead of an operable stream object.

Additional Locations (1)

Fix in Cursor Fix in Web

},
Comment on lines +57 to 60
Copy link

Copilot AI Mar 9, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

handleTextSmall/handleTextLarge can return a TextStreamResult (contains ReadableStream + Promises). JSON.stringify(result) will either drop most fields or produce {}, and it prevents the runtime from consuming the stream (breaking onStreamChunk-style streaming). The model handler should return the streaming shape expected by @elizaos/core instead of stringifying it.

Copilot uses AI. Check for mistakes.
[ModelType.TEXT_LARGE]: async (
runtime: IAgentRuntime,
params: GenerateTextParams
) => {
return handleTextLarge(runtime, params);
params: GenerateTextParams & {
tools?: Record<string, Tool>;
toolChoice?: ToolChoice<Record<string, Tool>>;
},
): Promise<string> => {
const result = await handleTextLarge(runtime, params);
return typeof result === 'string' ? result : JSON.stringify(result);
Comment on lines +58 to +69
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Streaming result silently broken when stream: true

When a caller passes stream: true in the params, handleTextSmall/handleTextLarge returns a TextStreamResult object (containing a ReadableStream and several Promise fields). However, the wrapper in index.ts falls through to JSON.stringify(result) on any non-string return value:

const result = await handleTextSmall(runtime, params);
return typeof result === 'string' ? result : JSON.stringify(result);

JSON.stringify on a TextStreamResult will produce something like {"text":{},"usage":{},"finishReason":{}} because Promise and ReadableStream instances do not serialize to JSON. The actual stream is silently discarded, and the caller gets back a useless string instead of an error or the real stream. If streaming is not intended to be exposed through this interface, the stream option should be blocked or the TextStreamResult path should be handled explicitly.

Comment on lines 51 to +69
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

The text-model wrappers still break streaming.

handleTextSmall() / handleTextLarge() can now return TextStreamResult, but these entrypoints stringify any non-string result. That drops the stream instead of consuming it, and onStreamChunk is never bridged to the internal stream flag, so the openrouter_test_streaming case at Lines 158-175 won't receive chunks.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/index.ts` around lines 51 - 69, The wrapper currently JSON.stringifys any
non-string return from handleTextSmall/handleTextLarge which drops streaming
results; update both ModelType.TEXT_SMALL and ModelType.TEXT_LARGE handlers to
detect TextStreamResult (or an object with onStreamChunk/stream fields) and pass
it through instead of stringifying, and if present wire params.stream and
params.onStreamChunk into the returned TextStreamResult (e.g., set result.stream
= result.stream ?? params.stream and result.onStreamChunk = result.onStreamChunk
?? params.onStreamChunk) so the internal stream flag and onStreamChunk are
bridged to the caller; otherwise fallback to returning typeof result ===
'string' ? result : JSON.stringify(result).

},
[ModelType.OBJECT_SMALL]: async (runtime: IAgentRuntime, params: ObjectGenerationParams) => {
return handleObjectSmall(runtime, params);
[ModelType.OBJECT_SMALL]: async (
runtime: IAgentRuntime,
params: ObjectGenerationParams,
): Promise<Record<string, unknown>> => {
return handleObjectSmall(runtime, params) as Promise<Record<string, unknown>>;
},
[ModelType.OBJECT_LARGE]: async (runtime: IAgentRuntime, params: ObjectGenerationParams) => {
return handleObjectLarge(runtime, params);
[ModelType.OBJECT_LARGE]: async (
runtime: IAgentRuntime,
params: ObjectGenerationParams,
): Promise<Record<string, unknown>> => {
return handleObjectLarge(runtime, params) as Promise<Record<string, unknown>>;
},
[ModelType.IMAGE_DESCRIPTION]: async (
runtime: IAgentRuntime,
Expand Down
4 changes: 2 additions & 2 deletions src/models/image.ts
Original file line number Diff line number Diff line change
Expand Up @@ -54,10 +54,10 @@ export async function handleImageDescription(
];

try {
const model = openrouter.chat(modelName);
const model = openrouter.chat(modelName) as import("ai").LanguageModel;

const { text: responseText } = await generateText({
model: model,
model,
messages: messages,
maxOutputTokens: maxOutputTokens,
});
Expand Down
28 changes: 18 additions & 10 deletions src/models/text.ts
Original file line number Diff line number Diff line change
@@ -1,10 +1,18 @@
import type {
GenerateTextParams,
IAgentRuntime,
TextStreamResult,
} from "@elizaos/core";
import type { GenerateTextParams, IAgentRuntime } from "@elizaos/core";
import { logger, ModelType } from "@elizaos/core";
import { generateText, streamText } from "ai";
import type { LanguageModel } from "ai";

/** Result shape when streaming text (matches AI SDK streamText return) */
export interface TextStreamResult {
textStream: ReadableStream<string>;
text: Promise<string>;
usage: Promise<{ promptTokens: number; completionTokens: number; totalTokens: number } | undefined>;
finishReason: Promise<string | undefined>;
}

/** Params extended with optional stream flag used by this plugin */
type TextParamsWithStream = GenerateTextParams & { stream?: boolean };
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash
set -euo pipefail

echo "Text-model entrypoints exposing tools/toolChoice:"
rg -n -C2 'tools\?:|toolChoice\?:' src/index.ts

echo
echo "Fields actually forwarded into generateParams:"
sed -n '14,56p' src/models/text.ts

Repository: elizaos-plugins/plugin-openrouter

Length of output: 2227


🏁 Script executed:

#!/bin/bash
set -euo pipefail

echo "=== Checking if generateText/streamText are called with tools/toolChoice ==="
rg -n 'generateText|streamText' src/models/text.ts -A 5

echo
echo "=== Checking where the params are passed to buildGenerateParams ==="
rg -n 'buildGenerateParams' src/models/text.ts -B 2 -A 10

echo
echo "=== Checking the complete function signatures in index.ts that accept tools ==="
sed -n '50,70p' src/index.ts

echo
echo "=== Checking how buildGenerateParams result is used ==="
grep -n 'buildGenerateParams' src/models/text.ts
rg -n 'const.*generateParams\|return.*generateParams' src/models/text.ts -A 3

Repository: elizaos-plugins/plugin-openrouter

Length of output: 4055


🏁 Script executed:

#!/bin/bash
set -euo pipefail

echo "=== Checking package.json for 'ai' library version ==="
cat package.json | grep -A 2 '"ai"'

echo
echo "=== Checking imports and types from 'ai' library ==="
head -20 src/models/text.ts

echo
echo "=== Checking if tools are used anywhere else in the codebase ==="
rg -n 'tools\|toolChoice' --type ts --type tsx | grep -v 'node_modules' | head -20

Repository: elizaos-plugins/plugin-openrouter

Length of output: 1201


🌐 Web query:

ai library v5.0.47 generateText streamText tools support documentation

💡 Result:

For the ai npm package (Vercel AI SDK) v5.x—including functions like generateText, streamText, and tool calling—the relevant documentation is here:

  • AI SDK Core – Generating Text (overview of generateText and streamText): [1]
  • AI SDK Core – Tools & Tool Calling (how to define tools, tool schemas, and tool-call loops/steps): [2]
  • Reference – AI SDK Core (API index for generateText(), streamText(), and related utilities): [3]
  • Reference – generateText (full parameter/return reference): [4]
  • AI SDK UI – Chatbot with Tool Calling (how tool calls stream and how to use tools with useChat + streamText): [5]
  • (Background) AI SDK 5 announcement (concepts like step control / tool loop primitives): [6]

Sources: [1] [2] [3] [4] [5] [6]


🌐 Web query:

vercel ai sdk v5 generateText streamText function signature parameters tools toolChoice

💡 Result:

generateText / streamText (AI SDK v5) — tools + toolChoice params

tools

  • Type: ToolSet (an object/map of tool names → tool definitions). Tools must be model-supported for tool calling.
    (v5.ai-sdk.dev)

Example shape:

tools: {
  weather: tool({
    description: '...',
    inputSchema: z.object({ location: z.string() }),
    execute: async ({ location }, options) => { /* ... */ },
  }),
}

(v5.ai-sdk.dev)

toolChoice?

  • Type:
toolChoice?:
  | 'auto'
  | 'none'
  | 'required'
  | { type: 'tool'; toolName: string }
  • Meaning: default 'auto'; 'none' disables tool execution; 'required' forces tool usage; {type:'tool', toolName} forces a specific tool.
    (v5.ai-sdk.dev)

Related (often used with tools)

  • activeTools?: Array<TOOLNAME> / Array<keyof TOOLS> — limit which tools are available for the model to call (all active by default). (v5.ai-sdk.dev)
  • prepareStep?: (...) => PrepareStepResult<TOOLS> — can override per-step toolChoice and activeTools. (v5.ai-sdk.dev)

Citations:


Forward tools and toolChoice into the generated request.

The Vercel AI SDK v5 (which is the version used here) supports tools and toolChoice parameters in both generateText and streamText. Currently, src/index.ts lines 53–66 accept these fields for text models, but buildGenerateParams() never includes them in the request object passed to generateText/streamText. Tool-enabled calls silently degrade to plain text generation.

Also extend TextParamsWithStream to include tool parameters so direct callers can express the capability.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/models/text.ts` at line 15, Add forwarding for tool parameters: extend
the TextParamsWithStream type to include tools?: GenerateTextParams['tools'] and
toolChoice?: GenerateTextParams['toolChoice'], then update
buildGenerateParams(...) to copy through tools and toolChoice from the incoming
params into the object returned for generateText/streamText (same place where
you set model, prompt, temperature, etc.). Make sure both the non-stream and
stream branches (the paths that call generateText and streamText) include these
fields so tool-enabled requests are passed intact to generateText/streamText.


import { createOpenRouterProvider } from "../providers";
import { getSmallModel, getLargeModel } from "../utils/config";
Expand All @@ -16,7 +24,7 @@ import { emitModelUsageEvent } from "../utils/events";
function buildGenerateParams(
runtime: IAgentRuntime,
modelType: typeof ModelType.TEXT_SMALL | typeof ModelType.TEXT_LARGE,
params: GenerateTextParams,
params: TextParamsWithStream,
) {
const { prompt, stopSequences = [] } = params;
const temperature = params.temperature ?? 0.7;
Comment on lines 24 to 30
Copy link

Copilot AI Mar 9, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

buildGenerateParams takes params that (via src/index.ts) may include tools / toolChoice, but these fields are currently ignored and never forwarded into generateText/streamText params. If tool calling is intended for AI SDK v5 support, include these fields in generateParams (or remove the added typing to avoid implying support).

Copilot uses AI. Check for mistakes.
Expand All @@ -34,7 +42,7 @@ function buildGenerateParams(
modelType === ModelType.TEXT_SMALL ? "TEXT_SMALL" : "TEXT_LARGE";

const generateParams: Parameters<typeof generateText>[0] = {
model: openrouter.chat(modelName),
model: openrouter.chat(modelName) as LanguageModel,
prompt: prompt,
system: runtime.character.system ?? undefined,
temperature: temperature,
Expand Down Expand Up @@ -88,7 +96,7 @@ function handleStreamingGeneration(
async function generateTextWithModel(
runtime: IAgentRuntime,
modelType: typeof ModelType.TEXT_SMALL | typeof ModelType.TEXT_LARGE,
params: GenerateTextParams,
params: TextParamsWithStream,
): Promise<string | TextStreamResult> {
const { generateParams, modelName, modelLabel, prompt } =
buildGenerateParams(runtime, modelType, params);
Comment on lines 96 to 102
Copy link

Copilot AI Mar 9, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

generateTextWithModel now accepts a params shape that can include onStreamChunk (used by the plugin’s streaming test in src/index.ts), but the implementation only enables streaming when params.stream is set and never consumes streamResult.textStream to invoke onStreamChunk. As a result, callers providing onStreamChunk won’t receive any chunks. Consider mapping onStreamChunk -> streaming mode and piping the stream to the callback.

Copilot uses AI. Check for mistakes.
Expand Down Expand Up @@ -127,7 +135,7 @@ async function generateTextWithModel(
*/
export async function handleTextSmall(
runtime: IAgentRuntime,
params: GenerateTextParams,
params: TextParamsWithStream,
): Promise<string | TextStreamResult> {
return generateTextWithModel(runtime, ModelType.TEXT_SMALL, params);
}
Expand All @@ -141,7 +149,7 @@ export async function handleTextSmall(
*/
export async function handleTextLarge(
runtime: IAgentRuntime,
params: GenerateTextParams,
params: TextParamsWithStream,
): Promise<string | TextStreamResult> {
return generateTextWithModel(runtime, ModelType.TEXT_LARGE, params);
}
Loading