Conversation
…eam attribution and streamText Made-with: Cursor
WalkthroughBuild script and TypeScript config updated for single ESM output and dynamic externals; model handler types extended (tools/toolChoice, streaming results, object outputs); OpenRouter provider wrapped to expose LanguageModel with supportedUrls; .gitignore and package.json adjusted. Changes
Sequence Diagram(s)sequenceDiagram
participant R as Runtime
participant P as OpenRouter Provider (proxy)
participant M as LanguageModel (augmented)
R->>P: createOpenRouterProvider(runtime)
P->>P: proxy wraps original provider.chat
R->>P: call chat(modelId)
P->>P: original.chat(modelId) -> model
P->>M: withSupportedUrls(model) -> modelWithUrls
P-->>R: return modelWithUrls (LanguageModel with supportedUrls)
Estimated code review effort🎯 4 (Complex) | ⏱️ ~45 minutes Poem
🚥 Pre-merge checks | ✅ 3✅ Passed checks (3 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches
🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
Made-with: Cursor
| const result = await handleTextSmall(runtime, params); | ||
| return typeof result === 'string' ? result : JSON.stringify(result); | ||
| }, | ||
| [ModelType.TEXT_LARGE]: async ( | ||
| runtime: IAgentRuntime, | ||
| params: GenerateTextParams | ||
| ) => { | ||
| return handleTextLarge(runtime, params); | ||
| params: GenerateTextParams & { | ||
| tools?: Record<string, Tool>; | ||
| toolChoice?: ToolChoice<Record<string, Tool>>; | ||
| }, | ||
| ): Promise<string> => { | ||
| const result = await handleTextLarge(runtime, params); | ||
| return typeof result === 'string' ? result : JSON.stringify(result); |
There was a problem hiding this comment.
Streaming result silently broken when stream: true
When a caller passes stream: true in the params, handleTextSmall/handleTextLarge returns a TextStreamResult object (containing a ReadableStream and several Promise fields). However, the wrapper in index.ts falls through to JSON.stringify(result) on any non-string return value:
const result = await handleTextSmall(runtime, params);
return typeof result === 'string' ? result : JSON.stringify(result);JSON.stringify on a TextStreamResult will produce something like {"text":{},"usage":{},"finishReason":{}} because Promise and ReadableStream instances do not serialize to JSON. The actual stream is silently discarded, and the caller gets back a useless string instead of an error or the real stream. If streaming is not intended to be exposed through this interface, the stream option should be blocked or the TextStreamResult path should be handled explicitly.
| function withSupportedUrls<T extends object>( | ||
| model: T | ||
| ): T & { supportedUrls: Record<string, RegExp[]> } { | ||
| return Object.assign({}, model, { supportedUrls: {} }); | ||
| } |
There was a problem hiding this comment.
Object.assign shallow copy loses prototype methods on the model
withSupportedUrls creates a plain object copy of the model using Object.assign({}, model, ...):
function withSupportedUrls<T extends object>(model: T): T & { supportedUrls: Record<string, RegExp[]> } {
return Object.assign({}, model, { supportedUrls: {} });
}Object.assign only copies own enumerable properties. Any methods defined on the model's prototype chain (e.g. doGenerate, doStream, etc. from the AI SDK LanguageModel interface) will not be present on the resulting plain object. This means the wrapped model may appear to satisfy the TypeScript type but will throw at runtime when the AI SDK attempts to call prototype methods.
Consider using Object.setPrototypeOf or Object.create to preserve the prototype chain, or simply mutate the existing object:
function withSupportedUrls<T extends object>(model: T): T & { supportedUrls: Record<string, RegExp[]> } {
(model as any).supportedUrls = {};
return model as T & { supportedUrls: Record<string, RegExp[]> };
}| return new Proxy(provider, { | ||
| get(target, prop) { | ||
| if (prop === "chat") { | ||
| return (modelId: string) => | ||
| withSupportedUrls(origChat(modelId)) as unknown as LanguageModel; | ||
| } | ||
| return Reflect.get(target, prop); | ||
| }, |
There was a problem hiding this comment.
Proxy Reflect.get missing the receiver argument
In the Proxy trap, Reflect.get(target, prop) is called without passing the third receiver argument:
return Reflect.get(target, prop);Without the receiver, any accessor (getter) properties on the target whose implementation references this will see target as this rather than the proxy itself. This breaks the Proxy contract and can cause subtle issues when getters are accessed through the proxy. The correct form is:
| return new Proxy(provider, { | |
| get(target, prop) { | |
| if (prop === "chat") { | |
| return (modelId: string) => | |
| withSupportedUrls(origChat(modelId)) as unknown as LanguageModel; | |
| } | |
| return Reflect.get(target, prop); | |
| }, | |
| return Reflect.get(target, prop, receiver); |
The trap signature should also accept receiver:
get(target, prop, receiver) {
if (prop === "chat") { ... }
return Reflect.get(target, prop, receiver);
}| external: externalDeps, | ||
| }); | ||
| console.log(`✅ Browser build complete in ${((Date.now() - browserStart) / 1000).toFixed(2)}s`); | ||
|
|
||
| // Node CJS build | ||
| const cjsStart = Date.now(); | ||
| console.log("🧱 Building @elizaos/plugin-openrouter for Node (CJS)..."); | ||
| const cjsResult = await Bun.build({ | ||
| entrypoints: ["src/index.node.ts"], | ||
| outdir: "dist/cjs", | ||
| target: "node", | ||
| format: "cjs", | ||
| sourcemap: "external", | ||
| minify: false, | ||
| external: [...externalDeps], | ||
| }); | ||
| if (!cjsResult.success) { | ||
| console.error(cjsResult.logs); | ||
| throw new Error("CJS build failed"); | ||
| if (!esmResult.success) { | ||
| console.error(esmResult.logs); | ||
| throw new Error("ESM build failed"); | ||
| } | ||
| try { | ||
| const { rename } = await import("node:fs/promises"); | ||
| await rename("dist/cjs/index.node.js", "dist/cjs/index.node.cjs"); | ||
| } catch (e) { | ||
| console.warn("CJS rename step warning:", e); | ||
| } | ||
| console.log(`✅ CJS build complete in ${((Date.now() - cjsStart) / 1000).toFixed(2)}s`); | ||
| console.log(`✅ Build complete in ${((Date.now() - esmStart) / 1000).toFixed(2)}s`); | ||
|
|
||
| // TypeScript declarations | ||
| const dtsStart = Date.now(); | ||
| console.log("📝 Generating TypeScript declarations..."); | ||
| const { mkdir, writeFile } = await import("node:fs/promises"); | ||
| const { $ } = await import("bun"); | ||
| await $`tsc --project tsconfig.build.json`; | ||
| await mkdir("dist/node", { recursive: true }); | ||
| await mkdir("dist/browser", { recursive: true }); | ||
| await mkdir("dist/cjs", { recursive: true }); | ||
| await writeFile( | ||
| "dist/node/index.d.ts", | ||
| `export * from '../index'; | ||
| export { default } from '../index'; | ||
| ` | ||
| ); | ||
| await writeFile( | ||
| "dist/browser/index.d.ts", | ||
| `export * from '../index'; | ||
| export { default } from '../index'; | ||
| ` | ||
| ); | ||
| await writeFile( | ||
| "dist/cjs/index.d.ts", | ||
| `export * from '../index'; | ||
| export { default } from '../index'; | ||
| ` | ||
| ); | ||
| console.log(`✅ Declarations generated in ${((Date.now() - dtsStart) / 1000).toFixed(2)}s`); | ||
| if (true) { // Always generate .d.ts | ||
| console.log("📝 Generating TypeScript declarations..."); | ||
| try { | ||
| await $`tsc --project tsconfig.build.json`; | ||
| console.log(`✅ Declarations generated in ${((Date.now() - dtsStart) / 1000).toFixed(2)}s`); | ||
| } catch (error) { | ||
| console.warn(`⚠️ TypeScript declaration generation had errors (${((Date.now() - dtsStart) / 1000).toFixed(2)}s)`); | ||
| console.warn(" Build will continue - fix type errors when possible"); | ||
| } | ||
| } else { | ||
| console.log("🔍 Type checking..."); | ||
| try { |
There was a problem hiding this comment.
Dead else branch behind if (true) constant
The declaration generation block uses if (true) { ... } else { ... }, making the else branch permanently unreachable dead code:
if (true) { // Always generate .d.ts
// ... tsc emit
} else {
// ... tsc --noEmit --incremental (never runs)
}This clutters the build script and obscures intent. The else block should be removed entirely.
| }, | ||
| ): Promise<string> => { | ||
| const result = await handleTextSmall(runtime, params); | ||
| return typeof result === 'string' ? result : JSON.stringify(result); |
There was a problem hiding this comment.
JSON.stringify on TextStreamResult breaks streaming
Medium Severity
When handleTextSmall / handleTextLarge returns a TextStreamResult (streaming mode), the fallback JSON.stringify(result) serializes an object containing ReadableStream and Promise properties, producing useless output like {"textStream":{},"text":{},...}. The previous code returned the result directly, preserving the stream. Now streaming callers receive a garbage string instead of an operable stream object.
Additional Locations (1)
| console.warn(`⚠️ Type checking had errors (${((Date.now() - dtsStart) / 1000).toFixed(2)}s)`); | ||
| console.warn(" Build will continue - fix type errors when possible"); | ||
| } | ||
| } |
There was a problem hiding this comment.
| external: [...externalDeps], | ||
| }); | ||
| console.log(`✅ Node build complete in ${((Date.now() - nodeStart) / 1000).toFixed(2)}s`); | ||
|
|
There was a problem hiding this comment.
Build output paths don't match package.json exports
High Severity
The rewritten build.ts outputs a single ESM bundle to dist/index.js (from entrypoint src/index.ts, outdir dist), but package.json still references the old multi-target paths: dist/node/index.node.js, dist/cjs/index.node.cjs, dist/browser/index.browser.js, and dist/node/index.d.ts. None of these files will exist after a build, so the package will fail to resolve for any consumer — Node imports, CJS requires, browser imports, and TypeScript type resolution will all break.
Additional Locations (1)
There was a problem hiding this comment.
Pull request overview
Updates the OpenRouter plugin to better interoperate with Vercel AI SDK v1/v2 expectations, including adapting provider model shapes and adjusting build/type tooling.
Changes:
- Adapt OpenRouter provider
chat()to return an AI SDKLanguageModel-compatible shape (addssupportedUrls). - Update text model plumbing/types to align with AI SDK v5 (
LanguageModel, streaming result shape, tool-related param types). - Simplify build output to a single ESM build and adjust TypeScript build settings / repo ignores.
Reviewed changes
Copilot reviewed 7 out of 8 changed files in this pull request and generated 7 comments.
Show a summary per file
| File | Description |
|---|---|
| tsconfig.build.json | Tightens TS build type checking and adds Node typings. |
| src/providers/openrouter.ts | Wraps OpenRouter provider to satisfy AI SDK v5 model expectations. |
| src/models/text.ts | Refactors text generation params/types and adds a local streaming result shape. |
| src/models/image.ts | Casts OpenRouter chat model to AI SDK LanguageModel. |
| src/index.ts | Extends text model params for tools and changes return handling. |
| package.json | Updates dependencies/devDependencies (currently with a merge conflict). |
| build.ts | Reworks build pipeline to emit only one ESM build and runs tsc for declarations. |
| .gitignore | Expands ignored artifacts (build outputs, IDE files, caches, logs). |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
You can also share your feedback on Copilot code review. Take the survey.
| const esmStart = Date.now(); | ||
| console.log("🔨 Building @elizaos/plugin-openrouter..."); | ||
| const esmResult = await Bun.build({ | ||
| entrypoints: ["src/index.ts"], | ||
| outdir: "dist", | ||
| target: "node", | ||
| format: "esm", | ||
| sourcemap: "external", | ||
| minify: false, | ||
| external: [...externalDeps], | ||
| }); | ||
| console.log(`✅ Node build complete in ${((Date.now() - nodeStart) / 1000).toFixed(2)}s`); | ||
|
|
||
| // Browser build | ||
| const browserStart = Date.now(); | ||
| console.log("🌐 Building @elizaos/plugin-openrouter for Browser..."); | ||
| await Bun.build({ | ||
| entrypoints: ["src/index.browser.ts"], | ||
| outdir: "dist/browser", | ||
| target: "browser", | ||
| format: "esm", | ||
| sourcemap: "external", | ||
| minify: false, | ||
| external: externalDeps, | ||
| }); |
There was a problem hiding this comment.
The build now only outputs a single ESM bundle to dist/ (entrypoint src/index.ts), but the package metadata expects dist/node/index.node.js, dist/browser/index.browser.js, and dist/cjs/index.node.cjs (see package.json exports/main/module/types). As-is, consumers will get missing-file runtime errors. Either restore the multi-target outputs/rename steps, or update package.json entrypoints/exports/types to match the new output layout.
| console.log(`✅ Declarations generated in ${((Date.now() - dtsStart) / 1000).toFixed(2)}s`); | ||
| } catch (error) { | ||
| console.warn(`⚠️ TypeScript declaration generation had errors (${((Date.now() - dtsStart) / 1000).toFixed(2)}s)`); | ||
| console.warn(" Build will continue - fix type errors when possible"); |
There was a problem hiding this comment.
tsc --project tsconfig.build.json errors are swallowed and the build continues, which can publish a package with missing/invalid .d.ts (and CI may still pass). Consider failing the build on declaration generation errors (or at least make this behavior opt-in via an env flag) so type breaks don’t ship silently.
| console.warn(" Build will continue - fix type errors when possible"); | |
| console.warn(" Build will continue - fix type errors when possible"); | |
| if (process.env.FAIL_BUILD_ON_DTS_ERROR === "true") { | |
| throw (error instanceof Error ? error : new Error("TypeScript declaration generation failed")); | |
| } |
| ): Promise<string> => { | ||
| const result = await handleTextSmall(runtime, params); | ||
| return typeof result === 'string' ? result : JSON.stringify(result); | ||
| }, |
There was a problem hiding this comment.
handleTextSmall/handleTextLarge can return a TextStreamResult (contains ReadableStream + Promises). JSON.stringify(result) will either drop most fields or produce {}, and it prevents the runtime from consuming the stream (breaking onStreamChunk-style streaming). The model handler should return the streaming shape expected by @elizaos/core instead of stringifying it.
| async function generateTextWithModel( | ||
| runtime: IAgentRuntime, | ||
| modelType: typeof ModelType.TEXT_SMALL | typeof ModelType.TEXT_LARGE, | ||
| params: GenerateTextParams, | ||
| params: TextParamsWithStream, | ||
| ): Promise<string | TextStreamResult> { | ||
| const { generateParams, modelName, modelLabel, prompt } = | ||
| buildGenerateParams(runtime, modelType, params); |
There was a problem hiding this comment.
generateTextWithModel now accepts a params shape that can include onStreamChunk (used by the plugin’s streaming test in src/index.ts), but the implementation only enables streaming when params.stream is set and never consumes streamResult.textStream to invoke onStreamChunk. As a result, callers providing onStreamChunk won’t receive any chunks. Consider mapping onStreamChunk -> streaming mode and piping the stream to the callback.
| function buildGenerateParams( | ||
| runtime: IAgentRuntime, | ||
| modelType: typeof ModelType.TEXT_SMALL | typeof ModelType.TEXT_LARGE, | ||
| params: GenerateTextParams, | ||
| params: TextParamsWithStream, | ||
| ) { | ||
| const { prompt, stopSequences = [] } = params; | ||
| const temperature = params.temperature ?? 0.7; |
There was a problem hiding this comment.
buildGenerateParams takes params that (via src/index.ts) may include tools / toolChoice, but these fields are currently ignored and never forwarded into generateText/streamText params. If tool calling is intended for AI SDK v5 support, include these fields in generateParams (or remove the added typing to avoid implying support).
| function withSupportedUrls<T extends object>( | ||
| model: T | ||
| ): T & { supportedUrls: Record<string, RegExp[]> } { | ||
| return Object.assign({}, model, { supportedUrls: {} }); |
There was a problem hiding this comment.
withSupportedUrls creates a shallow clone via Object.assign({}, model, ...), which drops the original prototype and non-enumerable properties. If the returned model relies on prototype methods/accessors, this can break model behavior at runtime. Prefer mutating/augmenting the original model object (e.g., define supportedUrls on it) instead of cloning it.
| return Object.assign({}, model, { supportedUrls: {} }); | |
| (model as any).supportedUrls ??= {}; | |
| return model as T & { supportedUrls: Record<string, RegExp[]> }; |
There was a problem hiding this comment.
Actionable comments posted: 4
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
package.json (1)
29-44:⚠️ Potential issue | 🔴 CriticalResolve the leftover merge conflict markers.
package.jsonis invalid JSON in its current state.build.tsLine 6 now parses this file at startup, so the build fails before bundling or declaration generation even begins. Pick the intended@elizaos/coredependency line and remove the conflict markers.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@package.json` around lines 29 - 44, package.json contains leftover git conflict markers (<<<<<<<, =======, >>>>>>>) around the dependencies block which makes the file invalid; remove the conflict markers and keep a single `@elizaos/core` entry (choose the intended value, e.g., "@elizaos/core": "workspace:*" or "^1.7.0") and ensure the surrounding JSON structure is valid so the additional dependencies ("@openrouter/ai-sdk-provider", "ai", "undici") and the devDependencies block are properly placed under "dependencies" and "devDependencies". Update package.json so build.ts can parse it at startup without errors.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@build.ts`:
- Around line 18-28: The package's published entrypoints still reference
non-existent directories (dist/node/*, dist/cjs/*, dist/browser/*) while the
build (see esmStart and the Bun.build call which outputs to outdir "dist" with
format "esm") now emits a single ESM dist tree; update package.json
exports/exportsField and main/module/browser fields to point to the new single
dist output (e.g., entrypoint(s) under "dist/") and remove or replace any
references to dist/node, dist/cjs, and dist/browser so consumers resolve the
actual files produced by Bun.build.
- Around line 12-16: The build script currently runs the repo-wide clean via the
pkg.scripts?.clean check and await $`bun run clean`.quiet(), which deletes
node_modules and hides failures; replace this with a build-scoped cleanup (e.g.,
invoke a dedicated "clean:build" or "clean:dist" npm script) or remove the clean
invocation entirely so node_modules/tsc remain intact, and stop suppressing
errors so declaration generation failures surface (remove .quiet() or the
surrounding catch that swallows errors). Update both occurrences (the one using
pkg.scripts?.clean / await $`bun run clean`.quiet() and the similar block at
lines 36-44) to call the build-only clean script or no clean and allow errors to
propagate.
In `@src/index.ts`:
- Around line 51-69: The wrapper currently JSON.stringifys any non-string return
from handleTextSmall/handleTextLarge which drops streaming results; update both
ModelType.TEXT_SMALL and ModelType.TEXT_LARGE handlers to detect
TextStreamResult (or an object with onStreamChunk/stream fields) and pass it
through instead of stringifying, and if present wire params.stream and
params.onStreamChunk into the returned TextStreamResult (e.g., set result.stream
= result.stream ?? params.stream and result.onStreamChunk = result.onStreamChunk
?? params.onStreamChunk) so the internal stream flag and onStreamChunk are
bridged to the caller; otherwise fallback to returning typeof result ===
'string' ? result : JSON.stringify(result).
In `@src/models/text.ts`:
- Line 15: Add forwarding for tool parameters: extend the TextParamsWithStream
type to include tools?: GenerateTextParams['tools'] and toolChoice?:
GenerateTextParams['toolChoice'], then update buildGenerateParams(...) to copy
through tools and toolChoice from the incoming params into the object returned
for generateText/streamText (same place where you set model, prompt,
temperature, etc.). Make sure both the non-stream and stream branches (the paths
that call generateText and streamText) include these fields so tool-enabled
requests are passed intact to generateText/streamText.
---
Outside diff comments:
In `@package.json`:
- Around line 29-44: package.json contains leftover git conflict markers
(<<<<<<<, =======, >>>>>>>) around the dependencies block which makes the file
invalid; remove the conflict markers and keep a single `@elizaos/core` entry
(choose the intended value, e.g., "@elizaos/core": "workspace:*" or "^1.7.0")
and ensure the surrounding JSON structure is valid so the additional
dependencies ("@openrouter/ai-sdk-provider", "ai", "undici") and the
devDependencies block are properly placed under "dependencies" and
"devDependencies". Update package.json so build.ts can parse it at startup
without errors.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
Run ID: afcde0eb-fe9c-4e38-93b7-dc6847a9e901
📒 Files selected for processing (8)
.gitignorebuild.tspackage.jsonsrc/index.tssrc/models/image.tssrc/models/text.tssrc/providers/openrouter.tstsconfig.build.json
| // Use the clean script from package.json | ||
| if (pkg.scripts?.clean) { | ||
| console.log("🧹 Cleaning..."); | ||
| await $`bun run clean`.quiet(); | ||
| } |
There was a problem hiding this comment.
Don't run the repo-wide clean script as part of every build.
package.json Line 50 deletes node_modules. Running that here removes the local tsc binary and installed typings before the declaration step, and the catch block below turns the resulting failure into a green build. Use a build-only cleanup for artifacts, and let declaration generation fail the build.
Suggested fix
- // Use the clean script from package.json
- if (pkg.scripts?.clean) {
- console.log("🧹 Cleaning...");
- await $`bun run clean`.quiet();
- }
+ console.log("🧹 Cleaning build artifacts...");
+ await $`rm -rf dist .turbo-tsconfig.json tsconfig.tsbuildinfo`.quiet();
@@
- } catch (error) {
- console.warn(`⚠️ TypeScript declaration generation had errors (${((Date.now() - dtsStart) / 1000).toFixed(2)}s)`);
- console.warn(" Build will continue - fix type errors when possible");
+ } catch (error) {
+ console.error(`❌ TypeScript declaration generation failed (${((Date.now() - dtsStart) / 1000).toFixed(2)}s)`);
+ throw error;
}Also applies to: 36-44
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@build.ts` around lines 12 - 16, The build script currently runs the repo-wide
clean via the pkg.scripts?.clean check and await $`bun run clean`.quiet(), which
deletes node_modules and hides failures; replace this with a build-scoped
cleanup (e.g., invoke a dedicated "clean:build" or "clean:dist" npm script) or
remove the clean invocation entirely so node_modules/tsc remain intact, and stop
suppressing errors so declaration generation failures surface (remove .quiet()
or the surrounding catch that swallows errors). Update both occurrences (the one
using pkg.scripts?.clean / await $`bun run clean`.quiet() and the similar block
at lines 36-44) to call the build-only clean script or no clean and allow errors
to propagate.
| const esmStart = Date.now(); | ||
| console.log("🔨 Building @elizaos/plugin-openrouter..."); | ||
| const esmResult = await Bun.build({ | ||
| entrypoints: ["src/index.ts"], | ||
| outdir: "dist", | ||
| target: "node", | ||
| format: "esm", | ||
| sourcemap: "external", | ||
| minify: false, | ||
| external: [...externalDeps], | ||
| }); | ||
| console.log(`✅ Node build complete in ${((Date.now() - nodeStart) / 1000).toFixed(2)}s`); | ||
|
|
||
| // Browser build | ||
| const browserStart = Date.now(); | ||
| console.log("🌐 Building @elizaos/plugin-openrouter for Browser..."); | ||
| await Bun.build({ | ||
| entrypoints: ["src/index.browser.ts"], | ||
| outdir: "dist/browser", | ||
| target: "browser", | ||
| format: "esm", | ||
| sourcemap: "external", | ||
| minify: false, | ||
| external: externalDeps, | ||
| }); |
There was a problem hiding this comment.
Update the published entrypoints to match the new build layout.
This build now emits a single ESM dist tree, but package.json Lines 5-20 still publish dist/node/*, dist/cjs/*, and dist/browser/*. In this state, consumers will resolve files that are never created after the build change.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@build.ts` around lines 18 - 28, The package's published entrypoints still
reference non-existent directories (dist/node/*, dist/cjs/*, dist/browser/*)
while the build (see esmStart and the Bun.build call which outputs to outdir
"dist" with format "esm") now emits a single ESM dist tree; update package.json
exports/exportsField and main/module/browser fields to point to the new single
dist output (e.g., entrypoint(s) under "dist/") and remove or replace any
references to dist/node, dist/cjs, and dist/browser so consumers resolve the
actual files produced by Bun.build.
| [ModelType.TEXT_SMALL]: async ( | ||
| runtime: IAgentRuntime, | ||
| params: GenerateTextParams | ||
| ) => { | ||
| return handleTextSmall(runtime, params); | ||
| params: GenerateTextParams & { | ||
| tools?: Record<string, Tool>; | ||
| toolChoice?: ToolChoice<Record<string, Tool>>; | ||
| }, | ||
| ): Promise<string> => { | ||
| const result = await handleTextSmall(runtime, params); | ||
| return typeof result === 'string' ? result : JSON.stringify(result); | ||
| }, | ||
| [ModelType.TEXT_LARGE]: async ( | ||
| runtime: IAgentRuntime, | ||
| params: GenerateTextParams | ||
| ) => { | ||
| return handleTextLarge(runtime, params); | ||
| params: GenerateTextParams & { | ||
| tools?: Record<string, Tool>; | ||
| toolChoice?: ToolChoice<Record<string, Tool>>; | ||
| }, | ||
| ): Promise<string> => { | ||
| const result = await handleTextLarge(runtime, params); | ||
| return typeof result === 'string' ? result : JSON.stringify(result); |
There was a problem hiding this comment.
The text-model wrappers still break streaming.
handleTextSmall() / handleTextLarge() can now return TextStreamResult, but these entrypoints stringify any non-string result. That drops the stream instead of consuming it, and onStreamChunk is never bridged to the internal stream flag, so the openrouter_test_streaming case at Lines 158-175 won't receive chunks.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/index.ts` around lines 51 - 69, The wrapper currently JSON.stringifys any
non-string return from handleTextSmall/handleTextLarge which drops streaming
results; update both ModelType.TEXT_SMALL and ModelType.TEXT_LARGE handlers to
detect TextStreamResult (or an object with onStreamChunk/stream fields) and pass
it through instead of stringifying, and if present wire params.stream and
params.onStreamChunk into the returned TextStreamResult (e.g., set result.stream
= result.stream ?? params.stream and result.onStreamChunk = result.onStreamChunk
?? params.onStreamChunk) so the internal stream flag and onStreamChunk are
bridged to the caller; otherwise fallback to returning typeof result ===
'string' ? result : JSON.stringify(result).
| } | ||
|
|
||
| /** Params extended with optional stream flag used by this plugin */ | ||
| type TextParamsWithStream = GenerateTextParams & { stream?: boolean }; |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
set -euo pipefail
echo "Text-model entrypoints exposing tools/toolChoice:"
rg -n -C2 'tools\?:|toolChoice\?:' src/index.ts
echo
echo "Fields actually forwarded into generateParams:"
sed -n '14,56p' src/models/text.tsRepository: elizaos-plugins/plugin-openrouter
Length of output: 2227
🏁 Script executed:
#!/bin/bash
set -euo pipefail
echo "=== Checking if generateText/streamText are called with tools/toolChoice ==="
rg -n 'generateText|streamText' src/models/text.ts -A 5
echo
echo "=== Checking where the params are passed to buildGenerateParams ==="
rg -n 'buildGenerateParams' src/models/text.ts -B 2 -A 10
echo
echo "=== Checking the complete function signatures in index.ts that accept tools ==="
sed -n '50,70p' src/index.ts
echo
echo "=== Checking how buildGenerateParams result is used ==="
grep -n 'buildGenerateParams' src/models/text.ts
rg -n 'const.*generateParams\|return.*generateParams' src/models/text.ts -A 3Repository: elizaos-plugins/plugin-openrouter
Length of output: 4055
🏁 Script executed:
#!/bin/bash
set -euo pipefail
echo "=== Checking package.json for 'ai' library version ==="
cat package.json | grep -A 2 '"ai"'
echo
echo "=== Checking imports and types from 'ai' library ==="
head -20 src/models/text.ts
echo
echo "=== Checking if tools are used anywhere else in the codebase ==="
rg -n 'tools\|toolChoice' --type ts --type tsx | grep -v 'node_modules' | head -20Repository: elizaos-plugins/plugin-openrouter
Length of output: 1201
🌐 Web query:
ai library v5.0.47 generateText streamText tools support documentation
💡 Result:
For the ai npm package (Vercel AI SDK) v5.x—including functions like generateText, streamText, and tool calling—the relevant documentation is here:
- AI SDK Core – Generating Text (overview of
generateTextandstreamText): [1] - AI SDK Core – Tools & Tool Calling (how to define
tools, tool schemas, and tool-call loops/steps): [2] - Reference – AI SDK Core (API index for
generateText(),streamText(), and related utilities): [3] - Reference –
generateText(full parameter/return reference): [4] - AI SDK UI – Chatbot with Tool Calling (how tool calls stream and how to use tools with
useChat+streamText): [5] - (Background) AI SDK 5 announcement (concepts like step control / tool loop primitives): [6]
Sources: [1] [2] [3] [4] [5] [6]
🌐 Web query:
vercel ai sdk v5 generateText streamText function signature parameters tools toolChoice
💡 Result:
generateText / streamText (AI SDK v5) — tools + toolChoice params
tools
- Type:
ToolSet(an object/map of tool names → tool definitions). Tools must be model-supported for tool calling.
(v5.ai-sdk.dev)
Example shape:
tools: {
weather: tool({
description: '...',
inputSchema: z.object({ location: z.string() }),
execute: async ({ location }, options) => { /* ... */ },
}),
}toolChoice?
- Type:
toolChoice?:
| 'auto'
| 'none'
| 'required'
| { type: 'tool'; toolName: string }- Meaning: default
'auto';'none'disables tool execution;'required'forces tool usage;{type:'tool', toolName}forces a specific tool.
(v5.ai-sdk.dev)
Related (often used with tools)
activeTools?: Array<TOOLNAME>/Array<keyof TOOLS>— limit which tools are available for the model to call (all active by default). (v5.ai-sdk.dev)prepareStep?: (...) => PrepareStepResult<TOOLS>— can override per-steptoolChoiceandactiveTools. (v5.ai-sdk.dev)
Citations:
- 1: https://v5.ai-sdk.dev/docs/reference/ai-sdk-core/generate-text
- 2: https://v5.ai-sdk.dev/docs/ai-sdk-core/tools-and-tool-calling?utm_source=openai
- 3: https://v5.ai-sdk.dev/docs/reference/ai-sdk-core/generate-text
- 4: https://v5.ai-sdk.dev/docs/reference/ai-sdk-core/generate-text
- 5: https://v5.ai-sdk.dev/docs/reference/ai-sdk-core/generate-text
Forward tools and toolChoice into the generated request.
The Vercel AI SDK v5 (which is the version used here) supports tools and toolChoice parameters in both generateText and streamText. Currently, src/index.ts lines 53–66 accept these fields for text models, but buildGenerateParams() never includes them in the request object passed to generateText/streamText. Tool-enabled calls silently degrade to plain text generation.
Also extend TextParamsWithStream to include tool parameters so direct callers can express the capability.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/models/text.ts` at line 15, Add forwarding for tool parameters: extend
the TextParamsWithStream type to include tools?: GenerateTextParams['tools'] and
toolChoice?: GenerateTextParams['toolChoice'], then update
buildGenerateParams(...) to copy through tools and toolChoice from the incoming
params into the object returned for generateText/streamText (same place where
you set model, prompt, temperature, etc.). Make sure both the non-stream and
stream branches (the paths that call generateText and streamText) include these
fields so tool-enabled requests are passed intact to generateText/streamText.
There was a problem hiding this comment.
Actionable comments posted: 1
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
package.json (1)
5-20:⚠️ Potential issue | 🔴 CriticalCorrect package.json entrypoints to match the actual build output.
The build only emits ESM output at
dist/index.jsand TypeScript declarations atdist/index.d.ts. The package.json manifest still declares CJS (dist/cjs/index.node.cjs), node-variant (dist/node/index.node.js), and browser (dist/browser/index.browser.js) files that are not produced. Update themain,module,types, andexportsfields to reflect what the build actually creates:
"main"→"dist/index.js"(or remove if ESM-only)"module"→"dist/index.js"(or remove if redundant)"types"→"dist/index.d.ts""exports["."].import"→"dist/index.js"- Remove or update
"exports["."].require"(CJS is not built)- Remove or update
"exports["."].browser"(browser build is not built)Consumers will fail at import/require/type-check time if these paths point to missing files.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@package.json` around lines 5 - 20, Update package.json entrypoints to match the actual build output: set "main" and "module" to "dist/index.js" (or remove them if you choose ESM-only), change "types" to "dist/index.d.ts", set exports["."].import to "dist/index.js", and remove or clear exports["."].require and exports["."].browser since no CJS or browser bundles are emitted; ensure the exports["."].default also points to "dist/index.js" if present so all entry fields reference existing dist/index.* artifacts.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@package.json`:
- Line 29: The package.json currently lists the dependency "@elizaos/core":
"workspace:*", which will break installs for consumers using npm; either replace
the workspace:* spec with a pinned semver (e.g. the actual version or a semver
range) for "@elizaos/core" or ensure your publish pipeline rewrites workspace:*
to the real version before publishing (update the publish script or tool
configuration that publishes package.json); locate the dependency entry for
"@elizaos/core" in package.json and make the change or add a verification step
in your release/publish scripts to rewrite workspace protocol for npm consumers.
---
Outside diff comments:
In `@package.json`:
- Around line 5-20: Update package.json entrypoints to match the actual build
output: set "main" and "module" to "dist/index.js" (or remove them if you choose
ESM-only), change "types" to "dist/index.d.ts", set exports["."].import to
"dist/index.js", and remove or clear exports["."].require and
exports["."].browser since no CJS or browser bundles are emitted; ensure the
exports["."].default also points to "dist/index.js" if present so all entry
fields reference existing dist/index.* artifacts.
| "@ai-sdk/openai": "^2.0.32", | ||
| "@ai-sdk/ui-utils": "1.2.11", | ||
| "@elizaos/core": "^1.7.0", | ||
| "@elizaos/core": "workspace:*", |
There was a problem hiding this comment.
🧩 Analysis chain
🌐 Web query:
Does publishing an npm package with a runtime dependency set to "workspace:*" work for consumers outside the monorepo, and how do npm, pnpm, yarn, and bun handle rewriting workspace protocol versions on publish?
💡 Result:
No—publishing a package that still has runtime deps like "dep": "workspace:*" generally will not work for consumers outside your monorepo, because workspace: is meant to be resolved against local workspace packages. With plain npm, consumers typically hit errors like “Unsupported URL Type "workspace:"” when installing such a manifest. [1]
How publish/pack rewriting works by package manager:
- npm (npm publish / npm pack): does not rewrite
workspace:*to a real semver range/version (so publishing withworkspace:*leaks that spec to the registry and breaks consumers). [1][2] - pnpm (pnpm pack / pnpm publish): rewrites
workspace:deps at pack/publish time:workspace:*→1.5.0(the referenced workspace package’s version)workspace:~→~1.5.0workspace:^→^1.5.0- other
workspace:<range>→ corresponding semver range (and also rewrites certain alias/path workspace forms). [3]
- Yarn (Berry) (yarn npm publish): rewrites
workspace:deps on publish, e.g. if the workspace version is3.2.1:workspace:^→^3.2.1workspace:~→~3.2.1workspace:*→=3.2.1workspace:^3.0.0→^3.0.0. [4]
- Bun (bun publish / publishing behavior): rewrites
workspace:versions when publishing, e.g.workspace:*→1.0.1workspace:^→^1.0.1workspace:~→~1.0.1workspace:1.0.2→1.0.2(explicit version wins). [5]
Practical takeaway: if you publish libraries meant for external consumers, ensure your published tarball contains only normal semver (or normal npm alias specs like npm:pkg@version)—which happens automatically with pnpm/Yarn/Bun, but not with npm, unless you add your own prepack/prepublish rewrite step. [3][4][5][2]
Sources: [1] https://stackoverflow.com/questions/69220042/npm-err-unsupported-url-type-workspace-workspace, [2] https://nesbitt.io/2026/01/18/workspaces-and-monorepos-in-package-managers.html, [3] https://dimava.github.io/pnpm/workspaces/, [4] https://yarnpkg.com/features/workspaces, [5] https://bun.sh/docs/pm/workspaces
Pin the dependency version or verify the publish tool rewrites workspace:*.
Publishing "@elizaos/core": "workspace:*" as-is will fail for consumers installing outside the monorepo if using npm (which does not rewrite workspace protocol). If using pnpm, Yarn, or Bun, these tools automatically rewrite workspace:* to the actual version at publish time. For npm or custom publish pipelines, either pin a specific semver range here or confirm your release step rewrites workspace dependencies before publishing.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@package.json` at line 29, The package.json currently lists the dependency
"@elizaos/core": "workspace:*", which will break installs for consumers using
npm; either replace the workspace:* spec with a pinned semver (e.g. the actual
version or a semver range) for "@elizaos/core" or ensure your publish pipeline
rewrites workspace:* to the real version before publishing (update the publish
script or tool configuration that publishes package.json); locate the dependency
entry for "@elizaos/core" in package.json and make the change or add a
verification step in your release/publish scripts to rewrite workspace protocol
for npm consumers.
There was a problem hiding this comment.
Cursor Bugbot has reviewed your changes and found 2 potential issues.
Bugbot Autofix is OFF. To automatically fix reported issues with cloud agents, enable autofix in the Cursor dashboard.
| function withSupportedUrls<T extends object>( | ||
| model: T | ||
| ): T & { supportedUrls: Record<string, RegExp[]> } { | ||
| return Object.assign({}, model, { supportedUrls: {} }); |
There was a problem hiding this comment.
Object.assign loses prototype methods on model objects
High Severity
withSupportedUrls uses Object.assign({}, model, ...) which creates a shallow plain-object copy. The model returned by origChat(modelId) is a class instance (OpenRouterChatLanguageModel) whose critical methods like doGenerate and doStream live on the prototype chain. Object.assign only copies own enumerable properties, so those methods are silently dropped. When the AI SDK later calls model.doGenerate() or model.doStream(), it will throw a "not a function" error at runtime.
| params: GenerateTextParams & { | ||
| tools?: Record<string, Tool>; | ||
| toolChoice?: ToolChoice<Record<string, Tool>>; | ||
| }, |
There was a problem hiding this comment.
Tools and toolChoice params accepted but silently dropped
Medium Severity
The tools and toolChoice parameters are newly added to the TEXT_SMALL and TEXT_LARGE handler signatures, but buildGenerateParams in text.ts never includes them in the generateParams object passed to the AI SDK's generateText/streamText. Any caller providing tools will have them silently ignored, with no error or warning.


Note
High Risk
High risk because it changes the published build artifacts and runtime model/provider types;
build.tsnow emits a singledist/ESM output whilepackage.jsonexports still reference the old multi-target paths, which can break consumers at import time.Overview
Adds Vercel AI SDK v5 (v1/v2) compatibility by shimming OpenRouter
chat()models to satisfyLanguageModelexpectations (addssupportedUrls) and updating text/image handlers with AI SDKLanguageModeltyping.Extends text generation to support streaming (
TextStreamResult) and optionaltools/toolChoiceparams, while normalizing plugin text model returns tostring(stringifying non-string results).Simplifies the build pipeline to a single ESM build from
src/index.ts, auto-derives externals frompackage.json, optionally runsclean, and always attempts.d.tsgeneration (continuing on type errors);tsconfig.build.jsonalso tightens type checking (skipLibCheck: false).Written by Cursor Bugbot for commit 1ac87a6. This will update automatically on new commits. Configure here.
Summary by CodeRabbit
New Features
Chores
Greptile Summary
This PR updates the OpenRouter plugin to support Vercel AI SDK v1 and v2 (AI SDK v5 API surface), simplifies the build pipeline to a single ESM output, and introduces a Proxy-based shim to add the
supportedUrlsproperty expected by AI SDK v5'sLanguageModelinterface. However, there are several critical blockers and logic bugs that must be resolved before merging.Critical issues:
package.jsoncontains unresolved Git merge conflict markers (<<<<<<< Updated upstream/=======), making the file invalid JSON and breaking all package tooling immediately.package.jsonexports,main,module, andtypesfields still reference old multi-target build paths (dist/node/index.node.js,dist/cjs/index.node.cjs,dist/browser/index.browser.js) that the new simplifiedbuild.tsno longer produces — any consumer will get a "module not found" error at import time.Logic bugs:
src/index.ts, if a caller passesstream: true, the returnedTextStreamResult(containingPromisefields and aReadableStream) is silentlyJSON.stringify'd, destroying the stream and returning a meaningless string to the caller.src/providers/openrouter.ts,withSupportedUrlsusesObject.assign({}, model, ...)which creates a plain object copy that loses all prototype-chain methods of the original model — any AI SDK method invocation on the wrapped model will fail at runtime.gettrap omits thereceiverargument inReflect.get(target, prop), breaking accessor properties that rely on correctthisbinding through the proxy.Style:
build.tscontains a permanent deadelsebranch behindif (true)that should be removed.Confidence Score: 1/5
package.json(merge conflict + broken exports),src/index.ts(streaming result serialization), andsrc/providers/openrouter.ts(prototype-chain loss inwithSupportedUrls, missing Proxy receiver).Important Files Changed
exports/main/module/typesfields all reference paths from the old multi-target build that no longer exist in the new single-target build.supportedUrlsonto OpenRouter models for AI SDK v5 compatibility, butObject.assignloses prototype-chain methods and the Proxy trap omits thereceiverargument, both of which can cause runtime failures.Tool/ToolChoicetyping to model handlers and normalises return type tostring, but silently serialises streamingTextStreamResultobjects viaJSON.stringify, destroying stream data whenstream: trueis passed.TextStreamResultinterface from@elizaos/coreimport to a local definition to decouple from the old SDK type; changes are otherwise sound.import("ai").LanguageModelto satisfy AI SDK v5 type; no functional changes.externalDepsfrompackage.json; contains a permanent deadelsebranch behindif (true).skipLibCheckfromtruetofalseand adds"types": ["node"]; stricter checking is good but may surface more type errors given the widespreadas unknown as LanguageModelcasts already present in the codebase.Sequence Diagram
sequenceDiagram participant Caller participant Plugin as openrouterPlugin (index.ts) participant TextHandler as models/text.ts participant ProviderFactory as providers/openrouter.ts participant Proxy as Proxy(provider) participant OpenRouter as @openrouter/ai-sdk-provider participant AISDK as ai (SDK v5) Caller->>Plugin: useModel(TEXT_SMALL, params) Plugin->>TextHandler: handleTextSmall(runtime, params) TextHandler->>ProviderFactory: createOpenRouterProvider(runtime) ProviderFactory->>OpenRouter: createOpenRouter({apiKey, baseURL, headers}) OpenRouter-->>ProviderFactory: provider ProviderFactory->>Proxy: new Proxy(provider, { get trap }) Proxy-->>TextHandler: OpenRouterProviderV2 TextHandler->>Proxy: .chat(modelName) Proxy->>OpenRouter: origChat(modelName) OpenRouter-->>Proxy: rawModel (no supportedUrls) Proxy->>Proxy: withSupportedUrls(rawModel) → Object.assign({}, rawModel, {supportedUrls:{}}) Proxy-->>TextHandler: LanguageModel (cast) alt stream=false TextHandler->>AISDK: generateText(params) AISDK-->>TextHandler: {text, usage} TextHandler-->>Plugin: string Plugin-->>Caller: string else stream=true TextHandler->>AISDK: streamText(params) AISDK-->>TextHandler: StreamResult TextHandler-->>Plugin: TextStreamResult Plugin->>Plugin: JSON.stringify(TextStreamResult) ⚠️ Plugin-->>Caller: broken JSON string (stream lost) endComments Outside Diff (1)
package.json, line 5-22 (link)exportsfield references paths that no longer existThe
package.jsonexportsfield still points to the old multi-target build output paths that the newbuild.tsno longer produces:The new
build.tsonly outputs todist/(e.g.dist/index.js). None ofdist/node/index.node.js,dist/cjs/index.node.cjs, ordist/browser/index.browser.jsare produced by the new build script. Any consumer that installs this package will get a "module not found" error at import time. Theexports,main,module, andtypesfields all need to be updated to match the new build output.Last reviewed commit: 053ab7a