Skip to content

streaming support + vercel ai sdk v1 & v2 support#24

Open
odilitime wants to merge 2 commits into1.xfrom
odi-dev
Open

streaming support + vercel ai sdk v1 & v2 support#24
odilitime wants to merge 2 commits into1.xfrom
odi-dev

Conversation

@odilitime
Copy link
Copy Markdown
Member

@odilitime odilitime commented Mar 9, 2026

Note

High Risk
High risk because it changes the published build artifacts and runtime model/provider types; build.ts now emits a single dist/ ESM output while package.json exports still reference the old multi-target paths, which can break consumers at import time.

Overview
Adds Vercel AI SDK v5 (v1/v2) compatibility by shimming OpenRouter chat() models to satisfy LanguageModel expectations (adds supportedUrls) and updating text/image handlers with AI SDK LanguageModel typing.

Extends text generation to support streaming (TextStreamResult) and optional tools/toolChoice params, while normalizing plugin text model returns to string (stringifying non-string results).

Simplifies the build pipeline to a single ESM build from src/index.ts, auto-derives externals from package.json, optionally runs clean, and always attempts .d.ts generation (continuing on type errors); tsconfig.build.json also tightens type checking (skipLibCheck: false).

Written by Cursor Bugbot for commit 1ac87a6. This will update automatically on new commits. Configure here.

Summary by CodeRabbit

  • New Features

    • Optional tools and toolChoice support for text models
    • Streaming text generation with a new TextStreamResult shape
    • OpenRouter provider now exposes supported URLs on models
    • Object-focused model handlers now return structured objects for object models
  • Chores

    • Streamlined build for single ESM output
    • Updated ignore patterns and added env/turbo ignore entries
    • Added dev typing dependency and TypeScript config adjustments

Greptile Summary

This PR updates the OpenRouter plugin to support Vercel AI SDK v1 and v2 (AI SDK v5 API surface), simplifies the build pipeline to a single ESM output, and introduces a Proxy-based shim to add the supportedUrls property expected by AI SDK v5's LanguageModel interface. However, there are several critical blockers and logic bugs that must be resolved before merging.

Critical issues:

  • package.json contains unresolved Git merge conflict markers (<<<<<<< Updated upstream / =======), making the file invalid JSON and breaking all package tooling immediately.
  • The package.json exports, main, module, and types fields still reference old multi-target build paths (dist/node/index.node.js, dist/cjs/index.node.cjs, dist/browser/index.browser.js) that the new simplified build.ts no longer produces — any consumer will get a "module not found" error at import time.

Logic bugs:

  • In src/index.ts, if a caller passes stream: true, the returned TextStreamResult (containing Promise fields and a ReadableStream) is silently JSON.stringify'd, destroying the stream and returning a meaningless string to the caller.
  • In src/providers/openrouter.ts, withSupportedUrls uses Object.assign({}, model, ...) which creates a plain object copy that loses all prototype-chain methods of the original model — any AI SDK method invocation on the wrapped model will fail at runtime.
  • The Proxy get trap omits the receiver argument in Reflect.get(target, prop), breaking accessor properties that rely on correct this binding through the proxy.

Style:

  • build.ts contains a permanent dead else branch behind if (true) that should be removed.

Confidence Score: 1/5

  • This PR is not safe to merge — the package.json has unresolved merge conflicts making it invalid JSON, and the exports field is misaligned with the new build output.
  • Score reflects two critical blockers: (1) the package.json contains raw Git merge conflict markers that will prevent any tooling from parsing the file, and (2) the exported paths in package.json no longer match what the new build script produces. Additionally there are logic bugs in the streaming path and the provider shim that would cause runtime failures even if the build issues were resolved.
  • Pay close attention to package.json (merge conflict + broken exports), src/index.ts (streaming result serialization), and src/providers/openrouter.ts (prototype-chain loss in withSupportedUrls, missing Proxy receiver).

Important Files Changed

Filename Overview
package.json Contains unresolved Git merge conflict markers making it invalid JSON; additionally the exports/main/module/types fields all reference paths from the old multi-target build that no longer exist in the new single-target build.
src/providers/openrouter.ts Introduces a Proxy-based wrapper to shim supportedUrls onto OpenRouter models for AI SDK v5 compatibility, but Object.assign loses prototype-chain methods and the Proxy trap omits the receiver argument, both of which can cause runtime failures.
src/index.ts Adds Tool/ToolChoice typing to model handlers and normalises return type to string, but silently serialises streaming TextStreamResult objects via JSON.stringify, destroying stream data when stream: true is passed.
src/models/text.ts Moves TextStreamResult interface from @elizaos/core import to a local definition to decouple from the old SDK type; changes are otherwise sound.
src/models/image.ts Minor compatibility cast to import("ai").LanguageModel to satisfy AI SDK v5 type; no functional changes.
build.ts Simplified to a single ESM-only build; removes browser and CJS targets; adds dynamic externalDeps from package.json; contains a permanent dead else branch behind if (true).
tsconfig.build.json Flips skipLibCheck from true to false and adds "types": ["node"]; stricter checking is good but may surface more type errors given the widespread as unknown as LanguageModel casts already present in the codebase.
.gitignore Expanded with standard ignore patterns for IDE, logs, caches, and build artifacts; no issues.

Sequence Diagram

sequenceDiagram
    participant Caller
    participant Plugin as openrouterPlugin (index.ts)
    participant TextHandler as models/text.ts
    participant ProviderFactory as providers/openrouter.ts
    participant Proxy as Proxy(provider)
    participant OpenRouter as @openrouter/ai-sdk-provider
    participant AISDK as ai (SDK v5)

    Caller->>Plugin: useModel(TEXT_SMALL, params)
    Plugin->>TextHandler: handleTextSmall(runtime, params)
    TextHandler->>ProviderFactory: createOpenRouterProvider(runtime)
    ProviderFactory->>OpenRouter: createOpenRouter({apiKey, baseURL, headers})
    OpenRouter-->>ProviderFactory: provider
    ProviderFactory->>Proxy: new Proxy(provider, { get trap })
    Proxy-->>TextHandler: OpenRouterProviderV2
    TextHandler->>Proxy: .chat(modelName)
    Proxy->>OpenRouter: origChat(modelName)
    OpenRouter-->>Proxy: rawModel (no supportedUrls)
    Proxy->>Proxy: withSupportedUrls(rawModel) → Object.assign({}, rawModel, {supportedUrls:{}})
    Proxy-->>TextHandler: LanguageModel (cast)
    alt stream=false
        TextHandler->>AISDK: generateText(params)
        AISDK-->>TextHandler: {text, usage}
        TextHandler-->>Plugin: string
        Plugin-->>Caller: string
    else stream=true
        TextHandler->>AISDK: streamText(params)
        AISDK-->>TextHandler: StreamResult
        TextHandler-->>Plugin: TextStreamResult
        Plugin->>Plugin: JSON.stringify(TextStreamResult) ⚠️
        Plugin-->>Caller: broken JSON string (stream lost)
    end
Loading

Comments Outside Diff (1)

  1. package.json, line 5-22 (link)

    exports field references paths that no longer exist

    The package.json exports field still points to the old multi-target build output paths that the new build.ts no longer produces:

    "main": "dist/cjs/index.node.cjs",
    "module": "dist/node/index.node.js",
    "types": "dist/node/index.d.ts",
    "exports": {
      ".": {
        "types": "./dist/node/index.d.ts",
        "import": "./dist/node/index.node.js",
        "require": "./dist/cjs/index.node.cjs",
        "browser": "./dist/browser/index.browser.js",
        "node": "./dist/node/index.node.js",
        ...
      }
    }

    The new build.ts only outputs to dist/ (e.g. dist/index.js). None of dist/node/index.node.js, dist/cjs/index.node.cjs, or dist/browser/index.browser.js are produced by the new build script. Any consumer that installs this package will get a "module not found" error at import time. The exports, main, module, and types fields all need to be updated to match the new build output.

Last reviewed commit: 053ab7a

Greptile also left 5 inline comments on this PR.

…eam attribution and streamText

Made-with: Cursor
Copilot AI review requested due to automatic review settings March 9, 2026 21:44
@coderabbitai
Copy link
Copy Markdown

coderabbitai Bot commented Mar 9, 2026

Walkthrough

Build script and TypeScript config updated for single ESM output and dynamic externals; model handler types extended (tools/toolChoice, streaming results, object outputs); OpenRouter provider wrapped to expose LanguageModel with supportedUrls; .gitignore and package.json adjusted.

Changes

Cohort / File(s) Summary
Build & Config
build.ts, tsconfig.build.json, .gitignore
Reworked build to read externals from package.json, run optional clean script, emit a single ESM build to dist, always attempt declaration generation, and change TS config (skipLibCheck:false, add Node types). .gitignore patterns for node_modules, dist/, env files, and .turbo-tsconfig.json were modified.
Package Manifest
package.json
Updated @elizaos/core to workspace:* and added devDependency bun-types@^1.2.21.
Core API Surface
src/index.ts
Extended TEXT_SMALL/TEXT_LARGE handlers to accept optional tools and toolChoice and normalize non-string results to strings; OBJECT_SMALL/OBJECT_LARGE now return Promise<Record<string, unknown>>; added Tool and ToolChoice type imports.
Text Model Logic
src/models/text.ts
Added exported TextStreamResult and internal TextParamsWithStream; updated generate/build and handler signatures to support streaming; cast models to LanguageModel and preserved streaming/non-streaming flows.
Image & Provider
src/models/image.ts, src/providers/openrouter.ts
Image handler: cast model to LanguageModel and simplified model property usage. OpenRouter provider: added OpenRouterProviderV2 types, withSupportedUrls utility, and a proxy wrapper so chat(modelId) returns a LanguageModel augmented with supportedUrls; changed exported provider return type.

Sequence Diagram(s)

sequenceDiagram
  participant R as Runtime
  participant P as OpenRouter Provider (proxy)
  participant M as LanguageModel (augmented)

  R->>P: createOpenRouterProvider(runtime)
  P->>P: proxy wraps original provider.chat
  R->>P: call chat(modelId)
  P->>P: original.chat(modelId) -> model
  P->>M: withSupportedUrls(model) -> modelWithUrls
  P-->>R: return modelWithUrls (LanguageModel with supportedUrls)
Loading

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~45 minutes

Poem

🐰 I hopped through builds and typed each stream,
I nudged the router to wear a new gleam,
Tools tucked in pockets, objects dressed neat,
One ESM path, a tidy retreat,
I nibble code crumbs — joy in each commit!

🚥 Pre-merge checks | ✅ 3
✅ Passed checks (3 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Docstring Coverage ✅ Passed Docstring coverage is 87.50% which is sufficient. The required threshold is 80.00%.
Title check ✅ Passed The title accurately describes the main objective of this PR: adding support for Vercel AI SDK v1 and v2, plus streaming support. It directly corresponds to the core changes across multiple files.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
  • 📝 Generate docstrings (stacked PR)
  • 📝 Generate docstrings (commit on current branch)
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch odi-dev

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Made-with: Cursor
Comment thread package.json Outdated
Comment thread src/index.ts
Comment on lines +58 to +69
const result = await handleTextSmall(runtime, params);
return typeof result === 'string' ? result : JSON.stringify(result);
},
[ModelType.TEXT_LARGE]: async (
runtime: IAgentRuntime,
params: GenerateTextParams
) => {
return handleTextLarge(runtime, params);
params: GenerateTextParams & {
tools?: Record<string, Tool>;
toolChoice?: ToolChoice<Record<string, Tool>>;
},
): Promise<string> => {
const result = await handleTextLarge(runtime, params);
return typeof result === 'string' ? result : JSON.stringify(result);
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Streaming result silently broken when stream: true

When a caller passes stream: true in the params, handleTextSmall/handleTextLarge returns a TextStreamResult object (containing a ReadableStream and several Promise fields). However, the wrapper in index.ts falls through to JSON.stringify(result) on any non-string return value:

const result = await handleTextSmall(runtime, params);
return typeof result === 'string' ? result : JSON.stringify(result);

JSON.stringify on a TextStreamResult will produce something like {"text":{},"usage":{},"finishReason":{}} because Promise and ReadableStream instances do not serialize to JSON. The actual stream is silently discarded, and the caller gets back a useless string instead of an error or the real stream. If streaming is not intended to be exposed through this interface, the stream option should be blocked or the TextStreamResult path should be handled explicitly.

Comment on lines +16 to +20
function withSupportedUrls<T extends object>(
model: T
): T & { supportedUrls: Record<string, RegExp[]> } {
return Object.assign({}, model, { supportedUrls: {} });
}
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Object.assign shallow copy loses prototype methods on the model

withSupportedUrls creates a plain object copy of the model using Object.assign({}, model, ...):

function withSupportedUrls<T extends object>(model: T): T & { supportedUrls: Record<string, RegExp[]> } {
  return Object.assign({}, model, { supportedUrls: {} });
}

Object.assign only copies own enumerable properties. Any methods defined on the model's prototype chain (e.g. doGenerate, doStream, etc. from the AI SDK LanguageModel interface) will not be present on the resulting plain object. This means the wrapped model may appear to satisfy the TypeScript type but will throw at runtime when the AI SDK attempts to call prototype methods.

Consider using Object.setPrototypeOf or Object.create to preserve the prototype chain, or simply mutate the existing object:

function withSupportedUrls<T extends object>(model: T): T & { supportedUrls: Record<string, RegExp[]> } {
  (model as any).supportedUrls = {};
  return model as T & { supportedUrls: Record<string, RegExp[]> };
}

Comment on lines +41 to +48
return new Proxy(provider, {
get(target, prop) {
if (prop === "chat") {
return (modelId: string) =>
withSupportedUrls(origChat(modelId)) as unknown as LanguageModel;
}
return Reflect.get(target, prop);
},
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Proxy Reflect.get missing the receiver argument

In the Proxy trap, Reflect.get(target, prop) is called without passing the third receiver argument:

return Reflect.get(target, prop);

Without the receiver, any accessor (getter) properties on the target whose implementation references this will see target as this rather than the proxy itself. This breaks the Proxy contract and can cause subtle issues when getters are accessed through the proxy. The correct form is:

Suggested change
return new Proxy(provider, {
get(target, prop) {
if (prop === "chat") {
return (modelId: string) =>
withSupportedUrls(origChat(modelId)) as unknown as LanguageModel;
}
return Reflect.get(target, prop);
},
return Reflect.get(target, prop, receiver);

The trap signature should also accept receiver:

get(target, prop, receiver) {
  if (prop === "chat") { ... }
  return Reflect.get(target, prop, receiver);
}

Comment thread build.ts
Comment on lines 27 to +47
external: externalDeps,
});
console.log(`✅ Browser build complete in ${((Date.now() - browserStart) / 1000).toFixed(2)}s`);

// Node CJS build
const cjsStart = Date.now();
console.log("🧱 Building @elizaos/plugin-openrouter for Node (CJS)...");
const cjsResult = await Bun.build({
entrypoints: ["src/index.node.ts"],
outdir: "dist/cjs",
target: "node",
format: "cjs",
sourcemap: "external",
minify: false,
external: [...externalDeps],
});
if (!cjsResult.success) {
console.error(cjsResult.logs);
throw new Error("CJS build failed");
if (!esmResult.success) {
console.error(esmResult.logs);
throw new Error("ESM build failed");
}
try {
const { rename } = await import("node:fs/promises");
await rename("dist/cjs/index.node.js", "dist/cjs/index.node.cjs");
} catch (e) {
console.warn("CJS rename step warning:", e);
}
console.log(`✅ CJS build complete in ${((Date.now() - cjsStart) / 1000).toFixed(2)}s`);
console.log(`✅ Build complete in ${((Date.now() - esmStart) / 1000).toFixed(2)}s`);

// TypeScript declarations
const dtsStart = Date.now();
console.log("📝 Generating TypeScript declarations...");
const { mkdir, writeFile } = await import("node:fs/promises");
const { $ } = await import("bun");
await $`tsc --project tsconfig.build.json`;
await mkdir("dist/node", { recursive: true });
await mkdir("dist/browser", { recursive: true });
await mkdir("dist/cjs", { recursive: true });
await writeFile(
"dist/node/index.d.ts",
`export * from '../index';
export { default } from '../index';
`
);
await writeFile(
"dist/browser/index.d.ts",
`export * from '../index';
export { default } from '../index';
`
);
await writeFile(
"dist/cjs/index.d.ts",
`export * from '../index';
export { default } from '../index';
`
);
console.log(`✅ Declarations generated in ${((Date.now() - dtsStart) / 1000).toFixed(2)}s`);
if (true) { // Always generate .d.ts
console.log("📝 Generating TypeScript declarations...");
try {
await $`tsc --project tsconfig.build.json`;
console.log(`✅ Declarations generated in ${((Date.now() - dtsStart) / 1000).toFixed(2)}s`);
} catch (error) {
console.warn(`⚠️ TypeScript declaration generation had errors (${((Date.now() - dtsStart) / 1000).toFixed(2)}s)`);
console.warn(" Build will continue - fix type errors when possible");
}
} else {
console.log("🔍 Type checking...");
try {
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Dead else branch behind if (true) constant

The declaration generation block uses if (true) { ... } else { ... }, making the else branch permanently unreachable dead code:

if (true) { // Always generate .d.ts
  // ... tsc emit
} else {
  // ... tsc --noEmit --incremental (never runs)
}

This clutters the build script and obscures intent. The else block should be removed entirely.

Comment thread package.json
Comment thread src/index.ts
},
): Promise<string> => {
const result = await handleTextSmall(runtime, params);
return typeof result === 'string' ? result : JSON.stringify(result);
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

JSON.stringify on TextStreamResult breaks streaming

Medium Severity

When handleTextSmall / handleTextLarge returns a TextStreamResult (streaming mode), the fallback JSON.stringify(result) serializes an object containing ReadableStream and Promise properties, producing useless output like {"textStream":{},"text":{},...}. The previous code returned the result directly, preserving the stream. Now streaming callers receive a garbage string instead of an operable stream object.

Additional Locations (1)

Fix in Cursor Fix in Web

Comment thread build.ts
console.warn(`⚠️ Type checking had errors (${((Date.now() - dtsStart) / 1000).toFixed(2)}s)`);
console.warn(" Build will continue - fix type errors when possible");
}
}
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Unreachable else branch due to if (true)

Low Severity

The if (true) condition on the .d.ts generation block makes the entire else branch (lines 45–54, type-check-only mode) unreachable dead code. This looks like leftover development/debugging code that was accidentally committed.

Fix in Cursor Fix in Web

Comment thread build.ts
external: [...externalDeps],
});
console.log(`✅ Node build complete in ${((Date.now() - nodeStart) / 1000).toFixed(2)}s`);

Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Build output paths don't match package.json exports

High Severity

The rewritten build.ts outputs a single ESM bundle to dist/index.js (from entrypoint src/index.ts, outdir dist), but package.json still references the old multi-target paths: dist/node/index.node.js, dist/cjs/index.node.cjs, dist/browser/index.browser.js, and dist/node/index.d.ts. None of these files will exist after a build, so the package will fail to resolve for any consumer — Node imports, CJS requires, browser imports, and TypeScript type resolution will all break.

Additional Locations (1)

Fix in Cursor Fix in Web

Copy link
Copy Markdown

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Updates the OpenRouter plugin to better interoperate with Vercel AI SDK v1/v2 expectations, including adapting provider model shapes and adjusting build/type tooling.

Changes:

  • Adapt OpenRouter provider chat() to return an AI SDK LanguageModel-compatible shape (adds supportedUrls).
  • Update text model plumbing/types to align with AI SDK v5 (LanguageModel, streaming result shape, tool-related param types).
  • Simplify build output to a single ESM build and adjust TypeScript build settings / repo ignores.

Reviewed changes

Copilot reviewed 7 out of 8 changed files in this pull request and generated 7 comments.

Show a summary per file
File Description
tsconfig.build.json Tightens TS build type checking and adds Node typings.
src/providers/openrouter.ts Wraps OpenRouter provider to satisfy AI SDK v5 model expectations.
src/models/text.ts Refactors text generation params/types and adds a local streaming result shape.
src/models/image.ts Casts OpenRouter chat model to AI SDK LanguageModel.
src/index.ts Extends text model params for tools and changes return handling.
package.json Updates dependencies/devDependencies (currently with a merge conflict).
build.ts Reworks build pipeline to emit only one ESM build and runs tsc for declarations.
.gitignore Expands ignored artifacts (build outputs, IDE files, caches, logs).

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

You can also share your feedback on Copilot code review. Take the survey.

Comment thread build.ts
Comment on lines +18 to 28
const esmStart = Date.now();
console.log("🔨 Building @elizaos/plugin-openrouter...");
const esmResult = await Bun.build({
entrypoints: ["src/index.ts"],
outdir: "dist",
target: "node",
format: "esm",
sourcemap: "external",
minify: false,
external: [...externalDeps],
});
console.log(`✅ Node build complete in ${((Date.now() - nodeStart) / 1000).toFixed(2)}s`);

// Browser build
const browserStart = Date.now();
console.log("🌐 Building @elizaos/plugin-openrouter for Browser...");
await Bun.build({
entrypoints: ["src/index.browser.ts"],
outdir: "dist/browser",
target: "browser",
format: "esm",
sourcemap: "external",
minify: false,
external: externalDeps,
});
Copy link

Copilot AI Mar 9, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The build now only outputs a single ESM bundle to dist/ (entrypoint src/index.ts), but the package metadata expects dist/node/index.node.js, dist/browser/index.browser.js, and dist/cjs/index.node.cjs (see package.json exports/main/module/types). As-is, consumers will get missing-file runtime errors. Either restore the multi-target outputs/rename steps, or update package.json entrypoints/exports/types to match the new output layout.

Copilot uses AI. Check for mistakes.
Comment thread build.ts
console.log(`✅ Declarations generated in ${((Date.now() - dtsStart) / 1000).toFixed(2)}s`);
} catch (error) {
console.warn(`⚠️ TypeScript declaration generation had errors (${((Date.now() - dtsStart) / 1000).toFixed(2)}s)`);
console.warn(" Build will continue - fix type errors when possible");
Copy link

Copilot AI Mar 9, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

tsc --project tsconfig.build.json errors are swallowed and the build continues, which can publish a package with missing/invalid .d.ts (and CI may still pass). Consider failing the build on declaration generation errors (or at least make this behavior opt-in via an env flag) so type breaks don’t ship silently.

Suggested change
console.warn(" Build will continue - fix type errors when possible");
console.warn(" Build will continue - fix type errors when possible");
if (process.env.FAIL_BUILD_ON_DTS_ERROR === "true") {
throw (error instanceof Error ? error : new Error("TypeScript declaration generation failed"));
}

Copilot uses AI. Check for mistakes.
Comment thread src/index.ts
Comment on lines +57 to 60
): Promise<string> => {
const result = await handleTextSmall(runtime, params);
return typeof result === 'string' ? result : JSON.stringify(result);
},
Copy link

Copilot AI Mar 9, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

handleTextSmall/handleTextLarge can return a TextStreamResult (contains ReadableStream + Promises). JSON.stringify(result) will either drop most fields or produce {}, and it prevents the runtime from consuming the stream (breaking onStreamChunk-style streaming). The model handler should return the streaming shape expected by @elizaos/core instead of stringifying it.

Copilot uses AI. Check for mistakes.
Comment thread src/models/text.ts
Comment on lines 96 to 102
async function generateTextWithModel(
runtime: IAgentRuntime,
modelType: typeof ModelType.TEXT_SMALL | typeof ModelType.TEXT_LARGE,
params: GenerateTextParams,
params: TextParamsWithStream,
): Promise<string | TextStreamResult> {
const { generateParams, modelName, modelLabel, prompt } =
buildGenerateParams(runtime, modelType, params);
Copy link

Copilot AI Mar 9, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

generateTextWithModel now accepts a params shape that can include onStreamChunk (used by the plugin’s streaming test in src/index.ts), but the implementation only enables streaming when params.stream is set and never consumes streamResult.textStream to invoke onStreamChunk. As a result, callers providing onStreamChunk won’t receive any chunks. Consider mapping onStreamChunk -> streaming mode and piping the stream to the callback.

Copilot uses AI. Check for mistakes.
Comment thread src/models/text.ts
Comment on lines 24 to 30
function buildGenerateParams(
runtime: IAgentRuntime,
modelType: typeof ModelType.TEXT_SMALL | typeof ModelType.TEXT_LARGE,
params: GenerateTextParams,
params: TextParamsWithStream,
) {
const { prompt, stopSequences = [] } = params;
const temperature = params.temperature ?? 0.7;
Copy link

Copilot AI Mar 9, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

buildGenerateParams takes params that (via src/index.ts) may include tools / toolChoice, but these fields are currently ignored and never forwarded into generateText/streamText params. If tool calling is intended for AI SDK v5 support, include these fields in generateParams (or remove the added typing to avoid implying support).

Copilot uses AI. Check for mistakes.
function withSupportedUrls<T extends object>(
model: T
): T & { supportedUrls: Record<string, RegExp[]> } {
return Object.assign({}, model, { supportedUrls: {} });
Copy link

Copilot AI Mar 9, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

withSupportedUrls creates a shallow clone via Object.assign({}, model, ...), which drops the original prototype and non-enumerable properties. If the returned model relies on prototype methods/accessors, this can break model behavior at runtime. Prefer mutating/augmenting the original model object (e.g., define supportedUrls on it) instead of cloning it.

Suggested change
return Object.assign({}, model, { supportedUrls: {} });
(model as any).supportedUrls ??= {};
return model as T & { supportedUrls: Record<string, RegExp[]> };

Copilot uses AI. Check for mistakes.
Comment thread package.json Outdated
Copy link
Copy Markdown

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 4

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
package.json (1)

29-44: ⚠️ Potential issue | 🔴 Critical

Resolve the leftover merge conflict markers.

package.json is invalid JSON in its current state. build.ts Line 6 now parses this file at startup, so the build fails before bundling or declaration generation even begins. Pick the intended @elizaos/core dependency line and remove the conflict markers.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@package.json` around lines 29 - 44, package.json contains leftover git
conflict markers (<<<<<<<, =======, >>>>>>>) around the dependencies block which
makes the file invalid; remove the conflict markers and keep a single
`@elizaos/core` entry (choose the intended value, e.g., "@elizaos/core":
"workspace:*" or "^1.7.0") and ensure the surrounding JSON structure is valid so
the additional dependencies ("@openrouter/ai-sdk-provider", "ai", "undici") and
the devDependencies block are properly placed under "dependencies" and
"devDependencies". Update package.json so build.ts can parse it at startup
without errors.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@build.ts`:
- Around line 18-28: The package's published entrypoints still reference
non-existent directories (dist/node/*, dist/cjs/*, dist/browser/*) while the
build (see esmStart and the Bun.build call which outputs to outdir "dist" with
format "esm") now emits a single ESM dist tree; update package.json
exports/exportsField and main/module/browser fields to point to the new single
dist output (e.g., entrypoint(s) under "dist/") and remove or replace any
references to dist/node, dist/cjs, and dist/browser so consumers resolve the
actual files produced by Bun.build.
- Around line 12-16: The build script currently runs the repo-wide clean via the
pkg.scripts?.clean check and await $`bun run clean`.quiet(), which deletes
node_modules and hides failures; replace this with a build-scoped cleanup (e.g.,
invoke a dedicated "clean:build" or "clean:dist" npm script) or remove the clean
invocation entirely so node_modules/tsc remain intact, and stop suppressing
errors so declaration generation failures surface (remove .quiet() or the
surrounding catch that swallows errors). Update both occurrences (the one using
pkg.scripts?.clean / await $`bun run clean`.quiet() and the similar block at
lines 36-44) to call the build-only clean script or no clean and allow errors to
propagate.

In `@src/index.ts`:
- Around line 51-69: The wrapper currently JSON.stringifys any non-string return
from handleTextSmall/handleTextLarge which drops streaming results; update both
ModelType.TEXT_SMALL and ModelType.TEXT_LARGE handlers to detect
TextStreamResult (or an object with onStreamChunk/stream fields) and pass it
through instead of stringifying, and if present wire params.stream and
params.onStreamChunk into the returned TextStreamResult (e.g., set result.stream
= result.stream ?? params.stream and result.onStreamChunk = result.onStreamChunk
?? params.onStreamChunk) so the internal stream flag and onStreamChunk are
bridged to the caller; otherwise fallback to returning typeof result ===
'string' ? result : JSON.stringify(result).

In `@src/models/text.ts`:
- Line 15: Add forwarding for tool parameters: extend the TextParamsWithStream
type to include tools?: GenerateTextParams['tools'] and toolChoice?:
GenerateTextParams['toolChoice'], then update buildGenerateParams(...) to copy
through tools and toolChoice from the incoming params into the object returned
for generateText/streamText (same place where you set model, prompt,
temperature, etc.). Make sure both the non-stream and stream branches (the paths
that call generateText and streamText) include these fields so tool-enabled
requests are passed intact to generateText/streamText.

---

Outside diff comments:
In `@package.json`:
- Around line 29-44: package.json contains leftover git conflict markers
(<<<<<<<, =======, >>>>>>>) around the dependencies block which makes the file
invalid; remove the conflict markers and keep a single `@elizaos/core` entry
(choose the intended value, e.g., "@elizaos/core": "workspace:*" or "^1.7.0")
and ensure the surrounding JSON structure is valid so the additional
dependencies ("@openrouter/ai-sdk-provider", "ai", "undici") and the
devDependencies block are properly placed under "dependencies" and
"devDependencies". Update package.json so build.ts can parse it at startup
without errors.

ℹ️ Review info
⚙️ Run configuration

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

Run ID: afcde0eb-fe9c-4e38-93b7-dc6847a9e901

📥 Commits

Reviewing files that changed from the base of the PR and between b51e019 and 053ab7a.

📒 Files selected for processing (8)
  • .gitignore
  • build.ts
  • package.json
  • src/index.ts
  • src/models/image.ts
  • src/models/text.ts
  • src/providers/openrouter.ts
  • tsconfig.build.json

Comment thread build.ts
Comment on lines +12 to +16
// Use the clean script from package.json
if (pkg.scripts?.clean) {
console.log("🧹 Cleaning...");
await $`bun run clean`.quiet();
}
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

Don't run the repo-wide clean script as part of every build.

package.json Line 50 deletes node_modules. Running that here removes the local tsc binary and installed typings before the declaration step, and the catch block below turns the resulting failure into a green build. Use a build-only cleanup for artifacts, and let declaration generation fail the build.

Suggested fix
-  // Use the clean script from package.json
-  if (pkg.scripts?.clean) {
-    console.log("🧹 Cleaning...");
-    await $`bun run clean`.quiet();
-  }
+  console.log("🧹 Cleaning build artifacts...");
+  await $`rm -rf dist .turbo-tsconfig.json tsconfig.tsbuildinfo`.quiet();
@@
-    } catch (error) {
-      console.warn(`⚠️  TypeScript declaration generation had errors (${((Date.now() - dtsStart) / 1000).toFixed(2)}s)`);
-      console.warn("   Build will continue - fix type errors when possible");
+    } catch (error) {
+      console.error(`❌ TypeScript declaration generation failed (${((Date.now() - dtsStart) / 1000).toFixed(2)}s)`);
+      throw error;
     }

Also applies to: 36-44

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@build.ts` around lines 12 - 16, The build script currently runs the repo-wide
clean via the pkg.scripts?.clean check and await $`bun run clean`.quiet(), which
deletes node_modules and hides failures; replace this with a build-scoped
cleanup (e.g., invoke a dedicated "clean:build" or "clean:dist" npm script) or
remove the clean invocation entirely so node_modules/tsc remain intact, and stop
suppressing errors so declaration generation failures surface (remove .quiet()
or the surrounding catch that swallows errors). Update both occurrences (the one
using pkg.scripts?.clean / await $`bun run clean`.quiet() and the similar block
at lines 36-44) to call the build-only clean script or no clean and allow errors
to propagate.

Comment thread build.ts
Comment on lines +18 to 28
const esmStart = Date.now();
console.log("🔨 Building @elizaos/plugin-openrouter...");
const esmResult = await Bun.build({
entrypoints: ["src/index.ts"],
outdir: "dist",
target: "node",
format: "esm",
sourcemap: "external",
minify: false,
external: [...externalDeps],
});
console.log(`✅ Node build complete in ${((Date.now() - nodeStart) / 1000).toFixed(2)}s`);

// Browser build
const browserStart = Date.now();
console.log("🌐 Building @elizaos/plugin-openrouter for Browser...");
await Bun.build({
entrypoints: ["src/index.browser.ts"],
outdir: "dist/browser",
target: "browser",
format: "esm",
sourcemap: "external",
minify: false,
external: externalDeps,
});
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

Update the published entrypoints to match the new build layout.

This build now emits a single ESM dist tree, but package.json Lines 5-20 still publish dist/node/*, dist/cjs/*, and dist/browser/*. In this state, consumers will resolve files that are never created after the build change.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@build.ts` around lines 18 - 28, The package's published entrypoints still
reference non-existent directories (dist/node/*, dist/cjs/*, dist/browser/*)
while the build (see esmStart and the Bun.build call which outputs to outdir
"dist" with format "esm") now emits a single ESM dist tree; update package.json
exports/exportsField and main/module/browser fields to point to the new single
dist output (e.g., entrypoint(s) under "dist/") and remove or replace any
references to dist/node, dist/cjs, and dist/browser so consumers resolve the
actual files produced by Bun.build.

Comment thread src/index.ts
Comment on lines 51 to +69
[ModelType.TEXT_SMALL]: async (
runtime: IAgentRuntime,
params: GenerateTextParams
) => {
return handleTextSmall(runtime, params);
params: GenerateTextParams & {
tools?: Record<string, Tool>;
toolChoice?: ToolChoice<Record<string, Tool>>;
},
): Promise<string> => {
const result = await handleTextSmall(runtime, params);
return typeof result === 'string' ? result : JSON.stringify(result);
},
[ModelType.TEXT_LARGE]: async (
runtime: IAgentRuntime,
params: GenerateTextParams
) => {
return handleTextLarge(runtime, params);
params: GenerateTextParams & {
tools?: Record<string, Tool>;
toolChoice?: ToolChoice<Record<string, Tool>>;
},
): Promise<string> => {
const result = await handleTextLarge(runtime, params);
return typeof result === 'string' ? result : JSON.stringify(result);
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

The text-model wrappers still break streaming.

handleTextSmall() / handleTextLarge() can now return TextStreamResult, but these entrypoints stringify any non-string result. That drops the stream instead of consuming it, and onStreamChunk is never bridged to the internal stream flag, so the openrouter_test_streaming case at Lines 158-175 won't receive chunks.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/index.ts` around lines 51 - 69, The wrapper currently JSON.stringifys any
non-string return from handleTextSmall/handleTextLarge which drops streaming
results; update both ModelType.TEXT_SMALL and ModelType.TEXT_LARGE handlers to
detect TextStreamResult (or an object with onStreamChunk/stream fields) and pass
it through instead of stringifying, and if present wire params.stream and
params.onStreamChunk into the returned TextStreamResult (e.g., set result.stream
= result.stream ?? params.stream and result.onStreamChunk = result.onStreamChunk
?? params.onStreamChunk) so the internal stream flag and onStreamChunk are
bridged to the caller; otherwise fallback to returning typeof result ===
'string' ? result : JSON.stringify(result).

Comment thread src/models/text.ts
}

/** Params extended with optional stream flag used by this plugin */
type TextParamsWithStream = GenerateTextParams & { stream?: boolean };
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash
set -euo pipefail

echo "Text-model entrypoints exposing tools/toolChoice:"
rg -n -C2 'tools\?:|toolChoice\?:' src/index.ts

echo
echo "Fields actually forwarded into generateParams:"
sed -n '14,56p' src/models/text.ts

Repository: elizaos-plugins/plugin-openrouter

Length of output: 2227


🏁 Script executed:

#!/bin/bash
set -euo pipefail

echo "=== Checking if generateText/streamText are called with tools/toolChoice ==="
rg -n 'generateText|streamText' src/models/text.ts -A 5

echo
echo "=== Checking where the params are passed to buildGenerateParams ==="
rg -n 'buildGenerateParams' src/models/text.ts -B 2 -A 10

echo
echo "=== Checking the complete function signatures in index.ts that accept tools ==="
sed -n '50,70p' src/index.ts

echo
echo "=== Checking how buildGenerateParams result is used ==="
grep -n 'buildGenerateParams' src/models/text.ts
rg -n 'const.*generateParams\|return.*generateParams' src/models/text.ts -A 3

Repository: elizaos-plugins/plugin-openrouter

Length of output: 4055


🏁 Script executed:

#!/bin/bash
set -euo pipefail

echo "=== Checking package.json for 'ai' library version ==="
cat package.json | grep -A 2 '"ai"'

echo
echo "=== Checking imports and types from 'ai' library ==="
head -20 src/models/text.ts

echo
echo "=== Checking if tools are used anywhere else in the codebase ==="
rg -n 'tools\|toolChoice' --type ts --type tsx | grep -v 'node_modules' | head -20

Repository: elizaos-plugins/plugin-openrouter

Length of output: 1201


🌐 Web query:

ai library v5.0.47 generateText streamText tools support documentation

💡 Result:

For the ai npm package (Vercel AI SDK) v5.x—including functions like generateText, streamText, and tool calling—the relevant documentation is here:

  • AI SDK Core – Generating Text (overview of generateText and streamText): [1]
  • AI SDK Core – Tools & Tool Calling (how to define tools, tool schemas, and tool-call loops/steps): [2]
  • Reference – AI SDK Core (API index for generateText(), streamText(), and related utilities): [3]
  • Reference – generateText (full parameter/return reference): [4]
  • AI SDK UI – Chatbot with Tool Calling (how tool calls stream and how to use tools with useChat + streamText): [5]
  • (Background) AI SDK 5 announcement (concepts like step control / tool loop primitives): [6]

Sources: [1] [2] [3] [4] [5] [6]


🌐 Web query:

vercel ai sdk v5 generateText streamText function signature parameters tools toolChoice

💡 Result:

generateText / streamText (AI SDK v5) — tools + toolChoice params

tools

  • Type: ToolSet (an object/map of tool names → tool definitions). Tools must be model-supported for tool calling.
    (v5.ai-sdk.dev)

Example shape:

tools: {
  weather: tool({
    description: '...',
    inputSchema: z.object({ location: z.string() }),
    execute: async ({ location }, options) => { /* ... */ },
  }),
}

(v5.ai-sdk.dev)

toolChoice?

  • Type:
toolChoice?:
  | 'auto'
  | 'none'
  | 'required'
  | { type: 'tool'; toolName: string }
  • Meaning: default 'auto'; 'none' disables tool execution; 'required' forces tool usage; {type:'tool', toolName} forces a specific tool.
    (v5.ai-sdk.dev)

Related (often used with tools)

  • activeTools?: Array<TOOLNAME> / Array<keyof TOOLS> — limit which tools are available for the model to call (all active by default). (v5.ai-sdk.dev)
  • prepareStep?: (...) => PrepareStepResult<TOOLS> — can override per-step toolChoice and activeTools. (v5.ai-sdk.dev)

Citations:


Forward tools and toolChoice into the generated request.

The Vercel AI SDK v5 (which is the version used here) supports tools and toolChoice parameters in both generateText and streamText. Currently, src/index.ts lines 53–66 accept these fields for text models, but buildGenerateParams() never includes them in the request object passed to generateText/streamText. Tool-enabled calls silently degrade to plain text generation.

Also extend TextParamsWithStream to include tool parameters so direct callers can express the capability.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/models/text.ts` at line 15, Add forwarding for tool parameters: extend
the TextParamsWithStream type to include tools?: GenerateTextParams['tools'] and
toolChoice?: GenerateTextParams['toolChoice'], then update
buildGenerateParams(...) to copy through tools and toolChoice from the incoming
params into the object returned for generateText/streamText (same place where
you set model, prompt, temperature, etc.). Make sure both the non-stream and
stream branches (the paths that call generateText and streamText) include these
fields so tool-enabled requests are passed intact to generateText/streamText.

Copy link
Copy Markdown

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
package.json (1)

5-20: ⚠️ Potential issue | 🔴 Critical

Correct package.json entrypoints to match the actual build output.

The build only emits ESM output at dist/index.js and TypeScript declarations at dist/index.d.ts. The package.json manifest still declares CJS (dist/cjs/index.node.cjs), node-variant (dist/node/index.node.js), and browser (dist/browser/index.browser.js) files that are not produced. Update the main, module, types, and exports fields to reflect what the build actually creates:

  • "main""dist/index.js" (or remove if ESM-only)
  • "module""dist/index.js" (or remove if redundant)
  • "types""dist/index.d.ts"
  • "exports["."].import""dist/index.js"
  • Remove or update "exports["."].require" (CJS is not built)
  • Remove or update "exports["."].browser" (browser build is not built)

Consumers will fail at import/require/type-check time if these paths point to missing files.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@package.json` around lines 5 - 20, Update package.json entrypoints to match
the actual build output: set "main" and "module" to "dist/index.js" (or remove
them if you choose ESM-only), change "types" to "dist/index.d.ts", set
exports["."].import to "dist/index.js", and remove or clear exports["."].require
and exports["."].browser since no CJS or browser bundles are emitted; ensure the
exports["."].default also points to "dist/index.js" if present so all entry
fields reference existing dist/index.* artifacts.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@package.json`:
- Line 29: The package.json currently lists the dependency "@elizaos/core":
"workspace:*", which will break installs for consumers using npm; either replace
the workspace:* spec with a pinned semver (e.g. the actual version or a semver
range) for "@elizaos/core" or ensure your publish pipeline rewrites workspace:*
to the real version before publishing (update the publish script or tool
configuration that publishes package.json); locate the dependency entry for
"@elizaos/core" in package.json and make the change or add a verification step
in your release/publish scripts to rewrite workspace protocol for npm consumers.

---

Outside diff comments:
In `@package.json`:
- Around line 5-20: Update package.json entrypoints to match the actual build
output: set "main" and "module" to "dist/index.js" (or remove them if you choose
ESM-only), change "types" to "dist/index.d.ts", set exports["."].import to
"dist/index.js", and remove or clear exports["."].require and
exports["."].browser since no CJS or browser bundles are emitted; ensure the
exports["."].default also points to "dist/index.js" if present so all entry
fields reference existing dist/index.* artifacts.

ℹ️ Review info
⚙️ Run configuration

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

Run ID: f9b4960d-0bb8-45b7-afa7-603b1db48dc3

📥 Commits

Reviewing files that changed from the base of the PR and between 053ab7a and 1ac87a6.

📒 Files selected for processing (1)
  • package.json

Comment thread package.json
"@ai-sdk/openai": "^2.0.32",
"@ai-sdk/ui-utils": "1.2.11",
"@elizaos/core": "^1.7.0",
"@elizaos/core": "workspace:*",
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🌐 Web query:

Does publishing an npm package with a runtime dependency set to "workspace:*" work for consumers outside the monorepo, and how do npm, pnpm, yarn, and bun handle rewriting workspace protocol versions on publish?

💡 Result:

No—publishing a package that still has runtime deps like "dep": "workspace:*" generally will not work for consumers outside your monorepo, because workspace: is meant to be resolved against local workspace packages. With plain npm, consumers typically hit errors like “Unsupported URL Type "workspace:"” when installing such a manifest. [1]

How publish/pack rewriting works by package manager:

  • npm (npm publish / npm pack): does not rewrite workspace:* to a real semver range/version (so publishing with workspace:* leaks that spec to the registry and breaks consumers). [1][2]
  • pnpm (pnpm pack / pnpm publish): rewrites workspace: deps at pack/publish time:
    • workspace:*1.5.0 (the referenced workspace package’s version)
    • workspace:~~1.5.0
    • workspace:^^1.5.0
    • other workspace:<range> → corresponding semver range (and also rewrites certain alias/path workspace forms). [3]
  • Yarn (Berry) (yarn npm publish): rewrites workspace: deps on publish, e.g. if the workspace version is 3.2.1:
    • workspace:^^3.2.1
    • workspace:~~3.2.1
    • workspace:*=3.2.1
    • workspace:^3.0.0^3.0.0. [4]
  • Bun (bun publish / publishing behavior): rewrites workspace: versions when publishing, e.g.
    • workspace:*1.0.1
    • workspace:^^1.0.1
    • workspace:~~1.0.1
    • workspace:1.0.21.0.2 (explicit version wins). [5]

Practical takeaway: if you publish libraries meant for external consumers, ensure your published tarball contains only normal semver (or normal npm alias specs like npm:pkg@version)—which happens automatically with pnpm/Yarn/Bun, but not with npm, unless you add your own prepack/prepublish rewrite step. [3][4][5][2]

Sources: [1] https://stackoverflow.com/questions/69220042/npm-err-unsupported-url-type-workspace-workspace, [2] https://nesbitt.io/2026/01/18/workspaces-and-monorepos-in-package-managers.html, [3] https://dimava.github.io/pnpm/workspaces/, [4] https://yarnpkg.com/features/workspaces, [5] https://bun.sh/docs/pm/workspaces


Pin the dependency version or verify the publish tool rewrites workspace:*.

Publishing "@elizaos/core": "workspace:*" as-is will fail for consumers installing outside the monorepo if using npm (which does not rewrite workspace protocol). If using pnpm, Yarn, or Bun, these tools automatically rewrite workspace:* to the actual version at publish time. For npm or custom publish pipelines, either pin a specific semver range here or confirm your release step rewrites workspace dependencies before publishing.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@package.json` at line 29, The package.json currently lists the dependency
"@elizaos/core": "workspace:*", which will break installs for consumers using
npm; either replace the workspace:* spec with a pinned semver (e.g. the actual
version or a semver range) for "@elizaos/core" or ensure your publish pipeline
rewrites workspace:* to the real version before publishing (update the publish
script or tool configuration that publishes package.json); locate the dependency
entry for "@elizaos/core" in package.json and make the change or add a
verification step in your release/publish scripts to rewrite workspace protocol
for npm consumers.

Copy link
Copy Markdown

@cursor cursor Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Cursor Bugbot has reviewed your changes and found 2 potential issues.

Bugbot Autofix is OFF. To automatically fix reported issues with cloud agents, enable autofix in the Cursor dashboard.

function withSupportedUrls<T extends object>(
model: T
): T & { supportedUrls: Record<string, RegExp[]> } {
return Object.assign({}, model, { supportedUrls: {} });
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Object.assign loses prototype methods on model objects

High Severity

withSupportedUrls uses Object.assign({}, model, ...) which creates a shallow plain-object copy. The model returned by origChat(modelId) is a class instance (OpenRouterChatLanguageModel) whose critical methods like doGenerate and doStream live on the prototype chain. Object.assign only copies own enumerable properties, so those methods are silently dropped. When the AI SDK later calls model.doGenerate() or model.doStream(), it will throw a "not a function" error at runtime.

Fix in Cursor Fix in Web

Comment thread src/index.ts
params: GenerateTextParams & {
tools?: Record<string, Tool>;
toolChoice?: ToolChoice<Record<string, Tool>>;
},
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Tools and toolChoice params accepted but silently dropped

Medium Severity

The tools and toolChoice parameters are newly added to the TEXT_SMALL and TEXT_LARGE handler signatures, but buildGenerateParams in text.ts never includes them in the generateParams object passed to the AI SDK's generateText/streamText. Any caller providing tools will have them silently ignored, with no error or warning.

Additional Locations (1)

Fix in Cursor Fix in Web

@odilitime odilitime changed the title vercel ai sdk v1 & v2 support streaming support + vercel ai sdk v1 & v2 support Mar 9, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants