Skip to content

Skill: validate that package versions in plans are current and not outdated #937

@harrisrobin

Description

@harrisrobin
  • I searched existing issues and this has not been proposed before

What problem does this solve?

When AI generates implementation plans that include specific package names and versions, those versions are often stale — the AI's training data has a knowledge cutoff, and popular packages (especially in the JS/TS/Python ecosystems) release frequently. The result is that plans include outdated package versions that produce deprecation warnings, security advisories, or broken installs, requiring manual correction before work can begin.

I run into this regularly: I ask for a plan that includes dependencies, the AI confidently names versions like [email protected] that are a major version behind, and I have to go look up current versions myself before I can start coding.

Proposed solution

A validate-package-versions skill (or a skill integrated into the existing planning workflow) that:

  1. Scans the current plan or a user-provided list of packages for version pins or version ranges
  2. Looks up the latest published version of each package against the appropriate registry (npm, PyPI, crates.io, etc.)
  3. Reports any packages where the plan's version is significantly behind current (e.g. a full major version behind, or exceeds a configurable staleness threshold)
  4. Optionally rewrites the plan with corrected, up-to-date version pins

The skill should be usable both reactively (run on an existing plan) and proactively (invoked automatically as part of a planning workflow via a hook or the writing-plans skill).

What alternatives did you consider?

  • Trust the user to check versions manually: This is what I do today. It's tedious and easy to forget, especially when there are many dependencies.
  • Ask the AI to always use latest versions: The AI can't know what "latest" is at runtime without a lookup; it will still use training-data versions.
  • Use a package-manager audit command after install: This catches the problem too late — after the plan has been accepted and dependencies installed — rather than flagging it during the planning phase.

The skill approach is better because it surfaces the problem before any code is written or installed, while the plan can still be cheaply corrected.

Is this appropriate for core Superpowers?

Yes. Package version staleness affects every developer working in any ecosystem with a package registry (JS/TS, Python, Rust, Go modules, etc.). It is not specific to any domain, company, or workflow. Any AI-assisted planning workflow that involves dependencies is susceptible to this problem.

Context

  • Using Claude Code (claude-sonnet-4-6) with the Superpowers harness
  • Hits this most often when asking for writing-plans output that includes npm/pnpm dependencies
  • Would be most valuable if the skill could be wired up as a hook that runs automatically after plan generation

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions