Feature Request: Multichannel “Quick Invocation” Mode (Daemon + Router) with Persistent Indexed Memory & Cache
Use case / motivation
A lot of people use Code Puppy like a “launcher assistant” (Raycast-style): short, mostly independent questions, minimal context, fast latency.
In this usage pattern, it’s costly when the runtime:
- pulls too much conversational history by default,
- repeatedly invokes the same agents/tooling pathways for similar micro-requests,
- rebuilds context/plans redundantly across invocations instead of reusing cached artifacts and durable memory.
I’d like to propose a multichannel mode optimized for:
- independent prompts (stateless-ish per request),
- minimal context assembly (retrieve only what’s needed),
- incremental reuse of prior work via a persistent indexed memory + output cache,
- optional daemon/event-loop runtime so multiple clients can share the same cache/memory.
This feels adjacent to:
Proposed concept: “multichannel mode”
Definition: A runtime mode where each request is treated as an independent “turn” in a specific channel_id (client channel), with:
- short-term state scoped to that channel (small sliding window),
- long-term state stored in a persistent memory layer (indexed),
- caching of redundant agent/tool results across channels (where safe).
Typical clients:
- CLI (interactive or non-interactive)
- a launcher (Raycast, Alfred, etc.)
- editor integration (VSCode, etc.)
Architecture sketch (high-level)
Option A (preferred long-term): Daemon + protocol (ACP)
code-puppy daemon runs as a long-lived event loop
- clients connect and send prompt events
- daemon maintains caches + memory + agent pool
Option B (MVP-friendly): stdio NDJSON subprocess
code-puppy --stdio --mode multichannel reads JSON events from stdin and writes JSON events to stdout
- wrappers can keep a single process alive (or restart while still persisting memory on disk)
Key building blocks
Supervisor (“shadow”) agent as the event-loop orchestrator
A thin orchestrator that:
- receives user input
- creates a structured record (turn log)
- routes the request to the minimal skill/agent
- captures output + tool results
- returns a single response payload to the client
- commits durable memory + cache entries
Feature Request: Multichannel “Quick Invocation” Mode (Daemon + Router) with Persistent Indexed Memory & Cache
Use case / motivation
A lot of people use Code Puppy like a “launcher assistant” (Raycast-style): short, mostly independent questions, minimal context, fast latency.
In this usage pattern, it’s costly when the runtime:
I’d like to propose a multichannel mode optimized for:
This feels adjacent to:
Proposed concept: “multichannel mode”
Definition: A runtime mode where each request is treated as an independent “turn” in a specific
channel_id(client channel), with:Typical clients:
Architecture sketch (high-level)
Option A (preferred long-term): Daemon + protocol (ACP)
code-puppy daemonruns as a long-lived event loopOption B (MVP-friendly): stdio NDJSON subprocess
code-puppy --stdio --mode multichannelreads JSON events from stdin and writes JSON events to stdoutKey building blocks
Supervisor (“shadow”) agent as the event-loop orchestrator
A thin orchestrator that: