Just say what you want. In your terminal.
Install • Usage • Providers • Config
$ pls "compress all PNGs in this folder"
╭───────────────────────────────────────────────────────────╮
│ find . -name "*.png" -exec pngquant --quality=65-80 {} \; │
╰───────────────────────────────────────────────────────────╯
Run it? (Y/n)
✓ Done.
pip install pls-shThat's it. The command is pls.
pls "find files bigger than 100MB"
pls "kill whatever is using port 3000"
pls "convert video.mp4 to gif"
pls "show disk usage sorted by size"
pls "rename all .jpeg files to .jpg"Works offline by default with Ollama. No API key, no internet, no telemetry.
pls "do something" --explain # also explains what the command does
pls "do something" --yes # skip confirmation, just run it
pls "do something" --dry-run # show command but don't run it
pls "do something" --provider openai
pls "do something" --model gpt-4o
pls --last # show the last generated command
echo "do something" | pls # pipe from stdinpls flags dangerous commands before running them. Stuff like rm -rf, chmod 777,
dd, piping random scripts into bash — all gets highlighted in red with a warning.
Dangerous commands flip the confirmation to opt-in (y/N instead of Y/n).
Press e at the confirmation prompt to edit the command before running it.
Ollama is the default. Runs locally, no account needed.
# make sure ollama is running
ollama serve
ollama pull qwen3.5:2b
pls "list all docker containers"
# just worksLM Studio, llama.cpp, or any OpenAI-compatible server:
# LM Studio (runs on port 1234 by default)
pls config set default provider lmstudio
# any OpenAI-compatible endpoint (llama.cpp, vLLM, OpenRouter, etc.)
pls config set custom url http://localhost:8080
pls config set custom model my-model
pls config set custom api_key sk-... # optional
pls config set default provider customOpenAI and Anthropic if you want cloud models:
# either set env vars
export OPENAI_API_KEY=sk-...
export ANTHROPIC_API_KEY=sk-ant-...
# or save them in config
pls config set openai api_key sk-...
pls config set anthropic api_key sk-ant-...
# then use them
pls "do something" --provider openai
pls "do something" --provider anthropic
# or set a default
pls config set default provider anthropicConfig lives in ~/.config/pls/config.toml.
pls config show # see current config
pls config set ... # change a value
pls config reset # back to defaults- You type what you want in plain English (or any language, really)
plsgrabs context — your OS, shell, what's in the current directory- Sends that + your request to the LLM
- Shows you the command, asks for confirmation
- Runs it
No history stored, no data sent anywhere (unless you use OpenAI/Anthropic).
MIT
