devops-ai-cli is a personal CLI tool written in Go, powered by Cobra and Viper.
It's a cli tool that includes some of the following:
- Initialize
- Rendering Markdown
- Explain with OpenWebUI AI models from the terminal
- Query with OpenWebUI - Continuing Conversations from the terminal
- Optimize Files: AI Recommendations
- Verify: Check if tools from config are installed
You need Go installed. Check with:
go version
If Go is not installed, you can install version 1.24.0
on Linux
Read more
curl -fsSL https://golang.org/dl/go1.24.0.linux-amd64.tar.gz | sudo tar -C /usr/local -xzf -
export PATH=$PATH:/usr/local/go/bin
echo "export PATH=\$PATH:/usr/local/go/bin" >> ~/.bashrc
echo "export GOPATH=\$HOME/go" >> ~/.bashrc
echo "export GOROOT=/usr/local/go" >> ~/.bashrc
Now, verify installation:
go version
go mod tidy
go build -o devopscli main.go
The init
command creates a default configuration file.
./devopscli init
This will create a configuration file at ~/.config/devopscli/config.yaml
openwebui:
host: "http://localhost:3000"
api_key: "your-api-key-here"
model: "gpt-4-turbo"
debug: false
By default, devopscli uses ~/.config/devopscli/config.yaml
for its configuration and to use a custom location you can set DEVOPSCLI_CONFIG_LOCATION
environment variable:
export DEVOPSCLI_CONFIG_LOCATION="$HOME/.devopscli/config.yaml"
The render
command allows you to display Markdown files beautifully in the terminal.
./devopscli render -f example.md
echo "# Hello, World!" > example.md
./devopscli render -f example.md
The explain
command allows you to ask OpenWebUI AI for explanations on various topics,
and it returns a response formatted in Markdown.
./devopscli explain "what does the Kubernetes CrashLoopBackOff mean?"
To use this command, configure OpenWebUI API details in one of two ways:
π Option 1: Using config.yaml
openwebui:
host: "http://localhost:3000"
api_key: "your-api-key-here"
π Option 2: Using Environment Variables
export OPENWEB_API_KEY="your-secret-api-key"
export OPENWEB_API_HOST="http://localhost:3000"
β
The CLI sends a request to OpenWebUI with your query.
β
OpenWebUI processes the request and returns a response.
β
The response is rendered as Markdown using glamour
.
The query
command allows you to ask OpenWebUI AI questions while maintaining context across multiple queries.
./devopscli query "What is Kubernetes?"
# Kubernetes Explained
Kubernetes is an open-source container orchestration system...
π **Conversation ID**: 1
This assigns the question a Conversation ID (1
), allowing follow-ups.
To ask follow-up questions, use the --cid
flag with the conversation ID:
./devopscli query "How does Kubernetes handle scaling?" --cid "1"
# Kubernetes Scaling
- Uses Horizontal Pod Autoscaler (HPA)
- Uses Cluster Autoscaler
- Supports Vertical Pod Autoscaling
This maintains context and follows up on the previous question.
./devopscli query --list
π **Previous Conversations:**
π 1: What is Kubernetes?
π 2: How does Kubernetes handle deployments?
This lets you see which past questions you can continue.
./devopscli query --delete 1
β
Removes conversation ID 1
from storage.
./devopscli query --clear
ποΈ Deletes all stored conversations.
β
Maintains conversation history
β
Allows follow-up questions (--cid
)
β
Lists previous queries (--list
)
β
Deletes single (--delete
) or all (--clear
) conversations
β
Uses OpenWebUI API for intelligent responses
β
Outputs beautifully formatted Markdown responses
The optimize
command allows you to send a code or configuration file (e.g., YAML, JSON, Python, Terraform, Shell scripts) to OpenWebUI AI, which will analyze and provide optimization suggestions in Markdown format.
./devopscli optimize -f _extras/manifests/example-deployment.yaml
Example for a Terraform file:
./devopscli optimize -f infra.tf
Example for a Python script:
./devopscli optimize -f script.py
To use this command, configure OpenWebUI API details via a config file or environment variables.
openwebui:
host: "http://localhost:3000"
api_key: "your-api-key-here"
model: "gemma:2b"
export OPENWEB_API_KEY="your-secret-api-key"
export OPENWEB_API_HOST="http://localhost:3000"
If you run:
./devopscli optimize -f _extras/manifests/example-deployment.yaml
The response from AI will be returned in markdown format.
β
Supports multiple file types (YAML, JSON, .py, .tf, .sh, etc.)
β
Sends the file content to OpenWebUI for AI-based optimization
β
Receives Markdown suggestions and beautifully renders them in the terminal
The verify
command checks whether required DevOps tools are installed on your system. It reads the list of tools from config.yaml
and reports their availability.
./devopscli verify
Define required tools in config.yaml
:
tools:
required:
- kubectl
- terraform
- docker
- helm
- git
- jq
- curl
π **Verifying Required Tools:**
β
kubectl
β terraform (Not Installed)
β
docker
β
helm
β
git
β jq (Not Installed)
β
curl
β
Reads required tools from config.yaml
β
Checks if each tool is installed
β
Displays β
(installed) and β (missing)
β
Fast and lightweight!