A web-based client application for the PostgreSQL Model Context Protocol (PG-MCP) server that allows users to query databases using natural language.
More info on the pg-mcp project here:
PG-MCP-Client provides a user-friendly web interface for interacting with PostgreSQL databases through the PG-MCP server. It leverages AI language models to translate natural language questions into SQL queries, making database exploration accessible to users without SQL expertise.
The client connects to a PG-MCP server, which provides database schema information and query execution capabilities, while an AI provider (Anthropic, Google, or OpenAI) handles the natural language understanding and SQL generation.
- Natural Language Interface: Query your PostgreSQL database using plain English
- Multiple LLM Support: Compatible with Anthropic (Claude), Google (Gemini), and OpenAI models
- Interactive UI: Clean, responsive web interface built with Tailwind CSS and HTMX
- Database Schema Exploration: Automatically retrieves and uses database schema for accurate query generation
- Query Visualization: Displays generated SQL and query results in a formatted table
- Configurable Settings: Easy configuration of API keys and connection details
- Python 3.13+
- A running PG-MCP server
- API key from one of the supported AI providers:
# Clone the repository
git clone https://github.com/stuzero/pg-mcp-client.git
cd pg-mcp-client
# Create a .env file with your application secret
echo "APPLICATION_SECRET=your_secure_random_string" > .env
# Build and run with Docker
docker-compose up -d
This project uses uv
for dependency management:
uv Installation Instructions
# Clone the repository
git clone https://github.com/stuzero/pg-mcp-client.git
cd pg-mcp-client
# Install dependencies and create a virtual environment ( .venv )
uv sync
# Activate the virtual environment
source .venv/bin/activate
# Create a .env file with your application secret
echo "APPLICATION_SECRET=your_secure_random_string" > .env
# Run the application
python -m client.app
- Access the web interface at http://localhost:8080
- Navigate to the Settings page and configure:
- LLM Provider (Anthropic, Gemini, or OpenAI)
- LLM API Key
- PG-MCP Server URL (e.g., http://localhost:8000/sse)
- PostgreSQL Connection String
- Go to the Query page
- Enter your question in natural language (e.g., "Show me the top 10 customers by revenue")
- View the generated SQL and query results
The application stores the following configuration in the user's session:
- LLM Provider: Select from Anthropic (Claude), Gemini, or OpenAI
- LLM API Key: Your API key for the selected provider
- PG-MCP Server URL: The URL of your PG-MCP server's SSE endpoint
- Database URL: PostgreSQL connection string for the target database
PG-MCP-Client consists of:
- Web UI: Built with Starlette, Jinja2 templates, Tailwind CSS, and HTMX
- Agent Service: Connects to LLM providers and the PG-MCP server
- Query Processing:
- Fetches database schema from PG-MCP server
- Sends schema + user question to LLM for SQL generation
- Executes generated SQL via PG-MCP server
- Formats and displays results
The application flow:
User Input → Schema Retrieval → LLM Processing → SQL Generation →
Query Execution → Results Display
- PG-MCP Server: The server component for PostgreSQL database access
- Model Context Protocol: The underlying protocol for AI agent interactions
- PydanticAI: The Agent framework for this client
Contributions are welcome! Areas for improvement:
- Additional LLM provider integrations
- Enhanced UI features
- Query history and management
- Improved error handling and debugging
- SQL query customization options