An AI-powered weather assistant built with LangGraph, MCP (Model Context Protocol), and Next.js. Ask natural language questions about weather and get real-time answers powered by OpenWeatherMap.
- Natural language weather queries (current conditions, forecasts, air quality, alerts)
- Streaming responses via Server-Sent Events
- LLM provider fallback (Google Gemini / Groq)
- MCP-based tool architecture with 5 weather tools
- Dark-mode chat UI with example query chips
- Dockerized with isolated internal networking
- Docker & Docker Compose (recommended) or Python 3.12+ and Node.js 22+
- OpenWeatherMap API key - get one free
- LLM provider - at least one of:
- Google Cloud / Vertex AI — a service account JSON key with Vertex AI API enabled (
GOOGLE_APPLICATION_CREDENTIALS+VERTEX_PROJECT) - Groq (
GROQ_API_KEY)
- Google Cloud / Vertex AI — a service account JSON key with Vertex AI API enabled (
# 1. Clone and configure
git clone <your-repo-url> weatherwise-agent
cd weatherwise-agent
cp .env.example .env
# Edit .env with your API keys
# 2. Build and run
docker compose up --build
# 3. Open http://localhost:3000The MCP server runs on an internal network with no exposed ports. Only the agent backend (:8000) and frontend (:3000) are accessible from the host.
cd mcp-server
pip install -r requirements.txt
python server.py # runs via stdio transportcd agent-backend
pip install -r requirements.txt
python main.py # starts on http://localhost:8000The agent spawns the MCP server as a subprocess (stdio) automatically when MCP_SERVER_URL is not set.
cd frontend
npm install
npm run dev # starts on http://localhost:3000API calls are proxied to http://localhost:8000 by default (configurable via API_URL env var).
The app is deployed on Google Cloud Run with all 3 services running as separate containers:
| Service | Live URL |
|---|---|
| Frontend | https://weatherwise-agent-frontend-ybn6xfzrsa-uc.a.run.app |
| Agent Backend | https://weatherwise-agent-backend-ybn6xfzrsa-uc.a.run.app |
| API Docs (Swagger) | https://weatherwise-agent-backend-ybn6xfzrsa-uc.a.run.app/docs |
| MCP Server | https://weatherwise-agent-mcp-ybn6xfzrsa-uc.a.run.app |
To deploy from scratch or redeploy after code changes:
./deploy.shThe script handles everything: Artifact Registry, service account, Secret Manager, Docker builds (linux/amd64), and ordered Cloud Run deploys. See DEPLOYMENT.md for the full deployment guide, architecture, and troubleshooting.
cd mcp-server
pip install -r requirements.txt
pytestcd agent-backend
pip install -r requirements.txt
pytestTests use pytest-asyncio for async test support. All tests use mocks and dummy API keys (set automatically via conftest.py), so no real credentials are needed.
| Variable | Required | Default | Description |
|---|---|---|---|
OPENWEATHER_API_KEY |
Yes | - | OpenWeatherMap API key |
LLM_PROVIDER |
No | google |
LLM provider: google or groq |
GOOGLE_APPLICATION_CREDENTIALS |
Local only | - | Path to GCP service account JSON (not needed on Cloud Run — ADC handles auth) |
VERTEX_PROJECT |
If provider=google | - | GCP project ID |
VERTEX_LOCATION |
No | us-central1 |
Vertex AI region |
GROQ_API_KEY |
If provider=groq | - | Groq API key |
MCP_SERVER_URL |
Docker only | - | SSE URL for MCP server (e.g., http://mcp-server:8001/sse) |
MCP_TRANSPORT |
Docker only | stdio |
MCP server transport: stdio or sse |
API_URL |
Docker only | http://localhost:8000 |
Backend URL for frontend proxy |
| Method | Path | Description |
|---|---|---|
POST |
/api/chat |
Send a message, get a complete response |
GET |
/api/chat/stream?message=... |
Stream a response via SSE |
GET |
/api/health |
Health check |
weatherwise-agent/
agent-backend/ # FastAPI + LangGraph ReAct agent
app.py # API routes
agent.py # Agent orchestration and streaming
mcp_client.py # MCP client (stdio + SSE transport)
llm_provider.py # LLM configuration with fallback
prompts.py # System prompt
main.py # Uvicorn entrypoint
mcp-server/ # FastMCP weather tools server
server.py # 5 MCP tools (geocode, weather, forecast, air quality, alerts)
config.py # OpenWeatherMap client configuration
schemas.py # Pydantic response models
frontend/ # Next.js chat UI
src/app/page.tsx # Chat interface with SSE streaming
Dockerfile.mcp # MCP server image
Dockerfile.agent # Agent backend image
Dockerfile.frontend # Frontend multi-stage image
docker-compose.yml # All 3 services with network isolation
deploy.sh # GCP Cloud Run deployment script
DEPLOYMENT.md # Deployment guide and troubleshooting
See ARCHITECTURE.md for design decisions and system overview.


