English | 中文 | 日本語 | 한국어 | Español | Français | Deutsch
Expose your Linkly AI Desktop local MCP (Model Context Protocol) server to the internet via Cloudflare Workers.
This worker acts as a reverse proxy, allowing remote MCP clients (like ChatGPT, Claude, etc.) to access your local MCP server running on your desktop through a secure WebSocket tunnel.
Remote MCP Client (ChatGPT/Claude...)
↓ HTTPS POST /mcp
Your Cloudflare Worker
↓ WebSocket (Hibernatable)
Durable Object (Tunnel Manager)
↑ WebSocket (Long-lived connection)
Linkly AI Desktop App
↓ HTTP
Local MCP Server (127.0.0.1:60606/mcp)
-
Clone this repository:
git clone https://github.com/LinklyAI/linkly-ai-mcp-remote-worker.git cd linkly-ai-mcp-remote-worker -
Install dependencies:
pnpm install
-
Login to Cloudflare:
npx wrangler login
-
Deploy the worker:
pnpm run deploy
-
Note down your worker URL (e.g.,
https://linkly-ai-mcp-remote.<your-account>.workers.dev)
- Open Linkly AI Desktop
- Go to Settings → MCP tab
- Enter your worker URL (without
https://prefix) - Click Test to verify the connection
- Click Save to save the configuration
- Enable the tunnel switch on the MCP page
Once the tunnel is connected, configure your MCP client to use the remote endpoint:
Claude Desktop (claude_desktop_config.json):
{
"mcpServers": {
"linkly-ai": {
"url": "https://linkly-ai-mcp-remote.<your-account>.workers.dev/mcp"
}
}
}Cursor (.cursor/mcp.json):
{
"mcpServers": {
"linkly-ai": {
"url": "https://linkly-ai-mcp-remote.<your-account>.workers.dev/mcp"
}
}
}| Endpoint | Method | Description |
|---|---|---|
/ |
GET | Worker info and available endpoints |
/health |
GET | Health check (shows tunnel status) |
/tunnel |
GET | WebSocket endpoint for desktop tunnel |
/mcp |
POST | MCP endpoint for remote clients |
pnpm run devThis starts a local development server at http://localhost:8787.
pnpm run cf-typegenpnpm test- Desktop Connection: Your Linkly AI Desktop app connects to the worker via WebSocket at
/tunnel - Hibernation: The Durable Object can hibernate during idle periods, reducing costs while keeping the WebSocket alive
- MCP Proxy: When a remote MCP client sends a request to
/mcp, the worker forwards it through the WebSocket to your desktop - Local Forwarding: Your desktop app receives the request and forwards it to your local MCP server
- Response Path: The response travels back through the same path
- No Authentication: The current version has no authentication. Your worker URL acts as a simple access control.
- Self-Deployed: Each user deploys their own worker instance, providing isolation.
- HTTPS Only: All communications use HTTPS/WSS encryption.
For production use, consider adding authentication via Cloudflare Access or API keys.
This worker uses Cloudflare Durable Objects with WebSocket Hibernation, which is cost-effective:
- Workers: Free tier includes 100,000 requests/day
- Durable Objects: Pay-as-you-go, but hibernation minimizes active time
- WebSocket: Auto ping/pong doesn't wake the Durable Object
Typical personal usage should stay within free tier limits.
MIT License - see LICENSE for details.
- Linkly AI Desktop - AI-powered knowledge management desktop app
- Model Context Protocol - Open protocol for AI model context
