⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⣤⣴⣶⣾⣿⣿⣿⣿⣷⣶⣦⣤⣀⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀⠀⠀⣠⣴⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣦⣄⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀⣠⣾⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣷⣄⠀⠀⠀⠀
⠀⠀⢀⣾⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣷⡀⠀⠀
⠀⢠⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⡄⠀
⢠⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⡄
⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿
STRETCHING TCP SINCE 2025
"I'm gonna be King of the Networking!" — Monkey D. TCP
Network programming playground. The Grand Line where packets learn to fight.
| Crate | Status | Description |
|---|---|---|
| basic-tcp-proxy | Ready | Async TCP proxy with metrics |
| echo-server | Ready | Simple echo server for testing |
| load-tester | Ready | Benchmarking and load testing |
| load-balancer | WIP | Multi-backend load balancer |
# Run the proxy (forwards localhost:3000 → localhost:8081)
just proxy
# Run echo server for testing
just echo
# Run tests
just test
# Run all quality checks
just qualityThe proxy sustains ~90k requests/sec with sub-millisecond latency.
| Metric | Value |
|---|---|
| Peak RPS | 91,599 |
| p50 Latency | 110μs |
| p99 Latency | 294μs |
| Throughput | 90 MB/s |
| Max Connections | 500+ |
See basic-tcp-proxy README for full benchmark matrix.
Run the built-in load tester to benchmark the proxy:
# Terminal 1: Start echo server
just echo
# Terminal 2: Start proxy
just proxy
# Terminal 3: Run load test
just load-testConfigure test scenarios in load_test.toml:
target_addr = "127.0.0.1:3000"
[[scenarios]]
name = "baseline"
connections = 10
duration_secs = 5
message_size = 1024
[[scenarios]]
name = "stress-test"
connections = 500
duration_secs = 5
message_size = 1024crates/
├── basic-tcp-proxy/ # Async TCP proxy with metrics
├── echo-server/ # Simple echo server for testing
├── load-balancer/ # (WIP)
└── load-tester/ # (WIP)
just fmt # Format code
just clippy # Run lints
just test # Run tests
just quality # All checks
just fix # Auto-fix issues- Rust 1.85+ (edition 2024)
- Tokio runtime
MIT