feat: OpenClaw integration — secure LM provider management#720
feat: OpenClaw integration — secure LM provider management#720GameWorldsDEV wants to merge 3 commits intosteipete:mainfrom
Conversation
- WebSocket RPC client for authenticated config injection - Keychain-based device pairing with gateway token - Multi-provider LM config exporter (Codex, Ollama, Claude, Gemini) - LM Hub UI with fallback editor and Ollama control panel - New providers: Ollama Cloud, Ollama LAN, HTTPS LM - Setup guide, client connection guide, third-party dev guide Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: cd0c16cb36
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
| let export = exporter.export( | ||
| ollamaResults: ollamaResults, | ||
| codexAccounts: codexAccounts, | ||
| codexbarVersion: version) |
There was a problem hiding this comment.
Pass reordered fallback chain into OpenClaw export
The inject path builds export without a fallbackOrder, so the order the user edits in the pane is never applied during injection. In this same file, the UI allows reordering and even reports “press Inject to apply,” but this call always falls back to exporter defaults/other disk sources instead of self.fallbackProviders, so the injected fallback chain can diverge from what the user just saved.
Useful? React with 👍 / 👎.
| return try await withCheckedThrowingContinuation { continuation in | ||
| Task { | ||
| await state.register(id: requestId, continuation: continuation) | ||
|
|
There was a problem hiding this comment.
Enforce per-RPC timeout for WebSocket requests
sendRPC registers a continuation and waits for receiveLoop to complete it, but there is no timeout path that resumes the continuation if the gateway never replies. timeoutSeconds is only applied to URLSession request config and does not bound WebSocket message waits, so a stuck/partial gateway response can leave Inject hanging indefinitely.
Useful? React with 👍 / 👎.
| var p: [String: Any] = [ | ||
| "provider": profile.provider, | ||
| ] | ||
| let modeValue: String = profile.mode ?? profile.type | ||
| p["mode"] = modeValue |
There was a problem hiding this comment.
Include auth profile keys when building config.patch
Patch construction drops profile.key and only writes provider/mode for each auth profile. That means any new api_key profile generated by the exporter (for example Claude/Gemini keys) is patched without credentials, and existing keys also cannot be updated via config.patch, breaking provider auth sync through this injection path.
Useful? React with 👍 / 👎.
- P1: pass user's fallback order from UI to export (was using defaults) - P1: include auth profile keys in config.patch payload - P2: add per-RPC timeout using TaskGroup (prevents indefinite hang) Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Summary
Adds OpenClaw integration to CodexBar, enabling it as a secure external LM provider manager.
Features
Security
Files Added
OpenClawGatewayClient.swift— WebSocket RPC clientOpenClawPairing.swift— Keychain device pairingOpenClawExporter.swift— Multi-provider config exporterPreferencesLMManagementPane.swift— LM Hub UI (modified)AI Disclosure
This PR was developed with AI assistance (Claude). All changes were manually tested.