-
Notifications
You must be signed in to change notification settings - Fork 0
Technology Stack
Status: Planning - No decisions made yet
This document tracks technology stack options and decisions for Zelara.
Based on architectural constraints, we need:
- Cross-platform - Same codebase for Desktop (Windows/Mac/Linux), Mobile (iOS/Android), Web
- Edge-first - Runs locally, no server dependency
- On-device AI/CV - For recycling image validation
- Device linking - Cross-device communication for task offloading
- Conditional builds - Ability to include/exclude modules at build time
- Git submodule integration - Build system must work with dynamic submodules
Pros:
- True cross-platform (single codebase for all platforms)
- Compiled to native binaries (good performance)
- Built-in device communication patterns
- Good support for on-device ML (TensorFlow Lite integration)
- Hot reload for fast development
Cons:
- Dart language (less familiar, smaller ecosystem than JS/TS)
- Larger app size
- Web support less mature than mobile/desktop
Fit for Zelara:
- β Cross-platform
- β Edge-first (compiles to native)
- β On-device AI (TFLite plugins)
- β Device linking (isolates for task offloading, platform channels)
β οΈ Conditional builds (possible but less common pattern)
Pros:
- TypeScript/JavaScript (familiar, huge ecosystem)
- React Native for mobile (mature, widely used)
- Tauri for desktop (lightweight, Rust-powered, secure)
- Tauri can bundle web version too
- Rust backend for heavy processing (perfect for CV, calculations)
- Smaller app size than Electron
Cons:
- Two separate frameworks (RN for mobile, Tauri for desktop/web)
- Complexity in sharing code between RN and Tauri
- More moving parts
Fit for Zelara:
β οΈ Cross-platform (RN + Tauri = 2 frameworks, but both use JS/TS for UI)- β Edge-first (Tauri compiles to native with Rust backend)
- β On-device AI (Rust can run ONNX/TensorFlow models efficiently)
- β Device linking (Rust backend perfect for task offloading)
- β Conditional builds (Tauri has modular plugin system)
Pros:
- TypeScript/JavaScript everywhere
- React Native for mobile (mature)
- Electron for desktop (very mature, huge ecosystem)
- Shared React codebase for UI
- Lots of examples and libraries
Cons:
- Electron apps are heavy (bundles Chromium)
- Poor performance compared to native
- Not ideal for edge/offline computing
- Large app size
Fit for Zelara:
β οΈ Cross-platform (Electron for desktop, RN for mobile, separate web build)β οΈ Edge-first (Electron heavy, not optimized for edge computing)β οΈ On-device AI (possible but slower than native)β οΈ Device linking (possible but not ideal architecture)β οΈ Conditional builds (possible but Electron not designed for modularity)
Verdict: Not ideal for Zelara's edge-first principle.
Pros:
- Familiar to most developers
- Huge ecosystem (npm)
- Works with React Native, Tauri, Electron
- Good tooling, type safety
- AI assistants very familiar with TS
Cons:
- Not as performant as Rust for heavy compute
- Requires runtime (Node.js/Deno/Bun)
Fit for Zelara:
- β UI layer (React/React Native)
β οΈ Heavy processing (CV, calculations) - not ideal- β Business logic (state management, skill tree)
Pros:
- Extremely fast (compiled, zero-cost abstractions)
- Memory safe (no garbage collection overhead)
- Perfect for on-device AI/CV
- Works great with Tauri
- Growing ecosystem
Cons:
- Steep learning curve
- Slower development velocity
- Smaller ecosystem than JS/TS
- Less familiar to AI assistants (but still capable)
Fit for Zelara:
β οΈ UI layer (possible but less common)- β Heavy processing (CV, calculations) - perfect
- β Backend logic (device linking, build system)
Pros:
- Designed for Flutter
- Good performance
- Type-safe
- Hot reload
Cons:
- Only useful with Flutter
- Smaller ecosystem than JS/TS or Rust
- Less familiar
Fit for Zelara:
- β Everything (if using Flutter)
- β Not applicable (if not using Flutter)
Proposal:
- Mobile: React Native (TypeScript for UI)
- Desktop/Web: Tauri (TypeScript for UI, Rust for backend)
- Shared logic: TypeScript business logic (state, skill tree, API)
- Heavy processing: Rust (CV, calculations, device linking backend)
Architecture:
βββββββββββββββββββββββββββββββββββββββββββ
β TypeScript (Shared) β
β - Business logic β
β - State management β
β - Skill tree engine β
βββββββββββββββββββββββββββββββββββββββββββ
β β
βΌ βΌ
ββββββββββββββββββββ ββββββββββββββββββββ
β React Native β β Tauri β
β (Mobile) β β (Desktop/Web) β
β - TypeScript UI β β - TypeScript UI β
β β β - Rust Backend β
ββββββββββββββββββββ ββββββββββββββββββββ
β β
ββββββββββ¬βββββββββββββββββ
βΌ
βββββββββββββββββββ
β Rust (Heavy) β
β - CV/AI β
β - Calculations β
β - Device link β
βββββββββββββββββββ
Benefits:
- Leverage TypeScript for rapid development (UI, business logic)
- Leverage Rust for performance-critical tasks (CV, compute)
- Code reuse across platforms (shared TS logic)
- Best tool for each job
Challenges:
- Complexity of 2 frameworks + 2 languages
- Bridge between TS and Rust (FFI, platform channels)
- Build system more complex
Pros:
- Designed for mobile/edge
- Good model compression
- Works on all platforms
- Flutter plugins available
- React Native support via community packages
Cons:
- Focused on inference (not training)
- Limited model architectures
Fit for Zelara: β Good for recycling image validation
Pros:
- Cross-platform (Windows, Mac, Linux, iOS, Android, Web)
- Framework-agnostic (convert from PyTorch, TensorFlow, etc.)
- Good performance
- Rust bindings available
Cons:
- Larger than TFLite
- More complex setup
Fit for Zelara: β Flexible, works with Rust backend
Pros:
- Native platform integration
- Best performance on respective platforms
- Platform-optimized
Cons:
- Platform-specific (need separate models)
- More development overhead
Fit for Zelara:
Pros:
- Smallest size
- Fastest inference
- Full control
Cons:
- Requires ML expertise
- Need to train and maintain models
- May not match pre-trained model accuracy
Fit for Zelara:
Rationale:
- TypeScript for UI and business logic (fast development, familiar to AI)
- Rust for heavy processing (CV, device linking, calculations)
- React Native for mobile (mature, widely used)
- Tauri for desktop/web (lightweight, Rust-powered, secure)
- ONNX Runtime for on-device AI (cross-platform, works with Rust)
Trade-offs:
- More complex than single-framework solution (Flutter)
- Better performance and flexibility than Electron
- Leverages strengths of each technology
Framework Decision: React Native + Tauri + Rust (Hybrid)
Rationale:
- React Native for mobile (mature, widely used, TypeScript)
- Tauri for desktop (lightweight, Rust-powered, secure)
- Best balance of performance, code reuse, and edge-first principles
- Leverages TypeScript for rapid UI/business logic development
- Leverages Rust for high-performance CV and device linking
Language Strategy: TypeScript + Rust Hybrid
- TypeScript: UI components, business logic, state management, skill tree
- Rust: On-device CV/ML, complex calculations, device linking backend
- Maximizes development speed (TS) while optimizing performance-critical paths (Rust)
On-Device AI: ONNX Runtime with Rust bindings
- Cross-platform (Desktop + Mobile)
- Framework-agnostic (can use PyTorch/TensorFlow models)
- Good performance with Rust
- Supports ML-based recycling validation (bag classification, content detection)
MVP Platform: Mobile + Desktop (both required)
- Mobile: Primary UI, camera for recycling photos, daily tasks
- Desktop: Processing hub for CV/ML tasks, device linking server
- Proves device linking from day 1
Code Sharing Strategy: Duplicate UI components per app
- Each app (mobile, desktop) maintains own components
- Shared business logic via TypeScript packages in core
- Simpler dependency management, faster iteration
Alternatives Considered:
- Flutter: Simpler single framework but Dart language less familiar, conditional builds more complex
- Electron: Too heavy for edge-first principle, poor performance
- Pure TypeScript: Not performant enough for on-device CV/ML
Implementation Impact:
- Need Rust environment setup for desktop app development
- Need React Native environment for mobile app development
- Shared TypeScript packages in core for business logic
- ONNX models need conversion/optimization for edge deployment
With Zelara Desktop being redesigned as "Zelara Core" (the local AI hub), the tech stack is extended with AI model management and inference capabilities.
Model inference (ONNX): ONNX Runtime via Rust bindings β already in use for recycling CV.
Formalised into ai/ module structure in v0.6.0 for Finance OCR and categorization.
Model management: Models stored in OS app data dir (%APPDATA%\Zelara\models\ on Windows,
~/.zelara/models/ elsewhere). Download-on-demand, cached across sessions. model_manager.rs
handles download streaming, progress events, and in-memory session caching.
Local LLM (optional): llama-cpp-rs crate (Rust bindings for llama.cpp) β GGUF format
models (Mistral 7B recommended). Feature-flagged (llm Cargo feature) so it's opt-in and
doesn't bloat the default build. Deferred to v0.7.0.
Model formats:
- ONNX: classification and NLP models (PaddleOCR, DistilBERT, MobileNet)
- GGUF: local LLMs (Mistral 7B, Llama variants)
Dispatch protocol: Generic ai_task / ai_task_result WSS message pair.
Rust dispatcher.rs routes by capability string to the appropriate handler.
cv_processor.rs (recycling CV) is the existing prototype for this pattern.
Status: cv_processor.rs is the working prototype. ai/ module structure and generic
dispatch formalised in v0.6.0 alongside Finance OCR and categorization.
Technology stack is now locked. Implementation can proceed.