Skip to content

Convert to Web AI to run model locally in browser so no need for Python backend #1

@jasonmayes

Description

@jasonmayes

Web AI allows you to run many models in the web browser via WebGPU allowing users to run LLMs like Gemma 3N multimodal and beyond in real-time on video feeds. I was wondering if you could also offer your demo in pure Web AI form to run the model client side either via Google's LiteRT.js or Microsoft ONNX Runtime Web?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions