Modulus-AI is a multi-provider chat platform that offers premium, customizable user experiences. Providing a seamless integration with various AI models, it allows users to interact with multiple AI providers through a single interface. Advance tool usage such as web-search and wiki-search is supported as well as customizable RAGs, enhancing the capabilities of the AI interactions.
- Multi-provider chat interface (Ollama, OpenRouter, Gemini)
- Premium user interface
- Wide set of tools
- Backend: Haskell, Servant, PostgreSQL
- Frontend: React, Typescript, Vite, TailwindCSS
- CI/CD: GitHub Actions
- Code Quality: Pre-commit hooks, HLint, fourmolu
- Deployment: GCP
- Model Serving: Ollama, OpenRouter, Gemini
- Email Service: Mailgun
- Docker
- Docker Compose
- Stack
- Pre-commit
- Mailgun API
- Ollama (optional, for local model serving)
git clone [email protected]:tusharad/modulus-ai.git
cd modulus-aicd modulus-ai-besource export-env.shThis will add the necessary environment variables to your shell session from the .env.local file. You must create this file based on the .env.example file provided in the repository.
make upThis command will start the Docker containers defined in the docker-compose.yml file. It will set up the necessary services for the application to run including postgres database.
stack runThis command will start the application using Stack, which is a tool for managing Haskell projects. It will compile the project and run the server.
The application should now be running and accessible at http://localhost:8081.
./scripts/run-tests.shcd modulus-ai-fenpm install
npm run devThis project is licensed under the MIT License - see the LICENSE file for details.
Contributions are welcome!

