Skip to content

Monorepo for the modern, open-source schedule builder for uOttawa Students

License

Notifications You must be signed in to change notification settings

g-raman/uenroll

Repository files navigation

🏛️ uEnroll

A modern, open-source schedule builder for uOttawa.

Check it out here

Local Setup

Note

You will need bun (v1.3.x), and Docker (latest) installed. Optional: Install turbo and wrangler globally via bun

Initialize

Clone Repo

git clone https://github.com/g-raman/uenroll.git

Install dependencies

cd uenroll
bun install

Build dependencies

bun run build

Setup Database

Run Postgres instance locally

Warning

Make sure the docker engine is running.

bun run db:up

Seed database

bun run db:seed

Running this command populates the database with seed data.

See the Populate with real data section if you want more/better data.

Or alternatively see the Request seed file section.

Setup webapp

bun run dev:web

Advanced Setup

Database viewer

You can use the drizzle client (Recommended) to view the database and run SQL queries. You can also use any database client that supports PostgreSQL of your choice (Beekeeper Studio, DBeaver, pgadmin, etc.)

bun run db:studio

You can now open http://local.drizzle.studio in your web browser of choice.

Caution

Safari/Brave/MacOS users, access via a browser to localhost is denied by default. You need to create self signed certificate and drizzle studio should work. For brave you also need to disable shields for the website.

brew install mkcert
mkcert -install

Restart your studio and you should be able to view it now.

Populate with real data

Caution

The dev mode for cloudflare workflows is a little janky. You will see a lot of errors on the screen. It's fine to ignore these, the scraper should pick up most of the course catalogue. There are some issues around concurrency so some courses will be missing. Refer to the Request seed file section for accurate data.

If you'd like to see actual data from the uOttawa public course registry.

You can run the scraper against your local database instance.

You can run the following commands (assuming database is running):

bun run dev:scraper

The default port for triggering cron jobs is 8787.

If there's no port conflicts, run the following command to trigger the scraper.

curl http://localhost:8787/cdn-cgi/handler/scheduled

Depending on network bottlenecks, the scraping should be done within 5-10 minutes.

Caution

The postgres instance running inside docker doesn't persist data to your disk. If you want to populate your local database with real data. Modify the compose.yml file to persist data to disk so you don't have to keep running the scraper.

Request seed file

If you don't want to wait for the scraper to run. You can request the seed.sql file from me.

Send me an e-mail.

Install postgresql via your package manager or from the website. Install version 18. We're gonna use the psql utility to dump the contents of the file into the local db.

psql 'postgresql://postgres:postgres@localhost:5432/postgres' < seed.sql

About

Monorepo for the modern, open-source schedule builder for uOttawa Students

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •