Optimize Images, Save the Planet
Curious about how it works? Here are the major components:
- Cloudinary
- co2.js
- Next.js
- Puppeteer + Sparticuz Chromium (serverless page fetch for
/api/scrape) - Supabase
- Netlify (see Netlify deploy below)
The app uses Tailwind CSS (see tailwind.config.js, postcss.config.mjs, and src/styles/globals.css). Sass is not used—layout and components use Tailwind utilities and small shared helpers (for example src/lib/heroClasses.js). Next.js may still install sass as an optional dependency of the framework; it is not part of this app’s styles.
- Create a Supabase project and run the SQL in
supabase/migrations/20260421000000_initial.sql(SQL editor or Supabase CLI). - Add environment variables:
NEXT_PUBLIC_SUPABASE_URLandSUPABASE_SERVICE_ROLE_KEY(server-only; used by API routes andgetServerSideProps). Use the service_role secret from Settings → API, not the anonpublickey—anon is blocked by RLS onsites/images, which causesviolates row-level security policyon insert if misconfigured. - Optional: copy old cache rows from Xata into
sites/images(site_url,date_collected,screenshot, and the three JSON text columns onimages).
- Connect the repo in the Netlify UI (or use the Netlify CLI). The included
netlify.tomlrunsnpm run buildand enables@netlify/plugin-nextjsv5 (Next.js 15). - Set the same environment variables you would use locally (Cloudinary, Supabase, optional GA, etc.) under Site configuration → Environment variables.
- Use Node 20+ for the build (set
NODE_VERSIONin Netlify or rely on.nvmrc;package.jsondeclaresengines.node).
next build alone is enough for local checks; Netlify runs the plugin during netlify build on their builders.
Image discovery runs in Node.js with Puppeteer and @sparticuz/chromium-min (same stack as /api/screenshot). Set CHROME_EXECUTABLE_PATH locally to your Chrome/Chromium binary for faster dev; in production the function downloads the Sparticuz Chromium pack on cold start.
Serverless time and bundle size limits apply on any host. If /api/scrape times out on Netlify, increase the function timeout in the Netlify UI (plan-dependent) or run scraping on a machine with a longer limit.