SEO-SWARM is a local SEO production workspace that turns structured inputs into reports, briefs, and draft-ready assets. It combines a step-by-step workflow, reusable templates, and automation scripts that write everything into data/outputs/<client>/.
Use it to:
- scaffold a client workspace
- collect and validate inputs
- generate briefs, audits, and compliance checks
- preview outputs in a web UI
- NEW: Manage agent workflows via modern web dashboard
A modern React frontend for streamlined client onboarding and agent workflow management:
cd frontend
./start.sh # Mac/Linux
# or
start.bat # WindowsFeatures:
- 🚀 Client Onboarding Wizard - Step-by-step setup with auto-scaffolding
- 📋 Task Management - Create, track, and manage tasks across agents
- 👥 Agent Registry - View all 20+ available agents by category
- 📊 Real-time Updates - WebSocket integration for live progress
- 📁 Output Browser - Explore generated reports and content
- ⚙️ Configuration - Edit workflow settings via UI
Quick Start:
- Frontend UI: http://localhost:5173
- Backend API: http://localhost:8000
- API Docs: http://localhost:8000/docs
See frontend/QUICKSTART.md for detailed setup.
- Workflow rules and roles: @docs/client-templates/swarm-roles.md
- Content templates: @docs/client-templates/webpage-templates.md, @docs/client-templates/article-templates.md
- Measurement system: @docs/seo/measurement-spec.md + intake/reporting templates
- @docs/seo/measurement-intake-template.md
- @docs/seo/measurement-reporting-template.md
- End-to-end execution flow: @docs/seo/swarm-execution-workflow.md
- Automation scripts grouped by category:
- @scripts/workflow/ (scaffolding and viewers)
- @scripts/generators/ (reports and briefs)
- @scripts/ingest/ (input ingestion and exports)
- @scripts/validation/ (linting and validation)
- Read the roles and guardrails: @docs/client-templates/swarm-roles.md
- Scaffold a client workspace:
python scripts/workflow/swarm_workflow.py --client "Client Name" --slug client-slug- Optional: add
--site-url https://example.comto crawl and cache the site HTML (also pre-fillsinputs.md).
Audit runner (all steps, end-to-end):
python scripts/workflow/site_audit_runner.py --client "Client Name" --slug client-slug --site-url https://example.comCrawl-only mode (cache + inputs.md):
python scripts/workflow/site_audit_runner.py --client "Client Name" --slug client-slug --site-url https://example.com --crawl-onlyFail fast (stop on first error):
python scripts/workflow/site_audit_runner.py --client "Client Name" --slug client-slug --site-url https://example.com --fail-fast- Fill approved facts in
data/outputs/<client>/inputs.md- Template: @docs/seo/inputs-template.md
- Run generators for briefs and reports (see below)
- Open the outputs viewer to browse and QA generated files:
python3 scripts/workflow/outputs_viewer.py- or
./outputs_viewer.sh
- Treat
data/outputs/<client>/inputs.mdas the single source of truth. All downstream scripts reference it or derived JSONs. - Use
--site-urlwhen scaffolding to seedinputs.mdfrom cached HTML, then verify and edit as needed. - Run the workflow in order (intake -> mapping -> briefs -> metadata -> compliance). See @docs/seo/swarm-execution-workflow.md.
- Use the outputs viewer to QA HTML and JSON outputs quickly without digging through folders.
- Keep input exports (GSC, GBP, GA4, rank tracker, crawl data) in
data/outputs/<client>/reports/and ingest them via @scripts/ingest/. - Validate schema HTML via @docs/seo/schema-approval-workflow.md before shipping.
- Create the client folder:
python scripts/workflow/swarm_workflow.py --client "Client Name" --slug client-slug- Optional: add
--site-url https://example.comto crawl and cache the site HTML.
- Fill in
data/outputs/<client>/inputs.mdwith approved facts (NAP, services, hours, proof points).- Template: @docs/seo/inputs-template.md
- Capture measurement inputs:
- Use @docs/seo/measurement-intake-template.md
- Optional generator:
python scripts/generators/measurement_intake_generator.py --client-slug client-slug --scaffold
- Run a crawl cache if needed (see @scripts/generators/service_brief_generator.py requirements).
- Generate service briefs and supporting reports.
- Produce content briefs and draft pages/articles using templates.
- Intake + validation
- Approved inputs in
data/outputs/<client>/inputs.md(see @docs/seo/inputs-template.md) - Measurement intake per @docs/seo/measurement-intake-template.md
- Approved inputs in
- Strategy + mapping
- Keyword map + KPI targets:
python scripts/generators/keyword_map_kpi.py --client-slug client-slug --scaffold
- Keyword map + KPI targets:
- Content production
- Generate service briefs:
python scripts/generators/service_brief_generator.py --client-slug client-slug - Summarize briefs:
python scripts/generators/brief_summary_report.py --client-slug client-slug - Generate content briefs:
python scripts/generators/content_brief_generator.py --client-slug client-slug --scaffold
- Generate service briefs:
- On-page + metadata
- Metadata + internal link map:
python scripts/generators/metadata_internal_link_map.py --client-slug client-slug - Internal link validation:
python scripts/validation/internal_link_validator.py --client-slug client-slug
- Metadata + internal link map:
- Compliance
- Draft compliance lint:
python scripts/validation/draft_compliance_lint.py --client-slug client-slug
- Draft compliance lint:
- Local asset updates
- GBP update checklist:
python scripts/generators/gbp_update_checklist.py --client-slug client-slug - Citation update log:
python scripts/generators/citation_update_log.py --client-slug client-slug --scaffold - Local link outreach log:
python scripts/generators/local_link_outreach.py --client-slug client-slug --scaffold - Review response templates:
python scripts/generators/review_response_templates.py --client-slug client-slug --scaffold - Review export ingest:
python scripts/ingest/review_export_ingest.py --client-slug client-slug --input path/to/reviews.csv
- GBP update checklist:
- Technical audit
python scripts/generators/technical_seo_audit_scaffold.py --client-slug client-slug
All scripts write under data/outputs/<client>/reports/ unless noted. Use the outputs viewer to browse and QA results quickly.
- Service briefs:
service_brief_generator.py->service-briefs/*.md - Brief summary:
brief_summary_report.py->service-briefs-summary.md/.json - Content briefs:
content_brief_generator.py->content-briefs/*.md+content-briefs.json - Article extraction:
article_cache_to_markdown.py->articles/*.md(extracts blog/article pages from cache) - Metadata + internal link map:
metadata_internal_link_map.py->metadata-internal-link-map.md/.json - Metadata linkmap ingest:
metadata_linkmap_ingest.py->metadata-linkmap-input.json - Internal link validation:
internal_link_validator.py-> report inreports/ - Measurement intake:
measurement_intake_generator.py->measurement-intake.md/.json - Keyword map + KPI:
keyword_map_kpi.py->keyword-map-kpi.md/.json - DataForSEO SERP fetch:
serp_dataforseo_fetch.py->serp-export.json+ inputs - GA4 export ingest:
ga4_export_ingest.py->ga4-export.json+ga4-summary.json(optional) - Rank tracker export ingest:
rank_tracker_export_ingest.py->rank-tracker-export.csv/.json - Draft compliance lint:
draft_compliance_lint.py->draft-compliance-lint.md/.json - GBP checklist:
gbp_update_checklist.py->gbp-update-checklist.md/.json - GBP export ingest:
gbp_export_ingest.py->gbp-export.json+gbp-summary.json(optional) - Citation log:
citation_update_log.py->citation-update-log.md/.json - Citation audit ingest:
citation_audit_ingest.py->citation-log-input.json - Local link outreach log:
local_link_outreach.py->local-link-outreach.md/.json - Review response templates:
review_response_templates.py->review-response-templates.md/.json - GSC export ingest:
gsc_export_ingest.py->gsc-export.json+gsc-summary.json(optional) - Review export ingest:
review_export_ingest.py->review-templates-input.json - Technical SEO audit scaffold:
technical_seo_audit_scaffold.py->technical-seo-audit.md/.json - Crawl export ingest:
crawl_export_ingest.py->crawl-export.json+crawl-summary.json(optional)
- Client outputs are intentionally ignored from git (
data/outputs/in.gitignore). - Replace all placeholders with approved inputs before publishing.
- Copy
.env.exampleto.envand setDATAFORSEO_LOGIN,DATAFORSEO_PASSWORD,DATAFORSEO_ENDPOINTfor SERP fetch automation.