Heard is a macOS menu bar app that auto-detects Microsoft Teams meetings, records dual-track audio, and produces on-device transcripts with speaker diarization. See spec.md for the full product specification.
swift build # compile
swift run Heard # compile and launch (terminal — mic permission goes to terminal app)
./scripts/bundle.sh # build Heard.app bundle (ad-hoc signed)
open build/Heard.app # launch as proper app (mic permission goes to Heard)
swift package clean # clean build artifactsNo Xcode project — this is a Swift Package Manager executable. macOS 15.0+ required.
spec.md— Product spec (source of truth for features and architecture)handoff.md— Current implementation status and next stepsSources/Heard/MTApp.swift— App entry pointSources/HeardCore/AppModel.swift— Central state orchestrationSources/HeardCore/Services.swift— Detection, recording, pipeline, permissionsSources/HeardCore/Views.swift— All UI (menu bar dropdown + settings window)Sources/HeardCore/CoreModels.swift— Data typesSources/HeardCore/Stores.swift— Persistence layerInfo.plist— App bundle metadataHeard.entitlements— Entitlements (audio input only, no sandbox)scripts/bundle.sh— Build script for .app bundlescripts/dmg.sh— Distribution pipeline: release build → sign → notarize → staple → DMG → SHA256.github/workflows/ci.yml— CI: build + test on all pushes; release bundle + GitHub Release upload on tag push
- Treat
spec.mdas the product source of truth unless the user explicitly changes scope. - Read
handoff.mdbefore making changes to understand current state. - Update
handoff.mdafter substantial implementation work. - Prefer the real macOS-native path (IOKit, CoreAudio, CoreML) over cross-platform abstractions.
- Keep the app as a single-process menu bar application.
- Do not introduce cloud APIs, LLM integrations, or non-English transcription.
- Keep v1 focused on post-meeting transcription. Dictation is v2 placeholder scaffolding.
- Avoid broad refactors — make targeted changes that deliver the next integration step.
- The "Simulate Meeting" buttons are intentional for testing without a real Teams call. Keep them.
MenuBarExtrawith.windowstyle — renders SwiftUI views in a floating panelWindowscene with id "settings" — opened via@Environment(\.openWindow)- Library target
HeardCore+ executableHeard+ test executableHeardTests - All persistence is JSON files in
~/Library/Application Support/Heard/ - Pipeline stages run sequentially on a background task, one job at a time
- Meeting detection polls every 3 seconds via
IOPMCopyAssertionsByProcess() - Audio capture uses
CATapDescription(app tap) +AVAudioEngine(mic)
swift run HeardTests # run the test suiteManual testing:
./scripts/bundle.sh && open build/Heard.app- Click menu bar icon → "Simulate Meeting Start" to exercise the full flow
- Use the Settings button to open preferences
- Running via
swift runattributes mic permission to the terminal app, not Heard. Use the .app bundle for proper permissions. - The
.windowMenuBarExtra panel has a max height — keep the dropdown content compact - FluidAudio dependency is declared but models aren't available as CoreML yet
- The worktree is at
.claude/worktrees/— run commands from the worktree dir, not the main repo