Synopses worth reading. On autopilot.

Point synop.stream at a YouTube channel, meeting page, or podcast feed. It produces editor-grade synopses — TLDR, decisions, roll calls, quotes, public comment — and publishes them to a subdomain you own. Local-first — runs entirely on your hardware.

For people who cover a beat — and want the synopsis, not the three-hour video.

Local-first — your model, your hardware, your data.

Closed beta

lapeer-commission.synop.stream / content / apr-8-2026

City Commission Regular Meeting — April 8, 2026

  • City of Lapeer
  • Apr 8, 2026
  • 1h 23m
  • 4,218 views
TLDR

The commission approved the FY2027 budget 6–1, tabled the downtown overlay zoning amendment, and heard public comment on the proposed traffic calming plan for Main Street.

Votes & Decisions
  • FY2027 Budget — Passed 6–1 (Commissioner Smith dissenting)
  • Downtown Overlay Zoning — Tabled, referred back to Planning
  • Main St. Traffic Calming — Public hearing scheduled for April 22

Captures the structure, not just the gist

Votes with roll calls. Quotes with attribution. Public comment digests. The synopsis takes a shape that matches your beat — not a generic three-bullet summary.

Filtered to what matters

Plain-English criteria: “commission meetings — skip sports and press conferences.” Quick checks first, full reads only when ambiguous. Everything rejected stays visible with the model’s reason — so nothing goes missing quietly.

Speaks your beat’s language

Drop in your board members, acronyms, and district names. The model uses them instead of hedging or generalizing — so the synopsis reads like someone who covers the beat wrote it.

Source always one click away

Raw transcript, full chat, complete comment thread — preserved alongside every synopsis. Verify a quote, check the moment, or share the original without leaving the page.

Local-first

Your model. Your hardware. Your data.

synop.stream runs end-to-end on your machine. Transcription happens on your chip, the LLM runs in your own Ollama, and your media never touches a third-party cloud. The only bytes that leave are the static site you choose to deploy — and only to the Cloudflare account you control.

No hosted service. No vendor pricing page. No SaaS rug-pull risk.
Swap the LLM when a better open model ships. No migration needed.
Public-records-friendly. Your transcripts stay on your hardware.
How it works

Four stages. One flow.

Set it up once. Let the scheduler do the rest.

  1. 1

    Sources

    A YouTube channel, a page of meeting minutes, or both. Add once; the scheduler watches from then on.

  2. 2

    Filter

    Cheap metadata pass first. Full-content re-check only when the metadata is ambiguous. Everything rejected stays visible in an audit view, with the reason.

  3. 3

    Summarize

    Template plus your reference context drive the main summary. Viewer comments and live chat get their own dedicated summaries. Raw originals stay one click away.

  4. 4

    Publish

    Static site built per project. Cloudflare Pages deploy, custom domain, DNS CNAME — all provisioned on first run.

Example

A city beat, on autopilot.

A city watcher creates a project called “City Commission,” pastes the city’s YouTube channel URL, and writes a short criteria line: “Commission meetings and budget hearings. Skip ribbon cuttings and press conferences.” They add reference info — mayor’s name, commissioner ward assignments, a few program acronyms. They pick the Local Government template and toggle Auto-process + Auto-deploy.

Three weeks later, a dozen meeting synopses live on their subdomain. Each opens with a TLDR, then the votes with roll calls, a digest of public comment, a summary of viewer reactions, and the raw chat and comment transcripts one click away. They didn’t touch a button. The ones that didn’t fit — a ribbon cutting, a student video — landed in Filtered Out with the model’s reason attached, so nothing went missing quietly.

Live now: lapeer-commission.synop.stream — a city beat. · all-in.synop.stream — a podcast (different format, different shape of synopsis). Same engine, same templates, different beats.

Frequently asked

Honest answers about what you’re getting into.

What hardware do I need?

Apple Silicon Mac (M1 or newer). Larger models run more comfortably on Pro/Max/Ultra chips. We’ll share recommendations during onboarding so you pick the model that fits your machine.

What’s Ollama?

A free runtime that lets you run open-weight LLMs locally on your Mac. Install once, pick a model, you’re set. We walk you through it during onboarding. ollama.com

Do I need Cloudflare?

Only for the auto-deploy step. Synopses are produced locally; the built static site can be hosted anywhere — or read directly from your own machine.

Windows or Linux support?

Not yet. Apple Silicon’s local-LLM tooling is the most polished today. Linux is on the roadmap; Windows further out.

How long is the beta wait?

Onboarding is rolling and small. Expect a real conversation, not an autoresponder.

Can I export my data?

It’s already yours — every transcript, synopsis, and audit log lives on your hard drive in plain files. No lock-in by design.

Closed beta

Tell us what you’d build with it.

We’re onboarding a small group of early users. Best fits so far: city and county coverage, beat reporting archives, organization-internal digests, researchers tracking a topic across sources. If you have a clear use case and a Mac to run it on, we’d like to hear from you.

Write to us

Or write directly to beta@synop.stream

Built by an indie developer who wanted a slow, boring feed of local facts instead of a fast, angry feed of national ones.