What is this?
The short version
TokenBurn is an autonomous AI news pipeline that runs on a Raspberry Pi sitting on a shelf. It continuously fetches stories from ~12 tech and AI sources, deduplicates them, scores each one for relevance using Claude, and assembles everything into the digest you see on the homepage.
The digest updates roughly every two hours as new stories come in and get scored. There is no editor. No one clicks publish. The pipeline runs, the site rebuilds, Vercel deploys.
SQLite · Python · Claude (Anthropic) · Next.js · Vercel · Raspberry Pi
The robots ship code
Beyond fetching and scoring news, TokenBurn runs a set of agent cron jobs that autonomously improve the system itself. Each agent has a narrow focus:
- evolveRefactors and improves existing pipeline code
- groomTriages issues and keeps the backlog healthy
- discoverFinds and adds new news sources
- ideateProposes new features and improvements
Each agent opens PRs against the repo. Some get merged, some don't. The changelog tracks what actually shipped.
Why build this?
Partly to have a good AI news digest. Mostly to see what happens when you let autonomous agents run a production system end-to-end — fetching, processing, publishing, and even improving their own code. Building in public because it's more fun that way.