The first thing I built for TerminalFeed was a Bitcoin ticker. A single number at the top of a black page, updating once per second. I watched it for an hour and thought: that's it. That's the feeling. The site needs to feel like looking at a terminal.

Then I added a second panel. Then ten more. Now there are over thirty, and the original ticker is still at the top, but it's no longer the product. The product is the ambient feeling of being connected to thirty things happening in the world at once. This is a founder's note on what I've learned building that, and why a single-number dashboard is only ever a waypoint, never a destination.

The Ticker Was the Hook, Not the Answer

People find TerminalFeed by searching for Bitcoin tickers. They stay because of the panels around it. Space launches, earthquake alerts, prediction markets, GitHub trending, a live Wikipedia edits stream, which cloud services are currently down. The ticker earns the click. Everything else earns the tab pin.

This was not the plan. The plan was a crypto dashboard. What happened is that every time I added a non-crypto panel, my own time on the site went up. I started leaving it open on a second monitor for the feeling of being plugged in. Friends said the same thing. Something about the combination of real-time feeds was more compelling than any single one.

I think the reason is that real-time data has a psychological quality polling data does not. When numbers update in front of you, your brain treats them as presence. You feel like you're observing the world, not retrieving a stale report. The ticker was teaching me that feeling; the other panels scaled it.

The Panel Is the Atomic Unit

Every feed on TerminalFeed fits a single pattern: a rectangle with a title, a data surface, and a self-healing lifecycle. It fetches its own data. It handles its own errors. It doesn't care about the other panels. It can be moved, reordered, hidden, or dragged anywhere on the page, and nothing else breaks.

This seems obvious once you see it. It is not obvious when you're building. The naive version is to centralize state: one store that knows about all feeds, one fetcher that runs everything on a shared schedule. That's the version I started with. It broke every time I added a new panel, because every new feed forced me to modify the central store, which touched every other panel.

The working version is: every panel is its own universe. Its own hook for data. Its own error boundary. Its own rendering. If one panel crashes, the site doesn't. If one API goes down, one panel goes dim and the other 29 keep running. This pattern is not elegant. It duplicates code. It's the right pattern.

Never trust a single shared pipeline. If one thing breaks, don't let it take down anything else.

Self-Healing Is the Feature, Not the Error Handling

A real-time dashboard that shows error states is a broken product. The user didn't sign up to debug your API reliability. If a feed is down, you have three options: show stale data with a subtle indicator, hide the panel, or reconnect silently and hope. Never: show a red error banner and dump a stack trace.

TerminalFeed panels do all three depending on the failure mode. Soft failures (one slow fetch) show stale data and retry in the background. Hard failures (endpoint returns 500 for minutes) hide the panel from the layout until recovery. Reconnections happen without any UI event. The user should never know anything went wrong.

This sounds like coddling, but it's the only honest design. Users come to the dashboard to see the world. Any moment they spend processing an error state is a moment they're not consuming the data the site exists to deliver. Error handling should be invisible if at all possible.

The Worker Is Not a Proxy, It's a Trust Layer

A lot of real-time dashboards make one critical mistake: they let the browser call third-party APIs directly. CORS headers get set permissively, someone's API key gets exposed, rate limits fire, and the site breaks under load. We tried this. We paid for it. It doesn't scale and it leaks secrets.

Every external API on TerminalFeed now runs through a Cloudflare Worker that sits between the browser and the source. The Worker handles authentication, caches aggressively, applies per-endpoint TTLs, times out at 8 seconds, falls through to secondary sources, and never returns a 5xx to the client. On failure, it returns last-known-good data with a staleness flag.

This trust layer is what makes the dashboard robust at scale. The same Worker serves 10 users or 10,000 users at similar cost because the cache absorbs most of the work. When a source API is degraded, we feel it for a minute before our cache expires; the user feels it never because we're serving from the cache.

Latency Budgets Are Real

A 30-panel dashboard has 30 latency budgets. Each panel has its own frequency of truth: some things change every second (BTC), some every minute (stocks), some every hour (economic data). Treating them all with the same cadence wastes bandwidth on the slow-changing feeds and undercooks the fast ones.

The pattern: each panel declares its freshness requirement. The Worker caches accordingly. The client polls or subscribes accordingly. A stock price panel has no business running a WebSocket. A BTC ticker has no business polling on a 15-second interval.

The tricky part is the long tail: disaster alerts that may fire once a month or twice a day. There's no reasonable polling cadence. Solution: SSE streams where available, long-TTL caches where not, and a banner only when something actually changes. If I poll for disaster alerts every 30 seconds and render "nothing" 98% of the time, I've wasted 98% of the fetches.

The Mobile Constraint Is a Feature

Phones force you to make hard choices you'd otherwise dodge. A ticker that looks fine on a 4K desktop looks garish at 375px wide. A 30-panel layout that's fun on a widescreen monitor is a scroll nightmare on a phone.

We ended up with a mobile layout that's narrower, slower to update, and deliberately less dense. Panels that make sense only on a desktop ("watch the Wikipedia firehose scroll past") are deprioritized on mobile. The ticker pauses when the tab is backgrounded. Animations are disabled.

Designing for mobile forced us to answer: which panels are truly essential? The answer turned out to be about eight. On mobile, you want BTC, Fear and Greed, prediction markets, top stories, stock movers, earthquake alerts, what's trending, and the weather. Everything else is optional. We surface those first on small screens. The full 30+ panel experience is desktop-first, and we're at peace with that.

The Static HTML Is for Google, the React Is for Humans

Real-time dashboards have a fundamental SEO problem: everything interesting is hydrated by JavaScript. Googlebot can execute JS, but it doesn't do so reliably or promptly on every page. If all your content lives inside React, Google sees an empty shell.

Our solution: every page ships a static HTML content block with the latest article titles, descriptions, and internal links. On page load, React renders the live dashboard on top of it and the static block is hidden. Humans see the dashboard; crawlers see structured content. Both are happy.

This is not a hack. It's recognizing that "real-time" and "indexable" are orthogonal goals and serving each appropriately. The real-time layer lives in the browser. The indexable layer lives in the initial HTML. Neither interferes with the other.

Community Forms Around Panels, Not Products

The most delightful thing about running TerminalFeed is the feedback about individual panels. People write in because they love the Space Launches panel. They suggest new feeds for the Cyber Threats panel. They argue for and against the inclusion of specific prediction markets. Nobody writes in about the "product" as a whole.

This tells me the panel is the right unit of engagement. When someone cares about a panel, they care deeply. They want it to be better. They have opinions about its styling, its data source, its update cadence. Treating panels as mini-products rather than features has been the right model. Each one has its own maintainer mentality even though I'm the only maintainer.

The Ticker Is Still at the Top

After thirty panels and two years of iteration, the Bitcoin ticker is still the first thing you see when you load the site. I've thought about moving it. I've thought about rotating the hero between BTC, ETH, and stocks. I've kept coming back to the ticker.

It's partly an anchor: that one number tells you the site is live, that everything below it is real. It's partly a shorthand: most of our visitors arrive looking for crypto data, and the ticker confirms they're in the right place. It's partly identity: TerminalFeed without a BTC hero is a different site.

So the ticker stays. But the dashboard around it is the thing. If you only see the ticker, you missed the product. Scroll. There are earthquakes happening, space missions launching, prediction markets settling, code being committed, APIs going down and coming back up. The world is noisy. Watching it in real-time, one panel at a time, is the most satisfying second monitor I've ever built.

If you're building your own real-time thing, start with the hook: one live number that earns the click. Then build the dashboard around it slowly. The magic is not in any single panel. It's in the composition.

For the nuts and bolts of the ticker itself, see Bitcoin Ticker: How Live BTC Price Updates Actually Work. For the broader case on second monitors, Why Second Monitor Dashboards Matter.

See the Whole Thing

30+ real-time feeds on one page. The ticker is at the top. The product is below it.

Open the Dashboard About