The idea
It started with a simple need. I wanted something on my second monitor. Not a static wallpaper, not a YouTube video on loop, but a living dashboard. Something that showed me what was happening in the world at a glance. Bitcoin price, stock market movement, breaking news, earthquakes, weather, all of it. In one place. With a dark terminal aesthetic that matched the rest of my dev setup.
I looked at what existed. Bloomberg terminals cost $24,000 a year. TradingView is great for charts but locks you into finance. News sites are cluttered with pop-ups and cookie banners. Nothing combined multiple data streams into a single, quiet, always-on display. So I decided to build it myself. One dark page, a grid of panels, each showing a different live feed. No accounts, no paywalls, no clutter. Just data in monospaced type on a black background.
The first version
The first version was embarrassingly simple. A single Next.js page deployed on Cloudflare Pages. Two panels: a Bitcoin price widget pulling from CoinGecko's free API, and a Hacker News feed using their Firebase endpoint. The styling was rough. The layout was broken on mobile. But it worked, and I found myself actually leaving it open on my second screen. That was the signal. If I used it, someone else would too.
So I kept adding panels. One API at a time. CoinGecko for crypto prices. USGS for earthquakes (they have an excellent GeoJSON feed). Open-Meteo for weather, which is completely free with no API key required. Each integration taught me something new about rate limits, response formats, and caching strategies. The panel count grew from 2 to 5 to 10 to 20. Fear and Greed Index, NASA near-Earth objects, Wikipedia recent changes, Reddit, GitHub trending. Every addition made the whole thing more useful.
The CORS problem
The biggest technical challenge early on was CORS. If you have ever tried to fetch data from a third-party API directly in the browser, you know the pain. Many APIs block cross-origin requests. The browser throws a cryptic error, and your dashboard panel shows nothing but a red error state.
My first attempt at solving this was embarrassing: free CORS proxy services. They worked for about a week, then went down or started injecting ads. The real solution was Cloudflare Workers. I built a single Worker that sits between the browser and every external API, fetches data server-side, caches responses at the edge, and serves them with proper CORS headers.
This architecture solved more than just CORS. Instead of every visitor hitting 30+ external APIs simultaneously, they hit my Worker, which returns cached responses almost instantly. The dashboard loads in under a second because all data is already at the edge. Crypto prices refresh every 30 seconds. Earthquake data every 5 minutes. NASA data hourly. Each endpoint gets a TTL matching how often the underlying data actually changes.
Making it feel alive
Early on, the dashboard had a problem. It was technically functional but emotionally flat. A grid of static numbers on a dark background felt like a spreadsheet, not a living system. So I started adding micro-animations, and they changed everything.
Price flash animations were first. When a crypto price updates, the number briefly flashes green or red depending on direction. Your peripheral vision catches the flash even when you are focused on something else. That is exactly what a second-monitor dashboard needs: ambient awareness without demanding attention.
Then came pulsing live indicators (small green dots that confirm a feed is active), new items sliding in with glow effects, Wikipedia edits streaming via Server-Sent Events, and a faint matrix rain animation behind the main grid. None of these animations are flashy. They are quiet. The dashboard should feel like a living organism breathing steadily in the corner of your screen, occasionally drawing your eye when something changes. Not screaming at you. Just present.
The API platform
When I built the Cloudflare Worker to proxy and cache API calls, I did not plan to make it public. But the cached responses sitting at the edge were genuinely useful on their own. So I opened them up as a public API. The flagship endpoint, /api/briefing, returns a complete world snapshot in a single JSON response. Crypto prices, market indices, top news, weather, earthquake activity. One call, everything you need. No API key, no authentication.
The API found its audience in an unexpected place: AI agents. After I added an llms.txt file, language model agents started discovering TerminalFeed and using the briefing endpoint to ground themselves in current world state. One HTTP request instead of scraping ten different sources. Suddenly TerminalFeed was not just a website for humans. It was infrastructure for machines that need to understand the world.
The tools expansion
Adding developer tools was a natural extension. The core audience was already developers who came for the dashboard. They regularly needed to format JSON, test a regex, generate a UUID, or decode a JWT. Why send them to some ad-infested tool site when I could build those utilities right into TerminalFeed?
Each tool is a standalone page with its own SEO metadata, built in the same terminal aesthetic. JSON formatter, diff checker, hash generator, Base64 encoder, cron expression builder, color converter, and more. They all run entirely in the browser. Your data never leaves your machine.
The strategy is simple. Each tool page captures organic search traffic for queries like "json formatter online" or "sha256 hash generator." A developer finds the tool through Google, uses it, discovers the dashboard. Some stick around. The tools are both useful on their own and a funnel to the main product. At last count, TerminalFeed has over 20 developer tools, and I keep adding more based on what I personally need. If I find myself visiting another site to do something that could live on TerminalFeed, that is a signal to build it.
What I learned
Building TerminalFeed taught me a lot about what works on the internet. The biggest lesson: simplicity wins. Every time.
Every feature that tried to be clever failed. Fancy interactive panels with custom controls nobody understood. A settings page with dozens of options nobody configured. Clever ideas solving problems people did not actually have. Every feature that just worked (reliably, fast, no signup required) succeeded. A panel showing the Bitcoin price. A tool that formats JSON. Simple things, done well, available instantly.
The site people leave open on their second monitor is not the one with the most features. It is the one that never breaks. Reliability is the feature. Speed is the feature. Not asking for an email address is the feature.
If you are building something similar, here is what I would tell you. Start with one data source you personally care about. Make it update in real time. Put it on a dark background. Stare at it for a week. If you keep it open, you are onto something. Then add a second panel. Let the product grow organically from your own usage. Do not plan 30 panels on day one. Plan one, and let the next 29 reveal themselves.
TerminalFeed started as a personal project for my second monitor. It became a dashboard, an API platform, and a developer toolkit. I did not plan that. I just kept building the thing I wanted to use.
That is probably the best advice I can give. Build what you want to use, and keep shipping.