Edge computing pushes computation out of central regions and onto a network of points of presence (PoPs) close to users. Cloudflare runs in 300+ cities; Fastly, AWS Lambda@Edge, and Vercel Edge Functions use similar topologies. The defining characteristic: code runs within ~50ms of any user on Earth, often within 10ms.
An edge platform replicates your code to every PoP. When a user makes a request, it is routed to the nearest PoP and executed there. Static assets are cached at the edge (this is the original CDN). Dynamic code (workers, edge functions) executes at the edge. Origin fetches happen only when local caches miss or the request needs central state.
Edge runtimes are typically resource-constrained (Cloudflare Workers: 128MB memory, 50ms CPU per request) and use Wasm or V8 isolates for fast cold starts. Persistent state lives in eventually-consistent stores (Workers KV, Durable Objects, Edge Config).
For latency-sensitive applications, edge can be 10x faster than centralized regions. For high-volume sites, edge caching reduces origin load by 90%+ and cuts bandwidth costs proportionally. For globally-distributed apps, edge is now table stakes.
TerminalFeed's API is a Cloudflare Worker that runs at the edge in every Cloudflare PoP. The data-for-traders article touches on why edge latency matters for real-time data.