Serverless

DEVELOPMENT

Quick Definition

Serverless is a cloud computing model where you deploy individual functions (or small services) and the platform handles everything below: provisioning, scaling, patching, scheduling. You pay only for actual execution time, often billed in milliseconds. AWS Lambda, Cloudflare Workers, Vercel Functions, and Google Cloud Functions are the canonical examples.

How it works

When a request comes in, the platform either reuses a warm instance of your function or spins up a new one (a "cold start", typically 50ms-2s depending on platform and language). The function runs, returns a response, and either stays warm for the next request or gets garbage-collected after some idle period. Scaling is automatic and instant: 1 request or 10,000 concurrent requests both work without configuration.

Tradeoffs: serverless is great for spiky, unpredictable traffic and bad for high-volume steady-state workloads (where reserved compute is cheaper). Cold starts hurt latency for low-traffic endpoints. Vendor lock-in varies; AWS Lambda is heavily entangled with the AWS ecosystem, Cloudflare Workers is more portable.

Why it matters

Serverless dramatically reduces operational overhead for many use cases. For an indie developer or small team, "no servers to manage" is a real productivity win. For large applications, serverless still makes sense for event-driven or low-traffic surfaces.

Where you'll see this on TerminalFeed

TerminalFeed itself is a serverless app: the API is a Cloudflare Worker, the frontend is on Cloudflare Pages, and the cron jobs are scheduled Worker invocations.