📝 Tutorials

What is Serverless? A Simple Explanation for Developers


Serverless doesn’t mean “no servers.” It means you don’t manage servers. You write a function, deploy it, and the cloud provider runs it for you. You pay only when it executes.

// This is a complete serverless function (AWS Lambda)
export async function handler(event) {
  const name = event.queryStringParameters?.name || 'World';
  return {
    statusCode: 200,
    body: JSON.stringify({ message: `Hello, ${name}!` }),
  };
}

Deploy this, and it handles 0 to 10,000 requests per second automatically. No servers to configure, no scaling to think about, no patches to install.

How it works

  1. You write a function
  2. You deploy it to a cloud provider
  3. A request comes in → the provider spins up your function → runs it → returns the response
  4. No requests? Nothing runs. You pay nothing.

The provider handles: servers, operating systems, scaling, load balancing, security patches, availability.

Serverless providers

ProviderServiceFree tier
AWSLambda1M requests/month
Google CloudCloud Functions2M invocations/month
CloudflareWorkers100K requests/day
VercelServerless FunctionsGenerous hobby tier
NetlifyFunctions125K invocations/month

When to use serverless

Good fit:

  • APIs and webhooks
  • Cron jobs / scheduled tasks
  • Image processing, file handling
  • Low-traffic or spiky-traffic apps
  • MVPs and side projects (free tier is generous)

Not ideal:

  • Long-running processes (most have a 10-30 second timeout)
  • WebSocket connections (stateless by design)
  • Apps that need persistent in-memory state
  • High-throughput, consistent-traffic apps (a regular server is cheaper)

Serverless vs. traditional server

ServerlessTraditional server
ScalingAutomaticManual or auto-scaling config
Cost at zero traffic$0$5-50/month minimum
Cost at high trafficCan get expensivePredictable
Cold startsYes (first request is slower)No
DeploymentPush a functionDeploy to a server
MaintenanceNoneOS updates, security patches

The cold start problem

When a serverless function hasn’t been called recently, the provider needs to spin up a new instance. This “cold start” adds 100ms-2s of latency to the first request. Subsequent requests are fast.

Mitigation: keep functions small, use lightweight runtimes (Node.js, Python), or use providers with minimal cold starts (Cloudflare Workers).

See also: What is CI/CD? | AWS CLI cheat sheet