Five years ago, I moved everything I could to AWS Lambda. It was the default answer for 'how do we host this?' at Thea Tech Solutions. We didn't want to manage servers, and the pay-per-use model sounded like financial magic.
But lately, I've been changing my tune. The honeymoon period with serverless functions is over. The cold starts are annoying, the execution time limits are stifling, and debugging a distributed async mess is not my idea of a good time.
This week, I've been architecting a new dashboard for a client, and I made a decision: I am not deploying a single Lambda function. Instead, I'm going all-in on Edge computing and long-running containers. Here is why this shift makes sense for modern web architecture in 2024.
The Problem with the 'Serverless' Default
Don't get me wrong—serverless has its place. If you have a sporadic, fire-and-forget task like processing an uploaded image or sending a webhook, it is fine. But for the critical path of your application? It introduces too much friction.
Cold starts are real. When a user clicks a button and waits two seconds for your function to spin up, that is a bad user experience. You can try to keep it 'warm' with pings, but that feels like hacking around the platform rather than using it. Vendor lock-in is tighter than it looks. Once you write your business logic inside ahandler.js tied to AWS Context, you are married to AWS. Moving that logic to Google Cloud or Azure requires rewriting the glue code.
The Edge is the New Serverless
So, where am I going? To the Edge. Specifically, Cloudflare Workers.
The concept here is similar to serverless—you write code, deploy it, and don't worry about the underlying metal—but the execution model is fundamentally different. Instead of spinning up a heavy container somewhere in us-east-1, the code runs on lightweight V8 isolates distributed across 300+ cities globally.
For a Next.js application, this is a game changer. I am currently using the Next.js App Router with the @cloudflare/next-on-pages adapter. This allows me to deploy the entire application to the Edge.
The Hybrid Approach: Edge + Containers
Moving to the Edge doesn't mean I abandon AWS entirely. I still need a place for heavy lifting—AI inference, complex data processing, or long-running jobs. For that, I am stepping back from Lambda and using AWS ECS (Elastic Container Service) or even just a managed EC2 instance.
Why? Because sometimes I just need a process that stays up. If I am running a Python script to crunch data, I don't want to worry about a 15-minute timeout. I want a server, a real server, that I can SSH into if things go sideways.
My new architecture looks like this:
* Static Assets & UI: Hosted on Vercel or Cloudflare Pages.
* API & Auth: Edge Functions (Cloudflare Workers). This handles session checks, lightweight database reads, and routing.
* Heavy Compute: AWS ECS. The Edge functions call the containerized backend only when necessary.
The Supabase Factor
I am also leaning heavily on Supabase for this stack. Since Supabase provides a Postgres connection, I can connect to it directly from the Edge.
With traditional serverless, you had to worry about RDS Proxy and database connection limits because Lambda functions scale by creating hundreds of concurrent connections. Edge workers are stateless and handle connection pooling much more efficiently, or better yet, I can use the Supabase REST API directly from the client or Edge, skipping a backend function entirely.
The Cost Reality
People talk about cost savings with serverless, but my AWS bill often tells a different story. Request counts and execution time add up fast.
Cloudflare's pricing model is ridiculously simple. You pay for requests, not compute time (mostly). For the high-traffic, low-compute workloads that make up the majority of web apps, the Edge is significantly cheaper.
The Takeaway
I'm not saying delete your Lambda functions today. If it works, it works. But for new projects? Stop defaulting to serverless.
The Edge offers the speed of serverless without the cold start latency. Containers offer the stability of traditional servers without the management overhead of 'serverless' constraints. It gives me the performance I need and the debugging experience I actually enjoy.
If you are starting a new React Native or Next.js project today, try pushing your API logic to the Edge. You won't miss the cold starts.