This article is more than 1 year old

Serverless is awesome (if you overlook inflated costs, dislike distributed computing, love vendor lock-in), say boffins

If 2019 is the year you try AWS Lambda et al, then here are pitfalls to look out for

Serverless computing, which does actually involve servers, has been touted as a way to reduce computing costs though its pay-per-use model and to free developers from operational concerns.

But – and there's always a but – researchers from the University of California at Berkeley contend that it's an expensive disappointment for all but a few simple applications.

In a paper distributed through pre-print service ArXiv, seven UCB boffins – Joseph M. Hellerstein, Jose Faleiro, Joseph E. Gonzalez, Johann Schleier-Smith, Vikram Sreekanti, Alexey Tumanov and Chenggang Wu – argue that today's serverless offerings are "a bad fit for cloud innovation and particularly bad for data systems innovation."

The paper, "Serverless Computing: One Step Forward, Two Steps Back," explains that serverless computing allows developers to upload their code to a cloud platform like AWS Lambda, Azure Functions, or Google Cloud Functions, and have it scale dynamically as needed. Rather than set up and deploy a full stack per host, from the operating system and web service to the high-level app, customers just provide their top-level application software and have it interface with the platform's API.

Provisioning of servers or virtual machines, and operational concerns such as operating system updates, are taken of automatically and invisibly, and the customer gets billed only for the resources used.

Serverless computing, or functions as a service (FaaS), thus sounds rather appealing. But it's too good to be true, the researchers say. They allow that the serverless paradigm has some legitimate use cases – such as running business logic over a database – and they acknowledge the value of autoscaling. But the advantages are outweighed by the downsides, they say.

Hand throws away slices of old mouldy bread

Hey cool, you went serverless. Now you just have to worry about all those stale functions

READ MORE

While moving the industry forward, "today's FaaS offerings also slide two major steps backward," the US-based computer scientists suggest. "First, they painfully ignore the importance of efficient data processing. Second, they stymie the development of distributed systems. This is curious since data-driven, distributed computing is at the heart of most innovation in modern computing."

A service like AWS Lambda, the paper explains, is fine for "embarrassingly parallel functions," like code that resizes images for a variety of client devices or performs object recognition on pictures. It's OK for preprocessing event streams before routing them to an analytics service. And it works for account creation logic, like triggering email notifications when users create online accounts.

But current FaaS offerings limit the ability to deal efficiently with data and distributed computing resources.

"As a result, serverless computing today is at best a simple and powerful way to run embarrassingly parallel computations or harness proprietary services," they say. "At worst, it can be viewed as a cynical effort to lock users into those services and lock out innovation."

AWS Lambda, the paper explains, powers down after 15 minutes and while the service may cache function state to facilitate a warm start the next time the code is invoked, there's no way to return to the same virtual machine. That means functions must be written to assume that function state is not recoverable.

Bandwidth v brandwidth

FaaS services also suffer from I/O bottlenecks. The paper says that a recent study found Lambda's network bandwidth averaged 538Mbps, which was similar to Google's and Microsoft's offerings. Lambda also puts a user's functions on a single VM, so those functions have to share bandwidth.

"With 20 Lambda functions, average network bandwidth was 28.7Mbps — 2.5 orders of magnitude slower than a single SSD," the paper says.

The researchers note that Amazon's announcement of 100Gbps networking at its re:Invent event in November helps somewhat, but constraints remain: "Even with 100Gbps/64 cores, under load you get ∼200MBps per core, still an order of magnitude slower than a single SSD."

Then there's Lambda's lack of network addressability, which requires the involvement of an intermediary service like S3 for maintaining state. This makes applications much slower than point-to-point networking.

Also, FaaS in its current form doesn't support APIs for specialized hardware, which is only becoming more common thanks to the demands of application-specific workloads.

A man in handcuffs

'Lambda and serverless is one of the worst forms of proprietary lock-in we've ever seen in the history of humanity'

READ MORE

The UC Berkeley crew contends that FaaS is an architectural anti-pattern that overlooks the realities of latency, bandwidth, and cost. And they argue it stymies distributed computing and innovation in hardware-accelerated software innovation and in open source.

To support their points, the boffins conducted several tests. One involved training a machine learning model on Lambda and doing the same on AWS EC2. The result: Lambda was 21x slower and 7.3x more expensive than EC2. Another involved making live predictions from a trained machine learning model. That task costs 57x less on EC2 than Lambda and EC2 latency was much better.

Ory Segal, CTO of serverless security biz PureSec, in an email to The Register, said he finds it amazing that even with all the limitations mentioned in the paper, developers are still finding uses for these early serverless platforms.

The researchers, he argues, haven't sufficiently acknowledged the innovation happening in this space.

"I think that if what we are seeing is just the first generation of serverless, then it’s obvious that this is going to be the future of computing – just imagine how much innovation will be done one these platforms evolve even more," he said.

The researchers offer similar optimism about the future potential of cloud computing but they're not sure whether what comes will still be called "serverless" after the issues they've identified have been addressed. ®

More about

TIP US OFF

Send us news


Other stories you might like