- Serverless computing promises to further abstract and thereby simplify VM/server infrastructure complexities, while supporting pay-as-you-go models.
- Serverless is a logical extension and evolution of the ideas of microservices, containerization tools, and cloud-oriented software development itself.
Serverless computing’s ability to eliminate next-generation application development/deployment complexities by stripping away as much code from the server as possible will usher in a number of cloud service rollouts this year. This begs the question: when should DevOps adopt this approach? This next evolution in cloud computing builds on the momentum of PaaS offerings, microservices development methodologies, and containerization tools like Docker Swarm and Kubernetes, to support agile app development through functions-as-a-service (FaaS). First brought to light a few years ago through Amazon’s AWS Lambda service, serverless computing – also known as FaaS – lets enterprise developers focus on writing code and not managing servers. The technology is based on the concept that when an event is triggered, a function is invoked automatically via a container to provide the context and execution framework for the work, all aimed at reducing operational requirements. A number of cloud providers are rolling out services this year based on investments in FaaS, so in addition to Amazon, buyers will be able to explore IBM OpenWhisk, Microsoft Azure Functions, Google Cloud Functions, and Pivotal Spring Cloud Function.
The advantages around serverless computing are twofold: there is a reduction in the requirements around provisioning and configuring the underlying virtual machine and app server infrastructure necessary to shore advanced event-based app development projects; and those compute resources operate more efficiently as they are only executed when needed. This all leads to applications that are easier to provision, secure, and manage; cheaper to run (via pure use-based licensing); and ultimately more scalable.
While there are numerous operational advantages, auto-scaling and provisioning remains complex. Within serverless computing is the need to retrofit existing applications so that they respond to each request under a request-based compute model, similar to the requirements of a microservices architecture when moving from traditional monolithic app development. Another hurdle to furthering adoption of this cloud service is the daunting task of gauging software architectural investments required around serverless and ‘cloud native’ in general. Moving to a cloud infrastructure is a major paradigm shift from traditional on-premises compute, storage, networking, etc., encouraging companies to consider the benefit and cost of new, demanding app patterns that assume infrastructure spikes and accommodate potential failures.
Despite these hurdles, FaaS will most certainly enable enterprise developers to tackle new challenges, particularly those geared towards simplifying operational tasks to react quickly to change events such as data query exercises checking for anomalies/validation in order to reduce security risks and cumbersome tasks. Other initial use cases include being able to flag asynchronous message patterns and speeding the extract, transform, and load (ETL) process with cloud-native data and real-time analytics. Beyond that, enterprises will continue to leverage emerging app development technologies, as well as traditional monolithic app development, as they are appropriate for various development styles and use cases. It’s highly likely that because FaaS builds on recent development architectures and technologies, it will actually accelerate the use of container tools and microservices development.
For a closer look at the pros and cons of serverless computing along with recaps of current offerings and strategies among leading cloud providers, please see “Serverless Computing: The Evolution of Server Virtualization and Containers/Microservices.”