Serverless architecture is a new concept and it is proliferating as a method to create mobile apps and supporting infrastructure for websites. The concept is difficult to understand because the very name of the system is misleading.
If these products are “serverless”, where are they? If they aren’t on a server, where do they run, in mid-air? Without a server, what processor runs them? The fact is that these systems aren’t serverless – they run on cloud servers.
The reason the name came about is that these functions are stored and executed in a different charging structure. Traditionally, a software developer gets a server and uses that resource to store and run packages. Now cloud servers are available, a developer can get a dedicated virtual server or rent storage space and processor power from a cloud host. All the major cloud systems offer serverless services. With these packages, you pay for the execution time of a function rather than buying blocks of space or processing capacity.
Serverless services can host supporting systems, such as databases, as well as the principal function that gets activated by external requirements.
The serverless hosting tariff is an on-demand usage charge rather than a charge per month for reserved resources.
Serverless programs
The serverless hosting model is used for microservices. A microservice is a function that creates a service-oriented architecture and the serverless architecture is the hardware delivery system beneath that model.
When writing a program, software developers are accustomed to extracting out commonly-used sections of code for reuse. These become functions that can be stored at the top of the file for local use, or stored in an external file that is included in the running program with a reference to be accessible.
Microservices are a library of functions stored on a cloud server. The microservice will provide an entire service, such as a form of movement in an animation, or a message posting function in a chat system.
A user-facing application can be provided in a very lightweight manner by making the completed web page or mobile app just a very high-level list of function calls with all of the processing performed on the serverless system that hosts each function. This is a requirement for mobile apps where there is a need to reduce the amount of processing that occurs locally. This is because the batteries of mobile devices drain quickly if their powerful processors are used to their full capacity.
Microservices lie behind the application programming interfaces (APIs) that enable one application to integrate with another. The API calls an access function on one serverless host that will trigger an entire service, which could be hosted on the same system or elsewhere. A package is made up of function calls that access a library of services that again might be hosted on the same serverless host or another elsewhere.
Developer kits can provide functions for layouts or actions in websites. These are calls to microservices that will trigger every time any page created in the development environment gets accessed. So, the same microservice can be run repeatedly, simultaneously, for many applications that have been built in the same environment.
There is a very small number of serverless hosting providers and so the chance that a supporting function will be hosted on the same system as the calling application is very high. However, these very large suppliers, such as AWS and Azure have so many servers all over the world that the chance of the two connecting modules being resident on the same physical server is very low.
Microservices can be dependent in a hierarchy on many other microservices. The creator of a mobile app probably doesn’t know exactly where the services accessed through APIs run. Those API modules are themselves composed of functions that are supplied by libraries hosted elsewhere. So, not only do the developers of mobile apps not know where the functions they use run, they even don’t know where the functions supporting the services that the modules their APIs depend upon are hosted. There can be many, many layers of dependencies in modern microservices.
Serverless platforms
Serverless platforms offer Function-as-a-Service (FaaS). This is a type of Platform-as-a-Service (PaaS) that takes care of all of the resource needs of software modules and manages their interactions with each other hosted functions and with external systems.
Each program that gets uploaded to the FaaS needs to be split into atomic functions that each provide a single service. The trigger that executes each function also needs to be specified within the definition of the function in the FaaS console.
The platform stores the function in memory and will run it whenever it is called. Servers operate in a cluster and so if the hosting server doesn’t have sufficient capacity, the responsibility for running the code is passed to another server. The actual physical services involved in supporting the function are abstracted.
This means that you don’t know exactly which server runs the function and it shouldn’t matter because the platform guarantees performance and availability. The user doesn’t have any responsibility for managing or monitoring the resources involved in delivering the functionality of the hosted module.
Serverless architecture uses
Serverless hosting is useful for providing short-run modules that might be demanded multiple times simultaneously. Failover provision and replication facilitate faster delivery around the world. These systems are useful in the following use cases:
- RESTful APIs Some platforms offer specific API management services, such as Amazon API Gateway.
- Rapid application development Segmenting an application into smaller functions enables developers to get something up and running quickly and then go back and fill out services by adding in functions.
- Continuous Integration and Continuous Delivery CI/CD A serverless hosting system can be linked into a development pipeline with porting occurring automatically on successful test completion. Serverless functions can also be deployed to advance developed code along the pipeline for deployment on other platforms.
- Asynchronous processing Background tasks can be performed on function completion while new functions are triggered to serve the user.
- Trigger activation Automated workflows that are triggered by a user action can be set up as a series of cloud-based functions.
- Mobile offloading Move functions to a FaaS to remove processing demands on mobile devices.
- Authentication Steps to authenticate users, such as multi-factor authentication can be ported to FaaS to ensure consistency across devices.
- Vulnerability assessments Use microservices to validate new objects, such as containers, and ensure that they don’t contain configuration weaknesses.
Serverless architecture terminology
Like any area of IT, serverless architecture has its terms. The important terminology to know includes:
- Invocation The execution of a function.
- Cold Start The time it takes for a function to start up, involves loading it into memory. Functions that are already in memory from a recent execution will have almost no cold start time.
- Concurrency Limit This is a service level condition that will form part of your contract and it limits the number of simultaneous innovations in any one region.
- Duration The execution time of the function.
- Default Timeout and Maximum Timeout The length of time a function will be left running before being judged as faulty and terminated. The default applies unless you specify a timeout and the maximum is the longest time you can choose.
Serverless architecture strategies
The examples in the previous section can all be distilled into a DevOps strategy that works well with the serverless concept. Serverless architecture has benefits and weaknesses that push the users of these services to adopt a particular form of development, which can be beneficial.
The cold start concept is a crucial factor that should shape a serverless strategy. If a function isn’t used for a while, the serverless system effectively archives it. The next time it is demanded, the server has to go through extra routines to get it running. However, once it is loaded into memory, a subsequent call that comes soon after will be delivered very quickly.
That means serverless systems aren’t ideal for infrequently used code. They are very good at delivering functions that are triggered again and again and often many times simultaneously. Therefore, if you have lots of assets that are the same behind the scenes but with different skins to make them look unique, then you will get the most out of serverless architecture.
Moving to a serverless architecture
Adopting serverless architecture requires a couple of small conceptual steps.
First of all, think of the serverless system as a content delivery network (CDN) for functions rather than for entire web pages or media assets. When you sign up for a CDN service, your digital assets get copied to many servers around the world. Web-based businesses are happy with this action because it improves the delivery speed of websites and mobile apps and also provides a failover service.
Systems administrators are comfortable with the proliferation of their sites to servers over which they have no control because they have the original cloud host of their system that they can log into and manage. They feel that they have control, simply because they have a primary location to deal with. However, they don’t have any security controls over the many other servers that interface with the majority of the visitors to the site.
In the case of the CDN, the Web host to which you upload your sites is, in reality, no longer serving the public. It is a central repository through which you roll out updates to the real site that people visit. The logistics of serverless systems are very similar – you don’t know where your functions run just as with a CDN you don’t know what server is delivering your site.
The second conceptual leap that needs to be made to get the most out of serverless architecture is to break up your applications into reusable units. Developers already do this to create function libraries that provide the same service again and again. Under current strategies, effectively, that function code gets copied into the body of the program when the application runs. With serverless systems, that code stays where it is and runs on whichever server the system controller selects.
To benefit from serverless architecture, you need to view a single application as a high-level piece of pseudocode. When you approach development from this angle, you will probably realize that this modular design is something that you are already doing.
You don’t need to compose an entire website, you just need to put: Company Header, Welcome Text, Menu, and Footer. Then Company Header becomes a cloud-hosted function and you can compose that with a series of plug-ins, such as Company Style Sheet, Logo, and Association Link.
You put the style sheet on the cloud server and then it can be called by all of the pages on your site. You might have a shopping cart function that many pages refer to. Once you get just a few layers down into the detail, you will see that you have already assembled your site from several mid-level units, such as the style sheet and the shopping cart. You trust these modules so much that you never want or need to drill into them and see that they are made up of smaller units that have been provided by someone else.
You aren’t that far from the program structure needed for a serverless system operating with FaaS infrastructure.
The last step in switching to serverless architecture is to let go of the belief that you need to keep all of your animals in a fenced field. You don’t need to maintain a server area with lots of extra space that you don’t use. You don’t even need to know where that server is, just like you don’t know what building houses the copy of your website that people in Hong Kong access.
FaaS providers
The major providers of serverless systems (FaaS) are:
- Amazon Web Services (AWS) Lambda
- Microsoft Azure Functions
- Google Cloud Functions
- Cloudflare Workers
- Oracle Cloud Functions
- Alibaba Function Compute
- IBM Cloud Functions
The big three globally in the FaaS market are the same as the top three cloud platforms: AWS, Azure, and GCP. Alibaba is the biggest player in China and the large volume of that market makes it statistically a global player without having much presence anywhere else in the world.
Cloudflare is a major CDN provider and so it caters very easily to FaaS concepts. Oracle and IBM are technically excellent but don’t have the marketing reach of the big three. However, these two providers each get a foot in the door by marketing their FaaS platforms internally to the customers that they have already gained for their business applications.
Monitoring serverless systems
It would be a mistake to think that serverless systems needn’t be monitored. At the same time, it is a delusion to think that a client of these systems could implement any remediation actions to head off performance impairment or shut down security loopholes.
Why bother detecting problems that you can’t fix? Well, some performance issues can be down to inefficient code and you can fix those. Above all, you need to ensure that your FaaS provider is not slacking and is providing the quality of service that you are paying for. If no customers ever check on the delivery by these serverless systems, providers wouldn’t be incentivized to ensure that the services work.
The major reason to implement serverless performance monitoring is to ensure that service level agreements are being met. Customers of FaaS platforms get a console that details throughput statistics but you might have accounts on several services and you can’t sit and watch all of those dashboards all of the time. Therefore, automated monitoring services for serverless accounts are worth the investment.
Monitoring serverless systems is a specialized niche and not many of the major monitoring tool providers have got this service right yet. Take a look at The Best Serverless Monitoring Tools for a deep dive on this issue. If you haven’t got time to read that comparison report, here are the top five tools that we recommend.
- AppOptics (FREE TRIAL) Use this monitoring tool if you have an account with AWS Lambda. It will trace all of the supporting services for your hosted functions and maintain a live dependency map that speeds up root cause analysis. That enables you to lay the blame in the right place if things go wrong. The tool also has an Azure integration but no specific routines for Azure Functions. Get a 30-day free trial.
- Site24x7 Serverless Monitoring (FREE TRIAL) FaaS platform monitoring for AWS, GCP, and Azure in a bundle with onsite server and hybrid application monitoring. Start a 30-day free trial.
- Datadog Serverless Monitoring This tool also focuses on AWS Lambda performance and has comprehensive tracing and mapping functions for dependencies.
- Catalyst by Zoho A Function-as-a-Service platform that only charges when a hosted module goes into production and includes a code editor in its console as well as asset management screens.
- New Relic Serverless Monitoring FaaS monitoring for AWS, Azure, and GCP systems with dependency mapping and performance tracking. Free for 100 GB of data collection and processing per month.
- Dynatrace Monitor infrastructure for all cloud platforms or get a plan that includes application monitoring as well. Covers AWS Lambda, Google Cloud Functions, and Azure Cloud Functions.