You are in charge of your company's website and all the integration APIs. It is 3 a.m., and you wake up in a cold sweat. You recall reading about a recently identified Linux security vulnerability. Did the team patch the operating systems? Were the applications re-deployed? After a few moments, you realize it was just a flashback to the days before your company migrated to AWS. You remember that you now are leveraging a serverless application approach. You smile – and calmly fall back to sleep.
SPR recently hosted an AWS Serverless Immersion Day, covering the value of serverless applications. Here, we continue the topic of the workshop, specifically how serverless architectures impact application security.
If you are not sure what serverless means, see our article Why You Should Use a Serverless Application. The definition that we like to use at SPR is that it "allows you to invest in delivering business value, not infrastructure."
Shared Responsibility Model
AWS states that "security is job zero," and this is a great attitude to take whenever building any application. The Shared Responsibility Model states that AWS is responsible for "Security of the Cloud," and the customer is responsible for "Security in the Cloud." Let's look at what that means in a traditional cloud application.
This shared responsibility assumes you are not running your applications in a serverless fashion. It shows you are responsible for securing the platform and the operating system of the EC2 instances. Let's see how a serverless approach changes this.
Serverless shifts more of the security to "security of the cloud" because AWS becomes responsible for more of the infrastructure.
As we move from server-full to server-less (see graph above), AWS becomes responsible for the security of the physical machines, operating system, and runtime environment.
Transitioning to serverless allows you to focus on securing the application layer through authentication and authorization. When developing serverless applications, all interactions with AWS infrastructure occurs via service APIs, and AWS IAM protects all the services.
Identity and Access Management
Security in a serverless environment shifts to service security based on IAM instead of based on infrastructure security.
While AWS IAM is the foundation for all services, it is particularly crucial in serverless applications. AWS IAM is the one service that everyone should understand how to use. Breaking an application into smaller micro-services enables applying the principle of least privilege and only grant enough authorization for the service to perform the business task.
Let us look a little deeper at how AWS secures two of the foundational serverless services: Lambda and API Gateway.
We’ve talked about using IAM to secure the services that a Lambda function needs to access. Now, we are going to look at how AWS secures the execution environment for Lambda functions.
When a Lambda function executes, AWS manages both provisioning the execution environment and the resources necessary to run your code.
What is an execution environment?
An execution environment provides the resources necessary to execute Lambda functions. The execution environment exists for the lifetime of the function. When complete, the environment can be destroyed or reused by another invocation of the same function. Sharing does not occur between execution environments. This function level isolation ensures that data from one function is never available to a different function.
Every execution environment runs on the AWS microVM called Firecracker. A microVM is dedicated to an AWS account yet can be reused by execution environments across functions within an account, providing a robust isolation model.
This execution isolation ensures that environments are not able to access or modify data that belongs to other environments.
AWS provides all of this capability for us and abstracts away the details. As serverless developers, we can focus on the business value, and AWS manages the security of the execution environment.
API Gateway Security
How do we secure the entry point to our services? How can we be sure the principal accessing the endpoint authenticated successfully and is authorized to do so?
Let’s look at three ways to secure an API Gateway endpoint:
- API Key. An API Keys can be generated and are typically associated with a Rate plan. Tying an API key to a rate plan provides the ability to enforce limits, so a caller does not exceed a reasonable usage threshold.
- Cognito Authorizer. Cognito is an AWS Service that manages and authenticates users. Users authenticate against the Cognito service and get a security token. This security token is then used on subsequent invocations of the API and validated.
- Lambda Authorizer. Using a Lambda allows for any custom authentication and authorization implementations.
Notice how AWS manages the security of the cloud. They enforce rate limits or authenticating against Cognito, leaving the security in the cloud up to the customer. In this case, security in the cloud means managing the API Keys, approving users in the Cognito User Pool, or custom authentication. AWS handles the lower-level infrastructure and mechanics of securing the API.
Let’s recap the main points from what we’ve learned:
- AWS is responsible for more security in a serverless environment.
- Using a serverless approach to application development increases the amount of security of the cloud that AWS will manage, and reduces the security in the cloud that you have to manage.
- Serverless environments are all about Events and APIs. The way AWS permissions these are with IAM Roles and Policies to allow for fine-grained access to implement a security policy of least privilege.
- Learn and understand AWS IAM. AWS IAM is the foundation for serverless security, making it the essential service to know and understand.