Eliminate Critical Risks in the Cloud

Uncover and remediate the critical severity issues in your cloud environments without drowning your team in alerts.

Serverless Security Explained

Serverless security is the extra layer of protection designed for applications built on a serverless architecture. In this type of cloud computing, you write the code (functions) but the cloud provider handles the servers. This creates a different security approach.

Wiz Experts Team
7 minutes read

Serverless Computing: A refresher

Serverless computing is a cloud computing model where you, the developer, focus on writing and deploying code without worrying about the underlying servers or infrastructure. The cloud provider handles everything from provisioning servers to scaling them based on usage.

Serverless architectures eliminate many chores that come with cloud infrastructure setup and maintenance, including security-related tasks like installing security patches for language runtimes and operating systems. However, using a serverless architecture doesn’t mean your apps are invulnerable.

Security remains a concern in serverless computing. The shared responsibility model of cloud services, including serverless computing, shifts some security responsibilities to the cloud service provider but doesn't eliminate all security concerns for the client.

This article explains serverless security, introduces common security threats for serverless applications, and gives actionable advice about preventing them. Ready to make the most of serverless architecture? Let’s dive in.

What is serverless security?

Serverless security consists of best practices and techniques that protect serverless workloads from unauthorized access. 

Function as a service (FaaS), where you just implement a function that responds to events, usually only exposes a fraction of the underlying instance features to the developer. Consequently, the hidden features are the cloud provider's responsibility. For organizations and developers, serverless security focuses on the new challenges serverless architectures bring, like keeping track of the increased number of cloud resources, each of which is a potential attack vector for a malicious user.

What benefits do serverless architectures bring?

Let’s look at the reasons why you would use serverless architecture before we take a closer look at the security aspects.

Flexible managed services

Serverless services are a sub-category of managed services. Their main differentiator is on-demand billing, which means that if they go unused, you incur zero costs. For instance, FaaS lets you write complete programs in any programming language without worrying about infrastructure maintenance. Using FaaS, you can program your own backend without the downside of keeping your OS up to date or paying a monthly subscription for an instance you might not use 100% of the time.

Improved security

Like with all managed services, the cloud provider takes care of the OS and runtime security in a serverless environment. 

Another upside? FaaS is mostly stateless, which is easier to maintain. Functions are distributed over different instances, and each execution is isolated and might not have access to the state of the previous one. If there is no state of previous executions available, an attacker can’t use it.

Some other security benefits are that serverless architectures have a lower code footprint since the cloud provider does most of the undifferentiated heavy lifting (e.g., networks, databases, gateways, etc.). And since functions are purpose-built for a small use case, it’s easier to keep track of them and ensure they have only the permissions they need.

On-demand billing

On-demand billing is a huge benefit that helps your bottom line. As we’ve mentioned, you only pay for what you use. If there is no function running or no data in a database, you don’t pay for it. Though one serverless execution might be more expensive than execution on traditional infrastructure that’s utilized to its full capacity all month, that scenario is rare, so it’s a good idea to move the risk associated with idling infrastructure to the cloud provider.

Why do serverless apps require security?

With its drastically increased number of resources, serverless technology is hard to monitor. You can have hundreds of functions that utilize dozens of databases and queues. While the functions are easy to observe, the services they utilize are not. A monitoring solution has to consider that. Otherwise, vulnerabilities might go under the radar until it's too late. 

Since every function can be an entry point to your system, you have to manage permissions for each, and the more functions you have, the more complex this can get. Let’s look at this in more detail, along with other common threats serverless architectures face.

Common serverless security threats

There are several reasons a serverless architecture might be susceptible to security vulnerabilities. Here are the seven most common:

1. Increased attack surface

Serverless architectures can consist of dozens, sometimes even hundreds, of small services that form a single application. This poses a risk for multiple reasons:

  • More services mean more cognitive load on the engineers that maintain them. While the managed nature of these services can lighten that load somewhat, issues might slip through the cracks if the number of services reaches a critical mass.

  • You can easily expose each serverless function to the public, creating an entry point into your system. It’s critical to keep every serverless function in check so they don’t become a liability.

  • Managing permissions for many services can become a full-time job if you don’t implement a reasonable process from the outset. Otherwise, giving each function full access is tempting when a deadline looms.

2. Event data injection

Injection vulnerabilities are the bane of every publicly exposed service. (Think JavaScript injections for HTML, SQL injections for relational databases, and event injection for serverless architectures.) 

If you integrate user-supplied parts of your event data in commands without sanitation, your app can be susceptible to an injection attack. For example, when you build a script via string concatenation and use a URL or filename from a user input, you’re introducing considerable risk.

3. Over-privileged functions

If you follow the principle of least privilege, your application will end up with air-tight permissions. In theory, the permissions a serverless function requires are easier to assess than with monolithic services. After all, functions are small and purpose-built. 

However, incomplete knowledge of available permissions or time pressures can lead an engineer to assign more permissions to a function than what’s required. And if a function was split because it grew too much, and each of the new functions only needs a fraction of the permissions of their ancestor, people tend to forget to update.

4. Compromised third-party code

While OS and runtime maintenance is your provider’s responsibility, you must choose libraries and frameworks and keep your application code up to date. Supply chain attacks are on the rise, and if you rely on third-party dependencies without checking their safety, you might install something that compromises your functions. 

5. Accidental state

Although serverless functions are commonly considered stateless, that’s not entirely true. When a cold-start happens, the runtime loads your function from scratch, but if your function hasn’t been idle long enough to be evicted and receives another event, the runtime will reuse an already loaded function. Everything you do in the code outside of a handler function will be cached and become a state that could be susceptible to a security vulnerability.

6. Side channel attacks

Using a VPC (a private network inside the cloud without direct public internet access), can lead to the false assumption that all services inside the VPC are secure because nobody can directly access the network from the outside. This assumption can lead people to give internal services more privileges than required. That’s a big problem: An external attacker has many tools in their arsenal, even if they don’t have direct access. And even peered VPCs can be at risk.

7. Billing attacks

Paying on demand is great, but there are downsides. A denial of wallet (DoW) attack is designed to take an app offline by racking up usage charges until the owner has no money to pay for it. According to OWASP, DoW attacks are more of a threat in serverless than DoS attacks.

A few simple serverless security best practices

Now that we understand the potential issues, let’s look at the top seven prevention methods:

1. Grant minimal permissions

Always follow the principle of least privilege. If a function doesn’t write to a service, don’t give it write access. If a function uses only one bucket, don’t give it access to all buckets.

2. Take advantage of API gateways

Don’t expose all of your serverless functions directly to the public; use an API gateway instead. That way, you have only one entry point into your application and can manage public access in a central location.

3. Employ command query responsibility separation

Split your functions into reading and writing functions. This makes the code footprint of each function smaller and easier to monitor, and if one of the two is compromised, the others are likely to be unaffected.

4. Scan your code

Follow the shift-left approach and solve your issues early in development. Utilize code security scanners that run in the IDE and your CI/CD pipeline to ensure you catch issues before they hit the cloud.

5. Prioritize monitoring, logging, and tracing

Use monitoring and observability tools and services in production. Otherwise, you’ll have no way to discover what went wrong when you got hacked. Monitoring, logging, and tracing are essential components of ongoing maintenance.

6. Secure function URLs

If API gateways aren’t an option and you must use function URLs, make sure to keep them secure. We’ve published a detailed guide on our blog, so check it out

Pro tip

Lambda function URLs may be simple, but like any other externally exposed resource in your cloud environment, it is important that they be properly secured. While functions residing behind an API gateway or load balancer rely on the secure configuration of these frontend services, function URLs must be independently secured, and misconfigured instances could pose an attractive target for malicious actors hoping to cause damage or gain access to sensitive data.

Learn More ->

7. Leverage agentless scanning

Every sufficiently complex system will include multiple technologies to complete its tasks. You might not be able to go 100% serverless for various reasons. But whether you’re completely serverless or partially serverless, it’s essential to have full visibility into your infrastructure. 

Wiz's approach to serverless security

Serverless computing offers numerous benefits, but it also introduces unique security challenges. As organizations increasingly adopt serverless containers like AWS Fargate and Azure Container Apps (ACA), it's crucial to implement robust security measures tailored to these environments.

Wiz has recently expanded its capabilities to address these evolving needs by extendeding its runtime sensor coverage to include serverless containers, offering a comprehensive defense-in-depth strategy for serverless workloads. This expansion brings several key advantages:

  1. Enhanced Visibility: The Wiz runtime sensor now provides deep visibility into serverless container processes, even without direct host access.

  2. Threat Detection and Response: Real-time detection and response capabilities allow organizations to identify and mitigate threats promptly.

  3. Custom Rule Creation: Users can create tailored rules to detect suspicious processes and network behavior, with the ability to trigger specific response actions.

  4. Runtime Hunting: The sensor monitors all serverless container events, centralizing this data to facilitate proactive threat hunting and simplify investigations.

  5. Vulnerability Validation: Wiz helps prioritize remediation efforts by identifying which vulnerabilities are actually exploitable in the runtime environment.

This extension of Wiz's cloud-native security platform ensures that organizations can maintain robust security measures across their entire cloud infrastructure, from code to runtime, without sacrificing the agility and scalability benefits of serverless architectures

Uncover Vulnerabilities Across Your Clouds and Workloads

Learn why CISOs at the fastest growing companies choose Wiz to secure their cloud environments.

Get a demo 

Continue reading

What Is Shadow IT? Causes, Risks, and Examples

Wiz Experts Team

Shadow IT is an employee’s unauthorized use of IT services, applications, and resources that aren’t controlled by—or visible to—an organization’s IT department.

What is API Security?

API security encompasses the strategies, procedures, and solutions employed to defend APIs against threats, vulnerabilities, and unauthorized intrusion.

What is Data Classification?

Wiz Experts Team

In this post, we’ll explore some of the challenges that can complicate cloud data classification, along with the benefits that come with this crucial step—and how a DSPM tool can help make the entire process much simpler.