Serverless Secrets: The Three Things Teams Get Wrong

Sam Goldstein

Database passwords, account passwords, API keys, private keys, other confidential data... A modern cloud application with multiple microservices is filled with confidential data that needs to be separated and managed. In the process of researching how we would improve and automate secrets management for Stackery customers, I found much of what you find online is bad advice. For example, there are quite a few popular tutorials which suggest storing passwords in environment variables or AWS Parameter Store. These are bad ideas which make your serverless apps less secure and introduce scalability problems.

Here are the top 3 bad ideas for handling serverless secrets:

1. Storing Secrets in Environment Variables

Using environment variables to pass environment configuration information into your serverless functions is a common best practice for separating config from your source code. However, environment variables should never be used to pass secrets, such as passwords, api keys, credentials, and other confidential information.

Never store secrets in environment variables. The risk of accidental exposure of environment variables is exceedingly high. That's why (just to be clear) you should never pass secrets in environment variables to Lambda functions. For example:

  • Many app frameworks print all environment variables for debugging or error reporting.
  • Application crashes usually result in environment variables getting logged in plain text.
  • Environment variables are passed down to child processes and can be used in unintended ways.
  • There have been many malicious packages found in popular package repositories which intentionally send environment variables to attackers.

At Stackery we never put secrets in environment variables. Instead we fetch secrets from AWS Secrets Manager at runtime and store them in local variables while they're in use. This makes is very difficult for secrets to be logged or otherwise exfiltrated from the runtime environment.

2. Storing Secrets in the Wrong Places

If you're dealing with secrets they should always be encrypted at rest and encrypted in transmission. By now we all know that keeping secrets in source code is a bad idea. Yes, secrets can't live in git with your code so where should you keep them? There's a lot of bad advice online, suggesting AWS Systems Manager Parameter Store (aka SSM) is a good place to store your secrets. Like environment variables Parameter Store is good for configuration, but terrible for secrets.

AWS Systems Manager Parameter Store falls short as a secrets backend in a few key areas:

  1. Parameters aren't generally encrypted at rest and are often displayed in the AWS Console UI. Encryption only occurs for entries using the recently added Secure String type.
  2. Parameter Store is free but heavily rate limited. It doesn't accommodate traffic spikes so you can't rely on fetching secrets at runtime during traffic spikes. To avoid throttling your Lambdas you need to rely on environment variables to pass Parameter Store values in.
  3. You should never store secrets in environment variables.

At Stackery we use AWS Secrets Manager which stores secrets securely with fine grained access policies, auto-scales to handle traffic spikes, and is straightforward to query at runtime.

3. Bad IAM Permissions

Each function in your application should only have access to the secrets it needs to do its work. However it's very common for teams to run configurations (often unintentionally) where every function by default is granted access to all secrets from all environments. These "/*" permissions mean a compromised function in a test environment can be used to fetch all production secrets from the secrets store. This is a bad idea for obvious reasons. Permission access should be tightly scoped by environment and usage, with functions defaulting to no secrets access.

At Stackery we automatically scope an IAM role per function and Fargate container tasks which limits AWS Secrets Manager access to the current environment the function is running in and the set of secrets required by that specific function.

Managing Serverless Environment Secrets with Stackery

Our team has learned a lot about how to manage serverless secrets, running production serverless applications, and working with many serverless teams and pioneers. We've integrated these best practices back into Stackery so serverless teams can easily layer secure secrets management onto their existing projects. If you are curious to read more about how Stackery handles secrets check out the Environment Secrets in the Stackery Docs.

Related posts

Introducing Redis Cache Cluster Support
EngineeringIntroducing Redis Cache Cluster Support

© 2022 Stackery. All rights reserved.