serverless-iac-framework

Posted on 13 January 2022, updated on 18 January 2022.

In the previous article, we saw how to deploy a simple serverless infrastructure in the AWS console. We also noticed that this was not a desirable way to operate, especially maintaining a consistent infrastructure. For this purpose, a fantastic tool has been created: the Infrastructure As Code (IaC). To deploy serverless on AWS, several tools are available. I will test here Terraform, Serverless Framework, and AWS SAM.

Deploying with Terraform

What is Terraform?

Simply put, Terraform is a powerful infrastructure-as-code (IaC) tool that you can use to create, update, and control the release management of your cloud infrastructure. Terraform is an effective solution to this problem and allows administrators to quickly provision and reconfigure infrastructure from a single source of truth and in an idempotent manner.

Example implementation

To do the same example with Terraform, I invite you to first download Terraform and set up your credentials to use it with AWS. Once this is done, create a folder and run the command tf init.

The target folder will have the following final organization:

.
└── terraform-api-dynamo/
    ├── LambdaFunctionOverHttps.js
    ├── LambdaFunctionOverHttps.zip
    ├── data.tf
    ├── main.tf
    ├── policies/
    │   ├── policy.json
    │   └── assume_role_policy.json
    ├── terraform.tfstate
    ├── terraform.tfstate.backup
    └── variables.tf

 

Create Lambda function

To create the lambda function, create a file LambdaFunctionOverHttps.js and copy the function code into it.

We also need to prepare the policies necessary for the lambda. To do this, create a policies folder in which we create the :

  • assume_role_policy.json
{
    "Version": "2012-10-17",
    "Statement": [
      {
        "Effect": "Allow",
        "Principal": {
          "Service": 
            "lambda.amazonaws.com"
            
        },
        "Action": "sts:AssumeRole"
      }
    ]
  }
  • policy.json
{
    "Version": "2012-10-17",
    "Statement": [
      {
        "Sid": "Stmt1428341300017",
        "Action": [
          "dynamodb:DeleteItem",
          "dynamodb:GetItem",
          "dynamodb:PutItem",
          "dynamodb:Query",
          "dynamodb:Scan",
          "dynamodb:UpdateItem"
        ],
        "Effect": "Allow",
        "Resource": "*"
      },
      {
        "Sid": "",
        "Resource": "*",
        "Action": [
          "logs:CreateLogGroup",
          "logs:CreateLogStream",
          "logs:PutLogEvents"
        ],
        "Effect": "Allow"
      }
    ]
  }

Finally, in the main.tf file, you have to declare in HCL (Hashicorp Configuration Language) the resources necessary to deploy a lambda function:

resource "aws_iam_role_policy" "lambda_policy" {
  name = "lambda_policy"
  role = aws_iam_role.role_for_LDC.id
  policy = file("policies/policy.json")
}


resource "aws_iam_role" "role_for_LDC" {
  name = "lambda-assume-role"
  assume_role_policy = file("policies/assume_role_policy.json")
}


resource "aws_lambda_function" "LambdaFunctionOverHttps" {
  function_name = "LambdaFunctionOverHttps"
  filename      = "LambdaFunctionOverHttps.zip"
  role          = aws_iam_role.role_for_LDC.arn
  handler       = "LambdaFunctionOverHttps.handler"
  runtime       = "nodejs12.x"
}

We notice here that the file sought by the lambda resource is a .zip. To generate it we can simply launch the zip command : zip LambdaFunctionOverHttps.zip LambdaFunctionOverHttps.js which will compress the code of the function into a zip file.

Create the API Gateway

To create the rest API, we only need to modify the main.tf file by adding the following resources:

resource "aws_api_gateway_rest_api" "apiLambda" {
  name        = "DynamoDBOperations"
}

resource "aws_api_gateway_resource" "Resource" {
  rest_api_id = aws_api_gateway_rest_api.apiLambda.id
  parent_id   = aws_api_gateway_rest_api.apiLambda.rot_resource_id
  path_part   = "DynamoDBManager"
}

resource "aws_api_gateway_method" "Method" {
   rest_api_id   = aws_api_gateway_rest_api.apiLambda.id
   resource_id   = aws_api_gateway_resource.Resource.id
   http_method   = "POST"
   authorization = "NONE"
}

resource "aws_api_gateway_integration" "lambdaInt" {
   rest_api_id = aws_api_gateway_rest_api.apiLambda.id
   resource_id = aws_api_gateway_resource.Resource.id
   http_method = aws_api_gateway_method.Method.http_method

   integration_http_method = "POST"
   type                    = "AWS"
   uri                     = aws_lambda_function.LambdaFunctionOverHttps.invoke_arn
}

resource "aws_api_gateway_method_response" "response_200" {
  rest_api_id = aws_api_gateway_rest_api.apiLambda.id
  resource_id = aws_api_gateway_resource.Resource.id
  http_method = aws_api_gateway_method.Method.http_method
  status_code = "200"
  response_models = { "application/json" = "Empty" }
}

resource "aws_api_gateway_integration_response" "MyDemoIntegrationResponse" {
  rest_api_id = aws_api_gateway_rest_api.apiLambda.id
  resource_id = aws_api_gateway_resource.Resource.id
  http_method = aws_api_gateway_method.Method.http_method
  status_code = aws_api_gateway_method_response.response_200.status_code

  depends_on = [
    aws_api_gateway_integration.lambdaInt
  ]
}

resource "aws_api_gateway_deployment" "apideploy" {
   depends_on = [aws_api_gateway_integration.lambdaInt]

   rest_api_id = aws_api_gateway_rest_api.apiLambda.id
   stage_name  = "Prod"
}

resource "aws_lambda_permission" "apigw" {
   statement_id  = "AllowExecutionFromAPIGateway"
   action        = "lambda:InvokeFunction"
   function_name = aws_lambda_function.LambdaFunctionOverHttps.function_name
   principal     = "apigateway.amazonaws.com"
   source_arn = "${aws_api_gateway_rest_api.apiLambda.execution_arn}/${aws_api_gateway_deployment.apideploy.stage_name}/POST/DynamoDBManager"
}

output "base_url" {
  value = aws_api_gateway_deployment.apideploy.invoke_url
}

 

Create DynamoDB Table

The dynamoDB table is the simplest resource to create in our context, it is enough to add to the main.tf the corresponding resource :

resource "aws_dynamodb_table" "ddbtable" {
  name             = "lambda-apigateway"
  hash_key         = "id"
  billing_mode   = "PROVISIONED"
  read_capacity  = 5
  write_capacity = 5
  attribute {
    name = "id"
    type = "S"
  }
}

 

Deploy the solution

To deploy the solution, terraform provides simple commands. If you have already done the tf init command before (which initiates the state of your application), you only have to run the tf apply command, check the changes that will be made to your infra and confirm by typing "yes" that terraform can delete / modify / add resources in your AWS account as specified before.

You can test the setup the same way as before with a curl from your command line or from within the AWS Console directly.

The whole project is available here : terraform-api-dynamo. at the time of writing, the project was done as a POC, it is in NO WAY an example of good IaC or JS practice. The code is not clean and deserves a rework.

Conclusion

  • Terraform is a great IaC tool and it allows us to solve the problems mentioned when setting up an infra using the console, but :
    • It does not allow to fully exploit the capabilities of serverless
      • The advantage of serverless is that the majority of the complexity is managed by the cloud provider. However, with Terraform, you have to deploy all the resources that are implicitly managed by AWS. For example, the API requires 8 resources to be configured
      • We lose interest of serverless since we have to manage 40 different resources whereas the interest of serverless is that we have less things to manage
    • Terraform was initially designed for more classical infrastructures with less components to plug in and less interaction between components (policies to define, zip code file, give permissions, etc...)
    • It is not necessarily useful to do this in Terraform, since Serverless solutions are dependent on the chosen provider, so you might as well use their specific tool to manage the IaC (using a multi-cloud provider tool is not really interesting).
    • However, Terraform has a plus since it allows you to be more granular about what you deploy. Terraform only deploys what we ask it to, and does nothing "under the hood", which allows us to better understand all the mechanisms at work on the cloud provider side and to have a better grip on what is deployed.

With the rise of serverless on various cloud providers, specialized frameworks for serverless infrastructure deployment have emerged. We have the right to ask ourselves if these frameworks really have advantages over the good old Terraform.

Deploying with Serverless/SAM

Serverless Framework

What is it? How it works?

The Serverless Framework – Build applications on AWS Lambda and other next-gen cloud services, that auto-scale and only charge you when they run. This lowers the total cost of running and operating your apps, enabling you to build more and manage less.

The Serverless Framework is a command-line tool that uses easy and approachable YAML syntax to deploy both your code and cloud infrastructure needed to make tons of serverless application use-cases. It's a multi-language framework that supports Node.js, Typescript, Python, Go, Java, and more. It's also completely extensible via over 1,000 plugins that can add more serverless use-cases and workflows to the Framework.

A serverless Framework is an open-source tool from IaC accessible on GitHub. This tool allows you to deploy serverless applications on different Clouds-Providers, including AWS.

Example implementation

To begin with, once you’ve installed serverless CLI and set up your AWS credentials, run serverless command to init a new project. It will guide you through the setup for your serverless application.

The final project’s structure will look like this:

.
└── serverless-api-dynamo
		├── README.md
		├── app.js
		└── serverless.yml


Pros / Cons

Pros

  • Allows you to have a single framework for multiple cloud providers
  • Easy to use
  • Fast to run

Cons

  • The documentation is not the most obvious
  • Less abstraction than SAM

My feelings

The serverless framework is quite pleasant to use and rather intuitive. It comes with multiple easy to use examples which allows us to deploy infrastructures in less than 5mn.

It also allows you to use a dashboard to monitor the use of your application (your stack). However, this dashboard is still very basic and seems to me to be much too light to monitor a real application in production.

Serverless also offers a lot of plugins developed by the community, which can be an advantage, but requires more vigilance when it comes to the rigor of their implementation, especially for the security of your infrastructure.

The big negative point I noticed is that the documentation, although provided, is not always clear and you sometimes find yourself on forums (hello StackOverflow) to answer questions that are usually documentary in nature.

SAM 

What is it? How it works?

The AWS Serverless Application Model (SAM) is an open-source framework for building serverless applications. It provides shorthand syntax to express functions, APIs, databases, and event source mappings. With just a few lines per resource, you can define the application you want and model it using YAML. During deployment, SAM transforms and expands the SAM syntax into AWS CloudFormation syntax, enabling you to build serverless applications faster.

To get started with building SAM-based applications, use the AWS SAM CLI. SAM CLI provides a Lambda-like execution environment that lets you locally build, test, and debug applications defined by SAM templates or through the AWS Cloud Development Kit (CDK). You can also use the SAM CLI to deploy your applications to AWS, or create secure continuous integration and deployment (CI/CD) pipelines that follow best practices and integrate with AWS' native and third party CI/CD systems.

SAM and SAM CLI are open-sourced under the Apache 2.0 license. You can contribute new features and enhancements to serverless-application-model or aws-sam-cli.

It is a framework that is used with YAML files, coupled with a CLI tool to deploy the infra described in the YAML files. SAM "compiles" the SAM files into CloudFormation files before deploying the infrastructure.

 

Example implementation

To begin with, once you’ve installed SAM CLI and set up your AWS credentials, run sam init command. It will guide you through the setup for your sam application.

The final project’s structure will look like this:

.
└── sam-api-dynamo/
		├── README.md
		├── events
		│   └── event.json
		├── lambdaFunctionOverHttps
		│   └── app.js
		├── samconfig.toml
		└── template.yaml

Create Lambda function

To create lambda function, create a folder named LambdaFunctionOverHttps . Inside that folder create a file named app.js and copy JS inside it.

Now you have to configure template.yamlfile and describe your serverless infrastructure in it. To do so you can copy the following code :

Globals:
  Function:
    Timeout: 3

Resources:
  LambdaFunctionOverHttps:
    Type: AWS::Serverless::Function
    Properties:
      CodeUri: lambdaFunctionOverHttps/
      Handler: app.handler
      Runtime: nodejs12.x
      Architectures:
        - x86_64
      Policies:
        - DynamoDBCrudPolicy:
            TableName: !Ref MyTable
      Events:
        DynamoDBOperations:
          Type: Api
          Properties:
            Path: /dynamodbmanager
            Method: post

The good thing is that this simple piece of code configures :

  • The lambda
  • The API that will call the lambda
  • The policy of the lambda (to allow it to write in a dynamo table)

If you have followed well, we only have to configure the dynamoDB table.

Create dynamoDB table

To configure the DynamoDB, nothing could be simpler, SAM provides simplified resources so you don't have to worry about too many parameters:

MyTable:
    Type: AWS::Serverless::SimpleTable
    Properties:
      TableName: lambda-apigateway

(Optional) We can also add outputs to use our API more easily:

Outputs:
  LambdaFunctionOverHttps:
    Description: "Hello World Lambda Function ARN"
    Value: !GetAtt LambdaFunctionOverHttps.Arn
  LambdaFunctionOverHttpsIamRole:
    Description: "Implicit IAM Role created for Hello World function"
    Value: !GetAtt LambdaFunctionOverHttpsRole.Arn

Deploy the application

sam-application

To deploy a sam application, you can run sam build which will implicitly “translate” your SAM code to CloudFormation. Once it’s terminated, you can run sam deploy --guided which will guide you through the deployment of your application pretty easily. And... it’s over! Your application is ready to be used by hundreds of users. It was easy, right?

You can test the setup the same way as before with a curl from your command line or from within the AWS Console directly.

The whole project is available. At the time of writing, the project was done as a POC, it is in NO WAY an example of good IaC or JS practice. The code is not clean and deserves a rework.

 

Pros / Cons

Pros

  • Suitable for deploying serverless infrastructure on AWS
  • Easy to use
  • Handles the creation of implicit resources (policies, etc.) by itself, embeds lots of abstraction
  • Based on CloudFormation and maintained by AWS developers
    • CloudFormation is the engine on which SAM is based, but one might ask "why not use CloudFormation instead?"
    • Highly compatible with CloudFormation
  • Follows AWS Best Practices

Cons

  • Requires a build step before deployment (this can be long, especially if the infrastructure becomes more complex)

Following my experience, the technology I prefer to use to deploy serverless applications on AWS is the SAM tool.

Conclusion

To conclude, the most suitable tool depends on the use case and the objectives you want to achieve. Finally, the most important thing is to stay consistent in your technical stack, and multiplying tools is not necessarily the ideal solution.
While specialized frameworks (serverless framework & SAM) are handy, if the main IaC technology on a project is Terraform, it's not worth adding them to your stack, and using Terraform to deploy simple serverless infrastructures is a better idea. However, if you are starting a full serverless project on AWS and no stack has been adopted yet, using tools like AWS SAM or Serverless Framework is much more relevant than starting on Terraform as for a classical infrastructure.

What about you, do you think serverless will supplant Kubernetes in the future? That new IaC framework could replace Terraform by combining traditional infrastructure and serverless in a more complete way?