<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/"
    xmlns:atom="http://www.w3.org/2005/Atom" xmlns:media="http://search.yahoo.com/mrss/" version="2.0">
    <channel>
        
        <title>
            <![CDATA[ Ifeanyi Otuonye - freeCodeCamp.org ]]>
        </title>
        <description>
            <![CDATA[ Browse thousands of programming tutorials written by experts. Learn Web Development, Data Science, DevOps, Security, and get developer career advice. ]]>
        </description>
        <link>https://www.freecodecamp.org/news/</link>
        
        <generator>Eleventy</generator>
        <lastBuildDate>Fri, 15 May 2026 22:29:33 +0000</lastBuildDate>
        <atom:link href="https://www.freecodecamp.org/news/author/REXTECH/rss.xml" rel="self" type="application/rss+xml" />
        <ttl>60</ttl>
        
            <item>
                <title>
                    <![CDATA[ How to Build a Serverless CRUD REST API with the Serverless Framework, Node.js, and GitHub Actions ]]>
                </title>
                <description>
                    <![CDATA[ Serverless computing emerged as a response to the challenges of traditional server-based architectures. With serverless, developers no longer need to manage or scale servers manually. Instead, cloud providers handle infrastructure management, allowin... ]]>
                </description>
                <link>https://www.freecodecamp.org/news/how-to-build-a-serverless-crud-rest-api/</link>
                <guid isPermaLink="false">66c63e8f9aca8203eaa3e7e3</guid>
                
                    <category>
                        <![CDATA[ APIs ]]>
                    </category>
                
                    <category>
                        <![CDATA[ Node.js ]]>
                    </category>
                
                    <category>
                        <![CDATA[ GitHub ]]>
                    </category>
                
                    <category>
                        <![CDATA[ serverless ]]>
                    </category>
                
                    <category>
                        <![CDATA[ serverless framework ]]>
                    </category>
                
                <dc:creator>
                    <![CDATA[ Ifeanyi Otuonye ]]>
                </dc:creator>
                <pubDate>Wed, 21 Aug 2024 19:22:55 +0000</pubDate>
                <media:content url="https://cdn.hashnode.com/res/hashnode/image/upload/v1724267592147/e9dc4429-6475-4d35-b0e8-81c116f769b8.jpeg" medium="image" />
                <content:encoded>
                    <![CDATA[ <p>Serverless computing emerged as a response to the challenges of traditional server-based architectures. With serverless, developers no longer need to manage or scale servers manually. Instead, cloud providers handle infrastructure management, allowing teams to focus solely on writing and deploying code.</p>
<p>Serverless solutions automatically scale based on demand and offer a pay-as-you-go model. This means that you only pay for the resources your application actually uses. This approach significantly reduces operational overhead, increases flexibility and accelerates development cycles, making it an attractive option for modern application development.</p>
<p>By abstracting server management, Serverless platforms let you concentrate on business logic and application functionality. This leads to faster deployments and more innovation. Serverless architectures are also event-driven, which means they can automatically respond to real-time events and scale to meet user demands without manual intervention.</p>
<h2 id="heading-table-of-contents">Table of Contents</h2>
<ol>
<li><p><a class="post-section-overview" href="#heading-important-concepts-to-understand">Important Concepts to Understand</a></p>
<ul>
<li><p><a class="post-section-overview" href="#heading-application-programming-interface-api">Application Programming Interface (API)</a></p>
</li>
<li><p><a class="post-section-overview" href="#heading-http-methods">HTTP Methods</a></p>
</li>
<li><p><a class="post-section-overview" href="#heading-amazon-api-gateway">Amazon API Gateway</a></p>
</li>
<li><p><a class="post-section-overview" href="#heading-amazon-dynamodb">Amazon DynamoDB</a></p>
</li>
<li><p><a class="post-section-overview" href="#heading-serverless-crud-application">Serverless CRUD Application</a></p>
</li>
<li><p><a class="post-section-overview" href="#heading-the-serverless-framework">The Serverless Framework</a></p>
</li>
<li><p><a class="post-section-overview" href="#heading-github-actions">GitHub Actions</a></p>
</li>
<li><p><a class="post-section-overview" href="#heading-postman">Postman</a></p>
</li>
</ul>
</li>
<li><p><a class="post-section-overview" href="#heading-prerequisites">Prerequisites</a></p>
</li>
<li><p><a class="post-section-overview" href="#heading-our-use-case">Our Use Case</a></p>
</li>
<li><p><a class="post-section-overview" href="#heading-tutorial-objectives">Tutorial Objectives</a></p>
</li>
<li><p><a class="post-section-overview" href="#heading-how-to-get-started-clone-the-git-repository">How to get Started:Clone the Git Repository</a></p>
</li>
<li><p><a class="post-section-overview" href="#heading-step-1-set-up-the-serverless-framework-environment">Step 1: Set up the Serverless Framework Environment</a></p>
</li>
<li><p><a class="post-section-overview" href="#heading-step-2-define-the-api-in-the-serverless-yaml-file">Step 2: Define the API in the Serverless YAML File</a></p>
</li>
<li><p><a class="post-section-overview" href="#heading-step-3-develop-the-lambda-functions-for-crud-operations">Step 3: Develop the Lambda Functions for CRUD Operations</a></p>
<ul>
<li><p><a class="post-section-overview" href="#heading-create-coffee-lambda-function">Create Coffee Lambda function</a></p>
</li>
<li><p><a class="post-section-overview" href="#heading-get-coffee-lambda-function">Get Coffee Lambda function</a></p>
</li>
<li><p><a class="post-section-overview" href="#heading-update-coffee-lambda-function">Update Coffee Lambda function</a></p>
</li>
<li><p><a class="post-section-overview" href="#heading-delete-coffee-lambda-function">Delete Coffee Lambda function</a></p>
</li>
</ul>
</li>
<li><p><a class="post-section-overview" href="#heading-step-4-set-up-cicd-pipeline-multi-stage-deployments-for-dev-and-prod-environments">Step 4: Set Up CI/CD Pipeline Multi-stage Deployments for Dev and Prod Environments</a></p>
</li>
<li><p><a class="post-section-overview" href="#heading-step-5-test-the-dev-and-prod-pipelines">Step 5: Test the Dev and Prod Pipelines</a></p>
</li>
<li><p><a class="post-section-overview" href="#heading-step-6-test-and-validate-prod-and-dev-apis-using-postman">Step 6: Test and Validate Prod and Dev APIs using Postman</a></p>
</li>
<li><p><a class="post-section-overview" href="#heading-conclusion">Conclusion</a></p>
</li>
</ol>
<p>Before diving into the technical details, we'll go over some key background concepts.</p>
<h2 id="heading-important-concepts-to-understand">Important Concepts to Understand</h2>
<h3 id="heading-application-programming-interface-api">Application Programming Interface (API)</h3>
<p>An Application Programming Interface (API) allows different software applications to communicate and interact with each other. It defines the methods and data formats that applications can use to request and exchange information for integration and data sharing between diverse systems.</p>
<h3 id="heading-http-methods">HTTP Methods</h3>
<p>HTTP methods or request methods are a critical component of web services and APIs. They indicate the desired action to be performed on a resource in a given request URL.</p>
<p>The most commonly used methods in RESTful APIs are:</p>
<ul>
<li><p><strong>GET</strong>: used to retrieve data from a server</p>
</li>
<li><p><strong>POST</strong>: sends data, included in the body of the request, to create or update a resource</p>
</li>
<li><p><strong>PUT</strong>: updates or replaces an existing resource or creates a new resource if it doesn’t exist</p>
</li>
<li><p><strong>DELETE</strong>: deletes the specified data from the server.</p>
</li>
</ul>
<h3 id="heading-amazon-api-gateway">Amazon API Gateway</h3>
<p>Amazon API Gateway is a fully managed service that makes it easy for developers to create, publish, maintain, monitor and secure APIs at scale. It acts as an entry point for multiple APIs, managing and controlling the interactions between clients (such as web or mobile applications) and backend services.</p>
<p>It also provides various functions, including request routing, security, authentication, caching and rate limiting that help simplify the management and deployment of APIs.</p>
<h3 id="heading-amazon-dynamodb">Amazon DynamoDB</h3>
<p>DynamoDB is a fully managed NoSQL database service designed for high scalability, low latency, and replication of data across multiple regions.</p>
<p>DynamoDB stores data in a schema-less format, allowing for flexible and fast storage and retrieval of structured and semi-structured data. It is commonly used for building scalable and responsive applications in cloud-based environments.</p>
<h3 id="heading-serverless-crud-application">Serverless CRUD Application</h3>
<p>A serverless CRUD application refers to the ability to <strong>Create, Read, Update and Delete</strong> data. But the architecture and components involved differ from traditional server-based applications.</p>
<p><strong>Create</strong> involves adding new entries to a DynamoDB table. The <strong>Read</strong> operation retrieves data from a DynamoDB table. <strong>Update</strong> updates existing data in DynamoDB. And the <strong>Delete</strong> operation deletes data from DynamoDB.</p>
<h3 id="heading-the-serverless-framework">The Serverless Framework</h3>
<p>The Serverless Framework is an open-source tool that simplifies the deployment and management of serverless applications across multiple cloud providers, including AWS. It abstracts away the complexity of provisioning and managing infrastructure by allowing developers to define their infrastructure as code using a YAML file.</p>
<p>The framework handles the deployment, scaling and updating of serverless functions, APIs and other resources.</p>
<h3 id="heading-github-actions">GitHub Actions</h3>
<p>GitHub Actions is a powerful CI/CD automation tool that allows developers to automate their software workflows directly from their GitHub repository.</p>
<p>With GitHub Actions, you can create custom pipelines triggered by events such as code pushes, pull requests, or branch merges. These workflows are defined in YAML files within the repository and can perform tasks like testing, building and deploying applications to various environments.</p>
<h3 id="heading-postman">Postman</h3>
<p>Postman is a popular collaboration platform that simplifies the process of designing, testing, and documenting APIs. It offers a user-friendly interface for developers to create and send HTTP requests, test API endpoints, and automate testing workflows.</p>
<p>Alright, now that you're familiar with the tools and technologies we'll use here, let's dive in.</p>
<h2 id="heading-prerequisites">Prerequisites</h2>
<ul>
<li><p>Node.js and npm installed</p>
</li>
<li><p>AWS CLI configured with access to your AWS account</p>
</li>
<li><p>A Serverlesss Framework account</p>
</li>
<li><p>Serverlesss Framework globally installed in your local CLI</p>
</li>
</ul>
<h2 id="heading-our-use-case">Our Use Case</h2>
<p>Meet Alyx, an entrepreneur who has recently been learning about serverless architecture. She's read about how it's a powerful and efficient way to build backends for web applications, offering a more modern approach to web application development.</p>
<p>She wants to apply what she's learned so far about of the fundamentals of AWS serverless  computing. She knows that serverless doesn’t mean there are no servers involved – rather, it just abstracts away the management and provisioning of servers. And now she wants to focus solely on writing code and implementing business logic.</p>
<p>Let’s check out how Alyx, the owner of a thriving coffee shop, begins to leverage serverless architecture for the backend of her web application.</p>
<p>Alyx’s Coffee Haven, an online coffee shop, offers an array of coffee blends and treats for sale. Initially, Alyx managed the shop’s orders and inventory with traditional web hosting services and operations, where she handled multiple servers and resources. But as her coffee shop grew in popularity, she started facing an increasing number of orders, especially during peak hours and seasonal promotions.</p>
<p>Managing the servers and ensuring the application could handle the surge in traffic became a challenge for Alyx. She found herself constantly worrying about server capacity, scalability, and the cost of maintaining the infrastructure.</p>
<p>She also wanted to introduce new features like personalized recommendations and loyalty programs, but this became a daunting task given the limitations of her traditional setup.</p>
<p>Then Alyx learned about the concept of serverless. She likened a serverless backend to a barista who automatically brews coffee in real-time, without her having to worry about the intricate details of the coffee-making process.</p>
<p>Excited by this idea, Alyx decided to migrate her coffee shop’s backend to a serverless platform using AWS Lambda, AWS API Gateway, and Amazon DynamoDB. This setup will let her focus more on crafting the perfect coffee blends and treats for her customers.</p>
<p>With serverless, each customer’s order becomes an event that triggers a series of serverless functions. Separate AWS Lambda functions processes the orders and handles all the business logic behind the scenes. For instance, it creates a customer’s order and is able to retrieve that order. It can also delete someone's order or update an order’s status.</p>
<p>Alyx no longer needs to worry about managing servers, as the serverless platform automatically scales up and down based on incoming order requests. Also, the cost-efficiency of serverless is huge for Alyx. With a pay-as-you-go model, she only pays for the actual compute time her functions consume, offering her a more a cost-effective solution for her growing business.</p>
<p>But she doesn’t stop there! She also wants to automate everything, from deploying infrastructure to updating her application whenever there’s a new change. By utilizing Infrastructure as Code (IaC) with the Serverless Framework, she can define all her infrastructure in code and manage it easily.</p>
<p>On top of that, she sets up GitHub Actions for continuous integration and delivery (CI/CD), so that every change she makes is automatically deployed through a pipeline, whether it’s a new feature in development or a hot fix for production.</p>
<h2 id="heading-tutorial-objectives">Tutorial Objectives</h2>
<ul>
<li><p>Set up the Serverless Framework environment</p>
</li>
<li><p>Define an API in the YAML file</p>
</li>
<li><p>Develop AWS Lambda functions to process CRUD operations</p>
</li>
<li><p>Set up multi-stage deployments for Dev and Prod</p>
</li>
<li><p>Test the Dev and Prod pipelines</p>
</li>
<li><p>Test and validate Dev and Prod APIs using Postman</p>
</li>
</ul>
<h2 id="heading-how-to-get-started-clone-the-git-repository">How to Get Started: Clone the Git Repository</h2>
<p>To enhance your understanding and so you can follow along with this tutorial more effectively, go ahead and clone the project’s repository from my GitHub. You can do that <a target="_blank" href="https://github.com/ifeanyiro9/coffee-shop-serverless-crud-api-nodejs">by going here</a>. As we move forward, feel free to edit the files as you feel necessary.</p>
<p>After cloning the repository, you will notice the presence of multiple files in your folder, as you can see in the image below. We’ll use all of these files to build our serverless coffee shop API.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1724353622612/2dd67caa-1a30-4511-afc5-babfaa0c5b82.png" alt="File structure" class="image--center mx-auto" width="600" height="400" loading="lazy"></p>
<h2 id="heading-step-1-set-up-the-serverless-framework-environment">Step 1: Set up the Serverless Framework Environment</h2>
<p>To set up the Serverless Framework environment for automated deployments, you'll need to authenticate your Serverless Framework account via the CLI.</p>
<p>This requires creating an access key that enables the CI/CD pipeline and utilizes the Serverless Framework to authenticate securely into your account without exposing your credentials. By signing into your Serverless account and generating an access key, the pipeline can deploy your serverless application automatically from the build configuration file.</p>
<p>To do this, head to your Serverless account and <a target="_blank" href="https://app.serverless.com/settings/accessKeys">navigate to the Access Keys section</a>. Click on “+add,” name it SERVERLESS_ACCESS_KEY, and then create the key.</p>
<p>Once you’ve created your access key, be sure to copy and store it securely. You'll use this key as a secret variable in your GitHub repository to authenticate and authorize your CI/CD pipeline.</p>
<p>It will provide access to your Serverless Framework account during the deployment process. You’ll add this key to your GitHub repository’s secrets later, so your pipeline can securely use it to deploy the serverless resources without exposing sensitive information in your codebase.</p>
<p>Now, let’s define the AWS resources as code in the <strong>severless.yaml</strong> file.</p>
<h2 id="heading-step-2-define-the-api-in-the-serverless-yaml-file">Step 2: Define the API in the Serverless YAML File</h2>
<p>In this file, you'll define the core infrastructure and functionality of the Coffee Shop API using the Serverless Framework’s YAML configuration.</p>
<p>This file defines the AWS services being utilized, including API Gateway, Lambda functions for CRUD operations, and DynamoDB for data storage.</p>
<p>You'll also configure an IAM role so the Lambda functions have the necessary permissions to interact with the DynamoDB service.</p>
<p>The API Gateway is set up with appropriate HTTP methods (<strong>POST</strong>, <strong>GET</strong>, <strong>PUT</strong>, and <strong>DELETE</strong>) to handle incoming requests and trigger the corresponding Lambda functions.</p>
<p>Let’s check out the code:</p>
<pre><code class="lang-yaml"><span class="hljs-attr">service:</span> <span class="hljs-string">coffee-shop-api</span>
<span class="hljs-attr">frameworkVersion:</span> <span class="hljs-string">'4'</span>

<span class="hljs-attr">provider:</span>
  <span class="hljs-attr">name:</span> <span class="hljs-string">aws</span>
  <span class="hljs-attr">runtime:</span> <span class="hljs-string">nodejs20.x</span>
  <span class="hljs-attr">region:</span> <span class="hljs-string">us-east-1</span>
  <span class="hljs-attr">stage:</span> <span class="hljs-string">${opt:stage}</span>
  <span class="hljs-attr">iam:</span>
    <span class="hljs-attr">role:</span>
      <span class="hljs-attr">statements:</span>
        <span class="hljs-bullet">-</span> <span class="hljs-attr">Effect:</span> <span class="hljs-string">Allow</span>
          <span class="hljs-attr">Action:</span>
            <span class="hljs-bullet">-</span> <span class="hljs-string">dynamodb:PutItem</span>
            <span class="hljs-bullet">-</span> <span class="hljs-string">dynamodb:GetItem</span>
            <span class="hljs-bullet">-</span> <span class="hljs-string">dynamodb:Scan</span>
            <span class="hljs-bullet">-</span> <span class="hljs-string">dynamodb:UpdateItem</span>
            <span class="hljs-bullet">-</span> <span class="hljs-string">dynamodb:DeleteItem</span>
          <span class="hljs-attr">Resource:</span> <span class="hljs-string">arn:aws:dynamodb:${self:provider.region}:*:table/CoffeeOrders-${self:provider.stage}</span>

<span class="hljs-attr">functions:</span>
  <span class="hljs-attr">createCoffee:</span>
    <span class="hljs-attr">handler:</span> <span class="hljs-string">createCoffee.handler</span>
    <span class="hljs-attr">environment:</span>
      <span class="hljs-attr">COFFEE_ORDERS_TABLE:</span> <span class="hljs-string">CoffeeOrders-${self:provider.stage}</span>
    <span class="hljs-attr">events:</span>
      <span class="hljs-bullet">-</span> <span class="hljs-attr">http:</span>
          <span class="hljs-attr">path:</span> <span class="hljs-string">coffee</span>
          <span class="hljs-attr">method:</span> <span class="hljs-string">post</span>

  <span class="hljs-attr">getCoffee:</span>
    <span class="hljs-attr">handler:</span> <span class="hljs-string">getCoffee.handler</span>
    <span class="hljs-attr">environment:</span>
      <span class="hljs-attr">COFFEE_ORDERS_TABLE:</span> <span class="hljs-string">CoffeeOrders-${self:provider.stage}</span>
    <span class="hljs-attr">events:</span>
      <span class="hljs-bullet">-</span> <span class="hljs-attr">http:</span>
          <span class="hljs-attr">path:</span> <span class="hljs-string">coffee</span>
          <span class="hljs-attr">method:</span> <span class="hljs-string">get</span>

  <span class="hljs-attr">updateCoffee:</span>
    <span class="hljs-attr">handler:</span> <span class="hljs-string">updateCoffee.handler</span>
    <span class="hljs-attr">environment:</span>
      <span class="hljs-attr">COFFEE_ORDERS_TABLE:</span> <span class="hljs-string">CoffeeOrders-${self:provider.stage}</span>
    <span class="hljs-attr">events:</span>
      <span class="hljs-bullet">-</span> <span class="hljs-attr">http:</span>  
          <span class="hljs-attr">path:</span> <span class="hljs-string">coffee</span>  
          <span class="hljs-attr">method:</span> <span class="hljs-string">put</span>  

  <span class="hljs-attr">deleteCoffee:</span>  
    <span class="hljs-attr">handler:</span> <span class="hljs-string">deleteCoffee.handler</span>
    <span class="hljs-attr">environment:</span>
      <span class="hljs-attr">COFFEE_ORDERS_TABLE:</span> <span class="hljs-string">CoffeeOrders-${self:provider.stage}</span>
    <span class="hljs-attr">events:</span>
      <span class="hljs-bullet">-</span> <span class="hljs-attr">http:</span>
          <span class="hljs-attr">path:</span> <span class="hljs-string">coffee</span>
          <span class="hljs-attr">method:</span> <span class="hljs-string">delete</span>
<span class="hljs-attr">resources:</span>
  <span class="hljs-attr">Resources:</span>
    <span class="hljs-attr">CoffeeTable:</span>
      <span class="hljs-attr">Type:</span> <span class="hljs-string">AWS::DynamoDB::Table</span>
      <span class="hljs-attr">Properties:</span>
        <span class="hljs-attr">TableName:</span> <span class="hljs-string">CoffeeOrders-${self:provider.stage}</span>
        <span class="hljs-attr">AttributeDefinitions:</span>
          <span class="hljs-bullet">-</span> <span class="hljs-attr">AttributeName:</span> <span class="hljs-string">OrderId</span>
            <span class="hljs-attr">AttributeType:</span> <span class="hljs-string">S</span>
          <span class="hljs-bullet">-</span> <span class="hljs-attr">AttributeName:</span> <span class="hljs-string">CustomerName</span>
            <span class="hljs-attr">AttributeType:</span> <span class="hljs-string">S</span>
        <span class="hljs-attr">KeySchema:</span>
          <span class="hljs-bullet">-</span> <span class="hljs-attr">AttributeName:</span> <span class="hljs-string">OrderId</span>
            <span class="hljs-attr">KeyType:</span> <span class="hljs-string">HASH</span>
          <span class="hljs-bullet">-</span> <span class="hljs-attr">AttributeName:</span> <span class="hljs-string">CustomerName</span>
            <span class="hljs-attr">KeyType:</span> <span class="hljs-string">RANGE</span>
        <span class="hljs-attr">BillingMode:</span> <span class="hljs-string">PAY_PER_REQUEST</span>
</code></pre>
<p>The <strong>serverless.yml</strong> configuration defines how Alyx's Coffee Shop API will run in a serverless environment on AWS. The <strong>provider</strong> section specifies that the application will use AWS as the cloud provider, with <strong>Node.js</strong> as the runtime environment.</p>
<p>The region is set to <strong>us-east-1</strong> and the <strong>stage</strong> variable allows for dynamic deployment across different environments, like dev and prod. This means that the same code can deploy to different environments, with resources being named accordingly to avoid conflicts.</p>
<p>In the <strong>iam</strong> section, permissions are granted to Lambda functions to interact with the DynamoDB table. The <strong>${self:provider.stage}</strong> syntax dynamically names the DynamoDB table, so that each environment has its own separate resources, like <strong>CoffeeOrders-dev</strong> for the development environment and <strong>CoffeeOrders-prod</strong> for production. This dynamic naming helps manage multiple environments without manually configuring separate tables for each one.</p>
<p>The <strong>functions</strong> section defines the four core Lambda functions, <strong>createCoffee</strong>, <strong>getCoffee</strong>, <strong>updateCoffee</strong> and <strong>deleteCoffee</strong>. These handle the CRUD operations for the Coffee Shop API.</p>
<p>Each function is connected to a specific HTTP method in the API Gateway, such as <strong>POST</strong>, <strong>GET</strong>, <strong>PUT</strong> and <strong>DELETE</strong>. These functions interact with the DynamoDB table that’s dynamically named based on the current stage.</p>
<p>The last <strong>resources</strong> section defines the DynamoDB table itself. It sets up the table with the attributes <strong>OrderId</strong> and <strong>CustomerName</strong>, which are used as the primary key. The table is configured to use a pay-per-request billing mode, making it cost-effective for Alyx's growing business.</p>
<p>By automating the deployment of these resources using the Serverless Framework, Alyx can easily manage her infrastructure, freeing her from the burden of manually provisioning and scaling resources.</p>
<h2 id="heading-step-3-develop-the-lambda-functions-for-crud-operations">Step 3: Develop the Lambda Functions for CRUD Operations</h2>
<p>In this step, we implement the core logic of Alyx’s Coffee Shop API by creating Lambda functions with JavaScript that perform the essential CRUD operations <strong>createCoffee</strong>, <strong>getCoffee</strong>, <strong>updateCoffee</strong> and <strong>deleteCoffee</strong>.</p>
<p>These functions utilize the AWS SDK to interact with AWS services, particularly DynamoDB. Each function will be responsible for handling specific API requests such as creating an order, retrieving orders, updating order statuses, and deleting orders.</p>
<h3 id="heading-create-coffee-lambda-function">Create Coffee Lambda function</h3>
<p>This function creates an order:</p>
<pre><code class="lang-yaml"><span class="hljs-string">const</span> <span class="hljs-string">AWS</span> <span class="hljs-string">=</span> <span class="hljs-string">require('aws-sdk');</span>
<span class="hljs-string">const</span> <span class="hljs-string">dynamoDb</span> <span class="hljs-string">=</span> <span class="hljs-string">new</span> <span class="hljs-string">AWS.DynamoDB.DocumentClient();</span>
<span class="hljs-string">const</span> { <span class="hljs-attr">v4:</span> <span class="hljs-string">uuidv4</span> } <span class="hljs-string">=</span> <span class="hljs-string">require('uuid');</span>

<span class="hljs-string">module.exports.handler</span> <span class="hljs-string">=</span> <span class="hljs-string">async</span> <span class="hljs-string">(event)</span> <span class="hljs-string">=&gt;</span> {
  <span class="hljs-string">const</span> <span class="hljs-string">requestBody</span> <span class="hljs-string">=</span> <span class="hljs-string">JSON.parse(event.body);</span>
  <span class="hljs-string">const</span> <span class="hljs-string">customerName</span> <span class="hljs-string">=</span> <span class="hljs-string">requestBody.customer_name;</span>
  <span class="hljs-string">const</span> <span class="hljs-string">coffeeBlend</span> <span class="hljs-string">=</span> <span class="hljs-string">requestBody.coffee_blend;</span>
  <span class="hljs-string">const</span> <span class="hljs-string">orderId</span> <span class="hljs-string">=</span> <span class="hljs-string">uuidv4();</span>

  <span class="hljs-string">const</span> <span class="hljs-string">params</span> <span class="hljs-string">=</span> {
    <span class="hljs-attr">TableName:</span> <span class="hljs-string">process.env.COFFEE_ORDERS_TABLE</span>,
    <span class="hljs-attr">Item:</span> {
      <span class="hljs-attr">OrderId:</span> <span class="hljs-string">orderId</span>,
      <span class="hljs-attr">CustomerName:</span> <span class="hljs-string">customerName</span>,
      <span class="hljs-attr">CoffeeBlend:</span> <span class="hljs-string">coffeeBlend</span>,
      <span class="hljs-attr">OrderStatus:</span> <span class="hljs-string">'Pending'</span>
    }
  }<span class="hljs-string">;</span>

  <span class="hljs-string">try</span> {
    <span class="hljs-string">await</span> <span class="hljs-string">dynamoDb.put(params).promise();</span>
    <span class="hljs-string">return</span> {
      <span class="hljs-attr">statusCode:</span> <span class="hljs-number">200</span>,
      <span class="hljs-attr">body:</span> <span class="hljs-string">JSON.stringify(</span>{ <span class="hljs-attr">message:</span> <span class="hljs-string">'Order created successfully!'</span>, <span class="hljs-attr">OrderId:</span> <span class="hljs-string">orderId</span> }<span class="hljs-string">)</span>
    }<span class="hljs-string">;</span>
  } <span class="hljs-string">catch</span> <span class="hljs-string">(error)</span> {
    <span class="hljs-string">return</span> {
      <span class="hljs-attr">statusCode:</span> <span class="hljs-number">500</span>,
      <span class="hljs-attr">body:</span> <span class="hljs-string">JSON.stringify(</span>{ <span class="hljs-attr">error:</span> <span class="hljs-string">`Could</span> <span class="hljs-attr">not create order:</span> <span class="hljs-string">$</span>{<span class="hljs-string">error.message</span>}<span class="hljs-string">`</span> }<span class="hljs-string">)</span>
    }<span class="hljs-string">;</span>
  }
}<span class="hljs-string">;</span>
</code></pre>
<p>This Lambda function handles the creation of a new coffee order in the DynamoDB table. First we import the AWS SDK and initialize a <strong>DynamoDB.DocumentClient</strong> to interact with DynamoDB. The <strong>uuid</strong> library is also imported to generate unique order IDs.</p>
<p>Inside the <strong>handler</strong> function, we parse the incoming request body to extract customer information, such as the customer's name and preferred coffee blend. A unique <strong>orderId</strong> is generated using <strong>uuidv4()</strong> and this data is prepared for insertion into DynamoDB.</p>
<p>The <strong>params</strong> object defines the table where the data will be stored, with <strong>TableName</strong> dynamically set to the value of the environment variable <strong>COFFEE_ORDERS_TABLE</strong>. The new order includes fields such as <strong>OrderId</strong>, <strong>CustomerName</strong>, <strong>CoffeeBlend</strong>, and an initial status of <strong>Pending</strong>.</p>
<p>In the <strong>try</strong> block, the code attempts to add the order to the DynamoDB table using the <strong>put()</strong> method. If successful, the function returns a status code of <strong>200</strong> with a success message and the <strong>OrderId.</strong> If there’s an error, the code catches it and returns a <strong>500</strong> status code along with an error message.</p>
<h3 id="heading-get-coffee-lambda-function">Get Coffee Lambda function</h3>
<p>This function retrieves all coffee items:</p>
<pre><code class="lang-javascript"><span class="hljs-keyword">const</span> AWS = <span class="hljs-built_in">require</span>(<span class="hljs-string">'aws-sdk'</span>);
<span class="hljs-keyword">const</span> dynamoDb = <span class="hljs-keyword">new</span> AWS.DynamoDB.DocumentClient();

<span class="hljs-built_in">module</span>.exports.handler = <span class="hljs-keyword">async</span> () =&gt; {
  <span class="hljs-keyword">const</span> params = {
    <span class="hljs-attr">TableName</span>: process.env.COFFEE_ORDERS_TABLE
  };

  <span class="hljs-keyword">try</span> {
    <span class="hljs-keyword">const</span> result = <span class="hljs-keyword">await</span> dynamoDb.scan(params).promise();
    <span class="hljs-keyword">return</span> {
      <span class="hljs-attr">statusCode</span>: <span class="hljs-number">200</span>,
      <span class="hljs-attr">body</span>: <span class="hljs-built_in">JSON</span>.stringify(result.Items)
    };
  } <span class="hljs-keyword">catch</span> (error) {
    <span class="hljs-keyword">return</span> {
      <span class="hljs-attr">statusCode</span>: <span class="hljs-number">500</span>,
      <span class="hljs-attr">body</span>: <span class="hljs-built_in">JSON</span>.stringify({ <span class="hljs-attr">error</span>: <span class="hljs-string">`Could not retrieve orders: <span class="hljs-subst">${error.message}</span>`</span> })
    };
  }
};
</code></pre>
<p>This Lambda function is responsible for retrieving all coffee orders from a DynamoDB table and exemplifies a serverless approach to retrieving data from DynamoDB in a scalable manner.</p>
<p>We again use the AWS SDK to initialize a <strong>DynamoDB.DocumentClient</strong> instance to interact with DynamoDB. The <strong>handler</strong> function constructs the <strong>params</strong> object, specifying the <strong>TableName</strong>, which is dynamically set using the <strong>COFFEE_ORDERS_TABLE</strong> environment variable.</p>
<p>The <strong>scan()</strong> method retrieves all items from the table. Again, if the operation is successful, the function returns a status code of <strong>200</strong> along with the retrieved items in JSON format. In case of an error, a <strong>500</strong> status code and an error message are returned.</p>
<h3 id="heading-update-coffee-lambda-function">Update Coffee Lambda function</h3>
<p>This function updates a coffee item by its ID:</p>
<pre><code class="lang-javascript"><span class="hljs-keyword">const</span> AWS = <span class="hljs-built_in">require</span>(<span class="hljs-string">'aws-sdk'</span>);
<span class="hljs-keyword">const</span> dynamoDb = <span class="hljs-keyword">new</span> AWS.DynamoDB.DocumentClient();

<span class="hljs-built_in">module</span>.exports.handler = <span class="hljs-keyword">async</span> (event) =&gt; {
  <span class="hljs-keyword">const</span> requestBody = <span class="hljs-built_in">JSON</span>.parse(event.body);
  <span class="hljs-keyword">const</span> { order_id, new_status, customer_name } = requestBody;

  <span class="hljs-keyword">const</span> params = {
    <span class="hljs-attr">TableName</span>: process.env.COFFEE_ORDERS_TABLE,
    <span class="hljs-attr">Key</span>: {
      <span class="hljs-attr">OrderId</span>: order_id,
      <span class="hljs-attr">CustomerName</span>: customer_name
    },
    <span class="hljs-attr">UpdateExpression</span>: <span class="hljs-string">'SET OrderStatus = :status'</span>,
    <span class="hljs-attr">ExpressionAttributeValues</span>: {
      <span class="hljs-string">':status'</span>: new_status
    }
  };

  <span class="hljs-keyword">try</span> {
    <span class="hljs-keyword">await</span> dynamoDb.update(params).promise();
    <span class="hljs-keyword">return</span> {
      <span class="hljs-attr">statusCode</span>: <span class="hljs-number">200</span>,
      <span class="hljs-attr">body</span>: <span class="hljs-built_in">JSON</span>.stringify({ <span class="hljs-attr">message</span>: <span class="hljs-string">'Order status updated successfully!'</span>, <span class="hljs-attr">OrderId</span>: order_id })
    };
  } <span class="hljs-keyword">catch</span> (error) {
    <span class="hljs-keyword">return</span> {
      <span class="hljs-attr">statusCode</span>: <span class="hljs-number">500</span>,
      <span class="hljs-attr">body</span>: <span class="hljs-built_in">JSON</span>.stringify({ <span class="hljs-attr">error</span>: <span class="hljs-string">`Could not update order: <span class="hljs-subst">${error.message}</span>`</span> })
    };
  }
};
</code></pre>
<p>This Lambda function handles updating the status of a specific coffee order in the DynamoDB table.</p>
<p>The <strong>handler</strong> function extracts the <strong>order_id</strong>, <strong>new_status</strong>, and <strong>customer_name</strong> from the request body. It then constructs the <strong>params</strong> object to specify the table name and the primary key for the order (using <strong>OrderId</strong> and <strong>CustomerName</strong>). The <strong>UpdateExpression</strong> sets the new status of the order.</p>
<p>In the <strong>try</strong> block, the code attempts to update the order in DynamoDB using the <strong>update()</strong> method. Once again, of course if successful, the function returns a status code of <strong>200</strong> with a success message. If an error occurs, it catches the error and returns a <strong>500</strong> status code along with an error message.</p>
<h3 id="heading-delete-coffee-lambda-function">Delete Coffee Lambda function</h3>
<p>This function deletes a coffee item by its ID:</p>
<pre><code class="lang-javascript"><span class="hljs-keyword">const</span> AWS = <span class="hljs-built_in">require</span>(<span class="hljs-string">'aws-sdk'</span>);
<span class="hljs-keyword">const</span> dynamoDb = <span class="hljs-keyword">new</span> AWS.DynamoDB.DocumentClient();

<span class="hljs-built_in">module</span>.exports.handler = <span class="hljs-keyword">async</span> (event) =&gt; {
  <span class="hljs-keyword">const</span> requestBody = <span class="hljs-built_in">JSON</span>.parse(event.body);
  <span class="hljs-keyword">const</span> { order_id, customer_name } = requestBody;

  <span class="hljs-keyword">const</span> params = {
    <span class="hljs-attr">TableName</span>: process.env.COFFEE_ORDERS_TABLE,
    <span class="hljs-attr">Key</span>: {
      <span class="hljs-attr">OrderId</span>: order_id,
      <span class="hljs-attr">CustomerName</span>: customer_name
    }
  };

  <span class="hljs-keyword">try</span> {
    <span class="hljs-keyword">await</span> dynamoDb.delete(params).promise();
    <span class="hljs-keyword">return</span> {
      <span class="hljs-attr">statusCode</span>: <span class="hljs-number">200</span>,
      <span class="hljs-attr">body</span>: <span class="hljs-built_in">JSON</span>.stringify({ <span class="hljs-attr">message</span>: <span class="hljs-string">'Order deleted successfully!'</span>, <span class="hljs-attr">OrderId</span>: order_id })
    };
  } <span class="hljs-keyword">catch</span> (error) {
    <span class="hljs-keyword">return</span> {
      <span class="hljs-attr">statusCode</span>: <span class="hljs-number">500</span>,
      <span class="hljs-attr">body</span>: <span class="hljs-built_in">JSON</span>.stringify({ <span class="hljs-attr">error</span>: <span class="hljs-string">`Could not delete order: <span class="hljs-subst">${error.message}</span>`</span> })
    };
  }
};
</code></pre>
<p>The Lambda function deletes a specific coffee order from the DynamoDB table. In the handler function, the code parses the request body to extract the <strong>order_id</strong> and <strong>customer_name</strong>. These values are used as the primary key to identify the item to be deleted from the table. The <strong>params</strong> object specifies the table name and key for the item to be deleted.</p>
<p>In the <strong>try</strong> block, the code attempts to delete the order from DynamoDB using the <strong>delete()</strong> method. If successful, again it returns a <strong>200</strong> status code with a success message, indicating that the order was deleted. If an error occurs, the code catches it and returns a <strong>500</strong> status code along with an error message.</p>
<p>Now that we’ve explained each Lambda function, let’s set up a multi-stage CI/CD pipeline.</p>
<h2 id="heading-step-4-set-up-cicd-pipeline-multi-stage-deployments-for-dev-and-prod-environments">Step 4: Set Up CI/CD Pipeline Multi-stage Deployments for Dev and Prod Environments</h2>
<p>To set up AWS secrets in your GitHub repository, first navigate to the repository’s settings. Select <strong>Settings</strong> on the top right, then go to the bottom left and select <strong>Secrets and variables.</strong></p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1724352977158/9250d55a-941a-4bfd-9f7d-843e9b40d8b6.png" alt="Select &quot;Settings&quot; option in GitHub repo at top right." class="image--center mx-auto" width="600" height="400" loading="lazy"></p>
<p>Next, click on <strong>Actions</strong> as seen in the image below:</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1724353027861/52692cba-1bd1-4773-9441-a080af16f513.png" alt="Select &quot;Actions&quot; option to set secret variables for GitHub Actions." class="image--center mx-auto" width="600" height="400" loading="lazy"></p>
<p>From there, select <strong>New repository secret</strong> to create secrets.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1724353092604/a54b12fa-31e7-43d0-b4d5-2abe6a641181.png" alt="Select button to create new repository secret variables." class="image--center mx-auto" width="600" height="400" loading="lazy"></p>
<p>Three secrets are needed to create for your pipeline, <strong>AWS_ACCESS_KEY_ID</strong>, <strong>AWS_SECRET_ACCESS_KEY</strong>, and <strong>SERVERLESS_ACCESS_KEY</strong>.</p>
<p>Use your AWS account access key credentials for the first two variables and then the serverless access key previously saved to create the <strong>SERVERLESS_ACCESS_KEY</strong>. These secrets will securely authenticate your CI/CD pipeline as seen in the image below.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1724353131423/5b4af7c7-ff3e-431f-a9ef-1ddf74fa9e46.png" alt="Three secret variables needed to authenticate to AWS and Serverless Framework account." class="image--center mx-auto" width="600" height="400" loading="lazy"></p>
<p>Make sure that your main branch is named “<strong>main</strong>,” as this will serve as the production branch. Next, create a new branch called “<strong>dev</strong>” for development work.</p>
<p>You can also create feature-specific branches, such as “<strong>dev/feature</strong>,” for more granular development. GitHub Actions will use these branches to deploy changes automatically, with <strong>dev</strong> representing the development environment and <strong>main</strong> representing production.</p>
<p>This branching strategy allows you to manage the CI/CD pipeline efficiently, deploying new code changes whenever there's a merge into either the dev or prod environments.</p>
<h3 id="heading-how-to-use-github-actions-to-deploy-the-yaml-file">How to Use GitHub Actions to Deploy the YAML File</h3>
<p>To automate the deployment process for the Coffee Shop API, you'll utilize GitHub Actions, which integrates with your GitHub repository.</p>
<p>This deployment pipeline is triggered whenever code is pushed to the main or dev branches. By configuring environment-specific deployments, you'll ensure that updates to the dev branch deploy to the development environment, while changes to the main branch trigger production deployments.</p>
<p>Now, let’s review the code:</p>
<pre><code class="lang-yaml"><span class="hljs-attr">name:</span> <span class="hljs-string">deploy-coffee-shop-api</span>

<span class="hljs-attr">on:</span>
  <span class="hljs-attr">push:</span>
    <span class="hljs-attr">branches:</span>
      <span class="hljs-bullet">-</span> <span class="hljs-string">main</span>
      <span class="hljs-bullet">-</span> <span class="hljs-string">dev</span>

<span class="hljs-attr">jobs:</span>
  <span class="hljs-attr">deploy:</span>
    <span class="hljs-attr">runs-on:</span> <span class="hljs-string">ubuntu-latest</span>

    <span class="hljs-attr">steps:</span>
    <span class="hljs-bullet">-</span> <span class="hljs-attr">name:</span> <span class="hljs-string">Checkout</span> <span class="hljs-string">code</span>
      <span class="hljs-attr">uses:</span> <span class="hljs-string">actions/checkout@v3</span>

    <span class="hljs-bullet">-</span> <span class="hljs-attr">name:</span> <span class="hljs-string">Setup</span> <span class="hljs-string">Node.js</span>
      <span class="hljs-attr">uses:</span> <span class="hljs-string">actions/setup-node@v3</span>
      <span class="hljs-attr">with:</span>
        <span class="hljs-attr">node-version:</span> <span class="hljs-string">'20.x'</span>

    <span class="hljs-bullet">-</span> <span class="hljs-attr">name:</span> <span class="hljs-string">Install</span> <span class="hljs-string">dependencies</span>
      <span class="hljs-attr">run:</span> <span class="hljs-string">|
        cd coffee-shop-api
        npm install
</span>
    <span class="hljs-bullet">-</span> <span class="hljs-attr">name:</span> <span class="hljs-string">Install</span> <span class="hljs-string">Serverless</span> <span class="hljs-string">Framework</span>
      <span class="hljs-attr">run:</span> <span class="hljs-string">npm</span> <span class="hljs-string">install</span> <span class="hljs-string">-g</span> <span class="hljs-string">serverless</span>

    <span class="hljs-bullet">-</span> <span class="hljs-attr">name:</span> <span class="hljs-string">Deploy</span> <span class="hljs-string">to</span> <span class="hljs-string">AWS</span> <span class="hljs-string">(Dev)</span>
      <span class="hljs-attr">if:</span> <span class="hljs-string">github.ref</span> <span class="hljs-string">==</span> <span class="hljs-string">'refs/heads/dev'</span>
      <span class="hljs-attr">run:</span> <span class="hljs-string">|
        cd coffee-shop-api
        npx serverless deploy --stage dev
</span>      <span class="hljs-attr">env:</span>
        <span class="hljs-attr">AWS_ACCESS_KEY_ID:</span> <span class="hljs-string">${{</span> <span class="hljs-string">secrets.AWS_ACCESS_KEY_ID</span> <span class="hljs-string">}}</span>
        <span class="hljs-attr">AWS_SECRET_ACCESS_KEY:</span> <span class="hljs-string">${{</span> <span class="hljs-string">secrets.AWS_SECRET_ACCESS_KEY</span> <span class="hljs-string">}}</span>
        <span class="hljs-attr">SERVERLESS_ACCESS_KEY:</span> <span class="hljs-string">${{secrets.SERVERLESS_ACCESS_KEY}}</span>

    <span class="hljs-bullet">-</span> <span class="hljs-attr">name:</span> <span class="hljs-string">Deploy</span> <span class="hljs-string">to</span> <span class="hljs-string">AWS</span> <span class="hljs-string">(Prod)</span>
      <span class="hljs-attr">if:</span> <span class="hljs-string">github.ref</span> <span class="hljs-string">==</span> <span class="hljs-string">'refs/heads/main'</span>
      <span class="hljs-attr">run:</span> <span class="hljs-string">|
        cd coffee-shop-api
        npx serverless deploy --stage prod
</span>      <span class="hljs-attr">env:</span>
        <span class="hljs-attr">AWS_ACCESS_KEY_ID:</span> <span class="hljs-string">${{</span> <span class="hljs-string">secrets.AWS_ACCESS_KEY_ID</span> <span class="hljs-string">}}</span>
        <span class="hljs-attr">AWS_SECRET_ACCESS_KEY:</span> <span class="hljs-string">${{</span> <span class="hljs-string">secrets.AWS_SECRET_ACCESS_KEY</span> <span class="hljs-string">}}</span>
        <span class="hljs-attr">SERVERLESS_ACCESS_KEY:</span> <span class="hljs-string">${{secrets.SERVERLESS_ACCESS_KEY}}</span>
</code></pre>
<p>The GitHub Actions YAML configuration is what automates the deployment process of the Coffee Shop API to AWS using the Serverless Framework. The workflow triggers whenever changes are pushed to the main or dev branches.</p>
<p>It begins by checking out the repository’s code, then setting up Node.js with version 20.x to match the runtime used by the Lambda functions. After that, it installs the project dependencies by navigating to the <strong>coffee-shop-api</strong> directory and running <strong>npm install</strong>.</p>
<p>The workflow also installs the Serverless Framework globally, allowing the serverless CLI to be used for deployments. Depending on which branch is updated, the workflow conditionally deploys to the appropriate environment.</p>
<p>If the changes are pushed to the dev branch, it deploys to the dev stage. If they are pushed to the main branch, it deploys to the prod stage. The deployment commands, <code>npx serverless deploy --stage dev</code> or <code>npx serverless deploy --stage prod</code> are executed within the coffee-shop-api directory.</p>
<p>For a secure deployment, the workflow accesses AWS credentials and the Serverless access key via environment variables stored in GitHub Secrets. This allows the CI/CD pipeline to authenticate with AWS and the Serverless Framework without exposing sensitive information in the repository.</p>
<p>Now, we can proceed to test out the pipeline.</p>
<h2 id="heading-step-5-test-the-dev-and-prod-pipelines">Step 5: Test the Dev and Prod Pipelines</h2>
<p>First, you'll need to verify that the main (prod) branch is called “<strong>main</strong>”. Then create a dev branch called “<strong>dev</strong>”. Once you make any valid changes to the dev branch, commit them to trigger the GitHub Actions pipeline. This will automatically deploy the updated resources to the development environment. After verifying everything in dev, you can then merge the dev branch into the main branch.</p>
<p>Merging changes into the main branch also automatically triggers the deployment pipeline for the production environment. This way, all necessary updates are applied and production resources are deployed seamlessly.</p>
<p>You can monitor the deployment process and review detailed logs of each GitHub Actions run by navigating to the <strong>Actions</strong> tab in your GitHub repository.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1724353173167/f1775dbc-732c-432d-9ee0-9572b8b9908f.png" alt="Select &quot;Actions&quot; in the top right of GitHub repository options." class="image--center mx-auto" width="600" height="400" loading="lazy"></p>
<p>The logs provide visibility into each step of the pipeline, helping you verify that everything is working as expected.</p>
<p>You can select any build run to review detailed logs for both the development and production environment deployments so you can track the progress and ensure that everything is running smoothly.</p>
<p>Navigate to the specific build run in GitHub Actions, as demonstrated in the image below. There, you can view the execution details and outcomes for either the development or production pipelines.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1724353205715/dd221126-4fed-4032-8b51-e883f1177173.png" alt="Pipeline run logs for the different branch environments (main, dev)" class="image--center mx-auto" width="600" height="400" loading="lazy"></p>
<p>Make sure to thoroughly test both the development and production environments to confirm successful pipeline executing.</p>
<h2 id="heading-step-6-test-and-validate-prod-and-dev-apis-using-postman">Step 6: Test and Validate Prod and Dev APIs using Postman</h2>
<p>Now that the APIs and resources are deployed and configured, we need to locate the unique API endpoints (URLs) generated by AWS to begin making requests to test functionality.</p>
<p>These URLs can test the API functionality by simply pasting them into a web browser. The API URLs are found in the output results of your CI/CD build.</p>
<p>To retrieve them, navigate to the GitHub Actions logs, select the most recent environment’s successful build, and click <strong>deploy</strong> to check the deployment details for the generated API endpoints.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1724353236275/7cbed3e1-d65a-4fa6-9dff-9974d1c2022a.png" alt="&quot;Deploy&quot; button that allows you to view log details." class="image--center mx-auto" width="600" height="400" loading="lazy"></p>
<p>Click on the <strong>Deploy to AWS</strong> stage for the selected environment (Prod or Dev) in your GitHub Actions logs. Once there, you’ll find the generated API URL.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1724353272312/43eee369-618f-45f9-b9aa-6ffb6e19061b.png" alt="Detailed logs of a specific build run to review for errors or success." class="image--center mx-auto" width="600" height="400" loading="lazy"></p>
<p>Copy and save this URL, as it will be needed when testing your API’s functionality. This URL is your gateway to verifying that the deployed API works as expected.</p>
<p>Now copy one of the generated API URLs and paste it into your browser. You will see an empty array or list displayed in the response. This actually confirms that the API is functioning correctly and that you are successfully retrieving data from the DynamoDB table.</p>
<p>Even though the list is empty, it indicates that the API can connect to the database and return information.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1724353307388/23791725-71d7-4b1d-908c-c0f5e0fb073b.png" alt="Empty list result when inserting API URL in browser." class="image--center mx-auto" width="600" height="400" loading="lazy"></p>
<p>To verify that your API works across both environments, repeat the steps for the other API environment (Prod and Dev).</p>
<p>For more comprehensive testing, we’ll use Postman to test all the API methods, <strong>Create</strong>, <strong>Read</strong>, <strong>Update</strong> and <strong>Delete</strong>, and perform these tests for both the development and production environments.</p>
<p>To test the <strong>GET</strong> method, use Postman to send a GET request to the API’s endpoint using the URL. You will receive the same response, an empty list of coffee orders as seen in the bottom of the image below. This confirms the API’s ability to retrieve data successfully, as shown in the image below.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1724353336998/17fff84a-a784-464f-a89e-9c73f3e863a0.png" alt="Testing the GET method using Postman." class="image--center mx-auto" width="600" height="400" loading="lazy"></p>
<p>To actually create an order, let’s test the <strong>POST</strong> method. Use Postman again to make a POST request to the API endpoint, providing the customer’s name and coffee blend in the request body, as show below :</p>
<pre><code class="lang-json">{
  <span class="hljs-attr">"customer_name"</span>: <span class="hljs-string">"REXTECH"</span>,
  <span class="hljs-attr">"coffee_blend"</span>: <span class="hljs-string">"Black"</span>
}
</code></pre>
<p>The response will be a success message with a unique OrderId of the order placed.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1724353370197/4f3ab8df-4f1f-4c66-888c-4069b60151f9.png" alt="Testing the POST method using Postman." class="image--center mx-auto" width="600" height="400" loading="lazy"></p>
<p>Verify that the new order was saved in the DynamoDB table by reviewing the items in the environments specific table :</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1724353402967/afbd2080-b66f-46ac-ac79-24a4d360871d.png" alt="Verifying new order is stored in DynamoDB table." class="image--center mx-auto" width="600" height="400" loading="lazy"></p>
<p>To test the <strong>PUT</strong> method, make a PUT request to the API endpoint by providing the previous order ID and a new order status in the request body as shown below :</p>
<pre><code class="lang-json">{                                                 
  <span class="hljs-attr">"order_id"</span>: <span class="hljs-string">"42a81c27-1421-4025-9bef-72b14e723c34"</span>,
  <span class="hljs-attr">"new_status"</span>: <span class="hljs-string">"Ready"</span>,                                             
  <span class="hljs-attr">"customer_name"</span>: <span class="hljs-string">"REXTECH"</span>                                             
}
</code></pre>
<p>The response will be a successful order update message with the OrderId of the order placed.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1724353432881/f5354746-9b42-4fc9-bb70-5c18f076ecea.png" alt="Testing the PUT method using Postman." class="image--center mx-auto" width="600" height="400" loading="lazy"></p>
<p>You can also verify that the order status was updated from the DynamoDB table item.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1724353463923/e6a2978c-bbb5-49c0-9b94-36ea404b8c11.png" alt="Verifying order status update in DynamoDB table." class="image--center mx-auto" width="600" height="400" loading="lazy"></p>
<p>To test the <strong>DELETE</strong> method, using Postman, make a DELETE request providing the previous order ID and the customer name in the request body as shown below:</p>
<pre><code class="lang-plaintext">{                                                 
  "order_id": "42a81c27-1421-4025-9bef-72b14e723c34",
  "customer_name": "REXTECH"
}
</code></pre>
<p>The response will be a successful order deleted message with the order ID of the order placed.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1724353509090/e61a8ab8-7ce3-44b1-a122-34d29b5a5734.png" alt="Testing the DELETE method using Postman." class="image--center mx-auto" width="600" height="400" loading="lazy"></p>
<p>Again, you can verify that the order has been deleted in the DynamoDB table.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1724353541300/d6ed82aa-12ca-4cc2-9b0b-1b86be9557ee.png" alt="Verifying empty items in DynamoDB table." class="image--center mx-auto" width="600" height="400" loading="lazy"></p>
<h2 id="heading-conclusion">Conclusion</h2>
<p>That’s it – congratulations! You’ve successfully completed all the steps. We’ve built a serverless REST API that supports CRUD (<strong>Create, Read, Update, Delete)</strong> functionality with API Gateway, Lambda, DynamoDB, Serverless Framework and Node.js, automating deployment of approved code changes with Github Actions.</p>
<p>If you’ve gotten this far, <strong>thanks for reading!</strong> I hope it was worthwhile to you.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1724353582971/091ac912-1d87-4179-addc-cc81a90c8657.png" alt="091ac912-1d87-4179-addc-cc81a90c8657" class="image--center mx-auto" width="847" height="502" loading="lazy"></p>
<p><a target="_blank" href="https://www.linkedin.com/in/ifeanyi-otuonye/">Ifeanyi Otuonye</a> is a 6X AWS Certified Cloud Engineer skilled in DevOps, Technical Writing and instructional expertise as a Technical Instructor. He is motivated by his eagerness to learn and develop and thrives in collaborative environments. Before transitioning to the Cloud, he spend six years as a Professional Track and Field athlete.</p>
<p>In the early 2022, he strategically embarked on an mission to be a Cloud/DevOps Engineer through self study and joining a 6 month accelerated Cloud program.</p>
<p>In May 2023, he accomplished that goal and landed his first Cloud Engineering role and has now set another personal mission to empower other individuals on their journey to the Cloud.</p>
 ]]>
                </content:encoded>
            </item>
        
            <item>
                <title>
                    <![CDATA[ How to Choose the Right IaC Tool – AWS CDK, CloudFormation, and Terraform Compared ]]>
                </title>
                <description>
                    <![CDATA[ Infrastructure as Code (IaC) has become a cornerstone of modern cloud resource management. It enables developers and engineers to manage their cloud resources with the same level of control and precision as application code.  When you're working with... ]]>
                </description>
                <link>https://www.freecodecamp.org/news/comparing-iac-tools-aws-cdk-cloudformation-terraform/</link>
                <guid isPermaLink="false">66c37738b737bb2ce7073282</guid>
                
                    <category>
                        <![CDATA[ Cloud Computing ]]>
                    </category>
                
                    <category>
                        <![CDATA[ Infrastructure as code ]]>
                    </category>
                
                <dc:creator>
                    <![CDATA[ Ifeanyi Otuonye ]]>
                </dc:creator>
                <pubDate>Mon, 03 Jun 2024 21:27:31 +0000</pubDate>
                <media:content url="https://www.freecodecamp.org/news/content/images/2024/06/Level-Up-Tech-Design-Portfolio.jpg" medium="image" />
                <content:encoded>
                    <![CDATA[ <p>Infrastructure as Code (IaC) has become a cornerstone of modern cloud resource management. It enables developers and engineers to manage their cloud resources with the same level of control and precision as application code. </p>
<p>When you're working with AWS, among the tools at the forefront of utilizing IaC are AWS CloudFormation, AWS Cloud Development Kit (CDK), and HashiCorp’s Terraform. </p>
<p>Each of these IaC tools offers unique features and approaches to infrastructure management. This makes them suitable for different scenarios and preferences, and they can help you automate and standardize your or your team's cloud resource deployments.</p>
<p>This article will provide a high-level comparison of these three tools, focusing on their capabilities, abstraction levels, and practical use cases. You'll explore how these tools enable you to programmatically create and manage complex cloud infrastructures. </p>
<p>Specifically, the focus will be on deploying a three-tier architecture networking infrastructure. It'll include deploying a Virtual Private Cloud (VPC) configured with multiple subnets, route tables, an internet gateway, and NAT gateways to showcase the unique capabilities and syntax of each IaC tool.</p>
<p>By the end of this article, you'll gain a thorough understanding of the functionalities of these tools so you can make an informed decision when selecting one to build resilient, scalable and efficiently managed cloud infrastructures.</p>
<p>Without further ado, let’s get this party started!</p>
<h2 id="heading-what-well-cover">What We'll Cover:</h2>
<ol>
<li><a class="post-section-overview" href="#heading-what-is-infrastructure-as-code-iac">What is Infrastructure as Code (IaC)?</a></li>
<li><a class="post-section-overview" href="#heading-prerequisites">Prerequisites</a></li>
<li><a class="post-section-overview" href="#heading-use-case-scenario">Use Case Scenario</a></li>
<li><a class="post-section-overview" href="#heading-iac-tool-code-examples">IaC Tool Code Examples</a></li>
<li><a class="post-section-overview" href="#heading-analysis-and-comparison">Analysis and Comparison</a></li>
<li><a class="post-section-overview" href="#heading-why-choose-one-over-the-other">Why Choose One Over the Other?</a></li>
</ol>
<h2 id="heading-what-is-infrastructure-as-code-iac">What is Infrastructure as Code (IaC)?</h2>
<p>Infrastructure as Code is a key DevOps principle that involves managing and provisioning infrastructure resources by defining it as code in configuration files, instead of using manual processes and settings.</p>
<p>If you want to learn more about IaC basics, <a target="_blank" href="https://www.freecodecamp.org/news/infrastructure-as-code-basics/">here's a helpful guide to get you started</a>.</p>
<p>Now let's learn a bit more about the three tools we'll be comparing in this overview.</p>
<h3 id="heading-what-does-aws-cloudformation-do">What Does AWS CloudFormation Do?</h3>
<p>AWS CloudFormation uses YAML or JSON to describe as well as automatically and securely provision infrastructure resources needed for your applications – across all regions and accounts in your AWS cloud environment.</p>
<h3 id="heading-what-does-aws-cloud-development-kit-cdk-do">What Does AWS Cloud Development Kit (CDK) Do?</h3>
<p>AWS Cloud Development Kit is a software development framework specifically used for defining cloud infrastructure in code. It ultimately provisions resources through AWS CloudFormation. </p>
<p>AWS CDK uses familiar programming languages like TypeScript, JavaScript, Python, Java, and others to define reusable cloud components known as constructs. These are then shared and used to create complex and scalable cloud architectures.</p>
<h4 id="heading-whats-a-construct">What's a construct?</h4>
<p>In the context of AWS CDK, a construct represents a cloud component that encapsulates certain functionality and configuration in a reusable form.</p>
<h3 id="heading-what-does-terraform-do">What Does Terraform Do?</h3>
<p>Terraform is a multi-tenant tool created by HashiCorp that allows you to define both low-level and high-level components of your cloud infrastructure using a declarative configuration language. </p>
<p>It is cloud-agnostic and is capable of managing multi-provider setups within a single configuration.</p>
<h4 id="heading-what-does-cloud-agnostic-mean">What does cloud-agnostic mean?</h4>
<p>Cloud-agnostic refers to the ability of a tool or service to operate across different cloud providers without significant changes to its operational procedures or architecture.</p>
<p>Alright, now that you understand the tools we'll be discussing, let's dive in.</p>
<h2 id="heading-prerequisites">Prerequisites</h2>
<ul>
<li>AWS Account with an IAM User with admin permissions</li>
<li>Basic knowledge and use of AWS CloudFormation, AWS CDK, and Terraform</li>
<li>Basic understanding of YAML, Python, and the HashiCorp Configuration Language</li>
<li>Experience with an Interactive Development Environment (IDE)</li>
</ul>
<h2 id="heading-use-case-scenario">Use Case Scenario</h2>
<p>You’re a Cloud Network Engineer at REXTECH Corp, a startup on the verge of launching a new online service that offers digital content streaming. As the service is expected to attract a substantial user base right from the start, you need to deploy a highly scalable, reliable, and secure cloud infrastructure that can handle peak traffic and provide continuous availability.</p>
<p>Your manager has mandated a cloud network solution that not only meets these performance requirements but also allows for rapid scaling and efficient management. </p>
<p>In response to this, you are tasked with automating the deployment of a three-tier architecture networking infrastructure. It needs to have a Virtual Private Cloud (VPC) that includes multiple subnets across multiple Availability Zones (AZs), NAT gateways, and route tables to ensure resiliency and optimal configuration.</p>
<p>With the need for agility and maintainability in your infrastructure, you decide to evaluate and choose between AWS CloudFormation, AWS CDK, and Terraform for this project. </p>
<p>Before you evaluate each tool's application to the scenario, let’s break down the deployment resource components.</p>
<p>This deployment involves configuring a VPC with two public subnets for frontend facing web servers, two private subnets for servers in the application tier, and another two private subnets to host a multi-AZ database. All subnets will be deployed across multiple AZs and will include the connectivity configurations between components through route tables and an internet gateway.</p>
<p>Also, two NAT gateways in the public subnets will ensure that the resources in the private subnets of the application tier can securely access the internet for updates and inter-service communication without direct exposure to the outside world.</p>
<p>Now, let’s learn how you can automate the creation of this solution using all three IaC tools: <strong>AWS</strong> <strong>CloudFormation</strong>, <strong>AWS CDK</strong> and <strong>Terraform</strong>.</p>
<h2 id="heading-iac-tool-code-examples">IaC Tool Code Examples</h2>
<h3 id="heading-aws-cloudformation-example">AWS CloudFormation Example</h3>
<p>AWS CloudFormation allows you to define your desired infrastructure using a declarative JSON or YAML configuration file. But you must define the interdependencies and connections between resources using intrinsic functions like <strong>!Ref</strong>, referencing other resources or <strong>!GetAtt</strong>, to help select availability zones dynamically.</p>
<p>Below is how you define the three-tier networking solution using AWS CloudFormation:</p>
<pre><code class="lang-yaml"><span class="hljs-attr">AWSTemplateFormatVersion:</span> <span class="hljs-string">'2010-09-09'</span>
<span class="hljs-attr">Resources:</span>
  <span class="hljs-attr">MyVPC:</span>
    <span class="hljs-attr">Type:</span> <span class="hljs-string">'AWS::EC2::VPC'</span>
    <span class="hljs-attr">Properties:</span>
      <span class="hljs-attr">CidrBlock:</span> <span class="hljs-string">'10.0.0.0/16'</span>
      <span class="hljs-attr">EnableDnsSupport:</span> <span class="hljs-literal">true</span>
      <span class="hljs-attr">EnableDnsHostnames:</span> <span class="hljs-literal">true</span>

  <span class="hljs-attr">InternetGateway:</span>
    <span class="hljs-attr">Type:</span> <span class="hljs-string">'AWS::EC2::InternetGateway'</span>

  <span class="hljs-attr">AttachGateway:</span>
    <span class="hljs-attr">Type:</span> <span class="hljs-string">'AWS::EC2::VPCGatewayAttachment'</span>
    <span class="hljs-attr">Properties:</span>
      <span class="hljs-attr">VpcId:</span> <span class="hljs-type">!Ref</span> <span class="hljs-string">MyVPC</span>
      <span class="hljs-attr">InternetGatewayId:</span> <span class="hljs-type">!Ref</span> <span class="hljs-string">InternetGateway</span>

  <span class="hljs-attr">PublicSubnetOne:</span>
    <span class="hljs-attr">Type:</span> <span class="hljs-string">'AWS::EC2::Subnet'</span>
    <span class="hljs-attr">Properties:</span>
      <span class="hljs-attr">VpcId:</span> <span class="hljs-type">!Ref</span> <span class="hljs-string">MyVPC</span>
      <span class="hljs-attr">CidrBlock:</span> <span class="hljs-string">'10.0.1.0/24'</span>
      <span class="hljs-attr">AvailabilityZone:</span> <span class="hljs-type">!Select</span> [<span class="hljs-number">0</span>, <span class="hljs-type">!GetAZs</span> <span class="hljs-string">''</span>]
      <span class="hljs-attr">MapPublicIpOnLaunch:</span> <span class="hljs-literal">true</span>

  <span class="hljs-attr">PublicSubnetTwo:</span>
    <span class="hljs-attr">Type:</span> <span class="hljs-string">'AWS::EC2::Subnet'</span>
    <span class="hljs-attr">Properties:</span>
      <span class="hljs-attr">VpcId:</span> <span class="hljs-type">!Ref</span> <span class="hljs-string">MyVPC</span>
      <span class="hljs-attr">CidrBlock:</span> <span class="hljs-string">'10.0.2.0/24'</span>
      <span class="hljs-attr">AvailabilityZone:</span> <span class="hljs-type">!Select</span> [<span class="hljs-number">1</span>, <span class="hljs-type">!GetAZs</span> <span class="hljs-string">''</span>]
      <span class="hljs-attr">MapPublicIpOnLaunch:</span> <span class="hljs-literal">true</span>

  <span class="hljs-attr">PrivateSubnetAppOne:</span>
    <span class="hljs-attr">Type:</span> <span class="hljs-string">'AWS::EC2::Subnet'</span>
    <span class="hljs-attr">Properties:</span>
      <span class="hljs-attr">VpcId:</span> <span class="hljs-type">!Ref</span> <span class="hljs-string">MyVPC</span>
      <span class="hljs-attr">CidrBlock:</span> <span class="hljs-string">'10.0.3.0/24'</span>
      <span class="hljs-attr">AvailabilityZone:</span> <span class="hljs-type">!Select</span> [<span class="hljs-number">0</span>, <span class="hljs-type">!GetAZs</span> <span class="hljs-string">''</span>]

  <span class="hljs-attr">PrivateSubnetAppTwo:</span>
    <span class="hljs-attr">Type:</span> <span class="hljs-string">'AWS::EC2::Subnet'</span>
    <span class="hljs-attr">Properties:</span>
      <span class="hljs-attr">VpcId:</span> <span class="hljs-type">!Ref</span> <span class="hljs-string">MyVPC</span>
      <span class="hljs-attr">CidrBlock:</span> <span class="hljs-string">'10.0.4.0/24'</span>
      <span class="hljs-attr">AvailabilityZone:</span> <span class="hljs-type">!Select</span> [<span class="hljs-number">1</span>, <span class="hljs-type">!GetAZs</span> <span class="hljs-string">''</span>]

  <span class="hljs-attr">PrivateSubnetDBOne:</span>
    <span class="hljs-attr">Type:</span> <span class="hljs-string">'AWS::EC2::Subnet'</span>
    <span class="hljs-attr">Properties:</span>
      <span class="hljs-attr">VpcId:</span> <span class="hljs-type">!Ref</span> <span class="hljs-string">MyVPC</span>
      <span class="hljs-attr">CidrBlock:</span> <span class="hljs-string">'10.0.5.0/24'</span>
      <span class="hljs-attr">AvailabilityZone:</span> <span class="hljs-type">!Select</span> [<span class="hljs-number">0</span>, <span class="hljs-type">!GetAZs</span> <span class="hljs-string">''</span>]

  <span class="hljs-attr">PrivateSubnetDBTwo:</span>
    <span class="hljs-attr">Type:</span> <span class="hljs-string">'AWS::EC2::Subnet'</span>
    <span class="hljs-attr">Properties:</span>
      <span class="hljs-attr">VpcId:</span> <span class="hljs-type">!Ref</span> <span class="hljs-string">MyVPC</span>
      <span class="hljs-attr">CidrBlock:</span> <span class="hljs-string">'10.0.6.0/24'</span>
      <span class="hljs-attr">AvailabilityZone:</span> <span class="hljs-type">!Select</span> [<span class="hljs-number">1</span>, <span class="hljs-type">!GetAZs</span> <span class="hljs-string">''</span>]

  <span class="hljs-attr">EIPOne:</span>
    <span class="hljs-attr">Type:</span> <span class="hljs-string">'AWS::EC2::EIP'</span>

  <span class="hljs-attr">EIPTwo:</span>
    <span class="hljs-attr">Type:</span> <span class="hljs-string">'AWS::EC2::EIP'</span>

  <span class="hljs-attr">NATGatewayOne:</span>
    <span class="hljs-attr">Type:</span> <span class="hljs-string">'AWS::EC2::NatGateway'</span>
    <span class="hljs-attr">Properties:</span>
      <span class="hljs-attr">AllocationId:</span> <span class="hljs-type">!GetAtt</span> <span class="hljs-string">'EIPOne.AllocationId'</span>
      <span class="hljs-attr">SubnetId:</span> <span class="hljs-type">!Ref</span> <span class="hljs-string">PublicSubnetOne</span>

  <span class="hljs-attr">NATGatewayTwo:</span>
    <span class="hljs-attr">Type:</span> <span class="hljs-string">'AWS::EC2::NatGateway'</span>
    <span class="hljs-attr">Properties:</span>
      <span class="hljs-attr">AllocationId:</span> <span class="hljs-type">!GetAtt</span> <span class="hljs-string">'EIPTwo.AllocationId'</span>
      <span class="hljs-attr">SubnetId:</span> <span class="hljs-type">!Ref</span> <span class="hljs-string">PublicSubnetTwo</span>

  <span class="hljs-attr">PublicRouteTable:</span>
    <span class="hljs-attr">Type:</span> <span class="hljs-string">'AWS::EC2::RouteTable'</span>
    <span class="hljs-attr">Properties:</span>
      <span class="hljs-attr">VpcId:</span> <span class="hljs-type">!Ref</span> <span class="hljs-string">MyVPC</span>

  <span class="hljs-attr">PublicRoute:</span>
    <span class="hljs-attr">Type:</span> <span class="hljs-string">'AWS::EC2::Route'</span>
    <span class="hljs-attr">Properties:</span>
      <span class="hljs-attr">RouteTableId:</span> <span class="hljs-type">!Ref</span> <span class="hljs-string">PublicRouteTable</span>
      <span class="hljs-attr">DestinationCidrBlock:</span> <span class="hljs-string">'0.0.0.0/0'</span>
      <span class="hljs-attr">GatewayId:</span> <span class="hljs-type">!Ref</span> <span class="hljs-string">InternetGateway</span>

  <span class="hljs-attr">PublicSubnetOneRouteTableAssociation:</span>
    <span class="hljs-attr">Type:</span> <span class="hljs-string">'AWS::EC2::SubnetRouteTableAssociation'</span>
    <span class="hljs-attr">Properties:</span>
      <span class="hljs-attr">SubnetId:</span> <span class="hljs-type">!Ref</span> <span class="hljs-string">PublicSubnetOne</span>
      <span class="hljs-attr">RouteTableId:</span> <span class="hljs-type">!Ref</span> <span class="hljs-string">PublicRouteTable</span>

  <span class="hljs-attr">PublicSubnetTwoRouteTableAssociation:</span>
    <span class="hljs-attr">Type:</span> <span class="hljs-string">'AWS::EC2::SubnetRouteTableAssociation'</span>
    <span class="hljs-attr">Properties:</span>
      <span class="hljs-attr">SubnetId:</span> <span class="hljs-type">!Ref</span> <span class="hljs-string">PublicSubnetTwo</span>
      <span class="hljs-attr">RouteTableId:</span> <span class="hljs-type">!Ref</span> <span class="hljs-string">PublicRouteTable</span>

  <span class="hljs-attr">PrivateAppRouteTableOne:</span>
    <span class="hljs-attr">Type:</span> <span class="hljs-string">'AWS::EC2::RouteTable'</span>
    <span class="hljs-attr">Properties:</span>
      <span class="hljs-attr">VpcId:</span> <span class="hljs-type">!Ref</span> <span class="hljs-string">MyVPC</span>

  <span class="hljs-attr">PrivateAppRouteTableTwo:</span>
    <span class="hljs-attr">Type:</span> <span class="hljs-string">'AWS::EC2::RouteTable'</span>
    <span class="hljs-attr">Properties:</span>
      <span class="hljs-attr">VpcId:</span> <span class="hljs-type">!Ref</span> <span class="hljs-string">MyVPC</span>

  <span class="hljs-attr">PrivateAppRouteOne:</span>
    <span class="hljs-attr">Type:</span> <span class="hljs-string">'AWS::EC2::Route'</span>
    <span class="hljs-attr">Properties:</span>
      <span class="hljs-attr">RouteTableId:</span> <span class="hljs-type">!Ref</span> <span class="hljs-string">PrivateAppRouteTableOne</span>
      <span class="hljs-attr">DestinationCidrBlock:</span> <span class="hljs-string">'0.0.0.0/0'</span>
      <span class="hljs-attr">NatGatewayId:</span> <span class="hljs-type">!Ref</span> <span class="hljs-string">NATGatewayOne</span>

  <span class="hljs-attr">PrivateAppRouteTwo:</span>
    <span class="hljs-attr">Type:</span> <span class="hljs-string">'AWS::EC2::Route'</span>
    <span class="hljs-attr">Properties:</span>
      <span class="hljs-attr">RouteTableId:</span> <span class="hljs-type">!Ref</span> <span class="hljs-string">PrivateAppRouteTableTwo</span>
      <span class="hljs-attr">DestinationCidrBlock:</span> <span class="hljs-string">'0.0.0.0/0'</span>
      <span class="hljs-attr">NatGatewayId:</span> <span class="hljs-type">!Ref</span> <span class="hljs-string">NATGatewayTwo</span>

  <span class="hljs-attr">PrivateSubnetAppOneRouteTableAssociation:</span>
    <span class="hljs-attr">Type:</span> <span class="hljs-string">'AWS::EC2::SubnetRouteTableAssociation'</span>
    <span class="hljs-attr">Properties:</span>
      <span class="hljs-attr">SubnetId:</span> <span class="hljs-type">!Ref</span> <span class="hljs-string">PrivateSubnetAppOne</span>
      <span class="hljs-attr">RouteTableId:</span> <span class="hljs-type">!Ref</span> <span class="hljs-string">PrivateAppRouteTableOne</span>

  <span class="hljs-attr">PrivateSubnetAppTwoRouteTableAssociation:</span>
    <span class="hljs-attr">Type:</span> <span class="hljs-string">'AWS::EC2::SubnetRouteTableAssociation'</span>
    <span class="hljs-attr">Properties:</span>
      <span class="hljs-attr">SubnetId:</span> <span class="hljs-type">!Ref</span> <span class="hljs-string">PrivateSubnetAppTwo</span>
      <span class="hljs-attr">RouteTableId:</span> <span class="hljs-type">!Ref</span> <span class="hljs-string">PrivateAppRouteTableTwo</span>

  <span class="hljs-attr">PrivateDBRouteTable:</span>
    <span class="hljs-attr">Type:</span> <span class="hljs-string">'AWS::EC2::RouteTable'</span>
    <span class="hljs-attr">Properties:</span>
      <span class="hljs-attr">VpcId:</span> <span class="hljs-type">!Ref</span> <span class="hljs-string">MyVPC</span>

  <span class="hljs-attr">PrivateSubnetDBOneRouteTableAssociation:</span>
    <span class="hljs-attr">Type:</span> <span class="hljs-string">'AWS::EC2::SubnetRouteTableAssociation'</span>
    <span class="hljs-attr">Properties:</span>
      <span class="hljs-attr">SubnetId:</span> <span class="hljs-type">!Ref</span> <span class="hljs-string">PrivateSubnetDBOne</span>
      <span class="hljs-attr">RouteTableId:</span> <span class="hljs-type">!Ref</span> <span class="hljs-string">PrivateDBRouteTable</span>

  <span class="hljs-attr">PrivateSubnetDBTwoRouteTableAssociation:</span>
    <span class="hljs-attr">Type:</span> <span class="hljs-string">'AWS::EC2::SubnetRouteTableAssociation'</span>
    <span class="hljs-attr">Properties:</span>
      <span class="hljs-attr">SubnetId:</span> <span class="hljs-type">!Ref</span> <span class="hljs-string">PrivateSubnetDBTwo</span>
      <span class="hljs-attr">RouteTableId:</span> <span class="hljs-type">!Ref</span> <span class="hljs-string">PrivateDBRouteTable</span>
</code></pre>
<p>This YAML script creates the intended VPC, two public subnets, an internet gateway, two elastic IP addresses, and two NAT gateways. Here, you also leverage AWS CloudFormation’s capabilities to link resources and manage dependencies explicitly.</p>
<h3 id="heading-aws-cdk-example">AWS CDK Example</h3>
<p>When using the AWS CDK, you define cloud resources in an imperative programming style. It's offers an abstraction over AWS CloudFormation but offers more flexibility by using constructs, which can encapsulate multiple resources into a single logical unit. It also allows you to use of loops, conditionals, and other programming logic to dynamically generate your resources.</p>
<p>When configuration resources like subnets, it is simplified by grouping them under <strong>subnet_configuration</strong> in a VPC construct. This automatically handles subnet associations for you.</p>
<p>Below, you'll use the Python programming language to define the three-tier solution with AWS CDK:</p>
<pre><code class="lang-python"><span class="hljs-keyword">from</span> constructs <span class="hljs-keyword">import</span> Construct
<span class="hljs-keyword">from</span> aws_cdk <span class="hljs-keyword">import</span> (
    Stack,
    aws_ec2 <span class="hljs-keyword">as</span> ec2
)

<span class="hljs-class"><span class="hljs-keyword">class</span> <span class="hljs-title">MyVpcStack</span>(<span class="hljs-params">Stack</span>):</span>
    <span class="hljs-function"><span class="hljs-keyword">def</span> <span class="hljs-title">__init__</span>(<span class="hljs-params">self, scope: Construct, id: str, **kwargs</span>):</span>
        super().__init__(scope, id, **kwargs)

        <span class="hljs-comment"># Create a VPC with specific configurations</span>
        vpc = ec2.Vpc(self, <span class="hljs-string">"MyVpc"</span>,
                      ip_addresses=ec2.IpAddresses.cidr(<span class="hljs-string">"10.0.0.0/16"</span>),
                      max_azs=<span class="hljs-number">2</span>,
                      subnet_configuration=[
                          ec2.SubnetConfiguration(
                              name=<span class="hljs-string">"PublicSubnet"</span>,
                              subnet_type=ec2.SubnetType.PUBLIC,
                              cidr_mask=<span class="hljs-number">24</span>
                          ),
                          ec2.SubnetConfiguration(
                              subnet_type=ec2.SubnetType.PRIVATE_WITH_EGRESS,
                              name=<span class="hljs-string">"PrivateSubnet1"</span>,
                              cidr_mask=<span class="hljs-number">24</span>
                          ),
                          ec2.SubnetConfiguration(
                              subnet_type=ec2.SubnetType.PRIVATE_ISOLATED,
                              name=<span class="hljs-string">"PrivateSubnet2"</span>,
                              cidr_mask=<span class="hljs-number">24</span>
                          )
                      ],
                      nat_gateways=<span class="hljs-number">2</span>,  <span class="hljs-comment"># Number of NAT Gateways</span>
                      )
</code></pre>
<p>As you can see, this AWS CDK Python script is more concise and allows you to work with a very familiar high-level programming language, which provides powerful abstractions and leverages use of constructs.</p>
<h3 id="heading-terraform-example">Terraform Example</h3>
<p>Terraform’s approach involves defining infrastructure using a declarative configuration language. But it differs from AWS CloudFormation in its approach to managing state and dependencies. It also allows more controlled resource creation, updating, and destruction with constructs like <strong>resource</strong>, <strong>provider</strong> and <strong>variable</strong>.</p>
<p>Here’s how you define the same solution with Terraform:</p>
<pre><code class="lang-hcl">provider "aws" {
  region = "us-east-1"
}

resource "aws_vpc" "my_vpc" {
  cidr_block           = "10.0.0.0/16"
  enable_dns_support   = true
  enable_dns_hostnames = true
}

# Public Subnets
resource "aws_subnet" "public_subnet_one" {
  vpc_id                  = aws_vpc.my_vpc.id
  cidr_block              = "10.0.1.0/24"
  map_public_ip_on_launch = true
  availability_zone       = "us-east-1a"
}

resource "aws_subnet" "public_subnet_two" {
  vpc_id                  = aws_vpc.my_vpc.id
  cidr_block              = "10.0.2.0/24"
  map_public_ip_on_launch = true
  availability_zone       = "us-east-1b"
}

# Private Subnets for Application Tier
resource "aws_subnet" "private_app_subnet_one" {
  vpc_id            = aws_vpc.my_vpc.id
  cidr_block        = "10.0.3.0/24"
  availability_zone = "us-east-1a"
}

resource "aws_subnet" "private_app_subnet_two" {
  vpc_id            = aws_vpc.my_vpc.id
  cidr_block        = "10.0.4.0/24"
  availability_zone = "us-east-1b"
}

# Private Subnets for Database Tier
resource "aws_subnet" "private_db_subnet_one" {
  vpc_id            = aws_vpc.my_vpc.id
  cidr_block        = "10.0.5.0/24"
  availability_zone = "us-east-1a"
}

resource "aws_subnet" "private_db_subnet_two" {
  vpc_id            = aws_vpc.my_vpc.id
  cidr_block        = "10.0.6.0/24"
  availability_zone = "us-east-1b"
}

resource "aws_internet_gateway" "igw" {
  vpc_id = aws_vpc.my_vpc.id
}

resource "aws_nat_gateway" "nat_gateway_one" {
  allocation_id = aws_eip.nat_one.id
  subnet_id     = aws_subnet.public_subnet_one.id
}

resource "aws_nat_gateway" "nat_gateway_two" {
  allocation_id = aws_eip.nat_two.id
  subnet_id     = aws_subnet.public_subnet_two.id
}

resource "aws_eip" "nat_one" {
  domain = "vpc"
}

resource "aws_eip" "nat_two" {
  domain = "vpc"
}

# Public Route Table
resource "aws_route_table" "public_rt" {
  vpc_id = aws_vpc.my_vpc.id

  route {
    cidr_block = "0.0.0.0/0"
    gateway_id = aws_internet_gateway.igw.id
  }
}

# Private Route Tables for Application Tier
resource "aws_route_table" "private_app_rt_one" {
  vpc_id = aws_vpc.my_vpc.id

  route {
    cidr_block     = "0.0.0.0/0"
    nat_gateway_id = aws_nat_gateway.nat_gateway_one.id
  }
}

resource "aws_route_table" "private_app_rt_two" {
  vpc_id = aws_vpc.my_vpc.id

  route {
    cidr_block     = "0.0.0.0/0"
    nat_gateway_id = aws_nat_gateway.nat_gateway_two.id
  }
}

# Private Route Tables for Database Tier
resource "aws_route_table" "private_db_rt" {
  vpc_id = aws_vpc.my_vpc.id
}

# Route Table Associations
resource "aws_route_table_association" "public_subnet_one_association" {
  subnet_id      = aws_subnet.public_subnet_one.id
  route_table_id = aws_route_table.public_rt.id
}

resource "aws_route_table_association" "public_subnet_two_association" {
  subnet_id      = aws_subnet.public_subnet_two.id
  route_table_id = aws_route_table.public_rt.id
}

resource "aws_route_table_association" "private_app_subnet_one_association" {
  subnet_id      = aws_subnet.private_app_subnet_one.id
  route_table_id = aws_route_table.private_app_rt_one.id
}

resource "aws_route_table_association" "private_app_subnet_two_association" {
  subnet_id      = aws_subnet.private_app_subnet_two.id
  route_table_id = aws_route_table.private_app_rt_two.id
}

resource "aws_route_table_association" "private_db_subnet_one_association" {
  subnet_id      = aws_subnet.private_db_subnet_one.id
  route_table_id = aws_route_table.private_db_rt.id
}

resource "aws_route_table_association" "private_db_subnet_two_association" {
  subnet_id      = aws_subnet.private_db_subnet_two.id
  route_table_id = aws_route_table.private_db_rt.id
}
</code></pre>
<p>This script shows how Terraform allows for a modular approach to infrastructure as code, with explicit definitions and dependency management with syntax that is relatively easy to read and write.</p>
<h2 id="heading-analysis-and-comparison">Analysis and Comparison</h2>
<p>When choosing between AWS CloudFormation, AWS CDK, and Terraform for managing cloud infrastructure, you have consider a number of factors. But in this article, will specifically focus on the <strong>ease of use</strong>, <strong>flexibility</strong>, <strong>scalability</strong>, <strong>language support</strong>, and the <strong>ability to handle complex environments</strong>. </p>
<h3 id="heading-ease-of-use-and-learning-curve">Ease of Use and Learning Curve</h3>
<p>AWS CloudFormation offers a JSON or YAML-based template format. This is straightforward for defining the infrastructure, but can become complex as infrastructure grows. It requires an understanding of specific syntax and AWS resource definitions, which might have a steeper learning curve for those not familiar with JSON or YAML.</p>
<p>AWS CDK uses familiar programming languages like Python, JavaScript, TypeScript and Java. This can make it more accessible for developers already familiar with these languages. </p>
<p>Also, since AWS CDK allows for defining infrastructure through code, it provides more intuitive logic, conditions, and loops, and it abstracts much of the boilerplate needed in AWS CloudFormation. This simplifies the development process.</p>
<p>Terraform uses its own domain-specific language, HashiCorp Configuration Language (HCL), which is designed to be easily readable and writable by humans. While it can be easy to learn, you'll need to be familiar with another new language. However, its declarative nature allows clear definitions of <strong>what</strong> the infrastructure should look like without the need of specifying <strong>how</strong> to achieve it.</p>
<h3 id="heading-flexibility-and-cloud-provider-support">Flexibility and Cloud Provider Support</h3>
<p>AWS CloudFormation is tightly integrated with AWS and is updated in tandem with AWS services. But it’s inherently limited to AWS, making it less suitable for the possibilities for hybrid or multi-cloud environments.</p>
<p>AWS CDK also primarily targets AWS services but supports the use of AWS CloudFormation custom resources to manage resources outside of AWS. Still, it doesn’t naturally lend itself to managing multi-cloud resources as directly as Terraform.</p>
<p>Terraform is designed to be cloud-agnostic, supporting multiple providers including AWS, Microsoft Azure, Google Cloud Platform and others. This makes it an ideal choice for complex deployments spanning more than one cloud provider.</p>
<h3 id="heading-scalability-and-maintainability">Scalability and Maintainability</h3>
<p>AWS CloudFormation templates can become unwieldy and difficult to manage as projects scale. But AWS provides nested stacks as a solution to manage large infrastructures but even with this capability, managing many stacks can become cumbersome track.</p>
<p>AWS CDK provides high-level abstractions and modular constructs, making it easier to manage and scale large infrastructures by breaking them down into smaller, reusable components.</p>
<p>Terraform excels in managing large-scale infrastructures due to its modular approach. By using Terraform modules, you can reuse configurations and ensure consistency across deployments.</p>
<h3 id="heading-community-support-and-ecosystem">Community Support and Ecosystem</h3>
<p>AWS CloudFormation has great adoption and support from AWS with a large user base, but its community contributions are limited to sharing templates.</p>
<p>AWS CDK is open-source and has a growing community, especially among developers preferring to use general-purpose programming languages for infrastructure management. The ecosystem includes a rich set of high-level constructs developed by both AWS and the community.</p>
<p>Terraform benefits from strong community engagement and a vast ecosystem of providers and modules shared publicly in the Terraform Registry. Its wide adoption across different platforms also helps foster a large and active community.</p>
<h3 id="heading-code-length-and-complexity">Code Length and Complexity</h3>
<p>AWS CloudFormation scripts tend to be verbose, requiring detailed specifications of every property. This can lead to lengthy and complex templates for larger infrastructures.</p>
<p>AWS CDK scripts are typically shorter and less complex due to the use of programming constructs that abstract away much of the detailed specifications required in AWS CloudFormation.</p>
<p>Terraform configurations strike a balance, being more concise than AWS CloudFormation but typically more verbose than AWS CDK due to its declarative nature, which requires explicit resource and configuration definitions.</p>
<h2 id="heading-why-choose-one-over-the-other">Why Choose One Over the Other?</h2>
<p>When choosing between AWS CloudFormation, AWS CDK, and Terraform, consider each tools' unique features, operational principles, and your own personal preferences.</p>
<p>Now I'll share recommendations based on this article's information to help you figure out when it’s best to use each of these IaC tools.</p>
<ul>
<li>AWS CloudFormation is suitable for when you are looking for stable, native AWS tooling and don’t necessarily need to manage resources outside of AWS. It’s particularly great when compliance with specific AWS configurations is required.</li>
<li>Choose AWS CDK when you prefer using standard programming languages and enjoy the benefits of object-oriented techniques to create reusable and modular cloud components. It is usually more appealing to developers who want to apply software development best practices to infrastructure provisioning.</li>
<li>Terraform is the ultimate leader for multi-cloud environments, or if you need a tool that is both powerful and flexible enough to manage complex architectures. It is also the right choice if you anticipate integrating a variety of cloud services and need a unified approach to manage them.</li>
</ul>
<p>Though these recommendations are based on the special makeup of each of these IaC tools, I advise you to gain some experience with each tool, so you can decide on the one that best aligns with the specific skills and needs of your team and projects.</p>
<p>If you’ve got this far, <strong>thanks so much for reading!</strong> I hope it was worthwhile to you.</p>
<p>If you want to learn more about me and my story of transitioning from a Pro Athlete to a Cloud Engineer, connect with me <strong><a target="_blank" href="https://www.linkedin.com/in/ifeanyi-otuonye/">here on LinkedIn</a></strong>.</p>
 ]]>
                </content:encoded>
            </item>
        
    </channel>
</rss>
