AWS Multi-Account GitOps Deployment 3: AWS GitHub Deployment

Welcome to the final installment of our series on multi-account GitOps deployment on AWS. After setting up a multi-account AWS organizational structure and integrating AWS accounts with GitHub Actions with short-lived credentials, we’re now ready to dive into the GitHub Actions deployment process.

The Importance of CICD in Modern Software Development

Continuous Integration and Continuous Delivery (CICD) is one of the most important tools in today’s software development endeavours. It allows the developers to build, test and see the effect of their addition to the codebase in an automated fashion. Within the vast amount of providers available for creating pipelines such as GitHub Actions, GitLab CICD, AWS CodePipeline etc., we will be opting to use GitHub Actions for this series. The reason, we are using GitHub Actions is that it is a free tool that integrates perfectly with one of the best code repository GitHub that has a huge number of quality custom steps created and maintained by the community.

Setting Up the GitHub Repository for Deployment

In this tutorial, we will create a simple serverless api that has a single Lambda function behind an API Gateway that responds differently with respect to the environment that the stack is deployed in.

Api response with respect to environment

Initialize the Repository: Begin by creating a new directory for your project and initialize it as a git repository.

mkdir simple-api && cd simple-api
git init
npx projen new awscdk-app-ts

Setting Up AWS CDK and Projen: Update the .projenrc.ts configuration file similar to following:

import { awscdk } from 'projen';
import { ApprovalLevel } from 'projen/lib/awscdk';

const project = new awscdk.AwsCdkTypeScriptApp({
authorEmail: 'utku.demir@luminis.eu',
authorName: 'Utku Demir',
cdkVersion: '2.96.2',
defaultReleaseBranch: 'main',
name: 'simple-api',
description: 'A CDK project for Simple Api GitOps Deployments',
github: false,
projenrcTs: true,
keywords: [
'AWS CDK',
'projen',
'Typescript',
'Deployment',
],
requireApproval: ApprovalLevel.NEVER,
gitignore: ['.idea'],
license: 'MIT',
licensed: true,
});
project.synth();
 After updating, generate the project:
yarn projen

Implementing the Necessary CDK Stack: Create a new file named simple_api_stack.ts under src to include the necessary configuration for our stack:

import { Stack, StackProps } from 'aws-cdk-lib';
import * as apigateway from 'aws-cdk-lib/aws-apigateway';
import { EndpointType } from 'aws-cdk-lib/aws-apigateway';
import * as lambda from 'aws-cdk-lib/aws-lambda';
import { Construct } from 'constructs';

export interface SimpleApiStackProps extends StackProps {
  environment: string;
}

export class SimpleApiStack extends Stack {
  constructor(scope: Construct, id: string, props: SimpleApiStackProps) {
    super(scope, id, props);

    // Define the Lambda function
    const helloLambda = new lambda.Function(this, 'HelloLambda', {
      runtime: lambda.Runtime.NODEJS_18_X,
      handler: 'index.handler',
      code: lambda.Code.fromInline(`
                exports.handler = async function(event, context) {
                    return {
                        statusCode: 200,
                        body: JSON.stringify({ message: "Hello, World from ${props.environment} environment!" })
                    };
                };
            `),
    });

    // Define the API Gateway
    new apigateway.LambdaRestApi(this, 'Endpoint', {
      handler: helloLambda,
      proxy: true,
      deploy: true,
      cloudWatchRole: true,
      endpointTypes: [EndpointType.EDGE],
    });
  }
}

Here, we create an edge optimized API Gateway Rest Api with a single Lambda handler that will return the message formatted with the correct environment given a get request to the base url.

Lastly, edit the main.ts under src to include this stack as:

import { App } from 'aws-cdk-lib';
import { SimpleApiStack } from './simple_api_stack';


// for development, use account/region from cdk cli
const devEnv = {
   account: process.env.AWS_ACCOUNT,
   region: process.env.AWS_REGION,
};

const app = new App();

const environment = process.env.ENVIRONMENT || 'dev';

new SimpleApiStack(app, `${environment}-simple-api`, {
   env: devEnv,
   environment: environment,
});

app.synth();

One important thing to note here is that we have an environment variable called ENVIRONMENT that will be injected through the GitHub Actions.

Integrating AWS with GitHub Actions

Repository Environments Under settings of the repository, you will find the Environments tab. This will allow you to manage your environments for the application. Create all the environments, namely, dev, test and prod as follows:

Addition of dev, test and prod GitHub repository environments

Environment Secrets and Variables: Set up necessary environment secrets and variables on the environment. These variables will store sensitive information like AWS credentials, ensuring they are securely managed and not hard-coded in your scripts is critical.

Addition of AWS_ACCOUNT, AWS_REGION and ENVIRONMENT environment secrets and variables

Here, AWS_ACCOUNT is the 12-digit account id corresponding to the environment that the stack will be deployed in.

Environment Protection: Add protection to the environments for test and prod environments. This step is crucial to ensure that changes can only be made to these environments under certain conditions, such as after code reviews or passing automated tests. This works like a manual approval and keep in mind that for private repositories, this is only possible if your organization’s plan is GitHub Enterprise. If this is not the case for your organization, I would suggest taking a look at the amazing custom manual approval step made by trstringer linked in the references.

Enable required reviewers for your test and prod environments

Repository Secrets: Add the DEPLOY_ROLE as a repository secret in GitHub. This role is necessary for GitHub Actions to interact with your AWS environment securely, and it will be assumed from the deployment account, hence, it will be same for all the environments.

Add repository secret of DEPLOY_ROLE

Replace the censored part with your deployment AWS account id.

Creating the Deployment Workflow

Here, are the necessary steps to create a GitHub Actions Workflow that will deploy our CDK stack to AWS Cloud.

  • Workflow File: Write a GitHub Actions workflow file. This file will define the steps and conditions under which your code is deployed to AWS.
  • Triggering Deployments: Set up triggers for deployment. For example, you might trigger deployments on pushing to specific branches or tagging a release.
  • Deployment Steps: Detail the steps for deploying your application. This may include building your application, running tests, and using AWS CDK commands to deploy to AWS.
  • Post-Deployment Verification: Implement steps to verify the successful deployment of your application. This might include health checks, smoke tests, or rollback procedures in case of failure.

Following is the deploy workflow file for our simple api for the dev environment:

# .github/workflows/deploy.yml
name: deploy

on:
  push:
    branches:
      - main

jobs:
  build:
    runs-on: ubuntu-latest
    defaults:
      run:
        working-directory: ./simple-api
    steps:
      - name: Pull repository
        uses: actions/checkout@v3

      - name: Setup Nodejs and npm
        uses: actions/setup-node@v3
        with:
          node-version: "18"

      - name: Setup yarn
        run: npm install -g yarn

      - name: Setup Nodejs with yarn caching
        uses: actions/setup-node@v3
        with:
          node-version: "18"
          cache: yarn
          cache-dependency-path: './simple-api/yarn.lock'

      - name: Install dependencies
        run: yarn install --frozen-lockfile

      - name: Build and test the project
        run: npx projen build

  deploy-dev-simple-api:
    runs-on: ubuntu-latest
    defaults:
      run:
        working-directory: ./simple-api
    needs: build
    environment: dev
    env:
      AWS_ACCOUNT: ${{ secrets.AWS_ACCOUNT }}
      AWS_REGION: ${{ vars.AWS_REGION }}
      ENVIRONMENT: ${{ vars.ENVIRONMENT }}
    permissions:
      id-token: write
      contents: read
    steps:
      - name: Pull repository
        uses: actions/checkout@v3

      - name: Setup Nodejs and npm
        uses: actions/setup-node@v3
        with:
          node-version: "18"

      - name: Setup yarn
        run: npm install -g yarn

      - name: Setup Nodejs with yarn caching
        uses: actions/setup-node@v3
        with:
          node-version: "18"
          cache: yarn
          cache-dependency-path: './simple-api/yarn.lock'

      - name: Install dependencies
        run: yarn install --frozen-lockfile

      - name: Assume deploy role
        uses: aws-actions/configure-aws-credentials@v2
        with:
          role-to-assume: ${{ secrets.DEPLOY_ROLE }}
          aws-region: ${{ vars.AWS_REGION }}

      - name: Deploy the application
        run: npx projen deploy --all

This workflow file triggers when a push is made to our main branch. It first builds and tests the stack and then runs the deploy step for the dev environment. To implement the test and prod environment deploy steps, just copy the dev job and replace the dev with test and follow a similar a process for the prod environment. It is also crucial to make these steps run sequentially so that the deployment to environments don’t happen all at once. As such, we would need to make sure test deployment depends on dev and the prod deployment depends on test. For that, change the needs part to the name of the dev or test job’s name, respectively.

For this tutorial, I haven’t included the steps necessary for the 4th step for post deployment health checks but in a production environment, it is crucial to make sure everything is working as expected.

The Deployment of the AWS CDK Stack

Finally, when we push our application to our remote main branch, we should expect to see our workflow start running. Make sure the working directories are correctly set within the workflow file to achieve successful execution!

Workflow runs in your repository actions pane

When the deployment finishes, you should be able to use the api endpoint presented in the outputs of the deployment within the workflow logs.

Access the api endpoint assigned by API Gateway in GitHub Actions

Here is how the manual approval step of the GitHub Actions is seen in the workflow page:

Manual approval step in a GitHub workflow

Conclusion on AWS GitHub Deployment

As we conclude our series on AWS Multi-Account GitOps Deployment on AWS using GitHub Actions, let’s reflect on the journey we’ve embarked on and the key takeaways from this process.

Embracing Automation and Efficiency

Through this series, we’ve explored how to effectively leverage automation to manage and deploy resources across multiple AWS accounts. By integrating AWS with GitHub Actions, we’ve demonstrated a powerful combination of cloud infrastructure management and modern CI/CD practices.

Advantages of Multi-Account Strategy

The multi-account strategy on AWS, when combined with GitOps practices, offers enhanced security, better resource segregation, and more granular control over access and billing. This approach is particularly beneficial for larger teams and organizations aiming to scale their cloud infrastructure efficiently.

The Power of AWS Multi-Account GitHub

GitHub Actions stands out as a versatile and user-friendly tool for CI/CD, providing a seamless integration with GitHub repositories. Its ability to handle complex workflows, along with the vast community-driven actions available, makes it an excellent choice for teams looking to implement GitOps.

Key Learning Points

  • Setting Up the Foundation: The importance of establishing a strong organizational structure within AWS to support multi-account deployment.
  • Integrating Tools: How to effectively integrate AWS with GitHub Actions, taking advantage of short-lived credentials for enhanced security.
  • Deployment Workflows: Crafting GitHub Actions workflows to automate the deployment process, ensuring consistency and reliability.
  • Security and Best Practices: Emphasizing the importance of security best practices, particularly in managing secrets and permissions.

Encouraging Continuous Learning

The landscape of cloud computing and DevOps is ever-evolving. This series, while comprehensive, is just the beginning. I encourage you to keep experimenting, learning, and adapting to new tools and methodologies.

Your Next Steps

With the foundation set and the knowledge gained, you’re now equipped to extend and customize your GitOps workflows. Experiment with different types of deployments, explore new GitHub Actions, and continue to refine your AWS multi-account management strategy. As you make progress, share your learnings and collaborate with the community. The collective knowledge and experiences help everyone grow and innovate.

Final Thoughts

Thank you for joining me in this series. I hope it has been enlightening and empowering. As you embark on your journey of cloud engineering, remember that the path to mastery is through continuous practice and adaptation.

References

For further reading and advanced topics, here are some resources:


Happy Cloud Engineering and until next time!

Meer weten over wat wij doen?

We denken graag met je mee. Stuur ons een bericht.