Deployed in a Day: Github Actions, Docker, and AWS App Runner | by Shaun Ganley | Mar, 2022

My exact process to deploy applications on the cloud

Photo by Rubaitul Azad on Unsplash

All developers will at some point find themselves with an application that they build and run on their local machine that they would love to have running in the cloud but don’t have the time or knowledge about how to deploy the application in the cloud without having to get their hands dirty in server configuration.

As part of a training program, I run for trainee software engineers we build a simple application that allows a user to create, edit and delete employees. The trainees start by building a database, then build a REST API to perform CRUD operations and finally add a user interface, the approach taken below will work for any frameworks that play nicely with Docker.

Part of modern-day software engineering involves much more than writing code and throwing it over the wall to a Platforms team who are responsible for deploying the app. In order to develop well-rounded software engineers, it’s vital that they understand how these applications go from running on a local machine to running in production.

The process below outlines how I used Github Actions, Docker and AWS App Runner to take the applications mentioned above to deploy the applications in the cloud.

Let’s start with the database layer. This is quite simple as cloud providers provided a limited number of ways to host your database in the cloud. As my background is in AWS I’m using the Amazon Managed Relational Database Service (RDS) which allows you to create your database with a few simple clicks via the console, this can be automated using Infrastructure as Code such as Terraform or the AWS CDK which will allow you to easily scale your application across any number of test environments if needed.

Now that we have our database created we can move on to creating the database structure which we will use for our application data. As a common best practice for database development you should never be running Data Definition Language (DDL) statement directly against your database but instead should be scripting them so they are rerun-able and reviewable. Using this database script you can create a Github Actions job on the repository where you store your script to load the script into your database, this can be done on a pull request if you use feature environments or just on commit to master when you want to update your test environment.

mysql — host=<your_host> — user=<your_username> — password=<your_password> <your_database_name> < employeesdb.sql
Photo by Sunder Muthukumaran on Unsplash

Next up we want to containerise our applications, in my case I’m using a Java Dropwizard REST API and a Node.js Express frontend, but as mentioned above this approach will allow you to deploy any application that allows you to use Docker to run. A container in this context contains all of the code and packages required to run your application anywhere that supports running containers and Docker allows you to create and run your containers.

Github Actions has a Docker image workflow that will generate a configuration file for you that will run the docker build command on merge to master. Once Github Actions has generated the container we need to store it somewhere so that we can use it to run the applications, in my case I’m using AWS Elastic Container Registry (ECR) to store my API and UI containers however there are lots of Alternatives that you can use but using ECR allows us to easily integrate the container with any service you want to use to run the application.

From the sample code below you can see the use of secrets. Github allows you to create secrets on a repository or an account level. You can use these secrets to store the credentials needed to store the container in a container repository, below I have access credentials for a user that has permissions to access ECR. You can also use secrets to store the database access information to connect your API with your database instance, in your frontend you can use a secret to store your API URL to make it configurable on build. As you can see on the docker build command we are passing the secrets to the container at this stage, this means we don’t need to store the secrets where we run our application and in the case of AWS App Runner this service doesn’t allow for secrets to be passed to the application, only unencrypted environment variables.

name: Docker Image CIon:
push:
branches: [ main ]
pull_request:
branches: [ main ]
jobs:build:runs-on: ubuntu-lateststeps:
— uses: actions/checkout@v2
- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@v1
with:
aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
aws-region: eu-west-1
- name: Login to Amazon ECR
id: login-ecr
uses: aws-actions/amazon-ecr-login@v1
- name: Build & push the Docker image
env:
DB_PASSWORD: ${{ secrets.DB_PASSWORD }}
DB_USERNAME: ${{ secrets.DB_USERNAME }}
DB_HOST: ${{ secrets.DB_HOST }}
DB_NAME: ${{ secrets.DB_NAME }}
ECR_REGISTRY: ${{ steps.login-ecr.outputs.registry }}
ECR_REPOSITORY: employee_demo
IMAGE_TAG: api
run: |
docker build — build-arg DB_PASSWORD=${DB_PASSWORD} — build-arg DB_USERNAME=${DB_USERNAME} — build-arg DB_HOST=${DB_HOST} — build-arg DB_NAME=${DB_NAME} -t $ECR_REGISTRY/$ECR_REPOSITORY:$IMAGE_TAG .
docker push $ECR_REGISTRY/$ECR_REPOSITORY:$IMAGE_TAG
Photo by Ian Taylor on Unsplash

Now that we have our container created and stored we just need somewhere to run the application, again there are lots of options here for running containers in the cloud and it’s really up to you and the needs of your application as to how much control you want Over your server configuration which should decide which option you choose.

For this simple demo application, I didn’t need fine-grained control over configuration so I decided to use AWS App Runner which is a fully managed service that allows you to run your container, giving you a URL to access your URL within a few minutes, you can use Infrastructure as Code to automate this deployment to test environments if required. App Runner also has automated scaling which means you will only pay for the compute that your application uses, the less traffic to your application the less your AWS bill at the end of the month.

App Runner is currently limited to only a few regions so you might want to check which region is nearest to you and create your ECR repository in the same region as your App Runner service.

Leave a Comment