Introducing the AWS CodeDeploy Multiplexer

Now you can package one or more applications from a GitHub repository into a single artifact, where it's ready to be deployed.

Coding

At Rackspace, a common request we get from customers is the ability to deploy multiple applications onto single EC2 AutoScaling groups using AWS CodeDeploy.

Unfortunately, CodeDeploy doesn’t handle this scenario very well. Generally the best way to handle this type of situation is to package all the application code into a single artifact and deploy it at once.

In order to make it easier for our customers, we decided to come up with a solution that can take one or more applications in a GitHub repository and package them into a single artifact ready to be deployed with CodeDeploy. The solution combines a small CLI tool that runs in CodeBuild with an API Gateway webhook that can be used to automatically build artifacts every time new code is pushed to GitHub

AWS CodeDeploy

How it works

GitHub kicks off a webhook for all push events to the API Gateway which then passes the information to a Lambda function. The Lambda function does a quick check and determines which artifacts should be built based on which Git branch the push originated from.

Once the artifacts are determined, the Lambda function initiates a CodeBuild run to kick off the build process of the artifacts. CodeBuild runs a Python-based CLI tool that will download zip files from GitHub, take the Appspec files and application code from each GitHub zip file and combine them into a single artifact. As part of the artifact creation, the Python code will rewrite directories for in the newly created Appspec file.

For example:

AWS CodeDeploy2

 

 

 

 

 

 

 

 

The above two application repos will be merged into a new artifact with the following structure:

AWS CodeDeploy3

 

 

 

 

 

 

Once the artifact is created, CodeBuild will push that artifact into S3 and then it’s ready to be deployed using CodeDeploy via the CLI or AWS Web Console.

Create the bucket and configure the solution

This solution requires an S3 Bucket to store the configuration file as well as the deployment package. This post will use aws-cd-artifacts as a bucket name; ensure that a unique bucket name is chosen when following along. Create the S3 Bucket:

Once the S3 bucket is created the configuration file and code package will need to be uploaded. To generate the package GNU, Make and zip will need to be installed on the local machine. This solution was created with OS X, however running on a Linux Desktop should work as well.

Clone the GitHub repository and run the Make command from the root of the repository.

There should be a file named build.zip in the directory, this should be uploaded to the newly created bucket.

Finally a configuration file needs to be created, we recommend copying the existing multiplexer.json.example file to multiplexer. Json and editing it with the applications that need to be merged. Once it has been updated upload it to S3:

Now that everything is created and uploaded the next step is to create a CloudFormation stack using the CloudFormation template located in the root of the solution repository (cloudformation.yaml). Make sure the GitHub token has access to all the repositories listed in the configuration file, it is used to authenticate with GitHub and download the individual repositories.

Finally ensure the GitHub webhooks are created on each repository so that any changes made trigger a new artifact build. Use the outputs of the new stack to fill in the GitHub webhook information and ensure the content type is set to application/json.

AWS CodeDeploy4

The flexibility of the solution should mean you only have to deploy it once and configure it to be used across all applications you manage with your fleet. This can also be further extended to a full CD solution using S3 notifications and AWS Lambda to kick off the deployment process. Of course we hope at some point AWS CodeDeploy can handle multiple application sources natively, but for now this solution should provide a good work around.

Want to find out more? Visit Rackspace to learn about all of the things our AWS certified engineers are doing.

Jim Rosser is a DevOps Engineer at Rackspace. He specializes in architecting and building custom automation and systems on AWS so customers can test and ship software more quickly and efficiently.

2 COMMENTS

LEAVE A REPLY

Please enter your comment!
Please enter your name here