Abhi's Experiments

Its a place to write about my professional experience and experiments

AWS CodeBuild Custom Notification Messages

Recently we got a chance to implement AWS Code Pipeline for a project. It has four stages: CodeCommit, CodeBuild, CodeDeploy, and Test using Jenkins. There was a requirement to enable notifications for each state of the stages. So we have created a SNS Topic and added subscriptions so that we can use this Topic in the AWS services to send the notification.

Enabling notifications in CodeCommit and CodeDeploy are pretty straight forward. In CodeCommit you can create a Trigger with Event “Push to existing branch” and add target as our SNS topic. Same with CodeDeploy, we can Create Trigger from Deployment Group with events (DeploymentStart, DeploymentSuccess, DeploymentFailure, DeploymentStop, DeploymentReady, DeploymentRollback) and target as our SNS Topic.

But the problem with CodeCuild is, we don’t have a built-in trigger for the CodeBuild state change. So we have to use CloudWatch events to trigger the notifications, where we have created Rule with event pattern and target as SNS topic in CloudWatch. In the Rule creation page we have to select “Event Pattern” and “Build event pattern to match events by service”, where we will select CodeBuild as “Service Name” and “All Events” for “Event Type”. This will generate an event pattern for CodeBuild status (IN_PROGRESS, SUCCEDED, FAILED, STOPPED).

But the cloudwatch event notifications send the emails with a static content as subject (“AWS Notification Message”), which won’t be helpful for the user who subscribed for the notification, as he has to open the email and read the body to understand the state (success or failure, which build, etc.) of the CodeBuild. So we came up with a solution to customize the notification from CloudWatch.

From CloudWatch, while creating the Rule instead of giving the target as SNS Topic, we have given the target as AWS Lambda function. We will get the entire event details as json in our Lambda function, which we can use to prepare the subject and body of the notification message. We have used Python Botocore SDK to publish the message to SNS Topic. Please find below the AWS Lambda function:

comments...

Apache External Authentication

I have created a web application which is a middleware situated between the mobile app and the webservice provider. Some requests should be handled by the middleware and some other requests should be forwarded to the webservice provider.

We can add apache proxy to forward the requests to the webservice provider. But before forwarding the requests it should be validated. If we are validating from the application then we can’t use the apache proxy, so the validation has to be implemented within apache.

There is a custom module for apache to do this kind of external authentication called mod-auth-external. Where you can mention the path of an external authentication file. Whenever apache gets a request it will execute the validation file and based on the response (true/false) from validation file apache either continue with the request or return unauthorized.

comments...

Distributed Job Processing with Celery

Recently I got a task to create a job processing application. The application will fetch the data from an API, parse the data, and create jobs for the data.

The job is to record a livestream using ffmpeg. There will be multiple jobs running simultaneously, so the idea is to create a distributed job processing application.

We have selected Celery as it is an asynchronous distributed task queue written in Python. Also we have used RabbitMQ as the message broker.

We have designed the infrastructure with the application running on multiple instances. One instance will be the Celery master where we execute the script to fetch data from API, parse the data, add jobs to the queue. Other instances will be running the Celery worker with the same RabbitMQ server as message broker.

app = Celery('tasks', backend='amqp://guest@192.168.1.3//', 
                       broker='amqp://guest@192.168.1.3//')

We are using a cron job in the Celery master to execute the script periodically. The Celery master and the Celery workers will be pointing to the same RabbitMQ server, so whenever a new job is added to the queue it will be assigned to a worker.

We are also using Flower: a celery monitoring tool in Celery master to check the status of the jobs and for other monitoring.

comments...