Within a month on the way to a fully automated infrastructure on the AWS cloud

AWS Kafka SaaS PrivateCA

Creating your own Kafka service despite a mountain of challenges

As one of LINKIT’s more experienced Cloud professionals, Anthony Potappel is not easily surprised by a challenge he encounters. He was hired by an insurer on a special and challenging assignment for less than a month, setting up his own Kafka service for a pilot application on the AWS Cloud. This proved to test him in new ways, while for the client it was a nice first step towards realizing the long-term plans of the company, whereby the majority of the infrastructure must run on the AWS Cloud.

What is Kafka?

Kafka, officially Apache Kafka, is an open source streaming platform developed by LinkedIn. It is widely used today as a kind of man-in-the-middle when processing real-time data feeds. Users can subscribe to data feeds and then publish data in real time on multiple systems or applications. Examples include matching passengers and vehicles at Uber, or real-time analysis on infrastructure for maintenance at Shell.

Linking an application with a SaaS service

The main problem that Anthony ran into was linking the new application with a third-party SaaS service. This required a public endpoint, for which a Kafka cluster was used. This was a challenge mainly because Amazon’s Managed Streaming for Apache (MSK) only works via one VPC network or VPN. Unfortunately, this was not supported by the SAAS party.

Cook in the AWS kitchen

To Anthony’s advantage AWS does offer a very extensive kit, building blocks and automation tools to assemble a service yourself, based on a proprietary recipe. And that’s what Anthony got to work with!

Lots of components

However, developing your own Kafka service is not an everyday task. For example, there are a huge number of components that have to be processed. Think of an Elastic Compute Cloud (ECS) with Autoscaling, SSM-Parameter Store, Route53, PrivateCA and Lambda functions. The created environment is then accessed via a VPN to a second AWS account, with a firewall, Intrusion Detection and Prevention Systems. Only after all these actions will the endpoint be publicly continued. A large list of opaque terms, in which everything has to be right.

Advantages and disadvantages of utilizing your own service

Applying all these components to one service yourself brought both pro’s and cons. For example, ECS offers the advantage that standard Confluent images can be used for Kafka. However, these must still be provided with an extra security patch, so that (automatically generated) passwords can be loaded via an SSM Parameter Store. Port mapping has been used to safely transfer the data from one AWS account to another. Unfortunately for Anthony that is a feature that is not yet in the standard MSK service of AWS, so it had to be set up manually.

PrivateCA

What made this project special for Anthony was the PrivateCA component. PrivateCA is used to control the issuance of (SSL) keys, so that Kafka nodes can communicate with each other securely and only the application has access to the cluster.

Setting this up was not without it’s fair amount of struggle. PrivateCA is one of the few services of AWS that cannot be fully delivered via CloudFormation. The importance of CloudFormation is that you can roll out on 4 environments of the OTAP development street via one recipe, which saves a lot of time and effort.

Python Lambda

To fix this, Anthony wrote in Python Lambda’s a solution to call (via AWS SDK) the appropriate APIs, generate a PrivateKey and then upload to AWS. Ultimately, the requested environment is now 100 percent automated via CloudFormation according to the AWS Infra-as-code standard!

Quite an achievement!

Despite the many challenges that were faced, Anthony managed to deliver the required configuration code and accompanying documentation in less than a month. An achievement he is also rightly proud of. The environment is now being tested and Anthony occasionally steps in for extra support. After this phase, he expects the service to run independently without any intervention for issues.

Completion

Because Anthony worked with the client’s team and has well documented the application, the client now has the opportunity to continue working on his own.

This project aligns perfectly within the LINKIT way of working: sharing knowledge while working together with the customer in a transparent and pragmatic approach in order to find a business-driven result!

Do you also face an AWS challenge? Could you use some extra advice? Anthony would be happy to discuss his AWS experience.