Feeding IoT Device Telemetry Data to Kafka-based Applications

With the newly released support for Kafka streams in Event Hubs, it is now possible for Azure IoT Hub customers to easily feed their IoT device telemetry data into Kafka-based applications for further downstream processing or analysis. This gives customers with existing Kafka-based applications the added flexibility of faster adoption of Azure IoT Hub, without the need to rewrite any parts of their applications upfront. This means that customers can start using IoT Hub's native support for messaging, device and configuration management early, and defer the decision to migrate their telemetry processing applications to natively use Event Hubs at a later time.

Applicable Customer Scenarios

The ability to feed IoT device telemetry data into Kafka-based processing applications is valuable in several scenarios:

  • A primary scenario involves a prospective Azure IoT Hub customer with legacy data processing applications that are already written to interface with Kafka clusters. With the support for Kafka in Event Hubs, the customer can defer the need to make upfront changes to such applications as part of onboarding to Azure IoT Hub. The new feature enables a faster IoT Hub adoption cycle for the customer with lower upfront development costs.
  • A secondary scenario involves a customer who would like to keep an existing Kafka-based telemetry processing application as is, perhaps to use it for processing telemetry data emitted as Kafka streams by other disparate sources (e.g., non-Azure-IoT-Hub-managed devices). Similarly, the new feature benefits the customer by reducing churn in existing Kafka-based applications.
  • A final scenario involves a customer who is simply evaluating Azure IoT Hub's capabilities for future adoption, and would typically like to undertake a minimal set of changes needed to validate IoT Hub's capabilities without making significant changes in any external periphery data processing system.


Using Kafka-based Applications with IoT Hub Telemetry

As shown in the diagram below, IoT Hub acts as a conduit for the device telemetry data. This data is persisted in Event Hubs, and ultimately consumed by downstream applications. To enable your Kafka-based application to retrieve this data, you need to add a Kafka-enabled Event Hub as an endpoint/route to IoT Hub, and configure your Kafka-based application with the connection string of your Event Hub where the data is stored. These steps are described in more detail below.

  1. Follow this guide to create a new Event Hubs. Ensure the "Enable Kafka" option is selected during the creation process.
  2. You now need to add your Event Hub as a custom endpoint in your IoT Hub using the Azure portal or CLI. To use the portal, your IoT Hub and Event Hub must be in the same region and under the same subscriptions. Go to your IoT Hub dashboard page in the portal, open the "Endpoints" tab, and click "Add". Enter a name for your endpoint, and select "Event Hub" under "Endpoint Type". Under "Event Hub Namespace" select your Event Hub created in Step 1, and select the Event Hub name. Click "OK" to create the endpoint. Alternatively, if your Event Hubs and IoT Hub are in different regions or under different subscriptions, you can use the IoT Hub CLI to add your Event Hub as a custom endpoint in IoT Hub. In that case, install the Azure CLI with IoT Hub extension and run the following command (substitute your information, and ensure that your Event Hubs connection string has the Event Hub name specified as EntityPath).
    az iot hub update -g [YOUR_IOT_HUB_RESOURCE_GROUP] -n [YOUR_IOT_HUB_NAME] –add properties.routing.endpoints.eventHubs connectionString='Endpoint=sb://[YOUR_EVENT_HUBS_NAMESPACE_FQDN];SharedAccessKeyName=RootManageSharedAccessKey;SharedAccessKey=[YOUR_EVENT_HUBS_KEY];EntityPath=[YOUR_EVENT_HUB_NAME_FROM_STEP_1]' name=[YOUR_ENDPOINT_NAME] subscriptionId=[YOUR_IOT_HUB_SUBSCRIPTION_ID] resourceGroup=[YOUR_IOT_HUB_RESOURCE_GROUP]
  3. Next, you will add a route so that IoT Hub persists the telemetry data emitted from devices to your Kafka-enabled Event Hub endpoint (from Step 2). For this purpose, go to your IoT Hub dashboard on the portal, and open the "Routes" tab. Click "Add", and enter a name for your route. Under "Data source" select "Device Messages", and subsequently select your endpoint created in Step 2. Enter `true` in the "Query string" box so that all messages from your devices match this route. At the end, click "Save" to save your new route. Alternatively, you can use the IoT Hub CLI as follows to add a new route (substitute [YOUR_ENDPOINT_NAME] with the endpoint name you used in step 2).
    az iot hub update -g [YOUR_IOT_HUB_RESOURCE_GROUP] -n [YOUR_IOT_HUB_NAME] –add properties.routing.routes "{'condition':'true', 'endpointNames':['[YOUR_ENDPOINT_NAME]'], 'isEnabled':True, 'name':'[YOUR_ROUTE_NAME]', 'source':'DeviceMessages'}"
  4. The next step for enabling consumption of device telemetry data in Kafka-based application is to update its connection string to the connection string of the Event Hubs you created previously and added to IoT Hub. You can use the QuickStart Kafka consumer code available here. Note that you will need to install a number of pre-requisites as outlined here. Assuming that you cloned the [Java QuickStart code](https://github.com/Azure/azure-event-hubs), follow the steps below to configure, compile and run the code:
    • Update bootstrap.servers=[YOUR_EVENT_HUBS_NAMESPACE_FQDN] in azure-event-hubs/samples/kafka/quickstart/consumer/src/main/resources/consumer.config with your Event Hub's namespace FQDN (FQDN is normally in this format: [EVENT_HUBS_NAME].servicebus.windows.net:9093).
    • Update sasl.jaas.config and set password="[YOUR_EVENT_HUBS_CONNECTION_STRING]" to your Kafka-enabled Event Hub's connection string from step 1.
    • Update the TOPIC constant defined in azure-event-hubs/samples/kafka/quickstart/src/main/java/com/example/app/TestConsumer.java and set it to your hub's name in Event Hubs (a hub's name in Event Hubs is a counterpart to Kafka topics).
    • On a terminal, compile the code using mvn clean package.
    • On a terminal, run the consumer code using mvn exec:java -Dexec.mainClass="TestConsumer". The consumer will periodically poll your Event Hubs for events and print them out on the console.
  5. Finally, you can use any of IoT Hub's sample codes in Java, Node.js, Python, or .NET to send device-to-cloud telemetry messages into your IoT Hub. These messages will be routed to your Event Hubs endpoint and can be consumed by your Kafka consumer. The flow of events from an IoT device into the Kafka-based application is depicted in the figure below.

Azure IoT Hub device telemetry feed into Kafka-based applications

Source: IoT

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.