Stream into Event Hubs for the Apache Kafka

This quickstart shows how to stream into Kafka-enabled Event Hubs without changing your protocol clients or running your own clusters. You learn how to use your producers and consumers to talk to Kafka-enabled Event Hubs with just a configuration change in your applications. Azure Event Hubs supports Apache Kafka version 1.0.

Note

This sample is available on GitHub

Prerequisites

To complete this quickstart, make sure you have the following prerequisites:

Create a Kafka enabled Event Hubs namespace

  1. Sign in to the [Azure portal][Azure portal], and click Create a resource at the top left of the screen.

  2. Search for Event Hubs and select the options shown here:

    Search for Event Hubs in the portal

  3. Provide a unique name and enable Kafka on the namespace. Click Create.

    Create a namespace

  4. Once the namespace is created, on the Settings tab click Shared access policies to get the connection string.

    Click Shared access policies

  5. You can choose the default RootManageSharedAccessKey, or add a new policy. Click the policy name and copy the connection string.

    Select a policy

  6. Add this connection string to your Kafka application configuration.

You can now stream events from your applications that use the Kafka protocol into Event Hubs.

Send and receive messages with Kafka in Event Hubs

  1. Clone the Azure Event Hubs repository.

  2. Navigate to azure-event-hubs/samples/kafka/quickstart/producer.

  3. Update the configuration details for the producer in src/main/resources/producer.config as follows:

    bootstrap.servers={YOUR.EVENTHUBS.FQDN}:9093
    security.protocol=SASL_SSL
    sasl.mechanism=PLAIN
    sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username="$ConnectionString" password="{YOUR.EVENTHUBS.CONNECTION.STRING}";
    
  4. Run the producer code and stream into Kafka-enabled Event Hubs:

    mvn clean package
    mvn exec:java -Dexec.mainClass="TestProducer"                                    
    
  5. Navigate to azure-event-hubs/samples/kafka/quickstart/consumer.

  6. Update the configuration details for the consumer in src/main/resources/consumer.config as follows:

    bootstrap.servers={YOUR.EVENTHUBS.FQDN}:9093
    security.protocol=SASL_SSL
    sasl.mechanism=PLAIN
    sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username="$ConnectionString" password="{YOUR.EVENTHUBS.CONNECTION.STRING}";
    
  7. Run the consumer code and process from Kafka enabled Event Hubs using your Kafka clients:

    mvn clean package
    mvn exec:java -Dexec.mainClass="TestConsumer"                                    
    

If your Event Hubs Kafka cluster has events, you now start receiving them from the consumer.

Next steps

In this article, you learned how to stream into Kafka-enabled Event Hubs without changing your protocol clients or running your own clusters. To learn more, continue with the following tutorial: