Stream into Event Hubs for the Kafka Ecosystem

Note

This sample is available on GitHub

This quickstart shows how to stream into Kafka-enabled Event Hubs without changing your protocol clients or running your own clusters. You learn how to use your producers and consumers to talk to Kafka-enabled Event Hubs with just a configuration change in your applications. Azure Event Hubs for Kafka ecosystem supports Apache Kafka version 1.0.

Prerequisites

To complete this quickstart, make sure you have the following prerequisites:

Send and receive messages with Kafka in Event Hubs

  1. Clone the Azure Event Hubs repository.

  2. Navigate to azure-event-hubs/samples/kafka/quickstart/producer.

  3. Update the configuration details for the producer in src/main/resources/producer.config as follows:

    bootstrap.servers={YOUR.EVENTHUBS.FQDN}:9093
    security.protocol=SASL_SSL
    sasl.mechanism=PLAIN
    sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username="$ConnectionString" password="{YOUR.EVENTHUBS.CONNECTION.STRING}";
    
  4. Run the producer code and stream into Kafka-enabled Event Hubs:

    mvn clean package
    mvn exec:java -Dexec.mainClass="TestProducer"                                    
    
  5. Navigate to azure-event-hubs/samples/kafka/quickstart/consumer.

  6. Update the configuration details for the consumer in src/main/resources/consumer.config as follows:

    bootstrap.servers={YOUR.EVENTHUBS.FQDN}:9093
    security.protocol=SASL_SSL
    sasl.mechanism=PLAIN
    sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username="$ConnectionString" password="{YOUR.EVENTHUBS.CONNECTION.STRING}";
    
  7. Run the consumer code and process from Kafka enabled Event Hubs using your Kafka clients:

    mvn clean package
    mvn exec:java -Dexec.mainClass="TestConsumer"                                    
    

If your Event Hubs Kafka cluster has events, you now start receiving them from the consumer.

Next steps