Stream into Event Hubs for Kafka Ecosystem

Note

This sample is available on GitHub

This quickstart shows how to stream into Kafka enabled Event Hubs without changing your protocol clients or running your own clusters. You will learn how to use your producers and consumers can talk to Kafka enabled Event Hubs with just a configuration change in your applications. Azure Event Hubs for Kafka Ecosystem supports Apache Kafka version 1.0.

Prerequisites

To complete this quickstart, make sure you have:

Send and receive messages with Kafka in Event Hubs

  1. Clone the Azure Event Hubs repository.

  2. Navigate to azure-event-hubs/samples/kafka/quickstart/producer.

  3. Update the configuration details for the producer in src/main/resources/producer.config as shown here.

    bootstrap.servers={YOUR.EVENTHUBS.FQDN}:9093
    security.protocol=SASL_SSL
    sasl.mechanism=PLAIN
    sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username="$ConnectionString" password="{YOUR.EVENTHUBS.CONNECTION.STRING}";
    
  4. Run the producer code and stream into Kafka enabled Event Hubs.

    mvn clean package
    mvn exec:java -Dexec.mainClass="TestProducer"                                    
    
  5. Navigate to azure-event-hubs/samples/kafka/quickstart/consumer.

  6. Update the configuration details for the consumer in src/main/resources/consumer.config as shown here.

    bootstrap.servers={YOUR.EVENTHUBS.FQDN}:9093
    security.protocol=SASL_SSL
    sasl.mechanism=PLAIN
    sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username="$ConnectionString" password="{YOUR.EVENTHUBS.CONNECTION.STRING}";
    
  7. Run the consumer code and process from Kafka enabled Event Hubs using your Kafka clients.

    mvn clean package
    mvn exec:java -Dexec.mainClass="TestConsumer"                                    
    

If your Event Hubs Kafka cluster has events queued from the producer, you should now start receiving them from the consumer.

Next steps