Ingest data from Event Hub into Azure Data Explorer

Azure Data Explorer is a fast and highly scalable data exploration service for log and telemetry data. Azure Data Explorer offers ingestion (data loading) from Event Hubs, a big data streaming platform and event ingestion service. Event Hubs can process millions of events per second in near real-time. In this article, you create an event hub, connect to it from Azure Data Explorer and see data flow through the system.


Sign in to the Azure portal

Sign in to the Azure portal.

Create an event hub

In this article, you generate sample data and send it to an event hub. The first step is to create an event hub. You do this by using an Azure Resource Manager template in the Azure portal.

  1. To create an event hub, use the following button to start the deployment. Right-click and select Open in new window, so you can follow the rest of the steps in this article.

    Deploy to Azure

    The Deploy to Azure button takes you to the Azure portal to fill out a deployment form.

    Deploy to Azure

  2. Select the subscription where you want to create the event hub, and create a resource group named test-hub-rg.

    Create a resource group

  3. Fill out the form with the following information.

    Deployment form

    Use defaults for any settings not listed in the following table.

    Setting Suggested value Field description
    Subscription Your subscription Select the Azure subscription that you want to use for your event hub.
    Resource group test-hub-rg Create a new resource group.
    Location West US Select West US for this article. For a production system, select the region that best meets your needs. Create the event hub namespace in the same Location as the Kusto cluster for best performance (most important for event hub namespaces with high throughput).
    Namespace name A unique namespace name Choose a unique name that identifies your namespace. For example, mytestnamespace. The domain name is appended to the name you provide. The name can contain only letters, numbers, and hyphens. The name must start with a letter, and it must end with a letter or number. The value must be between 6 and 50 characters long.
    Event hub name test-hub The event hub sits under the namespace, which provides a unique scoping container. The event hub name must be unique within the namespace.
    Consumer group name test-group Consumer groups enable multiple consuming applications to each have a separate view of the event stream.
  4. Select Purchase, which acknowledges that you're creating resources in your subscription.

  5. Select Notifications on the toolbar to monitor the provisioning process. It might take several minutes for the deployment to succeed, but you can move on to the next step now.


Create a target table in Azure Data Explorer

Now you create a table in Azure Data Explorer, to which Event Hubs will send data. You create the table in the cluster and database provisioned in Prerequisites.

  1. In the Azure portal, navigate to your cluster then select Query.

    Query application link

  2. Copy the following command into the window and select Run to create the table (TestTable) which will receive the ingested data.

    .create table TestTable (TimeStamp: datetime, Name: string, Metric: int, Source:string)

    Run create query

  3. Copy the following command into the window and select Run to map the incoming JSON data to the column names and data types of the table (TestTable).

    .create table TestTable ingestion json mapping 'TestMapping' '[{"column":"TimeStamp", "Properties": {"Path": "$.timeStamp"}},{"column":"Name", "Properties": {"Path":"$.name"}} ,{"column":"Metric", "Properties": {"Path":"$.metric"}}, {"column":"Source", "Properties": {"Path":"$.source"}}]'

Connect to the event hub

Now you connect to the event hub from Azure Data Explorer. When this connection is in place, data that flows into the event hub streams to the test table you created earlier in this article.

  1. Select Notifications on the toolbar to verify that the event hub deployment was successful.

  2. Under the cluster you created, select Databases then TestDatabase.

    Select test database

  3. Select Data ingestion and Add data connection. Then fill out the form with the following information. Select Create when you are finished.

    Event hub connection

    Data Source:

    Setting Suggested value Field description
    Data connection name test-hub-connection The name of the connection you want to create in Azure Data Explorer.
    Event hub namespace A unique namespace name The name you chose earlier that identifies your namespace.
    Event hub test-hub The event hub you created.
    Consumer group test-group The consumer group defined in the event hub you created.
    Event system properties Select relevant properties The Event Hub system properties. If there are multiple records per event message, the system properties will be added to the first one. When adding system properties, create or update table schema and mapping to include the selected properties.
    Compression None The compression type of the Event Hub messages payload. Supported compression types: None, GZip.

    Target table:

    There are two options for routing the ingested data: static and dynamic. For this article, you use static routing, where you specify the table name, data format, and mapping. Therefore, leave My data includes routing info unselected.

    Setting Suggested value Field description
    Table TestTable The table you created in TestDatabase.
    Data format JSON Supported formats are Avro, CSV, JSON, MULTILINE JSON, PSV, SOHSV, SCSV, TSV, TSVE, TXT, ORC and PARQUET.
    Column mapping TestMapping The mapping you created in TestDatabase, which maps incoming JSON data to the column names and data types of TestTable. Required for JSON or MULTILINE JSON, and optional for other formats.


    • Select My data includes routing info to use dynamic routing, where your data includes the necessary routing information as seen in the sample app comments. If both static and dynamic properties are set, the dynamic properties override the static ones.
    • Only events enqueued after you create the data connection are ingested.
    • You can also set the compression type via dynamic properties as seen in the sample app.
    • Avro, ORC and PARQUET formats as well as event system properties aren't supported on GZip compression payload.

Event system properties mapping


  • System properties are supported for single-record events.
  • For csv mapping, properties are added at the beginning of the record. For json mapping, properties are added according to the name that appears in the drop-down list.

If you selected Event system properties in the Data Source section of the table, you must include the following properties in the table schema and mapping.

Table schema example

If your data includes three columns (Timespan, Metric, and Value) and the properties you include are x-opt-enqueued-time and x-opt-offset, create or alter the table schema by using this command:

    .create-merge table TestTable (TimeStamp: datetime, Metric: string, Value: int, EventHubEnqueuedTime:datetime, EventHubOffset:string)

CSV mapping example

Run the following commands to add data to the beginning of the record. Note ordinal values.

    .create table TestTable ingestion csv mapping "CsvMapping1"
    '   { "column" : "Timespan", "Properties":{"Ordinal":"2"}},'
    '   { "column" : "Metric", "Properties":{"Ordinal":"3"}},'
    '   { "column" : "Value", "Properties":{"Ordinal":"4"}},'
    '   { "column" : "EventHubEnqueuedTime", "Properties":{"Ordinal":"0"}},'
    '   { "column" : "EventHubOffset", "Properties":{"Ordinal":"1"}}'

JSON mapping example

Data is added by using the system properties names as they appear in the Data connection blade Event system properties list. Run these commands:

    .create table TestTable ingestion json mapping "JsonMapping1"
    '    { "column" : "Timespan", "Properties":{"Path":"$.timestamp"}},'
    '    { "column" : "Metric", "Properties":{"Path":"$.metric"}},'
    '    { "column" : "Value", "Properties":{"Path":"$.metric_value"}},'
    '    { "column" : "EventHubEnqueuedTime", "Properties":{"Path":"$.x-opt-enqueued-time"}},'
    '    { "column" : "EventHubOffset", "Properties":{"Path":"$.x-opt-offset"}}'

Copy the connection string

When you run the sample app listed in Prerequisites, you need the connection string for the event hub namespace.

  1. Under the event hub namespace you created, select Shared access policies, then RootManageSharedAccessKey.

    Shared access policies

  2. Copy Connection string - primary key. You paste it in the next section.

    Connection string

Generate sample data

Use the sample app you downloaded to generate data.

  1. Open the sample app solution in Visual Studio.

  2. In the program.cs file, update the connectionString constant to the connection string you copied from the event hub namespace.

    const string eventHubName = "test-hub";
    // Copy the connection string ("Connection string-primary key") from your Event Hub namespace.
    const string connectionString = @"<YourConnectionString>";
  3. Build and run the app. The app sends messages to the event hub, and it prints out status every ten seconds.

  4. After the app has sent a few messages, move on to the next step: reviewing the flow of data into your event hub and test table.

Review the data flow

With the app generating data, you can now see the flow of that data from the event hub to the table in your cluster.

  1. In the Azure portal, under your event hub, you see the spike in activity while the app is running.

    Event hub graph

  2. To check how many messages have made it to the database so far, run the following query in your test database.

    | count
  3. To see the content of the messages, run the following query:


    The result set should look like the following:

    Message result set


    • Azure Data Explorer has an aggregation (batching) policy for data ingestion, designed to optimize the ingestion process. The policy is configured to 5 minutes or 500 MB of data, by default, so you may experience a latency. See batching policy for aggregation options.
    • Event Hub ingestion includes Event Hub response time of 10 seconds or 1 MB.
    • Configure your table to support streaming and remove the lag in response time. See streaming policy.

Clean up resources

If you don't plan to use your event hub again, clean up test-hub-rg, to avoid incurring costs.

  1. In the Azure portal, select Resource groups on the far left, and then select the resource group you created.

    If the left menu is collapsed, select Expand button to expand it.

    Select resource group to delete

  2. Under test-resource-group, select Delete resource group.

  3. In the new window, type the name of the resource group to delete (test-hub-rg), and then select Delete.

Next steps