Enrich data and ingest to event hub using the Stream Analytics no code editor
This article describes how you can use the no code editor to easily create a Stream Analytics job. It continuously reads from your Event Hubs, enrich the incoming data with SQL reference data, and then writes the results continuously to event hub.
Prerequisites
- Your Azure Event Hubs and SQL reference data resources must be publicly accessible and not be behind a firewall or secured in an Azure Virtual Network
- The data in your Event Hubs must be serialized in either JSON, CSV, or Avro format.
Develop a Stream Analytics job to enrich event hub data
In the Azure portal, locate and select the Azure Event Hubs instance.
Select Features > Process Data and then select Start on the Enrich data and ingest to Event Hub card.
Enter a name for the Stream Analytics job, then select Create.
Specify the Serialization type of your data in the Event Hubs window and the Authentication method that the job will use to connect to the Event Hubs. Then select Connect.
When the connection is established successfully and you have data streams flowing into your Event Hubs instance, you'll immediately see two things:
- Fields that are present in the input data. You can choose Add field or select the three dot symbol next to a field to remove, rename, or change its type.
- A live sample of incoming data in the Data preview table under the diagram view. It automatically refreshes periodically. You can select Pause streaming preview to see a static view of the sample input data.
- Fields that are present in the input data. You can choose Add field or select the three dot symbol next to a field to remove, rename, or change its type.
Select the Reference SQL input tile to connect to the reference SQL database.
Select the Join tile. In the right configuration panel, choose a field from each input to join the incoming data from the two inputs.
Select the Manage tile. In the Manage fields configuration panel, choose the fields you want to output to event hub. If you want to add all the fields, select Add all fields.
Select Event Hub tile. In the Event Hub configuration panel, fill in needed parameters and connect, similarly to the input event hub configuration.
Optionally, select Get static preview/Refresh static preview to see the data preview that will be ingested in event hub.
To start the job, specify:
- The number of Streaming Units (SUs) the job runs with. SUs represents the amount of compute and memory allocated to the job. We recommended that you start with three and then adjust as needed.
- Output data error handling – It allows you to specify the behavior you want when a job’s output to your destination fails due to data errors. By default, your job retries until the write operation succeeds. You can also choose to drop such output events.
After you select Start, the job starts running within two minutes, and the metrics will be open in tab section below.
You can also see the job under the Process Data section on the Stream Analytics jobs tab. Select Open metrics to monitor it or stop and restart it, as needed.
Next steps
Learn more about Azure Stream Analytics and how to monitor the job you've created.
Feedback
https://aka.ms/ContentUserFeedback.
Coming soon: Throughout 2024 we will be phasing out GitHub Issues as the feedback mechanism for content and replacing it with a new feedback system. For more information see:Submit and view feedback for