Use Azure Stream Analytics tools for Visual Studio

Azure Stream Analytics tools for Visual Studio are now generally available. These tools enable a richer experience for Stream Analytics users to troubleshoot as well as write complex queries and even write queries locally. You also can export a Stream Analytics job into a Visual Studio project.


In this tutorial, you learn how to use Stream Analytics tools for Visual Studio to create, author, test locally, manage, and debug your Stream Analytics jobs. After you create the job, you can set up a continuous integration and deployment process to Azure by using the CI/CD Nuget package, to learn more refer to Stream Analytics VS tools to set up a CI/CD pipeline article.

After completing this tutorial, you will be able to:

  • Familiarize yourself with the Stream Analytics tools for Visual Studio.
  • Configure and deploy a Stream Analytics job.
  • Test your job locally with local sample data.
  • Use the monitoring to troubleshoot issues.
  • Export existing jobs to projects.


You need the following prerequisites to complete this tutorial:

  • Finish the steps up to "Create a Stream Analytics job" in the tutorial Build an IoT solution by using Stream Analytics.
  • Install Visual Studio 2017, Visual Studio 2015, or Visual Studio 2013 Update 4. Enterprise (Ultimate/Premium), Professional, and Community editions are supported. Express edition is not supported.
  • Follow the installation instructions to install Stream Analytics tools for Visual Studio.

Create a Stream Analytics project

In Visual Studio, select File > New Project. In the templates list on the left, select Stream Analytics, and then select Azure Stream Analytics Application. At the bottom of the page, input the project Name, Location, and Solution name as you do for other projects.

New project creation

The project Toll is generated in Solution Explorer.

Toll project in Solution Explorer

Choose the correct subscription

  1. On the View menu, select Server Explorer in Visual Studio.

  2. Log in with your Azure account.

Define input sources

  1. In Solution Explorer, expand the Inputs node and rename Input.json to EntryStream.json. Double-click EntryStream.json.

  2. For Input Alias, enter EntryStream. Note that the input alias is used in query script.

  3. For Source Type, select Data Stream.

  4. For Source, select Event Hub.

  5. For Service Bus Namespace, select the TollData option in the drop-down list.

  6. For Event Hub name, select entry.

  7. For Event Hub Policy Name, select RootManageSharedAccessKey (the default value).

  8. For Event Serialization Format, select Json, and for Encoding, select UTF8.

    Your settings look like this:

    Input settings

  9. At the bottom of the page, select Save to finish the wizard. Now you can add another input source to create the exit stream. Right-click the Inputs node, and select New Item.

    New Item

  10. In the pop-up window, choose Stream Analytics Input, and change the Name to ExitStream.json. Select Add.

    Add New Item

  11. Double-click ExitStream.json in the project, and follow the same steps as the entry stream to fill in the fields. For Event Hub Name, be sure to enter exit, as shown in the following screenshot:

    ExitStream settings

    Now you have defined two input streams.

    Two input streams

    Next, you add reference data input for the blob file that contains car registration data.

  12. Right-click the Inputs node in the project, and then follow the same process for the stream inputs. For Source Type, select Reference data, and for Input Alias, enter Registration.

    Registration settings

  13. Select the Storage account that contains the option with TollData. The container name is TollData, and the Path Pattern is registration.json. This file name is case sensitive and should be lowercase.

  14. Select Save to finish the wizard.

Now all the inputs are defined.

Define output

  1. In Solution Explorer, expand the Inputs node and double-click Output.json.

  2. For Output Alias, enter output. For Sink, select SQL Database.

  3. For the Database name, enter TollDataDB.

  4. For User Name, enter tolladmin. For Password, enter 123toll!. For Table, enter TollDataRefJoin.

  5. Select Save.

    Output settings

Stream Analytics query

This tutorial attempts to answer several business questions that are related to toll data. We constructed queries that can be used in Stream Analytics to provide relevant answers. Before you start your first Stream Analytics job, let's explore a simple scenario and the query syntax.

Introduction to Stream Analytics query language

Let's say that you need to count the number of vehicles that enter a toll booth. Because this stream of events is continuous, you have to define a period of time. Let's modify the question to be "How many vehicles enter a toll booth every three minutes?" This measurement is commonly referred to as the tumbling count.

Let's look at the Stream Analytics query that answers this question:

    SELECT TollId, System.Timestamp AS WindowEnd, COUNT(*) AS Count 
    FROM EntryStream TIMESTAMP BY EntryTime 
    GROUP BY TUMBLINGWINDOW(minute, 3), TollId 

As you can see, Stream Analytics uses a query language that's like SQL. It adds a few extensions to specify time-related aspects of the query.

For more details, read about time management and windowing constructs used in the query from MSDN.

Now that you have written your first Stream Analytics query, test it by using sample data files located in your TollApp folder in the following path:


This folder contains the following files:

  • Entry.json
  • Exit.json
  • Registration.json

Question: Number of vehicles entering a toll booth

In the project, double-click Script.asaql to open the script in the editor. Paste the script in the previous section into the editor. The query editor supports IntelliSense, syntax coloring, and an error marker.

Query editor

Test Stream Analytics queries locally

You can first compile the query to see if there is any syntax error.

  1. To validate this query against sample data, use local sample data by right-clicking the input and selecting Add local input.

    Add local input

  2. In the pop-up window, select the sample data from your local path. Select Save.

    Add local input

    A file named local_EntryStream.json is added automatically to your inputs folder.

    Local input folder file list

  3. Select Run Locally in the query editor. Or you can press F5.

    Run Locally

    You can find the output path from the console output. Press any key to open the result folder.

    Local run

  4. Check the results in the local folder.

    Local folder result

Sample input

You also can sample input data from input sources to the local file. Right-click the input configuration file, and select Sample Data.

Sample Data

Note that you can sample only event hubs or IoT hubs for now. Other input sources are not supported. In the pop-up dialog box, fill in the local path to save the sample data. Select Sample.

Sample data configuration

You can see the progress in the Output window.

Sample data output

Submit a Stream Analytics query to Azure

  1. In the Query Editor, select Submit To Azure in the script editor.

    Submit To Azure

  2. Select Create a New Azure Stream Analytics Job. For Job Name, enter TollApp. Choose the correct Subscription in the drop-down list. Select Submit.

    Submit Job

Start the job

Now your job is created, and the job view opens automatically.

  1. Select the green arrow button to start the job.

    Start job button

  2. Choose the default setting, and select Start.

    Start Job

    You can see the job status changed to Running, and there are input/output events.

    Job Summary and Metrics

Check results in Visual Studio

  1. Open Visual Studio Server Explorer, and right-click the TollDataRefJoin table.

  2. Select Show Table Data to see the output of your job.

    Show Table Data

View job metrics

Some basic job statistics are shown in Job Metrics.

Job Metrics

List the job in Server Explorer

In Server Explorer, select Stream Analytics Jobs and then select Refresh. Your job appears under Stream Analytics jobs.

Jobs list

Open the job view

Expand your job node, and double-click on the Job View node to open a job view.

Job View

Export an existing job to a project

There are two ways you can export an existing job to a project.

  • In Server Explorer, under the Stream Analytics Jobs node, right-click the job node. Select Export to New Stream Analytics Project.

    Export to New Stream Analytics Project

    The generated project appears in Solution Explorer.

    Solution Explorer job

  • In the job view, select Generate Project.

    Generate Project

Known issues and limitations

  • Local testing doesn't work if your query has geo-spatial functions.
  • Editor support isn't available for adding or changing JavaScript UDF.
  • Local testing doesn't support saving output in JSON format.
  • Support isn't available for Power BI output and ADLS output.

Next steps