Im trying to send Data from my IoTHub Input to my Sql Database output, but tables stay empty?

Snack Overflow 0 Reputation points
2024-03-20T14:33:58.6733333+00:00

Hello everyone,

Im currently working on a university-project and I'm connecting my Raspberry Pi Pico with Azure.

The Pico is sending temperature data, which is recieved in the IoTHub Input im my analytics stream, but when I'm trying to send the data to my database, the job starts, but nothing is being send to the tables and the tables stay empty. On the simulation tab, i get the warning: Output 'DatabasePicoData' is not partitioned. Set MaxWriterCount to 0.

Does anyone have an idea, what i could do? I already checked the testresults with the sql-table-schema and all columns match perfectly.

First i tried to seperate the incoming data into three different tables. When this didn't work, I tried to funnel all the data into one Table for later separation, but it also didnt work.

Thanks in advance

Azure IoT Hub
Azure IoT Hub
An Azure service that enables bidirectional communication between internet of things (IoT) devices and applications.
1,117 questions
Azure Stream Analytics
Azure Stream Analytics
An Azure real-time analytics service designed for mission-critical workloads.
330 questions
{count} votes

3 answers

Sort by: Most helpful
  1. Sedat SALMAN 13,160 Reputation points
    2024-03-20T14:48:14.84+00:00

    please start with checking that the data types and structure of the incoming data from your IoT Hub align with the schema of your SQL database table columns then use the "Test" functionality in the Stream Analytics job output configuration. Send a sample of your input data and verify if it successfully writes to your SQL database and If your Azure Stream Analytics output to the SQL database is not partitioned, the MaxWriterCount should usually be set to 1 (though 0 as the warning suggests might be worth an experiment). Having multiple writers to a non-partitioned database can lead to conflicts.

    1 person found this answer helpful.
    0 comments No comments

  2. Sander van de Velde 28,386 Reputation points MVP
    2024-03-20T15:53:31.9666667+00:00

    Hello @Snack Overflow,

    welcome to this moderated Azure community forum.

    Do you see the messages arrive in Azure Stream Analytics?

    Can you output the IoT messages to an alternative output like an Event Hub or a storage account, just to test the flow?

    Have you considered using another database or are you bound to Sql Server? I ask this because the IoT devices generate time related facts and observations. For this, a timeseries database is much more useful once the number of rows increases. Querying billions of rows in Azure Data Explorer takes just a few seconds, not minutes or more.

    Check out this blog post about the free version of Azure Data Explorer.

    0 comments No comments

  3. LeelaRajeshSayana-MSFT 13,456 Reputation points
    2024-03-21T00:50:17.69+00:00

    Hi @Snack Overflow Thank you for posting the question here. I have created a test scenario of uploading the data to SQL table and could see the data flow from Stream analytics job to SQL DB table even though I see the same warning in the Job simulation as you. Please see the below image for reference.User's image

    My steam analytics job comprises of a simple query of pulling all the data and pushing into a single DB table. Below is my query -

    SELECT
        *
    INTO
        [iotdb]
    FROM
        [RajIoTHub]
    

    As @Sedat SALMAN pointed, you have perform a test on the data and create a table based on the schema returned. I have performed the Test Query and created a table with same column names and data types. For the data type string I have used Text in the DB table.

    Below is the data I received from the IoT Hub input in Test results

    enter image description here

    Here is the corresponding table I have created in SQL DB

    enter image description here

    In addition to the above checks, please also review if access of Azure services is enabled to Azure SQL server. You can review the Networking -> Public Access section of SQL server for this setting. Please review the below image.

    User's image

    If you still face issues any issues with writing data to the SQL DB table, enable Diagnostic logs on Stream Analytics job and Database. Inspect the logs generated by running the query All output data errors which captures all errors that occurred while writing the results of the query to the outputs in your job for additional investigation. Here is the query for your reference.

    AzureDiagnostics
    | where ResourceProvider == "MICROSOFT.STREAMANALYTICS" and parse_json(properties_s).DataErrorType in ("OutputDataConversionError.RequiredColumnMissing", "OutputDataConversionError.ColumnNameInvalid", "OutputDataConversionError.TypeConversionError", "OutputDataConversionError.RecordExceededSizeLimit", "OutputDataConversionError.DuplicateKey")
    | project TimeGenerated, Resource, Region_s, OperationName, properties_s, Level, _ResourceId
    

    Hope this helps! Please let us know if you need any additional assistance in the comments below. We would be happy to assist you further.


    If the response helped, please do click Accept Answer and Yes for the answer provided. Doing so would help other community members with similar issue identify the solution. I highly appreciate your contribution to the community.

    0 comments No comments