What is the best solution for live dashboard using powerBI?
For a IoT solution for a device which sends tons of sensor data from more 10-12 sensors the device has, what is the best solution to have, A live dashboard showing last sensor values device state Live dashboard showing 1 hour of sensor values …
Stream Analytics job doesn't load properly
Hello, I am facing the problem with the Stream Analytics. I've set up a job without any issue. After starting the job, I can't do anything with the service. I can't start, stop, delete, move and even refresh the job on the overview page. I don't see any…
How can I force cast JSON-number values to float?
Exactly what the image shows. I have numbers that not always have decimal numbers and I need them to be float. As the image shows, this doesn't work today. What am I doing wrong?
Low level mistake - mismatch in naming found in Stream Analytics
I have been using Stream Analytics. I got iothub-pj001, iothub-pj005 and iothub-pj009 in the Iothub and these are set as input in the Stream Analytics. I found that I cannot find the option of iothub-pj001 last night when I was trying to add a new…
Not able to use ML model to post and retrieve data
Hi I am using ml library model that results into below output on test page. False Alarm Prediction using Two Class Decision Forest [Predictive Exp.]' test returned ["8/10/2023 6:15:00 AM","User…
Setup edge hub input for azure stream analytics using the azure cli or python SDK
I am trying to create an azure stream analytics job using azure cli or python SDK with both input and output as edge hub. I tried using None and GatewayMessageBus as the input types but both of them did not work. It gives an error message: (BadRequest)…
watermark keeps increasing and event are not processed in Stream Analytics
I have IoT package data coming in from Event Hub to Stream Analytics and then outputted to a Delta Lake before switching to Delta Lake, I had output as CSV and it worked fine, no watermark delays, resource utilization was stable and low. After switching…
I can't use the Azure Stream analytic no code editor
Hi community, It seems that I can't use the no-code editor in my Azure stream analytic resource. I think I match all the Pre requisites for using No-Code editor. But not aware if input and outputs are something that I can set, as once I set those this…
Azure Stream Analytic at-least-once guarantees is not 100% accurate
I have Azure Stream Analytic (ASA) that process data from the event hub as input. And this job is for the finance biz so it is heavily based on Event Delivery Guarantees at least once of Azure Stream Analytic. But recently we found out that some data…
Processing nested JSON data in stream analytics for real-time visualization
Hello, I'm trying to get my sensor data into Power Bi via stream analytics so I can visualize it in real time. I'm getting JSON messages sent to the IoT Hub that are nested. These messages are from two sensors and can be recognized by the nAdr variable…
Send an alert email upon a new event in the Azure Auth Methods | Registration and Reset log
Hi folks, I'd like to send an email to my admin team when a user in my org adds a new authentication method. These events appear in the Azure portal in: Azure Active Directory > Security > Authentication Methods > Registration & reset…
Additional device states (say error states), which is the right place device twin or telemetry?
Hello Team, We are setting up IoT solution for our new device. The device is a complex device, hence there would be various states and error-states. I am trying to figure out which is the right place to update the states - device twin or telemetry? As of…
Can we debug the "InputDeserializerError.InvalidData" input JSON
Hi Team, We already have a document for the error "InputDeserializerError.InvalidData": https://learn.microsoft.com/en-us/azure/stream-analytics/data-errors#inputdeserializererrorinvaliddata Also we have Q&A document for the…
Azure Eventhubs Header X-Frame-Options
Hello, I have the following problem in this architecture: Integration: I have connected as output of a Stream Analytics to an Eventhub, this Eventhub is inside a virtual network and is used to send data to an Azure Function that then takes them and sends…
If we schedule event hub capture with parquet and delta both in standard tier, will it be charged extra for more than one file format?
I am using Event Hub Standard Tier with 2 TUs with capture enabled. However when enabling capture in parquet, I use stream analytics job created to store same data in delta, json and csv as well. Since in Standard tier it is billed $73/TU/month. If I use…
Do we incur extra charges for using event hup capture to stream data into Cosmos DB?
Do we incur extra charges for using event hup capture to stream data into Cosmos DB? I am using Standard Tier Azure Event Hub with 2 Throughput Units in US East region. I am also capturing data in parquet, JSON, CSV and delta lake format in ADLS Gen2. In…
Is capturing events in Parquet using stream analytics job costs extra on top of standard $73/TU/Month in Event Hub Standard Tier?
Hello People, I am working with Azure Event Hubs to pull data stream and capture it in Parquet format. Parquet is most suitable for my solution. I am using a standard tier Event Hub Namespace with 2 TU with capture enabled on Event Hub. As per the…
Azure Stream Analytics reference data input suddenly getting truncated to first 50 records after upgrade to V2 plan
An existing Stream Analytics job which has not been changes for a couple of years and working happily It supports a few hundred IoT devices "in the wild" As part of the job it relies on several (4x) reference data inputs All inputs are…
Struggling with Event Hub Errors: Unable to Trigger Functions After Changing Partition Count
I have built a system where I send messages from devices to IoT Hub, pass them through Stream Analytics, and trigger Functions from Event Hub. I conducted a series of tests at this point and confirmed that Functions were triggered successfully. Later, I…
Azure Event Hub receiving JSON with extraneous outer array bracket into Stream Analytics
As you can see below my events are being received with an outer array bracket, however they only ever contain 1 actual record (this is desired). I understand that this is probably how the source data comes in, and if were receiving as an actual JSON…