question

JemimahPriyadarshini-8153 avatar image
0 Votes"
JemimahPriyadarshini-8153 asked MartinJaffer-MSFT commented

How to perform stream analytics to simple export data from blob storage to table storage in regular intervals?

Hi,
I am creating a stream analytics job in azure where data is exported and saved into table storage in azure. While I am able to do the data export when the stream analytic job starts, data export does not happen when job is still running since query is executed only once. As a result, table storage does not hold updated data from blob. Can you please help me to export the data into table storage in regular intervals without any aggregation.
Thanks,
Jemimah

azure-stream-analyticsazure-table-storage
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

MartinJaffer-MSFT avatar image
0 Votes"
MartinJaffer-MSFT answered MartinJaffer-MSFT edited

Hello @JemimahPriyadarshini-8153 and welcome to Microsoft Q&A.

If I understand correctly, you are using Stream Analytics to transfer data from blob to table storage. The data is being moved correctly. The problem is that new data is not detected, and hence not moved.

Are you appending data to the same existing blob the job just read? That use-case is not supported by Stream Analytics. A different architecture or a different service should be used. I would need to know more before I could recommend an alternative.

Stream Analytics does not support adding content to an existing blob file. Stream Analytics will view each file only once, and any changes that occur in the file after the job has read the data are not processed. Best practice is to upload all the data for a blob file at once and then add additional newer events to a different, new blob file.

If the blob is being written while the job is still running, that is a different issue.

5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

JemimahPriyadarshini-8153 avatar image
0 Votes"
JemimahPriyadarshini-8153 answered MartinJaffer-MSFT commented

Hi @MartinJaffer-MSFT

Thanks for your reply.
Our idea is to move the log analytics(Telemetric) data to azure table storage for long time storage and build dashboard on table storage data.
We used diagnostic setting to move data from log analytics in application insights to azure blob storage containers.
From blob container trying to stream data into azure table storage.
Blob file pattern is given in below example :

Filename/y=2021/m=03/d=01/h=23/m=00/file.blob

So for every hour in a day, blob files are getting created.
Please suggest me on how streaming data can be handled in these blob files.
If I had to wait for blob file to finish, data will be one hour delayed.
Also how to define the job to accept date and time dynamically in this format in stream analytic job. because only formats accepted are {yyyy/mm/dd} or {hh/mm}, but my format is y=yyyy/m=mm/d=dd...

· 1
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

Hello @JemimahPriyadarshini-8153 , apologies for the delayed response.
I do not think Stream Analytics can work in this case. I think you can try with Azure Data Factory instead.

The reason I suggest Azure Data Factory over Azure Stream Analytics, is because the Data Factory Event/Storage trigger can trigger on blob updates, while Stream Analytics only triggers on blob creation. Data Factory has connectors for both table storage and blob storage.

If your files from Log analytics are in 1-hour chunks, then I am convinced it is written in smaller pieces. A connection is not likely kept open for a whole hour. This means Stream Alalytics starts and finishes reading before the Log Analytics finishes writing. This means it completes before all the data is there.


0 Votes 0 ·