How to perform stream analytics to simple export data from blob storage to table storage in regular intervals?

Jemimah Priyadarshini 1 Reputation point
2021-05-02T06:19:11.99+00:00

Hi,
I am creating a stream analytics job in azure where data is exported and saved into table storage in azure. While I am able to do the data export when the stream analytic job starts, data export does not happen when job is still running since query is executed only once. As a result, table storage does not hold updated data from blob. Can you please help me to export the data into table storage in regular intervals without any aggregation.
Thanks,
Jemimah

Azure Table Storage
Azure Table Storage
An Azure service that stores structured NoSQL data in the cloud.
156 questions
Azure Stream Analytics
Azure Stream Analytics
An Azure real-time analytics service designed for mission-critical workloads.
330 questions
0 comments No comments
{count} votes

2 answers

Sort by: Most helpful
  1. MartinJaffer-MSFT 26,011 Reputation points
    2021-05-03T21:13:12.063+00:00

    Hello @Jemimah Priyadarshini and welcome to Microsoft Q&A.

    If I understand correctly, you are using Stream Analytics to transfer data from blob to table storage. The data is being moved correctly. The problem is that new data is not detected, and hence not moved.

    Are you appending data to the same existing blob the job just read? That use-case is not supported by Stream Analytics. A different architecture or a different service should be used. I would need to know more before I could recommend an alternative.

    Stream Analytics does not support adding content to an existing blob file. Stream Analytics will view each file only once, and any changes that occur in the file after the job has read the data are not processed. Best practice is to upload all the data for a blob file at once and then add additional newer events to a different, new blob file.

    If the blob is being written while the job is still running, that is a different issue.

    0 comments No comments

  2. Jemimah Priyadarshini 1 Reputation point
    2021-05-05T08:19:05.973+00:00

    Hi @MartinJaffer-MSFT

    Thanks for your reply.
    Our idea is to move the log analytics(Telemetric) data to azure table storage for long time storage and build dashboard on table storage data.
    We used diagnostic setting to move data from log analytics in application insights to azure blob storage containers.
    From blob container trying to stream data into azure table storage.
    Blob file pattern is given in below example :

    Filename/y=2021/m=03/d=01/h=23/m=00/file.blob

    So for every hour in a day, blob files are getting created.
    Please suggest me on how streaming data can be handled in these blob files.
    If I had to wait for blob file to finish, data will be one hour delayed.
    Also how to define the job to accept date and time dynamically in this format in stream analytic job. because only formats accepted are {yyyy/mm/dd} or {hh/mm}, but my format is y=yyyy/m=mm/d=dd...