Thanks for your reply.
Our idea is to move the log analytics(Telemetric) data to azure table storage for long time storage and build dashboard on table storage data.
We used diagnostic setting to move data from log analytics in application insights to azure blob storage containers.
From blob container trying to stream data into azure table storage.
Blob file pattern is given in below example :
Filename/y=2021/m=03/d=01/h=23/m=00/file.blob
So for every hour in a day, blob files are getting created.
Please suggest me on how streaming data can be handled in these blob files.
If I had to wait for blob file to finish, data will be one hour delayed.
Also how to define the job to accept date and time dynamically in this format in stream analytic job. because only formats accepted are {yyyy/mm/dd} or {hh/mm}, but my format is y=yyyy/m=mm/d=dd...