question

DivakarKrishnan-8551 avatar image
0 Votes"
DivakarKrishnan-8551 asked PRADEEPCHEEKATLA-MSFT commented

Architecture for Log data analysis

Dear Team,

We have a requirement to ingest the Network user log traffic data and build the Power BI report on top of it.

Since it’s a log data, we don’t want to store it in our Azure SQL Warehouse DB, and we would like to implement any other effective architecture in azure to ingest and store this data.

Log Data Sample Below:
195514-logdata-sample.png

Source for this file:
Network application from vendor will generate this file on hourly basis and transfer this file to our SFTP. So we can say that we need to process 24 files on each day with the huge log data in each file.

Please advise me on which architecture and which service in azure is best fit for this scenario, I prefer to use some live streaming to ingest the data. Kindly share your suggestion.
Thanks,
Divakar


azure-data-factoryazure-data-explorer
logdata-sample.png (36.5 KiB)
· 1
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

Hello @DivakarKrishnan-8551,

Following up to see if the below suggestions was helpful. And, if you have any further query do let us know.


  • Please don't forget to click on 130616-image.png or upvote 130671-image.png button whenever the information provided helps you.

0 Votes 0 ·
NandanHegde-7720 avatar image
0 Votes"
NandanHegde-7720 answered DivakarKrishnan-8551 commented

Hey,
Assuming you do have Azure SQL DW already available,
you can leverage ADF to upload files from SFTP to Azure blob , Then leverage external table to read data from file rather than explicitly loading data into synapse.
And you can query the external table via PowerBI.

Else you can leverage ADF to upload in blob and access blob file directly via PowerBI.

· 2
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

Hi @NandanHegde-7720 ,


With the accessing blob files directly via Power BI, will there be any impact in performance in power BI?

24 files will be generated, and I expect each file may have close to 30,000 to 50,000 records. So per day it's 1.2 million records.

I am just curious about Power BI performance capabilities with this approach. Please suggest.

Thanks,
Divakar

0 Votes 0 ·
NandanHegde-7720 avatar image NandanHegde-7720 DivakarKrishnan-8551 ·

Hey, Accessing files from blob with the current amount of data shouldnt have any performance issues.

0 Votes 0 ·
DavidBroggy-5270 avatar image
2 Votes"
DavidBroggy-5270 answered

Hi @DivakarKrishnan-8551
plan B would be to use Azure Data Factory with Log Analytics.
1.2M records for a log analytics workspace should be a joke.
I work with billions of records per day using Sentinel, which depends on log analytics for everything.

If this helps please accept my solution and upvote.
Or just have a nice day.


5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.