question

SoumyaBanerjee-9177 avatar image
0 Votes"
SoumyaBanerjee-9177 asked SumanthMarigowda-MSFT commented

Sentinel to Blob

I want to transfer the data from log analytics workspace(sentinel) to Azure Blob through Data export option. I have setup 30 days retention in LAW.

From 31st day I want the data in BLOB storage . Is there a way to send the data only after 30 days from LAW to Az Blob through data export ? (Although the retention for first 30 days is anyways free in LAW)

Data in LAW( 30 days) ---> Data Export ----> Az BLOB ( 31st -180 days) ------> Automatically Delete data from Blob which has crossed 180 days

I would want the data to be deleted automatically which is stored in Blob for 180 days.

In storage lifecycle management, I can see an option as Base Blobs " Last modified " . If I select 30 days here , so will it only move the data which has crossed 30 days in LAW and not the current one?

Similarly to delete data that has crossed 180 days should I select if Base blob were last modified than 180 days ?






azure-storage-accountsazure-blob-storagemicrosoft-sentinel
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

AndrewBlumhardt-1137 avatar image
0 Votes"
AndrewBlumhardt-1137 answered SoumyaBanerjee-9177 commented

My understanding is that the data export feature is sent simultaneously at ingestion (no option for after the retention period). The difference once the data starts flowing (during or after) is really unimportant (at least not in the 30-90 day range). You could be more targeted with a logic app but the added cost and complexity may not be worth the effort. https://docs.microsoft.com/en-us/azure/azure-monitor/logs/logs-data-export

Sentinel increased the free retention period to 90 days. Extended storage in Log Analytics is very cost effective in the short term. If you only require 180 days I think you will find the added 90-days within log analytics to be affordable and simple to manage. Extended retention in log analytics can be expensive for large datasets after 8-12 months. Our pricing calculator can help with extended storage budgeting if needed.

Also note that Log Analytics supports table-level retention settings. This is set with an ARM template if you have varying retention requirements for individual tables. This option can make native retention more affordable.

You should also check out the new basic logs, archive option, and ingestion-time filtering. These new options will have a big impact on archival strategy. https://techcommunity.microsoft.com/t5/azure-observability-blog/the-next-evolution-of-azure-monitor-logs/ba-p/3143195





· 1
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

Much Thanks @AndrewBlumhardt-1137 . Say, If I keep it in blob (client wants it) for 180 days , how will the data be deleted which has passed 180 days ?

In storage lifecycle management, I can see an option as Base Blobs " Last modified " . Setting this to 180 days , Will this do the job for auto-deletion ?



0 Votes 0 ·
AndrewBlumhardt-1137 avatar image
0 Votes"
AndrewBlumhardt-1137 answered SoumyaBanerjee-9177 commented

I know very little about the blob storage tiers or lifecycle management options. Maybe someone else can chime in there. Maybe run a short test to verify.

I would still recommend working with your client on a more manageable solution.

This may help: https://docs.microsoft.com/en-us/azure/storage/blobs/lifecycle-management-overview

· 1
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

Much thanks for your help

0 Votes 0 ·
SudiptaChakraborty-1767 avatar image
0 Votes"
SudiptaChakraborty-1767 answered SoumyaBanerjee-9177 commented

@SoumyaBanerjee-9177 :

The following sample BLOB lifecycle management rule filters the account to run the actions on objects that exist inside sample-container and start with blob1.

Actions:

  • Tier blob to cool tier 30 days after last modification

  • Tier blob to archive tier 90 days after last modification

  • Delete blob 2,555 days (seven years for you it will be 180 days) after last modification

  • Delete previous versions 90 days after creation


    {
    "rules": [
    {
    "enabled": true,
    "name": "sample-rule",
    "type": "Lifecycle",
    "definition": {
    "actions": {
    "version": {
    "delete": {
    "daysAfterCreationGreaterThan": 90
    }
    },
    "baseBlob": {
    "tierToCool": {
    "daysAfterModificationGreaterThan": 30
    },
    "tierToArchive": {
    "daysAfterModificationGreaterThan": 90
    },
    "delete": {
    "daysAfterModificationGreaterThan": 2555
    }
    }
    },
    "filters": {
    "blobTypes": [
    "blockBlob"
    ],
    "prefixMatch": [
    "sample-container/blob1"
    ]
    }
    }
    }
    ]
    }


· 1
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

Hi @SudiptaChakraborty-1767 , I have setup blob as below. Will it delete the data whichever is crossing 180 days automatically ? Also I have selected the options of snapshots and versions as well , Not sure how, when and who creates them or if I should exclude them . If possible Can we connect.
194111-image.png


194018-image.png


193948-image.png


194068-image.png


0 Votes 0 ·
image.png (83.6 KiB)
image.png (110.4 KiB)
image.png (99.2 KiB)
image.png (101.4 KiB)
clivewatson-9831 avatar image
0 Votes"
clivewatson-9831 answered SoumyaBanerjee-9177 published

What will you do with the data in Blob? Sentinel is moving to the (preview) archive feature for 90-7years retention (the data will auto age off after the period is reached).
This feature allows for full searching (using KQL) and restore per Table (as Andrew mentioned https://docs.microsoft.com/en-gb/azure/azure-monitor/logs/cost-logs#log-data-retention-and-archive). If you plan to use the data again then this IMO is the better option, if you never plan to use the data the BLOB may be an option. You set Archive today per table, and that's it (via REST api https://docs.microsoft.com/en-gb/azure/azure-monitor/logs/data-retention-archive?tabs=api-1%2Capi-2#set-retention-and-archive-policy-by-table or via a Sentinel Workbook in the Github).

You need to check the cost difference between BLOB and archive the way you plan to use the data (as you dont say I dont know what you will do with the data in the BLOB).

Over the years I've seen many people move data to BLOB with no plan to be able to use it again, then they have problems when they try, normally when they need to do a quick check!

· 1
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

Much Thanks to both of you @AndrewBlumhardt-1137 and @clivewatson-9831 .

Customer was avoiding this method as it is in preview . So , I have been able to set the archive through the powershell code as mentioned in "https://github.com/Azure/Azure-Sentinel/blob/master/Tools/Archive-Log-Tool/ArchiveLogsTool-PowerShell/Configure-Long-Term-Retention.ps1" .

https://techcommunity.microsoft.com/t5/microsoft-sentinel-blog/ingest-archive-search-and-restore-data-in-microsoft-sentinel/ba-p/3195126

However, I could not find the Azure Activity table through this. How can I setup archive for "Azure activity" and "Usage" tables ? I would also need to setup archiving for 6 months for these 2 tables.


0 Votes 0 ·