Log Analytics workspace data export in Azure Monitor (preview)
Log Analytics workspace data export in Azure Monitor allows you to continuously export data from selected tables in your Log Analytics workspace to an Azure storage account or Azure Event Hubs as it's collected. This article provides details on this feature and steps to configure data export in your workspaces.
Once data export is configured for your Log Analytics workspace, any new data sent to the selected tables in the workspace is automatically exported to your storage account hourly or to your event hub in near-real-time.
All data from included tables is exported without a filter. For example, when you configure a data export rule for SecurityEvent table, all data sent to the SecurityEvent table is exported starting from the configuration time.
Other export options
Log Analytics workspace data export continuously exports data from a Log Analytics workspace. Other options to export data for particular scenarios include the following:
- Scheduled export from a log query using a Logic App. This is similar to the data export feature but allows you to send filtered or aggregated data to Azure storage. This method though is subject to log query limits See Archive data from Log Analytics workspace to Azure storage using Logic App.
- One time export using a Logic App. See Azure Monitor Logs connector for Logic Apps and Power Automate.
- One time export to local machine using PowerShell script. See Invoke-AzOperationalInsightsQueryExport.
- Configuration can currently only be performed using CLI or REST requests. You cannot use the Azure portal or PowerShell.
--export-all-tablesoption in CLI and REST isn't supported and will be removed. You should provide the list of tables in export rules explicitly.
- Supported tables are currently limited those specific in the supported tables section below. If the data export rule includes an unsupported table, the operation will succeed, but no data will be exported for that table. If the data export rule includes a table that doesn't exist, it will fail with the error
Table <tableName> does not exist in the workspace.
- Your Log Analytics workspace can be in any region except for the following:
- Switzerland North
- Switzerland West
- Azure government regions
- The destination storage account or event hub must be in the same region as the Log Analytics workspace.
- Names of tables to be exported can be no longer than 60 characters for a storage account and no more than 47 characters to an event hub. Tables with longer names will not be exported.
Log Analytics data export writes data as append blob which is currently in preview for Azure Data Lake Storage Gen2. You must open a support request before configuring export to this storage. Use the following details for this request.
- Issue type: Technical
- Subscription: Your subscription
- Service: Data Lake Storage Gen2
- Resource: Your resource name
- Summary: Requesting subscription registration to accept data from Log Analytics Data Export.
- Problem type: Connectivity
- Problem subtype: Connectivity issue
Data export will continue to retry sending data for up to 30 minutes in the event that the destination is unavailable. If it's still unavailable after 30 minutes then data will be discarded until the destination becomes available.
There are currently no additional charges for the data export feature. Pricing for data export will be announced in the future and a notice provided prior to start of billing. If you choose to continue using data export after the notice period, you will be billed at the applicable rate.
Data is sent to storage accounts every hour. The data export configuration creates a container for each table in the storage account with the name am- followed by the name of the table. For example, the table SecurityEvent would sent to a container named am-SecurityEvent.
The storage account blob path is WorkspaceResourceId=/subscriptions/subscription-id/resourcegroups/<resource-group>/providers/microsoft.operationalinsights/workspaces/<workspace>/y=<four-digit numeric year>/m=<two-digit numeric month>/d=<two-digit numeric day>/h=<two-digit 24-hour clock hour>/m=00/PT1H.json. Since append blobs are limited to 50K writes in storage, the number of exported blobs may extend if the number of appends is high. The naming pattern for blobs in such a case would be PT1H_#.json, where # is the incremental blob count.
The storage account data format is JSON lines. This means each record is delimited by a newline, with no outer records array and no commas between JSON records.
Log Analytics data export can write append blobs to immutable storage accounts when time-based retention policies have the allowProtectedAppendWrites setting enabled. This allows writing new blocks to an append blob, while maintaining immutability protection and compliance. See Allow protected append blobs writes.
Data is sent to your event hub in near-real-time as it reaches Azure Monitor. An event hub is created for each data type that you export with the name am- followed by the name of the table. For example, the table SecurityEvent would sent to an event hub named am-SecurityEvent. If you want the exported data to reach a specific event hub, or if you have a table with a name that exceeds the 47 character limit, you can provide your own event hub name and export all data for defined tables to it.
- 'Basic' event hub sku supports lower event size limit and some logs in your workspace can exceed it and be dropped. We recommend to use 'Standard' or 'Dedicated' event hub as export destination.
- The volume of exported data often increase over time, and the event hub scale needs to be increased to handle larger transfer rates and avoid throttling scenarios and data latency. You should use the auto-inflate feature of Event Hubs to automatically scale up and increase the number of throughput units and meet usage needs. See Automatically scale up Azure Event Hubs throughput units for details.
Following are prerequisites that must be completed before configuring Log Analytics data export.
- The storage account and event hub must already be created and must be in the same region as the Log Analytics workspace. If you need to replicate your data to other storage accounts, you can use any of the Azure Storage redundancy options.
- The storage account must be StorageV1 or StorageV2. Classic storage is not supported
- If you have configured your storage account to allow access from selected networks, you need to add an exception in your storage account settings to allow Azure Monitor to write to your storage.
Enable data export
The follow steps must be performed to enable Log Analytics data export. See the following sections for more details on each.
- Register resource provider.
- Allow trusted Microsoft services.
- Create one or more data export rules that define the tables to export and their destination.
Register resource provider
The following Azure resource provider needs to registered for your subscription to enable Log Analytics data export.
This resource provider will probably already be registered for most Azure Monitor users. To verify, go to Subscriptions in the Azure portal. Select your subscription and then click Resource providers in the Settings section of the menu. Locate Microsoft.Insights. If its status is Registered, then it's already registered. If not, click Register to register it.
You can also use any of the available methods to register a resource provider as described in Azure resource providers and types. Following is a sample command using PowerShell:
Register-AzResourceProvider -ProviderNamespace Microsoft.insights
Allow trusted Microsoft services
If you have configured your Storage Account to allow access from selected networks, you need to add an exception to allow Azure Monitor to write to the account. From Firewalls and virtual networks for your storage account, select Allow trusted Microsoft services to access this storage account.
Create or update data export rule
A data export rule defines data to be exported for a set of tables to a single destination. You can create a rule for each destination.
View data export configuration
Disable an export rule
Delete an export rule
View all data export rules in a workspace
If the data export rule includes an unsupported table, the configuration will succeed, but no data will be exported for that table. If the table is later supported, then its data will be exported at that time.
If the data export rule includes a table that doesn't exist, it will fail with the error
Table <tableName> does not exist in the workspace.
Supported tables are currently limited to those specified below. All data from the table will be exported unless limitations are specified. This list will be updated as support for additional tables is added.
|Alert||Partial support. Some of the data to this table is ingested through storage account. This data is currently not exported.|
|ConfigurationData||Partial support. Some of the data is ingested through internal services that aren't supported for export. This data is currently not exported.|
|Event||Partial support. Some of the data to this table is ingested through storage account. This data is currently not exported.|
|InsightsMetrics||Partial support. Some of the data is ingested through internal services that isn't supported for export. This portion is missing in export currently.|
|OfficeActivity||Partial support. Some of the data to ingested via webhooks from Office 365 into Log Analytics. This data is currently not exported.|
|Operation||Partial support. Some of the data is ingested through internal services that aren't supported for export. This data is currently not exported.|
|Perf||Partial support. Only windows performance data is currently supported. The linux performance data is currently not exported.|
|Syslog||Partial support. Some of the data to this table is ingested through storage account. This data is currently not exported.|
|Update||Partial support. Some of the data is ingested through internal services that isn't supported for export. This data is currently not exported.|
|WireData||Partial support. Some of the data is ingested through internal services that isn't supported for export. This data is currently not exported.|