Log Analytics workspace data export in Azure Monitor

Data export in Log Analytics workspace lets you continuously export data per selected tables in your workspace, to an Azure Storage Account or Azure Event Hubs as it arrives to Azure Monitor pipeline. This article provides details on this feature and steps to configure data export in your workspaces.

Overview

Data in Log Analytics is available for the retention period defined in your workspace, and used in various experiences provided in Azure Monitor and Azure services. There are cases where you need to use other tools:

  • Tamper protected store compliance – data can't be altered in Log Analytics once ingested, but can be purged. Export to Storage Account set with immutability policies to keep data tamper protected.
  • Integration with Azure services and other tools – export to Event Hubs as it arrives and processed in Azure Monitor.
  • Keep audit and security data for very long time – export to Storage Account in the workspace's region, or replicate data to other regions using any of the Azure Storage redundancy options including "GRS" and "GZRS".

After configuring data export rules in Log Analytics workspace, new data for tables in rules is exported from Azure Monitor pipeline to your Storage Account or Event Hubs as it arrives.

Data export overview

Data is exported without a filter. For example, when you configure a data export rule for SecurityEvent table, all data sent to the SecurityEvent table is exported starting from the configuration time.

Other export options

Log Analytics workspace data export continuously exports data that is sent to your Log Analytics workspace. There are other options to export data for particular scenarios:

Limitations

  • All tables will be supported in export, but currently limited to those specified in the supported tables section.
  • Legacy custom log using the HTTP Data Collector API won’t be supported in export, while data for DCR based custom logs can be exported.
  • You can define up to 10 enabled rules in your workspace. More rules are allowed when disabled.
  • Destinations must be in the same region as the Log Analytics workspace.
  • Storage Account must be unique across rules in workspace.
  • Tables names can be no longer than 60 characters when exporting to Storage Account and 47 characters to Event Hubs. Tables with longer names will not be exported.
  • Data export isn't supported in China currently.

Data completeness

Data export is optimized for moving large data volume to your destinations, and in certain retry conditions, can include a fraction of duplicated records. The export operation could fail when ingress limits are reached, see details under Create or update data export rule. In such case, a retry continues for up to 30 minutes, and if destination is unavailable yet, data will be discarded until destination becomes available.

Pricing model

Data export charges are based on the volume of data exported measured in bytes. The size of data exported by Log Analytics Data Export is the number of bytes in the exported JSON formatted data. Data volume is measured in GB (10^9 bytes)

For more information, including the data export billing timeline, see Azure Monitor pricing.

Export destinations

Data export destination must be available before creating export rules in your workspace. Destinations don't have to be in the same subscription as your workspace. When using Azure Lighthouse, it is also possible send data to destinations in another Azure Active Directory tenant.

Storage Account

You need to have 'write' permissions to both workspace and destination to configure data export rule.

Don't use an existing Storage Account that has other, non-monitoring data to better control access to the data, and prevent reaching storage ingress rate limit, failures, and latency.

To send data to immutable Storage Account, set the immutable policy for the Storage Account as described in Set and manage immutability policies for Blob storage. You must follow all steps in this article, including enabling protected append blobs writes.

The Storage Account must be StorageV1 or above and in the same region as your workspace. If you need to replicate your data to other Storage Accounts in other regions, you can use any of the Azure storage redundancy options, including "GRS" and "GZRS".

Data is sent to Storage Accounts as it reaches Azure Monitor and exported to destinations located in workspace region. A container is created for each table in Storage Account, with the name am- followed by the name of the table. For example, the table SecurityEvent would send to a container named am-SecurityEvent.

Blobs are stored in 5-minute folders in path structure: WorkspaceResourceId=/subscriptions/subscription-id/resourcegroups/<resource-group>/providers/microsoft.operationalinsights/workspaces/<workspace>/y=<four-digit numeric year>/m=<two-digit numeric month>/d=<two-digit numeric day>/h=<two-digit 24-hour clock hour>/m=<two-digit 60-minute clock minute>/PT05M.json. Append blobs is limited to 50-K writes and could be reached, and more blobs will be added in folder as: PT05M_#.json*, where # is incremental blob count.

The format of blobs in Storage Account is in JSON lines, where each record is delimited by a newline, with no outer records array and no commas between JSON records.

Storage sample data

Event Hubs

You need to have 'write' permissions to both workspace and destination to configure data export rule. The shared access policy for the Event Hubs namespace defines the permissions that the streaming mechanism has. Streaming to Event Hubs requires Manage, Send, and Listen permissions. To update the export rule, you must have the ListKey permission on that Event Hubs authorization rule.

Don't use an existing Event Hubs that has, non-monitoring data to prevent reaching Event Hubs namespace ingress rate limit, failures, and latency.

Data is sent to your Event Hubs as it reaches Azure Monitor and exported to destinations located in workspace region. You can create multiple export rules to the same Event Hubs namespace by providing different event hub name in rule. When event hub name isn't provided, default Event Hubs are created for tables that you export with name: am- followed by the name of the table. For example, the table SecurityEvent would sent to an Event Hub named: am-SecurityEvent. The number of supported Event Hubs in 'Basic' and 'Standard' namespaces tiers is 10. When exporting more than 10 tables to these tiers, either split the tables between several export rules, to different Event Hubs namespaces, or provide an Event Hub name to export all tables to it.

Note

  • 'Basic' Event Hubs namespace tier is limited – it supports lower event size and no Auto-inflate option to automatically scale up and increase the number of throughput units. Since data volume to your workspace increases over time and consequence Event Hubs scaling is required, use 'Standard', 'Premium' or 'Dedicated' Event Hubs tiers with Auto-inflate feature enabled. See Automatically scale up Azure Event Hubs throughput units.
  • Data export can't reach Event Hubs resources when virtual networks are enabled. You have to enable the Allow trusted Microsoft services to bypass this firewall setting in event hub, to grant access to your Event Hubs.

Enable data export

The following steps must be performed to enable Log Analytics data export. See the following sections for more details on each.

  • Register resource provider.
  • Allow trusted Microsoft services.
  • Create one or more data export rules that define the tables to export and their destination.

Register resource provider

Azure resource provider Microsoft.Insights needs to be registered in your subscription to enable Log Analytics data export.

This resource provider is probably already registered for most Azure Monitor users. To verify, go to Subscriptions in the Azure portal. Select your subscription and then click Resource providers in the Settings section of the menu. Locate Microsoft.Insights. If its status is Registered, then it's already registered. If not, click Register to register it.

You can also use any of the available methods to register a resource provider as described in Azure resource providers and types. Following is a sample command using CLI:

az provider register --namespace 'Microsoft.insights'

Following is a sample command using PowerShell:

Register-AzResourceProvider -ProviderNamespace Microsoft.insights

Allow trusted Microsoft services

If you have configured your Storage Account to allow access from selected networks, you need to add an exception to allow Azure Monitor to write to the account. From Firewalls and virtual networks for your Storage Account, select Allow trusted Microsoft services to access this Storage Account.

Storage Account firewalls and networks

Monitor destinations

Important

Export destinations have limits and should be monitored to minimize throttling, failures, and latency. See Storage Accounts scalability and Event Hubs namespace quota.

Monitoring Storage Account

  1. Use separate Storage Account for export

  2. Configure alert on the metric:

    Scope Metric Namespace Metric Aggregation Threshold
    storage-name Account Ingress Sum 80% of max ingress per alert evaluation period. For example: limit is 60 Gbps for general-purpose v2 in West US. Threshold is 14,400 Gb per 5-minutes evaluation period
  3. Alert remediation actions

    • Use separate Storage Account for export that isn't shared with non-monitoring data.
    • Azure Storage standard accounts support higher ingress limit by request. To request an increase, contact Azure Support.
    • Split tables between more Storage Accounts.

Monitoring Event Hubs

  1. Configure alerts on the metrics:

    Scope Metric Namespace Metric Aggregation Threshold
    namespaces-name Event Hub standard metrics Incoming bytes Sum 80% of max ingress per alert evaluation period. For example, limit is 1 MB/s per unit ("TU" or "PU") and five units used. Threshold is 1200 MB per 5-minutes evaluation period
    namespaces-name Event Hub standard metrics Incoming requests Count 80% of max events per alert evaluation period. For example, limit is 1000/s per unit ("TU" or ""PU") and five units used. Threshold is 1200000 per 5-minutes evaluation period
    namespaces-name Event Hub standard metrics Quota Exceeded Errors Count Between 1% of request. For example, requests per 5-minute is 600000. Threshold is 6000 per 5-minute evaluation period
  2. Alert remediation actions

    • Use separate Event Hubs namespace for export that isn't shared with non-monitoring data.
    • Configure Auto-inflate feature to automatically scale up and increase the number of throughput units to meet usage needs
    • Verify increase of throughput units to accommodate data volume
    • Split tables between more namespaces
    • Use 'Premium' or 'Dedicated' tiers for higher throughput

Create or update data export rule

Data export rule defines the destination and tables for which data is exported. You can create 10 rules in 'enable' state in your workspace, more rules are allowed in 'disable' state. Storage Account must be unique across rules in workspace. Multiple rules can use the same Event Hubs namespace when sending to separate Event Hubs.

Note

  • You can include tables that aren't yet supported in export, and no data will be exported for these until the tables are supported.
  • The legacy custom log won’t be supported in export. The next generation of custom log available in preview early 2022 can be exported.
  • Export to Storage Account - a separate container is created in Storage Account for each table.
  • Export to Event Hubs - if Event Hub name isn't provided, a separate Event Hub is created for each table. The number of supported Event Hubs in 'Basic' and 'Standard' namespaces tiers is 10. When exporting more than 10 tables to these tiers, either split the tables between several export rules to different Event Hubs namespaces, or provide an Event Hub name in the rule to export all tables to it.

In the Log Analytics workspace menu in the Azure portal, select Data Export from the Settings section and click New export rule from the top of the middle pane.

export create

Follow the steps, then click Create.

Screenshot of data export rule configuration.

View data export rule configuration

In the Log Analytics workspace menu in the Azure portal, select Data Export from the Settings section.

export rules view

Click a rule for configuration view.

Screenshot of data export rule view.

Disable or update an export rule

Export rules can be disabled to let you stop the export for a certain period such as when testing is being held. In the Log Analytics workspace menu in the Azure portal, select Data Export from the Settings section and click the status toggle to disable or enable export rule.

export rule disable

Delete an export rule

In the Log Analytics workspace menu in the Azure portal, select Data Export from the Settings section, then click the ellipsis to the right of the rule and click Delete.

export rule delete

View all data export rules in a workspace

In the Log Analytics workspace menu in the Azure portal, select Data Export from the Settings section to view all export rules in workspace.

export rules

Unsupported tables

If the data export rule includes an unsupported table, the configuration will succeed, but no data will be exported for that table. If the table is later supported, then its data will be exported at that time.

Supported tables

All data from the table will be exported unless limitations are specified. This list is updated as more tables are added.

 Table   Limitations 
AACAudit
AACHttpRequest
AADDomainServicesAccountLogon
AADDomainServicesAccountManagement
AADDomainServicesDirectoryServiceAccess
AADDomainServicesLogonLogoff
AADDomainServicesPolicyChange
AADDomainServicesPrivilegeUse
AADManagedIdentitySignInLogs
AADNonInteractiveUserSignInLogs
AADProvisioningLogs
AADRiskyServicePrincipals
AADRiskyUsers
AADServicePrincipalRiskEvents
AADServicePrincipalSignInLogs
AADUserRiskEvents
ABSBotRequests
ACRConnectedClientList
ACSAuthIncomingOperations
ACSBillingUsage
ACSCallDiagnostics
ACSCallSummary
ACSChatIncomingOperations
ACSNetworkTraversalIncomingOperations
ACSSMSIncomingOperations
ADAssessmentRecommendation
ADFActivityRun
ADFPipelineRun
ADFSSignInLogs
ADFTriggerRun
ADPAudit
ADPRequests
ADReplicationResult
ADSecurityAssessmentRecommendation
ADTDigitalTwinsOperation
ADTEventRoutesOperation
ADTModelsOperation
ADTQueryOperation
ADXCommand
ADXQuery
AegDataPlaneRequests
AegDeliveryFailureLogs
AegPublishFailureLogs
AEWAuditLogs
AgriFoodApplicationAuditLogs
AgriFoodFarmManagementLogs
AgriFoodFarmOperationLogs
AgriFoodJobProcessedLogs
AGSGrafanaLoginEvents
Alert Partial support – Data ingestion for Zabbix alerts isn't supported.
AlertEvidence
AlertInfo
AmlOnlineEndpointConsoleLog
ApiManagementGatewayLogs
AppCenterError
AppPlatformSystemLogs
AppServiceAppLogs
AppServiceAuditLogs
AppServiceConsoleLogs
AppServiceFileAuditLogs
AppServiceHTTPLogs
AppServicePlatformLogs
ASimDnsActivityLogs
ATCExpressRouteCircuitIpfix
AuditLogs
AutoscaleEvaluationsLog
AutoscaleScaleActionsLog
AWSCloudTrail
AWSGuardDuty
AWSVPCFlow
AzureAssessmentRecommendation
AzureAttestationDiagnostics
AzureDevOpsAuditing
BehaviorAnalytics
CassandraLogs
CDBCassandraRequests
CDBControlPlaneRequests
CDBDataPlaneRequests
CDBGremlinRequests
CDBMongoRequests
CDBPartitionKeyRUConsumption
CDBPartitionKeyStatistics
CDBQueryRuntimeStatistics
CIEventsAudit
CIEventsOperational
CloudAppEvents
CommonSecurityLog
ComputerGroup
ConfigurationData Partial support – some of the data is ingested through internal services that aren't supported in export. This portion is missing in export currently.
ContainerImageInventory
ContainerInventory
ContainerLog
ContainerLogV2
ContainerNodeInventory
ContainerServiceLog
CoreAzureBackup
DatabricksAccounts
DatabricksClusters
DatabricksDBFS
DatabricksInstancePools
DatabricksJobs
DatabricksNotebook
DatabricksSecrets
DatabricksSQLPermissions
DatabricksSSH
DatabricksWorkspace
DnsEvents
DnsInventory
DSMAzureBlobStorageLogs
DSMDataClassificationLogs
DSMDataLabelingLogs
Dynamics365Activity
EmailAttachmentInfo
EmailEvents
EmailPostDeliveryEvents
EmailUrlInfo
Event Partial support – data arriving from Log Analytics agent (MMA) or Azure Monitor Agent (AMA) is fully supported in export. Data arriving via Diagnostics extension agent is collected through storage while this path isn’t supported in export.
ExchangeAssessmentRecommendation
FailedIngestion
FunctionAppLogs
HDInsightAmbariClusterAlerts
HDInsightAmbariSystemMetrics
HDInsightHadoopAndYarnLogs
HDInsightHadoopAndYarnMetrics
HDInsightHBaseLogs
HDInsightHBaseMetrics
HDInsightHiveAndLLAPLogs
HDInsightHiveAndLLAPMetrics
HDInsightHiveQueryAppStats
HDInsightHiveTezAppStats
HDInsightKafkaLogs
HDInsightKafkaMetrics
HDInsightOozieLogs
HDInsightSecurityLogs
HDInsightSparkApplicationEvents
HDInsightSparkBlockManagerEvents
HDInsightSparkEnvironmentEvents
HDInsightSparkExecutorEvents
HDInsightSparkJobEvents
HDInsightSparkLogs
HDInsightSparkSQLExecutionEvents
HDInsightSparkStageEvents
HDInsightSparkStageTaskAccumulables
HDInsightSparkTaskEvents
Heartbeat
HuntingBookmark
IdentityDirectoryEvents
IdentityLogonEvents
IdentityQueryEvents
InsightsMetrics Partial support – some of the data is ingested through internal services that aren't supported in export. This portion is missing in export currently.
IntuneAuditLogs
IntuneDevices
IntuneOperationalLogs
KubeEvents
KubeHealth
KubeMonAgentEvents
KubeNodeInventory
KubePodInventory
KubeServices
LAQueryLogs
McasShadowItReporting
MCVPAuditLogs
MCVPOperationLogs
MicrosoftAzureBastionAuditLogs
MicrosoftDataShareReceivedSnapshotLog
MicrosoftDataShareSentSnapshotLog
MicrosoftHealthcareApisAuditLogs
NWConnectionMonitorPathResult
NWConnectionMonitorTestResult
OfficeActivity Partial support in government clouds – some of the data to ingested via webhooks from O365 into LA. This portion is missing in export currently.
OLPSupplyChainEntityOperations
OLPSupplyChainEvents
Operation Partial support – some of the data is ingested through internal services that aren't supported in export. This portion is missing in export currently.
Perf Partial support – only windows perf data is currently supported. The Linux perf data is missing in export currently.
PowerBIActivity
PowerBIDatasetsWorkspace
ProjectActivity
PurviewDataSensitivityLogs
PurviewScanStatusLogs
ResourceManagementPublicAccessLogs
SCCMAssessmentRecommendation
SCOMAssessmentRecommendation
SecurityAlert
SecurityBaseline
SecurityBaselineSummary
SecurityDetection
SecurityEvent Partial support – data arriving from Log Analytics agent (MMA) or Azure Monitor Agent (AMA) is fully supported in export. Data arriving via Diagnostics extension agent is collected through storage while this path isn’t supported in export.
SecurityIncident
SecurityIoTRawEvent
SecurityNestedRecommendation
SecurityRecommendation
SentinelAudit
SentinelHealth
SfBAssessmentRecommendation
SfBOnlineAssessmentRecommendation
SharePointOnlineAssessmentRecommendation
SignalRServiceDiagnosticLogs
SigninLogs
SPAssessmentRecommendation
SQLAssessmentRecommendation
SQLSecurityAuditEvents
StorageCacheOperationEvents
SucceededIngestion
SynapseBigDataPoolApplicationsEnded
SynapseBuiltinSqlPoolRequestsEnded
SynapseGatewayApiRequests
SynapseIntegrationActivityRuns
SynapseIntegrationPipelineRuns
SynapseIntegrationTriggerRuns
SynapseRbacOperations
SynapseScopePoolScopeJobsEnded
SynapseScopePoolScopeJobsStateChange
SynapseSqlPoolDmsWorkers
SynapseSqlPoolExecRequests
SynapseSqlPoolRequestSteps
SynapseSqlPoolSqlRequests
SynapseSqlPoolWaits
Syslog Partial support – data arriving from Log Analytics agent (MMA) or Azure Monitor Agent (AMA) is fully supported in export. Data arriving via Diagnostics extension agent is collected through storage while this path isn’t supported in export.
ThreatIntelligenceIndicator
UCClient
UCClientUpdateStatus
UCDeviceAlert
UCServiceUpdateStatus
UCUpdateAlert
Update Partial support – some of the data is ingested through internal services that aren't supported in export. This portion is missing in export currently.
UpdateRunProgress
UpdateSummary
Usage
UserAccessAnalytics
UserPeerAnalytics
Watchlist
WindowsEvent
WindowsFirewall
WireData Partial support – some of the data is ingested through internal services that aren't supported in export. This portion is missing in export currently.
WorkloadDiagnosticLogs
WVDAgentHealthStatus
WVDCheckpoints
WVDConnections
WVDErrors
WVDFeeds
WVDManagement

Next steps