Log Analytics workspace data export in Azure Monitor (preview)

Log Analytics workspace data export in Azure Monitor allows you to continuously export data from selected tables in your Log Analytics workspace to an Azure storage account or Azure Event Hubs as it's collected. This article provides details on this feature and steps to configure data export in your workspaces.

Overview

Once data export is configured for your Log Analytics workspace, any new data sent to the selected tables in the workspace is automatically exported to your storage account in hourly append blobs or to your event hub in near-real-time.

Data export overview

All data from included tables is exported without a filter. For example, when you configure a data export rule for SecurityEvent table, all data sent to the SecurityEvent table is exported starting from the configuration time.

Other export options

Log Analytics workspace data export continuously exports data from a Log Analytics workspace. Other options to export data for particular scenarios include the following:

Limitations

  • Configuration can currently be performed using CLI or REST requests. Azure portal or PowerShell are not supported yet.
  • The --export-all-tables option in CLI and REST isn't supported and will be removed. You should provide the list of tables in export rules explicitly.
  • Supported tables are limited those specified in the supported tables section below.
  • The existing custom log tables won’t be supported in export. A new custom log version available in March 2022 will be supported.
  • If the data export rule includes an unsupported table, the operation will succeed, but no data will be exported for that table until the table becomes supported.
  • If the data export rule includes a table that doesn't exist, it will fail with error Table <tableName> does not exist in the workspace.
  • You can define up to 10 enabled rules in your workspace. Additional rules are allowed when disabled.
  • Destination must be unique across all export rules in your workspace.
  • Destinations must be in the same region as the Log Analytics workspace.
  • Tables names can be no longer than 60 characters when exporting to storage account and 47 characters to event hub. Tables with longer names will not be exported.
  • Data export will be available in all regions, but currently supported in:
    • Australia Central
    • Australia East
    • Australia Southeast
    • Brazil South
    • Canada Central
    • Central India
    • Central US
    • East Asia
    • East US
    • East US 2
    • France Central
    • Germany West Central
    • Japan East
    • Korea Central
    • North Central US
    • North Europe
    • South Africa North
    • South Central US
    • Southeast Asia
    • Switzerland North
    • Switzerland West
    • UAE North
    • UK South
    • UK West
    • West Central US
    • West Europe
    • West US
    • West US 2

Data completeness

Data export will continue to retry sending data for up to 30 minutes in the event that the destination is unavailable. If it's still unavailable after 30 minutes then data will be discarded until the destination becomes available.

Cost

Currently, there are no additional charges for the data export feature. Pricing for data export will be announced in the future and a notice period provided prior to the start of billing. If you choose to continue using data export after the notice period, you will be billed at the applicable rate.

Export destinations

Data export destination must be created before creating the export rule in your workspace. The destination does not have to be in the same subscription as your workspace. When using Azure Lighthouse, it is also possible to have data sent to a destination in another Azure Active Directory tenant.

Storage account

You need to have 'write' permissions to both workspace and destination to configure data export rule. You shouldn't use an existing storage account that has other, non-monitoring data stored in it so that you can better control access to the data and prevent reaching storage ingestion rate limit and throttling.

To send data to immutable storage, set the immutable policy for the storage account as described in Set and manage immutability policies for Blob storage. You must follow all steps in this article including enabling protected append blobs writes.

The storage account must be StorageV1 or above and in the same region as your workspace. If you need to replicate your data to other storage accounts in other regions, you can use any of the Azure Storage redundancy options including GRS and GZRS.

Data is sent to storage accounts as it reaches Azure Monitor and stored in hourly append blobs. A container is created for each table in the storage account with the name am- followed by the name of the table. For example, the table SecurityEvent would sent to a container named am-SecurityEvent.

Note

It's recommended to use separate storage account for proper ingress rate allocation and reducing throttling, failures and latency events.

Starting 15-October 2021, blobs are stored in 5 minutes folders in the following path structure: WorkspaceResourceId=/subscriptions/subscription-id/resourcegroups/<resource-group>/providers/microsoft.operationalinsights/workspaces/<workspace>/y=<four-digit numeric year>/m=<two-digit numeric month>/d=<two-digit numeric day>/h=<two-digit 24-hour clock hour>/m=<two-digit 60-minute clock minute>/PT05M.json. Since append blobs are limited to 50K writes in storage, the number of exported blobs may extend if the number of appends is high. The naming pattern for blobs in such case would be PT05M_#.json*, where # is the incremental blob count.

The storage account data format is in JSON lines. This means that each record is delimited by a newline, with no outer records array and no commas between JSON records.

Storage sample data

Event hub

You need to have 'write' permissions to both workspace and destination to configure data export rule. The shared access policy for the event hub namespace defines the permissions that the streaming mechanism has. Streaming to event hub requires Manage, Send, and Listen permissions. To update the export rule, you must have the ListKey permission on that Event Hubs authorization rule.

The event hub namespace needs to be in the same region as your workspace.

Data is sent to your event hub as it reaches Azure Monitor. An event hub is created for each data type that you export with the name am- followed by the name of the table. For example, the table SecurityEvent would sent to an event hub named am-SecurityEvent. If you want the exported data to reach a specific event hub, or if you have a table with a name that exceeds the 47 character limit, you can provide your own event hub name and export all data for defined tables to it.

Note

  • 'Basic' event hub tier supports lower event size limit and some logs in your workspace might exceed it and be dropped. Use 'Standard', 'Premium' or 'Dedicated' tiers for export destination.
  • The volume of exported data increases over time and consequence scaling is required for higher ingress rates. Use the Auto-inflate feature to automatically scale up and increase the number of throughput units to meet usage needs. See Automatically scale up Azure Event Hubs throughput units.
  • Use separate event hub namespace for proper ingress rate allocation and reducing throttling, failures and latency events.
  • Data export can't reach event hub resources when virtual networks are enabled. You have to enable the Allow trusted Microsoft services to bypass this firewall setting in event hub, to grant access to your Event Hubs resources.

Important

The number of supported event hubs per 'Basic' and 'Standard' namespaces tiers is 10. If you export more than 10 tables, either split the tables between several export rules to different event hub namespaces, or provide event hub name in the export rule and export all tables to that event hub.

Enable data export

The following steps must be performed to enable Log Analytics data export. See the following sections for more details on each.

  • Register resource provider.
  • Allow trusted Microsoft services.
  • Create one or more data export rules that define the tables to export and their destination.

Register resource provider

The following Azure resource provider needs to registered for your subscription to enable Log Analytics data export.

  • Microsoft.Insights

This resource provider will probably already be registered for most Azure Monitor users. To verify, go to Subscriptions in the Azure portal. Select your subscription and then click Resource providers in the Settings section of the menu. Locate Microsoft.Insights. If its status is Registered, then it's already registered. If not, click Register to register it.

You can also use any of the available methods to register a resource provider as described in Azure resource providers and types. Following is a sample command using PowerShell:

Register-AzResourceProvider -ProviderNamespace Microsoft.insights

Allow trusted Microsoft services

If you have configured your Storage Account to allow access from selected networks, you need to add an exception to allow Azure Monitor to write to the account. From Firewalls and virtual networks for your storage account, select Allow trusted Microsoft services to access this storage account.

Storage account firewalls and virtual networks

Create or update data export rule

Data export rule defines the tables for which data is exported and destination. You can have 10 enabled rules in your workspace, additional rules can be added, but in 'disable' state. Destinations must be unique across all export rules in workspace.

Data export destinations have limits and they should be monitored to minimize export throttling, failures and latency. See storage accounts scalability and event hub namespace quota.

Recommendations for storage account

  1. Use separate storage account for export

  2. Configure alert on the metric below with the following settings:

    Scope Metric Namespace Metric Aggregation Threshold
    storage-name Account Ingress Sum 80% of max storage ingress rate. For example: it's 60Gbps for general-purpose v2 in West US
  3. Alert remediation actions

    • Use separate storage account for export
    • Azure Storage standard accounts support higher ingress limit by request. To request an increase, contact Azure Support
    • Split tables between additional storage accounts

Recommendations for event hub

  1. Configure metric alerts:

    Scope Metric Namespace Metric Aggregation Threshold
    namespaces-name Event Hub standard metrics Incoming bytes Sum 80% of max ingress per 5 minutes. For example, it's 1MB/s per unit (TU or PU)
    namespaces-name Event Hub standard metrics Incoming requests Count 80% of max events per 5 minutes. For example, it's 1000/s per unit (TU or PU)
    namespaces-name Event Hub standard metrics Quota Exceeded Errors Count Between 1% to 5% of request
  2. Alert remediation actions

    • Increase the number of units (TU or PU)
    • Split tables between additional namespaces
    • Use 'Premium' or 'Dedicated' tiers for higher throughput

Export rule should include tables that you have in your workspace. Run this query for a list of available tables in your workspace.

find where TimeGenerated > ago(24h) | distinct Type

N/A

View data export rule configuration

N/A

Disable an export rule

N/A

Delete an export rule

N/A

View all data export rules in a workspace

N/A

Unsupported tables

If the data export rule includes an unsupported table, the configuration will succeed, but no data will be exported for that table. If the table is later supported, then its data will be exported at that time.

If the data export rule includes a table that doesn't exist, it will fail with the error "Table <tableName> does not exist in the workspace".

Supported tables

Supported tables are currently limited to those specified below. All data from the table will be exported unless limitations are specified. This list is updated as support for additional tables added.

 Table   Limitations 
AACAudit
AACHttpRequest
AADDomainServicesAccountLogon
AADDomainServicesAccountManagement
AADDomainServicesDirectoryServiceAccess
AADDomainServicesLogonLogoff
AADDomainServicesPolicyChange
AADDomainServicesPrivilegeUse
AADManagedIdentitySignInLogs
AADNonInteractiveUserSignInLogs
AADProvisioningLogs
AADRiskyUsers
AADServicePrincipalSignInLogs
AADUserRiskEvents
ABSBotRequests
ACSAuthIncomingOperations
ACSBillingUsage
ACSChatIncomingOperations
ACSSMSIncomingOperations
ADAssessmentRecommendation
ADFActivityRun
ADFPipelineRun
ADFSSignInLogs
ADFTriggerRun
ADPAudit
ADPDiagnostics
ADPRequests
ADReplicationResult
ADSecurityAssessmentRecommendation
ADTDigitalTwinsOperation
ADTEventRoutesOperation
ADTModelsOperation
ADTQueryOperation
ADXCommand
ADXQuery
AegDeliveryFailureLogs
AegPublishFailureLogs
AEWAuditLogs
Alert
AmlOnlineEndpointConsoleLog
ApiManagementGatewayLogs
AppCenterError
AppPlatformSystemLogs
AppServiceAppLogs
AppServiceAuditLogs
AppServiceConsoleLogs
AppServiceFileAuditLogs
AppServiceHTTPLogs
AppServicePlatformLogs
AuditLogs
AutoscaleEvaluationsLog
AutoscaleScaleActionsLog
AWSCloudTrail
AWSGuardDuty
AWSVPCFlow
AzureAssessmentRecommendation
AzureDevOpsAuditing
BehaviorAnalytics
BlockchainApplicationLog
BlockchainProxyLog
CDBCassandraRequests
CDBControlPlaneRequests
CDBDataPlaneRequests
CDBGremlinRequests
CDBMongoRequests
CDBPartitionKeyRUConsumption
CDBPartitionKeyStatistics
CDBQueryRuntimeStatistics
CommonSecurityLog
ComputerGroup
ConfigurationData Partial support – some of the data is ingested through internal services that isn't supported for export. This portion is missing in export currently.
ContainerImageInventory
ContainerInventory
ContainerLog
ContainerLogV2
ContainerNodeInventory
ContainerServiceLog
CoreAzureBackup
DatabricksAccounts
DatabricksClusters
DatabricksDBFS
DatabricksInstancePools
DatabricksJobs
DatabricksNotebook
DatabricksSecrets
DatabricksSQLPermissions
DatabricksSSH
DatabricksWorkspace
DeviceNetworkInfo
DnsEvents
DnsInventory
DummyHydrationFact
Dynamics365Activity
EmailAttachmentInfo
EmailEvents
EmailPostDeliveryEvents
EmailUrlInfo
Event Partial support – data arriving from Log Analytics agent (MMA) or Azure Monitor Agent (AMA) is fully supported in export. Data arriving via Diagnostics Extension agent is collected though storage while this path isn’t supported in export.2
ExchangeAssessmentRecommendation
FailedIngestion
FunctionAppLogs
HDInsightAmbariClusterAlerts
HDInsightAmbariSystemMetrics
HDInsightHadoopAndYarnLogs
HDInsightHadoopAndYarnMetrics
HDInsightHBaseLogs
HDInsightHBaseMetrics
HDInsightHiveAndLLAPLogs
HDInsightHiveAndLLAPMetrics
HDInsightHiveTezAppStats
HDInsightJupyterNotebookEvents
HDInsightKafkaLogs
HDInsightKafkaMetrics
HDInsightOozieLogs
HDInsightSecurityLogs
HDInsightSparkApplicationEvents
HDInsightSparkBlockManagerEvents
HDInsightSparkEnvironmentEvents
HDInsightSparkExecutorEvents
HDInsightSparkJobEvents
HDInsightSparkLogs
HDInsightSparkSQLExecutionEvents
HDInsightSparkStageEvents
HDInsightSparkStageTaskAccumulables
HDInsightSparkTaskEvents
Heartbeat
HuntingBookmark
InsightsMetrics Partial support – some of the data is ingested through internal services that isn't supported for export. This portion is missing in export currently.
IntuneAuditLogs
IntuneDevices
IntuneOperationalLogs
KubeEvents
KubeHealth
KubeMonAgentEvents
KubeNodeInventory
KubePodInventory
KubeServices
LAQueryLogs
McasShadowItReporting
MCCEventLogs
MicrosoftAzureBastionAuditLogs
MicrosoftDataShareReceivedSnapshotLog
MicrosoftDataShareSentSnapshotLog
MicrosoftDataShareShareLog
MicrosoftHealthcareApisAuditLogs
NWConnectionMonitorPathResult
NWConnectionMonitorTestResult
OfficeActivity Partial support in government clouds – some of the data to ingested via webhooks from O365 into LA. This portion is missing in export currently.
Operation Partial support – some of the data is ingested through internal services that aren't supported for export. This portion is missing in export currently.
Perf Partial support – only Windows perf data is currently supported. The Linux perf data is missing in export currently.
PowerBIDatasetsWorkspace
PurviewScanStatusLogs
SCCMAssessmentRecommendation
SCOMAssessmentRecommendation
SecurityAlert
SecurityBaseline
SecurityBaselineSummary
SecurityCef
SecurityDetection
SecurityEvent Partial support – data arriving from Log Analytics agent (MMA) or Azure Monitor Agent (AMA) is fully supported in export. Data arriving via Diagnostics Extension agent is collected through storage while this path isn’t supported in export.2
SecurityIncident
SecurityIoTRawEvent
SecurityNestedRecommendation
SecurityRecommendation
SentinelHealth
SfBAssessmentRecommendation
SfBOnlineAssessmentRecommendation
SharePointOnlineAssessmentRecommendation
SignalRServiceDiagnosticLogs
SigninLogs
SPAssessmentRecommendation
SQLAssessmentRecommendation
SQLSecurityAuditEvents
SucceededIngestion
SynapseBigDataPoolApplicationsEnded
SynapseBuiltinSqlPoolRequestsEnded
SynapseGatewayApiRequests
SynapseIntegrationActivityRuns
SynapseIntegrationPipelineRuns
SynapseIntegrationTriggerRuns
SynapseRbacOperations
SynapseSqlPoolDmsWorkers
SynapseSqlPoolExecRequests
SynapseSqlPoolRequestSteps
SynapseSqlPoolSqlRequests
SynapseSqlPoolWaits
Syslog Partial support – data arriving from Log Analytics agent (MMA) or Azure Monitor Agent (AMA) is fully supported in export. Data arriving via Diagnostics Extension agent is collected through storage while this path isn’t supported in export.2
ThreatIntelligenceIndicator
Update Partial support – some of the data is ingested through internal services that aren't supported for export. This portion is missing in export currently.
UpdateRunProgress
UpdateSummary
Usage
UserAccessAnalytics
UserPeerAnalytics
Watchlist
WindowsEvent
WindowsFirewall
WireData Partial support – some of the data is ingested through internal services that aren't supported for export. This portion is missing in export currently.
WorkloadDiagnosticLogs
WVDAgentHealthStatus
WVDCheckpoints
WVDConnections
WVDErrors
WVDFeeds
WVDManagement

Next steps