Monitoring Azure Blob Storage
When you have critical applications and business processes that rely on Azure resources, you want to monitor those resources for their availability, performance, and operation. This article describes the monitoring data that's generated by Azure Blob Storage and how you can use the features of Azure Monitor to analyze alerts on this data.
Monitor overview
The Overview page in the Azure portal for each Blob storage resource includes a brief view of the resource usage, such as requests and hourly billing. This information is useful, but only a small amount of the monitoring data is available. Some of this data is collected automatically and is available for analysis as soon as you create the resource. You can enable additional types of data collection with some configuration.
What is Azure Monitor?
Azure Blob Storage creates monitoring data by using Azure Monitor, which is a full stack monitoring service in Azure. Azure Monitor provides a complete set of features to monitor your Azure resources and resources in other clouds and on-premises.
Start with the article Monitoring Azure resources with Azure Monitor which describes the following:
- What is Azure Monitor?
- Costs associated with monitoring
- Monitoring data collected in Azure
- Configuring data collection
- Standard tools in Azure for analyzing and alerting on monitoring data
The following sections build on this article by describing the specific data gathered from Azure Storage. Examples show how to configure data collection and analyze this data with Azure tools.
Monitoring data
Azure Blob Storage collects the same kinds of monitoring data as other Azure resources, which are described in Monitoring data from Azure resources.
See Azure Blob Storage monitoring data reference for detailed information on the metrics and logs metrics created by Azure Blob Storage.
Metrics and logs in Azure Monitor support only Azure Resource Manager storage accounts. Azure Monitor doesn't support classic storage accounts. If you want to use metrics or logs on a classic storage account, you need to migrate to an Azure Resource Manager storage account. For more information, see Migrate to Azure Resource Manager.
You can continue using classic metrics and logs if you want to. In fact, classic metrics and logs are available in parallel with metrics and logs in Azure Monitor. The support remains in place until Azure Storage ends the service on legacy metrics and logs.
Collection and routing
Platform metrics and the Activity log are collected automatically, but can be routed to other locations by using a diagnostic setting.
To collect resource logs, you must create a diagnostic setting. When you create the setting, choose blob as the type of storage that you want to enable logs for. Then, specify one of the following categories of operations for which you want to collect logs.
Category | Description |
---|---|
StorageRead | Read operations on objects. |
StorageWrite | Write operations on objects. |
StorageDelete | Delete operations on objects. |
Note
Data Lake Storage Gen2 doesn't appear as a storage type. That's because Data Lake Storage Gen2 is a set of capabilities available to Blob storage.
Creating a diagnostic setting
This section shows you how to create a diagnostic setting by using the Azure portal, PowerShell, and the Azure CLI. This section provides steps specific to Azure Storage. For general guidance about how to create a diagnostic setting, see Create diagnostic setting to collect platform logs and metrics in Azure.
Tip
You can also create a diagnostic setting by using an Azure Resource manager template or by using a policy definition. A policy definition can ensure that a diagnostic setting is created for every account that is created or updated.
This section doesn't describe templates or policy definitions.
To view an Azure Resource Manager template that creates a diagnostic setting, see Diagnostic setting for Azure Storage.
To learn how to create a diagnostic setting by using a policy definition, see Azure Policy built-in definitions for Azure Storage.
Sign in to the Azure portal.
Navigate to your storage account.
In the Monitoring section, click Diagnostic settings.
Choose blob as the type of storage that you want to enable logs for.
Click Add diagnostic setting.
The Diagnostic settings page appears.
In the Name field of the page, enter a name for this Resource log setting. Then, select which operations you want logged (read, write, and delete operations), and where you want the logs to be sent.
Archive logs to a storage account
If you choose to archive your logs to a storage account, you'll pay for the volume of logs that are sent to the storage account. For specific pricing, see the Platform Logs section of the Azure Monitor pricing page. You can't send logs to the same storage account that you are monitoring with this setting. This would lead to recursive logs in which a log entry describes the writing of another log entry. You must create an account or use another existing account to store log information.
Select the Archive to a storage account checkbox, and then select the Configure button.
In the Storage account drop-down list, select the storage account that you want to archive your logs to, and then select the Save button.
Important
You can't set a retention policy. However, you can manage the retention policy of a log container by defining a lifecycle management policy. To learn how, see Optimize costs by automating Azure Blob Storage access tiers.
Note
Before you choose a storage account as the export destination, see Archive Azure resource logs to understand prerequisites on the storage account.
Stream logs to Azure Event Hubs
If you choose to stream your logs to an event hub, you'll pay for the volume of logs that are sent to the event hub. For specific pricing, see the Platform Logs section of the Azure Monitor pricing page. You'll need access to an existing event hub, or you'll need to create one before you complete this step.
Select the Stream to an event hub checkbox, and then select the Configure button.
In the Select an event hub pane, choose the namespace, name, and policy name of the event hub that you want to stream your logs to.
Select the Save button.
Send logs to Azure Log Analytics
- Select the Send to Log Analytics checkbox, select a log analytics workspace, and then select the Save button. You'll need access to an existing log analytics workspace, or you'll need to create one before you complete this step.
Important
You can't set a retention policy. However, you can manage the data retention period of Log Analytics at the workspace level or even specify different retention settings by data type. To learn how, see Change the data retention period.
Send to a partner solution
You can also send platform metrics and logs to certain Azure Monitor partners. You must first install a partner integration into your subscription. Configuration options will vary by partner. Check the Azure Monitor partner integrations documentation for details.
Analyzing metrics
For a list of all Azure Monitor support metrics, which includes Azure Blob Storage, see Azure Monitor supported metrics.
You can analyze metrics for Azure Storage with metrics from other Azure services by using Metrics Explorer. Open Metrics Explorer by choosing Metrics from the Azure Monitor menu. For details on using this tool, see Getting started with Azure Metrics Explorer.
This example shows how to view Transactions at the account level.
For metrics that support dimensions, you can filter the metric with the desired dimension value. This example shows how to view Transactions at the account level on a specific operation by selecting values for the API Name dimension.
For a complete list of the dimensions that Azure Storage supports, see Metrics dimensions.
Metrics for Azure Blob Storage are in these namespaces:
- Microsoft.Storage/storageAccounts
- Microsoft.Storage/storageAccounts/blobServices
Analyze metrics by using code
Azure Monitor provides the .NET SDK to read metric definition and values. The sample code shows how to use the SDK with different parameters. You need to use 0.18.0-preview
or a later version for storage metrics.
In these examples, replace the <resource-ID>
placeholder with the resource ID of the entire storage account or the Blob storage service. You can find these resource IDs on the Endpoints pages of your storage account in the Azure portal.
Replace the <subscription-ID>
variable with the ID of your subscription. For guidance on how to obtain values for <tenant-ID>
, <application-ID>
, and <AccessKey>
, see Use the portal to create an Azure AD application and service principal that can access resources.
List the account-level metric definition
The following example shows how to list a metric definition at the account level:
public static async Task ListStorageMetricDefinition()
{
var resourceId = "<resource-ID>";
var subscriptionId = "<subscription-ID>";
var tenantId = "<tenant-ID>";
var applicationId = "<application-ID>";
var accessKey = "<AccessKey>";
MonitorManagementClient readOnlyClient = AuthenticateWithReadOnlyClient(tenantId, applicationId, accessKey, subscriptionId).Result;
IEnumerable<MetricDefinition> metricDefinitions = await readOnlyClient.MetricDefinitions.ListAsync(resourceUri: resourceId, cancellationToken: new CancellationToken());
foreach (var metricDefinition in metricDefinitions)
{
// Enumrate metric definition:
// Id
// ResourceId
// Name
// Unit
// MetricAvailabilities
// PrimaryAggregationType
// Dimensions
// IsDimensionRequired
}
}
Reading account-level metric values
The following example shows how to read UsedCapacity
data at the account level:
public static async Task ReadStorageMetricValue()
{
var resourceId = "<resource-ID>";
var subscriptionId = "<subscription-ID>";
var tenantId = "<tenant-ID>";
var applicationId = "<application-ID>";
var accessKey = "<AccessKey>";
MonitorClient readOnlyClient = AuthenticateWithReadOnlyClient(tenantId, applicationId, accessKey, subscriptionId).Result;
Microsoft.Azure.Management.Monitor.Models.Response Response;
string startDate = DateTime.Now.AddHours(-3).ToUniversalTime().ToString("o");
string endDate = DateTime.Now.ToUniversalTime().ToString("o");
string timeSpan = startDate + "/" + endDate;
Response = await readOnlyClient.Metrics.ListAsync(
resourceUri: resourceId,
timespan: timeSpan,
interval: System.TimeSpan.FromHours(1),
metricnames: "UsedCapacity",
aggregation: "Average",
resultType: ResultType.Data,
cancellationToken: CancellationToken.None);
foreach (var metric in Response.Value)
{
// Enumrate metric value
// Id
// Name
// Type
// Unit
// Timeseries
// - Data
// - Metadatavalues
}
}
Reading multidimensional metric values
For multidimensional metrics, you need to define metadata filters if you want to read metric data on specific dimension values.
The following example shows how to read metric data on the metric supporting multidimension:
public static async Task ReadStorageMetricValueTest()
{
// Resource ID for blob storage
var resourceId = "/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Storage/storageAccounts/{storageAccountName}/blobServices/default";
var subscriptionId = "<subscription-ID}";
// How to identify Tenant ID, Application ID and Access Key: https://azure.microsoft.com/documentation/articles/resource-group-create-service-principal-portal/
var tenantId = "<tenant-ID>";
var applicationId = "<application-ID>";
var accessKey = "<AccessKey>";
MonitorManagementClient readOnlyClient = AuthenticateWithReadOnlyClient(tenantId, applicationId, accessKey, subscriptionId).Result;
Microsoft.Azure.Management.Monitor.Models.Response Response;
string startDate = DateTime.Now.AddHours(-3).ToUniversalTime().ToString("o");
string endDate = DateTime.Now.ToUniversalTime().ToString("o");
string timeSpan = startDate + "/" + endDate;
// It's applicable to define meta data filter when a metric support dimension
// More conditions can be added with the 'or' and 'and' operators, example: BlobType eq 'BlockBlob' or BlobType eq 'PageBlob'
ODataQuery<MetadataValue> odataFilterMetrics = new ODataQuery<MetadataValue>(
string.Format("BlobType eq '{0}'", "BlockBlob"));
Response = readOnlyClient.Metrics.List(
resourceUri: resourceId,
timespan: timeSpan,
interval: System.TimeSpan.FromHours(1),
metricnames: "BlobCapacity",
odataQuery: odataFilterMetrics,
aggregation: "Average",
resultType: ResultType.Data);
foreach (var metric in Response.Value)
{
//Enumrate metric value
// Id
// Name
// Type
// Unit
// Timeseries
// - Data
// - Metadatavalues
}
}
Analyzing logs
You can access resource logs either as a blob in a storage account, as event data, or through Log Analytic queries.
For a detailed reference of the fields that appear in these logs, see Azure Blob Storage monitoring data reference.
Log entries are created only if there are requests made against the service endpoint. For example, if a storage account has activity in its blob endpoint but not in its table or queue endpoints, only logs that pertain to the blob service are created. Azure Storage logs contain detailed information about successful and failed requests to a storage service. This information can be used to monitor individual requests and to diagnose issues with a storage service. Requests are logged on a best-effort basis.
Log authenticated requests
The following types of authenticated requests are logged:
- Successful requests
- Failed requests, including timeout, throttling, network, authorization, and other errors
- Requests that use a shared access signature (SAS) or OAuth, including failed and successful requests
- Requests to analytics data (classic log data in the $logs container and class metric data in the $metric tables)
Requests made by the Blob storage service itself, such as log creation or deletion, aren't logged. For a full list of the logged data, see Storage logged operations and status messages and Storage log format.
Log anonymous requests
The following types of anonymous requests are logged:
- Successful requests
- Server errors
- Timeout errors for both client and server
- Failed GET requests with the error code 304 (Not Modified)
All other failed anonymous requests aren't logged. For a full list of the logged data, see Storage logged operations and status messages and Storage log format.
Accessing logs in a storage account
Logs appear as blobs stored to a container in the target storage account. Data is collected and stored inside a single blob as a line-delimited JSON payload. The name of the blob follows this naming convention:
https://<destination-storage-account>.blob.core.windows.net/insights-logs-<storage-operation>/resourceId=/subscriptions/<subscription-ID>/resourceGroups/<resource-group-name>/providers/Microsoft.Storage/storageAccounts/<source-storage-account>/blobServices/default/y=<year>/m=<month>/d=<day>/h=<hour>/m=<minute>/PT1H.json
Here's an example:
https://mylogstorageaccount.blob.core.windows.net/insights-logs-storagewrite/resourceId=/subscriptions/
208841be-a4v3-4234-9450-08b90c09f4/resourceGroups/myresourcegroup/providers/Microsoft.Storage/storageAccounts/mystorageaccount/blobServices/default/y=2019/m=07/d=30/h=23/m=12/PT1H.json
Accessing logs in an event hub
Logs sent to an event hub aren't stored as a file, but you can verify that the event hub received the log information. In the Azure portal, go to your event hub and verify that the incoming messages count is greater than zero.
You can access and read log data that's sent to your event hub by using security information and event management and monitoring tools. For more information, see What can I do with the monitoring data being sent to my event hub?.
Accessing logs in a Log Analytics workspace
You can access logs sent to a Log Analytics workspace by using Azure Monitor log queries.
For more information, see Get started with Log Analytics in Azure Monitor.
Data is stored in the StorageBlobLog table. Logs for Data Lake Storage Gen2 do not appear in a dedicated table. That's because Data Lake Storage Gen2 is not service. It's a set of capabilities that you can enable in your storage account. If you've enabled those capabilities, logs will continue to appear in the StorageBlobLog table.
Sample Kusto queries
Here are some queries that you can enter in the Log search bar to help you monitor your Blob storage. These queries work with the new language.
Important
When you select Logs from the storage account resource group menu, Log Analytics is opened with the query scope set to the current resource group. This means that log queries will only include data from that resource group. If you want to run a query that includes data from other resources or data from other Azure services, select Logs from the Azure Monitor menu. See Log query scope and time range in Azure Monitor Log Analytics for details.
Use these queries to help you monitor your Azure Storage accounts:
To list the 10 most common errors over the last three days.
StorageBlobLogs | where TimeGenerated > ago(3d) and StatusText !contains "Success" | summarize count() by StatusText | top 10 by count_ desc
To list the top 10 operations that caused the most errors over the last three days.
StorageBlobLogs | where TimeGenerated > ago(3d) and StatusText !contains "Success" | summarize count() by OperationName | top 10 by count_ desc
To list the top 10 operations with the longest end-to-end latency over the last three days.
StorageBlobLogs | where TimeGenerated > ago(3d) | top 10 by DurationMs desc | project TimeGenerated, OperationName, DurationMs, ServerLatencyMs, ClientLatencyMs = DurationMs - ServerLatencyMs
To list all operations that caused server-side throttling errors over the last three days.
StorageBlobLogs | where TimeGenerated > ago(3d) and StatusText contains "ServerBusy" | project TimeGenerated, OperationName, StatusCode, StatusText
To list all requests with anonymous access over the last three days.
StorageBlobLogs | where TimeGenerated > ago(3d) and AuthenticationType == "Anonymous" | project TimeGenerated, OperationName, AuthenticationType, Uri
To create a pie chart of operations used over the last three days.
StorageBlobLogs | where TimeGenerated > ago(3d) | summarize count() by OperationName | sort by count_ desc | render piechart
Feature support
This table shows how this feature is supported in your account and the impact on support when you enable certain capabilities.
Logs in Azure Monitor
Storage account type | Blob Storage (default support) | Data Lake Storage Gen2 1 | NFS 3.0 1 | SFTP 1 |
---|---|---|---|---|
Standard general-purpose v2 | ![]() |
![]() |
![]() |
![]() |
Premium block blobs | ![]() |
![]() |
![]() |
![]() |
Metrics in Azure Monitor
Storage account type | Blob Storage (default support) | Data Lake Storage Gen2 1 | NFS 3.0 1 | SFTP 1 |
---|---|---|---|---|
Standard general-purpose v2 | ![]() |
![]() |
![]() |
![]() |
Premium block blobs | ![]() |
![]() |
![]() |
![]() |
1 Data Lake Storage Gen2, Network File System (NFS) 3.0 protocol, and SSH File Transfer Protocol (SFTP) support all require a storage account with a hierarchical namespace enabled.
2 Feature is supported at the preview level.
FAQ
Does Azure Storage support metrics for Managed Disks or Unmanaged Disks?
No. Azure Compute supports the metrics on disks. For more information, see Per disk metrics for Managed and Unmanaged Disks.
Next steps
Get started with any of these guides.
Guide | Description |
---|---|
Gather metrics from your Azure Blob Storage containers | Create charts that show metrics (Contains step-by-step guidance). |
Monitor, diagnose, and troubleshoot your Azure Storage | Troubleshoot storage account issues (contains step-by-step guidance). |
Monitor storage with Azure Monitor Storage insights | A unified view of storage performance, capacity, and availability |
Best practices for monitoring Azure Blob Storage | Guidance for common monitoring and troubleshooting scenarios. |
Getting started with Azure Metrics Explorer | A tour of Metrics Explorer. |
Overview of Log Analytics in Azure Monitor | A tour of Log Analytics. |
Azure Monitor Metrics overview | The basics of metrics and metric dimensions |
Azure Monitor Logs overview | The basics of logs and how to collect and analyze them |
Transition to metrics in Azure Monitor | Move from Storage Analytics metrics to metrics in Azure Monitor. |
Azure Blob Storage monitoring data reference | A reference of the logs and metrics created by Azure Blob Storage |
Troubleshoot performance issues | Common performance issues and guidance about how to troubleshoot them. |
Troubleshoot availability issues | Common availability issues and guidance about how to troubleshoot them. |
Troubleshoot client application errors | Common issues with connecting clients and how to troubleshoot them. |
Feedback
Submit and view feedback for