Monitoring Azure Files

When you have critical applications and business processes that rely on Azure resources, you want to monitor those resources for their availability, performance, and operation. This article describes the monitoring data that's generated by Azure Files and how you can use the features of Azure Monitor to analyze alerts on this data.

Monitor overview

The Overview page in the Azure portal for each Azure Files resource includes a brief view of the resource usage, such as requests and hourly billing. This information is useful, but only a small amount of the monitoring data is available. Some of this data is collected automatically and is available for analysis as soon as you create the resource. You can enable additional types of data collection with some configuration.

What is Azure Monitor?

Azure Files creates monitoring data by using Azure Monitor, which is a full stack monitoring service in Azure. Azure Monitor provides a complete set of features to monitor your Azure resources and resources in other clouds and on-premises.

Start with the article Monitoring Azure resources with Azure Monitor, which describes the following:

  • What is Azure Monitor?
  • Costs associated with monitoring
  • Monitoring data collected in Azure
  • Configuring data collection
  • Standard tools in Azure for analyzing and alerting on monitoring data

The following sections build on this article by describing the specific data gathered from Azure Files. Examples show how to configure data collection and analyze this data with Azure tools.

Monitoring data

Azure Files collects the same kinds of monitoring data as other Azure resources, which are described in Monitoring data from Azure resources.

See Azure File monitoring data reference for detailed information on the metrics and logs metrics created by Azure Files.

Metrics and logs in Azure Monitor support only Azure Resource Manager storage accounts. Azure Monitor doesn't support classic storage accounts. If you want to use metrics or logs on a classic storage account, you need to migrate to an Azure Resource Manager storage account. See Migrate to Azure Resource Manager.

Collection and routing

Platform metrics and the Activity log are collected automatically, but can be routed to other locations by using a diagnostic setting.

To collect resource logs, you must create a diagnostic setting. When you create the setting, choose file as the type of storage that you want to enable logs for. Then, specify one of the following categories of operations for which you want to collect logs.

Category Description
StorageRead Read operations on objects.
StorageWrite Write operations on objects.
StorageDelete Delete operations on objects.

To get the list of SMB and REST operations that are logged, see Storage logged operations and status messages and Azure Files monitoring data reference.

Creating a diagnostic setting

You can create a diagnostic setting by using the Azure portal, PowerShell, the Azure CLI, or an Azure Resource Manager template.

Note

Azure Storage logs in Azure Monitor is in public preview and is available for preview testing in all public cloud regions. This preview enables logs for blobs (which includes Azure Data Lake Storage Gen2), files, queues,and tables. This feature is available for all storage accounts that are created with the Azure Resource Manager deployment model. See Storage account overview.

For general guidance, see Create diagnostic setting to collect platform logs and metrics in Azure.

  1. Sign in to the Azure portal.

  2. Navigate to your storage account.

  3. In the Monitoring section, click Diagnostic settings (preview).

    portal - Diagnostics logs

  4. Choose file as the type of storage that you want to enable logs for.

  5. Click Add diagnostic setting.

    portal - Resource logs - add diagnostic setting

    The Diagnostic settings page appears.

    Resource logs page

  6. In the Name field of the page, enter a name for this Resource log setting. Then, select which operations you want logged (read, write, and delete operations), and where you want the logs to be sent.

Archive logs to a storage account

If you choose to archive your logs to a storage account, you'll pay for the volume of logs that are sent to the storage account. For specific pricing, see the Platform Logs section of the Azure Monitor pricing page.

  1. Select the Archive to a storage account checkbox, and then click the Configure button.

    Diagnostic settings page archive storage

  2. In the Storage account drop-down list, select the storage account that you want to archive your logs to, click the OK button, and then click the Save button.

    Important

    You can't set a retention policy. However, you can manage the retention policy of a log container by defining a lifecycle management policy. To learn how, see Optimize costs by automating Azure Blob Storage access tiers.

    Note

    Before you choose a storage account as the export destination, see Archive Azure resource logs to understand prerequisites on the storage account.

Stream logs to Azure Event Hubs

If you choose to stream your logs to an event hub, you'll pay for the volume of logs that are sent to the event hub. For specific pricing, see the Platform Logs section of the Azure Monitor pricing page.

  1. Select the Stream to an event hub checkbox, and then click the Configure button.

  2. In the Select an event hub pane, choose the namespace, name, and policy name of the event hub that you want to stream your logs to.

    Diagnostic settings page event hub

  3. Click the OK button, and then click the Save button.

Send logs to Azure Log Analytics

  1. Select the Send to Log Analytics checkbox, select a log analytics workspace, and then click the and then click the Save button.

    Diagnostic settings page log analytics

Analyzing metrics

You can analyze metrics for Azure Storage with metrics from other Azure services by using Metrics Explorer. Open Metrics Explorer by choosing Metrics from the Azure Monitor menu. For details on using this tool, see Getting started with Azure Metrics Explorer.

For metrics that support dimensions, you can filter the metric with the desired dimension value. For a complete list of the dimensions that Azure Storage supports, see Metrics dimensions. Metrics for Azure Files are in these namespaces:

  • Microsoft.Storage/storageAccounts
  • Microsoft.Storage/storageAccounts/fileServices

For a list of all Azure Monitor support metrics, which includes Azure Files, see Azure Monitor supported metrics.

Accessing metrics

Tip

To view Azure CLI or .NET examples, choose the corresponding tabs listed here.

Azure Monitor provides the .NET SDK to read metric definition and values. The sample code shows how to use the SDK with different parameters. You need to use 0.18.0-preview or a later version for storage metrics.

In these examples, replace the <resource-ID> placeholder with the resource ID of the entire storage account or the Azure Files service. You can find these resource IDs on the Properties pages of your storage account in the Azure portal.

Replace the <subscription-ID> variable with the ID of your subscription. For guidance on how to obtain values for <tenant-ID>, <application-ID>, and <AccessKey>, see Use the portal to create an Azure AD application and service principal that can access resources.

List the account-level metric definition

The following example shows how to list a metric definition at the account level:

    public static async Task ListStorageMetricDefinition()
    {
        var resourceId = "<resource-ID>";
        var subscriptionId = "<subscription-ID>";
        var tenantId = "<tenant-ID>";
        var applicationId = "<application-ID>";
        var accessKey = "<AccessKey>";


        MonitorManagementClient readOnlyClient = AuthenticateWithReadOnlyClient(tenantId, applicationId, accessKey, subscriptionId).Result;
        IEnumerable<MetricDefinition> metricDefinitions = await readOnlyClient.MetricDefinitions.ListAsync(resourceUri: resourceId, cancellationToken: new CancellationToken());

        foreach (var metricDefinition in metricDefinitions)
        {
            // Enumrate metric definition:
            //    Id
            //    ResourceId
            //    Name
            //    Unit
            //    MetricAvailabilities
            //    PrimaryAggregationType
            //    Dimensions
            //    IsDimensionRequired
        }
    }

Reading account-level metric values

The following example shows how to read UsedCapacity data at the account level:

    public static async Task ReadStorageMetricValue()
    {
        var resourceId = "<resource-ID>";
        var subscriptionId = "<subscription-ID>";
        var tenantId = "<tenant-ID>";
        var applicationId = "<application-ID>";
        var accessKey = "<AccessKey>";

        MonitorClient readOnlyClient = AuthenticateWithReadOnlyClient(tenantId, applicationId, accessKey, subscriptionId).Result;

        Microsoft.Azure.Management.Monitor.Models.Response Response;

        string startDate = DateTime.Now.AddHours(-3).ToUniversalTime().ToString("o");
        string endDate = DateTime.Now.ToUniversalTime().ToString("o");
        string timeSpan = startDate + "/" + endDate;

        Response = await readOnlyClient.Metrics.ListAsync(
            resourceUri: resourceId,
            timespan: timeSpan,
            interval: System.TimeSpan.FromHours(1),
            metricnames: "UsedCapacity",

            aggregation: "Average",
            resultType: ResultType.Data,
            cancellationToken: CancellationToken.None);

        foreach (var metric in Response.Value)
        {
            // Enumrate metric value
            //    Id
            //    Name
            //    Type
            //    Unit
            //    Timeseries
            //        - Data
            //        - Metadatavalues
        }
    }

Reading multidimensional metric values

For multidimensional metrics, you need to define metadata filters if you want to read metric data on specific dimension values.

The following example shows how to read metric data on the metric supporting multidimension:

    public static async Task ReadStorageMetricValueTest()
    {
        // Resource ID for Azure Files
        var resourceId = "/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Storage/storageAccounts/{storageAccountName}/fileServices/default";
        var subscriptionId = "<subscription-ID}";
        // How to identify Tenant ID, Application ID and Access Key: https://azure.microsoft.com/documentation/articles/resource-group-create-service-principal-portal/
        var tenantId = "<tenant-ID>";
        var applicationId = "<application-ID>";
        var accessKey = "<AccessKey>";

        MonitorManagementClient readOnlyClient = AuthenticateWithReadOnlyClient(tenantId, applicationId, accessKey, subscriptionId).Result;

        Microsoft.Azure.Management.Monitor.Models.Response Response;

        string startDate = DateTime.Now.AddHours(-3).ToUniversalTime().ToString("o");
        string endDate = DateTime.Now.ToUniversalTime().ToString("o");
        string timeSpan = startDate + "/" + endDate;
        // It's applicable to define meta data filter when a metric support dimension
        // More conditions can be added with the 'or' and 'and' operators, example: BlobType eq 'BlockBlob' or BlobType eq 'PageBlob'
        ODataQuery<MetadataValue> odataFilterMetrics = new ODataQuery<MetadataValue>(
            string.Format("BlobType eq '{0}'", "BlockBlob"));

        Response = readOnlyClient.Metrics.List(
                        resourceUri: resourceId,
                        timespan: timeSpan,
                        interval: System.TimeSpan.FromHours(1),
                        metricnames: "BlobCapacity",
                        odataQuery: odataFilterMetrics,
                        aggregation: "Average",
                        resultType: ResultType.Data);

        foreach (var metric in Response.Value)
        {
            //Enumrate metric value
            //    Id
            //    Name
            //    Type
            //    Unit
            //    Timeseries
            //        - Data
            //        - Metadatavalues
        }
    }

Analyzing logs

You can access resource logs either as a blob in a storage account, as event data, or through Log Analytic queries.

To get the list of SMB and REST operations that are logged, see Storage logged operations and status messages and Azure Files monitoring data reference.

Note

Azure Storage logs in Azure Monitor is in public preview and is available for preview testing in all public cloud regions. This preview enables logs for blobs (which includes Azure Data Lake Storage Gen2), files, queues, tables, premium storage accounts in general-purpose v1, and general-purpose v2 storage accounts. Classic storage accounts aren't supported.

Log entries are created only if there are requests made against the service endpoint. For example, if a storage account has activity in its file endpoint but not in its table or queue endpoints, only logs that pertain to the Azure File service are created. Azure Storage logs contain detailed information about successful and failed requests to a storage service. This information can be used to monitor individual requests and to diagnose issues with a storage service. Requests are logged on a best-effort basis.

Log authenticated requests

The following types of authenticated requests are logged:

  • Successful requests
  • Failed requests, including timeout, throttling, network, authorization, and other errors
  • Requests that use Kerberos, NTLM or shared access signature (SAS), including failed and successful requests
  • Requests to analytics data (classic log data in the $logs container and classic metric data in the $metric tables)

Requests made by the Azure Files service itself, such as log creation or deletion, aren't logged. For a full list of the SMB and REST requests that are logged, see Storage logged operations and status messages and Azure Files monitoring data reference.

Accessing logs in a storage account

Logs appear as blobs stored to a container in the target storage account. Data is collected and stored inside a single blob as a line-delimited JSON payload. The name of the blob follows this naming convention:

https://<destination-storage-account>.blob.core.windows.net/insights-logs-<storage-operation>/resourceId=/subscriptions/<subscription-ID>/resourceGroups/<resource-group-name>/providers/Microsoft.Storage/storageAccounts/<source-storage-account>/fileServices/default/y=<year>/m=<month>/d=<day>/h=<hour>/m=<minute>/PT1H.json

Here's an example:

https://mylogstorageaccount.blob.core.windows.net/insights-logs-storagewrite/resourceId=/subscriptions/
208841be-a4v3-4234-9450-08b90c09f4/resourceGroups/myresourcegroup/providers/Microsoft.Storage/storageAccounts/mystorageaccount/fileServices/default/y=2019/m=07/d=30/h=23/m=12/PT1H.json

Accessing logs in an event hub

Logs sent to an event hub aren't stored as a file, but you can verify that the event hub received the log information. In the Azure portal, go to your event hub and verify that the incoming messages count is greater than zero.

Audit logs

You can access and read log data that's sent to your event hub by using security information and event management and monitoring tools. For more information, see What can I do with the monitoring data being sent to my event hub?.

Accessing logs in a Log Analytics workspace

You can access logs sent to a Log Analytics workspace by using Azure Monitor log queries. Data is stored in the StorageFileLogs table.

For more information, see Log Analytics tutorial.

Sample Kusto queries

Here are some queries that you can enter in the Log search bar to help you monitor your Azure Files. These queries work with the new language.

Important

When you select Logs from the storage account resource group menu, Log Analytics is opened with the query scope set to the current resource group. This means that log queries will only include data from that resource group. If you want to run a query that includes data from other resources or data from other Azure services, select Logs from the Azure Monitor menu. See Log query scope and time range in Azure Monitor Log Analytics for details.

Use these queries to help you monitor your Azure file shares:

  • View SMB errors over the last week
StorageFileLogs
| where Protocol == "SMB" and TimeGenerated >= ago(7d) and StatusCode contains "-"
| sort by StatusCode
  • Create a pie chart of SMB operations over the last week
StorageFileLogs
| where Protocol == "SMB" and TimeGenerated >= ago(7d) 
| summarize count() by OperationName
| sort by count_ desc
| render piechart
  • View REST errors over the last week
StorageFileLogs
| where Protocol == "HTTPS" and TimeGenerated >= ago(7d) and StatusText !contains "Success"
| sort by StatusText asc
  • Create a pie chart of REST operations over the last week
StorageFileLogs
| where Protocol == "HTTPS" and TimeGenerated >= ago(7d) 
| summarize count() by OperationName
| sort by count_ desc
| render piechart

To view the list of column names and descriptions for Azure Files, see StorageFileLogs.

For more information on how to write queries, see Log Analytics tutorial.

Alerts

Azure Monitor alerts proactively notify you when important conditions are found in your monitoring data. They allow you to identify and address issues in your system before your customers notice them. You can set alerts on metrics, logs, and the activity log.

The following table lists some example scenarios to monitor and the proper metric to use for the alert:

Scenario Metric to use for alert
File share is throttled. Metric: Transactions
Dimension name: Response type
Dimension name: FileShare (premium file share only)
File share size is 80% of capacity. Metric: File Capacity
Dimension name: FileShare (premium file share only)
File share egress has exceeded 500 GiB in one day. Metric: Egress
Dimension name: FileShare (premium file share only)

How to create alerts for Azure Files

  1. Go to your storage account in the Azure portal.

  2. Click Alerts and then click + New alert rule.

  3. Click Edit resource, select the File resource type and then click Done.

  4. Click Add condition and provide the following information for the alert:

    • Metric
    • Dimension name
    • Alert logic
  5. Click Add action groups and add an action group (email, SMS, etc.) to the alert either by selecting an existing action group or creating a new action group.

  6. Fill in the Alert details like Alert rule name, Description, and Severity.

  7. Click Create alert rule to create the alert.

Note

If you create an alert and it's too noisy, adjust the threshold value and alert logic.

How to create an alert if a file share is throttled

  1. Go to your storage account in the Azure portal.

  2. In the Monitoring section, click Alerts, and then click + New alert rule.

  3. Click Edit resource, select the File resource type for the storage account and then click Done. For example, if the storage account name is contoso, select the contoso/file resource.

  4. Click Add condition to add a condition.

  5. You will see a list of signals supported for the storage account, select the Transactions metric.

  6. On the Configure signal logic blade, click the Dimension name drop-down and select Response type.

  7. Click the Dimension values drop-down and select the appropriate response types for your file share.

    For standard file shares, select the following response types:

    • SuccessWithShareIopsThrottling
    • SuccessWithThrottling
    • ClientShareIopsThrottlingError

    For premium file shares, select the following response types:

    • SuccessWithShareEgressThrottling
    • SuccessWithShareIngressThrottling
    • SuccessWithShareIopsThrottling
    • ClientShareEgressThrottlingError
    • ClientShareIngressThrottlingError
    • ClientShareIopsThrottlingError

    Note

    If the response types are not listed in the Dimension values drop-down, this means the resource has not been throttled. To add the dimension values, next to the Dimension values drop-down list, select Add custom value, enter the respone type (for example, SuccessWithThrottling), select OK, and then repeat these steps to add all applicable response types for your file share.

  8. For premium file shares, click the Dimension name drop-down and select File Share. For standard file shares, skip to step #10.

    Note

    If the file share is a standard file share, the File Share dimension will not list the file share(s) because per-share metrics are not available for standard file shares. Throttling alerts for standard file shares will be triggered if any file share within the storage account is throttled and the alert will not identify which file share was throttled. Since per-share metrics are not available for standard file shares, the recommendation is to have one file share per storage account.

  9. Click the Dimension values drop-down and select the file share(s) that you want to alert on.

  10. Define the alert parameters (threshold value, operator, aggregation granularity and frequency of evaluation) and click Done.

    Tip

    If you are using a static threshold, the metric chart can help determine a reasonable threshold value if the file share is currently being throttled. If you are using a dynamic threshold, the metric chart will display the calculated thresholds based on recent data.

  11. Click Add action groups to add an action group (email, SMS, etc.) to the alert either by selecting an existing action group or creating a new action group.

  12. Fill in the Alert details like Alert rule name, Description, and Severity.

  13. Click Create alert rule to create the alert.

How to create an alert if the Azure file share size is 80% of capacity

  1. Go to your storage account in the Azure portal.

  2. In the Monitoring section, click Alerts and then click + New alert rule.

  3. Click Edit resource, select the File resource type for the storage account and then click Done. For example, if the storage account name is contoso, select the contoso/file resource.

  4. Click Add condition to add a condition.

  5. You will see a list of signals supported for the storage account, select the File Capacity metric.

  6. For premium file shares, click the Dimension name drop-down and select File Share. For standard file shares, skip to step #8.

    Note

    If the file share is a standard file share, the File Share dimension will not list the file share(s) because per-share metrics are not available for standard file shares. Alerts for standard file shares are based on all file shares in the storage account. Since per-share metrics are not available for standard file shares, the recommendation is to have one file share per storage account.

  7. Click the Dimension values drop-down and select the file share(s) that you want to alert on.

  8. Enter the Threshold value in bytes. For example, if the file share size is 100 TiB and you want to receive an alert when the file share size is 80% of capacity, the threshold value in bytes is 87960930222080.

  9. Define the rest of the alert parameters (aggregation granularity and frequency of evaluation) and click Done.

  10. Click Add action groups to add an action group (email, SMS, etc.) to the alert either by selecting an existing action group or creating a new action group.

  11. Fill in the Alert details like Alert rule name, Description, and Severity.

  12. Click Create alert rule to create the alert.

How to create an alert if the Azure file share egress has exceeded 500 GiB in a day

  1. Go to your storage account in the Azure portal.

  2. In the Monitoring section, click Alerts and then click + New alert rule.

  3. Click Edit resource, select the File resource type for the storage account and then click Done. For example, if the storage account name is contoso, select the contoso/file resource.

  4. Click Add condition to add a condition.

  5. You will see a list of signals supported for the storage account, select the Egress metric.

  6. For premium file shares, click the Dimension name drop-down and select File Share. For standard file shares, skip to step #8.

    Note

    If the file share is a standard file share, the File Share dimension will not list the file share(s) because per-share metrics are not available for standard file shares. Alerts for standard file shares are based on all file shares in the storage account. Since per-share metrics are not available for standard file shares, the recommendation is to have one file share per storage account.

  7. Click the Dimension values drop-down and select the file share(s) that you want to alert on.

  8. Enter 536870912000 bytes for Threshold value.

  9. Click the Aggregation granularity drop-down and select 24 hours.

  10. Select the Frequency of evaluation and click Done.

  11. Click Add action groups to add an action group (email, SMS, etc.) to the alert either by selecting an existing action group or creating a new action group.

  12. Fill in the Alert details like Alert rule name, Description, and Severity.

  13. Click Create alert rule to create the alert.

Next steps