您现在访问的是微软AZURE全球版技术文档网站,若需要访问由世纪互联运营的MICROSOFT AZURE中国区技术文档网站,请访问 https://docs.azure.cn.

Azure 存储分析日志记录Azure Storage analytics logging

存储分析记录成功和失败的存储服务请求的详细信息。Storage Analytics logs detailed information about successful and failed requests to a storage service. 可以使用该信息监视各个请求和诊断存储服务问题。This information can be used to monitor individual requests and to diagnose issues with a storage service. 将最大程度地记录请求。Requests are logged on a best-effort basis.

默认情况下,不会为你的存储帐户启用存储分析日志记录。Storage Analytics logging is not enabled by default for your storage account. 可以在 Azure 门户中启用它;有关详细信息,请参阅在 Azure 门户中监视存储帐户You can enable it in the Azure portal; for details, see Monitor a storage account in the Azure portal. 还可以通过 REST API 或客户端库以编程方式启用存储分析。You can also enable Storage Analytics programmatically via the REST API or the client library. 使用获取 Blob 服务属性获取队列服务属性获取表服务属性操作,为每个服务启用存储分析。Use the Get Blob Service Properties, Get Queue Service Properties, and Get Table Service Properties operations to enable Storage Analytics for each service.

仅在针对服务终结点发出请求时才会创建日志条目。Log entries are created only if there are requests made against the service endpoint. 例如,如果存储帐户的 Blob 终结点中存在活动,而在其表或队列终结点中没有活动,则仅创建与 Blob 服务有关的日志。For example, if a storage account has activity in its Blob endpoint but not in its Table or Queue endpoints, only logs pertaining to the Blob service will be created.

备注

存储分析日志记录目前仅可用于 Blob、队列和表服务。Storage Analytics logging is currently available only for the Blob, Queue, and Table services. 但是,不支持高级存储帐户。However, premium storage account is not supported.

备注

本文中所述的功能现在可用于具有分层命名空间的帐户。The features described in this article are now available to accounts that have a hierarchical namespace. 若要查看限制,请参阅 Azure Data Lake Storage Gen2 的已知问题一文。To review limitations, see the Known issues with Azure Data Lake Storage Gen2 article.

记录中记录的请求Requests logged in logging

记录经过身份验证的请求Logging authenticated requests

将记录以下类型的已经过身份验证的请求:The following types of authenticated requests are logged:

  • 成功的请求Successful requests

  • 失败的请求,包括超时、限制、网络、授权和其他错误Failed requests, including timeout, throttling, network, authorization, and other errors

  • 使用共享访问签名(SAS)或 OAuth 的请求,包括失败和成功的请求Requests using a Shared Access Signature (SAS) or OAuth, including failed and successful requests

  • 分析数据请求Requests to analytics data

    不会记录存储分析本身发出的请求,如创建或删除日志。Requests made by Storage Analytics itself, such as log creation or deletion, are not logged. 存储分析记录的操作和状态消息存储分析日志格式主题中提供了所记录数据的完整列表。A full list of the logged data is documented in the Storage Analytics Logged Operations and Status Messages and Storage Analytics Log Format topics.

记录匿名请求Logging anonymous requests

将记录以下类型的匿名请求:The following types of anonymous requests are logged:

如何存储日志How logs are stored

所有日志都存储在名为 $logs的容器中的块 blob 中,在为存储帐户启用存储分析时,会自动创建该容器。All logs are stored in block blobs in a container named $logs, which is automatically created when Storage Analytics is enabled for a storage account. $logs 容器位于存储帐户的 blob 命名空间中,例如: http://<accountname>.blob.core.windows.net/$logsThe $logs container is located in the blob namespace of the storage account, for example: http://<accountname>.blob.core.windows.net/$logs. 在启用存储分析后,无法删除该容器,但可以删除其内容。This container cannot be deleted once Storage Analytics has been enabled, though its contents can be deleted. 如果使用存储浏览工具直接导航到容器,则会看到包含日志记录数据的所有 blob。If you use your storage-browsing tool to navigate to the container directly, you will see all the blobs that contain your logging data.

备注

执行容器列出操作(例如列表容器操作)时,不会显示 $logs 容器。The $logs container is not displayed when a container listing operation is performed, such as the List Containers operation. 必须直接访问该容器。It must be accessed directly. 例如,可以使用 "列出 Blob" 操作访问 $logs 容器中的 blob。For example, you can use the List Blobs operation to access the blobs in the $logs container.

在记录请求时,存储分析将中间结果作为块进行上传。As requests are logged, Storage Analytics will upload intermediate results as blocks. 存储分析定期提交这些块,并将其作为 Blob 提供。Periodically, Storage Analytics will commit these blocks and make them available as a blob. 日志数据会出现在 $logs容器中的 blob 中最多需要一小时,因为存储服务刷新日志编写器的频率。It can take up to an hour for log data to appear in the blobs in the $logs container because the frequency at which the storage service flushes the log writers. 在同一小时内创建的日志中可能存在重复的记录。Duplicate records may exist for logs created in the same hour. 可以通过检查 RequestId 和操作编号确定记录是否为重复记录。You can determine if a record is a duplicate by checking the RequestId and Operation number.

如果有大量的日志数据,每小时有多个文件,则可以通过检查 blob 元数据字段,使用 blob 元数据来确定日志所包含的数据。If you have a high volume of log data with multiple files for each hour, then you can use the blob metadata to determine what data the log contains by examining the blob metadata fields. 这也很有用,因为在将数据写入日志文件时有时可能会有延迟: blob 元数据提供的 blob 内容比 blob 名称更准确。This is also useful because there can sometimes be a delay while data is written to the log files: the blob metadata gives a more accurate indication of the blob content than the blob name.

大多数存储浏览工具都使你可以查看 blob 的元数据;还可以使用 PowerShell 或以编程方式读取此信息。Most storage browsing tools enable you to view the metadata of blobs; you can also read this information using PowerShell or programmatically. 下面的 PowerShell 代码片段是按名称筛选日志 blob 列表以指定时间的示例,以及通过元数据识别包含写入操作的日志的示例。The following PowerShell snippet is an example of filtering the list of log blobs by name to specify a time, and by metadata to identify just those logs that contain write operations.

Get-AzureStorageBlob -Container '$logs' |  
Where-Object {  
    $_.Name -match 'table/2014/05/21/05' -and   
    $_.ICloudBlob.Metadata.LogType -match 'write'  
} |  
ForEach-Object {  
    "{0}  {1}  {2}  {3}" –f $_.Name,   
    $_.ICloudBlob.Metadata.StartTime,   
    $_.ICloudBlob.Metadata.EndTime,   
    $_.ICloudBlob.Metadata.LogType  
}  

有关以编程方式列出 blob 的详细信息,请参阅枚举 Blob 资源设置和检索 Blob 资源的属性和元数据For information about listing blobs programmatically, see Enumerating Blob Resources and Setting and Retrieving Properties and Metadata for Blob Resources.

日志命名约定Log naming conventions

每个日志将采用以下格式编写:Each log will be written in the following format:

<service-name>/YYYY/MM/DD/hhmm/<counter>.log

下表描述了日志名称中的每个属性:The following table describes each attribute in the log name:

属性Attribute 描述Description
<service-name> 存储服务的名称The name of the storage service. 例如: blobtablequeueFor example: blob, table, or queue
YYYY 日志的四位数年份。The four digit year for the log. 例如: 2011For example: 2011
MM 日志的两位数月份。The two digit month for the log. 例如: 07For example: 07
DD 日志的两位数日期。The two digit day for the log. 例如: 31For example: 31
hh 用两位数表示的日志起始小时,采用24小时 UTC 格式。The two digit hour that indicates the starting hour for the logs, in 24 hour UTC format. 例如: 18For example: 18
mm 两位数,指示日志的起始分钟。The two digit number that indicates the starting minute for the logs. 注意: 此值在存储分析的当前版本中不受支持,并且将始终 00其值。Note: This value is unsupported in the current version of Storage Analytics, and its value will always be 00.
<counter> 从零开始且具有六位数字的计数器,表示在 1 小时内为存储服务生成的日志 Blob 数。A zero-based counter with six digits that indicates the number of log blobs generated for the storage service in an hour time period. 此计数器从 000000开始。This counter starts at 000000. 例如: 000001For example: 000001

下面是组合上述示例的完整示例日志名称:The following is a complete sample log name that combines the above examples:

blob/2011/07/31/1800/000001.log

下面是可用于访问上述日志的示例 URI:The following is a sample URI that can be used to access the above log:

https://<accountname>.blob.core.windows.net/$logs/blob/2011/07/31/1800/000001.log

在记录存储请求时,生成的日志名称与完成请求的操作时间(小时)关联。When a storage request is logged, the resulting log name correlates to the hour when the requested operation completed. 例如,如果 GetBlob 请求已在7/31/2011 的6:下午6:30 上完成,则将使用以下前缀来写入日志: blob/2011/07/31/1800/For example, if a GetBlob request was completed at 6:30PM on 7/31/2011, the log would be written with the following prefix: blob/2011/07/31/1800/

日志元数据Log metadata

所有日志 Blob 与可用于确定 Blob 包含哪些日志记录数据的元数据一起存储。All log blobs are stored with metadata that can be used to identify what logging data the blob contains. 下表描述了每个元数据属性:The following table describes each metadata attribute:

属性Attribute 描述Description
LogType 描述日志是否包含与读取、写入或删除操作有关的信息。Describes whether the log contains information pertaining to read, write, or delete operations. 该值可能包含一种类型,也可能包含所有三种类型的组合并用逗号隔开。This value can include one type or a combination of all three, separated by commas.

示例 1:writeExample 1: write

示例 2:read,writeExample 2: read,write

示例3: read,write,deleteExample 3: read,write,delete
StartTime 日志中项的最早时间,格式为 YYYY-MM-DDThh:mm:ssZThe earliest time of an entry in the log, in the form of YYYY-MM-DDThh:mm:ssZ. 例如: 2011-07-31T18:21:46ZFor example: 2011-07-31T18:21:46Z
EndTime 日志中条目的最晚时间,格式为 YYYY-MM-DDThh:mm:ssZThe latest time of an entry in the log, in the form of YYYY-MM-DDThh:mm:ssZ. 例如: 2011-07-31T18:22:09ZFor example: 2011-07-31T18:22:09Z
LogVersion 日志格式的版本。The version of the log format.

以下列表显示了使用上述示例的完整示例元数据:The following list displays complete sample metadata using the above examples:

  • LogType=write
  • StartTime=2011-07-31T18:21:46Z
  • EndTime=2011-07-31T18:22:09Z
  • LogVersion=1.0

启用存储日志记录Enable Storage logging

可以使用 Azure 门户、PowerShell 和存储 Sdk 来启用存储日志记录。You can enable Storage logging with Azure portal, PowerShell, and Storage SDKs.

使用 Azure 门户启用存储日志记录Enable Storage logging using the Azure portal

在 Azure 门户中,使用 "诊断设置(经典) " 边栏选项卡控制存储日志记录,可从存储帐户的菜单边栏选项卡的 "监视(经典) " 部分进行访问。In the Azure portal, use the Diagnostics settings (classic) blade to control Storage Logging, accessible from the Monitoring (classic) section of a storage account's Menu blade.

可以指定要记录的存储服务,以及记录的数据的保持期(天)。You can specify the storage services that you want to log, and the retention period (in days) for the logged data.

使用 PowerShell 启用存储日志记录Enable Storage logging using PowerShell

你可以使用本地计算机上的 PowerShell 在存储帐户中配置存储日志记录,方法是使用 Azure PowerShell cmdlet AzureStorageServiceLoggingProperty检索当前设置,并使用 cmdlet 设置-AzureStorageServiceLoggingProperty以更改当前设置。You can use PowerShell on your local machine to configure Storage Logging in your storage account by using the Azure PowerShell cmdlet Get-AzureStorageServiceLoggingProperty to retrieve the current settings, and the cmdlet Set-AzureStorageServiceLoggingProperty to change the current settings.

控制存储日志记录的 cmdlet 使用LoggingOperations参数,该参数是一个字符串,其中包含以逗号分隔的要记录的请求类型列表。The cmdlets that control Storage Logging use a LoggingOperations parameter that is a string containing a comma-separated list of request types to log. 这三种可能的请求类型为 "读取"、"写入" 和 "删除"。The three possible request types are read, write, and delete. 若要关闭日志记录,请对LoggingOperations参数使用none值。To switch off logging, use the value none for the LoggingOperations parameter.

以下命令在保留设置为5天的情况下,在默认存储帐户的队列服务中打开读取、写入和删除请求的日志记录:The following command switches on logging for read, write, and delete requests in the Queue service in your default storage account with retention set to five days:

Set-AzureStorageServiceLoggingProperty -ServiceType Queue -LoggingOperations read,write,delete -RetentionDays 5  

以下命令在默认存储帐户中关闭表服务的日志记录:The following command switches off logging for the table service in your default storage account:

Set-AzureStorageServiceLoggingProperty -ServiceType Table -LoggingOperations none  

若要了解如何配置 Azure PowerShell cmdlet 来使用 Azure 订阅并了解如何选择要使用的默认存储帐户,请参阅:如何安装和配置 Azure PowerShellFor information about how to configure the Azure PowerShell cmdlets to work with your Azure subscription and how to select the default storage account to use, see: How to install and configure Azure PowerShell.

以编程方式启用存储日志记录Enable Storage logging programmatically

除了使用 Azure 门户或 Azure PowerShell cmdlet 控制存储日志记录之外,还可以使用 Azure 存储 Api 之一。In addition to using the Azure portal or the Azure PowerShell cmdlets to control Storage Logging, you can also use one of the Azure Storage APIs. 例如,如果您使用的是 .NET 语言,则可以使用存储客户端库。For example, if you are using a .NET language you can use the Storage Client Library.

CloudBlobClientCloudQueueClientCloudTableClient都有一些方法,如都使用 setservicepropertiessetservicepropertiesasync 等方法,它们采用ServiceProperties对象作为参数.The classes CloudBlobClient, CloudQueueClient, and CloudTableClient all have methods such as SetServiceProperties and SetServicePropertiesAsync that take a ServiceProperties object as a parameter. 可以使用ServiceProperties对象配置存储日志记录。You can use the ServiceProperties object to configure Storage Logging. 例如,以下C#代码片段显示了如何更改记录的内容和队列日志记录的保留期:For example, the following C# snippet shows how to change what is logged and the retention period for queue logging:

var storageAccount = CloudStorageAccount.Parse(connStr);  
var queueClient = storageAccount.CreateCloudQueueClient();  
var serviceProperties = queueClient.GetServiceProperties();  

serviceProperties.Logging.LoggingOperations = LoggingOperations.All;  
serviceProperties.Logging.RetentionDays = 2;  

queueClient.SetServiceProperties(serviceProperties);  

有关使用 .NET 语言配置存储日志记录的详细信息,请参阅存储客户端库参考For more information about using a .NET language to configure Storage Logging, see Storage Client Library Reference.

有关使用 REST API 配置存储日志记录的常规信息,请参阅启用和配置存储分析For general information about configuring Storage Logging using the REST API, see Enabling and Configuring Storage Analytics.

下载存储日志记录日志数据Download Storage logging log data

若要查看和分析日志数据,应将包含你感兴趣的日志数据的 blob 下载到本地计算机。To view and analyze your log data, you should download the blobs that contain the log data you are interested in to a local machine. 许多存储浏览工具可让你从存储帐户下载 blob;你还可以使用 Azure 存储团队提供的命令行 Azure 复制工具AzCopy下载日志数据。Many storage-browsing tools enable you to download blobs from your storage account; you can also use the Azure Storage team provided command-line Azure Copy Tool AzCopy to download your log data.

若要确保下载你感兴趣的日志数据并避免多次下载相同的日志数据,请执行以下操作:To make sure you download the log data you are interested in and to avoid downloading the same log data more than once:

  • 使用包含日志数据的 blob 的日期和时间命名约定来跟踪已下载的 blob,以避免多次重新下载相同的数据。Use the date and time naming convention for blobs containing log data to track which blobs you have already downloaded for analysis to avoid re-downloading the same data more than once.

  • 使用包含日志数据的 blob 上的元数据来标识 blob 保留日志数据的特定时间段,以确定需要下载的确切 blob。Use the metadata on the blobs containing log data to identify the specific period for which the blob holds log data to identify the exact blob you need to download.

若要开始 AzCopy,请参阅AzCopy 入门To get started with AzCopy, see Get started with AzCopy

下面的示例演示了在2014可能的情况下,如何从第09点、上午10点到11点上午11点开始,下载队列服务的日志数据。The following example shows how you can download the log data for the queue service for the hours starting at 09 AM, 10 AM, and 11 AM on 20th May, 2014.

azcopy copy 'https://mystorageaccount.blob.core.windows.net/$logs/queue' 'C:\Logs\Storage' --include-path '2014/05/20/09;2014/05/20/10;2014/05/20/11' --recursive

若要了解有关如何下载特定文件的详细信息,请参阅下载特定文件To learn more about how to download specific files, see Download specific files.

下载日志数据后,可以查看文件中的日志条目。When you have downloaded your log data, you can view the log entries in the files. 这些日志文件使用一种分隔文本格式,许多日志读取工具都可以对其进行分析,包括 Microsoft Message Analyzer (有关详细信息,请参阅指南监视、诊断和疑难解答 Microsoft Azure 存储)。These log files use a delimited text format that many log reading tools are able to parse, including Microsoft Message Analyzer (for more information, see the guide Monitoring, Diagnosing, and Troubleshooting Microsoft Azure Storage). 不同的工具具有不同的功能,可用于对日志文件的内容进行格式设置、筛选、排序和搜索。Different tools have different facilities for formatting, filtering, sorting, ad searching the contents of your log files. 有关存储日志记录日志文件格式和内容的详细信息,请参阅存储分析日志格式存储分析记录的操作和状态消息For more information about the Storage Logging log file format and content, see Storage Analytics Log Format and Storage Analytics Logged Operations and Status Messages.

后续步骤Next steps