您现在访问的是微软AZURE全球版技术文档网站,若需要访问由世纪互联运营的MICROSOFT AZURE中国区技术文档网站,请访问 https://docs.azure.cn.

将日志上传到 Azure MonitorUpload logs to Azure Monitor

可以定期导出日志,然后将其上传到 Azure。Periodically, you can export logs and then upload them to Azure. 导出和上传数据也会在 Azure 中创建和更新数据控制器、SQL 托管实例及超大规模 PostgreSQL 服务器组资源。Exporting and uploading logs also creates and updates the data controller, SQL managed instance, and PostgreSQL Hyperscale server group resources in Azure.

备注

在预览期,使用已启用 Azure Arc 的数据服务不会产生费用。During the preview period, there is no cost for using Azure Arc enabled data services.

备注

作为预览版功能,本文中介绍的技术受制于 Microsoft Azure 预览版补充使用条款As a preview feature, the technology presented in this article is subject to Supplemental Terms of Use for Microsoft Azure Previews.

发行说明中提供了最近的更新。The latest updates are available in the release notes.

对于当前更新,仅 Azure 门户支持直接连接模式下的 Arc 数据控制器部署。For the current update, deployment of Arc data controller in direct connectivity mode is only supported from Azure portal.

有关其他限制和更多详细信息,请检查已知问题For additional limitations and more details, check the known issues.

开始之前Before you begin

在上传日志之前,需要执行以下操作:Before you can upload logs, you need to:

  1. 创建 Log Analytics 工作区Create a log analytics workspace
  2. 将 ID 和共享密钥分配到环境变量Assign ID and shared key to environment variables

创建 Log Analytics 工作区Create a log analytics workspace

若要创建 Log Analytics 工作区,请执行这些命令以创建 Log Analytics 工作区,并将访问信息设置到环境变量中。To create a log analytics workspace, execute these commands to create a Log Analytics Workspace and set the access information into environment variables.

备注

如果你已有工作区,请跳过此步骤。Skip this step if you already have a workspace.

az monitor log-analytics workspace create --resource-group <resource group name> --workspace-name <some name you choose>

示例输出:Example output:

{
  "customerId": "d6abb435-2626-4df1-b887-445fe44a4123",
  "eTag": null,
  "id": "/subscriptions/<Subscription ID>/resourcegroups/user-arc-demo/providers/microsoft.operationalinsights/workspaces/user-logworkspace",
  "location": "eastus",
  "name": "user-logworkspace",
  "portalUrl": null,
  "provisioningState": "Succeeded",
  "resourceGroup": "user-arc-demo",
  "retentionInDays": 30,
  "sku": {
    "lastSkuUpdate": "Thu, 30 Jul 2020 22:37:53 GMT",
    "maxCapacityReservationLevel": 3000,
    "name": "pergb2018"
  },
  "source": "Azure",
  "tags": null,
  "type": "Microsoft.OperationalInsights/workspaces"
}

将 ID 和共享密钥分配到环境变量Assign ID and shared key to environment variables

将 Log Analytics 工作区 customerId 保存为环境变量,供稍后使用:Save the log workspace analytics customerId as an environment variable to be used later:

SET WORKSPACE_ID=<customerId>
$Env:WORKSPACE_ID='<customerId>'
export WORKSPACE_ID='<customerId>'

此命令返回连接到 Log Analytics 工作区所需的访问密钥:This command returns the access keys required to connect to your log analytics workspace:

az monitor log-analytics workspace get-shared-keys --resource-group MyResourceGroup --workspace-name MyLogsWorkpace

示例输出:Example output:

{
  "primarySharedKey": "JXzQp1RcGgjXFCDS3v0sXoxPvbgCoGaIv35lf11Km2WbdGFvLXqaydpaj1ByWGvKoCghL8hL4BRoypXxkLr123==",
  "secondarySharedKey": "p2XHSxLJ4o9IAqm2zINcEmx0UWU5Z5EZz8PQC0OHpFjdpuVaI0zsPbTv5VyPFgaCUlCZb2yEbkiR4eTuTSF123=="
}

在环境变量中保存主密钥,供稍后使用:Save the primary key in an environment variable to be used later:

SET WORKSPACE_SHARED_KEY=<primarySharedKey>
$Env:WORKSPACE_SHARED_KEY='<primarySharedKey>'
export WORKSPACE_SHARED_KEY='<primarySharedKey>'

设置最终环境变量并确认Set final environment variables and confirm

在环境变量中设置 SPN 机构 URL:Set the SPN authority URL in an environment variable:

SET SPN_AUTHORITY=https://login.microsoftonline.com
$Env:SPN_AUTHORITY='https://login.microsoftonline.com'
export SPN_AUTHORITY='https://login.microsoftonline.com'

验证环境变量Verify environment variables

如果需要,请检查以确保设置了所有必需的环境变量:Check to make sure that all environment variables required are set if you want:

echo %WORKSPACE_ID%
echo %WORKSPACE_SHARED_KEY%
echo %SPN_TENANT_ID%
echo %SPN_CLIENT_ID%
echo %SPN_CLIENT_SECRET%
echo %SPN_AUTHORITY%
$Env:WORKSPACE_ID
$Env:WORKSPACE_SHARED_KEY
$Env:SPN_TENANT_ID
$Env:SPN_CLIENT_ID
$Env:SPN_CLIENT_SECRET
$Env:SPN_AUTHORITY
echo $WORKSPACE_ID
echo $WORKSPACE_SHARED_KEY
echo $SPN_TENANT_ID
echo $SPN_CLIENT_ID
echo $SPN_CLIENT_SECRET
echo $SPN_AUTHORITY

设置环境变量后,可将日志上传到日志工作区。With the environment variables set, you can upload logs to the log workspace.

将日志上传到 Azure MonitorUpload logs to Azure Monitor

若要上传已启用 Azure Arc 的 SQL 托管实例和已启用 Azure Arc 的超大规模 PostgreSQL 服务器组的日志,请运行以下 CLI 命令 -To upload logs for your Azure Arc enabled SQL managed instances and AzureArc enabled PostgreSQL Hyperscale server groups run the following CLI commands-

  1. 使用 Azure Data CLI (azdata)Azure Data CLI (azdata) 登录到 Azure Arc 数据控制器。Log in to to the Azure Arc data controller with Azure Data CLI (azdata)Azure Data CLI (azdata).

    azdata login
    

    按照提示设置命名空间、管理员用户名和密码。Follow the prompts to set the namespace, the administrator username, and the password.

  2. 将所有日志导出到指定文件:Export all logs to the specified file:

    azdata arc dc export --type logs --path logs.json
    
  3. 将日志上传到 Azure Monitor Log Analytics 工作区:Upload logs to an Azure monitor log analytics workspace:

    azdata arc dc upload --path logs.json
    

在 Azure 门户中查看日志View your logs in Azure portal

上传日志后,你应该能够使用日志查询资源管理器对其进行查询,如下所示:Once your logs are uploaded, you should be able to query them using the log query explorer as follows:

  1. 打开 Azure 门户,在顶部搜索栏中按名称搜索你的工作区,然后选择该工作区。Open the Azure portal and then search for your workspace by name in the search bar at the top and then select it.
  2. 在左侧面板中选择“日志”。Select Logs in the left panel.
  3. 选择“开始”(如果你不熟悉 Log Analytics,可以选择“入门”页上的链接来详细了解它)。Select Get Started (or select the links on the Getting Started page to learn more about Log Analytics if you are new to it).
  4. 如果你是第一次使用 Log Analytics,请遵循教程详细了解 Log Analytics。Follow the tutorial to learn more about Log Analytics if this is your first time using Log Analytics.
  5. 展开表列表底部的“自定义日志”,你将看到一个名为“sql_instance_logs_CL”的表。Expand Custom Logs at the bottom of the list of tables and you will see a table called 'sql_instance_logs_CL'.
  6. 选择表名称旁边的“眼睛”图标。Select the 'eye' icon next to the table name.
  7. 选择“在查询编辑器中查看”按钮。Select the 'View in query editor' button.
  8. 查询编辑器中现在有一个查询,它会显示日志中最近的 10 个事件。You'll now have a query in the query editor that will show the most recent 10 events in the log.
  9. 在这里,可以使用查询编辑器、设置警报等来尝试查询日志。From here, you can experiment with querying the logs using the query editor, set alerts, etc.

自动上传(可选)Automating uploads (optional)

如果你要按计划上传指标和日志,可以创建脚本,并每隔几分钟按照计时器运行该脚本。If you want to upload metrics and logs on a scheduled basis, you can create a script and run it on a timer every few minutes. 下面是使用 Linux shell 脚本自动执行上传的示例。Below is an example of automating the uploads using a Linux shell script.

在你常用的文本/代码编辑器中,将以下脚本添加到文件,并将该文件另存为脚本可执行文件,如 .sh (Linux/Mac) 或 .cmd、.bat、.ps1。In your favorite text/code editor, add the following script to the file and save as a script executable file such as .sh (Linux/Mac) or .cmd, .bat, .ps1.

azdata arc dc export --type metrics --path metrics.json --force
azdata arc dc upload --path metrics.json

使脚本文件成为可执行文件Make the script file executable

chmod +x myuploadscript.sh

每隔 20 分钟运行该脚本:Run the script every 20 minutes:

watch -n 1200 ./myuploadscript.sh

还可以使用 cron 或 Windows 任务计划程序等作业计划程序或者 Ansible、Puppet 或 Chef 等业务流程协调程序。You could also use a job scheduler like cron or Windows Task Scheduler or an orchestrator like Ansible, Puppet, or Chef.

后续步骤Next steps

将指标和日志上传到 Azure MonitorUpload metrics, and logs to Azure Monitor

将使用情况数据、指标和日志上传到 Azure MonitorUpload usage data, metrics, and logs to Azure Monitor

将账单数据上传到 Azure 并在 Azure 门户中查看该数据Upload billing data to Azure and view it in the Azure portal

在 Azure 门户中查看 Azure Arc 数据控制器资源View Azure Arc data controller resource in Azure portal