Collect text and IIS logs with Azure Monitor agent (preview)
This article describes how to configure the collection of file-based text logs, including logs generated by IIS on Windows computers, with the Azure Monitor agent. Many applications log information to text files instead of standard logging services such as Windows Event log or Syslog.
Prerequisites
To complete this procedure, you need the following:
- Log Analytics workspace where you have at least contributor rights .
- Permissions to create Data Collection Rule objects in the workspace.
- An agent with supported log file as described in the next section.
- Azure Monitor collects entries from log files created by IIS, so you must configure IIS for logging. Azure Monitor only supports IIS log files stored in W3C format and does not support custom fields or IIS Advanced Logging. It does not collect logs in NCSA or IIS native format. Configure IIS logs in Azure Monitor from the Agent configuration menu for the Log Analytics agent. There is no configuration required other than selecting Collect W3C format IIS log files.
Log files supported
IIS logs must be in W3C format. Other log files must meet the following criteria to be collected:
- The log file must be stored on a local drive of a virtual machine, virtual machine scale set, or Arc enabled server with the Azure Monitor installed.
- Each entry in the log file must be delineated with an ISO 8601 formatted time stamp or an end of line.
- The log file must not allow circular logging, log rotation where the file is overwritten with new entries, or the file is renamed and the same file name is reused for continued logging.
Steps to collect text logs
The steps to configure log collection are as follows. The detailed steps for each are provided in the sections below:
- Create a new table in your workspace to receive the collected data. (not required for IIS logs)
- Create a data collection endpoint for the Azure Monitor agent to connect.
- Create a data collection rule to define the structure of the log file and destination of the collected data.
- Create association between the data collection rule and the agent collecting the log file.
Note
This feature is currently in public preview and isn't completely implemented in the Azure portal. This tutorial uses Azure Resource Manager templates for steps that can't yet be performed with the portal.
Create new table in Log Analytics workspace
The custom table must be created before you can send data to it. When you create the table, you provide its name and a definition for each of its columns.
Note
This step isn't required to collect an IIS log. The table W3CIISLog will be used for IIS logs.
Use the Tables - Update API to create the table with the PowerShell code below. This code creates a table called MyTable_CL with two columns. Modify this schema to collect a different table.
Important
Custom tables must use a suffix of _CL.
Click the Cloud Shell button in the Azure portal and ensure the environment is set to PowerShell.
Copy the following PowerShell code and replace the Path parameter with the appropriate values for your workspace in the
Invoke-AzRestMethodcommand. Paste it into the Cloud Shell prompt to run it.$tableParams = @' { "properties": { "schema": { "name": "MyTable_CL", "columns": [ { "name": "TimeGenerated", "type": "DateTime" }, { "name": "RawData", "type": "String" } ] } } } '@ Invoke-AzRestMethod -Path "/subscriptions/{subscription}/resourcegroups/{resourcegroup}/providers/microsoft.operationalinsights/workspaces/{workspace}/tables/MyTable_CL?api-version=2021-12-01-preview" -Method PUT -payload $tableParams
Create data collection endpoint
A data collection endpoint (DCE) is required for the agent to connect to send the data to Azure Monitor. The DCE must be located in the same region as the Log Analytics Workspace where the data will be sent. If you already have a data collection endpoint for the agent, then you can use the existing one.
In the Azure portal's search box, type in template and then select Deploy a custom template.
Click Build your own template in the editor.
Paste the Resource Manager template below into the editor and then click Save. You don't need to modify this template since you will provide values for its parameters.
{ "$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#", "contentVersion": "1.0.0.0", "parameters": { "dataCollectionEndpointName": { "type": "string", "metadata": { "description": "Specifies the name of the Data Collection Endpoint to create." } }, "location": { "type": "string", "metadata": { "description": "Specifies the location in which to create the Data Collection Endpoint." } } }, "resources": [ { "type": "Microsoft.Insights/dataCollectionEndpoints", "name": "[parameters('dataCollectionEndpointName')]", "location": "[parameters('location')]", "apiVersion": "2021-04-01", "properties": { "networkAcls": { "publicNetworkAccess": "Enabled" } } } ], "outputs": { "dataCollectionEndpointId": { "type": "string", "value": "[resourceId('Microsoft.Insights/dataCollectionEndpoints', parameters('dataCollectionEndpointName'))]" } } }On the Custom deployment screen, specify a Subscription and Resource group to store the data collection rule and then provide values a Name for the data collection endpoint. The Location should be the same location as the workspace. The Region will already be populated and is used for the location of the data collection endpoint.
Click Review + create and then Create when you review the details.
Once the DCE is created, select it so you can view its properties. Note the Logs ingestion URI since you'll need this in a later step.
Click JSON View to view other details for the DCE. Copy the Resource ID since you'll need this in a later step.
Create data collection rule
The data collection rule (DCR) defines the schema of data that being collected from the log file, the transformation that will be applied to it, and the destination workspace and table the transformed data will be sent to.
The data collection rule requires the resource ID of your workspace. Navigate to your workspace in the Log Analytics workspaces menu in the Azure portal. From the Properties page, copy the Resource ID and save it for later use.
In the Azure portal's search box, type in template and then select Deploy a custom template.
Click Build your own template in the editor.
Paste one of the Resource Manager templates below into the editor and then change the following values:
streamDeclarations: Defines the columns of the incoming data. This must match the structure of the log file.filePatterns: Specifies the location and file pattern of the log files to collect. This defines a separate pattern for Windows and Linux agents.transformKql: Specifies a transformation to apply to the incoming data before it's sent to the workspace.
Click Save.
Data collection rule for text log
See Structure of a data collection rule in Azure Monitor (preview) if you want to modify the text log DCR.
{ "$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#", "contentVersion": "1.0.0.0", "parameters": { "dataCollectionRuleName": { "type": "string", "metadata": { "description": "Specifies the name of the Data Collection Rule to create." } }, "location": { "type": "string", "metadata": { "description": "Specifies the location in which to create the Data Collection Rule." } }, "workspaceName": { "type": "string", "metadata": { "description": "Name of the Log Analytics workspace to use." } }, "workspaceResourceId": { "type": "string", "metadata": { "description": "Specifies the Azure resource ID of the Log Analytics workspace to use." } }, "endpointResourceId": { "type": "string", "metadata": { "description": "Specifies the Azure resource ID of the Data Collection Endpoint to use." } } }, "resources": [ { "type": "Microsoft.Insights/dataCollectionRules", "name": "[parameters('dataCollectionRuleName')]", "location": "[parameters('location')]", "apiVersion": "2021-09-01-preview", "properties": { "dataCollectionEndpointId": "[parameters('endpointResourceId')]", "streamDeclarations": { "Custom-MyLogFileFormat": { "columns": [ { "name": "TimeGenerated", "type": "datetime" }, { "name": "RawData", "type": "string" } ] } }, "dataSources": { "logFiles": [ { "streams": [ "Custom-MyLogFileFormat" ], "filePatterns": [ "C:\\JavaLogs\\*.log" ], "format": "text", "settings": { "text": { "recordStartTimestampFormat": "ISO 8601" } }, "name": "myLogFileFormat-Windows" }, { "streams": [ "Custom-MyLogFileFormat" ], "filePatterns": [ "//var//*.log" ], "format": "text", "settings": { "text": { "recordStartTimestampFormat": "ISO 8601" } }, "name": "myLogFileFormat-Linux" } ] }, "destinations": { "logAnalytics": [ { "workspaceResourceId": "[parameters('workspaceResourceId')]", "name": "[parameters('workspaceName')]" } ] }, "dataFlows": [ { "streams": [ "Custom-MyLogFileFormat" ], "destinations": [ "[parameters('workspaceName')]" ], "transformKql": "source", "outputStream": "Custom-MyTable_CL" } ] } } ], "outputs": { "dataCollectionRuleId": { "type": "string", "value": "[resourceId('Microsoft.Insights/dataCollectionRules', parameters('dataCollectionRuleName'))]" } } }Data collection rule for IIS log
{ "$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#", "contentVersion": "1.0.0.0", "parameters": { "dataCollectionRuleName": { "type": "string", "metadata": { "description": "Specifies the name of the Data Collection Rule to create." } }, "location": { "type": "string", "metadata": { "description": "Specifies the location in which to create the Data Collection Rule." } }, "workspaceName": { "type": "string", "metadata": { "description": "Name of the Log Analytics workspace to use." } }, "workspaceResourceId": { "type": "string", "metadata": { "description": "Specifies the Azure resource ID of the Log Analytics workspace to use." } }, "endpointResourceId": { "type": "string", "metadata": { "description": "Specifies the Azure resource ID of the Data Collection Endpoint to use." } } }, "resources": [ { "type": "Microsoft.Insights/dataCollectionRules", "name": "[parameters('dataCollectionRuleName')]", "location": "[parameters('location')]", "apiVersion": "2021-09-01-preview", "properties": { "dataCollectionEndpointId": "[parameters('endpointResourceId')]", "dataSources": { "iisLogs": [ { "streams": [ "Microsoft-W3CIISLog" ], "logDirectories": [ "C:\\inetpub\\logs\\LogFiles\\W3SVC1\\" ], "name": "myIisLogsDataSource" } ] }, "destinations": { "logAnalytics": [ { "workspaceResourceId": "[parameters('workspaceResourceId')]", "name": "[parameters('workspaceName')]" } ] }, "dataFlows": [ { "streams": [ "Microsoft-W3CIISLog" ], "destinations": [ "[parameters('workspaceName')]" ], "transformKql": "source" } ] } } ], "outputs": { "dataCollectionRuleId": { "type": "string", "value": "[resourceId('Microsoft.Insights/dataCollectionRules', parameters('dataCollectionRuleName'))]" } } }On the Custom deployment screen, specify a Subscription and Resource group to store the data collection rule and then provide values defined in the template. This includes a Name for the data collection rule and the Workspace Resource ID and Endpoint Resource ID. The Location should be the same location as the workspace. The Region will already be populated and is used for the location of the data collection rule.
Click Review + create and then Create when you review the details.
When the deployment is complete, expand the Deployment details box and click on your data collection rule to view its details. Click JSON View.
Change the API version to 2021-09-01-preview.
Copy the Resource ID for the data collection rule. You'll use this in the next step.
Create association with agent
The final step is to create a data collection association that associates the data collection rule to the agents with the log file to be collected. A single data collection rule can be used with multiple agents.
From the Monitor menu in the Azure portal, select Data Collection Rules and select the rule that you just created.
Select Resources and then click Add to view the available resources.
Select either individual agents to associate the data collection rule, or select a resource group to create an association for all agents in that resource group. Click Apply.
Troubleshooting - text logs
Use the following steps to troubleshoot collection of text logs.
Check if any custom logs have been received
Start by checking if any records have been collected for your custom log table by running the following query in Log Analytics. If no records are returned then check the other sections for possible causes. This query looks for entires in the last two days, but you can modify for another time range. It can take 5-7 minutes for new data from your tables to be uploaded. Only new data will be uploaded any log file last written to prior to the DCR rules being created will not be uploaded.
<YourCustomLog>_CL
| where TimeGenerated > ago(48h)
| order by TimeGenerated desc
Verify that you created custom table
As described in Create new table in Log Analytics workspace above, you must create the custom log table before you can send data to it.
Verify that the agent is sending heartbeats successfully
Verify that Azure Monitor agent is communicating properly by running the following query in Log Analytics to check if there are any records in the Heartbeat table.
Heartbeat
| where TimeGenerated > ago(24h)
| where Computer has "<computer name>"
| project TimeGenerated, Category, Version
| order by TimeGenerated desc
Verify that you specified the correct log location in the data collection rule
The data collection rule will have a section similar to the following. The filePatterns element specifies the path to the log file to collect from the agent computer. Check the agent computer to verify that this is correct.
"dataSources": [{
"configuration": {
"filePatterns": ["C:\\JavaLogs\\*.log"],
"format": "text",
"settings": {
"text": {
"recordStartTimestampFormat": "yyyy-MM-ddTHH:mm:ssK"
}
}
},
"id": "myTabularLogDataSource",
"kind": "logFile",
"streams": [{
"stream": "Custom-TabularData-ABC"
}
],
"sendToChannels": ["gigl-dce-00000000000000000000000000000000"]
}
]
This file pattern should correspond to the logs on the agent machine.
Verify that the text logs are being populated
The agent will only collect new content written to the log file being collected. If you are experimenting with the text logs collection feature, you can use the following script to generate sample logs.
# This script writes a new log entry at the specified interval indefinitely.
# Usage:
# .\GenerateCustomLogs.ps1 [interval to sleep]
#
# Press Ctrl+C to terminate script.
#
# Example:
# .\ GenerateCustomLogs.ps1 5
param (
[Parameter(Mandatory=$true)][int]$sleepSeconds
)
$logFolder = "c:\\JavaLogs"
if (!(Test-Path -Path $logFolder))
{
mkdir $logFolder
}
$logFileName = "TestLog-$(Get-Date -format yyyyMMddhhmm).log"
do
{
$count++
$randomContent = New-Guid
$logRecord = "$(Get-Date -format s)Z Record number $count with random content $randomContent"
$logRecord | Out-File "$logFolder\\$logFileName" -Encoding utf8 -Append
Sleep $sleepSeconds
}
while ($true)
Share logs with Microsoft
If everything is configured properly, but you're still not collecting log data, use the following procedure to collect diagnostics logs for Azure Monitor agent to share with the Azure Monitor group.
- Open an elevated powershell window.
- Change to directory
C:\Packages\Plugins\Microsoft.Azure.Monitor.AzureMonitorWindowsAgent\[version]\. - Execute the script:
.\CollectAMALogs.ps1. - Share the
AMAFiles.zipfile generated on the desktop.
Troubleshoot - IIS logs
Use the following steps to troubleshoot collection of IIS logs.
Check if any IIS logs have been received
Start by checking if any records have been collected for your IIS logs by running the following query in Log Analytics. If no records are returned then check the other sections for possible causes. This query looks for entires in the last two days, but you can modify for another time range.
W3CIISLog
| where TimeGenerated > ago(48h)
| order by TimeGenerated desc
Verify that the agent is sending heartbeats successfully
Verify that Azure Monitor agent is communicating properly by running the following query in Log Analytics to check if there are any records in the Heartbeat table.
Heartbeat
| where TimeGenerated > ago(24h)
| where Computer has "<computer name>"
| project TimeGenerated, Category, Version
| order by TimeGenerated desc
Verify that IIS logs are being created
Look at the timestamps of the log files and open the latest to see that latest timestamps are present in the log files. The default location for IIS log files is C:\inetpub\LogFiles\W3SVC1.
Verify that you specified the correct log location in the data collection rule
The data collection rule will have a section similar to the following. The logDirectories element specifies the path to the log file to collect from the agent computer. Check the agent computer to verify that this is correct.
"dataSources": [
{
"configuration": {
"logDirectories": ["C:\\scratch\\demo\\W3SVC1"]
},
"id": "myIisLogsDataSource",
"kind": "iisLog",
"streams": [{
"stream": "ONPREM_IIS_BLOB_V2"
}
],
"sendToChannels": ["gigl-dce-6a8e34db54bb4b6db22d99d86314eaee"]
}
]
This directory should correspond to the location of the IIS logs on the agent machine.
Verify that the IIS logs are W3C formatted
Open IIS Manager and verify that the logs are being written in W3C format.
Open IIS log on the agent machine to verify logs are in W3C format.
Share logs with Microsoft
If everything is configured properly, but you're still not collecting log data, use the following procedure to collect diagnostics logs for Azure Monitor agent to share with the Azure Monitor group.
- Open an elevated PowerShell window.
- Change to directory
C:\Packages\Plugins\Microsoft.Azure.Monitor.AzureMonitorWindowsAgent\[version]\. - Execute the script:
.\CollectAMALogs.ps1. - Share the
AMAFiles.zipfile generated on the desktop.
Next steps
- Learn more about the Azure Monitor agent.
- Learn more about data collection rules.
- Learn more about data collection endpoints.
Povratne informacije
Pošalјite i prikažite povratne informacije za