Sample code to send data to Azure Monitor using Logs ingestion API

This article provides sample code using the Logs ingestion API. Each sample requires the following components to be created before the code is run. See Tutorial: Send data to Azure Monitor using Logs ingestion API (Resource Manager templates) for a complete walkthrough of creating these components configured to support each of these samples.

  • Custom table in a Log Analytics workspace
  • Data collection endpoint (DCE) to receive data
  • Data collection rule (DCR) to direct the data to the target table
  • Microsoft Entra application with access to the DCR

Sample code

The following script uses the Azure Monitor Ingestion client library for .NET.

  1. Install the Azure Monitor Ingestion client library and the Azure Identity library. The Azure Identity library is required for the authentication used in this sample.

    dotnet add package Azure.Identity
    dotnet add package Azure.Monitor.Ingestion
    
  2. Create the following environment variables with values for your Microsoft Entra application. These values are used by DefaultAzureCredential in the Azure Identity library.

    • AZURE_TENANT_ID
    • AZURE_CLIENT_ID
    • AZURE_CLIENT_SECRET
  3. Replace the variables in the following sample code with values from your DCE and DCR. You may also want to replace the sample data with your own.

    using Azure;
    using Azure.Core;
    using Azure.Identity;
    using Azure.Monitor.Ingestion;
    
    // Initialize variables
    var endpoint = new Uri("https://logs-ingestion-rzmk.eastus2-1.ingest.monitor.azure.com");
    var ruleId = "dcr-00000000000000000000000000000000";
    var streamName = "Custom-MyTableRawData";
    
    // Create credential and client
    var credential = new DefaultAzureCredential();
    LogsIngestionClient client = new(endpoint, credential);
    
    DateTimeOffset currentTime = DateTimeOffset.UtcNow;
    
    // Use BinaryData to serialize instances of an anonymous type into JSON
    BinaryData data = BinaryData.FromObjectAsJson(
        new[] {
            new
            {
                Time = currentTime,
                Computer = "Computer1",
                AdditionalContext = new
                {
                    InstanceName = "user1",
                    TimeZone = "Pacific Time",
                    Level = 4,
                    CounterName = "AppMetric1",
                    CounterValue = 15.3
                }
            },
            new
            {
                Time = currentTime,
                Computer = "Computer2",
                AdditionalContext = new
                {
                    InstanceName = "user2",
                    TimeZone = "Central Time",
                    Level = 3,
                    CounterName = "AppMetric1",
                    CounterValue = 23.5
                }
            },
        });
    
    // Upload logs
    try
    {
        Response response = client.Upload(ruleId, streamName, RequestContent.Create(data));
    }
    catch (Exception ex)
    {
        Console.WriteLine("Upload failed with Exception " + ex.Message);
    }
    
    // Logs can also be uploaded in a List
    var entries = new List<Object>();
    for (int i = 0; i < 10; i++)
    {
        entries.Add(
            new {
                Time = recordingNow,
                Computer = "Computer" + i.ToString(),
                AdditionalContext = i
            }
        );
    }
    
    // Make the request
    LogsUploadOptions options = new LogsUploadOptions();
    bool isTriggered = false;
    options.UploadFailed += Options_UploadFailed;
    await client.UploadAsync(TestEnvironment.DCRImmutableId, TestEnvironment.StreamName, entries, options).ConfigureAwait(false);
    
    Task Options_UploadFailed(LogsUploadFailedEventArgs e)
    {
        isTriggered = true;
        Console.WriteLine(e.Exception);
        foreach (var log in e.FailedLogs)
        {
            Console.WriteLine(log);
        }
        return Task.CompletedTask;
    }
    
  4. Execute the code, and the data should arrive in your Log Analytics workspace within a few minutes.

Troubleshooting

This section describes different error conditions you might receive and how to correct them.

Script returns error code 403

Ensure that you have the correct permissions for your application to the DCR. You might also need to wait up to 30 minutes for permissions to propagate.

Script returns error code 413 or warning of TimeoutExpired with the message ReadyBody_ClientConnectionAbort in the response

The message is too large. The maximum message size is currently 1 MB per call.

Script returns error code 429

API limits have been exceeded. The limits are currently set to 500 MB of data per minute for both compressed and uncompressed data and 300,000 requests per minute. Retry after the duration listed in the Retry-After header in the response.

Script returns error code 503

Ensure that you have the correct permissions for your application to the DCR. You might also need to wait up to 30 minutes for permissions to propagate.

You don't receive an error, but data doesn't appear in the workspace

The data might take some time to be ingested, especially the first time data is being sent to a particular table. It shouldn't take longer than 15 minutes.

IntelliSense in Log Analytics doesn't recognize the new table

The cache that drives IntelliSense might take up to 24 hours to update.

Next steps