Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
This article provides sample code using the Logs ingestion API. Each sample requires the following components to be created before the code is run. See Tutorial: Send data to Azure Monitor using Logs ingestion API (Resource Manager templates) for a complete walkthrough of creating these components configured to support each of these samples.
- Custom table in a Log Analytics workspace
- Data collection rule (DCR) to direct the data to the target table
- Microsoft Entra application with access to the DCR
- Data collection endpoint (DCE) if you're using private link. Otherwise, use the DCR logs endpoint.
Sample code
The following script uses the Azure Monitor Ingestion client library for .NET.
Install the Azure Monitor Ingestion client library and the Azure Identity library. The Azure Identity library is required for the authentication used in this sample.
dotnet add package Azure.Identity dotnet add package Azure.Monitor.Ingestion
Create the following environment variables with values for your Microsoft Entra application. These values are used by
DefaultAzureCredential
in the Azure Identity library.AZURE_TENANT_ID
AZURE_CLIENT_ID
AZURE_CLIENT_SECRET
Replace the variables in the following sample code with values from your DCR. You may also want to replace the sample data with your own.
using Azure; using Azure.Core; using Azure.Identity; using Azure.Monitor.Ingestion; // Initialize variables var endpoint = new Uri("https://my-url.monitor.azure.com"); var ruleId = "dcr-00000000000000000000000000000000"; var streamName = "Custom-MyTableRawData"; // Create credential and client var credential = new DefaultAzureCredential(); LogsIngestionClient client = new(endpoint, credential); DateTimeOffset currentTime = DateTimeOffset.UtcNow; // Use BinaryData to serialize instances of an anonymous type into JSON BinaryData data = BinaryData.FromObjectAsJson( new[] { new { Time = currentTime, Computer = "Computer1", AdditionalContext = new { InstanceName = "user1", TimeZone = "Pacific Time", Level = 4, CounterName = "AppMetric1", CounterValue = 15.3 } }, new { Time = currentTime, Computer = "Computer2", AdditionalContext = new { InstanceName = "user2", TimeZone = "Central Time", Level = 3, CounterName = "AppMetric1", CounterValue = 23.5 } }, }); // Upload logs try { var response = await client.UploadAsync(ruleId, streamName, RequestContent.Create(data)).ConfigureAwait(false); if (response.IsError) { throw new Exception(response.ToString()); } Console.WriteLine("Log upload completed using content upload"); } catch (Exception ex) { Console.WriteLine("Upload failed with Exception: " + ex.Message); } // Logs can also be uploaded in a List var entries = new List<object>(); for (int i = 0; i < 10; i++) { entries.Add( new { Time = currentTime, Computer = "Computer" + i.ToString(), AdditionalContext = new { InstanceName = "user" + i.ToString(), TimeZone = "Central Time", Level = 3, CounterName = "AppMetric1" + i.ToString(), CounterValue = i } } ); } // Make the request try { var response = await client.UploadAsync(ruleId, streamName, entries).ConfigureAwait(false); if (response.IsError) { throw new Exception(response.ToString()); } Console.WriteLine("Log upload completed using list of entries"); } catch (Exception ex) { Console.WriteLine("Upload failed with Exception: " + ex.Message); }
Execute the code, and the data should arrive in your Log Analytics workspace within a few minutes.
Troubleshooting
This section describes different error conditions you might receive and how to correct them.
Error | Description |
---|---|
Error code 403 | Ensure that you have the correct permissions for your application to the DCR. You might also need to wait up to 30 minutes for permissions to propagate. |
Error code 413 or warning of TimeoutExpired with the message ReadyBody_ClientConnectionAbort in the response |
The message is too large. The maximum message size is currently 1 MB per call. |
Error code 429 | API limits have been exceeded. The limits are currently set to 500 MB of data per minute for both compressed and uncompressed data and 300,000 requests per minute. Retry after the duration listed in the Retry-After header in the response. |
No data | The data might take some time to be ingested, especially the first time data is being sent to a particular table. It shouldn't take longer than 15 minutes. |
IntelliSense in Log Analytics doesn't recognize the new table. | The cache that drives IntelliSense might take up to 24 hours to update. |