azure-monitor-ingestion-java

Azure Monitor Ingestion SDK for Java. Send custom logs to Azure Monitor via Data Collection Rules (DCR) and Data Collection Endpoints (DCE).

31,392 stars
Complexity: easy

About this skill

This skill provides an interface for AI agents to understand and assist with implementing custom log ingestion into Azure Monitor from Java applications. It leverages the Azure Monitor Ingestion SDK for Java, enabling developers to send arbitrary data streams as custom logs. By integrating with Data Collection Rules (DCR) and Data Collection Endpoints (DCE), this skill ensures structured, efficient, and secure data transfer to Azure Monitor for monitoring, analytics, and alerting purposes. An AI agent can use this skill to generate the necessary Java code, configuration details, or guidance for developers seeking to implement robust logging solutions.

Best use case

Implementing custom logging for Java applications into Azure Monitor; Centralizing application logs for monitoring and troubleshooting; Sending custom metrics or operational data to Azure Monitor; Integrating Java-based microservices or serverless functions with Azure's logging infrastructure.

Azure Monitor Ingestion SDK for Java. Send custom logs to Azure Monitor via Data Collection Rules (DCR) and Data Collection Endpoints (DCE).

Generated Java code snippet or comprehensive code example that successfully sends custom logs to Azure Monitor. Guidance on configuring Azure resources (DCR, DCE) to accept these logs. Successful ingestion of custom logs from a Java application into specified Azure Monitor Log Analytics workspace.

Practical example

Example input

Generate Java code to send a custom log entry to Azure Monitor. The log should include a timestamp, a message 'User login successful', and a user ID 'user123'. Assume the Data Collection Endpoint is 'https://<your-dce-name>.eastus-1.ingest.monitor.azure.com' and DCR Immutable ID is 'dcr-immutable-id-example'.

Example output

{"code": "import com.azure.monitor.ingestion.LogsIngestionClient;\nimport com.azure.monitor.ingestion.LogsIngestionClientBuilder;\nimport com.azure.monitor.ingestion.models.LogsUploadOptions;\nimport com.azure.identity.DefaultAzureCredentialBuilder;\nimport java.time.OffsetDateTime;\nimport java.util.Arrays;\nimport java.util.List;\nimport java.util.Map;\nimport java.util.HashMap;\n\npublic class AzureMonitorLogSender {\n\n    private static final String DCR_IMMUTABLE_ID = \"dcr-immutable-id-example\";\n    private static final String LOGS_STREAM_NAME = \"CustomJavaLogs\"; // Must match your DCR stream\n    private static final String DATA_COLLECTION_ENDPOINT = \"https://<your-dce-name>.eastus-1.ingest.monitor.azure.com\";\n\n    public static void main(String[] args) {\n        try {\n            LogsIngestionClient client = new LogsIngestionClientBuilder()\n                .endpoint(DATA_COLLECTION_ENDPOINT)\n                .credential(new DefaultAzureCredentialBuilder().build())\n                .buildClient();\n\n            List<Map<String, Object>> logs = Arrays.asList(\n                new HashMap<String, Object>() {{\n                    put(\"TimeGenerated\", OffsetDateTime.now().toString());\n                    put(\"Message\", \"User login successful\");\n                    put(\"UserId\", \"user123\");\n                    put(\"Severity\", \"Informational\");\n                    put(\"OperationName\", \"UserLogin\");\n                }}\n            );\n\n            // Assuming 'CustomJavaLogs' is the table name defined in your DCR.\n            // Replace with the actual stream name configured in your DCR.\n            client.upload(DCR_IMMUTABLE_ID, LOGS_STREAM_NAME, logs, new LogsUploadOptions());\n            System.out.println(\"Logs sent successfully to Azure Monitor.\");\n\n        } catch (Exception e) {\n            System.err.println(\"Error sending logs: \" + e.getMessage());\n            e.printStackTrace();\n        }\n    }\n}\n// Note: Ensure your Data Collection Rule (DCR) is configured with a stream named 'CustomJavaLogs'\n// and maps to a custom table, e.g., 'MyCustomJavaLogs_CL' with columns matching\n// 'TimeGenerated', 'Message', 'UserId', 'Severity', 'OperationName' types."}

When to use this skill

  • When a Java application needs to send custom, structured, or unstructured logs directly to Azure Monitor. When an AI agent is tasked with generating Java code to integrate logging with Azure Monitor. When an agent needs to provide guidance on setting up Data Collection Rules and Endpoints for Java applications.

When not to use this skill

  • For sending logs to other monitoring platforms (e.g., Splunk, ELK stack). When logs are already handled by standard logging frameworks (e.g., Log4j, SLF4J) and an existing Azure Monitor sink/appender is sufficient. When the application is not written in Java.

Installation

Claude Code / Cursor / Codex

$curl -o ~/.claude/skills/azure-monitor-ingestion-java/SKILL.md --create-dirs "https://raw.githubusercontent.com/sickn33/antigravity-awesome-skills/main/plugins/antigravity-awesome-skills-claude/skills/azure-monitor-ingestion-java/SKILL.md"

Manual Installation

  1. Download SKILL.md from GitHub
  2. Place it in .claude/skills/azure-monitor-ingestion-java/SKILL.md inside your project
  3. Restart your AI agent — it will auto-discover the skill

How azure-monitor-ingestion-java Compares

Feature / Agentazure-monitor-ingestion-javaStandard Approach
Platform SupportClaudeLimited / Varies
Context Awareness High Baseline
Installation ComplexityeasyN/A

Frequently Asked Questions

What does this skill do?

Azure Monitor Ingestion SDK for Java. Send custom logs to Azure Monitor via Data Collection Rules (DCR) and Data Collection Endpoints (DCE).

Which AI agents support this skill?

This skill is designed for Claude.

How difficult is it to install?

The installation complexity is rated as easy. You can find the installation instructions above.

Where can I find the source code?

You can find the source code on GitHub using the link provided at the top of the page.

Related Guides

SKILL.md Source

# Azure Monitor Ingestion SDK for Java

Client library for sending custom logs to Azure Monitor using the Logs Ingestion API via Data Collection Rules.

## Installation

```xml
<dependency>
    <groupId>com.azure</groupId>
    <artifactId>azure-monitor-ingestion</artifactId>
    <version>1.2.11</version>
</dependency>
```

Or use Azure SDK BOM:

```xml
<dependencyManagement>
    <dependencies>
        <dependency>
            <groupId>com.azure</groupId>
            <artifactId>azure-sdk-bom</artifactId>
            <version>{bom_version}</version>
            <type>pom</type>
            <scope>import</scope>
        </dependency>
    </dependencies>
</dependencyManagement>

<dependencies>
    <dependency>
        <groupId>com.azure</groupId>
        <artifactId>azure-monitor-ingestion</artifactId>
    </dependency>
</dependencies>
```

## Prerequisites

- Data Collection Endpoint (DCE)
- Data Collection Rule (DCR)
- Log Analytics workspace
- Target table (custom or built-in: CommonSecurityLog, SecurityEvents, Syslog, WindowsEvents)

## Environment Variables

```bash
DATA_COLLECTION_ENDPOINT=https://<dce-name>.<region>.ingest.monitor.azure.com
DATA_COLLECTION_RULE_ID=dcr-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
STREAM_NAME=Custom-MyTable_CL
```

## Client Creation

### Synchronous Client

```java
import com.azure.identity.DefaultAzureCredential;
import com.azure.identity.DefaultAzureCredentialBuilder;
import com.azure.monitor.ingestion.LogsIngestionClient;
import com.azure.monitor.ingestion.LogsIngestionClientBuilder;

DefaultAzureCredential credential = new DefaultAzureCredentialBuilder().build();

LogsIngestionClient client = new LogsIngestionClientBuilder()
    .endpoint("<data-collection-endpoint>")
    .credential(credential)
    .buildClient();
```

### Asynchronous Client

```java
import com.azure.monitor.ingestion.LogsIngestionAsyncClient;

LogsIngestionAsyncClient asyncClient = new LogsIngestionClientBuilder()
    .endpoint("<data-collection-endpoint>")
    .credential(new DefaultAzureCredentialBuilder().build())
    .buildAsyncClient();
```

## Key Concepts

| Concept | Description |
|---------|-------------|
| Data Collection Endpoint (DCE) | Ingestion endpoint URL for your region |
| Data Collection Rule (DCR) | Defines data transformation and routing to tables |
| Stream Name | Target stream in the DCR (e.g., `Custom-MyTable_CL`) |
| Log Analytics Workspace | Destination for ingested logs |

## Core Operations

### Upload Custom Logs

```java
import java.util.List;
import java.util.ArrayList;

List<Object> logs = new ArrayList<>();
logs.add(new MyLogEntry("2024-01-15T10:30:00Z", "INFO", "Application started"));
logs.add(new MyLogEntry("2024-01-15T10:30:05Z", "DEBUG", "Processing request"));

client.upload("<data-collection-rule-id>", "<stream-name>", logs);
System.out.println("Logs uploaded successfully");
```

### Upload with Concurrency

For large log collections, enable concurrent uploads:

```java
import com.azure.monitor.ingestion.models.LogsUploadOptions;
import com.azure.core.util.Context;

List<Object> logs = getLargeLogs(); // Large collection

LogsUploadOptions options = new LogsUploadOptions()
    .setMaxConcurrency(3);

client.upload("<data-collection-rule-id>", "<stream-name>", logs, options, Context.NONE);
```

### Upload with Error Handling

Handle partial upload failures gracefully:

```java
LogsUploadOptions options = new LogsUploadOptions()
    .setLogsUploadErrorConsumer(uploadError -> {
        System.err.println("Upload error: " + uploadError.getResponseException().getMessage());
        System.err.println("Failed logs count: " + uploadError.getFailedLogs().size());
        
        // Option 1: Log and continue
        // Option 2: Throw to abort remaining uploads
        // throw uploadError.getResponseException();
    });

client.upload("<data-collection-rule-id>", "<stream-name>", logs, options, Context.NONE);
```

### Async Upload with Reactor

```java
import reactor.core.publisher.Mono;

List<Object> logs = getLogs();

asyncClient.upload("<data-collection-rule-id>", "<stream-name>", logs)
    .doOnSuccess(v -> System.out.println("Upload completed"))
    .doOnError(e -> System.err.println("Upload failed: " + e.getMessage()))
    .subscribe();
```

## Log Entry Model Example

```java
public class MyLogEntry {
    private String timeGenerated;
    private String level;
    private String message;
    
    public MyLogEntry(String timeGenerated, String level, String message) {
        this.timeGenerated = timeGenerated;
        this.level = level;
        this.message = message;
    }
    
    // Getters required for JSON serialization
    public String getTimeGenerated() { return timeGenerated; }
    public String getLevel() { return level; }
    public String getMessage() { return message; }
}
```

## Error Handling

```java
import com.azure.core.exception.HttpResponseException;

try {
    client.upload(ruleId, streamName, logs);
} catch (HttpResponseException e) {
    System.err.println("HTTP Status: " + e.getResponse().getStatusCode());
    System.err.println("Error: " + e.getMessage());
    
    if (e.getResponse().getStatusCode() == 403) {
        System.err.println("Check DCR permissions and managed identity");
    } else if (e.getResponse().getStatusCode() == 404) {
        System.err.println("Verify DCE endpoint and DCR ID");
    }
}
```

## Best Practices

1. **Batch logs** — Upload in batches rather than one at a time
2. **Use concurrency** — Set `maxConcurrency` for large uploads
3. **Handle partial failures** — Use error consumer to log failed entries
4. **Match DCR schema** — Log entry fields must match DCR transformation expectations
5. **Include TimeGenerated** — Most tables require a timestamp field
6. **Reuse client** — Create once, reuse throughout application
7. **Use async for high throughput** — `LogsIngestionAsyncClient` for reactive patterns

## Querying Uploaded Logs

Use azure-monitor-query to query ingested logs:

```java
// See azure-monitor-query skill for LogsQueryClient usage
String query = "MyTable_CL | where TimeGenerated > ago(1h) | limit 10";
```

## Reference Links

| Resource | URL |
|----------|-----|
| Maven Package | https://central.sonatype.com/artifact/com.azure/azure-monitor-ingestion |
| GitHub | https://github.com/Azure/azure-sdk-for-java/tree/main/sdk/monitor/azure-monitor-ingestion |
| Product Docs | https://learn.microsoft.com/azure/azure-monitor/logs/logs-ingestion-api-overview |
| DCE Overview | https://learn.microsoft.com/azure/azure-monitor/essentials/data-collection-endpoint-overview |
| DCR Overview | https://learn.microsoft.com/azure/azure-monitor/essentials/data-collection-rule-overview |
| Troubleshooting | https://github.com/Azure/azure-sdk-for-java/blob/main/sdk/monitor/azure-monitor-ingestion/TROUBLESHOOTING.md |

## When to Use
This skill is applicable to execute the workflow or actions described in the overview.

Related Skills

azure-storage-queue-ts

31392
from sickn33/antigravity-awesome-skills

Azure Queue Storage JavaScript/TypeScript SDK (@azure/storage-queue) for message queue operations. Use for sending, receiving, peeking, and deleting messages in queues.

Cloud IntegrationClaude

azure-storage-queue-py

31392
from sickn33/antigravity-awesome-skills

Azure Queue Storage SDK for Python. Use for reliable message queuing, task distribution, and asynchronous processing.

Cloud IntegrationClaude

azure-servicebus-dotnet

31392
from sickn33/antigravity-awesome-skills

Azure Service Bus SDK for .NET. Enterprise messaging with queues, topics, subscriptions, and sessions.

Cloud IntegrationClaudeChatGPTGemini

azure-identity-py

31392
from sickn33/antigravity-awesome-skills

Azure Identity SDK for Python authentication. Use for DefaultAzureCredential, managed identity, service principals, and token caching.

Cloud IntegrationClaude

azure-eventgrid-py

31392
from sickn33/antigravity-awesome-skills

Azure Event Grid SDK for Python. Use for publishing events, handling CloudEvents, and event-driven architectures.

Cloud IntegrationClaude

azure-eventgrid-java

31392
from sickn33/antigravity-awesome-skills

Build event-driven applications with Azure Event Grid SDK for Java. Use when publishing events, implementing pub/sub patterns, or integrating with Azure services via events.

Cloud IntegrationClaudeChatGPTGemini

azure-eventgrid-dotnet

31392
from sickn33/antigravity-awesome-skills

Azure Event Grid SDK for .NET. Client library for publishing and consuming events with Azure Event Grid. Use for event-driven architectures, pub/sub messaging, CloudEvents, and EventGridEvents.

Cloud IntegrationClaude

n8n-code-javascript

31392
from sickn33/antigravity-awesome-skills

Write JavaScript code in n8n Code nodes. Use when writing JavaScript in n8n, using $input/$json/$node syntax, making HTTP requests with $helpers, working with dates using DateTime, troubleshooting Code node errors, or choosing between Code node modes.

Programming & DevelopmentClaude

microsoft-azure-webjobs-extensions-authentication-events-dotnet

31392
from sickn33/antigravity-awesome-skills

Microsoft Entra Authentication Events SDK for .NET. Azure Functions triggers for custom authentication extensions.

Identity Management / Authentication & AuthorizationClaude

javascript-typescript-typescript-scaffold

31392
from sickn33/antigravity-awesome-skills

You are a TypeScript project architecture expert specializing in scaffolding production-ready Node.js and frontend applications. Generate complete project structures with modern tooling (pnpm, Vite, N

Code GenerationClaude

javascript-pro

31392
from sickn33/antigravity-awesome-skills

Master modern JavaScript with ES6+, async patterns, and Node.js APIs. Handles promises, event loops, and browser/Node compatibility.

Programming & DevelopmentClaude

javascript-mastery

31392
from sickn33/antigravity-awesome-skills

33+ essential JavaScript concepts every developer should know, inspired by [33-js-concepts](https://github.com/leonardomso/33-js-concepts).

Programming Language HelpClaude