azure-monitor-ingestion-py
Azure Monitor Ingestion SDK for Python. Use for sending custom logs to Log Analytics workspace via Logs Ingestion API.
About this skill
This skill provides a Python-based interface for AI agents to programmatically ingest custom log data into Azure Monitor Log Analytics workspaces. It leverages the Azure Monitor Ingestion SDK for Python, allowing agents to send structured log messages, operational data, or bespoke telemetry that falls outside standard log collection methods. By configuring Data Collection Endpoints (DCE) and Data Collection Rules (DCR), agents can direct specific streams of custom information to designated tables within their Azure Monitor environment, enhancing observability, troubleshooting, and analytical capabilities for agent behaviors or integrated systems.
Best use case
An AI agent needs to report its decision-making process, internal states, or specific actions to a centralized monitoring system; an agent managing a system needs to push custom application-specific events or metrics to Azure Monitor for real-time analysis; debugging and auditing AI agent interactions by logging detailed request/response cycles or error conditions directly to Log Analytics; enriching existing Azure monitoring with agent-generated custom telemetry.
Azure Monitor Ingestion SDK for Python. Use for sending custom logs to Log Analytics workspace via Logs Ingestion API.
Custom logs, metrics, or events generated by the AI agent are successfully ingested into the specified Azure Monitor Log Analytics workspace table; the ingested data is queryable and visible within Azure Monitor, allowing for analysis, dashboarding, and alerting; improved observability and insights into the AI agent's operations and interactions.
Practical example
Example input
Agent, please send a custom log to Azure Monitor with the message 'Task completion detected' for task ID 'AGNT-456' and result 'SUCCESS', noting it as an 'Informational' event.
Example output
Successfully ingested custom log to Azure Monitor Log Analytics workspace for task ID 'AGNT-456'. The log entry should now be available for querying.
When to use this skill
- When your AI agent requires robust logging capabilities within an Azure environment; when you need to send custom, structured data that isn't captured by standard Azure Monitor agents; when you want to centralize agent-specific operational insights alongside other infrastructure and application logs in Log Analytics; when developing agents where detailed, queryable historical data of their activities is crucial for performance analysis or compliance.
When not to use this skill
- If your monitoring solution is not Azure Monitor; if standard logging mechanisms (e.g., Application Insights, existing VMs/container logs) sufficiently cover your needs; when dealing with extremely high-volume, performance-critical logging where direct API calls might introduce latency; if you require a simpler, local logging solution without the need for cloud integration and advanced analytics.
Installation
Claude Code / Cursor / Codex
Manual Installation
- Download SKILL.md from GitHub
- Place it in
.claude/skills/azure-monitor-ingestion-py/SKILL.mdinside your project - Restart your AI agent — it will auto-discover the skill
How azure-monitor-ingestion-py Compares
| Feature / Agent | azure-monitor-ingestion-py | Standard Approach |
|---|---|---|
| Platform Support | Claude | Limited / Varies |
| Context Awareness | High | Baseline |
| Installation Complexity | medium | N/A |
Frequently Asked Questions
What does this skill do?
Azure Monitor Ingestion SDK for Python. Use for sending custom logs to Log Analytics workspace via Logs Ingestion API.
Which AI agents support this skill?
This skill is designed for Claude.
How difficult is it to install?
The installation complexity is rated as medium. You can find the installation instructions above.
Where can I find the source code?
You can find the source code on GitHub using the link provided at the top of the page.
Related Guides
AI Agents for Coding
Browse AI agent skills for coding, debugging, testing, refactoring, code review, and developer workflows across Claude, Cursor, and Codex.
Best AI Skills for Claude
Explore the best AI skills for Claude and Claude Code across coding, research, workflow automation, documentation, and agent operations.
AI Agents for Marketing
Discover AI agents for marketing workflows, from SEO and content production to campaign research, outreach, and analytics.
SKILL.md Source
# Azure Monitor Ingestion SDK for Python
Send custom logs to Azure Monitor Log Analytics workspace using the Logs Ingestion API.
## Installation
```bash
pip install azure-monitor-ingestion
pip install azure-identity
```
## Environment Variables
```bash
# Data Collection Endpoint (DCE)
AZURE_DCE_ENDPOINT=https://<dce-name>.<region>.ingest.monitor.azure.com
# Data Collection Rule (DCR) immutable ID
AZURE_DCR_RULE_ID=dcr-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
# Stream name from DCR
AZURE_DCR_STREAM_NAME=Custom-MyTable_CL
```
## Prerequisites
Before using this SDK, you need:
1. **Log Analytics Workspace** — Target for your logs
2. **Data Collection Endpoint (DCE)** — Ingestion endpoint
3. **Data Collection Rule (DCR)** — Defines schema and destination
4. **Custom Table** — In Log Analytics (created via DCR or manually)
## Authentication
```python
from azure.monitor.ingestion import LogsIngestionClient
from azure.identity import DefaultAzureCredential
import os
client = LogsIngestionClient(
endpoint=os.environ["AZURE_DCE_ENDPOINT"],
credential=DefaultAzureCredential()
)
```
## Upload Custom Logs
```python
from azure.monitor.ingestion import LogsIngestionClient
from azure.identity import DefaultAzureCredential
import os
client = LogsIngestionClient(
endpoint=os.environ["AZURE_DCE_ENDPOINT"],
credential=DefaultAzureCredential()
)
rule_id = os.environ["AZURE_DCR_RULE_ID"]
stream_name = os.environ["AZURE_DCR_STREAM_NAME"]
logs = [
{"TimeGenerated": "2024-01-15T10:00:00Z", "Computer": "server1", "Message": "Application started"},
{"TimeGenerated": "2024-01-15T10:01:00Z", "Computer": "server1", "Message": "Processing request"},
{"TimeGenerated": "2024-01-15T10:02:00Z", "Computer": "server2", "Message": "Connection established"}
]
client.upload(rule_id=rule_id, stream_name=stream_name, logs=logs)
```
## Upload from JSON File
```python
import json
with open("logs.json", "r") as f:
logs = json.load(f)
client.upload(rule_id=rule_id, stream_name=stream_name, logs=logs)
```
## Custom Error Handling
Handle partial failures with a callback:
```python
failed_logs = []
def on_error(error):
print(f"Upload failed: {error.error}")
failed_logs.extend(error.failed_logs)
client.upload(
rule_id=rule_id,
stream_name=stream_name,
logs=logs,
on_error=on_error
)
# Retry failed logs
if failed_logs:
print(f"Retrying {len(failed_logs)} failed logs...")
client.upload(rule_id=rule_id, stream_name=stream_name, logs=failed_logs)
```
## Ignore Errors
```python
def ignore_errors(error):
pass # Silently ignore upload failures
client.upload(
rule_id=rule_id,
stream_name=stream_name,
logs=logs,
on_error=ignore_errors
)
```
## Async Client
```python
import asyncio
from azure.monitor.ingestion.aio import LogsIngestionClient
from azure.identity.aio import DefaultAzureCredential
async def upload_logs():
async with LogsIngestionClient(
endpoint=endpoint,
credential=DefaultAzureCredential()
) as client:
await client.upload(
rule_id=rule_id,
stream_name=stream_name,
logs=logs
)
asyncio.run(upload_logs())
```
## Sovereign Clouds
```python
from azure.identity import AzureAuthorityHosts, DefaultAzureCredential
from azure.monitor.ingestion import LogsIngestionClient
# Azure Government
credential = DefaultAzureCredential(authority=AzureAuthorityHosts.AZURE_GOVERNMENT)
client = LogsIngestionClient(
endpoint="https://example.ingest.monitor.azure.us",
credential=credential,
credential_scopes=["https://monitor.azure.us/.default"]
)
```
## Batching Behavior
The SDK automatically:
- Splits logs into chunks of 1MB or less
- Compresses each chunk with gzip
- Uploads chunks in parallel
No manual batching needed for large log sets.
## Client Types
| Client | Purpose |
|--------|---------|
| `LogsIngestionClient` | Sync client for uploading logs |
| `LogsIngestionClient` (aio) | Async client for uploading logs |
## Key Concepts
| Concept | Description |
|---------|-------------|
| **DCE** | Data Collection Endpoint — ingestion URL |
| **DCR** | Data Collection Rule — defines schema, transformations, destination |
| **Stream** | Named data flow within a DCR |
| **Custom Table** | Target table in Log Analytics (ends with `_CL`) |
## DCR Stream Name Format
Stream names follow patterns:
- `Custom-<TableName>_CL` — For custom tables
- `Microsoft-<TableName>` — For built-in tables
## Best Practices
1. **Use DefaultAzureCredential** for authentication
2. **Handle errors gracefully** — use `on_error` callback for partial failures
3. **Include TimeGenerated** — Required field for all logs
4. **Match DCR schema** — Log fields must match DCR column definitions
5. **Use async client** for high-throughput scenarios
6. **Batch uploads** — SDK handles batching, but send reasonable chunks
7. **Monitor ingestion** — Check Log Analytics for ingestion status
8. **Use context manager** — Ensures proper client cleanup
## When to Use
This skill is applicable to execute the workflow or actions described in the overview.Related Skills
azure-monitor-opentelemetry-py
Azure Monitor OpenTelemetry Distro for Python. Use for one-line Application Insights setup with auto-instrumentation.
azure-monitor-opentelemetry-exporter-py
Azure Monitor OpenTelemetry Exporter for Python. Use for low-level OpenTelemetry export to Application Insights.
manifest
Install and configure the Manifest observability plugin for your agents. Use when setting up telemetry, configuring API keys, or troubleshooting the plugin.
grafana-dashboards
Create and manage production-ready Grafana dashboards for comprehensive system observability.
distributed-tracing
Implement distributed tracing with Jaeger and Tempo for request flow visibility across microservices.
microsoft-azure-webjobs-extensions-authentication-events-dotnet
Microsoft Entra Authentication Events SDK for .NET. Azure Functions triggers for custom authentication extensions.
claude-monitor
Monitor de performance do Claude Code e sistema local. Diagnostica lentidao, mede CPU/RAM/disco, verifica API latency e gera relatorios de saude do sistema.
azure-web-pubsub-ts
Real-time messaging with WebSocket connections and pub/sub patterns.
azure-storage-queue-ts
Azure Queue Storage JavaScript/TypeScript SDK (@azure/storage-queue) for message queue operations. Use for sending, receiving, peeking, and deleting messages in queues.
azure-storage-queue-py
Azure Queue Storage SDK for Python. Use for reliable message queuing, task distribution, and asynchronous processing.
azure-storage-file-share-ts
Azure File Share JavaScript/TypeScript SDK (@azure/storage-file-share) for SMB file share operations.
azure-storage-file-share-py
Azure Storage File Share SDK for Python. Use for SMB file shares, directories, and file operations in the cloud.