ClaudeChatGPTGeminiCursorGitHub CopilotDeepSeekAiderContinueData Storage

azure-data-tables-py

Azure Tables SDK for Python (Storage and Cosmos DB). Use for NoSQL key-value storage, entity CRUD, and batch operations.

31,392 stars
Complexity: medium

About this skill

This skill equips AI agents with the capability to integrate directly with Azure Data Tables, encompassing both Azure Storage Tables and the Cosmos DB Table API. Leveraging the Azure Tables SDK for Python, agents can effectively manage NoSQL key-value pairs, performing essential operations such as Create, Read, Update, and Delete (CRUD) on entities. It also supports efficient batch operations, allowing for high-volume data management and improved performance. This integration empowers agents to store and retrieve structured data, maintain state across interactions, manage application configurations, or interact with external data sources hosted on Azure's scalable and cost-effective table storage solutions.

Best use case

Storing and retrieving structured data for AI agent applications; managing conversation history or user preferences for stateful interactions; persisting application configuration settings; tracking agent task progress or results; building lightweight data backends for agent-driven services; implementing a simple, scalable knowledge base.

Azure Tables SDK for Python (Storage and Cosmos DB). Use for NoSQL key-value storage, entity CRUD, and batch operations.

The AI agent will successfully connect to Azure Storage Tables or Cosmos DB Table API, perform requested data operations (e.g., adding an entity, retrieving data, updating records, or executing batch operations), and return confirmation or the requested data to the user.

Practical example

Example input

I need to add a new task to my 'AgentTasks' table. The task has a PartitionKey of 'DailyTasks', a RowKey of 'task_001', and properties 'Description': 'Review project reports', 'Status': 'Pending', 'DueDate': '2026-03-01'.

Example output

Successfully added task_001 to the 'AgentTasks' table with status 'Pending'.

When to use this skill

  • When the AI agent needs to store or retrieve structured data in a scalable NoSQL key-value store; when working within the Azure ecosystem and requiring persistent data storage; when performing high-volume entity operations that benefit from batching; when a flexible schema and fast access to data are prioritized over complex relational queries; when state management or user-specific data needs to be maintained across agent interactions.

When not to use this skill

  • When complex relational queries, joins, or transactional integrity across multiple tables are required (a relational database like Azure SQL would be more suitable); when dealing with large binary objects or unstructured files (Azure Blob Storage or Data Lake would be better); when the agent lacks access to Azure credentials or necessary permissions; for purely temporary data that does not need persistence; when data latency requirements are extremely critical and a simpler in-memory cache might be more appropriate.

Installation

Claude Code / Cursor / Codex

$curl -o ~/.claude/skills/azure-data-tables-py/SKILL.md --create-dirs "https://raw.githubusercontent.com/sickn33/antigravity-awesome-skills/main/plugins/antigravity-awesome-skills-claude/skills/azure-data-tables-py/SKILL.md"

Manual Installation

  1. Download SKILL.md from GitHub
  2. Place it in .claude/skills/azure-data-tables-py/SKILL.md inside your project
  3. Restart your AI agent — it will auto-discover the skill

How azure-data-tables-py Compares

Feature / Agentazure-data-tables-pyStandard Approach
Platform SupportClaude, ChatGPT, Gemini, Cursor, GitHub Copilot, DeepSeek, Aider, ContinueLimited / Varies
Context Awareness High Baseline
Installation ComplexitymediumN/A

Frequently Asked Questions

What does this skill do?

Azure Tables SDK for Python (Storage and Cosmos DB). Use for NoSQL key-value storage, entity CRUD, and batch operations.

Which AI agents support this skill?

This skill is designed for Claude, ChatGPT, Gemini, Cursor, GitHub Copilot, DeepSeek, Aider, Continue.

How difficult is it to install?

The installation complexity is rated as medium. You can find the installation instructions above.

Where can I find the source code?

You can find the source code on GitHub using the link provided at the top of the page.

Related Guides

SKILL.md Source

# Azure Tables SDK for Python

NoSQL key-value store for structured data (Azure Storage Tables or Cosmos DB Table API).

## Installation

```bash
pip install azure-data-tables azure-identity
```

## Environment Variables

```bash
# Azure Storage Tables
AZURE_STORAGE_ACCOUNT_URL=https://<account>.table.core.windows.net

# Cosmos DB Table API
COSMOS_TABLE_ENDPOINT=https://<account>.table.cosmos.azure.com
```

## Authentication

```python
from azure.identity import DefaultAzureCredential
from azure.data.tables import TableServiceClient, TableClient

credential = DefaultAzureCredential()
endpoint = "https://<account>.table.core.windows.net"

# Service client (manage tables)
service_client = TableServiceClient(endpoint=endpoint, credential=credential)

# Table client (work with entities)
table_client = TableClient(endpoint=endpoint, table_name="mytable", credential=credential)
```

## Client Types

| Client | Purpose |
|--------|---------|
| `TableServiceClient` | Create/delete tables, list tables |
| `TableClient` | Entity CRUD, queries |

## Table Operations

```python
# Create table
service_client.create_table("mytable")

# Create if not exists
service_client.create_table_if_not_exists("mytable")

# Delete table
service_client.delete_table("mytable")

# List tables
for table in service_client.list_tables():
    print(table.name)

# Get table client
table_client = service_client.get_table_client("mytable")
```

## Entity Operations

**Important**: Every entity requires `PartitionKey` and `RowKey` (together form unique ID).

### Create Entity

```python
entity = {
    "PartitionKey": "sales",
    "RowKey": "order-001",
    "product": "Widget",
    "quantity": 5,
    "price": 9.99,
    "shipped": False
}

# Create (fails if exists)
table_client.create_entity(entity=entity)

# Upsert (create or replace)
table_client.upsert_entity(entity=entity)
```

### Get Entity

```python
# Get by key (fastest)
entity = table_client.get_entity(
    partition_key="sales",
    row_key="order-001"
)
print(f"Product: {entity['product']}")
```

### Update Entity

```python
# Replace entire entity
entity["quantity"] = 10
table_client.update_entity(entity=entity, mode="replace")

# Merge (update specific fields only)
update = {
    "PartitionKey": "sales",
    "RowKey": "order-001",
    "shipped": True
}
table_client.update_entity(entity=update, mode="merge")
```

### Delete Entity

```python
table_client.delete_entity(
    partition_key="sales",
    row_key="order-001"
)
```

## Query Entities

### Query Within Partition

```python
# Query by partition (efficient)
entities = table_client.query_entities(
    query_filter="PartitionKey eq 'sales'"
)
for entity in entities:
    print(entity)
```

### Query with Filters

```python
# Filter by properties
entities = table_client.query_entities(
    query_filter="PartitionKey eq 'sales' and quantity gt 3"
)

# With parameters (safer)
entities = table_client.query_entities(
    query_filter="PartitionKey eq @pk and price lt @max_price",
    parameters={"pk": "sales", "max_price": 50.0}
)
```

### Select Specific Properties

```python
entities = table_client.query_entities(
    query_filter="PartitionKey eq 'sales'",
    select=["RowKey", "product", "price"]
)
```

### List All Entities

```python
# List all (cross-partition - use sparingly)
for entity in table_client.list_entities():
    print(entity)
```

## Batch Operations

```python
from azure.data.tables import TableTransactionError

# Batch operations (same partition only!)
operations = [
    ("create", {"PartitionKey": "batch", "RowKey": "1", "data": "first"}),
    ("create", {"PartitionKey": "batch", "RowKey": "2", "data": "second"}),
    ("upsert", {"PartitionKey": "batch", "RowKey": "3", "data": "third"}),
]

try:
    table_client.submit_transaction(operations)
except TableTransactionError as e:
    print(f"Transaction failed: {e}")
```

## Async Client

```python
from azure.data.tables.aio import TableServiceClient, TableClient
from azure.identity.aio import DefaultAzureCredential

async def table_operations():
    credential = DefaultAzureCredential()
    
    async with TableClient(
        endpoint="https://<account>.table.core.windows.net",
        table_name="mytable",
        credential=credential
    ) as client:
        # Create
        await client.create_entity(entity={
            "PartitionKey": "async",
            "RowKey": "1",
            "data": "test"
        })
        
        # Query
        async for entity in client.query_entities("PartitionKey eq 'async'"):
            print(entity)

import asyncio
asyncio.run(table_operations())
```

## Data Types

| Python Type | Table Storage Type |
|-------------|-------------------|
| `str` | String |
| `int` | Int64 |
| `float` | Double |
| `bool` | Boolean |
| `datetime` | DateTime |
| `bytes` | Binary |
| `UUID` | Guid |

## Best Practices

1. **Design partition keys** for query patterns and even distribution
2. **Query within partitions** whenever possible (cross-partition is expensive)
3. **Use batch operations** for multiple entities in same partition
4. **Use `upsert_entity`** for idempotent writes
5. **Use parameterized queries** to prevent injection
6. **Keep entities small** — max 1MB per entity
7. **Use async client** for high-throughput scenarios

## When to Use
This skill is applicable to execute the workflow or actions described in the overview.

Related Skills

native-data-fetching

31392
from sickn33/antigravity-awesome-skills

Use when implementing or debugging ANY network request, API call, or data fetching. Covers fetch API, React Query, SWR, error handling, caching, offline support, and Expo Router data loaders (useLoaderData).

API IntegrationClaude

microsoft-azure-webjobs-extensions-authentication-events-dotnet

31392
from sickn33/antigravity-awesome-skills

Microsoft Entra Authentication Events SDK for .NET. Azure Functions triggers for custom authentication extensions.

Identity Management / Authentication & AuthorizationClaude

hugging-face-datasets

31392
from sickn33/antigravity-awesome-skills

Create and manage datasets on Hugging Face Hub. Supports initializing repos, defining configs/system prompts, streaming row updates, and SQL-based dataset querying/transformation. Designed to work alongside HF MCP server for comprehensive dataset workflows.

Data ManagementClaude

hugging-face-dataset-viewer

31392
from sickn33/antigravity-awesome-skills

Query Hugging Face datasets through the Dataset Viewer API for splits, rows, search, filters, and parquet links.

Data Access & ExplorationClaude

gdpr-data-handling

31392
from sickn33/antigravity-awesome-skills

Practical implementation guide for GDPR-compliant data processing, consent management, and privacy controls.

Legal & ComplianceClaude

fp-data-transforms

31392
from sickn33/antigravity-awesome-skills

Everyday data transformations using functional patterns - arrays, objects, grouping, aggregation, and null-safe access

Data TransformationClaude

food-database-query

31392
from sickn33/antigravity-awesome-skills

Food Database Query

NutritionClaude

database

31392
from sickn33/antigravity-awesome-skills

Database development and operations workflow covering SQL, NoSQL, database design, migrations, optimization, and data engineering.

Workflow & Automation BundlesClaude

database-optimizer

31392
from sickn33/antigravity-awesome-skills

Expert database optimizer specializing in modern performance tuning, query optimization, and scalable architectures.

Database ManagementClaude

database-migrations-sql-migrations

31392
from sickn33/antigravity-awesome-skills

SQL database migrations with zero-downtime strategies for PostgreSQL, MySQL, and SQL Server. Focus on data integrity and rollback plans.

Database ManagementClaude

database-migrations-migration-observability

31392
from sickn33/antigravity-awesome-skills

Migration monitoring, CDC, and observability infrastructure

DevOps ToolsClaude

database-migration

31392
from sickn33/antigravity-awesome-skills

Master database schema and data migrations across ORMs (Sequelize, TypeORM, Prisma), including rollback strategies and zero-downtime deployments.

Database ManagementClaude