azure-blob-storage

Store and manage unstructured data with Azure Blob Storage. Create containers, upload and organize blobs, configure access tiers (Hot, Cool, Archive) for cost optimization, generate SAS tokens for secure temporary access, and set lifecycle management policies.

26 stars

Best use case

azure-blob-storage is best used when you need a repeatable AI agent workflow instead of a one-off prompt.

Store and manage unstructured data with Azure Blob Storage. Create containers, upload and organize blobs, configure access tiers (Hot, Cool, Archive) for cost optimization, generate SAS tokens for secure temporary access, and set lifecycle management policies.

Teams using azure-blob-storage should expect a more consistent output, faster repeated execution, less prompt rewriting.

When to use this skill

  • You want a reusable workflow that can be run more than once with consistent structure.

When not to use this skill

  • You only need a quick one-off answer and do not need a reusable workflow.
  • You cannot install or maintain the underlying files, dependencies, or repository context.

Installation

Claude Code / Cursor / Codex

$curl -o ~/.claude/skills/azure-blob-storage/SKILL.md --create-dirs "https://raw.githubusercontent.com/TerminalSkills/skills/main/skills/azure-blob-storage/SKILL.md"

Manual Installation

  1. Download SKILL.md from GitHub
  2. Place it in .claude/skills/azure-blob-storage/SKILL.md inside your project
  3. Restart your AI agent — it will auto-discover the skill

How azure-blob-storage Compares

Feature / Agentazure-blob-storageStandard Approach
Platform SupportNot specifiedLimited / Varies
Context Awareness High Baseline
Installation ComplexityUnknownN/A

Frequently Asked Questions

What does this skill do?

Store and manage unstructured data with Azure Blob Storage. Create containers, upload and organize blobs, configure access tiers (Hot, Cool, Archive) for cost optimization, generate SAS tokens for secure temporary access, and set lifecycle management policies.

Where can I find the source code?

You can find the source code on GitHub using the link provided at the top of the page.

SKILL.md Source

# Azure Blob Storage

Azure Blob Storage is Microsoft's object storage solution for the cloud. It stores massive amounts of unstructured data — documents, images, videos, backups, and data lakes. Three access tiers (Hot, Cool, Archive) let you optimize costs based on access patterns.

## Core Concepts

- **Storage Account** — top-level namespace for all Azure Storage services
- **Container** — groups blobs, similar to a directory or S3 bucket
- **Blob** — a file (Block, Append, or Page blob types)
- **Access Tier** — Hot (frequent), Cool (infrequent, 30d min), Archive (rare, 180d min)
- **SAS Token** — Shared Access Signature for time-limited, scoped access
- **Lifecycle Policy** — automatic tier transitions and deletion rules

## Storage Account Setup

```bash
# Create a storage account
az storage account create \
  --name myappstorageprod \
  --resource-group my-app-rg \
  --location eastus \
  --sku Standard_LRS \
  --kind StorageV2 \
  --access-tier Hot \
  --min-tls-version TLS1_2 \
  --allow-blob-public-access false
```

```bash
# Get connection string
az storage account show-connection-string \
  --name myappstorageprod \
  --resource-group my-app-rg \
  --query connectionString --output tsv
```

## Container Operations

```bash
# Create a container
az storage container create \
  --name uploads \
  --account-name myappstorageprod \
  --auth-mode login
```

```bash
# List containers
az storage container list \
  --account-name myappstorageprod \
  --auth-mode login \
  --query '[].name' --output tsv
```

## Blob Operations

```bash
# Upload a file
az storage blob upload \
  --account-name myappstorageprod \
  --container-name uploads \
  --name releases/v1.2.0/app.zip \
  --file ./build/app.zip \
  --tier Hot \
  --auth-mode login
```

```bash
# Upload a directory
az storage blob upload-batch \
  --account-name myappstorageprod \
  --destination static \
  --source ./dist \
  --auth-mode login \
  --overwrite
```

```bash
# Download a blob
az storage blob download \
  --account-name myappstorageprod \
  --container-name uploads \
  --name releases/v1.2.0/app.zip \
  --file ./app.zip \
  --auth-mode login
```

```bash
# List blobs
az storage blob list \
  --account-name myappstorageprod \
  --container-name uploads \
  --prefix releases/ \
  --auth-mode login \
  --query '[].name' --output tsv
```

```bash
# Set blob access tier
az storage blob set-tier \
  --account-name myappstorageprod \
  --container-name backups \
  --name old-backup.tar.gz \
  --tier Archive \
  --auth-mode login
```

## SAS Tokens

```bash
# Generate a SAS token for a single blob (read access, 1 hour)
az storage blob generate-sas \
  --account-name myappstorageprod \
  --container-name uploads \
  --name reports/q4.pdf \
  --permissions r \
  --expiry $(date -u -d '+1 hour' +%Y-%m-%dT%H:%MZ) \
  --auth-mode login \
  --as-user \
  --output tsv
```

```bash
# Generate a container-level SAS (list + read, 24 hours)
az storage container generate-sas \
  --account-name myappstorageprod \
  --name uploads \
  --permissions lr \
  --expiry $(date -u -d '+24 hours' +%Y-%m-%dT%H:%MZ) \
  --auth-mode login \
  --as-user \
  --output tsv
```

```python
# Generate SAS token with Python SDK
from azure.storage.blob import BlobServiceClient, generate_blob_sas, BlobSasPermissions
from datetime import datetime, timedelta, timezone

account_name = "myappstorageprod"
account_key = "your-account-key"

sas_token = generate_blob_sas(
    account_name=account_name,
    container_name="uploads",
    blob_name="reports/q4.pdf",
    account_key=account_key,
    permission=BlobSasPermissions(read=True),
    expiry=datetime.now(timezone.utc) + timedelta(hours=1)
)
url = f"https://{account_name}.blob.core.windows.net/uploads/reports/q4.pdf?{sas_token}"
print(f"SAS URL: {url}")
```

```python
# Generate SAS for upload (write permission)
upload_sas = generate_blob_sas(
    account_name=account_name,
    container_name="uploads",
    blob_name="user-uploads/avatar.jpg",
    account_key=account_key,
    permission=BlobSasPermissions(write=True, create=True),
    expiry=datetime.now(timezone.utc) + timedelta(minutes=15)
)
```

## Lifecycle Management

```json
// lifecycle-policy.json — auto-tier and expire blobs
{
  "rules": [
    {
      "name": "archiveLogs",
      "enabled": true,
      "type": "Lifecycle",
      "definition": {
        "filters": {
          "blobTypes": ["blockBlob"],
          "prefixMatch": ["logs/"]
        },
        "actions": {
          "baseBlob": {
            "tierToCool": {"daysAfterModificationGreaterThan": 30},
            "tierToArchive": {"daysAfterModificationGreaterThan": 90},
            "delete": {"daysAfterModificationGreaterThan": 365}
          }
        }
      }
    },
    {
      "name": "cleanupSnapshots",
      "enabled": true,
      "type": "Lifecycle",
      "definition": {
        "filters": {"blobTypes": ["blockBlob"]},
        "actions": {
          "snapshot": {
            "delete": {"daysAfterCreationGreaterThan": 90}
          }
        }
      }
    }
  ]
}
```

```bash
# Apply lifecycle policy
az storage account management-policy create \
  --account-name myappstorageprod \
  --resource-group my-app-rg \
  --policy @lifecycle-policy.json
```

## Python SDK Usage

```python
# Upload and download with Python SDK
from azure.storage.blob import BlobServiceClient

blob_service = BlobServiceClient.from_connection_string("your-connection-string")
container = blob_service.get_container_client("uploads")

# Upload
with open("report.pdf", "rb") as f:
    container.upload_blob(name="reports/2024/q4.pdf", data=f, overwrite=True)

# Download
blob = container.get_blob_client("reports/2024/q4.pdf")
with open("downloaded.pdf", "wb") as f:
    stream = blob.download_blob()
    f.write(stream.readall())

# List blobs
for blob in container.list_blobs(name_starts_with="reports/"):
    print(f"{blob.name} ({blob.size} bytes, tier: {blob.blob_tier})")
```

## AzCopy for Bulk Transfers

```bash
# Sync a local directory to blob storage
azcopy sync './dist' 'https://myappstorageprod.blob.core.windows.net/static?SAS_TOKEN' \
  --delete-destination true
```

```bash
# Copy between storage accounts
azcopy copy \
  'https://source.blob.core.windows.net/data/*?SAS' \
  'https://dest.blob.core.windows.net/data/?SAS' \
  --recursive
```

## Best Practices

- Disable public blob access by default; use SAS tokens for temporary sharing
- Use lifecycle management to automatically move cold data to cheaper tiers
- Use AzCopy for large-scale data transfers (parallel, resumable)
- Enable soft delete for accidental deletion recovery (set retention period)
- Use managed identities and RBAC instead of account keys when possible
- Set minimum TLS version to 1.2
- Use Cool tier for data accessed less than once per month
- Enable blob versioning for critical data protection

Related Skills

s3-storage

26
from TerminalSkills/skills

Manages S3-compatible object storage (AWS S3, MinIO, Cloudflare R2, DigitalOcean Spaces, Backblaze B2, Wasabi, Supabase Storage). Use when the user wants to create buckets, upload/download files, set up lifecycle policies, configure CORS, manage presigned URLs, implement multipart uploads, set up replication, handle versioning, configure access policies, or build file management features on top of S3-compatible APIs. Trigger words: s3, minio, r2, object storage, bucket, presigned url, multipart upload, lifecycle policy, s3 cors, storage backend, file storage, blob storage, spaces, backblaze, wasabi.

gcp-cloud-storage

26
from TerminalSkills/skills

Manage Google Cloud Storage for scalable object storage. Create and configure buckets, upload and organize objects, generate signed URLs for secure temporary access, set lifecycle rules for cost optimization, and configure access control.

azure-openai

26
from TerminalSkills/skills

Azure OpenAI Service — OpenAI models (GPT-4o, DALL-E 3, Whisper) on Azure infrastructure. Use when deploying OpenAI models with enterprise compliance (GDPR, HIPAA, SOC2), Azure-native auth via Managed Identity, content filtering, or VNET-isolated deployments. Same OpenAI API, hosted on Azure.

azure-functions

26
from TerminalSkills/skills

Build serverless applications with Azure Functions. Create HTTP and event-driven functions with input/output bindings, configure triggers for queues, timers, and blob storage. Use Durable Functions for stateful orchestration workflows.

azure-cosmos-db

26
from TerminalSkills/skills

Build globally distributed apps with Azure Cosmos DB. Work with multiple data models (document, key-value, graph), configure global replication with tunable consistency levels, manage throughput with RU/s, and query with SQL API.

azure-cli

26
from TerminalSkills/skills

Azure Command Line Interface for managing Microsoft Azure resources. Use when the user needs to create VMs, manage storage accounts, deploy functions, configure resource groups, and automate Azure operations from the terminal.

zustand

26
from TerminalSkills/skills

You are an expert in Zustand, the small, fast, and scalable state management library for React. You help developers manage global state without boilerplate using Zustand's hook-based stores, selectors for performance, middleware (persist, devtools, immer), computed values, and async actions — replacing Redux complexity with a simple, un-opinionated API in under 1KB.

zoho

26
from TerminalSkills/skills

Integrate and automate Zoho products. Use when a user asks to work with Zoho CRM, Zoho Books, Zoho Desk, Zoho Projects, Zoho Mail, or Zoho Creator, build custom integrations via Zoho APIs, automate workflows with Deluge scripting, sync data between Zoho apps and external systems, manage leads and deals, automate invoicing, build custom Zoho Creator apps, set up webhooks, or manage Zoho organization settings. Covers Zoho CRM, Books, Desk, Projects, Creator, and cross-product integrations.

zod

26
from TerminalSkills/skills

You are an expert in Zod, the TypeScript-first schema declaration and validation library. You help developers define schemas that validate data at runtime AND infer TypeScript types at compile time — eliminating the need to write types and validators separately. Used for API input validation, form validation, environment variables, config files, and any data boundary.

zipkin

26
from TerminalSkills/skills

Deploy and configure Zipkin for distributed tracing and request flow visualization. Use when a user needs to set up trace collection, instrument Java/Spring or other services with Zipkin, analyze service dependencies, or configure storage backends for trace data.

zig

26
from TerminalSkills/skills

Expert guidance for Zig, the systems programming language focused on performance, safety, and readability. Helps developers write high-performance code with compile-time evaluation, seamless C interop, no hidden control flow, and no garbage collector. Zig is used for game engines, operating systems, networking, and as a C/C++ replacement.

zed

26
from TerminalSkills/skills

Expert guidance for Zed, the high-performance code editor built in Rust with native collaboration, AI integration, and GPU-accelerated rendering. Helps developers configure Zed, create custom extensions, set up collaborative editing sessions, and integrate AI assistants for productive coding.