Azure Blob Storage

Azure Blob Storage is Microsoft's object storage solution for the cloud. It stores massive amounts of unstructured data — documents, images, videos, backups, and data lakes. Three access tiers (Hot, Cool, Archive) let you optimize costs based on access patterns.

25 stars

Best use case

Azure Blob Storage is best used when you need a repeatable AI agent workflow instead of a one-off prompt.

Azure Blob Storage is Microsoft's object storage solution for the cloud. It stores massive amounts of unstructured data — documents, images, videos, backups, and data lakes. Three access tiers (Hot, Cool, Archive) let you optimize costs based on access patterns.

Teams using Azure Blob Storage should expect a more consistent output, faster repeated execution, less prompt rewriting.

When to use this skill

  • You want a reusable workflow that can be run more than once with consistent structure.

When not to use this skill

  • You only need a quick one-off answer and do not need a reusable workflow.
  • You cannot install or maintain the underlying files, dependencies, or repository context.

Installation

Claude Code / Cursor / Codex

$curl -o ~/.claude/skills/azure-blob-storage/SKILL.md --create-dirs "https://raw.githubusercontent.com/ComeOnOliver/skillshub/main/skills/TerminalSkills/skills/azure-blob-storage/SKILL.md"

Manual Installation

  1. Download SKILL.md from GitHub
  2. Place it in .claude/skills/azure-blob-storage/SKILL.md inside your project
  3. Restart your AI agent — it will auto-discover the skill

How Azure Blob Storage Compares

Feature / AgentAzure Blob StorageStandard Approach
Platform SupportNot specifiedLimited / Varies
Context Awareness High Baseline
Installation ComplexityUnknownN/A

Frequently Asked Questions

What does this skill do?

Azure Blob Storage is Microsoft's object storage solution for the cloud. It stores massive amounts of unstructured data — documents, images, videos, backups, and data lakes. Three access tiers (Hot, Cool, Archive) let you optimize costs based on access patterns.

Where can I find the source code?

You can find the source code on GitHub using the link provided at the top of the page.

SKILL.md Source

# Azure Blob Storage

Azure Blob Storage is Microsoft's object storage solution for the cloud. It stores massive amounts of unstructured data — documents, images, videos, backups, and data lakes. Three access tiers (Hot, Cool, Archive) let you optimize costs based on access patterns.

## Core Concepts

- **Storage Account** — top-level namespace for all Azure Storage services
- **Container** — groups blobs, similar to a directory or S3 bucket
- **Blob** — a file (Block, Append, or Page blob types)
- **Access Tier** — Hot (frequent), Cool (infrequent, 30d min), Archive (rare, 180d min)
- **SAS Token** — Shared Access Signature for time-limited, scoped access
- **Lifecycle Policy** — automatic tier transitions and deletion rules

## Storage Account Setup

```bash
# Create a storage account
az storage account create \
  --name myappstorageprod \
  --resource-group my-app-rg \
  --location eastus \
  --sku Standard_LRS \
  --kind StorageV2 \
  --access-tier Hot \
  --min-tls-version TLS1_2 \
  --allow-blob-public-access false
```

```bash
# Get connection string
az storage account show-connection-string \
  --name myappstorageprod \
  --resource-group my-app-rg \
  --query connectionString --output tsv
```

## Container Operations

```bash
# Create a container
az storage container create \
  --name uploads \
  --account-name myappstorageprod \
  --auth-mode login
```

```bash
# List containers
az storage container list \
  --account-name myappstorageprod \
  --auth-mode login \
  --query '[].name' --output tsv
```

## Blob Operations

```bash
# Upload a file
az storage blob upload \
  --account-name myappstorageprod \
  --container-name uploads \
  --name releases/v1.2.0/app.zip \
  --file ./build/app.zip \
  --tier Hot \
  --auth-mode login
```

```bash
# Upload a directory
az storage blob upload-batch \
  --account-name myappstorageprod \
  --destination static \
  --source ./dist \
  --auth-mode login \
  --overwrite
```

```bash
# Download a blob
az storage blob download \
  --account-name myappstorageprod \
  --container-name uploads \
  --name releases/v1.2.0/app.zip \
  --file ./app.zip \
  --auth-mode login
```

```bash
# List blobs
az storage blob list \
  --account-name myappstorageprod \
  --container-name uploads \
  --prefix releases/ \
  --auth-mode login \
  --query '[].name' --output tsv
```

```bash
# Set blob access tier
az storage blob set-tier \
  --account-name myappstorageprod \
  --container-name backups \
  --name old-backup.tar.gz \
  --tier Archive \
  --auth-mode login
```

## SAS Tokens

```bash
# Generate a SAS token for a single blob (read access, 1 hour)
az storage blob generate-sas \
  --account-name myappstorageprod \
  --container-name uploads \
  --name reports/q4.pdf \
  --permissions r \
  --expiry $(date -u -d '+1 hour' +%Y-%m-%dT%H:%MZ) \
  --auth-mode login \
  --as-user \
  --output tsv
```

```bash
# Generate a container-level SAS (list + read, 24 hours)
az storage container generate-sas \
  --account-name myappstorageprod \
  --name uploads \
  --permissions lr \
  --expiry $(date -u -d '+24 hours' +%Y-%m-%dT%H:%MZ) \
  --auth-mode login \
  --as-user \
  --output tsv
```

```python
# Generate SAS token with Python SDK
from azure.storage.blob import BlobServiceClient, generate_blob_sas, BlobSasPermissions
from datetime import datetime, timedelta, timezone

account_name = "myappstorageprod"
account_key = "your-account-key"

sas_token = generate_blob_sas(
    account_name=account_name,
    container_name="uploads",
    blob_name="reports/q4.pdf",
    account_key=account_key,
    permission=BlobSasPermissions(read=True),
    expiry=datetime.now(timezone.utc) + timedelta(hours=1)
)
url = f"https://{account_name}.blob.core.windows.net/uploads/reports/q4.pdf?{sas_token}"
print(f"SAS URL: {url}")
```

```python
# Generate SAS for upload (write permission)
upload_sas = generate_blob_sas(
    account_name=account_name,
    container_name="uploads",
    blob_name="user-uploads/avatar.jpg",
    account_key=account_key,
    permission=BlobSasPermissions(write=True, create=True),
    expiry=datetime.now(timezone.utc) + timedelta(minutes=15)
)
```

## Lifecycle Management

```json
// lifecycle-policy.json — auto-tier and expire blobs
{
  "rules": [
    {
      "name": "archiveLogs",
      "enabled": true,
      "type": "Lifecycle",
      "definition": {
        "filters": {
          "blobTypes": ["blockBlob"],
          "prefixMatch": ["logs/"]
        },
        "actions": {
          "baseBlob": {
            "tierToCool": {"daysAfterModificationGreaterThan": 30},
            "tierToArchive": {"daysAfterModificationGreaterThan": 90},
            "delete": {"daysAfterModificationGreaterThan": 365}
          }
        }
      }
    },
    {
      "name": "cleanupSnapshots",
      "enabled": true,
      "type": "Lifecycle",
      "definition": {
        "filters": {"blobTypes": ["blockBlob"]},
        "actions": {
          "snapshot": {
            "delete": {"daysAfterCreationGreaterThan": 90}
          }
        }
      }
    }
  ]
}
```

```bash
# Apply lifecycle policy
az storage account management-policy create \
  --account-name myappstorageprod \
  --resource-group my-app-rg \
  --policy @lifecycle-policy.json
```

## Python SDK Usage

```python
# Upload and download with Python SDK
from azure.storage.blob import BlobServiceClient

blob_service = BlobServiceClient.from_connection_string("your-connection-string")
container = blob_service.get_container_client("uploads")

# Upload
with open("report.pdf", "rb") as f:
    container.upload_blob(name="reports/2024/q4.pdf", data=f, overwrite=True)

# Download
blob = container.get_blob_client("reports/2024/q4.pdf")
with open("downloaded.pdf", "wb") as f:
    stream = blob.download_blob()
    f.write(stream.readall())

# List blobs
for blob in container.list_blobs(name_starts_with="reports/"):
    print(f"{blob.name} ({blob.size} bytes, tier: {blob.blob_tier})")
```

## AzCopy for Bulk Transfers

```bash
# Sync a local directory to blob storage
azcopy sync './dist' 'https://myappstorageprod.blob.core.windows.net/static?SAS_TOKEN' \
  --delete-destination true
```

```bash
# Copy between storage accounts
azcopy copy \
  'https://source.blob.core.windows.net/data/*?SAS' \
  'https://dest.blob.core.windows.net/data/?SAS' \
  --recursive
```

## Best Practices

- Disable public blob access by default; use SAS tokens for temporary sharing
- Use lifecycle management to automatically move cold data to cheaper tiers
- Use AzCopy for large-scale data transfers (parallel, resumable)
- Enable soft delete for accidental deletion recovery (set retention period)
- Use managed identities and RBAC instead of account keys when possible
- Set minimum TLS version to 1.2
- Use Cool tier for data accessed less than once per month
- Enable blob versioning for critical data protection

Related Skills

azure-ml-deployer

25
from ComeOnOliver/skillshub

Azure Ml Deployer - Auto-activating skill for ML Deployment. Triggers on: azure ml deployer, azure ml deployer Part of the ML Deployment skill category.

azure-verified-modules

25
from ComeOnOliver/skillshub

Azure Verified Modules (AVM) requirements and best practices for developing certified Azure Terraform modules. Use when creating or reviewing Azure modules that need AVM certification.

azure-image-builder

25
from ComeOnOliver/skillshub

Build Azure managed images and Azure Compute Gallery images with Packer. Use when creating custom images for Azure VMs.

terraform-azurerm-set-diff-analyzer

25
from ComeOnOliver/skillshub

Analyze Terraform plan JSON output for AzureRM Provider to distinguish between false-positive diffs (order-only changes in Set-type attributes) and actual resource changes. Use when reviewing terraform plan output for Azure resources like Application Gateway, Load Balancer, Firewall, Front Door, NSG, and other resources with Set-type attributes that cause spurious diffs due to internal ordering changes.

azure-static-web-apps

25
from ComeOnOliver/skillshub

Helps create, configure, and deploy Azure Static Web Apps using the SWA CLI. Use when deploying static sites to Azure, setting up SWA local development, configuring staticwebapp.config.json, adding Azure Functions APIs to SWA, or setting up GitHub Actions CI/CD for Static Web Apps.

azure-resource-health-diagnose

25
from ComeOnOliver/skillshub

Analyze Azure resource health, diagnose issues from logs and telemetry, and create a remediation plan for identified problems.

azure-pricing

25
from ComeOnOliver/skillshub

Fetches real-time Azure retail pricing using the Azure Retail Prices API (prices.azure.com) and estimates Copilot Studio agent credit consumption. Use when the user asks about the cost of any Azure service, wants to compare SKU prices, needs pricing data for a cost estimate, mentions Azure pricing, Azure costs, Azure billing, or asks about Copilot Studio pricing, Copilot Credits, or agent usage estimation. Covers compute, storage, networking, databases, AI, Copilot Studio, and all other Azure service families.

azure-devops-cli

25
from ComeOnOliver/skillshub

Manage Azure DevOps resources via CLI including projects, repos, pipelines, builds, pull requests, work items, artifacts, and service endpoints. Use when working with Azure DevOps, az commands, devops automation, CI/CD, or when user mentions Azure DevOps CLI.

azure-deployment-preflight

25
from ComeOnOliver/skillshub

Performs comprehensive preflight validation of Bicep deployments to Azure, including template syntax validation, what-if analysis, and permission checks. Use this skill before any deployment to Azure to preview changes, identify potential issues, and ensure the deployment will succeed. Activate when users mention deploying to Azure, validating Bicep files, checking deployment permissions, previewing infrastructure changes, running what-if, or preparing for azd provision.

microsoft-azure-webjobs-extensions-authentication-events-dotnet

25
from ComeOnOliver/skillshub

Microsoft Entra Authentication Events SDK for .NET. Azure Functions triggers for custom authentication extensions. Use for token enrichment, custom claims, attribute collection, and OTP customization in Entra ID. Triggers: "Authentication Events", "WebJobsAuthenticationEventsTrigger", "OnTokenIssuanceStart", "OnAttributeCollectionStart", "custom claims", "token enrichment", "Entra custom extension", "authentication extension".

azure-web-pubsub-ts

25
from ComeOnOliver/skillshub

Build real-time messaging applications using Azure Web PubSub SDKs for JavaScript (@azure/web-pubsub, @azure/web-pubsub-client). Use when implementing WebSocket-based real-time features, pub/sub messaging, group chat, or live notifications.

azure-storage-queue-ts

25
from ComeOnOliver/skillshub

Azure Queue Storage JavaScript/TypeScript SDK (@azure/storage-queue) for message queue operations. Use for sending, receiving, peeking, and deleting messages in queues. Supports visibility timeout, message encoding, and batch operations. Triggers: "queue storage", "@azure/storage-queue", "QueueServiceClient", "QueueClient", "send message", "receive message", "dequeue", "visibility timeout".