clade-deploy-integration
Deploy Claude-powered applications to Vercel, Fly.io, and Cloud Run Use when working with deploy-integration patterns. with proper secrets management and streaming support. Trigger with "deploy anthropic", "claude production deploy", "anthropic vercel", "deploy claude app".
Best use case
clade-deploy-integration is best used when you need a repeatable AI agent workflow instead of a one-off prompt.
Deploy Claude-powered applications to Vercel, Fly.io, and Cloud Run Use when working with deploy-integration patterns. with proper secrets management and streaming support. Trigger with "deploy anthropic", "claude production deploy", "anthropic vercel", "deploy claude app".
Teams using clade-deploy-integration should expect a more consistent output, faster repeated execution, less prompt rewriting.
When to use this skill
- You want a reusable workflow that can be run more than once with consistent structure.
When not to use this skill
- You only need a quick one-off answer and do not need a reusable workflow.
- You cannot install or maintain the underlying files, dependencies, or repository context.
Installation
Claude Code / Cursor / Codex
Manual Installation
- Download SKILL.md from GitHub
- Place it in
.claude/skills/clade-deploy-integration/SKILL.mdinside your project - Restart your AI agent — it will auto-discover the skill
How clade-deploy-integration Compares
| Feature / Agent | clade-deploy-integration | Standard Approach |
|---|---|---|
| Platform Support | Not specified | Limited / Varies |
| Context Awareness | High | Baseline |
| Installation Complexity | Unknown | N/A |
Frequently Asked Questions
What does this skill do?
Deploy Claude-powered applications to Vercel, Fly.io, and Cloud Run Use when working with deploy-integration patterns. with proper secrets management and streaming support. Trigger with "deploy anthropic", "claude production deploy", "anthropic vercel", "deploy claude app".
Where can I find the source code?
You can find the source code on GitHub using the link provided at the top of the page.
SKILL.md Source
# Deploy Anthropic Integration
## Overview
Claude integrations are stateless API wrappers — a serverless function receives a user request, streams from the Messages API, and returns the response. No database, no connection pool, no persistent state.
## Vercel Edge Function (Recommended)
```typescript
// app/api/chat/route.ts (Next.js App Router)
import Anthropic from '@claude-ai/sdk';
export const runtime = 'edge';
export async function POST(req: Request) {
const client = new Anthropic();
const { messages, system } = await req.json();
const stream = await client.messages.create({
model: 'claude-sonnet-4-20250514',
max_tokens: 4096,
system: system || 'You are a helpful assistant.',
messages,
stream: true,
});
// Convert Anthropic stream to ReadableStream for SSE
const encoder = new TextEncoder();
const readable = new ReadableStream({
async start(controller) {
for await (const event of stream) {
if (event.type === 'content_block_delta' && event.delta.type === 'text_delta') {
controller.enqueue(encoder.encode(`data: ${JSON.stringify(event.delta)}\n\n`));
}
}
controller.enqueue(encoder.encode('data: [DONE]\n\n'));
controller.close();
},
});
return new Response(readable, {
headers: {
'Content-Type': 'text/event-stream',
'Cache-Control': 'no-cache',
},
});
}
```
## Instructions
### Step 1: Deploy to Vercel
```bash
# Add secret
vercel env add ANTHROPIC_API_KEY
# Deploy
vercel --prod
```
## Fly.io (Long-Running / WebSocket)
```dockerfile
FROM node:20-slim
WORKDIR /app
COPY package*.json ./
RUN npm ci --only=production
COPY . .
EXPOSE 3000
CMD ["node", "server.js"]
```
```bash
fly launch --name my-claude-app
fly secrets set ANTHROPIC_API_KEY=sk-ant-api03-...
fly deploy
```
## Google Cloud Run
```bash
gcloud run deploy claude-api \
--source . \
--region us-central1 \
--allow-unauthenticated \
--set-secrets=ANTHROPIC_API_KEY=claude-key:latest \
--timeout=300 \
--concurrency=80
```
## Health Check
```typescript
// api/health.ts
import Anthropic from '@claude-ai/sdk';
export async function GET() {
try {
const client = new Anthropic();
const msg = await client.messages.create({
model: 'claude-haiku-4-5-20251001',
max_tokens: 5,
messages: [{ role: 'user', content: 'ping' }],
});
return Response.json({ status: 'healthy', model: msg.model });
} catch (err) {
return Response.json({ status: 'unhealthy', error: err.message }, { status: 503 });
}
}
```
## Environment Variables
| Variable | Required | Description |
|----------|----------|-------------|
| `ANTHROPIC_API_KEY` | Yes | API key from console.anthropic.com |
| `ANTHROPIC_MODEL` | No | Default model ID (override per request) |
| `ANTHROPIC_MAX_TOKENS` | No | Default max tokens |
## Output
- Application deployed to chosen platform with streaming support
- `ANTHROPIC_API_KEY` stored in platform secrets manager
- Health check endpoint returning Claude connectivity status
- Environment-specific configuration (model, max_tokens) in place
## Error Handling
| Issue | Cause | Solution |
|-------|-------|----------|
| `FUNCTION_INVOCATION_TIMEOUT` | Claude response > function timeout | Set timeout to 300s. Use streaming. |
| Secret not found | Missing env var | Add via platform CLI |
| 529 in production | API overloaded | SDK retries automatically. Add fallback model. |
| CORS errors | Missing headers | Add CORS headers to API route |
## Examples
See Vercel Edge Function (with SSE streaming), Fly.io Dockerfile, Cloud Run deploy script, and Health Check endpoint above.
## Resources
- [Anthropic API Docs](https://docs.anthropic.com/en/api/getting-started)
- [Vercel AI SDK](https://sdk.vercel.ai/docs) (optional higher-level wrapper)
## Next Steps
See `clade-observability` for monitoring your Claude calls in production.
## Prerequisites
- Completed `clade-install-auth` and `clade-prod-checklist`
- Production Anthropic API key (separate from dev key)
- Platform CLI installed: `vercel`, `fly`, or `gcloud`
- Application code tested locallyRelated Skills
zapier-integration-helper
Zapier Integration Helper - Auto-activating skill for Business Automation. Triggers on: zapier integration helper, zapier integration helper Part of the Business Automation skill category.
vertex-ai-deployer
Vertex Ai Deployer - Auto-activating skill for ML Deployment. Triggers on: vertex ai deployer, vertex ai deployer Part of the ML Deployment skill category.
sagemaker-endpoint-deployer
Sagemaker Endpoint Deployer - Auto-activating skill for ML Deployment. Triggers on: sagemaker endpoint deployer, sagemaker endpoint deployer Part of the ML Deployment skill category.
orchestrating-deployment-pipelines
Deploy use when you need to work with deployment and CI/CD. This skill provides deployment automation and orchestration with comprehensive guidance and automation. Trigger with phrases like "deploy application", "create pipeline", or "automate deployment".
deploying-monitoring-stacks
This skill deploys monitoring stacks, including Prometheus, Grafana, and Datadog. It is used when the user needs to set up or configure monitoring infrastructure for applications or systems. The skill generates production-ready configurations, implements best practices, and supports multi-platform deployments. Use this when the user explicitly requests to deploy a monitoring stack, or mentions Prometheus, Grafana, or Datadog in the context of infrastructure setup.
deploying-machine-learning-models
This skill enables Claude to deploy machine learning models to production environments. It automates the deployment workflow, implements best practices for serving models, optimizes performance, and handles potential errors. Use this skill when the user requests to deploy a model, serve a model via an API, or put a trained model into a production environment. The skill is triggered by requests containing terms like "deploy model," "productionize model," "serve model," or "model deployment."
managing-deployment-rollbacks
Deploy use when you need to work with deployment and CI/CD. This skill provides deployment automation and orchestration with comprehensive guidance and automation. Trigger with phrases like "deploy application", "create pipeline", or "automate deployment".
kubernetes-deployment-creator
Kubernetes Deployment Creator - Auto-activating skill for DevOps Advanced. Triggers on: kubernetes deployment creator, kubernetes deployment creator Part of the DevOps Advanced skill category.
integration-test-setup
Integration Test Setup - Auto-activating skill for Test Automation. Triggers on: integration test setup, integration test setup Part of the Test Automation skill category.
running-integration-tests
This skill enables Claude to run and manage integration test suites. It automates environment setup, database seeding, service orchestration, and cleanup. Use this skill when the user asks to "run integration tests", "execute integration tests", or any command that implies running integration tests for a project, including specifying particular test suites or options like code coverage. It is triggered by phrases such as "/run-integration", "/rit", or requests mentioning "integration tests". The plugin handles database creation, migrations, seeding, and dependent service management.
integration-test-generator
Integration Test Generator - Auto-activating skill for API Integration. Triggers on: integration test generator, integration test generator Part of the API Integration skill category.
fathom-ci-integration
Test Fathom integrations in CI/CD pipelines. Trigger with phrases like "fathom CI", "fathom github actions", "test fathom pipeline".