coreweave-local-dev-loop
Set up local development workflow for CoreWeave GPU deployments. Use when building containers locally, testing YAML manifests, or iterating on model serving configurations before deploying. Trigger with phrases like "coreweave dev setup", "coreweave local testing", "develop for coreweave", "coreweave container build".
Best use case
coreweave-local-dev-loop is best used when you need a repeatable AI agent workflow instead of a one-off prompt.
Set up local development workflow for CoreWeave GPU deployments. Use when building containers locally, testing YAML manifests, or iterating on model serving configurations before deploying. Trigger with phrases like "coreweave dev setup", "coreweave local testing", "develop for coreweave", "coreweave container build".
Teams using coreweave-local-dev-loop should expect a more consistent output, faster repeated execution, less prompt rewriting.
When to use this skill
- You want a reusable workflow that can be run more than once with consistent structure.
When not to use this skill
- You only need a quick one-off answer and do not need a reusable workflow.
- You cannot install or maintain the underlying files, dependencies, or repository context.
Installation
Claude Code / Cursor / Codex
Manual Installation
- Download SKILL.md from GitHub
- Place it in
.claude/skills/coreweave-local-dev-loop/SKILL.mdinside your project - Restart your AI agent — it will auto-discover the skill
How coreweave-local-dev-loop Compares
| Feature / Agent | coreweave-local-dev-loop | Standard Approach |
|---|---|---|
| Platform Support | Not specified | Limited / Varies |
| Context Awareness | High | Baseline |
| Installation Complexity | Unknown | N/A |
Frequently Asked Questions
What does this skill do?
Set up local development workflow for CoreWeave GPU deployments. Use when building containers locally, testing YAML manifests, or iterating on model serving configurations before deploying. Trigger with phrases like "coreweave dev setup", "coreweave local testing", "develop for coreweave", "coreweave container build".
Where can I find the source code?
You can find the source code on GitHub using the link provided at the top of the page.
SKILL.md Source
# CoreWeave Local Dev Loop ## Overview Local development workflow for CoreWeave: build containers, test YAML manifests with dry-run, push to registry, and deploy to CoreWeave CKS. ## Prerequisites - Completed `coreweave-install-auth` setup - Docker installed locally - Container registry access (Docker Hub, GHCR, or CoreWeave registry) ## Instructions ### Step 1: Project Structure ``` my-inference-service/ ├── Dockerfile ├── src/ │ ├── server.py # Inference server code │ └── model_config.py # Model configuration ├── k8s/ │ ├── deployment.yaml # GPU deployment manifest │ ├── service.yaml # Service and ingress │ └── hpa.yaml # Horizontal pod autoscaler ├── scripts/ │ ├── build.sh # Build and push container │ └── deploy.sh # Deploy to CoreWeave ├── .env.local └── Makefile ``` ### Step 2: Build and Push Container ```bash # Build locally docker build -t my-inference:latest . # Tag for registry docker tag my-inference:latest ghcr.io/myorg/my-inference:v1.0.0 # Push docker push ghcr.io/myorg/my-inference:v1.0.0 ``` ### Step 3: Validate Manifests Before Deploy ```bash # Dry-run against CoreWeave cluster kubectl apply -f k8s/deployment.yaml --dry-run=server # Diff against current state kubectl diff -f k8s/deployment.yaml # Check resource requests match available GPU types kubectl get nodes -l gpu.nvidia.com/class=A100_PCIE_80GB --no-headers | wc -l ``` ### Step 4: Deploy and Watch ```bash kubectl apply -f k8s/ kubectl rollout status deployment/my-inference kubectl logs -f deployment/my-inference ``` ## Error Handling | Error | Cause | Solution | |-------|-------|----------| | Image pull backoff | Wrong registry or no pull secret | Create imagePullSecret | | CUDA mismatch | Driver vs container version | Match CUDA version to node drivers | | Dry-run fails | Invalid manifest | Fix YAML syntax | ## Resources - [CoreWeave CKS Docs](https://docs.coreweave.com/docs/products/cks) - [kubectl dry-run](https://kubernetes.io/docs/reference/kubectl/) ## Next Steps See `coreweave-sdk-patterns` for inference client patterns.
Related Skills
exa-local-dev-loop
Configure Exa local development with hot reload, testing, and mock responses. Use when setting up a development environment, writing tests against Exa, or establishing a fast iteration cycle. Trigger with phrases like "exa dev setup", "exa local development", "exa test setup", "develop with exa", "mock exa".
evernote-local-dev-loop
Set up efficient local development workflow for Evernote integrations. Use when configuring dev environment, setting up sandbox testing, or optimizing development iteration speed. Trigger with phrases like "evernote dev setup", "evernote local development", "evernote sandbox", "test evernote locally".
elevenlabs-local-dev-loop
Configure local ElevenLabs development with mocking, hot reload, and audio testing. Use when setting up a dev environment for TTS/voice projects, configuring test workflows, or building a fast iteration cycle with ElevenLabs audio. Trigger: "elevenlabs dev setup", "elevenlabs local development", "elevenlabs dev environment", "develop with elevenlabs", "test elevenlabs locally".
documenso-local-dev-loop
Set up local development environment and testing workflow for Documenso. Use when configuring dev environment, setting up test workflows, or establishing rapid iteration patterns with Documenso. Trigger with phrases like "documenso local dev", "documenso development", "test documenso locally", "documenso dev environment".
deepgram-local-dev-loop
Configure Deepgram local development workflow with testing and mocks. Use when setting up development environment, configuring test fixtures, or establishing rapid iteration patterns for Deepgram integration. Trigger: "deepgram local dev", "deepgram development setup", "deepgram test environment", "deepgram dev workflow", "deepgram mock".
databricks-local-dev-loop
Configure Databricks local development with Databricks Connect, Asset Bundles, and IDE. Use when setting up a local dev environment, configuring test workflows, or establishing a fast iteration cycle with Databricks. Trigger with phrases like "databricks dev setup", "databricks local", "databricks IDE", "develop with databricks", "databricks connect".
customerio-local-dev-loop
Configure Customer.io local development workflow. Use when setting up local testing, dev/staging isolation, or mocking Customer.io for unit tests. Trigger: "customer.io local dev", "test customer.io locally", "customer.io dev environment", "customer.io sandbox", "mock customer.io".
cursor-local-dev-loop
Optimize daily development workflow with Cursor IDE using Chat, Composer, Tab, and Git integration. Triggers on "cursor workflow", "cursor development loop", "cursor productivity", "cursor daily workflow", "cursor dev flow".
coreweave-webhooks-events
Monitor CoreWeave cluster events and GPU workload status. Use when tracking pod lifecycle events, monitoring GPU utilization, or alerting on inference service health changes. Trigger with phrases like "coreweave events", "coreweave monitoring", "coreweave pod alerts", "coreweave gpu monitoring".
coreweave-upgrade-migration
Upgrade CoreWeave deployments and migrate between GPU types. Use when migrating from A100 to H100, upgrading CUDA versions, or updating inference server versions. Trigger with phrases like "upgrade coreweave", "coreweave gpu migration", "coreweave cuda upgrade", "migrate coreweave".
coreweave-security-basics
Secure CoreWeave deployments with RBAC, network policies, and secrets management. Use when hardening GPU workloads, managing model access, or configuring namespace isolation. Trigger with phrases like "coreweave security", "coreweave rbac", "secure coreweave", "coreweave secrets".
coreweave-sdk-patterns
Production-ready patterns for CoreWeave GPU workload management with kubectl and Python. Use when building inference clients, managing GPU deployments programmatically, or creating reusable CoreWeave deployment templates. Trigger with phrases like "coreweave patterns", "coreweave client", "coreweave Python", "coreweave deployment template".