cursor2api
cursor2api proxy service management tool that converts Cursor IDE's free AI conversations into Anthropic Messages API / OpenAI Chat Completions API format. Supports Docker deployment, environment configuration, token refresh, and complete uninstallation. Use when user asks to: (1) Install or deploy cursor2api, (2) Configure cursor2api for OpenClaw/Claude Code/CC Switch, (3) Refresh or retrieve Cursor Session Token, (4) Uninstall cursor2api.
About this skill
cursor2api is a vital proxy service management tool designed to bridge the gap between Cursor IDE's integrated AI models and external AI coding agents. It intelligently converts Cursor's proprietary AI API calls into widely compatible Anthropic Messages API or OpenAI Chat Completions API formats. This translation allows agents such as OpenClaw or Claude Code to seamlessly interact with and leverage Cursor's AI capabilities as if they were directly communicating with Anthropic or OpenAI. The skill provides comprehensive management capabilities, including Docker-based deployment for easy setup, detailed environment configuration instructions, and methods for refreshing or retrieving the crucial `WorkosCursorSessionToken`. It also supports complete uninstallation of the service. By standardizing the API interface, cursor2api eliminates the need for agents to have custom integrations for Cursor's specific backend. Users would typically employ this skill to expand the repertoire of AI models available to their preferred AI coding agents. For developers and teams already using Cursor IDE, it offers a cost-effective way to integrate Cursor's AI into automated workflows, enabling advanced code generation, refactoring, and debugging within their existing agent-driven development environments without incurring additional API costs from other providers.
Best use case
The primary use case for `cursor2api` is to enable AI coding agents like OpenClaw and Claude Code to access and utilize Cursor IDE's AI models through a standardized API. Developers and AI practitioners who wish to consolidate their AI toolchain or leverage Cursor's AI models within a broader agent-orchestrated development workflow, especially for cost efficiency or diverse model access, will find this skill particularly beneficial.
cursor2api proxy service management tool that converts Cursor IDE's free AI conversations into Anthropic Messages API / OpenAI Chat Completions API format. Supports Docker deployment, environment configuration, token refresh, and complete uninstallation. Use when user asks to: (1) Install or deploy cursor2api, (2) Configure cursor2api for OpenClaw/Claude Code/CC Switch, (3) Refresh or retrieve Cursor Session Token, (4) Uninstall cursor2api.
The `cursor2api` proxy service will be successfully installed and running, allowing external AI agents to communicate with Cursor IDE's AI via standard Anthropic/OpenAI API calls.
Practical example
Example input
Install the `cursor2api` proxy service using Docker and set up the necessary environment variables for OpenClaw with my Cursor session token.
Example output
cursor-api Docker container started successfully on port 3010. Please configure your OpenClaw environment by setting `ANTHROPIC_BASE_URL="http://localhost:3010/v1"` and then restart OpenClaw.
When to use this skill
- When you need your AI agent (e.g., OpenClaw, Claude Code) to access Cursor IDE's AI capabilities.
- When you want to convert Cursor's internal AI API to standard Anthropic Messages or OpenAI Chat Completions formats.
- When deploying, configuring, or managing the `cursor2api` proxy service for your development environment.
- When you need to refresh or retrieve your `WorkosCursorSessionToken` for the `cursor2api` service.
When not to use this skill
- If you do not have a Cursor IDE account or an active AI subscription for its features.
- If you exclusively use Cursor IDE's AI directly within the IDE and do not require external API access.
- If your AI agent already has direct access to the specific Anthropic or OpenAI models you need.
- If you are unwilling or unable to manage Docker containers or Node.js services.
Installation
Claude Code / Cursor / Codex
Manual Installation
- Download SKILL.md from GitHub
- Place it in
.claude/skills/cursor2api/SKILL.mdinside your project - Restart your AI agent — it will auto-discover the skill
How cursor2api Compares
| Feature / Agent | cursor2api | Standard Approach |
|---|---|---|
| Platform Support | Not specified | Limited / Varies |
| Context Awareness | High | Baseline |
| Installation Complexity | medium | N/A |
Frequently Asked Questions
What does this skill do?
cursor2api proxy service management tool that converts Cursor IDE's free AI conversations into Anthropic Messages API / OpenAI Chat Completions API format. Supports Docker deployment, environment configuration, token refresh, and complete uninstallation. Use when user asks to: (1) Install or deploy cursor2api, (2) Configure cursor2api for OpenClaw/Claude Code/CC Switch, (3) Refresh or retrieve Cursor Session Token, (4) Uninstall cursor2api.
How difficult is it to install?
The installation complexity is rated as medium. You can find the installation instructions above.
Where can I find the source code?
You can find the source code on GitHub using the link provided at the top of the page.
Related Guides
Cursor vs Codex for AI Workflows
Compare Cursor and Codex for AI coding workflows, repository assistance, debugging, refactoring, and reusable developer skills.
AI Agents for Coding
Browse AI agent skills for coding, debugging, testing, refactoring, code review, and developer workflows across Claude, Cursor, and Codex.
Top AI Agents for Productivity
See the top AI agent skills for productivity, workflow automation, operational systems, documentation, and everyday task execution.
SKILL.md Source
# cursor2api
`cursor2api` bridges Cursor IDE's AI models with OpenClaw/Claude Code by converting Cursor's internal API into standard Anthropic/OpenAI formats.
**Architecture:**
```
OpenClaw / Claude Code
↓ (ANTHROPIC_BASE_URL)
cursor2api Docker/Node (:3010)
↓ (Session Token)
Cursor Official API
```
## Prerequisites
- Docker (for containerized deployment) or Node.js 18+ (for local)
- A Cursor account with active AI subscription
- `WorkosCursorSessionToken` from Cursor
## Quick Start
```bash
# 1. Get your WorkosCursorSessionToken (see references/token.md)
# 2. Start the service
docker run -d \
--name cursor-api \
-p 3010:3000 \
-e WORKOS_CURSOR_SESSION_TOKEN=your_token \
waitkafuka/cursor-api:latest
# 3. Configure OpenClaw
export ANTHROPIC_BASE_URL="http://localhost:3010/v1"
export ANTHROPIC_API_KEY="your_token"
export ANTHROPIC_DEFAULT_SONNET_MODEL="claude-sonnet-4-6"
# 4. Restart OpenClaw
openclaw gateway restart
```
## Core Operations
| Operation | Command |
|-----------|---------|
| **Install** | `docker run -d --name cursor-api -p 3010:3000 -e WORKOS_CURSOR_SESSION_TOKEN=token waitkafuka/cursor-api:latest` |
| **Status** | `docker ps \| grep cursor-api` |
| **Refresh Token** | See `references/token.md` |
| **Uninstall** | `docker stop cursor-api && docker rm cursor-api` |
## API Endpoints
| Endpoint | Format | Compatible With |
|----------|--------|-----------------|
| `http://localhost:3010/v1/messages` | Anthropic Messages API | OpenClaw, Claude Code |
| `http://localhost:3010/v1/chat/completions` | OpenAI Chat Completions | CC Switch, Universal |
## Documentation
| Document | Description |
|----------|-------------|
| [Installation Guide](references/installation.md) | Docker deployment, verification, troubleshooting |
| [Token Management](references/token.md) | Obtaining and refreshing WorkosCursorSessionToken |
| [Configuration](references/configuration.md) | OpenClaw, Claude Code, CC Switch setup |
| [Quick Reference](references/quick-reference.md) | One-page cheat sheet |
## ⚠️ Important Notes
- **ToS Risk**: Using third-party proxies may violate Cursor's Terms of Service
- **Token Expiry**: Session tokens expire periodically; monitor and refresh as needed
- **API Stability**: Cursor's internal API may change without noticeRelated Skills
botlearn-healthcheck
botlearn-healthcheck — BotLearn autonomous health inspector for OpenClaw instances across 5 domains (hardware, config, security, skills, autonomy); triggers on system check, health report, diagnostics, or scheduled heartbeat inspection.
Incident Postmortem Generator
Generate blameless incident postmortems from raw notes, Slack threads, or bullet points.
Post-Mortem & Incident Review Framework
Run structured post-mortems that actually prevent repeat failures. Blameless analysis, root cause identification, and action tracking.
afrexai-performance-engineering
Complete performance engineering system — profiling, optimization, load testing, capacity planning, and performance culture. Use when diagnosing slow applications, optimizing code/queries/infrastructure, load testing before launch, planning capacity, or building performance into CI/CD. Covers Node.js, Python, Go, Java, databases, APIs, and frontend.
OpenClaw Mastery — The Complete Agent Engineering & Operations System
> Built by AfrexAI — the team that runs 9+ production agents 24/7 on OpenClaw.
Legacy System Modernization Engine
Complete methodology for assessing, planning, and executing legacy system modernization — from monolith decomposition to cloud migration. Works for any tech stack, any scale.
Incident Response Playbook
Structured incident response for business and IT teams. Guides you through detection, triage, containment, resolution, and post-mortem — with auto-generated timelines and action items.
Git Engineering & Repository Strategy
You are a Git Engineering expert. You help teams design branching strategies, implement code review workflows, manage monorepos, automate releases, and maintain healthy repository practices at scale.
Django Production Engineering
Complete methodology for building, scaling, and operating production Django applications. From project structure to deployment, security to performance — every decision framework a Django team needs.
IT Disaster Recovery Plan Generator
Build production-ready disaster recovery plans that actually get followed when things break.
afrexai-api-architect
Design, build, test, document, and secure production-grade APIs. Covers the full lifecycle from schema design through deployment, monitoring, and versioning. Use when designing new APIs, reviewing existing ones, generating OpenAPI specs, building test suites, or debugging production issues.
Agent Ops Runbook
Generate a production-ready operations runbook for deploying AI agents. Covers pre-deployment checklists, shadow mode → supervised → autonomous rollout stages, monitoring dashboards, rollback procedures, cost management, and incident response templates.