cursor2api

cursor2api proxy service management tool that converts Cursor IDE's free AI conversations into Anthropic Messages API / OpenAI Chat Completions API format. Supports Docker deployment, environment configuration, token refresh, and complete uninstallation. Use when user asks to: (1) Install or deploy cursor2api, (2) Configure cursor2api for OpenClaw/Claude Code/CC Switch, (3) Refresh or retrieve Cursor Session Token, (4) Uninstall cursor2api.

3,891 stars
Complexity: medium

About this skill

cursor2api is a vital proxy service management tool designed to bridge the gap between Cursor IDE's integrated AI models and external AI coding agents. It intelligently converts Cursor's proprietary AI API calls into widely compatible Anthropic Messages API or OpenAI Chat Completions API formats. This translation allows agents such as OpenClaw or Claude Code to seamlessly interact with and leverage Cursor's AI capabilities as if they were directly communicating with Anthropic or OpenAI. The skill provides comprehensive management capabilities, including Docker-based deployment for easy setup, detailed environment configuration instructions, and methods for refreshing or retrieving the crucial `WorkosCursorSessionToken`. It also supports complete uninstallation of the service. By standardizing the API interface, cursor2api eliminates the need for agents to have custom integrations for Cursor's specific backend. Users would typically employ this skill to expand the repertoire of AI models available to their preferred AI coding agents. For developers and teams already using Cursor IDE, it offers a cost-effective way to integrate Cursor's AI into automated workflows, enabling advanced code generation, refactoring, and debugging within their existing agent-driven development environments without incurring additional API costs from other providers.

Best use case

The primary use case for `cursor2api` is to enable AI coding agents like OpenClaw and Claude Code to access and utilize Cursor IDE's AI models through a standardized API. Developers and AI practitioners who wish to consolidate their AI toolchain or leverage Cursor's AI models within a broader agent-orchestrated development workflow, especially for cost efficiency or diverse model access, will find this skill particularly beneficial.

cursor2api proxy service management tool that converts Cursor IDE's free AI conversations into Anthropic Messages API / OpenAI Chat Completions API format. Supports Docker deployment, environment configuration, token refresh, and complete uninstallation. Use when user asks to: (1) Install or deploy cursor2api, (2) Configure cursor2api for OpenClaw/Claude Code/CC Switch, (3) Refresh or retrieve Cursor Session Token, (4) Uninstall cursor2api.

The `cursor2api` proxy service will be successfully installed and running, allowing external AI agents to communicate with Cursor IDE's AI via standard Anthropic/OpenAI API calls.

Practical example

Example input

Install the `cursor2api` proxy service using Docker and set up the necessary environment variables for OpenClaw with my Cursor session token.

Example output

cursor-api Docker container started successfully on port 3010. Please configure your OpenClaw environment by setting `ANTHROPIC_BASE_URL="http://localhost:3010/v1"` and then restart OpenClaw.

When to use this skill

  • When you need your AI agent (e.g., OpenClaw, Claude Code) to access Cursor IDE's AI capabilities.
  • When you want to convert Cursor's internal AI API to standard Anthropic Messages or OpenAI Chat Completions formats.
  • When deploying, configuring, or managing the `cursor2api` proxy service for your development environment.
  • When you need to refresh or retrieve your `WorkosCursorSessionToken` for the `cursor2api` service.

When not to use this skill

  • If you do not have a Cursor IDE account or an active AI subscription for its features.
  • If you exclusively use Cursor IDE's AI directly within the IDE and do not require external API access.
  • If your AI agent already has direct access to the specific Anthropic or OpenAI models you need.
  • If you are unwilling or unable to manage Docker containers or Node.js services.

Installation

Claude Code / Cursor / Codex

$curl -o ~/.claude/skills/cursor2api/SKILL.md --create-dirs "https://raw.githubusercontent.com/openclaw/skills/main/skills/0xcjl/cursor2api/SKILL.md"

Manual Installation

  1. Download SKILL.md from GitHub
  2. Place it in .claude/skills/cursor2api/SKILL.md inside your project
  3. Restart your AI agent — it will auto-discover the skill

How cursor2api Compares

Feature / Agentcursor2apiStandard Approach
Platform SupportNot specifiedLimited / Varies
Context Awareness High Baseline
Installation ComplexitymediumN/A

Frequently Asked Questions

What does this skill do?

cursor2api proxy service management tool that converts Cursor IDE's free AI conversations into Anthropic Messages API / OpenAI Chat Completions API format. Supports Docker deployment, environment configuration, token refresh, and complete uninstallation. Use when user asks to: (1) Install or deploy cursor2api, (2) Configure cursor2api for OpenClaw/Claude Code/CC Switch, (3) Refresh or retrieve Cursor Session Token, (4) Uninstall cursor2api.

How difficult is it to install?

The installation complexity is rated as medium. You can find the installation instructions above.

Where can I find the source code?

You can find the source code on GitHub using the link provided at the top of the page.

Related Guides

SKILL.md Source

# cursor2api

`cursor2api` bridges Cursor IDE's AI models with OpenClaw/Claude Code by converting Cursor's internal API into standard Anthropic/OpenAI formats.

**Architecture:**
```
OpenClaw / Claude Code
         ↓ (ANTHROPIC_BASE_URL)
cursor2api Docker/Node (:3010)
         ↓ (Session Token)
Cursor Official API
```

## Prerequisites

- Docker (for containerized deployment) or Node.js 18+ (for local)
- A Cursor account with active AI subscription
- `WorkosCursorSessionToken` from Cursor

## Quick Start

```bash
# 1. Get your WorkosCursorSessionToken (see references/token.md)

# 2. Start the service
docker run -d \
  --name cursor-api \
  -p 3010:3000 \
  -e WORKOS_CURSOR_SESSION_TOKEN=your_token \
  waitkafuka/cursor-api:latest

# 3. Configure OpenClaw
export ANTHROPIC_BASE_URL="http://localhost:3010/v1"
export ANTHROPIC_API_KEY="your_token"
export ANTHROPIC_DEFAULT_SONNET_MODEL="claude-sonnet-4-6"

# 4. Restart OpenClaw
openclaw gateway restart
```

## Core Operations

| Operation | Command |
|-----------|---------|
| **Install** | `docker run -d --name cursor-api -p 3010:3000 -e WORKOS_CURSOR_SESSION_TOKEN=token waitkafuka/cursor-api:latest` |
| **Status** | `docker ps \| grep cursor-api` |
| **Refresh Token** | See `references/token.md` |
| **Uninstall** | `docker stop cursor-api && docker rm cursor-api` |

## API Endpoints

| Endpoint | Format | Compatible With |
|----------|--------|-----------------|
| `http://localhost:3010/v1/messages` | Anthropic Messages API | OpenClaw, Claude Code |
| `http://localhost:3010/v1/chat/completions` | OpenAI Chat Completions | CC Switch, Universal |

## Documentation

| Document | Description |
|----------|-------------|
| [Installation Guide](references/installation.md) | Docker deployment, verification, troubleshooting |
| [Token Management](references/token.md) | Obtaining and refreshing WorkosCursorSessionToken |
| [Configuration](references/configuration.md) | OpenClaw, Claude Code, CC Switch setup |
| [Quick Reference](references/quick-reference.md) | One-page cheat sheet |

## ⚠️ Important Notes

- **ToS Risk**: Using third-party proxies may violate Cursor's Terms of Service
- **Token Expiry**: Session tokens expire periodically; monitor and refresh as needed
- **API Stability**: Cursor's internal API may change without notice

Related Skills

botlearn-healthcheck

3891
from openclaw/skills

botlearn-healthcheck — BotLearn autonomous health inspector for OpenClaw instances across 5 domains (hardware, config, security, skills, autonomy); triggers on system check, health report, diagnostics, or scheduled heartbeat inspection.

DevOps & Infrastructure

Incident Postmortem Generator

3891
from openclaw/skills

Generate blameless incident postmortems from raw notes, Slack threads, or bullet points.

DevOps & Infrastructure

Post-Mortem & Incident Review Framework

3891
from openclaw/skills

Run structured post-mortems that actually prevent repeat failures. Blameless analysis, root cause identification, and action tracking.

DevOps & Infrastructure

afrexai-performance-engineering

3891
from openclaw/skills

Complete performance engineering system — profiling, optimization, load testing, capacity planning, and performance culture. Use when diagnosing slow applications, optimizing code/queries/infrastructure, load testing before launch, planning capacity, or building performance into CI/CD. Covers Node.js, Python, Go, Java, databases, APIs, and frontend.

DevOps & Infrastructure

OpenClaw Mastery — The Complete Agent Engineering & Operations System

3891
from openclaw/skills

> Built by AfrexAI — the team that runs 9+ production agents 24/7 on OpenClaw.

DevOps & Infrastructure

Legacy System Modernization Engine

3891
from openclaw/skills

Complete methodology for assessing, planning, and executing legacy system modernization — from monolith decomposition to cloud migration. Works for any tech stack, any scale.

DevOps & Infrastructure

Incident Response Playbook

3891
from openclaw/skills

Structured incident response for business and IT teams. Guides you through detection, triage, containment, resolution, and post-mortem — with auto-generated timelines and action items.

DevOps & Infrastructure

Git Engineering & Repository Strategy

3891
from openclaw/skills

You are a Git Engineering expert. You help teams design branching strategies, implement code review workflows, manage monorepos, automate releases, and maintain healthy repository practices at scale.

DevOps & Infrastructure

Django Production Engineering

3891
from openclaw/skills

Complete methodology for building, scaling, and operating production Django applications. From project structure to deployment, security to performance — every decision framework a Django team needs.

DevOps & Infrastructure

IT Disaster Recovery Plan Generator

3891
from openclaw/skills

Build production-ready disaster recovery plans that actually get followed when things break.

DevOps & Infrastructure

afrexai-api-architect

3891
from openclaw/skills

Design, build, test, document, and secure production-grade APIs. Covers the full lifecycle from schema design through deployment, monitoring, and versioning. Use when designing new APIs, reviewing existing ones, generating OpenAPI specs, building test suites, or debugging production issues.

DevOps & Infrastructure

Agent Ops Runbook

3891
from openclaw/skills

Generate a production-ready operations runbook for deploying AI agents. Covers pre-deployment checklists, shadow mode → supervised → autonomous rollout stages, monitoring dashboards, rollback procedures, cost management, and incident response templates.

DevOps & Infrastructure