cursor-model-selection

Configure and select AI models in Cursor for Chat, Composer, and Agent mode. Triggers on "cursor model", "cursor gpt", "cursor claude", "change cursor model", "cursor ai model", "cursor auto mode".

25 stars

Best use case

cursor-model-selection is best used when you need a repeatable AI agent workflow instead of a one-off prompt.

Configure and select AI models in Cursor for Chat, Composer, and Agent mode. Triggers on "cursor model", "cursor gpt", "cursor claude", "change cursor model", "cursor ai model", "cursor auto mode".

Teams using cursor-model-selection should expect a more consistent output, faster repeated execution, less prompt rewriting.

When to use this skill

  • You want a reusable workflow that can be run more than once with consistent structure.

When not to use this skill

  • You only need a quick one-off answer and do not need a reusable workflow.
  • You cannot install or maintain the underlying files, dependencies, or repository context.

Installation

Claude Code / Cursor / Codex

$curl -o ~/.claude/skills/cursor-model-selection/SKILL.md --create-dirs "https://raw.githubusercontent.com/ComeOnOliver/skillshub/main/skills/jeremylongshore/claude-code-plugins-plus-skills/cursor-model-selection/SKILL.md"

Manual Installation

  1. Download SKILL.md from GitHub
  2. Place it in .claude/skills/cursor-model-selection/SKILL.md inside your project
  3. Restart your AI agent — it will auto-discover the skill

How cursor-model-selection Compares

Feature / Agentcursor-model-selectionStandard Approach
Platform SupportNot specifiedLimited / Varies
Context Awareness High Baseline
Installation ComplexityUnknownN/A

Frequently Asked Questions

What does this skill do?

Configure and select AI models in Cursor for Chat, Composer, and Agent mode. Triggers on "cursor model", "cursor gpt", "cursor claude", "change cursor model", "cursor ai model", "cursor auto mode".

Where can I find the source code?

You can find the source code on GitHub using the link provided at the top of the page.

SKILL.md Source

# Cursor Model Selection

Configure AI models for Chat, Composer, and Agent mode. Cursor supports models from OpenAI, Anthropic, Google, and its own proprietary models. Choosing the right model per task is a major productivity lever.

## Available Models

### Included with Cursor Subscription

| Model | Provider | Best For | Context |
|-------|----------|----------|---------|
| **GPT-4o** | OpenAI | General coding, fast responses | 128K |
| **GPT-4o-mini** | OpenAI | Simple tasks, cost-efficient | 128K |
| **Claude Sonnet** | Anthropic | Code quality, detailed explanations | 200K |
| **Claude Haiku** | Anthropic | Fast simple tasks | 200K |
| **cursor-small** | Cursor | Quick completions, simple edits | 8K |
| **Auto** | Cursor | Automatic model selection per query | Varies |

### Premium Models (count against fast request quota)

| Model | Provider | Best For | Context |
|-------|----------|----------|---------|
| **Claude Opus** | Anthropic | Complex architecture, hard bugs | 200K |
| **GPT-5** | OpenAI | Advanced reasoning, complex code | 128K+ |
| **o1 / o3** | OpenAI | Deep reasoning, mathematical logic | 128K |
| **Gemini 2.5 Pro** | Google | Design, large context analysis | 1M |

## Model Selection by Task

### Quick Reference

```
Bug fix in one file        → GPT-4o or Claude Sonnet
Multi-file refactoring     → Claude Sonnet or Opus
Architecture planning      → Claude Opus or GPT-5
Test generation            → GPT-4o (fast + good patterns)
Complex algorithm design   → o1/o3 reasoning models
Large codebase analysis    → Gemini 2.5 Pro (1M context)
Simple autocomplete        → cursor-small (automatic via Tab)
"I don't know"             → Auto mode
```

### How to Switch Models

**Per conversation:** Click the model name in the top-right of Chat or Composer panel.

**Default model:** `Cursor Settings` > `Models` > set default for Chat and Composer separately.

**Auto mode:** Select "Auto" as the model. Cursor picks the best model per query based on complexity and current server load.

## Bring Your Own Key (BYOK)

Use your own API keys to bypass Cursor's quota system. You pay the provider directly at their rates.

### Configuration

`Cursor Settings` > `Models` > enable `Use own API key`:

**OpenAI:**
```
API Key: sk-proj-xxxxxxxxxxxxxxxxxxxx
```

**Anthropic:**
```
API Key: sk-ant-xxxxxxxxxxxxxxxxxxxx
```

**Google (Gemini):**
```
API Key: AIzaSyxxxxxxxxxxxxxxxxxxxxxxxxx
```

### Azure OpenAI

For enterprise Azure deployments:

```
Cursor Settings > Models > Azure:
  API Key:       xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
  Endpoint:      https://my-instance.openai.azure.com
  Deployment:    gpt-4o-deployment-name
  API Version:   2024-10-21
```

### Adding Custom Models

For OpenAI-compatible providers (Ollama, LM Studio, Together AI):

1. `Cursor Settings` > `Models` > `Add Model`
2. Enter model name (e.g., `llama-3.1-70b`)
3. Enable `Override OpenAI Base URL`
4. Enter base URL: `http://localhost:11434/v1` (Ollama) or provider URL
5. Enter API key if required

### BYOK Limitations

| Feature | Uses BYOK Key? | Uses Cursor Model? |
|---------|---------------|-------------------|
| Chat | Yes | -- |
| Composer | Yes | -- |
| Agent mode | Yes | -- |
| **Tab Completion** | **No** | **Always Cursor model** |
| **Apply from Chat** | **No** | **Always Cursor model** |

Tab Completion always uses Cursor's proprietary model regardless of BYOK configuration.

## Cost Optimization Strategies

### Tiered Model Usage

```
Tier 1 (Fast + Cheap):    cursor-small, GPT-4o-mini, Claude Haiku
  Use for: simple questions, syntax help, boilerplate

Tier 2 (Balanced):        GPT-4o, Claude Sonnet
  Use for: most coding tasks, debugging, refactoring

Tier 3 (Premium):         Claude Opus, GPT-5, o1/o3
  Use for: architecture decisions, critical bugs, complex logic
```

### Quota Management

Cursor subscription includes a monthly quota of "fast requests" (premium model uses). When exceeded, requests queue behind other users ("slow requests").

- Check remaining quota: `cursor.com/settings` > Usage
- Pro plan: ~500 fast requests/month
- Business plan: ~500 fast requests/month per seat

### Tips to Reduce Usage

1. Use Auto mode -- it picks cheaper models when they suffice
2. Start with Sonnet/GPT-4o, escalate to Opus/o1 only if needed
3. Write detailed prompts to avoid back-and-forth (fewer requests)
4. Use BYOK for heavy usage -- pay per token instead of per request

## Model Behavior Differences

### Code Generation Style

```python
# Claude models: Verbose, well-documented, defensive
def process_order(order: Order) -> Result[ProcessedOrder, OrderError]:
    """Process an order through the payment and fulfillment pipeline.

    Args:
        order: The order to process.

    Returns:
        Result containing the processed order or an error.

    Raises:
        Never raises -- errors returned as Result.Err.
    """
    if not order.items:
        return Err(OrderError.EMPTY_ORDER)
    ...

# GPT models: Concise, pragmatic, fewer comments
def process_order(order: Order) -> ProcessedOrder:
    if not order.items:
        raise ValueError("Order has no items")
    ...
```

### Reasoning Models (o1, o3)

These models "think" before responding. They are slower but significantly better at:
- Multi-step logic problems
- Finding subtle bugs in complex code
- Mathematical or algorithmic optimization
- Understanding implicit requirements

They are overkill for simple tasks. Use them deliberately for hard problems.

## Enterprise Considerations

- **Model access control**: Admins can restrict which models team members access via the admin dashboard
- **Spending limits**: Set per-user or per-team spending caps when using BYOK
- **Compliance**: Some models route through different providers -- verify data handling per model
- **Azure preference**: Enterprise teams on Azure can route all requests through their own Azure OpenAI deployments
- **Audit**: Model selection per request is visible in usage analytics (Business/Enterprise plans)

## Resources

- [Cursor Models Documentation](https://docs.cursor.com/settings/models)
- [API Key Configuration](https://docs.cursor.com/advanced/api-keys)
- [Pricing and Plans](https://cursor.com/pricing)

Related Skills

adapting-transfer-learning-models

25
from ComeOnOliver/skillshub

This skill automates the adaptation of pre-trained machine learning models using transfer learning techniques. It is triggered when the user requests assistance with fine-tuning a model, adapting a pre-trained model to a new dataset, or performing transfer learning. It analyzes the user's requirements, generates code for adapting the model, includes data validation and error handling, provides performance metrics, and saves artifacts with documentation. Use this skill when you need to leverage existing models for new tasks or datasets, optimizing for performance and efficiency.

training-machine-learning-models

25
from ComeOnOliver/skillshub

Build train machine learning models with automated workflows. Analyzes datasets, selects model types (classification, regression), configures parameters, trains with cross-validation, and saves model artifacts. Use when asked to "train model" or "evalua... Trigger with relevant phrases based on skill purpose.

tracking-model-versions

25
from ComeOnOliver/skillshub

Build this skill enables AI assistant to track and manage ai/ml model versions using the model-versioning-tracker plugin. it should be used when the user asks to manage model versions, track model lineage, log model performance, or implement version control f... Use when appropriate context detected. Trigger with relevant phrases based on skill purpose.

threat-model-creator

25
from ComeOnOliver/skillshub

Threat Model Creator - Auto-activating skill for Security Advanced. Triggers on: threat model creator, threat model creator Part of the Security Advanced skill category.

tensorflow-savedmodel-creator

25
from ComeOnOliver/skillshub

Tensorflow Savedmodel Creator - Auto-activating skill for ML Deployment. Triggers on: tensorflow savedmodel creator, tensorflow savedmodel creator Part of the ML Deployment skill category.

tensorflow-model-trainer

25
from ComeOnOliver/skillshub

Tensorflow Model Trainer - Auto-activating skill for ML Training. Triggers on: tensorflow model trainer, tensorflow model trainer Part of the ML Training skill category.

sequelize-model-creator

25
from ComeOnOliver/skillshub

Sequelize Model Creator - Auto-activating skill for Backend Development. Triggers on: sequelize model creator, sequelize model creator Part of the Backend Development skill category.

pytorch-model-trainer

25
from ComeOnOliver/skillshub

Pytorch Model Trainer - Auto-activating skill for ML Training. Triggers on: pytorch model trainer, pytorch model trainer Part of the ML Training skill category.

modeling-nosql-data

25
from ComeOnOliver/skillshub

This skill enables Claude to design NoSQL data models. It activates when the user requests assistance with NoSQL database design, including schema creation, data modeling for MongoDB or DynamoDB, or defining document structures. Use this skill when the user mentions "NoSQL data model", "design MongoDB schema", "create DynamoDB table", or similar phrases related to NoSQL database architecture. It assists in understanding NoSQL modeling principles like embedding vs. referencing, access pattern optimization, and sharding key selection.

model-versioning-manager

25
from ComeOnOliver/skillshub

Model Versioning Manager - Auto-activating skill for ML Deployment. Triggers on: model versioning manager, model versioning manager Part of the ML Deployment skill category.

model-registry-manager

25
from ComeOnOliver/skillshub

Model Registry Manager - Auto-activating skill for ML Deployment. Triggers on: model registry manager, model registry manager Part of the ML Deployment skill category.

model-quantization-tool

25
from ComeOnOliver/skillshub

Model Quantization Tool - Auto-activating skill for ML Deployment. Triggers on: model quantization tool, model quantization tool Part of the ML Deployment skill category.