system-prompt
Edit or improve the AI system prompt used in DBX Studio's AI chat. Invoke when the user wants to change how the AI responds, its tone, tool usage order, or response format.
Best use case
system-prompt is best used when you need a repeatable AI agent workflow instead of a one-off prompt. It is especially useful for teams working in multi. Edit or improve the AI system prompt used in DBX Studio's AI chat. Invoke when the user wants to change how the AI responds, its tone, tool usage order, or response format.
Edit or improve the AI system prompt used in DBX Studio's AI chat. Invoke when the user wants to change how the AI responds, its tone, tool usage order, or response format.
Users should expect a more consistent workflow output, faster repeated execution, and less time spent rewriting prompts from scratch.
Practical example
Example input
Use the "system-prompt" skill to help with this workflow task. Context: Edit or improve the AI system prompt used in DBX Studio's AI chat. Invoke when the user wants to change how the AI responds, its tone, tool usage order, or response format.
Example output
A structured workflow result with clearer steps, more consistent formatting, and an output that is easier to reuse in the next run.
When to use this skill
- Use this skill when you want a reusable workflow rather than writing the same prompt again and again.
When not to use this skill
- Do not use this when you only need a one-off answer and do not need a reusable workflow.
- Do not use it if you cannot install or maintain the related files, repository context, or supporting tools.
Installation
Claude Code / Cursor / Codex
Manual Installation
- Download SKILL.md from GitHub
- Place it in
.claude/skills/system-prompt/SKILL.mdinside your project - Restart your AI agent — it will auto-discover the skill
How system-prompt Compares
| Feature / Agent | system-prompt | Standard Approach |
|---|---|---|
| Platform Support | Not specified | Limited / Varies |
| Context Awareness | High | Baseline |
| Installation Complexity | Unknown | N/A |
Frequently Asked Questions
What does this skill do?
Edit or improve the AI system prompt used in DBX Studio's AI chat. Invoke when the user wants to change how the AI responds, its tone, tool usage order, or response format.
Where can I find the source code?
You can find the source code on GitHub using the link provided at the top of the page.
SKILL.md Source
# System Prompt Editor — DBX Studio
## Prompt Locations
There are **two** system prompts in this project:
### 1. Streaming Prompt (main, used in production)
**File**: [apps/api/src/routes/ai-stream.ts](../../../apps/api/src/routes/ai-stream.ts)
**Lines**: ~132–172 (with schema) and ~176–202 (without schema)
**Variable**: `contextPrompt` (built inline, not a constant)
### 2. oRPC Provider Prompt (used in `callAnthropicWithTools`, `callOpenAIWithTools`)
**File**: [apps/api/src/orpc/routers/ai/providersWithTools.ts](../../../apps/api/src/orpc/routers/ai/providersWithTools.ts)
**Variable**: `SYSTEM_PROMPT_WITH_TOOLS` (top of file)
## Current Prompt Structure (Streaming)
```
You are a SQL assistant...
## Tools Available ← list 5 tools
## Response Style ← 5 rules: be direct, show results, use tools, minimal explanation, SQL format
## Examples ← 2-3 concrete input/output examples
## Context ← dynamic schema from generateSQLPrompt()
Schema: "<schema>"
## User Query ← the actual user message
```
## Prompt Design Rules for DBX Studio
1. **Results first** — answer the question before showing SQL
2. **Use tools always** — never guess schema or data
3. **Be concise** — this is a data tool, not a chatbot
4. **Show SQL only when asked** — use ```sql blocks with uppercase keywords
5. **Format numbers clearly** — "**1,247 orders**" not "1247"
## When Editing the Prompt
- Keep the `## Tools Available` section in sync with actual tools in `tools.ts`
- Keep `## Examples` realistic to real user queries
- The `${enhancedPrompt}` injection must stay — it contains live schema context
- Do not remove `Schema: "${schema || 'public'}"` line — it scopes queries
- Both prompts (streaming + oRPC) should stay consistent in style
## Current Prompt Structure (as of last update)
Both prompts now follow this unified structure:
```
You are DBX Studio's AI assistant — expert SQL analyst and data explorer.
## Tools Available (ordered by when to use)
1. read_schema / get_table_schema — FIRST, when schema is unknown
2. execute_query / execute_sql_query — run SELECT/WITH queries
3. get_table_data / select_data — preview or filter rows
4. get_table_stats — distributions and row counts
5. generate_chart / generate_bar_graph — visualization
6. describe_table / get_enums — column details, enum values
## Response Rules
1. Results first — answer before explaining
2. Always use tools — never guess schema or data
3. Tool order matters (schema → query → chart)
4. Show SQL only when asked — use ```sql with UPPERCASE
5. Format numbers clearly — **bold** key values
6. No filler words
## Chart Selection Guide
[line / bar / pie / scatter / histogram guidance]
## Query Safety
[SELECT/WITH only, always LIMIT, quote identifiers]
## Context / Schema (streaming only)
{enhancedPrompt}
Schema: "{schema}"
## User Query
{query}
```Related Skills
prompt-optimize
Expert prompt engineering skill that transforms Claude into "Alpha-Prompt" - a master prompt engineer who collaboratively crafts high-quality prompts through flexible dialogue. Activates when user asks to "optimize prompt", "improve system instruction", "enhance AI instruction", or mentions prompt engineering tasks.
design-system-patterns
Build scalable design systems with design tokens, theming infrastructure, and component architecture patterns. Use when creating design tokens, implementing theme switching, building component libraries, or establishing design system foundations.
system-environment-setup
Configure development and production environments for consistent and reproducible setups. Use when setting up new projects, Docker environments, or development tooling. Handles Docker Compose, .env configuration, dev containers, and infrastructure as code.
prompt-repetition
A prompt repetition technique for improving LLM accuracy. Achieves significant performance gains in 67% (47/70) of 70 benchmarks. Automatically applied on lightweight models (haiku, flash, mini).
tailwind-design-system
Build scalable design systems with Tailwind CSS, design tokens, component libraries, and responsive patterns. Use when creating component libraries, implementing design systems, or standardizing UI patterns.
systems-programming-rust-project
You are a Rust project architecture expert specializing in scaffolding production-ready Rust applications. Generate complete project structures with cargo tooling, proper module organization, testing
radix-ui-design-system
Build accessible design systems with Radix UI primitives. Headless component customization, theming strategies, and compound component patterns for production-grade UI libraries.
prompt-library
Curated collection of high-quality prompts for various use cases. Includes role-based prompts, task-specific templates, and prompt refinement techniques. Use when user needs prompt templates, role-play prompts, or ready-to-use prompt examples for coding, writing, analysis, or creative tasks.
prompt-engineering-patterns
Master advanced prompt engineering techniques to maximize LLM performance, reliability, and controllability in production. Use when optimizing prompts, improving LLM outputs, or designing production prompt templates.
prompt-engineer
Transforms user prompts into optimized prompts using frameworks (RTF, RISEN, Chain of Thought, RODES, Chain of Density, RACE, RISE, STAR, SOAP, CLEAR, GROW)
prompt-caching
Caching strategies for LLM prompts including Anthropic prompt caching, response caching, and CAG (Cache Augmented Generation) Use when: prompt caching, cache prompt, response cache, cag, cache augmented.
llm-application-dev-prompt-optimize
You are an expert prompt engineer specializing in crafting effective prompts for LLMs through advanced techniques including constitutional AI, chain-of-thought reasoning, and model-specific optimizati