clickhouse-common-errors
Diagnose and fix the top 15 ClickHouse errors — query failures, insert problems, memory limits, and merge issues. Use when encountering ClickHouse exceptions, debugging failed queries, or troubleshooting server-side errors. Trigger: "clickhouse error", "fix clickhouse", "clickhouse not working", "debug clickhouse", "clickhouse exception", "clickhouse syntax error".
Best use case
clickhouse-common-errors is best used when you need a repeatable AI agent workflow instead of a one-off prompt.
Diagnose and fix the top 15 ClickHouse errors — query failures, insert problems, memory limits, and merge issues. Use when encountering ClickHouse exceptions, debugging failed queries, or troubleshooting server-side errors. Trigger: "clickhouse error", "fix clickhouse", "clickhouse not working", "debug clickhouse", "clickhouse exception", "clickhouse syntax error".
Teams using clickhouse-common-errors should expect a more consistent output, faster repeated execution, less prompt rewriting.
When to use this skill
- You want a reusable workflow that can be run more than once with consistent structure.
When not to use this skill
- You only need a quick one-off answer and do not need a reusable workflow.
- You cannot install or maintain the underlying files, dependencies, or repository context.
Installation
Claude Code / Cursor / Codex
Manual Installation
- Download SKILL.md from GitHub
- Place it in
.claude/skills/clickhouse-common-errors/SKILL.mdinside your project - Restart your AI agent — it will auto-discover the skill
How clickhouse-common-errors Compares
| Feature / Agent | clickhouse-common-errors | Standard Approach |
|---|---|---|
| Platform Support | Not specified | Limited / Varies |
| Context Awareness | High | Baseline |
| Installation Complexity | Unknown | N/A |
Frequently Asked Questions
What does this skill do?
Diagnose and fix the top 15 ClickHouse errors — query failures, insert problems, memory limits, and merge issues. Use when encountering ClickHouse exceptions, debugging failed queries, or troubleshooting server-side errors. Trigger: "clickhouse error", "fix clickhouse", "clickhouse not working", "debug clickhouse", "clickhouse exception", "clickhouse syntax error".
Where can I find the source code?
You can find the source code on GitHub using the link provided at the top of the page.
SKILL.md Source
# ClickHouse Common Errors
## Overview
Quick reference for the most common ClickHouse errors with real error codes,
diagnostic queries, and proven solutions.
## Prerequisites
- Access to ClickHouse (client or HTTP interface)
- Ability to query `system.*` tables
## Error Reference
### 1. Too Many Parts (Code 252)
```
DB::Exception: Too many parts (600). Merges are processing significantly slower than inserts.
```
**Cause:** Each INSERT creates a new data part. Hundreds of tiny inserts per second
overwhelm the merge process.
**Fix:**
```sql
-- Check current part count per table
SELECT database, table, count() AS part_count
FROM system.parts WHERE active GROUP BY database, table ORDER BY part_count DESC;
-- Temporary: raise the limit
ALTER TABLE events MODIFY SETTING parts_to_throw_insert = 1000;
-- Permanent: batch your inserts (10K+ rows per INSERT)
-- See clickhouse-sdk-patterns for batching code
```
### 2. Memory Limit Exceeded (Code 241)
```
DB::Exception: Memory limit (for query) exceeded: ... (MEMORY_LIMIT_EXCEEDED)
```
**Cause:** Query allocates more RAM than `max_memory_usage` (default ~10GB).
**Fix:**
```sql
-- Check what's consuming memory
SELECT query, memory_usage, peak_memory_usage
FROM system.processes ORDER BY peak_memory_usage DESC;
-- Option A: Increase limit for this query
SET max_memory_usage = 20000000000; -- 20GB
-- Option B: Reduce data scanned
SELECT ... FROM events
WHERE created_at >= today() - 7 -- Add time filters
LIMIT 10000; -- Cap result size
-- Option C: Enable disk spill for large sorts/GROUP BY
SET max_bytes_before_external_sort = 10000000000;
SET max_bytes_before_external_group_by = 10000000000;
```
### 3. Syntax Error (Code 62)
```
DB::Exception: Syntax error: ... Expected ... before ... (SYNTAX_ERROR)
```
**Common causes:**
```sql
-- Wrong: using backticks for identifiers (MySQL habit)
SELECT `user_id` FROM events;
-- Fix: use double-quotes or no quotes
SELECT "user_id" FROM events;
SELECT user_id FROM events;
-- Wrong: LIMIT with OFFSET keyword
SELECT * FROM events LIMIT 10, 20;
-- Fix: use LIMIT ... OFFSET
SELECT * FROM events LIMIT 10 OFFSET 20;
-- Wrong: using != in older versions
WHERE status != 'active';
-- Fix: use <>
WHERE status <> 'active';
```
### 4. Unknown Table (Code 60)
```
DB::Exception: Table default.events does not exist. (UNKNOWN_TABLE)
```
**Fix:**
```sql
-- List all tables in the database
SHOW TABLES FROM default;
-- Check all databases
SHOW DATABASES;
-- The table might be in a different database
SELECT database, name FROM system.tables WHERE name LIKE '%events%';
```
### 5. Timeout Exceeded (Code 159)
```
DB::Exception: Timeout exceeded: elapsed ... seconds, max ... (TIMEOUT_EXCEEDED)
```
**Fix:**
```sql
-- Increase timeout for this query
SET max_execution_time = 120; -- seconds
-- Find slow queries in history
SELECT
query,
query_duration_ms,
read_rows,
result_rows,
memory_usage
FROM system.query_log
WHERE type = 'QueryFinish'
ORDER BY query_duration_ms DESC
LIMIT 10;
```
### 6. Cannot Parse DateTime
```
DB::Exception: Cannot parse datetime ... (CANNOT_PARSE_DATETIME)
```
**Fix:**
```sql
-- ClickHouse expects: YYYY-MM-DD HH:MM:SS
-- Wrong: ISO 8601 with T and Z
INSERT INTO events (created_at) VALUES ('2025-01-15T10:30:00Z');
-- Fix: strip T and Z
INSERT INTO events (created_at) VALUES ('2025-01-15 10:30:00');
-- Or parse it explicitly
SELECT parseDateTimeBestEffort('2025-01-15T10:30:00Z');
```
### 7. Readonly Mode (Code 164)
```
DB::Exception: ... is in readonly mode (READONLY)
```
**Cause:** User lacks write permissions or server is in readonly mode.
**Fix:**
```sql
-- Check user permissions
SHOW GRANTS FOR CURRENT_USER;
-- Check server setting
SELECT name, value FROM system.settings WHERE name = 'readonly';
```
### 8. No Such Column (Code 16)
```
DB::Exception: Missing columns: 'user_name' ... (NO_SUCH_COLUMN_IN_TABLE)
```
**Fix:**
```sql
-- Inspect actual column names
DESCRIBE TABLE events;
-- Check column types too
SELECT name, type, default_kind, default_expression
FROM system.columns WHERE database = 'default' AND table = 'events';
```
### 9. Type Mismatch on Insert
```
DB::Exception: Cannot convert ... to UInt64 (TYPE_MISMATCH)
```
**Fix:**
```sql
-- Check expected types
DESCRIBE TABLE events;
-- Cast in your INSERT if needed
INSERT INTO events (user_id) VALUES (toUInt64('12345'));
-- In Node.js, ensure numeric types:
await client.insert({
table: 'events',
values: [{ user_id: 42 }], // number, not "42"
format: 'JSONEachRow',
});
```
### 10. Distributed Table Errors
```
DB::Exception: All connection tries failed. ... (ALL_CONNECTION_TRIES_FAILED)
```
**Fix:**
```sql
-- Check cluster health
SELECT * FROM system.clusters;
-- Check replica status
SELECT database, table, is_leader, total_replicas, active_replicas
FROM system.replicas;
```
## Diagnostic Queries
```sql
-- Currently running queries
SELECT query_id, user, query, elapsed, read_rows, memory_usage
FROM system.processes;
-- Kill a stuck query
KILL QUERY WHERE query_id = 'abc-123';
-- Recent errors from query log
SELECT event_time, query, exception_code, exception
FROM system.query_log
WHERE type = 'ExceptionWhileProcessing'
ORDER BY event_time DESC
LIMIT 20;
-- Disk usage by table
SELECT
database, table,
formatReadableSize(sum(bytes_on_disk)) AS size,
sum(rows) AS total_rows,
count() AS parts
FROM system.parts WHERE active
GROUP BY database, table
ORDER BY sum(bytes_on_disk) DESC;
-- Merge health
SELECT database, table, progress, elapsed, num_parts
FROM system.merges;
```
## Error Handling
| Error Code | Name | Category |
|------------|------|----------|
| 16 | NO_SUCH_COLUMN_IN_TABLE | Schema |
| 60 | UNKNOWN_TABLE | Schema |
| 62 | SYNTAX_ERROR | Query |
| 159 | TIMEOUT_EXCEEDED | Performance |
| 164 | READONLY | Permissions |
| 202 | TOO_MANY_SIMULTANEOUS_QUERIES | Concurrency |
| 241 | MEMORY_LIMIT_EXCEEDED | Resources |
| 252 | TOO_MANY_PARTS | Insert pattern |
## Resources
- [Error Codes Reference](https://clickhouse.com/docs/knowledgebase)
- [System Tables](https://clickhouse.com/docs/operations/system-tables)
- [Query Log](https://clickhouse.com/docs/operations/system-tables/query_log)
## Next Steps
For comprehensive debugging, see `clickhouse-debug-bundle`.Related Skills
fathom-common-errors
Diagnose and fix Fathom API errors including auth failures and missing data. Use when API calls fail, transcripts are empty, or webhooks are not firing. Trigger with phrases like "fathom error", "fathom not working", "fathom api failure", "fix fathom".
exa-common-errors
Diagnose and fix Exa API errors by HTTP code and error tag. Use when encountering Exa errors, debugging failed requests, or troubleshooting integration issues. Trigger with phrases like "exa error", "fix exa", "exa not working", "debug exa", "exa 429", "exa 401".
evernote-common-errors
Diagnose and fix common Evernote API errors. Use when encountering Evernote API exceptions, debugging failures, or troubleshooting integration issues. Trigger with phrases like "evernote error", "evernote exception", "fix evernote issue", "debug evernote", "evernote troubleshooting".
elevenlabs-common-errors
Diagnose and fix ElevenLabs API errors by HTTP status code. Use when encountering ElevenLabs errors, debugging failed TTS/STS requests, or troubleshooting voice cloning and streaming issues. Trigger: "elevenlabs error", "fix elevenlabs", "elevenlabs not working", "debug elevenlabs", "elevenlabs 401", "elevenlabs 429", "elevenlabs 400".
documenso-common-errors
Diagnose and resolve common Documenso API errors and issues. Use when encountering Documenso errors, debugging integration issues, or troubleshooting failed operations. Trigger with phrases like "documenso error", "documenso 401", "documenso failed", "fix documenso", "documenso not working".
deepgram-common-errors
Diagnose and fix common Deepgram errors and issues. Use when troubleshooting Deepgram API errors, debugging transcription failures, or resolving integration issues. Trigger: "deepgram error", "deepgram not working", "fix deepgram", "deepgram troubleshoot", "transcription failed", "deepgram 401".
cursor-common-errors
Troubleshoot common Cursor IDE errors: authentication, completion, indexing, API, and performance issues. Triggers on "cursor error", "cursor not working", "cursor issue", "cursor problem", "fix cursor", "cursor crash".
coreweave-common-errors
Diagnose and fix CoreWeave GPU scheduling, pod, and networking errors. Use when pods are stuck Pending, GPUs are not allocated, or experiencing CUDA and NCCL errors. Trigger with phrases like "coreweave error", "coreweave pod pending", "coreweave gpu not found", "coreweave debug", "fix coreweave".
cohere-common-errors
Diagnose and fix Cohere API v2 errors and exceptions. Use when encountering Cohere errors, debugging failed requests, or troubleshooting CohereError, CohereTimeoutError, rate limits. Trigger with phrases like "cohere error", "fix cohere", "cohere not working", "debug cohere", "cohere 429", "cohere 400".
coderabbit-common-errors
Diagnose and fix CodeRabbit common errors and configuration issues. Use when CodeRabbit is not reviewing PRs, posting duplicate comments, ignoring configuration, or behaving unexpectedly. Trigger with phrases like "coderabbit error", "fix coderabbit", "coderabbit not working", "debug coderabbit", "coderabbit broken".
clickup-common-errors
Diagnose and fix ClickUp API v2 errors by HTTP status and error code. Use when encountering ClickUp API errors, debugging failed requests, or troubleshooting OAUTH_* error codes, 401s, 429s, and 500s. Trigger: "clickup error", "fix clickup", "clickup not working", "clickup 401", "clickup 429", "OAUTH error", "debug clickup API".
clickhouse-webhooks-events
Ingest data into ClickHouse from webhooks, Kafka, and streaming sources with batching, dedup, and exactly-once patterns. Use when building data ingestion pipelines, consuming webhook payloads, or integrating Kafka topics into ClickHouse. Trigger: "clickhouse ingestion", "clickhouse webhook", "clickhouse Kafka", "stream data to clickhouse", "clickhouse data pipeline".