Apache Kafka

## Overview

25 stars

Best use case

Apache Kafka is best used when you need a repeatable AI agent workflow instead of a one-off prompt.

## Overview

Teams using Apache Kafka should expect a more consistent output, faster repeated execution, less prompt rewriting.

When to use this skill

  • You want a reusable workflow that can be run more than once with consistent structure.

When not to use this skill

  • You only need a quick one-off answer and do not need a reusable workflow.
  • You cannot install or maintain the underlying files, dependencies, or repository context.

Installation

Claude Code / Cursor / Codex

$curl -o ~/.claude/skills/kafka/SKILL.md --create-dirs "https://raw.githubusercontent.com/ComeOnOliver/skillshub/main/skills/TerminalSkills/skills/kafka/SKILL.md"

Manual Installation

  1. Download SKILL.md from GitHub
  2. Place it in .claude/skills/kafka/SKILL.md inside your project
  3. Restart your AI agent — it will auto-discover the skill

How Apache Kafka Compares

Feature / AgentApache KafkaStandard Approach
Platform SupportNot specifiedLimited / Varies
Context Awareness High Baseline
Installation ComplexityUnknownN/A

Frequently Asked Questions

What does this skill do?

## Overview

Where can I find the source code?

You can find the source code on GitHub using the link provided at the top of the page.

SKILL.md Source

# Apache Kafka

## Overview

Kafka is a distributed event streaming platform for high-throughput, fault-tolerant messaging. It's the backbone of event-driven architectures — used for real-time data pipelines, event sourcing, log aggregation, and microservice communication.

## Instructions

### Step 1: Local Setup

```yaml
# docker-compose.yml — Kafka with KRaft (no ZooKeeper)
services:
  kafka:
    image: bitnami/kafka:latest
    ports:
      - "9092:9092"
    environment:
      KAFKA_CFG_NODE_ID: 1
      KAFKA_CFG_PROCESS_ROLES: broker,controller
      KAFKA_CFG_LISTENERS: PLAINTEXT://:9092,CONTROLLER://:9093
      KAFKA_CFG_ADVERTISED_LISTENERS: PLAINTEXT://localhost:9092
      KAFKA_CFG_CONTROLLER_QUORUM_VOTERS: 1@kafka:9093
      KAFKA_CFG_CONTROLLER_LISTENER_NAMES: CONTROLLER
```

### Step 2: Node.js Producer

```typescript
// producer.ts — Send events to Kafka
import { Kafka, Partitioners } from 'kafkajs'

const kafka = new Kafka({ brokers: ['localhost:9092'] })
const producer = kafka.producer({ createPartitioner: Partitioners.DefaultPartitioner })

await producer.connect()

// Send single event
await producer.send({
  topic: 'orders',
  messages: [
    {
      key: 'order-123',                    // partition key (orders for same user go to same partition)
      value: JSON.stringify({
        orderId: 'order-123',
        userId: 'user-456',
        items: [{ sku: 'WIDGET-1', quantity: 2, price: 29.99 }],
        total: 59.98,
        createdAt: new Date().toISOString(),
      }),
    },
  ],
})

// Batch send
await producer.sendBatch({
  topicMessages: [
    { topic: 'orders', messages: events.map(e => ({ key: e.id, value: JSON.stringify(e) })) },
  ],
})

await producer.disconnect()
```

### Step 3: Consumer

```typescript
// consumer.ts — Process events from Kafka
const consumer = kafka.consumer({ groupId: 'order-service' })

await consumer.connect()
await consumer.subscribe({ topic: 'orders', fromBeginning: false })

await consumer.run({
  eachMessage: async ({ topic, partition, message }) => {
    const order = JSON.parse(message.value.toString())
    console.log(`Processing order ${order.orderId} from partition ${partition}`)

    // Process the order (idempotently — messages can be redelivered)
    await processOrder(order)
  },
})

// Graceful shutdown
process.on('SIGTERM', async () => {
  await consumer.disconnect()
})
```

### Step 4: Python Consumer

```python
# consumer.py — Kafka consumer with confluent-kafka
from confluent_kafka import Consumer

conf = {
    'bootstrap.servers': 'localhost:9092',
    'group.id': 'analytics-service',
    'auto.offset.reset': 'earliest',
}

consumer = Consumer(conf)
consumer.subscribe(['orders'])

while True:
    msg = consumer.poll(1.0)
    if msg is None:
        continue
    if msg.error():
        print(f"Error: {msg.error()}")
        continue

    order = json.loads(msg.value().decode('utf-8'))
    print(f"Processing: {order['orderId']}")
```

## Guidelines

- Use partition keys to ensure related events go to the same partition (ordering guarantee).
- Consumer groups enable parallel processing — each partition is consumed by one consumer in the group.
- Make consumers idempotent — Kafka guarantees at-least-once delivery by default.
- For managed Kafka: Confluent Cloud, AWS MSK, or Redpanda (Kafka-compatible, simpler).
- KRaft mode (no ZooKeeper) is production-ready since Kafka 3.3+.

Related Skills

kafka-stream-processor

25
from ComeOnOliver/skillshub

Kafka Stream Processor - Auto-activating skill for Data Pipelines. Triggers on: kafka stream processor, kafka stream processor Part of the Data Pipelines skill category.

kafka-producer-consumer

25
from ComeOnOliver/skillshub

Kafka Producer Consumer - Auto-activating skill for Backend Development. Triggers on: kafka producer consumer, kafka producer consumer Part of the Backend Development skill category.

deploying-kafka-k8s

25
from ComeOnOliver/skillshub

Deploys Apache Kafka on Kubernetes using the Strimzi operator with KRaft mode. Use when setting up Kafka for event-driven microservices, message queuing, or pub/sub patterns. Covers operator installation, cluster creation, topic management, and producer/consumer testing. NOT when using managed Kafka (Confluent Cloud, MSK) or local development without K8s.

Upstash — Serverless Redis, Kafka & QStash

25
from ComeOnOliver/skillshub

You are an expert in Upstash, the serverless data platform for Redis, Kafka, and QStash. You help developers add caching, rate limiting, session storage, message queuing, and scheduled jobs to serverless and edge applications — with HTTP-based APIs that work on Vercel Edge, Cloudflare Workers, and AWS Lambda without persistent connections.

KafkaJS — Apache Kafka Client for Node.js

25
from ComeOnOliver/skillshub

You are an expert in KafkaJS, the pure JavaScript Apache Kafka client for Node.js. You help developers build event-driven architectures with producers, consumers, consumer groups, exactly-once semantics, SASL authentication, and admin operations — processing millions of events per second for real-time analytics, event sourcing, log aggregation, and microservices communication.

Apache Flink

25
from ComeOnOliver/skillshub

## Overview

Apache Spark

25
from ComeOnOliver/skillshub

## Overview

Apache Arrow — Columnar Data Format

25
from ComeOnOliver/skillshub

## Overview

Daily Logs

25
from ComeOnOliver/skillshub

Record the user's daily activities, progress, decisions, and learnings in a structured, chronological format.

Socratic Method: The Dialectic Engine

25
from ComeOnOliver/skillshub

This skill transforms Claude into a Socratic agent — a cognitive partner who guides

Sokratische Methode: Die Dialektik-Maschine

25
from ComeOnOliver/skillshub

Dieser Skill verwandelt Claude in einen sokratischen Agenten — einen kognitiven Partner, der Nutzende durch systematisches Fragen zur Wissensentdeckung führt, anstatt direkt zu instruieren.

College Football Data (CFB)

25
from ComeOnOliver/skillshub

Before writing queries, consult `references/api-reference.md` for endpoints, conference IDs, team IDs, and data shapes.