KafkaJS — Apache Kafka Client for Node.js

You are an expert in KafkaJS, the pure JavaScript Apache Kafka client for Node.js. You help developers build event-driven architectures with producers, consumers, consumer groups, exactly-once semantics, SASL authentication, and admin operations — processing millions of events per second for real-time analytics, event sourcing, log aggregation, and microservices communication.

25 stars

Best use case

KafkaJS — Apache Kafka Client for Node.js is best used when you need a repeatable AI agent workflow instead of a one-off prompt.

You are an expert in KafkaJS, the pure JavaScript Apache Kafka client for Node.js. You help developers build event-driven architectures with producers, consumers, consumer groups, exactly-once semantics, SASL authentication, and admin operations — processing millions of events per second for real-time analytics, event sourcing, log aggregation, and microservices communication.

Teams using KafkaJS — Apache Kafka Client for Node.js should expect a more consistent output, faster repeated execution, less prompt rewriting.

When to use this skill

  • You want a reusable workflow that can be run more than once with consistent structure.

When not to use this skill

  • You only need a quick one-off answer and do not need a reusable workflow.
  • You cannot install or maintain the underlying files, dependencies, or repository context.

Installation

Claude Code / Cursor / Codex

$curl -o ~/.claude/skills/kafka-js/SKILL.md --create-dirs "https://raw.githubusercontent.com/ComeOnOliver/skillshub/main/skills/TerminalSkills/skills/kafka-js/SKILL.md"

Manual Installation

  1. Download SKILL.md from GitHub
  2. Place it in .claude/skills/kafka-js/SKILL.md inside your project
  3. Restart your AI agent — it will auto-discover the skill

How KafkaJS — Apache Kafka Client for Node.js Compares

Feature / AgentKafkaJS — Apache Kafka Client for Node.jsStandard Approach
Platform SupportNot specifiedLimited / Varies
Context Awareness High Baseline
Installation ComplexityUnknownN/A

Frequently Asked Questions

What does this skill do?

You are an expert in KafkaJS, the pure JavaScript Apache Kafka client for Node.js. You help developers build event-driven architectures with producers, consumers, consumer groups, exactly-once semantics, SASL authentication, and admin operations — processing millions of events per second for real-time analytics, event sourcing, log aggregation, and microservices communication.

Where can I find the source code?

You can find the source code on GitHub using the link provided at the top of the page.

SKILL.md Source

# KafkaJS — Apache Kafka Client for Node.js

You are an expert in KafkaJS, the pure JavaScript Apache Kafka client for Node.js. You help developers build event-driven architectures with producers, consumers, consumer groups, exactly-once semantics, SASL authentication, and admin operations — processing millions of events per second for real-time analytics, event sourcing, log aggregation, and microservices communication.

## Core Capabilities

### Producer

```typescript
import { Kafka, Partitioners, CompressionTypes } from "kafkajs";

const kafka = new Kafka({
  clientId: "my-app",
  brokers: ["kafka1:9092", "kafka2:9092", "kafka3:9092"],
  ssl: true,
  sasl: { mechanism: "plain", username: process.env.KAFKA_USER!, password: process.env.KAFKA_PASS! },
  retry: { initialRetryTime: 300, retries: 10 },
});

const producer = kafka.producer({
  createPartitioner: Partitioners.DefaultPartitioner,
  idempotent: true,                       // Exactly-once delivery
  transactionalId: "order-service",
});

await producer.connect();

// Send single message
await producer.send({
  topic: "orders",
  messages: [
    {
      key: order.userId,                  // Same user → same partition → ordered
      value: JSON.stringify({ type: "order.created", data: order }),
      headers: { "correlation-id": requestId, "source": "order-service" },
    },
  ],
  compression: CompressionTypes.GZIP,
});

// Transactional send (atomic multi-topic)
const transaction = await producer.transaction();
try {
  await transaction.send({ topic: "orders", messages: [{ key: order.id, value: JSON.stringify(order) }] });
  await transaction.send({ topic: "notifications", messages: [{ key: order.userId, value: JSON.stringify(notification) }] });
  await transaction.commit();
} catch (err) {
  await transaction.abort();
}
```

### Consumer

```typescript
const consumer = kafka.consumer({
  groupId: "order-processor",
  sessionTimeout: 30000,
  heartbeatInterval: 3000,
  maxBytesPerPartition: 1048576,          // 1MB per partition fetch
});

await consumer.connect();
await consumer.subscribe({ topics: ["orders", "payments"], fromBeginning: false });

await consumer.run({
  eachMessage: async ({ topic, partition, message }) => {
    const event = JSON.parse(message.value!.toString());

    switch (topic) {
      case "orders":
        await processOrder(event);
        break;
      case "payments":
        await processPayment(event);
        break;
    }
  },
});

// Batch processing for throughput
await consumer.run({
  eachBatch: async ({ batch, resolveOffset, heartbeat }) => {
    for (const message of batch.messages) {
      await processMessage(message);
      resolveOffset(message.offset);
      await heartbeat();                  // Prevent session timeout on long batches
    }
  },
});

// Graceful shutdown
const shutdown = async () => {
  await consumer.disconnect();
  await producer.disconnect();
  process.exit(0);
};
process.on("SIGTERM", shutdown);
```

## Installation

```bash
npm install kafkajs
```

## Best Practices

1. **Idempotent producer** — Enable `idempotent: true` for exactly-once delivery; prevents duplicate messages on retries
2. **Key-based partitioning** — Use message keys (userId, orderId) to ensure related events go to the same partition (ordered)
3. **Consumer groups** — Add more consumers to a group for horizontal scaling; Kafka auto-rebalances partitions
4. **Manual offset commits** — Commit offsets after processing, not before; prevents data loss on consumer crashes
5. **Heartbeat in batches** — Call `heartbeat()` during long batch processing to prevent session timeout
6. **Dead-letter topics** — Send failed messages to a DLT (`topic.DLT`) after retries; don't block the consumer
7. **Schema validation** — Use Avro/Protobuf with Schema Registry for strong typing across producers/consumers
8. **Compression** — Use GZIP or LZ4 compression; reduces network bandwidth 60-80% for JSON payloads

Related Skills

websocket-client-creator

25
from ComeOnOliver/skillshub

Websocket Client Creator - Auto-activating skill for API Integration. Triggers on: websocket client creator, websocket client creator Part of the API Integration skill category.

oauth-client-setup

25
from ComeOnOliver/skillshub

Oauth Client Setup - Auto-activating skill for API Integration. Triggers on: oauth client setup, oauth client setup Part of the API Integration skill category.

kafka-stream-processor

25
from ComeOnOliver/skillshub

Kafka Stream Processor - Auto-activating skill for Data Pipelines. Triggers on: kafka stream processor, kafka stream processor Part of the Data Pipelines skill category.

kafka-producer-consumer

25
from ComeOnOliver/skillshub

Kafka Producer Consumer - Auto-activating skill for Backend Development. Triggers on: kafka producer consumer, kafka producer consumer Part of the Backend Development skill category.

http-client-config

25
from ComeOnOliver/skillshub

Http Client Config - Auto-activating skill for API Integration. Triggers on: http client config, http client config Part of the API Integration skill category.

api-client-generator

25
from ComeOnOliver/skillshub

Api Client Generator - Auto-activating skill for API Integration. Triggers on: api client generator, api client generator Part of the API Integration skill category.

building-n8n-nodes

25
from ComeOnOliver/skillshub

Builds custom community nodes for n8n, the workflow automation platform. Activates when the user wants to create, scaffold, develop, test, lint, or publish an n8n node — including both declarative (REST API) and programmatic styles. Also triggers when the user mentions n8n nodes, n8n-cli, n8n-node, community nodes, node credentials, or anything related to extending n8n with custom integrations. Encodes all official best practices from n8n's documentation.

apollo-client

25
from ComeOnOliver/skillshub

Guide for building React applications with Apollo Client 4.x. Use this skill when: (1) setting up Apollo Client in a React project, (2) writing GraphQL queries or mutations with hooks, (3) configuring caching or cache policies, (4) managing local state with reactive variables, (5) troubleshooting Apollo Client errors or performance issues.

react-flow-node-ts

25
from ComeOnOliver/skillshub

Create React Flow node components with TypeScript types, handles, and Zustand integration. Use when building custom nodes for React Flow canvas, creating visual workflow editors, or implementing node-based UI components.

nodejs-best-practices

25
from ComeOnOliver/skillshub

Node.js development principles and decision-making. Framework selection, async patterns, security, and architecture. Teaches thinking, not copying.

nodejs-backend-patterns

25
from ComeOnOliver/skillshub

Build production-ready Node.js backend services with Express/Fastify, implementing middleware patterns, error handling, authentication, database integration, and API design best practices. Use when creating Node.js servers, REST APIs, GraphQL backends, or microservices architectures.

n8n-node-configuration

25
from ComeOnOliver/skillshub

Operation-aware node configuration guidance. Use when configuring nodes, understanding property dependencies, determining required fields, choosing between get_node detail levels, or learning common configuration patterns by node type.