meshtastic-detection

Receive DETECTION_SENSOR_APP alerts from Meshtastic LoRa devices via USB. When the remote sensor GPIO triggers (preset target detected), store the event and alert the user immediately.

3,891 stars

Best use case

meshtastic-detection is best used when you need a repeatable AI agent workflow instead of a one-off prompt.

Receive DETECTION_SENSOR_APP alerts from Meshtastic LoRa devices via USB. When the remote sensor GPIO triggers (preset target detected), store the event and alert the user immediately.

Teams using meshtastic-detection should expect a more consistent output, faster repeated execution, less prompt rewriting.

When to use this skill

  • You want a reusable workflow that can be run more than once with consistent structure.

When not to use this skill

  • You only need a quick one-off answer and do not need a reusable workflow.
  • You cannot install or maintain the underlying files, dependencies, or repository context.

Installation

Claude Code / Cursor / Codex

$curl -o ~/.claude/skills/meshtastic-detection/SKILL.md --create-dirs "https://raw.githubusercontent.com/openclaw/skills/main/skills/autume/meshtastic-detection/SKILL.md"

Manual Installation

  1. Download SKILL.md from GitHub
  2. Place it in .claude/skills/meshtastic-detection/SKILL.md inside your project
  3. Restart your AI agent — it will auto-discover the skill

How meshtastic-detection Compares

Feature / Agentmeshtastic-detectionStandard Approach
Platform SupportNot specifiedLimited / Varies
Context Awareness High Baseline
Installation ComplexityUnknownN/A

Frequently Asked Questions

What does this skill do?

Receive DETECTION_SENSOR_APP alerts from Meshtastic LoRa devices via USB. When the remote sensor GPIO triggers (preset target detected), store the event and alert the user immediately.

Where can I find the source code?

You can find the source code on GitHub using the link provided at the top of the page.

Related Guides

SKILL.md Source

# Meshtastic Detection Skill

Receive detection sensor alerts from a remote Meshtastic device over LoRa. When the remote device's GPIO pin triggers (preset target detected), the event is stored locally and requires immediate user notification via feishu.

## Prerequisites

- Meshtastic-compatible hardware connected via USB (RAK4631, T-Beam, Heltec, etc.)
- Python 3.10+ with `meshtastic` and `pypubsub` packages (venv at `{baseDir}/venv`)
- `usb_receiver.py` daemon running
- Quick setup: `cd {baseDir} && ./setup.sh`
- Detailed guide: `{baseDir}/references/SETUP.md`

## Architecture

```
┌──────────────────────────────────────────────────────────────┐
│                     USB Receiver Daemon                       │
├──────────────────────────────────────────────────────────────┤
│  LISTEN:  DETECTION_SENSOR_APP only (GPIO trigger events)    │
│  STORE:   data/sensor_data.jsonl (append per detection)      │
│  LATEST:  data/latest.json (most recent detection)           │
└──────────────────────────────────────────────────────────────┘

┌─────────────┐     USB      ┌──────────────┐
│  LoRa Node  │◄────────────►│ usb_receiver │
│  (Radio)    │              │   daemon     │
└─────────────┘              └──────┬───────┘
                                    │
                    ┌───────────────┼───────────────┐
                    │               │               │
                    ▼               ▼               ▼
          sensor_cli.py     event_monitor.py   OpenClaw cron
          (query data)      (check alerts)     (feishu alert)
```

## Quick Reference

### Run the Receiver

```bash
cd {baseDir}
source venv/bin/activate
python scripts/usb_receiver.py --port /dev/cu.usbmodem1CDBD4A896441
```

### Check for New Alerts

```bash
cd {baseDir}
./venv/bin/python scripts/event_monitor.py
```

Every `DETECTION_SENSOR_APP` record = high-priority alert. Output:

```json
{
  "alerts": [{"priority": "high", "sender": "!1dd29c50", "text": "alert detected", "received_at": "...", "channel": "ch0", "portnum": "DETECTION_SENSOR_APP"}],
  "summary": "🚨 3 new detection alert(s) from 3 record(s)",
  "alert_count": 3,
  "new_records": 3
}
```

### Query Historical Data

```bash
python scripts/sensor_cli.py latest
python scripts/sensor_cli.py stats --since 24h
python scripts/sensor_cli.py query --since 1h
```

### Data Format

Each record in `data/sensor_data.jsonl`:

```json
{"received_at": "2026-03-04T11:07:06+00:00", "sender": "!1dd29c50", "channel": "ch0", "portnum": "DETECTION_SENSOR_APP", "data": {"type": "detection", "text": "alert detected"}}
```

**Only `DETECTION_SENSOR_APP` messages are captured.** This portnum means the remote sensor's GPIO pin was triggered — a preset target has been detected. **Every detection event requires immediate user alert.**

All other message types (TEXT_MESSAGE_APP, telemetry, position, etc.) are ignored.

### Log Rotation

`sensor_data.jsonl` is automatically rotated at **5 MB** (keeps 2 archive files, total max ~15 MB). Rotation is transparent — `event_monitor` auto-resets offset, `sensor_cli` reads across archives.

## Monitoring & Alerts

### Cron Job (Active)

The cron job runs `event_monitor.py` every 60 seconds and delivers alerts to feishu:

```bash
# Check status
openclaw cron list

# View run history
openclaw cron runs --id <job-id>

# Manual test
openclaw cron run <job-id>

# Edit config
openclaw cron edit <id> --timeout-seconds 60 --to <feishu-open-id>
```

Cron message template (for reference):

```
Run this command and report the output:
cd {baseDir} && ./venv/bin/python scripts/event_monitor.py
— If alert_count > 0, tell me how many alerts, the latest sender and time.
  If alert_count is 0, reply: 暂无新告警。
```

Key settings:
- `timeoutSeconds: 60` (agent needs ~20-40s)
- `channel: feishu`
- `delivery.to: ou_16c6dc8bda8ac97abfd0194568edee59`

### Alert Behavior

All `DETECTION_SENSOR_APP` events are treated as **high priority**. No rule configuration needed — every detection triggers an immediate alert. The alert message includes:
- Sender device ID
- Detection text (from remote sensor config)
- Timestamp

## Configuration

Edit `CONFIG.md` to customize:

- **Serial port** — USB device path
- **Notification channel** — `feishu` (configured in OpenClaw)

## Common Conversation Patterns

**User asks about recent detections:**
> "What was detected in the last hour?"

Run: `cd {baseDir} && ./venv/bin/python scripts/sensor_cli.py query --since 1h`

**User asks for statistics:**
> "Give me a summary of detections today"

Run: `cd {baseDir} && ./venv/bin/python scripts/sensor_cli.py stats --since 24h`

**User asks about system status:**
> "Is the sensor still working?"

Run: `cd {baseDir} && ./venv/bin/python scripts/sensor_cli.py status`

## Files

```
{baseDir}/
├── SKILL.md               # This file (agent instructions + metadata)
├── CONFIG.md              # User configuration
├── setup.sh               # One-click setup
├── scripts/
│   ├── usb_receiver.py    # USB serial daemon (DETECTION_SENSOR_APP only)
│   ├── event_monitor.py   # Incremental alert monitor
│   └── sensor_cli.py      # Query CLI
├── data/
│   ├── sensor_data.jsonl  # Detection records (auto-rotated at 5 MB)
│   ├── latest.json        # Most recent detection
│   └── monitor_state.json # Monitor byte offset + seen hashes
└── references/
    └── SETUP.md           # Detailed installation guide
```

## Troubleshooting

**"No records found"**
- Check that `usb_receiver.py` is running
- Verify USB device: `ls /dev/cu.usb*`

**"Resource temporarily unavailable"**
- Only one process can use the serial port. Check: `lsof /dev/cu.usbmodem*`

**Receiver connects but no data appears**
- The receiver only captures `DETECTION_SENSOR_APP` messages (other types are ignored)
- Run with `--debug` to see all packets: `python scripts/usb_receiver.py --port ... --debug`
- Verify the remote device is on the same channel and frequency
- Confirm the remote device has Detection Sensor Settings configured (GPIO pin monitoring)

**Cron job times out or fails delivery**
- Check: `openclaw cron runs --id <job-id>`
- Fix timeout: `openclaw cron edit <id> --timeout-seconds 60`
- Fix delivery: `openclaw cron edit <id> --to <feishu-open-id>`

Related Skills

错敏信息检测 API (Sensitive Content Detection)

3891
from openclaw/skills

一个基于 FastAPI 的错敏信息检测服务,用于检测文本中的敏感词、错别字和规范表述问题。

Content Moderation & Analysis

---

3891
from openclaw/skills

name: article-factory-wechat

Content & Documentation

humanizer

3891
from openclaw/skills

Remove signs of AI-generated writing from text. Use when editing or reviewing text to make it sound more natural and human-written. Based on Wikipedia's comprehensive "Signs of AI writing" guide. Detects and fixes patterns including: inflated symbolism, promotional language, superficial -ing analyses, vague attributions, em dash overuse, rule of three, AI vocabulary words, negative parallelisms, and excessive conjunctive phrases.

Content & Documentation

find-skills

3891
from openclaw/skills

Helps users discover and install agent skills when they ask questions like "how do I do X", "find a skill for X", "is there a skill that can...", or express interest in extending capabilities. This skill should be used when the user is looking for functionality that might exist as an installable skill.

General Utilities

tavily-search

3891
from openclaw/skills

Use Tavily API for real-time web search and content extraction. Use when: user needs real-time web search results, research, or current information from the web. Requires Tavily API key.

Data & Research

baidu-search

3891
from openclaw/skills

Search the web using Baidu AI Search Engine (BDSE). Use for live information, documentation, or research topics.

Data & Research

agent-autonomy-kit

3891
from openclaw/skills

Stop waiting for prompts. Keep working.

Workflow & Productivity

Meeting Prep

3891
from openclaw/skills

Never walk into a meeting unprepared again. Your agent researches all attendees before calendar events—pulling LinkedIn profiles, recent company news, mutual connections, and conversation starters. Generates a briefing doc with talking points, icebreakers, and context so you show up informed and confident. Triggered automatically before meetings or on-demand. Configure research depth, advance timing, and output format. Walking into meetings blind is amateur hour—missed connections, generic small talk, zero leverage. Use when setting up meeting intelligence, researching specific attendees, generating pre-meeting briefs, or automating your prep workflow.

Workflow & Productivity

self-improvement

3891
from openclaw/skills

Captures learnings, errors, and corrections to enable continuous improvement. Use when: (1) A command or operation fails unexpectedly, (2) User corrects Claude ('No, that's wrong...', 'Actually...'), (3) User requests a capability that doesn't exist, (4) An external API or tool fails, (5) Claude realizes its knowledge is outdated or incorrect, (6) A better approach is discovered for a recurring task. Also review learnings before major tasks.

Agent Intelligence & Learning

botlearn-healthcheck

3891
from openclaw/skills

botlearn-healthcheck — BotLearn autonomous health inspector for OpenClaw instances across 5 domains (hardware, config, security, skills, autonomy); triggers on system check, health report, diagnostics, or scheduled heartbeat inspection.

DevOps & Infrastructure

linkedin-cli

3891
from openclaw/skills

A bird-like LinkedIn CLI for searching profiles, checking messages, and summarizing your feed using session cookies.

Content & Documentation

notebooklm

3891
from openclaw/skills

Google NotebookLM 非官方 Python API 的 OpenClaw Skill。支持内容生成(播客、视频、幻灯片、测验、思维导图等)、文档管理和研究自动化。当用户需要使用 NotebookLM 生成音频概述、视频、学习材料或管理知识库时触发。

Data & Research