hunt-report

Crypto hunt report — aggregate 4-hour hunting logs into actionable intelligence

33 stars

Best use case

hunt-report is best used when you need a repeatable AI agent workflow instead of a one-off prompt.

Crypto hunt report — aggregate 4-hour hunting logs into actionable intelligence

Teams using hunt-report should expect a more consistent output, faster repeated execution, less prompt rewriting.

When to use this skill

  • You want a reusable workflow that can be run more than once with consistent structure.

When not to use this skill

  • You only need a quick one-off answer and do not need a reusable workflow.
  • You cannot install or maintain the underlying files, dependencies, or repository context.

Installation

Claude Code / Cursor / Codex

$curl -o ~/.claude/skills/hunt-report/SKILL.md --create-dirs "https://raw.githubusercontent.com/aAAaqwq/AGI-Super-Team/main/skills/hunt-report/SKILL.md"

Manual Installation

  1. Download SKILL.md from GitHub
  2. Place it in .claude/skills/hunt-report/SKILL.md inside your project
  3. Restart your AI agent — it will auto-discover the skill

How hunt-report Compares

Feature / Agenthunt-reportStandard Approach
Platform SupportNot specifiedLimited / Varies
Context Awareness High Baseline
Installation ComplexityUnknownN/A

Frequently Asked Questions

What does this skill do?

Crypto hunt report — aggregate 4-hour hunting logs into actionable intelligence

Where can I find the source code?

You can find the source code on GitHub using the link provided at the top of the page.

SKILL.md Source

# ⚡ 猎杀报告 Skill

你是Quant(量子)。读取最近4小时的猎杀日志,整合成报告推送。

## 铁律
- **不执行任何交易** — 只读结果+报告
- 新交易 → 加密猎杀执行
- 止损 → 仓位管理执行

## Step 1: 读取最近4h猎杀日志

```bash
echo "=== 最近4h猎杀日志 ==="
CUTOFF=$(date -u -d '4 hours ago' +%Y-%m-%dT%H:%M:%SZ 2>/dev/null || date -u -v-4H +%Y-%m-%dT%H:%M:%SZ)
python3 -c "
import json, sys
from datetime import datetime, timedelta, timezone

cutoff = datetime.now(timezone.utc) - timedelta(hours=4)
entries = []
try:
    with open('${QUANT_WORKSPACE}/data/hunt-log.jsonl') as f:
        for line in f:
            line = line.strip()
            if not line: continue
            try:
                e = json.loads(line)
                ts = datetime.fromisoformat(e['ts'].replace('Z','+00:00'))
                if ts >= cutoff:
                    entries.append(e)
            except: pass
except FileNotFoundError:
    print('NO_LOG_FILE')
    sys.exit(0)

if not entries:
    print('NO_RECENT_ENTRIES')
    print(f'Cutoff: {cutoff.isoformat()}')
    # Show last 3 entries regardless of time
    try:
        with open('${QUANT_WORKSPACE}/data/hunt-log.jsonl') as f:
            all_lines = [l.strip() for l in f if l.strip()]
        print(f'Total entries in log: {len(all_lines)}')
        for l in all_lines[-3:]:
            print(l)
    except: pass
    sys.exit(0)

print(f'Found {len(entries)} entries in last 4h')
trades_total = []
sweet_total = []
for e in entries:
    print(f\"--- {e.get('ts_local','?')} ---\")
    print(f\"Prices: {json.dumps(e.get('prices',{}))}\")
    sweets = e.get('sweet_spots', [])
    trades = e.get('trades', [])
    print(f\"Sweet spots: {len(sweets)} | Trades: {len(trades)}\")
    if sweets:
        for s in sweets:
            print(f\"  SWEET: {s.get('market','')} {s.get('side','')} @{s.get('price_c','')}¢ buf:{s.get('buffer_pct','')}% trend:{s.get('trend','')} entry:{s.get('entry','')}\")
    if trades:
        for t in trades:
            print(f\"  TRADE: {t.get('market','')} {t.get('side','')} \${t.get('amount_usd','')} @{t.get('price_c','')}¢\")
    else:
        print(f\"  Skip: {e.get('skipped_reason','unknown')}\")
    print(f\"  Summary: {e.get('summary','')}\")
    trades_total.extend(trades)
    sweet_total.extend(sweets)

print(f'\\n=== 4h汇总 ===')
print(f'扫描次数: {len(entries)}')
print(f'甜区发现: {len(sweet_total)}')
print(f'交易执行: {len(trades_total)}')
"
```

```bash
echo "=== Elon猎杀最近结果 ==="
cat ${WORKSPACE}/data/hunt-elon-latest.json 2>/dev/null || echo "无记录"

echo "=== 仓位快照 ==="
cat ${WORKSPACE}/data/portfolio-snapshot.json 2>/dev/null || echo "无记录"
```

## Step 2: 整合报告

根据Step 1的输出整合:

```
🔍 猎杀报告 @ HH:MM (最近4h)
━━━━━━━━━━━━━━
📈 BTC:$XX ETH:$XX SOL:$XX GOLD:$XX
💰 Portfolio:$XX | Cash:$XX

━━ 最近4h猎杀 (X次扫描) ━━
🎯 甜区发现: X个
⚡ 交易执行: X笔

[列出每次扫描的关键发现]
• HH:MM — [甜区/无机会] | [交易/跳过原因]

━━ 成功交易 ━━
[如有交易,列出详情]
• 市场 | 方向 | $金额 @价格¢

━━ Elon推文 ━━
🐦 [Elon盘状态/无活跃盘]

━━ 策略状态 ━━
🟢 S1甜区: [活跃/静默]
🔵 S2趋势: [状态]
🐦 S7推文: [状态]
```

如果日志文件不存在或最近4h无条目,报告注明"⚠️ 猎杀日志无数据,请检查猎杀cron是否正常写入hunt-log.jsonl"

## Step 3: 推送

**使用message工具推送到Daniel私聊**:
```
message(action='send', channel='telegram', target='${TELEGRAM_TARGET_ID}', message='报告内容')
```

## Step 4: 更新memory
追加到 memory/$(date +%Y-%m-%d).md

## Step 5: 日志清理(可选)

如果 hunt-log.jsonl > 1000行,只保留最近500行:
```bash
LOG=${WORKSPACE}/data/hunt-log.jsonl
LINES=$(wc -l < "$LOG" 2>/dev/null || echo 0)
if [ "$LINES" -gt 1000 ]; then
  tail -500 "$LOG" > "${LOG}.tmp" && mv "${LOG}.tmp" "$LOG"
  echo "Trimmed hunt-log from $LINES to 500 lines"
fi
```

## 变更记录
- v2.0 (2026-03-17): 重写!从读latest.json改为读hunt-log.jsonl最近4h滚动日志
- v1.0 (2026-03-15): 从cron prompt迁移为skill

Related Skills

token-reporter

33
from aAAaqwq/AGI-Super-Team

每日自动统计 OpenClaw 实例 Token 消耗和工作产出,上报到飞书多维表格。扫描 JSONL 日志按模型聚合 token,收集各 agent 当日工作摘要,写入飞书 Bitable。触发:'token报告'、'token report'、'日报'、'每日汇报'、'飞书上报'。

team-daily-report

33
from aAAaqwq/AGI-Super-Team

自动汇总团队内 agent、cron、skill 进展与关键事件,生成并推送结构化日报。

soushen-hunter

33
from aAAaqwq/AGI-Super-Team

> 搜神猎手——基于 Playwright 的高性能 Bing 搜索与深度网页信息提取

clawbio-pharmgx-reporter

33
from aAAaqwq/AGI-Super-Team

Pharmacogenomic report from DTC genetic data (23andMe/AncestryDNA)

crypto-hunt

33
from aAAaqwq/AGI-Super-Team

Scan all crypto markets for sweet-spot opportunities and entry timing signals

wemp-operator

33
from aAAaqwq/AGI-Super-Team

> 微信公众号全功能运营——草稿/发布/评论/用户/素材/群发/统计/菜单/二维码 API 封装

Content & Documentation

zsxq-smart-publish

33
from aAAaqwq/AGI-Super-Team

Publish and manage content on 知识星球 (zsxq.com). Supports talk posts, Q&A, long articles, file sharing, digest/bookmark, homework tasks, and tag management. Use when publishing content to 知识星球, creating/editing posts, uploading files/images/audio, managing digests, batch publishing, or formatting content for 知识星球.

zoom-automation

33
from aAAaqwq/AGI-Super-Team

Automate Zoom meeting creation, management, recordings, webinars, and participant tracking via Rube MCP (Composio). Always search tools first for current schemas.

zoho-crm-automation

33
from aAAaqwq/AGI-Super-Team

Automate Zoho CRM tasks via Rube MCP (Composio): create/update records, search contacts, manage leads, and convert leads. Always search tools first for current schemas.

ziliu-publisher

33
from aAAaqwq/AGI-Super-Team

字流(Ziliu) - AI驱动的多平台内容分发工具。用于一次创作、智能适配排版、一键分发到16+平台(公众号/知乎/小红书/B站/抖音/微博/X等)。当用户需要多平台发布、内容排版、格式适配时使用。触发词:字流、ziliu、多平台发布、一键分发、内容分发、排版发布。

zhihu-post-skill

33
from aAAaqwq/AGI-Super-Team

> 知乎文章发布——知乎平台内容创作与发布自动化

zendesk-automation

33
from aAAaqwq/AGI-Super-Team

Automate Zendesk tasks via Rube MCP (Composio): tickets, users, organizations, replies. Always search tools first for current schemas.