repo-metadata
This skill should be used when the user asks to "generate repository metadata", "create catalog-info.yaml", "add repo metadata", "document repository structure", or mentions generating structured metadata for service catalog or architecture documentation.
Best use case
repo-metadata is best used when you need a repeatable AI agent workflow instead of a one-off prompt.
This skill should be used when the user asks to "generate repository metadata", "create catalog-info.yaml", "add repo metadata", "document repository structure", or mentions generating structured metadata for service catalog or architecture documentation.
Teams using repo-metadata should expect a more consistent output, faster repeated execution, less prompt rewriting.
When to use this skill
- You want a reusable workflow that can be run more than once with consistent structure.
When not to use this skill
- You only need a quick one-off answer and do not need a reusable workflow.
- You cannot install or maintain the underlying files, dependencies, or repository context.
Installation
Claude Code / Cursor / Codex
Manual Installation
- Download SKILL.md from GitHub
- Place it in
.claude/skills/repo-metadata/SKILL.mdinside your project - Restart your AI agent — it will auto-discover the skill
How repo-metadata Compares
| Feature / Agent | repo-metadata | Standard Approach |
|---|---|---|
| Platform Support | Not specified | Limited / Varies |
| Context Awareness | High | Baseline |
| Installation Complexity | Unknown | N/A |
Frequently Asked Questions
What does this skill do?
This skill should be used when the user asks to "generate repository metadata", "create catalog-info.yaml", "add repo metadata", "document repository structure", or mentions generating structured metadata for service catalog or architecture documentation.
Where can I find the source code?
You can find the source code on GitHub using the link provided at the top of the page.
SKILL.md Source
# Repository Metadata
Generate structured `catalog-info.yaml` metadata for repositories using industry-standard conventions (based on Backstage catalog format). This metadata enables cross-repository architecture analysis and service catalog functionality.
## Purpose
Create and maintain `catalog-info.yaml` files that describe a repository's role in the broader architecture. This metadata feeds into architectural views, dependency graphs, and service groupings across the entire organization.
## When to Use
Trigger this skill when:
- User asks to "generate repo metadata" or "create catalog-info.yaml"
- User wants to document a repository for a service catalog
- User needs to prepare a repository for cross-repo architecture analysis
- User mentions "service catalog" or "component metadata"
## Metadata Schema
The `catalog-info.yaml` file follows Backstage conventions with Astrabit-specific extensions:
```yaml
apiVersion: astrabit.io/v1
kind: Component
metadata:
name: service-name # Required: Unique identifier
description: Brief description
tags:
- backend
- user-management
spec:
# Service Classification
type: service # Required: service, gateway, worker, library, frontend, database
category: backend # Broader category
domain: trading # Business domain
owner: platform-team # Team responsible
# Dependencies (Upstream)
dependsOn:
- component: auth-service
type: service
- component: user-db
type: database
# APIs Provided
providesApis:
- name: User API
type: REST
definition: ./openapi.yaml
# APIs Consumed
consumesApis:
- name: Auth API
providedBy: auth-service
# Events Produced
eventProducers:
- name: user-events
type: kafka
topic: user.created
schema: avro
# Events Consumed
eventConsumers:
- name: order-events
type: kafka
topic: order.placed
group: user-service-group
# HTTP Routes (for gateways/services)
routes:
- path: /api/users/*
methods: [GET, POST, PUT, DELETE]
handler: this
- path: /api/auth/*
methods: [POST]
forwardsTo: auth-service
# Infrastructure
runtime: nodejs # nodejs, python, go, java, etc.
framework: nestjs # nestjs, fastapi, spring, etc.
```
## Generation Workflow
### Phase 1: Analyze Repository
Gather information about the repository:
1. **Use existing analysis scripts:**
```bash
python skills/repo-docs/scripts/analyze-repo-structure.py /path/to/repo
python skills/repo-docs/scripts/find-integration-points.py /path/to/repo
```
2. **Read existing documentation:**
- Check for `INTEGRATIONS.md` - contains upstream/downstream relationships
- Check for `ARCHITECTURE.md` - contains service role and dependencies
- Check for `README.md` - contains basic description and tech stack
3. **Detect from code:**
- Language from file extensions and package files
- Framework from dependencies
- Integration points from import patterns
### Phase 2: Generate Metadata
Based on analysis, generate `catalog-info.yaml` with detected values:
| Field | Detection Method |
|-------|------------------|
| `name` | Repo name or `package.json` `name` field |
| `description` | README title/description or generated from code |
| `type` | Inferred from code patterns (gateway has routes, worker has consumers only) |
| `runtime` | From package files (`package.json`, `pyproject.toml`, `go.mod`) |
| `framework` | From dependencies (`nestjs`, `fastapi`, `spring-boot`, etc.) |
| `dependsOn` | From integration point scanning |
| `eventProducers` | From `kafka.producer` or similar patterns |
| `eventConsumers` | From `@KafkaListener`, `@EventListener`, or similar patterns |
| `routes` | From `@Controller`, `@GetMapping`, router definitions |
### Phase 3: Present and Refine
Present the generated metadata to the user in a table format:
```markdown
Generated catalog-info.yaml:
| Field | Value | Source |
|-------|-------|--------|
| name | user-service | repo name |
| type | service | detected: has routes and consumers |
| runtime | nodejs | package.json |
| framework | nestjs | dependencies |
| domain | unknown | ❌ needs input |
| owner | unknown | ❌ needs input |
| dependsOn | auth-service, user-db | integration scan |
```
Prompt user to review and fill in missing fields (marked with ❌).
### Phase 4: Write Metadata File
Write `catalog-info.yaml` to the repository root.
### Phase 5: Update Related Documentation
Offer to update related docs to reference the new metadata file:
- Add link to `catalog-info.yaml` in `README.md`
- Update `INTEGRATIONS.md` to be consistent with metadata
## Service Type Detection
| Type | Indicators |
|------|------------|
| **gateway** | Has `routes` with `forwardsTo`, handles external requests, minimal business logic |
| **service** | Has both `providesApis` and `consumesApis`, business logic |
| **worker** | Only `eventConsumers`, no HTTP routes, background processing |
| **library** | No APIs consumed, only provides, shared utilities |
| **frontend** | `type: frontend` in package.json, has build artifacts |
| **database** | Contains migrations, schemas, no application code |
## Script Usage
Use `scripts/generate-metadata.py` to automate metadata generation:
```bash
# Generate from current directory
python skills/repo-metadata/scripts/generate-metadata.py
# Generate from specific repo
python skills/repo-metadata/scripts/generate-metadata.py /path/to/repo
# Output as JSON for inspection
python skills/repo-metadata/scripts/generate-metadata.py --format json
```
The script:
1. Runs repo structure analysis
2. Scans for integration points
3. Reads existing docs
4. Outputs `catalog-info.yaml` content
## Additional Resources
### Reference Files
- **`references/schema.md`** - Complete catalog-info.yaml schema reference
- **`references/detection-patterns.md`** - Patterns for detecting service characteristics
### Example Templates
- **`examples/catalog-info-template.yaml`** - Full template with all fields
- **`examples/catalog-info-gateway.yaml`** - Example gateway service
- **`examples/catalog-info-worker.yaml`** - Example worker service
- **`examples/catalog-info-library.yaml`** - Example shared library
## Quality Checklist
Before finalizing metadata, verify:
- [ ] `name` is unique across the organization
- [ ] `type` correctly classifies the service
- [ ] `domain` and `owner` are filled (not auto-detected)
- [ ] `dependsOn` lists all upstream dependencies
- [ ] `eventProducers` and `eventConsumers` are complete
- [ ] Routes are documented if this is a gateway/service
- [ ] File is valid YAML
- [ ] File is at repository root (`catalog-info.yaml`)Related Skills
daily-news-report
Scrapes content based on a preset URL list, filters high-quality technical information, and generates daily Markdown reports.
lark-workflow-standup-report
日程待办摘要:编排 calendar +agenda 和 task +get-my-tasks,生成指定日期的日程与未完成任务摘要。适用于了解今天/明天/本周的安排。
fixing-metadata
Ship correct, complete metadata.
reporting-sprints
Use this skill when you need to report on a sprint
reporting-issues
Use this skill when you need to report on a troubleshooting session
ghe-report
Generate detailed workflow reports with metrics, health assessments, and epic-specific analysis for GitHub Elements. Covers throughput, cycle times, compliance status, and thread history.
report-writing
작업 완료 후 상세 리포트 문서를 작성. 변경 이력, 영향도 분석, 검증 결과를 문서화할 때 사용. 파일명 규칙 YYYY-MM-DD-<제목>-report.md
infrastructure-reporting
Generate comprehensive network infrastructure reports including health status, performance analysis, security audits, and capacity planning recommendations.
transparency-reporter
When Truth Layer identifies a blocker:
security-report
Generate security assessment reports in docx format with findings, risk ratings, and remediation recommendations. Use when: User asks for security audit report, vulnerability assessment document, penetration test report, or compliance gap analysis document. Keywords: security report, audit findings, vulnerability report, pentest report
repo-docs
This skill should be used when the user asks to "generate repository documentation", "create a README", "document API", "write architecture docs", "add CONTRIBUTING guide", "update repo docs", "document codebase", or mentions repository documentation, codebase analysis, or cross-repository integration documentation.
sharepoint-audit
Guide and run a SharePoint audit locally. Collect inputs, confirm PowerShell 7.4+ and Python 3.10+ are available, call PowerShell with certificate auth via wrapper, parse audit.json, and render Markdown/HTML. Use only local shell commands.