junta-leiloeiros
Coleta e consulta dados de leiloeiros oficiais de todas as 27 Juntas Comerciais do Brasil. Scraper multi-UF, banco SQLite, API FastAPI e exportacao CSV/JSON.
About this skill
This skill, 'junta-leiloeiros', enables AI agents to access a specialized dataset of official auctioneers from all 27 Brazilian Juntas Comerciais. It features a robust multi-state scraper to gather the data, which is then stored in an SQLite database. Agents can interact with this data through a FastAPI API for dynamic querying or request exports in CSV and JSON formats. This skill is crucial for tasks requiring up-to-date and verified information on Brazilian official auctioneers, streamlining data collection and analysis processes for AI agents.
Best use case
Accessing and analyzing official Brazilian auctioneer data for market research, compliance checks, lead generation, or legal inquiries related to auctions in Brazil. It can also be used to integrate specific Brazilian business registry data into larger analytical systems or reports.
Coleta e consulta dados de leiloeiros oficiais de todas as 27 Juntas Comerciais do Brasil. Scraper multi-UF, banco SQLite, API FastAPI e exportacao CSV/JSON.
The AI agent will provide structured data about official Brazilian auctioneers, including their names, registration details, and associated Commercial Board. Output can be in raw data format (e.g., JSON), a summary, or a link to a generated CSV/JSON export, depending on the specific query.
Practical example
Example input
Me forneça uma lista de leiloeiros oficiais registrados na Junta Comercial de São Paulo, incluindo seus dados de contato e registro.
Example output
```json
[
{
"name": "João Silva",
"registration_number": "JL-SP-12345",
"junta_comercial": "JUCESP",
"state": "SP",
"contact": {
"phone": "(11) 98765-4321",
"email": "joao.silva@leiloeiro.com.br"
}
},
{
"name": "Maria Santos",
"registration_number": "JL-SP-67890",
"junta_comercial": "JUCESP",
"state": "SP",
"contact": {
"phone": "(11) 91234-5678",
"email": "maria.santos@leiloeiro.com.br"
}
}
]
```
Aqui estão alguns leiloeiros oficiais registrados na Junta Comercial de São Paulo. Se precisar de mais detalhes ou uma lista completa em CSV, me avise.When to use this skill
- Use this skill when the user explicitly requests information about 'leiloeiro junta' (auctioneer board), 'junta comercial leiloeiro' (commercial board auctioneer), or any related terms pertaining to official auctioneers in Brazil. It is ideal for queries about specific auctioneers, lists of auctioneers by state, or general statistics concerning official auctioneers across Brazil's commercial boards.
When not to use this skill
- Do not use this skill for information on auctioneers outside of Brazil, or for real-time auction bidding/management. It is also not suitable for collecting data on informal or non-official auctioneers, or for data unrelated to the Juntas Comerciais. If the user requires general business registration data that is not specific to auctioneers, other skills might be more appropriate.
Installation
Claude Code / Cursor / Codex
Manual Installation
- Download SKILL.md from GitHub
- Place it in
.claude/skills/junta-leiloeiros/SKILL.mdinside your project - Restart your AI agent — it will auto-discover the skill
How junta-leiloeiros Compares
| Feature / Agent | junta-leiloeiros | Standard Approach |
|---|---|---|
| Platform Support | Claude, Cursor, Gemini, Codex | Limited / Varies |
| Context Awareness | High | Baseline |
| Installation Complexity | medium | N/A |
Frequently Asked Questions
What does this skill do?
Coleta e consulta dados de leiloeiros oficiais de todas as 27 Juntas Comerciais do Brasil. Scraper multi-UF, banco SQLite, API FastAPI e exportacao CSV/JSON.
Which AI agents support this skill?
This skill is designed for Claude, Cursor, Gemini, Codex.
How difficult is it to install?
The installation complexity is rated as medium. You can find the installation instructions above.
Where can I find the source code?
You can find the source code on GitHub using the link provided at the top of the page.
Related Guides
Best AI Skills for Claude
Explore the best AI skills for Claude and Claude Code across coding, research, workflow automation, documentation, and agent operations.
Cursor vs Codex for AI Workflows
Compare Cursor and Codex for AI coding workflows, repository assistance, debugging, refactoring, and reusable developer skills.
AI Agents for Coding
Browse AI agent skills for coding, debugging, testing, refactoring, code review, and developer workflows across Claude, Cursor, and Codex.
SKILL.md Source
# Skill: Leiloeiros das Juntas Comerciais do Brasil
## Overview
Coleta e consulta dados de leiloeiros oficiais de todas as 27 Juntas Comerciais do Brasil. Scraper multi-UF, banco SQLite, API FastAPI e exportacao CSV/JSON.
## When to Use This Skill
- When the user mentions "leiloeiro junta" or related topics
- When the user mentions "junta comercial leiloeiro" or related topics
- When the user mentions "scraper junta" or related topics
- When the user mentions "jucesp leiloeiro" or related topics
- When the user mentions "jucerja" or related topics
- When the user mentions "jucemg leiloeiro" or related topics
## Do Not Use This Skill When
- The task is unrelated to junta leiloeiros
- A simpler, more specific tool can handle the request
- The user needs general-purpose assistance without domain expertise
## How It Works
Coleta dados públicos de leiloeiros oficiais de todas as 27 Juntas Comerciais estaduais,
persiste em banco SQLite local e oferece API REST e exportação em múltiplos formatos.
## Localização
```
C:\Users\renat\skills\junta-leiloeiros\
├── scripts/
│ ├── scraper/
│ │ ├── base_scraper.py ← classe abstrata
│ │ ├── states.py ← registro dos 27 scrapers
│ │ ├── jucesp.py / jucerja.py / jucemg.py / jucec.py / jucis_df.py
│ │ └── generic_scraper.py ← usado pelos 22 estados restantes
│ ├── db.py ← banco SQLite
│ ├── run_all.py ← orquestrador de scraping
│ ├── serve_api.py ← API FastAPI
│ ├── export.py ← exportação
│ └── requirements.txt
├── references/
│ ├── juntas_urls.md ← URLs e status de todas as 27 juntas
│ ├── schema.md ← schema do banco
│ └── legal.md ← base legal
└── data/
├── leiloeiros.db ← banco SQLite (criado no primeiro run)
├── scraping_log.json ← log de cada coleta
└── exports/ ← arquivos exportados
```
## Instalação (Uma Vez)
```bash
pip install -r C:\Users\renat\skills\junta-leiloeiros\scripts\requirements.txt
## Para Sites Com Javascript:
playwright install chromium
```
## Coletar Dados
```bash
## Todos Os 27 Estados
python C:\Users\renat\skills\junta-leiloeiros\scripts\run_all.py
## Estados Específicos
python C:\Users\renat\skills\junta-leiloeiros\scripts\run_all.py --estado SP RJ MG
## Ver O Que Seria Coletado Sem Executar
python C:\Users\renat\skills\junta-leiloeiros\scripts\run_all.py --dry-run
## Controlar Paralelismo (Default: 5)
python C:\Users\renat\skills\junta-leiloeiros\scripts\run_all.py --concurrency 3
```
## Estatísticas Por Estado
python C:\Users\renat\skills\junta-leiloeiros\scripts\db.py
## Sql Direto
sqlite3 C:\Users\renat\skills\junta-leiloeiros\data\leiloeiros.db \
"SELECT estado, COUNT(*) FROM leiloeiros GROUP BY estado"
```
## Servir Api Rest
```bash
python C:\Users\renat\skills\junta-leiloeiros\scripts\serve_api.py
## Docs Interativos: Http://Localhost:8000/Docs
```
**Endpoints:**
- `GET /leiloeiros?estado=SP&situacao=ATIVO&nome=silva&limit=100`
- `GET /leiloeiros/{estado}` — ex: `/leiloeiros/SP`
- `GET /busca?q=texto`
- `GET /stats`
- `GET /export/json`
- `GET /export/csv`
## Exportar Dados
```bash
python C:\Users\renat\skills\junta-leiloeiros\scripts\export.py --format csv
python C:\Users\renat\skills\junta-leiloeiros\scripts\export.py --format json
python C:\Users\renat\skills\junta-leiloeiros\scripts\export.py --format all
python C:\Users\renat\skills\junta-leiloeiros\scripts\export.py --format csv --estado SP
```
## Usar Em Código Python
```python
import sys
sys.path.insert(0, r"C:\Users\renat\skills\junta-leiloeiros\scripts")
from db import Database
db = Database()
db.init()
## Todos Os Leiloeiros Ativos De Sp
leiloeiros = db.get_all(estado="SP", situacao="ATIVO")
## Busca Por Nome
resultados = db.search("silva")
## Estatísticas
stats = db.get_stats()
```
## Adicionar Scraper Customizado
Se um estado precisar de lógica específica (ex: site usa JavaScript):
```python
## Scripts/Scraper/Meu_Estado.Py
from .base_scraper import AbstractJuntaScraper, Leiloeiro
from typing import List
class MeuEstadoScraper(AbstractJuntaScraper):
estado = "XX"
junta = "JUCEX"
url = "https://www.jucex.xx.gov.br/leiloeiros"
async def parse_leiloeiros(self) -> List[Leiloeiro]:
soup = await self.fetch_page()
if not soup:
return []
# lógica específica aqui
return [self.make_leiloeiro(nome="...", matricula="...")]
```
Registrar em `scripts/scraper/states.py`:
```python
from .meu_estado import MeuEstadoScraper
SCRAPERS["XX"] = MeuEstadoScraper
```
## Referências
- URLs de todas as juntas: `references/juntas_urls.md`
- Schema do banco: `references/schema.md`
- Base legal da coleta: `references/legal.md`
- Log de coleta: `data/scraping_log.json`
## Best Practices
- Provide clear, specific context about your project and requirements
- Review all suggestions before applying them to production code
- Combine with other complementary skills for comprehensive analysis
## Common Pitfalls
- Using this skill for tasks outside its domain expertise
- Applying recommendations without understanding your specific context
- Not providing enough project context for accurate analysis
## Related Skills
- `leiloeiro-avaliacao` - Complementary skill for enhanced analysis
- `leiloeiro-edital` - Complementary skill for enhanced analysis
- `leiloeiro-ia` - Complementary skill for enhanced analysis
- `leiloeiro-juridico` - Complementary skill for enhanced analysis
- `leiloeiro-mercado` - Complementary skill for enhanced analysisRelated Skills
data-scraper-agent
构建一个全自动化的AI驱动数据收集代理,适用于任何公共来源——招聘网站、价格信息、新闻、GitHub、体育赛事等任何内容。按计划进行抓取,使用免费LLM(Gemini Flash)丰富数据,将结果存储在Notion/Sheets/Supabase中,并从用户反馈中学习。完全免费在GitHub Actions上运行。适用于用户希望自动监控、收集或跟踪任何公共数据的场景。
nft-standards
Master ERC-721 and ERC-1155 NFT standards, metadata best practices, and advanced NFT features.
nextjs-app-router-patterns
Comprehensive patterns for Next.js 14+ App Router architecture, Server Components, and modern full-stack React development.
new-rails-project
Create a new Rails project
networkx
NetworkX is a Python package for creating, manipulating, and analyzing complex networks and graphs.
network-engineer
Expert network engineer specializing in modern cloud networking, security architectures, and performance optimization.
nestjs-expert
You are an expert in Nest.js with deep knowledge of enterprise-grade Node.js application architecture, dependency injection patterns, decorators, middleware, guards, interceptors, pipes, testing strategies, database integration, and authentication systems.
nerdzao-elite
Senior Elite Software Engineer (15+) and Senior Product Designer. Full workflow with planning, architecture, TDD, clean code, and pixel-perfect UX validation.
nerdzao-elite-gemini-high
Modo Elite Coder + UX Pixel-Perfect otimizado especificamente para Gemini 3.1 Pro High. Workflow completo com foco em qualidade máxima e eficiência de tokens.
native-data-fetching
Use when implementing or debugging ANY network request, API call, or data fetching. Covers fetch API, React Query, SWR, error handling, caching, offline support, and Expo Router data loaders (useLoaderData).
n8n-workflow-patterns
Proven architectural patterns for building n8n workflows.
n8n-validation-expert
Expert guide for interpreting and fixing n8n validation errors.