x-twitter-scraper
X (Twitter) data platform skill — tweet search, user lookup, follower extraction, engagement metrics, giveaway draws, monitoring, webhooks, 19 extraction tools, MCP server.
Best use case
x-twitter-scraper is best used when you need a repeatable AI agent workflow instead of a one-off prompt.
X (Twitter) data platform skill — tweet search, user lookup, follower extraction, engagement metrics, giveaway draws, monitoring, webhooks, 19 extraction tools, MCP server.
Teams using x-twitter-scraper should expect a more consistent output, faster repeated execution, less prompt rewriting.
When to use this skill
- You want a reusable workflow that can be run more than once with consistent structure.
When not to use this skill
- You only need a quick one-off answer and do not need a reusable workflow.
- You cannot install or maintain the underlying files, dependencies, or repository context.
Installation
Claude Code / Cursor / Codex
Manual Installation
- Download SKILL.md from GitHub
- Place it in
.claude/skills/x-twitter-scraper/SKILL.mdinside your project - Restart your AI agent — it will auto-discover the skill
How x-twitter-scraper Compares
| Feature / Agent | x-twitter-scraper | Standard Approach |
|---|---|---|
| Platform Support | Not specified | Limited / Varies |
| Context Awareness | High | Baseline |
| Installation Complexity | Unknown | N/A |
Frequently Asked Questions
What does this skill do?
X (Twitter) data platform skill — tweet search, user lookup, follower extraction, engagement metrics, giveaway draws, monitoring, webhooks, 19 extraction tools, MCP server.
Where can I find the source code?
You can find the source code on GitHub using the link provided at the top of the page.
SKILL.md Source
# X (Twitter) Scraper — Xquik
## Overview
Gives your AI agent full access to X (Twitter) data through the Xquik platform. Covers tweet search, user profiles, follower extraction, engagement metrics, giveaway draws, account monitoring, webhooks, and 19 bulk extraction tools — all via REST API or MCP server.
## When to Use This Skill
- User needs to search X/Twitter for tweets by keyword, hashtag, or user
- User wants to look up a user profile (bio, follower counts, etc.)
- User needs engagement metrics for a specific tweet (likes, retweets, views)
- User wants to check if one account follows another
- User needs to extract followers, replies, retweets, quotes, or community members in bulk
- User wants to run a giveaway draw from tweet replies
- User needs real-time monitoring of an X account (new tweets, follower changes)
- User wants webhook delivery of monitored events
- User asks about trending topics on X
## Setup
### Install the Skill
```bash
npx skills add Xquik-dev/x-twitter-scraper
```
Or clone manually into your agent's skills directory:
```bash
# Claude Code
git clone https://github.com/Xquik-dev/x-twitter-scraper.git .claude/skills/x-twitter-scraper
# Cursor / Codex / Gemini CLI / Copilot
git clone https://github.com/Xquik-dev/x-twitter-scraper.git .agents/skills/x-twitter-scraper
```
### Get an API Key
1. Sign up at [xquik.com](https://xquik.com)
2. Generate an API key from the dashboard
3. Set it as an environment variable or pass it directly
```bash
export XQUIK_API_KEY="xq_YOUR_KEY_HERE"
```
## Capabilities
| Capability | Description |
|---|---|
| Tweet Search | Find tweets by keyword, hashtag, from:user, "exact phrase" |
| User Lookup | Profile info, bio, follower/following counts |
| Tweet Lookup | Full metrics — likes, retweets, replies, quotes, views, bookmarks |
| Follow Check | Check if A follows B (both directions) |
| Trending Topics | Top trends by region (free, no quota) |
| Account Monitoring | Track new tweets, replies, retweets, quotes, follower changes |
| Webhooks | HMAC-signed real-time event delivery to your endpoint |
| Giveaway Draws | Random winner selection from tweet replies with filters |
| 19 Extraction Tools | Followers, following, verified followers, mentions, posts, replies, reposts, quotes, threads, articles, communities, lists, Spaces, people search |
| MCP Server | StreamableHTTP endpoint for AI-native integrations |
## Examples
**Search tweets:**
```
"Search X for tweets about 'claude code' from the last week"
```
**Look up a user:**
```
"Who is @elonmusk? Show me their profile and follower count"
```
**Check engagement:**
```
"How many likes and retweets does this tweet have? https://x.com/..."
```
**Run a giveaway:**
```
"Pick 3 random winners from the replies to this tweet"
```
**Monitor an account:**
```
"Monitor @openai for new tweets and notify me via webhook"
```
**Bulk extraction:**
```
"Extract all followers of @anthropic"
```
## API Reference
| Endpoint | Method | Purpose |
|----------|--------|---------|
| `/x/tweets/{id}` | GET | Single tweet with full metrics |
| `/x/tweets/search` | GET | Search tweets |
| `/x/users/{username}` | GET | User profile |
| `/x/followers/check` | GET | Follow relationship |
| `/trends` | GET | Trending topics |
| `/monitors` | POST | Create monitor |
| `/events` | GET | Poll monitored events |
| `/webhooks` | POST | Register webhook |
| `/draws` | POST | Run giveaway draw |
| `/extractions` | POST | Start bulk extraction |
| `/extractions/estimate` | POST | Estimate extraction cost |
| `/account` | GET | Account & usage info |
**Base URL:** `https://xquik.com/api/v1`
**Auth:** `x-api-key: xq_...` header
**MCP:** `https://xquik.com/mcp` (StreamableHTTP, same API key)
## Repository
https://github.com/Xquik-dev/x-twitter-scraper
**Maintained By:** [Xquik](https://xquik.com)Related Skills
apify-ultimate-scraper
Universal AI-powered web scraper for any platform. Scrape data from Instagram, Facebook, TikTok, YouTube, Google Maps, Google Search, Google Trends, Booking.com, and TripAdvisor. Use for lead gener...
twitter-automation
Automate Twitter/X tasks via Rube MCP (Composio): posts, search, users, bookmarks, lists, media. Always search tools first for current schemas.
firecrawl-scraper
Deep web scraping, screenshots, PDF parsing, and website crawling using Firecrawl API
seo-geo-optimize
Audits and automatically implements SEO, local SEO (Geo/Schema.org), and Social Graph structured data across web projects according to modern best practices.
seo-fundamentals
Core principles of SEO including E-E-A-T, Core Web Vitals, technical foundations, content quality, and how modern search engines evaluate pages.
seo-forensic-incident-response
Investigate sudden drops in organic traffic or rankings and run a structured forensic SEO incident response with triage, root-cause analysis and recovery plan.
seo-content-writer
Writes SEO-optimized content based on provided keywords and topic briefs. Creates engaging, comprehensive content following best practices. Use PROACTIVELY for content creation tasks.
seo-content-refresher
Identifies outdated elements in provided content and suggests updates to maintain freshness. Finds statistics, dates, and examples that need updating. Use PROACTIVELY for older content.
seo-content-planner
Creates comprehensive content outlines and topic clusters for SEO. Plans content calendars and identifies topic gaps. Use PROACTIVELY for content strategy and planning.
seo-content-auditor
Analyzes provided content for quality, E-E-A-T signals, and SEO best practices. Scores content and provides improvement recommendations based on established guidelines.
seo-cannibalization-detector
Analyzes multiple provided pages to identify keyword overlap and potential cannibalization issues. Suggests differentiation strategies. Use PROACTIVELY when reviewing similar content.
seo-audit
Diagnose and audit SEO issues affecting crawlability, indexation, rankings, and organic performance.