Give your AI agents access to real-time Twitter data through the Model Context Protocol. Works with Claude, custom agents, and any MCP-compatible client.
Install the MCP server: npx @scrapebadger/mcp-server or Docker
Configure your API key in the MCP server settings
Connect your AI agent to the MCP server endpoint
The agent can now call Twitter endpoints as MCP tools
All 34+ Twitter endpoints are available as structured tools
See /mcp page for detailed setup guides per client
// Claude Desktop MCP configuration (~/.claude/claude_desktop_config.json)
{
"mcpServers": {
"scrapebadger": {
"command": "npx",
"args": ["-y", "@scrapebadger/mcp-server"],
"env": {
"SCRAPEBADGER_API_KEY": "YOUR_API_KEY"
}
}
}
}
// Now Claude can use tools like:
// - get_twitter_user_profile("elonmusk")
// - search_twitter_tweets("AI news")
// - get_twitter_followers("openai")Model Context Protocol (MCP) is an open standard that lets AI models connect to external data sources and tools. ScrapeBadger provides a native MCP server for Twitter data.
Any MCP-compatible client: Claude Desktop, Claude Code, custom agents built with the Agent SDK, and any tool that supports MCP.
No. The MCP server exposes Twitter endpoints as structured tools with descriptions. The AI agent discovers and uses them automatically.
The MCP server wraps the REST API in MCP tool format. For direct integration in your code, use the Python or Node.js SDK. For AI agent integration, use MCP.
The official Python client for ScrapeBadger. Fully type-hinted, async-ready, and designed for data pipelines. Install with pip install scrapebadger.
The official Node.js/TypeScript client for ScrapeBadger. Full TypeScript definitions, Promise-based API, and tree-shakeable. npm install scrapebadger.
Build automated Twitter data pipelines with n8n's visual workflow builder. Extract tweets, monitor brands, and trigger actions — all without writing code.