The official Node.js/TypeScript client for ScrapeBadger. Full TypeScript definitions, Promise-based API, and tree-shakeable. npm install scrapebadger.
Install the SDK: npm install scrapebadger
Get your API key from scrapebadger.com/dashboard/api-keys
Initialize: new ScrapeBadger({ apiKey: "YOUR_KEY" })
Call any endpoint with full TypeScript autocomplete
Use async/await for clean, readable code
Check docs at docs.scrapebadger.com for full reference
import ScrapeBadger from 'scrapebadger';
const sb = new ScrapeBadger({ apiKey: 'YOUR_API_KEY' });
// Get user profile
const user = await sb.twitter.users.getByUsername('elonmusk');
console.log(`${user.data.name}: ${user.data.followers_count} followers`);
// Search tweets
const results = await sb.twitter.tweets.advancedSearch({
query: 'web scraping API lang:en',
queryType: 'Latest',
});
// Get followers
const followers = await sb.twitter.users.getFollowers('elonmusk');
followers.data.forEach(f => console.log(f.username, f.followers_count));Yes. The SDK is written in TypeScript and ships with full type definitions. You get autocomplete for every endpoint and response field.
The SDK is designed for Node.js. For browser usage, use the REST API directly via fetch. Never expose your API key in client-side code.
Yes. The SDK ships dual ESM/CJS builds and works with any bundler or runtime.
Yes. The SDK includes a StreamClient class for connecting to real-time tweet streams via WebSocket.
The official Python client for ScrapeBadger. Fully type-hinted, async-ready, and designed for data pipelines. Install with pip install scrapebadger.
Build automated Twitter data pipelines with n8n's visual workflow builder. Extract tweets, monitor brands, and trigger actions — all without writing code.
Give your AI agents access to real-time Twitter data through the Model Context Protocol. Works with Claude, custom agents, and any MCP-compatible client.