Official Client Libraries

ScrapeBadger SDKs

Integrate ScrapeBadger into your applications with our official SDKs. Full type safety, async support, and automatic pagination built-in.

Async-First

Built for high-performance concurrent requests

Type Safe

Full type definitions for IDE autocompletion

Auto Pagination

Iterate through large datasets seamlessly

Real-Time Streaming

WebSocket-based live tweet delivery

🐍

Python SDK

Python 3.10+

v0.4.0
Install
pip install scrapebadger

or: uv add scrapebadger

Features

  • Async-first design with httpx & asyncio
  • Full type hints with Pydantic models
  • Automatic pagination with async iterators
  • Smart rate limit handling & exponential backoff retries
  • 50+ Twitter endpoints across 7 categories
  • Real-time WebSocket streaming with auto-reconnect
  • Stream monitors & filter rules management
  • Web scraping API (scrape, screenshot, extract, batch)
  • Webhook signature verification
Quick Start
import asyncio
from scrapebadger import ScrapeBadger

async def main():
    async with ScrapeBadger(api_key="your-api-key") as client:
        # Get a user profile
        user = await client.twitter.users.get_by_username("elonmusk")
        print(f"{user.name} has {user.followers_count:,} followers")

        # Search tweets with automatic pagination
        async for tweet in client.twitter.tweets.search_all(
            "python programming",
            max_items=100
        ):
            print(f"@{tweet.username}: {tweet.text[:80]}...")

        # Real-time streaming via WebSocket
        async with client.twitter.stream.connect() as events:
            async for event in events:
                if event.type == "tweet":
                    print(f"New tweet: {event.tweet.text}")

asyncio.run(main())
🟢

Node.js SDK

Node.js 18+

v0.4.0
Install
npm install scrapebadger

or: pnpm add scrapebadger

Features

  • Full TypeScript support with type definitions
  • Dual ESM & CommonJS with tree-shaking
  • Async iterators for automatic pagination
  • Smart rate limit handling & exponential backoff retries
  • 50+ Twitter endpoints across 7 categories
  • Real-time WebSocket streaming (EventEmitter & AsyncIterator)
  • Stream monitors & filter rules management
  • Web scraping API (scrape, screenshot, extract, batch)
  • Webhook signature verification
Quick Start
import { ScrapeBadger, collectAll } from "scrapebadger";

const client = new ScrapeBadger({ apiKey: "your-api-key" });

// Get a user profile
const user = await client.twitter.users.getByUsername("elonmusk");
console.log(`${user.name} has ${user.followersCount.toLocaleString()} followers`);

// Search tweets with automatic pagination
for await (const tweet of client.twitter.tweets.searchAll("python", {
  maxItems: 100,
})) {
  console.log(`@${tweet.username}: ${tweet.text.slice(0, 80)}...`);
}

// Real-time streaming via WebSocket
for await (const event of client.twitter.stream.connectIter()) {
  if (event.type === "tweet") {
    console.log(`New tweet: ${event.tweet.text}`);
  }
}

More SDKs Coming Soon

We're working on official SDKs for more languages. Have a request? Let us know on GitHub or Discord.

GoRubyJavaPHPC#

Ready to Get Started?

Sign up for free and get 1,000 credits to start building.