Build automated Twitter data pipelines with n8n's visual workflow builder. Extract tweets, monitor brands, and trigger actions — all without writing code.
Install the ScrapeBadger n8n node from the n8n community nodes marketplace
Add a ScrapeBadger credential in n8n with your API key from scrapebadger.com/dashboard/api-keys
Drag the ScrapeBadger node into your workflow and select the Twitter endpoint you need
Configure parameters (username, search query, etc.) using n8n expressions or static values
Connect to downstream nodes — Google Sheets, Slack, email, database, or any of n8n's 500+ integrations
Activate the workflow to run on a schedule or trigger
// Example n8n workflow JSON (import into n8n)
{
"nodes": [
{
"name": "ScrapeBadger",
"type": "n8n-nodes-scrapebadger",
"parameters": {
"resource": "twitter",
"operation": "searchTweets",
"query": "web scraping API",
"queryType": "Latest"
}
},
{
"name": "Google Sheets",
"type": "n8n-nodes-base.googleSheets",
"parameters": {
"operation": "appendOrUpdate",
"sheetName": "Twitter Data"
}
}
]
}No. n8n is a visual workflow builder. You configure ScrapeBadger endpoints through a graphical interface and connect them to other tools.
Yes. ScrapeBadger maintains an official n8n community node that's regularly updated with new endpoints.
Yes. n8n supports cron-based scheduling. Run Twitter data collection hourly, daily, or at any custom interval.
Yes. Each API call made through n8n uses credits just like direct API calls. Monitor your usage in the ScrapeBadger dashboard.
Connect Twitter data to your entire tool stack. Use Zapier's webhook triggers with ScrapeBadger's webhook delivery for seamless automation.
The official Python client for ScrapeBadger. Fully type-hinted, async-ready, and designed for data pipelines. Install with pip install scrapebadger.
The official Node.js/TypeScript client for ScrapeBadger. Full TypeScript definitions, Promise-based API, and tree-shakeable. npm install scrapebadger.