NEW AI Studio is now available Try it now
Community Node

Spider for n8n

The web data layer for your n8n workflows. Collect structured content, search the live web, and transform URLs into LLM-ready output — all from a single node.

INSTALL

One command setup

Install from npm inside your n8n instance, then add your Spider API key as a credential.

Add credentials

In n8n, go to CredentialsNew → search for Spider API

// Credential fields
API Key: "your-spider-api-key"

// Get your key at
https://spider.cloud/credits/new
OPERATIONS

7 ways to get web data

Each operation gives your workflow a different way to access web content. Add the Spider node and pick what you need.

spider_crawl

Collect structured content across an entire site. Follow links, respect depth limits, return clean data.

spider_scrape

Extract content from a single URL. Get markdown, HTML, or plain text — ready for your next node.

spider_search

Query the web and get results with full page content. Feed real-time data into any workflow.

spider_links

Map site structure by extracting all hyperlinks from a page. Discover resources automatically.

spider_screenshot

Capture visual snapshots of any page. Full-page or viewport — useful for monitoring and archival.

spider_transform

Convert any URL into markdown, HTML, plain text, or XML. Reshape web content for downstream nodes.

spider_get_credits

Check your remaining API credit balance from within a workflow.

CAPABILITIES

Enterprise-grade data collection

LLM-ready output

Get results as Markdown, HTML, plain text, or XML. Structured content ready to feed into any AI model or data pipeline.

Access any site

Spider handles CAPTCHAs, fingerprinting, and bot detection transparently. Protected sites just work.

JavaScript rendering

Dynamic sites rendered in a real browser. SPAs, client-side content, infinite scroll — you get the full page data.

Global data access

Built-in proxy network for geo-restricted content. Access data from any region without managing infrastructure.

Clean content extraction

Automatically strips navigation, ads, and boilerplate. Get just the meaningful content, structured and ready to use.

100K+ pages/sec

Built for throughput. Collect data from entire sites in seconds — your workflows never wait on the data layer.

EXAMPLE

Web data in any workflow

Drop the Spider node between a trigger and any destination. Live web data flows into your pipeline in seconds.

n8n workflow: Extract pricing data → Slack Spider Node
// 1. Schedule Trigger — every hour
// 2. Spider Node:
Operation: Scrape
URL:       https://example.com/pricing
Format:    Markdown
Anti-Bot:  Enabled

// 3. Slack Node:
Channel:   #pricing-updates
Message:   {{$json.content}}

Related

Give your workflows a web data layer

Get your API key, install the node, and pipe live web data into any workflow in under a minute.