Give Your AI Tools
Web Superpowers
Connect Spider to Claude, Claude Code, Cursor, Windsurf, and any MCP-compatible tool. Your AI gets 13 tools for crawling, scraping, searching, and extracting web data at 100K+ pages per second.
One command. Any tool.
Install from npm — no build step, no dependencies to manage. Works with any tool that supports the Model Context Protocol.
Claude Code
Recommended# One command setup
claude mcp add spider -- npx -y spider-cloud-mcp
# Set your API key
export SPIDER_API_KEY="your-api-key" Claude Desktop
Add to claude_desktop_config.json
{
"mcpServers": {
"spider": {
"command": "npx",
"args": ["-y", "spider-cloud-mcp"],
"env": {
"SPIDER_API_KEY": "your-key"
}
}
}
} Cursor
Add to .cursor/mcp.json in your project
{
"mcpServers": {
"spider": {
"command": "npx",
"args": ["-y", "spider-cloud-mcp"],
"env": {
"SPIDER_API_KEY": "your-key"
}
}
}
} Windsurf
Add to ~/.codeium/windsurf/mcp_config.json
{
"mcpServers": {
"spider": {
"command": "npx",
"args": ["-y", "spider-cloud-mcp"],
"env": {
"SPIDER_API_KEY": "your-key"
}
}
}
} 13 tools your AI can use
Every tool maps directly to a Spider API endpoint. Your AI decides which tool to call based on context — no configuration needed.
Core Tools 8 tools
spider_crawl Crawl a website and extract content from multiple pages.
spider_scrape Scrape a single page — no crawling, just fetch and process one URL.
spider_search Search the web and optionally crawl the results.
spider_links Extract all links from a page without fetching content.
spider_screenshot Capture screenshots of web pages as base64 images.
spider_unblocker Access blocked content with anti-bot bypass and proxy rotation.
spider_transform Transform HTML to markdown or text — no network requests needed.
spider_get_credits Check your available API credit balance.
AI Tools 5 tools Requires subscription
spider_ai_crawl AI AI-guided crawling using natural language prompts.
spider_ai_scrape AI AI-powered structured data extraction from plain English.
spider_ai_search AI AI-enhanced semantic web search with relevance ranking.
spider_ai_browser AI AI browser automation — describe actions in natural language.
spider_ai_links AI AI-powered intelligent link extraction and filtering.
What your AI can do with Spider
Crawl Entire Sites
Ask your AI to "crawl the docs at example.com" and it calls spider_crawl with the right parameters automatically.
Search the Web
"Find recent articles about quantum computing" — your AI searches the web and returns results with full page content.
Extract Structured Data
"Get the pricing table from this page" — AI extracts exactly the data you describe and returns clean JSON.
Capture Screenshots
"Take a screenshot of this page" — get full-page or viewport screenshots rendered in a real browser.
Bypass Anti-Bot
Protected sites are no problem. Spider handles CAPTCHAs, fingerprinting, and bot detection with a 99.5% success rate.
Automate Browsers
"Click the login button, fill in my email, and submit" — AI drives a real browser to interact with web pages.
Built for AI workflows
Token-efficient
- Clean markdown output, no HTML noise
- Content filtering removes nav, ads, footers
- Return only the data your AI needs
Fastest API
- 100K+ pages per second throughput
- Sub-second response for single pages
- No rate limits on paid plans
Full coverage
- JavaScript rendering via headless Chrome
- Anti-bot bypass on protected sites
- Global proxy network for geo-access
See it in action
Here's what it looks like when your AI uses Spider tools in a real conversation.
You: Crawl the Spider docs and summarize the API endpoints.
Claude: I'll crawl the Spider documentation for you.
Using spider_crawl:
url: "https://spider.cloud/docs/api"
return_format: "markdown"
limit: 20
I found 12 API endpoints across the docs. Here's a summary:
1. POST /crawl — Crawl websites and return content
2. POST /scrape — Scrape a single page
3. POST /search — Search the web
4. POST /links — Extract links
... Related Resources
Ready to connect your AI to the web?
Get your API key and install the MCP server in under a minute.