Libraries
Official SDKs for Python, JavaScript, Rust, and Go. Each library wraps the Spider API with typed methods for crawling, scraping, and search. All you need is an API key to get started.
Python
Install the Python SDK with pip:
pip install spider_client
Crawl website with Python
from spider import Spider
app = Spider()
params = {"limit":5}
crawl_result = app.crawl_url('https://example.com', params=params)
JavaScript / TypeScript
Install the JS/TS SDK . Works with Node.js, Deno, and Bun:
npm install @spider-cloud/spider-client
Crawl website with Javascript
import { Spider } from "@spider-cloud/spider-client";
const app = new Spider();
const params = {"limit":5};
const result = await app.crawlUrl("https://example.com", params)
Rust
Add the Rust SDK with cargo:
cargo add spider-client
Crawl website with Rust
let spider = spider_client::Spider::new(None).expect("API required");
let crawler_params = RequestParams { limit: Some(5), ..Default::default() };
let crawl_result = spider.crawl_url("https://example.com", Some(crawler_params), false, "application/json", None).await.expect("Failed to crawl the URL");
Go
Install the Go SDK with go get:
go get github.com/spider-rs/spider-clients/go
Crawl website with Go
package main
import (
"context"
"fmt"
spider "github.com/spider-rs/spider-clients/go"
)
func main() {
client := spider.New("")
pages, _ := client.CrawlURL(context.Background(), "https://example.com", &spider.SpiderParams{Limit: 5})
for _, page := range pages {
fmt.Printf("%s: %d chars\n", page.URL, len(page.Content))
}
}
CLI
Install the CLI with cargo:
cargo install spider-cloud-cli
spider-cloud-cli auth --api_key YOUR_API_KEY
Now after authentication run any of the commands:
spider-cloud-cli crawl --url http://example.com --limit 5
Use spider-cloud-cli --help in terminal to see all available commands.