NEW AI Studio is now available Try it now
CrewAI logo

Agent Tool

Give your AI crew eyes on the web

Spider integrates as a CrewAI tool. Your agents can crawl sites, scrape specific pages, and run web searches autonomously. The crew decides when and how to use the web.

1

Researcher

Search and gather sources

SpiderTool (search)

2

Analyst

Extract structured data

SpiderTool (scrape)

3

Writer

Synthesize findings

Context from crew

from crewai import Agent, Task, Crew
from crewai_tools import SpiderTool

# Create the Spider tool
spider_tool = SpiderTool()

# Define agents with web access
researcher = Agent(
    role="Senior Researcher",
    goal="Find accurate, up-to-date information from the web",
    tools=[spider_tool],
    verbose=True,
)

analyst = Agent(
    role="Data Analyst",
    goal="Extract structured insights from raw web content",
    tools=[spider_tool],
)

# Define tasks
research_task = Task(
    description="Research the top 5 competitors in the AI search space. Crawl their websites and summarize their offerings.",
    agent=researcher,
    expected_output="A list of competitors with key features and pricing",
)

analysis_task = Task(
    description="Analyze the research and identify market gaps.",
    agent=analyst,
    expected_output="A strategic analysis with actionable recommendations",
)

# Run the crew
crew = Crew(agents=[researcher, analyst], tasks=[research_task, analysis_task])
result = crew.kickoff()
print(result)

One-Line Setup

Import SpiderTool, pass it to your agent. The tool handles authentication, rate limiting, and retries. Your agent just asks for web data.

Agent-Driven Discovery

Your crew decides what to crawl and when. The researcher agent can follow links across sites, pivot searches, and deep-dive into topics autonomously.

Scrape + Search + Crawl

Spider exposes all three modes through the tool interface. Agents can scrape a single page, search the web, or crawl an entire domain based on the task.

Structured Output

Spider returns clean markdown that LLMs parse reliably. No HTML soup for your agents to fight through. Better input means better reasoning.

Parallel Crews

Spider handles concurrency server-side. Multiple agents in your crew can make requests simultaneously without conflicting or hitting local resource limits.

Tool Composability

Combine SpiderTool with file writers, database tools, or API callers. Let your crew research the web, then act on findings in a single workflow.

Research Crew with Live Web Search

Build a crew that searches the web, reads the results, and produces a report. All with real-time data.

from crewai import Agent, Task, Crew
from spider import Spider

spider = Spider()

def search_web(query: str) -> str:
    """Search the web and return relevant content."""
    results = spider.search(query, params={
        "search_limit": 5,
        "fetch_page_content": True,
        "return_format": "markdown",
    })
    return "\n\n".join(
        f"## {r['url']}\n{r['content'][:2000]}"
        for r in results if r.get("content")
    )

def crawl_site(url: str) -> str:
    """Crawl a website and return its contents."""
    pages = spider.crawl_url(url, params={
        "return_format": "markdown",
        "limit": 20,
    })
    return "\n\n".join(
        p["content"][:3000] for p in pages if p.get("content")
    )

# Your agents can call search_web() and crawl_site()
# as custom tools alongside SpiderTool for full flexibility

Start building with CrewAI + Spider

Free credits on signup. No subscription required.