New — Desktop app with built-in MCP server

Crawl any website.
Talk to it with any AI.

The Crawl Tool is a desktop app that crawls and stores small-to-medium websites locally. It indexes and orders them in a special way, then exposes them — along with specialised tools — to AI Desktop Apps like Claude, Cursor, Visual Studio Code, LM Studio, llama.cpp, and more.

thecrawltool.com
You

Are there any pages that have images without alt tags? List them and the img tags for me.

Scanning your local crawl via MCP…

pages_with_selector(site="example.com", selector="img:not([alt])")
→ 23 pages returned

Here are 23 pages with images missing alt text:

  • /blog/hero-image-guide — <img src="/photos/hero.jpg">
  • /products/widget — <img src="/img/widget-thumb.png">
  • /about/team — 4 images missing alt text
  • …20 more
You

Generate the alt text for all of them and save it as notes.

Why it's different

A crawler that speaks your AI's language.

Most SEO tools lock your data behind dashboards. The Crawl Tool puts your crawl on your machine and lets your AI query it directly — no seats, no exports, no limits.

Full-site crawling

Breadth-first crawling that renders JavaScript, follows links, and stays within your domain.

100% local storage

Every site gets its own SQLite database, stored in your app data folder. Your crawl never leaves your machine.

Built-in MCP server

HTTP and stdio transports work out of the box with Claude Desktop, Cursor, VS Code, and any MCP-compatible client.

Semantic search

Vector embeddings generated locally. Find pages and passages by meaning, not just keywords.

CSS-selector queries

Ask "which pages have an H1?" or "find pages with a .buy-button" — your AI runs real selector queries across the crawl.

In-memory SQL

Your AI can spin up short-term in-memory databases to transform, join, and aggregate data at scale — then discard them when done.

Structured page data

Titles, meta, H1s, H2s, link graphs, status codes, image counts — everything your AI needs to reason about SEO.

No seats. No limits.

Crawl as many sites as you want. One license, one machine, unlimited pages. Pay once — or use it with one site for free.

How it works

From URL to AI chat in three steps.

01

Crawl

Open the app, paste a URL, and click Start. The Crawl Tool renders every page, extracts the good stuff, and stores it locally.

Desktop app
▸ https://example.com
  Crawling… 342 / 1000 pages
  ✓ Stored to example_com.sqlite
02

Connect

Drop a snippet into your AI client's config. Claude Desktop? Use stdio. Something else? Use the HTTP URL.

claude_desktop_config.json
{
  "mcpServers": {
    "the-crawl-tool": {
      "command": "node",
      "args": ["path/to/mcp-server.js"]
    }
  }
}
03

Chat

Ask questions in plain English. Your AI calls the right MCP tools — search, selectors, semantics, SQL — and answers with data.

You → Claude
"Give me every product page
with no meta description."
Who it's for

One tool. Three very happy audiences.

For SEOs

Run entire audits by chatting.

  • Find pages missing H1s, titles, or meta descriptions
  • Spot thin content, duplicate titles, orphaned pages
  • Cluster pages semantically to plan internal linking
  • Export findings as CSVs directly from the chat
For marketers

Understand any website in minutes.

  • Analyze competitors' entire site structure
  • Extract every CTA, headline, and pricing mention
  • Summarize product ranges or content strategies
  • Generate briefs grounded in real crawled data
For website owners

Know what's actually on your site.

  • Get a plain-English overview of your content
  • Ask "what's broken?" and get real answers
  • Track how pages evolve between crawls
  • Keep your data private — nothing is uploaded
Why site-wide AI matters

Site-wide AI prompts were impractical. Until now.

Before, asking an AI about your entire website required the AI to work on a page by page basis — burning tokens, hitting context limits, and taking hours. The Crawl Tool changes that. Your crawl lives locally, indexed and ready for any AI to query at scale.

Practical at last

Site-wide analysis — find every missing alt tag, audit all meta descriptions, compare 500 pages side by side — is now a single prompt away instead of a multi-hour slog.

Save tokens, or go free

Use expensive cloud APIs and keep your quota intact — or run local models like Qwen 3.5 9B for free. It's fast, accurate, and runs entirely on your hardware.

Automate your agents

Define a process and let your AI run it overnight. Generate full website reports, track SEO changes week over week, or build custom audit pipelines — no manual work required.

Pricing

Simple. Local-first. No surprises.

Start free. Upgrade when you need to crawl more websites.

Free

0

Try it out. Crawl one website.

  • Crawl and save one website
  • All features included
Download free

Need a license key?

Get The Crawl Tool Pro for just €19.99/year — unlimited website crawls.

Get Your License Key
FAQ

Questions, answered.

How is this different to competitors?

We focus on data collection and presenting it for the best possible AI queries about crawled websites. We don't try to analyse the data ourselves — because when data is well presented, AI can do that analysis far better than any dashboard. And local AI is rapidly catching up to very usable levels. That means you choose your preferred chat interface and AI model. If your machine is powerful enough to run advanced local AIs, you can even pay zero AI costs.

Which AI clients does it work with?

Anything that speaks MCP — Claude Desktop, Cursor, VS Code's Copilot, LM Studio, llama.cpp, and more. We ship both stdio and HTTP transports so you can pick whichever your client supports.

Is my data sent anywhere?

No. Crawled data is stored in a local SQLite database on your machine. Your AI client only reads it when you ask it a question, and only through the MCP server running on your computer.

Does it work on macOS and Linux?

The app is built on Electron and is cross-platform. Today we ship a Windows installer; macOS and Linux builds are coming soon. Early access available for Pro and Agency customers.

Can I crawl JavaScript-heavy websites?

Yes. We use Puppeteer under the hood, so pages render exactly like they would in a real browser — JavaScript, lazy-loaded content, and all.

What happens to existing SaaS subscribers?

Existing subscribers get a free lifetime Pro license. Check your inbox — we've sent over migration instructions.

Your next SEO insight is a chat away.

Download The Crawl Tool, connect your AI, and ask anything about any website.