Full-site crawling
Breadth-first crawling that renders JavaScript, follows links, and stays within your domain.
The Crawl Tool is a desktop app that crawls and stores small-to-medium websites locally. It indexes and orders them in a special way, then exposes them — along with specialised tools — to AI Desktop Apps like Claude, Cursor, Visual Studio Code, LM Studio, llama.cpp, and more.
Are there any pages that have images without alt tags? List them and the img tags for me.
Scanning your local crawl via MCP…
pages_with_selector(site="example.com", selector="img:not([alt])") → 23 pages returned
Here are 23 pages with images missing alt text:
<img src="/photos/hero.jpg"><img src="/img/widget-thumb.png">Generate the alt text for all of them and save it as notes.
Most SEO tools lock your data behind dashboards. The Crawl Tool puts your crawl on your machine and lets your AI query it directly — no seats, no exports, no limits.
Breadth-first crawling that renders JavaScript, follows links, and stays within your domain.
Every site gets its own SQLite database, stored in your app data folder. Your crawl never leaves your machine.
HTTP and stdio transports work out of the box with Claude Desktop, Cursor, VS Code, and any MCP-compatible client.
Vector embeddings generated locally. Find pages and passages by meaning, not just keywords.
Ask "which pages have an H1?" or "find pages with a .buy-button" — your AI runs real selector queries across the crawl.
Your AI can spin up short-term in-memory databases to transform, join, and aggregate data at scale — then discard them when done.
Titles, meta, H1s, H2s, link graphs, status codes, image counts — everything your AI needs to reason about SEO.
Crawl as many sites as you want. One license, one machine, unlimited pages. Pay once — or use it with one site for free.
Open the app, paste a URL, and click Start. The Crawl Tool renders every page, extracts the good stuff, and stores it locally.
▸ https://example.com Crawling… 342 / 1000 pages ✓ Stored to example_com.sqlite
Drop a snippet into your AI client's config. Claude Desktop? Use stdio. Something else? Use the HTTP URL.
{
"mcpServers": {
"the-crawl-tool": {
"command": "node",
"args": ["path/to/mcp-server.js"]
}
}
}
Ask questions in plain English. Your AI calls the right MCP tools — search, selectors, semantics, SQL — and answers with data.
"Give me every product page with no meta description."
Before, asking an AI about your entire website required the AI to work on a page by page basis — burning tokens, hitting context limits, and taking hours. The Crawl Tool changes that. Your crawl lives locally, indexed and ready for any AI to query at scale.
Site-wide analysis — find every missing alt tag, audit all meta descriptions, compare 500 pages side by side — is now a single prompt away instead of a multi-hour slog.
Use expensive cloud APIs and keep your quota intact — or run local models like Qwen 3.5 9B for free. It's fast, accurate, and runs entirely on your hardware.
Define a process and let your AI run it overnight. Generate full website reports, track SEO changes week over week, or build custom audit pipelines — no manual work required.
Start free. Upgrade when you need to crawl more websites.
Try it out. Crawl one website.
Get The Crawl Tool Pro for just €19.99/year — unlimited website crawls.
Get Your License KeyWe focus on data collection and presenting it for the best possible AI queries about crawled websites. We don't try to analyse the data ourselves — because when data is well presented, AI can do that analysis far better than any dashboard. And local AI is rapidly catching up to very usable levels. That means you choose your preferred chat interface and AI model. If your machine is powerful enough to run advanced local AIs, you can even pay zero AI costs.
Anything that speaks MCP — Claude Desktop, Cursor, VS Code's Copilot, LM Studio, llama.cpp, and more. We ship both stdio and HTTP transports so you can pick whichever your client supports.
No. Crawled data is stored in a local SQLite database on your machine. Your AI client only reads it when you ask it a question, and only through the MCP server running on your computer.
The app is built on Electron and is cross-platform. Today we ship a Windows installer; macOS and Linux builds are coming soon. Early access available for Pro and Agency customers.
Yes. We use Puppeteer under the hood, so pages render exactly like they would in a real browser — JavaScript, lazy-loaded content, and all.
Existing subscribers get a free lifetime Pro license. Check your inbox — we've sent over migration instructions.
Download The Crawl Tool, connect your AI, and ask anything about any website.