Glossary

Plain-English SEO definitions, sourced from Google's documentation.

Crawl Error

A condition that prevents Googlebot from successfully fetching or processing a URL, such as a server failure, DNS issue or HTTP error response.

Definition

A crawl error is any issue that stops a search engine crawler from retrieving a page. Google reports these in Search Console's Page Indexing report, classifying URLs by reasons such as server error 5xx, not found 404, redirect error, blocked by robots.txt or soft 404.

Crawl errors split into site-wide problems — DNS failures, repeated 5xx responses, robots.txt fetch errors that block all crawling — and URL-level problems specific to individual pages. Site-wide issues can throttle Googlebot's overall crawl rate for a host, while URL-level issues affect indexing only for the URLs involved. Search Console groups affected URLs by reason so site owners can diagnose patterns; many reasons are descriptive rather than evaluative, for example 'Crawled — currently not indexed' indicates Google fetched the page but chose not to index it.

Examples

  • Server error reported in Search Console

    Search Console's Page Indexing report shows a spike of 'Server error (5xx)' URLs after a deployment. The team checks logs, finds a database connection bug and the URLs return to the indexed bucket once the fix ships.

  • Soft 404 detected by Google

    A category page with no products returns 200 OK but the page reads 'No results found'. Google labels it 'Soft 404' in Search Console because the response code does not match the empty content.

Sources

Related terms

Where QueryCatch uses this

Last updated: 12/05/2026

Crawl Error — Definition, Example & SEO Use | QueryCatch | QueryCatch SEO Glossary