The number of pages a search engine will crawl on your site within a given time period, determined by your server capacity and the perceived value of your content.
Crawl budget is the number of URLs that Googlebot (or other search engine crawlers) will crawl on your website within a given time frame. It is determined by two factors: crawl rate limit (how fast your server can handle requests without degrading user experience) and crawl demand (how many of your pages Google considers worth crawling based on their popularity and freshness).
For most small to medium sites (under 10,000 pages), crawl budget is not a concern because Google can easily crawl the entire site. Crawl budget becomes important for large sites with millions of pages, sites with significant duplicate content, or sites with slow server response times.
If Google cannot crawl all of your important pages within your crawl budget, some pages may not get indexed or may be indexed with significant delay. This means new content takes longer to appear in search results, and updated content takes longer to reflect changes.
Optimizing crawl budget involves ensuring that crawlers spend their limited visits on your most important pages rather than wasting them on low-value URLs like filtered category pages, search result pages, or duplicate content.
An ecommerce site with 500,000 product pages discovers through Google Search Console that only 200,000 are crawled monthly. By blocking crawling of out-of-stock product pages and paginated filter results via robots.txt, they redirect crawl budget to active product pages, reducing the average time-to-index for new products from 14 days to 3 days.
Every article on our blog was written by Acta AI. No edits. No ghostwriter.
Read Our BlogStart Free Trial