Google Announces Sunset for Crawl Rate-Limiter Tool

Highlights:

  • Google will sunset the Crawl Rate-Limiter Tool on January 8th, 2024.
  • Why the Change: Improved crawling logic and alternative publisher tools render the rate limiter tool obsolete.
  • Impact on Website Owners: Google will now dynamically adjust crawling speed based on server responses for a more streamlined experience.

In a recent announcement, Google revealed its decision to phase out the Crawl Rate-Limiter Tool in Search Console on January 8th, 2024. This tool has been reliable for more than ten years, but it is becoming outdated because of new crawling logic and other tools that publishers can use.

Googlebot’s responsiveness to a site’s server handling its HTTP requests has evolved. The bot adjusts its crawling speed based on the server’s reactions to its requests. For instance, persistent HTTP 500 status codes trigger an immediate slowdown, and longer response times lead to an automatic reduction in crawling speed. In cases of overwhelming crawling, site owners can refer to Google’s help article for assistance.

Google Crawl Rate-Limiter Tool

The now-obsolete rate limiter tool, in contrast, had a slower impact. The application of new limits to crawling might have taken over a day. Luckily, not many website owners needed to use this tool. And those who did would usually set the crawling speed to the lowest possible.

Now that this tool is being phased out, Google is also decreasing the minimum crawling speed to match the old crawl rate limits. This is done to maintain the previous settings, especially for sites with low search interest, in order to avoid unnecessary use of bandwidth.

Due to the evolution of automated crawl rate handling and the commitment to user simplicity, Google has chosen to deprecate the crawl rate limiter tool in Search Console. However, the Googlebot report form remains accessible for reporting unusual Googlebot activities and emergencies.

It’s essential for site owners to note that the most effective way to control crawl rates is through instructing Googlebot via server responses, as detailed in Google’s documentation. This strategic move aligns with Google’s continuous efforts to enhance user experience and streamline webmaster tools for optimal performance.

Also read:

Hey, it's Adan! I'm passionate about exploring and critiquing innovative SaaS products, especially those focused on content creation and marketing. Oh, and I can't forget about my love for coffee :)

Leave a Comment