Control your Googlebot's crawl rate in Webmaster Tools

Google upgraded the crawl rate setting in Webmaster Tools so that webmasters experiencing problems with Googlebot can now provide more specific information. Only webmaster of hosted domain with root level have access. It's probably best to choose the "Let Google determine my crawl rate" deafult option. However, if you're facing bandwidth issues with your server, you can customize your crawl […]

Google upgraded the crawl rate setting in Webmaster Tools so that webmasters experiencing problems with Googlebot can now provide more specific information. Only webmaster of hosted domain with root level have access. It's probably best to choose the "Let Google determine my crawl rate" deafult option. However, if you're facing bandwidth issues with your server, you can customize your crawl rate to the speed most optimal for your web server(s). The custom crawl rate option allows you to provide Googlebot insight to the maximum number of requests per second and the number of seconds between requests that you feel are best for your environment.

Please note, Setting the crawl rate to a lower-than-default value may affect the coverage and freshness of your site in Google's search results.

In related; Webmaster Tools now provides a single, dedicated page where you can see and adjust all the settings for your site. The settings that have been moved to the new Settings page are: Geographic Target ; Preferred domain control; Opting in to enhanced image search; Crawl rate control”.

Whenever you change a setting, you will be given an option to save or cancel the change. However, some settings are time-bounded, that will expire after a certain time period.