The Googlebot is Google’s mighty algorithm which determines how much to crawl each site it visits. Now Google have given you more power to control this process.
The crawl rate is the component of the Googlebot which determines the time the Googlebot spends on crawling your site. Obviously, Google would like to thoroughly crawl your site (so that searchers are given the best results in the SERP pages), however it also needs to take into consideration that it should create little impact on your server’s bandwidth.
There is a default crawl setting; however some sites may have more specific needs. It’s in these situations that the new changes are useful.
Pooja Shah from the Google Webmaster Tools Team explains more:
“For a vast majority of sites, it’s probably best to choose the “Let Google determine my crawl rate” option, which is the default. However, if you’re an advanced user or if you’re facing bandwidth issues with your server, you can customize your crawl rate to the speed most optimal for your web server(s). The custom crawl rate option allows you to provide Googlebot insight to the maximum number of requests per second and the number of seconds between requests that you feel are best for your environment.”
Shah outlines more on the changes here, including a warning that if you do customize your crawl rate then it will only be in effect for 90 days, after which it will reset to Google’s recommended value.
There are a number of factors that determine the rate that Google sets as your site’s crawl rate. I would suggest that if you wish to customize your crawl rate, then you do so with care.