This is a warning to any webmasters or online marketers who continue to block Googlebot from crawling their website pages – this will now be reflected in the search results and could impact your click-through success.
Google has a rolled out a new search feature that informs users when a site has blocked Googlebot from crawling a page or page elements since there is no rich snippets information for SERPs.
Previously, if the Googlebot was blocked from a site it would simply show page descriptions based on the limited data available regarding the page, and the result were a typically inaccurate or awkward assessment of the page’s content. But that’s all about to change…
Now the SERP’s the listing will show a message that communicates why Google cannot show a search snippet.
A description for this result is not available because of this site’s robots.txt – learn more.
There is an option for searchers to “learn more” about this message, which is really directed at the webmaster themselves as it explains why some sites opt to block Googlebot and explanations for marketers and webmasters describing the ways to block or unblock the search crawler.
For marketers, blocking the Googlebot results in the SERP listing not being attractive for searchers which could result in fewer visitors to your website. Google urges companies to make their sites as user-friendly as possible and this latest move is a step in the right direction by urging webmasters and online marketers to avoid inaccurate descriptions of their pages.
Do you think this slight change will make a difference to the success of clicks to your website from Google SERP’s? Feel free to share your thoughts below.