Good news for webmasters, and online publishers, as Google, Yahoo! and Microsoft unite to support a single format sitemap auto discovery method. In a rare collaborative effort, the three search rivals have agreed on referencing sitemaps within the robots.txt file.
What are Sitemaps?
Sitemaps inform search engine crawlers of the pages within your website, how important each page is and the frequency with which you update them. For a great DIY tip into creating sitemaps, check out Matt’s post on Making Sitemaps Work for Your Website
What is a robots.txt File?
Robots.txt is a text file (hence the .txt) present in the root directory of a site. It controls which pages of your website search engine robots index. Since all search engine crawlers access this file, it’s the obvious place in which to reference a sitemap – hopefully resulting in better, faster website indexing.
The New Sitemap Protocol
Once you have created a sitemap for your site, you should have it uploaded to your website directory and should add the following line of text into your robots.txt file:
(Speak to your webmaster if you don’t administer your own website)
Any search engine that supports sitemaps will now be able to access your sitemap through your site, making it easier for you to get all the content you want into all the search engines. It’s best to follow the guidelines from Google, Yahoo! or sitemaps.org to make sure you implement your sitemap in the robots.txt file correctly and remember that it’s best to leave changes to your site to your webmaster, or an IT professional!
Boost your chances of getting into Google – guaranteed Google inclusion in just 7 days – order G-Boost now!