Google has been locked in battle recently with some of the world’s biggest newspapers, but has called their bluff by announcing a simple solution to their problem.
Recently, media owners and high ranking officials in Australia, the United States and Europe have been attacking Google by arguing that Google steals their work by aggregating links on services such as Google News. The media outlets want Google to pay for the right to link to their content
Google have come out with a simple solution however, telling the newspapers that they can simply add disallow lines to their Robot.txt file and the problem is solved. This would see links to their content removed from Google and all other major search engines that follow the Robots.txt protocol.
Here’s what Google’s Josh Cohen is saying about it all:
Like all other content owners, are in complete control when it comes not only to what content they make available on the web, but also who can access it and at what price.
For more than a decade, search engines have routinely checked for permissions before fetching pages from a web site.
Millions of webmasters around the world, including news publishers, use a technical standard known as the Robots Exclusion Protocol (REP) to tell search engines whether or not their sites, or even just a particular web page, can be crawled. Webmasters who do not wish their sites to be indexed can and do use these two lines to deny permission.
So is this just a case of sour grapes for the newspapers wanting to make more money online? I can see how it would appear that Google is “stealing” the content, but surely they should be happy for the extra company exposure without having to pay for it. What do you think? Feel free to share your thoughts below.