It’s often difficult to get a simple answer on the impact that site architecture has on search engine performance and website traffic.
Accordingly there also a myriad of myths that circle the forums and blogs as to how the search engines treat various architecture scenarios and practices. Google have come to the rescue (for their search engine at least) with a post addressing some of these myths, malicious practices and issues.
Luisella Mazza, a senior Search Quality Analyst at Google, went through some of the more common issues (caused by malicious hackers and self imposed) and myths at the recent SMX London conference.
Below I’ve summarized some of the discussion to clarify any misgivings you might have on architecture’s impact on your website’s performance.
Myth 1: Traffic Drops Due to Duplicate Content.
Most SEO’ers appreciate that duplicate content can penalized by the search engines, but Luisella points out that such practices are only penalized if the purpose is aimed at fooling the search engines, which is often not the case where similar content is presented in various areas of a website.
Myth 2: Using Affiliate Programs Causes Traffic Drops
Using affiliate programs alone isn’t cause for concern. The issue lies in whether you are offering original and useful content to promote your affiliate links. As long as you offer up new and relevant content to support your affiliate links, then you have equal rights to a strong ranking as other sites.
Luisella makes special reference to malicious SEO practices as they are often the main cause of issues. It’s techniques such as hidden text and links that most often cause sites to be penalized by search engines. The other practice is using redirects which show search engines one version of content and users a different experience.
Below is her first presentation which highlights how to use Google’s tools to identify such malicious efforts. A useful way to check if your website has been being hijacked.
Site Design & Architecture
Luisella then goes on to offer up the following common site design and architecture issues:
- First off, check that your robots.txt file has the correct status code and is not returning an error.
- Keep in mind some best practices when moving to a new site and the new “Change of address” feature recently added to Webmaster Tools.
- Review the settings of the robots.txt file to make sure no pages — particularly those rewritten and/or dynamic — are blocked inappropriately.
- Finally, make good use of the rel=”canonical” attribute to reduce the indexing of duplicate content on your domain. The example in the presentation shows how using this attribute helps Google understand that a duplicate can be clustered with the canonical and that the original, or canonical, page should be indexed.
At the end of the day – search engines are in a constant state of change as algorithms and rules change, so it’s normal that some level of traffic fluctuation will occur over time.
As Luisella points out, if you see any unexpected changes (that seem out of the norm), check for malicious attacks, and if all seems okay, then get back to grass roots site structure for the answers. Google’s Webmaster Center is a great place to start your research.
If you’ve encountered malicious or architecture issues with your site that have caused major search engine issues, share your stories and solutions below.