Ways To Combat Panda Penalisation

Ways To Combat Panda Penalisation

By | April 13, 2011 at 8:56 pm | One comment | SEO | Tags: , ,

With the recent Panda update, the usual scaremongering has reared its ugly head along with the “gurus” overusing the term to appear knowledgeable and “down with the kidz” in business circles; Implemented to remove websites that returned low quality content – either via User Generated Content (UGC), duplicate content etc – the Panda update cleaned up a lot of search results where UGC websites dominated. However, several website have been caught up in the algorithmic change and are already experiencing a drop in exposure.

Whilst the majority of websites that were affected on the day of roll out in the UK are big, national websites, the potential for some websites to be hit with panda penalisation is still very real.  Simple problems caused by the bad practises of their website developers and/or their SEO’s can create a myriad of headaches which result in “false positives” when under the panda spotlight.

To combat panda penalisation, there are a few things you can do to tighten up your website to ensure you aren’t giving off any false signals.

Monitor Indexation

One problem that is regularly encountered when monitoring fluctuation in rankings is duplicate content.  Google, quite literally, gets confused with your website and will change the pages that it decides to rank for specific phrases when basing the decision solely on on page content. This regularly happens with freshly launched domains or website migrations.

Checking Indexation using RankTracker

Checking Indexation using RankTracker

The causes for this can be down to a simple thing such as Google indexing a printer-friendly version of your page or something more extreme such as capitalisation in URL’s which both return a 200 (OK) server response code. The solution?

Canonical Link Element

With the introduction of the “Canonical Tag” (as it is often referred to) it is now much easier to control duplicate content on your website.  An example canonical tag would look like:

<link rel="canonical" href="http://www.kevstrong.com/search-engine-optimisation/change-domains-and-retain-your-pagerank-in-under-2-hours/"/>

This ensures that you can serve pages as they would be shown on the website whilst telling the search engines which page to index and attribute authority to – all without impacting on the user experience.

Check For Duplicate Mark Up

As an SEO you should ensure you’re doing your upmost to make every page of your website unique and relevant to the content on that specific page. Avoiding duplicate mark up should be at the top of your on page priority list and regular checks will help you tighten up your website.

There are two ways to do this – both free.

Search Engines

Use the site: operator in Google to easily spot duplicate Title/Description tags.

Use Site: in Google

Use Site: in Google

 

Spider Tool

Use a spider tool to analyse the on-page tags and elements of your pages.  I personally love Screaming Frog for this as it will also show you canonical tags for those pages and will easily export to Excel for further analysis.

Screaming Frog SEO Spider

Screaming Frog SEO Spider

Redirect All URL’s To One Core Domain

With multi-lingual serving aside, your domains can be causing you complications. Sometimes your website can serve up different versions of your URL thus creating full-site-duplication. Had this been a problem with this blog, http://kevstrong.com would be classed as a different website to http://www.kevstrong.com for example.

If you have not already implemented canonicals on your website, Installing a simple site-wide 301 redirect solves this problem

Also ensure that any other URL’s you have serving up your websites (hyphenated etc), are re-directed using a 301 (permanent) redirect to ensure only one version of your site exists in the search index.

Change Your Product Descriptions

When dealing with large online retailers it is hard to create thousands of pages of content for products. However, one problem that many websites bring on themselves is the implementation of content provided by the product manufacturers – the same information supplied to thousands of other websites. This is the prime reason several affiliate websites and coupon websites took a big hit in the panda update.

Write unique descriptions for the products as much as possible (100% if you have the manpower), and be creative with the specifications (height, width, dimensions).

Use Javascript When Serving UGC or Web Applets

You could choose to serve elements of your website that are generated by users (such as comments) or plugins that are implemented throughout your site using Javascript. This ensures that you cannot be maliciously targeted for duplicate content by spammers (although the likelihood of this being a base for penalisation is slim-to-none) or penalised for showing the same content on every page.  A side benefit to this would be increasing site speed. (A good example of this is thesun.co.uk).

Thesun.co.uk with JavaScript disabled/enabled UGC

Thesun.co.uk with JavaScript disabled/enabled UGC

Obviously there are elements outside of your control when it comes to the panda update, but if you keep your own house in order then you are at least controlling what you can.

 

About the Author

Kev Strong

Kev Strong is an online marketing consultant at Newcastle Upon Tyne based digital marketing agency, Mediaworks. A lover of all things search and an ex-web developer, Kev Strong (a.k.a Goosh) is a specialist in advanced search engine optimisation.

One Comment

  1. Copywriting Newcastle (3 years ago)

    One things i’ve not been doing is redirect my http:// to my http://www i’m going to add this to my list of SEO to do’s. Thanks for the details.

Comments