Menu
Menu
October 7, 2012

Websites from a bad neighbourhood

Published: 7 October 2012 

Before planning an SEO strategy for a website it is important to get a clear picture of its history. Whether it is good, bad or in between Google does not forget and this must be taken into account before planning a strategy otherwise it could become an uphill battle. The first place to look is Google Webmaster Tools to see if there have been any warnings, crawl errors or malware that could prevent your website from ranking.

Google views the internet as one big popularity contest where votes come in the form of followed links and social shares. One vote from a trustworthy source could be worth 200 votes from less trusted sources. Issues arise when these votes appear to be manipulative to the extent where it violates Google’s Webmaster Guidelines. A common example of this is when there are a large number of linking domains that reside on the same hosting account and registered under the same name. Sometimes this is done without the intention of gaming the system but the domains can still incur penalties. This information can be found easily by doing a WhoIs search for admin details and using Netcraft to find domains that are on the same netblock.

Another common issue to be aware of is article spam. The Penguin algorithm was initially rolled out on April 24 2012 with subsequent refreshes happening on May 26 and October 5. It is also known as the webspam algorithm and it mainly targets over optimised anchor text from untrustworthy sources. You can figure out whether a website has been getting these types of links by checking the source of backlinks and looking at what anchor text they are using. I recommend viewing backlinks with Open Site Explorer however this is also possible with Ahrefs. These tools can be used to draw conclusions about whether or not there has been any issues or if there is likely to be any in the near future. An informative Penguin related article was posted by John Doherty from Distilled where he explains an effective way of diagnosing whether a website has been affected by the algorithm. It can be read here.

Being aware of previous link building is also a good way of anticipating future drops in rankings and we keep track of other websites aside from our clients to see if our theories are correct. Although unproven there is a common trait that building high volumes of low-quality links in a short period of time has the effect of diminishing returns on rankings. Low-quality links do not tend to last long due to them being removed or because the pages become de-indexed. The backlink history tool from Majestic SEO is a useful way of seeing if this has been the case. Here is an example of a website that we keep track of that isn’t a client…

There are other issues to watch out for such as duplicate content both internally and externally. This post goes into more detail about how to fix duplicate content issues both onsite and offsite. Getting a clear understanding of a website’s history shows what direction to take and helps to prevent any nasty surprises in the future.

Ben Maden

Read more posts by Ben

Leave a Reply

Your email address will not be published. Required fields are marked *

Shares