On the 24th of April 2012 Google released a major algorithmic update, known as ‘Penguin’, which could change the face of link building within the SEO industry forever. Initially thought to affect about 3% of queries, subsequent releases may affect up to 10% of all search queries (overlaps are likely to exist so the number is probably about 6%).
This was the first time that mainstream news covered a significant Google algorithm change with press in the New York Times, The Register in the UK and The Age in Australia.
The previous rounds of significant algorithm updates, dubbed Google Panda, primarily targeted onsite factors including; low quality/reproduced content, high ad to content ratios & affiliate sites.
The Penguin update was designed to remediate suspicious link profiles and lots of genuine businesses, who were relying soley on Organic Search traffic to generate revenue, were impacted – seeing a reduction in organic traffic as a result of this update.
Google take a number of factors into account when assessing the value and good faith in which a link profile was created, including but not limited to:
- Anchor text distribution
- Keyword focused
- Brand terms
- Generic ‘click here’ or naked URL terms
- Link velocity (How quickly a site gains links over time)
- Link quality and distribution
Sites found to be violation of the Penguin formula saw significant ranking decreases almost instantaneously, a poll run by SEO roundtable showed that 66% of participants saw a decline in organically generated search traffic.
Webmasters were encouraged to clean up their link profiles by contacting sites providing links that contravened Google’s quality guidelines and requesting removal; however the majority faced little success as many of these low quality sites do not have the resources to manage such tasks. The fact is, getting links removed from websites that are on auto pilot is near impossible as either contact details are not available or E-mails go unread and unanswered. Many made their best attempts and applied for reconsideration however were summarily denied by Google.
In June this year, Bing was the first Search Engine to release a ‘disavow’ tool. The tool is a seemingly easy way to request for these low quality links to not be counted when ascertaining the value a website for ranking position. The Bing disavow web interface is fairly easy to use; simply select whether you want to disavow content from a page, site directory or an entire domain, enter the offending location and request disavow.
Bing currently has a fairly low footprint in Australia, typically referring 3-7% of organic search traffic and consequently the majority of search agencies paid little attention. What search consultants and businesses we’re crying out for was a disavow tool from Google.
The nervousness businesses were feeling at this time was compounded by a further round of link warnings distributed soon after by Google on the 23rd of July. The wording was very similar to that which preceded the most recent set of Google penalties and consequently had the Internet in a state of panic. In this instance, even super white hat SEOMoz received a suspicious link warning.
Following months of speculation within the SEO community Google finally released a tool providing Webmasters the ability to request to disavow part or all of their link profiles. There are a range of theories about the Google disavow tool, its effectiveness and the suggestion of hidden motives. Some have suggested the disavow tool Google has created essentially crowd sources link spam detection, allowing Webmasters to directly report information about what links they believe to be low quality or spammy links.
By declaring a portion of a link profile to be low quality or unethically generated Webmasters could effectively be putting their hand up to historically doing some wrong. Some Webmasters are concerned that whilst Google may mitigate some of a Penguin inspired penalty, there will always be a copy of a link building ‘criminal record’, that could, in the future, be used for character judgement against a site causing future issues. If faced with the necessity of needing to repair a link profile a sensible preferred order of operation would be to:
- Attempt to outweigh negative links by acquiring a higher number of links from high quality sites
- Continue to reach out to websites to requestfor low quality links to be removed
- Remove the affected pages from your site (404)
- Use a reclamation tool
I have not had a client receive a Google penalty from any of the Google algorithm updates and consequently not have to go through the procedure of link profile remediation, however our SEO Team have discussed how we would tackle such an issue
Reverse Link Outreach
Start with a good quality link database. My personal preference is Majestic SEO, however aHrefs andOpenSiteExplorer will work just as well. Create the most granular report for your domain possible, make sure it’s the root of your domain and generates the largest possible dataset (historical URLs). Download the database in CSV format and create a pivot table across all the data.
Build your pivot to sort by destination URL, anchor text and ultimately the source URL. Locate your URL that is receiving the largest number of external links and then analyse the anchor text weight. If any one anchor text is responsible for more than 10% of the links, except brand, it’s likely that there is some link profile manipulation ie. it’s generally not natural behaviour for multiple sites to link to a site using the same non-brand anchor text.
Review each source URL and assess if the link looks to be organically generated or has been manually created. Paid links (mainly blogroll or blog network) and mass article/directory submission are the most common forms of link profile manipulation. If the link doesn’t look above board, contact the webmaster and request that the link be removed or assigned a rel=”nofollow” tag The latter effectively requests for Google to not include the link when determining the value of a site.
Attempting to outweigh low quality links with high quality links
The Google filters are based on characteristics of an unnatural link profile including the ratio of high quality & authoritative links compared to those which are lower quality. Rather than removing links from the low end of the spectrum, which often has a very low success rate, why not focus your efforts on increasing the number of high quality links you have? Usually it takes just as much time to gain new high value links as remove old low quality ones, however the flow on effect is the improvement in overall link quality which should increase your sites ranking in the Google SERPs as well as increase brand awareness and deliver additional referral traffic from third party sites
404ing Penguin affected pages
If the issue appears to only affect a select few pages, simply telling Google that those pages no longer exist may be a way of escaping the penalty. Obviously this isn’t a solution to remedy site wide issues or home page link profile shortcomings, but there are instances of this being effective.
Simply take the content from the old Penguin affected page, rewrite it to some degree and then repost it at a new, clean URL. In place of the old content, serve Google with a 404 error indicating that any negative juju is to cease there. Provide a nice, easy to understand and 404 page for any visitors legitimately arriving from those links to help them to find other content on your site
Using Google’s disavow tool
Once again, use your favourite link database to generate the most complete set of backlinks to your site. This time we want to order all links descending by link value. Each database uses a different metric for link value. If you’re using Majestic SEO I recommend Source Trust Flow. The first step is to identify which of those links likely look like low quality to Google – but how do you determine quality? Initially I’d flag anything with low (0-5) trust flow as likely candidates to be disavowed. Remember – links can be disavowed at both a page and a domain level. Be careful in your selection, some links on a particular domain may be legitimate while other could be perceived as attempts to manipulate the algorithm, you only want to disavow the manipulative links.
You can disavow a single url by including it on its own line in your disavow file. To report an entire domain use a disavow: prefix (similar to the site: prefix in a Google search query)
Once you have compiled a list of all offending URLs and domains, save them to a .txt in notepad and head over to the Google disavow tool. Login to your Google Webmaster Tools account and select your domain from the drop down box:
You’ll be given the opportunity to upload your recently created text file containing all your suspect links and then send it off to Google.
The current advice is to wait 10-14 days for Google to process your list of disavowed links prior to taking any further action. If you haven’t seen any improvements in ranking and traffic in that period, proceed to file a reconsideration request and wait a further 3-4 weeks.
Once again, my recommendation is to use the link disavow tool only as a last resort. Disavowing links that Google has not identified any problems with could have a negative effect on your site and inadvertently flag the website as being in violation of Google guidelines going forward.
The post The Google Disavow Tool appeared first on Rich McPharlin.