Part one of this guide explained the different Google penalties, and now you are ready to learn what caused the penalty in the first place, and how to fix it right?
Finding out you have been hit with a Google penalty is always bad news, but the good news is…it can be fixed, no matter what caused it, and no matter what type of penalty it is.
A manual penalty often displays a message in your Webmaster tools. Go to “Search Traffic> Manual Actions”. Any unnatural, deceptive, or manipulative links detected on your site will alert Google, and you end up with a message like this on your screen.
If the content on your site is what triggered the penalty, you are likely to see a message like this one:
Algorithmic penalties often never display a message, so figuring out the problem is left to the webmaster.
This part of the guide will discuss potential penalty-triggering offenses. All of the offenses will not be listed, but the ones that are most likely the cause of your penalty are the common ones listed here. It is a good start for webmasters to start working towards repairing a penalty, as well as learn how to avoid them.
Just because you have not been hit with a Google penalty yet…it does not mean you will not be!
Yes, we know you WANT your site noticed by Google, but in a positive way, right? Well, for now, we are going to stay focused on the negative attention your site is getting, or could possibly get from Google, and around part 7 of this guide, we will help you work on creating a more positive SEO campaign for your page.
Google offenses typically fall into three categories, on-site content, off-site backlinks, and technical code issues on the page.
Armed with the information in this part of the guide, you will be much more effective in removing the penalties using the information given in the upcoming guides. We are not trying to drag this along, but what we are trying to do, is give every webmaster a chance to fully understand each element, so they can create better websites, increase positive attention, and eliminate all negative attention from their site.
Algorithms can only do so much when it comes to spotting poor quality content, but you might be surprised at exactly how much they can do. Google’s algorithms notice certain behaviors that are consistent with poor quality text, such as excessive advertising, plagiarism, and even regular typos. If a site triggers the algorithm to a pattern that resembles poor quality content, it is flagged for a manual review. The algorithm can detect doorway pages, keyword stuffing, and duplicated content among many other things that will alert the reviewer to the site, so be sure your site does not have any of these triggers.
Check to make sure your content is original by using CopyScape or CopyGator, if it is not…remove it before Google removes you.
Spun content is a black hat SEO tool that Google has trained its algorithms to spot. The spun text reads horribly, and rarely makes sense, and Google has no use for it on their site, so remove it from yours.
Algorithms look for a high bounce rate on websites to detect thin content. When a Google user clicks your page for the information they searched, and they do not find what they need upon entering your page, they return to the search engine page with record speed, and if you have that happen enough, Google will flag your site for review.
The days of keyword stuffing are over, thank goodness! Who wants to read a 400-word article that repeats the same word 50 times? Not me, and guess what, not Google either! Instead of going crazy with one or two keywords, use long-tail keywords, synonyms, and other variations of the word. Yes, you want to optimize your headers, text content, and header tags, but try to mix up the wording for a better result.
Ok, so spun content, duplicated content, thin content, and keyword-stuffed content have all been removed from the site of the viewers, but if you simply hide the text, Google can still see it. Using hidden divs, or by plastering keywords in text color to match the background of the site used to be an effective black hat SEO technique, but Google has caught on to the game, and those still doing it, will be penalized.
The Panda update is working to remove content farms, so better quality sites can move up the Google ladder. Any user-generated or auto-generated content that is primarily for capturing search traffic or plastering advertisements will cause a red flag on your site. Some of the surviving content farms, such as Squidoo and Hubspot are only still around because they have higher standards than match Google’s requirements for quality content, so if you plan on posting content there, make sure it is the best you can do, otherwise, it risks being removed.
Linking has been a way of gaining higher rankings in Google since the beginning, so why is it such a problem now? Google still looks at websites positively that have high-quality links, but the search engine giant is now looking at sites much closer than before and realizing many of these high-ranking sites have been using poor-quality links to push their rankings.
Part 4 of this guide will walk you through the proper ratio for anchor text, but right now, all you need to know is that you should have a very low amount of links that have an exact match of the keyword your using, it just does not look natural to Google.
Reciprocal linking is only approved by Google when it makes sense. If your website sells pet supplies, it would make sense to swap links with animal hospital websites, pet adoption centers, and other pet-related sites. What would not make sense, your pet supplies website swapping out hundreds of links with sites about cooking, tennis, and Internet gaming.
Rented links is one way webmasters increase rankings within the search engines, but it can be a risky game. Many times when the site is linked, it creates one or more violations, so stay away from renting or buying links.
In addition to the above links issues, the below list is also responsible for alerting Google to your site in a negative way:
Google has search bots that crawl through the content on your site and check your code. Many webmasters overlook the fact that their code could be the cause of their Google penalty, but in many cases, it does trigger a red flag to your site, so use Seositecheckup.com to analyze your code and correct any problems before Google notices them.
An XML sitemap helps inform Google every time you post new content, so even though it is not required, it should be a part of your site.
Unfortunately, anyone can report your website to Google, and this means your competitors want to play dirty as well. The report only flags your site, it does not penalize it, but if Google finds other issues, even technical ones, you get to end up penalized.
Cloaking creates different content for the search engine than what is seen in the browser window. Delivery cloaking uses an IP address to present geo-targeted content to specific users. User-agent cloaking is used to present different content to both users and search engine bots. Cloaking is an old practice, so if it is affecting your site, it most likely was put there a while ago…
In Part 3 of this guide, we will discuss how to quickly and effectively collect the data that will show you exactly what you need to change, and HOW.
Need a recap of Part 1: Identifying a Google Penalty
Part Three: How to Gather Critical Data on Your Website