14-Day Free Trial

Improve and monitor your website's search engine rankings with our supercharged SEO tools. Instantly create competitor analysis, white-label reports and analyze your SEO issues.

Create My Free Account

The Day Google Changed Forever


The Day Google Changed Forever

Ever since the infamous Mayday Update, Google’s been using human “quality checkers” alongside its algorithm to ensure websites aren’t gaming their bots.

The goal of these quality checkers? Make sure the algorithm isn’t letting any spam websites slip through the floorboards. By visiting websites flagged by the bots as potential spam sites, Google quality raters can give the final word on whether or not a website is valuable.

Of course, we’ve all been interested in what types of guidelines these quality raters go by (since they’re likely similar to what the algorithm targets) and now we finally get the chance to see.

In a document leaked several times over the last several years, Google lays out all of its guidelines for quality raters. Read the entire guideline at scribd.com, you can browse through the entire 160 pages, until they pull it down that is!

Although this document doesn’t necessarily contain SEO tips (it was designed for quality raters), we can glean some interesting information on how the Google algorithm works. And no – there’s no “magic bullet” algorithm trick in here. Instead, this document supports our hypothesis: Google’s looking for a quality user experience.

The Quality Rater’s Rating Scale

As we’ve discussed in past SEOSiteCheckup.com articles, relevance is KEY to ranking high on Google.

Your website must provide valuable information on whatever subject it’s targeting. So if your website’s targeting “goldfish,” if it doesn’t have high-quality information that’s relevant to goldfish, it won’t rank.

But, after the document leak, we could see the five point scale Google rater’s use to check the relevancy of a website.

1) Vital

2) Useful

3) Relevant

4) Slightly Relevant

5) Off-topic

So what does it all mean? First, let’s talk about Google’s “Searcher Intent” rule.

What’s the User Searching For?

In the leaked documents, Google talks about how it determines what websites to rank for a specific keyword. Some words have multiple meanings, so the algorithm does its best to put the most relevant content.

For example, when someone searches “Amazon,” they could mean a few things. They could mean the website, Amazon.com. Or they might want information on the Amazon Rainforest.

Well, if you do a search for Amazon, you’ll see what Google favors. Amazon.com rankings fill up more than half of the first page of rankings. So through substantial testing, Google’s determined that their users view Amazon.com as the most relevant term for the keyword “Amazon.”

For another example of this in action, do a search for “Apple.”

What Makes a Site Relevant?

 

So now that you understand the intent rule, it’s time to discuss how a site is rated on the quality rater’s scale.

Take the highest rating, vital, for example. A vital website is a website that must rank on page #1 rank #1.

For example, if you search the name of presidential candidate Gary Johnson, you get his website on rank #1. Now, normally when you search the name of someone, you get their wikipedia page.

So vital is information that the user is DEFINITELY searching for. If they’re searching for “Target,” the vital result is Target.com.

Important note – just because you have an exact-match domain, doesn’t make your website vital. The vital rating is saved for websites that have the highest possible chance of being exactly what the searcher’s looking for.

Down the line, you have the “useful” ranking. Now, this is where quality content comes into play. You could have the best quality in the world, but if your website’s not the vital website for your search phrase, “useful” is the ranking you should be looking for.

A useful website goes above and beyond relevance. Not only does a useful website provide relevant content, it provides additional value to whoever’s reading it. “Useful” websites will have some type of visitor interaction, top quality information and a thriving community.

Conclusion: What this Means for SEO’s

Really, this document doesn’t teach us brand new information. We’ve always speculated this was how Google handled rankings – all this document does is support our original ideas. Google cares about quality and relevance. 

It’s not enough just to have relevant information anymore. No, you need to be providing value with your content. Enough value that your website could receive a “useful” rating (if “vital” is impossible). 

The first page of Google is taken up by “vital” and “useful” websites. By putting your visitors first, you dramatically increase the odds that you’ll be seeing your own website on page #1.

So, Do We Still Optimize, and How Much is Too Much?

There’s been a lot of buzz lately about over-optimization. Ever since Panda 3.4 took down several major blog networks, Google’s been hinting at additional penalties coming to those who over optimize their websites.

It’s a big problem, too. Many ambitious SEO’s learn the basics to SEO and over-optimize their websites. They begin to see improvements, so they continue to optimize. And it’s not only ambitious beginners. No, there are entire SEO companies built on artificially inflating results by over-optimizing. 

In future updates, expect to see waves of penalties to over-optimized websites. Rumor has it one of these updates is right around the corner. Is your website in the crosshairs? Maybe. In today’s article, we’ll be talking about sites that are at risk and how you can protect yourself before it’s too late.

Creating Sites for Humans – Not Robots

Before we describe the nitty gritty details, it’s worth mentioning that virtually all of these problems have to do with creating a website for robots – not humans.

If you over-optimize your website, it’s going to be ugly. It’s going to be littered with odd keywords and paragraphs that don’t make any sense. Additionally, the titles of your pages and the title of your site will make little to no sense to the human mind. To Google’s robots, however, your website may make a lot of sense. But since Google bots are trained to penalize sites that only contain content readable by engines, you don’t want your site optimized for the Google bots.

So when you create your content, ask yourself, “is this written for humans or for the Google bots?” Keep this idea in mind as you read on.

Keyword Spam

This is one of the biggest problems facing beginner SEO’s. Look at the following paragraph and see if you can spot the problem:

Welcome to Accounting Services Dallas. We offer the best accounting services in Dallas. If you find yourself in need of accounting services, don’t go anywhere else – come to accounting services Dallas!

This is a bit of an over-exaggeration, but you’ve probably seen this before. This is called “keyword stuffing” and it is a HUGE red flag to Google bots.

With keywords, more is NOT better. No, the best way to rank for a keyword is to use it as naturally as possible. So use it when it makes sense, and only when it makes sense. Instead of mentioning your main keyword 10 times per paragraph, use your main keyword once, but fill in related LSI keywords in a way that makes sense to the reader.

Multiple Pages with Similar Titles

This is another “trick” you’ll see fairly often: SEO’s creating multiple pages with very similar keywords. So for our Dallas site, you may see a page titled “Dallas Accounting Services.” Then another page is called “Dallas CPA Services.” Then “Dallas Tax Services.” All these pages could easily be combined into one. And that’s exactly what human readers would prefer to read.

No one wants to read several pages that all contain the same information on slightly different keywords. And let’s face it – if you’re writing content on each of these keywords, the content is going to be garbage. It’s virtually impossible to create high quality unique content for slightly varying keywords.

Watch the Anchor Text

When most SEO’s find their “main keywords,” they get tunnel vision. This because the main keywords will always have the most potential traffic. So for both internal and external linking, these SEO’s will use the main keyword as their anchor text virtually every single time.

So Google sees the site has a few hundred links. Great. Wait, what’s this? All the links have the anchor text “accounting services Dallas?” Instant red flag. On top of this, some SEO’s point all these same anchor text links at their homepage. That’s a HUGE red flag.

See, if these were natural links, they would have varying anchor text. And they wouldn’t all be pointing to the homepage!

Content is King

Lately, we’ve been talking about how big of a role content is playing in SEO.

The articles on this website are written to tell you what we DO know about Google, as well as speculations of what Google might do in the future. 

We can’t know everything – especially after the latest algorithm re-haul. After extensive testing and trial-and-error, though, we’ve been able to determine a framework for getting real results on the search engine positions.

Today’s article will serve as an overview. By reading this article, you’ll understand exactly what you need to do on your website to increase your rankings.

The Game has Changed

If you asked a SEO expert last year how to get to page #1 on Google, they’d tell you to build backlinks.

And that’s about it.

Yes, we knew there were other determining factors. Testing showed us that backlinks played more of a role than anything else for SEO, though, so building backlinks became the primary SEO strategy.

A lot can change in a year. Although backlinks have consistently been the cornerstone to any good SEO campaign before, now virtually all SEO experts will agree: content is now the most important aspect of your SEO campaign.

After content creation, the line becomes a bit blurry. It’s true building backlinks is still important, but it’s becoming more and more clear building a strong “social” following may be even more important.

This article isn’t about speculating, though, so let’s move right along… 

The New Content Strategy – Give the Readers What They Want

Since Google was founded more than a decade ago, there has been a constant battle between search engine optimizers and Google.

Through testing different factors and strategies, we were able to determine what kinds of factors Google took into consideration when ranking websites.

When doing this, though, many internet marketers missed the point of what Google is looking to do: provide their users with quality content.

Yes, it’s been over ten years. This fact hasn’t changed. Google STILL wants to deliver the best content to their users.

So why, then, do the vast majority of internet marketers skimp out on quality content? Why is the most common article length 500 words? Why do people outsource content to foreign countries and get back articles that are hardly legible?

It’s simple – these people are looking for a “quick fix.” They want page #1 yesterday.

Unfortunately for them, with the Panda Update, it seems like Google’s finally figured out how to determine rankings based on the quality of your website.

Here’s what we know:

• In general, websites with longer content (1,000-4,000 word articles) rank better than websites with 500 word articles. 

• Google knows if your website provides value. By looking at bounce rates and average time spent on your website, Google can determine whether or not people are finding the information they need.

• After the Panda Update, it would appear Google takes around 30 days to update your website’s rankings based on any changes you’ve made.

• Google’s leaning towards ranking websites higher if they have a large following on Twitter, Facebook, or other social media platforms.

• Links are still important, especially if you know how to get high quality and relevant links. The days of spam appear to be over, though.

• Google checks the ratio of content to advertisements on your website. If you have more ads than you have content, don’t expect to be ranking very well.

By looking at what we know, we can start to form a conclusion on what Google’s currently looking for.

Google likes websites with longer content. Longer content usually means more value being shared through your website. 

If you have long content, a low bounce rate, a high amount of followers on social media websites, a good amount of relevant and high quality links, and a good content to advertisement ratio, you will rank well, usually on page #1.

What are all these factors symptoms of? A website that provides true value to its readers.

Moving Forward

During the past few years, a lot of internet marketers have fallen prey to the “rank quick” SEO philosophy.

It’s time to go back to the basics. Create a website that provides real value to the people who are visiting. Make sure your content gives readers what they want.

This really is the bottom line. Yes, there are other factors besides content Google is taking into consideration. However, if your content is top quality, these other pieces will naturally fall into place, leading to excellent rankings on Google.

Of course, Google may still have a few tricks up its sleeves. As always, the best way to get your website ranking well consistently is to be informed. Know what Google’s up to and create your content based around those guidelines.

 



Comments

comments powered by Disqus

Analyze and monitor your SEO with our powerful ToolBox.
Create your free account today!

×
By clicking "Sign Up!" I agree to the Terms of Service