Unique SEO Content Rules Causing Algorisms Everywhere!

By Nik

Intro – Covered in this article:

  1. Seeing Google’s big picture when it comes to algorithm changes.
  2. How Google’s recent changes reflect a new attitude towards SEO’s.
  3. What mechanism Google uses to disqualify duplicate content and how you can protect yourself.

It never stops changing, folks.  Google’s algorithm is the Bugs Bunny to an SEO world filled with Elmer Fudds.  Unless you’re going for some kind of a world record for BS-ing while scratching your head, you probably need a swift change of perspective that will keep you “in tune” with your white whale at all times.

Google’s algorithm will continue to be adjusted over time to fit one simple goal:  return the most relevant, helpful pages for any particular search.  To that end, there has been a dynamic shift in Google’s ability to attain this goal, changing from being a very reactive entity to one that is proactive.  Since there was such a low historical penetration within the internet space of proper SEO practices and a high prevalence of SEO trickery (or SEO mockery from Google’s perspective – note:  geeks hate mockery), Google’s major changes of the yester-year were laden with defense tactics, penalties and black/white thinking.

But what formerly seemed like a very emo view of the world has changed to one that is much more in-line with the hyperbolic hopefulness of generation-Obama.  Google is starting to see that SEO is not an island, but rather a part of a much larger organism of the internet.  This mentality is reflected in a few changes that I’ve seen recently:

Part I: Content is more important than ever before.

Google will always believe that people searching on the internet are, by majority, seeking knowledge and information.  Do you agree?  I personally do still, but also think it’s starting to change towards entertainment and shopping.  I think that regardless of what a person is looking for, they are bound to at least peruse content on a page.  Having unique content (150+ words) is more important now that I can ever remember.  Keeping up is akin to allowing your page to face search users.

Your Next Step: Write content for pages that deserve more face-time on the search engines.

Part II: Duplicate content is undervalued more heartlessly than it was in the past.

It’s simply a matter of keeping up with inflation, that’s all!  Google chooses 10 results for the first page SERP and they’re not going to stop. After that, results only really matter till we hit the 100th result mark. So when your google SERP says “Results 1-10 of about 1,000,000” or even “… of about 1,000,000,000” – to Google, everything past the first 100 or so is just a waste of space. They can’t control how much content is being places on sites, but they do have the freedom to control how many of those page are pulled into the index – specifically, how much duplicate content is allowed within the index. I call their control mechanism the “duplicate content threshold” because it makes sense to call it that. Here’s how I believe it work: in the example below, 675 pages are all constructed based on one original piece of information.  Of those pages, 500 are directly copied.  At the other extreme, only 5 pages truly wrote that content from scratch, with just the original idea in mind.  The duplicates are 100% identical to the original content, when the “from scratch” pages are only 40% similar.

What People Did
How similar to original content? Number of pages that qualify
Copied original content 100% 500
Switched words 95% 150
Re-wrote sentances 85% 20
Started from scratch 40% 5

So if your re-writes once passed the test by being less than 90% similar to the content you nearly plagiarized in the past, now the cut off may be something more like 80%.  It’s a very easy mechanism for Google to seek high quality in an inflating pool of options.  It’s just a matter of Google adjusting a setting on their duplicate content threshold and all of a sudden, they no longer need to house tons of duplicate content. Below, we see that by changing the threshold from 95% to 80%, they only need to serve up 6 of the 675 pages that were all based on the original content.

Your Next Step: Write that new content from scratch and don’t cut corners now, just to get penalized later.

Duplicate threshold Pages housed by Google Bandwidth saved (in total worth of pages)
95% 175 + original 499
90% 25 + original 674
80% 5 + original 694

Part III: Do No Evil.  Just Do The Work.

Obviously there’s more to the store, but this is what matters on the content side and it should give you an idea of how you can prepare for the future.  Think like Google and she will like you – build your relationship on trust and understanding and she’ll be yours forever.

Your Next Step: Embrace the good and do no evil.

FacebookTwitterLinkedInShare