How Google’s Algorithm Rules the Web Recap – SEO Monday



It’s SEO Monday, and I’m your presenter Lauren. Today I’ll be recapping Wired Magazine’s detailed report on Google’s most significant additions and adaptations to their algorithm over the years. I’ve gone through and cherry picked the most important information from the article and commented on what all the changes mean for site owners and marketers alike.

Let’s start with a condensed version of Google search advance history:

Backrub (1997): The search engine that morphed into Google as we know it today. The breakthrough innovation: ranking searches based on the number and quality of incoming links.

The Take Away: Google’s Inbound link calculation has remained a constant over the years and is very much present in today’s algorithm for ranking web pages/searches.

Personalized Results (2005): Users have the ability to allow Google to provide individualized results based upon their search behavior.

The Take Away: This foreshadowed the recent personalized search result push in the last few months of 2009. A user does not have to be signed in for search behavior to be collected and used in Google’s search results (based on cookies). Search engine marketers braced for potential adverse effects we were going to see from this update, but thankfully clients’ businesses have gone on as usual, if not stronger. Still controversial, I anticipate additional changes to personalized search as time goes on.

Caffeine (2009): increased efficiency for index bandwidth, made it easier for engineers to add more signals, improved user experience with faster results

The Take Away: The last quarter of 2009 brought several changes in Google, namely Caffeine. The changes have forced us all to take a look at website performance to ensure that not only are we improving potential customers’ experience, but we are in fact optimizing pages to be found better in the search engines as we can speculate that site speed is among the 200 signals in the ranking algorithm.

Real-Time Results (2009): If you can’t beat ‘em, join ‘em! Google integrates Twitter, Facebook, news posts and blog article results in the prime real estate of search results for popular and breaking topics.

The Take Away: Want your 5 seconds of fame? Google uses similar signals for ranking pages, to decide what posts to roll in the real-time search. Google says [specifically about Twitter] they look at retweets, followers, and organic vs. bot to make sure only the best posts make it into the stream.

Absorbing The Competitors:

  1. Facebook – Brought up the notion of trusting friends’ content
  2. Twitter – Introduced the real-time search
  3. Yelp – Returns results by user ratings

Coming up this Year: Around 550 new improvements to its algorithm

Under the Hood: Google’s Algorithm

In 1997, The PageRank concept rated pages based on the number and importance of links that pointed to them. Larry Page and Sergey Brin attribute PageRank as the company’s fundamental innovation.

The Take Away: While PageRank is still recognizable as a perceived “tell-tale” of how much trust and authority a web page has, it should not be a metric that keeps you up at night. Google Webmaster Trends Analyst, Susan Moskwa, explains why Google removed the PageRank metric from Google Webmaster Tools, “We’ve been telling people for a long time that they shouldn’t focus on PageRank so much; many site owners seem to think it’s the most important metric for them to track, which is simply not true. We removed it because we felt it was silly to tell people not to think about it, but then to show them the data, implying that they should look at it.” The problem is that PageRank ratings are still installed on millions of toolbars. If there is one thing to take away from the availability of PageRank information, it would be that it provides an indication of a penalization (green bar=good, PageRank of 0 = investigation necessary).

Crawling Webpages & Indexing: As Google bots pass through web pages, the content is collected and broken down in to an index. So, how does Google sort through the thousands, if not millions, of pages with similar themes? There are more than 200 signals to help rank indexed pages. Here are a few key staples of calculation according to Google:

  1. PageRank
  2. Title Tag [weighed especially heavy early on]
  3. Anchor Text [hyperlinked keywords]
  4. Freshness [of content]
  5. Location [from which you are searching from]
  6. Popularity

The Secret Sauce: Overall Importance [of a page] and Query Specific Relevance

Google’s rank basis comes from a true democratic system where users are able to “vote” for pages/concepts on the web [through linking one page to another]. This goes back to the Backrub foundation that began with the inception of Google. But there’s another twist to the voting floor – user generated data is lending a hand to identifying the results that meet the needs of a certain search query. This comes through clicks on results, words replaced in the search query when the user is unsatisfied with the results, and how queries match with location (what we know as Personalized Search).

Feeling inspired? Two of Exclusive Concepts’ core services are our search engine optimization service and our managed conversion testing service. We offer free audits for both. Go online or give us a call for more information!

Thank you and I hope you enjoyed today’s installment of Your Daily Concept. We’ll see you tomorrow for PPC Tuesday!