The First Rule of the SEOpocalypse

An incredible amount of discussion erupted over the last few weekends after Matt Cutts left these choice words at SXSW.

What about the people optimizing really hard and doing a lot of SEO. We don’t normally pre-announce changes but there is something we are working in the last few months and hope to release it in the next months or few weeks. We are trying to level the playing field a bit. All those people doing, for lack of a better word, over optimization or overly SEO – versus those making great content and great site. We are trying to make GoogleBot smarter, make our relevance better, and we are also looking for those who abuse it, like too many keywords on a page, or exchange way too many links or go well beyond what you normally expect. We have several engineers on my team working on this right now.

This, of course, has left search engine optimizers preparing for what they expect to be Pandaesque roll outs in the next few weeks and months. Webmasters have every reason to be concerned, as many have pointed out that some of the Panda negative signals ostensibly were once considered good, white-hat SEO (like keyword specific content), and Matt’s overtures indicate that a site might be “overly optimized” without indicating in his words that a site needs to be outside Google’s Quality Guidelines.

Hindsight is 20/20

In the last several major Google updates, we have been able to pinpoint “tests” that Google ran in the months previous. These quality tests are used to determine the impact on user behavior of each algorithmic roll out before commiting to them search engine wide. Luckily, since Google has pre-announced their intents, we have the opportunity to look at the last few months of anecdotal evidence to try and extract what was happenstance and what are actually Google experiments with these new algorithmic updates. I go through a few of those below.

Keyword Stuffing

Guffaw. Yeah, crazy right? I don’t know of a single SEO these days who things “I should add another keyword to my page, that will surely make the difference”! It hasn’t been years since I saw a site dinged for keyword stuffing. Notice the strike through? About 3 months ago, one of our clients who owns a single keyword domain, like Golf.com, got dinged in Google for – get this – having that single word too many times on the page. Literally, could you imagine Golf.com losing 100 positions for having the word Golf on the home page too often? Well, this is exactly what happened. At the time, I figured it was just a glitch, but it was incredibly strange. The Google Webmaster team confirmed it was keyword stuffing, we removed several of the instances, and a week later we were back in the top 10.

I wouldn’t have given it another thought if Matt hadn’t stated “like too many keywords on a page” as a signal. And then it occurred to me – the first rule of the SEOpocolypseWhat Google once learned to “ignore” is now a “signal”. Optimizers have become too good at keeping each single ranking factor “below the radar” so, instead, Google can ramp up the dial of attention on all factors but only penalize or devalue sites which violate several of them. These kinds of old-school SEO techniques which you stopped paying attention to may now be part of combined signals to determine who is actively trying to manipulate the search rankings. I think it is time to stop explicitly pointing out to Google “hey, these are the keywords we want to rank for.” Get rid of those meta keywords tags, pull out the excessive keyword usage in text, and try to act natural.

Deindexation is Back

We have a few clients who are particularly agressive with their SEO tactics. They recognize that ultimately these techinques will get them penalized, but an interesting old friend reared his ugly head a few weeks ago – deindexation. You may have heard news about Google deindexing blog networks, which may be an unrelated instance, but the timing is difficult for me to ignore, given its close proximity to the deindexing of two of the client’s sites.

There are two potential signals at play here – generated content and paid links. When Google said “level the playing field”, I think this is exactly what they are talking about. Generated content and paid link networks give an unfair advantage to sophisticated or well moneyed webmasters. These types of tactics have been used for years, in moderation, by webmasters with generally reputable sites. Things like generating tons of local pages and only switching out a few words are just not going to cut it anymore – not because those pages are insufficient to rank, but because they will act as a signal towards over optimization.

Google’s Most Wanted: Paid Link Networks

It has long been lamented by individuals like myself that Google seems to take a special interest in Paid Links (at least publically) while allowing unsavory techinques like massive article syndication, forum and comment spam, etc. to remain untouched. This is probably because the latter is far easier for GoogleBot to ignore and devalue, given the clear footprints left by those techniques. (If Eppie’s LinkDetective can discover them, you know Google can). Many of you may have received this “Google Webmaster Tools notice of detected unnatural links” recently. Interestingly enough, we found them very helpful because they pointed out to us a couple of our clients who still had legacy links with a very popular link network. In fact, the single unifying factor of all of the examples of this that we have seen point back to a single popular link network – even when the website just happened to be linked to from one of their advertisers.

What was most frustrating about this is the clients who reached out to this network said “A lot of our advertisers are receiving this. It is just an automated message, don’t worry about it.” My opinion is quite different – Google is buying itself good will so when the shit hits the fan in a few weeks, they can say “I told you so”.

This does not mean that every site with the message will get hit, they are more likely than not purposefully over-reaching with these notifications to make sure that no one in GWT gets penalized who did not receive a message.

Let me be clear though: If you are using a link network and got this message in GWT, you should strongly consider removing those links. Consider this a good time to determine which of those links are valuable or not, realizing that many of those that Google used to ignore are now part of a combined signal for detecting over optimization.

It is essential that we start thinking about the combined weight of our activities as SEOs in a Post-Panda world. We can no longer look at 100 factors and assume that as long as each one is slightly below some historical threshold that we are safe. More importantly, we need to expect that Google is willing and able to run analysis for months at a time on a series of machine-learned factors and release punctuated updates.

4 Comments

  1. Eric Scism
    Mar 20, 2012

    I love that G is cutting down on over optimized sites. Write great content, interlink it really well. Build links and let people find your content and develop key followers and that’s how it should be. Don’t try and manipulate the ranking algos just try and make your site the best it can be and let your customers/visits promote it.

  2. Michael Martinez
    Mar 20, 2012

    Your last paragraph is the one most people need to pay attention to, I think. Well said overall, of course.

  3. Steve Plaguer
    Mar 21, 2012

    “In fact, the single unifying factor of all of the examples of this that we have seen point back to a single popular link network – even when the website just happened to be linked to from one of their advertisers.”

    So why hint around .. what popular network are you referring to — it may actually help other..

    The other thing .. it’s not always easy to remove links ..

    And thirdly .. what a great way to kill a competitor …

  4. Karen Madson
    Mar 25, 2012

    Great article, I own and operate a small bookkeeping business in Rogers Arkansas and have worked to make my site visible. When I heard Google was making changes I was concerned. After reading this article I feel better knowing nothing I have done will be penalized.

Trackbacks/Pingbacks

  1. SearchCap: The Day In Search, March 20, 2012 | Market 7 - [...] The First Rule of the SEOpocalypse, The Google Cache [...]
  2. #2 Dilbilim ve arama motoru optimizasyonu « Emre YaÄŸlı - [...] Bu da 20 Mart tarihli Matt Cutts’ın açıklaması: [baÄŸlantı] [...]

Submit a Comment

Your email address will not be published. Required fields are marked *