Building Your Site for Users is Not Enough.
I am not going to make many friends with this post, but I think it is about time someone said it. If your SEO firm (search engine optimization firm) is telling you to build websites “for the user” and not “for the bots”, fire them. Yes, that’s right. Politely tell them to stuff it, and move along to better things.
There is nothing wrong with building websites for the user, but there is something wrong with doing it in lew of building websites for the bots – especially when the two are in no way mutually exclusive.
Let’s take a look at a real world example of how a website “built for the user” fails because it wasn’t also “built for the bots”.
We will look at OpenOffice.org, the popular alternative to Microsoft Office. (if you aren’t using OpenOffice, stop reading, go download it now, delete your copy of Microsoft Office, and resume reading).
First, as always, take a look at Google’s cache of the site to see if anything is going wrong…
http://www.google.com/search?hl=en&q=site%3Awww.openoffice.org&btnG=Google+Search
Oops. Notice that Google thinks only 20 pages of the site are valuable enough to rank…
http://www.google.com/search?q=site:www.openoffice.org&hl=en&lr=&start=10&sa=N
Why is this? Surely OpenOffice.org is designed for “the user experience”. Shouldn’t that be enough? Let’s delve a little deeper…
http://www.google.com/search?q=site:www.openoffice.org&hl=en&lr=&filter=0
If you notice, they are suffering from the damning “duplicate content penalty”. Could this be, gasp, an issue in designing for users and not bots? Yes. Absolutely. Unquestionably. If your SEO tells you “just think about what is best for the user”, you too can end up in supplemental hell.
What is occuring is this: in designing such a user-friendly experience they did not consider how the bot accesses information on the page. For example, lets look at the Web Style Guide page, one of the duplicate content affected pages:
http://www.openoffice.org/styles/
The web styles page is the definition of minimalist, information-driven content. However, without unique meta information, and with duplicate information occuring as the first content on teh page (the top and side navigation bars and the search box), Google wrongly identifies this page as not being unique enough to rank. In fact, there are 194 lines of code and whitespace before you get to the first snippet of true unique content. The first 116 words read by Google, including the meta-keywords and meta-description are IDENTICAL to every other page.
If this page was designed “for the bots”, instead of only thinking about “the users”, the page could look exactly the same but avoid these types of costly traffic-killers. So, what steps could be taken to make this “for the bots”…
(1) Unique Meta Description and Meta Keyword tags. Identical sitewide tags are one of the most common precursers to duplicate content penalties.
(2) Move the top and left navigation bars and the search bar to the bottom of the page in the actual code, then use CSS to position them (position:absolute;) on the page as you see fit. This will cause Google to read the page’s unique content first.
(3) Sit down and watch your website escape duplicate content penalty hell.
The “build for the user” movement is for some reason gaining a lot of steam in the SEO community. To be honest, I think that it is the group of SEO’s who used to succeed by relying solely on keyword density and the general non-competitiveness of the SEO field. Now that things have gotten substantially more difficult, their solution is not to bear down into the code and come up with unique, effective ways to improve your site’s rank, but to tell you to “focus on the users”.
I am telling you now that this is useless crap. Of course your site needs to be tailored to the users, but that is a question to be handled by web designers and web developers, not your SEO.
No tags for this post.
Let’s be real. Good SEO companies can successfully accomplish both. Spiders are getting smarter, while webmasters are also getting more SEO savvy. Something’s got’a give soon. Build your site for SEO first but PLEASE don’t over sacrifice the user experience is all i’m saying. What have we learned in the past 2 years? Quality links are 80% of good SEO. Who’s going to link to a bogus spammer site?
That said — you do make some great points… and i enjoyed your article.
Of course you don’t sacrifice the user experience, but that is not the job of your SEO.