In Defense of Hats: White, Gray, Black and Blue
There has been quite a bit of talk lately about the resurgence of gray and black-hat panel discussions at the SMX Advanced conference held recently in Seattle. I posted a lengthy comment at the trail end of Matt Cutts’s post regarding the matter, but I felt that it deserved a little more attention. In short, any advanced SEO conference should not shy away from the full gamut of SEO techniques, regardless of the stigma attached. Black Hat is Beating You: Regardless of your position on the ethics of black hat techniques, it does not change the simple fact that these techniques exist in the wild and they are being employed by your competitors. If the only information you gather from a discussion on these techniques is how to identify and out those activities,...
Really Solved: Another Common Site Review Problem
Matt Cutts wrote recently of a common site-review problem. Many sites prefer to store links within drop down menus (the “select” element). Unfortunately, this non-standard way of using javascript to link to pages within your site is quite difficult for search engines to spider. (ie: search spiders like GoogleBot have difficulty determining that your javascript code is meant to be interpreted as links). Here are a few examples: http://www.pickwicktea.com/ http://www.evinrude.com/ http://www.yoofi.com Luckily, there is a pretty easily solution to the issue. Start by creating a DIV tag with traditional text links inside representing each of the items you would like to appear in your menu. And then just run the javascript I have included below to...
Google & Matt Cutts Get it Wrong: First Priority Should Be Webmasters
While Matt Cutts promised us that his recent post would be “boring”, I believe that it actually raised one of the most important questions regarding search ethics: Our highest duty has to be to our users, not to an individual webmaster.Matt Cutts Unfortunately, Matt and the Googlers get this one very wrong. All relationships require some form of give and take. For a relationship to be properly functioning, it requires that consent be a prerequisite in that give and take. The balance of give-and-take between users and Google is heavily tilted towards the user. In exchange for the possibility of me clicking on an advertisement at some point, with no promises, I can use Google’s search. The goods being exchanged here are search results and...
Follow-up Data on Org vs Com vs Net
I wanted to release a few more stats regarding the comparison of Google rankings for .org, .com, and .net Unfortunately, I do have a sneaky suspicion that over the last day there has been some fluctuation due to a few factors. Additional noise as this story has spread across the web Google’s unintential indexing of the actual domains and not simply the experimental domains Potential “mention” bias from listings on newly registered domain sites (no links, however) Nevertheless, here are some of the raw numbers coming out of the earliest of reporting. These only include the previously revealed experimental subdomains, and not the other domains / subdomains we are running the test now. We will not reveal those domains until we have completed all...
Google Indexes Without Links or Sitemaps
I have long been convinced that Google uses its many tools to find and queue urls for indexing, but I believe I have found indisputable proof. After beginning many of our experiments using the our sitemaps technique (using Google Sitemaps to get pages indexed in order to control for the unpredictable weight of individual links), we noticed a trend. When using webmaster tools to get subdomains indexed, the domains themselves kept getting indexed. We painstakingly made sure that no URLs were ever linked to, much less the domains themselves which can and do interrupt the quality of our results. Nevertheless, Google still finds a way. While I think there is no real problem with this practice, it does seem clear that Google will attempt to spider all...
Google Showing Bias Towards .org TLDs
A bit of history: Of the many uses that Google’s webmaster tools offers, I think the most profound use is the ability to get sites easily indexed into Google without the use of links. While Google, like many search engines, has long allowed webmasters to “submit” their sites to be spidered, webmaster tools makes this easier while providing more robust data. Essentially, in our endless quest to uncover what is behind Google’s ranking algorithm we finally have a good way to control “links” in the equation. Links make experimentation difficult. It is nearly impossible to create 2 identically valued links for the purpose of testing. (This is actually possible now, as well; and we will be using a technique to test out the...
Recent Comments