The Virgin’s Guide to Viral Marketing
Since its public boom in 1996, the Internet and social media has mirrored human development in one important manner: we’re now in the middle of the awkward, hormone-ridden age of middle school. As sites like Digg, Propeller, and Reddit begin to blossom from niche political communities and into girls sought after by acne-ridden social media marketers such as ourselves, we must keep a few things in mind.            Like middle school, there will always be the “jock†marketers with epic mustaches always getting laid first and bullying you around. If you find yourself to be one of these 13 year olds admiring the upper lip fur of sites such as Cracked, College Humor, the Huffington Post, or the Onion as you stare at the social networks beauties from...
2009 SEOMoz Search Ranking Factors Guide Released
If you haven’t taken the time to check it out yet, head immediately over to the Search Engine Ranking Factors Guide released by SEOMoz. The Annual survey of top SEO’s across the globe is a quintessential starting point for privatization of SEO tactics. Key Quotes from Yours Truly “If Google only ranked the “tried and trueâ€, their results would be old and outdated. Recency is a valuable asset when links are hard to come by”On Page Non-Keyword Ranking Factors “The Link is King. All Hail the Link”Page Specific Link Popularity Ranking Factors “Any opportunity you have to tell Google explicitly what region for which your site is designed — do it. Make their job as easy as possible.”Geo-targeting Factors “The...
Bad News Gives Good Information
Short Post: This morning I became aware that a friend’s site had been banned by Google this morning. My immediate thought was to see if his site had been banned in the beta sandbox as well. It was. It is interesting to note that the current engine’s and the sandbox’s indexing or, at least, deindexing is closely tied together.
LinkSleeve Moved to New Server
Excitingly, LinkSleeve, the distributed anti-link-spam service offered for free by Virante, has been moved to a new dedicated server. We have also updated the codebase to use a new database schema that should up response speeds and lower outages.
Weather Specific Backgrounds on Ubuntu with Apache, PHP, Cron and Weather.com
It has long been a goal of mine to create a system for my laptop which allows my background to mimic the weather where I live (Durham, NC). When it is raining outside, I want a background of rain. When it is sunny, the sun. So, here is a basic step-by-step guide I used to cobble this together. 1. Install Apache on Ubuntu. * JTirrell below makes a good point for those who don’t already need Apache: “Why install apache at all? php5-cli and just run the php file like a script.” Open up a terminal and type in the following, being sure to use your password after entering in the command. Afterwards, you should now be able to open up http://localhost/ in your browser. sudo apt-get install apache2 2. Install PHP Open up a terminal and type in the...
The Triviality of On-Page HTML Tag Optimization
I have long speculated that on-page optimization was trivial. It meshed with my understanding of how a suspicious Google engineer may treat the content of a page in relationship to its rankings. Why trust anything a webmaster says about his or her content (keywords stuffed into H1, meta, or bold tags)? Why trust anything on a page that a user won’t get to preview before visiting (anything outside the Title and Meta-Description, by-and-large)? However, despite my speculations, I lacked the data to truly start making conclusions about the usage of keywords in specific tags. Until now. First, I am pleased to say that our micro-experimentation has found similar results to SEOMoz’s macro-experimentation. In particular, their finding that the H1 tag was no...
XML Sitemap Assisted Redirects: Advanced White Hat SEO
One of the most critical times for a site’s rankings occur when there is a massive shift in URL structure across the site. Unfortunately, this is a common prescription for sites with unruly URLs with multiple parameters. Creating pretty, canonical URLs is easy enough, as is mapping old URLs to new with 301 redirects, but preventing duplicate content issues can be problematic. Each page on the web represents a destination that can be reached by links. Theoretically, without XML Sitemaps (or similar forms of direct page submission), there would be no way for Googlebot to find pages that are not connected by links. In our first example image, this site has a homepage and 4 subpages, connected by links, all of which have been cached by Google. Let’s...
Recent Comments