Virante’s Software List Grows Again

I am proud to announce today the launch of AuthorRank Pro, the first and only tool that allows you to track your AuthorRank, AuthorTrust and other metrics over time. The tool also helps you find great authors for your own site and great opportunities for writing on other sites. But what I am most proud of is the growing corpus of great software owned and operated by Virante. Remove’em: We’ve removed over 1,000,000 bad backlinks from the web as the most comprehensive tool for penalty recovery online. nTopic: Statistically proven to increase organic traffic when recommended language is applied across your site. Penguin Analysis: The Machine Learned Penguin Risk Detector. Know just how vulnerable you are to the next Penguin update! AuthorRank Pro: The...

The Most Devious Link Campaign of SEO History

Disclaimer: This article represents my own opinion and not that of my employer, clients, or business partners. 1,792,729 I want you to try and remember that number. One million, seven hundred ninety two thousand, seven hundred and twenty nine. It is a big number, and it represents the most successful, devious link building campaign of all time. What makes it successful is quite clear. The number you are looking at right there is none other than the number of unique websites according to NerdyData that have attempted to install authorship markup with links to Google+. Now, I am careful to say attempted because many of them have installed it incorrectly, but the backlink still exists. No doubt the 4,587,474 historical root linking domains “earned” by...

More on Google’s Javascript Handling

Many of you probably noticed my recent post without any substantive content. I was seeking to answer the following questions… Does Google wait for timeouts and display that content in the index? is that content searchable? How does Google handle content generated at intervals in javascript? Will Google index content that is only displayed after an action like a button click occurs? We now have some pretty solid answers to each question… Does Google wait for timeouts and display that content in the index? is that content searchable?Yes. Google does wait for timeouts and display that content in that index. That is to say, the content that was displayed after the timeout is included in the search index such that you can find it by searching Google for...

Testing JS: Nothing To See Here

Seriously, I am just testing some stuff out with Googlebot. The javascript running on this page should help us know a couple of things… 1. Does Google wait for timeouts and display that content in the index? is that content searchable? 2. How does Google handle content generated at intervals in javascript? 3. Will Google index content that is only displayed after an action like a button click occurs? It is worth pointing out that it appears Google is still asynchronously parsing Javascript. This page was almost instantly indexed by Google, but the javascript generated content has not been parsed. the marker blue wax among elephant made candle popular kids

Should we move to an all HTTPS web? No.

Joost de Valk has started a great discussion about https everywhere over at his blog and it is well worth the read, however I believe he has come to the wrong conclusions. The discussion was spurred on by Bing’s apparent move to HTTPS which would influence the passing of referrer and, subsequently, keyword data to webmasters from search queries. It is worth noting that as of the writing of this, Bing’s HTTPS version is not working and Bing has made no announcement of a move. Much of this discussion in the SEO community revolves around Section 15.1.3 of RFC2616 which indicates that… Clients SHOULD NOT include a Referer header field in a (non-secure) HTTP request if the referring page was transferred with a secure protocol. Subsequently, as a user...

Simple DDOS Amplification Attack through XML Sitemap Generators

It was all too easy really. Filling up a 10Mb/s pipe, tearing down a website with just a handful of tabs open in a browser seems like something that should be out of the reach of your average web users, but SEOs like myself have made it all too easy by creating simple, largely un-protected tools for unauthorized spidering of websites. So, here is what is going on… Yesterday a great post was released about new on site SEO crawlers that allow you to determine a host of SEO issues by merely typing in a domain name. This seems fantastic at first glance, but I immediately saw an opportunity when I realized that none of these tools – and really almost none of the free SEO tools out there – require any form of authentication you actually own the...

The Disadvantages of Speed: Finding Exact Match Domains in Drop Lists

I recently wrote a post on the advantages of speed specifically dealing with the ability to find exact match domains. One of the disadvantages of speed is that of the classic hammer problem. If you have a hammer, everything looks like a nail. Because lookup speeds are very fast, I made the assumption that I could just pound away. Eventually, though, that led to some insurmountable speed problems and would force more horizontal scaling. Because the lookups were so fast, I assumed that the number of lookups could be egregiously large without greatly damaging performance. I. Was. Wrong. It hit me over New Years Eve night that I had been looking at the problem all wrong. The lookup data was structured in a way that required the massive lookups. Subsequently,...