Huge Update

Google recently launched their webspam Penguin update. While they claim it only impacted about 3.1% of search queries, the 3.1% it impacted were largely in the “commercial transactional keywords worth a lot of money” category.

Based on the number of complaints online about it (there is even a petition!) this is likely every bit as large as Panda or the Florida update. A friend also mentioned that shortly after the update WickedFire & TrafficPlanet both had sluggish servers, yet another indication of the impact of the update.

Spam vs OOP

Originally leading up to the update, the update was sold as being about over-optimization. However when it was launched it was given no pet name, but rather given the name of the webspam update. Thus anyone who complained about the update was by definition a spammer.

A day after declaring that the name didn’t have any name Google changed positions and called the update the Penguin update.

Why the quick turn around on the naming?

If you smoke a bunch of webmasters & then label them all as spammers, of course they are going to express outrage and look for the edge cases that make you look bad& promote those. One of the first ones out of the gate on that front was a literally blank blogspot blog that was ranking #1 for make money online.

As I joked with Eli, if it is blank then they couldn’t have done anything wrong, right? 😀

Another site that got nailed by the update was Viagra.com. It has since been fixed, but it is pretty hard for Google to state that the sites that got hit are spam, blend the search ads into the results so much that users can’t tell them apart& force Pfizer to buy their own brand to rank. If that condition didn’t get fixed quickly I am pretty certain it would lead to lawsuits.

Google also put out a form to collect feedback about the update. They only ever do that if they know they went to far and need to refine it. Or, put another way, if this was the Penguin update then this is GoogleBot:

So Worried About Manipulation That They Manipulate Themselves

When I was a kid I used to collect baseball cards. As the price of pictures from sites like iStockphoto have gone up I recently bought a few cards on eBay (in part for nostalgia & in part to have pictures for some of our blog posts). Yesterday I searched for baseball card holders for mini-cards & in the first page of search results was:

  • a big ecommerce site where the review on that product stated that the retail described the quantity as being 10x what you actually get (the same site had other better pages)
  • a user-driven aggregator site with a thin affiliate post made years ago & attributed to a site that no longer exists
  • a Facebook note that was auto-generated from a feed
  • an old blogspot splog
  • a broader tag page for a social site
  • a Yahoo! Shopping page that was completely empty


That blank Yahoo! Shopping page is also what showed up in Google’s cache too. So I am not claiming that they were spamming Google in any way, rather that Google just has bad algorithms when they rank literally blank pages simply because they are on an authoritative domain name.

The SERPs lacked expert blogs, forum discussions, & niche retailers. In short, too much emphasis on domain authority yet again.

Part of the idea of the web was that it could connect supply and demand directly, but an excessive focus on domain authority leads users to have to go through another set of arbitragers. Efforts to squeeze out micro-parasites has led to the creation of macro-parasites (and micro-parasites that ride on the macro-parasite platforms).

SEO-based Business Models

Now more than ever SEO requires threading the needle: being sufficiently aggressive to see results, but not so aggressive that you get clipped for it (and hopefully building enough protection that makes it harder for others to clip you). That requires a tighter integration of the end to end process (tying efforts into analytics & analytics back into efforts) & a willing to view SEO through a broader marketing lens & throwing up a number of hail marry passes that likely won’t on their own back out but will give you a lower risk profile when combined with your other stuff.

And your business model is probably far more important than your SEO skill level is. Imagine running a consulting company for a lot of small business customers for a few hundred Dollars a month each, based on stable rankings & then dealing with a tumultuous update that hits a number of them at the same time. And then they see an older (abandoned even) competing site of lower quality with fewer links ranking and they think you are selling them a bag of smoke. These sorts of updates harm the ability to do SEO consulting for anyone who isn’t consulting the big brands. Yes many people made it through this update unscathed, but how many of these sorts of updates can one manage to slide through before eventually getting clipped?

The Unknowable Future

As search evolves, invariably anyone who is doing well in the ecosystem will at some point face setbacks. Those may happen due to an algorithm update or an interface change where Google inserts itself in your market. If you never get hit, it means you were only operating at a fraction of your potential. If you consistently get hit, you might be aiming too low. Many trends can be predicted, but the future is unknowable, so set up a safety cushion when things are going well.

This year Google has moved faster than any year in their history (massive link warnings, massive link penalties, tighter integration of Panda & now Penguin) & the rate of change is only accelerating. Go back about 125 years and a candle wick adjuster was cutting edge technology marketed as brand spanking new:

Blekko has a decently competitive search service which they manage to run for only a few million a year. As computers get cheaper & Google collects more data think of all the different data points they will be able to layer into their relevancy algorithms. In some markets Chrome has more marketshare than Internet Explorer does& Android is another deep data source. And they can know what user data to trust most by tracking things like if they have a credit card or phone verified on file & how often they use various services like Gmail or YouTube. Google+ is just icing on the cake.

At the same time, they need to improve. As the search algorithms get better, so do the business models that exploit them:

I asked Kristian Hammond what percentage of news would be written by computers in 15 years. “More than 90 percent.”

There will be many more casualties in that war.

Categories: 

More: continued here