Posted by randfish

Over the past few days, I’ve seen a lot of questions in our Q+A section and across the blogosphere that suggest it’s time for some direct answers from the search engines on major issues that affect business practices, consulting and website building. Here are the ones I believe are in desperate need of straight responses:

  1. To all the engines – does content on subdomains inherit the full ranking ability provided by the pay-level domain? Website builders deserve to know this ahead of time so they can be intelligent about the ways they design their site structure (via SEJ).
  2. To Yahoo! and Microsoft/Live – Are you going to offer geo-targeting options like Google does in their Webmaster Tools? If not, would you be willing to follow a common format in robots.txt or meta tags to let sites tell you which countries/languages their content is targeted towards?
  3. To Google – if a site creates two subdomains, one targeting Canada and one targeting the US with very similar or nearly the same content on each, can both of those subdomains operate successfully targeting their specified country without causing duplicate content issues? If not, why?
  4. To Google – although you claim to notify webmasters inside Webmaster Tools about penalties to their site, I’ve seen many, many penalized domains and only one penalty message, ever. Why create and publicize the feature if you’re going to continue to withold that information from site owners. Surely your algorithms are savvy enough to detect pure spam vs. legitimate sites and businesses that simply made mistakes or attempted bad practices – why not give these domains the benefit of the doubt? Does more penalty reporting actually correlate with more spam? I’d find that hard to believe.
  5. To Yahoo! and Microsoft/Live – Penalty reporting would be an excellent feature – will we ever see it?
  6. To Google – An extraordinarily small percentage of questions get answered or addressed by Google represenatitives in the Google Groups for Webmasters area, yet you could easily create a more open, communicative environment by allowing your analysts to participate actively. What fears are preventing that direction? The strategy of Webmaster Central was always to open up communication, yet the party line of "well, most threads get good answers from the community" really ignores the potential abilities of your staff and the publicly promoted strategy of the group – where’s the disconnect?
  7. To Yahoo! and Microsoft/Live – Any chance you’ll offer groups/forums where webmasters can interact with engine representatives?
  8. To Google – Yahoo!’s Dynamic URL Rewriting system is a clear leap forward in letting site owners properly canonicalize page level content and is not a massively challenging process to implement – why not offer it (or something similar in meta tags or robots.txt)? Savvy SEOs can conditionally re-direct, but most organizations don’t have this SEO intelligence – why punish them?
    (BTW – To Yahoo!ers reading this, the reason folks don’t use the system is becuase they still have to fix it manually for Google & Live, so it’s generally not worth the effort. It really is a great system.)
  9. To all the engines – can you offer some clear guidelines for what is cloaking vs. what is IP delivery (or whatever other name you have for showing different content to humans and bots in a good way)? Many companies worry about the practice, even when it’s wise and engines would probably approve, while other push the boundaries because they haven’t been well defined.
  10. To all the engines – would you consider offering a clear method for showing if a site’s links no longer pass link juice? Google’s current process of sometimes lowering PageRank in the toolbar is particularly weak, as those unsavvy enough to be buying links from low quality directories and sites selling them are also often the ones who have no idea how to see what a domain’s PageRank used to be or whether the current PR bar is meant to tell them something. On the other hand, Yahoo! and Live offer no method whatsoever. This strategy would seem to fit very well with the concept of protecting consumers and the downside that some sites that escape detection could still sell links for a time would seem to be by far the lesser of the two evils here.

Sometimes, I’m on the side of the search engines keeping things quiet, and I understand and empathize with their reasons. However, for these questions, I think straight answers are in their best interests. I must say that I’ve been generally impressed with the search engines offering more transparency of late, so I have hope that we might get good responses on some of these.

Please feel free to provide your own questions of similar nature in the comments if you’ve got them.

Do you like this post? YesNo