Updated September 17, 2010Doc Sheldon
[wp_cycle]We all expect the rules to change periodically… that goes with the territory. If the goalposts didn’t get moved once in a while, the optimization game wouldn’t be nearly as much fun, right?
And to be fair, it’s perfectly natural that the rules change. The internet is still relatively young… very much in its formative stages. What used to work wonders, is no longer as effective. Some things are discounted totally. Other techniques, that were once commonplace, are now penalized.
Content cloaking, keyword stuffing, misleading anchor text, rampant re-directs… just a few of the techniques that smart optimizers no longer utilize. Some other things that now have little or no impact on the success of an optimization campaign, such as meta keywords, however, bear no penalty for their use. But then, they carry no real benefit, either. Even that, with Google officially stating that they’re ignored, is a point of contention between a lot of practitioners.
With all the constant changes (several per day, on Google’s part, alone), it’s no wonder that many optimizers disagree on so many aspects of their profession. Is anchor text still as valid a ranking factor? What about validation? Duplicate content? In-bound links from a bad neighborhood? Toolbar PR updates (that one’s always sure to get the more informed, pulling out their hair)?
And those are just a few of the continual disagreements over what is. Start talking about what will be, and the sparks really begin to fly!
Steve Rubel (Nope! No link. Nothing to see here, folks… move along) recently published a post on his blog about SEO being irrelevant, because of Google Instant. It drew 200 comments so far, and they’re still coming in, nine days later! To be fair, Steve may well have been banking on just that – controversy is always a good comment-sparker, and I imagine he got a fair amount of traffic out of it. Years ago, of course, there were folks that would make their slaves fight to the death, and charge admission to watch it. Some things never change, eh? But I digress!
The point is, Steve wasn’t the first to hint that some aspect of life-as-we-know-it had ceased to exist. It’s been going on for as long as the internet has been in play. And remarkably, in those 200 comments, there were nearly as many “I agree”s as there were insults. I find that disconcerting. No, make that disturbing. Scary, even.
One of the many upcoming changes we’re faced with now is the ability to implement RDFa, rather than the more elementary, poor-man’s version – microformats – to provide more information for the search engine to serve up to the user. Some feel it’s the wave of the future, and a major step toward what might be termed AI search, when it finally comes to pass. Others scoff at it as an unnecessary burden that won’t increase their revenue – only their workload.
Whether AI search will ever become a reality, remains to be seen. Personally, I think a reasonable facsimile is not far off. But that’s just my opinion, based upon my gut reaction to what I see Google focusing on in the small bits of their development efforts that are visible.
I see RDFa and their oft-attendant common tags as being instrumental in not only putting more information in front of the user via the SERPs, but in also providing the search engines with more data. The SEs’ parsing load may be significantly reduced, as well, at least on those sites that implement them. I would not be at all surprised, if Google (and the other SEs will surely follow suit) begins to bring increasing pressure to bear, to get more sites into compliance. Why would they do that, you may ask?
- To put themselves in position to finally be able to effectively eliminate links as a factor (more on this in a moment)
- To bring sites into a structural pattern that better lends itself to an AI search algorithm
Links have been a necessary evil in establishing the pagerank of a page. And pagerank is one of the factors in establishing SERPs ranking. Unfortunately, links have also been the most easily abused aspect of optimization efforts. Bought, sold, traded and inherited, they’ve made manipulation entirely too easy. By eliminating links as a major factor, AND by effectively forcing implementation of CTags, spamdexing would be greatly reduced.
Take that theory a step further. What would replace IBLs in the determination of pagerank?
Pagerank could easily become yet another thing of the past. It was great, while it lasted. Certainly, it was necessary. But the search algorithms have developed sufficiently to be able to rank INSTANTLY, for a given search. PR needn’t be a constant. It could vary dramatically from one search to another, for the same site. And if that were the case, why not do away with it altogether?
Think how much simpler Google’s job would be. Imagine how much manipulation would evaporate.
I think we may see the goalposts moving to an entirely different stadium, before too long. Wouldn’t it be wise to be near the gate, before the announcement is made?