What Does the Future Hold for Us?

Updated October 30, 2010

Doc Sheldon

This is another post from my other blog, Ramblings of a Madman, that fits better here, than there.


Search Engine Optimization has been around since the mid 1990s, and has gone through a lot of growing pains. Some were caused by the periodic shifts of priorities by the search engines, but most, I think, were our own fault, either individually or collectively.

Initially, the web was a virtual Dodge City… no law, no order, every man for himself. Looking back, for those of you that remember the Web-Ring days, it was all about links, and there were no “rules” to follow. Then, the Marshal came to town, in the form of the search engines. They couldn’t set actual rules, per se, but they could penalize those that didn’t follow their guidelines, by either degrading a site’s position in the results pages, or by eliminating it altogether. To some, it may have seemed like living in Langtree, Texas, during the reign of the famous “Law West of the Pecos”, Judge Roy Bean.

(eclectic link for the easily distracted)

Jumping back from the late 19th century to the early 21st, however, it’s easy to realize that lawlessness and chaos aren’t conducive to growth. Some things, of course, will grow in spite of the chaos, such as was the case with the Old West, and has been the case with the internet. But eventually, those growing pains stop showing growth, and all that’s left is the pain.

The search engines need to serve up the best possible results in response to the users’ queries… that’s the cornerstone of their business plan. Fail in that, and ad sales decline quickly, users migrate to the competition to meet their needs, and even a behemoth corporation like Google can be brought to its knees in a remarkably short time.

The pseudo law that came in the wake of the rise of search engines was sorely needed, and the results have been nothing short of fantastic. The wealth of information now available at our fingertips makes the Industrial Revolution look like no more than a neighborhood block party.

Still, the initial efforts to aid in ranking sites, in order to improve the relevancy of results, brought with it a beast of many heads… the link. The search engines didn’t invent that – they just fed it, until it became unmanageable. Spamdexing crawled from beneath the bed, and became the boogeyman.


Don't let the Boogeyman getcha!

Having focused most of their developmental efforts on pagerank, which is driven only by backlink quality and quantity, the SEs were faced with the necessity of dedicating a HUGE amount of energy on containing that spam. Perhaps they would have been better advised to instead move away from links as a prime motivator for the SEO community. Easily said, but not so easily done, I imagine. A major shift like that would take time (and money, but then, they have plenty of that) to implement, to avoid drastic repercussions. A commodity – and don’t kid yourself, links have long since become a commodity – isn’t easily removed from an economy, without creating havoc.

Enter, 2010. By my guess, we’re at LEAST two years into a highly focused, and necessarily top-secret, plan, to make links all but obsolete. Remove the incentive, and spamdexing will largely evaporate. Sounds easy on paper, doesn’t it? Well, let’s look at a few recent and ongoing developments, and consider what they may mean to us.

  • Caffeine came out, which dramatically increased the speed at which pages were indexed. A necessary precursor to what some have called “real-time search”.
  • Mayday is simultaneously thrust upon us, supposedly having the most impact on long-tail keywords (there’s some subtlety here), but really, it’s still unknown what other intended effects were built into Mayday.
  • Major new pushes toward the Semantic Web, a concept proposed by Tim Berners-Lee as early as 2001, such as microformats, RDFa and Ctags.

So what might be the effects of all these? Let’s take a look at the last one, first, since it’s key to the success of what I think is planned.

First, I think it’s obvious that microformats and RDFa technology will allow the search engines to deliver more relevant search results. In addition, they put more information on each individual result right on the SERP, allowing the user to get a better preview of what he might find on a site.

Second, implementation of Ctags will create a deadly loop for the folks that would try to spam the search engines, in an effort to create relevancy where none exists. Go ahead and spam! You’ll only be spamming yourself!

As for the effects of Mayday… I know a lot of SEOs that have been testing, trying to determine what was done with this update. But most seem to agree that it’s essentially impossible to say, with any certainty. And I think we can all agree that Caffeine was a logical step, even by itself. But given the additional information to be parsed, as we provide more information, the speed achieved through Caffeine will be an integral part of any success of Google, in the days to come.

Links won’t go away. They’ve been there as long as the ‘net has existed, and they’ll be there for some time to come. But I think they’ll soon be a very minor contributor to the ranking algorithm. That means that much of the potential for spamming will disappear. We have the opportunity, finally, to unite behind a common purpose… provide the user with the most relevant results possible. The user, the search engines, the SEO and the site owner can all win!

If you see holes in my logic, or have a different take on these developments, please feel free to share your thoughts. This is my opinion, but it’s not cast in stone.


Since this post was originally published, we’ve seen Google Instant and Quick Scroll both rolled out, both of which created a furor. There’ll undoubtedly be more such surprises just around the corner, as Google continues to build a stronger position for the Semantic Web (Web 3.0).

I know a lot of folks that are poo-pooing that idea, but then, more than a few folks poo-pooed the notion that the world wasn’t flat, too. Time will tell.

If I have to eat crow, it won’t be an entirely new experience. I’d rather be wrong for thinking, than right, without thinking. 😉


  1. Dan @ Keyword Research Service says:

    I agree with the vast majority of your points there, Doc. But I must admit that I find it hard to even imagine links becoming a minor factor in ranking. Google’s algo made the world of SEO what it is today, with link a form of online currency, and changing that is simply inconceivable.


  2. Hi, Dan – Thanks for ringing in.

    I agree that it’s hard to imagine. The shift in emphasis for Google alone would be tremendous. For all the billions of sites… frightening. Nevertheless, that’s where I think we’re headed.
    The Semantic Web, when (or if, if you prefer) it’s finally realized, will make links essentially superfluous. I’m sure they’ll still be around, and probably still carry some weight. But if you think of it from a business standpoint, Google wants maximum control, with minimum risk. Holding the semantic reins will give them the control than any business would want in their position, and pulling the teeth of linkspam will eliminate their single largest headache.
    Obviously, this is simply stating an opinion on my part. I can’t point to any “evidence”. I DO see, however, a lot of indicators that make me believe that they’re driving hard in the direction of semantics. Even the most recent changes could be supportive of that.

    Thanks for your comment… don’t be a stranger!

  3. That was fascinating read, Doc. I well remember the days of the Webring. I was a kid and had two web sites in web rings, one about model rockets and the other about yo-yo’s… Good times. I also remember life before Google in the form of Hotbot and Altavista. It’s weird to think that Google didn’t invent the Internet.

    I personally am not a fan of Google Instant but who knows, maybe it will grow on me…

  4. Whats up Doc…Great article Man. I think your logic is pretty solid and I get you as far as the impact of this upon long tail. But its difficult for me to make the leap from there to Google eventually making links obsolete. Also with respect to Mayday you’re right even now people don’t fully comprehend it but there seems to be some agreement out there that this update was more about placing an emphasis on site architecture (like how many layers away your internal pages are from the homepage )-which to me does not diminish the significance of links.

    That being said this is still enough to scare the bejeebies out of many a webmaster or SEO. Sort of like the webmaster’s version of the apocalypse-if you ask me. So for SEO’s sake I hope this doesnt come to pass.

    Thanks for the great piece, Doc. Take care.


  5. Hi, Tristan-

    Thanks for stopping in, and making yourself heard! Yeah, I played in the Rings a bit. Things were definitely simpler then, eh? 😉

    It makes me feel really old, realizing that I remember the roll-out of all those search engines. We had a fairly extensive network where I worked in the early 90s, and had an IT guy that could sell refrigerators to Eskimos. He convinced our CEO that having internet capability would put us head & shoulders above the competition, so we got it very early on. It was pretty cool, being among to first to experience it.

    I’m not wild about Instant either. But I find it MUCH less intrusive than Quick Scroll (another rant to come, soon).

    Thanks again for your contribution, Tristan.

  6. Hey, Benin! Great to see you here.

    My logic on links diminishing in importance (versus becoming totally obsolete… I don’t expect that to happen) is a combination of two things:

    First, links, and the resultant spamdexing the ne’er-do-wells bring to the table, trying to game the system, have been a major thorn in Google’s side for years. It costs them money and resources to combat it, but more importantly, it costs them BIG-TIME in credibility. Credibility=revenue.

    Second, I don’t think anyone will dispute the trend to get closer and closer to the Semantic Web capability everyone would like to see. The impact that could have on Google’s effectiveness and efficiency would be tremendous. In my opinion, it would jump their search market percentage to the mid-to-high 90s, virtually overnight (think in terms of an attendant 10-15% increase in ad revenue).

    So my leap is this:

    How can they eliminate that thorn? Obviously, it has to be replaced with something… something at least as effective as links, right? Preferably, more so. And shifting more control to them, not on an individual site basis… more on a criteria basis…makes business sense for them. What better way than to essentially eliminate pagerank, in favor of semanticrank, computed on-the-fly?

    Effects? SERPs become more relevant, user search experience becomes more fulfilling, and that link-thorn essentially evaporates, at least sufficiently so as to no longer be the prime method of boosting one’s position in the SERPs. On the site-owner’s side, because of better SERPs (aided by the implementation of microformats|RDFa), traffic becomes much more targeted. That should result in better ROI.

    And don’t forget that the attraction for buying certain targeted ads will probably increase, as well.

    So, while I can’t point to any hard evidence, I think this is Google’s logical next step. And I see it as decidedly positive, if it happens. For those that buy several thousand links each month for their clients, it’s obviously going to necessitate some adjustment in strategy. For those that refuse to recognize any benefit to their ROI by the implementation of microformats|RDFa, it’s going to require some re-thinking. And of course, for Google, it’s going to be a lot of work to engineer such a change-over.

    But I suspect they’ve been working on something like this for a long time. 😉

    And to your apocalypse concern… I wouldn’t worry. SEO’s going to be around for a long while. We’re just going to have to evolve a bit, along with search technology.

    Thanks for you input, Benin. Don’t be a stranger!

    Disclaimer: The opinions stated in this comment are those of the poster, and by pure coincidence, happen to be the opinion of Doc Sheldon’s SXO Clinic, as well.

  7. Thanks Doc. I gotcha. So it seems that everything is dependent upon when we might actually see a truly semantic web paradigm come into play. This sounds like something that could be way too big of a project for Google to implement single handedly though. What do you think?

  8. I think we’re seeing the Semantic Web come into play already, Benin. But it’s happening so gradually, in baby steps, that you don’t notice it if you’re not paying close enough attention.

    Think about years ago, when a search for “car” would only return “car” in the SERPs. Then after a while, they got to where they’d recognize “automobile”, “auto” or “vehicle”, as well. Now, you can search for “car” and if there’s a page out there that uses NONE of those terms, but instead uses “wheeled steed”, but the surrounding text mentions “upholstery”, “gasoline” and “steering”, it’ll show up in the results, and rank highly, if warranted. They can already determine, in many cases, what the page is about, and determine if it is relevant to the search query. That’s a level of semantic analysis, already.

    Now add in microformats or RDFa… a term like “Stanley” can immediately be properly classified as a man’s name or a brand, depending upon the markup. And all the inter-relational markup will be captured upon indexing, which will have the effect of making the return of results FASTER.

    Add to that, the ability of software to actually LEARN, and the task doesn’t seem so daunting. Particularly when you think about what has already been accomplished.

    I remember when Infoseek used to return a common result in 3 to 4 seconds – 6 or 7 seconds wasn’t unusual. Now, you can enter a long-tail term and get 14 million results in a hundred milliseconds! From billions of pages, not 50,000 or so.

    To answer your question, no, I don’t think the Semantic Web is out of Google’s reach. All they have to do is get people to give them the information.

    And they’ve been pretty effective at getting us to do what they want, so far. 😉

  9. Dan @ Keyword Research Service says:

    Hey Doc (and Benin),
    Now, as fascinating as your discussion is, and believe me when I say that it’s fascinating – I read it twice, I’d sure like to see a freshly brewed post on Doc’s blog.
    I think I’m on to your strategy here Benin 🙂 You’ve been popping 3-4 posts a week while keeping Doc occupied, neck deep in comment discussions. 🙂

  10. LOL, Dan. I think you’re right… Benin is pulling some of Sun Tzu’s theories on me! Gonna have to keep an eye on him!

    I apologize for the lapse, and it’s been on my mind, believe me. I’ve been tweaking a bit, and I’m in the middle of a problematic CDN implementation here. I’ve also got a deadline coming up for a series of articles. Gimme a day or so, and I’ll catch up with you. 😉


  1. State of Search radioshow – episode 39: Bubble coming up, 2010 and and the future - Podcasts - State of Search says:

    […] 2010 predictions Prediction by Doc Sheldon (who gave this prediction in the chatroom) […]

  2. […] according to a number of bloggers.Predictions of things to come have been made (including a couple here and here), and it’s been both challenging and entertaining to monitor the changes and try to […]