Reading Time: 10 minutesGoogle’s ever changing algorithm, that mysterious secret combination of characteristics to convince the search engines that your site is in fact the best and most appropriate resource for the inquiry typed into Google’s search.
In previous years search engine optimization experts would anguish over data and statistics trying to discern what keyword density was optimal, dashes or hyphens, strong or bold tags. These were the questions of the day. The algorithm would change as various techniques were exploited.
From the Site:
Why Google's Link Based Search Algorithms are Here to Stay
Let us look at just one small competitive niche for example. Say "Dallas Mortgage" for instance. This industry is fiercely competitive and right now there are several dozen mortgage brokers that are actually in Dallas trying to do a decent job of SEO to be ranked for this one (relatively small) niche. Now add to this the fact that Dallas is a major market and many companies in Houston, Austin, San Antone, etc that can underwrite a loan anywhere in the state are focusing on SEO for this one term as well. Lastly add the literally thousands of affiliate marketers that are working to build leads for companies like LowerMyBills.com, Ditech, etc who are also making mirror sites that optimize for this term and this one very small niche is persued by thousands of people.
To accomplish this goal some of these people are doing pure white hat (and getting owned by the way), some are doing varying levels of Gray to Black hat methods and some (affiliates mostly) are doing pure spam. To get a rank for this term you have to play by the Google rules and you must get links for it. Here is a news flash, no one is likely to give out links for "Dallas Mortgage" in the idealistic "democratic" way that Google suggests we get links. So to rank for this term you either directly create, negotiate, request, buy or beg links from quality sites.
Now to my idealist White Hat SEO brethren the solution is simple, just pull this link component out of the equation and judge sites on their content, what could be wrong with that? To anyone with an ability to think forward even a little bit the problem is like a oncoming train! Just go back to the fact that on site SEO is simple to accomplish, easy to learn and simplistic to teach. It only requires knowing and following standards, some very basic math and some skill with keyword research. So what scream the idealists!
Well what this means is all those thousands of people chasing "Dallas Mortgage" now will each create content with specific key word densities, proper tags, etc. Some will "win" for the moment and the loosers will just copy there techniques and try to do 1% better. Very soon the precise formula is determined and all the sites are using it and in a statistical tie with each other. Now also understand that with the exception of perhaps some of the "made for adSense" sites most of these sites will actually lead the visitor to a source for a Dallas Mortgage, they are not all junk as many would claim. Does this stalemate sound familiar? It should if you have been around a decade or more as it is very much how some of the first engines worked.
So what happened next? We needed a "tie breaker" some way to take two sites that both were quality from a code stand point, both had real sources of "dallas mortgage" information and both had a 2.5% (or whatever was in en vogue at the time) key word density for the term. What, short of a subjective and therefore flawed human review, was left for the search engines to use. Nothing but the infamous link. Why?
Beacause even though you can build your own links, even though you can buy them, even though you can build an entire series of sites just to pass link power around, some number of links will still be 100% beyond the control of the actual site owners. Right now we only have two choices in this. Human review or links as a component and humans can be bribed, wrong, bias, etc. Links at least use math and my friends, "math doesn't lie".
Do I think we have swung to far and links now have to much influence? Yes I do, I think it should be impossible for any page to rank for any term that is clearly not present on it at all. Yet Google "click here" and you find Adobe and if you Googled Miserable Failure in the past you found George Bush and Michale Moore (thanks to bloggers Google Bombing). Eventually Google had to hand job out those results for Bush and Moore because there were so many links nothing else would have made them go away.
I would have loved Google to simply have tinkered with things so that a word must be on a page. Sure keep the link portion but if I look for failure on Google I ain't looking for Bush or Moore (regardless of your opinion of either). What this leads us to though is a simple understanding, links are not going to stop pushing rankings for a very long time. Google may move to put more weight back into content, which I would welcome but links will be a driving force for a long time to come. I for one don't think removing them all together would create some sort of democratic internet eutopia, that others seem to believe it would.
What do you think? Is there to much weight on links? Would it be good if Google put more weight on content? Do you like things they way they are now? Or do you think I am wrong and TinPig is right and that Google should just stop using links to rank sites at all, if so how do we then break the 100 "ties" for a first page ranking?