When the web was young, backlinks were the perfect way to measure a sites popularity. The sites with backlinks were more popular and the sites with backlinks from more popular sites were even more popular as you would expect. And all was well.
But in today’s world no matter how much bleach you apply to your white hat, you can’t get around that little bit of knowledge you’ve acquired. Backlinks = success on Google. And by knowing this and “exploiting” this you will knock out those websites that don’t attempt to do any “SEO”. Your white hat may look white, but really it’s gray. You will do all the right things to generate backlinks. Invariably as much as Google would hate to admit it, this breeds a competitive landscape. “You can’t win if you don’t play.” It has now become essential to “ethically” build backlinks in order to get organic Google search results. And this is very bad for the quality of links on Google’s search results.
No matter how innocent you try to be, the cat is out of the bag. It is far to easy to get a large number of “encouraged” backlinks to a given site. When you search on Google the really big players come up first. And this is good. They have such a huge number of backlinks that they can pretty much be seen as genuine. It’s the middle ground. The “long tail”. These are the problems. If a smart developer/designer releases a WordPress theme with a backlink and that theme is used by several hundred people they get lots of page rank. These wordpress themes can be used on popular blogs with plenty of pagerank themselves. This is “white hat” activity. But it’s really intelligently gaming the system. The SEO gurus know plenty of tricks to get backlinks and PR.
Because of this many Google “long tail” searches that are more specific are filled with SEO’d sites both white hat and black hat spam. I know quite a few people who actually are switching to other search engines because of the “backlink” spam. “Quality backlinks” are just too easy to get, especially if you are willing to pay money for them. And that goes on everyday.
Backlinks were a good idea at some point. Now they are only showing that you are trying really hard. The people that talk the loudest and the most often aren’t always the best. And proliferating backlinks to your site with all the right keywords to game the system doesn’t mean your content is the best either. It just means that you are good at generating backlinks.
Until spiders can have a clue what they are parsing it will be hard to impossible to solve this problem and still use backlinks.
Social networks like digg.com, stumbleupon.com and reddit.com are “the next new thing”. By having people vote on pages… the crap filters to the bottom. This of course will have it’s own set of problems – but invariably will be far better and more accurate than backlinks. At the end of the day some content will be brilliant and good – and still be undiscovered. However more data will yield better results.
How do you really judge the worth of a website? Is it how many people like it? The demographics of those people linking to worth for specific topics? How long people spend on the site itself? These are fascinating questions that search engines will be forced to answer in upcoming year – or they will be inconsequential.
This concept really applies to much of life, however, computer science in particular.
Why is is that new ways of doing things can have remarkable success when they are at “the discovery stage” and end up being useful but disappointing in the field? Object oriented programming turns into spaghetti class interdependencies and deep hierarchies in day to day use. A giant leap forward and two steps back. Software patterns turn into misuse and abuse of the Singleton pattern and inappropriate shoehorning. Extreme programming becomes an excuse to write unmaintainable code that passes each and every unit test. Simplicity becomes an excuse to build things so bare bone as to become unusable. It seems like each new “promising technique” becomes troublesome once it goes mainstream. Still worth the trouble, but troublesome all the same.
You may find yourself scratching your head about why such issues aren’t discovered before they gain mass acceptance. Why when things are in “the R&D” phase they perform so well. Even when these same things are tested in limited circles on “real world problems” they continue to seem like the next silver bullet.
The answer unfortunately is as hard to swallow as it is simple. Early on the people developing and working with these new technologies are quite often some of the best in the field and the smartest guys out there. By the time the concept migrates to the other 90% of the coding world it finds itself slammed against the middle and trailing ugly end of the bell curve. These portions of the curve will often times misunderstand and misuse the new tools and techniques presented to them. It’s during these phases that the backlash usually kicks in. Guidelines and rules must be created to keep the average programmer from hanging himself with too much rope.
Often times new paradigms go through a few separate stages of development and acceptance. In stage 1 the really smart R&D types hit it. These people are actually great at creating new solutions that are both groundbreaking and elegant. Once these “conceptual entrepreneurs” have firmed things up it is ready for stage 2. In stage 2 the brightest of the “real world group” is brave enough to tackle integrating these concepts into production code. They become evangelists of the techniques they have had so much success with. Stage 3 involves more risk takers embracing the techniques advocated by stage 2. They pour over the concepts involved, grok them and integrate them into their code bases. It’s at stage 4 that the tide starts to turn. At stage four the middle of the bell curve is beginning to be pierced. The new mantra is gaining success. By stage 5 you have everyone accepting the new paradigm as “the way to go” based more on appeal to authority and peer pressure. At this stage a significant number may not understand the new techniques at all. They will advocate them but only graft them on top of their existing ways of doing things.
I like to think of myself as a stage 3 integrator. At this point things have hit my comfort zone and they are worth a try. I will do everything I can to understand a new technique, but by no means have created it or been the true dare devil integrating it at stage 2 into production code. Later down the pipe when I see these concepts misused and abused – I avidly follow the “shoring up” techniques to keep people from blowing body parts off my misusing a new technique.
No matter what the latest silver bullet appears to be, it is merely getting it justly deserved 15 minutes of fame during its integration period. These techniques will live on but the focus will move on. It’s important not to throw the baby out with the bath water. New techniques may disappoint after several years of growth – but in the end they often stay part of our process. Object orientation, design patterns, components, agile programming, the list goes on. Don’t forget that silver bullets are still bullets even when they tarnish. Just because a concept hasn’t completely lived up to its hype doesn’t make it a great and useful part of your process.
With all of the latest talk about virtual machines no one can miss the impact they are bound to have soon. With dual core and quad core processors virtual machines will just be good common sense from a number of perspectives. Web sites will be able to do things however they wish – they will merely be interacting with dumb terminals – pushing images back forth.
In other news – Google brilliance. Using KISS(keep it simple stupid) they implement only the bang for the buck features and go lightyears beyond anyone else. Microsoft proved more clever and agile than IBM back in the day. It certainly seems as though Google is proving itself more clever and agile than Microsoft. This will be no quick fix for Microsoft. By the time you are a force of nature – it is too late to reroute the butterfly wings that started you in the first place.
One of the better implementations of garbage collection for c++ can be found here:
A garbage collector for C and C++
I think in this modern day the time for automatic garbage collection for standard c++ is finally here. Manually reclaiming memory is both error prone and dangerous – calls to dead objects cause many of the surprise problems associated with c++. A number of sites provide details on how a concurrent thread for garbage collection actually provides superior performance to manual deletes. Proposals have been put forth for making garbage collection a part of the C++ standard and I am all for it. Other nice additions would be debug bounds checking for the stl among other things. I still love C++ but removing some of its thorns would give it longer legs in a world being overtaken by Java and C#.
I’ve read many good blogs recently detailing why global singletons are evil. This thought has been brewing in me a long time. No design pattern is easier for the novice to pick up and use and misuse as the singleton. The (mis)perceived elegance of hiding global variables in a singleton is truly an evil thing. Sure Singletons are better than have all static access methods, and sure having all static access methods is better than having global variables and routines for the same thing. But the lesser of two evils (or three) is still evil. The desire to constrain a class to one instance is a good one when appropriate. I prefer the thought that singletons should return a valid object pointer on the first call and null on subsequent calls. This is what you would expect a class that can only have one to do. Using this method you get one instance that you use in the same manner as normal class instances. There should be a support group for people who like to hide globals in singleton’s – for all their handwaiving they aren’t fooling anyone;)
I have seen project where just about every class is a global singleton ….. how very (un)object oriented. Even Vlisside’s jumps on the band wagon in Pattern Hatching bringing up his regrets about this pattern. One simple change can clean it up so nicely – only return the valid instance ONCE.