Category Archives: Web

WebGL is finally ready for prime time – watch everything change! – Part 1 of 4

WebGL_1500

If you see a spinning Pebble smartwatch above you are using a WebGL enabled browser!  Otherwise you’ve fallen back to a precomputed spinnable 360 degrees format which isn’t even close….

With the latest round of WebGL supported browsers and the hard push towards optimized JavaScript compilers, JavaScript “assembly” libraries that play well with optimizing compilers(asm.js), support for compilers that generate asm.js compliant JavaScript (LLVM with Emscripten), and direct support for LLVM virtual machines and even totally safe sand-boxed code execution (NaCL in chrome only) we are FINALLY ready for a 3D web. And much more than that – these technologies will signal the death knell of traditional OS specific apps….. again FINALLY!

The History from the Standpoint of Applications and the Web

If you are inpatient or pressed for time you can skip down past this – but having a feel for the background really makes the appreciation sink in.  The history with respect to mobile devices is in part 2, web 3d is coming up in part 3 and part 4 will wrap things up by explaining how WebGL on browsers should pull all app development into its fold and leave the other targets as second thoughts.

Let’s start at the beginning (where most things usually start) with the schizophrenic, 1.5 steps forward – 1.2 steps back nature of the computer world where the ecosystem can change over night – a Darwinian process that works well in the end but is far from optimal with respect to efficiency. In the beginning everything was proprietary to processors…. great job security for developers. Some reasonable guys created C to be “a portable machine language” and for a while things moved closer to the write once, modify for various targets and be good to go model. A useful orthogonal model of development – 1.5 steps forward. And remember these systems were very different under the hood – but conveniently were mostly crunching data and spitting out text. Over time things exploded as essential proprietary libraries were created for various targets that broke many of the most useful elements of this paradigm – only 0.5 steps back. These systems did in fact have different capabilities, processors and memory footprints and so a universal abstraction layer couldn’t be generated in many cases. So this did make some sense.

As we enter more modern times – hardware began to be nearly ubiquitous across operating systems – you probably had the same hardware underneath whether you were running Windows, Unix (including the Mac, being Unix of course), Linux, etc. However operating systems that implemented a large number of the same applications (everyone wanted Photoshop to run the same way on all targets) were proprietary in nature with respect to their code. Due to the power of the fast new machines, code could be written with abstraction layers that made it possible a more write once and build for multiple targets; however many programmers still chose the easier, more feature rich approach of writing code specifically for each target. For companies that did nothing but port this was a cash cow.

When the internet took off the ultimate force multiplier for a homogeneous, ubiquitous development abstraction layer was in motion – oddly disguised as a hypertext viewer.  In throws and spurts thanks primarily to Microsoft compatibility games (in Internet Explorer, Java etc.) getting the web experience to be reliable and consistent was a process of never-ending testing.  This did get fixed over time to a good degree.  And as these things really settled down in recent years the stage was almost set, cross-browser compliance was close enough.  There were key missing ingredients however.

One of these missing ingredients was performance – you just couldn’t get close enough to native performance out of the apps that were developed for the web without security risks a mile wide AKA Active X.  These first generation JavaScript ajax apps were really far from the ideal mark – a huge step back in look and feel from native apps.  However, they were relatively safe and sand-boxed(after some time).  Most importantly collaboration was available in a larger way as application installation wasn’t an issue and a rate limiting factor to  deployment and acceptance.  So they were good enough and created a strange, ever-expanding environment of cross-breed of internet applications.  In our guts most developers realized these environments were first class kluges making a square peg round hole metaphor look like an insane understatement.  Html documents were never intended (or designed for) application development and this environment really create a freak show of clunky technologies.  But they became irresistibly de facto – their limited functionality was simply to useful to end users.

Things were getting very close…. 2.5 steps forward for the concept of write once run anywhere.  Everyone who was anyone wanted to have a web app built or customized for their company.  Over the course of several years, web-based app frameworks started to take hold – and sprout up everywhere.  The only thing standing in the way to a the universal “OS killer” app environment in browsers at this point was performance and a more unified, cohesive development experience (client side functionality instead of everything on the server, support for multiple programming languages, something closer to a symmetry between client side and server-side code, etc.).

Side Note: 

Many would think Flash might have helped bridge these gaps – but the binary executable blob concept simply never sat well with a generation of developers wined and dined on complete transparency of underlying code and implementation that came with the standard web development paradigm.  Flash was doomed well before Steve Jobs pulled a Microsoft and didn’t permit it on iOS devices.  Was he the benevolent guru of user experience as he claimed – keeping the masses from poorly performing flash apps?  Of course not, his Bill Gates spidey senses were at work – flash apps could be as strong as app-store apps and would be completely out from Apple’s thumb.  This is the same reason iOS devices don’t support WebGL – it makes uncontrolled high quality apps possible.  But Apple will cave in with iOS just as Microsoft did with WebGL in Internet Explorer – we’ll talk more about this later. 

HTML 5 was coming down the pipe and JavaScript engine optimizations were being implemented – and even rudimentary 3D using canvas and “software rendering” was coming along.  Things were getting so close you could almost smell it in the air.  And then the machines turned into a wrench (literally and figuratively)…….

Mobile phones came on the scene with Android and iOS and …… the old days were revisited – proprietary “OS apps” were back in full swing.  Let’s once again set back the clock and take a big step back…….

More coming in part 2!

How SEO will kill google – or the problem with backlinks

When the web was young, backlinks were the perfect way to measure a sites popularity. The sites with backlinks were more popular and the sites with backlinks from more popular sites were even more popular as you would expect. And all was well.

But in today’s world no matter how much bleach you apply to your white hat, you can’t get around that little bit of knowledge you’ve acquired. Backlinks = success on Google. And by knowing this and “exploiting” this you will knock out those websites that don’t attempt to do any “SEO”. Your white hat may look white, but really it’s gray. You will do all the right things to generate backlinks. Invariably as much as Google would hate to admit it, this breeds a competitive landscape. “You can’t win if you don’t play.” It has now become essential to “ethically” build backlinks in order to get organic Google search results. And this is very bad for the quality of links on Google’s search results.

No matter how innocent you try to be, the cat is out of the bag. It is far to easy to get a large number of “encouraged” backlinks to a given site. When you search on Google the really big players come up first. And this is good. They have such a huge number of backlinks that they can pretty much be seen as genuine. It’s the middle ground. The “long tail”. These are the problems. If a smart developer/designer releases a WordPress theme with a backlink and that theme is used by several hundred people they get lots of page rank. These wordpress themes can be used on popular blogs with plenty of pagerank themselves. This is “white hat” activity. But it’s really intelligently gaming the system. The SEO gurus know plenty of tricks to get backlinks and PR.
Because of this many Google “long tail” searches that are more specific are filled with SEO’d sites both white hat and black hat spam. I know quite a few people who actually are switching to other search engines because of the “backlink” spam. “Quality backlinks” are just too easy to get, especially if you are willing to pay money for them. And that goes on everyday.

Backlinks were a good idea at some point. Now they are only showing that you are trying really hard. The people that talk the loudest and the most often aren’t always the best. And proliferating backlinks to your site with all the right keywords to game the system doesn’t mean your content is the best either. It just means that you are good at generating backlinks.

Until spiders can have a clue what they are parsing it will be hard to impossible to solve this problem and still use backlinks.
Social networks like digg.com, stumbleupon.com and reddit.com are “the next new thing”. By having people vote on pages… the crap filters to the bottom. This of course will have it’s own set of problems – but invariably will be far better and more accurate than backlinks. At the end of the day some content will be brilliant and good – and still be undiscovered. However more data will yield better results.

How do you really judge the worth of a website? Is it how many people like it? The demographics of those people linking to worth for specific topics? How long people spend on the site itself? These are fascinating questions that search engines will be forced to answer in upcoming year – or they will be inconsequential.