Sunday, August 28, 2011

Technology Deja Vu

I feel like I am in the middle of that popular sci-fi film - "Back to the Future" !

Every new technology that hits the headlines, seems to remind me of something I have seen before. It brings back memories of the past, it raises the same old questions, the concept does not feel truly "novel".

Do you feel so too? If you have been around in the IT industry for more than a couple of decades and have earned your programming chops on the trusty old mainframes, I bet you do get that feeling, right?

Enough of talking in the abstract. Let us look some examples...

Let us start with that prime example of new technology - the cloud. Wow, you can now have your software and services run anywhere out there in the wild, and access them at will! You need not know where they are running, you need not worry about resource constraints (kind of, since you can set it up to be elastic), you need not worry about downtime. Heavenly, isn't is? Yes, it is, but is the concept new? Were things very different in the "multiprocessor" days of the mainframe? We never used to bother where our processes and programs were running, and resources were not usually a constraint either. And downtime? Well, in my days as a Tandem (later Compaq NonStop, and then HP NonStop) programmer, I remember the demos at the Cupertino (California) labs, where the customers were shown true redundancy - you could bring down a CPU, pull out a memory or network card, and voila! the system would continue as if nothing had happened! Was the concept of having your software programs, processes, and services running on a "processor farm" with true redundancy built-in very different from the concept of your services running on a "server farm" with cross-region redundancy? The scale is different, of course, but I believe the template is the same.

Hadoop is making waves with its capability to split large workloads into smaller chunks and then ship them to separate machines which can then chew on these bites in parallel. That must be a new concept, right? Well, I think not. In the mainframe world, we used to have the concept of DB queries being broken down into smaller chunks during the "compilation" of the queries. These chunks would then be intelligently shipped-off to the "disk processes" that the RDBMS would be running close to each physical disk. The idea was that each work-unit of data processing would be done by a disk process that was closest to the physical disk that the data was residing on - thereby guaranteeing the best performance. Again, the scale and infrastructure today are different, but the concept is tried and tested.

And what about all the excitement about responsive and interactive web pages? Aren't they making the web pages heavier and heavier? Aren't they moving more and more processing to the client? Aren't the heavy Javascript frameworks starting to look more like client-server technology of the old days, with AJAX calls to servers and listeners and call-backs? I leave it to you to decide...

So, what is really happening here? In my opinion, the new capabilities of the hardware and infrastructure are allowing us to use the old concepts in new ways. The massive scaling possibilities of connected processors are feeding the cloud and technologies like Hadoop. The increasing capabilities of mobile devices are fuelling heavier clients. Thus, it is more like old wine in new (and much larger) bottles! I would really love to see some truly new wine, though.

No comments:

Post a Comment