Sunday, May 28, 2017

So you think you are a Technology Company? Think again.

I was recently asked to speak on technology at our company town-hall, and since our company is in the middle of a transformation to a “Technology Company”, I decided to make that the anchor-point of my talk. Why? Because I have found that most companies and their employees do not really “get” this concept, leading to many interpretations and confusions and hence resulting in lost opportunities. I wanted to make sure our employees got it right.

As I thought through my talking points, I came up with a small list of misconceptions about what a Technology Company really is. I decided to share all the thoughts I had gathered here, in anticipation of an interesting discussion and feedback from a larger forum…

So what is a “Technology Company”? The most common explanation I have heard is - a Technology Company is one that effectively uses technology to be successful in business. If you let yourself believe that, you are in deep trouble, in my humble opinion. Merely “using” technology may have been a good strategy in the 70s and 80s, but today, being technology-enabled is just “table stakes”. Can you name any business today that has not adopted technology? You can’t, right? Even local mom-and-pop shops now use mobile-payments and take delivery orders on Whatsapp. So, would you call all of them Technology Companies? I am sure you wouldn’t. So, what is it that Technology Companies do differently?

Well, to answer that, let me borrow a term coined by Tom Peters. The term he uses is “Re-Imagine”, which I think perfectly captures the essence of what I am attempting to convey. Technology Companies are companies that use Technology to Re-imagine business, rather than to just automate or enable existing business. They use technology to find completely new ways of doing business, they even invent new kinds of business on the strength of technology. At a Technology Company, technology defines business, rather than the other way around. Ask the big banks, and they will tell you how they thought interconnecting their branches and installing billion-dollar “core-banking” systems to centralize their operations would be all the technology play that was needed, only to be left gaping at upstarts like Atom (UK) and Ally (US), the new “online-only” or “direct” banks, that have redefined (Re-Imagined!) what the banking business is. This quote from an article in Wired magazine (http://www.wired.co.uk/article/digital-only-banks) sums it up nicely – “I could make a compelling argument to say that Atom is actually a data company that happens to have a banking license”. Well, there you have it. I could not have put it any better myself. But wait, it gets even better – “Atom's competitor, Mondo, is perhaps best known in the tech world because it has run a series of hackathons to imagine new banking functions”. See what I mean?

The second common fallacy I find is the idea that Technology Companies are about the latest and greatest technologies. I have seen companies become a veritable museum of new technologies, in an effort to embrace all that is new and shiny, thinking that just being on the latest technology will give them a business advantage. They keep building “Next-gen” versions of their existing products using new technologies, thinking that will automatically make the product better. No. Technology is just a tool, it is how you use it that counts. Yes, the latest technologies may have capabilities that could give you advantages, but only if you apply those capabilities to business in innovative ways. Try this exercise. List out the top 50 companies by R&D spend. Next list out the top 50 innovative companies. See how many names in the two lists match. Surprised at the lack of co-relation? You shouldn’t be. Re-Imagination happens when business ideas and technology capabilities come together in previously un-conceived ways to result in completely unforeseen outcomes. That is why it is so difficult. That is why it so often blind-sides established players. And, that is why it is so valuable.

As I was finishing up my preparations for my talk, I tried to anticipate counter-points that would come up, so that I could go in prepared. One of the top questions I knew folks would have was – “Hey, we have been in this industry for decades, and have unique knowledge and expertise in the domain, and that is our biggest competitive advantage. Why do we need to become a Technology Company”? Very good question, and many companies think along these same lines and go – “Oh yeah, that’s right! So, if I use technology to automate or enable these unique business strengths I have, I should be a winner on all counts, right”? Umm…hold on. I know that logic sounds very reasonable. But don’t jump to conclusions yet, follow me closely here. The problem is that your “unique business knowledge and expertise” is competitive in the context of how business is done today! It is tuned to the way the business works today. What if the context changes to a “Re-Imagined” way of doing business? Would your well-practiced orchestra go completely out of tune? Would your years of in-grained processes and culture make it all the more difficult to turn the ship and catch a new prevailing wind? You bet it will! Tell me, did Amazon have decades of experience as a book-seller? Or was Uber a taxi-fleet operator for a century? They are true examples of how Technology Companies change the game.

Well, that brings us to one final interesting question – if you are a true Technology Company, do you always remain so? Can you rest on your laurels and be assured of continuing success? Aha! Life is never that simple. Remember, you rose to ascendancy because you were the first to use technology to “change” how business was done, so if you stop changing, what happens? Yes, you are right; your competitors will soon catch on. What was unique to you will soon become established business process! Your days ahead of the pack will be numbered, and soon the herd will be all around you. And now, you will be as susceptible as the others to a new upstart who changes the game again. So you see, to retain your Technology Company badge, you will have to continue using technology to change the game. Look at Facebook, the guys who wrote the rule-book on social media. They are now trying their best to catch up to the new brats in town – Snapchat and Instagram! The only company (that I can think of) that has relentlessly held on to the Technology Company badge is Apple. No wonder both their customers and their shareholders love them so much.

Hope I was able to get you to think differently about this topic. So, what do you think? Do you agree? Which of the fallacies listed above have you encountered? Do you have some more examples? What are your examples of great Technology Companies? I would love to know. Or, do you disagree? If so, please enlighten me with your point of view through your comments.

Sunday, February 5, 2017

Flying the product kite - creative tussle between Product Management and Engineering

If you have built software products as part of either Engineering or Product Management, I am sure you have often wondered why these two roles sometimes seem to be at cross purposes. Product Management always seems to want the greatest features now, and engineering always seems to be explaining why a feature is technically infeasible or difficult to complete in so short a time. You are very sure both sides are trying to bring their best to the game, and you keep wondering how they can play better as a team.

These kinds of experiences led me to the need for a simple analogy describing the relationship between Product Management and Engineering - something that teams could easily grasp, something that would stick. So where did that search lead me? It led me back in time to the days of childhood and the joy of flying kites!

In my model, Product Management can be pictured as a kite, soaring among the clouds, with Engineering as the little kid on the ground, holding the string firmly. Please see the picture below the next paragraph, where I have attempted to represent this visually. (Note : This is just my way of looking at this relationship. You readers may have other ideas or opinions. I would love to hear about them, and I look forward to your comments)

Product Management is up there facing the winds of change blowing through the business environment, trying not to be left behind. They have their head truly in the clouds, thinking up grand new features to leave competitors languishing in the lower echelons. Being up high, they are also able to gaze at distant horizons and see the future of the industry, and they may also look through their telescopes at other kite-flying teams to see what the competitors are up to. Engineering, on the other hand, is the kid running around on the tough technology terrain, trying to avoid prickly technical problems and the hard rocks of architectural dead-ends. They are the anchor, grounding the product in solid engineering, the voice of practicality and logic and reason that keeps the kite from being torn apart by the wind or snared by the electric pole. And just like the skillful interaction between kite and flier helps them reach new heights, close coordination between Product Management and Engineering is the only way to launch a successful product and keep the organization's banners flying.


If you have flown kites, you know that the only way to make the kite rise is to pull on the string, against the wind. Similarly, Engineering needs to have a firm hold on the string to help and guide Product Management through turbulent business scenarios. Another thing you will also know from your kite flying days is that to get the kite higher up in the sky, you need to successively pull on it and release the string to allow more and more of the string to play out, carrying the kite higher and higher. This is very important for Engineering to understand. The successive pulls on the string are equivalent to engineering hardening of the product where feature-creep is kept on a tight leash and the product resilience and performance is improved. The successive relaxations of the string are the innovations, hackathons, new technology adoptions and release marathons that Engineering undertakes to feed Product Management's needs for better features, improved user experiences and insightful business intelligence.

Kite-flying disasters are quite common when either of the parties stops playing as a team. An unruly kite that fails to respond to the inputs of the flier ends up in tatters or high-up on a tree, and a flier that pulls too insistently on the string is left with either a stalled kite or a broken string. These are important lessons for Product Management and Engineering to keep in mind.

By the way, in no way do I want to imply that these roles of kite and flier are rigid and exclusive. Far from it. Good engineers are expected to understand business and be aware of developments in the domain, and if they do, they can also become partners to Product Management in driving features. I have seen many instances of this happening. I have also seen equally commendable cases of Product Management being cognizant of the challenges faced by technology, and working with Engineering to create a road-map that provides enough space for deep technology transformations. So yes, the parts played by the kite and flier can overlap, and they often do. However, the analogy presented here does provide a very simple story of what the generic and ideal relationship between Product Management and Engineering should be like.

What do you think? Do you see your kite-flying lessons being as handy in product development and engineering as envisioned here? Or do you feel that product development is too serious a sport to be compared with mere kite flying? Do let me know through your feedback.

Saturday, August 6, 2016

Technological Advances, or just "degrees of separation"?

I am very sure you have heard about the "Six degrees of Separation" theory as a party-topic or at the office water-cooler. Interestingly, as computer hardware, software and networking has advanced through the ages, computing has gone through its own "degrees of separation".

In the beginning, everything was one big block - the hardware, the OS, the applications - everything came from the same vendor and ran on the same box that took up the space of a house! Things were simple, it was a close-knit family living in a single room.

Then, in the mid-to-late 1960s, the first "application software" was developed and sold by a third party. This was a major step, since till this point, the business software applications used to be bundled along with the hardware and OS, and no-one thought it could be any other way. Now, the computer hardware companies were joined by a new class of software companies - the Independent Software Vendors. Thus were the giants like Microsoft born.

Meanwhile, the dumb terminal had separated from the mother-ship, and we were on to the next era of separation - the client-server era! This hardware separation was soon copied into software-side separation too, and voila, we had our "two-tier architecture"! Well, three is always better than two, right? At least the architecture pundits thought so, and stretched the two-tier architecture to create the new "three-tier-architecture", leading the English dictionary to create space for a new word in our vocabulary - "middleware".

Things were going well with the three-tier world for some time, and then the industry was bitten by the separation bug again. We started hearing about "distributed" systems. Everything could now be distributed - services, servers, databases, disks - and they could be distributed around the room, around the data center, or across larger LAN/WAN setups. We were now in the world of "n-tier" architecture, and our single-room-dwelling family had now separated, divided-up into hundreds of sub-members, and sprawled out across town and country.

However, the story was far from over. Before we had time to lament the break-up of the close-knit family and their scattering all over terra-firma, the separation drama reached for the clouds. And when mobile joined the party, things went really crazy! As I sit here and type on my web browser, it looks like it is all happening on my lap (now, don't get any ideas, all I mean is it is happening on my laptop...), but in reality the server sending me this page could be anywhere on earth, the database storing my words could be at the other corner of the globe, and my precious text could be merrily flying around clouds, travelling over thin air or under-sea cables. It truly is mind-boggling. The single-room dwelling is now a global village. How separated is that?

But why limit our thoughts to mere earth? The farthest computer today is probably on the Voyager 1 spacecraft which is a mind-boggling 20 million kilometers away from the earth and counting, and if a client on earth were to "ping" the server on Voyager 1, it would take about 38 hrs. to come back with a response, since that is the round-trip time for light to Voyager 1 ! (http://voyager.jpl.nasa.gov/where/) Talk about a really slow network!  So, it is not hard to imagine the day when our computing cloud would be separated across millions of miles of inter-galactic space.

Well, enough about separation, it is time for my separation from this close-to-too-long post. See you soon in my next post.

Sunday, May 19, 2013

The Success Paradox

I have been advising many customer CTOs and VPs with their product strategies, product roadmaps and modernization efforts. In the past I have also led new product developments, and have had engineering ownership of a mature banking product with over 500 customers. Looking back at what I learnt from all of this, I realize that I see a clear trend here - the more success a product has had in the recent past, the higher the chances that engineering is nearing a dead-end. The higher the past success, the more difficult it is to achieve the next leap in engineering for future success of the product. And, on top of that, the longer you wait to take the next leap in technology and engineering, the worse it gets.

Why does this happen?

Let us assume you head engineering, and have started building a new product. You have a clear vision of what the product needs to do, and how you are going to achieve it. The design, architecture and roadmap of the product is based on this initial vision. The choice of technolgies and tools is similarly based on current needs and current availability. Everything goes well in development. You release the product into the market and sit back and relax, expecting to keep working on the roadmap at your pace and priority. Suddenly, your product picks up! You have new customers signing up every day and guess what, your plans are hijacked. The business wants to capitalize on the momentum and starts pressurizing you into providing new features faster, features that you have never planned for. Customers start getting pushy about their defects and their feature requests. The load on the system keeps increasing dramatically. You hire a larger team, go with the flow, start churning out releases by the dozen, add many new features, increase the infrastructure footprint, integrate with a bunch of partner products....you are running just to keep up!

The years fly by, and one fine day, business comes back and tells you - "the product is not good enough, and engineering does not seem to be able to give us what we need in time". What?! Are we talking about the same product that was beating the charts 5 years ago? Yes, we are. Unfortunately, while you were busy fixing bugs, adding new features, improving performance to meet the increasing load on the system, and fighting off impractical feature requests, the world has moved on. Competitors have come out with cooler stuff built on newer technology. Their solutions are more modular and can integrate with other services. They are more nimble and agile. On the other hand, your technology, that was shining new at inception is now rusty, your architecture looks dated and monolithic, your interfaces are not open enough. And guess what, over all those years, as you were madly keeping up with "Business as usual", technical debt has been silently creeping up behind you. The trickle of technical debt, that you always planned to catch up with in the next release, is now a mountain, blocking your way to agility, nimbleness and efficiency. Each new feature now takes longer to develop, and is costlier. No wonder business is complaining!

I see this story repeated again and again.
So, what is the solution?

Well, once you get to this state, there is no easy way out. So, my suggestion is, never let yourself get to this stage. Keep "watering the roots" - keep looking at ways to improve the architecture, keep refactoring and catching up with tech debt, keep an eye out for new technologies and trends and adopt what is necessary, keep in tune with business strategy and align the product roadmap accordingly, and of, course, use an Agile or Lean development methodology. These would help, but would not insulate you completely. You will still have challenges. But just being aware of the paradox and taking adequate steps should make life much easier.

Saturday, January 12, 2013

Web 3.0 - are we there yet?

We are now all too familiar with Web 2.0. It has been around for sometime, and we have heard a lot about how it has transformed the world of internet. Now, with the advent of HTML5 and the rapid developments in "rich media" and "responsive web", Web 2.0 already seems like a relic from the past. So why are we not getting to Web 3.0 yet?

Don't you know, Web 3.0 is already here! "Why did I not hear about it?" - you ask. Well...remember, Web 2.0 was more a marketing terminology than anything else. It was used to put a label to the state-of-the-art web at the time, it was a handle technology marketers could use. It was never really a "technical specification". So, though I find that in many ways, we are already into Web 3.0, we are still waiting for someone to turn on the marketing and publicity blitz to make us sit up and take notice.

Why do I say we are already into Web 3.0? Just as Web 2.0 was defined by a few major things - democratization of web, Asynchronous Calls (AJAX) and Subscription and feeds (RSS etc), Web 3.0 is supposed to be built on four key concepts - semantic web, personalization, artificial intelligence and "anytime anywhere" access. All of these are already available today in some form or other! Twine, which was first announced way back in 2007, was a good attempt at a semantic web. Though it did not succeed, it still laid the foundations. Today, many social networking and search sites use semantic search. iGoogle is the best example of personalization, and it is very much here. Artificial intelligence is evident in many of the features of popular sites, be it the graph searches of Facebook, or iGoogle, or Siri. And need I say anything about "anytime anywhere"? It is one of the most heard terms these days.

So believe me you, Web 3.0 is already here! If you are interested in knowing more about Web 3.0, this site has links to some wonderful material.

Sunday, August 28, 2011

Technology Deja Vu

I feel like I am in the middle of that popular sci-fi film - "Back to the Future" !

Every new technology that hits the headlines, seems to remind me of something I have seen before. It brings back memories of the past, it raises the same old questions, the concept does not feel truly "novel".

Do you feel so too? If you have been around in the IT industry for more than a couple of decades and have earned your programming chops on the trusty old mainframes, I bet you do get that feeling, right?

Enough of talking in the abstract. Let us look some examples...

Let us start with that prime example of new technology - the cloud. Wow, you can now have your software and services run anywhere out there in the wild, and access them at will! You need not know where they are running, you need not worry about resource constraints (kind of, since you can set it up to be elastic), you need not worry about downtime. Heavenly, isn't is? Yes it is, but is the concept new? Were things veery different in the "multiprocessor" days of the mainframe? We never used to bother where our processess and programs were running, and resources were not usually a contraint either. And downtime? Well, in my days as a Tandem (later Compaq NonStop, and then HP NonStop) programmer, I remember the demos at the Cupertino (California) labs, where the customers were shown true redundancy - you could bring down a CPU, pull out a memory or network card, and voila!, the system would continue as if nothing had happened! Was the concept of having your software programs, processes and services running on a "processor farm" with true redundancy built-in very different from the concept of your services running on "server farm" with cross-region redundancy? The scale is different, of course, but I believe the template is the same.

Hadoop is making waves with its capability to split large workloads into smaller chunks and then ship them to separate machines which can then chew on these bites in parallel. That must be a new concept, right? Well, I think not. In the mainframe world, we used to have the concept of DB queries being broken down into smaller chunks during "compilation" of the queries. These chunks would then be intelligently shipped-off to the "disk processes" that the RDBMS would be running close to each physical disk. The idea was that each work-unit of data processing would be done by a disk process that was closest to the physical disk that the data was residing on - thereby guaranteeing the best performance. Again, the scale and infrastructure today is different, but the concept is tried and tested.

And what about all the excitement about responsive and interactive web pages? Aren't they making the web pages more and more heavy? Aren't they moving more and more processing to the client? Aren't the heavy Javascript frameworks starting to look more like client server technology of the old days, with AJAX calls to servers and listeners and call-backs? I leave it to you to decide...

So, what is really happening here? In my opinion, the new capabilities of the hardware and infrastructure is allowing us to use the old concepts in new ways. The massive scaling possibilities of connected processors is feeding the cloud and technologies like Hadoop. The increasing capabilities of mobile devices is fuelling heavier clients. Thus, it is more like old wine in new (and much larger) bottles! I would dearly to see some truly new wine, though.

Friday, June 10, 2011

Technology Turns Social

Social: Adjective, [attributive] relating to society or its organization, … suited to living in communities, living together in groups … with complex communication. Origin: … from Latin socialis 'allied'…[Oxford Dictionary].

As technologists, we should not forget that the consumers of technology are primarily humans – “social” animals.

Therefore, changes in social behavior often influence the direction of technology. And, when technology enables the natural tendencies and aspirations of society, a “tipping point” is quickly reached, propelling massive adoption. This happened with Facebook when it made “social networking” truly social with photo tagging, and is seemingly happening with Groupon, as proved by its sky-high valuation in a takeover bid.

Hence, to predict the technologies that are bubbling-up to the top now and will shape this decade, let us look at trends in society and social behavior, and see where that leads us…

People today are “on the move”, but need to maintain their complex communications. Therefore, “mobility technologies” are definitely going to be a growth area in the coming years.

Society has come to expect “infrastructure as a service”. Though recently popularized by cloud and virtualization vendors, such services are not new. Telecom services have been available for some time now as prepaid packages that can be “topped-up” as needed, and wireless broadband has replaced the modem and router at home. Virtualization and cloud computing will become equally prevalent.

Realizing the importance of social behavior, technology is trying to understand and predict it, like a dog chasing its tail! As a result analytics, business intelligence and behavior/sentiment analysis are thriving. These will be areas to watch out for in the years ahead.

“Social” traits are now evident even in new computing paradigms. “Elastic computing”, seamlessly distributing large workloads across a community of allied resources is the theme for technologies like MapReduce, Hadoop/Hive, Terracota/BigMemory, Azul Systems’ Zing JVM, and similar projects. This trend will keep gaining momentum in 2011. Growing volumes of “relationship” data in the “social networks” has given rise to “graph databases” and the related concept - “NoSQL”. These are still very early-stage, but will rapidly mature in the next couple of years.

Am I the only one thinking "social"? Of course not. The Gartner 2011 top10 strategic technology list includes – Cloud computing, Mobile Applications and Tablets, Social Communication/Collaboration, Video, Next-Generation Analytics, Social Analytics, Context-aware Computing, Storage-Class Memory, Ubiquitous Computing and Fabric-based Infrastructures. As you can clearly see, most of these technologies follow the themes of social usage patterns, mobility, infrastructure as a service and “computing-infrastructure communities”.

So, if you are a Technologist, better start thinking like a Sociologist!