Thursday, August 30, 2012

On the Abstraction of the Computer

I have a theory. I've had this theory for a while, and occasionally will subject those around me to a description of it. This is going to be one of those times.


 First, a little background. I strongly think of the computer as a tool that enables us to perform tasks. Some of these tasks we did before computers were available, like balancing our checkbook, and the computer enables us to perform the task faster or more accurately or in some way better or easier. Other tasks did not exist before the computer came along, like surfing the web. Either way, the computer is simply a tool, and the tasks are really the important part of the equation.

Really, I think of the computer as a meta-tool, a toolbox or a workshop basically. The applications are the individual tools. I think understanding this idea, in some form, is one of the key factors in whether a novice user progresses to a state where they are comfortable with alternate browsers like Chrome or are stuck in the "Internet Explorer == The Internet" mindset. Recognizing that the internet is the destination, the browser is simply the tool of choice for getting there, and that there is more that one tool available that will work.

I use the browser in this example because these days it is the most common application that a friend or family member will eventually suggest a replacement for. I really do think that when this happens it is essentially a skill growth check for the user, to use RPG terms for a moment. If they get it, then it usually clicks for them, and they proceed to eventually discover other alternate applications. If they don't get it, then their skill growth is limited, at least in this area, until the next time they encounter this idea.

So, having laid out the background, let me tell you about my theory. My theory is that sometime in the near future, the act of computing will become abstracted and separated from the actual tool, the computer itself. I've had this theory since before tablets started getting popular, and it keeps getting more convincing to me.

When I imagine this in my head, I always think of a fictional house, one with multiple computers in it. In this house, in the present time, if I want to use my personal finance software to work on my family's budget, I will go to the computer in our home office, because that is where I have that software installed. If my kids want to play a computer game, they will go to the computer in the game room, because it has the fancy graphics card and nice monitor. If I want to look up some random piece of trivia while watching a movie, I will grab a smartphone or a tablet or maybe a chromebook, depending on what is within reach. I pick the computer to use for each task based on its convenience of access as well as its capabilities (is the app installed here, does it have the needed hardware).

Imagine that I have 7 computing devices in my house. That isn't that many, I promise. I could meet that with 2 smartphones, 1 laptop, 1 tablet, 2 gaming pcs, and 1 office pc. I didn't even get a home theater pc into the setup. How many of those devices are in use at any given point in time? How much processing power is going unused? It is really inefficient.

Look at the enterprise IT world, and this sort of scenario is where the growth of virtualization came from. Instead of 10 separate servers for 10 different apps, each of which consumes 10% cpu on average, combine them onto 1 physical server, and let the CPU run at 100%. Or onto 2, let it run at 50%, and you have spare capacity in case one server dies.

This is the reason I think computing will become abstracted. What if the computing resources in your house were a pool, instead of islands? What if your apps were centralized, and no matter what computer I was at I could access anything I wanted to? That lets me use the most convenient computer, which means I don't have to go upstairs to the office to work on the budget, because I can access that application from the laptop, or in the future, from the tablet or smartphone.

But what about the gaming machines? What about specialized hardware? Well, I don't have a present day solution, but I imagine that using an On-live type of system, that will also become part of the pool. What if I could sit down at the tablet, and it would use the graphics card from one of the gaming computers (that nobody else was using) to render the graphics for the game I wanted to play? What if any screen, keyboard, mouse (or touchscreen) combination could use any of the computing resources in the house to perform any task you wanted. Then you wouldn't have to worry about whether the machine had the capabilities. Once you had the capabilities in your home computing pool, they could be accessed from any device. Now it would just be an issue of using the most convenient device.

But, you would still be picking based on the size of the display, whether it has a mouse or a touchscreen, etc. Honestly, I think that will go away too. In my vision, I see a time when any surface can become a display and/or an input device. Sure, you could still have dedicated monitors and mice for those times you really want to play a game and reflexes matter. But for that time you are in the kitchen and want to look up that recipe your mom emailed you, it can pop up on some unused counter space as a display with touch controls. Want to finish a movie while relaxing in a hot bath, it can be displayed on the wall in the bathroom.  Essentially, this is a world where every surface could be a display on demand, and the actual computer boxes themselves have faded away to a room where you plug in computing modules to add capacity.

I think once everything can be a computer, that the computer as we think of it will be an outdated concept. That the idea of having to go upstairs to work on your budget, instead of clearing some space and sitting down at the kitchen table and having it be displayed there, will seem as old fashioned as driving around without turn by turn GPS on your phone with Pandora streaming in the background.

Combine this with wireless technology, and mobile devices can join the pool and access your resources when you are at home, and leave it and run on their own when you are gone. These would even work for devices like Google Glass, or other future technologies.

That is my theory. It's really more of a vision. I'm not claiming the implementation details will be correct, but I think that at a general level, it is the direction we are headed.

Now, why did I decide to share this with you today? I was listening to a podcast today, and they were talking about VDI (Virtual Desktop Infrastructure). This is technology like Citrix boxes that provide virtual desktops and apps to you. Virtual, in that they are run on the centralized hardware, but displayed on your screen. It's the new thin client.

They were speculating a little about where it was headed, and commented that the future might be one where we no longer bought a computer, but bought a desktop with apps installed instead, and just accessed the same one from wherever we were.

That got me thinking about my theory that also had to do with computers going away, and that is why I decided to write this down finally.