Complexity kills the computer industry

Posted by Rob Enderle, Pricipal Analyst, Enderle Group

Analyst Opinion - AMD brought out Shanghai, their latest part in November, substantially ahead of when it was expected and it arrived in a market hungry for efficient performance but lacking in the cash needed to buy new systems.  This week Nehalem EP was released, which is arguably the biggest technology step that Intel has made this decade. Both parts are less about raw performance and are more focused on performance per watt. Right now, there is no other industry in which we care so much about what is inside the box. We only care about the benefits the solution provides. I am certain the computer industry is getting ready to take us back to that future.  

The first rebirth of the computer was when the ENIAC gave way to the mainframe and the technology moved out of the labs and into business, the second was the wave that created PCs, workstations, servers - and the networks that tied them together.   
As we look out into the changes that are occurring from the work that Microsoft is doing with regard to rethinking the data center as a system as opposed to a collection of parts and Cisco's entry into the space, which argues that the network has an equal part to what is happening with OpenCL and GPU processing, I think we are fast reaching a point where the CPU matters less and less and the system matters more.  
 
In fact, in looking at HP's recent workstation release, where the company used BMW to redesign the product and blended Intel and Nvidia technology into a differentiated solution, it may well be, for both servers and workstations, that parts are becoming parts again and the OEMs are beginning to rethink how to build workstations, servers, and with concepts like OnLive, PCs. 

Let's take a look at what may be the third major rebirth of the computer.   


Netbooks and smartphones

Starting from the client and moving out, the concepts surrounding the leading smartphones and netbooks have to do with their ability to live off the web. The iPhone isn't even multi-tasking to ensure that each task gets the full performance that was intended and if we look at how netbooks are positioned, they are also seen as vastly more focused on a few things and more dependent on back-end services than the PC ever was. In effect, these may be more like terminals with pretty user interfaces than PCs.   

Read on the next page: GPU + CPU = Platform, Cisco and Microsoft: Rethinking servers


GPU + CPU = Platform

With the emergence of OpenCL, we now increasingly have the GPU, which used to just be focused on rendering images, now moving into compute intensive roll,s which used to be the sole responsibility of the CPU, and are trending to be on the same die. In some systems, at some future point, you may not be able to tell where one leaves off and the other begins. This concept appears to be at the heart of Intel's Larrabee effort.  
But, why should this only occur on the client side if clients are becoming more limited anyway?

Cisco and Microsoft: Rethinking servers

Microsoft's lab effort to rethink a datacenter around low cost processors has created a concept like RAID, which was for storage. But applying the concept to processors though replacing the "D" in that word with a "P" is somewhat problematic for the naming folks.  

Google had been scaring the hardware vendors with their internally deployed concept of self-built servers, but they remained within the normal server architectures. Microsoft took the next logical and more revolutionary step of completely rethinking the concept of a server and suggesting you treat the datacenter like you would treat a huge PC case.  I'm still wrapping my mind around the idea of a massively multi-core datacenter and if we toss in blended CPU and GPU solutions, then we suddenly have something that has capabilities that no current datacenter has.

Cisco approached this from another direction. Realizing that the idea of a cloud has, at its core, network aware servers, the company went back to the drawing board and created servers that were designed to be network aware. Once we combine these three concepts, custom built super servers that are network aware and have blended CPU + GPU processing power, we get an explosion of real and virtual cores with a result that sounds almost brain-like to me.

OnLive:  Moving gaming to the web

Why am I picking gaming as the example application you ask? Because gaming has a variety of elements that servers aren't known for doing well. It has graphics, physics, and artificial intelligence, all of which generally are more suited to high-end workstations than servers.  But, if servers can be made to take these loads than there isn't much a workstation can do that a server cannot.  In fact, because the room sized custom servers we are imagining could handle the massive loads associated with creating a realistic virtual world because of their equally massive processing capability, they should be eventually able to exceed any one or group of workstations of equal processing power.
 
OnLive is the first test bed for this concept and may be a bit long on promise and short on execution at the moment, but is the well-timed test case for this concept. If it can be made to work adequately, then it provides the test case that makes everything else real and interesting.   

Wrapping Up

Complexity is killing the computer industry.  We simply have too much stuff generally running at around 10% capacity and chewing up energy relatively inefficiently.  Buyers, both individuals and corporations, are looking for ways to save and yet they can't seem to translate all this unused performance into something beneficial. But it appears that the industry is now starting to rethink computing.  Part of this process is creating a framework that may become the third rebirth of the computer, one that creates more of a computing service and the appliance-like PCs we have always wanted, but were not able to get.
 
In 1977, Ken Olsen the then president of Digital Equipment Computer Corporation (DEC) said that was no reason for anyone to want a personal computer. While history proved him wrong, I think he was generally right. The market went in the wrong direction and what we wanted was a personal computer service. We did not want to become computer technicians.
 
I believe we are beginning to see a market correction. The next decade will take us back to a world more in line with where we should have been with appliance like clients tied to managed services. If that will happen, most of us can go back to not caring about what is inside the box.   

Rob Enderle is one of the last Inquiry Analysts. Inquiry Analysts are paid to stay up to date on current events and identify trends and either explain the trends or make suggestions, tactical and strategic, on how to best take advantage of them. Currently he provides his services to most of the major technology and media companies.

The opinions expressed in this commentary are solely those of the writer.