Opinion: Can AMD reinvent the microprocessor?

Posted by Wolfgang Gruener

Groundbreaking developments in the microprocessor space typically come along once a decade. Think about the Gigahertz race in the late 90s and the multi-core trend started last year. AMD is now convinced that multi-core may be a short-lived era and shifts towards what it calls "APUs." But: is it really a breakaway from multi-core and can AMD establish a new type of CPU all by itself?

AMD Fusion: Presentation details and background

It was impossible to miss AMD's glowing confidence at the firm's analyst meeting held last week. To the delight of analysts, the first row of executives had only good news to report for 2006 and the company was rewarded with a climbing stock price, which was up more than 8% for the day. The stunning news, from a technology point of view, certainly was that AMD believes that getting too excited about multi-core may be a doomed trend.

It is remarkable, because we are just learning to understand what may be possible when running more than one engine is running your PC. We thought (and still believe) that multi-core holds the opportunity to enable the types of applications visionaries such as Bill Gates have been talking about over the past ten years or so. And now AMD tells us multi-core isn't the key? Where did that come from?

The 2007-2009 mission: To go where Intel can't go

While we don't know how AMD's newly laid out strategy will turn out in the end, it appears that there are at least two major factors that are driving the firm's desire to break away from today's multi-core microprocessor trend.

First, AMD always has been the second player in the microprocessor industry. The company has been locked into a market positioned that has been a decent business opportunity, but AMD never had the chance to surpass Intel. In recent years, the company has celebrated its most significant successes, cutting deep into Intel's strongest markets, grabbing market shares and revenues.

While the true reasons behind those events are complex, there is no denying that AMD's approach to listen to customers and develop products that match what the market wants has been executed much more efficiently by the green guys. Being different (and faster) can actually pay off, even if you are not the market leader.

Second, AMD has been a fairly small company, when comparing its pre-ATI 14,000 or-so-employees with Intel's 100,000. Intel has more engineers testing processors than AMD had on its entire CPU engineering staff. This scenario somewhat cemented the fact that AMD's opportunities would always be very limited and that the company would mostly have to follow the trends created by the market leader. As a result, the company's success would always depend on the failure or success of Intel's products. Trying to be different is the natural move and could provide the company a way to escape this trap. Purchasing a company like ATI enables AMD to both close gaps to Intel and outrun the market leader in other areas.

We still think that Nvidia would have been the better choice for AMD (but then, we don't know what ATI really has to offer behind closed doors), but as insane as it may have sounded at first, the ATI-acquisition begins to make sense. When announcement was first made, most of us looked at ATI's chipset business, which could enable AMD to catch up with Intel in critical business segments, such as notebook platforms. But the company is not wasting any time to put ATI's assets to work and catching up appears not to be the main focus of the deal: We believe that AMD will try to create new markets that are unreachable for Intel in the short-term. An example for this strategy is dynamic graphics switching, which will allow notebooks to dynamically switch from discrete to integrated graphics when a system goes into battery mode.


APU - solution for the future?

The other, more significant example is "Fusion," a code-name for a processor that "fuses" a graphics core with a traditional CPU and the Northbridge on the silicon level. The reasoning behind that idea is that basic 3D graphics is well on its way to becoming a commodity and that graphics processors have many more talents that aren't exploited today: For example, the floating point performance of ATI's R580 processors is already almost ten times higher than that of today's high-end dual-core processors. Technically, the integration of a GPU into a CPU, could enable AMD to build very low-end processors that do not need an integrated graphics chipset anymore - and use the same idea to build supercomputer microprocessors.

AMD recently told us that Fusion is very much in the first stages of development, so its capabilities are largely theoretical at this time. AMD, however reiterated at the meeting with analysts that the first Fusion processors will be mobile processors and will provide advantages in terms of power efficiency when released. Performance is pure speculation at this time, but there is a good chance that the integration into the CPU could eliminate bottlenecks and bring some benefit over the external integrated graphics we are used today. AMD also expects that such processors will be cheaper to produce than separate CPUs and integrated graphics chipsets.

What is enticing about this idea, is the fact that Fusion is basically the same approach that brought us the integrated memory controller in Opteron and Athlon 64. AMD has been quite successful with this approach and integrated graphics are certainly a novel idea that could work out similarly. AMD's roadmap suggests that the first Fusion will surface in the 2009 timeframe.

But AMD does not call Fusion a "GPU-CPU," it calls it "APU" (accelerated processing unit) - which means than just a different way of integrating graphics. In fact, you could call this a "Lego" approach, which enables the company to take different building blocks to create different processors with very specific target applications. AMD claims that this "modular" approach to building processors will enable the company to quickly react to changing market trends. On the higher-end, a GPU could morph into a "stream processor" that won't accelerate graphics but take advantage of the floating point horsepower of the processor. AMD envisions stream computing to move into the mass market at some point, but clearly, stream computing is a high-end topic today and non-existent in the mass-market, as there are no off-the-shelf applications that are taking advantage of floating point capability.

What concerns us about Fusion is how the consumer will be affected, if this idea is successful and will develop as promised by AMD today. The company dreams about building a "one-size-fits-all processor" - and that is true only from a development or manufacturing point of view. A processor that is performance-tailored to certain application scenarios will only reveal that performance in those areas. There is a good chance that we are going from a general purpose microprocessor today to a specialized processor, which means you could end up with multiple computers in your house, specialized on different tasks.

Asked about a possible "fragmentation" of the microprocessor market, AMD told us that "people should not be worried about what a device is supposed to do" and believes that Fusion will be able to make buying a computer - or any other device that could integrate an AMD processor. From today's view, however, we believe that specialized processors could make buying a computer more complicated. At least if the consumer will continue to be exposed to the task of deciding on a processor for his computer.

Is multi-core dead?

Chief technology officer Phil Hester told analysts that getting overboard with multi-core processors could be compared to the Gigahertz race. Of course, this was a shot against Intel, who has been distributing the idea of dozens or hundreds of cores in one processor. When looking back, we realistically have to say, however, that quickly increasing clock speeds were the right strategy around the turn of the century, but the industry should have turned the corner towards more power efficient technologies earlier. No doubt about it.

The multi-core talk today in fact sounds very similar to the Gigahertz trend around 2000. It appears to be solving many of current problems such as high power consumption and promises to increase application performance at the same time. While Intel has been promoting the idea of "many-core" processors, the company appears to have retreated from that strategy at least partially. Current roadmaps from Intel suggest that notebook and desktop PCs will be stuck at quad-core for a couple years, while greater core numbers (8) could be realized on server and workstation processors before the end of the decade. AMD explicitly said it won't participate in such a many-core race.

That claim however, exclusively refers to "homogeneous" multi-core processors with a number of identical cores. "Heterogeneous" cores, with processing cores that could be assigned different tasks are not quite new and have been discussed by Intel in the past - AMD's APU's are heterogeneous multi-core processors that are likely to see an increasing number of cores as well. However, AMD is first to lay out a clear direction of such heterogeneous multi-cores. So, multi-core isn't dead.

Intel, by the way, is also aware of the need of number crunching horsepower: SVP David Perlmutter told us in an interview earlier this year that "integer and floating point performance will improve significantly" in the upcoming 45 nm core due in Q4 2007. Also, 2006 was clearly the year where the technology came out of its shadows: Take for example Clearspeed's CSX600 accelerator card or the PeakStream software - two concepts that show that there is a trend that makes a lot of sense and has great potential.

The software perspective

The real challenge in stream computing, however, may not be building "stream hardware." There aren't many applications that are taking advantage of floating point horsepower today - it is mainly an area that is limited to scientific and financial applications. Mainstream success of such processors will largely depend on the availability of software - and it is unclear how long it will take until developers pick up the trend.

AMD thinks the adoption of stream computing may take about as long as it took developers to embrace 64-bit - two years. But realistically, 64-bit is still a high-end topic and virtually non-existent in the mainstream (at least until we have a need for computers with more than 4 GB of system memory). Intel came out with its 64-bit extensions 2 years after AMD, but still well in time to take part of a 64-bit trend, when it begins to grow.

Turning the microprocessor space into a different direction is a monumental task -certainly much more elaborate than anything AMD has done before. I agree with AMD that stream computing could take the same route as 64-bit, but this one is a different caliber. Having the vision is one side of the story, but making it a reality will be more difficult.