Background: What Fusion will be – and what not

Posted by Wolfgang Gruener

Chicago (IL) – We recently were able to spend some time with AMD to learn more about its upcoming chip technologies. Not surprisingly, one focus was Fusion, AMD’s mysterious processor that will marry graphics and CPU in the 2009 or 2010 timeframe. We heard lots of buzzwords and did not see an actual product, but there was enough information to get a better idea of AMD’s plans: Join us for a closer look at what Fusion may look like in three, four years.

And yes, we are interested in your take on Fusion. Let us know what you think about AMD’s ideas and write a comment in the form at the end of this article.

AMD may be the perfect example to show how quickly fortunes in the semiconductor industry can change. While the company had a fantastic run from the end of 2003 until the third quarter of 2006, AMD recently has been going through a relentless beating from Intel since then. AMD teased Intel for some time, probably a bit too much, and it’s payback time now: Intel appears to be stronger than ever and leaves no doubt that it will do everything it can to regain lost market shares.

There is also a strange sensation that Intel recently came up with several ideas that sounded very familiar to what we had heard previously over at AMD: Being able to out-spend and out-resource its much smaller rival several times over, it is a fact that Intel at least in theory has the ability to copy ideas from AMD and introduce products within a similar time frame. While we do not know, if Intel in fact is copying ideas from AMD, the reaction of AMD to be a bit more cautious about which product news are being disclosed and which not is somewhat conclusive.

 

However, the overall situation has put AMD in the very inconvenient position of not being able to talk about new products in order to keep analysts, investors (and, yes, journalists as well) happy. Negative headlines about the firm’s financial performance dominated AMD articles in recent months as a result. Rather than waiting out the storm, AMD apparently is now taking steps to break the current pattern: The company is getting more active again and said that it will provide updates about its technology and product plans on a more regular basis.  

At a recent event AMD briefed journalists about its future core key technologies, some of which we have agreed to not talk about just yet and some of which clarified AMD’s direction. Much of the focus was put on the hybrid processor: While “Fusion” generated quite some interest and speculation last year, it recently was reduced to a buzzword that hardly anyone (outside of AMD) understands. Occasionally, we even heard that Fusion may only exist on paper to justify the acquisition of ATI.  

Let’s have a closer look at what we learned about this new processor concept.

Fusion: Graphics to start

Fusion essentially is the first phase in a vision of AMD that aims to take the traditional processor from the multicore-era into what the company calls “accelerated computing”. Scheduled for a late 2009 or early 2010 release, Fusion will be offered as an entry-level / mainstream processor for notebooks initially. Over time, Fusion will develop into a broad family of products that eventually will span across most (complex) microprocessor markets.

AMD believes that the combination of graphics and a traditional CPU core on one die will offer three key advantages. (1) a better performance/watt ratio; (2) more performance through reduced latency between the CPU and the GPU; (3) some economic benefits as the production of one Fusion processor may be cheaper as a CPU and a separate graphics core. Putting a relatively low-end CPU-GPU solution into a notebook – and not into a desktop – also makes sense as users of such devices are unlikely to upgrade their system.

From a historical view, the integration of the GPU into the CPU could be considered as a natural evolution. Just like the FPU was absorbed by the CPU more than a decade ago, it isn’t completely out of the question that basic graphics functionality could also become a feature of the CPU – a feature that we simply would expect in a few years down the road. Much more than it is the case today, graphics could become a “default” commodity.

As in the past, however, there are certain concerns whenever such an integration move happens: There are reasons why GPUs have not been integrated into CPUs and these reasons certainly include an economic component: Chip companies such as Intel want to sell silicon, as much as they can. Combining two chips into one chip at least theoretically reduces the number (and revenues) of chips that can be sold.  So, if it were just about graphics, Fusion could be an extremely risky move for AMD from a financial perspective. But, as it turns out, the idea behind Fusion is not really about graphics – it is about opening up the graphics processor to general purpose applications. And this is where Fusion gets interesting.

 

Read on the next page: Roadmap - From graphics to general purpose applications

 


 

 

 

Roadmap: From graphics to general purpose applications

If you have been using computers for some time, then you may be aware of those sci-fi applications that have been described by visionaries such as Bill Gates over the past two, three decades. While those applications usually were promised to be available within a few years, we have learned to take these promises with a grain of salt: Speech recognition isn’t really as reliable as it should be; you still have to very creative when using search engines and consumer-style face recognition has been a no-show so far.

Interestingly, GPUs have been gaining lots processing capability since about 2002, a trend that went by almost unnoticed: GPUs adopted an increasingly parallel processing model, resulting in massive, and largely untapped floating point horsepower. Today, two or three graphics cards are enough to achieve one Teraflop/s performance – the same level of performance that would require about 30 current dual-core server processors (CPUs).

According to AMD, this growing horsepower potential could come in handy for “general purpose” applications running on Fusion processors. Perhaps not the first generation of the family, but the second and third generation of processors, could be able to take advantage of optimized software code and exploit the general purpose potential of a GPU. AMD expects this third Fusion generation to be available in the 2011-2013 timeframe.

 

 

With the limitation of processing only graphics, Fusion will not remain a mobile processor. Fusion is planned to quickly penetrate other markets, such as the mainstream desktop, the enthusiast desktop, servers and high-performance computing. AMD even believes that Fusion could go into consumer electronics market as the software stack for devices such as DTVs becomes more complex and requires  x86 processors instead of embedded CPUs.

Based on a multicore model, the Fusion family carries the potential to spread out into many different processors within each market segment, each of which could carry different number of CPU and graphics cores. AMD explained that the “one size fits all” processors may be over with Fusion – conceivably, the ability to combine almost any number of CPUs with any number of GPUs and perhaps other special purpose processors on one die could be a recipe to create a hugely confusing product portfolio. However, AMD representatives told TG Daily that it is unlikely that there will be more processor models for each market segment than there are today – and suggested that finding a processor with a certain purpose in the future may actually be easier than it is today. Rather than promoting Gigahertz, the company will switch to promoting the features of a processor – or the “experience” the buyer can expect from a certain processor.

The way to Fusion has several roadblocks that have to be cleared. Among them is the fact that while graphics cards have the quantitative processing power, they lack qualitative processing capability. Most GPUs are based on 32-bit processing (single precision), which is not enough: A wrong pixel here and there isn’t a big deal in games, but it certainly matters in other applications - for example when financial models are calculated. Double precision capability (64-bit) will be one of the minimum requirements when GPUs are deployed to run general purpose applications.

And then there is the software challenge.

Changing the mindset of developers

The big question mark behind Fusion really is: How do you get developers excited about this processor concept and how do you create an incentive for them to create applications that run on a GPU? Without the hardware, there appears to be no reason for developers to invest into something they do not know. Vice versa, a Fusion processor that offers just integrated graphics and does not have general purpose software that takes advantage of the GPU horsepower, makes about as much sense as a Ferrari that is driven in 25 mph zones.

Clearly, the CPU+GPU chips need to be exposed to applications and there will be a massive effort required to master this challenge: In the end, AMD is asking for expertise in two different styles of programming – one that addresses the sequential processing needs in the traditional CPU and one that creates massively parallel applications. Considering the fact that three years after the introduction of Athlon 64 we are still waiting for the (consumer) mass-market 64-bit applications and considering the fact that multithreaded applications for dual-core processors have turned out to be a monumental task for many developers, there is some doubt how quickly we will see widely available software that runs on a general purpose GPUs.      

Common sense suggests that, in a best case scenario, Fusion will need a mass-market killer application from the likes of Google or Microsoft to showcase its potential. AMD, however, believes that there is also the potential for a newcomer in this market – a developer with fresh ideas for applications we aren’t imagining today. For a start, the company showcased a simple, but very fast face recognition software that, at least in a demo scenario appeared to be working well already. The company also ran a small application that was able to combine complex image rendering, movement recognition and physics: A captured imaged was captured by a web camera, rendered into a bitmap and then dissolved into thousands of separate elements. Users in front of the web camera were then able to push an object through the sea of particles simply by moving their hand from top to bottom as well as from front to back. The particles were physics-enabled and reacted to collisions in a natural way,    

 

 

 

 

The big question remains: What happens, if this new type of software won’t be available when Fusion arrives? AMD says that, in the end, Fusion will remain a processor with traditional CPU and GPU features. And if it really comes down to it, it will run your graphics anyway – just like any other CPU and a separate integrated graphics solution. That is not really a satisfying scenario, but it is sort of a safety net for what could be considered one of the riskiest product decisions AMD has ever made.

Obviously, we do hope that someone will take advantage of this processing horsepower and come up with software that is more user-aware and can interact with humans in a more human-like way: Ultimately, this technology puts those fancy applications within reach: Face recognition, speech recognition that relies on audio-processing and translating lip movements as well as image search engines that can deal with search terms such as “find all images with uncle tom” – and actually come up with reasonable search results.

Read on the next page: Torrenza, manufacturing and conclusion 

 


 

 

 

 

How Fusion fits into Torrenza

AMD’s Torrenza initiative is slowly taking shape, with first products surfacing here and there. While Torrenza encourages hardware developers to create devices such as accelerators around AMD processors, Fusion will become the heart of an “accelerated Torrenza platform”. In AMD’s vision, a Fusion (accelerated)  processor could be directly connected to either (1) a third party HTX accelerator or (2) to a third party socket compatible accelerator (for example on a dual-socket motherboard.)

There is also the idea of connecting PCI Express-based accelerators to the chipset, which has to deal with latency issues and a certain amount of data overhead. However, this specific connection appears to be the favored foundation for first generation Torrenza systems: For example, Tarari has developed an accelerator card with the sole purpose of accelerating virus scans: While four Opteron cores are able to process about 300 Mb/s virus scan data through the Kaspersky anti-virus software package on 50% processor load, the Tarari card achieves about 6.2 Gb/s on about 2% processor load, AMD said.

 

 

Torrenza may go into different ways over time, as the platform model is established through a “learning by doing” approach, according to AMD.

 
Manufacturing challenges

AMD has enjoyed massive growth in recent months, if we forget the last two quarters for a moment. But that growth came with growing pains as well. Some of these growing pains can be found in the firm’s manufacturing.

AMD underestimated the demand for its Turion 64 X2 processors in the fourth quarter of last year and overestimated the demand for its desktop processors on the other side. As a result, the company ended up with the wrong mix of products, which put the company in the inconvenient position of having to choose whom to send the Turions and whom not. The company decided to provide preferred treatment to its new Tier 1 customers (Dell, Lenovo) and disappoint smaller customers, who ended up buying Core 2 Duos from Intel.

Assumed that Fusion will really go into the direction that AMD has laid out and assumed that Fusion will spread out into a number of different processors across different product segments, reaching from high performance computing down to the notebook and from the DTV down to portable devices, then AMD will certainly need much more production flexibility than it has today.

Currently, the company produces processors in two AMD fabs (Fab 30/38 and Fab 36 in Dresden, Germany), as well as in three contracted fabs from Chartered, TSMC and UMC. However, only Fab 30, which will become Fab 38 after the conversion for 300 mm wafer capability, Fab 36 and Chartered are able to produce processors based on SOI, which Fusion is expected to use. The fabs of UMC and TSMC are currently utilized for “bulk” production of former ATI products such as graphics processors, handheld media processors and chipsets.

 

Any additional flexibility will help AMD to compete with Intel – whether it is the addition of bulk capability to its own fabs (which they do not have at this time) or added SOI capacity. In this sense, AMD’s recent announcement to build a 32 nm fab in Luther Forest in New York actually begins to make a whole lot of sense. Breaking ground is expected to happen between July 2007 and July 2009; the production ramp could start sometime between 2009 and 2011 or about the time when AMD expects Fusion to ramp into the market. There is no doubt that Luther Forest will be critical for AMD to be able to deliver a reliable and flexible product mix of Fusion (and other non-accelerated processors which are expected co-exist with Fusion.)


Conclusion

I am still scratching my head over how successful the Fusion concept can be. There are some exciting elements in Fusion, especially the idea of supercomputing performance for your desktop. In virtually any area of the market, you could see 20x – 30x performance jumps in certain application areas. But Fusion isn’t quite a product yet. Keep in mind, it is a concept that is just beginning to take shape.

The software challenge is massive and AMD has to invest enormous energy to educate a new generation of developers and create enough incentive for them to tap the general purpose horsepower of a heterogeneous processor. The transition to 64-bit applications and multicore has taught us that convincing a critical mass of developers to switch to a new programming style is everything else but easy.

It is unclear at this time which path Intel will taking. Will both companies will drive this idea of “accelerated computing”? Perhaps. But we don’t know for sure. AMD executives said that they don’t care what Intel will do and that they won’t turn around and won’t follow Intel. However, chief technology officer Phil Hester noted that AMD would like to work with Intel on this topic: “It’s their call,” he said during a briefing.  

Listening to AMD executives, product managers and engineers throughout a two-day event, there is little doubt that the company is betting big time on Fusion, its potential to achieve new performance heights and to become something that is different from what Intel may offer. There is a lot of energy and enthusiasm surrounding this project and, from our subjective impression, the idea of Fusion should be powerful enough to fuel a new wave of fresh ideas and innovation. And there’s certainly nothing wrong with that.

What do you think? Does Fusion make sense? What feature would you especially interested in such a chip? Let us know and write a comment in the form below.