You can’t tell me you haven’t asked this question yourself: Why exactly is Intel coming along right now with an integrated memory controller idea? And why is it that Intel now plans to put graphics capability into the CPU. Does AMD innovate and Intel has begun to follow? Here’s some food for thought.
So, let’s get right to it. Why is the integrated memory controller (IMC), a key feature that made AMD’s current Athlon 64/Opteron platform so successful, developed for the next-gen Intel Nehalem platform? When I heard about the news, a statement by Intel’s Pat Gelsinger from a 2005 IDF popped up in my mind, in which the executive said that Intel would be thinking about such a technology when there’s the right time.
Nehalem’s release must be right time, apparently. But why? Intel had such a technology already developed in the past (see also the reader comments in our first Nehalem article ) for its never released Timna processor (some background on the development and the decision to scrap this chip can be found in our interview with Intel’s Mooly Eden). While AMD’s success with the integrated controller and customer pressure may have motivated Intel to rethink its IMC strategy, the official explanation is that multi-core has changed the landscape and will retire the concept of the good old FSB.
PR Manager George Alfs told me that engineers typically have certain tools they can use to improve a processor and an IMC appeared to be the right approach to deal with the quick increase of threads in Nehalem. 45 nm Nehalem processora will be available with at least 8 cores on the high-end and, with the return of Hyperthreading, there will be at least 16 threads in Intel’s fastest CPUs. “There is a lot of data going in and out. It makes a whole lot of sense to use an IMC in this architecture,” said Alfs.
Intel also said that it will be integrating graphics into the processor. We haven’t really heard about this concept from the blue team until a few days ago. Could this idea be inspired by AMD? Intel’s news comes just about a year after AMD had announced that it will leverage ATI knowledge to build Fusion, a processor that will offer a graphics core on the low-end and possibly a stream-processing core on higher-end versions of this processor.
It would be almost foolish to think that Intel never had thought about the capability of integrating its graphics technology into a processor. But integrating graphics into the processor goes against the very basic concept Intel’s business is built on – to sell as many chips as they can. In today’s model, Intel sells CPUs and graphics processors separately, with two profit margins in place. In a future model, Intel may only sell only one chip with a profit margin that is far less than today. So far there hasn’t been really an incentive for Intel to integrate graphics into the CPU. Put Intel’s 40% market share in the graphics industry into this equation and you have one convincing reason for the company not to integrate graphics into the processor.
But the market requirements and technologies are changing: “The CPU tends to absorb other components over time,” Alfs said. He also mentioned that integrated graphics always have been part of the Nehalem technology, which has been in development for about three years now.
I leave it up to you to decide how much influence AMD’s Fusion processor had on Intel. Interestingly, both approaches appear to be very similar, as Alfs said that a graphics-equipped CPU would be positioned as a mainstream solution, while the company expects that there will always be a market more discrete graphics cards. Sounds like a Fusion competitor to me.
The most interesting part of this whole scenario will be timing. If Intel will be able to roll out a graphics-Nehalem processor close to the processor’s release date - which we expect to be the second half of 2008 - then Intel will have a huge advantage over AMD: Fusion is not expected to be unveiled until 2009/2010.
Is Intel copying AMD and throwing its enormous resources at every good idea AMD comes up with? Or is AMD just a bit more talkative about its ideas and the company really leverage the idea of Fusion as a key reason to justify its acquisition of ATI? The answers really depend on your preferences and your point of view.
I can’t tell and only certain executive ranks at both companies know for sure. But I doubt these are the real questions anyway. From a consumer perspective, both processor companies are in a highly competitive environment right now, which can only result in much better products. From the perspective of AMD, it really doesn’t matter if Intel copies ideas or not: Suing your competitor is really only one side of the story. The green team knows about the capabilities of the blue team and will need to continue to have ideas that differentiate the firm’s products - and we now know that it will be a challenge to position the IMC and Fusion, at least on the low end, as a unique feature.
Interestingly, one component is largely left out of the graphics discussion. If Intel and AMD aim to battle for the lion’s share of the market of the graphics market, what does that mean for Nvidia? Yes, both AMD and Intel say that there will always be a market for discrete graphics. But will Nvidia be able to survive from the leftovers? We have discussed this topic with Nvidia extensively and you can read the answers in an interview here on TG Daily on Monday.