Does AMD’s Spider catch Intel’s Penryn in its Web?
Analyst Opinion - The fourth quarter, particularly after Black Friday, is particularly important to the PC industry. PCs are, for the first time in this decade, according to the CEA, at the top of the wish list for folks in the US and that indicates that volume should be very strong. While we often watch Apple, Dell, HP, Acer (Gateway), and Lenovo in this segment, the big fight is always between AMD and Intel.
Historically, Intel is the power player and AMD is playing catch up. Uniquely for this decade, AMD, however was out-executing Intel and actually came into this season several times rather strongly. This year, however, Intel has been on fire and this quarter has been no exception with what is arguably the strongest product launch the company has done this decade.
AMD certainly knew this was coming and took a dramatic risk to buy ATI and quietly put together a counter strategy called “Spider”, which combined ATI chipsets and graphics with their Phenom processor to create a different approach to the competitive problem - one that focused on Intel’s historic weakness: Graphics performance. This is a significant departure from the focus on the microprocessor alone, and Intel strength that, alone, is almost unmatched.
This all runs into what is a big problem right now and that is apathy with regard to quad-core anything. In fact, the market still doesn’t appear to get dual-core products, let alone the quad-core offerings Intel and AMD will be fighting over. And there is every likelihood that that, when this quarter is over, quad-core offerings from either vendor may not make that much of a difference.
Let’s have a look at this scenario and then move to AMD’s Spider play.
The quad-core quandary
We are moving into a multi-core world. I doubt anyone who is following technology would have missed this. The first to move were the game systems with Microsoft’s Xbox 360, followed by Sony and Nintendo. We also got a heads up early on to what the problem with multi-core has always been; it can be a programming nightmare for software, like video games, which have traditionally been single-threaded.
To do multi-core right you have to break things down into components that can be executed in parallel on different cores rather than in sequence on one core. This is like moving from playing the piano with one hand to two hands and then 4 or 9 hands at the same time. This would be hard enough but the pipes associated with the activities are still largely based on the older sequential loads and that has created strange performance bottlenecks that have to be designed around. This means that not only did software developers need to learn how to deal with this new world, hardware designers had to start rethinking how they did things as well. Fortunately, servers had gone multi-socket some time ago and at least some of that knowledge (though not as much as you’d probably think) transferred shortening the process.
More cores means more power and this comes at a time were folks are really focused on trying to keep power costs down. The idea of green pervades the technology space and initial products have been very power hungry. The good news is, after a little thought, multi-core products can actually be more power efficient because they can shut down unneeded cores dynamically and, as it turns out, one core can still do a lot (like make sure your email inbox is up to date and your system is virus free) and save you a ton of energy in the process. But this is only if someone creates a platform that will intelligently shut down the cores, and while vastly better than the initial offerings this appears well short of the potential available now.
Finally, folks are buying laptops in volume and not desktops and you can’t get laptops with quad core processors (well laptops that you are likely to carry anyway) and that locks these Quads out of a good chunk of the market.
Advantages to quad-core
This doesn’t mean there aren’t reasons to use a quad-core in a desktop computer. I use an 8-core Intel machine with an Nvidia graphics card for my primary workstation and a quad-core AMD machine with an ATI graphics card for gaming.
One of the interesting things about Vista is that it was designed to scale to multiple cores while Windows XP starts getting upset after two cores (and was created at a time when the only folks using two cores were at the nose bleed high end of the PC segment.) I’m particularly impressed with the SP1 Vista Beta which is working much better with gaming now.
The real advantages only exist for applications that are highly multi-threaded - such as transcoding audio and video, photo editing, video editing and rendering. There are some recent video games that make good use of this as well but, for now, this technology is truly best for those that that are using one of the media tools (or some kind of advanced scientific analysis). Where a lot of us really need the power right now is in graphics more than number crunching and this takes us to Spider.
Spider: Graphics, graphics, graphics
As AMD rolls this platform to market you will hear this chant “graphics, graphics, graphics” and spider gives AMD two potential advantages. The first is aggressively priced high performance graphics. This is a tuned platform and at the low end should have a performance advantage in graphics, where we often bottleneck with games, and games are what often drive us to higher end systems, Spider becomes very interesting.
Now, as of this writing we are just getting benchmarks done and, initially, they look pretty good (these were done with engineering samples). The Penryn systems have benchmarked very well so far and they have set a high bar for AMD to meet.
Still, if there were a way to pass Intel, graphics is the only one that makes sense. AMD is more focused on this graphics issue now: At the high end of Spider is a quad-graphics card platform that has me drooling already if not from excitement, then from the challenge of water cooling such a rig.
So what does all that mean? In the end, you are just seeing one more indication that the paths Intel and AMD are on are currently diverging. AMD is shifting to a much higher focus and reliance on graphics while at Intel the processor is still king and the company will be focused on ensuring their processor remains on top. To succeed, AMD has to change how we look at the market and get us to subordinate our beliefs on either CPU or GPU to looking at the combination of both. Fortunately, for AMD, we seem to be already going down that path as we realize that machines that are poorly tuned, regardless of how good the components are, don’t provide as good a value as well tuned products.
For most of us who are currently shopping for PCs this all doesn’t matter much, because this current battle doesn’t touch laptops. But, since this fight will move to laptops next year, it is good for us to watch how the battle develops so we can make better decisions in the future. For those of us in the market for a new mid-range desktop this will matter a lot and, in this case, there could be significant differences between AMD and Intel based offerings. Over the next few days, watch the independent benchmarks that are focused on what you want to do with the system and make your decision accordingly.
So, “did AMD’s Spider catch Intel’s Penryn in its Web?” Maybe, but it will depend on what you are looking for in a system. Look for benchmarks on production systems to make your buying decisions.
Rob Enderle is one of the last Inquiry Analysts. Inquiry Analysts are paid to stay up to date on current events and identify trends and either explain the trends or make suggestions, tactical and strategic, on how to best take advantage of them. Currently he provides his services to most of the major technology and media companies.