AMD, Nvidia and the century of graphics

Posted by ROB ENDERLE

For much of the last three decades of the PC it has been all about the CPU, while the graphics card was largely a concern of engineers, media creators, and gamers.

But that has started to change dramatically. We are clearly entering a world where iPads, Google and HP Tablets, and future multi-media Kindles replace books as the market for platforms that can create, edit, and modify still as well as multi-media content.

But it isn’t just that visual aspects of computing are changed but because graphics processing can approach super-computer performance at affordable price points, it is changing everything from what we drive to how long we’ll live.

AMD and NVIDIA and the Century of GraphicsLet’s pause in this Apple driven week, because Apple is the most visual of companies, to explore that.

Good Bye to ATI, Hello to the New AMD

A big indicator of this change happened this week when AMD abandoned the ATI brand and officially made graphics a mainstream part of their core message.

This is in preparation for their move to Fusion and the first blend of CPU and GPU technology.

In many ways this is a rebirth for the company which has, since its inception as the designed Number Two to Intel, has always lived under its powerful shadow.

With this move, AMD makes the first visual step to truly move out from under Intel’s shadow and become something vastly different. It isn’t often companies makes changes at this level and it doesn’t come without heavy risks.

But if they didn’t do this they would always live as "second best," and with the massive move to GPU computing there is a chance that the new AMD could and will become more than it otherwise could be.

NVIDIA and the GPU Technology Conference

Intel dominated the past decades of computing and their quintessential event is IDF, the Intel Developer’s Forum, which will launch in a few short weeks. No other vendor has ever successfully challenged them for the hearts and minds of developers, OEMs, and parts makers.

IDF remains the conference when it comes to PCs and servers but NVIDIA is stepping up and IDF is unchallenged no more. GTC, the GPU Technology Conference launches on September 20th, just a few days after IDF.

The conference is designed to focus on what IDF doesn’t - the emerging world of GPU computing and the Emerging Companies that will be on the cutting edge of the first wave.

I’ll be hosting one of the Emerging Company panels personally as we explore how the next wave of computing will be built and those that are building it.

This conference will have sessions covering advancements in AI driven automobiles and robotics, because GPU computing is wonderful for AI. It will have sessions on advances in medical and modeling tied back to GPU computing’s advances in those areas and there will be examples of projects ranging from mapping the weather to exploring outer space that wouldn’t have been completed had it not been for the introduction of GPU computing.

You may very well owe your extended life and future to the significant advances being made as a result of making super-computing level performance affordable much more quickly.

In the end this will likely be the first definitive year where the future of computing will be defined not just at IDF but at GTC and where those that go to both conferences have the greatest advantage.

Wrapping Up: A Bold New World

It is easy to look at the graphics candy that has resulted from the stunning visual updates to games like City of Heroes, StarCraft, and Mafia II which, I have to admit, have been eating up a lot of my own time last month.

But it is the advancements in science, engineering, and media that will likely change our lives dramatically this decade and it is there that GPU computing is beginning to shine and why we are suddenly watching both AMD and NVIDIA much more closely.

For, while they are competitors, they are in agreement on one important thing and that is that GPU computing will define the next age of performance advancement in the technology space. For my own part, I’m looking forward to seeing the first true AI and GPU computing is likely to make my dream a reality.

See more about: