Tim Sweeney, Part 2: “DirectX 10 is the last relevant graphics API”

Posted by Theo Valich

Interview - In the second part of our interview Tim Sweeney, creator of the Unreal game engine and CEO of Epic Games, discusses the challenges and dramatic changes that are just ahead for game developers and gamers: Graphics rendering may change completely and Microsoft’s DirectX interface may become less important. The successors of the Xbox 360 and Playstation 3, due in 2012, could be running entirely on software pipelines. 

The idea of extremely powerful graphics processors being used for general purpose applications is a much debated topic in the games industry as well - and Sweeney believes that the GPU and CPU will be heading into a battle for the dominant position in a computer – and either one could be pushed out of the market.

 


Tim Sweeney, Part 1: "PCs are good for anything, just not games"

Tim Sweeney, Part 3: "Unreal Engine 4.0 aims at next-gen console war"


TG Daily: In the first part of our interview you implied that software rendering might be coming back. Daniel Pohl, who rewrote Quake 3 and Quake 4 using ray-tracing [and is now working as Intel's research scientist] recently showed ray-tracing on a Sony UMPC, an ultraportable device equipped with a single-core processor. True, the resolution was much lower than on PCs of today, but it looked impressive. What are your thoughts on ray-tracing? How will 3D develop in the next months and years?

Sweeney: Ray-tracing is a cool direction for future rendering techniques. Also, there is rendering and there is the ray scheme of dividing the scene into micro-polygons and voxels. There are around five to ten different techniques and they are all very interesting for the next-generation of rendering.

Rendering can be done on the CPU. As soon as we have enough CPU cores and better vector support, these schemes might get more practical for games. And: As GPUs become more general, you will have the possibility of writing a rendering engine that runs directly on the GPU and bypasses DirectX as well as the graphics pipeline. For example, you can write a render in CUDA and run it on Nvidia hardware, bypassing all of their rasterization and everything else.

All a software renderer really does is input some scene data, your position of objects, texture maps and things like that - while the output is just a rectangular grid of pixels. You can use different techniques to generate this grid. You don’t have to use the GPU rasterizer to achieve this goal.

TG Daily: What kind of advantage can be gained from avoiding the API? Most  developers just utilize DirectX or OpenGL and that's about it. How does the Unreal Engine differ from the conventional approach?

Sweeney: There are significant advantages in doing it yourself, avoiding all the graphics API calling and overhead. With a direct approach, we can use techniques that require wider frame buffer, things that DirectX just doesn't support. At Epic, we're using the GPU for general computation with pixel shaders. There is a lot we can do there, just by bypassing the graphics pipeline completely.

TG Daily: What is the role of DirectX these days? DirectX 10 and the Vista-everything model promised things like more effects and direct hardware approach, claiming that lots of new built-in technologies would enable a console-like experience. DirectX 10.0 has been on the market for some time and the arrival of DirectX 10.1 is just ahead. What went right, what went wrong?

Sweeney: I don't think anything unusual happened there. DirectX 10 is a fine API. When Vista first shipped, DirectX 10 applications tended to be slower than DirectX 9, but that was to be expected. That was simply the case because the hardware guys were given many years and hundreds of man-years to optimize their DirectX 9 drivers. With DirectX 10, they had to start from scratch. In the past weeks and months, we have seen DX10 drivers catching up to DX9 in terms of performance and they're starting to surpass them.

I think that the roadmap was sound, but DirectX 10 was just a small incremental improvement over DX9. The big news items with DirectX 9 were pixel and vertex shaders: You could write arbitrary code and DX10 just takes that to a new level, offering geometry shaders and numerous features and modes. It doesn't change graphics in any way at all, unlike DX9. That was a giant step ahead of DirectX 7 and DirectX 8.

TG Daily: Since you are a member of Microsoft's advisory board for DirectX, you probably have a good idea what we will see next in DirectX. What can we expect and do you see a potential for a segmentation of APIs - all over again?

Sweeney: I think Microsoft is doing the right thing for the graphics API. There are many developers who always want to program through the API - either through DirectX these days or a software renderer in the past. That will always be the right solution for them. It makes things easier to get stuff being rendered on-screen. If you know your resource allocation, you'll be just fine. But realistically, I think that DirectX 10 is the last DirectX graphics API that is truly relevant to developers. In the future, developers will tend to write their own renderers that will use both the CPU and the GPU - using graphics processor programming language rather than DirectX. I think we're going to get there pretty quickly.

I expect that by the time of the release of the next generation of consoles, around 2012 when Microsoft comes out with the successor of the Xbox 360 and Sony comes out with the successor of the PlayStation 3, games will be running 100% on based software pipelines. Yes, some developers will still use DirectX, but at some point, DirectX just becomes a software library on top of ... you know.

TG Daily: Hardware?

Sweeney: GPU hardware. And you can implement DirectX entirely in software, on the CPU. DirectX software rendering always has been there.  
Microsoft writes the reference rasterizer, which is a factor of 100 slower than what you really need. But it is there and shows that you can run an entire graphics pipeline in software. I think we're only few years away from that approach being faster than the conventional API approach - and we will be able to ship games that way. Just think about the Pixomatic software rendering.

TG Daily: That technique was awesome. [click here for details about this software. -ed]

Sweeney: Yes, up to DirectX 8, we were actually able to use Pixomatic software rendering. In Unreal Tournament 2003, you could even play the game completely in software, running off the CPU. It was completely independent from whatever graphics hardware you had. It is only a matter of time before that level of performance is there in new variants of DirectX. On a quad-core CPU, you should be able to do that sort of thing again - completely software based DirectX rendering. Over time, I think that the whole graphics API will become less relevant, just like any other Microsoft API. There are hundreds of them in Windows, file handling, user interface and things like that. It is just a layer for people who don't want direct access to hardware.

Read on the next page: Running Linux on a GPU is not a pipe-dream
 


 

Running Linux on a GPU is not a pipe-dream

 

TG Daily: If your vision comes true, it looks like graphics will take the best from the CPU and the GPU, with graphics hardware continuing its evolution from a fixed function pipeline into what are basically arrays of mini-processors that support almost the same data formats as floating point units on the CPU today.

Sweeney: It is hard to say at what point we are going to see graphics hardware being able to understand C++ code. But data will be processed right on the GPU. Then, you are going to get the GPU's computational functionality to a point where you can - not that this is useful, but it will be a very important experiment - recompile the kernel for a GPU and actually run the Linux kernel off the GPU - running entirely by itself. Then, the boundary between the CPU and the GPU will become just a matter of performance trade-offs.

 

 

 

TG Daily: General purpose GPUs competing with CPUs? Do you already have any idea who might win this battle?

Sweeney: Hard to say at this time. Both can run any kind of code, GPUs are just much better optimized for highly parallel vector computing. CPUs are better for authorized out-of-order, branching, and operating system type of things. Once they both have a complete feature-set, things will get very interesting there. We could see the CPU pushing GPUs out of the market entirely, if the integration of highly parallel computing influences future designs. Or, we could see GPU vendors start pushing actual CPUs out of the market: Windows or any other operating system could run directly on a GPU. There are lots of possibilities.

TG Daily: In some way, this is already happening in the handheld world. STMicro recently launched a chip that integrates the ATI z460 GPU a.k.a.  
mini-Xenos [a trimmed-down version of the Xbox 360 GPU -ed]. Nvidia launched the APX2500, a system-on-a-chip product that uses an ARM11 core for CPU-type computing and an ULP GeForce for the rest of the system. Intel is talking about such SoCs for the consumer electronics segment. Will we see something similar on the desktop?

Sweeney: It is unclear what these products actually are. As they become actual silicon, we will be able to see how far the miniaturization can go.

TG Daily: There are signs that a whole new market segment might reveal itself, enabling 3D performance with CPU-type computing on handheld-type devices, which so far provided pathetic a 3D GUI experience for users.

Sweeney: Well, if we look at the iPhone, we can see that these low-power devices can actually be very important part of our lives. Now you can really browse the Web on a handheld - I mean you can actually browse the web in a decent fashion. Look at my Blackberry [8700]. There is a crappy little web browser thrown in to provide the simple functionality and it just sucks. It is a horrible web experience. Apple’s version is really good, it is usable. The video player and the YouTube integration are excellent. I definitely see those devices becoming a much more important part of our lives. For that reason, we need more and more compute power inside the same size package.

 

 

TG Daily: But realistically, will it ever be possible to run a high-end game on a handheld platform?

Sweeney: The way people go online and do things replaced a lot of things we used to do on our PCs, but not everything, of course. You don't use your handheld and write a document in word processing software. You don't spend hours playing a game on a handheld because the battery won't last. And these are tiny screens. Why would you play on a handheld, if you can play on large screens and enjoy the full experience of a game?

These [small] devices are important and I feel they will grow over time, as processing capabilities increase. I believe that the next generation of mobile gaming will be quite impressive. I think we're only few years away from really good user experience [in all segments]. If you look at the PlayStation Portable or Nintendo’s handhelds, they are so low-end and so low-performance that they are just not interesting to mainstream game developers. But with another generation or two, they will have enough power to run a scaled down version of a high-end game console or PC game. Reduce the level of detail, lower the resolution and you will get the same game running on these devices.

TG Daily: Then, it would not be a far-fetched call to see games based on next-gen engines such as Unreal Engine 4, even 4.5 or 5 or something like that, running on a device that fits in your pocket?

Sweeney: That is the great thing about this scalable factor. On a 320x200 pixel screen, you have 30 times less pixels than on the highest-end PC monitor that is currently available. When you look at the performance figures, the actual scale is within reach. It should be possible to create a compelling next-gen experience on consoles, PC and handhelds.

This concludes the second part of our interview. We will publish the final segment tomorrow, which will focus on Epic’s plans for the Unreal Engine 4.0 .

Read the first part of the interview here: Tim Sweeney, Part 1: "PCs are good for anything, just not games"