Interview – We are wrapping up our interview with EPIC Games CEO and Unreal creator Tim Sweeney today with an outlook into the future of video gaming. What will be the next generation of game consoles bring? What can we expect from the next Unreal game engine? What about all those fancy brain-computer-interface devices and future game controllers that will allow you to control your character in a game with your body, rather than with a controller in your hand? Join us listening to Sweeney’s answers to those questions and get insight in how video games will change over the course of the next four, five years.
TG Daily: Throughout GDC, there have been several companies that were presenting game controllers that are so-called brain-computer interfaces. A while ago, I used OCZ's NIA device in Unreal Tournament 2004 and was blown away by the usability of that device. How do you see the interface between us, humans, and the computer evolving, given the fact that Nintendo has seen such a huge success with its Wii-mote?
Sweeney: I think the key challenge here is to look at all the cool things engineers are developing and identify which ones are just gimmicks, which ones are cool ideas that might benefit one part of the market, but aren't fundamental changes and which ones are things that really change the way we work with computing interfaces. I still think that motion controllers, such as the Wii controller, have a limited purpose, sort of a gimmicky thing. Standing there and holding a bunch of devices and moving them around wildly is great for party games, but I don't think that will fundamentally change the way people interact with computers. Humans are tactile beings, so things such as touchscreens fundamentally improve the user interface.
TG Daily: That brings us back to the iPhone, which we talked about earlier. Apple appears to have made a lot of progress in this area.
Sweeney: I agree. You are not just bringing up the map on the screen, but you move it with your fingers, you zoom in and zoom out. It's incredible that nobody thought of that earlier. With 3D editing tools, the touchscreen approach would be an excellent thing. You can grab vertices and drag them in different directions. A touchscreen could really improve and change things. I think that we might see that technology migrating to Maya. It is hard to tell how exactly that will pan out, but I see that as a very big improvement in computing versus the motion stuff. These are just neat toys.
The other big direction is head tracing - cameras built into consoles. They watch you and detect, for example, your arm movement. It is just more natural, because it is somewhat annoying to hold a bunch of wired plastic do-adds, wireless things you have to pick up and recharge them every once in a while. To me, it's more compelling to just use free-form movement and have computers recognize your gestures.
TG Daily: You mean behavior analysis?
Sweeney: Yes, but I do not know how that will work out. We humans don't have great motion control, when it comes to moving arms around in free space, while we have astonishing control over fine movement - when we touch an object, for example. If you look at people, what we are optimized for is manipulating objects. That is what separates us from animals: We can manipulate with toys and bulletins and interact with complicated objects. Anything that is useful in that space is going to give computers precise tactile feedback and enable us to be in touch with objects. That said, it is very hard to say how the user interface will evolve. I am not an expert in those areas.
TG Daily: At the end of the day, we all go back to basics at some point. According to one of those industry legends, Bill Gates and Microsoft engineers were roaming inside Apple, at a time when these two companies were friends. Gates asked Mac engineers how they managed to develop hardware to control the mouse movement. It turned out that Apple's engineers wrote software to control it. When we take a look at input devices of today, it seems that we are repeating the same thing what happened early 1980s.
Sweeney: Five years into development of personal computers we exhausted all the major ideas such as keyboard, mouse, joystick and gamepad. But then you see something like Apple’s multi-touch, or you see that YouTube video on a big screen based interface where people walk around and just start manipulating objects that are projected there. That is new stuff, that's entirely new. No one really has done that before and it is clear that there are still a lot of major ideas that haven't surfaced yet. Yet. As the technology improves, one thing is certain: As you increase complexity of the user interface, you need more processing power.
Read on the next page: The development of Unreal Engine 4 has begun
The development of Unreal Engine 4 has begun
TG Daily: Let’s talk about your game visions for the future and the next Unreal Engine? Where is EPIC going with the Unreal Engine 3.5 and 4.0?
Sweeney: The Unreal engine is really tied to a console cycle. We will continue to improve Unreal Engine 3 and add significant new features through the end of this console cycle. So, it is normal to expect that we will add new stuff in 2011 and 2012. We're shipping Gears of War now; we're just showing the next bunch of major tech upgrades such as soft-body physics, destructible environments and crowds. There is a long life ahead for Unreal Engine 3. Version 4 will exclusively target the next console generation, Microsoft's successor for the Xbox 360, Sony's successor for the Playstation 3 - and if Nintendo ships a machine with similar hardware specs, then that also. PCs will follow after that.
Also, we continuously work on transitions, when we go through large portions of the engine. We completely throw out parts and create large subsystems from the ground up, while we are reusing some things that are still valid.
TG Daily: Like ...?
Sweeney: The Internet bandwidth. In five years, the bandwidth isn't going to be more than 5-6 times higher than it is today. So the network code we have in the engine now will stay the same. Our tools are still valid, but we will rewrite large sections of the engine around it, as the new hardware develops.
TG Daily: What part of the engine will need a completely new development?
Sweeney: Our biggest challenge will be scaling to lots and lots of cores. UE3 uses functional subdivision and paths, so we have the rendering thread that handles all in-game rendering. We have the gameplay thread that handles all game-plays and uses AI. We have some hopper threads for physics. We scale very well from dual-core to quad-core, and actually you can see a significant performance increase when you run UT3 on a quad-core when compared to a dual-core system.
Down the road, we will have tens of processing cores to deal with and we need much, much finer grain task-parallelism in order to avoid being burdened by single-threaded code. That, of course, requires us to rewrite very large portions of the engine. We are replacing our scripting system with something completely new, a highly-threadable system. We're also replacing the rendering engine with something that can scale to much smaller rendering tasks, in- and out-of-order threads. There is a lot of work to do.
TG Daily: You already have started working on Unreal Engine 4.0?
Sweeney: We have a small Research & Development effort dedicated to the Unreal Engine 4. Basically, it is just me, but that team will be ramping up to three to four engineers by the end of this year - and even more one year after that. In some way, we resemble a hardware company with our generational development of technology. We are going to have a team developing Unreal Engine 3 for years to come and a team ramping up on Unreal Engine 4. And then, as the next-gen transition begins, we will be moving everybody to that. We actually are doing parallel development for multiple generations concurrently.
TG Daily: Stepping back, what do you see as the most significant technology trends these days?
Sweeney: When it comes to the PC, Intel will implement lots of extensions into the CPU and Nvidia will integrate many extensions into the GPU by the time next-gen consoles begin to surface. We are going to see some CPU cores that will deal with gameplay logic, some GPU stuff that will run general computing... and two different compilers. One for the GPU and one for the CPU. The result will be a reduction of our dependence on bloated middleware that slows things down, shielding the real functionality of the devices.
It would be great to be able to write code for one massively multi-core device that does both general and graphics computation in the system. One programming language, one set of tools, one development environment - just one paradigm for the whole thing: Large scale multi-core computing. If you extract Moore's Law, you see that with the number of cores that Microsoft put in Xbox 360, it is clear that around 2010 - at the beginning of the next decade - you can put tens of CPU cores on one processor chip and you will have a perfectly usable uniform computing environment. That time will be interesting for graphics as well.
At that time, we will have a physics engine that runs on a computing device, we will have a software renderer that will be able to do far more features that you can do in DirectX as a result of having general computation functionality. I think that will really change the world. That can happen as soon as next console transition begins, and it brings a lot of economic benefits there, especially if you look at the world of consoles or the world of handhelds. You have one non-commodity computing chip; it is hooked up directly to memory. We have an opportunity to economize the system and provide entirely new levels of computing performance and capabilities.
TG Daily: Let’s close the circle and return to the beginning of our interview: What does that mean for the PC?
Sweeney: Well, that trend could also migrate to the PC, of course. I can definitely see that at the beginning of next decade: PCs will ship with a usable level of graphics functionality without having any sort of graphics hardware in the system. Your graphics hardware will be a VGA and a HDMI-out connector and that's about it. The same thing happened with sound. All the time, you had these high-end, ultra-expensive sound cards and different levels of sound acceleration. And then look at what happened when Vista arrived. Poof! It is 100% software based and with one operating system, all that hardware acceleration was gone. We now have software sound and all that hardware is now used for digital to analog conversion. That is a great approach, because now there is a lot more flexibility: Now you have sound algorithms, treble control, you we got rid of all the hardware incompatibility issues that were the result of complicated fixed-function hardware and poorly written drivers. Things are much cleaner now and much more economical.
Simplifying the development process and making more efficient computer hardware is going to be the next big step for us all.
TG Daily: Thank you for your time.
We hope you enjoyed this three-part interview with one of the great personalities and inventors in the gaming industry as we did. If you noticed, Tim Sweeney did not talk about actual hardware such as consoles, PCs and handhelds, but rather about how hardware manufacturers have to improve their products to enable game developers to drive the next generation of games. As soon as today’s limitations are removed, game developers will be able to create even more realistic worlds – and more people will be able to enjoy this experience.
At the end of the day, we want to be entertained by brilliant games on various platforms. The sooner manufacturers are able to remove limitations and abandon possibly wrong directions, the better for us consumers.