UPDATE: HD 3800: AMD’s Midrange Rebuttal - benchmarks

Posted

Price per performance: what it really means

It was almost two weeks ago that Nvidia launched its G92 based GeForce 8800GT graphics boards into the midrange segment. On the same day Nvidia launched its product to market, we unveiled what Rick Bergman, Vice President of the Graphics Product Group at AMD shared with Tom’s Hardware about the HD 3800 series hardware. The performance of the GeForce 8800GT speaks for itself. It is almost as powerful as its bigger brothers yet offers a current selling price of $280-320 depending on the clock frequencies and features.  Now we get to see AMD’s response on the high end.

 


AMD's 3800 benchmarks and pics ...

 

Nvidia has owned the performance space for over a year with single and multi-GPU but this has always come at a premium. Based on forum posts, article comments and the emails we have received, there are some who think that the GeForce 8800GT is a bit expensive. Regardless of your view of “expensive and cheap” the GT opened up high performance gaming to a cost conscience consumer. This is exactly where AMD has targeted its newest graphics product code named RV670. It was reported on TG Daily that AMD would introduce the 3800 series for $180-220 depending on models. This is exactly on target as we were briefed that the 256 MB Radeon HD 3850 will retail for $179 and the 512 MB Radeon HD 3870 will sell for $219.

While marketing will spin off some metric like price per performance, what does that really mean? In essence, true midrange cards give gamers a “graphics card I can afford” yet at a “price I can actually afford.”  This is a very attractive price point. As our preliminary benchmarks will show, it will allow even more gamers to experience high performance graphics within even tighter budgetary constraints.

AMD has not only reduced the GPU size by using a 55 nm process, but has updated its hardware to comply with Microsoft’s DirectX 10.1 specification. A 55 nm process means a lot for several reasons. RV670 requires half the silicon per wafer to produce than Nvidia’s 8800GT. This should translate directly into lower pricing, higher units per wafer (higher volume) and ultimately mean higher margins per wafer.

Traditionally the midrange part was 75% the performance of the high end, but at 50% or less of its price. Looking back in history we see that die shrinks have been a key component of bringing these midrange parts to the masses. This new midrange part offers not only DX 10.1 and a 55 nm process; users will be able to use more cards in two, three and four-way CrossFire will be supported on Vista. Although the driver was not made available to use for launch, AMD stated it will provide this to us in a few days. We have seen this running three times over the past 2 weeks and we told we would have the driver soon to begin testing of our own.

Previously AMD reported that it can beat Nvidia’s thermal envelope. A die shrink means less heat per transistor. In testing we can see that the internal monitoring shows that RV670 is hotter at the die than R600 but this makes sense. Even though it is smaller, it is still going to get hot under operation and there is less surface area to spread heat away from the die. While this temperature reported to Catalyst Control Center is higher than HD 2900XT, you can touch the heatsink with you hand and not get burned.

There is more heat per square inch, but that is easily mitigated by a simple heatsink. Try touching the plastic shroud covering Geforce 8800GT’s heatsink – OUCH! - you will move your hand away fast.  Even though it is quiet, it does run a bit hot.

Whether you use the “hand test” or a thermometer, the implementation of AMD’s mobile technology into the desktop parts is apparent. There are actually three energy states in which the GPU will operate at. The first two we are familiar with, performance with everything turned on and idle with the 3D engine in a down-clocked mode. A new state exists either when the GPU is doing 3D rendering for the Vista desktop or when the shaders are being used for video decoding or other application acceleration. Radeon HD 3800 series will also have an updated Universal Video Decoder (UVD) for the hardware acceleration of HD DVD and BluRay movies.

We are sure you are more interested on how these new cards perform compared to the existing GeForce 8800GT and the Radeon HD 2900XT so we will keep it short and sweet in this article and follow up with another to go over power efficiency at each state, temperature states, video playback and any other intricacy you might like to know about.

Test Setup

We ran the standard benchmarks we have run in the pas,t but included Bioshock and Crysis to see what these loads can do to the new cards. For Bioshock we turned up all of the image quality settings under DX9 (Windows XP) and for Crysis we enabled everything in the game to High. We plan on following up this with additional settings to see AA performance as well as add a couple more games. Since we had limited time before the launch deadline we wanted to encapselate as much performace information within the timeframe of the launch.

 

System Hardware

Processor

Intel Core 2 Extreme X6800 Conroe

2.93 GHz, 1,066 MHz FSB, 32kB+32kB L1 , 4 MB L2

Platform

Nvidia Platform:

XFX MB-N680-ISH9, LGA 775

NVIDIA nForce 680i SLI

ATI Platform:

Intel D975XBX, LGA 775

Intel 975X Express Chipset

RAM

Corsair CM2X1024-9136C5D

2x 1024 MB DDR2 (CL5-5-5-15)

Hard Drive

Western Digital Raptor, WD1500ADFD

150 GB, 10,000 rpm, 16 MB cache, SATA150

Networking

nForce4 Gigabit Ethernet (Nvidia), Intel® 82573E/82573L Gigabit Ethernet Controller (Intel)

Graphics Cards

AMD / ATI

ATI Radeon HD 3870 512 MB GDDR4

775 MHz Core

320 Stream Processors @ Core Frequency

1125 MHz Memory (2.25 GHz DDR)

ATI Radeon HD 3850 256 MB GDDR3

670 MHz Core

320 Stream Processors @ Core Frequency

830 MHz Memory (1.66 GHz DDR)

ATI Radeon HD 2900 XT 512 MB GDDR4

320 Stream Processors @ Core Frequency

740 MHz Core

825 MHz Memory (1.65 GHz DDR)

Nvidia

Nvidia GeForce 8800 GT 512 MB GDDR3

600 MHz Core

112 Streaming Processors @ 1.50 GHz

900 MHz Memory (1.80 GHz DDR)

Power Supply

TopPower Powertrain 900W

CPU Cooler

Zalman CNPS9700 LED

System Software & Drivers

OS

Microsoft Windows XP Professional 5.10.2600, Service Pack 2

DirectX Version

DirectX 9.0c

Graphics Driver(s)

ATI - Catalyst 7.10 (HD 2900XT)

ATI - sample_xp32-64_8-43 (HD 3850 & HD 3870)

NVIDIA - Forceware 169.02

 

 

Read on the next page: Benchmark results 

 


 

 

 

 

Benchmark results 

 

 

3DMark 05

Looking at the 3DMark results you will see a few things. First, the AMD HD series cards are capped at a resolution of 1024x768 with AA and AF disabled as they differ by less than 100 marks under CrossFire. Notice that the Nvidia results are not much higher at the same setting. As you will see in other applications, the platform with the Intel chipset is not as fast as the Nvidia based motherboard.

Looking at 1280x1024 you will notice an odd issue that happens in CrossFire where the performance is lower than it theoretically should be. Overall the GeForce 8800GT has the advantage in terms of overall scores but is trailed by both newcomers by small margins. The Radeon HD 3850 only has 356 MB of frame buffer memory, so it will tail off at high resolution in a synthetic like 3DMark which stresses all of subsystems of the GPU and especially with AA enabled.

Doom 3

Under Doom 3 the system performance plays a significant role in determining what the maximum frame rates can be. All of the AMD cards tested on the Intel board are capped at 140 frames per second while the Nvidia board caps around 180. Looking at the data, the 3850 in CrossFire was able to break this threshold.

So where exactly are the bottlenecks and how to they form?  You can see that SLI also pushed past 180 but not by much. Talking to Terry Makedon (Catalyst Product Manager) the extra load on the CPU for CrossFire under the test drivers might be one of the reasons we see limited performance in one area and not in another. We are planning on getting improved drivers from both AMD and Nvidia for issues we see in Doom 3 as well in Crysis. The performance of the 3870 under 4x antialiasing and 8x anisotropic filtering shows it ahead of the 8800GT in single card operation.

F.E.A.R.

The GeForce 8800GT keeps the AMD cards at bay under F.E.A.R. The largest gap is at 10 frames per second without AA and using trilinear filtering and 13 frames with 4x AA and 8x AF enabled. AMD told us that there is a hardware issue inherent in R600 based architecture that is handled better under RV670 and will be completely fixed with R700. In the mean time, Nvidia wins this round with F.E.A.R.

Elder Scrolls IV: Oblivion

Oblivion has been our torture tester since it released and we have always used it with 4x AA to kill graphics cards. In dual card configurations all of the cards can handle resolutions of with the exception of the HD 3850 which just misses the minimum 30 fps boundary. Aside from a driver issue at 2560x1600 in single GPU the Radeon HD 3870 beats the GeForce 8800GT. Oblivion is shader intensive and AMD’s new cards do very well in comparison against the 8800GT. The indoor results shows the platform limitations but the GeForce 8800GT does very well even at the higher resolutions.

Bioshock

We used two different locations in the game to test as there are different visual distances and events happening in the scenes. In dual card operation the 8800GT in SLI has a 10 frame lead on the AMD cards in the Farmer’s Market test. This is a closed area that we use to show effects of smoke, bees, and reflections to see how the shaders perform. Since it is a limited scene in terms of overall view distances, some of the variance could be due to the platform. Looking past 10x7 and 12x10, the new Radeon HD 3870 surpasses the 8800GT. Moving to the second test in Neptune’s Bounty, the Nvidia GeForce 8800GT does 10 frames more at 1024x768 but then is tied and then loses the rest of the test.

Crysis

The last game on the list is the pre-release of Crysis. There is a known “bug” with CrossFire that Crytek states will have a patch one week after the game launches. Whether it is a competitive tactic (Intel paid $2 million and Nvidia paid $4 million for their companies to be on the marketing (In game, box, and promotions) or if this is merely a last minute change, we can only leave in skepticism and to the rumormongers. We hope to test this with the patch and with different drivers to see if we can get better results.

Looking past the unknowns, the AMD scaling is broken under the current conditions. Additionally with all of the settings turned to “High” all  of the cards can at lease play in single card operation at 1024x768 and then at 1280x1024 the 3850 drops out of the race. Nvidia continues to stay close to 30 fps at 1600x1200 which is impressive. Until we get fixes for the game and or better drivers we will have to weigh Crysis less than the other established titles. It does however show that all of the midrange cards can play the game with all of the bells and whistles enabled.

Conclusions

At the end of the day you have to take into consideration what works best for you. The significance of your decision should be of 4 key factors: Price, performance, scalability, and features. While Nvidia held onto its performance crown in many areas, the Radeon HD 3870 did very well against it and even beat it in several situations.

So if you have cards that are comparable in performance, then you have to compare prices or more exactly ask yourself the question - “What are you getting for the price?” There is a new leader for price for the amount of frames per second: Radeon HD 3850.

All of the midrange cards have something to offer. The Nvidia GeForce 8800GT holds more wins than losses so it is still the best performing midrange graphics card. The Radeon HD 3850 is the best value for the performance you get. Looking at its performance in CrossFire it makes for a very compelling system at $360 which is only $60-80 more than a single GeForce 8800GT. The AMD HD 3800 cards scale better overall. Nvidia stated that only its high end cards will be able to utilize 3-way SLI. AMD on the other hand will scale 2, 3, and 4-way CrossFire. We will have to test how much extra performance each card adds compared to the cost, but at least AMD is offering that option.

The last item to weigh into your decision is the feature set of each card. AMD and Nvidia both have hardware video decoding, but AMD offers full VC-1 hardware decode. According to HighDefDigest.Com, the majority of HD DVD titles (87%) use the VC-1 codec while 10% use H.264.  For HD DVD owners, this makes the AMD Universal Video Decoder more valuable than the Pure Video solution from Nvidia.

Another feature AMD has over the Nvidia GeForce 8800GT is full support for DirectX10.1. While DX10.1 has some promising improvements for game development in terms of tighter blending and filtering formats, a set of specified multi-sampling antialiasing (MSAA) patterns and the inclusion of custom filters, the biggest advances may come through the use of extensions from shader model (SM) 4.0 to SM4.1 and more significantly from the adoption of techniques using cube map arrays for effects like global illumination. You need to keep in mind that some of these will not be implemented in the short term.

Of all of the cards, the value and overall performance title goes to the AMD Radeon HD 3870. It has enough horsepower to play all the games currently on the market very well. Purchasing any of the midrange cards will not leave you short in terms of performance, but the AMD solutions and especially the HD 3870 offers the most for the most bang for the buck.

Correction

We incorrectly put 575 MHz as the Nvidia 8800 GT clock speed.  The actual speed is 600 MHz.  The 8800 GTX is the card with the 575 MHz clock speed.