Saturday , 19 April 2014
Latest Reviews
Home » Reviews & Articles » Hardware » NVIDIA NV31 & NV34 Review

NVIDIA NV31 & NV34 Review

NVIDIA has created an all inclusive line-up of 3D cards for every budget, and NV31 and NV34 are performing above their targetted markets.

[review_ad]

Introduction


As a follow up to our NVIDIA GeForce FX Family Preview, which we posted HERE last week, NVIDIA has sent us a couple of the new FX cards to test. We received an NV31 (GeForce FX 5600 Ultra) and an NV34 (GeForce FX 5200 Ultra), NVIDIA’s Performance and Mainstream market products, respectively. Scott took the higher line 5600 Ultra for testing on his nForce2/XP2700+ computer while I took the 5200 Ultra for testing on Bjorn3D reference computer, a nForce2 and AMD XP2000+. This review is going to concentrate on the performance of these cards in the benchmark arena.

The Cards


I do want to go over the cards briefly before we look at the performance numbers. I highly recommend looking at our GeForce FX Preview and GeForce FX Family Preview, for more in-depth technology discussions.


NV34 – GeForce FX 5200 Ultra

The 5200 Ultra came with the now familiar reference ducted heatsink/fan cooler. These have worked well and should be ideal for the lower clocked 5200s. The DDR SDRAM modules (located on both the front and back of the card) are not cooled by heatsinks (more on cooling later). The 5200 reference card also included external ports to connect VGA and DVI monitors. There is also an S-Video port for TV connection, and even VIVO (video in/video out) duty on VIVO capable cards. This arrangement will likely be the most common choice for the 5200 Ultra. The lower, GeForce FX 5200 (non-ultra) may not offer this much variety. We might see board manufacturers with dual-VGA and even dual-DVI, as we’ve seen on several GeForce 4 retail boards. Noted addition is the external power adapter. NVIDIA requires this 4-pin connection to ensure (1) that any board with an acceptable AGP slot can run these new GeForce FXs, and (2) to simply ensure proper power to the GPU and RAM. But note that there are significant number of capacitors, etc. at the right edge of the board, which are necessary to deal with the straight 12V supply of the power supply. Beyond that, the 5200 Ultra is not an exceptionally long card. It’s about 1 cm longer than the Radeon 9700 that I have. The NV34 which I received was default clocked to 325MHz core and 650MHz on 2.5ns DDR. The 2.5ns DDR is rated to 800MHz (see our overclocking results, next page).


NV31 – GeForce FX 5600 Ultra

Referencing NVIDIA’s reference picture for the 5600 Ultra (Scott’s digital camera is on the fritz), you’ll quickly notice that it looks pretty much identical to the 5200 Ultra. In the end, they are similar GPUs (except for some minor changes which we discussed HERE), and they can easily be supported by similar construction. Oddly enough, Scott’s reference NV31 also had 2.5ns DDR onboard. I’d expect the board manufacturer’s to take advantage of lower speed RAM on production GeForce FX 5200s, but we’ll see.

Installation


Installation was a breeze under Windows XP for both cards. NVIDIA supplied us with Detonator drivers in version 42.72, the current recommended FX driver version. Installation was as simple as removing the old card (Ti4600 in Scott’s case and a Radeon 9700 in my case), putting the new card in and rebooting. Upon boot-up, the driver executable will install the new drivers, and prompt for a reboot. After this second boot, they worked as expected. Using the Coolbits trick, we added the ability to under/overclock our GPUs. The default clock speeds were 325/650 for the GeForce FX 5200 Ultra and 350/700 for the GeForce FX 5600 Ultra.

Next page, the test set-up and benchmarks…..

Test Setup & Benchmarks


Scott’s System:
  • Asus A7N8X nForce2 Motherboard
  • AMD XP2700+
  • 512MB Corsair XMS PC3200 DDR
  • On-board Soundstorm Audio
  • On-board 10/100 NIC
  • Windows XP Professional
  • Reference NVIDIA GeForce FX 5600 Ultra & BFG GeForce4 Ti4600
  • My System:

  • Leadtek K7NCR18D nForce2 Motherboard
  • AMD XP2000+
  • 512MB OCZ PC3000 DDR
  • On-board Soundstorm Audio
  • On-board 10/100 NIC
  • Windows XP Professional
  • Reference NVIDIA GeForce FX 5200 Ultra &
  • Reference GeForce4 MX460

    With the above systems, we ran our GPUs under Unreal Tournament 2003, 3DMark 2001SE, and Comanche 4 benchmark demo. Note that we haven’t used any DirectX9 benchmarks. The games tested are DirectX8.1 at best:

    GeForce FX 5600 Ultra (NV31)

    Scott was surprised to see the GeForce FX 5600 Ultra trailing the Ti4600 by such a margin, but Comanche 4 is a hungry DirectX 8 engine. The 5600 does catch and even surpass the Ti4600 when antialiasing (AA) and anisotropic filtering (AF) are enabled, but not by any worthwhile margin and at unplayable frame rates. However, these cards are dead even on pricing. Street pricing on a Ti4600 is around $192 while the suggested retail price for the 5600 Ultra is $199.

    UT2003 tells a similar story to Comanche 4. the Ti4600 is besting the 5600 Ultra by 26-32% without AA or AF. However, the FX’s enhanced CineFX engine does show a performance boost when AA and AF are cranked up. At a modest 4x/4x setting (NVIDIA is recommending testing up to 8x), the FX is pulling away by 20%.

    The UT2003 botmatch benchmarks are a little quirky since the movement and actions of the bots can and do vary from benchmark run to benchmark run, but Scott chose to run them here also. The botmatches do add more effects and triangles to be rendered by the GPUs. Obviously, the numbers have lowered quite a bit from the flyby, due to the above, but the game is quite plyable at 4xAA and 4xAF on both cards at 1024×768. Overall, again we’re seeing the Ti4600 surpassing the FX 5600 Ultra without AA and AF, while the FX passes the Ti4600 with the extra features turned on.

    Again, the flip-flop with the AA and AF effects, but the 3DMark scores at 1600×1200 with 4xAA and 4xAF are off the charts! Scott reran these to the same effect. NVIDIA certainly has 3DMark’s Max Payne engine beaten with the FX. I typically hold a score of 2000 as a minimum playable score, and we’re well beyond that at all points, except for the Ti4600 at 1600×1200 with 4xAA and 4xAF, where the extra procedural requirements have taken their toll.

    All in all, considering that the 5600 Ultra is to debut at the same current price of Ti4600s, it appears to me that the more advanced engine of the FX is the choice based on these benchmarks. Trailing the Ti4600 in the non-AA and AF tests does not bother me since the FX 5600 Ultra provides more than acceptable scores at all of these areas.

    GeForce FX 5200 Ultra (NV34)

    A quick note: The GeForce4 MX460 does not support 4x anisotropic filtering. I’ve run these scores at 2xAF on the MX460 and compared them to 4xAF scores on the 5200 Ultra….giving the MX an unfair advantage, but it’s not needed, as you’ll see.

    Unlike Scott’s tests of the FX 5600 Ultra, my testing of the FX 5200 Ultra is already showing the 5200 surpassing the MX460 without the antialiasing and anisotropic filtering. As I’ve said, Comanche 4 is a tough benchmark to beat. At 1024×768 with 4xAA and 4xAF, the 5200 Ultra is doubling the performance of the MX460 at only 4xAA and 2xAF.

    The Unreal Tournament 2003 graphs look very similar to the Comanche 4 graphs. Again, we’re seeing the doubling of performance at 1024×768 with AA and AF. At 1600×1200, I believe I’m hitting the CPU barrier and the 5200 Ultra is not getting enough juice to out perform the MX460 at 1600×1200 with AA and AF. There is only a slight, but noticeable, lead at the non-AA/AF tests.

    3DMark shows practically a linear increase in performance of the 5200 Ultra over the MX460. It’s about 20-25% faster than the MX460 without AA and AF. When you turn AA and AF on, we’re seeing a 50% performance boost.

    Overclocking


    Now here’s where it gets fun. Remember how I said that the 5200 Ultra and 5600 Ultra are running the same speed RAM? Well, our overclocking speeds show that and more.

    After playing around with the overclocking settings in the advanced display properties of the NVIDIA driver set, I managed to find a sweet spot at 362 MHz on the core and 775MHz on the memory for the 5200 Ultra. This is a nice 11% increase on the core and ~20% on the ram. The best part is that I got a lot more than a 20% increase in performance! The graph above shows my score in 3DMark 2001SE at 1600×1200 w/4xAA and 4xAF enabled. I’m getting 280% increase in performance at these speeds! Lets just hope that the board manufacturers aren’t shy with the speed of the RAM that they put on the 5200 Ultras. In effect, I’m getting a 5600 Ultra, just without the Intellisample technology and slightly slower RAMDACs. But, the core was running pretty hot when overclocked and benchmarked. Not too hot to touch, but definitely warmer than usual….good case cooling is in order.

    Scott didn’t quite reach the level that I did on the memory, but he probably could have pushed it a little more. Scott chose to benchmark without AA and AF, as a quick comparison and to show that the FX 5600 Ultra now bests the Ti4600 in this arena. His scores in 3DMark 2001SE show a 11% to 13% lead over the Ti4600 at usable overclock speeds.

    Conclusions


    Well, NVIDIA’s FXs are finally here. After playing with the NV31-GeForce FX 5200 Ultra, I came away impressed with the “Mainstream” market GPU’s performance. Scott was satisfied with the 5600 Ultra, given its price point. All in all, NVIDIA has created an all inclusive line-up of 3D cards for every budget. Counting in the overclocking potential, I don’t think you could go wrong with a 5200 Ultra or 5600 Ultra in your system.

    With great, high level (AA/AF) performance, DirectX9 hardware across the board, and excellent overclocking potential, we’re awarding the GeForce FX 5200 Ultra and the GeForce FX 5600 Ultra a score of 9.5 out of 10 and the Bjorn3D Golden Bear Award.

    Optimization WordPress Plugins & Solutions by W3 EDGE