Friday , 29 August 2014
Latest Reviews
Home » Reviews & Articles » Hardware » Leadtek 8800 GT 256MB aka G92

Leadtek 8800 GT 256MB aka G92

The sample I’ve tested costs roughly $215 bucks while the 512 MB model positions itself at around $250-260 dollars. Personally I’d rather pay the difference and not worry about staggering at higher resolutions with applied 4AA / 16AF.

[review_ad]

INTRODUCTION

The past year was something NVIDIA had always wished for — stay on the top and keep the performance crown. Not that they never did in the past, but this seems to be the trend. That includes both desktop and mobile markets ranging from high to low-end SKUs. As always, the first batch of high-end G80 cards were expensive and unavailable to a lot of people. Since performance wasn’t the sole deciding factor, video chip giant had to make a smart move and introduce a subtle form of 8800 GTX. Based on cheaper 65nm process, this G92 code-named SKU became a product for the masses while still keeping most of G80 characteristics. Shrinking the die and selling it cheaper isn’t only NVIDIA’s strategy. If you remember well, AMD did the same thing a while back with their Radeon HD 3850/3870 which in fact brought high performing cards to almost everyone. Ok, enough of my gum flapping, let’s get down to business.

YET ANOTHER VALUABLE SKU?

Though I’m not planning on delving into a lot of details, some brief explanation on the features and specs might be needed. As I’ve already mentioned, Leadtek PX8800 GT is a cut down version of more expensive 8800 boards. The main features resemble a lot of G80 traits. There are obviously differences which I will go over in just a second, but here is what we already have.

  • NVIDIA® unified architecture with GigaThread™ technology:  Massively multi-threaded architecture supports thousands of independent, simultaneous threads, providing extreme processing efficiency in advanced, next generation shader programs.
  • NVIDIA® Lumenex™ Engine:   Delivers stunning image quality and floating point accuracy at ultra-fast frame rates.
  • Full Microsoft® DirectX® 10 Support:   World’s first DirectX 10 GPU with full Shader Model 4.0 support delivers unparalleled levels of graphics realism and film-quality effects.
  • Dual 400MHz RAMDACs:  Blazing-fast RAMDACs support dual QXGA displays with ultra-high, ergonomic refresh rates–up to 2048 x 1536 @ 85Hz.
  • Dual Link DVI:  Capable of supporting digital output for high resolution monitors (up to 2560×1600).
  • NVIDIA® SLI™ Technology:  Delivers up to 2x the performance of a single GPU configuration for unparalleled gaming experiences by allowing two graphics cards to run in parallel. The must-have feature for performance PCI Express graphics, SLI dramatically scales performance on over 60 top PC games.
  • PCI Express™ Support:  Designed to run perfectly with the next-generation PCI Express bus architecture. This new bus doubles the bandwidth of AGP 8X delivering over 4 GB/sec. in both upstream and downstream data transfers.
  • 16x Anti-aliasing:  Lightning fast, high-quality anti-aliasing at up to 16x sample rates obliterates jagged edges.
  • NVIDIA® PureVideo™ Technology:  The combination of high-definition video processors and NVIDIA DVD decoder software delivers unprecedented picture clarity, smooth video, accurate color, and precise image scaling for all video content to turn your PC into a high-end home theater. (Feature requires supported video software.)
  • OpenGL™ 2.0 Optimizations and Support:  Ensures top-notch compatibility and performance for all OpenGL applications. NVIDIA® nView® Multi-display Advanced technology provides the ultimate in viewing flexibility and control for multiple monitors.
  • NVIDIA® nView® Multi-Display Technology:  Advanced technology provides the ultimate in viewing flexibility and control for multiple monitors.

There are three important factors that make GeForce 8800 GT a better run for the money. First and foremost, the GPU is manufactured using 65 nm process, thus making it a smaller and cooler chip. The second change involves the way G92 communicates with the rest of the components; PCI Express 2.0 which in my opinion is more of an evolution than revolution (sort of like AGPx4 -> AGPx8). The last thing that has been overlooked when looking at enthusiast SKUs is VP2 or Visual Processor which takes care of HD video decoding.

Video Card
GeForce 8800 GT GeForce 8800 GTX
GPU (256-bit) G92 GT
G80 GTX
Process
65nm (TSMC fab) 90nm (TSMC fab)
Transistors ~330 Million ~484 Million
Memory Architecture 256-bit 384-bit
Frame Buffer Size 256 MB GDDR-3 768 MB GDDR-3
Rasteriser
Shaders: 112 unified
ROPs: 16
Shaders: 128 unified
ROPs: 24
Bus Type 2.0 PCI-e 16x 1.0 PCI-e 16x
Core Clock 600 MHz 630 MHz
Memory Clock 1800 MHz DDR3 2000 MHz DDR3
Shader Clock 1500 MHz 1350 MHz
RAMDACs 2x 400 MHz DACs 2x 400 MHz DACs
Visual Processor / HD decoder
YES NO
Memory Bandwidth 57.6 GB / sec 96 GB / sec
Pixel Fillrate 9.6 GPixels / sec 15.1 GPixels / sec
Texture Fillrate 9.6 GTexels / sec 15.1 GTexels / sec
DirectX Version 10 10
Pixel Shader 4.0 4.0
Vertex Shader 4.0 4.0

 


GPU-Z output

From the chart above you can spot the obvious differences between the new GT (G92) and old GTX (G80). Those include smaller process, more transistors, a bit cut down rasteriser, PCI interface, clocks and VP2. Please note that our Leadtek PX8800 GT came with 256 MBs of on-board RAM; a bit cheap in my opinion as you will be extremely limited at high resolutions. Applying FSAA will kill the performance even more — theoretically at least, but you will soon find out that to be true.

THE CARD & BUNDLE

Our Leadtek PX8800 GT came inside a nicely designed box. If you look at the pictures, the material on the package isn’t your standard printed cardboard. Its surface has a flashy feel to it — it can be easily spotted. The sides and back have printed major features, package contents and system requirements. As far as bundle is concerned it’s looking very optimistic. Along with PX8800 GT you get:

  • DVI to VGA converter
  • HDTV cable
  • 6-pin power connector
  • Quick Installation guide
  • Driver CD
  • CyberLink PowerDVD 7.0
  • Orb
  • Neverwinter Nights 2

Click a picture to see a larger view





The card itself is covered with a well incorporated cooling system while the back of the PCB looks rather standard. There are stickers informing about the model so make sure you get what you’ve paid for. On the back I/O panel we get regular display support: VGA compatible, VESA compatible BIOS for SVGA and DDC 1/2b/2b+. All available through dual DVI-I outputs and S-Video port (HDTV ready). The interesting part are the clocks. The higher-end 512 MB model comes with the exact core / memory clocks as this 256 MB version (600 MHz core / 1800 MHz DDR3 effective memory). Most, if not all companies tend to lower the clocks, but Leadtek decided to make the two equal — all except frame buffer. The last two pictures compare Leadtek’s PX8800 GT and PowerColor HD 3850 Xtreme. Sizewise, they’re about the same expect Radeon occupying two slots.

TESTING METHODOLOGY

Although the card gets memory limited at high resolutions I’ve decided to drop all non AA / AF modes. Gaming tests were performed using various image settings depending on game defaults: 4aa16af. The following resolutions were used to test the performance of the card: 1024×768, 1280×1024 and 1600×1200. Due to 256 MB framebuffer some games simply refused to run with Antialiasing enabled. As far as driver settings are concerned, the card was clocked at standard 600 MHz core and 1800 MHz DDR3 for memory. High quality AF was used for anisotropic filtering. All other settings were at their default states.

PLATFORM

All of our benchmarks were ran on an Intel Core 2 Duo platform clocked at 3.0GHz. Performance of Leadtek WinFast PX8800 GT was measured under ASUS P5N-E SLI motherboard. The table below shows test system configuration as well benchmarks used throughout this comparison.

As far as installation goes, haven’t noticed anything wrong. The main system used was Windows XP with SP2. All DX10 tests were run under Vista. For overclocking, I’ve used RivaTuner to determine maximum overclock / temperature along with ATITool to overload the GPU and check for artifacts.

Testing Platform
Processor Intel Core 2 Duo E6600 @ 3.0 GHz
Motherboard ASUS P5N-E SLI
Memory GeIL PC2-6400 DDR2 Ultra 2GB kit
Video card(s) Leadtek WinFast PX8600 GT 512MB SLI
PowerColor HD 3850 Xtreme
Hard drive(s) Seagate SATA II ST3250620AS
Western Digital WD120JB
CPU Cooling Cooler Master Hyper 212
Power supply Thermaltake Toughpower 850W
Case Thermaltake SopranoFX
Operating System

 Windows XP SP2 32-bit
Windows Vista Business 32-bit

API, drivers
DirectX 9.0c
DirectX 10
NVIDIA Forceware 169.21 & 169.25
ATI CATALYST 8.1
Other software ATITool, RivaTuner
Benchmarks
Synthetic 3DMark 2006
Games

Bioshock / FRAPS
Colin McRae: DiRT / FRAPS
Half-Life 2: Episode 1 / Bjorn3D custom timedemo
Unreal Tournament 3 demo / vCTF-Suspense flyby
World In Conflict / ingame benchmark

 

3DMark 06

DirectX 9: BIOSHOCK

DirectX 9: COLIN MCRAE DIRT

DirectX 9: HALF-LIFE 2 EPISODE ONE

DirectX 9: UNREAL TOURNAMENT 3

DirectX 9: WORLD IN CONFLICT

DirectX 10: BIOSHOCK

DirectX 10: UNREAL TOURNAMENT 3

DirectX 10: WORLD IN CONFLICT

POWER & TEMPERATURE

Power consumption is a critical factor when building gaming stations. Some people care how much power each component needs and some simply don’t care at all. I always want to know how many Watts it takes to power up a decent PC. That way I can go buy an appropriate power supply and save up some money for something else. Note that the results below will vary depending on your system configuration.

What we have above is a graph consisting of two different sets of information: temperature (in Celsius) and peak power (in Watts). These are the highest numbers I was able to achieve, not average as with frames per second. Although our Leadtek WinFast PX8800 GT comes with 256 MB of memory it is clocked just like any other GT in its class. When doing simple web browsing and document editing the card’s temp reached 49C and used around 124 Watts of power. Because you don’t necessarily buy 8800 GTs to just surf the web we took it for a spin with our favorite 3DMark06 benchmark. The results weren’t impressive, especially the temperature which rocked around 74 degrees Celsius. As for power, the system requested 211W.

OVERCLOCKING

Leadtek released their 256 MB version 8800 GTs factory overclocked meaning they are clocked the same as the 512 MB models. Ain’t that sweet? Although the clocks won’t help as much as bigger frame buffer, it’s a nice addition from Leadtek nonetheless. The card comes clocked at 600 MHz core and 900 MHz memory (1800 MHz effective). I’ve heard a lot of great things about G92 overclockability, let’s find out whether that checks out.

Overclocking results were achieved with the help of RivaTuner and ATITool. The last application helped me push GPU/memory to its limits with the built-in artifact scanner. The last phase involved 3DMark06 testing. As far as results are concerned, our Leadtek WinFast PX8800 GT reached a maximum of 635 MHz for core while memory ended up at 970 MHz (1940 MHz effective) — all artifact free. Anything beyond these numbers resulted in either screen corruption or random freeze-up / reset / BSOD.

Core clock difference (600 MHz stock): 6%
Memory clock difference (900 MHz stock): 7%

CONCLUSIONS

Most if not all has been said about G92 — not necessarily in this article, but elsewhere. I’m just a messenger that brings news about particular product, in this case Leadtek WinFast 8800 GT 256 MB. You might be wondering what am I doing with this card. New generation of GPUs are around the corner, but we’ve yet to see cards performing so damn well for that sum of money. Well, except for PowerColor HD 3850 Xtreme which kicked some serious four letters. This particular G92 from Leadtek is a tad late, but better late than never right? To tell you the truth, the differences between G92 and G80 are similar to those between R600 and RV670. The whole idea of these kicker SKUs is so damn familiar: alike rasterizers (taken from the higher-end models), PCI-e 2.0, UVD (RV670), VP2 (G92) and die shrink. G92 however lacks DirectX 10.1 and Shader Model 4.1 support

Some of you may be bothered by the 256 MB of on-board RAM. Don’t know why would Leadtek or any other company release a high-end card and supply it only with half-cut frame buffer. The sample I’ve tested costs roughly $215 bucks while the 512 MB model positions itself at around $250-260 dollars. Personally I’d rather pay that extra $35 and not worry about staggering at higher resolutions with applied 4AA / 16AF. The card has great potential, but not with the castrated memory buffer. Of course you can enjoy all the details at 1024×768, but who uses that resolution today? That’s the only real gripe I have with the card, though a 512 MB model is available.

Pros:
+ Great for low res gaming / HTPCs
+
Factory overclocked
+ VP2 video processor

+ Good price point

Cons:
– Poor performance at high res with 4AA/16AF
applied
– Competition has a better deal and 512 MB memory config

For its performance and price, Leadtek WinFast PX8800 GT receives a score of 8 out of 10 (Very Good) and Bjorn3D Seal of Approval award

Optimization WordPress Plugins & Solutions by W3 EDGE