iXBT Labs - Computer Hardware in Detail






NVIDIA GeForce 8800 GTS (G80):
Say Hello to DirectX 10 Again, or 96 Unified Processors Left in Action

Cards from XFX and BFG


  1. Introduction
  2. Video cards' features
  3. Testbed configuration, benchmarks
  4. Game test results (performance)

Once upon a time there lived... a user. A common user. OK, he might be already a-d-v-a-n-c-e-d. Because he was experienced with computers. He liked to play games, especially RTS. He scorned the appearance of 3D accelerators - "who needs to pay extra money for game decorations?" Days turned to weeks and months, the glide was calling, post processing was luring, texturing and raster units were bewitching...

The user got used to it... so much that he got addicted to 3D. He knew that if a game bore the 3dfx label, it must be cool. You could see another world behind the display, it was not flat anymore. And then came Unreal... The user got up to its neck into it... Deeper and deeper... 128128 16-bit textures were not enough anymore...

The new philosophy came to replace the old one - Rivism! Yeah, only Riva.. Onward to better graphics... You all know what happened later... Riva morphed into GeForce. Voodoo went bankrupt (no one came to its shrines anymore and left money there), RAGE died of shame for its bilinear filtering and changed the name to RADEON, a starname no one heard of. Savage was caught, tied and thrown to the VIA reservation, where this company lives on the house... Matrox... That's a real tragedy. Millennium was over as well as Voodoo, no one was attracted by the famous name... Matrox exerted itself intending to overcome its competitors... and then it gave birth to Parhelia. The two suns were dull and artificial, only people in Montreal liked them.

And there came shaders... Yep, at first programmers had drawn triangles on a white sheet, painted them, and told users about photorealism. Everything was simple before shaders: programmers used a sliding rule (it was done by transistors in GPUs; NVIDIA, ATI, and 3dfx had special departments to equip all transistors with sliding rules) to multiply, divide, get mean values - it was done to get a color number. To paint a pixel with. But a number of transistors must be GREATER than a number of pixels. That's all. If it was vice versa, transistors needed a second pass... With a break, of course, because even such realism was very slow. On the other hand, it still yielded 35fps, and all users were happy.

Ok... shaders... Clever Microsoft said enough! Enough of the stone age. 3D graphics should be processed in a different way. Transistors should learn to use a programmable calculator instead of a slide-rule. Just enter several operations at once and get the result. It's much more flexible and easier for developers. The ball started rolling. SM 1.0, 1.1, 1.3, 1.4 (at this stage the Canadian company declared that SM 1.4 would be used only by itself for nice demos). There was no stopping Microsoft - SM 2.0. That's where NVIDIA fizzled out, as it was carrying the dead body of 3dfx... It was too heavy... Before abandoning it, the company picked its pockets for valuable specialists, who successfully... flunked the launch of GeForce FX... Calling after ATI, NVIDIA carelessly attached the old bus and equipped its product with a "vacuum cleaner" to avoid overheating... To make the story short, for two years Californian company had been generating cheerful reports about technological advances, while the engineers tried to keep low... They were designing SM 3.0! Microsoft was out there, of course. So, the combined efforts (bursts from NVIDIA and consistent scattering of rakes from ATI) brought both companies on the same level... Neither 6800 Ultra, nor RADEON X800 XT were to be found... They were hidden so well that gamers offered lots of money for such a miracle. A year and a half ago they would have fainted from $500. And now they were ready to part with $600, to turn their computers into ovens, ruin their PSUs, and make graphics stunning. FarCry was released, you know... True gamers will understand...

But that's nothing. Shaders were divided. They could either compute vertices or paint pixels. There was no flexibility. Yep, flexibility again, it's always in deficit. There cannot be excessive memory in a system unit, and there cannot be excessive flexibility for game developers. So... a new solution came from California, with unified shaders that could do everything, where each transistor was equipped with its own PDA, smartphone, and notebook. Besides, there were great many of them... The core grew in size, but it still worked and remained manageable. Now the driver estimates a game and distributes roles. But this miracle with 128 processors was already reviewed, it's GeForce 8800 GTX.

But we know that the new series from NVIDIA consists of two cards now:

  • NVIDIA GeForce 8800 GTX (G80) 768MB GDDR3, 575/1350/1800 MHz, 128 unified processors/32 TMUs/24 ROPs — $599;
  • NVIDIA GeForce 8800 GTS (G80) 640MB GDDR3, 500/1200/1600 MHz, 96 unified processors/24 TMUs/20 ROPs — $449;

The second product will be reviewed here.

By the way, it's already available in stores. So we reviewed production-line cards from two manufacturers. I already mentioned them before, so let's proceed straight to the examination.

Video Cards

XFX GeForce 8800 GTS 640MB PCI-E
GPU: GeForce 8800 GTS (G80)

Interface: PCI-Express x16

GPU frequencies (ROPs/Shaders): 512/1200 MHz (nominal — 512 (1200) MHz)

Memory clock rate (physical (effective)): 800 (1600) MHz (nominal — 800 (1600) MHz)

Memory bus width: 320bit

Vertex processors: -

Pixel processors: -

Unified processors: 96

Texture Processors: 24

ROPs: 20

Dimensions: 220x100x32mm (the last figure is the maximum thickness of a video card).

PCB color: black.

RAMDACs/TDMS: in a separate chip.

Output connectors: 2DVI, TV-Out.

VIVO: not available

TV-out: integrated into GPU.

Multi-GPU operation: SLI (integrated into GPU).

BFG GeForce 8800 GTS 640MB PCI-E
GPU: GeForce 8800 GTS (G80)

Interface: PCI-Express x16

GPU frequencies (ROPs/Shaders): 512/1200 MHz (nominal — 512 (1200) MHz)

Memory clock rate (physical (effective)): 800 (1600) MHz (nominal — 800 (1600) MHz)

Memory bus width: 320bit

Vertex processors: -

Pixel processors: -

Unified processors: 96

Texture Processors: 24

ROPs: 20

Dimensions: 220x100x32mm (the last figure is the maximum thickness of a video card).

PCB color: black.

RAMDACs/TDMS: in a separate chip.

Output connectors: 2DVI, TV-Out.

VIVO: not available

TV-out: integrated into GPU.

Multi-GPU operation: SLI (integrated into GPU).

XFX GeForce 8800 GTS 640MB PCI-E; BFG GeForce 8800 GTS 640MB PCI-E
The video card has 640 MB of GDDR3 SDRAM allocated in ten chips on the front side of the PCB.

Samsung (GDDR3) memory chips. 1.2ns memory access time, which corresponds to 800 (1600) MHz.

Comparison with the reference design, front view
XFX GeForce 8800 GTS 640MB PCI-E; BFG GeForce 8800 GTS 640MB PCI-E Reference card NVIDIA GeForce 8800 GTX 768MB PCI-E

Comparison with the reference design, back view
XFX GeForce 8800 GTS 640MB PCI-E; BFG GeForce 8800 GTS 640MB PCI-E Reference card NVIDIA GeForce 8800 GTX 768MB PCI-E

We can see that the main difference between the 8800GTS and the GTX is power supply units, they are of different lengths, so the cards' sizes are different. Left parts of the PCBs are absolutely identical. In fact, the PCB of the GTS card uses the same 384-bit bus, it just lacks two memory chips that constitute the difference in buses and local video memory.

Pay attention to the PCB length - 220mm again instead of 270mm, like in the GTX. There could be some problems with the GTX because of its length - the card has grown by 5cm, so it cannot be used in some PC cases.

The photos also show that the GTS PCB requires ONE 6-pin PCI-E cable from a PSU instead of two. But requirements to the PSU are not changed much: you need a 400W (honest) PSU with the 12V channel supporting at least 20A. By the way, the card itself consumes approximately 130W under 3D load and about 90W in other cases.

The cards have TV-Out with a unique jack. You will need a special adapter (usually shipped with a card) to output video to a TV-set via S-Video or RCA. You can read about TV Out here.

The cards are equipped with a couple of DVIs. They are Dual link DVI, to get 16001200 or higher resolutions via the digital channel. Analog monitors with d-Sub (VGA) interface are connected with special DVI-to-d-Sub adapters. Maximum resolutions and frequencies:

  • 240 Hz Max Refresh Rate
  • 2048 x 1536 x 32bit @ 85Hz Max - analog interface
  • 2560 x 1600 @ 60Hz Max - digital interface

We should mention clock rates of the cores. We found out previously that an increase in ROP frequency raised the frequency of the shader part proportionally INSTEAD OF BY THE SAME DELTA! That is:

(640-513)/513=24.7%. So we get: 1200 MHz * 1.247 = 1499 MHz.

What concerns the products from BFG and XFX, they act similarly here. They differ only in bundles and stickers on their coolers. We already mentioned the cooling system - the GTS cooler is similar to that in the 8800GTX, only a tad shorter. I'll just remind you about its peculiarities and quote a fragment of the 8800GTX review.

GeForce 8800 GTX

The cooling system has been modified completely. The 7900 GTX was equipped with a cooling system with a huge slow fan in the center. It drove the air through the left and right heatsinks with heat pipes that channeled the heat away from the core and memory chips. Now we roll back to the old turbine design with a fan at one end of a closed heatsink that drives the air through.

But don't get frustrated, recalling the horrible cooler from the X1900 XTX. Even though they are similar, there are some differences.

Firstly, the device drives ALL hot air out of a PC case.

Secondly, the turbine is very large and slow, so it's quiet even under heavy loads.

Thirdly, efficiency of the cooling system is reinforced by heat pipes at the side of the heatsink.

The cooler is not heavy, as copper is used only for the plate that touches the GPU. Other components are made of aluminum alloy. The design resembles devices from Arctic Cooling.

Now have a look at the processor.

8800 GTS — G80 is manufactured on Week 37, 2006. A smear of paint on the cap is probably a mark of the GPU with locked processors


You can see well that there are practically no visual differences between GTS and GTX, which was to be expected.

Remember that graphics output is now up to a separate chip - NVIO. It's a special chip that incorporates RAMDAC and TDMS. That is the units responsible for outputting graphics to monitors or TV were removed from the G80 and implemented as a separate chip. It's a reasonable solution, because RAMDAC won't be affected by interferences from the shader unit operating at a very high frequency. The same chip is also used for HDCP support.


XFX GeForce 8800 GTS 640MB PCI-E
The bundle is modest, but sufficient: User's manual, CD with drivers, DVI-to-d-Sub and S-Video-to-RCA adapters, TV cords, composite output adapter.

BFG GeForce 8800 GTS 640MB PCI-E
User's manual, CD with drivers, DVI-to-d-Sub and S-Video-to-RCA adapters, TV cords, composite output adapter. And a BONUS: a brand XL T-shirt from BFG and teflon-coated stickers for a mouse (to make it glide faster). I already expressed my admiration of this bundle. There are no games in the bundle. It does not matter, they would have put old games into the bundle anyway.


XFX GeForce 8800 GTS 640MB PCI-E

It's a large box inside a jacket of the brand design.

All components are neatly arranges into sections. Along with an antistatic envelope, the card is secured in porolon to avoid any transportational problems.

BFG GeForce 8800 GTS 640MB PCI-E

It's a large box inside a jacket of the brand design. BFG pranksters played a joke with the expression on the face of a character depicted on the box (a marketing specialist suffering from realia of the Russian market).

All components are thoroughly arranged into sections. Along with an antistatic envelope, the video card is put inside soft package that protects it from any jolts.

Installation and Drivers

Testbed configuration:

  • Intel Core2 Duo (775 Socket) based computer
    • CPU: Intel Core2 Duo Extreme X6800 (2930 MHz) (L2=4096K)
    • Motherboard: EVGA nForce 680i SLI on NVIDIA nForce 680i
    • RAM: 2 GB DDR2 SDRAM Corsair 1142MHz (CAS (tCL)=5; RAS to CAS delay (tRCD)=5; Row Precharge (tRP)=5; tRAS=15)
    • HDD: WD Caviar SE WD1600JD 160GB SATA
  • RADEON X1950 XTX (reference, PCI-E, 512MB GDDR4, 650/2000 MHz);
  • GeForce 7900 GTX (ASUS EN7900GTX, PCI-E, 512MB GDDR3, 650/1600 MHz);
  • GeForce 7950 GX2 (BFG, PCI-E, 2x512MB GDDR3, 2x500/1200 MHz)
  • Operating system: Windows XP SP2 DirectX 9.0c
  • Monitor: Dell 3007WFP (30") (and Mitsubishi Diamond Pro 2070sb (21")).
  • ATI CATALYST 6.10; NVIDIA Drivers 96.94.

VSync is disabled.

The latest RivaTuner beta, written by Alexei Nikolaychuk, now supports this accelerator.

We overclocked both cards, but the BFG card was better: 640/2000 MHz!!!. It demonstrated super results! By the way, test results, published below, will show that the GTS catches up with its top brother at this frequency. I want to remind you again that overclocking results depend on a given sample, no one can guarantee that your sample will work at the same frequency.

BFG GeForce 8800 GTS 640MB PCI-E

The first screenshot shows how the card performs at the standard frequencies.

The second screenshot shows its performance at increased frequencies, equal to operating frequencies of the 8800GTX.

The new version of RivaTuner to be released within the next few days can now disable GPU processors. Note, it can only DISABLE them so far. The work is in progress and we cannot promise so far that it will be able to unlock anything. But we can already say that it can disable processors in the GTX card. You cannot expand the bus from 320bit to 384bit, of course. The GTS just doesn't have enough memory chips.

I repeat that the work is in progress. You cannot expect the author to add a feature to unlock processors in the GTS card at once.

Many games have already reached the limits of CPU and system capacities on the whole, so we don't see the potential of top video cards even in 4 and F 16 modes. That's why when we tested previous top accelerators, we introduced a new HQ (High Quality) mode, which means:

  • ATI RADEON X1xxx: AA 6x, plus Adaptive AA, plus AF 16x High Quality
  • NVIDIA GeForce 7xxx/8xxx: AA 8x, plus TAA (MS with Gamma correction), plus AF 16.

Note that this was not selected at random. It's a compromise: on one hand, it's not much different from ATI AA6x; but on the other hand, GF 7xxx does not support AA above 8x. Besides, this mode is visually sufficient to my mind, because you cannot see the differences with the naked eye. You should take into account that the 8800 cards will be compared in high resolutions.

The screenshots above demonstrate the HQ mode.

And one more thing: we just won't compare the cards and analyze their performance WITHOUT AA+AF, these numbers are published for your information only. You should understand that such cards should be loaded at least with AA4x+AF16x! We also publish data on the GTS card operating at the GTX frequencies, so that you could evaluate a performance drop, when a couple of processors and a rasterizer are disabled.

Test results: performance comparison

You can look up the testbed configuration here

We used the following test applications:

  • Splinter Cell Chaos of Theory v.1.04 (Ubisoft) — DirectX 9.0, multitexturing, maximum test settings, shaders 3.0 (for NVIDIA cards)/shaders 2.0 (for ATI cards); HDR OFF!

  • Half-Life2 (Valve/Sierra) — DirectX 9.0, demo (ixbt01 . The tests were carried out with maximum quality, option -dxlevel 90, presets for video card types are removed from dxsupport.cfg.

  • FarCry 1.33 (Crytek/UbiSoft), DirectX 9.0, multitexturing, demo from the Research level (-DEVMODE startup option), Very High test settings.

  • DOOM III (id Software/Activision) — OpenGL, multitexturing, test settings — High Quality (ANIS8x), demo ixbt1 (33MB!). We have a sample batch file to start the game automatically with increased speed and reduced jerking (precaching) d3auto.rar. (DON'T panic when the screen goes black for 5-10 seconds after the first menu. The demo will follow.)

  • 3DMark05 1.20 (FutureMark) — DirectX 9.0, multitexturing, trilinear filtering.

  • The Chronicles Of Riddick: Escape From Butcher Bay 1.10 (Starbreeze/Vivendi) — OpenGL, multitexturing, maximum texture quality, Shader 2.0, demo 44.

    I wish to thank Rinat Dosayev (AKA 4uckall) and Alexei Ostrovski (AKA Ducche), who have created a demo for this game. I also want to thank Alexei Berillo AKA Somebody Else for his help.

  • F.E.A.R. v.1.02 (Multiplayer) (Monolith/Sierra) — DirectX 9.0, multitexturing, maximum test settings, Soft shadows disabled.

  • Call Of Duty 2 DEMO (Ubisoft) — DirectX 9.0, multitexturing, test settings — maximum, shaders 2.0, tested in Benchemall, demo and a startup script, readme contains necessary instructions

  • 3DMark06 1.02 (FutureMark) — DirectX 9.0c, multitexturing, test settings — trilinear,

You can download test results in EXCEL format: RAR and ZIP

FarCry, Research (No HDR)

Test results: FarCry Research (No HDR)

We can see that the 8800GTS in AA+AF mode offers performance on the level of the X1950, or a tad higher. But HQ mode (really justified in this game) makes the new card a winner.

The Middle-End product from the new NVIDIA series outperforms all previous leaders.

FarCry, Research (HDR)

Test results: FarCry Research (HDR)

HDR adds problems, the 8800 GTS is outperformed by the ATI card. However, HQ mode puts everything into place, the new card is victorious again.


Test results: F.E.A.R.

There are no surprises here, the 8800GTX is the ultimate winner.

Splinter Cell Chaos Theory (No HDR)

Test results: SCCT (No HDR)

Our conclusions are similar to those for Far Cry: the GTS is outperformed in AA+AF mode, but it shoots forward in HQ mode.

Splinter Cell Chaos Theory (HDR)

Test results: SCCT (HDR)

The same thing.

Call Of Duty 2

Test results: OD2

The same situation here, but the new card outperforms the X1950XTX in AA4x+AF16x mode. But the 7950GX2 is faster.

Half-Life2: ixbt01 demo

Test results: Half-Life2, ixbt01

We shall analyze only HQ mode here. The 8800 GTS is victorious, no comments are necessary.

DOOM III High mode

Test results: DOOM III

Chronicles of Riddick, demo 44

Test results: Chronicles of Riddick, demo 44

Experienced readers know well that openGL games favour all NVIDIA products. According to our tests, it concerns not only games based on id Software engines.

3DMark05: MARKS

Test results: 3DMark05 MARKS

In this case lots of processors and the high frequency of the shader unit produce an expectable result: victory everywhere, only the 7950GX2 with its two cores is unreachable.

3DMark06: SHADER 2.0 MARKS

Test results: 3DMark06 SM2.0 MARKS

We have the same conclusions here, even the 7950GX2 failed to outscore the new product.

3DMark06: SHADER 3.0 MARKS

Test results: 3DMark06 SM3.0 MARKS

The 8800GTS is evidently victorious.


XFX GeForce 8800 GTS 640MB PCI-E - I'll describe this card as the GeForce 8800 GTS on the whole. This accelerator with its recommended price of $449 evidently removes the 7900GTX and the 7950GX2, which must get cheaper or perish. Besides, our tests show that the GTS successfully competes with the X1950 XTX, because it now possesses all advantages of ATI cards (anisotropy of higher quality and support for HDR+AA). So if the 8800GTS is somehow priced like the X1950, you should choose the GTS card. You'll have to think hard though, if you need VIVO - it's available in ATI cards, but top modern cards from NVIDIA lack it. But this minor detail shouldn't be a problem, I think.

What concerns the card from XFX, it's just a reference card. That's why XFX adds only the bundle and the box. The shorter PCB will make the card very popular.

All GeForce 8800 GTX/GTS cards are manufactured in Flextronics and Foxconn plants. Their quality is controlled by NVIDIA. Then the company just sells them to its partners. None of NVIDIA's partners manufactures these cards on its own! Just drill it like an axiom. When you buy the 8800GTX/GTS card, different vendors mean just different bundles and boxes. That's it!

BFG GeForce 8800 GTS 640MB PCI-E is practically a copy of the previous product, the same reference card with a different sticker. It has a very good bundle.

On the whole: GeForce 8800 GTS is a very good card for its price. We can recommend it, if the prices are within promised limits.

You should also keep in mind that GTS cards can be easily overclocked. If you overclock it to the level we did, the card may yield the GTX performance! It certainly cannot be done with the X1950XTX.

You can find more detailed comparisons of various video cards in our 3Digest.

BFG GeForce 8800 GTS 640MB PCI-E gets the Excellent Package award (November).

We also decided to give our Original Design award to NVIDIA for its excellent GeForce 8800 GTS.

ATI RADEON X1300-1600-1800-1900 Reference

NVIDIA GeForce 7300-7600-7800-7900 Reference

We express our thanks to
BFG Russia
and Mikhail Proshletsov personally
for the provided BFG video card

PSU for the testbed was kindly provided by TAGAN

                               The Dell 3007WFP monitor for the testbeds was kindly provided by NVIDIA

Andrey Vorobiev (anvakams@ixbt.com)
November 25, 2006

Write a comment below. No registration needed!

Article navigation:

blog comments powered by Disqus

  Most Popular Reviews More    RSS  

AMD Phenom II X4 955, Phenom II X4 960T, Phenom II X6 1075T, and Intel Pentium G2120, Core i3-3220, Core i5-3330 Processors

Comparing old, cheap solutions from AMD with new, budget offerings from Intel.
February 1, 2013 · Processor Roundups

Inno3D GeForce GTX 670 iChill, Inno3D GeForce GTX 660 Ti Graphics Cards

A couple of mid-range adapters with original cooling systems.
January 30, 2013 · Video cards: NVIDIA GPUs

Creative Sound Blaster X-Fi Surround 5.1

An external X-Fi solution in tests.
September 9, 2008 · Sound Cards

AMD FX-8350 Processor

The first worthwhile Piledriver CPU.
September 11, 2012 · Processors: AMD

Consumed Power, Energy Consumption: Ivy Bridge vs. Sandy Bridge

Trying out the new method.
September 18, 2012 · Processors: Intel
  Latest Reviews More    RSS  

i3DSpeed, September 2013

Retested all graphics cards with the new drivers.
Oct 18, 2013 · 3Digests

i3DSpeed, August 2013

Added new benchmarks: BioShock Infinite and Metro: Last Light.
Sep 06, 2013 · 3Digests

i3DSpeed, July 2013

Added the test results of NVIDIA GeForce GTX 760 and AMD Radeon HD 7730.
Aug 05, 2013 · 3Digests

Gainward GeForce GTX 650 Ti BOOST 2GB Golden Sample Graphics Card

An excellent hybrid of GeForce GTX 650 Ti and GeForce GTX 660.
Jun 24, 2013 · Video cards: NVIDIA GPUs

i3DSpeed, May 2013

Added the test results of NVIDIA GeForce GTX 770/780.
Jun 03, 2013 · 3Digests
  Latest News More    RSS  

Platform  ·  Video  ·  Multimedia  ·  Mobile  ·  Other  ||  About us & Privacy policy  ·  Twitter  ·  Facebook

Copyright © Byrds Research & Publishing, Ltd., 1997–2011. All rights reserved.