iXBT Labs - Computer Hardware in Detail






NVIDIA GeForce 6600GT and 6600 (NV43):
Part 1 - Performance.


  1. Official specifications
  2. Architecture
  3. Video cards' features
  4. Testbed configurations, benchmarks, 2D quality
  5. Synthetic tests in D3D RightMark
  6. Synthetic tests in 3DMark03: FillRate Multitexturing
  7. Synthetic tests in 3DMark03: Vertex Shaders
  8. Synthetic tests in 3DMark03: Pixel Shaders
  9. Test results: Quake3 ARENA
  10. Test results: Serious Sam: The Second Encounter
  11. Test results: Return to Castle Wolfenstein
  12. Test results: Code Creatures DEMO
  13. Test results: Unreal Tournament 2003
  14. Test results: Unreal II: The Awakening
  15. Test results: RightMark 3D
  16. Test results: TRAOD
  17. Test results: FarCry
  18. Test results: Call Of Duty
  19. Test results: HALO: Combat Evolved
  20. Test results: Half-Life2(beta)
  21. Test results: Splinter Cell
  22. Test results: DOOM III
  23. Test results: 3DMark03 Game1
  24. Test results: 3DMark03 Game2
  25. Test results: 3DMark03 Game3
  26. Test results: 3DMark03 Game4
  27. Test results: 3DMark03 MARKS
  28. Conclusions

Beginning of autumn... As it had already been five years ago with a loud announcement of the first video card with T&L hardware support (GeForce256), today we meet new products from NVIDIA. Of course they do not belong to High-End, but their announcement is no less interesting. The announcement has been actually made before, in August, but it provided only video cards' features. And now, as many other mass media, we have a chance to highlight the features of the newest Middle End 3D accelerators.

Up to recently this market segment has been dominated by ATI products: RADEON 9600/PRO/XT, X600PRO/XT outscoring its competitors NVIDIA GeForce FX 5700/Ultra, PCX5750/5900 by their performance in modern games with active shader usage. And only FX 5900XT, drawn down from "above" into this sector, managed to become popular and to press the supremacy of the canadian products. And now...

"Nalu is on the way to the first prize... Ruby will have to hold the line..."

Yes, the mermaid and the valiant girl from the corresponding NVIDIA and ATI demos, demonstrating the new technologies (SM3.0 from NVIDIA and 3Dc/SM2.0b from ATI) are not selected as our heroines by chance. The new products from the californian company, which will be reviewed today, completely support shaders v3.0, as their elder brothers.

Will Ruby let catching her up Nalu have her king diamond? The new announcements from ATI in the same video cards' sector are round the corner. What will be the outcome of this battle? - We don't know yet. I think that the article about RV410 will be no less interesting and gripping. But for now we'll put aside these issues and examine NV43 (GeForce 6600GT/6600) as if these cards are already on sale. Correspondingly, their competitors will be presently popular accelerators within the 150-200 USD price range. And of course predecessors of these new video cards.

Running a few steps forward, I want to note that NV43 has built-in support for the PCI-Express interface (or PCX), that's why AGP products without HSI bridge are impossible. Thus, they will be costlier to manufacture and will be launched later than their PCX counterparts (if they will be launched – it'll be all up to the demand). It's a considerable shortcoming of the new products today, because PCX sector has just started to evolve and the demand for such platforms is minimal for now. That's why, no matter how wonderful the new product is, it is doomed to little retail demand from the very beginning, because benefits from the AGP to PCX upgrade are doubtful so far. On the other hand, OEM market and PC assemblers, especially foreign ones, will not fail to launch models with not so expensive (as top models) but still meeting modern requirements DirectX PCX solutions.

Besides, who knows, perhaps the launch of interesting and profitable in terms of price-performance video cards will whip up the interest to PCX on the whole. So, time will show. And don't forget that ATI will launch its RV410 with native support for PCX only, and the canadian company does not have its own bidirectional bridges AGP<->PCX. So it will hardly be able to implement its new products on the AGP bus. However, this sector is overcrowded anyway, and there are a lot of solutions similar in performance, which had been manufactured before or are produced now.

It was very interesting for us to compare not only video cards on the same interface, but also AGP and PCX implementations. Of course it's difficult to do, because the platforms differ considerably. However we remember that we are in Middle End sector, where modern processors are quite capable of loading accelerators by 100%, and after a certain resolution barrier the performance is not so platform-dependent. Below are the results of our cross-platform research.

And now let's return to the objects of today's analysis: NVIDIA NV43 or GeForce 6600GT/6600 (the series now contains two video cards, which differ only by parts).

Official specifications on GeForce 6600GT/6600 (NV43)

  1. Codename of the chip is NV43
  2. 110nm process (TSMC) (!)
  3. 146 million transistors
  4. FC packaging (flip-chip, without a metal cap)
  5. 128-bit two-channel memory interface (!)
  6. Up to 256 MB DDR/GDDR-2/GDDR-3
  7. PCI Express16x bus interface integrated into the chip
  8. Interface conversion into APG 8x using bidirectional PCI Express<->AGP bridge HSI
  9. 8 pixel processors, each with a texture unit with random floating point and integer filtering (anisotropy up to 16x).
  10. 3 vertex processors, each with a texture unit, without filtering selected values (discrete sampling)
  11. Calculation, blending, and writing of up to 8 full (color, depth, stencil buffer) pixels per clock (tests demonstrate up to 4)
  12. Calculation and writing of up to 16 values of depth and stencil buffer per clock (no operations with color) (tests demonstrate up to 8)
  13. Support for "double-sided" stencil buffer
  14. Support for special geometry rendering optimizations to accelerate shadow algorithms based on a stencil buffer (so called Ultra Shadow II Technology), particularly widely used in the Doom III engine.
  15. All necessary things to support Pixel and Vertex Shaders 3.0, including dynamic branching in pixel and vertex processors, vertex texture fetch, etc.
  16. Floating point filtering of textures
  17. Floating point frame buffer (including blending operations)
  18. MRT
  19. 2 RAMDAC 400 MHz
  20. 2 DVI interfaces (require interface chips)
  21. TV-Out and TV-In (require interface chips)
  22. Programmable streaming video processor (for encoding, decoding, and video postprocessing purposes)
  23. 2D accelerator supporting all GDI+ functions
  24. Integrated thermal and power monitoring.

GeForce 6600 GT, specification of the reference card

  1. Core frequency 500 MHz
  2. Effective memory frequency 1 GHz (2*500 MHz)
  3. 128-bit memory bus
  4. GDDR-3 memory type
  5. 128 MB of memory
  6. 16 GB/sec Memory Bandwidth
  7. Theoretical fill rate 4 gigapixel/sec.
  8. Theoretical texture fetch speed 4 gigatexel/sec
  9. 1 x VGA (D-Sub) and 1 x DVI-I connector
  10. TV-Out
  11. Consumes up to 70 W (that is there is no need in an additional power connector on PCI-Express cards, recommended power unit is 300 W or more)

A list of existing cards based on NV43:

  • GeForce 6600GT: 500/500 (1000) MHz, 128MB GDDR3, PCI-Express 16x, 8 pixel and 3 vertex pipelines ($199) - competitor to NVIDIA GeForce PCX5900, ATI RADEON X600 XT(?), as well as to future ATI solutions (RV410);
  • GeForce 6600: 300/250-300 (500-600) MHz, 128/256MB DDR, PCI-Express 16x, 8 pixel and 3 vertex pipelines ($149) - competitor to NVIDIA GeForce PCX5750, ATI RADEON X600 PRO (X600 XT?).

Chip architecture

We didn't find any special architectural differences from NV40, which is not surprising – NV43 is a scaled (by means of reducing vertex and pixel processors and memory controller channels) solution based on the NV40 architecture. The differences are quantitative (bold elements on the diagram) but not qualitative – from the architectural point of view the chip remains practically unchanged.

Thus, we have 3 (there were 6) vertex processors and two (there were 4) independent pixel processors, each working with one quad (2x2 pixel fragment). It's interesting, that this time PCI Express has become a native (i.e. on-chip) bus interface, and AGP 8x cards will have an additional bidirectional PIC-E <-> AGP bridge (shown with dotted line), which has been already described.
Besides, note an important limiting factor – two-channel controller and a 128-bit memory bus – we'll analyze and discuss this fact later on.

The architecture of vertex and pixel processors and of the video processor remained the same – these elements were described in detail in our review of GeForce 6800 Ultra (link). And now let's consider potential quantitative and qualitative changes relative to NV40:

Theoretical considerations about what and how has been cut down

On the whole, at present we have the following series of solutions based on NV4X and R4XX:

Video card




Memory band


Core frequency


6800 U
256 (4 x 64)
GDDR3 1100
6800 GT
16/ 6
256 (4 x 64)
GDDR3 1000
256 (4 x 64)
DDR 700
6800 LE
256 (4x64)
DDR 700
6600 GT
NV 43
128 (2x64)
GDDR 3 1000
NV 43
128 (2x64)
DDR 500-600-700


X800 XT
256 (4 x 64)
GDDR3 1000
X800 PRO
256 (4 x 64)
GDDR3 1100
X800 SE
256 (4 x 64)
DDR 700
X700 XT*
256 (4x64)
DDR (?)
~500 (?)
X 700 PRO / SE *
RV 41 X
128 (2x64)
~400 (?)
X600 XXX Based on the architecture of the previous generation

*) the data is based on unverified rumours (beyond3d forum and other unofficial web resources), these products will soon be announced officially.

While 6800 Ultra, GT and just 6800 look rather balanced in terms of memory band and fill rate, 6800LE will often experience the insufficient fill rate - memory band is excessive, and both 6600 models will first of all suffer from insufficient passband. Peak fill rate in 6600 GT is almost 2/3 of fill rate in 6800 Ultra, while its memory bandwidth is twice as low (without taking into account potentially reduced caches and two-channel memory controller).

Thus we can predict that the weak point in the 6600 series will be high resolutions and full-screen antialiasing modes, especially in simple applications. And the strong point – programs with long and complex shaders and anisotropic filtering without simultaneous MSAA. We'll check this assumption later in game and synthetic tests.

It's difficult to judge now how reasonable the introduction of 128-bit memory bus was - on the one hand, it cheapens chip packaging and reduces a number of defective chips, on the other hand, the cost difference between 256-bit and 128-bit PCBs is small and is excessively compensated by the cost difference between usual DDR and still expensive high-speed GDDR3 memory. Perhaps, from the manufacturers' point of view the solution with 256-bit bus would be more convenient, at least if they had a choice. But from the point of view of NVIDIA, who manufactures chips, very often selling them together with memory, the 128-bit solution with GDDR3 is more profitable. Its impact on performance is another story, the potential restriction of excellent chip features because of the considerably reduced memory passband is obvious (8 pipelines, 500 MHz core frequency, and that's not the limit yet).

DDR 700 x 256 bit = 22.4 Gigabyte versus GDDR3 1000x128 bit = 16 Gigabyte.

This fact is especially troubling on the background of rumours about the seniour X700 model, which will be equipped with 256-bit memory.

However, note that the Ultra suffix is still reserved by NVIDIA – considering a great overclocking potential of the 110 nm process, we can expect a video card with the core frequency at about 600 MHz or less, 1100 or even 1200 (in future) memory and with the name 6600 Ultra. The question is how much it will be? In far-reaching perspective, we can forecast a launch of a renewed 256-bit variant of the mainstream solution, let's call it NV46, optimized by performance, with 8 or even 12 pipelines and a 256-bit bus.

To all appearances, vertex and pixel processors in NV43 remained the same, but the internal caches could be reduced proportionally to the number of pipelines. However, the number of transistors does not give cause for trouble. Considering not so large cache sizes, it would be more reasonable to leave them as they were in NV40, thus compensating for the noticeable scarcity of memory passband. It's quite possible that the ALU array, which contains rather large quantities of transistors responsible for postprocessing, verification, Z generation, and pixel blending to write the results to frame buffer, was also reduced in each pipeline relative to NV40. The reduced memory band will not allow to write 4 full gigapixels per second anyway, and the fill rate potential (8 pipelines for 500 MHz) will be validly used only with more or less complex shaders with more than 2 textures and corresponding shader calculations.

We'll check all these assumptions in the course of our synthetic and game tests.

Before examining the card itself, we'll publish a list of articles devoted to the analysis of the previous products: NV40/R420. It's obvious already that the NV43 architecture is a direct heir to NV40 (its chip capacity was halved).

Theoretical materials and reviews of video cards, which concern functional properties of the GPU ATI RADEON X800 (R420) and NVIDIA GeForce 6800 (NV40)

I repeat that this is only the first part of the review devoted to the performance of the new video cards. The qualitative components will be examined later in the second part (3D quality and video playback).

Now about the video card. Why do we mention two video cards in the title and actually review only one? - The fact is, 6600GT and 6600 differ only by operating frequencies, that's why we can emulate GF 6600 with high probability by reducing 6600GT frequencies. That's what we did. Taking into account that production GeForce 6600 cards will be equipped with common DDR instead of GDDR3 (besides frequencies, they have different timings) and that NVIDIA does not declare strict memory operating frequencies for such video cards, and there may be memory clocks from 250 to 300 MHz, we cannot obviously speak about 100% match of our results with those demonstrated by final GeForce 6600. But we can reckon. It's even useful. That's why we'll include GeForce 6600 300/300 (600) MHz into our results (an extreme case). It's clear to everybody that real 6600 will demonstrate the performance NOT HIGHER than that on our diagrams, and we can reckon the range.

So, reference card GeForce 6600GT.

Video card

NVIDIA GeForce 6600GT

The card has the PCI-Express 16x interface, 128 MB GDDR3 SDRAM allocated in 4 chips on the front side of the PCB.

NVIDIA GeForce 6600GT
Samsung (GDDR3) memory chips. 2.0ns memory access time, which corresponds to 500 (1000) MHz, at which the memory operates. GPU frequency — 500 MHz. 128-bit memory bus. Pixel pipelines X texture units - 8x1. 3 vertex pipelines.

Comparison with the reference design, front view
NVIDIA GeForce 6600GT NVIDIA GeForce PCX5900

NVIDIA GeForce 6800

Comparison with the reference design, back view
NVIDIA GeForce 6600GT/6600 NVIDIA GeForce PCX5900

NVIDIA GeForce 6800

GF 6600GT design is obviously unique, it is not similar to any previous design. First of all, this is due to the reduction of PCB dimensions allowed by the lack of 256-bit bus, which actually influences PCB dimensions. And a critical simplification of the power block provided for the PCB area reduction (all PCX card consuming less than 75W do not require external power, and it simplifies the design). Our object consumes less than 75W even at maximum load, so there is no need to connect it directly to the power supply unit.

Despite enormous frequencies for an 8-pipeline chip, the cooler is rather primitive.

NVIDIA GeForce 6600GT

Here is a familiar cooling device: a closed heatsink with an off-center fan driving air inside the heatsink along the card.

We may assume that manufacturers of these cards will experiment with their own coolers or will use the previous designs for GeForce4 Ti (GeForce FX 5600/5700).

GPU location is relatively small (it's quite clear, 128-bit but), it looks like GeForce FX 5700. The chip size is also almost the same. But while NV36 contained only 4 pixel and 3 vertex pipelines with this dimensions, in this case it contains twice as many pixel pipelines. 0.11 micron process...

This video card has an important feature, designed for the future, namely the SLI support (that is, as in times of Voodoo2, there is an opportunity to increase the total capacity of 3D graphics by an additional similar accelerator). For this purpose the upper part of the card contains a corresponding connector for a special cable to two video cards to synchronize their operation:

To finish the examination of the card, note that it has the VIVO support implemented with Philips 7115 (we haven't come across this codec yet, so our permanent researcher of multimedia add-ins and features of video cards Alexei Samsonov is looking forward to test the new card, so wait for an article concerning this issue).

Now what concerns overclocking. Thanks to responsiveness of Alexei Nikolaychuk (author of RivaTuner) this utility is already compatible with NV43.

The number of pipelines in this card (both pixel and vertex) can be detected. On the second screenshot you can see that two quads (four pixel pipelines) operate in this card.

So, the card operated steadily at 590/590 (1180) MHz!. Unprecedented potential! I can even assume that after the launch of ATI RV410, NVIDIA will launch GeForce 6600 Ultra (that's the reason for the seniour model presently bearing only the GT suffix).

At these frequencies the card worked with an external cooler. Here are the temperatures we registered:

Yes, sometimes the core heated to 88 degrees, but as you know, this is not the limit for these chips (they can heat up to 100 degrees). It's interesting to note that the external cooler actually cooled only memory, because its removal did not lead to any core temperature rise.

And here is what we see at regular frequencies:

[ The next part (2) ]

Andrey Vorobiev (anvakams@ixbt.com)
Alexander Medvedev (unclesam@ixbt.com)

07 September, 2004

Write a comment below. No registration needed!

Article navigation:

blog comments powered by Disqus

  Most Popular Reviews More    RSS  

AMD Phenom II X4 955, Phenom II X4 960T, Phenom II X6 1075T, and Intel Pentium G2120, Core i3-3220, Core i5-3330 Processors

Comparing old, cheap solutions from AMD with new, budget offerings from Intel.
February 1, 2013 · Processor Roundups

Inno3D GeForce GTX 670 iChill, Inno3D GeForce GTX 660 Ti Graphics Cards

A couple of mid-range adapters with original cooling systems.
January 30, 2013 · Video cards: NVIDIA GPUs

Creative Sound Blaster X-Fi Surround 5.1

An external X-Fi solution in tests.
September 9, 2008 · Sound Cards

AMD FX-8350 Processor

The first worthwhile Piledriver CPU.
September 11, 2012 · Processors: AMD

Consumed Power, Energy Consumption: Ivy Bridge vs. Sandy Bridge

Trying out the new method.
September 18, 2012 · Processors: Intel
  Latest Reviews More    RSS  

i3DSpeed, September 2013

Retested all graphics cards with the new drivers.
Oct 18, 2013 · 3Digests

i3DSpeed, August 2013

Added new benchmarks: BioShock Infinite and Metro: Last Light.
Sep 06, 2013 · 3Digests

i3DSpeed, July 2013

Added the test results of NVIDIA GeForce GTX 760 and AMD Radeon HD 7730.
Aug 05, 2013 · 3Digests

Gainward GeForce GTX 650 Ti BOOST 2GB Golden Sample Graphics Card

An excellent hybrid of GeForce GTX 650 Ti and GeForce GTX 660.
Jun 24, 2013 · Video cards: NVIDIA GPUs

i3DSpeed, May 2013

Added the test results of NVIDIA GeForce GTX 770/780.
Jun 03, 2013 · 3Digests
  Latest News More    RSS  

Platform  ·  Video  ·  Multimedia  ·  Mobile  ·  Other  ||  About us & Privacy policy  ·  Twitter  ·  Facebook

Copyright © Byrds Research & Publishing, Ltd., 1997–2011. All rights reserved.