iXBT Labs - Computer Hardware in Detail

Platform

Video

Multimedia

Mobile

Other

Gigabyte 3D1 (2xGeForce 6600GT) and K8NXP-SLI Motherboard








Contents

  1. Video card's features
  2. Testbed configurations, benchmarks, 2D quality
  3. Gigabyte K8NXP-SLI motherboard
  4. Test results: TRAOD
  5. Test results: FarCry Research
  6. Test results: FarCry Regulator
  7. Test results: FarCry Pier
  8. Test results: Half-Life2, ixbt01
  9. Test results: Half-Life2, ixbt02
  10. Test results: Half-Life2, ixbt03
  11. Test results: DOOM III
  12. Test results: 3DMark05 Game1
  13. Test results: 3DMark05 Game2
  14. Test results: 3DMark05 Game3
  15. Test results: 3DMark05 MARKS
  16. Conclusions



Today we have an unusual review, because it is devoted not to a separate video card or cards, but to an entire kit consisting of a unique accelerator based on two GeForce 6600GT cards and a motherboard based on NVIDIA nForce4 SLI.

This decision is engendered by the policy of Gigabyte towards the new 3D1 video card, which can operate only with Gigabyte K8NXP-SLI motherboard. That's why there is no point in writing a separate review of a video card, which is tied to a motherboard.

We don't know whether the 3D1 will be on sale separately or just as a kit component. Gigabyte has not yet decided on this. This video card will be issued in a limited edition. The K8NXP-SLI without 3D1 will certainly appear on the market. As usual, it will be a "no XP" modification, that is with fewer bells and whistles in the bundle.

What suggested the idea to combine two GeForce 6600GT processors on a single PCB? The product will require a motherboard with SLI anyway! I have been thinking hard about the expedience of such a solution, and nothing but marketing and advertising considerations crossed my mind.

Of course, 3D1 guarantees SLI, because the two conventional video cards on one PCB are surely identical. According to our tests, SLI is an uncertain thing, not every pair of video cards can easily form it: the driver meticulously "tries" each video card. But it's also clear that buying two GeForce 6600GT cards from the same manufacturer gives 99% guarantee that SLI will be formed, especially if you buy them together.

Price difference? It goes without saying that two video cards are more expensive than one, even such a topnotch one. But you lose one of SLI advantages: flexible upgrade of graphics system capacities – if you have enough money you buy a single accelerator, and as soon as you have spare money, you can buy another one and build up 3D capacity. In this case you will have to buy two accelerators at once, even if they are on a single PCB. I repeat, this purchase is reasonable only if its price is considerably lower than the total price for two video cards.

Can this video card theoretically operate not only on an nForce4 SLI based motherboard but say, on iE7525 as well? Besides, Gigabyte announced a motherboard on i915 with SLI support. Theoretically it's possible, but I repeat that the 3D1 can operate only on the K8NXP-SLI. At least for now. Perhaps it will later be supported by the motherboard on i915.

The manufacturer announced that 3D1 performance exceeds GeForce 6600GT SLI (of two video cards). This is not surprising. I'll tell you the reasons below. Because it's high time to examine out main hero.

Video card



Gigabyte 3D1 (2xGeForce 6600GT)





Gigabyte 3D1 (2xGeForce 6600GT)
The card has the PCI-Express x16 interface, 256 MB GDDR3 SDRAM in total allocated in 8 chips on the front side of the PCB.

Samsung memory chips. 1.6ns memory access time, which corresponds to 625 (1250) MHz. Memory operates at 570 (1140) MHz. Operating frequency of both GPUs — 500 MHz. Memory bus – 128 bit per each processor. 8 pixel pipelines 3 vertex ones for each GPU. I want to pay your attention to the increased memory frequency – 570 MHz versus 500 MHz in serial GeForce 6600GT cards. Is it the reason for that particular performance superiority promised by the manufacturer?






Comparison with the reference design, front view
Gigabyte 3D1 (2xGeForce 6600GT)
Reference card NVIDIA GeForce 6600GT 128MB











Comparison with the reference design, back view
Gigabyte 3D1 (2xGeForce 6600GT)
Reference card NVIDIA GeForce 6600GT 128MB








It looks like the card was designed by people who decided to decorate it with an unusual memory chip pattern for the New Year's Eve – it's impossible to understand why they are rotated at 45 degrees. All the more so as the processors are not displaced. A tad later I got the following information from Gigabyte: this solution presupposes memory chips to be located at equal distance from a processor, which according to designers saves PCB room and makes the cards more compact.

Of course, this video card has exceeded the PCI-E limit of 75W by the total power consumption of both GPUs and the doubled number of memory chips. That's why the card is equipped with an external power connector as in senior models.

Let's have a look at the graphics processor.






These two chips are obviously no different from the regular 6600GT installed on separate video cards.

The card itself operates with cunning: its PCI-E x16 is in fact not an x16 slot, but a doubled x8. A single PCB just accommodates two independent video cards. The only connection between them is an on-board counterpart of the synchronizing adapter, which we install on two SLI video cards. But the system sees TWO VIDEO CARDS.

So it's clear that this video card will be supported only by motherboards, which can apply two X8 signals to the PCI-E x16 slot. Our conclusion: this product is doomed to be tied to Gigabyte motherboards. Though all motherboards on nForce4 SLI can change x16 to x8, they can hardly apply 2x8 to the first slot.

By the way, the first thing I did when I got the 3D1 was to install it into ASUS A8N SLI, which was in the testbed up to that moment. The system booted, I installed video card's drivers, and then the system froze at the initialization process. Q. E. D.

Let's get on with the examination.

Gigabyte 3D1 (2xGeForce 6600GT)
The cooling system is a single whole massive device with two fans. The memory seems to be cooled as well, but that's not the case: the heatsink does not touch the memory chips. It's very strange, considering that there are no chips on the back of the PCB and it would have been quite possible to cool all overheated chips. But if you get thermal spacers, their thickness will be enough to press the heatsink tight to the memory chips.

The fans are not that fast, so the cooler is very quiet.









We are not going to review the bundle, because the card and the motherboard have the common package contents (that's why you are advised to read the next chapter on the K8NXP-SLI).

Gigabyte K8NXP-SLI motherboard

Not to encumber this article, a brief description of this product is brought to a separate page.

Installation and Drivers

Testbed configurations:

  • Athlon64 based computer
    • CPU: AMD Athlon64 2400 MHz (218MHz x 11; L2=512K) (~3800+);
    • Gigabyte K8NXP-SLI motherboard on NVIDIA nForce4 SLI chipset; ONE video card mode!
    • RAM: 1 GB DDR SDRAM 400MHz
    • HDD: WD Caviar SE WD1600JD 160GB SATA

  • Operating system: Windows XP SP2; DirectX 9.0c
  • Monitors: ViewSonic P810 (21") and Mitsubishi Diamond Pro 2070sb (21").
  • ATI drivers 6.497 (CATALYST 4.12); NVIDIA drivers 71.20.

VSync is disabled.

Both companies have enabled trilinear filtering optimizations in their drivers by default.

It should be noted that the 3D1 is detected by RivaTuner, so there are no problems with overclocking. What concerns the frequencies, they got raised just to 540/1250 MHz (not much of overclocking for this GPU). But there are severe limitations due to the PCB layout and stability of both processors when overheated.

Test results

Before giving a brief evaluation of 2D, I will repeat that at present there is NO valid method for objective evaluation of this parameter due to the following reasons:

  1. 2D quality in most modern 3D accelerators dramatically depends on a specific sample, and it's impossible to evaluate all the cards.
  2. 2D quality depends not only on the video card, but also on the monitor and a cable.
  3. A great impact on this parameter has been recently demonstrated by monitor-card pairs. That is there are monitors, which just won't "work" with specific video cards.

What concerns our samples under review together with Mitsubishi Diamond Pro 2070sb, these cards demonstrated identically excellent quality in the following resolutions and frequencies:

Gigabyte 3D1 (2xGeForce 6600GT) 1600x1200x85Hz, 1280x1024x100Hz, 1024x768x120Hz


Test results: performance comparison

We used the following test applications:

  • Tomb Raider: Angel of Darkness v.49 (Core Design/Eldos Software) – DirectX 9.0, Paris5_4 demo. The tests were conducted with the quality set to maximum, only Depth of Fields PS20 was disabled.

  • Half-Life2 (Valve/Sierra) – DirectX 9.0, demo (ixbt01, ixbt02, ixbt03 The tests were carried out with maximum quality, option -dxlevel 90, presets for video card types are removed from dxsupport.cfg.

  • FarCry 1.3 (Crytek/UbiSoft), DirectX 9.0, multitexturing, 3 demo from Research, Pier, Regulator levels (the game is started with -DEVMODE option), Very High test settings.

  • DOOM III (id Software/Activision) – OpenGL, multitexturing, test settings – High Quality (ANIS8x), demo ixbt1 (33MB!). We have a sample batch file to start the game automatically with increased speed and reduced jerking (precaching) d3auto.rar. (DO NOT BE AFRAID of the black screen after the first menu, that's how it should be! It will last 5-10 seconds and then the demo should start)

  • 3DMark05 (FutureMark) – DirectX 9.0, multitexturing, test settings – trilinear,



TR:AoD, Paris5_4 DEMO



Test results: TRAOD




In fact, this test demonstrates the capacity of shader units. That's why the 3D1 is an obvious leader here. Its advantage nearly reaches 100% relative to the regular 6600GT. But don't forget that memory operates at increased frequencies in this card.



FarCry, Research



Test results: FarCry Research




The advantage over the 6600GT does not exceed 63% – it can be explained by the processor-critical game, and even maximum load on the video card cannot deplete its resources. But the way, there is practically no difference from SLI either. What concerns the competition with ATI – it's defeated both by X800 XT (it's a more expensive card, so it's OK) and by X800 XL (and that's not good of the 3D1).



FarCry, Regulator



Test results: FarCry Regulator




The same picture.



FarCry, Pier



Test results: FarCry Pier




No comments, the same alignment of forces. This game has been bad luck for NVIDIA products for a long time :).



Half-Life2: ixbt01 demo



Test results: Half-Life2, ixbt01




The game is still more critical to processors, that's why the effect from 3D1 (as well as from SLI in general) does not exceed even 50% of gain. And again it's defeated by ATI products. It should be said though that the GeForce 6800GT cards were not that bad.



Half-Life2: ixbt02 demo



Test results: Half-Life2, ixbt02




The picture is almost the same, with the only difference that 3D1 lost more scores and was outscored more by ATI products and its senior brother 6800GT.



Half-Life2: ixbt03 demo



Test results: Half-Life2, ixbt03




An interesting aspect. The X800 XL is obviously victorious in this test, but our card demonstrates noticeable gain relative to the 6600GT SLI! Exactly in AA+AF modes (memory plays its role?). And another thing: almost on a par with the 6800GT, and the latter is obviously more expensive.



DOOM III



Test results: DOOM III




This game uses technological advantages of the NV4x architecture again, and you can see how much gain SLI provides. Increased memory frequency brought 5% of advantage over the 6600GT SLI. The competition with the 6800GT also looks good (to say nothing about the defeat of ATI X800).



3DMark05: Game1



Test results: 3DMark05 Game1






3DMark05: Game2



Test results: 3DMark05 Game2






3DMark05: Game3



Test results: 3DMark05 Game3






3DMark05: MARKS



Test results: 3DMark05 MARKS




In fact the result could have been predicted after TRAoD tests, because 3DMark05 lays special stress on shader performance. And we have two 500 MHz processors, 16 pipelines in total, you can know the power when you look at it. That's why 3D1 is certainly a leader.

At the end of the review I'd like to say that all the problems we previously had with quality in SLI cards were fixed in drivers 71.20. That's very good.

Conclusions

One can say that the success of this KIT is up to the price! If it's adequate, we'll have:

- an excellent motherboard with a full set of features for the AMD Athlon 64/FX platform (Socket939). It has disadvantages as well (we are reviewing purely constructive peculiarities of the board so far) – they are described here.

- a very powerful solution – 3D1, which capacity nearly reaches that of the 6800GT, but cheaper. It's clear that the more games oriented to the general use of shaders we have, the more advantageous 3D1 will be.

But 3D1 also has negative points. Firstly, it's actually outscored by the cheaper X800 XL in popular games, except for DOOM III. Secondly, power supply requirements (external power supply is mandatory, while the X800 XL does not need it).

Thirdly, large dimensions of the 3D1. As this card can be used only with the K8NXP-SLI, it means that SATA connectors located right under the video card will be unavailable.

Fourthly, the second PCI-E slot is disabled in case of 3D1, so you cannot use it to install the second video card for video output to 4 monitors.

The main conclusion: the necessity of this solution in general. I repeat, if the price is ok, lower than the price for a similar motherboard plus two 6600GT cards, then this purchase is justified. If it's cheaper than a motherboard plus X800 XL, its success will be guaranteed.

Now have a look at the calendar, 2005 has begun and it means that the market is divided into AGP- and PCI-E - sectors as 95% to 5%. The question is: will this kit or separate K8NXP-SLI become the engine to pull PCI-E out of the low demand mire?

Time will show.

In our 3Digest you can find more detailed comparisons of various video cards.



According to the test results, Gigabyte 3D1 (2xGeForce 6600GT) gets the Original Design award (January).








Theoretical materials and reviews of video cards, which concern functional properties of the GPU ATI RADEON X800 (R420)/X850 (R480)/gigabyte-3d1 (nv44)/X700 (RV410) and NVIDIA GeForce 6800 (NV40/45)/6600 (NV43)








Andrey Vorobiev (anvakams@ixbt.com)

February 16, 2005



Write a comment below. No registration needed!


Article navigation:



blog comments powered by Disqus

  Most Popular Reviews More    RSS  

AMD Phenom II X4 955, Phenom II X4 960T, Phenom II X6 1075T, and Intel Pentium G2120, Core i3-3220, Core i5-3330 Processors

Comparing old, cheap solutions from AMD with new, budget offerings from Intel.
February 1, 2013 · Processor Roundups

Inno3D GeForce GTX 670 iChill, Inno3D GeForce GTX 660 Ti Graphics Cards

A couple of mid-range adapters with original cooling systems.
January 30, 2013 · Video cards: NVIDIA GPUs

Creative Sound Blaster X-Fi Surround 5.1

An external X-Fi solution in tests.
September 9, 2008 · Sound Cards

AMD FX-8350 Processor

The first worthwhile Piledriver CPU.
September 11, 2012 · Processors: AMD

Consumed Power, Energy Consumption: Ivy Bridge vs. Sandy Bridge

Trying out the new method.
September 18, 2012 · Processors: Intel
  Latest Reviews More    RSS  

i3DSpeed, September 2013

Retested all graphics cards with the new drivers.
Oct 18, 2013 · 3Digests

i3DSpeed, August 2013

Added new benchmarks: BioShock Infinite and Metro: Last Light.
Sep 06, 2013 · 3Digests

i3DSpeed, July 2013

Added the test results of NVIDIA GeForce GTX 760 and AMD Radeon HD 7730.
Aug 05, 2013 · 3Digests

Gainward GeForce GTX 650 Ti BOOST 2GB Golden Sample Graphics Card

An excellent hybrid of GeForce GTX 650 Ti and GeForce GTX 660.
Jun 24, 2013 · Video cards: NVIDIA GPUs

i3DSpeed, May 2013

Added the test results of NVIDIA GeForce GTX 770/780.
Jun 03, 2013 · 3Digests
  Latest News More    RSS  

Platform  ·  Video  ·  Multimedia  ·  Mobile  ·  Other  ||  About us & Privacy policy  ·  Twitter  ·  Facebook


Copyright © Byrds Research & Publishing, Ltd., 1997–2011. All rights reserved.