iXBT Labs - Computer Hardware in Detail

Platform

Video

Multimedia

Mobile

Other

ASUS Extreme N6800 256MB Based on GeForce 6800 (SLI Mode) and A8N-SLI-Deluxe Motherboard








Contents

  1. Video card's features
  2. Testbed configurations, benchmarks, 2D quality
  3. Description of the ASUS A8N-SLI-Deluxe motherboard
  4. Test results: TRAOD
  5. Test results: FarCry Research
  6. Test results: FarCry Regulator
  7. Test results: FarCry Pier
  8. Test results: Half-Life2, ixbt01
  9. Test results: Half-Life2, ixbt02
  10. Test results: Half-Life2, ixbt03
  11. Test results: DOOM III
  12. Test results: 3DMark05 Game1
  13. Test results: 3DMark05 Game2
  14. Test results: 3DMark05 Game3
  15. Test results: 3DMark05 MARKS
  16. Conclusions



At the end of January we published our first unique article that reviewed a video card and a motherboard together. That was because Gigabyte 3D1 could operate only with a motherboard from the same manufacturer – K8NXP-SLI.

Now I decided to carry on the tradition and to review another kit, so to say. However, it's a relative kit, because Extreme N6800 video cards can operate with any motherboards supporting PCI-Express.

So, in our today's article you will read about the operation of the above mentioned video card alone and of two such cards in SLI mode based on the ASUS A8N-SLI-Deluxe motherboard; you will also get a glimpse of this motherboard on NVIDIA nForce4 SLI.

At the end of the review you will find a link to the list of our articles devoted to the latest generation products, where you can learn what we have already written about the new NV41. It's a reworked NV40 core, which now lacks 4 pixel and 1 vertex pipelines, and its AGP support was replaced with the PCI-E module, so that the chip can support the new interface on its own, without the HSI bridge. It has become a counterpart of GeForce 6800 AGP in its performance with its 12 pixel and 5 vertex pipelines.

Our readers will learn about features of this ASUSTeK product as well as about the SLI operation and the motherboard on nForce4 SLI.

In the article about NV41 and the cards based on this chip, I noted that the PCI-E sector was too small and the products for this market were so many that it was very difficult to position a given product, because the prices might vary from day to day depending on its novelty and demand. That's why the newly-made GeForce 6800 PCI-E cannot be thrown into the scale – it has no counterpart from the other side (of course, we mean ATI), because the closest counterpart X800 XL, though appeared on sale, is too pricy and it's obviously more expensive than NV41. And the cheaper X800 is on sale in the form of an unannounced 256 MB product. It's not clear how long it will be so, because a 128 MB counterpart is to be released at a much lower price. It's a total farrago. That's why I'll compare it both with the X800 XL and the X800 256MB.

In total, GeForce 6800 SLI will be much more expensive than even the GeForce 6800 Ultra and RADEON X850 XT PE flagships. But it will be compared with them anyway.

So, let's start with GeForce 6800 PCI-E from ASUSTeK.

Video card



ASUS Extreme N6800 256MB
Interface: PCI-Express x16

Frequencies (chip/memory – physical (memory – effective): 350/310 (620) MHz (nominal – 325/350 (700) MHz)

Memory bus width: 256bit

Number of vertex pipelines: 5

Number of pixel pipelines: 12

Dimensions: 200x100x17mm (the last value is the maximum thickness of a video card in its heatsink section).

PCB color: dark blue.

Output connectors: DVI, d-Sub, S-Video.

VIVO: n/a

TV-out: integrated into GPU.






ASUS Extreme N6800 256MB
The card has 256 MB DDR SDRAM allocated in 8 chips on the front side of the PCB.

Hynix memory chips. 2.8ns memory access time, which corresponds to 350 (700) MHz.






Comparison with the reference design, front view
ASUS Extreme N6800 256MB
Reference card NVIDIA GeForce 6800 PCI-E











Comparison with the reference design, back view
ASUS Extreme N6800 256MB
Reference card NVIDIA GeForce 6800 PCI-E








The card obviously copies the reference design. It consumes less than 75 W, so it does not need external power supply. Perhaps it's one of the reasons for the memory chip frequencies being considerably lower than the nominal.

Let's have a look at the graphics processor.






I have already written before that the PCI-E interface obtained by the chip and the cut down pipelines resulted in a vertically elongated die form, unlike NV40. I repeat that this chip is manufactured in Taiwan, so NV41 is quite possibly manufactured by TSMC already instead of IBM.

Let's get on with the examination.

ASUS Extreme N6800 256MB

The cooler is again "a la GeForce4" as I call it, that is the closed heatsink with an off-center fan, which sucks in the air and drives it through the grid over the graphics processor. The heatsink has a copper base.

The rotational speed of the cooler is about 2000 rpm, so it produces almost no noise.

You can watch a video clip and evaluate the noise here (4.2MB, AVI DivX 5.1) and compare it to the background noise without the video card (740KB, AVI DivX 5.1).









Bundle

ASUS Extreme N6800 256MB
User's guide, CD with drivers and utilities, a disc with DVD player, TV cord extensions, DVi-to-d-Sub adapter, and a traditional ASUS set of software in a brand box.







Package

ASUS Extreme N6800 256MB

The box is made of thick cardboard. It has a bright design with a traditional male character. The package contents are arranged in sections, the video card proper is in a foam plastic box, that's why transportational damages are out of the question (unless the box is subject to heavy damage).






ASUS A8N-SLI-Deluxe Motherboard

Not to encumber this article, a brief description of this product is brought to a separate page.

Installation and Drivers

Testbed configurations:

  • Athlon64 based computer
    • CPU: AMD Athlon64 2400 MHz (218MHz x 11; L2=512K) (~3800+);
    • ASUS A8N-SLI-Deluxe motherboard on NVIDIA nForce4 SLI
    • RAM: 1 GB DDR SDRAM 400MHz
    • HDD: WD Caviar SE WD1600JD 160GB SATA

  • Operating system: Windows XP SP2; DirectX 9.0c
  • Monitors: ViewSonic P810 (21") and Mitsubishi Diamond Pro 2070sb (21").
  • ATI drivers 6.512 (CATALYST 5.2); NVIDIA drivers 71.81.

VSync is disabled.

Both companies have enabled trilinear filtering optimizations in their drivers by default.

Test results

Before giving a brief evaluation of 2D, I will repeat that at present there is NO valid method for objective evaluation of this parameter due to the following reasons:

  1. 2D quality in most modern 3D accelerators dramatically depends on a specific sample, and it's impossible to evaluate all the cards.
  2. 2D quality depends not only on the video card, but also on the monitor and a cable.
  3. A great impact on this parameter has been recently demonstrated by monitor-card pairs. That is there are monitors, which just won't "work" with specific video cards.

What concerns our samples under review together with Mitsubishi Diamond Pro 2070sb, these cards demonstrated identically excellent quality in the following resolutions and frequencies:

ASUS Extreme N6800 256MB 1600x1200x85Hz, 1280x1024x100Hz, 1024x768x120Hz


Test results: performance comparison

I will note right away that we have already tested a single 6800 before. Besides, it will soon appear in our 3Digest. That's why we shall publish only the 6800 SLI test results.

We used the following test applications:

  • Tomb Raider: Angel of Darkness v.49 (Core Design/Eldos Software) – DirectX 9.0, Paris5_4 demo. The tests were conducted with the quality set to maximum, only Depth of Fields PS20 was disabled.

  • Half-Life2 (Valve/Sierra) – DirectX 9.0, demo (ixbt01, ixbt02, ixbt03 The tests were carried out with maximum quality, option -dxlevel 90, presets for video card types are removed from dxsupport.cfg.

  • FarCry 1.3 (Crytek/UbiSoft), DirectX 9.0, multitexturing, 3 demo from Research, Pier, Regulator levels (the game is started with -DEVMODE option), Very High test settings.

  • DOOM III (id Software/Activision) – OpenGL, multitexturing, test settings – High Quality (ANIS8x), demo ixbt1 (33MB!). We have a sample batch file to start the game automatically with increased speed and reduced jerking (precaching) d3auto.rar. (DO NOT BE AFRAID of the black screen after the first menu, that's how it should be! It will last 5-10 seconds and then the demo should start)

  • 3DMark05 (FutureMark) – DirectX 9.0, multitexturing, test settings – trilinear,



TR:AoD, Paris5_4 DEMO



Test results: TRAOD




This test demonstrates GPU power in terms of shader calculations. That's why there is almost nothing surprising that the 6800 SLI outscored a single card nearly twofold. Is this 10-15% advantage over the top products enough to exonerate the expenses? - It's all up to the prices. If two 6800 cards cost like a single 6800 Ultra or X850 XT PE, then the answer is yes.



FarCry, Research



Test results: FarCry Research




The 6800 SLI hasn't managed to reach the twofold advantage here, because the game is CPU-critical. The performance is just 1.5 times as high. But our tandem was defeated in the competition with top products, especially by the X850 XT PE, which was full 20-25% faster.



FarCry, Regulator



Test results: FarCry Regulator




The same picture.



FarCry, Pier



Test results: FarCry Pier




Actually, the 6800 SLI tandem has demonstrated its cost-ineffectiveness here as well.



Half-Life2: ixbt01 demo



Test results: Half-Life2, ixbt01




And here, considering only AA+AF modes (it's stupid to buy two cards 300-400 USD each and disable these functions), the 6800 SLI fares quite well. Of course, if the occasional quality issues in the drivers are fixed.



Half-Life2: ixbt02 demo



Test results: Half-Life2, ixbt02




Fiasco again. Yes, this scene is much more critical to a processor, so the payoff from the 6800 SLI is minimal.



Half-Life2: ixbt03 demo



Test results: Half-Life2, ixbt03




The situation is similar to the previous one, so we can say nothing good about the 6800 SLI so far.



DOOM III



Test results: DOOM III




Developers finetuned the drivers for this game in the first place I guess, that's why the brilliant results of the 6800 SLI tandem. That's not only up to drivers, the game itself loads heavily the GPU, and thus the dividends from SLI are considerable.



3DMark05: Game1



Test results: 3DMark05 Game1






3DMark05: Game2



Test results: 3DMark05 Game2






3DMark05: Game3



Test results: 3DMark05 Game3






3DMark05: MARKS



Test results: 3DMark05 MARKS




The situation is similar to TRAoD, this test loads GPU with shader calculations, and it's quite clear that 3Dmark lacks gameplay. That's why SLI has demonstrated brilliant results. But is a couple of games and 3DMark worth buying two video cards at a price higher than any top accelerator?

Conclusions

So, we have reviewed GeForce 6800 PCI-E from ASUSTeK (a single card and two such cards in SLI mode) plus the motherboard based on nForce4 SLI, which also participated in all these tests.

The ASUS Extreme N6800 256MB is practically a copy of the reference design. It's more expensive than its AGP counterpart, first of all because it's equipped with 256 MB instead of 128. It should be noted though that the memory is DDR1, and thus the price difference must not be very large. I repeat that the memory operating frequencies are reduced relative to the chip nominals and I have only one assumption why it was done: not to exceed the 75 W power consumption limit and avoid installing external power connectors (and the corresponding componentry).

This card is positioned right between the GeForce 6600GT and GeForce 6800GT, sometimes going down to the performance level of the former. But you shouldn't forget that the star feature of all 256-bit video cards is active AA mode. When it's enabled, the 6600GT is left far behind. I hope the prices will settle down and you'll be able to buy such cards within 300-320 USD.

In this case the 6800 SLI tandem will get its share of demand, because its total price will not be too higher than the price of a single 6800 Ultra or X850 XT PE. Sometimes this "sweet couple" outscores these top accelerators in performance. Not everywhere, not always. That's why it's too early to speak about SLI popularity with such cards. We should also take into account that there are some driver glitches resulting in occasional quality issues. Besides, not all new games support SLI, for example this feature does not work in Chronicles of Riddick.

Here is an illustrative example from FarCry (1.9MB, DivX 5.1) with enabled marker demonstrating which part of the scene is processed by which card. SLI in action. And here (3.3MB, DivX 5.1) you can see the situation in Chronicles of Riddick, where SLI simply doesn't work. That's why there is no such marker, though it's enabled in drivers.

Modern incarnation of SLI is not just dull coupling of two video cards that works in all applications a priori, as it used to be in times of Voodoo2. Now it depends on driver support, which must detect a game and enable a certain optimization mode for SLI. You should always bear in mind that a fresh game will not necessarily show its full potential in SLI mode. It will simply fail to work until NVIDIA programmers release a new driver version with added support for this game.

What concerns the ASUS A8N-SLI-Deluxe motherboard, this product is good in all respects: assemblage convenience, PCi-E x16 slots spaced apart to be able to install even two-slot video cards, BIOS settings, excellent operation. But the cooler noise, which is difficult to get rid of (not because it's impossible to replace it, but because BIOS requires the 8000 rpm speed – have a look at the utility screenshots, which demonstrate this rotational speed), makes this excellent product very disappointing. Just fancy that! Constantly listening to that bee buzz accompanied by howling at rpm changes – that's not serious for a High-End product.

nForce4 SLI overheats very much, even such cooler is heated to 50-60 degrees. To say nothing of a lower rotational speed. Designers should have thought about it when they planned the layout and should have left more space around the chipset to install a large flat heatsink with a low-noise fan instead of the puny thing that looks good but yells like a siren (3MB, AVI DivX 5.1).

In our 3Digest you can find more detailed comparisons of various video cards.





Theoretical materials and reviews of video cards, which concern functional properties of the GPU ATI RADEON X800 (R420)/X850 (R480)/asus-6800pcie (nv44)/X700 (RV410) and NVIDIA GeForce 6800 (NV40/45)/6600 (NV43)








We express our thanks to ASUSTeK
for the provided A8N-SLI-Deluxe motherboard


Andrey Vorobiev (anvakams@ixbt.com)

2005



Write a comment below. No registration needed!


Article navigation:



blog comments powered by Disqus

  Most Popular Reviews More    RSS  

AMD Phenom II X4 955, Phenom II X4 960T, Phenom II X6 1075T, and Intel Pentium G2120, Core i3-3220, Core i5-3330 Processors

Comparing old, cheap solutions from AMD with new, budget offerings from Intel.
February 1, 2013 · Processor Roundups

Inno3D GeForce GTX 670 iChill, Inno3D GeForce GTX 660 Ti Graphics Cards

A couple of mid-range adapters with original cooling systems.
January 30, 2013 · Video cards: NVIDIA GPUs

Creative Sound Blaster X-Fi Surround 5.1

An external X-Fi solution in tests.
September 9, 2008 · Sound Cards

AMD FX-8350 Processor

The first worthwhile Piledriver CPU.
September 11, 2012 · Processors: AMD

Consumed Power, Energy Consumption: Ivy Bridge vs. Sandy Bridge

Trying out the new method.
September 18, 2012 · Processors: Intel
  Latest Reviews More    RSS  

i3DSpeed, September 2013

Retested all graphics cards with the new drivers.
Oct 18, 2013 · 3Digests

i3DSpeed, August 2013

Added new benchmarks: BioShock Infinite and Metro: Last Light.
Sep 06, 2013 · 3Digests

i3DSpeed, July 2013

Added the test results of NVIDIA GeForce GTX 760 and AMD Radeon HD 7730.
Aug 05, 2013 · 3Digests

Gainward GeForce GTX 650 Ti BOOST 2GB Golden Sample Graphics Card

An excellent hybrid of GeForce GTX 650 Ti and GeForce GTX 660.
Jun 24, 2013 · Video cards: NVIDIA GPUs

i3DSpeed, May 2013

Added the test results of NVIDIA GeForce GTX 770/780.
Jun 03, 2013 · 3Digests
  Latest News More    RSS  

Platform  ·  Video  ·  Multimedia  ·  Mobile  ·  Other  ||  About us & Privacy policy  ·  Twitter  ·  Facebook


Copyright © Byrds Research & Publishing, Ltd., 1997–2011. All rights reserved.