iXBT Labs - Computer Hardware in Detail






ATI CrossFire Based on RADEON X1900 XT, 2x512MB


  1. Introduction
  2. Video cards and PSU's features
  3. Testbed configurations, benchmarks
  4. Test results
  5. Conclusions

All our readers know well that in the end of January ATI announced a new series of desktop video accelerators - RADEON X1900.

This series contains three cards: RADEON X1900 XTX (the most powerful accelerator), RADEON X1900 XT (a tad slower), and RADEON X1900 XT CrossFire Edition. We already examined the first two products. This article will be devoted to CrossFire based on the X1900.

The technology itself was already reviewed, so we shall not repeat ourselves. In brief, it's a counterpart of NVIDIA SLI, when two accelerators work together to process 3D graphics and a final result is ideally obtained twice as fast (in real life the gain is lower for a number of reasons: game peculiarities and CPU performance limits (system resources), etc).

There are different methods of sharing scene rendering between the cards, it's all up to the driver. NVIDIA SLI either slices a scene horizontally in two pieces, which size can vary depending on video requirements of a game; or each accelerator processes an entire frame on its own (a complete scene), but by turns (even frames - by one accelerator, odd frames - by the other). CrossFire adds the third mode - dividing a scene into tiles in staggered rows and sharing them between two video cards. That's just an outline. You can read the details in this article about CrossFire.

SLI has a number of advantages: firstly, it had appeared earlier and had time to grow roots, its drivers were updated, SLI support got stronger. Secondly, SLI does not need a special card, any two video cards with identical GPUs will do (from GeForce 6600LE and higher).

All these SLI advantages automatically turn into disadvantages for CrossFire (CF). This technology was too late, it was poorly supported, and represented by outdated X850 cards. Plus a mandatory master card - CF Edition.

But that was back in summer and early autumn 2005. Now the situation has changed. First of all, the new X1900 series includes a CF Edition, these cards are already available in stores together with regular X1900 XT/XTX cards. That's why you can build such a tandem of two cards with a CF Edition and a regular X1900 card even now. There is no point in buying the XTX model, as the tandem will operate at XT frequencies anyway (CF Edition works at these frequencies).

By the way, for this reason we can say that the X1900 CF is formally a duet of not the most powerful accelerators from ATI. But we'll speak of it later.

There is also another issue: a motherboard. CF currently requires a motherboard on ATI RD480 or i975. Besides, a late launch and reluctant distribution of such cards is of a certain problem. It's already obvious, that if you have a system based on NForce4 or i955, you'll hardly want to upgrade to another motherboard for the sake of CF.

But if you plan on upgrading your old AGP platform into something new and get the most powerful gaming system, that's where you should think it all. CF may be not the worst choice.

However, it's up to users to decide. Prices and availability play a great role. Neither SLI nor CF possess bright features to spend lots of money, even if it's more expensive.

Our mission is to demonstrate a new card.

So, CF is a bundle: a motherboard, two video cards, and software. It's just usual drivers, nothing special. Video cards: one of them is a CF Edition model. Today we'll examine this very card. The motherboard will also be briefly reviewed below.

Video card

ATI RADEON X1900 XT CrossFire Edition, 512MB
Interface: PCI-Express x16

Frequencies (chip/memory — physical (memory — effective): 625/725 (1450) MHz (nominal - 625/725 (1450) MHz)

Memory bus width: 256bit

Number of vertex pipelines: 8

Number of pixel pipelines: 48

Number of texture processors: 16

Number of ROPs: 16

Dimensions: 205x100x32mm (the last figure is the maximum thickness of a video card).

PCB color: red.

Output connectors: 2 x DualLink DVI.

VIVO: not available

TV-out: not available.

ATI RADEON X1900 XT CrossFire Edition, 512MB
The video card has 512 MB of GDDR3 SDRAM allocated in eight chips on the front side of the PCB.

Samsung memory chips. 1.2ns memory access time, which corresponds to 800 (1600) MHz.

Comparison with the reference design, front view
ATI CrossFire based on RADEON X1900 XT, 2x512MB
Reference card ATI RADEON X1900 XT 512MB

Comparison with the reference design, back view
ATI CrossFire based on RADEON X1900 XT, 2x512MB
Reference card ATI RADEON X1900 XT 512MB

So we can see that the X1900 XT CFE is actually a regular X1900 XT, deprived of VIVO and even TV-out for the sake of CF. Just a bare video card without usual additions. Is it a disadvantage? I think it is not. If you plan on installing CF later, that is to buy the second video card after the first, the first card to be bought must be a regular X1900 XT with VIVO. Only then you should buy a CF Edition product.

The card is equipped with two DVI jacks (Dual-Link, of course). One of them is special:

A special adapter-composer is plugged into it.

One of its ends goes to the second video card (DVI), the other is used to connect a monitor.

That's how it looks like assembled (without a PC case). CF Edition should be inserted into the farthest from a CPU slot. By the way, that's the second time already when I noticed that ATI has everything vice versa: the slots do not start from a CPU, all GPUs on video cards are upside down. What does it all mean? :)

That's how CF works:

I guess there is no point in examining the cooling system - just a standard thing, I already described this cooler before.

In conclusion of the CF Edition examination, let's have a look at the GPU:

It's an engineering sample of the R580, exactly like those installed in all X1900 XT/XTX cards. The difference between CFE and regular cards comes down to a PCB and its additional elements that assemble pieces into a whole 3D scene.

Installation and Drivers

Testbed configurations:

  • Athlon 64 (939Socket) based computer
    • CPU: AMD Athlon 4000+ (2400MHz) (L2=1024K)
    • Motherboard: ASUS A8R-MVP on ATI RADEON Xpress 200 CrossFire

    • RAM: 2 GB DDR SDRAM 400MHz (CAS (tCL)=2.5; RAS to CAS delay (tRCD)=3; Row Precharge (tRP)=3; tRAS=6)
    • HDD: WD Caviar SE WD1600JD 160GB SATA

  • Operating system: Windows XP SP2 DirectX 9.0c
  • Monitors: ViewSonic P810 (21") and Mitsubishi Diamond Pro 2070sb (21").
  • ATI CATALYST 5.13beta; NVIDIA drivers 81.98.

VSync is disabled.

According to system monitoring, the video card is very hot. Over 90°C under load! But I repeat that the main positive feature of the cooling system is driving hot air outside of a PC case, so you shouldn't worry too much. Besides, rotational speed of the cooler does not grow very much even at maximum temperature (the noise is hardly audible).

Test results: performance comparison

It's no secret that many games have already reached the limits of CPU and system capacities on the whole, so we don't see the potential of top video cards even in AA 4x and AF 16x modes. That's why we introduced a new HQ (High Quality) mode, which means: HQ:

  • ATI RADEON X1xxx: AA 6x, plus Adaptive AA, plus AF 16x High Quality

In fact, that's the maximum the latest-gen accelerators from ATI can demonstrate. That is their capacity is used nearly by 100%.

NVIDIA GeForce 7800 GTX 512 SLI is represented by Golden Sample cards from Gainward, that's why the frequencies are 580/1760 MHz instead of 550/1700 MHz.

We used the following test applications:

  • Half-Life2 (Valve/Sierra) — DirectX 9.0, demo (ixbt01, ixbt02, ixbt03 The tests were carried out with maximum quality, option -dxlevel 90, presets for video card types are removed from dxsupport.cfg).

  • FarCry 1.33 (Crytek/UbiSoft), DirectX 9.0, multitexturing, 3 demos from Research, Pier, Regulator levels (-DEVMODE startup option), Very High test settings.

  • DOOM III (id Software/Activision) — OpenGL, multitexturing, test settings — High Quality (ANIS8x), demo ixbt1 (33MB!). We have a sample batch file to start the game automatically with increased speed and reduced jerking (precaching) d3auto.rar. (DO NOT BE AFRAID of the black screen after the first menu, that's how it should be! It will last 5-10 seconds and then the demo should start)

  • 3DMark05 (FutureMark) — DirectX 9.0, multitexturing, test settings — trilinear,

  • F.E.A.R. v.1.02 (Monolith/Sierra) — DirectX 9.0, multitexturing, test settings — maximum, Soft shadows OFF(!!!).

  • Splinter Cell Chaos of Theory v.1.04 (Ubisoft) — DirectX 9.0, multitexturing, maximum test settings, shaders 3.0 (for NVIDIA cards)/shaders 2.0 (for ATI cards); HDR OFF!

  • The Chronicles Of Riddick: Escape From Butcher Bay v.1.04 (Starbreeze/Vivendi) — OpenGL, multitexturing, test settings — maximum texture quality, Shader 2.0, demo 44 and demo ducche.

    I wish to thank Rinat Dosayev (AKA 4uckall) and Alexei Ostrovski (AKA Ducche), who have created a demo for this game. I also want to thank Alexei Berillo AKA Somebody Else for his help.

  • Call Of Duty 2 DEMO (Ubisoft) — DirectX 9.0, multitexturing, test settings — maximum, shaders 2.0, tested in Benchemall, demo and a startup script, readme contains necessary instructions

Test results of ATI CrossFire on RADEON X1900 XT, 2x512MB

Game tests that heavily load vertex shaders, mixed pixel shaders 1.1 and 2.0, active multitexturing.

FarCry, Research

Test results: FarCry Research

The game is heavily limited by system resources, so the gain is not large even in AA+AF mode. Only the HQ mode demonstrated the highest gain from CF.

However, that's without HDR. As soon as Patch 1.40 supporting this technology on ATI cards is out, we shall carry out additional tests.

Game tests that heavily load vertex shaders, pixel shaders 2.0, active multitexturing.


Test results: F.E.A.R.

CF gain is quite good, but it still does not exceed 50%. What concerns the battle with the 7800GTX 512 SLI, the situation is contradictory. But CF is still victorious, as this SLI will obviously be more expensive.

Splinter Cell Chaos Theory

Test results: SCCT

CF demonstrates very good performance gain, but SLI is still not defeated - we've got parity here. But we should keep in mind that CF may be cheaper.

Call Of Duty 2 DEMO

Test results: COD2

It's a similar situation, we can even say that SLI is victorious.

Half-Life2: ixbt01 demo

Test results: Half-Life2, ixbt01

The game depends much on a processor, so only the HQ mode allowed this tandem to reveal its potential.

Game tests that heavily load pixel pipelines with texturing, active operations of the stencil buffer and shader units

DOOM III High mode

Test results: DOOM III

Chronicles of Riddick, demo 44

Test results: Chronicles of Riddick, demo 44

CF gain is very high, but NVIDIA's best API - OpenGL makes itself felt. While we can see parity between SLI and CF in DOOM III, CR favours the NVIDIA solution. Alas, OGL games will be a headache of the Canadian company up to the end.

Synthetic tests that heavily load shader units

3DMark05: MARKS

Test results: 3DMark05 MARKS

It's amazing but the CF gain is not as high as we could have expected. SLI demonstrates higher performance gains. That's why the duets demonstrate parity here, even though a single X1900 XT outperforms a 7800GTX 512.

3DMark06: MARKS

Test results: 3DMark06 MARKS

Shader Model 2.0

Shader Model 3.0

CF yields much dividend in SM2.0 tests, but SLI is again more efficient. Thus we have parity here.

SM3.0 brings success to CF, as even a single X1900 XT card is stronger than its competitor from NVIDIA. Besides, GeForce cards cannot use HDR together with AA.


ATI CrossFire based on RADEON X1900 XT, 2x512MB is currently the fastest tandem of 3D accelerators for games.

It's sometimes outperformed by a similar duet of two GeForce 7800 GTX 512MB cards. But on the whole, it's a tad more powerful. Besides, SLI operates at higher frequencies. CF will certainly look like a winner versus a regular 7800 GTX 512MB SLI. But is it justified to spend so much money on two accelerators to get the average performance gain of 65-70%? There is no point in answering this question, as a regular user can hardly afford to spend nearly $1500 for such a video system. Only an enthusiast or a hardcore gamer can go for it, but such people will not be stopped by percents.

All pros and cons are already mentioned above, everything is crystal clear. What concerns me, if I had such a choice, given I should have changed my platform completely, I wouldn't solve this problem easily even if the two cards had had the same price (as for now, we can expect the X1900 XT to be cheaper than the 7800GTX 512MB even as they only start to appear on the shelves, because the latter cards are in deficit).

So it's up to our readers to decide. We've told everything we could about the new technology and its implementations. And don't forget about voracity of such cards! A single X1900 XT card consumes over 120 Watts, so CF requires very powerful PSUs (certainly more powerful than 500-600 Watts).

When this article was nearly completed, I suddenly had an idea: was it too hard to make all X1900 cards CrossFire Editions? It's quite possible to arrange VIVO and composing on one PCB. The upper Dual-DVI jack can be used even without CF - a monitor can be connected via the adapter. It will not raise the cost by more than $20. That's nothing compared to $600. In return, there would be no problems with buying so-called master cards, any two X1900 cards would be able to form CF.

Moreover, the price for a X1900 XT CFE, which is higher than the X1900 XT by $50, seems too high to me. The card does not contain super expensive elements! Besides, we should deduct the cost of Rage Theater, which is not used in CFE. That's why this high price is just a marketing move.

We may have a situation when CFE cards will be cheaper than regular X1900 XT cards due to no demand. Many people use TV-out and the X1900 XT CFE does not have this feature.

You can find more detailed comparisons of various video cards in our 3Digest.

Theoretical materials and reviews of video cards, which concern functional properties of the GPU ATI RADEON X1800 (R520) / RADEON X1900 (R580) / X1600 (RV530) / X1300 (RV515) and NVIDIA GeForce 6800 (NV40/45) / 7800 (G70) / 6600 (NV43)

Andrey Vorobiev (anvakams@ixbt.com)

February 3, 2006

Write a comment below. No registration needed!

Article navigation:

blog comments powered by Disqus

  Most Popular Reviews More    RSS  

AMD Phenom II X4 955, Phenom II X4 960T, Phenom II X6 1075T, and Intel Pentium G2120, Core i3-3220, Core i5-3330 Processors

Comparing old, cheap solutions from AMD with new, budget offerings from Intel.
February 1, 2013 · Processor Roundups

Inno3D GeForce GTX 670 iChill, Inno3D GeForce GTX 660 Ti Graphics Cards

A couple of mid-range adapters with original cooling systems.
January 30, 2013 · Video cards: NVIDIA GPUs

Creative Sound Blaster X-Fi Surround 5.1

An external X-Fi solution in tests.
September 9, 2008 · Sound Cards

AMD FX-8350 Processor

The first worthwhile Piledriver CPU.
September 11, 2012 · Processors: AMD

Consumed Power, Energy Consumption: Ivy Bridge vs. Sandy Bridge

Trying out the new method.
September 18, 2012 · Processors: Intel
  Latest Reviews More    RSS  

i3DSpeed, September 2013

Retested all graphics cards with the new drivers.
Oct 18, 2013 · 3Digests

i3DSpeed, August 2013

Added new benchmarks: BioShock Infinite and Metro: Last Light.
Sep 06, 2013 · 3Digests

i3DSpeed, July 2013

Added the test results of NVIDIA GeForce GTX 760 and AMD Radeon HD 7730.
Aug 05, 2013 · 3Digests

Gainward GeForce GTX 650 Ti BOOST 2GB Golden Sample Graphics Card

An excellent hybrid of GeForce GTX 650 Ti and GeForce GTX 660.
Jun 24, 2013 · Video cards: NVIDIA GPUs

i3DSpeed, May 2013

Added the test results of NVIDIA GeForce GTX 770/780.
Jun 03, 2013 · 3Digests
  Latest News More    RSS  

Platform  ·  Video  ·  Multimedia  ·  Mobile  ·  Other  ||  About us & Privacy policy  ·  Twitter  ·  Facebook

Copyright © Byrds Research & Publishing, Ltd., 1997–2011. All rights reserved.