iXBT Labs - Computer Hardware in Detail







September 27, 2006


  1. Introduction
  2. Video card's features
  3. Testbed configuration, benchmarks
  4. Test results
  5. Conclusions

June and Computex seemed to be not long ago. But the summer is over... Dull autumn days await us.

But it's all different in the IT world. As if this world industry branch is located solely in the Southern Hemisphere: it's all peace and quiet in summer, but manufacturers and vendors become increasingly active by winter. The climax the year is Christmas and sales before this holiday. New Year's Eve will follow. And then back to work to dance the last pas after the spring turmoil at Computex and to go to rest till autumn.

That's the paradox in the human world, unlike the nature. Only denizens of the Southern Hemisphere live in full accord with the IT rhythm. Because 90% of all projects and production is done in the Northern Hemisphere :-)

The nature is slowing down to get ready for cold weather, and we are expecting new video cards. And here they are. We have recently reviewed GeForce 7900GS. Now it's a turn of the ATI product announced back in August.

The Canadian manufacturer declared the general tendency for reducing prices on top accelerators. The recommended retail price for the X1950 XTX was set to 449 USD, so all lower-ranked cards should go down in price proportionally.

We have a nice surprise - RADEON X1900 XT (sterling! without any cutoffs!) with 256MB of memory instead of 512MB. A recommended price for this product is 279 USD.

Thus, the new line of top products from ATI will look like this (video cards' properties are in brackets: vertex pipelines/pixel shader units/texture modules/ROPS):

  1. RADEON X1950 XTX, 512MB, 8/48/16/16, 650/2000 MHz, 449USD
  2. RADEON X1900 XTX, 512MB, 8/48/16/16, 650/1550 MHz, 399USD
  3. RADEON X1900 XT, 512MB, 8/48/16/16, 299USD
  4. RADEON X1900 XT, 256MB, 8/48/16/16, 279USD
  5. RADEON X1900 GT, 256MB, 8/36/12/12, 249USD
  6. RADEON X1950 PRO, 256MB, 8/36/12/12, 199USD - not yet launched

Here are competing cards from NVIDIA:

  1. GeForce 7950 GX2, 2x512MB, 2x(8/24/24/16), 500/1200 MHz, 449USD
  2. GeForce 7900 GTX, 512MB, 8/24/24/16, 650/1600 MHz, 399USD
  3. GeForce 7950 GT, 512MB, 8/24/24/16, 550/1320 MHz, 349USD
  4. GeForce 7950 GT, 256MB, 8/24/24/16, 550/1320 MHz, 299USD
  5. GeForce 7900 GT, 256MB, 8/24/24/16, 450/1320 MHz, 249USD
  6. GeForce 7900 GS, 256MB, 7/20/20/16, 450/1320 MHz, 199USD

That's why the X1900 XT 256MB should compete with the old 7900 GT and with the new 7950 GT. That's because NVIDIA has no clear-cut positioning strategy in the $279 segment. Unfortunately, we haven't got the 7950GT 256MB yet, so we cannot compare our cards with it. And we cannot emulate this card, as all GeForce 7800GTX, 7900GT cards with this memory volume cannot operate at 550 MHz (GPU frequency).

So, we can see that the Canadian company is obviously trying to attract potential customers in the popular segment of $250-$280 with its X1900 XT card, famous for the power of 48 shader units, but with halved memory capacity. We'll run our tests and see.

Video card

GPU: RADEON X1900 XT (R580)

Interface: PCI-Express x16

GPU frequencies: 621 MHz (nominal — 625 MHz)

Memory frequencies (physical (effective)): 720 (1440) MHz (nominal — 725 (1450) MHz)

Memory bus width: 256bit

Number of Shader Vertex Processors: 8

Number of Shader Pixel Processors: 48

Number of texture processors: 16

Number of ROPs: 16

Dimensions: 210x100x32mm (the last figure is the maximum thickness of a video card).

PCB color: red.

Output connectors: 2 x DVI (Dual-Link), TV-Out.

VIVO: available (RAGE Theater)

TV-out: integrated into GPU.

The video card has 256 MB of GDDR3 SDRAM allocated in eight chips on the front side of the PCB.

Samsung (GDDR3) memory chips. 1.2ns memory access time, which corresponds to 800 (1600) MHz.

Comparison with the reference design, front view
ATI RADEON X1900 XT 256MB PCI-E Reference card ATI RADEON X1900 XTX

Comparison with the reference design, back view
ATI RADEON X1900 XT 256MB PCI-E Reference card ATI RADEON X1900 XTX

We can see well that the video card suffered no changes compared to the earlier counterparts. It differs only in capacity of memory chips.

The card has TV-Out with a unique jack. You will need special bundled adapters to output video to a TV-set via S-Video or RCA. You can read about the TV-Out in more detail here. The product is also equipped with TV-In to be used with a special adapter as well.

Analog monitors with d-Sub (VGA) interface are connected with special DVI-to-d-Sub adapters. Maximum resolutions and frequencies:

  • 240 Hz Max Refresh Rate
  • 2048 x 1536 x 32bit x85Hz Max - analog interface
  • 2560 x 1600 @ 60Hz Max - digital interface

What concerns the cooler, the product is equipped with the same reference cooler as in the X1900 XT/XTX. The device is quite complex. It's very noisy, when the card gets hot. We have described this poor cooler many times - it made many ATI fans groan and look for ways to get rid of this system.

Look at the temperatures of the X1900 XT (to say nothing of the X1900 XTX)!

I also want to draw your attention to the fact that real frequencies, both of the core and memory, in all X1900XT/XTX cards was lowered by 4-10 MHz some time ago. But even this step does not solve the overheating problem. It's a fiery card, the cooler is noisy (for example, the noise gets almost intolerable in FEAR).

We noted in the X1950 XTX announcement that this noisy and awkward cooler had been finally replaced with a better solution. But unfortunately, its retention module is not compatible with older cards. So we have to put up with this monster, the reference cooler from the X1900 series. Users will again have to look for a replacement to this cooling system (of course, those who get irritated by the noise; there are also people who either don't use the card in heavy game modes, or do not suffer from noise).

It's an engineering sample, so the bundle and package are out of the question.

Installation and Drivers

Testbed configuration:

  • Athlon 64 (939Socket) based computer
    • CPU: AMD Athlon 64 4000+ (2400MHz) (L2=1024K)
    • Motherboard: MSI K8N Diamond Plus based on the NVIDIA nForce4 SLI X16 chipset
    • RAM: 2 GB DDR SDRAM 400MHz (CAS (tCL)=2.5; RAS to CAS delay (tRCD)=3; Row Precharge (tRP)=3; tRAS=6)
    • HDD: WD Caviar SE WD1600JD 160GB SATA

  • Operating system: Windows XP SP2 DirectX 9.0c
  • Monitor: Mitsubishi Diamond Pro 2070sb (21").
  • ATI CATALYST 6.8; NVIDIA Drivers 91.47.

VSync is disabled.

Test results: performance comparison

We used the following test applications:

  • Splinter Cell Chaos Theory v.1.04 (Ubisoft) — DirectX 9.0, multitexturing, test settings — maximum, shaders 3.0 (for NVIDIA cards)/shaders 2.0 (for ATI cards); HDR OFF!

  • Half-Life2 (Valve/Sierra) — DirectX 9.0, demo (ixbt01. The tests were carried out with maximum quality, option -dxlevel 90, presets for video card types are removed from dxsupport.cfg.

  • FarCry 1.33 (Crytek/UbiSoft), DirectX 9.0, multitexturing, demo from the Research level (-DEVMODE startup option), Very High test settings.

  • DOOM III (id Software/Activision) — OpenGL, multitexturing, test settings — High Quality (ANIS8x), demo ixbt1 (33MB!). We have a sample batch file to start the game automatically with increased speed and reduced jerking (precaching) d3auto.rar. (DO NOT BE AFRAID of the black screen after the first menu, that's how it should be! It will last 5-10 seconds and then the demo should start)

  • 3DMark05 1.20 (FutureMark) — DirectX 9.0, multitexturing, test settings — trilinear,

  • 3DMark06 1.02 (FutureMark) — DirectX 9.0, multitexturing, test settings — trilinear,

  • The Chronicles Of Riddick: Escape From Butcher Bay 1.10 (Starbreeze/Vivendi) — OpenGL, multitexturing, maximum texture quality, Shader 2.0, demo 44.

    I wish to thank Rinat Dosayev (AKA 4uckall) and Alexei Ostrovski (AKA Ducche), who have created a demo for this game. I also want to thank Alexei Berillo AKA Somebody Else for his help.

  • F.E.A.R. v.1.02 (Multiplayer) (Monolith/Sierra) — DirectX 9.0, multitexturing, maximum test settings, Soft shadows disabled.

  • Call Of Duty 2 DEMO (Ubisoft) — DirectX 9.0, multitexturing, test settings — maximum, shaders 2.0, tested in Benchemall, demo and a startup script, readme contains necessary instructions

Summary performance diagrams

Game tests that heavily load vertex shaders, mixed pixel shaders 1.1 and 2.0, active multitexturing.

FarCry, Research

Test results: FarCry Research

Game tests that heavily load vertex shaders, pixel shaders 2.0, active multitexturing.


Test results: F.E.A.R

Splinter Cell Chaos Theory

Test results: SCCT

Call Of Duty 2 DEMO

Test results: COD2

Half-Life2: ixbt01 demo

Test results: Half-Life2, ixbt01

Game tests that heavily load pixel pipelines with texturing, active operations of the stencil buffer and shader units

DOOM III High mode

Test results: DOOM III

Chronicles of Riddick, demo 44

Test results: Chronicles of Riddick, demo 44

Synthetic tests that heavily load shader units

3DMark05: MARKS

Test results: 3DMark05 MARKS

3DMark06: Shader 2.0 MARKS

Test results: 3DMark06 Shader 2.0 MARKS

3DMark06: Shader 3.0 MARKS

Test results: 3DMark06 Shader 3.0 MARKS


ATI RADEON X1900 XT 256MB PCI-E. According to the tests, the new product (it's actually not new, it just has less memory and a better price) outscores the 7900GT practically everywhere. These days, of course. Even if that card is cheaper than the X1900XT, it will be still expedient to buy the latter. But we don't know about the 7950GT 256MB yet... Perhaps, it will demonstrate a better price/performance ratio.

But I'd like to repeat that the X1xxx series has an important advantage: it supports HDR+AA, a very important feature for top cards, their performance is sufficient for this mode. NVIDIA products cannot possibly do that. Plus anisotropic filtering of higher quality. There is also an opinion that the default quality of NVIDIA cards with all their optimizations is worse than that of ATI products. If optimizations are disabled, NVIDIA accelerators lose much of their performance to raise the quality. Perhaps we'll analyze this issue in the nearest future and will see the situation with image quality - the last time we did it was last year.

But still... A very noisy cooler and video card's overheating, which heats other elements inside your computer, stop us from saying that the X1900 XT 256MB is the best card in its class. Yes, its results are good, but there are other important issues for top modern accelerators as well. We all know well how hot these cards get and how noisy they are. Since 2002 and GeForce FX 5800 (NV30), which howling cooler shocked its users. We changed our attitude to Hi-End cooling systems since that time. Now we thoroughly examine them and they play a more important role in choosing cards.

As for now, the only choice is to buy a X1900 XT and a cooler (e.g. Zalman), but will the total cost be expedient? If the 7950GT is equipped with a cooler from the 7900GTX, this product will be evidently more expedient.

The X1900 XT 256MB is good (it would have been our favorite choice for 279-300 USD), but for the cooling system. :(

You can find more detailed comparisons of various video cards in our 3Digest.

Theoretical materials and reviews of video cards, which concern functional properties of the GPU ATI RADEON X800 (R420)/X850 (R480)/X700 (RV410) and NVIDIA GeForce 6800 (NV40/45)/6600 (NV43)

PSU for the testbed was kindly provided by HIPER

                               Motherboard for the testbed is kindly provided by the company

Andrey Vorobiev (anvakams@ixbt.com)

September 27, 2006

Write a comment below. No registration needed!

Article navigation:

blog comments powered by Disqus

  Most Popular Reviews More    RSS  

AMD Phenom II X4 955, Phenom II X4 960T, Phenom II X6 1075T, and Intel Pentium G2120, Core i3-3220, Core i5-3330 Processors

Comparing old, cheap solutions from AMD with new, budget offerings from Intel.
February 1, 2013 · Processor Roundups

Inno3D GeForce GTX 670 iChill, Inno3D GeForce GTX 660 Ti Graphics Cards

A couple of mid-range adapters with original cooling systems.
January 30, 2013 · Video cards: NVIDIA GPUs

Creative Sound Blaster X-Fi Surround 5.1

An external X-Fi solution in tests.
September 9, 2008 · Sound Cards

AMD FX-8350 Processor

The first worthwhile Piledriver CPU.
September 11, 2012 · Processors: AMD

Consumed Power, Energy Consumption: Ivy Bridge vs. Sandy Bridge

Trying out the new method.
September 18, 2012 · Processors: Intel
  Latest Reviews More    RSS  

i3DSpeed, September 2013

Retested all graphics cards with the new drivers.
Oct 18, 2013 · 3Digests

i3DSpeed, August 2013

Added new benchmarks: BioShock Infinite and Metro: Last Light.
Sep 06, 2013 · 3Digests

i3DSpeed, July 2013

Added the test results of NVIDIA GeForce GTX 760 and AMD Radeon HD 7730.
Aug 05, 2013 · 3Digests

Gainward GeForce GTX 650 Ti BOOST 2GB Golden Sample Graphics Card

An excellent hybrid of GeForce GTX 650 Ti and GeForce GTX 660.
Jun 24, 2013 · Video cards: NVIDIA GPUs

i3DSpeed, May 2013

Added the test results of NVIDIA GeForce GTX 770/780.
Jun 03, 2013 · 3Digests
  Latest News More    RSS  

Platform  ·  Video  ·  Multimedia  ·  Mobile  ·  Other  ||  About us & Privacy policy  ·  Twitter  ·  Facebook

Copyright © Byrds Research & Publishing, Ltd., 1997–2011. All rights reserved.