iXBT Labs - Computer Hardware in Detail






GeCube RADEON X800 128MB 256bit PCI-E

Based on ATI RADEON X800


  1. Introduction
  2. Video cards' features
  3. Testbed configurations, benchmarks
  4. Test results: summary performance diagrams
  5. Conclusions

A middle end product - RADEON X800 appeared at the end of 2004. But only 256MB cards were available at that time. They were a complete copy of the reference X800XL design; the only difference from the senior brother was the crippled core and reduced memory frequency.

But larger memory capacity of GDDR3 type did not contribute to more or less adequate prices. This product competes both with the GeForce 6600GT as well as the GeForce 6800, ranking approximately between them.

Launching a 128MB card with DDR1 would drop manufacturing costs and consequently retail prices for such products.

But it would require redesigning PCB to support DDR1 and retain the 256bit memory bus. Some companies are tempted not to install half of memory chips on the card and launch the X800 128MB with GDDR3, the old PCB, and the memory bus reduced to 128 bit. Our today's review will include a comparison with one of such cards.

Here is the first robin among 128MB and sterling 256MB X800 cards. It's a product from GeCube that will become an object of our review.

I repeat that now almost all articles will include video clips, which will be brief but more illustrious concerning new products. There will be three formats: the best quality (MPEG1), average and low quality (WMV). So,

I would remind you that the RADEON X800 is a product for about 200 USD. It officially supports only the 256bit bus (as it turns out it's easily converted to 128bit), it contains 12 pixel and 6 vertex pipelines and supports DirectX 9.0c.

Video Cards

GeCube RADEON X800 128MB 256bit PCI-E
Interface: PCI-Express x16

Frequencies (chip/memory — physical (memory — effective): 400/350 (700) MHz (nominal — 400/350 (700) MHz)

Memory bus width: 256bit

Number of vertex pipelines: 6

Number of pixel pipelines: 12

Dimensions: 190x100x17mm (the last figure is the maximum thickness of a video card).

PCB color: red.

Output connectors: DVI, d-Sub, S-Video.

VIVO: Not available

TV-out: integrated into GPU.

GeCube RADEON X800 128MB 256bit PCI-E
The video card has 128 MB of DDR SDRAM allocated in eight chips on the front and back sides of the PCB.

Samsung memory chips. 2.8ns memory access time, which corresponds to 350 (700) MHz.

Comparison with the reference design, front view
GeCube RADEON X800 128MB 256bit PCI-E
Reference card ATI RADEON X800 PCI-E

Comparison with the reference design, back view
GeCube RADEON X800 128MB 256bit PCI-E
Reference card ATI RADEON X800 PCI-E

GeCube engineers based their product on a new PCB, with the same 256bit bus, where all seats for memory chips are occupied. Besides, it uses cheaper memory; such chips have been on the market for at least three years (the first ones were used in the GeForce4 Ti 4600 in 2002). Let's hope that other ATI partners will follow the trail.


Let's proceed with the examination.

GeCube RADEON X800 128MB 256bit PCI-E

The cooling system is elegant, refined, and even effective; it generates almost no noise! Details are in the video clip (the link is published in the beginning of the article).

But the back heatsinks are made of aluminum, it's only painted copper. The main heatsink has a copper base. It also has a heat pipe, filled with low-boiling liquid, to amplify the heat removal effect.

GPU R430 (X800):


GeCube RADEON X800 128MB 256bit PCI-E
The card traditionally comes shipped with the following items: User's manual, CD with drivers, CD with utilities, TV extension cords, HDTV and DVI-to-d-Sub adapters. And a BONUS: a headset (headphones with a microphone) to be used in internet chats. But it's not clear why this video card's bundle includes a bonus, which seems more appropriate for a sound card.

Let's have a look at the box.

GeCube RADEON X800 128MB 256bit PCI-E

The box has a traditional for this company design - bright colors, glossy coating. There are cardboard sections inside the box to accommodate the entire bundle. So there must be no dangling components.

Installation and Drivers

Testbed configurations:

  • Athlon 64 (754Socket) based computer:
    • CPU: AMD Athlon 64 3700+ (L2=1024K)
    • Motherboard: ASUS K8V SE Deluxe based on VIA K8T800
    • RAM: 1 GB DDR SDRAM PC3200 (CAS (tCL)=2.5; RAS to CAS delay (tRCD)=3; Row Precharge (tRP)=3; tRAS=6)
    • HDD: Seagate Barracuda 7200.7 80GB SATA

  • Athlon 64 (939Socket) based computer
    • CPU: AMD Athlon 4000+ (L2=1024K)
    • Motherboard: ASUS A8N SLI Deluxe based on NVIDIA nForce4 SLI
    • RAM: 1 GB DDR SDRAM 400MHz (CAS (tCL)=2.5; RAS to CAS delay (tRCD)=3; Row Precharge (tRP)=3; tRAS=6)
    • HDD: WD Caviar SE WD1600JD 160GB SATA

  • Gigabyte RADEON X800 256MB, 400/990 MHz, 256bit, PCI-E
  • Galaxy GeForce 6600GT 128MB, 500/1000 MHz, 128bit, PCI-E
  • Operating system: Windows XP SP2 DirectX 9.0c
  • Monitors: ViewSonic P810 (21") and Mitsubishi Diamond Pro 2070sb (21").
  • ATI drivers 6.542 (CATALYST 5.6) NVIDIA drivers 77.72.

VSync is disabled.

Test results: performance comparison

We used the following test applications:

  • Tomb Raider: Angel of Darkness v.49 (Core Design/Eldos Software) — DirectX 9.0, Paris5_4 demo. The tests were conducted with the quality set to maximum, only Depth of Fields PS20 were disabled.

  • Half-Life2 (Valve/Sierra) — DirectX 9.0, demo (ixbt01, ixbt02, ixbt03 The tests were carried out with maximum quality, option -dxlevel 90, presets for video card types are removed from dxsupport.cfg.

  • FarCry 1.3 (Crytek/UbiSoft), DirectX 9.0, multitexturing, 3 demos from Research, Pier, Regulator levels (-DEVMODE startup option), Very High test settings.

  • DOOM III (id Software/Activision) — OpenGL, multitexturing, test settings — High Quality (ANIS8x), demo ixbt1 (33MB!). We have a sample batch file to start the game automatically with increased speed and reduced jerking (precaching) d3auto.rar. (DO NOT BE AFRAID of the black screen after the first menu, that's how it should be! It will last 5-10 seconds and then the demo should start)

  • 3DMark05 (FutureMark) — DirectX 9.0, multitexturing, test settings — trilinear,

  • F.E.A.R. (Multiplayer beta) (Monolith/Sierra) — DirectX 9.0, multitexturing, Maximum test settings, Soft shadows.

  • The Chronicles Of Riddick: Escape From Butcher Bay (Starbreeze/Vivendi) — OpenGL, multitexturing, test settings — maximum texture quality, Shader 2.0, demo 44 and demo ducche.

Overall performance

Game tests that heavily load pixel shaders 2.0.

TR:AoD, Paris5_4 DEMO

Test results: TRAOD

We use this outdated and not very popular game as an example of a classic "shader game", which heavily loads these units. As you can see, the 128MB 256bit card is outperformed by its 256MB counterpart, the largest breakaway is demonstrated with AA+AF. It's easy to explain as the memory on second card operates at 990 MHz instead of 700 MHz. Then, the same accelerator from GeCube (X800 128MB 256bit) outperforms the GeForce 6600GT.

Game tests that heavily load vertex shaders, mixed pixel shaders 1.1 and 2.0, active multitexturing.

FarCry, Research

Test results: FarCry Research

FarCry, Regulator

Test results: FarCry Regulator

FarCry, Pier

Test results: FarCry Pier

GeCube RADEON X800 128MB 256bit PCI-E is outperformed by the X800 256MB 256bit; the difference reaches 18% in heavy modes. I guess that's the effect of the difference in memory frequencies rather than in the memory capacity. However, this product wins back from the GeForce 6600GT.

F.E.A.R. (MP beta)

Test results: F.E.A.R. (MP beta)

F.E.A.R. is the latest game; its release is planned for the autumn this year. It loads accelerators heavily, so the results are not so high in absolute FPS with maximum quality.

The reason the GeCube RADEON X800 128MB 256bit PCI-E is outperformed by the Gigabyte RADEON X800 256MB 256bit PCI-E (GV-RX80256D) has to do only with different memory frequencies, as the gap grows with increased resolution.

Game tests that heavily load both vertex shaders and pixel shaders 2.0

Half-Life2: ixbt01 demo

Test results: Half-Life2, ixbt01

Half-Life2: ixbt02 demo

Test results: Half-Life2, ixbt02

Half-Life2: ixbt03 demo

Test results: Half-Life2, ixbt03

The accelerator from GeCube is generally very good. Of course, the X800 256MB turns out the fastest model, but that's the effect of a higher memory frequency.

Game tests that heavily load pixel pipelines with texturing, active operations of the stencil buffer and shader units

DOOM III High mode

Test results: DOOM III

Chronicles of Riddick, demo 44

Test results: Chronicles of Riddick, demo 44

Chronicles of Riddick, demo ducche

Test results: Chronicles of Riddick, demo ducche

There is nothing much to comment: it's quite clear that the GeForce 6600GT is the strongest here. In other respects the alignment of forces between the two kinds of the X800 remains the same (the fastest X800 card is 256MB, then the X800 128MB 256bit).

Synthetic tests that heavily load shader units

3DMark05: MARKS

Test results: 3DMark05 MARKS

The results are similar to those demonstrated in the first test. That's only natural, considering that the heaviest load falls on shader units again.


GeCube RADEON X800 128MB 256bit PCI-E is a positive phenomenon. Even though the product is slower than the 256MB modification, I think that in the majority of cases it has to do with different memory frequencies. Unfortunately, we don't have the X800 256MB at 400/700 MHz. But that's not important, because the lags are not so catastrophic, considering the low price for this card relative to the 256MB modification. That's why this card will be successful.

And don't forget that the same name of X800 may be used for accelerators not only with different memory frequencies, but also with different bus width.

You can find more detailed comparisons of various video cards in our 3Digest.

Theoretical materials and reviews of video cards, which concern functional properties of the GPU ATI RADEON X800 (R420)/X850 (R480)/X700 (RV410) and NVIDIA GeForce 6800 (NV40/45)/6600 (NV43)

We express our thanks to
for the provided video card

Andrey Vorobiev (anvakams@ixbt.com)

August 8, 2005

Write a comment below. No registration needed!

Article navigation:

blog comments powered by Disqus

  Most Popular Reviews More    RSS  

AMD Phenom II X4 955, Phenom II X4 960T, Phenom II X6 1075T, and Intel Pentium G2120, Core i3-3220, Core i5-3330 Processors

Comparing old, cheap solutions from AMD with new, budget offerings from Intel.
February 1, 2013 · Processor Roundups

Inno3D GeForce GTX 670 iChill, Inno3D GeForce GTX 660 Ti Graphics Cards

A couple of mid-range adapters with original cooling systems.
January 30, 2013 · Video cards: NVIDIA GPUs

Creative Sound Blaster X-Fi Surround 5.1

An external X-Fi solution in tests.
September 9, 2008 · Sound Cards

AMD FX-8350 Processor

The first worthwhile Piledriver CPU.
September 11, 2012 · Processors: AMD

Consumed Power, Energy Consumption: Ivy Bridge vs. Sandy Bridge

Trying out the new method.
September 18, 2012 · Processors: Intel
  Latest Reviews More    RSS  

i3DSpeed, September 2013

Retested all graphics cards with the new drivers.
Oct 18, 2013 · 3Digests

i3DSpeed, August 2013

Added new benchmarks: BioShock Infinite and Metro: Last Light.
Sep 06, 2013 · 3Digests

i3DSpeed, July 2013

Added the test results of NVIDIA GeForce GTX 760 and AMD Radeon HD 7730.
Aug 05, 2013 · 3Digests

Gainward GeForce GTX 650 Ti BOOST 2GB Golden Sample Graphics Card

An excellent hybrid of GeForce GTX 650 Ti and GeForce GTX 660.
Jun 24, 2013 · Video cards: NVIDIA GPUs

i3DSpeed, May 2013

Added the test results of NVIDIA GeForce GTX 770/780.
Jun 03, 2013 · 3Digests
  Latest News More    RSS  

Platform  ·  Video  ·  Multimedia  ·  Mobile  ·  Other  ||  About us & Privacy policy  ·  Twitter  ·  Facebook

Copyright © Byrds Research & Publishing, Ltd., 1997–2011. All rights reserved.