iXBT Labs - Computer Hardware in Detail






Gigabyte GeForce FX 5950 Ultra-GT
Gigabyte GeForce FX 5700 Ultra 128MB
on NVIDIA GeForce FX 5950/5700


  1. Video cards features
  2. Testbed, test tools, 2D quality
  3. Test results: Performance Comparison
  4. Conclusion

Winter... PC enthusiasts are looking for new announcements. Intel must be the first to take the floor: its advanced PCI-Express technology will get into the mainstream market in spring, but we are mostly interested in what ATI and NVIDIA are going to tell us. Will the new solutions unveil the true way to really beautiful and almost realistic games? I'd believe it, but the reality makes me doubt.

Unfortunately, GPUs priced at $500-600 are not important for game developers (except research departments which favor every new solutions). They are mostly focused on mainstream accelerators. Besides, the market is still overfilled with GeForce4 MX and GeForce2 cards; plus, NVIDIA punished itself by releasing the GeForce MX 4000... Today they affirm that the GeForce FX 5200 is an excellent solution and it's the way to bring DX9 into the Low-end sector, and simultaneously they send a DX7 card into that market. Although this is an office budget solution, its price is far not $20.. They do have 64-bit lame FX 5200 which are now going down the price ladder, so why to put an obstacle?! At the same time they persuade publishers to look forward instead of looking back at DX7 cards.. I understand that the marketers at NVIDIA will reproach me saying that I do not understand the positioning policy, but I'm a user and I do not care about it at all. I see the facts. While ATI is bombing the Low-end sector with its RADEON 9200SE cards crying that they are much stronger than the FX5200 64bit in DX9 and the FX 5200 needs help in this war, NVIDIA releases a DX7 card providing ATI with new trumps in the marketing battle (it's clear that the RADEON 9200SE will win, at least, due to the DX81 support).

We are still hoping to see games where all the beauty from Nature and other benchmarks will be used in real 3D shooters or RPG games. The publishers make the game developers remove all those beautiful scenes that need powerful state-of-the-art accelerators... Some of them even want the games to run on Riva TNT2 cards. 

Let's get back on track. In Autumn 2003, when NVIDIA's line got the NV36/38 GPUs its products were positioned the following way:

  • GeForce FX 5950 Ultra - $499
  • GeForce FX 5900 Ultra - $399
  • GeForce FX 5900 - $349
  • GeForce FX 5900XT - $299
  • GeForce FX 5700 Ultra - $199
  • GeForce FX 5700 - $149

We are mostly interested in two positions: GeForce FX 5900XT - $299 and GeForce FX 5700 Ultra - $199.

However, the GeForce FX 5900XT is now available at $230 and lower (!), while the FX 5700 Ultra hasn't reached even the $200 level yet. Actually, the prices are equal. Does anybody need an accelerator having 4 texture units instead of 8, twice fewer ALUs controlling shader speed, a 128bit memory bus instead of a 256bit one, hot DDR-II instead of moderately warm 2.8ns DDR? Yes, I mean FX 5700 Ultra vs FX 5900XT. In spite of the Ultra postfix this card is a loser.

Why did they let the FX 5900XT get by throat its junior sibling that tops NVIDIA's Middle-End sector? I do not believe that such corporation as NVIDIA has no sane marketers to manage the price policy. 

Probably, the reason is that they have to sell out all their FX 5900 chips left (as there are new products coming onto the scene), and taking into account that the RADEON 9800 is pressing down on the FX 5900 they have no choice but to cut prices.

The GeForce FX 5700 Ultra cards are moving too slowly on the market, the prices hardly fall down... The company probably waits for the 5900XT to run out. NVIDIA has no more solutions in this sector except the FX 5700 Ultra and they will hardly cut prices for the upper FX 5900. 

Nevertheless, the market will soon need all these cards. The 5900XT will be sold out and the prices will be trimmed.

By the way, in the list below you can find the reviews of accelerators mentioned above.

Theoretical materials and reviews of video cards which concern functional properties of the GPU NVIDIA GeForce FX

So, today we are going to test the FX 5700 Ultra, or rather Gigabyte's version. 

The FX 5950 Ultra-GT looks more interesting. GT means that the GPU clock is increased. This is actually the fastest card among FX 5950 based ones except the water-cooled solution from Gainward. The core clock is lifted up from 475 to 520 MHz. The manufacturer selects processors that steadily run at such clock speeds, and it's clear that there are few of them and their prices will be higher. Anyway, we, testers, are very glad to see a unique solution in this ocean.



Gigabyte GeForce FX 5950 Ultra-GT

Gigabyte GeForce FX 5700 Ultra 128MB

Gigabyte GeForce FX 5700 Ultra 128MB
AGP x8/x4/x2, 128 MB DDR-II SDRAM  in 8 chips on both PCB sides.

Samsung (GDDR2) 2.2ns memory chips; it corresponds to 450 (900) MHz, and the memory runs at these clock speeds. GPU is clocked at 475 MHz. 128 bit memory bus. 

Gigabyte GeForce FX 5950 Ultra-GT
AGP x8/x4/x2, 256 MB DDR SDRAM  in 16 chips on both PCB sides.

Hynix 2ns memory chips (corresponds to 500 (1000) MHz), memory clocked at 475 (950) MHz, GPU at 520 MHz (!). 256 bit memory bus. 

Comparison with the reference design, front view
Gigabyte GeForce FX 5950 Ultra-GT Reference card NVIDIA GeForce FX 5950 Ultra

Gigabyte GeForce FX 5700 Ultra Reference card NVIDIA GeForceFX 5700 Ultra

Comparison with the reference design, back view
Gigabyte GeForce FX 5950 Ultra-GT Reference card NVIDIA GeForce FX 5950 Ultra

Gigabyte GeForce FX 5700 Ultra Reference card NVIDIA GeForceFX 5700 Ultra

Both products are based on the reference design and painted sky-blue - the color traditional for Gigabyte. The cooler on the FX 5950 is reference as well. Remember that the default clock speeds of the Gigabyte GeForce FX 5950 Ultra-GT are 475/950 MHz. To get 520 MHz you need the patch for NVIDIA's drivers and VTuner (like for Gainward's Golden Sample line). The patch can be downloaded from Gigabyte's site.

Note. Gigabyte BIOS with clock speeds of 520/950 MHz will be soon available for download, and you will need no patches.

Gigabyte GeForce FX 5950 Ultra-GT
The turbine-type cooler draws air from outside, drives it through the plastic pipe cooling down the aluminum heatsink on the chipset and pushes it out. Since air doesn't change its direction the cooler doesn't make much noise, though it gets stronger in the 3D mode when the turbine rotates faster.

The cooler is bulky, that is why the card locks the first PCI slot. The plastic air pipe features Gigagyte GT and NVIDIA logos. The memory chips are cooled with traditional aluminum sinks. The front one has a coarse look.

Gigabyte GeForce FX 5700 Ultra
For this card Gigabyte ordered a cooler from a third company. By the way, it's very similar to the one of Chaintech. Although it's very big, it can't cope with the processor running at 500 MHz.

At the same time, it has a beautiful impeller that shines in UV rays.

The FX 5700 Ultra doesn't have an external TV codec, that is why the TV-out is based on the GPU. The FX 5950 Ultra-GT does have such codec that controls VIVO.

The box contents.

Gigabyte GeForce FX 5950 Ultra-GT 
Here you can find a software suite shown on the right, a user manual, a VIVO adapter/splitter, DVI-to-d-Sub adapter and TV extension cords.

Gigabyte GeForce FX 5700 Ultra 128MB
Here you can also find a software suite shown on the right, a user manual, a VIVO adapter/splitter, S-Video-to-RCA and DVI-to-d-Sub adapters and TV extension cords.


Here are the boxes:

Gigabyte GeForce FX 5950 Ultra-GT 
Gigabyte completely changed its box style when returned to NVIDIA's camp. Now the light yellow boxes picture a virtual lady that calls to reach the 3D peaks. Besides, this box is glossy and looks like mirror. They have developed a completely new package for the GT series with the printed and even stamped name of exactly this card.

Gigabyte GeForce FX 5700 Ultra 128MB
The package is of the similar style but of the ivory color.


Testbed and drivers


  • Pentium 4 3200 MHz based computer:
    • Intel Pentium 4 3200 MHz CPU;
    • DFI LANParty Pro875 (i875P) mainboard; 
    • 1024 MB DDR SDRAM; 
    • Seagate Barracuda IV 40GB HDD; 
    • Windows XP SP1; DirectX 9.0b;
    • ViewSonic P810 (21") and ViewSonic P817 (21") monitors.
    • NVIDIA driver 53.03.

  • Athlon 64 3400+ based PC:
    • AMD Athlon 64 3400+ (2200 MHz = 220 MHz*10);
    • MSI K8T (VIA KT8); 
    • 1024 MB DDR400 SDRAM; 
    • Seagate Barracuda 7200.7 SATA 80GB; 
    • Windows XP SP1; DirectX 9.0b;
    • ViewSonic P810 (21") and ViewSonic P817 (21").
    • NVIDIA driver 53.03.

VSync off, S3TC off in applications. 

Test results

Before we start examining 2D quality, I should say there are no complete techniques for objective 2D quality estimation because:

  1. 2D quality much depends on certain samples for almost all modern 3D accelerators; 
  2. Besides videocards, 2D quality depends on monitors and cables; 
  3. Moreover, certain monitors might not work properly with certain video cards. 

With the ViewSonic P817 monitor and BNC Bargo cable the cards showed excellent quality at the following resolutions and clock speeds: 

Gigabyte GeForce FX 5950 Ultra-GT  1600x1200x85Hz, 1280x1024x100Hz, 1024x768x100Hz
Gigabyte GeForce FX 5700 Ultra 128MB 1600x1200x85Hz, 1280x1024x100Hz, 1024x768x100Hz

Test results: performance

Test applications:

  • Unreal 2: The Awakening (Infogrames), DirectX 8.1, multitexturing, tested with Bench'emAll! 2.5beta
  • RightMark 3D (one of the game scenes) - DirectX 8.1, Dot3, cube texturing, shadow buffers, vertex and pixel shaders (1.1, 1.4).
  • Test settings: pixel shaders 1.1, shadow buffers OFF.
  • Half-Life2 (Valve/Sierra) - DirectX 9.0, two different demos (ixbt07 and coast). Tested with anisotropic filtering enabled.
  • Note! Since this is the leaked beta version, the test results can be just of conditional interest.
  • Tom Clancy's Splinter Cell v.1.2b (UbiSoft) - Direct3D, Vertex/Pixel Shaders 1.1/2.0, Hardware T&L, Very High quality; demo 1_1_2_Tbilisi. AA doesn't work in this game. 
  • Call of Duty (MultiPlayer) (Infinity Ward/Activision) - OpenGL, multitexturing, ixbt1203demo, test settings - maximum, S3TC ON 
  • Tomb Raider: Angel of Darkness v.49 (Core Design/Eldos Software) - DirectX 9.0, Paris5_4 demo, test settings are shown here

If you need the demo benchmarks please email me. 



  1. Gigabyte GeForce FX 5950 Ultra-GT is the most powerful card among NVIDIA GPUs based solutions except the expensive Gainward CoolFX. The increased core clocks let this card outscore its competitors, but not on modern shader games. At a moderate price this card can get a good marketshare but it won't be easy as complicated shader games haven't arrived yet. The disadvantage is that you need to separately download the patch to get 520 MHz (however, they will soon make the updated BIOS available). 

  3. Gigabyte GeForce FX 5700 Ultra 128MB goes on a par with the RADEON 9600 XT. Sometimes it wins, but not in shader applications. Today such cards suffer a lot from FX 5900 XT, and it's difficult to draw any conclusion. The price will be a determining factor. If it's $50 less than that of FX 5900XT, the card will get every chance to succeed.


In our 3Digest you can find full comparison characteristics for video cards of this and other classes. 

Andrey Vorobiev (anvakams@ixbt.com)

Write a comment below. No registration needed!

Article navigation:

blog comments powered by Disqus

  Most Popular Reviews More    RSS  

AMD Phenom II X4 955, Phenom II X4 960T, Phenom II X6 1075T, and Intel Pentium G2120, Core i3-3220, Core i5-3330 Processors

Comparing old, cheap solutions from AMD with new, budget offerings from Intel.
February 1, 2013 · Processor Roundups

Inno3D GeForce GTX 670 iChill, Inno3D GeForce GTX 660 Ti Graphics Cards

A couple of mid-range adapters with original cooling systems.
January 30, 2013 · Video cards: NVIDIA GPUs

Creative Sound Blaster X-Fi Surround 5.1

An external X-Fi solution in tests.
September 9, 2008 · Sound Cards

AMD FX-8350 Processor

The first worthwhile Piledriver CPU.
September 11, 2012 · Processors: AMD

Consumed Power, Energy Consumption: Ivy Bridge vs. Sandy Bridge

Trying out the new method.
September 18, 2012 · Processors: Intel
  Latest Reviews More    RSS  

i3DSpeed, September 2013

Retested all graphics cards with the new drivers.
Oct 18, 2013 · 3Digests

i3DSpeed, August 2013

Added new benchmarks: BioShock Infinite and Metro: Last Light.
Sep 06, 2013 · 3Digests

i3DSpeed, July 2013

Added the test results of NVIDIA GeForce GTX 760 and AMD Radeon HD 7730.
Aug 05, 2013 · 3Digests

Gainward GeForce GTX 650 Ti BOOST 2GB Golden Sample Graphics Card

An excellent hybrid of GeForce GTX 650 Ti and GeForce GTX 660.
Jun 24, 2013 · Video cards: NVIDIA GPUs

i3DSpeed, May 2013

Added the test results of NVIDIA GeForce GTX 770/780.
Jun 03, 2013 · 3Digests
  Latest News More    RSS  

Platform  ·  Video  ·  Multimedia  ·  Mobile  ·  Other  ||  About us & Privacy policy  ·  Twitter  ·  Facebook

Copyright © Byrds Research & Publishing, Ltd., 1997–2011. All rights reserved.