Gigabyte GeForce FX 5950 Ultra-GT
Gigabyte GeForce FX 5700 Ultra 128MB
on NVIDIA GeForce FX 5950/5700
|
CONTENTS
- Video cards features
- Testbed, test tools, 2D quality
- Test results: Performance Comparison
- Conclusion
Winter... PC enthusiasts are looking for new announcements. Intel
must be the first to take the floor: its advanced PCI-Express technology
will get into the mainstream market in spring, but we are mostly interested
in what ATI and NVIDIA are going to tell us. Will the new solutions
unveil the true way to really beautiful and almost realistic games?
I'd believe it, but the reality makes me doubt.
Unfortunately, GPUs priced at $500-600 are not important for game
developers (except research departments which favor every new solutions).
They are mostly focused on mainstream accelerators. Besides, the market
is still overfilled with GeForce4 MX and GeForce2 cards; plus, NVIDIA
punished itself by releasing the GeForce MX 4000... Today they affirm
that the GeForce FX 5200 is an excellent solution and it's the way
to bring DX9 into the Low-end sector, and simultaneously they send
a DX7 card into that market. Although this is an office budget solution,
its price is far not $20.. They do have 64-bit lame FX 5200 which
are now going down the price ladder, so why to put an obstacle?! At
the same time they persuade publishers to look forward instead of
looking back at DX7 cards.. I understand that the marketers at NVIDIA
will reproach me saying that I do not understand the positioning policy,
but I'm a user and I do not care about it at all. I see the facts.
While ATI is bombing the Low-end sector with its RADEON 9200SE cards
crying that they are much stronger than the FX5200 64bit in DX9 and
the FX 5200 needs help in this war, NVIDIA releases a DX7 card providing
ATI with new trumps in the marketing battle (it's clear that the RADEON
9200SE will win, at least, due to the DX81 support).
We are still hoping to see games where all the beauty from Nature
and other benchmarks will be used in real 3D shooters or RPG games.
The publishers make the game developers remove all those beautiful
scenes that need powerful state-of-the-art accelerators... Some of
them even want the games to run on Riva TNT2 cards.
Let's get back on track. In Autumn 2003, when NVIDIA's line got the
NV36/38 GPUs its products were positioned the following way:
- GeForce FX 5950 Ultra - $499
- GeForce FX 5900 Ultra - $399
- GeForce FX 5900 - $349
- GeForce FX 5900XT - $299
- GeForce FX 5700 Ultra - $199
- GeForce FX 5700 - $149
We are mostly interested in two positions: GeForce FX 5900XT - $299
and GeForce FX 5700 Ultra - $199.
However, the GeForce FX 5900XT is now available at $230 and lower
(!), while the FX 5700 Ultra hasn't reached even the $200 level yet.
Actually, the prices are equal. Does anybody need an accelerator having
4 texture units instead of 8, twice fewer ALUs controlling shader
speed, a 128bit memory bus instead of a 256bit one, hot DDR-II instead
of moderately warm 2.8ns DDR? Yes, I mean FX 5700 Ultra vs FX 5900XT.
In spite of the Ultra postfix this card is a loser.
Why did they let the FX 5900XT get by throat its junior sibling that
tops NVIDIA's Middle-End sector? I do not believe that such corporation
as NVIDIA has no sane marketers to manage the price policy.
Probably, the reason is that they have to sell out all their FX 5900
chips left (as there are new products coming onto the scene), and
taking into account that the RADEON 9800 is pressing down on the FX
5900 they have no choice but to cut prices.
The GeForce FX 5700 Ultra cards are moving too slowly on the market,
the prices hardly fall down... The company probably waits for the
5900XT to run out. NVIDIA has no more solutions in this sector except
the FX 5700 Ultra and they will hardly cut prices for the upper FX
5900.
Nevertheless, the market will soon need all these cards. The 5900XT
will be sold out and the prices will be trimmed.
By the way, in the list below you can find the reviews of accelerators
mentioned above.
Theoretical materials and reviews of video cards which concern
functional properties of the GPU NVIDIA GeForce FX
- Analysis of the architecture
of NVIDIA NV30 (GeForce FX)
- NVIDIA GeForce FX
5800 Ultra (NV30) - single-page review
- NVIDIA GeForce FX
5800 Ultra (NV30) - multi-page review
- NVIDIA GeForce FX 5600
Ultra (NV31) and GeForce FX 5200 Ultra (NV34) - single-page
review
- NVIDIA GeForce FX 5600
Ultra (NV31) and GeForce FX 5200 Ultra (NV34) - multi-page review
- ASUS V9900 Ultra on
NVIDIA GeForce FX 5800 Ultra - AA and anisotropy quality
- Gainward FX Powerpack
Ultra/1000 Golden Sample and Gainward FX Powerpack Pro/660 TV/DVI
on NVIDIA GeForce FX 5800 Ultra and 5200 - scaling (performance
vs. CPU clock speed) of GeForce FX 5800 Ultra, performance of GeForce
FX 5200
- Leadtek WinFast
A300 Ultra MyVIVO on NVIDIA GeForce FX 5800 Ultra - performance
of GeForce FX 5800 Ultra vs. CPU clock speed in heavy modes with
AA and anisotropy enabled
- MSI FX5800 Ultra-TD8X
on NVIDIA GeForce FX 5800 Ultra
- Albatron, Chaintech, Gainward,
InnoVision, Leadtek, Palit and Prolink video cards on NVIDIA GeForce
FX 5200
- ASUSTeK NVIDIA GeForce
FX 5200/5600 video cards
- MSI FX5600-VTDR128 (MS-8912)
card on NVIDIA GeForce FX 5600
- Albatron, Leadtek and MSI
video cards on the NVIDIA GeForce FX 5200 Ultra
- Prolink PixelView GeForce FX 5600 256MB
Golden Limited on the NVIDIA GeForce FX 5600
- Gainward FX PowerPack Ultra/760 XP Golden
Sample on the NVIDIA GeForce FX 5900 (new revision)
- AOpen and Soltek cards on NVIDIA GeForce
FX 5600
- Leadtek WinFast A310 Ultra MyVIVO on NVIDIA
GeForce FX 5600 Ultra (350MHz revision)
- ATI vs NVIDIA: where are fair
duels? or Dishonest Treatment of the 3DMark
- MSI FX5900-VTD128 on NVIDIA GeForce FX
5900 - More on 3DMark 2003 (fruits of collaboration of NVIDIA
and FutureMark after signing the peaceful agreement)
- Gainward FX Powerpack Ultra/1200 Golden
Sample on NVIDIA GeForce FX 5900 - performance in NEW game tests,
analysis of performance of the FX 5900 without cheats
- Albatron Gigi GeForce FX 5900PV 128MB
on NVIDIA GeForce FX 5900 Ultra
- Inno3D Tornado GeForce FX5900 128MB on
NVIDIA GeForce FX 5900
- ASUS V9950 Ultra on NVIDIA GeForce FX
5900 Ultra - extremal overclocking
- AOpen Aeolus GeForce FX 5600S 256MB on
NVIDIA GeForce FX 5600
- ABIT, ASUSTeK and Chaintech cards on the
NVIDIA GeForce FX 5600 Ultra
- Gainward Powepack FX Ultra/1600 GS CoolFX on the NVIDIA GeForce
FX 5900 Ultra - Drivers 51.75 beta
- Gainward Powepack FX Ultra/1600 GS CoolFX
on the NVIDIA GeForce FX 5900 Ultra - Drivers 51.75 beta
- Prolink PixelView GeForce FX 5900 128MB
on NVIDIA GeForce FX 5900 - Tomb Raider: Angel of Darkness Benchmark
- NVIDIA GeForce FX 5900SE/XT from Albatron
and Leadtek
- NVIDIA GeForce FX 5950 Ultra (NV38) and
GeForce FX 5700 Ultra (NV36)
- Gigabyte GeForce FX 5950 Ultra 256MB
- drivers 52.70 vs 52.16
- Albatron Gigi GeForce FX 5700 Ultra 128MB
on NVIDIA GeForce FX 5700 Ultra
- Leadtek WinFast A380 Ultra MyVIVO 256MB
and ASUS V9950SE on NVIDIA GeForce FX 5950Ultra/5900SE
- Chaintech GeForce FX 5600XT 128MB 64bit;
Chaintech GeForce FX 5600XT 256MB 128bit; Chaintech GeForce FX 5700
Ultra 128MB; Chaintech GeForce FX 5900 128MB; Chaintech GeForce
FX 5950 Ultra 256MB
- Gainward CoolFX Ultra/1800 XP GS 256MB
on NVIDIA GeForce FX 5950Ultra
- MSI FX5950 Ultra-VTD256 & MSI FX5700
Ultra-TD128 on NVIDIA GeForce FX 5950Ultra/5700Ultra &emdash;
3DMark03 v.340 on drivers 52.16 & 53.0
- NVIDIA GeForce FX 5700 Ultra cards from
AOpen, InnoVision and Sparkl
- ASUS V9570 TD 256MB and ASUS V9560XT/TD
128MB (64bit) on NVIDIA GeForce FX 5700/5600XT
So, today we are going to test the FX 5700 Ultra, or rather Gigabyte's
version.
The FX 5950 Ultra-GT looks more interesting. GT means that the GPU
clock is increased. This is actually the fastest card among FX 5950
based ones except the water-cooled solution from Gainward. The core
clock is lifted up from 475 to 520 MHz. The manufacturer selects processors
that steadily run at such clock speeds, and it's clear that there
are few of them and their prices will be higher. Anyway, we, testers,
are very glad to see a unique solution in this ocean.
Cards
Gigabyte GeForce FX 5950 Ultra-GT |
|
Gigabyte GeForce FX 5700 Ultra 128MB |
|
Gigabyte GeForce FX 5700 Ultra 128MB |
AGP x8/x4/x2, 128 MB DDR-II SDRAM in 8 chips
on both PCB sides.
Samsung (GDDR2) 2.2ns memory chips; it corresponds to 450 (900)
MHz, and the memory runs at these clock speeds. GPU is clocked
at 475 MHz. 128 bit memory bus.
|
|
Gigabyte GeForce FX 5950 Ultra-GT |
AGP x8/x4/x2, 256 MB DDR SDRAM in 16 chips
on both PCB sides.
Hynix 2ns memory chips (corresponds to 500 (1000) MHz), memory
clocked at 475 (950) MHz, GPU at 520 MHz (!). 256 bit
memory bus.
|
|
Comparison with the reference design,
front view |
Gigabyte GeForce FX 5950 Ultra-GT |
Reference card NVIDIA GeForce FX 5950 Ultra |
|
|
Gigabyte GeForce FX 5700 Ultra |
Reference card NVIDIA GeForceFX 5700 Ultra |
|
|
|
Comparison with the reference design,
back view |
Gigabyte GeForce FX 5950 Ultra-GT |
Reference card NVIDIA GeForce FX 5950 Ultra |
|
|
Gigabyte GeForce FX 5700 Ultra |
Reference card NVIDIA GeForceFX 5700 Ultra |
|
|
|
Both products are based on the reference design and painted sky-blue
- the color traditional for Gigabyte. The cooler on the FX 5950 is
reference as well. Remember that the default clock speeds of the Gigabyte
GeForce FX 5950 Ultra-GT are 475/950 MHz. To get 520 MHz you need
the patch for NVIDIA's drivers and VTuner (like for Gainward's
Golden Sample line). The patch can be downloaded from Gigabyte's site.
Note. Gigabyte BIOS with clock speeds of 520/950 MHz will be soon
available for download, and you will need no patches.
The FX 5700 Ultra doesn't have an external TV codec, that is why
the TV-out is based on the GPU. The FX 5950 Ultra-GT does have such
codec that controls VIVO.
The box contents.
Gigabyte GeForce FX 5950 Ultra-GT |
Here you can find a software suite
shown on the right, a user manual, a VIVO adapter/splitter, DVI-to-d-Sub
adapter and TV extension cords. |
|
|
Gigabyte GeForce FX 5700 Ultra 128MB |
Here you can also find a software suite
shown on the right, a user manual, a VIVO adapter/splitter, S-Video-to-RCA
and DVI-to-d-Sub adapters and TV extension cords. |
|
|
Here are the boxes:
Gigabyte GeForce FX 5950 Ultra-GT |
Gigabyte completely changed its box
style when returned to NVIDIA's camp. Now the light yellow boxes
picture a virtual lady that calls to reach the 3D peaks. Besides,
this box is glossy and looks like mirror. They have developed
a completely new package for the GT series with the printed and
even stamped name of exactly this card. |
|
|
Gigabyte GeForce FX 5700 Ultra 128MB |
The package is of the similar style but of the
ivory color. |
|
Testbed and drivers
Testbed:
- Pentium 4 3200 MHz based computer:
- Intel Pentium 4 3200 MHz CPU;
- DFI LANParty Pro875 (i875P) mainboard;
- 1024 MB DDR SDRAM;
- Seagate Barracuda IV 40GB HDD;
- Windows XP SP1; DirectX 9.0b;
- ViewSonic P810 (21") and ViewSonic
P817 (21") monitors.
- NVIDIA driver 53.03.
- Athlon 64 3400+ based PC:
- AMD Athlon 64 3400+ (2200 MHz = 220 MHz*10);
- MSI K8T (VIA KT8);
- 1024 MB DDR400 SDRAM;
- Seagate Barracuda 7200.7 SATA 80GB;
- Windows XP SP1; DirectX 9.0b;
- ViewSonic P810 (21") and ViewSonic P817 (21").
- NVIDIA driver 53.03.
VSync off, S3TC off in applications.
Test results
Before we start examining 2D quality, I should say there are no complete
techniques for objective 2D quality estimation because:
- 2D quality much depends on certain samples for almost all modern
3D accelerators;
- Besides videocards, 2D quality depends on monitors and cables;
- Moreover, certain monitors might not work properly with certain
video cards.
With the ViewSonic P817 monitor and BNC Bargo cable the cards showed
excellent quality at the following resolutions and clock speeds:
Gigabyte GeForce FX 5950 Ultra-GT |
1600x1200x85Hz, 1280x1024x100Hz, 1024x768x100Hz |
Gigabyte GeForce FX 5700 Ultra 128MB |
1600x1200x85Hz, 1280x1024x100Hz, 1024x768x100Hz |
Test results: performance
Test applications:
- Unreal 2: The Awakening (Infogrames), DirectX 8.1, multitexturing,
tested with Bench'emAll! 2.5beta.
- RightMark 3D
(one of the game scenes) - DirectX 8.1, Dot3, cube texturing, shadow
buffers, vertex and pixel shaders (1.1, 1.4).
- Test settings: pixel shaders 1.1, shadow buffers OFF.
- Half-Life2 (Valve/Sierra) - DirectX 9.0, two different demos
(ixbt07
and coast).
Tested with anisotropic filtering enabled.
- Note! Since this is the leaked beta version,
the test results can be just of conditional interest.
- Tom Clancy's Splinter Cell v.1.2b (UbiSoft) - Direct3D, Vertex/Pixel
Shaders 1.1/2.0, Hardware T&L, Very High quality; demo 1_1_2_Tbilisi.
AA doesn't work in this game.
- Call of Duty (MultiPlayer) (Infinity Ward/Activision) - OpenGL,
multitexturing, ixbt1203demo, test settings - maximum, S3TC ON
- Tomb Raider: Angel of Darkness v.49 (Core Design/Eldos Software)
- DirectX 9.0, Paris5_4 demo, test settings are shown here.
If you need the demo benchmarks please email me.
Performance
- 1. Call of Duty
- 2. Tom Clancy's Splinter Cell
- 3. Aquamark 3
- 4. Unreal 2: The Awakening
- 5. Tomb Raider: The Angel of Darkness
- 6. RightMark 3D
- 7. Half Life 2 beta
Conclusion
- Gigabyte GeForce FX 5950 Ultra-GT is the most powerful card among
NVIDIA GPUs based solutions except the expensive Gainward CoolFX.
The increased core clocks let this card outscore its competitors,
but not on modern shader games. At a moderate price this card can
get a good marketshare but it won't be easy as complicated shader
games haven't arrived yet. The disadvantage is that you need to
separately download the patch to get 520 MHz (however, they will
soon make the updated BIOS available).
- Gigabyte GeForce FX 5700 Ultra 128MB goes on a par with the RADEON
9600 XT. Sometimes it wins, but not in shader applications. Today
such cards suffer a lot from FX 5900 XT, and it's difficult to draw
any conclusion. The price will be a determining factor. If it's
$50 less than that of FX 5900XT, the card will get every chance
to succeed.
In our 3Digest you can find
full comparison characteristics for video cards of this and other
classes.
Write a comment below. No registration needed!
|
|
|
|
|