Leadtek GeForce 9500GT, 9800 GTX+, GTX 260/280
Three reference cards and two original products.
December 8, 2008
- Intel Core2 Extreme QX9650 (3000 MHz) CPU
- Zotac 790i Ultra motherboard on NVIDIA nForce 790i Ultra
- 2GB DDR3 SDRAM Corsair 2000MHz (CAS (tCL)=5, RAS to CAS delay (tRCD)=5, Row Precharge (tRP)=5, tRAS=15)
- WD Caviar SE WD1600JD 160GB SATA hard drive
- Tagan TG900-BZ 900W PSU
- Windows Vista 32bit SP1, DirectX 10.1
- Dell 3007WFP 30-inch monitor
- ATI CATALYST 8.9; NVIDIA Forceware 177.92
- VSync disabled
- Call Of Juarez (Techland/Ubisoft) -- DirectX 9.0, Shaders 3.0 (HDR), maximum quality settings; demo, batch file included.
- S.T.A.L.K.E.R. 1.003 (GSC Game World/THQ) -- DirectX 9.0, maximum quality settings (dynamic lighting enabled); demo, copy files to the savegames folder, run the game, load level 'ixbt3', and type "demo_play ixbt3" in the console.
- 3DMark Vantage 1.00 (FutureMark) -- DirectX 10.0, Shaders 4.0, multitexturing, 'Extreme' settings.
- CRYSIS 1.2 (Crytek/EA), DirectX 10.0, Shaders 4.0, 'Very High' settings, levels 'Rescue' and 'Harbor'; batch file, e-mail us to obtain the timedemo. We express gratitude to CRYTEK for creating a timedemo for iXBT.com / Digit-Life.
- Company Of Heroes Opposing Fronts (Relic Entertainment/THQ) -- DirectX 10.0, Shaders 4.0, maximum quality settings; batch file, run the game, invoke graphics settings and click the test button.
- World In Conflict 1.007 (Massive Entertainment/Sierra) -- DirectX 10.0, Shaders 4.0, 'Very High' settings with adjusted AA and AF; run the game, invoke graphics settings and click the test button.
- Devil May Cry 4 (CAPCOM) -- DirectX 10.0, 'Super High' settings with adjusted AA and AF; Scene 1 and Scene 4.
Call Of Juarez
World In Conflict
CRYSIS DirectX 10.0 (Very High), RESCUE
CRYSIS DirectX 10.0 (Very High), HARBOR
Company Of Heroes Opposing Fronts
3DMark Vantage Graphics MARKS
Devil May Cry 4 (Scene 1)
Devil May Cry 4 (Scene 4)
Leadtek WinFast PX9500GT 512MB GDDR2 is a strange card. As I have already mentioned, its pared-down memory bandwidth has reduced 3D performance by 7-10%. If you want to buy yourself such a card, take notice. However, the lion's share of consumers in this segment do not aspire to high 3D image quality. They have another objective in mind for such cards. So these cards may be quite popular, provided the price is reasonable. The famous quality of Leadtek products in this case is still up to the mark (only fan whirring spoils the impression a little).
Leadtek WinFast PX9500GT 512MB GDDR3 is a standard 9500GT card with all its pros and cons, we have already examined it in this review. Note the strange design (bigger PCB), but that shouldn't bother users. Again, a reasonable price is more important. As in the previous case, this card is of very high quality.
Leadtek WinFast GTX280 1024MB is a reference GTX 280 examined in this here. The bundle includes a bonus PC game.
Leadtek WinFast GTX260 (192sp) 896MB is also a reference card (examined here). And don't forget about GTX 260 cards with improved core characteristics (number of processors, etc). If I'm not mistaken, they come with the same price tag. So GTX 260 with 192 processors should be leaving the market. Although there are still a lot of them in stock.
Leadtek WinFast PX9800GTX+ 512MB is another reference card. Its increased frequencies (relative to 9800 GTX) must help it in the struggle against RADEON HD 4850. However, the card manages to do it in performance only, not in price. Given 9800 GTX+ costs the same as 4850, the GTX+ modification is a better choice. At that I mean 512 MB graphics cards, of course.
A few words about our benchmarks.
In our updated FRAPS review we have illustrated how crude and inaccurate tests with this utility are. Testers do not have other tools, except for benchmarks built into games.
This article and its first part explain that it's sometimes possible to test games with integrated and identically looped demos. Although this implies lots of potential errors, since measurement accuracy depends on a given tester: whether he starts/stops FRAPS in time or does it too late/early.
But I have run across situations, when demo load changes abruptly in the very beginning or end. So a half-second delay in starting/stopping the utility changes the average FPS by 15-20%. That's not a measurement error anymore - such a test is a total waste of time. One time you delay the test, another time you start it too early (not intentionally, of course), and you end up with absolutely different performance results.
But even that's not the most important thing. The fact is, there are almost no games with built-in demos anymore. So, testers are forced to use a method that we deem totally unacceptable. They measure gaming performance by walking a straight line from the a starting point in a scene to a selected destination (the nearest fence, tree, etc.).
We all understand that it's impossible to navigate to a finish spot in precisely the same route with different cards and in different resolutions. Besides, such games always introduce random elements into a scene, and objects may be placed slightly differently on the same scene.
Unfortunately, websites that publish a huge number of tests do not always reveal their test methods for each game (except for those with built-in benchmarks).
So, we believe that it's better to offer a limited number of game tests, but each will be crystal clear, accurate and showing actual differences between graphics cards.
We express gratitude to Leadtek for provided graphics cards.
PSU provided by TAGAN, monitor provided by NVIDIA.
Write a comment below. No registration needed!