This GPU is presented by Leadtek WinFast A400 (NVIDIA GeForce 6800) card.
By the end of the month the product has MSRP of $299 (the real value is about $350 considering the newness), supports DirectX 9.0c.
This $300-350 supports all technologies, including shaders 3.0. The card has AGP x8/x4 interface, 128MB DDR SDRAM in 8 Hynix chips. 2.8ns indicate 350 (700) MHz. 256 bit bus. 12 pixel pipelines (3x4).
Though I was consumed with doubts, I patched GeForce 6800 with BIOS of GeForce 6800 Ultra, as unlike ATI, NVIDIA doesn't fix timings and even capacity in BIOS.
The 128MB of GeForce 6800-to-Ultra was determined correctly, but 700 MHz clock rate was too much and caused garbage, through which I saw that Windows determined the card as an Ultra. After I restored BIOS, clock rates were reverted.
But in some tests I saw broken line artefacts. E.g. FarCry, TRAoD, etc. were OK, but Splinter Cell and RtCW were buggy and crashable.
But the reason was found when I looked at operable test results. The performance increased comparing to GeForce 6800! Fillrate test (3DMark03) proved that there were 16 pipelines! In other words, 6800 BIOS didn't return 12 pipelines. Perhaps, the reason was 6800 Ultra BIOS that could reprogram some registers, for example, and changed pipeline amount.
So, the question is still raised. Maybe our Aleksey Nikolaychuk will be able to fix this by forcing GPU registers values.
As for those tests that worked, why FarCry didn't crash, but old tests were inoperable? It's simple, not all 4 re-enabled pipelines were bad. Just some of their parts like texture caches, etc. Maybe it was something that FarCry didn't use.
You can read more on GeForce 6 Series in our reviews: Theoretical and analytical graphics card reviews containing functional analysis of NVIDIA GPUs
On June 10, 2004 the latest drivers from NVIDIA were 61.34 for Windows XP.
Andrey Vorobiev (Anvakams@ixbt.com)