This chip is presented by reference ABIT Siluro GF3 Ti500 64MB, AGP videocard.
Overclocks to 265/300 (600) MHz (see summary diagrams).
The end of April 2001 brought the new NVIDIA product, fundamentally differing from all previous solutions by memory optimizations. It's known that memory bandwidth was the most vulnerable part of all recent accelerators. New cards required faster and more expensive memory, resulting in increased prices. GeForce2 Ultra with 64MB of the most expensive 4ns memory has become the peak. The incredible prices and sales failure of these cards taught NVIDIA a lesson and the company tried hard to consider all previous mistakes in the new product. It's interesting that GeForce3 had memory with the same clock rate (230 (460) MHz) as GeForce2 Ultra, but the core clock rate was even lower (200 MHz vs. 250 MHz). This resulted in undoubtful victory of GeForce3 in 32-bit color mode, and GeForce2 Ultra was only the second lacking memory bandwidth. GeForce3 cards were the industry leaders (in the 3D accelerators sector) until the fall of 2001. The September Ti200/500 line put an end to the life of the February firstling. Such a short life of the top product.
Now GeForce3 Ti200/500 is on the homestretch. The release of GeForce4 Ti, actually being the improved GeForce3, passed the sentence on the previous family. If GeForce4 MX stroke GeForce2 line, then GeForce4 Ti was aimed at GeForce3 Ti200/500, at Ti500, in particular, and successfully "killed" it. The destiny of Ti200 is clear as well. Despite it was a very profitable product for manufacturers, for NVIDIA it was profitable only together with Ti500 (as wastes of Ti500 were used to make Ti200), so the termination of Ti500 production meant the death of Ti200. Meanwhile the perfomance of GeForce4 MX460, the top mainstream model, is on the level of Ti200, though it lacks some of Ti200 capabilities, pixel shaders in particular. Moreover, ATI released RADEON 9000 for prices similar to MX440-Ti200 making it hard for the latter to survive on the market. But Ti 4200 is already on it's way to save the GeForce4 family...
You can read more about GeForce3 and its family in our reviews:
On the 10th of October 2002 the latest drivers from NVIDIA are 31.40, 40.52, 40.71, 40.72 for Windows XP.
You can see there's almost no difference between the latest drivers. Still there's criticism about 27.20 and 27.30 versions (especially under Windows XP), but 28.* versions (28.32 in particular) are far more better. Newest 40.41 drivers add more D3D performance.
I recommend to use the official 29.42 or 29.80 WHQL as the most stable.
I shall mark that overlay settings tab is disabled in a number of 14.*, 21.88, and 22.* versions. It may be enabled with RivaTuner utility.
You can get more information about problems of various NVIDIA drivers at the NVIDIA World web-site.
And now I want to attract your attention to another important 3D feature - filtering. As many of you know, all chipsets support bilinear filtering that is an important instrument in MIP-mapping, some chipsets support trilinear filtering (true, not the approximation) and only few support anisotropic filtering.
Due to the fact that GeForce2 and GeForce3 do support anisotropic filtering, I recommend you to pay attention to the article concerning this feature.
I shall mark that GeForce3 supports new types of antialiasing based on multisampling (MSAA) technology, including noteworthy Quincunx mode (read about it in our
GeForce3). Here are two examples:
Interesting technologies of working with vertex and pixel shaders allow to get more realistic lighting effects and to form original solutions like semi-transparent objects on the right screenshot:
As far back as 1999 Matrox released a revolutionary (for those years) technology of relief texturing with usage of environment maps - EMBM. Time passed and some other Bump Mapping technologies began to spread (though slowly), and DOT3 most of all. But however developers didn't like such effective methods of realistic representation of complex 3D objects that had not so smooth surface in the majority of cases. The reason of this attitude was a very small number of graphic accelerators capable of working with Bump Mapping of mentioned types. By the way there's also the Emboss BM, supported by almost all cards, but due to low 3D-realism and high resources requirements it hasn't become widespread. Only after releases of NVIDIA GeForce and especially GeForce2 some mediocre realizations of BM began to appear, namely DOT3. EMBM begins to spread as well because not only Matrox G400, but also ATI RADEON and NVIDIA GeForce3 support it now. The example of EMBM can be seen on the water surface screenshot above. Giants game has become a good example of BM:
Bump Mapping DOT3 is widely used in this game and EMBM is used occasionally. I hope EMBM application would become wider than it has been before. I shall add that a new version of this game, using GeForce3 features, is included into ELSA Gladiac 920 box. Read more about it in the ELSA Gladiac 920 review.
The power and perfomance of GeForce3 is in the best way demonstrated in the demo-version of the future X-Isle - Dinosaur Island game:
If some are not satisfied with these (as many think X-Isle is just a techno demo), here you are: the screenshots from the future Unreal2 hit (they were taken from the alpha version, and may developers forgive me for using stolen pictures, but the WWW is full of this alpha stuff)!
There's no criticism about 3D quality in all the games tested. Look at the shots taken on this card.
And 6.* drivers are not compatible with the software for sound cards based on Yamaha processor, as well as with SoftXG100 software synthesizer. Start or Close Window buttons might be spoiled by some "garbage" in 2D mode. One solution is in the iXBT Forum.