iXBT Labs - Computer Hardware in Detail

Platform

Video

Multimedia

Mobile

Other

The Effect of Settings and Optimizations on Performance and Rendering Quality in ATI and NVIDIA Cards





Manufacturers of video chips started introducing various optimizations (to raise 3D performance) into drivers long ago. Some time ago users were also granted an official access to some of them. For example, starting from NVIDIA ForceWare 61.34, we can disable and enable optimizations of anisotropic and trilinear filtering in NVIDIA drivers. Drivers for video cards based on ATI GPUs also allow to change some of their optimizations (for example, CATALYST A.I.). The X1000 series also introduced support for high-quality anisotropic filtering method, which can also be considered as an optimization of quality.

Of course, optimization options provided by ATI and NVIDIA differ much. And it's very difficult to evaluate and compare their effect on quality, as the process of detecting flaws in 3D image quality and comparative analysis of the quality of texture filtering and so on is similar to the process of looking for a black cat in a dark room. But it's still useful to test rendering quality in video cards based on GPUs from different manufacturers, especially considering possible changes in optimization algorithms. Though our 3Digest periodically publishes materials on rendering quality, in this article we'll analyze image quality in general. Special attention will be paid to texture filtering, as it has become the main procedure to include various optimizations.

We'll also compare performance of video cards from ATI and NVIDIA in three modes: "Performance" (all driver settings are configured for maximum performance), "Default" (all driver settings are configured by default, as offered by GPU manufacturers), usually used in tests, and "Quality" (driver settings with some exceptions are configured for the highest 3D rendering quality.) In this article we are going to analyze several popular games, which are often used in reviews of video cards from various manufacturers. We'll evaluate comparative performance in the three above-mentioned modes (Default mode will be taken for 100%, the other numbers will show 3D performance relative to this mode). Besides, we'll publish two sets of screenshots for each game to evaluate rendering quality. We don't say that we've covered all possible driver configuration issues. But it may help you understand the effect of contrary video settings on GPUs from the two main manufacturers. As well as to decide for yourself whether you should enable driver optimizations to raise performance and deteriorate quality or vice versa.

Unfortunately, we cannot do a well-posed comparison of ATI and NVIDIA's optimizations. Because it's impossible at this stage. Drivers from those companies differ in the sets of parameters to configure. We speak of only those parameters, which can be changed in the official control panel. The other methods for raising performance or improving quality will not be covered in this article. So, here are the main NVIDIA parameters we are interested in: general texture filtering performance - "Texture filtering", some texture filtering optimizations ("Anisotropic mip filter optimization", "Anisotropic sample optimization", "Trilinear optimization"). There are also other NVIDIA settings: "Force mipmaps", "Negative LOD bias", and "Gamma correct antialiasing". ATI offers the following most interesting parameters in terms of raising performance or quality: "High Quality AF", "CATALYST A.I.", and "Mipmap Detail Level". Even judging from the names of these settings, you can see that NVIDIA offers a tad more settings to optimize texture filtering. And the CATALYST A.I. option (ATI) looks mysterious. We can see that NVIDIA's optimizations that can be disabled refer to texture filtering only, while CATALYST A.I. offers some general optimizations, which ATI prefers to be silent on. But we shall still use various values for this setting in all test modes, whatever they mean.

Let's digress a little and speak about the essence of texture filtering optimizations, which is one of the most important and always controversial issues of 3D rendering quality. The most popular optimizations, used by both ATI and NVIDIA with various degrees of "boldness" (we are going to evaluate this degree a tad later by comparing pictures): optimizations of trilinear and anisotropic filtering. The former ones consist in reducing the work volume of the trilinear filtering algorithm, reducing the filtered surfaces and biasing them to mip level borders. As a result, there appear noisy surfaces in distant mip levels and noticeable borders on these levels in 3D applications. In worse case scenario distant mip levels can use only bilinear filtering.

There can be various anisotropic filtering optimizations: for example, reducing the filtering level for distant mip levels, or special anisotropy algorithms, when the "right" filtering is not applied to all surfaces, but only to the most necessary horizontal, vertical, and angled surfaces. But there are also "inconvenient" angles, when surfaces look blurred. There may also be other popular optimizations of texture filtering, such as reducing the level of anisotropy for non-contrast textures, increasing mipmap LOD bias, as well as applying trilinear filtering only at certain texturing stages (most often - at the first stage only, when most basic textures to be filtered are processed).

Out of doubt, all these optimizations have a right to live. They can be of much help to users of weak video cards - they allow to squeeze precious FPS in modern games. But only if they can be officially enabled in driver settings. Not when a driver is "quietly" optimized for a given application, which is often used in performance comparisons, at the cost of quality deterioration at that. We shall not try to prove that GPU manufacturers play unfairly, especially considering that 3D rendering quality is subjective. But we'll publish comparative performance results in various modes as well as images to evaluate rendering quality of video cards from the two leading manufacturers. We'll also add our subjective comments. Paying attention to the texture filtering quality in the first place, we'll also consider antialiasing quality, but only if we suspect that driver settings affect its quality (which rarely happens in 3D graphics). Our readers will have to decide: Which optimized what and where.

Testbed Configuration and Driver Settings

Testbed configuration:

  • CPU: AMD Athlon 64 3200+ Socket 939
  • Motherboard: Foxconn WinFast NF4SK8AA-8KRS (NVIDIA nForce4 SLI)
  • RAM: 2048 MB DDR SDRAM PC3200
  • HDD: Seagate Barracuda 7200.7 120 Gb SATA
  • Operating system: Microsoft Windows XP Professional SP2

We used two video cards in our tests, based on GPUs from the main manufacturers:

The list of games we use in our tests

  • Far Cry 1.4 (Crytek/Ubisoft) — Direct3D, maximal settings, HDR rendering disabled.
    Built-in benchmark, Research demo.


  • F.E.A.R. 1.02 (Monolith/Sierra) — Direct3D, maximal settings, soft shadows disabled.
    Built-in benchmark.


  • Splinter Cell: Chaos Theory 1.05 (Ubisoft) — Direct3D, maximal settings, Shaders 3.0 and HDR are disabled.
    Built-in benchmark, Lighthouse demo.


  • Serious Sam 2 (Croteam/2K Games) — Direct3D/OpenGL, maximal settings, HDR rendering is disabled.
    Built-in benchmark, ZUMZUM demo.


  • Prey 1.0 (Human Head Studios/3D Realms) — OpenGL, maximal settings.
    Built-in benchmark, demo003


We've decided to use the 1280x1024 resolution in our tests (or 1280x960, if this resolution is not supported) as the most popular resolution these days, mostly promoted by 17" and 19" LCD monitors. We've enabled the following options in all games: trilinear filtering, 16x anisotropic filtering, MSAA (antialiasing with four samples). As NVIDIA video cards cannot use multisampling for FP16 formats, used for HDR, this mode was disabled in all games (Splinter Cell: Chaos Theory, Serious Sam 2, and Far Cry.) We always enabled antialiasing and anisotropic filtering in the applications instead of in the driver. Firstly, because all modern games allow to enable them in the menu. Secondly, because this method is correct. As we have already mentioned, we used three test modes: Performance, Default, and Quality modes. Let's list the main driver settings in these modes for video cards with GPUs from ATI and NVIDIA...

Driver settings for the three modes:

ATI

Default:
Anti-Aliasing - "Let the application decide"
Anisotropic Filtering - "Let the application decide"
High Quality AF - "Off"
CATALYST A.I. - "Standard"
Mipmap Detail Level - "High Quality"
Adaptive Anti-Aliasing - "Off"
Temporal Anti-Aliasing - "Off"

Performance:
Anti-Aliasing - "Let the application decide"
Anisotropic Filtering - "Let the application decide"
High Quality AF - "Off"
CATALYST A.I. - "Advanced"
Mipmap Detail Level - "High Performance"
Adaptive Anti-Aliasing - "Off"
Temporal Anti-Aliasing - "Off"

Quality:
Anti-Aliasing - "Let the application decide"
Anisotropic Filtering - "Let the application decide"
High Quality AF - "On"
CATALYST A.I. - "Disable"
Mipmap Detail Level - "High Quality"
Adaptive Anti-Aliasing - "Off"
Temporal Anti-Aliasing - "Off"


NVIDIA

Default:
Anisotropic filtering - "Application-controlled"
Antialiasing settings - "Application-controlled"
Texture filtering - "Quality"
Anisotropic mip filter optimization - "Off"
Anisotropic sample optimization - "On"
Trilinear optimization - "On"
Force mipmaps - "None"
Negative LOD bias - "Allow"
Gamma correct antialiasing - "Off"
Transparency antialiasing - "Off"

Performance:
Anisotropic filtering - "Application-controlled"
Antialiasing settings - "Application-controlled"
Texture filtering - "High performance"
Anisotropic mip filter optimization - "On"
Anisotropic sample optimization - "On"
Trilinear optimization - "On"
Force mipmaps - "None"
Negative LOD bias - "Allow"
Gamma correct antialiasing - "Off"
Transparency antialiasing - "Off"

Quality:
Anisotropic filtering - "Application-controlled"
Antialiasing settings - "Application-controlled"
Texture filtering - "High quality"
Anisotropic mip filter optimization - "Off"
Anisotropic sample optimization - "Off"
Trilinear optimization - "Off"
Force mipmaps - "Trilinear"
Negative LOD bias - "Clamp"
Gamma correct antialiasing - "On"
Transparency antialiasing - "Off"

Some readers will certainly be of the opinion that we did it all wrong. But we have a standard answer in store: all these settings reflect our subjective approach and cannot be the only right ones. Nevertheless, we think that these very settings will be set by an experienced user, who understands what each of them means, if he/she wants to get usual quality (Default), maximum quality (Quality), or maximum performance (Performance). There may be some insignificant deviations (with reservations), as a different mode will not be the mode with maximum quality or maximum performance. The only setting, where we departed from our own rules, is Transparency antialiasing or Adaptive Anti-Aliasing, it was disabled in our tests.

Far Cry (Direct3D)

Far Cry is still one of the most frequently used benchmarks among video card reviewers. This game is notable for a lot of surfaces with sharp textures at various angles. It will be convenient to find quality flaws in texture filtering on such surfaces. But at first we are going to evaluate the effect of selected settings on gaming performance. Though similar, our video cards are from different price segments, so we shall not compare their performance directly. We'll express the performance results obtained in Performance and Quality modes relative to the Default mode, having taken the average FPS results in the latter mode for the 100% reference point.

The first test demonstrates quite a big difference in performance between these modes in ATI and NVIDIA cards. In case of the NVIDIA card, Performance mode with enabled optimizations (it reflects maximum possible performance) and Quality mode with disabled optimizations (it provides better 3D rendering quality) differ in speed from the default mode approximately by 10% (both ways, of course). In case of ATI, Performance mode yields almost nothing, and the average FPS in Quality mode differs from the Default mode not as much as in the NVIDIA card. So we assume that we'll see a bigger difference in image quality on NVIDIA screenshots. We tested it in practice using two test scenes for screenshots:

Far Cry, Scene 1
  Performance Default Quality
ATI








NVIDIA








The first test scene in Far Cry demonstrates that Quality and Default modes are practically identical in case of ATI. They differ only in a higher detail level of distant mip levels in the first mode. But ATI's performance mode differs much. We couldn't have thought about it judging from performance results: we can see a reduced detail level on all mip levels, which can be explained by the reduced Mipmap Detail Level. But the most interesting issue is evident problems with lighting, which can be caused by texture manipulations at texturing stages that differ from the first one in Advanced mode for CATALYST A.I. (or probably by a banal bug in the drivers). It's strange that such noticeable visual differences yielded almost no performance gains.

NVIDIA screenshots suggest different conclusions - the key image differences come down to texture filtering quality. While the Quality mode image is similar to what we saw in case of ATI (it's actually difficult to distinguish between them in quality modes), the other screenshots are noticeably moiré - an effect of an excessively aggressive optimization of trilinear filtering. It's especially noticeable in Performance mode. But you can make it out in Default mode as well. The effect gets mode pronounced in motion. As you can see in performance tests, benefits from all texture filtering optimizations of such an aggressive level reach 10-20%. However, a user can always disable optimizations - all of them (Quality mode, excellent quality) or selected ones. You should just keep in mind the consequent performance drop.

Far Cry, Scene 2
  Performance Default Quality
ATI








NVIDIA








The screenshots of the second scene in Far Cry demonstrate a similar situation: quality modes in ATI and NVIDIA are similar, but the former is at advantage due to the high-quality anisotropic filtering that provides a high detail level on surfaces at different angles. Image quality is similar in the default modes, there are no evident post effects of aggressive optimizations. And the performance mode from both vendors has the same problems - the ATI card demonstrates a reduced detail level on all mip levels and evident problems with lighting (light maps or pixel shaders). NVIDIA reveals texture noise due to excessively optimized trilinear and anisotropic filtering.

F.E.A.R. (Direct3D)

The built-in benchmark in F.E.A.R., used in many reviews, is also one of the most popular 3D tests. The game is notable for complex pixel shaders and sharp textures, which highlight texture filtering flaws. Unfortunately, the game is very dark, which covers some flaws in optimized texture filtering. As in case of Far Cry, we are going to determine the effect of 3D settings on performance in F.E.A.R. first, and then - on the image quality.

This game shows a different performance picture. Video cards from both main vendors demonstrate similar performance gains and drops when the default mode is changed to performance and quality modes. Performance drops by 7-8% in quality mode. When in performance mode, the speed grows only by 2-4%. So we can say that the game is limited by the fillrate or execution speed of pixel shaders. In other words, F.E.A.R. is an example of a game, where optimizations are not much of a help. Let's see whether it makes sense to change the default settings at all.

F.E.A.R., Scene 1
  Performance Default Quality
ATI








NVIDIA








The first scene demonstrates a usual situation. ATI card displays a similar image in Default and Quality modes. Only the Performance mode is different - reduced mip level detail and an evident optimization of trilinear filtering, which you can see well on the floor texture. What concerns NVIDIA, it is notable for texture noise, as usual. You can see it even in the default mode, but it gets really noticeable in Performance mode, especially in motion. Here is our subjective opinion: quality modes are similar (ATI is at a minor advantage, as usual, due to high-quality anisotropy), the default mode is a tad better in the ATI card, and the performance modes are just different. Some users will like the blurred mip levels and their noticeable borders, the others - some noisy textures.

F.E.A.R., Scene 2
  Performance Default Quality
ATI








NVIDIA








It's a similar situation, but the problems are more pronounced. The ATI video card shows some moire even in the default mode. Though it's less noticeable than in the NVIDIA card in the Default mode and is much less noticeable than in the Performance mode. ATI's quality mode shows more details on the wall texture than the same mode in NVIDIA. That's probably the effect of the high-quality algorithm of anisotropic filtering. In return, ATI shows sharp junctions in the wall texture approximately in the center (it's difficult to point at them on the static screenshots). The default mode is a tad better in ATI, there is less moire (on the other hand, sharpness is lower). Image quality in the performance mode is up to our readers to evaluate. These two GPU companies have very different approaches to raising performance. Considering the similar relative performance, the choice of a mode seems evident - Default or Quality. Performance mode makes no sense in the game. Perhaps, performance of the game in the test was not limited by the texture sampling.

Splinter Cell: Chaos Theory (Direct3D)

The latest part of Splinter Cell is an excellent example of an old game, still high-tech and popular among 3D testers. We can note such features as high-quality textures, normal mapping for all surfaces and parallax mapping for many of them, as well as stygian darkness due to the genre of this game. Let's analyze quality and performance of 3D rendering by ATI and NVIDIA cards in three various modes, described in the beginning of the article.

Performance in Splinter Cell: Chaos Theory in Quality, Default, and Performance modes differs even less than in previous F.E.A.R. tests. The speed in the performance mode is higher than in the default mode by just 2-3%. And the quality mode is slower than the default mode by just 6% in ATI (taking into account enabled high-quality anisotropy), and by 3% in NVIDIA (all texture optimizations disabled). Like F.E.A.R., this game seems not to be limited by the texturing speed. Let's have a look at the screenshots. Do optimizations make sense for Splinter Cell?

Splinter Cell: Chaos Theory, Scene 1
  Performance Default Quality
ATI








NVIDIA








What?! NVIDIA sacrifices normal texture filtering for the sake of those measly 3%-6% of performance gain?! It's strange, but ATI shows no improvements from high-quality anisotropic filtering in this scene. ATI's quality mode is almost no different from the default mode, and the performance mode is expectedly worse because of the increased mipmap LOD bias. NVIDIA's quality mode is good. There are almost no differences from ATI's image in the corresponding mode. But the other modes demonstrate lame texture filtering quality on a vertical wall, which is shown on the fragments. In motion, the quality is horrible in the performance mode - all those pixels are moving and the impression is unattractive. The default mode is a tad better, but textures at certain mip levels are noticeably noisy there as well. OK, we've examined the vertical surface, now it's turn to examine the horizontal surface.

Splinter Cell: Chaos Theory, Scene 2
  Performance Default Quality
ATI








NVIDIA








NVIDIA has almost the same problems on horizontal surfaces in Splinter Cell: Chaos Theory (moire instead of horrible stripes), but they are less pronounced. The reference quality in Quality modes from ATI and NVIDIA, practically the same situation is demonstrated by the ATI card in the default mode (only some distant surfaces are less sharp), there are various problems in other cases. All the above said about ATI's Performance mode is confirmed - the main changes are actually caused by the reduced Mipmap Detail Level. NVIDIA's performance mode still has the same problem - horrible moire and noise at average mip levels, deterioration of the rendering quality is obviously not proportional to the measly performance gain. NVIDIA's default mode also shows some signs of enabled texture filtering optimizations - the same moire and noise in motion. Though not that high, it's still noticeable in some cases.

Serious Sam 2 (Direct3D)

Serious Sam 2 supports Direct3D as well as OpenGL API, that's another advantage of this game for our purposes. The first advantage is vast spaces and high-quality textures, which reveal all flaws in texture filtering and its optimizations. The game offers a lot of surfaces with complex textures: large textures, lots of layers. That's why texturing is one of the main limiting factors in this game. Let's analyze relative performance of the game in our three modes. We can expect bigger differences than in the previous games.

Right, quite another matter! Although performance modes yield just 8% in both cards, the quality settings change ATI and NVIDIA's performance much. 24% in the ATI card and 16% in the NVIDIA card - these are very high results. We can see right away that texture filtering optimizations (and maybe some optimizations for a certain application in CATALYST A.I.) make sense and may yield high performance gains. Maybe such optimizations affect much the resulting rendering quality? Let's have a look:

Serious Sam 2 (Direct3D), Scene 1
  Performance Default Quality
ATI








NVIDIA








I have deliberately chosen this scene and aspect. Even though you will seldom look nearly vertically into the sky in the game, this angle of view and the object are good for determining the effect of various settings on the image quality. Have a look, you can see well the positive effect of "High Quality AF" on the image quality in ATI's quality mode. This high-quality algorithm does not have such weak points as inconvenient angles, where the quality of anisotropic filtering gets worse. It may seem to you that NVIDIA's performance mode offers the same quality, but that's not true. Motion reveals distinct texture noise on significant surface area, it looks very bad.

ATI and NVIDIA's default modes are ok, but not ideal. We can see problems with texture sharpness (additional anisotropy optimizations or just an inconvenient angle) in the first card and small artifacts in the form of "sand" in the second. NVIDIA's quality mode eliminates all the problems, but it's still far from ATI's quality mode. However, ATI's performance drop in this game is also significantly greater. Let's analyze another example from Serious Sam 2 in Direct3D mode:

Serious Sam 2 (Direct3D), Scene 2
  Performance Default Quality
ATI








NVIDIA








That's what the game has been famous for since the times of the first part, such long lines show some filtering quality problems. However, GPU designers seem to have learnt to cope with such situations for the time since the first part of the game. The quality is generally good, if we don't take the performance modes into account. But let's examine the images closer...

There is almost no visual difference between ATI's Quality and Default modes. That's not good, because a performance drop in this mode was too high. And if there is no quality difference, the quality mode makes little sense. On the other hand, the image quality in ATI's default mode is very good, a tad better than in NVIDIA's default mode due to the lack of noise on some borders between mip levels. To say nothing of the performance mode, where the borders between mip levels become distinct (ATI's performance mode is traditionally ruined by the high mipmap LOD bias). In return, NVIDIA's quality mode evidently catches up with ATI's quality. As there are no inconveniently inclined surfaces in the scene, it can boast of the same quality level at a lower performance drop relative to the default mode.

Serious Sam 2 (OpenGL)

Although the game does not officially support the OpenGL renderer, it can still be enabled. We'll certainly do it, because modern and hi-tech OpenGL games can be counted on your fingers. And though the OpenGL renderer does not support all effects in the game, we are more interested in simple texturing, which is no worse than in the main Direct3D version. The demo for testing performance is the same, but we changed the scenes for screenshots, to make it more interesting. But let's look at the performance first:

The OpenGL situation in Serious Sam 2 is even more interesting than in Direct3D. Either because vendors don't pay attention to optimizing unsupported modes in games (remember that you cannot enable OpenGL API from the menu in Serious Sam 2), or because less attention is paid to the OpenGL part of video drivers. Anyway, the result is evident. 11% and 21% of performance gain in the performance mode in NVIDIA and ATI's cards, 26% and 28% drop in the average FPS in the quality mode compared to the default mode. That's the first time, when the NVIDIA's card gains less from the performance mode and loses more performance from the quality mode. So it's very interesting to have a look at rendering quality in our modes.

Serious Sam 2 (OpenGL), Scene 1
  Performance Default Quality
ATI








NVIDIA








Indeed, we can say that the situation has changed. NVIDIA's default mode offers more strong points - inclined surfaces are sharper than in ATI, we can see no noise. On the other hand, NVIDIA's performance mode clearly shows rough optimizations of anisotropic filtering (its level decreases at distant mip levels) as well as some noise on models of automatic weapons with normal maps applied. NVIDIA's quality mode resembles the default mode, so it's not quite clear where the lost performance is. It's all crystal clear in case of ATI - some performance is lost to high-quality anisotropic filtering. The Quality screenshot shows sharper textures at inclined surfaces than in any other screenshots.

Serious Sam 2 (OpenGL), Scene 2
  Performance Default Quality
ATI








NVIDIA








ATI's screenshots of the second test scene again show almost no difference between Quality and Default modes. In case of NVIDIA, borders of mip levels appear in the default mode. Performance modes differ the most, as usual. Both ATI and NVIDIA show borders of mip levels, which speaks of aggressive optimizations of trilinear filtering. ATI also reduces details in mip levels. There is nothing new here, all the tendencies remain. All the more strange to see the relative performance results in Serious Sam 2 in the OpenGL mode. It especially concerns the quality modes, which do not demonstrate any special quality improvements. But the performance drops by more than a quarter.

Prey (OpenGL)

One of the few "pure" OpenGL games, which is based on the famous DOOM 3 engine. It's not as dark as the forefather of the engine and offers textures of better quality (it's only natural, comparing the release dates of these two games). It allows to evaluate the quality of texture filtering and determine possible effects of optimizations. But we are interested in this game in the first place, as one of the few representatives of OpenGL games. As usual, we are going to analyze the effect of settings on performance and quality on the screenshots from a couple of game scenes.

This situation is again nothing like any other - the performance modes remain within the standard 8-11% both for ATI and NVIDIA. But the quality modes show very different results. While 15% is a good average result for NVIDIA, 28% is too much for the ATI card, in our opinion. We don't know whether it's an effect of high-quality anisotropic filtering or the result of the disabled optimization for a certain application via CATALYST A.I. But the fact remains, only extraordinary image quality can justify this performance drop, otherwise there is no point in this quality mode.

Prey, Scene 1
  Performance Default Quality
ATI








NVIDIA








But we don't see it. There are no traces of the difference between ATI's Quality and Default modes. It makes it more likely that the fault of the performance drop lies in disabled optimizations for Prey... But the difference between NVIDIA's default and quality modes is visible - you cannot see texture noise on some mip levels in the latter. As usual, ATI's performance mode differs most in detailed textures on all mip levels. In case of NVIDIA, it's too aggressive optimizations of texture filtering, resulting in noise even in static scenes, to say nothing of dynamics. Such artifacts are especially noticeable on a foreground bush. Let's have a look at the second scene, perhaps it will be different.

Prey, Scene 2
  Performance Default Quality
ATI








NVIDIA








The second example from Prey unveils the problems of performance modes even better - ATI's image is all blurry, NVIDIA's - too sharp. So sharp that half of pixels stirs a little in motion as if alive. The difference is not very noticeable in the other modes. The quality mode is very good on both video cards. The picture is almost identical, except for some trifles. NVIDIA's default mode differs from the quality mode in some texture noise. It's no different at all in case of ATI. I don't understand where the horrible 15% and 28% of performance are gone to? But we are given driver settings so that we could choose an optimal combination of performance and quality.

Conclusions

Let's draw our conclusions on each mode separately, as they are so different. I repeat that some of the conclusions are a subjective opinion of the author of this article. Our readers are free to disagree.

  • Quality - the maximum image quality in the latest chips from ATI and NVIDIA with all driver options set to maximum quality and optimizations disabled is very good. It's very difficult to distinguish between the cards in most cases, if not impossible. NVIDIA and ATI chips have only rare weak points in the quality mode. Rendering quality is very much alike. But considering support for high-quality anisotropic filtering by the latter chips, we give the "Quality" garland to modern video chips from ATI. The difference is not large in most cases. You will have to look for it real thoroughly, but it's still there.

  • Default - the most frequently used mode demonstrates an ambiguous situation, both NVIDIA and ATI have problems with quality. But we can still say that ATI's default quality is higher. It was even more difficult to find the difference here. We have various gripes with the quality of chips from both companies (see above). But still, aggressive optimizations of texture filtering often let NVIDIA down, they result in unpleasant artifacts in the form of noise, "sand", moire, etc. ATI has some flaws, of course. But they are more rare.

  • Performance - this situation is quite different; we can see well the different approaches of ATI and NVIDIA to the most effective performance optimizations. While ATI reduces the filtering quality by optimizing trilinear filtering and reducing the level of anisotropic filtering as well as increasing mipmap LOD bias, NVIDIA enables aggressive optimizations of trilinear and anisotropic filtering that lead to the above-mentioned artifacts. It's hard to say which approach is better, but NVIDIA's methods usually yield a higher performance gain and/or a lesser quality drop - up to a user to choose. In our opinion, enabling only selected optimizations seems a better idea than CATALYST A.I., which effect on a given game is vague.

On the whole, we can note that optimizations usually make a greater difference in performance and quality on GPUs from NVIDIA than from ATI. But the most important thing that attracts our attention is excessively aggressive (in our opinion) optimizations of trilinear and anisotropic texture filtering in the default mode. Settings offered by NVIDIA result in visible artifacts, like texture noise, in many games. Of course, we deliberately drew our readers' attention to the most illustrative fragments. They do not happen everywhere in real life, and not full screen. Besides, similar artifacts (noise or "sand" in textures) sometimes appear in ATI video cards in the default mode. But they are much more rare there.

What concerns NVIDIA's optimizations of trilinear and anisotropic filtering, they often result in noticeable texture noise even in the default mode. Such noise has a negative effect on the general rendering quality in some games. Small performance gains cannot always justify such a drop in image quality. So we can say that optimal settings for NVIDIA cards these days are the Quality mode and Default mode in rare cases. In case of ATI it's the Default mode and the Quality mode in some cases. But these conclusions are subjective. It's wrong to compare performance of video cards with different quality settings. In our opinion, the optimal solution is to compare performance with default driver settings (maximum quality in some cases) and to provide several screenshots for our readers to evaluate rendering quality on their own.



Alexei Berillo aka SomeBody Else (sbe@ixbt.com)

October 4, 2006



Write a comment below. No registration needed!


Article navigation:



blog comments powered by Disqus

  Most Popular Reviews More    RSS  

AMD Phenom II X4 955, Phenom II X4 960T, Phenom II X6 1075T, and Intel Pentium G2120, Core i3-3220, Core i5-3330 Processors

Comparing old, cheap solutions from AMD with new, budget offerings from Intel.
February 1, 2013 · Processor Roundups

Inno3D GeForce GTX 670 iChill, Inno3D GeForce GTX 660 Ti Graphics Cards

A couple of mid-range adapters with original cooling systems.
January 30, 2013 · Video cards: NVIDIA GPUs

Creative Sound Blaster X-Fi Surround 5.1

An external X-Fi solution in tests.
September 9, 2008 · Sound Cards

AMD FX-8350 Processor

The first worthwhile Piledriver CPU.
September 11, 2012 · Processors: AMD

Consumed Power, Energy Consumption: Ivy Bridge vs. Sandy Bridge

Trying out the new method.
September 18, 2012 · Processors: Intel
  Latest Reviews More    RSS  

i3DSpeed, September 2013

Retested all graphics cards with the new drivers.
Oct 18, 2013 · 3Digests

i3DSpeed, August 2013

Added new benchmarks: BioShock Infinite and Metro: Last Light.
Sep 06, 2013 · 3Digests

i3DSpeed, July 2013

Added the test results of NVIDIA GeForce GTX 760 and AMD Radeon HD 7730.
Aug 05, 2013 · 3Digests

Gainward GeForce GTX 650 Ti BOOST 2GB Golden Sample Graphics Card

An excellent hybrid of GeForce GTX 650 Ti and GeForce GTX 660.
Jun 24, 2013 · Video cards: NVIDIA GPUs

i3DSpeed, May 2013

Added the test results of NVIDIA GeForce GTX 770/780.
Jun 03, 2013 · 3Digests
  Latest News More    RSS  

Platform  ·  Video  ·  Multimedia  ·  Mobile  ·  Other  ||  About us & Privacy policy  ·  Twitter  ·  Facebook


Copyright © Byrds Research & Publishing, Ltd., 1997–2011. All rights reserved.