Full Screen Antialiasing is called to improve output image quality. Before we'd just provided you with performance of this technology, but now we decided to expand this section and provide quality features as well. In this issue we continue our examination on a number of new cards.
There's various sorts of antialiasing (as well as anisotropy) :-). Moreover, various companies implement their own AA. We won't describe the details of various antialiasing kinds here, but will sum them up.
So, what is antialiasing and how is it used? To answer this, we should turn to physics and linguistics. The latter is simple: "anti" is clear and "aliasing" is jaggies. As the monitor features a number of pixels vertically and horizontally, vertical or horizontal lines will appear perfect, following one another. But if the line is a kind of diagonal, it will look jaggy, because pixels are organized in the 2D matrix, limited by the screen resolution. So what should we do, if we can't change our monitors? Still user's eyes and mind can be tricked. This is done by various antialiasing algorithms.
As the task is clear, the algorithm is as follows. As pixel color is encoded in 8 to 32 bits (there can be more, but human eye won't distinguish 64-bit color from 32-bit, for example) it can be substituted. For example, there are two colors, white and black. They are contrary and produce a clear BORDER=1 when near. So, if we draw a black line on the white background, it won't be straight (unless vertical or horizontal), but jaggy. To make it more perfect, we need to paint borders over, i.e. blend colors. In the given case, it's a blend of black and white. As a result, we'll get gray color on the both sides of displacement points. In most cases we'll save the original line color, but sometimes the new one may replace the original. The size of matrix is important as well. For example, if it's 800x600, jaggies will be more obvious, comparing to 1600x1200 with lesser displacement. I.e., larger matrix corresponds to higher capacity and smaller pixels. Actually, it's more of a whim to use antialiasing at 1600x1200, comparing to 800x600. But everything is more complex.
To confuse you even more ;-) here are excerpts from NVIDIA World articles with our additions.
During antialiasing each pixel is divided into subpixels. Pixel color is averaged with some formula, given subpixel colors. At that, though physical resolutions remains the same, the effective resolutions becames considerably higher.
Two most popular approaches include supersampling and multisampling. Both are based on defining pixel color by blending subpixel (sample) colors, though samples are generated differently.
Supersampling is the simplest and straightforward method. The image is calculated in virtual resolution, several times higher than the actual. After that it's scaled and filtered to the original resolution. The color of each pixel is defined using several subpixels. This enables to improve image quality considerably, but resultes in several times higher graphics card load and thus performance falloff. The reason is that it needs to calculate several times more colors for a pixel instead of one, for example, AA 2x2 at 800x600 will require the calculation of 800x2 x 600x2, i.e. 1600x1200.
Multisampling is a more complex and intellectual approach or rather a tool. The idea is very simple as well: why stupidly calculate N subpixels for each pixel, if we can reuse already calculated ones again and again to form several resulting pixels. On the other hand, no AA is needed in some image parts, so why use several subpixels, if one is enough? Vice versa, some parts require very good AA, so we need to calculate many more subpixels there. This enables not only to save resources significantly, but also to obtain better AA quality! The tool can be used as one likes, and performance and quality depend on the implementation, selected by graphics card or game developer.
Now let's get to specific implementations of makers' AA. Let's start with the most unusual and original solution of Matrox.
FSAA supported by Parhelia-512 is currently unique and is close to ideal. Essentially, it's supersampling with up to 16 calculations per pixel. But it's performed ONLY (!) for polygon BORDER=1 dots (3-5% of a typical scene). The advantage is clear: unlike multisampling, it doesn't keep redundant data in memory and transfer them over the bus! The total frame buffer size increases slightly, up to by a factor of two even with the max. 16x setting. A special fast rendering pass is used to determine BORDER=1 pixels, when graphics card marks only BORDER=1 polygon pixels in a separate buffer without calculating textures and painting intermediate pixels. Besides, as only borders are processed, there's no texture sharpness losses, peculiar to FSAA and some hybrid MSAA techniques. Nowever, this intellectual AA can still produce artefacts in some cases. Besides, it can't correctly process borders mixed with transparent polygons (like clouds, glass, fire).
NVIDIA AA is clearly described in articles at NVIDIA World. We are not going to post them again, but just provide the links to them. The basic article is here. This newer article describes NVIDIA AA technologies.
As for ATI AA, it's reviewed here along with other companies' antialiasing.
Let's now finish with theory and move to the performance.
The following cards were used in these tests:
Tests were conducted on our AMD Athlon 64 3200+ based testbed and FarCry game.
But we must know what we spend the performance for. As you can see the falloffs are rather great, we'd like to know if we really need it. Here are the screenshots from the tests, indicating the actual quality of different antialiasing modes.
During tests we found out that FarCry v.1.1 doesn't force AA from NVIDIA 61.11 drivers control panel. Therefore we had to force it ingame. As you know, there are None, Low, Middle, and High settings, where Middle should be AA4x, while High - AA8x. Why should? Because the game refused to enable High setting! So both Middle and High produced the same performance results suitable for AA4x.
Experimentally we found out that renaming farcry.exe miraculously enabled AA in drivers control panel as well as enabled AA8x ingame.
We also found out that preferable AA activation method was ingame, not in drivers. So NVIDIA disabled the latter, so you can't mess with control panel more then you are allowed to :)