iXBT Labs - Computer Hardware in Detail

Platform

Video

Multimedia

Mobile

Other

NVIDIA GeForce3 benchmarking with Vulpine GLMark



The other days we have considered the NVIDIA GPU GeForce3, and video cards on it. Having complained about the lack of the appropriate software for usage of all possibilities of the new processor we examined the GeForce3 with already classic Quake3 (OpenGL) and 3DMark 2000 (DirectX). All new possibilities of the GPU GeForce3 were checked with the examples from the DirectX 8.0 SDK.

Right after the release of this chipset MadOnion company, 3DMark 2000 developer, promised to release a super grandiose benchmark 3DMark 2001 which can enable all possibilities and advantages of the DirectX 8.0. MadOnion has shown prodigious screenshots from the beta-version and released a video clip showing this program in demo mode. The date of debut of the program is kept in secret. Hope it will be at CeBIT'2001.

However, a german company Vulpine which is dealing with development of 3D Rendering Engines based on their Vulpine Vision technology have released the Vulpine GLMark benchmark based on one of their engines - The Vulpine Vision Game Engine. The OpenGL serves a basis. Let's see whether the young company can compete against the venerable MadOnion with dignity.

Note that the GLMark's zest lies in a full value support for NVIDIA GeForce3, i.e. a demonstration of the vertex and pixel shaders, as well as bump mapping via Dot3 or EMBM (in all cases the appropriate OpenGL extensions are used).

Vulpine GLMark

Currently, there is only 1.1 technologic version of the product, that's why in case of failures you should make allowance for this fact.

So, you have the main menu of the program in front of you. I think that it's clear what should be chosen. I want just to recommend some.

  • GeForce3 (Compatibility section) - inclusion of a series of unique (for this GPU) OpenGL extensions (details are not clarified);
  • The program can work with texture autocompression (like the Quake3), that's why the Use Texture Compression item is right for activation of this mode;
  • The Vertex Array Range item is not available when working with the ATI RADEON based cards due to the fact that the corresponding extension is directed toward NVIDIA. I hope that in the further version the current inclination for NVIDIA-cards will be corrected.

GLMark can keep the results obtained in the database where you can always check them afterwards:

Unfortunately, the program itself chooses the folders for storage, and sometimes everything gets messed up since GLMark measures the CPU frequency from time to time and sets this frequency in the name of the folder. And clock speed measurement contains a tolerance (for example, 1000 MHz can be achieved either as 999 MHz or as 1001 MHz), that's why the results of the same video card can be placed in different folders. Moreover, the folders are named after the ICD OpenGL driver, and not after the video card's name, and, for example, test results for the GeForce2 GTS/Pro/Ultra cards will be put into the same folder despite the fact that cards are different.

Testing conditions

Testbed configuration:

  • CPU Intel Pentium III 1000 MHz:
  • Mainboard Chaintech 6OJV (i815);
  • RAM 256 MBytes PC133;
  • HDD IBM DPTA 20 GBytes;
  • OS Windows 98 SE;
  • Monitors: ViewSonic P810 (21") and ViewSonic P817 (21").

We have examined the NVIDIA GeForce3 with the Leadtek WinFast GeForce3 card and took the following ones for comparison: Leadtek WinFast GeForce2 Ultra and ATI RADEON 64 MBytes DDR (183MHz). For the cards on the NVIDIA chipsets we used 10.70 version drivers, and for the RADEON - the 7.078 version drivers from ATI.

Test results

Let's start examination with the general performance in High Detail mode.

The absolute FPS values are rather low. And it's not strange since the scenes are rather complex , especially in such mode. In general, the results follow the ones we received in our first review on the GeForce3: the performance decrease when migrating from 16 to 32-bit color is insignificant, and in 16-bit color it slightly lags behind the GeForce2 Ultra. I have taken the ATI RADEON 64 MBytes DDR card since it is the only apart from the NVIDIA GeForce3 that supports new possibilities such as bump mapping. But its performance results are very sad. I think it's better to leave aside the RADEON performance results since there is nothing to be proud of (the reason lies rather in the lack of support for a series of ATI ICD OpenGL extensions in GLMark than in possibilities of the RADEON itself).

What do Low, Medium and High Details items mean? Let's look at the screenshots where you can see Low Details mode on the left and Medium one on the right:

And now there comes the High Details mode:

It's clear that the modes differ in richness of the objects on the scene, what is very similar to 3DMark 2000. Interestingly how strong does performance depend on the scene complexity?

The results are impressing. The both cards, on the GeForce3 and GeForce2 Ultra, showed sharp leaps in the performance when decreasing a number of objects on the scene. Unfortunately, we do not know the number of polygons in one or another complexity level that's why it's hard to judge how the GPU depends on the number of polygons on the GLMark. Just note that the High Detail is a level of the future games.

The diagrams which are demonstrated in the beginning of the performance analysis show how much anti-aliasing takes. We have already discussed a new type of the AA, called Quincunx, which provides a perfect compromise between the quality and speed.

And below you can see a shot with the AA 4X enabled (2X2).

There are no any visual differences in quality between the stronger AA 4X and Quincunx, the former gives higher speed decrease.

Now we are going to turn to the anisotropic filtering on the GeForce3, and what it gives in the GLMark. All following tests were carried out in 32-bit color.

Do not feel upset about such performance decrease with the anisotropy activation, especially the hardest one which appears with 32 texture samples. Let's check in practice what give different filtering modes:

Here you can see 8-pixel anisotropy:

Anisotropic filtering is off:

Anisotropic filtering is on (8 samples):

And here is the 16-pixel anisotropy:

Anisotropic filtering is off:

Anisotropic filtering is on (16 samples):

And now up 32-pixel anisotropy:

Anisotropic filtering is off:

Anisotropic filtering is on (32 samples):

It's clear that you can tell one filtering type from another only in case of an even, far going surface (floor, for example). So, why to pay so much? I think that in the most cases 8-sample anisotropic filtering is enough.

Before turning to the quality of all three cards, I want to attract your attention to S3TC-technology. As I have already said, the GLMark implements texture autocompression with S3TC activation. The Quake3 showed what drawbacks are typical for the cards based on the GeForce chipsets in 32-bit color. Let's see what happens in the GLMark:

The first example:

S3TC is off:

S3TC is on:

The second example:

S3TC is off:

S3TC is on:

Well, yes, there are some imperfections, but remember that usage of the S3TC technology implies some loss in quality. The losses will be minimal and the S3TC will give not a damage but even some benefit only if the game will use textures purposely produces and compressed in advance. And in our case the first diagram shows that S3TC enabling in the GLMark yields little:

Possibilities and quality

Here we will show what all three video cards are capable of, and first of all comes the NVIDIA GeForce3.

The first example of the bump mapping (EMBM)

ATI RADEON

GeForce2 Ultra

NVIDIA GeForce3

The second example of the bump mapping (EMBM)

ATI RADEON

NVIDIA GeForce2 Ultra

NVIDIA GeForce3

It's clear that the EMBM is perfectly realized in the GLMark, and the NVIDIA GeForce3 supports this technology excellently. You must know also that the GeForce2 can't just do it. That's why there is nothing surprising in such gloomy water. And it's unclear why the RADEON didn't manage to realize its potential. There can be only two reasons:

  • The RADEON drivers didn't realize the potential (we have already witnessed it once with the S3TC-technology);
  • The GLMark itself is guilty in case it puts the EMBM usage only in the section for the GeForce3.

There are some more vivid examples:

ATI RADEON

NVIDIA GeForce2 Ultra

NVIDIA GeForce3

Again the water is more realistic on the GeForce3. Besides, the RADEON has some obvious troubles with realization of lighting effects (the NVIDIA GeForce2 Ultra couldn't reach the GeForce3 level either):

The first example

ATI RADEON

NVIDIA GeForce2 Ultra

NVIDIA GeForce3

The second example

ATI RADEON

NVIDIA GeForce2 Ultra

NVIDIA GeForce3

Conclusion

Now let's draw a conclusion. It's rather short and capacious:

  • The new Vulpine GLMark benchmark turned to be successful. This is a really good product despite some downsides which can be accredited to the unfinished version of the program. The GLMark can give testers valuable material for analyses and serve a good demo-program which shows a high level of designers' and programmers' competence. Unfortunately, some inclination for NVIDIA-chipsets worsens the general impression.
  • The NVIDIA GeForce3 showed top notch quality, but the performance is not so high as it should be. The GeForce3 is still the undoubted leader, on the video cards of such level you can play games of even such high complexity at high resolutions with good gameability.
  • Some problems with the ATI RADEON can be explained by the aforementioned reasons, that's why we hope for the future GLMark versions as well as for the corresponding corrections in it in order to obtain the full value support not only for NVIDIA-cards.

Write a comment below. No registration needed!


Article navigation:



blog comments powered by Disqus

  Most Popular Reviews More    RSS  

AMD Phenom II X4 955, Phenom II X4 960T, Phenom II X6 1075T, and Intel Pentium G2120, Core i3-3220, Core i5-3330 Processors

Comparing old, cheap solutions from AMD with new, budget offerings from Intel.
February 1, 2013 · Processor Roundups

Inno3D GeForce GTX 670 iChill, Inno3D GeForce GTX 660 Ti Graphics Cards

A couple of mid-range adapters with original cooling systems.
January 30, 2013 · Video cards: NVIDIA GPUs

Creative Sound Blaster X-Fi Surround 5.1

An external X-Fi solution in tests.
September 9, 2008 · Sound Cards

AMD FX-8350 Processor

The first worthwhile Piledriver CPU.
September 11, 2012 · Processors: AMD

Consumed Power, Energy Consumption: Ivy Bridge vs. Sandy Bridge

Trying out the new method.
September 18, 2012 · Processors: Intel
  Latest Reviews More    RSS  

i3DSpeed, September 2013

Retested all graphics cards with the new drivers.
Oct 18, 2013 · 3Digests

i3DSpeed, August 2013

Added new benchmarks: BioShock Infinite and Metro: Last Light.
Sep 06, 2013 · 3Digests

i3DSpeed, July 2013

Added the test results of NVIDIA GeForce GTX 760 and AMD Radeon HD 7730.
Aug 05, 2013 · 3Digests

Gainward GeForce GTX 650 Ti BOOST 2GB Golden Sample Graphics Card

An excellent hybrid of GeForce GTX 650 Ti and GeForce GTX 660.
Jun 24, 2013 · Video cards: NVIDIA GPUs

i3DSpeed, May 2013

Added the test results of NVIDIA GeForce GTX 770/780.
Jun 03, 2013 · 3Digests
  Latest News More    RSS  

Platform  ·  Video  ·  Multimedia  ·  Mobile  ·  Other  ||  About us & Privacy policy  ·  Twitter  ·  Facebook


Copyright © Byrds Research & Publishing, Ltd., 1997–2011. All rights reserved.