iXBT Labs - Computer Hardware in Detail

Platform

Video

Multimedia

Mobile

Other

Test Method for Evaluating PC Performance

Version 4, as of 2009.

August 3, 2009



<< Previous page

     Next page >>

Software groups and tests

First of all, it makes sense to list software used in all tests:

  1. Microsoft Windows Vista Ultimate x64 + SP1
  2. NVIDIA Graphics Drivers x64 Vista 182.08

Plus the latest drivers for the chipsets, processors (if necessary), and other PC components.

Group 1: 3D visualization

It's a new group of tests, but it includes familiar software. We traditionally use three tests from three popular 3D suites: Autodesk 3ds max, Autodesk (aka Alias|Wavefront) Maya, and NewTek Lightwave.

Some of our readers note that the third package cannot compare with the first two in popularity. However, we are of the opinion that Lightwave has its small but stable share of the market (and it does not look like it will decrease in future), and it has its unique share of features, so removal of this package from our test method would have narrowed down its range of vision so to speak. Besides, it would have disappointed Lightwave users. Both options are no good. Besides, our Lightwave test has become much more interesting in this version of the test method.

Our readers surely remember that results obtained in each of these programs fall into two types: the speed of interactivity and the speed of rendering (Maya tests also produce separate results for the speed of CPU-intensive operations aside from rendering). The first type of load (interactivity) is critical to CPU and GPU speed (either Direct3D or OpenGL is used for video output), but it's quite indifferent to the number of cores (processors) in the system. On the other hand, rendering ready scenes is absolutely indifferent to the graphics system, but it's sensitive to CPU speed and the number of cores. Thus, we see it conceptually correct to divide interactive and render results of 3D modeling packages into two groups. Interactive score is calculated in the 3D Visualization group. And results of measuring render speed are published in the separate group: "3D Scene Rendering".

However, the same visualization group should include not only results of interactive tests in 3D modeling packages, but also the other old acquaintances of ours: CAD/CAM suites SolidWorks, PTC Pro/ENGINEER, and UGS NX. There is a separate graphics component in their tests, and it actually means the same thing as the interactive score in 3D modeling tests: the speed of a program working with certain hardware to output 3D images to a display in real time using some graphics API (in our case all three CADs use OpenGL). Thus, this part of our test method includes seven packages, three of which belong to 3D modeling software, three CADs, and one program is the 10th generation of the famous SPEC viewperf test, which is also designed to evaluate the speed of real-time 3D visualization with OpenGL. Let's describe each test in more detail.

3ds max



Unfortunately, SPECapc still lacks tests for the current version of 3ds max. So we are forced to resort to more and more tricks to continue to use an old benchmark (written for the 'grandfather' of the current suite) with the new version (3ds max 9 -- 3ds max 2008 -- 3ds max 2009). Alas, we don't know an alternative benchmark that can compare with SPECapc for 3ds max 9. However, the situation is worsening each year: the SPEC benchmark worked well with 3ds max 2008 (there were some minor issues, but all its features worked), but several fragments of this test script fail to work with 3ds max 2009. The program would either crash or the messed main window would testify to problems. As a result, we had to remove several scenes that were used in the original test from SPEC and overhaul the summary table and formulas used to calculate average scores (trying to preserve the original logic, when possible). Thus, results of this tests obtained now will have nothing to do with results of the original SPEC test, and they cannot be compared with each other.

As willy-nilly we modified the test to make it work, we could also overhaul the SPEC benchmark a little to make its results even more interesting. So we decided to modernize the rendering part. The modified version of the benchmark uses the popular professional V-Ray to render all scenes instead of the original built-in renderer 3ds max Scanline (which is not interesting to professionals). Besides, we use more complex scenes now: instead of the standard scenes that come with the test (quite easy for modern processors) -- Throne Shadowmap, CBALLS2, and Space FlyBy -- we 'fed' to the renderer Scenes 01, 02, and 07 from Evermotion Archinteriors Vol 1. Our efforts hardly raised this benchmark to the height of the hypothetical SPECapc for 3ds max 2009. However, 'better a small fish than an empty dish.'

Maya



Even more time passed since the release of SPECapc for Maya 6.5, so it's not a state-of-the-art benchmark either. However, as we already wrote in the article devoted to the previous version of the test method, a minor change in one scene file allows to use it even with the latest Maya versions. In this case, SPEC test contains only an interactive component, and it does not evaluate rendering speed. We make up for this shortcoming in a traditional way: we measure render times of our own Maya scene (with the MentalRay renderer). We can say that nothing has changed since the previous version of the test method, only the program version.

Lightwave



SPEC has finally done it! They have released a test for Lightwave, a sterling test for the latest version (including the 64-bit one). It has an interactive part, a rendering part, and even measures speed in a multitasking environment. This is certainly much better than a single test (render time) used in our previous test methods -- in terms of both scope and developer authority. Unfortunately, we have no other comments: it's a new test, we haven't gathered enough statistics yet. However, it meets our formal selection criteria: it supports modern software, it does not crash, and it provides test results.

SolidWorks



We've been using this test for several years already. Unfortunately, for all these years SPEC hasn't released a single update for its benchmark. So each time a new version of SolidWorks comes out, we hold our breath as we start SPECaps for SolidWorks 2007... And sigh with relief: we're lucky again, it works. The benchmark itself resembles the other SPECapc tests, including those of 3D modeling packages: it measures performance in three aspects: processor-intensive operations, graphics, IO operations (it actually measures the speed of the disk system). Results represent the number of seconds it takes to execute corresponding parts of the benchmark (the smaller the result, the better). Only one test takes part in this segment -- Graphics.

Pro/ENGINEER



We use OCUS Benchmark for Pro/ENGINEER, as this test develops much more dynamically than SPECapc for Pro/ENGINEER. It has a 64-bit version that can use lots of memory (this is one of the main advantages of 64-bit systems), and it supports the latest version of Pro/ENGINEER Wildfire. That is we have no reasons to use the old test from SPEC. Even though the OCUS Benchmark is an independent project, it also provides three results of the same type: Graphics related tasks, CPU related tasks, and Disk related tasks. As in case with the previous test, it's the sum of seconds spent on execution tasks in a given group. Only Graphics related tasks are used for the visualization score.

UGS NX



It's another CAD/CAM package, and another old test from SPEC -- SPECapc for UGS NX 4 (we use the sixth version of the package in this test method). Fortunately, this test works well with the updated package. In fact, it looks very much like the previous two tests: three total scores (performance of graphics, processor, and IO operations). In this section we are interested in the first components -- graphics.

SPEC viewperf 10



It's a famous test with a long and glorious history. It was developed for the analysis of graphics performance. This benchmark traditionally includes only those applications that can use OpenGL. In fact, viewperf is the most conventional tool for integral evaluation of graphics cards' performance in API OpenGL. It's an optional package in our method of testing: if we are interested in CPU performance, this benchmark makes no sense, as viewperf results do not depend on a CPU. And if we want a complex comparison of two computers with different graphics cards, viewperf results included to the 3D Visualization part may be very useful. Version 10 of this test now supports multithreading. However, this support is quite primitive so far: the test can be executed in one, two, and four threads (there are no other options, for example, three or eight threads). Besides, multithreading is used in the most primitive way: several instants of the test are started (that is we can see two or four identical windows with the same content).

Technical details

  1. Software: 3ds max 2009 x64 + SP1, V-Ray for 3ds max x64 1.50 SP2, SPECapc for 3ds max 9. Testing: Install SPECapc for 3ds max 9, then download UIarrayAutomation.ms and copy (or overwrite, if this file is already present) it to \Program Files\Autodesk\3ds Max 2009\MAXTest\ScriptFiles. Go to \Program Files\Autodesk\3ds Max 2009\MAXTest\Scenes, replace throne_shadowmap, cballs2, and space_flyby scenes to Scenes 01, 02, and 07 from Evermotion Archinteriors Vol 1. Copy graphics files that come with Evermotion Archinteriors scenes to \Program Files\Autodesk\3ds Max 2009\maps. Start the test: \Program Files\Autodesk\3ds Max 2009\MaxTestAuto.bat. The Graphics score is used in this part of the test method. The test is run five times, results obtained are averaged out.
  2. Software: Maya 2009 x64 + SP1, SPECapc for Maya 6.5. Testing: Install SPECapc for Maya 6.5, then replace \Program Files\SPECapc\MayaBenchmark\Insect\scenes\Insect.ma with the file downloaded from here. Then start the test according to its user's guide. GFX score is used here. The test is run five times, results obtained are averaged out.
  3. Software: Lightwave 3D x64 9.6. SPECapc for Lightwave 3D 9.6. Instructions on how to start the test come with SPECapc for Lightwave 3D 9.6, no changes are necessary. We use the Interactive score here: the lower the score, the better. The test is run five times, results obtained are averaged out.
  4. Software: SolidWorks 2009 x64 SP0.0, SPECapc for SolidWorks 2007. Testing: according to the SPECapc for SolidWorks 2007 guide, the error message should be ignored (just press OK). We use the Graphics score here: the lower the score, the better. By default, the test offers five iterations and averages the score on its own.
  5. Software: Pro/ENGINEER Wildfire x64 4.0 (M070), OCUS Benchmark v5.1. Testing: run the 64-bit version of the test according to the instructions. We use the Graphics related tasks score here: the lower the score, the better. The test is run five times, results obtained are averaged out.
  6. Software: UGS NX x64 6.0, SPECapc for UGS NX 4. Configuration: replace \Program Files\SPECapc\NX4mark\nx4mark.bat with the modified file. Testing: run the test according to the SPECapc for UGS NX 4 Guide. The Total Graphics score is used in this part of the test method. The test is run five times, results obtained are averaged out.
  7. Software: SPEC viewperf 10. Testing: you start a single-, two-, and four-thread versions of the test, the resulting score for each subtest is the best of the three results. The test is run five times, results obtained are averaged out. The total score is calculated as a geomean of results demonstrated by all subtests.

The total score in a group is calculated as a geomean of all test results (of the lowest value means the best result, the geomean result works as 1/x).


Write a comment below. No registration needed!


<< Previous page

Next page >>



blog comments powered by Disqus

  Most Popular Reviews More    RSS  

AMD Phenom II X4 955, Phenom II X4 960T, Phenom II X6 1075T, and Intel Pentium G2120, Core i3-3220, Core i5-3330 Processors

Comparing old, cheap solutions from AMD with new, budget offerings from Intel.
February 1, 2013 · Processor Roundups

Inno3D GeForce GTX 670 iChill, Inno3D GeForce GTX 660 Ti Graphics Cards

A couple of mid-range adapters with original cooling systems.
January 30, 2013 · Video cards: NVIDIA GPUs

Creative Sound Blaster X-Fi Surround 5.1

An external X-Fi solution in tests.
September 9, 2008 · Sound Cards

AMD FX-8350 Processor

The first worthwhile Piledriver CPU.
September 11, 2012 · Processors: AMD

Consumed Power, Energy Consumption: Ivy Bridge vs. Sandy Bridge

Trying out the new method.
September 18, 2012 · Processors: Intel
  Latest Reviews More    RSS  

i3DSpeed, September 2013

Retested all graphics cards with the new drivers.
Oct 18, 2013 · 3Digests

i3DSpeed, August 2013

Added new benchmarks: BioShock Infinite and Metro: Last Light.
Sep 06, 2013 · 3Digests

i3DSpeed, July 2013

Added the test results of NVIDIA GeForce GTX 760 and AMD Radeon HD 7730.
Aug 05, 2013 · 3Digests

Gainward GeForce GTX 650 Ti BOOST 2GB Golden Sample Graphics Card

An excellent hybrid of GeForce GTX 650 Ti and GeForce GTX 660.
Jun 24, 2013 · Video cards: NVIDIA GPUs

i3DSpeed, May 2013

Added the test results of NVIDIA GeForce GTX 770/780.
Jun 03, 2013 · 3Digests
  Latest News More    RSS  

Platform  ·  Video  ·  Multimedia  ·  Mobile  ·  Other  ||  About us & Privacy policy  ·  Twitter  ·  Facebook


Copyright © Byrds Research & Publishing, Ltd., 1997–2011. All rights reserved.