iXBT Labs - Computer Hardware in Detail

Platform

Video

Multimedia

Mobile

Other

Test Method for Evaluating PC Performance



<< Previous page

     Next page >>

Group 10: HD video playback



Potential audience for this group of tests are all people who watch high definition video on their personal computers. We've developed a test to determine how well a given computer will cope with HD content (qualitative rating: will cope/won't cope) and how difficult this task will be (quantitative rating). This test analyzes CPU usage during HD video playback in one of the most popular formats: H.264. Unfortunately, the player we use for these tests (MPC HC x64) has some problems with DXVA support in the other popular format -- VC-1. So we were forced to give up the idea of using this format in our tests.

We use the software player Media Player Classic Homecinema to play a 10-minute HD video clip, and monitor CPU usage (all cores) with Riva Tuner five times a second. Media Player Classic is configured to use EVR, which is the best option for this player in Microsoft Windows Vista. The player starts maximized, but not in fullscreen. The only info displayed is playback statistics. Video is played in two different situations:

  1. H.264 video, GPU assistance (DXVA) is disabled
  2. H.264 video, GPU assistance (DXVA) is enabled.

The following results are calculated for each playback using Riva Tuner logs:

  1. The average CPU usage during playback. This parameter corresponds to average readings from Windows Task Manager.
  2. Minimum and maximum CPU usage registered during the test.
  3. Besides, a screenshot of Media Player Classic with statistics is taken at the end of each playback. In this case we are interested in dropped frames. In most cases it equals zero (no dropped frames, so the system can cope with HD video playback) or it may differ from zero (some frames were dropped, so the system cannot play HD video).

We can only say that, just like SPEC viewperf 10, this test hasn't been added to any group for good. That is its results are published in a table, but they do not affect any total group score. By the way, such tests are another novelty in our test method. In our opinion, such optional tests make it more flexible. As results of optional tests do not affect total scores, they can be added or removed from the test method any time. On the other hand, results of these tests can be analyzed, discussed, accumulated for statistics, and get your feedback, which adds dynamism to the situation after freezing the test method for another year and a half.

Technical details

Software: Media Player Classic Home Cinema x64 1.2.908.0, RivaTuner 2.24. Test conditions: RivaTuner is configured to monitor CPU usage (each core) five times a second, monitoring results are written into a log file. Start Media Player Classic Homecinema and open the H.264 video fragment of Iron Man (30000 bitrate, 10 minutes long)*. Then analyze the log: first of all calculate instant CPU usage by summing up the load of each processor core and dividing the result by the number of cores; then calculate the minimum, average, and maximum CPU usage. First run (Software Mode): file to be imported into the register. Second run: (DXVA Mode): file to be imported into the register.

* We cannot upload this fragment, as it will be infringement of law, so we give you an accurate description, so that you could make the file on your own.

Group 11: 3D games



The gaming part is traditional for our test method. However, there have been made significant changes here (probably for the first time): now this chapter is divided into two large parts for different computer systems. The first part of tests is quite traditional. It's designed for high-speed systems equipped with graphics cards. This group uses modern games with high settings for image quality and resolution (1280x1024). We have finally abandoned the old popular idea to try and benchmark CPUs with games in low resolution and low graphics quality. From our point of view, such manipulations render results innocuous, they become fully synthetic and lose their connection with reality: LQ options simplify (or even disable) CPU computations. Thus, even within the same game engine, CPU usage in LQ and HQ modes may actually differ in both essence and the set of computations.

The second part is represented by old games (three or four years old) with average graphics quality settings and low resolution (800x600). This group will help us get a full idea of gaming performance demonstrated by computers with integrated graphics. Indeed, some integrated solutions still don't support DirectX 10, to say nothing of the speed in tests from the first part of tests: it does not matter whether such a computer demonstrates 3 or 4 fps in Crysis Warhead. Both 3 and 4 fps are too slow, even though the latter has a formal 33% advantage over the former. So these tests are designed for weak computers. It's what owners of such computers, who like to play games, usually do: they install old hits and use low graphics options.

Technical details

Group 1: Game tests for PCs with discrete graphic cards.

  1. Crysis WARHEAD (Patch 1.1). We use an external benchmark from FRAMEBUFFER. Open Benchmark Tool settings in the application. Save this configuration file into the folder \Users\<username>\Documents\My Games\Crysis_WARHEAD. The test is run five times, results obtained are averaged out.
  2. Devil May Cry 4. We use the benchmark built into the game (it provides four results to be averaged out into the final result). Save the configuration file in the folder \Users\<username>\AppData\Local\CAPCOM\DEVILMAYCRY4. This test is run only one time, as stability of its results is very high.
  3. Far Cry 2 (Patch 1.01). We use the benchmark built into the game. All options are specified in the command line. The test is run five times, results obtained are averaged out.
  4. Grand Theft Auto IV (Patch 1.2.1). We use the benchmark built into the game. Save the files with game settings (1, 2, 3) into \Users\<username>\AppData\Local\Rockstar Games\GTA IV\Settings. The test is run five times, results obtained are averaged out.
  5. Left 4 Dead. We use the recorded fragment of the game from Guru3D (instructions included). Save the configuration file into <folder with the installed game>\left4dead\cfg. Start the game with the -dev option, then open console (~), and type 'timedemo guru3d.dem'. The test is run five times, results obtained are averaged out.
  6. Lost Planet: Extreme Condition DX10 Demo. We use the benchmark built into the game. Save the configuration file into the folder \Users\<username>\AppData\Local\CAPCOM\lostplanetTrial. It's a continuous benchmark, so we stop it after 15 minutes. Test results are captured from screen.
  7. S.T.A.L.K.E.R.: Clear Sky. DirectX 10 benchmark from game developers. Import the test configuration file into the register. The intermediate total is calculated as arithmetic average of daybenchmark, nightbenchmark, rainbenchmark, and sunshaftbenchmark results. This test is run five times, and the final result is calculated as arithmetic average of five intermediate results.
  8. Unreal Tournament 3 (Patch 4) + Titan Pack. We use the integrated benchmark (imitation of the network game, when all players are controlled by the computer) with the CTF-FacingWorlds map (eight players, 20 minutes). Config files: unpack the first file into \Users\<username>\Documents\My Games\Unreal Tournament 3\UTGame\Config, the second file into \Program Files (x86)\Unreal Tournament 3\UTGame\Config, the third into \Program Files (x86)\Unreal Tournament 3\Engine\Config.
  9. World in Conflict (Patch 1.010). We use the benchmark built into the game. Save the configuration file into \Users\<username>\Documents\World in Conflict. The test is run five times, results obtained are averaged out.

Group 2: Game tests for computers with integrated graphics cores.

  1. Call of Duty 2 (Patch 1.3). We use the recorded fragment of the game from techPowerUp.com (test instructions included). Save the configuration file into the folder \Program Files (x86)\Call of Duty 2\main\players\<username in the game>. The test is run five times, results obtained are averaged out.
  2. DOOM 3 (Patch 1.3.1). We use the integrated benchmark (demo1). Save the config file into the folder \Program Files (x86)\DOOM 3\base. Here is the command line to start the test: doom3.exe +set logfile 1 +set com_showFPS +set timescale 7 +set r_mode 4 +playdemo demo1 +wait 500 +playdemo demo1 +wait 500 +playdemo demo1 +wait 500 +timedemoquit demo1. The test is run five times, results obtained are averaged out.
  3. Far Cry (Patch 1.4). We use the integrated benchmark (Training level). Save the config file into the folder \Program Files (x86)\Far Cry. Here is the command line to start the test: FarCry.exe -DEVMODE "#demo_num_runs=3" "#demo_quit=1" "#r_Width=800" "#r_Height=600" "#s_DummySound=1" "map training" "demo training"'. The test is run five times, the highest result is used as an intermediate score, results obtained are averaged out.
  4. Serious Sam 2 (Patch 2.070). We use the integrated benchmark (Demo_0001). Save the config file into the folder \Program Files (x86)\Serious Sam 2\Content\SeriousSam2. Here is the command line to start the test: Sam2.exe +demo content\serioussam2\demos\Demo_0001.dem +bmk_bAutoQuit 1 +bmk_bBenchmarkDemos 1 +sam_demo 1 +sam_bBootSequence 0 +fullscreen 1 +width 800 +height 600. The test is run five times, results obtained are averaged out.
  5. S.T.A.L.K.E.R.: Shadow of Chernobyl (Patch 1.6). We use the flyby benchmark buildings_timedemo (instructions included). Save the config file into the folder \Users\Public\Documents\STALKER-SHOC. The test is run five times, results obtained are averaged out.
  6. Unreal Tournament 2004 (Patch 3369). We use the integrated benchmark. Save the config file into the System folder of the installed game. Here is the command line to start the test: UT2004.exe BR-Colossus.ut2?spectatoronly=1?attractcam=1?quickstart=1?numbots=8 -benchmark -nosound -seconds=900 -800x600. The test is run five times, results obtained are averaged out.

The total score in this group is calculated as a geomean of results demonstrated by all tests:

  • Group 1 for computers with discrete graphics;
  • Group 2 for computers with integrated graphics.

Conclusions

This method for testing PC performance is evolutionary. It's based on the previous version that is positioned as a CPU benchmarking method. Both have similar benchmarks and test groups. However, it's not a transition product. This version of our test method marks the already formed tendency: modern end users are interested in the speed of the entire computer in daily applications rather than in performance of separate components.

From this point of view, this test method is practically a completed (perfect) version of the old classical approach, having borrowed its best features (scope, traditions). On the other hand, it's an attempt to introduce as many new features to the old approach as possible: focus on real tasks, retreat from benchmarking separate systems of a modern computer and adoption of the complex approach, and finally, transition from testing separate components to benchmarking end solutions.

Nevertheless, we also tried to meet the needs of users advocating the classical approach to CPU benchmarking. Hopefully, we've succeeded here, so you can obtain maximum information in the classic format as well.


Write a comment below. No registration needed!


<< Previous page



blog comments powered by Disqus

  Most Popular Reviews More    RSS  

AMD Phenom II X4 955, Phenom II X4 960T, Phenom II X6 1075T, and Intel Pentium G2120, Core i3-3220, Core i5-3330 Processors

Comparing old, cheap solutions from AMD with new, budget offerings from Intel.
February 1, 2013 · Processor Roundups

Inno3D GeForce GTX 670 iChill, Inno3D GeForce GTX 660 Ti Graphics Cards

A couple of mid-range adapters with original cooling systems.
January 30, 2013 · Video cards: NVIDIA GPUs

Creative Sound Blaster X-Fi Surround 5.1

An external X-Fi solution in tests.
September 9, 2008 · Sound Cards

AMD FX-8350 Processor

The first worthwhile Piledriver CPU.
September 11, 2012 · Processors: AMD

Consumed Power, Energy Consumption: Ivy Bridge vs. Sandy Bridge

Trying out the new method.
September 18, 2012 · Processors: Intel
  Latest Reviews More    RSS  

i3DSpeed, September 2013

Retested all graphics cards with the new drivers.
Oct 18, 2013 · 3Digests

i3DSpeed, August 2013

Added new benchmarks: BioShock Infinite and Metro: Last Light.
Sep 06, 2013 · 3Digests

i3DSpeed, July 2013

Added the test results of NVIDIA GeForce GTX 760 and AMD Radeon HD 7730.
Aug 05, 2013 · 3Digests

Gainward GeForce GTX 650 Ti BOOST 2GB Golden Sample Graphics Card

An excellent hybrid of GeForce GTX 650 Ti and GeForce GTX 660.
Jun 24, 2013 · Video cards: NVIDIA GPUs

i3DSpeed, May 2013

Added the test results of NVIDIA GeForce GTX 770/780.
Jun 03, 2013 · 3Digests
  Latest News More    RSS  

Platform  ·  Video  ·  Multimedia  ·  Mobile  ·  Other  ||  About us & Privacy policy  ·  Twitter  ·  Facebook


Copyright © Byrds Research & Publishing, Ltd., 1997–2011. All rights reserved.