iXBT Labs - Computer Hardware in Detail






Test Method for Evaluating PC Performance

<< Previous page

     Next page >>

Group 4: Bitmap processing

Here is a little secret: this part of the test method got this ambitious title (for example, instead of working with a digital camera) because our test lab in cooperation with third-party experts is preparing another part of the test method: Vector Graphics. Unfortunately, the process of developing benchmarks is far from completion, so this version of the test method does not include this part yet. But we hope that it will be added to the next version. And now let's get back to the present.

Adobe Photoshop's hegemony on the market of bitmap editors is an objective fact, which we used to take for granted. Our first timid attempt to add something new here was made in the previous version of the test method (we only added a new 'parallel' group). Perhaps it was not very consistent (three viewers and one optimizer were doing the same job - batch-converting files, plus a deliberately simple graphics editor). However, it was a start. Today we use three sterling graphics editors in the Bitmap Graphics group: the latest version of Adobe Photoshop (certainly), Corel PhotoImpact X3 (former Ulead PhotoImpact), and Corel PaintShop Pro Photo X2 (former Jasc PaintShop). Besides, we still use a simple small graphics editor Paint.NET (it's actually sufficient for most non-professional tasks), plus the most popular graphics viewer -- ACDSee. However, the latter performs a more complex task now, it converts RAW files into JPEG. Here we convert NEF (Nicon) and CR2 (Canon) files (in equal parts) with the total size of about 1GB.

We overhauled the Adobe Photoshop test to increase the load, that is it has become a more complex task. However, the idea remains the same: the test script measures the time it takes to perform Actions, grouped into Blur, Sharp, Light, Resize, Rotate, Convert (from one color model into another), Transform, and Filters. Each group has the same weight coefficient in the total score, and you can also lookup information about separate groups. We developed tests for Corel PhotoImpact X3 and PaintShop Pro Photo X2 in the same way: load a test image, process it with a script, and measure how much time it took to do it (the shorter, the better). However, in the last two benchmarks we didn't divide operations into groups, so we measure the execution time of the entire script.

Technical details

  1. Software: ACDSee Pro 2.5.363 (RAW Plugin 4.0.62). Configuration: file to be imported into the register. Testing: open the folder with test files in the RAW format*, then choose 'Select All' and 'Batch Convert File Format' (JPEG). This test measures conversion time. The test is run five times, results obtained are averaged out.
  2. Software: Paint.NET x64 3.36, PdnBench 3.20. Startup instructions come with the benchmark. The test is run five times, results obtained are averaged out.
  3. Software: Adobe Photoshop CS4 x64 (11.0.1). Configuration: unpack environment settings into \Users\<username>\AppData\Roaming\Adobe\Adobe Photoshop CS4\ (if this folder already contains files, they should be deleted prior to this step). Start Adobe Photoshop CS4 x64 and add Actions for the test. Open the test image. F2-F9 keys start corresponding groups of tests. Measure execution time of each group five times then average the results out. The total score is calculated as a geomean of results demonstrated by all groups.
  4. Software: Corel PhotoImpact X3. Configuration: unpack environment settings into \Users\<username>\AppData\Roaming\Ulead Systems\ (if this folder already contains the Ulead PhotoImpact subfolder, it should be deleted). Start Corel PhotoImpact X3 and open the test image. Run the iXBT PhotoImpact Test script from the standard Quick Command panel, and measure the execution time. Repeat it five times and average the results.
  5. Software: Corel PaintShop Pro Photo X2. Configuration: import this file to the register, unpack environment settings to \Users\<username>\Documents\My PSP Files\ (delete all other files in this folder). Here is the command to start the test: "\Program Files (x86)\Corel\Corel Paint Shop Pro Photo X2\Corel Paint Shop Pro Photo.exe" "\Users\<username>\Documents\My PSP Files\Scripts-Trusted\test.pspScript" "<name of the test file with a full path>". Result is execution time of the test script. The test is run five times, results are averaged out.

* As the test set is very big, we'll email the link at your request.

Group 5: Data compression

Measuring the speed of lossless compression of heterogeneous data (aka archiving) is a traditional benchmark. Besides, system parameters critical for most archivers have been known for a long time already: computational power of integer units in a processor is on the first place; then goes the total size and performance of all cache levels, the third parameter is memory performance, both linear read/write rates and latencies are critical. Processor bus throughput may also affect archiving speed, but rarely (practically never these days). However, it concerns the days (now past), when memory controllers were implemented in chipsets.

Our benchmark traditionally consists in compressing the same file set with each archiver used in the method: BMP (uncompressed image), DBF (data base), DLL (dynamic library), DOC (Microsoft Word 97/2003 document), PDF (PDF document), and TXT (plain text, Win-1251). Practically nothing has been changed in this test method, we were only forced to increase the size of the file set, so that short archiving times would not affect repeatability of test results. It goes without saying that we also updated our software. Some programs were updated in a cardinal way: for the first time WinRAR is represented by a 64-bit version. Unfortunately, the 64-bit release was not available at the time we worked on the new test method. However, its beta version was stable enough to be included into our tests. Archiving options haven't changed for many years: a solid archive, maximum compression, the number of threads equals the number of processor cores. To be fair, it makes no sense to use more than two threads with 7-Zip, as it does not support more threads. But we still specify the real number of cores in its options, just in case.

Some of you may ask a question: why don't we use built-in benchmarks? Both 7-Zip and WinRAR have them. The answer is simple: unfortunately, the benchmark built into WinRAR proves that realistic results are not guaranteed even if a benchmark is written by the author of the program. For example, WinRAR benchmark responds very well to the increase in processor cores (up to four cores). However... it does not happen when the archiver does the real job: performance gains starting from the third core are significantly lower than in the benchmark. To all appearances, the benchmark imitates the archiver's program core in the ideal situation (for multithreading), hence the discrepancy with the reality. So we decided that the good old method of testing by archiving files for time suites our purposes better: to provide users with information about real performance of PC components by the example of real user tasks. Besides, it's silly to use two different methods in a group consisting of only two tests, so the question with the integrated 7-Zip benchmark was resolved in the same way.

Technical details

  1. Software: 7-Zip x64 4.65. Testing: archiving a file set*. Command line to start the test: 7z.exe a -r -mx=9 -mmt<the number of processors in a system> <archive name> <path to the file set>\*.*. This test measures archiving time. The test is run five times, results obtained are averaged out.
  2. Software: WinRAR x64 3.90 beta 3. Testing: archiving a file set*. Command line to start the test: rar.exe a -r -m5 -s -mdg -mt<the number of processors in a system> <archive name> <path to the file set>\*.*. This test measures archiving time. The test is run five times, results obtained are averaged out.

The total score in this group is calculated as a geomean of results demonstrated by all tests.

* As the archive is very big, we'll email the link at your request.

Write a comment below. No registration needed!

<< Previous page

Next page >>

blog comments powered by Disqus

  Most Popular Reviews More    RSS  

AMD Phenom II X4 955, Phenom II X4 960T, Phenom II X6 1075T, and Intel Pentium G2120, Core i3-3220, Core i5-3330 Processors

Comparing old, cheap solutions from AMD with new, budget offerings from Intel.
February 1, 2013 · Processor Roundups

Inno3D GeForce GTX 670 iChill, Inno3D GeForce GTX 660 Ti Graphics Cards

A couple of mid-range adapters with original cooling systems.
January 30, 2013 · Video cards: NVIDIA GPUs

Creative Sound Blaster X-Fi Surround 5.1

An external X-Fi solution in tests.
September 9, 2008 · Sound Cards

AMD FX-8350 Processor

The first worthwhile Piledriver CPU.
September 11, 2012 · Processors: AMD

Consumed Power, Energy Consumption: Ivy Bridge vs. Sandy Bridge

Trying out the new method.
September 18, 2012 · Processors: Intel
  Latest Reviews More    RSS  

i3DSpeed, September 2013

Retested all graphics cards with the new drivers.
Oct 18, 2013 · 3Digests

i3DSpeed, August 2013

Added new benchmarks: BioShock Infinite and Metro: Last Light.
Sep 06, 2013 · 3Digests

i3DSpeed, July 2013

Added the test results of NVIDIA GeForce GTX 760 and AMD Radeon HD 7730.
Aug 05, 2013 · 3Digests

Gainward GeForce GTX 650 Ti BOOST 2GB Golden Sample Graphics Card

An excellent hybrid of GeForce GTX 650 Ti and GeForce GTX 660.
Jun 24, 2013 · Video cards: NVIDIA GPUs

i3DSpeed, May 2013

Added the test results of NVIDIA GeForce GTX 770/780.
Jun 03, 2013 · 3Digests
  Latest News More    RSS  

Platform  ·  Video  ·  Multimedia  ·  Mobile  ·  Other  ||  About us & Privacy policy  ·  Twitter  ·  Facebook

Copyright © Byrds Research & Publishing, Ltd., 1997–2011. All rights reserved.