iXBT Labs - Computer Hardware in Detail

Platform

Video

Multimedia

Mobile

Other

Testing method for HDD (IDE)



In this article we will consider a testing technique for hard discs with IDE interface. We are purposing two aims.

  • First, to explain what, why and for what we are doing.
    Secondly, to ensure repeatability of the tests. Since all tests used are free, you can use this method for thorough testing of you disc. Of course, accurate figures will depend on a mainboard, a controller, memory size etc.

1.Test system

The test system consists of hardware and software part.

Hardware

The biggest problem in choosing the hardware is to defer its moral aging as far as possible, since any upgrade leads to loss of comparability of the results. The i815E chipset was taken as a base because:

  • it is relatively new;
  • widely spread;
  • there is ATA/100 support at the chipset level;
  • 133 MHz FSB and PC133 support;
  • integrated video.

As for a mainboard, we have chosen the Iwill WO2-R because the Iwill products have proved that they work stable. Besides, presence of the ATA/100 RAID-controller from American Megatrends has also made an effect on our decision.

The Intel Pentium III 800EB was chosen due to the high speed which won't slow down the work of not only current hard discs but also of future ones.

The memory size was chosen to be equal to 256MB. For today it is quite enough.

The IBM DTLA 307015 plays a role of a system disc which contains an operating system, tests and their results. No problems were noticed during the operation together with the i815E chipset.

So, what we have received in the upshot:

  • Iwill WO2-R (BIOS ver. 6.00PGN) mainboard;
  • Intel Pentium III 800EB processor;
  • 128 MBytes PC133 SDRAM memory;
  • IBM DTLA 307015 system disc.

Software

The software consists of an operating system and drivers of the devices.

With the Microsoft Windows 2000 Professional Service Pack 1, that we have chosen, we can test discs with NTFS and FAT32 file systems under the one OS. Besides, with release of the Service Pack 1 the system has become more stable.

As for video drivers, we have installed just a standard VGA driver of the OS complete set since the display implements just decorative functions. Besides, we didn't install drivers for an integrated sound. The only driver used was the one of the Ultra ATA controller, version 6.1.4.

2. Test set

Ziff-Davis WinBench

Disk Inspection Tests and Disk WinMarks tests from the Ziff-Davis WinBench 99 ver 1.1 are used. They work with logic discs.

The Disk Inspection Tests are used to define physical characteristics of discs. They include:

  • Disk Transfer Rate. The test defines the linear reading speed in thousand bytes/sec. In the end, the test gives two numerical values - speed in the beginning and in the end of the disc, and, if necessary, a graph in bmp format or a table of values in csv format (comma separated values).
  • Disk Access Time. The test defines an access time in ms. The final value is equal to the sum of average latency and average seek time.
  • Disk CPU Utilization. The test shows the CPU utilization in percents in the course of data exchange with a disc.

The Disk WinMarks includes two tests - Business Disk WinMark and High-End

  • Disk WinMark - each of them, in its turn, is compound. The Business Disk WinMark gives an average speed of disc operation with a set of business applications in thousand bytes/sec.
  • High-End Disk WinMark gives an average speed of disc operation with a set of advanced applications, and speed results for each application. The applications are following.
    • AVS/Express 3.4 - visualization program.
    • Microsoft FrontPage 98 - HTML-editor.
    • Bentley Systems MicroStation SE - CAD system program.
    • Adobe Photoshop 4.0 - a program of image editing.
    • Adobe Premiere 4.2 - a program of video editing.
    • Sonic Foundry Sound Forge 4.0 - a program of sound processing.
    • Microsoft Visual C++ 5.0 - C++ compiler.

    The programs were selected so that there were both types of applications used: for small-size files (FrontPage), and applications for working with relatively large files (Photoshop). The test doesn't always include the latest versins of applications.

The benchmark tests can be downloaded.

Intel IOMeter

IOMeter, unlike the WinBench based on real applications, is completely a synthetical test. It gives flexibility and brings about a lot of problems (when making settings) for a tester. One may create a configuration which has nothing common with the reality. Additional difficulties are caused by the fact that one can test not only one disc on a uniprocessor machine (what we are doing), but also disc arrays in multiprocessor configurations and even computers on a network.

When making settings I used the technique developed by StorageReview. You can find it also there (Operating Systems and Benchmarks - Part 4 ). I will give you a short description of the test.

The IOMeter works with so called "workers". Intel recommends to create one worker per processor, that's why consider that we have one worker. After that each worker tests target(s), which are either one unpartitioned physical disc, or one or several partitions on a disc. Then, each worker receives its access pattern (a set of parameters according to which this worker organizes an access to the target).

An access pattern contains the following variables:

  • Transfer Request Size - a minimal data block to which the test can apply.
  • Percent Random/Sequential Distribution - percentage of random requests. The other are, therefore, sequential.
  • Percent Read/Write Distribution - percentage of requests for reading. Another important variable which is not directly included into the access pattern - # of Outstanding I/Os - defines the number of simultaneous I/O requests for the given worker and, correspondingly, a disc load.

So, setting parameters ad arbitrium we can get a wide range of incomparable results which have little practical sense. And there arises a question: how to set an access pattern in order it models disc operation in real conditions? Here, I used the technique developed by StorageReview.

So, there are three access patterns - File Server (the model is defined by Intel and comes with the IOMeter), Workstation and Database (defined by StorageReview). Below you can see a table of parameters for each pattern, taken from StorageReview (Operating Systems and Benchmarks - Part 5 ). There you can read why these patterns were chosen.

Access Patterns
% of Access Specification Transfer Size Request % Reads % Random
File Server Access Pattern (as defined by Intel)
10%
0.5 KBytes
80%
100%
5%
1 KBytes
80%
100%
5%
2 KBytes
80%
100%
60%
4 KBytes
80%
100%
2%
8 KBytes
80%
100%
4%
16 KBytes
80%
100%
4%
32 KBytes
80%
100%
10%
64 KBytes
80%
100%
Workstation Access Pattern (as defined by StorageReview.com)
100%
8 KBytes
80%
80%
Database Access Pattern (as defined by Intel/StorageReview.com)
100%
8 KBytes
67%
100%

Now a few words on the # of Outstanding I/Os parameter. If you set it to 1, then with the 100% Percent Random/Sequential Distribution we in fact measure a random access time. Value 4 corresponds to a load of an elementary applications like Windows Calculator. According to StorageReview, in average on real applications this parameter takes 30-50. The value more than 100 corresponds to high disc load (e.g. in case of defragmentation). According to it they suggest to take the following 5 values for this parameter.

Loads
Linear 1 Outstanding I/O
Very Light 4 Outstanding I/Os
Light 16 Outstanding I/Os
Moderate 64 Outstanding I/Os
Heavy 256 Outstanding I/Os

Now comes the main thing - what we get in the end. The following results were included:

  • IOps - Total I/Os Per Second - an average number of requests implemented per second. A request consists of positioning and reading/writing of a block of the corresponding size. Read IOps and Write IOps values are given out separately.
  • MBps - Total MBs Per Second - it is the same, but in other words. If the patterns are working with the units of the same size (Workstation and Database) it is just multiplication of Total I/Os Per Second by unit's size. Read MBps and Write MBps are also given out.
  • Average I/O Response Time - for linear loading (1 outstanding I/O) - it's again the same as Total I/Os Per Second (Total I/Os Per Second = 1000 milliseconds / Average I/O Response Time). With load increase the value rises but not arcwise. The result depends on optimization of drive firmware, bus and OS. Here we have Average Read Response Time and Average Write Response Time, relatively.
  • Maximum Response Time. Maximum Read Response Time and Maximum Write Response Time are given out in this case.
  • The total number of read and recorded bytes, as well as reading and writing instructions.
  • % CPU Utilization.
  • CPU Effectiveness - I/Os per % CPU Utilization.

All results are outputted in the form of tables in csv format. The Trial Version of this test can be downloaded here.

3. Testing technique

The disc is unpacked and installed as master on the first channel of the integrated IDE controller of the mainboard. The disc is not partitioned and not formatted.

The OS is downloaded from the system disc, after what 15 Intel IOMeter tests are carried out (5 load levels for each of three access patterns). The test allows to set a time for test implementation (in the Trial Version you can do it only manually by pressing the STOP button) and time from the beginning of the test to beginning of measurements (rump-up time). I have set the time of test implementation equal to 10 minutes, and rump-up time equal to 30 seconds. All results are recorded on the system disc.

OS rebooting.

With OS standard means one partition is created on the disc equal in size to the whole disc. The partition is formattted with usage of the NTFS file system.

OS rebooting.

Ziff-Davis Winbench is started, the above test set is formatted and started. This process is implemented three times with OS rebooting between the startups. All results are recorded into the corresponding database. After passing this stage the partition is deleted.

OS rebooting.

With OS standard means one partition is created on the disc equal in size to the whole disc. The partition is formatted with usage of the FAT32 file system.

OS rebooting.

Ziff-Davis Winbench is started, the above test set is formatted and started. This process is implemented three times with OS rebooting between the startups. All results are recorded into the corresponding database.

By averaging the results obtained for each file system we get the final results of these tests.

Now you are to analyze them. I recommend you to keep the source test results in the initial form. These files contain a lot of service data which can be useful in case of a detailed analyses.

[ Back to article ]


Write a comment below. No registration needed!


Article navigation:



blog comments powered by Disqus

  Most Popular Reviews More    RSS  

AMD Phenom II X4 955, Phenom II X4 960T, Phenom II X6 1075T, and Intel Pentium G2120, Core i3-3220, Core i5-3330 Processors

Comparing old, cheap solutions from AMD with new, budget offerings from Intel.
February 1, 2013 · Processor Roundups

Inno3D GeForce GTX 670 iChill, Inno3D GeForce GTX 660 Ti Graphics Cards

A couple of mid-range adapters with original cooling systems.
January 30, 2013 · Video cards: NVIDIA GPUs

Creative Sound Blaster X-Fi Surround 5.1

An external X-Fi solution in tests.
September 9, 2008 · Sound Cards

AMD FX-8350 Processor

The first worthwhile Piledriver CPU.
September 11, 2012 · Processors: AMD

Consumed Power, Energy Consumption: Ivy Bridge vs. Sandy Bridge

Trying out the new method.
September 18, 2012 · Processors: Intel
  Latest Reviews More    RSS  

i3DSpeed, September 2013

Retested all graphics cards with the new drivers.
Oct 18, 2013 · 3Digests

i3DSpeed, August 2013

Added new benchmarks: BioShock Infinite and Metro: Last Light.
Sep 06, 2013 · 3Digests

i3DSpeed, July 2013

Added the test results of NVIDIA GeForce GTX 760 and AMD Radeon HD 7730.
Aug 05, 2013 · 3Digests

Gainward GeForce GTX 650 Ti BOOST 2GB Golden Sample Graphics Card

An excellent hybrid of GeForce GTX 650 Ti and GeForce GTX 660.
Jun 24, 2013 · Video cards: NVIDIA GPUs

i3DSpeed, May 2013

Added the test results of NVIDIA GeForce GTX 770/780.
Jun 03, 2013 · 3Digests
  Latest News More    RSS  

Platform  ·  Video  ·  Multimedia  ·  Mobile  ·  Other  ||  About us & Privacy policy  ·  Twitter  ·  Facebook


Copyright © Byrds Research & Publishing, Ltd., 1997–2011. All rights reserved.