AIM Benchmarks

Mark Campbell campbell at sauron.UUCP
Tue Dec 10 00:43:56 AEST 1985


> 	I am interested in finding out what people
> think of the AIM benchmarks.  How useful are they
> to you?  How would you improve them?

[Preliminary Disclaimer: This article in no way reflects the judgement
or policy of NCR Corporation.  I alone am responsible for its
contents.]

There are two AIM benchmarks currently in use: AIM 1.5 and AIM 2.0.
AIM Technology has a pretty restrictive licensing policy with both.
Publishing the results of these benchmarks are subject to this
licensing policy.

AIM 1.5 consists of eight tests:
  - A C Compiler Test		- A Disk Write Throughput Test:
  - An FP Test			- A Multi-User Edit Test
  - A Multi-User Sort Test	- A Million Operation Test
  - A Memory Throughput Test	- An Interprocessor Communication Test

Aim 2.0 consists of a large collection of generally smaller tests
and uses a linear modeling technique along with a user-specified
system mix ratio to approximate system performance.  The disk
throughput benchmark tests both disk read's and write's with a
large fixed buffer size.  There are no explicit multi-user tests.

Both are decent benchmarks for measuring uniprocessor system
performance; however, neither should be used to judge
multiprocessing systems.  The linear modeling scheme used to
determine system performance in AIM 2.0 is highly suspect.
Only the individual test results of either should be used.

AIM 1.5's disk write throughput test is a pretty interesting
concept...it can expose some true weaknesses in a given
implementation of a file system (hint: Let the buffer size
range up to 16K for fairness, and then test a BSD file system.
The results are pretty interesting.).

In order to improve these benchmarks, I'd probably try to
merge both of them together.  I would throw out the AIM 2.0
disk throughput tests in favor of those in AIM 1.5.  I'd
also make the multi-user benchmarks of AIM 1.5 a bit more
realistic.  Otherwise, I'd use the results of both.

I've seen better benchmarks, but they all take 2 hours or
more to execute.  If you want a simple benchmark that
executes pretty quickly, neither is a bad choice.

P.S. "A benchmark proves which system executes the benchmark
      the fastest...that's it."
					    --- Anonymous
-- 

Mark Campbell
Phone:  (803)-791-6697
E-Mail: {decvax!mcnc, ihnp4!msdc}!ncsu!ncrcae!sauron!campbell



More information about the Comp.unix.wizards mailing list