Looking for UNIX workloads

David Hinnant dfh at ecsvax.uncecs.edu
Fri Jul 22 02:34:23 AEST 1988



The Performance Metrics working group of the /usr/group Technical
Committee is involved in identifying/acquiring/developing UNIX workloads
that will be proposed as "standards" to be used by benchmark vendors and
users alike.  The workloads defined will be "functional workloads" only.
I.e., not actual workload instructions tied to a particular
organization, benchmarking tool or vendor, but pseudo-English functional
descriptions; perhaps not unlike government benchmarking specifications
(of the form "Insert Slot A into Tab B" - E.g.  Invoke the editor on the
file xyzzy; Compile foo.c; Invoke 'ls' on the 'src' directory; Run SPICE
on the file workle.data; change spreadsheet cell A-3 to $63.97 and
recompute; etc).

We're interested just in 'workloads' (actual user level tasks) and not
'benchmarks' (concise, variably accurate approximations of user tasks
or system functions - at least that's my definition of 'benchmark' for
today...)

These workloads will cover most of the areas where UNIX machines are
in wide use including:

   1) Scientific number crunching
   2) Office Automation:
	- Database
	- Spreadsheet
	- Text Processing
	- Presentation Graphics
	- etc.

   3) Software Engineering
   4) Hardware Engineering
   5) Transaction Processing
   6) Insert your favorite stereotypical UNIX user here

As not to re-invent the wheel, we would like to
examine/include/incorporate/clone (as appropriate) workloads developed
by others.  There are several sources for existing workloads:

(1) Computer System vendors are thought to have the most experience in
developing and testing workloads but this is a touchy situation.  Surely
AT&T, DEC, IBM and other large (in terms of $$$ to spend to do workload
R&D) vendors have such workloads, but would they be willing to let them
out?  Vendor 'A' may not want to release their workload into the public
domain for fear that a competitor, Vendor 'B', may out perform him on
'his own' workload.  Also, unless vendor 'B' does well on Vendor 'A's
workload, 'B' may denounce the workload as not being representative of
the given task.

2) Software vendors of various packages may have benchmarks we could
examine.  For example, surely database and/or spreadsheet vendors have a
test suite to regression test the speed (or the absence thereof) of new
releases.

(3) Large end user organizations (e.g. MIS shops, Engineering
Services/Support organizations) may also have substantial experience in
this area, but who are they and how do we contact them?

So, USENET reader, we solicit your assistance.

If you have suggestions on workloads, or types of users for which a
workload needs to be developed, or information on where we can find
end-user organizations or vendors willing to discuss/disclose their
workloads and experiences, please send mail.

Also, if you're interested in the /usr/group Performance Metrics Working
Group, mail to me at: ...!mcnc!rti!ntirtp!dfh or to Ram Chelluri at:
...!ihnp4!cuae2!src.

   David F. Hinnant
   Co-Chair, /usr/group Performance Measurements Working Group


-- 

David Hinnant		UUCP: ...{decvax,akgua}!mcnc!rti!ntirtp!dfh
Northern Telecom Inc.	(919) 992-5000



More information about the Comp.org.usrgroup mailing list