We assume that the work a job has to execute, ,
is drawn from a Hypergeometric distribution with
mean
and coefficient of variation
(1, 5, and 30 are considered).
This is consistent with variations in service
demands used in previous studies
[5, 26]
and those observed at one supercomputer
installation
[9].
We also assume that there is no correlation
between the amount of work a job executes
and the efficiency with which it is executed.
Although this might not be true for all
applications,
we want to separately examine the effects of
service demand and efficiency on the mean response
time obtained with different allocation
policies.
This is not possible if efficiency
and work are correlated.
We model the fact that jobs execute with different
efficiency by using the execution rate function,
F, for all jobs and
choosing uniformly
between
and
.
This distribution is similar to that used in
previous studies
[23, 30]
except that we ensure that the distribution is
uniformly distributed in
rather than
.
We believe that this is what was actually
intended in the previous studies.
Each workload executes M jobs, whose arrival follows a Poisson distribution. Each experiment is repeated a number of times using different random seeds in order to compute confidence intervals. The number of jobs and repetitions used for each experiment was chosen in order to achieve 90% confidence intervals that are within 5% of the mean. We use the bootstrap method for computing confidence intervals since it is robust for small numbers of repetitions and for non-normal distributions of observed means [34].