Information formats and decision performance

MISSING IMAGE

Material Information

Title:
Information formats and decision performance an experimental investigation
Physical Description:
x, 100 leaves : ; 28 cm.
Language:
English
Creator:
Amador, José Angel, 1947-
Publication Date:

Subjects

Subjects / Keywords:
Decision making   ( lcsh )
Information theory   ( lcsh )
Management thesis Ph. D   ( lcsh )
Dissertations, Academic -- Management -- UF   ( lcsh )
Genre:
bibliography   ( marcgt )
non-fiction   ( marcgt )

Notes

Thesis:
Thesis--University of Florida.
Bibliography:
Bibliography: leaves 66-68.
Statement of Responsibility:
by José A. Amador.
General Note:
Typescript.
General Note:
Vita.

Record Information

Source Institution:
University of Florida
Rights Management:
All applicable rights reserved by the source institution and holding location.
Resource Identifier:
aleph - 000187443
oclc - 03403840
notis - AAV4043
System ID:
AA00011840:00001


This item is only available as the following downloads:


Full Text











INFORMATION FORMATS AND DECISION PERFORMANCE:
AN EXPERIMENTAL INVESTIGATION










By


JOSE A. AMADOR


A DISSERTATION PRESENTED TO THE GRADUATE COUNCIL OF
THE UNIVERSITY OF FLORIDA IN PARTIAL
FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF
DOCTOR OF PHILOSOPHY


UNIVERSITY OF FLORIDA

1977






























Copyright By

Jose A. Amador

1977































A MagUi y Provi













ACKNOWLEDGEMENTS


I wish to express my gratitude to the University of Puerto Rico

for financing my graduate program; to my dissertation adviser, Dr.

Richard A. Elnicki, for his recurrent insistence and corrections

throughout the course of this work; to my co-adviser, Dr. Jack M.

Feldman, for his invaluable comments and stimulus; to the rest of

my committee, Dr. Christopher B. Barry, Dr. Thom J. Hodgson, and

Dr. Richard R. Jesse, for their help and suggestions at the various

stages of the research; and to my colleague and friend, Jose F. Colon,

for his enlightening suggestions during the early part of the project.

I also want to express special appreciation to my "parents" in

Gainesville, Mr. and Mrs. Bruce Ruiz, for their long, long hours of

companionship and friendship during my stay at the University of

Florida.

Most important, I thank my wife, Magui, for pushing me through

while taking the worst part; my sons, for the time they would have

rather spent with me; and my parents, for their ever present support

and counsel.











TABLE OF CONTENTS

Page

ACKNOWLEDGEMENTS ........................................... iv

LIST OF TABLES ....................................... vii

LIST OF FIGURES ............................................ viii

ABSTRACT ........................................... .... ix


CHAPTER 1 RESEARCH BACKGROUND ............................ 1

1.1 Introduction ..................................... 1
1.2 Literature Review ................................. 2
1.2.1 The Minnesota Experiments ..................... 2
1.2.2 The Lucas Model ............................... 7
1.2.3 Other Related Literature ...................... 10
1.3 Organization of the Dissertation .................. 11

CHAPTER 2 THE PROBLEM .................................. 13

2.1 An Information-Decision Problem .................... 13
2.1.1 The Real-Life Problem ......................... 13
2.1.2 The Abstracted Problem ........................ 15
2.2 Information Content ............................... 16
2.3 The "How-Do-We-Present-the-Information?" Problem ... 17
2.3.1 Medium of Transmission ........................ 18
2.3.2 Format of Presentation ........................ 19
2.3.3 Level of Detail .............................. 23
2.4 Number of Decision Entities ....................... 26
2.5 Dependent Variables ............................... 27
2.6 Research Hypotheses ................................ 28
2.7 Basic Functional Model ............................. 31

CHAPTER 3 THE EXPERIMENT ............................... 33

3.1 Method ....................................... 33
3.1.1 Subjects .................................... 33
3.1.2 Design and Analysis ........................... 34
3.2 Experimental Task ................................ 37
3.3 Evaluation of Hypotheses .......................... 40








TABLE OF CONTENTS (continued)


Page


CHAPTER 4 EXPERIMENTAL RESULTS ............................... 44

4.1 Introduction ............................................ 44

4.2 Results .................................................... 44
4.2.1 Effect of Layout on Decision Time ................. 44
4.2.2 Influence of Format on Choice Behavior ............ 46
4.2.3 Joint Effect of Layout and Style on Decision Time .. 47
4.2.4 Effect of Probabilistic Detail on Cost Performance .. 47
4.2.5 Joint Effect of Format and Level of Detail
on Decision Time ................................. 48
4.2.6 Relation Between Number of Decision Entities
and Format ........................................ 49

CHAPTER 5 DISCUSSION OF RESULTS .............................. 51

5.1 Summary of Findings .................................... 51
5.2 The Multivariate Effects .............................. 51
5.3 The Univariate Effects ................................ 57
5.3.1 Effects Related to H1 and H3 ....................... 57
5.3.2 Effects Related to H2 ............................ 58
5.3.3 Effects Related to H5 ............................ 59
5.3.4 Effects Related to H6 ............................ 60

CHAPTER 6 SUMMARY AND POSSIBLE EXTENSIONS ..................... 63


BIBLIOGRAPHY ........................................ ......... 66

APPENDIX A EXPERIMENTAL TREATMENTS ........................... 69

APPENDIX B FORMAT OPINION QUESTIONNAIRE ....................... 78

APPENDIX C COMPUTER SIMULATION PROGRAM ....................... 80

APPENDIX D SUBJECT INSTRUCTIONS ............................. 89

BIOGRAPHICAL SKETCH ............................................. 100















LIST OF TABLES


Page
1.1 SOME STUDIES CONDUCTED UNDER THE CHERVANY
ET AL. FRAMEWORK ..................................... 4

3.1 FACTORIAL DISPLAY AND FACTOR LEVELS .................. 35

3.2 DATA FOR THE EXPERIMENT .............................. 39

3.3 EFFECTS PREDICTED BY THE RESEARCH HYPOTHESES ......... 41

4.1 CELL MEANS FOR THE SIXTEEN EXPERIMENTAL
CONDITIONS ........................................... 45

5.1 MAIN EFFECTS ......................................... 52

5.2 INTERACTION INVOLVING THE NUMBER-OF-DECISION
ENTITIES-VARIABLES ................................... 53

5.3 OTHER INTERACTION EFFECTS ........................... 55














LIST OF FIGURES


1.2 Different formats of presentation .................. 8

2.1 Three forms of presenting the expected
due-dates information ............................. 21

2,2 Graphical style with events in due-date order ....... 22

2.3 Three forms of presenting the interval
estimates information ............................... 25










Abstract of Dissertation Presented to the Graduate Council
of the University of Florida in Partial Fulfillment of the Requirements
for the Degree of Doctor of Philosophy


INFORMATION FORMATS AND DECISION PERFORMANCE:
AN EXPERIMENTAL INVESTIGATION

By

Jose A. Amador

June 1977

Chairman: Richard A. Elnicki
Major Department: Management

This study examines some implications of the relationship

between information format and decision performance. A real-life

information-decision problem was abstracted to create a simulated

decision environment in which alternative forms of presenting

information relevant to the problem were manipulated and adminis-

tered to 160 experimental subjects.

Multivariate and univariate analyses of the experimental data

indicated significant differences due to the experimental treatments.

Presentation style tabularr versus graphical) and information layout

(I.D. ordering versus due-date ordering) were found to have separate

and joint effects on decision performance. The style of presentation

had a strong influence on subject choice behavior. The level of

probabilistic information provided (point estimates versus interval

estimates) and the style of presentation had a joint effect on









decision time. Subjects with few decision entities on their reports

felt indifferent toward format, while subjects with many decision

entities indicated clear format preferences.

The implications of the findings for the Management Information

Systems researcher and practitioner are discussed. Suggestions are

given for further research.















CHAPTER 1

RESEARCH BACKGROUND

1.1 Introduction

The last decade has seen a significant increase in the use of

computer-based data systems to support decision making in organizations.

This marriage of computers and organizations has developed into the

rapidly growing field of Management Information Systems (MIS). In

general, MIS refers to the use of computer-based data systems for the

primary purpose of supporting management decisions. Since MIS exist

to support decision making, researchers in the area have suggested

that their effectiveness should be measured in terms of the effectiveness

of the decisions they support. In turn, it has been argued that the

effectiveness of decisions based on information will depend, among other

things, on the accuracy, relevancy, and timeliness of the information.

More recently, it has also been proposed that even when information

is adequate, its effective use can be influenced by the manner in which the

information is presented, in particular, by its format of presentation,

level of detail, and medium of transmission. This line of thought has led

researchers in the area to investigate how the physical form of presenting

the information can influence aspects of decision performance. That






2

relationship, in essence, is the object of this study. In the present

research, the influence of information format on decision performance will

be investigated in the context of a specific information-decision problem.

1.2 Literature Review

The increase in popularity of Management Information Systems

during the last decade has been accompanied by an awareness of the need
for improving the efficiency of the systems designed. The consensus

of the researchers in the area has been that there is a need for a
theory of MIS. Zannetos [31] states that a theory is needed to develop
objective criteria for determining the effectiveness of MIS efforts.

In response to the call for a theory, several research frameworks have
been proposed [10, 16, 21, 22].1

1.2.1 The Minnesota Experiments

The research framework proposed by Chervany et al. [10] has guided

the "Minnesota Experiments," a series of empirical studies that have

been conducted at the Management Information Systems Research Center,
University of Minnesota. The general purpose of these studies has been

to manipulate various MIS variables to investigate their impact on

decision performance. The Chervany et al. framework states that three
categories of variables affect decision performance, P, given a particu-
lar information system. These are the decision environment, DE, the




1These frameworks have not constituted theories, in the formal
sense, but rather pre-theoretical lists of variables.








decision maker, DM, and the characteristics of the information system,

CIS. In functional form,


P = f(DE,DM,CIS)


(1.1)


A number of experiments have been conducted under this framework.

They all appear to have followed Van Horn's [28] suggestion that

laboratory studies provide an effective means for MIS research. In

particular, the experimenters have drawn upon the technique of

"experimental gaming" to create artificial decision-making evironments

within which they have manipulated various aspects of the information

system. In support of this technique, Barret et al. conclude:


Dickson et al.


We have been unable, to date, to generate
evidence which implies that the interpre-
tation of decision performance results in
terms of a treatment variable is likely to
be confounded by the effects of the simu-
lator and/or other aspects of the manage-
ment game context [4, p. 11].

say of experimental gaming:

...we are of the opinion that experimental
gaming, despite its high cost, is an effec-
tive way of investigating this area. The
major problems really are associated with
the measurement of the variables included
in the experiments. Somewhat surprisingly,
problems of subject motivation and
experimental control have been minor
[12, p. 20].


Table 1.1 on pages 4 6 is a summary of some of the most
referenced studies that have been conducted under the Chervany et al.

framework. The variables that were experimentally controlled are shown

in each case before the "given" bar (1) in the functional models along

with a brief description of the measure used for each variable. Other













..0L0 '001.
rcr-.- cri i

-L ) .S0 0Eu



0. 0 L c. O eS 4)0
2J 1 C
u &- .- c- P
j TO ^ 5 c
ja || (-k e.


Msl
0 1e." .
'UC ..0 ,
1 4-on.- C--
,-S -o*A

c, i.e,- 0" 4.
econ .'
tf- 0 0 J>I 0 0 )
o 00- .0
iLniaM




=n 0 SIa USW
.0 0 6g*o --


S-n- ,Sa.0S.
CT 0 -
E gf' c .
t 1 t0

SaiL-


cO P.A Mo "a
r -C IS C4.C
Sue o .j u -p c- 0I f
*t -c- a 0,c .-T c
1. 4J W U >,J- .00 ..UU.- )*4i
o* -1 go* *0 o S





Sn 0 50 *-0 If--l



cI a
.5 LU La








3-"C
ss s





~
-. C1 C
CO *s 00



hi C q0a

C O 0*0


P'S. *g.s.* I -~
010 a;q Ccb,
QUJ ftC3 C1.


` c
m:
p C
0 5
rXS


1.
0=0 50
1.00 10
O h















C 0..o U CCI y C c
O -- c- -C L ciO
JCc. 0C.0000 0



c E0WCOCC C F

- >. 0300- Vi). o0 0 C
C- t CLIS C et t cE*- b.-SO
CC .. 0O adL 0B0 1 -m o *^ -
C 11 0.COflOE CODL0
CB-O ULI Ci OaujJt CTI

** C 0 C C9L fl 06 .0aLC* Ob-
MC'l kn L 0. 0- 00 C- C*
r 100.0 o- C Lb.. 4.6 c. C.-e


CCf- LtL' <.C
c 0,, ,C .00

1- *- IO-C. u.JC..

CC C' C. 0. L=

O.JE .-v i3E.o 0LC
C^ bo>0 **.OyU
L69, C00)n
Urorr s'SIs



0 0 %-.A.CL0
0CL-C 00 >- 0 0.4
E*- 0 i 30) C
10 0.0000Q O CD W0 CCI

0 0 -C 3 S. C -L 0. 06
t_>>- F^~c


*~ LO*(C 0I

Sl IC 0091*O 0 WO O& CL
U0 C00 CL). & UU C
1- 604 a L CCC 0 --tlP. *60 *
= CLt -- 100>*6 gOV ai
n *iii..Oivi m- E n C W> 0,0 4*4-jc 9







0ES~. 0.S SSS




sJ








is S 5..


s U







1. 0 -
-a m Cas~
aS~'" "&


























C/ C C 0

61 --ow



oJ c. f. fcW a
a5 u


CC* 9


a "














C.0g S". 0 C C C

U 0 U .C O 0. t' C 0 .C a-. aOw
CXaC-w*JOCC .-r.-.X .W%- 0CJ *
C0C ) S*-&U h L U--
C-C v 0 o 0- X O CC. CCO I
cO. LOS. Lo c J C .U
SAACO .0 51 0 51510 C CC


5)p un a u 51. ClI .'SC S C
9 CC*JIl=tS(y.j%03U -U C g
I 2.Q a ) .- L 24 C S. 0. C C
w. SC0t'n -0Wcu.IS.
jU -l l U UI O1JU .C = Ll L- &**







50..
C. ^ a & ,r. ,-C

sO -> B ct' -j ^^-C g m' Mr **o *tb
1 *s V ** j o s *i-f -j ~ ~ ij & n E w- v1c;- -fi

UC.S aV*O .)C-C SB l 0.WCC.OCCO











^T u
xI,




U~ U
n.w









51. U f







CC0 100
as -












In. COO.I



CCtf Cl
S.C CI C 0
CI gs
g ss
*w UC 1
*Sg *g-g









S'S *u

Pt sp e s-
t> Z oj
* S X





7

items in the table are the nature of the simulated decision environ-

ment, the experimental subjects, and a summary of the results.
It is interesting to note that while the form of presenting the
information has been extensively considered in one form or another,

the "layout" or physical arrangement of the information reported has
not been manipulated as an experimental variable in any of the studies

reviewed. Figure 1.2, part A (p. 8) is an example of the type of "form
of presentation" treatment that has been manipulated in the reviewed

literature. Figure 1.2, part B is an example of what is meant here

by information layout. The influence of this variable on decision

performance will play an important role in this study.

1.2.2 The Lucas Model

The model proposed by Lucas [21] includes essentially the same
variables as the Chervany et al. framework, but it also takes into
account the interface between use of the information system and per-
formance. His descriptive model states that performance (P) is a
function of situational, personal, and decision style variables (the

DM group in the Chervany et al. model), the quality of the information
system (the CIS group in the Chervany et al. model) and the analysis
and actions taken by the users (similar to the DE group in the Chervany

et al. model). In addition, his model also states that the performance

of the information system is independently affected by the use of the
system, U. In functional form,


P = f(DE,DM,CIS,U)


(1.2)















Raw Data Treatment

FINISHED GOODS INVENTORY HISTORY


K EE 1 OF MONTH 3


K EEM 2 OF MONTH 3


INVENTORY
LEVELS

Resinoid R-Forced Vitrifid Resinoid R-Forced Vitrifid
HONDAY 0 371 0 0 120 481
TUESDAY 39 102 82 0 153 191
WEDHESDAY 0 0 198 0 202 0
THURSDAY 34 36 299 38 267 0
FRIDAY 71 84 393 79 188 38


Resinoid R-Forced Vitrffid
285 0 58
0 0 0
379 321 0
0 0 0
0 0 0


Resinold R-Forced Vitrifid
354 0 0
423 0 0
144 0 201
0 0 121
0 0 0


Statistically Sunmarized Treatment
FINISHED GOODS INVENlTORY HISTORY
SURIARY STATISTICS CALCULATED
FRO OPERATIONS FOR PERIOD
WEEK 1 OF MONTH 3 THROUGH WEEK 4 OF MONTH 3

Daily Inventory Levels
(End of Day) Stockouts
Resinoid R-Forced Vitrifid Resinoid R-Forced Vttrifid
Mean 23.25 140.80 92.85 Mean 171.30 38.20 123.70
Coef Var 6.28 4.18 7.97 Coef Var 5.63 14.77 7.09
laxmuma 79.00 371.00 481.00 Ma lou 427.00 392.00 484.00
Range 79.00 371.00 481.00 Range 427.00 392.00 484.00

A. Abreviated samples of two "form of presentation" treatments used by
Chervany and Dickson [9, p. 1338].

FINISHED GOODS INVENTORY HISTORY
SUMMRY STATISTICS CALCULATED
FROM OPERATIONS FOR PERIOD
WEEK I OF MONTH 3 THROUGH WEEK 4 OF MONTH 3


Daily Inventory Levels
(End of Gay)
Resinoid
R-Forced
Vitrlfid

Stockouts
Resinoid
R-Forced
Vitrifid


Mean Coef Var Mavitmn


23.25 6.28 79.00 79.00
140.80 4.13 371.00 371.00
92.85 7.97 481.00 481.00



171.30 5.63 427.00 427.00
38.20 14.77 392.00 392.00
123.70 7.09 484.00 484.00


B. A different "layout" for the information in the second report above.


Figure 1.2 Different formats of presentation


STOCKOUTS


MONDAY
TUESDAY
WEDNESDAY
THURSDAY
FRIDAY





9

In a field study [20] with an actual information system and data

from salesmen's performances, Lucas observed relationships among DE,

DM, CIS, and P that are congruent with those observed in the "Minnesota

Experiments." In addition he also noted the following relationship

between performance and information system use:


> 0 when relevant information is
AU provided and used,


AP < 0 when the information provided
AU is irrelevant to the decisions
that must be made.

What he in effect noted is that only those information system designs

that promote effective use of the information will have a positive

effect on performance. One of the indications of his results was that

information structure elements such as the format of presentation, F,

and the level of detail, L, can be determinants of effective use, EU,

given other system characteristics, CIS', and a given set of DM and DE

variables. Although not explicitly stated in his paper, the results

of his study suggest that

EU = f(F,LIDE,DM,CIS') (1.3)

and that P = F(EUIDE,DM,CIS') (1.4)

with AP >
AEU >"

These relationships are inferred from the discussion part of his paper:

One of the most important implications of
the model and results is that different
personal, situational and decision style
variables appear to affect the use of
systems. These findings argue for more








flexible systems to support different
users' needs. For example, the
present sales information system could
be modified to provide different out-
put formats and levels of summarization
[20,p. 918].
Equations 1.3 and 1.4 are combined in the next chapter to produce

a model that will serve as a guide for evaluating a set of propositions

relating information format to decision performance.

1.2.3 Other Related Literature

In addition to the literature referenced above, other related

literature has influenced the formulation of the hypotheses evaluated

in the present study. Two textbooks on MIS, in particular, contain

some interesting but undocumented ideas which have shared in the latter.

Murdick and Ross [23] make such general statements as, "In general,

the format should be established to save the manager's time" [p. 326]

and, "Managers prefer graphic displays, which reduce large amounts of

information into easily understood pictorial form" [p. 263].

The second MIS text which makes similar suggestions is Voich et al.

[29]. They propose:

Format is important because it affects
the ease with which the report can be
read and assimilated. As the complexity
of a report increases, its likelihood
of extent of use falls [29, p. 229].

This writer feels that the authors are saying that more attention

should be given to the format of the report as the number of "entities"

in the report on which decisions are required increases.

Finally, a recent paper by Conrath [8] suggests:


In all the literature on decision making,
and in particular that on statistical de-








cision theory, little if anything has
been said about the form in which the
data should be presented to the de-
cision maker. Perhaps this is because
most theoreticians assume that as long
as the data unambiguously define the
distributions, the format of presenta-
tion should make no difference. This
brings up the question of whether data
can ever be unambiguously presented,
and perhaps more importantly, in whose
eyes? The only answer to the second
question is the user, but he has seldom
been asked [8, p. 878].

Conrath goes on to propose that the format in which probabilistic

data is presented as a basis for choice can influence choice.

The present study centers around the questions raised above.as

they relate to a pragmatic "how-do-we-present-the-information?"

problem.

1.3 Organization of the Dissertation

In Chapter 2, a "real-life" information-decision problem is pre-

sented to provide a setting for the questions investigated in this study.

The nature of the problem is explained in Section 2.1. In Section 2.2,

the information needs of the manager in the problem are considered, and

it is assumed that these needs are relatively well defined and structured.

A number of questions related to the form in which information should be

presented to the manager are raised in Section 2.3. The results of

previous studies are revisited in an effort to provide orientation to

the present information format/decision performance questions. The

criteria used to measure decision performance are defined in Section 2.5,

and a set of research hypotheses relating these criteria to the experi-

mental format variables is presented in Section 2.6. In Section 2.7, a

general model is presented to serve as the guide for the experiment.







The nature and details of the experiment are the subject of

Chapter 3. The methodology is discussed in Section 3.1 and a full

description of the experimental task is given in Section 3.2. Section

3.3 discusses the experimental results that should be observed for the

research hypotheses to be supported, and a table is presented that

shows how each of the hypotheses is to be evaluated from the experi-

mental data.

The statistical results of the experiment are presented in Chapter

4. These are discussed in Chapter 5 from the point of view of their

implications for both the MIS researcher and practitioner. In Chapter

6, suggestions are given for new lines of research.













CHAPTER 2

THE PROBLEM

2.1 An Information-Decision Problem

The information-decision problem that provided the setting for the

current study is presented in this chapter. The situation studied pre-

sents several advantages from the point of view of empirical MIS re-

search. First, the situation is relatively simple, easy to characterize

and to model. Second, the problem points to clearly definable questions

of information structure, an area that has received increased attention

in the recent MIS literature [5, 9, 12, 18, 25, 26]. Finally, the situa-

tion may represent a new area for the application of MIS technology.

2.1.1 The Real-Life Problem

The particular decision situation to be outlined comes from the

field of agriculture and concerns the detection of estrus (heat) in

artificially inseminated dairy herds. The problem is that failure to

detect heat can result in lost breeding opportunities, lower milk pro-

duction, and subsequent capital losses. The following excerpts from the

dairy industry literature illustrate the problem:


Accurate estrus detection is a key to
efficient reproduction and high milk
production. .. Proper detection of
estrus is essential in any planned
breeding program using hand mating,
especially to capitalize on superior
sires available through artificial
insemination [14, p. 248].








...delayed conception means a cow must
stand dry and nonproductive when her
lactation ceases at a maintenance cost
of about $20 per month [19, p. 580].

Approximately 53% of heats are being
missed. Dairymen appear to be losing
twice as many days due to missed heat
periods as due to failure to conceive
2, p. 2473.


The literature includes much advice about methods for heat detec-
tion, most of it having to do with heat recognition in the field.

Even then, it has been suggested that close to 50% of all heats are not

detected [2, 3].

Dairymen using artificial insemination and keeping the appropriate
records have information that can help them in detecting heat [7]. The

information consists of the date of the last service (insemination) of

each cow and data on the average number of days between successive ser-

vices. It has been suggested [27] that a chart with "heat expectancy

dates" could be valuable for detecting heat, as it would enable the

dairyman to concentrate his observations on those cows expected to come
in heat.

The design of such a report motivated initial work on the problem.

A preliminary survey using an experimental report in an actual dairy
operation revealed that rather general agreement existed among the

prospective users as to the desired content of the report and how often



Unpublished; conducted at the dairy farm of Mr. Herman Hernandez,
Isabela, Puerto Rico during January-May, 1976.





15

it should be produced. One issue that remained questionable was the

manner in which the information should be presented in the report. There

were several formats that appeared useful but each seemed to have its

own pros and cons from the point of view of ease of use. The problem

appeared to be sufficiently interesting and important to merit an

experimental evaluation of the various information format alternatives.

The problem discussed in the next section is the abstraction or

prototype designed to Investigate this information structure problem

within a controlled laboratory setting.1 The questions of interest

were widened to include a set of propositions related to a more general

MIS framework and theory. In the problem to be outlined below,-the

term "heat" is replaced with more general terminology.

2.1.2 The Abstracted Problem

Consider an organization that needs to keep records on a number

of random events that occur relatively infrequently but are important

to management. These events represents opportunities for management:

if one occurs and is not detected the organization suffers opportunity

costs.
Management knows that these events occur independently approxi-

mately once every 20 days, and that when they occur they are

"detectable" during a short period of time (approximately 24 hours).



The reasons for taking the research to the laboratory were two-
fold. First, resources were not available for conducting a reasonably
controlled field experiment. Second, the research interests of the
author were shifted from the operational considerations of the problem
to a more general set of research questions more amenable for resolution
in a laboratory setting.








It is assumed that each check made on an event to see whether it is

occurring has a fixed unit cost associated with it, independent of the

number of checks made on the same day. It can, therefore, be uneconom-

ical for management to check on these events too often. Management

is assumed to maintain a computer-based data bank with the following

data on the process:

(1) a three-digit identification (I.D.) number for
each event that is expected to occur during the
next twenty days,

(2) the date of the last observed occurrence of each
event, and

(3) data on past time intervals between successive
occurrences of each event.

It is further assumed that management will use this data to pro-

duce a periodic report to aid them in deciding which events to check

at the beginning of each day.l Their decision problem is relatively

well structured and straight-forward: they would like to detect as many

of these events as possible but face a trade-off between the costs

of "checking" and "missing" the events.

2.2 Information Content

Based on past experience, the managers in charge of checking the

events know that it is not cost-effective to check an event except on

those days when the event is more likely to occur, i.e., the days

around the date figured by adding 20 days to the last observed occur-

rence. They have suggested that a periodic chart with "event



IThey will produce the report; they will rather have the data re-
ported in its worst possible form than no report at all.






17

expectancy dates" would be useful as it would permit management to con-

centrate their checks on those days when each event is expected to occur.

A dichotomy from economic models will help to clarify the type of

report that managers consider appropriate in the problem modeled.

Managerial reports can be descriptive or normative in nature. Purely

descriptive reports, as used here, are those limited to the presenta-

tion of factual information (e.g.: production history reports, financial

reports). Purely normative reports, as used here, explicitly indicate

courses of action to be followed by the user (e.g.: production

schedules). All managerial reports can be placed on this descriptive-

normative scale. A report providing demand forecasts and safety

stock sizes [11] is, for example, more normative than one providing a

detailed sales history but no forecasts. In this study, it is assumed

that managers want more than a descriptive report (for example, one

showing only the dates of the last observed occurrence of each event).

They want a report providing forecasts for the event occurrence dates.

They consider twenty days a reasonable time horizon for the report.

It is assumed that shorter horizons would make the report too costly

to produce and longer horizons would make the forecast data basis too

dated. In conclusion, the report that is assumed to be appropriate

for the problem modeled is a periodic chart containing event I.D.

numbers and "expected due-dates" for those events expected to occur

within the next twenty days.

2.3 The "Hao-Do-We-Present-the-Information?" Problem

The information content needs of management in the problem

characterized above are assumed to be relatively well structured and

defined. The issue that constitutes the main focus of this research







is the question of information structure, i.e., the physical manner in

which the information is presented to the user. Dickson et al. have
suggested three categories of information structure enumerationss

added by the author):


It is naive to assume that information system
requirements do not vary with the type of
decision being formulated. And, it is sub-
optimal to continue developing information
support systems without serious consideration
of (1) the form in which information is pro-
vided, (2) the level of detail incorporated
into ensuing reports, and (3) the media by
which the information in transmitted
[12, p. 3].

The medium of transmission, the format of presentation, and the

level of detail are discussed below in terms of their importance in

the defined decision problem. In each case, arguments are presented

to show why each category was included or excluded as an experimental
variable in the study. The dependent variables measuring decision

performance are then presented, and the questions raised about the
effects of the experimental treatments on performance are presented
as a set of testable hypotheses. In the final section, a functional

model is presented to serve as framework for testing the hypotheses.

2.3.1 Medium of Transmission

Two media are commonly used for reports generated from a
computer-based data bank: paper printout and cathode ray tube (CRT)
display. In the case of a report that is to be produced and released



IWhen reports are generated by a computer, the choice of transmission
medium is usually confined to these two media. Otherwise, the writer is
aware that other more "personalistic" modes of communication are also
available for displaying the information to the user [22]. Only computer







every twenty days, paper would appear to be the more appropriate

medium. A CRT could be a reasonable medium if the time interval be-

tween reports was shorter and if there was a need to reduce paper

clutter. Kozar [8] found that users of CRT's tend to to be unhappy with

the lack of hard copy and that they take significantly more time to

arrive at decisions than hard copy users. The medium of transmission

was not considered a relevant design variable in the present study.

Conventional paper printout was used as the constant medium through-

out the experiment.

2.3.2 Format of Presentation

The format variable has been discussed more extensively in the

MIS literature than the medium variable [5,9,25,26]. The most common

format treatment has been summary versus raw data [9, 25,26]. This

treatment, however, has manipulated the data content more than its

format. Only one study has been concerned with format, if format is

considered to be related to the "style" of presentation.

Style of presentation. Benbasat and Schroeder [5] presented

daily production figures to experimental subjects in one of two styles:

tabular and graphical. The tabular style listed daily production figures

while the graphical style plotted the same daily figures versus time.

Their results indicated that subjects using the graphical reports had

lower costs, with no significant differences in decision time between

the two groups.1 These results suggest that the graphical format might



generated reports are considered here, however, mainly because of
the lack of resources for experimenting with other media.
IMurdick and Ross [23,p. 263] state that managers prefer graphical
displays, although they do not support their contention.








be a more appropriate style of presentation for the time-staged infor-

mation in our problem. Specifically, if the reports are to consist

only of event I.D. numbers and expected due-dates, the question of

interest is whether the formats shown in Figure 2.1 (p. 21) can in-

fluence aspects of decision performance. As discussed in section 2.5,

decision performance will be measured in this study in terms of time

performance (the time devoted to making the "check" decisions) and

cost performance (the total cost of checking and missing the events).

A prior, it would seem logical to expect the formats in Figure

2.1 to influence, if anything, time performance. The dates reported

are future dates and the information is going to be used chronologically.

Consequently, the time dimension added by the graphical style should

be helpful in that it orders the events chronologically from left

to right on the x-axis In part C of Figure 2.1, for example, it is

seen that event "032" is expected to occur first (May 26), then event

"146" (May 28), and so on.

Information layout. A chronological ordering of the events can

also be achieved with the tabular style by arranging the events in

order of expected due-dates, as in part B of Figure 2.1. It is assumed,

however, that the ordering of events by ascending I.D. numbers is a

desirable condition in these reports because management frequently

needs to make quick reference to the due-dates of particular events.

The quickest way to make these references is when the events are arranged




If these reports were intended for chinese managers, an attempt
would be made to present the information from right to left.








A. Tabular style
with I.D. layout
EVENT EXPECTED DUE-DATE
!DENT. (MONTH-DAY)

004 6-07
009 5-29
017 6-11
024 6-04
032 5-26
038 6-04
051 5-31
070 6-01
076 6-10
078 5-30
082 6-03
035 6-10
097 6-05
110 6-11
121 6-05
128 6-01
142 6-09
146 5-28
155 6-12
163 6-09
168 6-07
171 5-31
173 6-06
177 6-06
186 5-31


B. Tabular style
with due-date layout
EVENT EXPECTED DUE-DATE
IDENT. (MOnTH-0AY)

032 5-26
146 5-28
003 5-29
078 5-30
171 5-31
051 5-31
186 5-31
070 6-01
128 6-01
082 6-03
024 6-04
038 6-04
121 6-05
097 6-05
173 6-06
177 6-06
168 6-07
004 6-07
163 6-09
142 6-09
085 6-10
076 6-10
110 6-11
017 6-11
155 6-12


C. Graphical style with I.D. layout

EXPECTED DUE-DATES

EVENT MAY JUNE EVENT
IDENT. 25 26 27 28 29 30 31 01 02 03 04 05 06 07 08 09 10 11 12 13 IDENT.

004 H 004
009 H 009
017 H 017
024 H 024
032 H 032
038 H 038
051 H 051
970 H 070
076 H 076
078 H 078
082 H 082
085 H 085
097 H 097
110 H 110
121 H 121
128 H 128
142 H 142
146 H 146
155 H 155
163 H 163
163 H 168
171 H 171
173 H 173
177 H 177
186 H 186

25 26 27 28 29 30 31 01 02 03 04 05 06 07 08 09 10 11 12 13


Figure 2.1 Three forms of presenting the expected due-dates information








in ascending I.D. number order, especially when there are a large

number of events to be referenced. One solution to this dual need is

to produce two reports: one in order of expected due-dates to support

the daily checking of decisions, and another in ascending I.D. order for

quick references. But, if an experiment revealed no significant

difference in performance between the users of the graphical I.D.

ordered reports and the users of the due-date ordered reports, the im-

plication would be that there is no need for both reports. The graphical

report would provide the two desired features. Another explanation

for that result could be that the graphical style in this case has

a "calendar" resemblance and therefore presents a more familiar .

picture to the user than a listing of numbers. If this were the case,

a graphical report that presented the events in due-date order might

also influence performance. Figure 2.2 below shows such a report.



EXPECTED DUE-DATES
------------------------------------------------~----
EVENT MAY JUNE EVENT
IOENT. 25 26 27 28 29 30 31 01 02 03 04 05 05 07 08 09 10 1112 13 10ENT.
------------------------------------------------------
032 H 032
146 H 146
009 H 009
078 H 078
171 H 171
051 H 051
185 H 186
070 H 070
128 H 128
082 H 082
024 H 024
038 H 038
121 H 121
097 H 097
173 H 173
177 H 177
168 H 168
004 H 004
163 H 163
142 H 142
085 H OE5
076 H 076
110 H 110
017 H 017
155 H 155
25 26 27 28 29 30 31 01 02 03 0z 05 06 07 08 09 10 11 12 13

Figure 2.2 Graphical style with events in due-date order








The only difference between the arrangement above and that in part C

of Figure 2.1 is the "layout" of the information in the report. The

term "layout" will be used here to refer strictly to the order in

which the information is arranged in the report. Two layout schemes

are considered: I.D. number ordering and expected due-date ordering.

Questions of interest. The format alternatives considered above

appear to have pros and cons from the point of view of the ease with

which the report can be used. The format variables layout and style

will be experimentally manipulated in an attempt to address the following

questions:

(1) Can information layout by itself affect decision
performance?

(2) Can information layout interact with presentation
style to enhance or reduce separate performance
effects of either layout or style?

2.3.3 Level of Detail

Given the probabilistic nature of our data, a wide range of levels of

detail can be provided in the "expected due-dates" reports. These could

rangefrom point estimates to complete probability distributions of the

event occurrence times. On this subject, Conrath [8] proposes that

decision makers are not likely to think in terms of probability distribu-

tions, and that they prefer to think in terms of, and use, point estimates.

His argument would suggest the use of point estimate forecasts as one level

of detail in this study. The question would remain, however, whether the users
in this decision context could benefit from additional information about the






24
probability distributions from which these estimates are drawn. This

additional information could be presented, for instance, in the form

of percentiles of the distribution (e.g.: the days lying above the

fifth percentile and below the ninety-fifth percentile of the distri-

bution), The latter would have the advantage of incorporating infor-

mation about the variability of the event occurrence times and, there-
fore, the risk involved in making the check decisions,

In the problem modeled, it is assumed that there is enough data
available on past intervals between events to permit estimates of the

mean, ii, and standard deviation, si, for each event i. Using this data,

and assuming normal and stable distributions, the intervals ii 2si

were used as interval estimates for the days during which each event i
is more likely to occur. In the case of the tabular style, the reports

with such "95% confidence intervals" could appear as in parts A and B of
Figure 2.3 (p.25). The issue of format takes new importance now

since it is possible that the graphical style (part C of Figure 2.3)

may have properties that make the checking choices easier for the user.
Specifically, the level of probabilistic detail (point estimates or

interval estimates) may interact with the style of presentation

tabularr or graphical) to affect the ability of the user to process
and effectively use the information.

Questions of interest. The two levels of probabilistic informa-
tion described above, point estimates and interval estimates, will be

experimentally manipulated in connection with the format variables to
address the following questions:







A. Tabular style
with I.D. layout
EVENT 95% CCGFIDErNCE InTERVAL
IDENT. (FIRST DAY, LAST bAY)
-------------------------------
004 6-06 6-09
009 5-27 5-31
017 6-10 6-12
024 6-01 6-07
032 5-25 5-28
038 6-02 6-05
051 5-29 6-03
070 5-31, 6-02
076 6-09 6-12
078 5-27 6-02
082 6-01 6-05
085 6-09 6-14
097 6-04 6-07
110 6-10 6-12
121 6-03 6-07
128 5-30 6-04
142 6-06 6-12
146 5-26 5-30
155 6-11 6-13
163 6-07 6-1
168 6-04 6-10
171 5-28 ,6-03
173 6-05 6-07
177 6-05 6-08
186 5-29 6-03


8. Tabular style
with due-date layout
EVE!F S55 CC;FIDE.'CE INrTERVAL
IO;DE. (FIRST CAY, LAST DAY)

.032 5-25 ,5-28
146 5-26 5-30
009 5-27 5-31
C78 5-27 6-02
171 5-28 6-03
051 5-29 6-03
186 5-29 6-03
128 5-30 6-04
070 5-31 6-02
082 6-01 6-05
024 6-01 6-07
036 6-02 6-06
121 6-03 6-07
168 6-04 6-10
097 6-04 6-07
173 6-05 6 -07
177 6-05 6-08
004 6-06 6-09
142 6-06 6-12
163 6-07 6-12
085 6-09 6-14
076 6-09 6-12
110 6-10 6-12
017 6-10 6-12
155 6-11 6-13


C. Graphical style with I.D. layout

95% CONFIDENCE INTERVALS

EVENT MAY JUNE EVENT
IDENT. 25 26 27 28 29 30 31 01 02 03 04 05 05 07 08 09 10 11 12 13 IDENT.


HHH


H H H H


H


H H H






H H H H


H H

H


H H H H 004
H H 009
H H H 017
H H H H H H H 024
032
H H H H H 038
SH H H H 051
H H H 070
H H K H 076
H H H H 078
H H H H H 082
H H H H H 085
H H H H 097
H H H 110
H H H H H 121
H H H H H H 128
H I H H H H H 142
H 146
H H H 155
H H H H H 163
H H H H H H H 163
H H H H 171
H H H 173
H H H H 177
H H II H H 185


25 26 27 28 29 30 31 01 02 03 04 05 06 07


08 09 10 11 12 13


Figure 2.3 Three forms of presenting the interval estimates information







(1) Can users of probabilistic data make effective
use of information beyond point estimates?

(2) Can the format in which probabilistic data is
presented affect choice behavior?

(3) Can the level of probabilistic information in-
teract with format to affect user performance?

2.4 Number of Decision Entities

An important consideration is now introduced: the influence that

any report format will have is very likely to be related to what is

called here the "number of decision entities" on the report. The

term "decision entities" will be used to refer to the separate pieces

of information present on a report and on which decisions are required.

In the real problem this study is based on, there is little doubt that

the report users would be indifferent about format if their reports con-

tained information on only two or three events. This is not expected

to be the case, however, if the reports contain information on 200

events.1 The "number of of decision entities" was included as an experi-

mental variable to empirically test the assertion that while report

users may feel indifferent about format when small amounts of infor-

mation must be processed, they will move toward preferred formats,

and their performance will be more sensitive to format as the amount

of information they must process increases.2



A manager dealing with more than 200 decision events told the
writer he "could care less" about format if he did not have so many
events to look after.
2"Amount of information," as used here, must not be confused with
"information overload," a condition where the decision maker is given
too much, unnecessary information [1, 9, 15].








2.5 Dependent Variables

Time performance. Since time is a valuable managerial commodity,

decision time is commonly used as a decision performance criterion

[5,9,18,26]. Although not supported by any studies, Murdick and Ross
contend that "... format should be established to save the manager's

time" [23, p. 326]. Decision time will be used here as a proxy for the

value of managerial time to the organization. It will be measured by

the total time that the decision maker devotes to making the. checking

decisions. It is expected that this measure will be correlated with

the cost measure, although the cost measure will not include the cost

of managerial time to avoid double counting. Chervany and Dickson [9]

found that some decision makers will take longer to arrive at their

decisions but will make lower cost decisions. The possible correla-

tion between the time and cost measures will be taken into account

through the use of multivariate statistical procedures (viz., MANOVA).

Cost performance. Cost is also commonly used as a performance
criteria when decision effectiveness is discussed [5,9,18,26]. Cost

performance will be measured here by the total of the "checking" and

"missing" costs. For each check made, the decision maker will in-

cur a fixed dollar cost. The checking cost, then, will be given by

the product of the total number of checks made times the fixed cost

per check. For each event that is not detected, the decision maker
will incur a fixed opportunity cost. The cost of missing is then

calculated as the product of the total number of misses times the fixed

cost per miss.

Choice behavior. Choice behavior was also included as a criterion

variable to test Conrath's [ 8] contention that the format in which






28
probabilistic data is presented influences the choice behavior of the

user. It was assumed this could be the case with the tabular versus

graphical formats in the current study. The graphical format appears

to "bring out" more vividly the information, especially in the case

of the interval estimates (see Figure 2.3, p. 25). The measure used

for choice behavior was the number of checks performed by the decision

maker, disregarding which were successful and which were not.

2.6 Research Hypotheses

The questions raised above are now presented as six testable
hypotheses. Three of the hypotheses relate to format, two to the

level of detail, and one to the number of decision entities in the

report. The hypotheses relating to format are presented first.

1. The layout or physical order of the information in a (HI)
report can reduce decision time. In this case, it is
expected that the users of the due-date ordered reports
will have shorter decision times than the users of the
I.D. ordered reports.

This hypothesis was not found to have been considered in the
MIS literature, either in field or laboratory work. There are many

ways in which the same information can be arranged in a report. Even

though the "best" way may usually be considered "apparent" or the

issue simply "unimportant," this may not always be the case.

2. The format in which probabilistic data is presented as a (H2)
basis for choice can influence choice. In this case, it
is expected that the graphical report users will choose
to make more checks than the users of the tabular reports.
This hypothesis was suggested by Conrath [ 8] but not statis-
tically demonstrated in his paper. He states:

Apparently format has the characteristic
that it can focus one's attention on one







dimension of the choice space, and that
dimension becomes paramount in the de-
cision process. ...Whether the attention
focusing attributes of data format are
the keys to the influence that format has
on choice is a question not yet resolved.
But the question would appear to be
sufficiently important that it should no
longer be ignored [8, p. 880].

The format variable that is expected to have "attention focusing at-

tributes" in this case is the style variable (graphical versus tabular).

As such, the style factor will be the one analyzed in the evaluation

of this hypothesis.

3. Report layout and style can interact to enhance or reduce (H3)
the decision time effects of a particular layout or style."
In this case, it is expected that the users of the I.D.
ordered reports in graphical style will have shorter de-
cision times than the users of the same layout in tabular
style.

The objective in testing this proposition is to demonstrate the

existence of information format characteristics that may have joint

effects on decision performance. Here, the combination of the I.D.

ordering layout with the graphical style is expected to reduce the

long decision times associated with the absence of the convenient due-

date ordering.

The next two hypotheses relate to the level of detail of the

probabilistic information provided.

4. Users of probabilistic data can make effective use of (H4)
information beyond point estimates. In this case, it
is expected that the interval estimates users will make
more cost-effective decisions than the point estimates
users.

The interest in this hypothesis is twofold. First, its evaluation

should give an indication as to whether the users of this type of







report can make effective use of interval estimates. In the real

problem this study is based on, it is expected that interval estimates

can be useful. Second, this proposition provides a setting for testing

Conrath's [8] contention that decision makers are not likely to think

in terms of probability measures other than point estimates.

5. The time to process and effectively use probabilistic (H5)
information is related to the format in which the in-
formation is presented. In this case, it is expected
that the users receiving the interval estimates in the
graphical style will have shorter decision times than
those receiving the interval estimates in the tabular
style.
The difference between HI and H5 is that HI refers to the direct

(main effect) influence of layout on performance while H5 refers to
the interaction between a format variable (style) and the level of
probabilistic information provided (point estimates or interval

estimates). The purpose in testing this hypothesis is to show that

different levels of probabilistic information will be more easily

processed and used with different formats of presentation.

The hypothesis relating the number of decision entities to

format preference is:

6. Report users will move from format indifference to for- (H6)
mat preference and their performance will be more sensitive
to format as the number of decision entities on their
report increases. In this case, no significant format
opinion differences are expected among users of reports
with few decision entities in them, with the opposite
expected among users of reports containing many decision
entities. Differences in performance are also expected
to be larger among the users of the reports with many
decision entities.

The objective in testing this proposition is to demonstrate the
existence of a "number of decision entities" variable that should be






31

considered in MIS design. This variable will occur in most situations

where the number of physical phenomena on which decisions are required

is variable. A procurement manager, for instance, may be indifferent

about the format of his inventory status reports if he must place orders

for only five items. He would probably be concerned about format (and

his performance would be more influenced by format) if he has 250 items

on which to place orders.

2.7 Basic Functional Model

The following model presents, in equation form, the relationships

to be analyzed.

Given equations 1.3 and 1.4,

EU = f(F,LIDE,DM,CIS') (1.3)

P = f(EUIDE,DM,CIS') (1.4)

it follows that

P = f(F,LIDE,DM,CIS') (2.1)

where, P = decision performance

F = format of presentation

L = level of detail

DE = decision environment characteristics

DM = decision maker characteristics

CIS' = other characteristics of the information system.

In words, equation 2.1 states that the format of presentation

and the level of detail are determinants of decision performance






32

given a particular decision environment, decision maker, and other

characteristics of the information system. In the next chapter, a

table is presented that shows each of the research hypotheses expressed

as a variant of this basic model.














CHAPTER 3

THE EXPERIMENT

3.1 Method

This chapter presents the details of the experiment that was con-

ducted to evaluate the decision performance effects of four factors:

information layout, style of presentation, level of detail, and the

number of decision entities on the report. The material has been

arranged as follows. Section 3.1.1 discusses the nature of the experi-

mental subjects. The methods used for collecting and analyzing the

experimental data are presented in Section 3.1.2. A full description

of the experimental task is given in Section 3.2. Finally, the experi-

mental results that should be expected for the hypotheses to be backed

up are discussed in Section 3.3.

3.1.1 Subjects

One-hundred sixty subjects participated in the experiment. The

subjects were undergraduate students in Business Administration who had

completed the first semester of introductory statistics at the Univer-

sity of Puerto Rico, Mayaguez Campus. They were invited to participate

through announcements placed on bulletin boards and read in classrooms.

No monetary incentives were offered but the rate of volunteering was

high: an initial "sign-up" list yielded more than 200 subjects.






34

Ten subjects were randomly assigned to each of the sixteen (four
factors each at two levels) experimental conditions. The assignments

were made with a random number generator that uniformly distributed

the numbers 1 through 160 among the sixteen conditions until a

schedule was formed with ten numbers assigned to each condition. As

subjects arrived to participate, their order of arrival was checked

against the schedule to determine their condition assignment. It

was felt that this assignment scheme avoided the problem of making

individual subject "appointments" at the same time that it provided

a means for stratifying the assignment of the experimental conditions

through the three months it took to complete the study.

3.1.2 Design and Analysis

Table 3.1 (p. 35) is a summary of the 24 factorial experimental
design. The two levels for the number-of-decision-entities variable

were achieved by dividing the experimental subjects into two groups.

One group was assigned to an experimental condition where only five

events of interest had to be checked, referred to below as the "few

decision entities" condition. The five events were selected to form

a stratified representation of the twenty-five events that would take

place in the "many decision entities" condition. User preference for

particular formats was measured with a questionnaire administered to the

subjects at the end of the simulation runs (see Appendix B, p. 78).

In this study, no significant differences in format preference ratings
are expected between the various format combinations among subjects

assigned to the few-decision-entities group. Significant opinion

differences, however, are expected among the ratings of the subjects

























o 9 uc.11Puo3

I C uoI:Gpuoj




J uopuo3



N t UOM.L, puo3
o 9 uol.puog





SIL uot.ipuoJ





SL uoLMpuo"O

a tt UQOMPUO3
o S uoL. puo

S uoTpu I
0 uoil.puC




!] 6 UOLUO pUo

a j uo.!.puO
u -- ___ ^
__ o" E UompO


en
-2
-J


-1










OS
U-

-J




I-


s S
E5
0. UC
a wa




*0 Oh *- U
P


I C

aL a Ji.

(U =h C C
Sa s


S. 3 CC C



U.0 4 C C

o a- Q
LCl






S0 0h 6
Eh Oh *- 4!*
J Oh -5
o a. w
0- ).



R a gjY






36
assigned to the many-decision-entities group. Differences in decision
performance are also expected to be larger for the many decision

entities group.

Each of the sixteen resulting cells contained observations of 10
subjects' decision time, total checks made, total cost, and the five

ratings to the format opinion questionnaire. The data was analyzed

using Multivariate Analysis of Variance (MANOVA)2 MANOVA procedures

have the advantage of considering correlation among the dependent vari-

ables [6]. Peter et al. [24], suggest that the technique should be
used, as opposed to the univariate ANOVA, whenever there is reason to

believe that multiple dependent varialbes might be correlated. 'Winer

[30, p. 232] points out that by considering possibly correlated depen-
dent variables in a series of independent univariate tests, one fails
to obtain information about the total effect of the experimental treat-
ments on all the criteria simultaneously. In the case of experimental

MIS research, a close correlation has been suggested between time and
cost, two of the criteria most frequently considered in the literature.

In none of the reviewed literature, however, was MANOVA used.

In the current study, separate ANOVA's will be conducted on
each of the dependent variables after overall significance is obtained



IDecision time was rounded to the nearest minute and did not in-
clude the time devoted to the post-experimental questionnaire. The
subjects were clocked as soon as their last "check" decision was made.
2BMD12V Multivariate Analysis of Variance and Covariance, Health
Sciences Computing Facility, Department of Biomathematics, School of
Medicine, University of California, Los Angeles, 1976, p. 751.





37

by MANOVA. This procedure is necessary in order to evaluate directional

hypotheses relating to specific dependent variables. Further investi-

gation into the directions of obtained differences will be conducted

using Scheffe's posthoc test for comparisons between means [17, pp.
483-486].


3.2 Experimental Task

The experimental subjects acted as the decision makers in the

problem described in Chapter 2 A computer simulation of the decision

environment was created which modelled the essential features1 of the

decision problem. Those features are:

1. The decision maker wants to detect a number of events that
occur at random with normally distributed intervals between
successive occurrences.

2. He has data on the random events and wants to use it to make
cost-effective decisions on when to check each event.

3. He incurs a fixed opportunity cost for each event that occurs
and goes undetected.

4. He incurs a fixed cost for each "check" that he makes on an event.

5. His objective is to minimize the total combined costs of
"checking" and "missing" the events.

These features were incorporated into the simulation model as
follows:

1. A hypothetical data set for the means, x., and standard
deviations, s., for the between occurrences interval of each
event i was uied to generate "actual" occurrence



Van Horn recommends that a good guide in developing an
effective prototype is "to restrict the prototype content to the
minimum set of features that are directly relevant to the problem
modeled" [28, p. 179]. His advice was followed here.







dates for a number of hypothetical events.1 Table 3.2 (p.39)
shows that data in two parts: the data used for generating
the 25 events in the "many decision entities" condition, and
the data used for generating the 5 events in the "few
decision entities" condition. The "generator"'was validated
to verify that the events were generated according to a normal
distribution with the means and standard deviations indicated
in Table 3.2.

2. Based on the data on Table 3.2, forecasts for the event
occurrence dates were prepared and presented in report form.
These were the experimental reports (conditions) administered
to the subjects to help them in making their daily check
decisions. Several of these reports have already have been
shown in Chapter 2. The rest are shown in Appendix A, p. 69.
3. The subjects were told that undetected events at the end
of the simulation would cost them $5 each. These will be
counted and multiplied by $5 to determine their total "missing"
cost.

4. Subjects were also told that each and every check made during
the run would be charged at $1 per check. These would be
counted at the end of the run to determine their total
"checking" cost.

5. Finally, subjects were instructed that their objective in the
game was to minimize their total cost figured as the sum of
their "checking" and "missing cost. Subjects were told that
their run time was also being measured, but were given no
time limit or other time pressures.

Subjects interacted with the simulator through typewriter-type

computer terminals in deciding which events to check for at each of

twenty decision points (days). At each decision point, the subject

chose the I.D. numbers of the events he wanted to check and entered

them for processing by the simulator. The simulator reported whether

or not the events checked occurred on that day, i.e., whether the

checks were successful or not.



Another Van Horn guide followed here in developing the present
prototype is "...to replace large actual data bases with small, care-
fully stratified representations." [28, p. 179].
2This and subsequent dollar figures were also simulated. No
monetary incentive scheme was used to make subjects do "their best."










TABLE 3.2

DATA FOR THE EXPERIMENT


Event Date of Last Number of Days between Occurrences
I.n. Occurrence Mean Standard Deviation
____


A. Data for the many-decision-entities condition


B. Data for the few-decision-entities condition


Event Date of Last Number of Days between Occurrences
1.0. Occurrence Mean Standard Deviation
(;j) (sj)










Uniform instructions were administered to all subjects regarding

the general nature of the experiment and their participation (see

Appendix D, P. 89). Special care was given to insure that subjects

understood their objective in the game. A familiarization session

consisting of five decision points (days) was conducted to acquaint

subjects with their decision environment. These sessions were con-

sidered to be long enough to check subject "learning" effects during

the experimental runs. In no case did data collection begin until

subjects indicated that they felt comfortable with the procedure and

ready to start.l

Three remote terminals were used for conducting the runs. Each

was on line with a master program that maintained a file with the results

of each run. The file included the three performance criteria and the

scores of the post-experimental questionnaire.

3.3 Evaluation of Hypotheses

Table 3.3 (p. 41) shows the main effects and interactions that

should be observed for the hypotheses to be supported. Each of these
effects is discussed next.

Hypothesis 1. Information layout (factor A in table 3.1) should

affect decision time such that the subjects with due-date ordered reports

should have shorter decision times than those with I.D. ordered reports,

i.e., A2 < Al: TIME. The due-date order should be more convenient

given the chronological way in which the information is going to be used.



The average familiarization session took 15.2 minutes.




















w



rs

nSBs
E -5
s-jl w

os w
4,42ECi,

8f^|
4. 024'

0s ftf


b~
"cb
qs,
Cp~:"
eX3
S
SrB

51f
EPO


EtS

5Y
B 3
;41
~B$s
SE;~
t"SF
'VU"
~3'3
3_Y~
L)VOY
XS;~
15P5


Ott
Sgo
zsi
'ZiP


s"s






s 's








J S









L: ~E
BOd'
e
f~E
or
'5~t
%dE~

f5seo
,,-,i
i31;P
,"rf
Yrd'l
gye:f~


g8.


Y. E

"r


9
H










~


a. u


u






I 3



ci


cri
g~o;
ar

tL:
BtS"
,I
fSI5i
sf"4
O~S9






42

Hypothesis 2. The total number of chekcs made should be affected

by the style treatment (factor B). It is expected that the number of

checks made by the graphical style users will be larger than that
made by the tabular style users, i.e., B2 > Bl: CHECKS, since the

graphical style appears to illustrate the "choice space" more clearly,
thus inviting more check decisions.

Hypothesis 3. Decision times of subjects receiving some com-

bination of layout and style (A,B) should be significantly different

from decision times of subjects receiving some other combination.

Specifically it is expected that subjects receiving I.D. ordered

reports and the graphical style will have shorter decision times than

those receiving the I.D. ordered report but not the graphical style,

i.e., A12 < AB1 : TIME. The graphical style should reduce the

need for the time-convenient due-date ordering.

Hypothesis 4. Subjects receiving the interval estimates should

perform better than those receiving only point estimates, i.e., it is
expected that t2 < el : COST. The interval estimate subjects will
have more information on the random nature of the events.

Hypothesis 5. Decision times of subjects receiving some combina-
tion of style and level of detail (B,C) should be significantly

different from decision times of subjects receiving some other com-
bination. In particular, it is expected that the subjects receiving
interval estimates in the graphical style will have shorter decision

times than those receiving the interval estimates in the tabular
style, i.e., B2-2 < BIC2 : TIME. The graphical style should make

it easier for users to process the interval estimates information.





43

Hypothesis 6, Differences in format opinion should be observed
among the various layout and style treatments administered to the
many-decision-entities subjects (D1). Differences in both time and
cost performance should also be observed among this group. This
will indicate that report users have preference differences for
format and their performance is more sensitive to format when the
number of decision entities on their report is large. Non-significant
differences should result among the same layout and style combinations
administered to the few-decision-entities subjects (D2). In general,
it is expected that the opinion ratings will average higher for the
few decision entities group, i.e., 62 > B1: RATINGS. Differences

i, A2D1, BID B2DI and AIB1D1 A1BD are expected to be
significant for cost, time and the opinion ratings. The same com-
parisons with D2 instead of D1 are not expected to be significant
(the few-decision-entities case). The experimental results are pre-
sented in the next chapter.














CHAPTER 4

EXPERIMENTAL RESULTS

4.1 Introduction

The statistical results of the experiment are presented in this

chapter. Table 4.1 (p. 45) shows the cell means obtained for the three

performance variables and the five format opinion questions. The

results revealed significant differences among treatment means to

support five of the six research hypotheses.

4.2 Results

4.2.1 Effect of Layout on Decision Time

The first hypothesis, that information layout can reduce decision

time, was supported. As Table 4.1 shows, decision time was shorter for

the subjects receiving the due-date ordered reports, A2, than for

those receiving the I.D. ordered reports, A, (13.8 versus 16.4 minutes).

A multivariate test on the three performance variables showed a sig-

nificant layout main effect (F = 7.41, p < .00001).1 A univariate

test on the decision time variable also revealed a significant dif-

ference between the two layout groups (A2 < A1, F = 13.35, p < .003).
A Scheffe post-hoc test revealed an even stronger relationship when



1All multivariate F's presented here are based on 3 and 142 degrees
of freedom. All the univariate F's are based on I and 144 degree of
freedom.
















0





F$


H 3 .4 H .4 1I N
5 gI 0"

N 1 l N W



Sa a< ss s "' n
S1 N N pA M N 4 v 'I




C r. 4* IA N P 03 N- 0
8^^ ^ ^ r
63 ..< ---------------------


00
C-


z
0
L-J
H
s:
w




CD
w

cr




LU
V)
r- 0
^ 5j
S H"




-I
-n
LU



w
=
0
LI.
C')
LI
-J
-J
wJ
C-


agl" s a "' "1 O

44


H H

I I asJI OOO ANH 4I
BON N .4 Ni





B g a^l M r-- r -l
-d a a g ^ '
J I'I00 N ^ % N 0 IA
I"0 H-H----0 40 N I




0 SA u 0 0 H 0* 1" 0 m




*y M ^ j i
3 0. 1 BN0
..4 4-

r*I H '4NO AICN


if







SII




BB ii
G- I 1





Eil J




j w

11 s

0 .Q






46
only the many-decision-entities subjects, DI, were considered. With-
in this group, the subjects receiving the due-date ordered reports

had significantly shorter decision times (A21, < AID1, F = 26.76,
p < .00001). No significant interactions were observed within the
few-decision-entities group. In all the tests, decision time was

significantly shorter for the subjects receiving the due-date
ordered reports, thus supporting the hypothesis.

4.2.2 Influence of Format on Choice Behavior

The second hypothesis, that the format in which probabilistic
information is presented can influence choice behavior, was supported.
As Table 4.1 illustrates, the total number of checks made was higher
for the subjects using the graphical style, B2, as opposed to the
tabular style, Bl. A multivariate test revealed a weak interaction
between style and number of decision entities (F = 2.60, p < .06).
Univariate tests on the number of checks variable showed a weaker style
main effect (B2 < B1, F = 3.38, p < .07) and a stronger style and
number of decision entities interaction (F = 4.75, p < .03). With-
in the many-decision-entities group, a Scheffe test revealed that
the average number of checks was significantly higher for the
graphical report users (73.5 versus 66.0, F = 8.07, p < .005). No
significant differences were found in the number of checks made within
the few-decision-entities group, and a comparison of the differences
in the number of checks between the two styles subjects for the DI
and D2 groups was highly significant ([B2D1 BlDI]<[B2D2 BD2],
F = 131.83, p < .00001). Presuming that the total number of checks
made, regardless of success, was a reasonable measure of choice






47
behavior in this problem, the results support Conrath's [8] contention
that presentation format influences choice behavior.

4.2.3 Joint Effect of Layout and Style Decision Time

The third hypothesis, that information layout and style can inter-

act to reduce decision time, was supported. Table 4.1 shows that,

within the many decision entities group, subjects using the I.D.

ordered reports, Al, had shorter decision times when they also re-

ceived the graphical style, B2. A multivariate test revealed a mar-
ginal interaction between style and number of decision entities

(F = 2.43, p < .08). Univariate tests on the decision time variable
showed a stronger ABD interaction (F = 6.20, p < .02). A Scheffe

test revealed that the significance was due to the shorter decision
times of the subjects receiving the I.D. ordered reports in graphical

style ( 1A, < A1B1Dl, F = 7.01, p < .009). The fact that a
significant layout and style interaction was observed only within the

many-decision-entities group also supports H6: that performancel

becomes more sensitive to format as the number of decision entities

on the report increases. This is also demonstrated by the fact that,

within the few-decision-entities group, both the layout main effect

(F 1 0, p 1 1) and the interaction between layout and style (F = .78,

p > .35) were not significant.

4.2.4 Effect of Probabilistic Detail on Cost Performance

The fourth hypothesis, that users of probabilistic data can make
cost-effective use of information beyond point estimates, was not


Time performance in this case.






48
supported. Subjects receiving the interval estimates treatment, C2,

had lower costs than those receiving the point estimates, C1, but
the difference was not significant. The univariate level-of-detail

main effect, with cost as the dependent variable, had F = .34, and

the Scheffe test on C1D, 12 was also non-significant (F = 2.42,
p > .10). Contrary to the author's expectation, these results do not

systematically support the hypothesis, though the directions are as

predicted, nor do they support Conrath's [8] argument that decision

makers do better with point estimates than with other probability
measures.

4.2.5 Joint Effect of Format and Level of Detail on Decision Time

Support of the fifth hypothesis was weak. The hypothesis is that
the format in which probabilistic data is presented interacts with the
level of detail to influence the time required to process and use the

information. The multivariate level-of-detail and style interaction
was not significant (F = 1.19, p > .25). There was a significant
univariate interaction between style, level of detail, and the number

of decision entities on the report (F = 3.70, p 4 .05). In particular,
the many-decision-entities subjects, D1, receiving interval estimates,

C2, in graphical style, B2, had significantly shorter decision times
than those receiving the same level of detail but in tabular style

(B2C2D1 < BIC2DI, F = 4.95, p < .03). This result suggests that
certain formats may be better for reporting certain levels of prob-
abilistic detail, but the absence of a significant multivariate
effect makes the inference rather weak.








4.2.6 Relation between Number of Decision Entities and Format

The sixth hypothesis, that report users' preference for and

sensitivity to format is related to the number of decision entities

on their reports, was supported. Table 4.1 shows that subjects with-

in the many-decision-entities group gave significantly different

ratings to the various layout and style combinations. Two of the five

opinion questions were used to verify that the subjects understood

and systematically answered the post-experimental questionnaire

(see Appendix B, p. 78). The validation consisted of checking that
the ratings for these two questions were consistent with performance
of subjects receiving the particular treatments mentioned in the

questions:

(1) Question asked the subjects to rate the order of
the information in the reports. Table 4.1 shows
that the subjects receiving the due-date ordered
reports gave significantly higher ratings to this
item than those receiving the I.D. ordered reports
(A2 > Al, F = 17.33, p < .00004).

(2) Question 4 asked the subjects to rate the level of
probabilistic detail given. Table 4.1 illustrates
that the subjects receiving the interval estimates
consistently gave higher ratings to.this item than
those receiving the point estimates treatment
(C2 > Cl' F = 18.86, p < .00002).

In the case of Question 1, the many-decision-entities subjects

gave significantly higher ratings to the due-date ordered reports
(A21 > A1D1, F = 19.08, p < .12). In Question 5, where the subjects
were asked to give an over-all rating for the format of their reports,
there was a significant difference in ratings between the many and

few-decision-entities groups (D2 > Dl F = 5.23, p < .03).






50

With regard to the relationship between the number of decision

entities and the sensitivity of performance to format, the discussion

of the first five hypotheses has shown that the performance of the

many-decision-entities subjects was more sensitive to format than

that of the few-decision-entities subjects. In all the comparisons

the differences in performance were larger among the many-decision-

entities subjects than among the few-decision-entities subjects.














CHAPTER 5

DISCUSSION OF RESULTS

5.1 Summary of Findings

Tables 5.1, 5.2, and 5.3 (pp. 52- 55) present a summary of the

experimental results that had a significance level of p <.10 or better.

The results have been grouped into main effects, interaction effects

involving the number-of-decision-entities variable, and other interaction

effects. In each case the actual significance figure has been given so

that the reader can make his own judgement on the significance of each

result. The hypotheses relating to each result are also shown in the

right margin, along with a line reference number, to facilitate the dis-

cussion in the following sections.

The results are discussed first for the multivariate (MANOVA) effects.

These do not relate to any hypothesis in particular, since the hypotheses

have been stated in terms of the effect of the experimental treatments on

specific criterion variables. They contain, however, important infor-

mation about the total effect of the treatments on decision performance

in general. The univariate effects are discussed next as they relate to

each hypothesis. In each case, the implications for both the MIS re-

searcher and practitioner are discussed.

5.2 The Multivariate Effects

Information layout (I.D. versus due-date ordering) was found to




















to to to to to to





I 444 t


*'- wO 0.0
30 0-4 w*1-4-
10 1. 0
0.0J 04-.0. 0

04k) w440 44 0
4-0 4k i .0001 .0s
0i-0 *4d 0 *r-reO


00o*-4 3 < 3 -
44 *r "0 0 *>( 0>e L)0
0-00- JCn X X~c
S'YI fe=H_,, '
t- 0 s i. w i.


I- 4.4 I.4
V) 1 .5
LAI
4n LL

m z S =







Off
1C4 G 0 -4 ^ t
4.4 0 444U



S 0
r 440 0JT-
0. )

o 00 44:
o 5 014


0 5- &5


g S S
S1 s


35.4
4~.0.


34. 4


01.
s5-
g3






L93.
X333
agi.^


05 04 0
C C C
0 ~ 04 -0
0 4 445
0- 4
00 0 0 00
1.0 44 44
0.4 0 r 0 0


44 0
0 5

0 -s




I 0
O 0


"Eg
g0.0




40 0
3 4




*.s"
4.4 1..-
34'LS


eOR LL

































L I.
0I q '.- u.

SOa 0 3)
SCL.- 0-CC c co 0



40 rs 0u
000*- 00=0 U'.- 0C+ u


a- U0 00C 0n4.
tm U il C
+3C) >, S- + 43)3>, 0f
OU- C >SU Ex-0-* E3 >lOO
tttE0 (L(U -.-)iS 4.? 0
C 5.0 C 0 0 001
C>, .00 0 .00)- 00
4.30 > S 4- %-3.-
4.0. T. 014-) 0 = :3 4.-0*
u C >, eu CE u
00.0x: 0005.0 cO,- CS
'4-043 44000.i- 00 5. (us~v ..* c
0400 4.t, 0.4
{UtfDIC Qt+JM tr __Cl CUOJ ^E
*- 3 I-- T->,C ( -. C
000 Q0 .Q 04>U J 0r >, -1 0: CCC0) 0
J= fU i- 3E tU-C +.lJ C JSJC^
f- 10 toC40 0 tflC44 (4 4U-0


)1











LU
I-









CQ




0
4 W








on
LU


0

I w











=





4-


0
"-


c
I-





*4S






0
I.


0..
4.0 0

cx
005
4.0 0U 0





040..

411
'AC 0
,o C o


l 0.00
0 00
804.

4.0r 0





Ul 01-4.
~*o a
"gcr "
BO,;^
4-14J(.

JWCI
crrr. r


a 0







I-







rr EC
w0 04
0A Cu .ua
0X 0 J-S Wl -X;
U t4.5.2
C0 008
3. Qu-


+.
0 C)

..J 04.t-
0 >,43
04.0*
- tt1 0U
4. 0l



C 0


-40
C, V.
U) U,






)0


5.
w





*.0 f



E-00
4.-
0I0
435.)
g^


000




0 0 -*





04.

>I 0
5.
CL


- o
o


I-

L
0 u
004-
*-
C-h:
n3 U

O O~l
C 1) 0r
*- 0>
V> *J
*gt"d


r+ Y,
N N
U,


































4 l I








tn-
4- =0 LMO)




3 Q *i' n3 C
.04* c40* to .044








t c W T^T
00
C) 0' 3C)
4* 0C44J tI
44 OC > .0 10
C > I O



'--t 00 %-

UCO04.5 .C"

& U ) BUffC >0) 0 0
ia ) Ci -- Cu- 0)4 2 1
oL 014-f- t l 00


0. 44 I 4|
SC C -0 %-0C









CL W' 42 % t
49, lS : 0 4 01 0







JC=U P- J C= I2 i
on w as ow
C)..4CC)0 'C) L.CCC4 C)-w-r

441-r-- 05 C A toLLS


T44OC% O* toO C-
044 .1.44 !0-CI- Q .-f-44
5-.C 0 O C4-n C C



1=-C C 1.
0*C,-C)01 103.-C C)Cl0ft ,-
*F* M~a jia+jilif
V>I U 1. i .C.-.UCO


iC <4400* 440-44 (IC

01%-C),- U-r- c a- u-O.-
O Ci t -4 .0T t4t0 Ql's C 04
at > U*n v+ xi4- r n r c >***
.C0 i0aC OCC:I-L OOC
I-01*0'0C U)C)44.C~jCIijoC


rr
44 >1' o

C) (44) QC0
CU i 100
0 C 1

G V2! 0.1-)
04-4


t o on,
I S. tot-
3) 0U ; -

C1J CC0
7;- lo -r



4._0 T1 4w-
V0 440
30 0 u
14 *E WC

.0 iuC C)C

*) 0 to 3.- 4f4a
0,










0


S'd-


In
u





























ra
Af


C >,4

0 C
C 0

















CO *
0 .n t( 10






-0 0
Uf SOU)1











4)0)




W.DC w 09
4-- 0 0 10 94. u 4
00




~-ca.
.4). c- f
W -u4 1I
0 4301.04 '4= -


4J Uu~~r *r-CJ *" -







CL-L
M. 03 44-
B 0 Q 43 c) +-g.-.
P-Cl <0 >,0 WVJ 0 0

),O IO 53 .
0J Ci I0 S-,r ',0) 01 U 4)'C 4-*0.
U.US.r ,-%-fl3*Jf1 4r)00. 0
0C 4 *L0 So 30s1. e>-4 'E0
40 1-4 4)10 00 43 6J ~ 03- >- 00)
-a = = 0 0





41= cr 39 g
0j .0- r-
j3a 0 0> *- E 3 o Iac



.04. 433 0ly -'0d 0.5CrIZ ~
4-> a. .- OL U 43 >>.0 0) t.43









LL Y)

CC 9rr C TZL~
LL-I gi-iZc iuo a a 1






CD)
3E>GS. 0-^ 1.-5 ,0.- 5380 <




(/) cn *- -->< c a
U) 05 4-543 00 0- 5t *' -B
,,, *O= E04Ul0 *r4-410 r- .JU











ZE 0 !E 4--0 0D
-a U> 0.0 Wol O- 0!.:;aa n o.
La oof o 000..J4 0r ) *- ~ UOf 0 01 0*









u4 4- m^ (o a
0








a ,a
I-








a 0 0



00 >
c 04- S 0 0 m
0'T- -* S* .4-
w0 0 0 0
J O1 0n


0



0E

.u
S. i ) 50 5













mD
0 02 0
4 0 0 'h-' a
a s & a rr.sE
0 -I -0 0





b c:" c;U
0 sO.- *0. 0-0
-" 43 431'.^ 4J '..-^ +4) ^.
10 0 C 0 043*- 0043 0- 0 43
0-~f^ 10 sJ< -1
0> ~ 303 /40 M 40 .40
4 1O-.0C 01- 0 04-
4C 0r-30i TOS.I TOS- TOS.r
0j 4) 0/ 4) 0f 4J) 0 4)* 0
X> S.00 0*4-> 034-> p39- >
C C S 0 0 EO) E 0 .
0. i 0 00 00 0 0 J L _
80 4 4- 4--.o *.-.--0
? O $ ,355 $g
0 --4 v4 w






56

affect decision performance in general (5.1.1). Multivariate analysis

is called for here since the performance criteria measured (decision

time, cost, and number of checks made) are not independent. In the pre-

sent study, the simultaneous effect of layout on the three performance

criteria gives more value to the observed univariate effect on decision

time. If Hypothesis 1 (p.28) had read "information layout can influence

decision performance," it would have been equally supported. The fact

that the effect observed was stronger among the many-decision-entities

subjects (5.2.1) also lends support to Hypothesis 6 (p. 30),

The style of presentation tabularr versus graphical) was also seen

to have a significant total effect on decision performance (5.2.3).

Although the univariate effect of style on cost was not significant, the

graphical style users within the many-decision-entities group had higher

costs than the tabular style users ($104.73 versus $100.95, p = .258).

This result contradicts Benbasat and Schroeder's [5] results, who

observed that subjects with graphical reports had lower costs than those

with tabular listings. Neither result, however, is significant (theirs

had p = .148). This indicates that there is still no basis for predicting

the effect of presentation style on cost performance. Either result

could have been due to chance alone.

In terms of future MIS research, it is suggested here that MANOVA

should be used in the analysis of experimental data. Winer [30, p. 232]

explains that whenever there is reason to believe that dependent variables



The number in parenthesis is the reference to the related result in
the tables.






57
are correlated, these should be considered simultaneously to obtain infor-

mation about the "total" effect of the experimental variables. In the case

of experimental MIS research, there is reason to believe that commonly

used criteria, such as cost performance and decision time, are correlated

[5, 9].
From the point of view of the MIS practitioner, these multivariate

results point to one conclusion: the format in which information is pre-

sented can influence decision performance. The multivariate separate

and joint effects of the two format variables in this study, layout and

style, support this view. The specific directions of these effects are

discussed next.

5.3 The Univariate Effects

5.3.1 Effects Related to HI and H3

The influence of information layout on decision time was found to be

strong (5.1.2), as predicted in Hypothesis 1 (p. 28). The shorter deci-

sion times for the subjects with due-date as opposed to I.D. ordered

reports were expected, since the due-date ordering was more convenient in

the present problem. This hypothesis, however, was evaluated for two

reasons. The first was to demonstrate the importance of arranging the

information in a manner consistent with the way information is used. This

seemingly obvious observation appears to have been ignored in many reports

this author has had to use. The second reason was to prepare a basis

for Hypothesis 3 (p.29). There, it is proposed that long decision times

due to an inconvenient information layout can be reduced by introducing

a second format element, namely, the graphical style. The time dimension

added by the graphical style had the effect of reducing the need for the






58
due-date ordering, while still maintaining a desirable feature of the

report (the I.D. ordering of the events). This is evidenced by the re-

sult in Table 5.2, line 6.

These results have other implications, besides supporting HI and

H3. For the MIS researcher, they suggest the need for more investigation

on the layout variable. It might be revealing example, to look at

information layout schemes for information on events that are less time-

dependent in nature than the ones studied here.l Maintenance data on

some mechanical process, for example, could provide a setting for an in-

teresting and practical experiment.

To the MIS practitioner, and in particular to the person in charge
of designing information formats, the results emphasize the importance

of reporting information in a manner consistent with the way recipients

use it. Also, the observed interaction between layout and style suggests

that practitioners should be on the alert for joint effects among format

elements that can work to their advantage in enhancing the readability

of the report.

5.3.2 Effects Related to H2

Perhaps most striking was the result that subjects with graphical re-
ports chose to make substantially more "checks" than subjects with tabular

reports (5.1.3). This supports Hypothesis 2 (p. 28), namely, that the for-

mat in which probabilistic information is presented can influence choice.

From a research point of view, a question that remains to be answered is



All events in this world are probably time dependent, but their
occurrence may be more dependent on time for some types (e.g., biological)
than for others (e.g., electrical components).






59
whether the observed effect was related to the short duration of the

experiment, or whether the effect would have continued even if the sub-

jects had been given enough time to get fully acquainted with their re-

port style. In either case, the result observed here is an important

finding since many real-life managerial reports have short-term use, are
"one-shot-non-recurrent" reports, and, very frequently, contain informa-

tion of a probabilistic nature. Ergo, the information analyst that must

report probabilistic information as a basis for decisions appears to have

a delicate problem at hand: if the format in which he presents the in-

formation is going to bias the choice of the decision maker, he will

surely want that bias to be in the "correct" direction. This point is

also related to the issue of normative versus descriptive reports, and is

a point that should be further investigated elsewhere.

5.3.3 Effects Related to H5

A result closely related to the preceding discussion provided

support for Hypothesis 5 (p. 30; 5.2.7). Within the many-decision entities

group, the subjects with interval estimates had shorter decision times

when the information was given to them in graphical as opposed to tabular

style (21.4 versus 24.6 minutes, p = .026). Point estimates users, how-

ever, did not experience the same benefits in moving from the tabular

to the graphical style. Their average decision time, in fact, was higher

with the graphical style than with the tabular style (22.1 versus 20.7

minutes, p = .32). These results suggest that different levels of

probabilistic information may be more appropriately reported using dif-

ferent presentation formats. In the present experiment, subjects with

the tabular style did as good or better than-subjects with the graphical








style when only point estimates were reported. The compact tabular for-

mat was inadequate, however, for processing the interval estimates in-

formation.

5.3.4 Effects Related to H6

All the effects that have been discussed thus far were found to be

more marked within the many-decision-entities group (5.2.2, 5.2.4, 5.2.6,

5.2.7). This supports one of the propositions in Hypothesis 6, (p. 30),

that user performance becomes more sensitive to report format as the

number of decision entities on the report increases. In the present ex-

periment, the subjects with five decision entities on their reports had

so little information to process that whether it was given in I.D. order,

due-date order, tabular style or graphical style did not make much

difference on their performance. Evidence of this is that, within the

five decision events group, there were no significant differences in

performance for any of the performance measures. The only effect that

approached significance in that group was an interaction between style

and level of detail with cost as the dependent variable (p < .10).

The other proposition in H6, that report users will move from in-

difference to preference for particular formats as the number of decision

entities increases, was also supported. The subjects in the few-decision-

entities group gave more or less constant high ratings to the various

format characteristics of their reports (5.2.8, 5.2.9). Within that

group, there were no significant differences in the ratings for the order

of the information in the reports (layout). For the over-all format

rating, only one difference was significant. Interval estimate subjects

gave significantly higher ratings than point estimate subjcets (4.25

versus 3.80, p = .044). Among the subjects with twenty-five decision






61
entities, the story was quite different (5.2.8, 5.2.9, 5.3.3, 5.3.4).

There were significant differences between their ratings in the various

layout and style conditions.

For the MIS practitioner, these results suggest that they should give

careful attention to report format, especially when the report must grow.

The results obtained here give meaning to Voich, et al.'s statement,

"As the complexity of a report increases, its likelihood of extent of use

falls" [28, p. 229]. When preparing a report for a procurement manager,

for example, a standard layout by major classes of items, code number,

etc., may be appropriate if the number of items that must be ordered each

time, and their frequencies of ordering, are small. If the number of

orders that must be placed were to increase considerably, it may be to

the manager's advantage to have the layout of his report revised. A more

favorable layout in that case could be, for example, to have the items

arranged according to the frequency with which they are ordered.

For format revisions or similar actions to occur, the channels of

communication between the information analyst and user must first be im-

proved. At the present, there appears to be a "tail versus dog" problem

between information users and providers when it comes to seemingly un-

important matters, such as designing a format for a report. Voich et al.

state:

Report formats are often not tailored
precisely to users' needs. One reason
for this is the programming costs asso-
ciated with special arrangements of in-
formation, especially if several dif-
ferent users each request a unique for-
mat. A second reason for finding formats
not tailored exactly to user's needs is
that report designs are often based on






62

the system analysts' or programmers' pre-
ferences for programming ease [28, p. 229].

It would appear that better communication channels between the

analyst and the user should, at least, help to alleviate the second

reason noted above.












CHAPTER 6

SUMMARY AND POSSIBLE EXTENSIONS

The present study has considered some of the implications of the

relationship between information format and decision performance. A

specific information-decision problem was abstracted to create a simu-

lated decision environment within which alternative forms of presenting

information relevant to the problem were experimentally manipulated.

Six hypotheses were tested in relation to the effects of the information

format treatments on subject performance. The experimental data supported

five of the six hypotheses. As is always the case with empirical re-

search, however, a number of questions can be raised in connection with

the observed results. Some questions result from inquiring into the

limitations of the present study. Others follow logically from the re-

sults.

Among the limitations, there is the problem of having used student

subjects as surrogates for managers [13]. The actual managers in the

real-life problem modeled could have served as subjects in a field study.

This, of course, may bring about other complications, in particular,

problems of experimental control. Van Horn states:

The unifying theme of field tests is sad
stories. In every one, operational con-
siderations (understandably) dominate
test conditions. As soon as a conflict
arises, the test yields. Even if a test
proceeds to completion, endless arguments
arise over interpretation of the results
[28, p. 175].









Going to the field also implies having to deal with uncooperative

mother nature, as opposed to pre-chosen probability distributions for

the events of interest. Notwithstanding this dismal picture, a field

experiment should be useful. By establishing the external validity of

the present study with respect to the subject population, decision-

making conditions, and other areas of interest, its benefit for the

actual population can be established and considered in the design of

a field study. A practical field study could be, for example, one in

which a more normative report is provided to the manager facing the

heat detection problem. Such a report could be based on some optimal

decision rule indicating to the dairyman the days in which he should

check for heat in particular cows. The decision rule would have to be

based on some cost estimates (costs of "checking," "missing," and

breeding after a successful "check"), and on some probability estimates

(probability of detecting heat on given "checks," probability of a

successful breeding once heat is detected, etc.).

Although growing fast, "experimental work on MIS is still in its

infancy" [5, p. 17]. Many promising areas have not been investigated.

One line of research that follows from the present results is the re-

lationship between the number of decision entities in the report and

the sensitivity of performance to format variables. Various number-

of-decision-entities levels could be manipulated in a parametric study
to investigate such questions as, "Can a report user adapt to an

increasing number of decision entities in his report without a rapid

deterioration in his performance?" To investigate this question

the same subjects would have to be given increasing numbers of decision

entities, and this could bring up problems of subject "learning." If

properly controlled, however, an experiment along these lines could






65


shed light into such questions as, "At what point would it have been

appropriate to have a format revision?"

Another area that appears to need more consideration is the analysis

of interaction effects among information structure characteristics [5, 12].

In this study, an interaction was found between the level of detail and

format, suggesting that different levels of detail may be more easily

processed with different styles of presentation. The validity of findings

such as this one should be further investigated in other decision contexts.

Finally, research relating empirical MIS findings to current trends in the
theory of human information processing may be useful in providing a better

understanding of the results observed.














BIBLIOGRAPHY

1. Ackoff, Russell L., "Management Misinformation Systems," Manage-
ment Science, 14, pp. 147-156, 1967.

2. Barr, Harry L., "Influence of Estrus Detection on Days Open in
Dairy Herds," Journal of Dairy Science, 58, pp. 246-247, 1975.

3. Barr, Harry L., "Breed at Forty Days to Reduce Days Open,"
Hoard's Dairyman, 120, p. 1115, 1975.

4. Barret,'M. J., N. L. Chervany and G. W. Dickson, "On Some Aspects
of the Validity of an Experimental Simulator in MIS Research,"
Working Paper 72-02, MIS Research Center, University of Min-
nesota, January, 1973.

5. Benbasat, I. and R. Schroeder, "An Experimental Investigation of
Some MIS Design Variables," Working Paper 75-01, MIS Research
Center, University of Minnesota, September, 1974.

6. Bock, R. D. and E. A. Haggard, "The Use of Multivariate Analysis
of Variance in Behavioral Research," in D. K. Whitla (Ed.),
Handbook of Measurement and Assessment in Behavioral Sciences,
Addison-Wesley Publishing Co., Reading, Massachusetts, pp. 100-142,
1968.

7. Conlin, B. J., "Use of Records in Managing for Good Lactational
and Reproductive Performance," Journal of Dairy Science, 51, pp.
377-385, 1974.

8. Conrath, David W., "From Statistical Decision Theory to Practice:
Some Problems with the Transition," Management Science, 19, pp.
873-883, 1973.

9. Chervany, N. L. and G. W. Dickson, "An Experimental Evaluation of
Information Overload in a Production Environment," Management
Science, 20, pp. 1335-1344, 1974.

10. Chervany, N. L., G. W. Dickson and K. A. Kozar, "An Experimental
Gaming Framework for Investigating the Influence of Management
Information Systems on Decision Effectiveness," Working Paper
71-12, MIS Research Center, University of Minnesota, 1972.








11. Dancer, Robert E.,"An Empirical Evaluation of Constant and Adaptive
Computer Forecasting Models for Inventory Control," Decision
Sciences, 8, pp. 228-238, 1977.

12. Dickson, G. W., J. A. Senn and N. L. Chervany, "Research in Manage-
ment Information-Decision Systems: The Minnesota Experiments,"
Working Paper 75-08, MIS Research Center, University of Minnesota,
May, 1975.

13. Fleming, J. E., "Managers as Subjects in Business Decisions Re-
search," Academy of Management Journal, 12, pp. 59-66, 1969.
14. Foote, R. H., "Estrus Detection and Estrus Detection Aids,"
Journal of Dairy Science, 58, pp. 248-256, 1975.
15. Gehrlein, W. V. and P. C. Fishburn, "Information Overload in
Mechanical Processes," Management Science, 23, pp. 391-398, 1976.

16. Gory, G. A. and M. S. Morton, "A Framework for Management Infor-
mation Systems," Sloan Management Review, 13, pp. 55-70, 1971.
17. Hays, W. L., Statistics for Psychologists, Holt, Rinehart and
Winston, Inc., New York, New York, 1963.

18. Kozar, K. A., Decision Making in a Simualted Environment: A
Comparative Analysis of Computer Data Display Media, Ph.D. Thesis,
University of Minnesota, 1972.

19. Louca, A. and J. E. Legates, "Production Losses in Dairy Cattle Due
to Days Open," Journal of Dairy Science, 51, pp. 573-583, 1968.

20. Lucas, Henry C., "Performance and the Use of an Information System,"
Management Science, 21, pp. 908-919, 1975.

21. Lucas, Henry C., "A Descriptive Model of Information Systems in
the Context of the Organization," Data Base, 5, pp. 27-36, 1973.
22. Mason, R. 0. and I. I. Mitroff, "A Program for Research on Manage-
ment Information Systems," Management Science, 19, pp. 475-487, 1973.

23. Murdick, R. G. and J. E. Ross, Information Systems for Modern
Management, Prentice-Hall, Inc., Englewood Cliffs, New Jersey, New
Jersey, 1975.

24. Peter, J. P., M. J. Ryan and R. E. Hughes, A MANOVA Approach to
Disentangling Correlated Dependent Variables in Organizational Re-
search," Academy of Management Journal, 18, pp. 904-911, 1975.

25. Schroeder, R. G. and I. Benbasat, "An Experimental Evaluation of the
Relationship of Uncertainty in the Environment to Information Used
by Decision Makers," Decision Sciences, 6, pp. 556-567, 1975.






68

26. Senn, J. A. and G. W. Dickson, "Information System Structure
and Purchasing Decision Effectiveness," Journal of Purchasing
and Materials Management, 10, No. 3, pp. 52-64, 1974.

27. Twenty-one Day Reproductive Calendar, Agricultural Extension
Service, Iowa State University, Ames, Iowa, 1970.

28. Van Horn, R. L., "Empirical Studies of Management Information
Systems," Data Base, 5, pp. 172-180, 1973.

29. Voich, D., H. J. Mottice and W. A. Shrode, Information Systems
for Operations and Management, South-Western Publishing Co.,
Cincinnati, Ohio, 1975.

30. Winer, B. J., Statistical Principles in Experimental Design,
McGraw-Hill Book Co., New York, New York, 1971.

31. Zannetos, Z. S., Discussion Comments for "An Overview of
Management Information Systems," Data Base, 5, pp. 13-14,
1973.













APPENDIX A

EXPERIMENTAL TREATMENTS

The sixteen reports that constituted the experimental treat-

ments are shown here in the same size they were administered to the

subjects. The only difference between these reports and those used

by the subjects is that the latter had horizontal green lines across

them to facilitate their use. Each report is labeled with the
"condition" numbers used in Table 3.1 (p. 35).

















EVENT EXPECTED DUE-DATE
IDENT. (MONTH-DAY)

004 6-07
009 5-29
017 6-11
024 6-04
032 5-26
038 6-04
051 5-31
070 6-01
076 6-10
078 5-30
082 6-03
085 6-10
097 6-05
110 6-11
121 6-05
128 6-01
142 6-09
146 5-28
155 6-12
163 6-09
168 6-07
171 5-31
173 6-06
177 6-06
186 5-31
Condition 1


EVENT EXPECTED DUE-DATE
IDENT. (MONTH-DAY)
----------------------..----
032 5-26
146 5-28
009 5-29
078 5-30
171 5-31
051 5-31
186 5-31
070 6-01
128 6-01
082 6-03
024 6-04
038 6-04
121 6-05
097 6-05
173 6-06
177 6-06
168 6-07
004 6-07
163 6-09
142 6-09
085 6-10
076 6-10
110 6-11
017 6-11
155 6-12

Condition 2

















EVENT 95% CONFIDENCE INTERVAL
IDENT. (FIRST DAY, LAST DAY)

004 6-06 6-09
009 5-27 5-31
017 6-10 6-12
024 6-01 6-07
032 5-25 5-28
038 6-02 6-06
051 5-29 6-03
070 5-31 6-02
076 6-09 ,6-12
078 5-27 6-02
082 6-01, 6-05
085 6-09 ,6-14
097 6-04 6-07
110 6-10 6-12
121 6-03 6-07
128 5-30 6-04
142 6-06 6-12
146 5-26 5-30
155 6-11 6-13
163 6-07 6-12
168 6-04 6-10
171 5-28 6-03
173 6-05 6-07
177 6-05 6-08
186 5-29 6-03

Condition 3


EVENT 95% CONFIDENCE INTERVAL
IDENT. (FIRST DAY, LAST DAY)

032 5-25 5-28
146 5-26 5-30
009 5-27 ,5-31
078 5-27 6-02
171 5-28 6-03
051 5-29 6-03
186 5-29 6-03
128 5-30 6-04
070 5-31 6-02
082 6-01 6-05
024 6-01 6-07
038 6-02 6-06
121 6-03 6-07
168 6-04 6-10
097 6-04 6-07
173 6-05 6-07
177 6-05 6-08
004 6-06 6-09
142 6-06 ,6-12
163 6-07 6-12
085 6-09 6-14
076 6-09 6-12
110 6-10 6-12
017 6-10 6-12
155 6-11 6-13

Condition 4


















EXPECTED DUE-DATES
.........................................................................
EVENT MAY JUNE EVENT
IDENT. 25 26 27 28 29 30 31 01 02 03 04 05 06 07 08 09 10 11 12 13 IDENT.
-------------------004 H 004--
009 H 009
009 H 009
017 H 017
024 H 024
032 H 032
038 H 038
051 H 051
070 H 070
076 H 076
078 H 078
082 H .082
085 H 085
097 H 097
110 H 110
121 H 121
128 H 128
142 H 142
146 H 146
155 H 155
163 H 163
168 H 168
171 H 171
173 H 173
177 H 177
186 H 186
-------------25 26 27 28 29 30 31 01 02 03 04 6-------------
25 26 27 28 29 30 31 01 02 03 04 05 06 07 08 09 10 11 12 13


Condition 5
















EXPECTED DUE-DATES
- ----------------- ----------- ------------------------
EVENT MAY JUNE EVENT
IDENT. 25 26 27 28 29 30 31 01 02 03 04 05 06 07 08 09 10 11 12 13 IDENT.

032 H 032
146 H 146
009 H 009
078 H 078
171 H 171
051 H 051
186 H 186
070 H 070
128 H 128
082 H 082
024 H 024
038 H 038
121 H 121
097 H 097
173 H 173
177 H 177
168 H 168
004 H 004
163 H 163
142 H 142
085 H 085
076 H 076
110 H 110
017 H 017
155 H 155
-------------25 26 27 28 29 30 31 01 02 03 04 05 06 07 08 09 10 11 12 13
25 26 27 28 29 30 31 01 02 03 04 05 06 07 08 09 10 11 12 13


Condition 6
















95% CONFIDENCE INTERVALS
----------------------------------------------------------


EVENT MAY
IDENT. 25 26 27 28 29 30 31


JUNE
01 02 03


04 05 06 07 08 09 10 11 12


EVENT
13 IDENT.


H H H H H


H H


H H H H H H H


H H
H H H H H H
H H H


H H H H H




HH

HHHHH


HH H


H H
H H H H H

H H H H

H H H H H
HHHHH
HHHH

HHHHH
HHHH
HH


HHHHHHH


HHHHHH


H H 004
009
H H H 017
024
032
038
051
070
H H H H 076
078
082
H H H H H 085
097
H H H 110
121
128
H H H H H 142
146
H H H 155
H H H H H 163
H H H 168
171
173
H 177
186


25 26 27 28 29 30 31 01 02 03 04 05 06 07 08 09 10 11 12 13
25 26 27 28 29 30 31 01 02 03 04 05 06 07 08 09 10 11 12 13


---- ---- --- ---- --- ---- --- ---- ---- --- ---- --- ---- --- ---- ---


H H H H


Condition 7















95% CONFIDENCE INTERVALS
---------------------------------------------------------
EVENT MAY JUNE EVENT
IDENT. 25 26 27 28 29 30 31 01 02 03 04 05 06 07 08 09 10 11 12 13 IDENT.

032 H H H H 032
146 H H H H H 146
009 H H H H H 009
078 H H H H H H H 078
171 H H H H H H H 171
051 H H H H H H 051
186 H H H H H H 186
128 H H H H H H 128
070 H H H 070
082 H H H H H 082
024 H H H H H H H 024
038 H H H H H 038
121 H H H H H 121
097 H H H H 097
168 H H H H H H H 168
177 H H H H 177
173 H H H 173
004 H H H H 004
142 H H H H H H H 142
163 H H H H H H 163
076 H H H H 076
085 H H H H H 085
110 H H H 110
017 H H H 017
155 H H H 155
25 26 27 28 29 30 31 01 02 03 04 05 06 07 08 09 10 11 12 13


Condition 8














EVENT EXPECTED DUE-DATE
IDENT. (MONTH-DAY)

009 5-29
032 5-26
128 6-01
155 6-12
168 6-07
Condition 9




EVENT 95% CONFIDENCE INTERVAL
IDENT. (FIRST DAY, LAST DAY)
--------- ------ ------ ------
009 5-27 5-31
032 5-25 5-28
128 5-30 6-04
155 6-11 6-13
168 6-04 6-10
Condition 11 -


EVENT EXPECTED DUE-DATE
IDENT. (MONTH-DAY)

032 5-26
009 5-29
128 6-01
168 6-07
155 6-12
Condition 10




EVENT 95% CONFIDENCE INTERVAL
IDENT. (FIRST DAY, LAST DAY)

032 5-25 5-28
009 5-27 5-31
128 5-30 6-04
168 6-04 6-10
155 6-11 6-13
Condition 12


EXPECTED DUE-DATES

EVENT MAY JUNE EVENT
IDENT. 25 26 27 28 29 30 31 01 02 03 04 05 06 07 08 09 10 11 12 13 IDENT.

009 H 009
032 H 032
128 H 128
155 H 155
168 H 168

25 26 27 28 29 30 31 01 02 03 04 05 06 07 08 09 10 11 12 13


Condition 13













EXPECTED DUE-DATES

EVENT MAY JUNE EVENT
IDENT. 25 26 27 28 29 30 31 01 02 03 04 05 06 07 08 09 10 11 12 13 IDENT.
--------------------------------------------------------------
032 H 032
009 H 009
128 H 128
168 H 168
155 H 155
------------------------------------------------
25 26 27 28 29 30 31 01 02 03 04 05 06 07 08 09 10 11 12 13

Condition 14



95% CONFIDENCE INTERVALS
----.--.--..-------------........------------ ------
EVENT MAY JUNE EVENT
IDENT. 25 26 27 28 29 30 31 01 02 03 04 05 06 07 08 09 10 11 12 13 IDENT.
------------------------------------------------
009 H H H H H 009
032 H H H H 032
128 H H H H H H 128
155 H H H 155
168 H H H H H H H 168
------------------------------------------------
25 26 27 28 29 30 31 01 02 03 04 05 06 07 08 09 10 11 12 13

Condition 15



95% CONFIDENCE INTERVALS
.............................................----------------------......
EVENT MAY JUNE EVENT
IDENT. 25 26 27 28 29 30 31 01 02 03 04 05 06 07 08 09 10 11 12 13 IDENT.
.............................................----------------------......
032 H H H H 032
009 H H H H H 009
128 H H H H H H 128
168 H H H H H H H 168
155 H H H 155
--25 26 27 28 29 30 31 01 02 03 04 05 06 07 08 09 10 11 12 13
25 26 27 28 29 30 31 01 02 03 04 05 06 07 08 09 10 11 12 13


Condition 16














APPENDIX B

FORMAT OPINION QUESTIONNAIRE

The post-experimental questionnaire is presented here in an

English version of the actual questions (shown toward the end of the

program in Appendix C). A connotative rather than literal translation

has been attempted to give the non-Spanish reader a more accurate

representation of the content.










Please answer the following questions by entering a 1, 2, 3, 4, or 5, and

pressing the "return" key. An entry close to 1 will indicate "very

little" and an entry close to 5 will indicate "very much," as follows:


VERY LITTLE : 1 : 2 : 3 : 4 : 5 : VERY MUCH

1. How appropriate did you considered the order of the infor-
mation in the report, given the type of decisions that had
to be made?

2. How appropriate did you considered the format of the report
at the time of making the daily decisions?

3. How appropriate did you considered the format of the report
at the time of recording the feedback on events checked
and detected?

4. How appropriate did you considered the detail of the prob-
abilistic information on the possible date of occurrence
of each event?

5. All factors considered, how appropriate did you considered
the format of your report?














APPENDIX C
COMPUTER SIMULATION PROGRAM

A.1 General

This program created the simulated decision environment for the
experimental runs. The program was written in BASIC, Version 17,

Digital Equipment Corporation System 10. It was run from a type-

writer terminal, model DEC 33 TELETYPE, on line with a PDP 10.

A.2 Input

The fixed input to the program was the data of Table 3.2, (p. 39)

arranged as follows:

E$(J) M D I S
004 5 18 20.5 1.00
009 5 9 20.0 1.25
017 5 22 20.0 0.75
024 5 15 20.0 1.75
032 5 6 20.5 1.00
038 5 15 20.0 1.25
051 5 11 20.5 1.50
070 5 12 20.0 0.75
076 5 21 20.5 1.00
078 5 10 20.0 1.75
082 5 14 20.0 1.25
085 5 21 20.5 1.50
097 5 16 20.5 1.00
110 5 22 20.0 0.75
121 5 16 20.0 1.25
128 5 12 20.5 1.50
142 5 20 20.0 1.75
146 5 8 20.0 1.25
155 5 23 20.0 0.75
163 5 20 20.5 1.50








E$(J)

168
171
173
177
186

where E&(J)

M

D

I

S


= event I.D. number

= month of last occurrence

= day of last occurrence

= mean interval between occurrences

= standard deviation of interval between occurrences


All other input to the program was manually entered by the subjects.

It consisted of the I.D. numbers of the events they checked on each

successive day, and the answers to the post-experimental questionnaire.

A.3 Output

A sample of the beginning of a run is illustrated in Figure A.1

(p. 82). It shows an identification of the experimental report used in

the run, the initial time clocked after the subject indicated he was

ready to start,* and some of the outcomes of the early part of the run.

Figure A.2 (p. 83) is a sample from the last part of the experiment.

It shows some of the last decisions made by the subject, the final

decision time clocked (26 minutes and 3 seconds in the sample shown), the

summarized results of the run, and the post-experimental questionnaire

with the subject's answers. The last two lines seen are part of the

file updated with the results of each run. The program is on pp. 84-88.


*A pre-experimental familiarization session had already been con-
ducted.








82











TI ? !.Y' !l;O3C Zr':: .,I) :.lsJ37 'h-.1 2!".:C 167*r7? -I1 S' Cs:!476E




?-- -- -- -

CC7:.52 032 DETECTrEL :!1 ;


T"lDAY Ir 5 --!

CPEC 3 ?
?Z32 1.6
C'C'LDI 032 146 =T1.-r: 1'3:JE



Fiu "e IC op t- i7n
COEt;s ?
?032
..CRiC i 0",2 :=TEC.TE,: O32



TCrCE5 T


4EKCKE: 009 14E Vi~E>EiL InO:IE





Figure A.1 Sample output at the beginning of a run









83



'LC"F7 : IT I I TTIT:
e r:r":.0 JII 1DI in2 11 I.; PC-14765











?******* : **7 *8 ,"- l',, P:0'LT. 7 '% "G. T .! ..7 i PHZ $ *6






TTAl. CVST 3 ;I4

ACT'UAL E1S4T ATES

*!E2ET T 3. D.TE

034 fe-
S 09 ..-.3














009 23



038 ; -4
2730 1 -721









076 ;-12
07 -30
0o2 f-2
385 6-10
07 E-5

121 c-5
t42 C-2




'." *-.3
t4f 6r-


763 6-!

171 S-9
172 6-6
177 6-6
1t6 5-3,

V 0ISEL EST 0 zFS.ET. DE n R'*TiOq PAA DVr Fil?7!CA. 71.2 LA IOfT'OA7IOit
-V CL.'.VT3 A EVE7T') CMTEIJ1A'S Y DTECTAO2S ES CT.ELC.7..- LI.. VE7 HEC0
UT3 7PI'22 ". IT73 '.


TAlft DE C3"TE TA'. LAS SIt': :2;E3 PR.I5CL'TA1 5 TC:l'1"E:C0 'U I*r''**3'
'4* ** *' 7?D 7i'1: I LA T- CL'A *FETU' t :.' ?ZSP7 ESTA CRE.CA DE I
VI'DIC2.'A *P'73 APROPi7?1 "i Y 2.ESPL'STA CE.OC Dr 1:rICAi'c A 'I 'Y
.1P'IPIA"7*- O"io S*SLt :


PI)C7 3?. P"7.1 :; I : 3 i 2 a1 E ; t y .1 '.l" 1 t433



1. CC.! 407?-i.' l T 7 ic'4' i .C2TEI 2 L M rC J 7. L A .F'r"TrA't11-t
:' 1.:. 2?." .T ri i3: L' TIP. .0 212f1'T- S 3IE 51 1L31S T01A0 *'2





Figure '"2.- T'', 'STam ''32 op at17" 32thT. 32n7f 7u
t t2TO 2 TOW 2 2!.

7" Z :L F t r ?,)







C 1 LL' I. L -.' '.1 '... C.71p7 '.%1 .1- C P l.1 7:' ***





Figure A.2 Sample output at the end of a run











00005 REM ******************************************************
00010 REM *** INTERACTIVE SIMULATION OF THE EVENT CHECKING PROBLEM ***
00015 REM ************* *************************************
00020 DIM A$(25),E$(25)
00030 FILES A1%,A2%,A3%,A4%,A5%,AO%,DATA,TALYA$
00035 DO=25
00040 PRINT "SO,R";
00050 INPUT SO,R
00060 IF R =8 THEN 70
00062 00=5
00064 FOR J=1 TO 25.
00066 INPUT #7, NO,E$(J),M,D,I,S
00068 NEXT J
00070 C1=1
00075 PRINT
00080 C2=5
00090 PRINT
00100 PRINT "SUBJECT IS USING REPORT ";STR$(R)+"."
00105 PRINT
00110 FOR J=1 TO DO
00125 RO=0
00130 INPUT #7, NO,E$(J),M,D,I,S
00140 S$=S$+E$(J)
00210 RANDOMIZE
00220 FOR N=1 TO 12
00230 RO=RO+RND
00240 NEXT N
00250 Y=S*(RO-6) + I
00260 X=INT(Y+.5)
00280 IF (D+X) 31 GO TO 310
00290 D=D+X
00300 GO TO 330
00310 D=D+X-31
00320 M=M+1
00330 A$(J) = STR$(M)+"-"+STR$(D)
00340 NEXT J
00350 PRINT
00360 M=5
00370 D=24
00380 PRINT "Ti";
00381 INPUT Tl$
00389 PRINT
00390 FOR I=1 TO 20
00400 D=D+1
00410 IF D 32 GO TO 440
00420 D=l
00430 M=M+1
00440 D$=STR$(M)+"-"+STR$(D)
00442 PRINT
00450 PRINT "TODAY IS ";D$
00460 PRINT "-------------










00470 PRINT "CHECKS ?"
00480 INPUT E$
00482 IF E$=" THEN 630
00485 E$=E$+"AAAA."
00490 L=INSTR(E$,".")/4-1
00502 IF ABS(L-INT(L)) = 0 THEN 510
00504 GOSUB 2000
00506 GO TO 480
00510 FOR K=1 TO L
00520 Y=4*K-3
00525 Z=4*K-1
00530 K$=MID$(E$,Y,Z-Y+1)
00550 J=(INSTR(S$,K$)+2)/3
00560 C$=C$+E$(J)+"
00570 C=C+1
00580 IF A$(J) D$ THEN 610
00590 0$=0$+E$(J)+" "
00600 0=0+1
00610 NEXT K
00620 IF L =1 THEN 660
00630 PRINT "CHECKED: NONE"
00640 PRINT
00650 GO TO 730
00660 PRINT "CHECKED: ";C$;
00670 IF LEN(0$) 1 THEN 690
00680 0$="NONE"
00690 PRINT DETECTED: ";0$
00700 PRINT
00710 C$=" "
00720 0$=" "
00730 NEXT I
00735 PRINT "***********"
00740 PRINT "******...FAVOR DE LLAMAR AL PROF. AMADOR....*******
00751 INPUT T2$
00752 IF LEN(T2$) 1 THEN 754
00753 T2$="O"+T2$
00754 PRINT
00756 PRINT
00760 PRINT "******* SUMMARY OF RESULTS FOR THE RUN *******"
00770 PRINT
00780 PRINT C;"CHECKS @ $";STR$(C1)+"/"+"CHECK = $";STR$(C*C1)
00782 C8$=STR$(C)
00783 IF LEN(C8$) 2 THEN 790
00784 IF LEN(C8$) 1 THEN 788
00785 C8$="00"+C8$
00786 GO TO 790
00788 C8$="0"+C8$
00790 PRINT DO-O;"MISSES @ $";STR$(C2)+"/"+"MISS = $";STR$((DO-0)*C2)
00795 PRINT
00800 PRINT "TOTAL COST = $"+STR$(C*C1+(DO-O)*C2)
00802 T8$=STR$(C*C1+(DO-0)*C2)










00804 IF LEN(T8$) 2 THEN 870
00806 T8$="O"+T8$
00870 PRINT
00880 PRINT "ACTUAL EVENT DATES"
00890 PRINT "----------------
00900 PRINT "EVENT ID. DATE"
00910 PRINT "--------- -----
00920 FOR J=l TO DO
00930 PRINT TAB(3);E$(J);TAB(11);A$(J)
00940 NEXT J
00950 PRINT
00960 PRINT "REVISE ESTE RESUME DE RESULTADOS PARA VERIFICAR"
00965 PRINT "QUE LA INFORMATION EN CUANTO A EVENTS COTEJADOS"
00970 PRINT "Y DETECTADOS ES CORRECT. UNA VEZ HECHO ESTO,"
00975 PRINT "OPRIMA 'RETURN'."
01010 INPUT R9$
01020 PRINT
01030 PRINT "FAVOR DE CONTESTAR LAS SIGUIENTES PREGUNTAS ESCRIBIENDO UN"
01040 PRINT "1, 2, 3, 4, 0 5, Y OPRIMIENDO LA TECLA 'RETURN'. UNA"
01045 PRINT "RESPUESTA CERCA DE 1 INDICARA POCOO APROPRIADO'; UNA"
01050 PRINT "RESPUESTA CERCA DE 5 INDICARA 'MUY APROPRIADO, COMO SIGUE:"
01070 PRINT
01080 PRINT
01090 PRINT POCO APROPRIADO : 1 : 2 : 3 : 4 : 5 : MUY APROPRIADO"
01100 PRINT --
01110 PRINT
01120 PRINT
01130 PRINT "1. CUAN APROPRIADO ENCONTRO USTED EL ORDEN DE LA INFORMATION"
01140 PRINT EN EL REPORT PARA EL TIPO DE DECISIONS QUE SE"
01145 PRINT DEBIAN TOMAR?"
01150 INPUT Al
01152 IF Al 5 THEN 1154
01153 IF Al =1 THEN 1160
01154 GOSUB 2000
01156 60 TO 1150
01160 SET :1, R; :2, R; :3, R; :4, R; :5, R; :6, R
01162 INPUT :6, QO
01164 QO=Q0+1
01170 INPUT :1, Cl
01180 C1=C1+A1
01210 PRINT
01220 PRINT "2. CUAN APROPRIADO FUE EL FORMAT DE PRESENTATION DEL REPORT"
01230 PRINT AL MOMENT DE TOMAR LAS DECISIONS DIARIAS?"
01240 INPUT A2
01242 IF A2 5 THEN 1244
01243 IF A2 =1 THEN 1250
01244 GOSUB 2000
01246 GO TO 1240
01250 INPUT :2, C2
01260 C2=C2+A2
01290 PRINT










01300 PRINT "3. CUAN APROPRIADO FUE EL FORMAT DEL REPORT AL MOMENT"
01305 PRINT DE ANOTAR LA INFORMATION SOBRE LOS EVENTS COTEJADOS"
01310 PRINT Y DETECTADOS?"
01320 INPUT A3
01322 IF A3 5 THEN 1324
01323 IF A3 =1 THEN 1330
01324 GOSUB 2000
01326 GO TO 1320
01330 INPUT :3, C3
01340 C3=C3+A3
01370 PRINT
01380 PRINT "4. CUAN APROPRIADO FUE EL DETALLE DE LA INFORMATION"
01385 PRINT PROBABILISTICA SOBRE LA POSSIBLE FECHA DE OCURRENCIA"
01390 PRINT DE CADA EVENTO"
01400 INPUT A4
01402 IF A4 5 THEN 1404
01403 IF A4 =1 THEN 1410
01404 GOSUB 2000
01406 GO TO 1400
01410 INPUT :4, C4
01420 C4=C4+A4
01450 PRINT
01460 PRINT "5. CONSIDERANDO TODOS LOS FACTORS, COMO DE APROPRIADO"
01465 PRINT ENCONTRO USED EL FORMAT DE ESTE REPORTE"
01480 INPUT A5
01482 IF A5 5 THEN 1484
01483 IF A5 =1 THEN 1490
01484 GOSUB 2000
01486 GO TO 1480
01490 INPUT :5, C5
01500 C5=C5+A5
01530 PRINT
01540 PRINT "........HEMOS CONCLUIDO EL EXPERIMENT ......
01545 PRINT ...... GRACIAS POR SU COOPERACION......"
01550 R8$=STR$(R)
01552 IF LEN(R8$) 1 THEN 1560
01554 R8$="O"+R8$
01560 Q8$=" "+STR$(A1)+" "+STR$(A2)+" "+STR$(A3)+" "+STR$(A4)+" "+STR$(A5)
01561 SO$=STR$(SO)
01562 IF LEN(SO$) 2 THEN 1570
01563 IF LEN(SO$) 1 THEN 1566
01564 SO$="00"+SO$
01565 GO TO 1570
01566 SO$="0"+SO$
01570 TO$=SO$+" "+R8$+" "+T2$+" "+C8$+" "+T8$+Q8$
01600 SET :1,R; :2,R; :3,R; :4,R; :5,R; :6,R
01610 WRITE :1,C1
01620 WRITE :2,C2
01630 WRITE :3,C3
01640 WRITE :4,C4
01650 WRITE :5,C5





88




01660 WRITE :6,QO
01670 PRINT C1;C2;C3;C4;C5;Q0
01672 PRINT
01674 PRINT TO$
01680 SET :8,SO
01685 WRITE :8,TO$
01690 GO TO 2050
02000 PRINT
02005 PRINT "? INPUT DATA NOT IN CORRECT FORM--PLEASE RETYPE"
02010 RETURN
02050 END














APPENDIX D

SUBJECT INSTRUCTIONS

The written instructions given to the experimental subjects are

shown here in their original version (in Spanish) and in an English

version. Again, an effort has been made to present a connotative

rather than literal translation so the non-Spanish reader can have

a more accurate picture of their content. The statement of consent

that was signed by each subject is also included.







Instrucciones

Introducci6n

Con este experiment se quiere medir la efectividad de un informe
gerencial. El Informe estudiado ha sido disefiado para ayudar a un gerente
a "detectar" una series de events de interest que han de ocurrir en el future
pr6ximo. La raz6n que amerlta el uso de un informed en este caso es que
estos events de interes ocurren al azar solamente durante el termino de un
dia y luego no vuelven a ocurrir hasta despues de aproximadamente 20 dlas.
Es convenient para la gerencla "detectar" el dfa en que estos events ocu-
rren ya que se incurre en un cost de oportunidad cada vez que uno de estos
events sucede y pasa sin ser detectado (la proxima oportunidad de observer
el event no vuelve a ocurrir hasta despues de aproximadamente 20 dras). El
informed estudiado es preparado en base a estadIsticas pasadas y consiste
precisamente de las fechas mas probables de ocurrencia para cada uno de
una serie de estos events. El format general de este informed es como
sigue:

Identiflcaci6n del Posible Fecha de
Evento Ocurrencia











En este experiment se quiere medir el efecto del format de presenta-
ci6n de este informed sobre el uso efectivo que se le ha de dar al mismo.
Usted recibird urn de various formatos experimentales de este informed y du-
rante un period simulado de 20 dfas used utilizard dicho informed para
tratar de "detectar" una series de events que segfn su informe han sido
Identificados como que han de ocurrir durante esos 20 dfas.

Reglas de la Simulacion

Durante el experiment usted jugara el papel de un gerente que debe
decidir a diario cuantos y cuales eventso" cotejar para ver si estan
"ocurriendo" en ese dia o no. Las caracterfsticas dcl problema que usted
debera mantener en mente son las siguientes:

1. Usted recibira un informed experimental con los eventso"
que deben ser "cotejados" durante los pr6ximos 20 dfas.
Estos events estaran idenrificados al lado izquierdo del




Full Text
xml version 1.0 encoding UTF-8
REPORT xmlns http:www.fcla.edudlsmddaitss xmlns:xsi http:www.w3.org2001XMLSchema-instance xsi:schemaLocation http:www.fcla.edudlsmddaitssdaitssReport.xsd
INGEST IEID E3HTV0955_EV7JBG INGEST_TIME 2012-09-24T14:33:18Z PACKAGE AA00011840_00001
AGREEMENT_INFO ACCOUNT UF PROJECT UFDC
FILES