|Table of Contents|
Table of Contents
List of Tables
List of Figures
Chapter 1. Introduction
Chapter 2. Review of literature
Chapter 3. Methods
Chapter 4. Results
Chapter 5. Discussion, summary, conclusions, and implications for further research
Appendix A. Competitive state anxiety inventory – 2 (CSAI-2)
Appendix B. Informed consent form
Appendix C. Pre-race instructions
Appendix D. Familiarization session instructions
Appendix E. Practice session instructions
Appendix F. Competition session instructions
Appendix G. Post-experiment comments
Appendix H. Pearson product-moment correlation coefficients
CHANGES IN VISUAL SEARCH PATTERNS AS AN INDICATION OF ATTENTIONAL NARROWING AND DISTRACTION DURING A SIMULATED HIGH-SPEED DRIVING TASK UNDER
INCREASING LEVELS OF ANXIETY
CHRISTOPHER M. JANELLE
A DISSERTATION PRESENTED TO THE GRADUATE SCHOOL OF THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF
DOCTOR OF PHILOSOPHY
UNIVERSITY OF FLORIDA
There are many people to whom I am indebted for their guidance and patience
throughout my doctoral education and especially during the dissertation process. I would like to begin by thanking my wife, Carol, and my little buddy, Matthew, for their inspiration and understanding over the past four years. Words cannot express the love and appreciation I have for you both. Similarly, I would like to express my gratitude to my parents, Jean and Fran Janelle, for their support and encouragement during my graduate education and throughout my life. The value system and work ethic they instilled in me are what made this possible.
A very special thank you goes to my mentor and dissertation chair, Dr. Robert N. Singer. His scholarly example and practical lessons have greatly enhanced my professional and personal development. He has truly embodied the term "mentor" by providing me with the tools and opportunities needed to develop into a young scholar while adding constructive criticism and a sincere pat on the back when needed. 11is influence will always be greatly appreciated.
I would like to express my sincere thanks to my committee members, Dr. James H. Cauraugh, Dr. Ira Fischler, Dr. Milledge Murphey, and Dr. L. Keith Tennant, for their support and helpful comments in the completion of this project. In addition to the dissertation experience, each has provided much in their own way to my development and
for that I am grateful. The many experiences I have shared with each of you, both academically and otherwise, will not be forgotten.
This study would not have been possible without the willingness to participate and generosity of Dr. Mark Williams who allowed me to use his eye-tracking equipment and provided interesting ideas and conceptual contributions during the formative stage of this project. Furthermore, I would like to acknowledge the technical assistance of Mark Tillman, Luis Maseda, and Dr. Jeff Bauer who helped put everything in motion. Also, I am thankful to Beth Fallen, Wisug Ko, and Dr. Andrea Behrman for helping with data collection, reduction, and analysis.
TABLE OF CONTENTS
ACKNOWLEDGEMENTS ............................................................ ii
LIST O F TA B LE S ...................................................................... vii
LIST OF FIGURES ..................................................................... viii
A B ST R A C T .............................................................................. ix
I INTRODUCTION ..................................................... 1
Attentional Narrowing ................................................. 3
D istraction .............................................................. 12
Arousal and Anxiety ................................................... 15
A More Comprehensive Next Step .............................. ... 18
Statement of the Problem ............................................. 22
H ypotheses ............................ ................................. 23
Definitions of Terms ................................................... 29
A ssum ptions ............................................................ 32
Significance of the Study .............................................. 33
2 REVIEW OF LITERATURE ........................................ 37
Stress and Human Performance ...................................... 39
Anxiety, Arousal, and Attention ..................................... 63
V isual A ttention ........................................................ 79
Visual Attention and Driving ......................................... 95
Visual Attention and Sport ........................................... 112
Summary and Future Directions ...................................... 115
Visual Search as an Indicator of Distraction
and/or Narrowing .................................................... 118
3 METHODS ............................................................. 121
P articipants ............................................................. 12 1
Instruments and Tests ................................................. 122
Measurement Recording Devices .................................... 126
P rocedure ................................................................ 130
D ata A nalysis ........................................................... 136
4 R E SU L T S ............................................................... 139
Anxiety and Arousal .................................................... 139
Performance Data ....................................................... 142
Visual Search Data ...................................................... 151
Multiple Regression Analyses ......................................... 157
Manipulation Checks ................................................... 160
5 DISCUSSION, SUMMARY, CONCLUSIONS,
AND IMPLICATIONS FOR FURTHER RESEARCH ......... 162
D iscussion ................................................................ 164
Visual Search Data ...................................................... 175
Findings Which Contradict and Augment
Previous Research ..................................................... 180
Sum m ary .................................................................. 193
C onclusions .............................................................. 195
Issues for Future Consideration ....................................... 196
A Final Comment ........................................................ 201
REFERENCES ............................................................................ 203
A COWETITIVE STATE ANXIETY INVENTORY 2
(C SA I-2) ................................................................. 223
B INFORMED CONSENT FORM .................................... 225
C PRE-RACE INSTRUCTIONS ........................................ 227
D FAMILIARIZATION SESSION INSTRUCTIONS ............... 230
E PRACTICE SESSION INSTRUCTIONS ........................... 233
F CONTETITION SESSION INSTRUCTIONS ..................... 235
G POST-EXPERE"ENT CONPvIENTS ............................... 237
H PEARSON PRODUCT-MOMENT
CORRELATION COEFFICIENTS ............................... 239
BIOGRAPHICAL SKETCH ............................................................ 242
LIST OF TABLES
3.1 Experim ental design ................................................................. 136
4.1 Cognitive Anxiety Levels for Each Group Across Sessions 1-3 ............. 140
4.2 Change from Baseline HR for Each Group Across Sessions 1-3 ............ 142
4.3 Driving Performance (Lap Speed) ................................................ 145
4.4 Number of Major Driving Errors ................................................. 147
4.5 Mean Response Time Across Sessions 1-3 ..................................... 149
4.6 Mean Number of Peripheral Light Misidentifications ....................... 151
4.7 Number of Saccades to Peripheral Stimuli ............ ........................ 153
4.8 Number of Fixations to peripheral Locations Across Sessions 1-3 ..................................................................... 155
4.9 Stepwise Multiple Regression Analysis Predicting Lap Speed with Activation Data Across Sessions 1-3 .................................. 158
4.10 Stepwise Multiple Regression Analysis Predicting Response Time with Activation Data Across Sessions 1-3 .................................. 159
4.11 Stepwise Multiple Regression Analysis Predicting Misidentifications of Peripheral Stimuli with Activation Data Across Sessions 1-3 ......... 159
4.12 Stepwise Multiple Regression Analysis Predicting Exogenous Saccades with Activation Data Across Sessions 1-3 ....................... 160
LIST OF FIGURES
4.1 Changes in cognitive anxiety for each group across sessions 1-3 .......... 141
4.2 Change in HR from baseline rates for each group during sessions 1-3 ..................................................................... 143
4.3 Lap speed for each group across sessions 1-3 .................................. 146
4.4 Number of major driving errors for each group across sessions 1-3 ........ 148
4.5 Mean response time across sessions 1-3 ....................................... 150
4.6 Mean number of peripheral light misidentifications ........................... 152
4.7 Number of saccades to peripheral stimuli across sessions 1-3 ............... 154
4.8 Number of fixations to peripheral locations across sessions 1-3 ............. 156
Abstract of Dissertation Presented to the Graduate School
of the University of Florida in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy
CHANGES IN VISUAL SEARCH PATTERNS AS AN INDICATION OF
ATTENTIONAL NARROWING AND DISTRACTION DURING A
SIMULATED HIGH-SPEED DRIVING TASK UNDER INCREASING LEVELS OF ANXIETY By
Christopher M. Janelle
Chairperson: Robert N. Singer Major Department: Health and Human Performance
The purpose of this investigation was to examine the influence of distraction on the attentional narrowing construct in the context of a dual task driving simulation under varying levels of anxiety. Forty-eight women were randomly assigned to one of six experimental conditions: distraction control, distraction anxiety, relevant control, relevant anxiety, central control, and central anxiety. Those assigned to central conditions only performed a driving task while the other four groups were required to identify peripheral lights in addition to driving. Those in anxiety conditions were exposed to increasing levels of anxiety which was manipulated by instructional sets. All
participants completed three sessions consisting of 20 trials each during which measures of cognitive anxiety, arousal, visual search patterns, and performance were taken.
Data indicated that as those in dual task conditions reached higher levels of anxiety, their ability to identify peripheral fights become slower and less accurate. Furthermore, the ability to drive for those in the distraction and central groups was impaired at high levels of anxiety. The decrease in driving proficiency for those in the distraction anxiety condition was highly associated with changes in visual search patterns which became more directed toward peripheral locations. In the central anxiety condition, driving proficiency was influenced by an increased tendency to make minor errors which could be attributed to a more cautious driving style when highly activated. Overall, performance on both central and peripheral tasks was worse for those in the distraction anxiety condition during the period of highest anxiety. Furthermore, visual search patterns were more eccentric during this session for this group.
Results suggest that drivers who are highly anxious and aroused experience an altered ability to process peripheral information at the perceptual level, leading to a decrease in attention resources available for the processing of central information. In addition, it appears that this effect is amplified when distractors as well as relevant cues are present in peripheral areas. Implicated in the study is the role of visual search patterns and distractors in the dual task context. Suggestions are made to revise the current notion of attention narrowing to include the role of distraction as a contributor to performance variability.
Anyone associated with sport as either an athlete, coach, or spectator, can remember instances in which the pressure of competition transcended the typical commentary that it was "just a game". Sport is replete with occasions such as a crucial free throw, a clutch base hit, a game winning field goal, or a breathtaking lap at the finish fine, in which athletes either overcome the excessive demands of the moment and perform at their highest levels or choke under the extreme circumstances of the situation. More often than not, it is the ability to maintain concentration when faced with these stressors that determines the outcome of sport contests. However, even the greatest athletes occasionally succumb to these inordinate demands, causing sport psychologists to question why this occurs and what mechanisms contribute to diminished performance.
It has been suggested that the ability of athletes to execute effectively in
exceptionally stressful environments is related to the impact of arousal and anxiety on the capability to maintain concentration. Though a number of researchers have suggested that excessive stress influences information processing capabilities by overloading the limited attention. resources available, much evidence provided to support this claim is anecdotal or observational in nature (Moran, 1996). Ignoring the underlying mechanisms
responsible for general changes in performance renders it impossible to prescribe competent interventions that specifically address the mechanisms which are being affected.
The paradigm shift in the study of cognitive psychology that occurred in the late 1950s and early 1960s brought with it a greater understanding of the specific processes that are involved with attending to and processing information. However, the research has been criticized due to its reductionistic nature. Ignored have been other relevant factors, particularly emotions that influence attentional processes and subsequent achievement (Kremer & Scully, 1994; Moran, 1996). By not studying the interaction of emotions, attention, and performance, the generalizability of research on attention has been somewhat limited. Thus, much still needs to be understood about dynamic sport settings in which attentional flexibility is crucial under conditions of severe time constraints and the stress associated with the competitive drive to win.
Of interest here is the peripheral (or attentional) narrowing phenomenon which has been reported to occur under high stress levels (Easterbrook, 1959). Though intriguing, and attracting much research interest to the present day, the underlying mechanisms responsible for the narrowing (or tunnel vision effect) which presumably occurs in stressful situations remain a mystery. Questions are still unanswered regarding the specific components of the stress response (i.e., cognitive or somatic anxiety, and/or arousal) that influence performance. Specifically, does narrowing occur due to heightened levels of cognitive anxiety, somatic anxiety, mere arousal, or some combination of these factors?
Another factor that has contributed to the confusion is that sport psychology researchers have been reluctant to give up the notion that the Inverted-U hypothesis
(Yerkes-Dodson, 1908) is the one and only description of the stress/performance relationship. However, contemporary models have been proposed that address the specific components of stress and prescribe testable hypotheses that are quite different from the very general Inverted-U description of the relationship of stress with performance.
Furthermore, the specific aspects of performance (i.e., stimulus detection and discrimination, response time, response accuracy, and others) that are influenced by changes in affective states have received relatively little empirical investigation due to the favoring of more easily understood global performance measures. As mentioned, by failing to address the specific parameters that are impacted by stressful stimuli, it is impossible to understand more precisely what is happening; and therefore, what to do about it.
Finally, many of the performance changes in stressful environments that have been attributed to attentional narrowing could possibly be explained in the context of distraction. In spite of their obvious application to understanding sport performance, the study of attentional narrowing and distraction in the context of dynamic sports is nonexistent.
As may be evident, advancement beyond current understandings of the
stress/performance relationship is warranted for both theoretical and practical reasons. Thus, my intent was to investigate specific affective factors that influence attentional parameters and, ultimately, performance in an ecologically valid dual task situation under stressful circumstances. To provide further description of the specific issues to be
addressed and to Justify the intended experiment, background information on the topics of interest follows.
It has been suggested that the ability to attend to, select, and process the most critical cues in a situation is one of the most important skills required for high level performance in sport (e.g., Abernethy, 1993). In support of this idea, experts have consistently exhibited what has been called a "cognitive advantage" over less skilled participants, being able to process the same information in a more efficient and effective manner (Starkes & Allard, 1993). Though this is interesting and valuable information for both cognitive and sport psychology researchers, the ability to demonstrate this cognitive advantage has rarely been investigated under imposed stressful states in a realistic sport context or other meaningful situation. However, an early theory that directly addressed the ability to select cues and use them effectively under different emotional conditions is the cue-utilization hypothesis described by the concept of attentional narrowing.
Easterbrook (1959) produced the most influential article on the topic of cue utilization based on the findings of Bahrick, Fitts, and Rankin (1952) and others (e.g., Bruner, Matter, & Papanek, 1955; Callaway & Dembo, 1958; Callaway & Thompson, 1953; Eysenck, Granger, & Brengelman, 1957; Granger, 1953). Easterbrook's primary theoretical contribution was the notion that as level of arousal increased to a certain point, performance in a dual-task situation would be variable between the two tasks. Specifically, he suggested that with an increase in activation to moderate levels, central task achievement would be facilitated due to the blocking of irrelevant cues in the
periphery from being processed. Furthermore, he postulated that at this moderate level, performance in tasks requiring less of a central focus (i.e., a peripheral focus) would deteriorate due to a blocking of these cues. Finally, performance in central tasks would be expected to deteriorate if arousal level reached a heightened state in which the funneling effect prohibited attention to relevant cues that are integral to performance of the central task. In other words, Easterbrook (1959) suggested that the degree of facilitation or disruption caused by emotional arousal is dependent on the range of cues needed to perform a task effectively and how those cues are attenuated by emotional states.
Unfortunately, relatively few investigations have been undertaken in sport settings to examine the effects of peripheral narrowing, or if this phenomenon exists. This is surprising considering that typical sport situations, especially at higher levels of expertise, often occur in extremely stress-provoking environments. In one of the only studies done in the context of sport, Landers, Wang, and Courtet (1985) investigated peripheral narrowing with experienced and inexperienced rifle shooters. The central task was a target shooting task and the peripheral task was an auditory detection task. Although there were no differences found in secondary task performance between the experienced and inexperienced shooters, both groups shot worse under high stress conditions.
Also with relevance to sport, two studies were conducted by Williams, Tonymon, and Andersen (1990, 1991) that substantiated Andersen and Williarns'(1988) model of athletic injury. In the model, Andersen and Williams (1988) indicate that a possible predisposition to athletic injuries may be precipitated by elevated levels of life stress that result in an inability to attend to threatening peripheral stimuli. Support for this possibility
was provided by Williams et al. (1990, 199 1) who showed that decrements in the ability to detect peripheral cues were found to occur while individuals performed Stroop tasks under stressful conditions. Based on their conclusions, the researchers suggested that attention narrowing may be a dispositional factor that predicts athletic injuries because athletes are unable to notice potentially dangerous peripheral stimuli such as other players, dangerous terrain, and the like.
Though not directly sport-related, other perceptual-motor activities have been investigated with respect to the ability to attend to central and peripheral dual tasks. Of these, perhaps the most relevant to sport is driving an automobile (unfortunately, many highway drivers forget that it is not a sport!). While driving, there is a limited amount of attention resources that can be devoted to an almost infinite number of stimuli at any point of time. As the task of driving becomes more complex due to decreased visibility, bad weather, heavy traffic, mechanical malfunction, sudden unexpected obstacles, fatigue, and other factors, the automaticity of driving becomes less instinctive and demands more attention resources (Shinar, 1978). In these conditions, drivers may experience information overload and may be more likely to place themselves in possibly risky situations.
During normal driving, the driver tends to focus on the central task of keeping the vehicle "on the straight and narrow" so to speak, maintaining control of the vehicle based on the constraints of the driving environment (e.g., speed limits and lane markers). However, when confronted with an object or event that is not in the central (or foveal) field of vision, the eyes are normally moved from the central task to focus more directly on
the information that has been attended to in the periphery. Based on the information provided by the newly attended stimulus, a decision must be made regarding whether or not to change driving behavior, These alterations occur both in serial and in parallel depending on the specific situation presented (Schneider & Shiffiin, 1977; Shiffrin & Schneider, 1977). To make matters more complicated, all of these processes are often limited by extremely restrictive temporal constraints (Shinar, 1978).
Recent research has been directed toward understanding, more fully, the ability of drivers to extract meaningful information from signals along the roadway. In particular, many studies have been done on the demands of the external environment while driving, such as the perception and processing of road signs.
Hughes and Cole (1988) investigated the effect of attention demands on eye movement behavior during simulated road driving. They attempted to assess how a driver's performance was effected by purposely directing attention to particular features of the road environment under single and dual task conditions. Results showed that across groups, 25% of the fixations were located at the actual focus of expansion while 80% of the remaining fixations were centered within 6 degrees of the focus of expansion. Therefore, results suggest that if road signs are located beyond the 6' point in the display, they will probably not be perceived. Also, increasing task specificity resulted in more fixations to the left part of the display (the area where most signs were posted) with a corresponding decrease infixations to the center of the display. Furthermore, the addition of the dual task paradigm resulted in two predominant effects on eye movements. First,
eye fixations tended to move closer to the central region. Second, the distance of peripheral fixation also moved closer to the focus of expansion.
Therefore, it can be concluded that in the typical dual task condition which
requires increased attention resources, there is insufficient spare resources to perform the peripheral task without more fixation resources. The additional demand of the secondary task not only necessitates more fixations to the region of the task, but also reduces the extent to which the rest of the visual display is searched. Though not suggested by the researchers, these results could be accounted for in the context of attention narrowing and/or distraction.
A similar study was conducted by Luoma (1988) to examine the types of roadway landmarks that are perceived and remembered better than others. As may be evident from the results of Hughes and Cole (1988), drivers do not perceive nearly all of the traffic signs that they encounter, even in situations where they have been precued to look for the signs. In situations imposing increasing demands and challenges to the driving task, the perception of signs is even less than in "normal" driving conditions.
Luoma (198 8) tested the idea that the more casual the perception or the larger the target signs, peripheral vision is used to a greater extent. However, an important function of peripheral vision is to identify targets of importance to the driving task and, if the situations warrants, direct focal vision to the sign. To investigate these ideas, participants actually drove a 50 Ian route while outfitted in eye movement monitoring equipment. Results indicated that correct perception only occurred, for the most part, when the target was fixated foveally. Also, whether the sign was perceived or not depended heavily upon
the relevance of the sign to the driving task. For example, 100% of all speed limit targets were perceived foveally and were recalled while signs such as pedestrian crossings, roadside advertisements, and houses were perceived much less, if at all. in fact, no subjects recalled passing "pedestrian crossing" signs even though 25% of them fixated on it. It appears that the processing devoted toward identifying the signs was dependent upon the relevance of the sign to the actual driving task and its informativeness.
Perhaps the most relevant study reported to date to examine the processing of visual stimuli in both central and peripheral fields was conducted by Nfiura (1990). The primary purpose was to assess changes in the useful field of view (UFOV: the information gathering area of the display) under situations of varying task demands and to determine the corresponding variation in the acquisition of visual information that accompanied these changes. Mackworth (1976) has suggested that the UFOV will vary with changes in the situational characteristics or specific demands of the environment. The study was conducted under actual driving conditions in which the subject had to navigate along a roadway, in daylight conditions.
Results showed that RT to peripheral lights increased as the situational demands increased. Furthermore, response eccentricity became shorter, suggesting that fixations had to occur closer to the actual target location to acquire the necessary information. In general, this suggests that peripheral visual performance is impeded by an increase in situational demands. Specifically, it appears that the UFOV narrowed at each fixation point, and the latency of each fixation lengthened. Also, the detection of targets required a greater number of eye movements in more demanding driving situations.
To explain these findings, Mura (1990) postulated that the depth of processing of an object in focus increases as the situational demands increase. Specifically, the latency period of the eye movements following fixation on a target lengthens as the demands increase. In more demanding situations, when a narrower UFOV exists, information pickup at the fixation point appears to be slower, causing a delay in the attentional switching capabilities of the driver. Other evidence (Mura, 1985) indicates that with lower demands, the fixation points shift to the inner area of the UFOV while during highly demanding situations, fixations shift toward the outer part of the UFOV. Thus, as a result of the deeper processing that occurs at each fixation point, participants attempt to acquire information more efficiently in the periphery while using a smaller UFOV. Another hypothesis is that as demands increase, they develop a stronger tendency to search for information in the periphery, a phenomenon referred to as "cognitive momentum", and a possible adaptation of the system to utilize attentional resources in the most efficient manner to deal with the increase in demands (Miura, 1986).
Though interesting and conceptually valuable, Miura's (1985, 1986, 1987, 1990) work fails to take into account what might be a primary influence on the decrement in peripheral performance and the apparent narrowing of the UFOV. Though not mentioned in any of his papers, a possible explanation for these findings can be attributed to the increase in arousal and anxiety that accompanies tasks that increase in complexity and demands (Easterbrook, 1959). Although eye movements have been recorded in a variety of real world and simulated driving situations, researchers have not attempted to examine other affective inputs to the system that may account for differences in performance.
Furthermore, in Murals (1990) study, as well as others, performance in the central driving task was not recorded.
Like normal driving, the sport of auto racing demands the coordination of an
extensive repertoire of perceptual and motor skills. However, the performance difficulty of these skills is significantly compounded by the competitive nature of the sport. In addition to mastering typical driving skills, the shear speed of the car requires split-second decision-making and intense concentration on the most relevant cues for the entire duration of the race. An ill-advised momentary attention shift or distraction can be (and often is) catastrophic under these circumstances. Unfortunately, virtually no attempt has been made to empirically assess these factors in auto racing.
As may be evident from the sparse research that has been conducted related to
sport and driving, no study has addressed the issue of attention narrowing in the context of dynamic, reactive sport environments. However, perhaps the theoretical mechanisms that underlie results from laboratory tasks and the few sport situations that have been studied are common to all dynamic sports as well.
The reduction in the range of cue utilization was originally explained in the context of both Hull's (1943) Drive theory and the Yerkes-Dodson (1908) Inverted-U hypothesis. However, the cue utilization hypothesis can be more accurately accommodated with more recent attention capacity (or resource) theories (Kahneman, 1973: Wickens, 1984) which propose a limit in the resources available to attain optimal attention. Proponents of this view (e.g., Landers, 1980) suggest that one primary feature of high arousal levels is a narrowing of attention because the allocation policy is likely to shift away from the
periphery and toward the central area of a visual display. This notion has been supported by studies that indicate the probability of cues in central areas of a display to draw more attention resources 'creases under stressful situations (e.g., Hockey, 1970).
To summarize the attention narrowing point of view, stress (either arousal or anxiety produced) tends to overload the system, narrowing the range of stimuli that are perceived. When this occurs, information processing capabilities appear to operate in a dysfunctional manner. At the initial stages of perception, possibly various cues are ignored, never reaching later stages of processing. On the other hand, the actual informational value of the stimulus may not be utilized effectively due to an inability to distinguish the stimulus as relevant or irrelevant and respond accordingly, Thus, narrowing could be due to a dysfunction at the perceptual stage of processing and/or at the short term memory stage. Quite possibly, impairment occurs at both stages of information processing (Bacon, 1974; Hockey, 1970). However, the exact location of information processing dysfunction has not been substantiated. Furthermore, an alternative explanation for what happens to performance and attention allocation under stressful conditions is plausible.
As described previously, the idea that consistently recurs as an explanation for performance changes in both central and peripheral tasks in stressful environments is a narrowing of the attention beam in which cues are somehow filtered from processing at either the perceptual or encoding stage of analysis. However, the influence of distractors in the context of peripheral narrowing has not been investigated, and the concept of
distraction has received very little attention from researchers. It seems logical, however, that the apparent narrowing of attention that occurs under stressful conditions could also be explained by the notion that anxious or aroused performers are more inclined to be distracted.
The lack of research directed toward understanding distraction is surprising
considering the need of people in many work entertainment, sport, and other situations to ignore distractors and focus only on the most critical cues in order to effectively perform the task. Examples of athletes and other performers who have been victimized by distraction are numerous (Moran, 1996), prompting Orlick (1990) to suggest that the need to avoid distraction is one of the most important mental skills required to be successful in sport.
Brown (1993) defines distraction as situations, events, and circumstances which divert one's mind from some intended train of thought or from some desired course of action. This definition is somewhat different from William James' (1890) original conceptualization of distraction which was directed toward the experience of distracting thoughts and being "scatter-brained". Each of these views of distraction can be more easily understood if categorized in the context of internal and external types of distractors (Moran, 1996). Internal distractors refer to mental processes that interfere with one's ability to maintain attention while external distractors are environmental or situational factors that divert attention from the task at hand. Wegner (1994) has postulated that because the mind tends to wander, an attempt is made to hold it in place by repeatedly checking to determine whether it has wandered or not. However, in this process, the mind
is inadvertently drawn to the exact thing that one is trying to ignore. He also suggests that when highly emotional, attentional resources are reduced, and the mind is inclined not only to wander away from where it should be attending, but is also diverted toward that which one is attempting to ignore.
The typical effect of distraction is a decrease in performance effectiveness. The most plausible explanation for this is that when one is distracted by either external or internal factors, there is a decrease in available attentional resources for the processing of relevant cues. Like attentional narrowing, this idea is consistent with the limited capacity models of attentional resources proposed in different forms by various attention theorists (e.g., Aliport, 1989; Kahneman, 1973; Shitfiin & Schneider, 1977). Because attentional capacity is limited, resources directed toward the processing of distractors reduce available resources for the processing of task-relevant information. This idea is supported by studies which have shown that distraction effects increase for complex rather than simple tasks and are greater as the simiAlarity of distractors to relevant cues increases (Graydon & Eysenck, 1989).
Though empirical evidence is scarce, many researchers have suggested that increases in emotionality (i.e., anxiety, worry, arousal) increase susceptibility to distraction. Numerous examples of evidence to support the notion that stress impedes performance due to distraction can be found in verbal accounts and behavioral observations of "choking" in competitive environments. Moran (1994, 1996) provides substantial anecdotal evidence that the impact of anxiety is the absorption of attentional resources which could otherwise be directed toward the relevant task. Similarly,
Baumeister and Showers (1986) suggest that increased worry causes attentional resources to be devoted to task-irrelevant cues. Furthermore, self-awareness theorists such as Masters (1992) suggest that under stress, not only is attention absorbed by irrelevant stimuli, but also the performance of normally automated skills becomes less automated as resources begin to be intentionally directed toward the process of the once-automated movement. Finally, Eysenck (1992) has provided empirical evidence that anxiety provokes people to detect stimuli which they fear, usually stimuli that diverts them from attending to relevant information. Unfortunately, the specific components of stress that influence attentional parameters have also been largely ignored.
Arousal and Anxiety
Due to increasing dissatisfaction with the Inverted-U hypothesis and other
theories, researchers attempted to analyze the stress response in greater detail as to its various components and to re-examine the stress/performance relationship. Perhaps the first scholars to approach the possibility of dissecting the general anxiety response were Liebert and Morris (1967) who identified two primary contributing factors to anxiety: worry and emotionality. In Liebert and Morris's view, worry consisted of cognitive concerns about one's performance while emotionality referred to the autonomic reactions to the performance environment. This concept strongly influenced Davidson and Schwartz's (1976) multidimensional model of anxiety. They were the first to use the terms "cognitive" and "somatic" anxiety and formulated their theory in the context of clinical applications. Thus, worry has become synonymous with cognitive anxiety and emotionality has become synonymous with somatic anxiety. These general characteristics
of the components of anxiety have held up under empirical investigation and appear to be manipulable independently (e.g., Schwartz, Davidson, & Goleman, 1978). Also, it is important to distinguish both components of anxiety from arousal. Though similar to somatic anxiety, arousal refers to the natural physiological indices of activation that are present within an organism at any time (Sage, 1984). In contrast, somatic anxiety refers to the perception of physiological arousal.
One problem with multidimensional anxiety theory is the two-dimensional
approach used to explain the effects of somatic and cognitive anxiety on performance. Specifically, the two-dimensional approach in analyzing results tends to neglect the interaction of the components of stress, treating them independently rather than in combination (Hardy & Fazey, 1987). According to the viewpoint of Hardy and his colleagues, any relatively comprehensive treatment of these components must treat them in an interacting, three dimensional manner. To improve the predictability and structure of the model, therefore, Hardy and Fazey (1987) developed a catastrophe model of anxiety and performance.
In an effort to advance understanding beyond the multidimensional approach to the study of the effects of anxiety and arousal on performance, Fazey and Hardy (1988) proposed a three-dimensional model of the relationship. Borrowing heavily from Thom (1975) and Zeeman (1976) who originally conceptualized the idea of catastrophes and then applied them to the behavioral sciences, respectively, Fazey and Hardy's (198 8) model is closest in form to the cusp catastrophe, one of the seven originally proposed
catastrophe models of Thom (1975). According to the cusp catastrophe model, changes in either cognitive anxiety or arousal, or both Will change performance in specific ways.
Hardy and Fazey (1987) state that of the two variables that determine behavior
(cognitive anxiety and arousal), cognitive anxiety is the "splitting factor", the variable that has the primary influence on performance level. The roles of cognitive anxiety and physiological arousal were chosen specifically to be able to evaluate testable hypotheses with respect to the anxiety/arousal/performance relationship. Specifically, when cognitive anxiety is low, the model predicts that physiological arousal will influence performance in an inverted-U fashion. However, when physiological arousal is high, high levels of cognitive anxiety will result in lower levels of performance. Finally, when physiological arousal is low, higher cognitive anxiety will lead to increases in performance.
Usually the manipulation of anxiety and arousal is carried out through a time-toevent paradigm in which assessments are taken at specified times leading up to a competition setting (Hardy, Parfitt, & Pates, 1994). For instance, assessments will be taken one week prior, two days prior, and then one hour prior to the competition. In this way, the time course of anxiety and arousal can be assessed. In other instances, levels of anxiety and arousal are manipulated through the use of both ego-threatening or other anxiety-producing instructional sets and through the use of exercise-induced arousal, respectively (Parfitt, Hardy, & Pates, 1995).
An obvious feature of the cusp catastrophe model of the anxiety/performance
relationship is the choice of physiological arousal rather than somatic anxiety as the normal factor. The primary reason for this choice is based on the notion that it is part of the
organisrds natural physiological response to anxiety-producing situations (Hardy, 1996). This belief is sufficiently well-established to be spoken of in the context of a generalized response within the competition setting. In other words, in competitive environments, performers usually show one or more signs of physiological arousal. Though the physiological response may be reflected in self-reports of somatic anxiety, the purely physiological index can encompass the individual task requirements, different situations, and other combinations of factors that override reports of somatic anxiety. Furthermore, physiological arousal changes tend to be reflected in changes of somatic anxiety while the converse is not the case (Fazey & Hardy, 1988; Hardy, 1996; Hardy & Fazey, 1987). Substantial support has been shown for the cusp catastrophe model of the anxiety performance relationship in seminal investigations of the model by Hardy and his colleagues (e.g., Hardy, Parfitt, & Pates, 1994).
One limitation, however, to the study of stress and performance in the context of any of the models described previously, is a lack of empirical explanation for the performance changes that are noticed in overly stressful situations. As mentioned, one specific cognitive mechanism that has been implicated, but has received limited empirical investigation in sport contexts, is the impact of anxiety and arousal on attention resources. Thus, a logical next step is to attempt to delineate these relationships in an effort to more thoroughly understand performance changes under stressful conditions.
A More Comprehensive Next Step
Though intriguing and receiving much anecdotal support in a variety of settings, the empirical interaction between the cognitive and emotional antecedents of the
stress/performance relationship remains largely unspecified. Furthermore, in light of recent dissatisfaction with the Inverted-U hypothesis of the anxiety/arousal/performance relationship, the underlying explanations originally forwarded by Easterbrook (195 9) may be somewhat obsolete. Specifically, although studies in which anxiety or arousal have been manipulated have shown support for the attentional narrowing phenomenon, none have examined the interactive effects of these emotional antecedents, nor have they designated one or the other as the primary contributor to the relationship. Furthermore, the role of distraction has received little or no investigation in this context, and an understanding of it could contribute greatly to the understanding of performance changes.
Paradoxically, it appears that perhaps there are two equally attractive explanations for the decrease in performance that occurs under high levels of stress. On one hand, proponents of the attentional narrowing argument would suggest that under high stress levels (either anxiety or arousal induced) the attentional field narrows to block out irrelevant cues, and then narrows further, blocking the processing of relevant information as stress continues to increase. On the other hand, proponents of the distraction argument would suggest that actually a widening of the attentional field occurs such that irrelevant or distracting cues receive more attention than when under lower stress levels. Evidently, a controversy exists unless in some way, both mechanisms could be working at the same time. Perhaps, an increase in anxiety and/or arousal results in a narrowing of the attentional field while at the same time, especially at higher levels of stress, it increases susceptibility to distraction. Many theories can account for how stress affects attention
and the eventual impact of attention variation on performance, but none address specifically why this phenomenon occurs.
As may be evident from the discussion of driving tasks, visual search has been used extensively to draw cognitive inferences regarding what information is being extracted and processed during eye fixations, a concept Viviani (1990) has termed the "central dogma" of visual search research. Though it is presently impossible to empirically prove the central dogma, most researchers agree that eye fixations do at least reflect cognitive processing. Assuming the dogma to be even partially true, if an attenuation of cues in the periphery is evident, the need to pick up crucial cues in the periphery during particular situations would necessitate an increase in scan path variability and fixation rate in order to compensate for peripheral narrowing. Furthermore, if distracting visual cues were actually introduced into the test environment, visual search strategies may be altered, resulting in increased fixation and processing of distracting stimuli and a reduction of attention resources available for central task performance.
Like normal driving, the sport of auto racing demands the coordination of an
extensive repertoire of perceptual and motor skills. However, the performance difficulty of these skills is significantly compounded by the competitive nature of the sport. In addition to mastering typical driving skills, the sheer speed of the car requires split-second decision-making and intense concentration on the most relevant cues. An ill-advised momentary attention shift or distraction can be (and often is) catastrophic under these circumstances. Thus the need to respond effectively in this type of a pressure-packed
activity is paramount. Unfortunately, no attempt has been made to empirically assess these factors in auto racing.
Viviani (1990) suggested that the central dogrna of visual search and cognitive inference would be valid if evidence for serial search is provided in particular tasks. According to Kahneman (1973), as arousal increases, task difficulty also increases. Under these circumstances, parallel (relatively automatic) processes tend to be modified by the organism, becoming more serial and attentive in nature (Duncan & Humphreys, 1989; Shiffiin & Schneider, 1977). As mentioned, the auto-racing environment is one is one in which drivers experience extremely high levels of arousal and anxiety. In this case, the ability to relate eye fixations to cognitive information processing is more valid than when parallel processing is dominant.
As mentioned, very limited research has been done to investigate any psychological phenomena with auto racing and none has been done to investigating driver's eye movements or other attention parameters that are critical to high performance in the fastest sport in the world. The selective and divided attention demands of race car driving render it an ideal task and environment to investigate attention mechanisms and the eyemovement parameters that underlie those mechanisms. Perhaps the first step that should be taken to better understand the attention capabilities necessary for effective race car operation is to evaluate the visual search patterns of drivers as they navigate the race course. By evaluating these parameters, it may be possible to assess whether the "software" advantages that appear to predispose athletes in other sports to reach higher levels of achievement are valid antecedents to high performance auto racing.
In light of these considerations, the primary objective of this study was to attempt to delineate the individual and interactive influence of arousal and cognitive anxiety on attention capabilities. In addition, it was anticipated that these attention alterations would result in behavioral changes that would, in turn, influence global performance indicators. Specifically, performance while undertaking (1) a central driving task and (2) a peripheral light identification task was investigated under various levels of cognitive anxiety. Furthermore, visual search patterns were assessed to ascertain whether perceptual factors (i.e., the search patterns themselves) contributed to the attention narrowing and/or distractibility phenomena.
In this manner, an attempt was made to isolate specific factors that might influence selective attention and the ability to divide attention between the central and peripheral tasks. Also, an attempt was made to determine whether visual search patterns were influenced by changes in both cognitive and physiological activation levels. By assessing specific dependent measures rather than simply global changes in affect, cognition, and performance, a clearer understanding of the interactive influence of these factors was acquired.
Statement of the Problem
In this experiment, a central driving task and a peripheral light detection task were used to assess the effects of anxiety (as manipulated by a time-to-event paradigm and anxiety-producing instructional sets) on performance over the course of familiarization, practice, and competition sessions. Performance-related variables included: (a) driving speed and accident propensity, (b) peripheral fight detection speed and accuracy, (c) visual
search patterns, and (d) physiological arousal. Determined was whether any anxietyinduced changes in performance were due to a narrowing of the attentional field, increased distractibility, or both.
Hypotheses and Pilot Study Results
The following hypotheses were tested in this investigation. The first set of
hypotheses was directed toward the manipulation of anxiety and the expected result of this manipulation on arousal levels. Rationale for the hypotheses is offered after all are proposed.
1. The use of the time-to-event paradigm and instructional sets will produce higher cognitive anxiety levels during the practice and competition sessions in the experimental groups (anxiety) than in the control groups (no anxiety) as measured by the CSAI-2 (Martens et al., 1990). The instructional sets used will be similar to those employed by Hardy et al. (1994) and will be used to manipulate levels of cognitive anxiety independent of somatic anxiety. These manipulations have been shown to be valid in both sport-specific (Hardy et aL, 1994) and other evaluative situations (e.g., Morris, Harris, & Rovins, 198 1). Furthermore, the time-to-event paradigm has been a reliable means of investigating temporal changes in anxiety associated with impending competitions (Hardy et al., 1994).
2. The increase in anxiety levels exhibited in the experimental groups will be
mirrored by an increase in physiological arousal (as measured by an increase in heart rate and pupil diameter size) in the practice and competition sessions. In addition, it is
hypothesized that cognitive anxiety and arousal levels will be highest immediately prior to the competition session due to the time-to-event and instructional set manipulations.
According to Lacey and Lacey's (195 8) autonomic response stereotype
hypothesis, the reaction to anxiety-producing thoughts and stimuli cannot be specified due to individual differences. However, if manifested in physiological changes, heart rate and pupil dilation measures are sensitive to increases in autonorrdc activity. In addition, heart rate has been used reliably in other tests of the catastrophe model of anxiety (e.g., Hardy et al., 1994). Furthermore, Abernethy (1993) has advocated the use of pupillometry as one of the most reliable measures of anxiety. Finally, because the test environment is static, such that the participant is not physically activated in any way, any changes in HR or pupil dilation across test conditions can be more readily attributable to emotional changes than if tested in a physically active situation.
The next set of hypotheses was directed toward the anticipated changes in performance that were expected to occur in the central and peripheral tasks.
1. For central task conditions (those in which only the central driving task is performed), driving performance (as measured by lap speed and the number of driving errors) will be similar for the control group and anxiety group in the familiarization session. However, during the second session, driving is hypothesized to be more proficient for the anxiety group than the control group. Finally, performance in the competition session will be better for those in the control group than those in the anxiety group.
2. Those in the relevant groups, in which the central driving task will be
performed concurrently with peripheral fight identification of relevant stimuli, will exhibit similar proficiency on both tasks regardless of control or anxiety manipulations during the familiarization session. Driving skill during Session 2 (practice) is predicted to be facilitated for those in the anxiety group as opposed to the control group, but performance in the peripheral light detection task (as measured by reaction time and response accuracy) will be diminished due to a decrease in peripheral cue utilization. In the third test session (competition), those in the anxiety group will perform worse in both tasks due to a decrease in cue utilization.
3. For the dual task distraction conditions (those in which the central driving task will be completed concurrently with peripheral light detection of relevant stimuli while ignoring irrelevant peripheral lights), achievement in both tasks during the familiarization session will be similar for the anxiety and control groups. Central driving task proficiency during the second session will be facilitated for those in anxiety groups as opposed to control groups, but peripheral cue utilization changes will result in reduced performance on the peripheral light detection task during the same session for the anxiety group. In the third session, execution of both tasks will be worse for those in the anxiety condition as compared to control groups due to an increase in the narrowing of cue utilization as well as an increase in the distractibility of participants at high levels of anxiety.
4. Overall, achievement in the central driving task should be highest for the
central control group in the third test session due to no interference from anxiety or other attention-demanding stimuli (i.e. peripheral fights). The ability to detect peripheral lights
should be best for the relevant control group in the competition session due to the increased automation of the central task no interference from distractors, and no interference from anxiety changes. Furthermore, reaction time and detection accuracy for relevant peripheral lights in the distraction condition is expected to be similar in the familiarization session for anxiety and control groups. However, detection speed and accuracy will decrease for those in the anxiety group in the competition session due to an increase in distractibility.
These hypotheses were forwarded on the basis of previous conclusions from
studies of the attentional narrowing phenomenon (e.g., Bruner, Matter, & Papanek, 1955; Callaway & Dembo, 1958; Callaway & Thompson, 1953; Eysenck, Granger, & Brengelman, 1957; Granger, 1953), as well as a variety of anxiety models that indicate a moderate increase in activation to be beneficial to performance but a high level of activation to result in diminished achievement (e.g., Hardy & Fazey, 1987; YerkesDodson, 1908).
According to the attentional narrowing phenomenon, under moderate levels of
anxiety and arousal, the range of cues utilized will be decreased, blocking peripheral cues from being processed. Thus, central driving task proficiency will be facilitated by maintaining attentional focus on the most relevant cues while performance on the peripheral light detection task will be hindered (Easterbrook, 1959; Kahneman, 1973). However, as activation levels increase, a person is most likely susceptible to a further decrease in the range of cue utilization, blocking the processing of relevant cues (Easterbrook, 1959). Also, remaining attentional resources may be absorbed by the
increased propensity to be distracted by both internal factors (anxiety) and an increased propensity to process irrelevant external factors (distracting peripheral stimuli) (Moran, 1996; Wegner, 1994).
If activation levels reach extremes, this could eventually result in a catastrophic deterioration in effective execution (Hardy, 1996) of both central and peripheral tasks. Specifically, Hardy and Fazey's (1987) catastrophe model indicates that when a performer's cognitively anxiety and arousal reach high levels, performance will deteriorate in a dramatic fashion, not in a gradual manner as proposed by the Inverted-U hypothesis (Yerkes-Dodson, 1908).
The final set of hypotheses was directed toward the expected changes in visual search patterns that were expected to be exhibited by participants in response to changes in anxiety and arousal levels. Once again, at the completion of the proposed hypotheses, rationale will be presented.
1. Eye fixations for those in the central condition are expected to cluster closely around the point of expansion (within a 6' radius from the point of expansion) for both the control and anxiety groups.
2. In the relevant condition, fixations for the control groups should be focused more centrally (similar to the central condition) than for the anxiety group due to the ability of control participants to acquire peripheral stimuli information with peripheral vision. Correspondingly, those in the anxiety group will probably exhibit an increase in the number of fixations to the periphery in order to compensate for the reduction of peripheral vision due to anxiety.
3. In the distraction condition, similar to the relevant condition, fixations for the control groups are expected to remain more centrally located in Sessions 2 and 3 due to the ability to discriminate relevant from irrelevant peripheral light stimuli with peripheral vision. However, the number of fixations to the periphery for those in the anxiety group will increase in Session 2 and then even more in Session 3 due to a narrowing of cue utilization and an inability to acquire peripheral information with peripheral vision, as well as the increased susceptibility to focus on distracting stimuli.
These hypotheses are based on findings from general studies of driver fixation tendencies as well as the previously mentioned hypotheses with respect to attention. narrowing and distraction. It has been repeatedly shown that 80-90% of drivers' fixations tend to cluster within 4-6' of the point of expansion in the visual display and that this tendency is enhanced under conditions of higher task complexity (Miura, 1985, 1990). These tendencies would be expected to hold for those in control groups that do not experience extremely high levels of anxiety and are not required to process peripheral input. However, under anxiety-producing conditions, the visual field is expected to narrow (Easterbrook, 1959), requiring an increased number of fixations to the periphery to acquire information that is normally acquired by peripheral vision.
Furthermore, it would appear that highly anxious and aroused participants will
increase the number of fixations to distracting stimuli. Mura (1986) has suggested that as driving demands increase, a stronger tendency to search for information in the periphery occurs. Accordingly, this is a possible adaptation of attention processing to deal with the increase in demands (Mura, 1987). In terms of distraction, resources (i.e., eye
fixations) directed toward the processing of distractors reduce available resources for the processing of task-relevant information. Graydon and Eysenck (1989) have shown that distraction effects increase for complex rather than simple tasks and are greater as the similarity of distractors to relevant cues increases. As the ability to distinguish relevant from irrelevant cues is diminished, the propensity to be distracted by irrelevant stimuli will likely increase along with the tendency to fixate on these stimuli.
Definitions of Terms
To standardize the terminology in this experiment, the following terms are defined:
Arousal is the process in the central nervous system that increases the activity in the brain from a lower level to a higher level, and maintains that higher level. The activation response is a general energy mobilizing response that provides the conditions for high performance, both physically and psychologically (Ursin, 1978).
Attention is "...the taking possession by the mind, in clear and vivid form, of one out of what seem several simultaneously possible objects or trains of thought. Focalization concentration, of consciousness are of its essence. It implies withdrawal from some things in order to deal effectively with others" (James, 1890, pp. 403-404). Also, it has been described as a concentration of mental activity (Matlin, 1994; Moran, 1996).
Attentional. narrowing refers to the phenomenon in which, under increasing levels of stress, the range of cues utilized by an organism is reduced, resulting in an initial filtering of irrelevant or peripheral cues from processing, an increase in performance of central tasks, and a decrease in performance of peripheral tasks. As stress levels continue
to increase, both task- irrelevant as well as relevant cues begin to be attenuated from processing until performance in both peripheral as well as central tasks are disrupted (Easterbrook, 1959).
Cognitive anxiety is characterized by worry or the awareness of unpleasant
feelings, concerns about performance, and the inability to concentrate (Rotella & Lerner, 1993).
Cusp catastrophe model is a three-dimensional model that describes how one dependent variable can demonstrate both continuous and discontinuous changes in two other dependent variables. In the context of the catastrophe model of anxiety, the dependent variable is performance and the two independent variables are anxiety and arousal (Hardy, 1996).
Distraction refers to situations, events, thoughts, or circumstances that divert the mind from some intended train of thought and tend to disrupt performance (Brown, 1993; James, 1890; Moran, 1996).
Divided attention is characterized by the ability to attend to several simultaneously active messages or tasks, or to distribute attention effectively to simultaneous tasks that develops as a result of experience and practice (Eysenck & Keane, 1995; Hawkins & Presson, 1986).
Fixation refers to a pause in search during which the eye remains stationary for a period equal to or in excess of three video frames (120 ins) (Williams, Davids, Burwitz, & Williams, 1994).
Fixation location refers to the areas in the display in which the eye fixates during completion of a task (Williams, Davids, Burwitz, & Williams, 1994).
Point of exansion (POE) is the area where the two edge lines of the road appear to converge and the point at which the road appears to expand outward from the center (Rockwell, 1972).
Reaction time (RT) refers to the elapsed time between presentation of a particular stimulus and the initiation of a response to that stimulus (Schmidt, 1988).
Saccadic eye movements refer to movements of the eyes from one fixation point to another. A common saccade lasts for approximately 1I5Oth to 1/10 t of a second depending on how far it is to the next fixation (Andreassi, 1989).
Search Rate refers to a combination score representing the number of fixations and the duration of each fixation at particular locations (Williams, Davids, Burwitz, & Williams, 1994).
Selective attention refers to "the process of selecting part of simultaneous sources of information by enhancing aspects of some stimuli and suppressing information from others" (Theeuwes, 1994, p. 94).
Somatic anxiety refers to perceptions of physiological arousal such as shakiness, sweating, increased heart rate, rapid respiration, and "butterflies in the stomach"' (Martens et al., 1990).
Stress is characterized by a combination of stimuli or a situation that is perceived as threatening and which causes anxiety and/or arousal (Hackfort & Schwenkmezger, 1993).
Useful field of view (UFOV) refers to the information gathering area of the visual display (Mackworth, 1976).
Visual search refers to the two-stage process in which visual information from
sensory receptors is held in a rapidly decaying visual sensory store and then selected items in the iconic store are subjected to a more detailed analysis (Jonides, 1981; Theeuwes, 1994).
For the purposes of this investigation, the following assumptions were made:
1. Participants received course credit for participation and therefore should have been
equally motivated to participate in the study.
2. The time-to-event paradigm and specific instructional sets which include possible ego
threats, monetary gain, and other incentives, were appropriate methods to manipulate
cognitive anxiety (Hardy, Parfitt, & Pates, 1994).
3. The CSAI-2 (Martens et al., 1990) was an appropriate measure of cognitive anxiety.
4. Heart rate and pupil diameter measures were accurate and appropriate indices of
arousal (Abernethy, 1993; Hardy, 1996).
5. The dependent measures used to assess central driving task performance (lap speed
and number of errors) and the peripheral tasks (RT and number of errors) were
appropriate measures of performance.
6. The central dogma that the line of sight will coincide with the direction of attention
(Viviani, 1990) was at least partially true in this case, and therefore, visual search
orientation was reflective of the participant's actual allocation of attention.
Significance of the Study
Most empirical research dealing with the interactive effects of arousal and/or anxiety with performance has been oriented in a in a very general fashion. This is exemplified by the global measures of both stress and performance that have been used (Jones, 1990). Therefore, very little is known regarding the specific components of the stress response (either cognitive anxiety, arousal, or both) that influence performance variables such as attention flexibility, speed of information processing, decision-making, and other cognitive factors. With this in mind, the primary intention of the study was to contribute to and expand upon the established bodies of knowledge regarding the ability of participants in competitive sports and other stress-inducing activities to attend to and process the most relevant cues and make decisions appropriately. Though a driving task was used in the study, the implications of this research are intended to be generalizable, to a certain extent, to other achievement situations in which the stress response occurs. The driving simulation provided an ecologically valid, natural dual task paradigm in which to ideally investigate the phenomena of interest due to the need to attend to and process cues from both central and peripheral locations while driving.
The investigation addressed five issues of theoretical importance. First, a greater understanding was provided of the decrement in performance that has been repeatedly shown while completing tasks under high levels of stress. Though the attention narrowing phenomenon has received much empirical support as the underlying reason for a diminished ability to execute various tasks, other factors were suggested as possible contributors to these debilitative effects. Specifically, proposed was that the influence of
distractors, and the tendency to be distracted when faced with increased activation levels may also contribute to performance decreases, but had not been addressed. Wegner (1994) and others have presented the notion that as stress levels increase, the propensity of the performer to be distracted is enhanced. Though empirical evidence does not exist to support this notion, anecdotal self-report from athletes and athletes and other performers warranted investigation into this area (Moran, 1996). No research done to date in the context of peripheral narrowing had been conducted in which distractors were presented to participants while performing central and peripheral tasks.
Another issue of interest was whether the performance changes that were anticipated to occur under elevated levels of activation were due to changes in psychological affect (e.g., cognitive anxiety), an increase in arousal level, or some combination of both. By examining these variables in the context of the cusp catastrophe model (Hardy & Fazey, 1987), a clearer understanding of them and their affect on attention processing was delineated.
Third, determined to a certain extent was whether performance changes under higher levels of activation were due to the perceptual alterations in visual selective attention (as indicated by changes in visual search patterns) or other non-perceptual factors (i.e., encoding, response selection) during the information processing of relevant and irrelevant stimuli. As mentioned, one of the areas of controversy regarding the peripheral narrowing phenomenon was with respect to the mechanisms responsible for the lack of effective cue utilization. Indirect support has been provided for both a diminished ability to perceive relevant cues as well as a decrease in the efficiency of later stages of
information processing. Before this study was undertaken, no researchers had used eye movement information to clarify these issues. However, shifts in visual attention from central areas of a display to the periphery, and vice-versa, were reflected in the visual search data obtained in this experiment. Furthermore, information gathered from the use of visual search monitoring equipment was used to shed some light on the question of distraction versus narrowing by indicating whether eye-movement patterns were altered to focus more on distractors while under high levels of stress. A fourth area of significance addressed in this experiment was the effect of elevated activation levels on specific performance variables. In particular, by evaluating performance in terms of a variety of accuracy, speed, and reaction time measures, a more complete understanding of the separate elements of proficiency that are impaired or facilitated was ascertained. As Jones and Hardy (1990) have suggested, the lack off attention to these specific performance variables rendered it difficult, if not impossible, to prescribe interventions to enhance them.
Finally, an attempt was made to surmise whether skill execution was affected in a gradual or more dramatic fashion at higher levels of activation. Although the view of an inverted-U relationship between activation levels and performance is still the most popular conception of the relationship, this investigation provided evidence that perhaps more recent models (such as the cusp catastrophe model) are more accurate in their predictions.
From a more applied point of view, the results of this investigation are expected to benefit both drivers and sports performers. Though merely a simulation, the findings from this investigation give an indication of the manner in which excessive driving demands (such as heavy traffic, being "cut off', or near accidents) which increase the level of
activation of drivers will affect their attentional abilities. Furthermore, the impact of attentional abilities on the central task of driving the car (accelerating, braking, and steering) as well as the ability to detect and effectively process peripheral information were elaborated.
It is anticipated that many of the results obtained from this study will be
generalizable to other dynamic and reactive sport activities that involve the coordination and flexibility of attentional. processing between central and peripheral sources of information. By developing a clearer understanding of information processing abilities in these types of environments, it may be possible to derive training simulations to help athletes to maintain focus on the most relevant cues in the performance situation. For instance, Singer, Cauraugh, Chen, Steinberg, Frehlich, and Wang (1994) have shown that it is possible to train attentional parameters to be more in line with expert strategies used in reactive tennis situations. Perhaps this will. be possible in tasks in which an anxietyproducing situation is present, such as the high speed driving context of interest in this study.
REVEEW OF LITERATURE
When considering the ability to attend to, process, and react to specific cues in dynamic, highly reactive sport situations in the most efficient and correct manner, issues arise concerning the various attention and information processing components that either facilitate or impede performance. Specific questions include: How do performers know which cues to attend to? What are the properties of particular cues that make them salient and informative to the participant? What information is extracted from cues as they are attended? What are the separate influences of arousal and anxiety levels on the ability to perform effectively by selecting and processing the most relevant cues at the right time? Do eye movements and other behaviors associated with visual attention processing change under stressful situations? If so, do changes in attention shifts and eye movements reflect detrimental or facilitative effects of performance? Are these effects due to a narrowing of the visual field and/or to changes in the ability to mediate the distracting properties of irrelevant stimuli? These are questions that have received little attention in the context of sport and other performance areas and will therefore be investigated in this project.
The influence of an organism's general level of activation is integral to the ability to respond to particular stimuli in an effective and timely manner. The level of activation 37
is usually described in terms of the performer's state of arousal, which has been defined by Abernethy (1993) as "a physiological state that reflects the energy level or degree of activation of the performer at any particular instant" (p. 129). Since the publication of the Yerkes-Dodson (1908) Inverted-U theory, much research has been devoted to understanding the influence of arousal states on the ability to attend to, discriminate, and process information in tasks ranging from simple laboratory reaction time tasks to more applied areas in sport, the military, and industry. Research in which the effects of stress on performance have been investigated have ranged along a continuum from assumed low levels of arousal in vigilance tasks to very high levels of arousal in quickly changing, interactive, dynamic environments or situations in which the perception of threat has been induced.
A concept that received a great deal of attention during the early 1950's was the narrowing of the attention field as arousal and/or anxiety increased, culminating in the publication of Easterbrook's (1959) article describing the phenomenon. The peripheral narrowing idea has been used extensively to explain changes in performance in a variety of laboratory tasks and has been generalized to other real-world applications. However, in the sport domain, empirical investigation of the peripheral narrowing phenomenon has been sparse. Furthermore, other factors such as the influence of distraction on decision making and information processing capabilities of athletes have been virtually ignored by sport psychology researchers. Similarly, no research has been directed toward assessing these various attention parameters in the sport of auto racing. However, due to its reliance on speedy decision making and attention shifts under extreme time constraints
and life-threatening circumstances, auto racing provides the ideal environment in which to assess these factors. Differences to such situations in other contexts and with other tasks can be made, which is the intent in the present study.
Accordingly, the focus of the following literature review is to critically evaluate the literature that led up to and continued beyond the publication of Easterbrook's (1959) influential work. Also, the separate components of stress will be compared and contrasted, and the interactive influence of these components on attention will be summarized. Furthermore, a justification for examining attention processing in stressful environments with respect to eye movement parameters will be provided. Finally, an empirical framework will be proposed to evaluate the influence of physiological and cognitive stress on attention capabilities in a simulated race car driving task.
Stress and Human Performance
Anxiety, arousal, fear, and a variety of other terms that fall under the guise of
stress have been studied extensively in terms of their influence on performance, individual responses to stressors, and methods of regulating the stress levels of sport performers. The very nature of sport, with its increasing public exposure, the pressures placed on athletes to win from coaches, other athletes, and themselves, the rewards for great performance, and the disappointment from losing, is full of stressful performance situations (Murphy, 1995). Athletes who are able to regulate the stress response and perform in competitive situations in spite of the surrounding pressures inherent in sport are those who will inevitably excel.
However, though the general topic of stress in sport has received much attention from sport psychology researchers, confusion has been proliferated by the fact that many researchers and practitioners use terms such as activation, stress, anxiety, and arousal interchangeably, treating a multidimensional construct in unidimensional ways. Accordingly, before addressing the specific issue of attentional narrowing as a result of stressful circumstances, a discussion of the similarities and differences of these terms must be addressed. Also, a discussion of popular theories developed to describe how performers deal with stress and the theoretical basis for the present investigation Will be provided in light of the recently proposed cusp catastrophe model of anxiety and arousal (Fazey & Hardy, 1987).
Stress is defined as a combination of stimuli or a situation that is perceived as threatening and which causes anxiety (Hackfort & Schwenkmezger, 1993)). Various stressors include external threats, deprivation of primary needs, and performance pressures that can be characterized as both general and sport specific. Selye (1956) described stress based on the principle of equilibrium in which self-regulation is of primary importance. He differentiated stress (a condition to which we are always prone) from the inability to cope with the stress.
A popular cognitive view of anxiety that was heavily influenced by Selye's ideas was forwarded by Lazarus and his colleagues (e.g., Lazarus, 1966; Lazarus & Averill, 1972). Basically, Lazarus viewed anxiety as an emotion with a specific pattern of arousal that corresponds to it and that is influenced by the cognitive appraisal and perception of an
anxiety-producing event. According to this view, all facets of a situation tend to be classified with respect to its significance and the implications of that situation on the person's well-being. Therefore, it is the perception of the event and not the event itself that dictates emotions. Researchers have discovered that, contrary to the medical model of stress, many people view stress and anxiety as challenging, exciting, and beneficial (Lazarus & Folkman, 1984).
These findings prompted the formulation of Kobasa's (1989) Hardy Personality Theory which states that people who are psychologically hardy tend to view stressful situations in a positive way. The specific characteristics of psychologically hardy people are that they (1) are committed to the activity, (2) believe they can control or influence events, and (3) view demands or changes as exciting challenges. Similarly, Smith's (1980) mediational model suggests that the appraisal process creates the psychological reality based on what the individual tells himself or herself about the situation and the ability to cope with it.
Meichenbaum (1985) also suggests that the cognitive appraisal of the individual is what dictates the nature of the interaction with the environment. The meaning the person construes to the event is what shapes the emotional and behavioral response. Similarly, Mahoney and Meyers (1989) postulate that it is not stress that is central to performance but the athlete's expectations, efficacy beliefs, and use of arousal that will determine performance. Therefore, arousal, if perceived as natural is positive but negative anxiety (i.e., worry) is negative. Being aroused does not mean that one will become anxious. Rather, anxiety occurs due to (1) distrust of natural responses, (2) ineffective perceptions
due to previous exposure to modeling of arousal, (3) directly being taught that arousal is bad, and (4) early failure experience while aroused. Support for the notion that it is the perception of the stressful situation that dictates performance is provided by findings that athletes enjoy the "nervousness" associated with competition. Rotella, Lerner, Allyson, and Bean (1990) have shown that precompetitive feelings of high activation are helpful to performance if they are perceived to be natural and provide a sense of readiness rather than concern.
Unfortunately, all athletes, even those perceived as being the best in stressful situations, occasionally "choke" under pressure. Thus, the question remains: How do external and internal stressors manifest themselves in the stress response and how does the stress response affect performance? The rest of the review will be directed toward describing situations in which the performer fails to regulate the stress response appropriately. A justification for continued research in this area will be provided. From a cognitive perspective, then, questions arise concerning how the stress response influences the ability of performers to process information and allocate processing resources to coping with stressful stimuli as well as dealing with task demands and constraints. Theories of the Stress Response
Controlling the stress response is critical to the ability to perform well. Whether or not cognitive appraisal reflects reality is not necessarily important in terms of the stress response for the simple reason that it only occurs in situations in which self-regulatory skills fail (Carver & Scheier, 198 1; Cherry, 1978; Jones, 1990; Lazarus, 1966).
The analysis of stress has its roots in the psychoanalytic conceptualization of the construct. Specifically, Freud (1952) postulated that affect and neurosis are closely related to each other, with affect being related to exogenous arousal and neurosis being related to endogenous arousal. Though not the most popular view of stress today, this does provide a foundation for much of the work done in the psychoanalytic realm and provides the impetus for later cognitive and behavioral approaches to the study of stress.
Mower's (1960) learning theory approached stress from a behavioral learning viewpoint involving both classical conditioning and instrumental reinforcement. He suggested that in environments where specific stimuli result in stressful outcomes, the organism would eventually learn to associate the stimulus with the stressful outcome. For instance, if an athlete consistently performs poorly in a specific competition setting, eventually, the simple thought of that setting will elicit an anxious response.
With the cognitive revolution in the late 50's and early 60's, stress (in particular, anxiety) was viewed as an emotion that is triggered by a person's "communicative relationship" with the environment and arose from expectations and appraisals of these situations (Festinger, 1954). Festinger suggested that anxiety control is based on decisions that lead to either direct actions to remove the anxiety-producing stimulus or to avoid it (the approach/avoidance distinction). Three assumptions that formed the basis of Festinger's theory were that: (1) a person who cannot account for arousal will look for something to attribute it to, (2) previous explanations do not cause a need for appraisal, and (3) a person with arousing thoughts but no physiological arousal will not show emotional response and therefore will not be stressed. According to this view, an athlete
that experiences physiological arousal will only choose to exert cognitive processes for interpretation of it if the arousal persists, and is unaccounted for (Hackfort & Schwenkmezger, 1993).
As will be described in depth later, the specific reactions to stress are individually determined. Stress can be manifested in the form of cognitive and somatic anxiety, physiological arousal, loss of self-confidence, panic, and a variety of other forms. Obviously, each of these different responses will have an influence on performance if not regulated appropriately.
The Stress/Performance Relationship
One of the more popular early conceptualizations of the stress/performance
relationship was the Hull/Spence Drive Theory (Hull, 1952; Spence & Spence, 1966). According to the theory, level of activation is considered a function of the sum of all of the energetic components affecting an individual at the time of a particular behavior. Furthermore, drive strength is dependent on the emotional reaction that is caused by an aversive stimulus. Thus, people with increased drive levels perform better due to their greater effort, emotion, and motivational need to remove the aversive stimulus. Though an attractive early attempt to explain the stress/performance relationship, empirical testing has suggested that the theory is not generalizable to many situations, especially those requiring fine motor control.
Other popular theories that have attempted to relate stress to performance are the 'optimal zone' theories. Of these, Hanin's (1980) concept of an arousal zone of optimal functioning (ZOF) has received the majority of empirical investigation. Though initially
criticized as a reiteration of the Inverted-U hypothesis (Yerkes-Dodson, 1908) (which will be discussed at length in the next section), it is instead an interindividual account of how arousal affects performance. The attractiveness of the model rests in the fact that it accounts for individual differences, something the Inverted-U is unable to do. A similar theory is Martens' (1987) zone of optimal energy.
Csiksentmihalyi's (1975) concept of a less sport-specific optimal arousal state (or FLOW state) is another attempt to explain the activation of the organism at a level that is most conducive to performing well. The flow state is characterized by a variety of factors including (1) awareness, but not being aware of awareness, (2) focused attention, (3) loss of the ego and self-consciousness, (4) feeling of being in control, and (5) intrinsic reward from performing well. Often athletes refer to the flow state in discussing their best performances and continued research is being directed toward understanding the factors that allow athletes to enter this relaxed state of intense concentration and seemingly effortless ability to perform at the highest levels.
Another related theory to that of the 'optimal states' is Kerr's (1989) Reversal Theory. Based on Apter's (1982) phenomonological theory of motivation, emotion, personality, and psychopathology, Kerr's basic premise is that depending on the metamotivational state in which the athlete is currently involved, there is a combination of arousal and "hedonic tone" (feeling of pleasure) that dictates whether that state will be associated with anxiety, pleasurable excitement, boredom, or relaxation. A discussion of the intricacies of reversal theory is beyond the scope of the current review, but it does
provide a unique way to view the arousal/anxiety/performance relationship and warrants further investigation.
Of the theories that have been proposed to account for the relationship between stress and performance, perhaps the most influential and misunderstood is YerkesDodson's (1908) Inverted-U hypothesis. The basic premise of the Inverted-U hypothesis (which was generated based on work with animals) is that as arousal increases so does performance until an optimal level is reached. At this point, any increase in arousal level will lead to a gradual deterioration of performance until arousal level is reduced to the optimal level (Yerkes-Dodson, 1908). Unfortunately, sport psychology research has been reluctant to abandon the rather shallow notion of the Inverted-U hypothesis due to the simplistic nature of the theory and its almost universal application. The myths and realities surrounding this controversial theory and the research undertaken that both supports and refutes it will be briefly reviewed in the following section.
It has been postulated that one mediator of the stress/performance relationship is the characteristics of the task. In regard to the influence of task characteristics on the stress/performance relationship (and assuming the Inverted-U relationship of stress to performance), Oxendine (1970, 1984) and Oxendine and Temple (1970) suggested that different types of tasks require different levels of arousal. According to Oxendine, a moderately above resting level of arousal is required for the successful execution of all motor tasks. Also, a low level of arousal is best for tasks involving complex movements,
very fine motor control, steadiness, and concentration. Finally, in gross movements requiring strength, endurance, and speed, a high level of arousal is most beneficial.
Though intuitively appealing, Oxendine's suggestions have been criticized due to their simplicity (Jones, 1990). Jones provides several examples of sport situations where Oxendine's hypotheses do not hold true and cites three primary reasons for their lack of value. First, only one of the three predictions has held up to empirical examination; that relatively lower levels of arousal are most advantageous for complex, highly specialized tasks. Also, his classification system is overly simplified in that entire sports such as basketball which requires extremely diverse arousal states during the course of the game can be categorized in one of the three levels. Finally, Jones (1990) suggests that Oxendine does not consider the cognitive requirements of the skills in favor of focusing on the movement parameters in particular.
Another one of the primary criticisms of the Inverted-U hypothesis is its global nature. It seemingly relies on the notion of a general stress response that influences performance (e.g., Neiss, 1988). Levi (1972) made an early attempt at separating the different components of the stress response by suggesting that both high and low levels of arousal could be experienced as stressful. In this vein, he proposed that an increase in stress would result from the further deviation of the arousal state from the optimal level. However, these ideas have also been criticized and basically dismissed by the newer concepts of the interactionist approach to stress in which individual differences in the perception of the stress response are accounted for, not simply the fact that being underaroused or overaroused causes stress.
Another problem with the Inverted-U description of the stress/performance
relationship is that it is a description and nothing more (Jones, 1990). No explanation is offered for why performance is impaired when arousal deviates from the optimal level. Though factors such as attentional allocation of resources, attentional narrowing, and hyperdistractibility have been suggested and many have been investigated, the Inverted-U hypothesis specifies none of these as the primary contributor to the decline in performance as arousal deviates from optimal levels. More than likely, it is a combination of these factors that impacts on the ability of the performer to function efficiently and to process information effectively in the stressful environment.
Another criticism that has been levied against the Inverted-U hypothesis is that it does not address specifically how performance is influenced. Rather, the hypothesis merely states that overall capabilities, in a very general sense, are dependent on the level of stress. Obviously, this description is entirely too global and does not explain how such variables as speed of information processing, stimulus detection ability, and response accuracy are affected (Eysenck, 1984). Furthermore, as Will be addressed later, the actual shape of the Inverted-U hypothesis has been questioned by those who assert a more dramatic decrease in performance at high levels of anxiety/arousal with a more difficult recovery to high performance levels as anxiety/arousal decreases (Hardy & Fazey, 1987).
It has been suggested that there is virtually no sound evidence to support the
Inverted-U hypothesis (Hockey, Coles, & Gaillard, 1986; Naatanen, 1973; Neiss, 1988). Perhaps, of the critics of the Inverted-U, Neiss (1988) is the most rabid, calling the empirical evidence in favor if the Inverted-U "psychologically trivial". Other researchers
have been equally adamant regarding its lack of applicability, validity, and credibility, calling it a "catastrophe" and a "myth" (Hardy & Fazey, 1987; King, Stanley, & Burrows, 1987). The criticisms and negative connotations associated with the Inverted-U hypothesis prompted Neiss (1988, 1990) to suggest that the study of arousal in the context of the Inverted-U should be abandoned for the following reasons: (1) it cannot be falsified, (2) it cannot function as a causal hypothesis, (3) it has trivial value if true, and
(4) it hinders understanding of individual differences in regard to the stress response.
Others suggest that it merely needs to be reformulated to account for individual differences and to address the underlying mechanisms that specify the facilitative and/or detrimental effects of stress (Anderson, 1990; Hanin, 1980; Martens, 1987). Researchers have addressed such areas as the nature of the task (e.g., Weinberg, Gould, & Jackson, 1985), skill level (e.g., Cox, 1990), and individual differences (e.g., Ebbeck & Weiss, 1988; Hamilton, 1986; Spielberger, 1989) with respect to the Inverted-U hypothesis. However, the understanding of these specific components is only beginning to be surmised.
Perhaps much of the confusion, equivalence of empirical results, and lack of consistency in research findings that has been associated with the Inverted-U can be attributed to the multitude of experimental methods that have been used to examine it and the lack of consistency in differentiating the various components that embody the term "stress". A discussion of the specific components that fall under the guise of "stress" will be presented in the following section.
Stress. Arousal, and Anxiety
As mentioned earlier in the review, stress is characterized by a combination of
stimuli or a situation that comprises the circumstance of a person's subjective experiences as threatening and which causes anxiety (Hackfort & Schwenkmezger, 1993). According to this view, stress occurs when one is unable to cope with a particular situation, and it arises due to specific 'constellations' of threatening stimuli. Various stressors include internal and external threats, performance pressures, social threats, and sport-specific circumstances. One of the specific components of stress is anxiety.
Anxiety is an emotion characterized by uncertainty; a state of unoriented activation that is learned through the socialization process and direct exposure to anxiety-producing situations (Sage, 1984)). Fear, on the other hand, though similar to anxiety, is characterized by the perception of danger in response to a known threat, is a reflex-like defense, and is logical, self-protective, and adaptive (Hackfort & Schwenkmezger, 1993). According to Cattell. and Scheier (196 1), fear is a specific reaction while anxiety is caused by anticipatory and imaginative processes. Thus they are based on the degree of specificity and recognizability.
Spielberger (1966, 1972, 1983) defines stress as being closely related to state and trait anxiety. The trait component is exhibited as an acquired behavioral disposition, independent of time, causing the person to perceive a wide range of not very dangerous circumstances as threatening. Conversely, state anxiety refers to subjective, consciously perceived feelings of inadequacy and tension accompanied by an increase in arousal in the autonomic nervous system. These characteristics are influenced by both cognitive and
emotional components in which the person is preoccupied with irrelevant thoughts and eventual subjective excitement when the ego is threatened. Spielberger's Anxiety Theory (1966, 1972) states that those with higher trait anxiety tend to respond to stressful situations with even higher state anxiety. In accordance with this view, studies (e.g., Hackfort & Schwenkmezger, 1989) have indicated that those who exhibited higher trait anxiety reported anxiety as debilitating while those who were not trait anxious reported it as facilitative to performance. Similarly, Martens (1971, 1974) determined that highly anxious persons perform better on some tasks while lower anxious do better on others and that the state anxiety level at the beginning of the learning process depends on the trait anxiety level of the person. Furthermore, there appears to be an unexplored interaction between anxiety level, situation-specific stress stimuli, task difficulty, and situation specific conditions of learning and performance.
Another important distinction must be made between cognitive and somatic anxiety. Cognitive anxiety is characterized by a state of worry, the awareness of unpleasant feelings, and concerns about ability to perform and concentrate in a particular environment. Worry is a cognitive process that takes place prior to, during, and after a task and is marked by decreases in faith in the performance, increased concern, social comparison, and fear of failure (Hackfort & Schwenkmezger, 1993). These characteristics of worry may represent cognitive, evaluative processes that are suitable for predicting performance, as high levels of worry tend to lead to lower levels of performance (Martens, Burton, Vealey, Bump, & Smith, 1990).
Conversely, somatic anxiety refers to the perceptions of physiological arousal such as shakiness, sweating, M respiration, and "butterflies" in the stomach. A synonymous term used to describe somatic anxiety is "emotionality", characterized by affective physiological system changes caused by an increase in arousal level (nervousness, increased HR, etc.) (Zaichkowski & Takenaka, 1993). Furthermore, cognitive and somatic anxiety appear to have different antecedents. Somatic anxiety is elicited by a conditioned response to competitive stimuli while cognitive anxiety is characterized by worry or negative expectations about an impending performance or event. A handful of studies has suggested that there tends to be a negative link between worry and motor performance while there appears to be a positive link between somatic anxiety and performance (e.g., Gould, Weiss,& Weinberg, 198 1).
Due to the relevance of somatic anxiety to arousal, these terms are often used interchangeably. However, there is a clear distinction between the two terms. Somatic anxiety refers to perceptions of physiological states and is, therefore, a psychological characteristic. On the other hand, arousal reflects the natural activity of one's physiology and is therefore a purely physiological construct (Rotella & Lemer, 1993). In this respect, somatic anxiety is influenced by the subjective evaluation and interpretation of arousal. The specific physiological mechanisms that govern arousal level are thought to be regulated by the neurophysiology of the central nervous system, in particular. The four primary structures involved include the cerebral cortex, the reticular formation, the hypothalamus, and the limbic system. The cortex is responsible for cognitive appraisal of incoming stimuli, the reticular formation acts as an organizer with the other components,
the limbic system provides emotional input in the regulation of arousal, and the hypothalamus regulates sympathetic nervous system activity along vAth the pit-uitafy gland (Zaichkowski & Takenaka, 1993). These upper level control systems exert their influence on the sympathetic nervous system which is primarily responsible for the psychophysiological changes in HR, pupil dilation, respiration rate, blood glucose levels, and other physiological responses.
Though the description of arousal appears straightforward, researchers have conceptualized it in various ways. For instance, Sage (1984) suggests that arousal is synonymous with activation level. Magill (1989) discusses it in a motivational context that serves as an energizing agent to direct behavior to a specific goal. Cox (1990) has defined arousal as alertness while Martens (1987) dislikes the term "arousal" altogether and prefers the term "psychic energy" which serves as the cornerstone of motivation. Based on these current views of arousal, collectively, anxiety appears to be a multidimensional construct that serves as an energizing function of the mind and body and varies along a continuum from sleep to extreme excitement. It contains a general physiological response in which several systems may be activated at once in including HR, sweat gland activity, pupil dilation, and electrical activity of the brain. It also includes behavioral responses (performance) and cognitive processes (appraisal of physiological arousal).
Therefore, in order to gain an accurate assessment of arousal, physiological,
behavioral, and cognitive components must be assessed (Borkovec, 1976). It should be emphasized that changes in physiological function are not necessarily indicative of arousal,
and therefore must be accompanied by other measures because any of the physiological components can be altered without impacting the others (Lacey & Lacey, 1958). These issues will be addressed again later in the discussion of multidimensional anxiety theory. Assessment of the Stress Response
As mentioned, due to the multidimensional nature of the stress response, multilevel assessment is absolutely necessary to gather a better understanding of the influence of the various components of stress on performance. Assessment effectiveness can be maximized through the combination of physiological, behavioral, and cognitive (selfreport) measures. Physiological indices of arousal include such measures as skin resistance, pupil dilation, heart rate, electroencephalogram, electrocardiogram, electromyogram, and other biological measures. The advantages of physiological assessments are that they are not tied to verbal statements. Also, they can be used with all types of people and can assess changes in arousal continuously. However, the primary disadvantage is the fact that physiological measures lack high correlations among each other, a condition Lacey and Lacey (1958) referred to as autonomic response stereotype. Also, in most sport contexts, physiological measures will be confounded by other physiological changes due to exercise-induced responses.
Another level of assessment is behavioral. Observation of behavioral change (such as the presence of nervous twitches, vomiting, etc.) can provide an indication of the stress response. Unfortunately, often behavioral observations may be attributed to stress when the actual root of the behavior is not stress-produced. For instance, vomiting could be
due to the flu rather than competitive stress. Therefore, often self-statements are needed to interpret behavioral observations.
Assessment at the cognitive level is usually done through self-report measures.
Some of the more popular measures of anxiety include the State-Trait Anxiety Inventory (STAL Spielberger, Gorsuch, & Luschene, 1970), the Sport Competition Anxiety Test (SCAT: Martens, 1977), and the Competitive State Anxiety Inventory 2 (CSAI-ll: Martens, Burton, Vealey, Bump, & Smith, 1990). It should be mentioned that most cognitive measures of arousal that have been used are those that measure anxiety, not arousal. Though much time and effort has been devoted to the development of these selfreport measures, Kleine (1990) conducted meta-analyses that indicated only a moderate relationship between various measures of anxiety and performance. Furthermore, his results suggested that the STAI (a non-sport-specific measurement tool) was as good as the SCAT (sport-specific) for predicting performance in sport. Further criticism has been directed toward the SCAT due to the unidimensionality of the instrument (assessing only the cognitive aspects of anxiety) and its bias toward assessment of the frequency of debilitating anxiety while ignoring possibly facilitative aspects.
As mentioned, one of the primary weaknesses of research on stress and more specifically, anxiety, is the lack of multidimensional assessment. The CSAI-2 is more multidimensional in nature as it separates measures of cognitive and somatic anxiety and has been used extensively in sport research. The reliability and validity of the instrument and its ability to measure the multidimensional nature of anxiety is laudable. The next section of the review addresses the multidimensional nature of anxiety and the importance
of obtaining a better understanding of the influence of specific components of anxiety and their influence on performance from both a basic and applied point of view. Multi-Dimensional Anxigty Theor.y
Due to recent concern with the lack of usefulness of the Inverted-U model of anxiety and/or arousal, theorists began to search for a better explanation of the stress/performance relationship. Researchers began to attempt to break down the stress response into its various components. These concerns eventually lead to the formation of multidimensional anxiety theory which has also spurred the development of other theories such as Hardy and Fazey's (1987) catastrophe theory. The generation and a general summary of multidimensional anxiety theory follows.
Perhaps the first to attempt a defragmentation of the general stress response were Liebert and Morris (1967), who identified two primary contributing factors to anxiety: Worry and emotionality. In Liebert and Morris's view, worry consisted of cognitive concerns about one's performance while emotionality referred to the autonomic reactions to the performance environment. This initial identification heavily influenced Davidson and Schwartz's (1976) multidimensional model of anxiety. They were the earliest to use the terms "cognitive" and "somatic" anxiety and formulated their theory in the context of clinical applications. Thus, worry has become synonymous with cognitive anxiety and emotionality has become synonymous with somatic anxiety. Cognitive anxiety is typified by the awareness of unpleasant feelings and concerns about ability to perform and to concentrate. Conversely, somatic anxiety is characterized by perceptions of physiological arousal such as shakiness, sweating, HR, respiration, and "butterflies in the stomach".
These general characteristics of the components of anxiety have held up under empirical investigation and appear to be manipulable independently (e.g., Schwartz, Davidson, & Goleman, 1978).
As mentioned, another characteristic of the multidimensional components of
anxiety is that they appear to have different antecedents. Somatic anxiety is elicited by a conditioned response to the competitive environment, while cognitive anxiety is characterized by worry or negative expectations. Researchers have consistently shown that somatic anxiety tends to build as the event (or competition) grows nearer and dissipates as performance begins, while cognitive anxiety continually fluctuates as the subjective probability of success varies (Jones & Hardy, 1990; Martens et al., 1990). Martens et al. (1990) found that cognitive anxiety remains stable and high during the period preceding an event while somatic anxiety peaks at the moment just before competition. Likewise, in an earlier study, Spiegler, Morris, and Liebert (1968) reported similar results in the context of test anxiety.
Another means in which cognitive and somatic anxiety differ is with respect to
their effects on performance. In accordance with differences in the time course of anxiety onset, somatic anxiety would be expected to have no influence on performance while cognitive anxiety would have a significant influence, due to the ever changing subjective probability of success. Consistent with this prediction, Martens et al. (1990) found that this was the case. However, other studies have shown an Inverted-U relationship of somatic anxiety to performance (Burton, 1988). Furthermore, studies using the time-toevent paradigm have found that cognitive anxiety actually has a positive effect on
performance in the days leading up to a competition (Hardy, 1996). Thus, it appears that rather equivocal results exist on both sides of the argument. Jones and Hardy (1990) interpret the disparity and lack of consistency in findings to the multitude of different paradigms that have been devised and the abundance of analyses that have been applied to reduce the data.
Another problem that exists with respect to multidimensional anxiety theory is the two-dimensional approach used to explaining the effects of somatic and cognitive anxiety on competition. Specifically, the two-dimensional approach in analyzing results tends to neglect the interaction of the components of anxiety, treating them independently rather than in combination (Hardy & Fazey, 1987). According to the viewpoint of Hardy and his colleagues, any relatively comprehensive treatment of these components must consider them in an interacting, three-dimensional manner. In an attempt to improve the predictability and structure of the model, therefore, Hardy and Fazey (1987) developed a catastrophe model of anxiety and performance. A Catastrophe Model of Anxi!4y
In an effort to advance understanding beyond the multidimensional approach to the study of the effects of anxiety and arousal on performance, Hardy and Fazey (1987) formulated a three-dimensional model of the relationship. Borrowing heavily from Thom (1975) and Zeeman (1976) who originally devised the idea of catastrophes and then applied them to the behavioral sciences, respectively, Hardy and Fazey's (1987) model is closest in form to the cusp catastrophe, one of the seven originally proposed catastrophe models of Thom (1975). Zeeman (1976) borrowed Thom's original ideas and described
the theory by developing a machine to model it. When describing human behavior, however, events are less mechanistic and absolute, requiring the model to be revised such that changes in one variable (i.e., anxiety or arousal) increases the likelihood that the dependent variable (i.e., behavior) will change in a predicted direction.
Of the two independent variables in Hardy and Fazey's (1987) model, anxiety is the "splitting factor", the variable that determines performance levels and ultimately, catastrophes. The roles of cognitive anxiety and physiological arousal were chosen specifically to be able to evaluate testable hypotheses with respect to the anxiety/arousal/performance relationship. Specifically, when cognitive anxiety is low, the model predicts that physiological arousal will influence performance in an inverted-U fashion. However, when physiological arousal is high, such as on the day of competition, high levels of cognitive anxiety will result in lower levels of performance. When physiological arousal is low, such as during the days leading up to competition, higher cognitive anxiety will lead to increases in performance. When cognitive anxiety is high, the effect of physiological arousal depends on how high cognitive anxiety is elevated. Usually the manipulation of anxiety and arousal is carried out through a time-to-event paradigm in which assessments are taken at specified times leading up to a competition setting. For instance, assessments will be taken one week prior, two days prior, and then one hour prior to the competition setting. In this way, the time course of anxiety and arousal can be assessed. In other instances, levels of anxiety and arousal are manipulated through the use of both ego-threatening or other anxiety producing instructional sets and through the use of exercise-induced arousal, respectively.
Testable hypotheses have been generated from the conceptual framework of the
original catastrophe model (Fazey & Hardy, 1988). According to the model, physiological arousal changes are not necessarily detrimental or facilitative to performance. However, if physiological arousal is high, it can have catastrophic effects on performance in situations where cognitive anxiety is also high. Another prediction is the hysteresis effect. Due to the splitting effect of cognitive anxiety, under high cognitive anxiety conditions, physiological arousal will have a differential effect on performance when it is increasing as opposed to when it is decreasing. A third prediction is that intermediate levels of achievement are most likely to occur under conditions where cognitive anxiety is high. Finally, Fazey and Hardy (1988) suggest that it is possible to fit statistical models to cusp catastrophes.
One notion that may become obvious in the discussion of the differences in the
catastrophe model versus multidimensional anxiety theory is the suggestion that cognitive anxiety can facilitate performance at certain times, especially in the days leading up to competition. This is in direct contrast to most studies of cognitive anxiety that have demonstrated a negative relationship between it and skill execution. With further thought, however, it is obvious that the motivating effects of cognitive anxiety in the days leading up to competition could eventually facilitate achievement capabilities. Also, it should be emphasized that in many of those studies in which a negative relationship has been identified between cognitive anxiety and performance, assessment was made on the day of competition, when physiological arousal can be assumed to be relatively high (Hardy, 1996).
Another obvious feature of the cusp catastrophe model of the anxiety-performance relationship is the choice of physiological arousal rather than somatic anxiety as the normal factor. The primary reason for this choice was based on the notion that it is part of the organism's natural physiological response to anxiety producing situations (Hardy, 1996). The physiological response to performance anxiety is sufficiently well-established to be considered in the context of a generalized response within the competition setting. However, though the physiological response may be reflected in self-reports of the presence of somatic anxiety, the purely physiological index can encompass the individual task requirements, different situations, and other combinations of factors that override reports of somatic anxiety. Furthermore, physiological arousal changes tend to be mirrored by changes of somatic anxiety while the converse is not the case (Fazey & Hardy, 1988; Hardy, 1996; Hardy &Fazey, 1988).
Substantial support has been shown for the cusp catastrophe model of the anxiety performance relationship in seminal investigations of the model by Hardy and his colleagues. Hardy, Parfitt, and Pates (1994) and Parfitt, Hardy, and Pates (1995) conducted two studies to examine the relationship. In the first of their studies, the timeto-event paradigm was implemented to manipulate anxiety independently of physiological arousal in female basketball players and was primarily directed toward examining the hysteresis hypothesis. Physiological arousal was measured by a Polar heart rate monitor
(HM) and cognitive and somatic anxiety were measured with the CSAI-2. The task was a basketball free throw that was performed after completing physiologically arousing exercise. Findings indicated that both cognitive and somatic anxiety were elevated on the
day before the tournament. This was a somewhat different finding as compared with previous studies in which somatic anxiety increases usually only occurred on the day of the significant event. The data with regard to the hysteresis hypothesis were supportive. In general, performance followed a different pathway with respect to heart rate when increasing as opposed to when it was decreasing in conditions of high cognitive anxiety but not in conditions of low cognitive anxiety.
In the second experiment, Parfitt, Hardy, and Pates (1995) examined the
generalizability of these findings with women basketball players to male crown green bowlers. The exception in this study was that cognitive anxiety was manipulated through the use of instructional sets rather than through the use of the time-to-event paradigm. The results of the first experiment were replicated in that the three-way interaction between cognitive anxiety, HR, and direction of heart rate change influenced performance in predictable directions.
Another interesting finding that provides support for the cusp catastrophe is a subprediction that performance will be most variable under the high and low cognitive anxiety conditions (Hardy, 1996). Specifically, according to the surface of the performance curve, it would be predicted that the highest levels of performance achieved in the high anxiety condition would be higher than the highest levels achieved in the low anxiety condition. Similarly, the lowest levels of performance in the high anxiety condition would be lower than the lowest levels of performance in the low anxiety condition. In fact, these hypotheses were supported in the second study; thereby providing evidence to support the cognitive anxiety component as the splitting factor on the performance surface (Parfitt,
Hardy, & Pates, 1995). Though relatively little work has been done to examine the validity of the cusp catastrophe model, initial results provide evidence to support it and many fruitful areas of research in this area are warranted.
One limitation, however, to the study of anxiety in the context of both catastrophe theory and the other models mentioned above, is a lack of explanation for the performance changes that are noticed in overly stressful situations. One specific cognitive mechanism that has been implicated, but has received limited empirical investigation is the impact of anxiety and arousal on attentional resources. The following section will outline some of the research that has been directed toward examining this relationship.
Anxiety. Arousal, and Attention
One of the critical factors that could contribute to performance changes under
anxiety or arousal producing situations is the ability to allocate attentional resources in the appropriate areas and to process information gathered in these areas effectively (Kahneman, 1973; Landers, 1978; Nideffer, 1976, 1989). Evidence seems to suggest an arousal! performance relationship that is mediated by attentional factors. Support has been found for this notion in both anecdotal and empirical evidence (Nideffer, 1988).
Perhaps the most compelling evidence that favors the notion of a mediating role of attentional processes in the anxiety/arousallperformance relationship is the substantial support provided for the idea of attentional (or peripheral) narrowing. Research has indicated consistent changes in the peripheral acuity of subjects assessed in arousal and/or anxiety producing situations. Various studies have indicated a narrowing of attention that occurs in highly stressful environments, resulting in a tunneling effect where peripheral
cues are selectively attenuated from further processing. Using dual task paradigms, results have shown a facilitative effect in the performance of central tasks with a concomitant decrease in performance of peripheral tasks when performed under a state of increased arousal or anxiety. Literature relevant to the attention narrowing idea will be reviewed in the following section.
The first researchers to address the idea of peripheral narrowing in terms of cue utilization were Bahrick, Fitts, and Rankin (1952). Based on the assumptions that anything to which an organism responds is relevant to performance, and that continuously variable information is more important to interpreting a stimulus than are relatively constant sources, Bahrick et al. (1952) hypothesized that perceptual selectivity would be highly dependent upon cues available. They postulated that objects in the peripheral visual field (as well as those aspects of the central task that are relatively unimportant) would tend to be interpreted as less important than those in the central part of the field. Using a tracking task and several intermittent peripheral tasks, they found that when subjects were offered incentives, performance on the central task was superior to performance on peripheral ones. These results were interpreted as suggesting that performance was influenced by the degree of motivation manipulated by the incentives provided.
Easterbrook (1959) produced the most influential article on the topic of cue
utilization based on the findings of Bahrick et al. (1952), and others (e.g., Bruner, Matter, & Papanek, 1955; Callaway & Dernbo, 1958; Callaway & Thompson, 1953). Easterbrook indicated that as the level of arousal increased to a certain point, performance on the
central task was facilitated due to the blocking of irrelevant cues in the periphery from being processed. In contrast, as arousal increased, he suggested that performance on tasks requiring less of a central focus deteriorated due to the blocking of relevant cues. Furthermore, performance on central tasks deteriorated if arousal level reached a state in which the funneling effect prohibited attention to relevant cues that were integral to performance of the central task.
Easterbrook (1959) suggested that the degree of facilitation or disruption caused by emotional arousal is dependent on the range of cues required to perform a task effectively. These ideas were consistent with Woodworth's (1938) concept of a 49recepto-effector spatf', an index of the range of cue utilization. The size of the receptoeffector span is related to the number of possible responses permitted following a stimulus, and the influence of warning time on the ability to prepare responses. Based on the work of Bartlett (1950) and Poulton (1957), Easterbrook suggested that in serial task performance, "the effect of increased foreknowledge is that responses can be made in larger units so that inter-response delay times become covert, inter-response junctions are smoothed, net speed increases, precision improves, and the performance may be better described as better integrated" (p. 186). Therefore, in tasks that require a large range of cue utilization (larger receptor-effector spans), performance win be facilitated with an increase in the amount of advanced preparation allowed. In relatively simpler tasks, however, requiring reduced cue utilization and attention, a surplus in capacity to attend to and process information exists, permitting the processing of (and distraction due to) irrelevant cues (e.g., Porteus, 1956). In accordance with this view, effective execution on
a variety of serial tasks including paced problem solving, mirror drawing, and tracking, has been shown to decrease in anxious subjects as compared to control groups.
Easterbrook (195 9) was insistent on the interdependence of perception and
response, based on the premise that a response cannot be made without some type of perception. Similarly, in the absence of a response, it is virtually impossible to determine whether perception occurred. Through this conceptualization, he defined the meaning of a Cccue" as occurring when a singular related response has been made to a percept. Likewise, in highly variable situations, containing many cues, a response to a particular cue can be said to have occurred when the response takes the form of the normal response in the absence of other cues. In light of this operationalization of cue meaning, several researchers during the 1950's found that a funneling or reduction of the perceptual field resulted from induced psychosomatic stress (e.g., Callaway & Thompson, 1953; Combs & Taylor, 1952). In most perceptual tasks administered, manipulations causing the range of cue utilization to fall below that required to complete the task resulted in relative decrements in achievement (Eysenck, Granger, & Brengelman, 1957; Granger, 1953). However, it is important to note that the degree of skill deterioration on tasks is highly relevant to task complexity. As Easterbrook wrote,
For any task, then, provided that initially a certain portion of the cues in use
are irrelevant cues (that the task demands something less than the total capacity of the organism), the reduction in range will reduce the proportion of irrelevant cues
employed and so improve performance ...... When all relevant cues have been
excluded, however, (so that now the task demands the total capacity of the
subject), further reduction in the number of cues employed can only effect relevant
cues, and proficiency will fall. (p. 193)
One may question the effect of learning on the ability to select and process only the most relevant cues in a display. Support for the idea that overlearning improves the ability to select appropriate cues has been provided by Bruner, Matter, and Papanek (1955). In regard to learning, the question of "What makes a cue relevant or irrelevant?" must be answered. In other words, how does a person know what cues to select and what information will be gained from selection of particular cues? It appears that cue relevance is specified by the amount of information obtained from a cue and the task requirements at hand. Thus, cue utilization is not merely a perceptual idea, but one that is mediated by the "cerebral competence of the subject" (Easterbrook, 1959, p. 196). Consequently, the ability to select and incorporate the most relevant cues while ignoring irrelevant cues is intricately tied to the intellectual competence of the person who must competently complete various tasks.
Easterbrook's (195 9) conceptual contribution spurred much work to investigate the mediating factors that influence the degree of peripheral narrowing, and the related facilitation and inhibition in skill level resulting from this condition. The methodologies used and factors investigated are quite varied. As a result, this review, though somewhat comprehensive, cannot account for all studies that have been related to the concept of peripheral narrowing.
Studies concerning the cue utilization theory were prevalent in the 1960's and
1970's and lent credence to Easterbrook's (1959) ideas. Because much of Easterbrook's
theory is related to the conceptualization of arousal as a driving force which directs behavior in a way to reduce the desire for something, many early researchers investigated the cue utilization theory with this underlying theoretical backdrop. For instance, Eysenck and Willett (1962) classified subjects into high and low drive categories based on whether or not they had passed their entrance examination into a training school. Those who had passed were classified as high-drive subjects, while those who had not were classified as low drive subjects. Findings indicated that performance on a Tsal-Partington Numbers test was significantly greater for subjects characterized by high drive rather than those categorized in the low-drive condition. Though not a direct test of the cue utilization hypothesis, the results do suggest limited performance on this highly visually dependent task by those who were at presumably lower drive levels.
A direct examination of the cue utilization theory was conducted by Agnew and Agnew (1963) who used two different tasks, the Porteus maze and the Stroop ColorWord Interference test. Investigated was whether tasks which demand differing levels of attentional span would be effected differentially by increasing and decreasing stress levels as manipulate through electric shock. Success in the Porteus maze task, one that requires a wide range of cue utilization, was detrimentally influenced by electric shock. However, proficiency in the Stroop color word test, requiring a more narrow range of cue utilization, was facilitated by increased levels of arousal. These results provided substantial evidence for the validity of the cue utilization hypothesis.
A similar study was conducted by Tecce and Happ (1964) in which performance on a card sorting task and the Stroop Color-Word Interference test were assessed while
stress levels were manipulated through electric shock. In this way, both relevant and irrelevant stimuli were presented that would be thought to impede performance of the central sorting task. Similar findings to those of Agnew and Agnew (1963) were obtained in which the shocked subjects performed better on the card sorting task than did a no-shock control group.
Another early study in which the cue utilization hypothesis was examined
incorporated a state measure of anxiety as assessed by the Taylor Manifest Anxiety Scale (TMAS: Zaffy & Bruning, 1966). Those participants who scored in the upper and lower 20% of the distribution of TMAS scores were selected for the study. The task consisted of learning 19 multiple choice items with 5 zeros for each 19 choice set. With the presentation of the zeros which had to be identified, either a relevant cue, an irrelevant cue, or no cue was presented. Findings showed that the low anxiety subjects performed worse than the high anxiety subjects, responding to both relevant and irrelevant cues while the high anxiety subjects responded to only the relevant cues, ignoring the irrelevant ones. Follow up experiments using the same task as Zaffy and Bruning (1966) but reducing the items from 19 to 15 and increasing the choices from 5 to 7 provided similar results (Bruning, Capage, Kozuh, Young, & Young, 1968).
In their first experiment, Bruning et al. (1968) manipulated anxiety through the
presence or absence of the test administrator, while in the second experiment, anxiety was manipulated by feedback regarding the subject's success and failure. Results in the first experiment replicated the findings of Zaffy and Bruning (1966). However, in Experiment 2, it was determined that the high drive subjects were superior in the irrelevant condition
while the low drive subjects were superior in the relevant cue condition. Because subjects responded in different manners based on their anxiety disposition, results do provide some support for the attention narrowing idea.
Wachtel (1968) also conducted a study to examine the cue utilization hypothesis. However, his goal was to determine whether the cue utilization tendencies could be altered through offering participants a means of coping with the anxiety. The tasks consisted of a central continuous tracking task while identifying a random presentation of peripheral lights. Performance was based on a combined score of accuracy on the pursuit rotor task as well as reaction time to the peripheral fights. Three groups were tested in which one was a control group, the second group was told that it would receive random shocks that were independent of performance, and the third group was told that the longer it went without a shock, the stronger the shock would be. However, this group was also told that it would not be shocked as long as sufficient achievement was demonstrated, Results indicated that groups I and 3 reacted slower to the peripheral stimulus, suggesting that proficiency was impaired under the threat of electric shock, but not if the subjects had a means of escaping it. Thus, once again, it appears that stress affected peripheral task performance while facilitating the central task performance.
Hockey (1970) tested Easterbrook's ideas based on the notion that the differential selectivity effect observed between central and peripheral tasks is based not on the actual location of the stimuli but rather the allocation of priorities to the two tasks. He postulated that the high subjective probability of relevant signals to occur in the central field predisposes subjects to focus attention scanning of signals to the primary task.
Using the manipulation of noise levels on central and peripheral tasks in which he ensured that all signals were detected (making objective and subjective probabilities identical), Hockey (1970) hypothesized that if a probability mechanism (priority allocation) was working, a greater facilitation of central detections in noise would occur only when the signal distribution was biased toward the center of the display. Attentional changes due to noise were inferred by the latency of response to central and peripheral locations. Support for the probability hypothesis was found. The response latency was faster when signals were biased toward the center of the visual field, but not when probability was equal of the signal being presented in the central or periphery. This explains, in part, that the fimneling which occurs is a function of the higher probability of relevant cues occurring in the central area, rather than as a function of the spatial location of the signal.
Bacon (1974), using a signal detection approach (Green & Swets, 1966), assessed the nature of stimulus loss by hypothesizing that there is not necessarily a loss of perceptual sensitivity to peripheral or irrelevant stimuli, rather a shift in the subjective decision criterion to respond to peripheral cues occurs. Due to the inconsistencies reported regarding whether performance on central tasks is enhanced or diminished, Bacon suggested that cues that initially attract less attention will show even less attention devoted to them while those that occupy the primary focus of attention attract an even higher degree of attention processing.
Using a dual task paradigm, Bacon's (1974) results supported Easterbrook's
(1959) hypothesis in that the increase in arousal (induced through electric shock) caused a funneling of attention toward central areas and away from the periphery. More pertinent
to the hypotheses tested, however, it was determined that the decrease in attention devoted to the periphery was, in contrast to the expected result, due to a decrease in sensitivity rather than a shift in the subject's criterion for responding. Furthermore, due to the lack of ability to attend to both tasks as well in the aroused condition, the capacity limitation ideas of Easterbrook (1959) were also supported.
Though obviously laboratory-based and basic in nature, these early studies
established significant support for the attention narrowing idea. Eventually, these ideas were tested in more applied arenas. The less controlled studies and observations which will be summarized in the next section provided practical evidence for the viability of the attention narrowing idea in actual stress-producing environments. Applications of Peripheral Narrowing Research
Baddeley (1972) reviewed both anecdotal and empirical evidence of peripheral narrowing in "dangerous environments". Citing such examples as those from military combat observations, Baddeley (1972) provided substantial evidence of the impact of perceptual narrowing on real world situations. For example, he found that in the heat of battle soldiers will use their rifles much less efficiently than in training, the ratio of error to hits in combat increases, and tonnage of bombs needed to destroy a target increases. These are each examples of anecdotal reports that indict the deterioration of ability to use the most relevant cues in dangerous or stressful environments.
Weltman and Egstrom. (1966) and Wellman, Smith, and Egstrom. (197 1) applied the idea of peripheral narrowing to a deep sea diving environment under differing conditions of stress. The experimental conditions consisted of surface testing, shallow
diving in an enclosed tank, diving at ocean depths of 20-25 feet (6-8 m), and simulated decompression dives in a pressure chamber. In general, they found that across conditions, performance was maintained on a centrally located monitoring task, but as stress level increased (i.e., the divers descended to more dangerous depths), attention to peripherally located light stimuli deteriorated. Though intriguing, these results may be contaminated by other extraneous factors such as the increase of nitrogen levels in the blood stream.
Surprisingly, relatively few investigations have been undertaken in sport settings to examine the effects of peripheral narrowing. Landers, Wang, and Courtet (1985) investigated peripheral narrowing with experienced and inexperienced rifle shooters. The central task was a target shooting task while the peripheral task was an auditory detection task. Although there were no differences found in secondary task performance between the experienced and inexperienced shooters, they did find that under high stress conditions, both groups shot worse.
As to other sport situations, two studies were conducted by Williams, Tonymon, and Andersen (1990, 1991) to help substantiate Andersen and Williams' (1988) model of athletic injury. In the model, Andersen and Williams (1988) indicate that a possible predisposition to athletic injuries may be precipitated by elevated levels of life stress, resulting in an inability to attend to peripheral stimuli. Support for this possibility was found in the two studies designed to test the model in which Williams, Tonymon, and Andersen (1990, 1991) found significant decrements in detection of peripheral cues while performing Stroop tasks under stressful conditions.
The work done with regard to attentional narrowing in the sport context was
reviewed by Landers (1980). In the review, the Inverted-U in sports was explained using Easterbrook's cue utilization hypothesis. In sport, as with other domains, Landers suggested that performance is proportional to the number of cues utilized. At low arousal levels, there is a surplus of cues, including irrelevant cues that must be dealt with. With increasing anxiety levels, irrelevant stimuli are eliminated before relevant ones. Therefore, according to Landers, perhaps there is a bi-directional, reciprocal causality between arousal and performance in sport. Other theoretical proposals have been forwarded to account for the narrowing phenomenon. These will be reviewed in the next section. Theoretical Explanations for Peripheral Narrowing
Many theories have been forwarded to explain the consistent reduction in cue utilization during performance of tasks in stressful environments. Easterbrook (1959) proposes that if intensity cannot be discriminated between stimuli, a reduction in the employment of cues results. The reduction in the range of cue utilization can also be explained in the context of both Hull's (1943) Drive theory and the Yerkes-Dodson (1908) Inverted-U theory. In the Hullian sense, an increase in arousal (or drive) increases the stimulus generalization of a particular stimulus, resulting in the application of a trained response to stimuli other than the one of interest. In the Yerkes-Dodson argument, as arousal increases, some cues lose their ability to evoke the proper response, hence increased arousal, to a point, will be beneficial, after which decrements will result.
Easterbrook (1959) also implies that the cue utilization hypothesis fits nicely into Broadbent's (1957) idea of the single channel hypothesis of attentional capacity. Though
less popular than current theories of attention, Broadbent's notion that there exists a single cue channel that will affect processing capabilities elsewhere in the system accommodated the cue utilization hypothesis effectively.
However, the idea can also be supported in the context of more recent
capacity/resource models such as proposed by Kahneman (1973) or Wickens (1984). These theories, though opposed with regard to the number of resource pools available, suggest a limit in the resources accessible to attain optimal attention as determined by priorities. In line with this view, one primary feature of high arousal levels is a narrowing of attention because the allocation policy is likely to shift away from the periphery and toward the central area. Thus, the allocation policy is also consistent with the probability results obtained by Hockey (1970).
In summary, it appears as though arousal tends to overload the system, narrowing the range of stimuli that are processed by impairing the memory traces of the stimuli of lesser importance, such that processing can continue to be devoted to the more central cues. It seems that narrowing could be due to both an impairment at the perceptual stage of processing and at the short term memory stage. However, the exact location of impairment has not been clearly identified.
An idea that consistently recurs as an explanation for performance changes in both central and peripheral tasks is a narrowing of the attention beam in which irrelevant cues are somehow filtered from processing, either in the perceptual or encoding stage of analysis. However, virtually no one has assessed the impact of distractors, in this context
and the concept of distraction has received very little attention from sport psychology and cognitive psychology researchers, in general. It seems logical, however, that the central task proficiency decrements that eventually occur as stress levels increase could also be explained in the context of distraction.
The lack of research directed toward understanding distraction is surprising
considering the imperative need to ignore distractors and focus only on the most critical cues in any performance situation. It is also surprising considering that the concept of distraction was actually addressed by William James as early as 1890. Though many of the ideas of James are being empirically investigated even at the end of the 20'h century, distraction continues to be a virtually untapped area of research on attention. Meanwhile, examples of athletes and other performers who have been victimized by distraction are numerous (Moran, 1996). The need to avoid distraction has prompted leading sport psychologists such as Orlick (1990) to suggest that it is one of the most important mental skills required to be successful in sport. Perhaps this is why virtually all mental training skills programs developed by sport psychologists are directed toward maintaining concentration on the task and appropriate cues. Interfering thoughts need to be regulated and irrelevant stimuli ignored.
Brown (1993; as cited by Moran, 1996) defines distraction as situations, events, and circumstances which divert attention from some intended train of thought or from some desired course of action. This definition is somewhat different from James' (1890) original conceptualization of distraction which was more directed toward the description of distracting thoughts and being "scatter-brained". Each of these views of distraction can
be more easily understood if categorized in the context of internal and external types of distractors (Moran, 1996). Internal distractors refer to mental processes that interfere with the ability to maintain attention while external distractors are environmental or situational factors that divert attention from the task at hand. Each of the two types of distraction lead to a wandering of attention which Wegner (1994) has suggested is "not just the weakness of the will in the face of absorbing environmental stimulation ... but rather it is compelled somehow, even required, by the architecture of the mind" (p.3). Wegner (1994) has postulated that because the mind tends to wander, there is an attempt to hold it in place by repeatedly checking in to see whether it has wandered or not. Unfortunately, this results in a Catch-22 because by evaluating, attentional focus is inadvertently drawn to the exact thing that one is trying to ignore. He also suggests that when highly emotional, attentional resources are reduced and the mind is inclined not only to wander away from where it should be attending, but also wanders toward that which we are attempting to ignore.
Effects of distraction. Obviously, the typical effect of distraction is a decrease in performance effectiveness. The most plausible explanation for the decrease in performance when distracted by either external or internal factors is the decrease in available attentional resources for processing relevant cues. This idea is consistent with the limited capacity models of attentional resources proposed in different forms by various attention theorists (e.g., Allport, 1989; Kahneman, 1973; Shifflin & Schneider, 1977). Because attentional capacity is limited, resources directed toward the processing of distractors, reduces available resources for the processing of task-relevant information.
This idea is supported by studies which have shown that distraction effects are greater for complex rather than simple tasks, and that distraction effects are greater as the similarity of distractors to relevant cues increases (Graydon & Eysenck, 1989). As tasks become more complex and distractor similarity increases, the attentional resources needed also increases due to a reduction in the automaticity of cue discrimination. Thus, any increase in distractibility will inevitably reduce the attentional capacity available for the primary task.
Distraction and stress. Though empirical evidence is scarce, many researchers have suggested that increases in emotionality as embodied by stress and the various components that make up stress (i.e., anxiety, worry, arousal) increase susceptibility to distraction. Emotional stress would be classified as an internal distractor as it does not exist except in the mind of the performer; but often internal distraction is caused by the erroneous perception of an external distractor (Anshel, 1995). Numerous examples to support the notion that stress impedes performance due to distraction can be found in verbal accounts and behavioral observations of "choking" in competitive environments. Moran (1994, 1996) provides substantial anecdotal evidence that the impact of anxiety is the absorption of attentional resources which could otherwise be directed toward the relevant task. Baumeister and Showers (1986) indicate that increased worry causes attentional resources to be devoted to task irrelevant cues while self-awareness theorists such as Masters (1992) suggest that under stress, not only is attention absorbed by irrelevant stimuli, but also the performance of normally automated skills becomes less automated as resources begin to be intentionally directed toward the process of the once-
automated movement. Self-awareness, then, interrupts the normally fluid mechanics of the movement and inevitably decreases performance. Finally, Eysenck (1992) has provided empirical evidence that anxiety provokes people to detect stimuli which they fear, usually those that diverts them from attending to relevant information.
Paradoxically, it appears that there are two equally attractive explanations for the decrease in performance that occurs under high levels of stress. On one hand, proponents of the attention narrowing argument would suggest that under high stress levels (either anxiety or arousal induced), the visual field narrows to block out irrelevant information, and subsequently relevant information as stress continues to increase. On the other hand, proponents of the distraction argument would suggest that actually a widening of the attention field occurs such that irrelevant or distracting cues receive more attention then when under lower stress levels. Evidently, a controversy exists unless in some way, both mechanisms could be working at the same time. Perhaps, an increase in anxiety and/or arousal results in a narrowing of the attention field while at the same time, especially at higher levels of stress, increases susceptibility to distraction. Thus, many theories can account for how stress affects attention and the eventual impact of attention variation on performance, but none address specifically why this phenomenon occurs. By briefly examining research in visual attention, perhaps some clues as to what exactly is happening in these contexts may be surmised.
It has long been known that there is a direct relationship between human
performance capabilities and the informational load as well as the response demands
associated with a particular task (Fitts & Posner, 1967; Hick, 1952; Hyman, 1953). That is, as the level of response uncertainty (informational load) increases, so too does reaction time (RT). More importantly, laboratory research tends to indicate that RT to a single unanticipated visual stimulus is in the order of 180-220 ms, with this delay composed of latencies associated with stimulus detection, response preparation, and neural and muscular activity associated with a simple key press (e.g., Wood, 1983). Given these latencies, there is an apparent discrepancy between the obvious time constraints imposed by complex situations (those dominated by heightened levels of response uncertainty) and the ability of elite performers to routinely select and execute the most appropriate motor response.
Hardware vs. Software Approaches
In an attempt to understand this paradox, researchers have forwarded two
competing explanations. The first approach posits that expert performers differ from novices in that they possess advanced psychophysical and mechanical properties of the central nervous system (Abernethy, 1991; Burke, 1972). That is, proponents of this theory believe that experts have much faster overall RT's (simple, choice, and correction times) than do novices, and also possess greater optometric (static, dynamic, and mesopic acuity) and perimetric (horizontal and peripheral vertical range) attributes. In accord with the notion that humans are somewhat genetically programmed to possess these qualities, this perspective has been termed the "hardware" approach of expertise.
Support for the hardware approach, however, has been very limited. Studies by Helsen (1994), McLeod (1987), Starkes (1987), and Starkes and Deakin (1984), in which
expert and novice athletes were compared on a number of laboratory tasks involving visual mechanisms (depth perception, static visual acuity) and processing abilities (simple and choice reaction time tasks) demonstrated no significant differences between the two groups. Thus, it appears as though expertise cannot be explained by a CNS advantage on the part of the expert.
In contrast to the hardware theory of expertise, proponents of the "software" approach argue that experts have a much greater knowledge base of information pertaining to their particular area of expertise. Differences in expert performance as compared to novices is thought to be the result of a cognitive advantage, rather than a physical advantage. For example, it is believed that expert athletes make faster and more appropriate decisions based on acquiring selective attention, anticipation, and pattern recognition strategies associated with their sport (Abernethy, 1991). That is, experts learn to know which cues to focus their attention on in their sport environment, and develop an understanding of the importance of these cues in predicting the nature of future sport related stimuli.
Support for the software approach to expertise has been repeatedly demonstrated in studies assessing decision time and accuracy responses for sport-specific situations (Bard & Fleury, 1976; Starkes, 1987). The same is true for the recognition and recall of structured elements of game situations in sports such as baseball (Hyllegard, 1991; Shank & Haywood, 1987), basketball (Allard & Burnett, 1985; Bard & Fleury, 1981), field hockey (Starkes, 1987), and volleyball (Borgeaud & Abernethy, 1987). Given the vast support for the software approach, the rest of this section will describe the cognitive
elements of visual search that provide a better understanding of visual selective attention capabilities.
Visual Selective Attention
Theeuwes (1994) has defined selective attention as "the process of selecting part of simultaneous sources of information by enhancing aspects of some stimuli and suppressing information from others" (p. 94). Visual selective attention theorists are in agreement that there is primarily a two-stage process of selection: A preattentive stage and an attentive stage. The preattentive stage is thought to be unlimited in capacity and occurs in parallel across the visual display. Conversely, the attentive stage is capacity limited and is serial in nature. Preattentive parallel search has been supported by the notion that in simple search tasks, a flat function exists relating RT to the number of nontarget items that are varied (e.g., Egeth, Jonides, & Wall, 1972; Neisser, Novick, & Lazar, 1963). This flat function has been regarded as a pop-out effect (i.e., the non-target items pop-out of the display) and gives support to the notion that operations are carried out in a spatially parallel manner. Thus the three properties of preattentive search are unlimited capacity, independence of strategic control (exogenous, stimulus driven,), and spatial parallelism at various locations. Attentive search is characterized by functions that show a linear increase in RT as the number of non-target items increases. It is serial in nature, usually found in tasks with specific arrangements and in conjunctive search, and is probably capacity limited.
The specific nature of the attentive stage of visual search has been hotly debated by theorists who favor the concept of a late selection approach versus those who favor early
selection. In regard to the notion that the attentive stage is limited in capacity, disagreement exists in regard to where the capacity is limited. Early selection theorists (e.g., Theeuwes, 1994; Treisman, 1988; Treisman & Gelade, 1980; Treisman & Sato, 1990) suggest that perceptual operations can be performed during the attentional stage that cannot be handled by the preattentive stage. Conversely, late selection theorists (e.g., Ailport, 1980; Duncan, 1980; Duncan & Humphreys, 1989) say that during the attentive stage, no perceptual operations are completed. Rather they propose that during the attentive stage, selection of one of the competing response tendencies elicited by the multiple stimuli occurs.
The idea of a limited spatial location property to attentive search has also been of issue. Specifically, early selection theorists have suggested that there is serial inspection of each item; a notion that is in line with several metaphors that have been forwarded to describe visual selective attention such as the spotlight (Posner, 1980; Treisman, 1988) and the zoom lens (Treisman & Gormican, 1988) which will be described later. The late selection theorists, on the other hand, do not allocate a special role to spatial attention.
Different types of search tasks have been used in an attempt to better understand the covert processes that distinguish the two stages and elements of the stages. The most popular of these tasks have been those characterized as primitive features and conjunctive features. In searches involving primitive features, Treisman and her colleagues (e.g., Treisman, 1988; Treisman & Gelade, 1980) have provided an abundance of evidence that these tasks can be carried out preattentively, exhibiting flat search functions which are the result of the popping-out of the most significant features. In these types of tasks,
information does not need to be passed to the second stage because it is automatically selected and there are no attention limitations.
As mentioned, a special role for spatial attention has been advocated by those in
the early selection camp (e.g., Broadbent, 1982; Hoffman, 1986) to account for findings in which items with unique attributes have not been shown to pop-out when they were irrelevant to the task. This notion also contradicts the idea that top-down control maintains gaze until it comes close to a conspicuous object, and then bottom-up control takes over (e.g., Engel, 1977). Thus, spatial attention may not strictly adhere to the constraints of other types of primitive search tasks. These concepts support for the zoom lens metaphor of spatial attention in that people may intentionally vary the distribution of attention in the visual field (Eriksen & Yeh, 1985). In this case, search for the target proceeds serially, omitting the need for the preattentive stage period. This is in line with a series of studies by Eriksen and his colleagues in which it was shown that non-target items may have a detrimental effect if they are spatially close to the target but have no effect when they are further away. As will be seen later, the idea that attentive search is serial is also an important factor in being able to infer that the fine of sight coincides with attention.
Conjunctive feature search tends to show a linearly increasing relationship between the number of different features in the task, and whether the target is absent or present in the display. According to the early selection account, the reason this occurs is due to the need for serial search rather than parallel operations only. However, under certain circumstances such as relatively large displays or search for some particular attributes (depth, movement), search functions become relatively flat (Pashler, 1987; Wolfe, Cave, &
Franzel, 1989). These results can all be accounted for however, by the revised FIT which incorporates some top-down mechanisms in conjunctive search so that non-targets (even though conjunctive) which are very dissimilar to the target do not have the same probability of entering into the attentive stage as do those that are similar.
Stage-s of visual search. As mentioned, the visual search process consists of two distinct stages (Jonides, 1981). The first of these, the preattentive stage, involves unlimited capacity in which visual information from sensory receptors is held in a rapidly decaying visual sensory store. The literal representation of this briefly held information is labeled "the icon" (Neisser, 1967). This stage of visual search is thought to be automatic, with parallel processing of information, and demonstrates crude feature analysis or detection.
The second stage of visual search, termed the focal or attention demanding stage, refers to the process through which selected items in the iconic store are subjected to a more detailed analysis (Jonides, 198 1: Remington & Pierce, 1984; Yarbus, 1967). The concept of selective attention in this context focuses on the determination and passage of specific icons from the preattentive stage to the focal stage. It is in this focal stage that only those cues (icons) in the sport environment that are deemed pertinent will be attended to and used by the athlete.
The process of selecting and processing information from only specific aspects of an entire visual display entails both overt visual orienting and covert mechanisms that occur during eye fixations. Overt visual orienting includes the movement of the eyes and head to focus on a particular spatial location. Both top-down (cognitively driven) and
bottom-up (stimulus driven) processes control the 'macrostructure' of the scanpath (LevySchoen, 198 1), or where the visual receptors are focused. Covert orienting mechanisms are unseen processors that occur within the attention allocation resources of the brain and are also influenced by both top-down and bottom-up control (e.g., Posner & Cohen, 1984).
Temporal aV. cts. Though covert orienting mechanisms are, by definition hidden, studies of the covert measures of visual orienting have been reported for the past 20 years based on the cost-benefit paradigm developed by Posner and Snyder (1975) and Posner (1978, 1980) to investigate mental chronometry; the time course of information processing. Much work in this area led to the conclusion that reaction time decreases give the perceiver a head start in shifting attention to the target's location. However, questions arose regarding the effect of location cueing as being related to perceptual sensitivity changes or changes in the observer's response criterion. Using SDT paradigms, results have indicated that the benefit occurs mainly through a change in the perceptual sensitivity (e.g., Downing, 1988). These results have further been substantiated by overt measures of mental chronometry.
Specifically, Saitoh and Okazaki (1990) examined the temporal structure of visual processing while performing a digit string search and matching task in an effort to decompose the stages of reaction time. The time used to encode and memorize the standard digit string increased linearly with each addition to the digit string. Also, it was found that the entire visual search time and RT was associated more with the number of eye fixations rather than the duration of the fixations. This provides support for the idea
that each shift of eye fixation provides a shift in visual attention as well and that the ability to measure the chronometry of information processing can be accomplished through the study of eye movements.
Though the results obtained by Saitoh and Okazaki (1990) are encouraging, many questions have been raised regarding the ability to infer visual attention shifts from eye movements (Klien, 1994; Viviani, 1990). Attempts to clarify this issue have typically involved determining whether saccadic eye-movements can be made without concomitant eye-movements to the location. As mentioned, when highly salient aspects of the display exist, stimulus driven (bottom-up) control takes over (Engel, 1971, 1974, 1977). Cognitive control (top-down) of the scanpath is most evident when a particular aspect of the display is of interest. Goal driven visual search strategies are produced on the basis of cognitive control while stimulus driven responses appear to be elicited by the stimuli themselves and take on the properties of reflexive shifts to the visual field (Yantis & Jonides, 1984). Most research has indicated that while there appears to be a close relationship between stimulus driven saccades and attention shifts, less convincing evidence exists for the validity of inferring attention shifts from goal driven initiation.
Research indicates that in the case of stimulus driven saccades, the shift of
attention occurs before the initiation of the saccade (Wright & Ward, 1994). In their work looking at express saccades, Fischer and Weber (1993) have shown that attention must first be disengaged from the fixation point at the origin prior to target onset. Posner accounted for these criticisms through an elaborative account of the disengage, shift, reengage sequence that is probably mediated by activity in the posterior parietal cortex, the
superior colliculus, and the pulvinar region of the thalamus (Posner, Peterson, Fox, & Raichle, 1988). However, even in this description, most data were gathered from stimulus driven rather than goal driven attentional shifts. An in-depth discussion of the benefits and criticisms directed toward inferring attentional processing from eye-movement recording devices will be provided in the following sections of the review. Metaphors of Visual Attention
Though overt mechanisms of visual selective attention are relatively simple to
observe, covert attentional shifts are much more difficult to ascertain. As a result, much debate surrounds the ability to infer cognitive processing from overt observations. Due to the inability to precisely describe the association between line of fixation and attentional processing, several different models have been posed to account for the psychological mechanisms underlying attentional shifts. First, movement models suggest that the focus of attention is shifted from one location to another in an analog or discrete manner (the spotlight metaphor, e.g., Posner, 1980). Another popular metaphor is focusing models which suggest that attentional focus can change from a broader, more diffuse state, then back to a finer, more concentrated state at the destination of the shift (the zoom lens idea, Eriksen & St. James, 1986). Finally, resource distribution models postulate an attentional alignment process that does not involve a movement or a focusing component (Laberge & Brown, 1989). Investigations of each of these models have provided data to support them. However, as will be addressed later, Wright and Ward (1994) suggest that the reason for many discrepancies is the use of a variety of experimental paradigms, tasks, and cueing mechanisms.
The primary question that arises from the debate is whether or not the line of sight is independent of selective attention shifts. The evidence described so far in reference to the stages of processing has been gleaned primarily from studies in which line of sight is inferred from RT and other indirect measures of fixation location. However, much research has been completed with eye-movement recording devices to determine precisely when and where attention shifts during information processing of visual stimuli. Eye-Movement Recording
The ability to infer attention shifts from eye movements was first investigated by Helmholtz in the 19th century when he discovered that he could shift his point of gaze to illuminated letters before his actual attention shifted there (the latency of a normal saccade is approximately 220 ms (Fischer & Weber, 1993). James (1890) described attention shifts as being under involuntary or voluntary control which was the genesis for the study of exogenous (bottom-up) versus endogenous (top-down) processing. However, much research in the area was not possible until the 1970's with the advent of sophisticated eye monitoring equipment. Even with the additional data acquired through eye movement recording devices, researchers have been unable to provide indisputable evidence for the notion that the line of sight coincides with the line of attention.
While the visual search paradigm would appear to be a fluitful means of assessing selective attention strategies, it is not without criticism. Before concluding this section on visual search, it is necessary to discuss some of the limitations and potential problems that currently exist in eye movement recording research. These concerns are reflected in both
the assumptions of selective visual attention theory, and in the eye movement recording techniques themselves (Abernethy, 1988; Vviani, 1990).
According to Abernethy (1988), the first major limitation of eye movement recording lies in the assumption that visual search orientation is reflective of actual allocation of attention. That is, visual fixation and attention are one in the same (where one looks is where one attends). This notion, however, has been refuted by Remington (1980) and Remington and Pierce (1984), who demonstrated that attention can be allocated to areas other than the foveal fixation point. Indeed, attention can be allocated to areas in peripheral vision, a mode that cannot be measured with current visual search equipment (Buckholz, Martinelli, & Hewey, 1993; Davids, 1987).
A second limitation of current visual search recording involves the high trial-totrial variability that is evident in the literature (Abernethy, 1988). These variable patterns make reliable conclusions about the relevance of specific visual cues difficult. Related to this limitation is the fact that the majority of studies include relatively low sample sizes (often n = 6 or 8), thus causing internal and external validity concerns.
A third, and perhaps most important, limitation of eye movement recording
focuses on the issue of visual orientation and information pick-up. As Abernethy (1988) notes, merely "looking" at visual information does not necessarily equate with "seeing" (or comprehending) this information. Thus, a person may fixate upon pertinent cues in the visual array, but there is no guarantee that he or she is actually attending to or utilizing these cues. In order to empirically determine whether one is actually "picking-up" and using the cues available in the visual field, the technique of cue occlusion has been used.