Hemispheric organization in the deaf


Material Information

Hemispheric organization in the deaf a comparative study
Physical Description:
vi, 95 leaves : ill. ; 28 cm.
Harris, Hope Elaine, 1954-
Publication Date:


Subjects / Keywords:
Deafness   ( lcsh )
Brain -- Localization of functions   ( lcsh )
Clinical Psychology thesis Ph. D   ( lcsh )
Dissertations, Academic -- Clinical Psychology -- UF   ( lcsh )
bibliography   ( marcgt )
non-fiction   ( marcgt )


Thesis (Ph. D.)--University of Florida, 1982.
Bibliography: leaves 92-94.
Statement of Responsibility:
by Hope Elaine Harris.
General Note:
General Note:

Record Information

Source Institution:
University of Florida
Rights Management:
All applicable rights reserved by the source institution and holding location.
Resource Identifier:
aleph - 000318247
oclc - 08943565
notis - ABU5080
sobekcm - AA00004900_00001
System ID:

Full Text








I would like to thank the administration, staff and

students of Bradford High School and Florida School for the

Deaf and Blind, whose cooperation and effort made this re-

search possible. In addition, I would like to express my

sincere appreciation to the members of my committee, not

only for their contributions to this research, but for their

instruction and guidance throughout my graduate studies. I

want to especially thank Dr. Eileen Fennell, both for her pa-

tience and for the insight and support she supplied at crit-

ical moments during this project. Special thanks go to the

Biostatistics Department of the University of Florida for

their statistical expertise; to Mary Ann Petska for her life-

like sketches of the ASL and non-ASL handshapes; and to Lois

Rudloff for her excellent preparation of this manuscript.

Particularly, I want to thank my sister, Laura Cocks, for

the time and energy she invested by serving as experimenter

for this research and for the love and confidence she freely

gave me. Most of all, I want to express my love and grat-

itude to my mother, sisters, family and special friends for

their unconditional love, encouragement, and support, pro-

viding me with an environment in which I was free to learn,

to make mistakes, and to grow.




ABSTRACT ... . . iv



Lateralization in the Deaf: Liter-
ature Review . ........ 7
Current Research: Rationale and
Objectives . .. 14


Subjects . .
Stimuli . .
Apparatus and Procedure .
Statistical Analysis .


Experiment 1: Shapes .
Experiment 2: Line Orientations
Experiment 3: ASL Handshapes .
Experiment 4: Non-ASL Handshapes
Individual's Lateral Directional
Preference . .


Group Effects . .
Hemispheric Organization Effects .
Comparison of Present Findings with
Past Research . .
Conclusion . .




. 32

S 51

. 54
. 54
.. 59

. 67
. 72

. 75

. .. 92

. 95




Abstract of Dissertation Presented to the Graduate
Council of the University of Florida in Partial
Fulfillment of the Requirements for the Degree
of Doctor of Philosophy



Hope Elaine Harris

May, 1982

Chairman: Eileen B. Fennell
Major Department: Clinical Psychology

Recent research has focused on the hemispheric organ-

ization of the deaf, and particularly their lateral organ-

ization for language, in hopes of understanding how later-

ality develops. This research, however, has been confounded

by methodological and sampling errors and has failed to con-

trol for early language training. The present research at-

tempted to compare hemispheric organization of individuals

in a Hearing group (10 males, 10 females), an Oral group (10

males, 10 females), and a Sign group (9 males, 9 females) on

four separate tasks: tactile exploration of shapes and line

orientations and tachistoscopic presentation of American

Sign Language (ASL) handshapes and non-ASL handshapes.

Results of the experiment demonstrated no predicted

field x group interaction. Instead, all groups appeared to

use similar processing strategies for the tasks. What was

observed, however, was differential effects for laterality

dependent upon sex, side of stimuli input, and side of re-

sponse output. Specifically, for lines and shapes, males

responded initially with a right hemisphere, visuospatial

strategy, but switched to a left hemisphere, verbal strategy

with practice. Females, who initially preferred a verbal

strategy and performed poorly with right hemisphere process-

ing, were able to improve with the right hemisphere with

practice. For the ASL handshapes, no visual half-field ef-

fects were found. The Sign group, however, was more accurate

than the Oral, who were more accurate than the Hearing. This

finding both validates the composition of the deaf groups

and suggests that early language training does affect the

later accuracy with which the deaf groups process signs.

Finally, all three groups demonstrated a right hemisphere

advantage for the non-ASL handshapes, with both deaf groups

being significantly more accurate than the Hearing.

It appears that in terms of the way the brain is organ-

ized for certain tasks, it matters more whether one is male

or female than whether one is deaf or hearing. Also, no

task is inherently a left or right hemisphere task, and hem-

ispheric processing may be manipulated depending on the de-

mands of the task. Finally, it is important to consider both

hemispheric input and output when investigating differences

in hemispheric organization as these interact with gender

to determine laterality differences.


One prominent focus of investigation within the past

few years has been on understanding the presence and devel-

opment of cerebral lateralization in man. It now appears

that most normal, right-handed hearing individuals are lat-

eralized for language in the left hemisphere (along with

analytic and sequential processing in general), while the

right hemisphere is reserved for tasks requiring spatial,

holistic and parallel processing (Bever & Chiarello, 1974;

Ross, Pergament, & Anisfeld, 1979). (Lateralization of left

handed individuals is a much different and more complicated

story.) Before the use of tachistoscopes in visual half-

field experiments, much of our knowledge of the laterality

of language and other processes was based on work with pa-

tients suffering from unilateral brain lesions. With this

population, it was difficult to determine what impairments

in ability were due to the absence of necessary cortical

areas (which would document laterality) or were due to mis-

connections formed in the recovery of the brain to the le-

sion. Use of the tachistoscope in visual half-field exper-

iments has allowed us to investigate laterality in the nor-

mal, non-lesioned brain. This research in visual half-field

processing has shown a consistent left hemisphere advantage

(LHA) for language (printed words, letters) and a right hem-

isphere advantage (RHA) for spatial material (complex shapes,

faces, line orientations). What this research has not shown,

however, is why the human brain lateralizes as it does, i.e.,

how laterality develops. Different hypotheses have been pro-

posed to account for the development of the lateral organ-

ization in the brain (McKeever, Hoemann,Florian, & Van De-

venter, 1976; Scholes & Fischler, 1979), but the area re-

mains controversial because of the issues of handedness

(Levy, 1969; Levy, 1974) and processing strategies (Bever,


A processing strategy that has been thought by some to

impact on the lateralization of the brain, particularly as

it relates to language, is auditory processing. Researchers

have suggested that the left hemisphere handles all auditory

input, and thus language, while the right hemisphere is re-

sponsible for visual stimuli, thereby explaining the RHA

that has been found for complex shapes, faces and line ori-

entations. Another line of thought suggests that the left

hemisphere is not specialized solely for language process-

ing or auditory input, but more specifically for analytic

processing. In turn, the right hemisphere deals most effec-

tively with holistic processing regardless of the modality

of the input (Bever, Hurtig, & Handel, 1976; Nebes, 1974).

From this argument it would follow that it is not the audi-

tory nature of language that places it in the left hemisphere,

or the fact that it is language per se. Rather, the impor-

tant variable then becomes the type of processing involved

in the use of language.

One means of assessing the impact of auditory process-

ing in the development of laterality of language is the

study of cerebral lateralization in the congenitally deaf

individual. This population allows the researcher to in-

vestigate the development of laterality in the absence of

auditory input. In addition, many of these individuals

utilize a sign language which is different from English.

This sign language, known as American Sign Language (ASL),

has its own distinct syntax, lexical structure and sublex-

ical structure whichare different from the oral phonology

of the English language (Woodward, 1977). It also differs

from other types of manual communication such as finger

spelling or Signed English (a visual form of English) at the

syntactic, morphological and phonological level. Finger

spelling assigns 26 distinct hand configurations to the sep-

arate letters of the English alphabet and is used to spell

out English words and sentences in the air, letter by letter.

Therefore, finger spelling does not differ in structure from

English. In contrast, Signed English on the surface seems

more similar to ASL in that it uses similar signs, but these

signs are put in the order of English syntax. Both finger

spelling and Signed English are difficult for the deaf per-

son to understand (unless he/she is very familiar with English)

and more likely represent a second language to the deaf per-

son, distinct from ASL. Given these differences in manual

communication, it is important to note that research in the

hemispheric specialization of the deaf has generally failed

to make any distinctions between finger spelling, visual

English, and ASL when describing or investigating manual


Before one can hypothesize differences in laterality

between ASL and English, one needs to understand something

of the structure of ASL. The syntax of ASL can best be de-

scribed as telegraphic, often with action words and nouns

occurring first, with few connecting units. Not only the

signs themselves, but their movements and their speed of

movement, along with the body gestures and facial expres-

sions of the signer,are integral components of the syntax.

In fact, it is difficult to give meaning to a sign in ASL

without associated movements and gestures. Because of this,

ASL would seem to lend itself best to a holistic, spatial

processing which is different from the analytic processing

required to handle English syntax, and ASL, therefore, may

be handled most efficiently by the right hemisphere. As

noted earlier, it has become a given in laterality research

that language (in right handed hearing individuals) is con-

trolled by the left hemisphere. However, it may be that it

is not language, but analytical processing, which the left

hemisphere controls.

Some research that supports the idea that it is the

type of cognitive processing involved that determines how a

language is lateralized is the work done with the Japanese

orthography of Kana and Kanji (Bradshaw, 1980; Sasanuma,

Itoh, Mori, & Kobayashi, 1977). Both Kana and Kanji are

nonalphabetic symbols, with Kana the phonetic symbols for

syllables, while Kanji are the nonphonetic, logographic sym-

bols representing lexical morphemes. In this sense, these

characters parallel the distinctions between English and

ASL, as English consists of phonetic symbols while ASL is

concerned with the morpheme structure. Early work with

Japanese aphasic patients disclosed that the strategies used

for decoding the two types of symbols were different, with

Kanji utilizing a visual processing, in contrast to a phono-

logical processing for Kana. Tachistoscopic experiments

with Kana and Kanji words further supported these early find-

ings by demonstrating a LHA for Kana and a RHA (nonsignif-

icant) for Kanji. Sasanuma et al. (1977) suggest that while

Kanji characters are fit candidates for a direct visual map-

ping or holistic type processing under some conditions, the

characters are also associated with one or more phonological

representations which may call up left hemisphere process-

ing, thus presenting a more mixed picture of lateralization

for Kanji. Determining the laterality of language in deaf

individuals who communicate with ASL could shed further light

on the role of types of processing in the determination of


Research with deaf individuals has begun only recently,

and many of the studies are characterized by methodological

difficulties. There have been major difficulties with the

choice of tasks and strategies. Use of English words, let-

ters, or manual alphabet do not insure linguistic mediation

by the deaf (they may approach these as spatial tasks in-

stead). Multiple choice or matching responses also enable

the tasks to be approached in a spatial, holistic manner.

One difficulty in the choice of tasks has been the reliance

on tachistoscopic presentation of line drawings of sign.

These signs, without the accompanying movements which is a

central and salient aspect of ASL, cannot be considered as

representative of the language of the deaf. Another major

difficulty in research with deaf populations has been the

failure to discriminate on the basis of early language train-

ing (oral vs. ASL) and age at which language is acquired.

Early language training may be crucial in the development of

laterality in the deaf, as some suggest that laterality be-

gins to develop at an early age (Lenneberg, 1967) and dif-

ferent types of processing strategies may affect this lat-

erality. Children raised in an oral environment are exposed

to the English language with its accompanying syntax through

oral presentation of words. Manual communication in an oral

environment is often used to supplement the oral presenta-

tion of language, but continues to follow the English struc-

ture and syntax. In contrast, children raised with ASL as

their first language (as is most typically the case of deaf

children with deaf parents) experience an entirely different

language structure which may require a separate type of cog-

nitive processing from ASL.

Lateralization in the Deaf:
Literature Review

McKeever, Hoemann, Florian, and Van Deventer (1976)

were the first to investigate language laterality in the

deaf. They presented four hemi-field visual-recognition

tasks to congenitally deaf signers and hearing subjects who

knew ASL. Three of the tasks used English letters and words,

while one used drawings of ASL signs and manual letters.

Hearing subjects showed LHA for two of the three English

tasks and a RHA for manual alphabet. The deaf subjects also

showed a LHA on one of the three English tasks (no differ-

ence on the others), but only a trend toward a RHA for the

alphabet and signs. The authors concluded that auditory ex-

perience is a major determinant of cerebral lateralization.

But this conclusion seems unwarranted. Their use of line

drawings for the manual alphabet and static signs hardly

seems comparable to ASL as the deaf use it and therefore

does not appear to tap language. Second, they made no at-

tempt to distinguish between the type of early language train-

ing the deaf experiences. Finally, the deaf and hearing

subjects used different response modes so that their re-

sults are not compatible.

Manning, Goble, Markman, and La Breche (1977), in an-

other visual half-field experiment, presented English words,

line drawings of signs,and shapes to deaf subjects and

words and shapes to hearing subjects. They found a LHA for

words for the hearing subjects and a trend in that direction

for the deaf. No visual field advantage was shown by the

deaf for signs or by either group for the shapes. In a sec-

ond experiment (to increase the chances that the subjects

deal with the stimuli linguistically), written words, photo-

graphs of signs, and words plus signs were presented bilat-

erally to the deaf, while just words were presented to the

hearing. Subjects had to match the words or signs to pic-

tures or objects on a board. Hearing subjects showed a LHA

for words, while deaf subjects showed a trend toward LHA for

words and RHA for signs. Manning et al. (1977), however,

do not mention the signing skill of their deaf group, nor

the group's early language training and whether or not they

were native signers. A final problem mentioned by Poiz-

ner and Lane (1979) is that Manning et al. (1977) failed to

use signs that were bilaterally symmetrical so that signs

in the RVF would have been seen more clearly than signs in

the LVF.

The only study to attempt to investigate differences

in deaf populations based on early language training was

done by Phippard (1979). In this study, hearing subjects

and two groups of prelingually deaf subjects (those with

strict oral training and those with oral and manual training)

were presented tachistoscopically English letter stimuli,

line orientations, and human faces. In addition, the total

communication group (those with early oral and manual train-

ing combined) viewed manual alphabet hand shapes. The hear-

ing group showed the predicted LHA to English letters and

the RHA to the lines and faces. The orally trained deaf

showed a RHA to both English letters and lines, while the

total communication group showed no visual field differences

to any of the four tasks. Phippard's use of a matching re-

sponse limits her findings somewhat in that this type of re-

sponding facilitates the use of a spatial matching (nonlin-

guistic) response. In addition, Phippard does not add to

the knowledge on ASL processing as she used only the manual

alphabet rather than signs and she does not say how fluent

any of her total communication groups were with ASL or if

they were native signers.

Neville and Bellugi (1978) had deaf native signers iden-

tify line drawings of ASL signs that were presented either

unilaterally or bilaterally through a tachistoscope. In ad-

dition, both the deaf signers and hearing subjects unfamil-

iar with sign were presented a task requiring them to point

to a dot location in a response matrix. Deaf subjects

showed a significant LHA for both unilateral presented signs

and dots, but no laterality differences for bilaterally pre-

sented signs or dots. Hearing subjects showed only the ex-

pected RHA in localizing unilaterally presented dots. Ne-

ville and Bellugi conclude that the left hemisphere of deaf

signers may function both for linguistic and certain non-

linguistic visual-spatial processing. Again, however, the

use of line drawings of ASL restricts any inferences concern-

ing language in the deaf. Also, their failure to find lat-

erality effects for both hearing and deaf subjects when stim-

uli were presented bilaterally complicates the conclusions.

Lubert (reported in Poizner and Lane, 1979) asked sub-

jects to match tachistoscopically presented stimuli to items

on a board. Congenitally deaf and normal hearing subjects

(sign language skills were not reported for either group)

received English letters and photographs of ASL signs that

could be identified without movement, photographs of manual

alphabet handshapes, and a dot enumeration task. Both the

deaf and hearing subjects showed a RHA to the ASL signs and

no lateral differences to the other three tasks. Failure

to find a LHA to English letters for the hearing subjects

adds support to the hypothesis that matching tasks are not

sufficiently complex to encourage verbal mediation, but

instead result in spatial processing. Because of this, it

is difficult to assess whether the deaf subject's RHA to the

ASL signs is due to language processing or purely spatial


Poizner and Lane (1979) attempted to avoid the lack of

movement confound in assessing dominance for ASL by using

photographs of signs that do not incorporate movement (signs

of numbers). Other stimuli included photographs of non-ASL

handshapes, arabic numbers, and random geometric shapes of

low association value (non-linguistic spatial controls).

Subjects were hearing adults unfamiliar with ASL and congen-

itally deaf (hereditary, unaccompanied by neurological in-

volvement) adults with ASL as a first language. Both deaf

and hearing subjects showed a RHA to the signs and non-ASL

handshapes. In addition, hearing subjects showed a LHA to

arabic numbers, while the deaf showed no reliable visual-

field differences to the material. Neither the deaf nor the

hearing subjects showed a reliable visual field advantage

to the geometric forms. From an analysis of the response

patterns, Poizner and Lane contended that the deaf subjects

labeled the signs and processed the labels, while the hear-

ing subjects relied exclusively on shape cues (as did the

deaf and hearing for the non-ASL hands). While this conten-

tion appeared justified, their conclusion seemed less war-

ranted. Instead of concluding that the right hemisphere in

deaf individuals may be responsible both for spatial process-

ing and for language (ASL), they inferred that the spatial

processing required of the signs predominated over their

language processing in determining the cerebral assymetry

of the deaf. If the deaf subjects did approach the signed

numbers as lexical terms (as Poizner and Lane argue) and in

a manner different from non-ASL hands and still obtained a

RHA, then perhaps the right hemisphere is responsible for

language in deaf native signers.

Scholes and Fischer (1979) investigated the relation-

ship between linguistic competence in the deaf and lateral-

ity of language. Their subjects were grammatically skilled

(based on their ability to comprehend passive sentences)

deaf adolescents, grammatically unskilled deaf adolescents,

and hearing college students. Onset of deafness for the two

adolescent groups was pre-lingual. It was hypothesized

that linguistic competence in the deaf would be associated

with normal and near-normal laterality (LHA for analytic

linguistic tasks). Subjects were shown a picture of a com-

mon object (e.g., lamp), followed by a brief unilateral pres-

entation of a manually signed or orthographic letter, and

they had to indicate as quickly as possible whether the let-

ter was present in the spelling of the object's label.

Hearing subjects showed a marked LHA for this task (ortho-

graphic letter), but no superiority was shown for either

the linguistically skilled or unskilled deaf groups.

Skilled subjects did show more of a trend toward RHA assym-

etries than did the unskilled subjects. The authors con-

cluded that while hemispheric asymmetry did not develop nor-

mally in the deaf, the absence of this normal pattern did

not preclude the development of the analytic skills necessary

to deal with the structure of language. It should be added

that these investigators were investigating laterality of

the English language per se and not the possible laterality

of ASL. It is not clear how this may have affected later-

ality results.

Ross, Pergament and Anisfeld (1979) tested groups of

congenitally deaf adults and hearing adults who had learned

ASL as a native tongue at home. Three tasks were adminis-

tered to the subjects tachistoscopically: a sign-word task

(signs were videotaped), word-word task, and a letter task.

In these tasks, the subjects had to respond as to whether

the stimuli were the same or different. Hearing subjects

showed the expected LHA for the letter task, while there

was no difference for the deaf subjects. In the sign-word

task, hearing subjects showed a LHA and deaf subjects a RHA.

No significant effects relating to visual-field for the word-

word task were found. These results present the strongest

evidence to date for possible right hemisphere asymmetries

for language (ASL) in the deaf and may be a function of the

experimenter's ability to incorporate the movement of ASL

through the use of videotapes. What Ross et al. (1979) did

not show, however, is how a deaf-oral population with En-

glish as a first language would perform on this task, so

that it is not clear if the differences are due to lack of

auditory processing or differences in the processing strat-

egies for English and ASL.

In summary, investigations concerning language lateral-

ity in deaf populations suggest that these individuals are

lateralized differently and in fact may process sign lan-

guage in the right hemisphere. However, methodological dif-

ficulties involving choice of stimuli (with the use of line

drawings of the manual alphabet and static signs which ig-

nore sign language's most salient feature--movement) and

choice of response set (with matching responses being too

simple to encourage verbal mediation, thus encouraging spa-

tial matching) make it difficult to draw definite conclu-

sions from the literature. In addition, researchers have

added to the confusion with their sample populations which

fail to make an important distinction between types of early

language training of deaf individuals and by seldom giving

information regarding their subjects' fluency with sign lan-


Current Research: Rationale and Objectives

The present study was interested in investigating cere-

bral laterality in deaf and hearing subjects through a tac-

tile modality. While some researchers may argue that tactual

representation is primarily a right hemisphere task (Witel-

son, 1974), recent research has suggested that it is not the

nature of the stimuli per se (whether auditory, visual, tac-

tual), but the processing strategy required that determines

the hemispheric specialization (Bever, 1975; Manning et al.,

1977). The use of a tactile modality in assessing language

laterality in the deaf is particularly useful since it avoids

the complications inherent in attempts to present signs vis-

ually and allows deaf subjects to utilize whatever language

strategy (English or ASL) is most efficient for them. In ad-

dition, a tachistoscopic study with stimuli similar to that

used by Poizner and Lane (1979) was also employed. This di-

mension was used to serve as a reference point so that the

tactile data from this study could be compared with the

tachistoscopic research with deaf individuals.

Stimuli used in the tactile paradigm were line orienta-

tions and shapes. Line orientations, presented both unilat-

erally and bilaterally, in both the visual and tactual mode

have consistently yielded a RHA (Benton, Levin, & Varney,

1973; Benton, Varney, & Hamsher, 1978; Oscar-Berman, Rehbein,

Parfert, & Goodlass, 1978). This RHA holds for lines pre-

sented visually to the deaf population as well (Phippard,

1979). Research on the hemisphere advantage for shapes is

more complex. Initially thought to be a right hemisphere

task because of the spatial nature of the task (Dodds, 1978;

Witelson, 1974), it now appears to depend on the degree to

which the shapes encourage verbal mediation. In fact, as

Kimura (1966) and others have shown, shapes must be exceed-

ingly complex before verbal mediation is discouraged. Two

measures used to encourage verbal mediation of the shapes

included bilateral presentation of the stimuli to increase

the complexity of the task (if the task was too easy it would

not be necessary for verbal mediation to occur) and the in-

clusion of a 10 second delay between presentation of stim-

uli and response to encourage verbal mediation through re-

hearsal (Oscar-Berman et al., 1978; Satz, Aschenbach, Patti-

shall & Fennell, 1965; Witelson, 1974). While these two

strategies (bilateral presentation and delay) should have

been sufficient to guarantee verbal mediation of the shapes,

they were not expected to increase the complexity of the

line task to require mediation (Harris, Bullard, Satz, Freund,

Hutchinson, & Berg, 1980). Finally, the shapes used in the

paradigm were ones considered to be more easily verbally me-

diated. These shapes were determined from a pilot study in

which subjects were asked to label shapes after they felt

them. Those shapes that lent themselves most readily to la-

beling were chosen for the study from a sample of shapes

originally used by Witelson (1974). Witelson had used these

shapes and found that subjects were more accurate in recog-

nizing shapes felt by the left hand (possible RHA), but did

not compare response hand (all responses were made with the

left hand). Gardner, English, Flannery, Hartnett, McCormick

and Wilhelmy (1976) used Witelson's shapes, controlling

for response hand and also found an overall

advantage for shapes felt by the left hand, but this was

complicated by a feeling hand x response hand interaction

(which the authors discussed in terms of inter-hemispheric

cross over and retrieval). Neither Witelson (1974) or Gard-

ner et al. (1977) utilized a delay procedure as did this

study. Cranney and Ashton (1980) employed Witelson's

dichhaptic task with deaf and hearing populations and did

not find a significant hemispheric advantage, but failed to

control for pointing hand response, thereby possibly blur-

ring laterality effects. La Breche, Manning, Goble and Mark-

ham (1977) also used Witelson's shapes with deaf and hearing

children and found a significant right hand advantage for

hearing children and a similar, though nonsignificant trend

for the deaf children, suggesting that it is possible to call

forth a left hemisphere strategy for these shapes (again,

however, the authors did not control for response hand).

A second important aspect of the study was to assess

the differential responses of three samples on these tasks.

These three samples were hearing adolescents (Hearing), con-

genitally deaf adolescents with ASL as their native language

(Sign), and congenitally deaf adolescents raised in a strict

oral tradition with English as a first language (Oral). In

order to avoid the controversy regarding the age of estab-

lishment of cerebral dominance (see Krashen, 1975, for a re-

view of this literature), only adolescents 13 years of age

or older were used. In any attempt to form separate and

distinct sample groups, issues of which variables to control

for and match for are crucial. These variables became par-

ticularly salient in separating the deaf populations. In

this study, the three groups were matched for age. Intellec-

tual level as a factor was to be controlled statistically if

it became a significant variable. Another issue of concern

was the degree of hearing loss. Traditionally in research,

the cut-off for deafness has been a loss of 85 db or greater

in the better ear. While this was the accepted cut-off for

this investigation, it was recognized that amount of resid-

ual hearing to a large degree depends on the type of loss

(conductive vs. sensorineural) and the configuration of the

loss (i.e., the amount of hearing at different frequencies,

especially in the range of 250 hz to 4 k.).

Specific Hypotheses

The purpose of this study was to investigate the effect

of early language training on laterality in two different

deaf populations. Hearing and congenitally deaf adolescents

and young adults (with either English or ASL as a first lan-

guage) were presented with bilateral dichaptic stimulation of

two sets of stimuli, line orientations and shapes. In addi-

tion, they were exposed tachistoscopically to bilateral

presentation of .meaningful (ASL) and non-meaningful (non-

ASL) line drawings of handshapes similar to those used by

Poizner and Lane (1979). It was predicted that

(1) All three groups (Hearing, Oral and Sign subjects)

would have higher accuracy scores and faster reaction times

to lines presented to the left hand than to the ones pre-

sented to the right (demonstrating a RHA). If the shape

stimuli tapped language through verbal mediation, then

(2) The Hearing and Oral group would demonstrate

faster reaction times and obtain higher accuracy scores

when the shapes were presented to the right hand (LHA),

while the sign language groups would be more accurate and

perform faster when the shapes were presented to the left

hand. This was predicted because the Oral group, with its

early training in English syntax, should more closely re-

semble normal hearing individuals, while it was predicted

that deaf native signers, because of the visual-spatial na-

ture of the signs and holistic, idiographic processing re-

quired, would rely more on right hemisphere processing.

Finally, results of the tachistoscopic study were expected

to parallel Poizner and Lane's (1979) results, with

(3) All three groups having higher accuracy scores

and faster reaction times to both ASL and non-ASL handshape

stimuli presented to the left visual half-field (RHA), al-

though it was possible that the Oral group would demonstrate

a LHA for the ASL hand shape by performing best with these

stimuli presented to the right visual half-field.



Group 1 (Hearing) consisted of 20 hearing adolescents

and young adults selected from Bradford High School in

Starke, Florida. Group 2 (Oral) comprised 18 congenitally

deaf adolescents and young adults raised in an oral environ-

ment with English as their first language. Group 3 (Sign)

comprised 20 congenitally deaf adolescents and young adults

with ASL as their first language. The deaf subjects were

chosen from the Florida School for the Deaf and Blind (FSDB)

in St. Augustine, Florida. School records, parental reports

and teachers' reports were used to establish early language

training with the deaf students. To be included in a group,

deaf subjects had to be exposed to the particular language

training through at least the first six years of life (with

many experiencing the same language environment for the first

12 years). By 12 years of age, many of the children had

transferred to FSDB and at the time of the study were all

fluent in ASL and used this mode when talking with peers,

regardless of early language training.

All subjects were white and were right handed as meas-

ured by at least a 70% right hand preference (including

writing), on the 10 unimanual tasks of the Harris Test of

Lateral Dominance (Harris, 1958). Each group consisted of

an equal number of males and females between the ages of 13

and 21 years.

Subjects in the three groups were matched for age. Mean

ages for the three groups were as follows: Hearing subjects

--16 years, 7 months; Oral subjects--16 years, 6 months; and

ASL subjects--16 years, 7 months. Only those adolescents

and young adults free from any neurological impairment or

handicap (other than deafness) were included. Amount of

hearing loss for the two deaf groups was set at 85 db or

greater average loss in the better ear (x loss for Oral sub-

jects--91 db; x loss for ASL subjects--93 db). Intellectual

functioning (as measured by the Performance Scales of the

Wechsler Intelligence Scale for children--Revised or the

Wechsler Adult Intelligence Scale) was assessed. Mean IQ

scores for the three groups were as follows: Hearing sub-

jects--116; Oral subjects--lll; and ASL subjects--lll. Sub-

ject characteristics by group and sex are summarized in

Table 1.


Four sets of stimuli were presented, two in the tactile

mode and two in the visual mode. The tactile stimuli were

shapes and lines. The visual stimuli were line drawings of

ASL handshapes and nonsense handshapes.




c 0-
4 ( *



H1 .

r -4

ro (n

(N i




c u
yl r-4


*H H
O a

S-0 r-

a s






o n


Il )

in '-

* O






*N C



00 \0



u) -

h o

S(N -"



0- -

IN c
H fs


Shapes. The stimuli were 10 different wooden shapes,

each having four to eight sides and were 1 1/2 x 1 1/2 x 3/16

inches in size. These shapes were selected from Witelson's

shapes (1974) following a pilot study by the author which

suggested that these shapes were more readily verbally me-

diated. (Subjects responded with a verbal label more rapidly

to these shapes.) No one shape was a spatial reversal of an-

other. The 10 shapes were arranged in five pairs, such that

the general outline and the numbers of angles and curved

sides were similar within pairs of stimuli. Each pair of

stimuli was glued with a 4-inch horizontal separation between

them, to the central portion of an 8 x 11 inch board. A vis-

ual display of six shapes drawn to size on a card, consist-

ing of the two correct shapes, two other test stimuli, and

two distractors was used in the response portion of the test.

The six stimuli were arranged in a predetermined but random

circular arrangement in order to discourage left-right scan-

ning. The display stimuli were counterbalanced for display

position and frequency of occurrence. Ten different recog-

nition displays were used.

Lines. The stimuli were 10 different line orientations,

with angles of .50, 30, 45, 60, 75, 105, 120, 135, 150 and

165 degrees from the horizontal axis. These line orienta-

tions were formed by gluing wooden sticks 2 inches long on to the

central portion of an 8 x 11 inch board with a 4-inch

horizontal separation between them. The following angle de-

grees were paired and presented bilaterally: .50 and 30;

45 and 60; 75 and 105; 120 and 135; 150 and 165. Visual

displays of six line orientations were drawn onto cards, con-

sisting of two correct stimuli and four other test stimuli

and used for recognition of the correct response. The lines

were arranged in the same manner as the visual display of

shapes and 10 different line recognition displays were used.


ASL handshapes. These stimuli consisted of four ASL

signs that require no movement for recognition (although,

in common use, they often take on movement with added mean-

ing). These signs were the signs for the numbers six, seven,

eight and nine. The stimuli were 3 cm in width and 9.5 cm

in height and were presented tachistoscopically 30 to the

left or right of center fixation point to assure processing

by the specific visual half-field. Each ASL number was

paired with every other ASL number (six pairs) and then in-

terchanged as to left-right arrangement, for a total of 12

trials. Figure 1 presents the ASL handshape stimuli em-

ployed in the study.

Non-ASL hands. These stimuli consisted of four line

drawings of hand configurations that never occur in ASL.

The stimuli were 3 cm in width and 9.5 cm in height and were

presented 30 to the left or right of center fixation point

Figure 1

ASL Handshapes

to confine the processing to a specific visual half-field.

Each stimulus was paired with every other stimulus and then

interchanged as to left-right arrangement, for a total of

12 trials. Figure 2 presents the non-ASL stimuli employed

in the study.

Apparatus and Procedure

Initially, a baseline for speed of response for right

and left hands was established for each individual. This

was done by having the subjects respond to the presentation

of a visual display of dots by touching the center dot on

the display. Response hand was signaled (left or right)

prior to the stimulus display by an experimenter touching

the corresponding shoulder of the response hand. If the

subject's right shoulder was touched, the subject responded

with his/her right hand. If the subject's left shoulder

was touched,he/she responded with his/her left hand. Left or

right response was randomized between hands over 10 reaction

time trials.

Each subject was given six pretest trials with feed-

back as to the correct answers (using nontest stimuli) for

both the tactile and visual paradigms. The purpose of the

pretest trials was to familarize the subjects with the gen-

eral nature and procedures of the tests.

Figure 2

Non-ASL Handshapes

Tactile Presentation

The tactile tests consisted of 20 trials, each involv-

ing simultaneous presentation of a pair of different stim-

uli. Each of the five pairs was presented four times, with

the two stimuli interchanged as to the left-right arrange-

ment so that each shape was presented twice to the subject's

left hand and twice to his/her right hand. The stimuli were

presented to the subject with a partition positioned between

the subject and the experimenter, thus preventing subjects

from seeing the stimuli. The partition had an opening which

allowed for placement of the subject's hands so that he/she

did not have to search for the stimuli and so that each hand

only touched the left or right stimulus. At the start of

any trial, the subject placed his/her hands inside the two

openings. The subject's hands were positioned so that the

marks drawn on his/her wrists were in line with those on the

base of the entrance of the partition. A board with an at-

tached pair of stimuli was then slid into place under the

subject's hands. When the subject lowered his/her hands,

his/her fingers made contact with the stimuli. Each pair of

stimuli was presented for 10 seconds. Following this, a 10

second delay occurred before the presentation of the visual

display. With this display, the subject's response was then

made by pointing to the correct stimuli. The response hand

was indicated just prior to the presentation of the visual

display by an experimenter touching the corresponding shoul-

der of the response hand as in the reaction time trials. Re-

action time was defined as the time interval between visual

presentation of the response card and the manual response.

No time limit nor feedback for a response was given on the

test trials, but reaction time was measured. Subjects were

asked to select two stimuli on each trial, even if they had

to guess which stimuli they felt. Half of the subjects in

each group (controlled for sex) received the shape stimuli

first and the other half received the line orientation first.

Visual Presentation

A three field tachistoscope (Gerbrands, Model T-3B2)

was employed to present the visual stimuli with luminances

of each field set equal. Subjects viewed the two sets of

stimuli, with the ASL and non-ASL hand shapes intermixed

randomly, making a total of 24 trials. Three additional

trials were inserted at separate points during the stimuli

presentation to insure center fixation. These trials com-

prised small, three letter words flashed to the center of

the visual field and were read by the subjects. The subject

saw the following sequence of events. A fixation dot ap-

peared for four seconds at the beginning of each trial. Im-

mediately following the dot, the handshape pair was presented

bilaterally for 500 msec. There then followed a 10 second

delay before the presentation of the visual display response

sets. These visual displays were presented directly above

the tachistoscope viewer so that the subject had only to sit

back slightly to view the display. During the 10 second de-

lay, while the subjects tried to remember the handshapes

viewed, the experimenter signaled which hand (left or right)

should be used in making the response. This was done in the

same manner as in the reaction time trials. Each subject

was then required to select from two stimuli that were pre-

sented from a display of six handshapes. Reaction time for

each trial was measured from the time the experimenter pre-

sented the response display until the subject made his/her

first choice.

Statistical Analysis

Dependent measures for the four experiments were mean

percentage correct and mean reaction time to response. A

3x2x2x2x2 (group x sex x feeling hand x response hand x

time) repeated measures analysis of variance was performed

on the tactile experiments (shapes and lines) with group and

sex the between groups factors and feeling hand, response

hand and time the within group factors. Group refers to the

three subject samples, Hearing, Oral, and Sign and sex to

male and female. Feeling hand refers to which hand (left or

right) felt the stimulus, and response hand to which hand

(left or right) made the response. Time is concerned with

the grouping of trials with time 1 referring to trials 1-10

and time 2 to trials 11-20 (which are a repetition of the

first ten trials). For the visual experiments (ASL hand-

shapes and non-ASL handshapes), a 3x2x2x2x2 (group x sex x

visual half-field x response hand) was employed, with group

and sex the between groups factors and visual half-field

(left or right) and response hand the within group factors.

Initial analyses of variance were performed using the

P2V-Analysis of Variance and Covariance Including Repeated

Measures statistical package of the Biomedical Computer Pro-

grams-P Series Statistical Software (Dixon, 1981). When sig-

nificant interactions made further analysis necessary, the

General Linear Models Program from the Statistical Analysis

System User's Guide (Hewlig & Council, 1979) was used to pro-

vide appropriate analyses of variance. The statistical anal-

yses were performed with the assistance of the Biostatistic De-

partment of the University of Florida.


Separate analyses were performed for the four experi-

ments: Shapes, Lines, ASL handshapes and non-ASL handshapes.

Dependent variables for all experiments were mean percentage

of correct responses and mean reaction time to response.

Initially, individual reaction times for each subject were

converted for differences in rapidity of right versus left

response hands by subtracting the average time to response

scores obtained for each hand at the beginning of the test-

ing session. The tactile experiments (Shapes and Lines)

were analyzed in a 3(group) x 2(sex) x 2(feeling hand) x

2(response hand)x 2(time) repeated measures design with

group and sex the between groups factors and feeling hand,

response hand and time the within group factors. The vis-

ual experiments (ASL handshapes and non-ASL handshapes) were

analyzed in a 3(group)x 2(sex) x 2(visual half-field)x 2

(response hand) repeated measures design group and sex the

between groups factors and visual half-field and response

hand the within group factors. Preliminary analysis showed

no difference for sex, group, or sex x group for age or IQ;

therefore these variables were not considered in later anal-

yses. (See Tables 2 and 3 in Appendix for summary tables

of analyses of variance.)

Experiment 1: Shapes

Analysis of Percentage Correct Responses

Initially, the repeated measures analysis of variance

was performed on mean percentage correct for the shape stim-

uli. Main effects were found for sex (F(1,51)=6.11 p<.05),

time (F(1,51)=13.65 <.001), and response hand (F(1,51)=8.09

p<.01). No main effects were found for group or feeling

hand and there was no group x feeling hand interaction. The

main effects that were found, however, can best be inter-

preted in the light of significant interactions between time,

feeling hand, and sex (F(1,51)=6.72 p<.05) and between time

and response hand (F(1,51)=24.80 p<.001). Separate analyses

for time 1 and time 2 demonstrated a significant effect for

sex (males females) at time 1 (Reduced Model: F(1,18)=9.12

p<.01) and a sex x feeling hand interaction at time 2 (Re-

duced Model: F(1,18)=7.77 p<.05). In addition, an effect

for response hand (with the left hand being more accurate

than the right) was evident at time 1 (Reduced Model: F(1,

18)=44.68 p<.001), but with their being no difference by

time 2. (See Table 4 in Appendix for analysis of variance


Specifically, at time 1 for the Shape stimuli, males

were more accurate than females regardless of feeling hand.

By time 2, females had improved sufficiently on shapes felt

with their left and so that there was no longer a difference

between males and females. Females, however, remained sig-

nificantly less accurate on Shapes felt by their right hand.

These results are presented pictorially in Figure 3. Fi-

nally, subjects (regardless of sex) were more accurate at

time 1 when their response was made with the left hand. By

time 2, response with the right hand had improved signif-

icantly, so that there was no difference in accuracy between

response hands (Figure 4).

Analysis of Reaction Time Data

The repeated measures analysis of variance for mean re-

action time to response for shapes yielded a main effect for

group (F(2,51)=8.71 <.001) and for time (F(1,51)=9.86

p<.01), as well as a time x group (F(2,51)=3.56 p<.05) and

a time x response hand (F(1,51)=4.46 p<.05) interaction.

Analyzing these interactions separately for time 1 and time

2, the Hearing group was significantly faster than the two

deaf groups at both time 1 and time 2 (Reduced Models:

F(2,25)=9.08 p<.001; F(2,26)=6.20 p<.01) and there was no

significant difference in speed of response between the deaf

groups at either time. However, the sign group did respond

significantly faster by time 2 as compared with their re-

sponses at time 1 (Duncan's Multiple Range/corrected for un-

equal n's p<.05). (Figure 5 presents these results.) Paral-

lelling the improved accuracy for the right pointing hand

found with Shapes, pointing with the right hand as shown in



0 0 0 0 0 0 0 0 0
aN 00 !- kid Ln qt cr) N1 r

1 ~I p

0 0 0
co rN k

0 0 0 0 0
in U cr C CM r-4







H a


J0) X


- J
P. rN
co ,
04 (d

C, X

X aE-

E- H e
r Q)




0 u



Right Response

Figure 4

Experiment 1: Shapes
Mean Percentage Correct
Time x Response Hand

Time 1

- -Time 2







10 1





o 2.0

1.0 /

Hearing Oral Sign

Figure 5

Experiment 1: Shapes
Mean Reaction Time
Time x Group

Time 1

*- -qTime 2

Figure 6 became faster with time and was significantly faster

than responses with the left hand at time 2 (F(1,26=12.92

p<.001). Table 5 in the Appendix presents the summary of

analysis of variance.

Experiment 2: Line Orientations

Analysis of Percentage Correct Responses

Repeated measures analysis of variance for mean per-

centage correct yielded a main effect for feeling hand (F

(1,51=17.81 p<.001). No other main effects were observed.

While the main effect for feeling hand was in the predicted

direction (with lines felt by the left hand being more ac-

curately perceived than those felt by the right), this ef-

fect was complicated by the following interactions: response

hand x time (F(1,51)=61.83 p<.001); response hand x sex (F

(151)=4.23 p<.05); feeling hand x time x sex (F(1,51)=11.19

p<.01); and feeling hand x time x response hand (F(1,51)=

22.96 E<.001). Table 6 in the Appendix contains a summary

of this analysis of variance.

More specifically, accuracy for line orientations var-

ied with hand of input, hand of response output, sex, and

time. Generally, across time, males were equally accurate

in responding with both hands and were as accurate as fe-

males with their right hand. Females, however, were signif-

icantly poorer when a left hand response was called for

o 2.0
( 1


Left Response

Right Response

Figure 6

Experiment 1: Shapes
Mean Reaction Time
Time x Response Hand

-- Time 1

. --- Time 2

3.0 T

(Duncan's Multiple Range/corrected for unequal n's p<.05

(DMR)). (Figure 7 depicts this interaction.) Looking fur-

ther at how males and females performed, according to which

hand felt the stimuli, it appears that at time 1, males were

significantly more accurate in feeling lines with the left

than the right hand. No differences between the hands were

found for the female subjects nor did performance by either

hand for the female subjects differ from the right hand

scores of the male subjects (DMR p<.05). By time 2, how-

ever, males had improved their performance when lines were

presented to the right hand and females had improved when

lines were presented to the left hand such that there was

no differences between males feeling with left hand, males

feeling with right hand and females feeling with left hand.

Females, however, remained significantly less accurate (DMR

p<.05) when lines were presented to their right hand. This

interaction can be seen most clearly by viewing Figure 8.

Finally, analyzing feeling hand x responding hand sep-

arately for time, a feeling x responding hand effect is evi-

dent (Reduced Model: F(1,18)=18.36 p<.001). At time 1,

feeling left-responding left, feeling left-responding right,

and feeling right-responding left are equally as accurate

and are more accurate than feeling right-responding right.

By time 2, accuracy remains virtually the same for either

responding hand for feeling right (with some improvement in

feel left-respond right), utforfeel right, the response hand


Left Response

Right Response

Figure 7

Experiment 2: Line Orientations
Mean Percentage Correct
Sex x Response Hand

-- Males

& -- Females


90 ..




20 +


m' E-4



oC0 0 0 ) 0 0C) 0
o O co r om N H

qoaIoaz %


*4 l



E a
*H E
H *H
a) E-

oeqjao3 %

0 X
Hi-l a)

a) 4
Ou4 O

o a
.) & .,

a 0)
** C

U 0 a)

*H a E-4

00o r-O t O g e c

accuracy scores have reversed so that feel right-respond

right is as accurate as either feel left score and is sig-

nificantly better than feel right-respond left (DMR p<.05).

Figure 9 presents this reversal in accuracy scores.

Analysis of Reaction Time Data

Repeated measures analysis of variance for mean reac-

tion time to line orientations yields a main effect for time,

with all subjects performing faster at time 2 than time 1

(F(1,51)=4.24 p<.01) (Figure 10) and a main effect for group

(F(2,51)=84.76 p<.01). Use of Duncan's Multiple Range (cor-

rected for unequaln's) indicates that the Hearing group is

significantly faster than both deaf groups (p<.05) and that

there is no difference between the Oral group and Sign group

in terms of speed of response (see Figure 11). Summary of

the analysis of variance is included in Table 7 of the Ap-


Experiment 3: ASL Handshapes

Analysis of Percentage Correct Responses

The repeated measures analysis of variance for the ASL

handshapes demonstrated a main effect for group (F(2,51)=

49.03 p<.001) with the Sign group perceiving the handshapes

more accurately than did the Oral group and with both groups

being more accurate than the Hearing subjects (DMR p<.05).

0 o o o o o o0 0

oaaaIo3 %







*H (U
(0 0r
4-Jl () 0

*4 0 U)
4 U*
9 (arT
*H-I 4C


E -



o0 0 0 0 0 0 0
So r- I V m N

qoaatoo %

Time 1

Time 2

Figure 10

Experiment 2: Line Orientations
Mean Response Time



2. 0.


I.... ,





Hearing Oral Sign

Figure 11

Experiment 2: Line Orientations
Mean Reaction Time


This finding in effect validates the composition of the

three sample populations by demonstrating that even though

both groups were fluent in ASL, the Sign group's early train-

ing in ASL resulted in significantly higher accuracy scores

for the ASL handshapes than the Oral group. Figure 12 dem-

onstrates this effect for group. No main effect was found

for visual half-field and there was no visual half-field x

group interaction. However, a significant three-way inter-

action was found for visual half-field x response hand x

sex (F(1,51)=5.88 p<.05). Table 8 in the Appendix contains

the summary of the analysis of variance.

Separate analyses for sex reveal a significant interac-

tion between visual half-field and response hand for male

subjects (F(1,36)=5.35 p<.05), while this effect is not evi-

dent for the females. Specifically for males, subjects were

more accurate when input and output occurred within the same

hemisphere (i.e., left visual half-field-left response hand;

right visual half-field-right response hand) than when it

was necessary for cross-hemispheric processing to occur

(left visual half-field-right response hand; right visual

half-field-left response hand). This effect is seen in Fig-

ure 13.

Analysis of Reaction Time Data

Analysis of variance for mean reaction time to ASL

handshapes demonstrated no significant main effects for







u 50

u 40




Hearing Oral Sign

Figure 12

Experiment 3: ASL Handshapes
Mean Percentage Correct





U) X
So a cQ)
H M 4 r5

S o a
D O 4O a (

-, 4
r & a H ) H
0 U4


4O -
4-1 m

U4-l C C CC

H *
/a c ^

group, sex, visual half-field or response hand and no sig-

nificant interactions. Table 9 in the Appendix contains

the analysis of variance summary.

Experiment 4: Non-ASL Handshapes

Analysis of Percentage Correct Responses

The analysis of variance with repeated measures for

the non-ASL handshapes revealed a main effect for response

hand, with subject's being more accurate when they responded

with their left hand (F(1,51)=20.24 <.001). Again, how-

ever, this main effect was embedded in a visual half-field

x response hand x sex interaction (F(1,51)=8.61 p<.01).

Separating this interaction by left and right visual

half-fields and performing a reduced model analysis of vari-

iance yield an effect for sex (males>females)(F(1,18)=

P<.05), response hand (left>right) (F(1,18)=14.09 p<.01), and

a sex x response hand interaction (F(1,18)=7.35 p<.05). No

effects were noted for sex or response hand when handshapes

were viewed in the right visual half-field. Analyzing the

sex x response hand interaction for left feeling hand fur-

ther, it appears that in terms of mean percentage correct

for meaningless handshapes, there was no difference for

sex or response hand when handshapes were viewed in the

right visual half-field. However, when these non-ASL hand-

shapes were viewed in the left visual half-field, females

did significantly worse than males when they were required

to respond with their right hand. Females viewing these

handshapes in the left visual half-field and responding with

the left hand were equally as accurate as the males (regard-

less of which hand the males responded with). Figure 14

presents this interaction. A summary of the analysis of

variance is contained in Table 10 in the Appendix.

Analysis of Reaction Time Data

Repeated measures analysis of variance for mean reac-

tion time to the non-ASL handshapes yielded no significant

effects for group, sex, visual half-field or response hand

and no significant interactions. Table 11 in the Appendix

contains a summary of this analysis of variance.

Individual's Lateral Directional Preference

Because no group effects were observed for hemispheric

laterality across the four tasks, individual subject's di-

rectional preference for laterality was observed post hoc

to determine if group averaging had obscured clear individ-

ual differences for laterality. These directional prefer-

ences were determined by assigning the number one to the con-

dition containing the highest percentage correct for each of

the four tasks. These conditions were input left field-out-

put left hand (LL), input right field-output right hand (RR),

and an averaged percentage combination of the two crossed

0 0 0 0 0 0 00 0
0I co r- 0 LO m 1w Hr-

q.o03oo %

0 0 0 0 0 0

qoeoo %

0 0 0
m C4 r-


CI a (u
tn q

C h



m 0
0 C a

1-) o

x c;

0 (


S0 tl

!, a





hemispheric conditions (X), i.e., input right field-output

left hand and input left field-output right hand. Ranks of

two and three were assigned to the second and third highest

accuracy performance respectively. Tied ranks were assigned

to individuals with equally good performance within condi-

tions by task.

Assessing the performance of individual subjects across

the tasks, no subjects demonstrated the same directional

preference across all four tasks (i.e., no subjects received

a ranking of one in either the LL, RR, or X condition for

all four tasks). Of the total number of subjects, only four

subjects demonstrated the same directional preference for

three of the four tasks, while only 22 of the subjects

showed a similar directional preference on two of the four

tasks. Three separate Chi square analyses were then per-

formed on the number of subjects showing superior perfor-

mance on the LL, RR, and X conditions for each of the four

tasks to determine if the Hearing, Oral and Sign group dif-

fered in their lateral directional preference. No signif-

icant differences were observed. Specifically, for the LL
condition, X = 8.58, df = 6, p<.20. The RR superior

preference condition yielded a X = 8.84, df = 6, p<.20.

Finally, the crossed hemisphere condition yielded a X2

5.66, df = 6, p<.50.


Group Effects

The major purpose of this research was to investigate

the lateral organization of the brain in deaf individuals,

particularly as this organization related to language. It

was hoped that it might be possible to demonstrate differen-

tial laterality of processing language dependent on the deaf

individual's early language training. To this extent, this

research failed to find any significant differences in hem-

ispheric organization between different deaf populations or

between deaf and hearing populations. With the four sep-

arate experiments, there was never a group x field (either

feeling hand or visual half-field) interaction effect,

which would suggest differences in hemispheric organization.

Experiment 1, which consisted of shape stimuli, particu-

larly had been hypothesized to show differential effects of

processing. Because no such effects were found, one must

conclude that either there is no difference in the way the

brain is lateralized for this task between the various

groups, or that the task was not sensitive enough to en-

hance any difference that might exist. The same conclu-

sions generally hold for the line orientation, ASL and

non-ASL tasks as well. The performance of the three groups

on the tasks in the four experiments does not suggest that

a different organization exists for any of the groups. In-

stead, in these experiments, with the demands of the four

tasks, all groups were able to develop and use similar strat-

egies (see Figure 15).

In addition, when the data was analyzed for direction

of superior performance, i.e., better performance in the LL,

RR, or X conditions, again no significant differences were

obtained between the groups. Thus, whether one analyzed for

level or for direction of preference, no significant differ-

ences were obtained between the groups. This finding weak-

ens the argument that the groups showed differential lateral

organization for language as predicted in the hypothesis.

However, as noted earlier, it may be that the tasks them-

selves were insufficiently sensitive to demonstrate a lan-

guage organization effect. Specifically, on the shape task

in particular, allowing the subjects to wait 10 seconds be-

fore responding, which had been intended to pull for a ver-

bal strategy, may have been too extended to guarantee that

only a verbal strategy was employed. Further, by not spe-

cifically instructing the subjects to code the shapes ver-

bally, it was possible that the subjects could have used

something other than a verbal label to identify the shapes.

Group effects that did exist involved differential mean

reaction times on Experiments 1 and 2 and differential mean





60 .

/ -

0 --. 1 .'' .
*^^ ''-


Shapes Lines ASL Non-ASL
Mean Percentage Correct

Figure 15

Task x Group

- Hearing ..---....Oral



U 50 ..


-- -. Sign

accuracy scores on Experiments 3 and 4. For Experiments 1

and 2 (shapes and lines) both deaf groups were significantly

slower to respond than their hearing counterparts. No dif-

ference was evidenced between the reaction times of the

three groups on the visual half-field tasks and while the

hearing subjects responded more rapidly to the shapes and

lines than to the visual half-field tasks, the deaf groups

did not differ in speed of response across the four tasks.

One possible explanation for this is that deaf individuals

process information slower than hearing individuals and that

even when the stimuli were familiar (as were the handshapes),

is was not possible for the deaf subjects to process the in-

formation any more quickly. A more parsimonious explanation

also exists, however, which avoids the necessity of invoking

differences in brain processes between the deaf and hearing.

This explanation would hold that differences in reaction times

between deaf and hearing subjects reflect a motivational var-

iable. Observing hearing subjects in the two tactile exper-

iments, it was apparent that they were aware that they were

being timed. Not only did they attempt to respond as

rapidly as possible on these tasks, but they also evidenced

concern at any point when the experimenter failed to stop

the watch when they made their response (sometimes the ex-

perimenter would take a "split time" which allowed the

watch to keep ticking). Deaf subjects, in

contrast, could not hear the watch (and the watch was not

visible) so that they were generally less aware that they

were being timed and did not strive for rapid responding.

The behavior of the Hearing group was different for the vis-

ual half-field experiments. Here, because of the nature

of the equipment and the timing device used, the stopwatch

was not audible and subjects in all groups were generally

unaware of being timed. In Experiments 3 and 4, there was

no difference in reaction times for the three groups.

The other group effect occurred in Experiments 3 and 4

and involved differences in mean percentage correct for the

visual handshapes for the three groups. The pattern of re-

sponding of the Hearing group for the non-ASL handshapes

(response output from left hand superior to response output

with right hand) suggests that these individuals processed

these handshapes as "nonsense" visuospatial stimuli, evi-

dencing a right hemispheric strategy. This pattern of vis-

uospatial processing also existed for the deaf groups.

Their experience with handshapes did seem to enhance their

performance, however, such that they were generally more ac-

curate than the Hearing group for the non-ASL handshapes.

The two deaf groups did not differ between themselves in

terms of accuracy for these stimuli (x Hearing = 63%, x

Oral = 69%, x Sign = 72%). Reviewing results for the ASL

handshapes, the Hearing group again appeared to process

these handshapes as nonsense visuospatial stimuli with no

indications of differences in processing or accuracy (x

Hearing = 56%). In addition, both deaf groups were again

significantly more accurate than the Hearing group, but were

also significantly more accurate than they were with the

non-ASL handshapes, suggesting that the deaf groups viewed

these signs as meaningful stimuli and processed them as lan-

guage. Finally, not only were the two deaf groups more ac-

curate for the ASL handshapes than for the non-ASL hand-

shapes, but the Sign group was significantly more accurate

than the Oral group for the ASL stimuli (x Oral = 80%; x

Sign = 91%). The superior performance of the Sign group over

the Oral group for the ASL handshapes adds validation to this

research's separation of the deaf population according to

early language training. In addition, these results suggest

that early language training does affect the latter accuracy

of processing sign language for deaf individuals.

Hemispheric Organization Effects

Generally, while no major effects were observed for in-

ferred hemispheric organization of language dependent on hear-

ing status, laterality effects, reflecting differential hem-

ispheric processing of different stimuli, were observed. The

extent to which these effects were observed, however, was de-

pendent on the type and nature of the task, and response used

to assess laterality, and the sex of the individual. Usually,

type of task has been thought to determine laterality, but

results of this research suggest that other parameters of

the task, such as the inclusion of a memory component by add-

ing a delay between stimuli presentation and response, may

alter the demands of the task, thereby affecting laterality.

Responses used to assess laterality, whether it be side of

stimulus input (feeling hand; visual half-field) or side of

response output (hand that makes the response) also affected

laterality effects. Few researchers have thought to look at

both stimuli input and response output, and therefore, their

results are limited and may be misleading. Finally, while

not predicted, sex was found to significantly interact with

the type of task and the response variable measured (whether

stimuli input or response output) to affect laterality re-

sults. Bullard, Satz, Harris, and Freund (1982) have re-

viewed the literature on sex effects in laterality research

and note this dependence of laterality effects on the inter-

action of sex with task and type of response measured.

Reaction Time

Generally, effects for the dependent variable mean re-

action time were limited to the tactile experiments (shapes

and lines) and did not evidence any laterality effects. Per-

haps, one reason for this was that reaction time measured

the time interval between stimulus presentation and first re-

sponse, which tended to be fairly consistent. In contrast,

variability in terms of time between first and second re-

sponse was much greater, often stretching out for a number

of seconds and sometimes as much as 10, 20 or more seconds.

The hesitancy in response for the second stimulus may reflect

difficulty in inter-hemispheric crossover and therefore may

be an important variable. One possible means to make reac-

tion time a more sensitive measure of laterality effects

therefore is to measure time of response for choice of first

stimulus and time to response to second stimulus.

Percentage Correct

Laterality effects for the dependent variable, mean per-

centage correct, were dependent on the type of task and will

be discussed for each experiment separately.

Experiment 1: shapes. Initially, males were better

than females at this task regardless of the side of input.

This is consistent with Kimura's finding (1966) that females

improve on visuospatial processing with time. In addition,

it appeared that with time (and therefore, practice) response

output with the right hand became more accurate. These ef-

fects demonstrate the importance of looking at both input

and output if one is to make conclusions regarding laterality

of hemispheres. If one had looked only at stimulus input for

this task, a left hemisphere strategy would seem to dominate.

Response output, in contrast, would argue for a right hemis-

phere strategy. Clearly, to understand what is occurring

in the brain, one must consider both these variables.

What appears to have happened in this experiment is

that shapes were initially (at time 1) handled using a visuo-

spatial, right hemisphere strategy, as indicated by the su-

perior performance of the left response hand, and that fe-

males improved with practice using this strategy. However,

the design of the experiment pulled for a verbal strategy

(by having the task be bilateral and including a memory com-

ponent) and with practice a left hemisphere strategy did ap-

pear to develop, as evidenced by more accurate performance

with the right response hand. Therefore, in this experiment,

by including parameters to encourage verbal mediation, it

was possible to override an initial right hemisphere strat-

egy. This would suggest that it is possible to experimen-

tally manipulate processing strategies, thereby altering

strategies that might more typically be used. These find-

ings are consistent with Bullard et al. (1982) who manipu-

lated processing time between stimulus presentation and re-

sponse (zero or 10-second delay) for complex shapes pre-

sented by visual half-fields and found that, while males

used a right hemisphere strategy with no delay, they changed

to a verbal strategy when delay was introduced. Gardner et

al. (1977) is the only reported study to control for both

stimulus input and response hand output, for the Witelson

shapes. They found that the left feeling hand was generally

superior, but that performances tended to be more accurate

and faster when feeling hand and response hand were the same.

The Gardner et al. results are compatible with the current

findings; however their experiment did not employ the memory

component which may have lessened cross-hemispheric effects

for the present study since it allowed more time for inter-

hemispheric transfer of information.

Experiment 2: lines. Initially, tactile exploration

of line orientations appeared to be a right hemisphere task.

At time 1, feeling with the left hand was superior to feel-

ing with the right; responding with the left was superior to

responding with the right hand; and the feeling right-re-

sponding right condition (when input and output were both

from the nondominant hemisphere) was significantly worse

than all other conditions.

It was originally hypothesized that increasing the com-

plexity of the task through bilateral presentation and in-

voking a memory component through the use of a delay should

not override this right hemisphere strategy. However, with

practice, males appeared able to develop a verbal strategy

as they improved over time with input into the left hemis-

phere. In fact, while debriefing subjects after the exper-

iment, many indicated that they had used verbal strategies,

such as remembering the lines as the face of a clock, to fa-

cilitate memory. This is compatible with the Bullard et al.

(1982) finding that males are more likely to use a right

hemisphere strategy with visually presented line orientations

when there is no delay between stimulus presentation and

response, but that they change to a verbal strategy when a

delay is introduced.

Looking specifically at input to the hemisphere as sep-

arate from output, females showed significant improvement in

right hemisphere processing of lines with time (time 1--feel

left: x = 66%; time 2--feel left; x = 74%). This finding

replicates the shape data, which also indicated that females

improved in right hemisphere processing with time. For out-

put with response hand, males appeared equally as accurate

when responses were made with either hand, and were equally

as accurate as females responding with the right hand. Fe-

males, however, despite gains made in accuracy across time

with input to the right hemisphere, continued to do poorly

when output was required from the right hemisphere. Breaking

this sex x response hand interaction down across time, at

time 1 males were more accurate when response output was from

the right hemisphere (75% vs. 63%), as would be consistent

with a right hemisphere, visuospatial strategy. Males, how-

ever, become more accurate with output from the left hemis-

phere by time 2 as a verbal strategy developed (respond right:

77%, respond left: 68%). Apparently, this increase in accu-

racy of output from the left hemisphere compromises right hem-

isphere strategy somewhat as evidenced by the lowered re-

sponse accuracy for the left response hand for time 2 (re-

spond left time 1: 75%, time 2: 68%). Females, in turn, were

equally as accurate in output from either hemisphere at time

1 (respond left: 66%, respond right: 62%) and improved when

output was from the left hemisphere at time 2 (respond right:

75%; respond left: 58%). These results are compatible with

results for the shapes as graphs of the data for line orien-

tations (Figures 3 and 4) are essentially no different from

the graphs for Shapes (Figures 8 and 9).

Perhaps the major conclusion to be drawn from the re-

sults of this experiment pertains to the nature of the line

orientation task. These results indicate that one cannot

assume that line orientations are inherently a right hemi-

spheric task. Instead, it appears that subjects have the

capacity to invent and use a verbal strategy for line orien-

tations, and will do so or not, depending on the demands of

the task.

Experiment 3: ASL handshapes. No main effects for vis-

ual half-field were found for these stimuli. The group ef-

fects that were found (Sign>Oral Group) indicate that early

language training does affect how the groups process these

signs. The Hearing group treated these handshapes as non-

sense visuospatial stimuli. Both deaf groups processed the

ASL handshapes as meaningful, but the Sign group with early

language training in ASL was significantly more accurate in

processing the signs. With the exception of one study (Ross

et al., 1979) researchers investigating language in deaf in-

dividuals have not considered or controlled early language

training as a factor in the development of language for the

deaf. The differential responding in terms of accuracy of

processing the ASL handshapes for the Oral and Sign group

indicates that this factor should not be overlooked with in-

vestigating language in the deaf.

Experiment 4: Non-ASL handshapes. Inspection of the

significant effects and interactions for the non-ASL hand-

shapes suggests that all groups handled this as a primarily

visuospatial task with response output from the right hemi-

sphere being superior to response output from the left. In

addition, males were generally more accurate for this task

than were females. Poizner and Lane (1979), presenting

these same stimuli unilaterally and without a delay between

stimuli presentation and response, also found a right hemi-

sphere advantage for both hearing and deaf subjects. How-

ever, the differences for hemisphere and sex in the present

research (as shown by the visual half-field x sex x response

hand interaction) were primarily due to the poor performance

of the females when input was to the right hemisphere and

output was from the left hemisphere. Apparently, while this

task may be strongly lateralized to the right hemisphere be-

cause of its visuospatial nature, males could access this in-

formation fairly equally with either response hand (left:

72%; right: 68%) when stimuli were presented to the right

hemisphere. Females, however, were only able to output ef-

ficiently when output was directed from the processing hemi-

sphere. Crossed-hemisphere output was significantly poorer

for females (respond left: 75%; respond right: 46%). Gard-

ner et al. (1977) found in their study that cross-hemispheric

information retrieval decreased accuracy. These differences

in the present research were not observed for the nondominant

(left) hemisphere for the task. If females are less lateral-

ized for visuospatial tasks, it may be that they pay the

price in efficiency of output in a task that is strongly lat-


Comparison of Present Findings with Past Research

Direct comparison of the present findings with previous

research involving the hemispheric organization of deaf indi-

ividuals is made difficult by explicit purposes of the pres-

ent research. Specifically, early research into the lateral

organization of language in deaf individuals limited the con-

clusions that might be drawn from this research by using

tasks that did not represent language for deaf individuals

(McKeever et al., 1976; Neville & Bellugi, 1978; Phippard,

1979); by choosing response sets that were too simple to in-

voke verbal mediation or favored one particular hemisphere

over the other (Manning et al., 1977; Phippard, 1979; Poiz-

ner & Lane, 1979); and by failing to consider early language

training within their deaf population (Manning et al., 1976;

McKeever et al., 1977; Ross et al., 1979). The present re-

search was designed to minimize these difficulties by

including a tactile modality to utilize another approach to

attempt to tap language in the deaf; by increasing the com-

plexity of the tasks through bilateral presentation; and by

differentially assessing the performance of deaf individuals

with early language training in a primarily oral or signing

environment. Two additional factors, sex and response hand,

which had been demonstrated to be of primary significance in

work with hearing individuals (Bullard et al., 1982; Gardner

et al., 1977; Harris et al., 1980) were also assessed.

Given these changes in the design of the present study,

sex and the interaction of stimulus input and response out-

put were demonstrated to be the most significant factors af-

fecting laterality. In past research, sex of the individual

subject has typically either been ignored (Gardner et al.,

1977; Manning et al., 1977; Phippard, 1979), limited to one

sex (McKeever et al., 1976), or information was gathered in

such small or unequal proportions that the variable could not

be assessed accurately (Cranney & Ashton, 1980; Poizner &

Lane, 1979; Ross et al., 1979). Similarly, consideration of

response hand (side of response output) has generally either

been ignored (La Breche et al., 1977; Manning et al., 1977;

McKeever et al., 1976; Oscar-Berman et al., 1978; Phippard,

1979; Ross et al., 1979); confined to the left or right hand

(Cranney & Ashton, 1980; Witelson, 1974), or limited in the

conclusions that could be drawn by the failure to use a bi-

lateral task (Poizner & Lane, 1979).

Despite differences between the current study and these

noted above, some general similarities in findings have

emerged. For the Witelson's shapes task, La Breche et al.

(1977) demonstrated a LHA for their hearing group and a trend

toward a LHA for their deaf group, as measured by the hand of

stimulus input. This is consistent with the present findings

of a LHA for all groups on shapes as measured by side of

stimulus input. As noted previously, however, response out-

put in the present study suggested a right hemisphere strat-

egy. Cranney and Ashton (1980), using a tactile paradigm

with the Witelson shapes, failed to find a hemispheric advan-

tage for either the deaf or hearing groups, but did not con-

trol for side of response output or for sex. The research

with the Witelson shapes that is most consistent with the

present findings is that of Gardner et al. (1977). Gardner

et al. demonstrated a RHA for the shapes, consistent with

the initial RHA for shapes in the present study. Because

Gardner et al. did not utilize a delay component, however,

they did not find as did the present study, the emergence of

a left hemisphere, verbal strategy with continued exposure

to the task. Most significant, however, was the finding by

Gardner et al. of the importance of assessing both stimulus

input and response output as the two variables were found to

interact in determining laterality results. Furthermore,

the present study did replicate the Gardner et al. finding

that crossed hemispheric conditions were performed with less

accuracy than uncrossed conditions.

The results of the line orientation task were most con-

sistent with the literature, suggesting an initial right

hemisphere strategy (Benton et al., 1973; Benton et al.,

1978; Phippard, 1979). With time, however, the added memory

component apparently facilitated the use of the left hemi-

sphere (verbal) strategy. Although this change in strategy

was not predicted, it was compatible with the results of Bul-

lard et al. (1982) and with the shape data from Experiment 1.

Thus, the present study demonstrated that initially, judgment

of line orientations by hearing handicapped children is prob-

ably performed best by the right hemisphere. Extended ex-

posure and familiarity with the task eventuates in the de-

velopment of a left hemisphere (i.e., verbal) strategy for

these children as it also does for normal hearing individ-

uals (Bullard et al., 1982).

The results from the visual half-field experiments with

the ASL and non-ASL handshapes were most compatible with the

research by Poizner and Lane (1979). Both the present study

and the work by Poizner and Lane demonstrated a clear RHA for

the non-ASL handshapes for both the deaf and hearing subjects.

Poizner and Lane, however, also demonstrated a RHA for all

subjects to the ASL handshapes whereas in the present study

no hemisphere advantage was observed for the ASL handshapes.

Several differences in the two studies might contribute to

these differential findings. First, Poizner and Lane uti-

lized unilateral input, while in this study bilateral (i.e.,

competing) stimuli were used which increased the complexity

of the task for the present subjects. Second, as discussed

previously, the response measure for the present research

may not have been sensitive enough to detect hemispheric dif-

ferences. Poizner and Lane found in their results, that ac-

curacy was not a sensitive measure of hemispheric differ-

ences. Instead, they relied on correct reaction time and

combined this with a three second limitation to response

time. Had these, or similar measures, been employed in the

present study, a differential response by hemisphere might

have been observed.

While past research has suggested the possibility that

there may be a different organization of language for deaf in-

dividuals than for hearing, this was not observed in the pres-

ent study. It was hoped that the use of two different deaf

groups in comparison with a normal hearing control would dem-

onstrate both the affect of the auditory system and the im-

portance of early language training on the lateral organiza-

tion of the brain, however, this was not demonstrated. While

it is still an intriguing idea, results from the present

study suggest several modifications that would be needed be-

fore any such differences might emerge. Included in these

changes would be the induction of a verbal strategy set in

all subjects on the shapes tasks by instructing the subjects

to verbally code and remember the shapes; a shortening of the

time to response or a measurement of time to both first and

second responses to limit or measure inter-hemispheric cross-

over; and the use of both a delay and no delay condition to

assess the effect of the memory component or processing

strategy. However, the present study also makes clear the

need for future research to take into account the importance

of the sex of the subject and the interaction of stimulus in-

put and response output in determining laterality, as fail-

ure to do so appears to lead to erroneous conclusions.


The major finding of the present research, and one that

was not predicted, was that differential effects for lateral-

ity exist dependent on sex. The predicted effect, differ-

ences in hemispheric processing of stimuli dependent on hear-

ing status and early language training, was not found. While

differential laterality effects for sex with different tasks

were known to exist in the literature (Harris et al., 1980),

it was thought that hearing status would overshadow the pos-

sible sex effects and therefore they were not predicted. In-

stead, what was observed was that sex effects interacting

with type of task, side of stimulus input, and side of stim-

ulus output overrode the consequences of being deaf. This

would suggest that in terms of the way the brain is organ-

ized for certain types of tasks, it appears that it matters

more whether you are male or female than whether you can hear

or not. In effect, the present study ended up investigating

differences between the male and female brains, rather than

between hearing and non-hearing individuals. Another major

finding of the present study enforces the importance of con-

sidering both hemisphere input and hemisphere output when in-

vestigating differences in hemispheric organization of tasks.

Failure to do so (which has more often been the rule rather

than the exception with research in this area) could lead to

erroneous conclusions regarding superiority of hemispheric

output for tasks. Finally, this research supported the find-

ings of Bullard et al. (1982) who noted that laterality ef-

fects are dependent on both type of task and strategy em-

ployed in performing the task. One should not assume that

the presentation of certain stimuli alone insure a particular

type of processing (i.e., verbal/nonverbal, analytic/visuo-

spatial). Rather, the particular strategy used in a given

task is a function of both the nature of the stimuli and the

demands of the task. In addition, within a given task, the

strategy employed may also vary as a function of exposure

to and practice with the stimuli presented.

Future research with these different language popula-

tions might yield clearer results if the dependent measure

used in the experiment is tightened. It was noted earlier

that a change in the method of measuring reaction time by

including a measure of the time to second stimulus chosen

might prove a more sensitive measure of inter-hemispheric

information retrieval. In addition, a similar effect might

be obtained for mean percentage correct by forcing the sub-

jects to make their two choices within a set limited amount

of time, thereby allowing less time for inter-hemispheric


To further highlight the differences in hemispheric

processing and the effect of processing strategy on lateral-

ity effects, a paradigm similar to the one used by Bullard

et al. (1982) might be used. In this research, the authors

not only compare side of stimulus input and side of response

output, but also manipulate and compare processing strategies

by investigating the effects of no delay versus a 10-second

delay between stimulus presentation and response.

Finally, the inclusion of a verbal induction component

which would instruct subjects to utilize a verbal labeling

strategy in coding and remembering the shapes task might

serve to lessen the chance that both deaf or hearing subjects

would use other than a verbal processing strategy, thereby

making the shape task a clearer measure of language lateral-



Table 2. Summary Table for Hearing, Oral and Sign
Groups, Analysis of Variance for Mean Age.

Source df Mean Square F p

Sex 1 642,835 1.35 .25

Group 2 4,452 0.01 .99

SG 2 5,190 0.01 .99

Error 51 475,062

Table 3. Summary Table for Hearing, Oral and
Sign Groups, Analysis of Variance
for Mean Intelligence Quotient.

Source df Mean Square F_

Sex 1 21,807 0.21 .65

Group 2 174,607 1.66 .20

SG 2 96,419 0.92 .41

Table 4. Summary Table for Experiment 1:
Analysis of Variance for Mean
Percentage Correct.

Source df Mean Square F


Feeling Hand



Response Hand

















Table 4.


Source df Mean Square F

FTR 1 .099 1.99
FTRS 1 .011 0.22
FTRG 2 .054 1.09
FTRSG 2 .045 0.90
Error 51 .045

Table 5. Summary Table for Experiment 1: Shapes
Analysis of Variance for Mean Reaction

Source df Mean Square F






Response Hand

















Table 5.


Source df Mean Square F

FTR 1 .000 1.79
FTRS 1 .000 0.45
FTRG 2 .000 0.80
FTRSG 2 .000 1.43
Error 51 .000

Table 6. Summary Table for Experiment 2: Line
Orientations Analysis of Variance for
Mean Percentage Correct.

Source df Mean Square F


Feeling Hand



Response Hand

















Table 6. Continued.

Source df Mean Square F

FTR 1 .917 22.96***
FTRS 1 .011 0.26
FTRG 2 .055 1.37
FTRSG 2 .001 0.04
Error 51 .040

Table 7. Summary Table for Experiment 2: Line
Orientations Analysis of Variance for
Mean Reaction Time.

Source df Mean Square F





Feeling Hand



Response Hand














Table 7.


Source df Mean Square F

FTR 1 .018 1.64
FTRS 1 .002 0.19
FTRG 2 .007 0.68
FTRSG 2 .017 1.58
Error 51 .011

Table 8. Summary Table for Experiment 3: ASL
Handshapes Analysis of Variance for
Mean Percentage Correct.

Source df Mean Square F

Sex 1 .012 0.24
Group 2 2.449 49.03***
SG 2 .086 1.71
Error 51 .050

Visual Half-Field 1 .013 0.18
Vs 1 .001 0.01
Vg 2 .007 0.10
Vsg 2 .007 0.10
Error 51 .073

Response Hand 1 .020 1.06
RS 1 .004 0.23
RG 2 .038 1.99
RSG 2 .044 2.30
Error 51 .019

VR 1 .111 3.45
VRS 1 .190 5.88*
VRG 2 .044 1.37
VRSG 2 .021 1.65
Error 51 .032 0.65

Table 9. Summary Table for Experiment 3: ASL
Handshapes Analysis of Variance for
Mean Reaction Time.

Source df Mean Square F

Sex 1 5.783 1.97
Group 2 1.284 0.44
SG 2 3.479 1.19
Error 51 2.932

Visual Half-Field 1 .070 1.27
Vs 1 .057 1.04
Vg 2 .067 1.21
Vsg 2 .054 0.99
Error 51 .055

Response Hand 1 4.595 3.43
RS 1 1.148 0.86
RG 2 .397 0.30
RSG 2 .411 0.31
Error 51 1.340

VR 1 .059 1.07
VRS 1 .059 1.07
VRG 2 .062 1.13
VRSG 2 .062 1.13
Error 51 .055


Table 10. Summary Table for Experiment 4: Non-ASL
Handshapes Analysis of Variance for Mean
Percentage Correct.

Source df Mean Square F

Sex 1 .064 0.68
Group 2 .154 1.64
SG 2 .118 1.24
Error 51 .094

Visual Half-Field 1 .136 3.50
Vs 1 .120 3.10
Vg 2 .056 1.44
Vsg 2 .014 0.37
Error 51 .039

Response Hand 1 .675 20.24***
RS 1 .002 0.06
RG 2 .023 0.68
RSG 2 .008 0.25
Error 51 .033

VR 1 .153 3.91*
VRS 1 .337 8.61*
VRG 2 .009 0.22
VRSG 2 .023 0.60
Error 51 .039


Table 11. Summary Table for Experiment 4: Non-ASL
Handshapes Analysis of Variance for Mean
Reaction Time.

Source df Mean Square F

Sex 1 2.598 0.61
Group 2 4.281 1.00
SG 2 5.864 1.37
Error 51 4.267

Visual Half-Field 1 1.179 1.07
VS 1 1.179 1.07
VG 2 1.153 1.04
VSG 2 1.153 1.04
Error 51 1.104

Response Hand 1 .988 0.56
RS 1 .012 0.01
RG 2 1.147 0.66
RSG 2 4.487 2.56
Error 51 1.751

VR 1 1.312 1.20
VRS 1 1.312 1.20
VRG 2 1.285 1.18
VRSG 2 1.285 1.18
Error 51 1.089



Table 12. Individual Subject's Rank Ordered Per-
formance Within Conditions by Task.1,2

Shapes Lines ASL non-ASL
No. Sex3 Group4 LL5 X RR LL X RR LL X RR LL X RR

1 = highest percentage correct within a task
2 = 2nd highest percentage correct within a task
3 = 3rd highest percentage correct within a task

Tied ranks are equally good performance by condition.

M = males; F = females

0 = Oral; S = Sign; H = Hearing

LL = input left field-output left hand
X = combination of crossed hemispheric conditions (i.e.,
input left-output right and input right-output left).
RR = input right field-output right hand

Table 12. Continued.

Shapes Lines ASL non-ASL
No. Sex3 Group4 LL5 X RR LL X RR LL X RR LL X RR

1 2
2 1
2 1
3 1
1 2
1 2
1 3
1 3
2 1
2 1
1 2
2 2
1 1
1 1
1 3
2 1
2 1
1 1
2 3
3 1
1 2
1 1
3 1
3 2
1 2
2 1
2 1
1 3
1 2
2 1
1 1

1 1
2 2
2 2
1 1
3 1
3 2
3 1
1 2
1 2
3 2
1 3
1 1
1 1
1 2
2 3
2 3
3 2
2 3
1 1
2 3
1 2
3 2
1 3
3 2
2 3
1 3
1 2
2 1
1 3
1 2
1 2

1 1
3 2
2 1
1 1
1 3
1 2
3 2
2 2
1 3
3 1
2 3
1 2
1 3
1 2
3 1
1 2
2 3
2 1
2 1
2 1
1 3
1 2
1 2
2 3
3 2
1 3
1 3
1 1
2 3
3 1
1 1


Benton, A.L., Levin, H.S., & Varney, R.N. Tactile percep-
tion of direction in normal subjects. Neurology, 1973,
23, 1248-1250.

Benton, A.L., Varney, N.R., & Hamsher, K.D. Lateral differ-
ences in tactile directional perception. Neuropsy-
chologia, 1978, 16, 109-114.

Bever, T.G. Cerebral asymmetries in humans are due to the
differentiation of two incompatible processes: Holis-
tic and analytic. Annals of the New York Academy of
Science, 1975, 263, 251-262.

Bever, T.G., & Chiarello, R. Cerebral dominance in musicians
and non-musicians. Science, 1974, 185, 537-539.

Bever, T.G., Hurtig, R.R., & Handel, A.B. Analytic proces-
sing elicits right ear superiority in monaurally pre-
sented speech. Neuropsychologia, 1976, 14, 175-181.

Bradshaw, J.L. Right hemisphere language: Familial and
nonfamilial sinistrals, cognitive deficits and writing
hand position in sinistrals, and concrete-abstract,
imageable-nonimageable dimensions in word recognition.
A review of interrelated issues. Brain and Language,
1980, 10, 172-188.

Bullard, P.C., Satz, P., Harris, H.E., & Freund, A. Gender
differences in hemispheric organization: Fact or fan-
tasy? In preparation, 1982.

Cranney, J., & Ashton, R. Witelson's dichhaptic task as a
measure of hemispheric asymmety in deaf and hearing
populations. Neuropsychologia, 1980, 18, 95-98.

Dixon, W.F. (Ed.). BMDP statistical software. Berkeley:
University of California Press, 1981.

Dodds, A.G. Hemispheric differences in tactual-spatial
processing. Neuropsychologia, 1978, 16, 247-254.

Gardner, E.B., English, A.G., Flannery, B.M., Hartnett,
M.B., McCormick, J.K., & Wihelmey, B.B. Shape-recogni-
tion accuracy and response latency in a bilateral tac-
tile task. Neuropsychologia, 1977, 15, 607-616.

Gibson, C., & Bryan, M.P. Cerebral lateralization in deaf
children using a dichhaptic task. Paper presented at
the International Neuropsychological Society in Pitts-
burgh, February, 1982.

Harris, A.J. Harris tests of lateral dominance (3rd Ed.).
New York: Psychological Corp., 1958.

Harris, H.E., Bullard, P.C., Satz, P., Freund, A., Hutchin-
son, S., & Berg, S. Gender differences in hemispheric
organization: Fact or fantasy? Paper presented at the
International Neuropsychological Society. San Fran-
cisco, January, 1980.

Helwig, J.T., & Council, K.A. (Eds.). Statistical Analysis
System User's Guide (1979 ed.). Raleigh, N.C.: Sta-
tistical Analysis Systems Institute, Inc., 1979.

Kimura, D. Dual functional asymmetry of the brain in visual
perception. Neuropsychologia, 1966, 4, 275-285.

Krashen, S.D. The major hemisphere. UCLA Educator, 1975,
17, 12-23.

La Breche, T.M., Manning, A.A., Goble, W., & Markham, R.
Hemispheric specialization for linguistic and nonlin-
guistic tactual perception in a congenitally deaf pop-
ulation. Cortex, 1977, 13, 184-194.

Lenneberg, E.H. Biological foundations of language. New
York: Witen, 1967.

Levy, J. Possible basis for the evolution of lateral spe-
cialization of the human brain. Nature (London) 1969,
224, 614-615.

Levy, J. Psychological implications of bilateral asymmetry.
In S.J. Dimond, & T.G. Beaumont (Eds.), Hemispheric
function and the brain. London: Elek, 1974.

Manning, A.A., Goble, W., Markham, R., & La Breche, T. Lat-
eral cerebral differences in the deaf in response to
linguistic and nonlinguistic stimuli. Brain and Lan-
guage, 1977, 4, 309-321.

McKeever, W.F., Hoemann, H.W., Florian, V.A., & Van Deventer,
A.D. Evidence of minimal cerebral asymmetries for the
processing of English words and American sign language
in the congenitally deaf. Neuropsychologia, 1976, 14,

Nebes, R.D. Hemispheric specialization in commissurotomized
man. Psychological Bulletin, 1974, 81, 1-14.

Neville, H.J., & Bellugi, U. Patterns of cerebral special-
ization in congenitally deaf adults. In P. Siple (Ed.)
Understanding language through sign language research.
New York: Academic Press, 1978.

Oscar-Berman, M., Rehbein, L., Parfert, A., & Goodlass, H.
Dichaptic handorder effects with verbal and nonverbal
tactile stimulation. Brain and Language, 1978, 6,

Phippard, D. Hemifield differences in visual perception in
deaf and hearing subjects. Neuropsychologia, 1979,
15, 555-561.

Poizner, H., & Lane, H. Cerebral asymmetry in the percep-
tion of American sign language. Brain and Language,
1979, 7, 210-226.

Ross, P., Pergament, L., & Anisfeld, M. Cerebral lateraliza-
tion of deaf and hearing individuals for linguistic com-
parison judgements. Brain and Language, 1979, 8, 69-80.

Sasanuma, S., Itoh, M., Mori, K., & Kobayashi, Y. Tachisto-
scopic recognition of Kana and Kanji words. Neuropsy-
chologia, 1977, 15, 547-553.

Satz, P., Aschenbach, K., Pattishall, E., & Fennell, E.
Order of report, ear assymetry and handedness in di-
chotic listening. Cortex, 1965, 1, 377-396.

Scholes, R.J., & Fischler, I. Hemispheric function and lin-
guistic skills in the deaf. Brain and Language, 1979,
7, 336-350.

Witelson, S. Hemispheric specialization for linguistic and
nonlinguistic tactual perception using a dichotomous
stimulation technique. Cortex, 1974, 10, 3-17.

Woodward, A. We can't ignore the signs any longer. Paper
presented at the Verbal Ape Symposium. Ohio State
University, Columbus, March, 1977.