<%BANNER%>

Sex Bias in Interpreting Emotional States from Visual Cues

Permanent Link: http://ufdc.ufl.edu/UFE0021045/00001

Material Information

Title: Sex Bias in Interpreting Emotional States from Visual Cues
Physical Description: 1 online resource (82 p.)
Language: english
Creator: Stanley, Kevin E
Publisher: University of Florida
Place of Publication: Gainesville, Fla.
Publication Date: 2007

Subjects

Subjects / Keywords: differences, emotion, expression, facial, sex
Psychology -- Dissertations, Academic -- UF
Genre: Counseling Psychology thesis, Ph.D.
bibliography   ( marcgt )
theses   ( marcgt )
government publication (state, provincial, terriorial, dependent)   ( marcgt )
born-digital   ( sobekcm )
Electronic Thesis or Dissertation

Notes

Abstract: A disconnect exists between perceptions in the general public about sex differences in emotion and the findings in the scientific literature. Whereas popular books and portrayals in various media depict the sexes as inhabiting wholly different emotional realms, the scientific evidence reveals sex differences in emotion to be small and often situational. Attempting to determine the source of this divergence of popular and scientific opinion is puzzling. The present study examined one possible element in the creation and maintenance of widely-held beliefs about large sex differences in emotion. In the present study, the decoding of facial and postural affect was examined for evidence of a bias based on the sex of the encoder. Apparently male, apparently female, and androgynous stimulus figures were created using computer graphics software. Stimulus figures were manipulated to display unambiguous expressions of stereotypically masculine emotions (pride and anger), and stereotypically feminine emotions (happiness and fear), as well as ambiguous expressions of emotions (pride-happiness blended expression, anger-fear blended expression). Participants rated each expression for the degree to which anger, pride, fear, and happiness were perceived to be represented. It was hypothesized that unambiguous expressions would be interpreted similarly regardless of the apparent sex of the stimulus figure, or encoder. It was hypothesized that ambiguous expressions portrayed by apparent females would be interpreted as more consistent with stereotypically feminine emotions, and ambiguous expressions portrayed by apparent males would be interpreted as more consistent with stereotypically masculine emotions. It was further hypothesized that when the apparent sex of the encoder was ambiguous, sex bias effects would be attenuated. That is, androgynous figures would be rated lower on stereotypically masculine emotions than would apparent males, and lower on stereotypically feminine emotions than would apparent females. Statistical analysis of the results yielded partial support for the hypotheses. Unambiguous expressions were rated significantly higher on the intended emotion than on the same-valence alternative in 11 of 12 cases. Both apparent males and apparent females displaying ambiguous expressions were rated higher on stereotypically feminine emotions than on stereotypically masculine emotions, which was inconsistent with the hypothesis. However, follow-up analysis revealed that apparent males displaying ambiguous expressions were rated higher on stereotypically masculine emotions than were apparent females displaying ambiguous emotions, and apparent females displaying ambiguous expressions were rated higher on stereotypically feminine emotions than were apparent males displaying ambiguous emotions. Androgynous figures were rated lower on stereotypically masculine emotions than were apparent males, but they were not rated significantly lower on stereotypically feminine emotions than were apparent females. Manipulation check data indicates that participants did not interpret androgynous figures to be female in the majority of cases. Implications of the findings and directions for future research are discussed.
General Note: In the series University of Florida Digital Collections.
General Note: Includes vita.
Bibliography: Includes bibliographical references.
Source of Description: Description based on online resource; title from PDF title page.
Source of Description: This bibliographic record is available under the Creative Commons CC0 public domain dedication. The University of Florida Libraries, as creator of this bibliographic record, has waived all rights to it worldwide under copyright law, including all related and neighboring rights, to the extent allowed by law.
Statement of Responsibility: by Kevin E Stanley.
Thesis: Thesis (Ph.D.)--University of Florida, 2007.
Local: Adviser: Heesacker, Martin.

Record Information

Source Institution: UFRGP
Rights Management: Applicable rights reserved.
Classification: lcc - LD1780 2007
System ID: UFE0021045:00001

Permanent Link: http://ufdc.ufl.edu/UFE0021045/00001

Material Information

Title: Sex Bias in Interpreting Emotional States from Visual Cues
Physical Description: 1 online resource (82 p.)
Language: english
Creator: Stanley, Kevin E
Publisher: University of Florida
Place of Publication: Gainesville, Fla.
Publication Date: 2007

Subjects

Subjects / Keywords: differences, emotion, expression, facial, sex
Psychology -- Dissertations, Academic -- UF
Genre: Counseling Psychology thesis, Ph.D.
bibliography   ( marcgt )
theses   ( marcgt )
government publication (state, provincial, terriorial, dependent)   ( marcgt )
born-digital   ( sobekcm )
Electronic Thesis or Dissertation

Notes

Abstract: A disconnect exists between perceptions in the general public about sex differences in emotion and the findings in the scientific literature. Whereas popular books and portrayals in various media depict the sexes as inhabiting wholly different emotional realms, the scientific evidence reveals sex differences in emotion to be small and often situational. Attempting to determine the source of this divergence of popular and scientific opinion is puzzling. The present study examined one possible element in the creation and maintenance of widely-held beliefs about large sex differences in emotion. In the present study, the decoding of facial and postural affect was examined for evidence of a bias based on the sex of the encoder. Apparently male, apparently female, and androgynous stimulus figures were created using computer graphics software. Stimulus figures were manipulated to display unambiguous expressions of stereotypically masculine emotions (pride and anger), and stereotypically feminine emotions (happiness and fear), as well as ambiguous expressions of emotions (pride-happiness blended expression, anger-fear blended expression). Participants rated each expression for the degree to which anger, pride, fear, and happiness were perceived to be represented. It was hypothesized that unambiguous expressions would be interpreted similarly regardless of the apparent sex of the stimulus figure, or encoder. It was hypothesized that ambiguous expressions portrayed by apparent females would be interpreted as more consistent with stereotypically feminine emotions, and ambiguous expressions portrayed by apparent males would be interpreted as more consistent with stereotypically masculine emotions. It was further hypothesized that when the apparent sex of the encoder was ambiguous, sex bias effects would be attenuated. That is, androgynous figures would be rated lower on stereotypically masculine emotions than would apparent males, and lower on stereotypically feminine emotions than would apparent females. Statistical analysis of the results yielded partial support for the hypotheses. Unambiguous expressions were rated significantly higher on the intended emotion than on the same-valence alternative in 11 of 12 cases. Both apparent males and apparent females displaying ambiguous expressions were rated higher on stereotypically feminine emotions than on stereotypically masculine emotions, which was inconsistent with the hypothesis. However, follow-up analysis revealed that apparent males displaying ambiguous expressions were rated higher on stereotypically masculine emotions than were apparent females displaying ambiguous emotions, and apparent females displaying ambiguous expressions were rated higher on stereotypically feminine emotions than were apparent males displaying ambiguous emotions. Androgynous figures were rated lower on stereotypically masculine emotions than were apparent males, but they were not rated significantly lower on stereotypically feminine emotions than were apparent females. Manipulation check data indicates that participants did not interpret androgynous figures to be female in the majority of cases. Implications of the findings and directions for future research are discussed.
General Note: In the series University of Florida Digital Collections.
General Note: Includes vita.
Bibliography: Includes bibliographical references.
Source of Description: Description based on online resource; title from PDF title page.
Source of Description: This bibliographic record is available under the Creative Commons CC0 public domain dedication. The University of Florida Libraries, as creator of this bibliographic record, has waived all rights to it worldwide under copyright law, including all related and neighboring rights, to the extent allowed by law.
Statement of Responsibility: by Kevin E Stanley.
Thesis: Thesis (Ph.D.)--University of Florida, 2007.
Local: Adviser: Heesacker, Martin.

Record Information

Source Institution: UFRGP
Rights Management: Applicable rights reserved.
Classification: lcc - LD1780 2007
System ID: UFE0021045:00001


This item has the following downloads:


Full Text
xml version 1.0 encoding UTF-8
REPORT xmlns http:www.fcla.edudlsmddaitss xmlns:xsi http:www.w3.org2001XMLSchema-instance xsi:schemaLocation http:www.fcla.edudlsmddaitssdaitssReport.xsd
INGEST IEID E20101113_AAAAIG INGEST_TIME 2010-11-13T10:21:17Z PACKAGE UFE0021045_00001
AGREEMENT_INFO ACCOUNT UF PROJECT UFDC
FILES
FILE SIZE 6913 DFID F20101113_AABHIS ORIGIN DEPOSITOR PATH stanley_k_Page_01.QC.jpg GLOBAL false PRESERVATION BIT MESSAGE_DIGEST ALGORITHM MD5
4d36d7aff6f50ff8a6af9ffe8d6bdc36
SHA-1
997e68b9dc8015b2690c841df55ea5220735ddec
1053954 F20101113_AABHDV stanley_k_Page_82.tif
31ad6a9173bc971ca61df10c5d51d752
e081db53120b373cefa5fddf0d9dbb3dd07b6381
72861 F20101113_AABGWX stanley_k_Page_05.jpg
b5ba66a88ff11e31ea091ad20813918b
7b7b1c064e22949718cc52b03294c6f0e3c0372c
3069 F20101113_AABHIT stanley_k_Page_02.QC.jpg
65bdecd17c25b6197781936148b196c7
3826a81cf0a9ebbcf4b6b65bcec1bb00045c1110
7681 F20101113_AABHDW stanley_k_Page_01.pro
c4c9c38d942f9829b840f8d16145b279
070bca309b116961b49291314371a6203b63f2f1
F20101113_AABGUA stanley_k_Page_63.tif
32788079d2f7f113b0fd1754a50bd91e
bfe1c7dc91265e6c56351cbd1efbdf8f258a33dc
73123 F20101113_AABGWY stanley_k_Page_07.jpg
702c571c1c166a205beb8cf25f0b16f8
a89206cd010da5f5cff564761fa3ebd386e47864
1326 F20101113_AABHIU stanley_k_Page_02thm.jpg
e1bc3647b13b95d452222486b97ab49b
6f8fafb4ac67705c805757cca9e8266a57ed9359
882 F20101113_AABHDX stanley_k_Page_02.pro
9f03edc0727892efe0b2614faa2dd726
5c208a26b509b0da52845b8d41b737be8d738920
6838 F20101113_AABGUB stanley_k_Page_54thm.jpg
25079c59d6e61b69e1eb483a990628ce
57bfa988763158f310c8b9cc52bc426a24c0bf74
67031 F20101113_AABGWZ stanley_k_Page_08.jpg
890806136ba0dd069e1ac5fd1c205f00
8bcf7bdb9e207b6adb41963dfd7cd24822cefbdb
3056 F20101113_AABHIV stanley_k_Page_03.QC.jpg
1f4c150db3e5136343f1f7aa8011812c
220928daaa5c4cf371bb9dc80ec7b53c7dcf45f2
122060 F20101113_AABHBA stanley_k_Page_66.jp2
1dfdd1a445e5e704fe259e3bc4f1b4d4
18c811f96452faf9e6bc50f342c71a89c646a73f
830 F20101113_AABHDY stanley_k_Page_03.pro
dcc54c450b40e9e1fc1855d3c2dff674
a6d6d95831d4d8b92182e6baac06c2cedd63295d
F20101113_AABGUC stanley_k_Page_62.tif
b49e120b28862e5f501c7d906fb871a7
5d31f75327fe6c18b4a528534a660eed1a72ec1a
12135 F20101113_AABHIW stanley_k_Page_04.QC.jpg
0bb29b5fe87c61464f55b2a9716c2ac6
a489ddfdca51b05e6c5f0e9d4a10232166f908cf
21818 F20101113_AABHDZ stanley_k_Page_04.pro
5bc801ea328317a0c51f69673cd9f205
9d630269055fa58c4fd70ff1443d91d4bb631d14
5288 F20101113_AABGUD stanley_k_Page_68.QC.jpg
66c0e4c391bcdd7c2d2d59a301cddcd3
2e2d6803bd1a6a968fd217c2b209b04e6b21f0aa
94333 F20101113_AABGZA stanley_k_Page_77.jpg
716fef39379ca3c684c4fe61658f5aee
ebef8e015e87c8db1cb62c38a323633f79288d1b
3575 F20101113_AABHIX stanley_k_Page_04thm.jpg
39edd4454092bed8dcf06db4b5b06444
ba3063a5dfa6e4f79e7bde32500a24261913849a
56215 F20101113_AABGUE stanley_k_Page_47.pro
d21d34db4640534a34bf6fb1d008cac4
fa16ab19d51ab1544a503cfc240b1f65360d3368
90264 F20101113_AABGZB stanley_k_Page_78.jpg
778fe1b2905d8e28ef48828528f8487c
2bc426f701cf010912e22f5b1b2b386fb63f78fb
118757 F20101113_AABHBB stanley_k_Page_67.jp2
a92c2d5a5e31466426446c22b8210e67
688579c7ea289f1382736b3d75ef8af72a9fc6f0
5354 F20101113_AABHIY stanley_k_Page_05thm.jpg
89e80235880eabe77cd70062e09a4bf7
6aa9b572522be6204aff1175e5b7052fcfb7a1d6
F20101113_AABGUF stanley_k_Page_56.tif
784e51b5f0a5c9f859f22fa49a9028e8
4d2ca281776e3c8d9648d596df2741aff10b96e7
62590 F20101113_AABHGA stanley_k_Page_76.pro
253d4fa66ae683348821fed11d820394
ce2ee0922a45feb18da5d09b3a2370c541649fdc
88762 F20101113_AABGZC stanley_k_Page_79.jpg
e1d38d163a0718bcb06328f35e16b673
cc213c114f4888ceb1997d1a0005087013d9b378
14886 F20101113_AABHBC stanley_k_Page_68.jp2
3170d514a1c370374390638bbf9fc897
6895d851625abe606962146aa77f0ebcb4dea512
8215 F20101113_AABHIZ stanley_k_Page_06.QC.jpg
98a7c4c2ea0a022b1734df64f92415a1
8bdf3e5bf13f1d3a7ae49942869ea3869271832a
F20101113_AABGUG stanley_k_Page_78.tif
6882865be898359e6d3d29501262b9b8
18452e03e219882eae98e086e327af16629f7ca4
60824 F20101113_AABHGB stanley_k_Page_77.pro
a64daad6820260a5491d80f9424469d6
b44c1c9eb2c175b16994153aa7bbb946cd0d1787
91460 F20101113_AABGZD stanley_k_Page_80.jpg
972d5c16e42f8c486e2f6165dfcc81fa
64491e4443ffc3961f0fd5b7111f4b72d2d6ace9
753699 F20101113_AABHBD stanley_k_Page_69.jp2
f7e5c9d7fc91ad6750978ed739346b25
ac851672e9d5d0eddc9d0773a4aa24357c98bd75
1811 F20101113_AABGUH stanley_k_Page_68thm.jpg
2a635258b8c62dab8e6725ebeee08533
f2aef18c321fc2ff4fe4e5f3f75522ed2c54cc33
62455 F20101113_AABHGC stanley_k_Page_78.pro
b2fee600a6f85cfca50a9829159bf157
f5a3ddea4b97516e2350ed5d1f21c342e5d8b9e3
15455 F20101113_AABGZE stanley_k_Page_81.jpg
599cdbae42e8dfa9abf0a1dc8e23666e
990ecd47c2235f6980ad8a75949e4f0cbe94b555
91660 F20101113_AABHBE stanley_k_Page_70.jp2
6db104a2c16d5dcc5f710310013f90c5
375817b402415e11cc01c5625465155ee0ce5bac
6711 F20101113_AABHLA stanley_k_Page_39thm.jpg
bcc9d8e50a0b5cd4a64f05b4cef080e7
80afe1205858dd171107a5ae273671c976a64111
24761 F20101113_AABGUI stanley_k_Page_61.QC.jpg
e912b09e45c6b74922cb830e98f3c637
73894a84d24972f3276a8544203579a6af513798
61072 F20101113_AABHGD stanley_k_Page_80.pro
5015582ce4b64d1a226d345964489025
81dbecfc5ca34d5fbaf11b2ffed2b11ffee11545
28094 F20101113_AABGZF stanley_k_Page_82.jpg
1c44a9323e7f90fd1798143954b93432
d34b4c87ff4c8fc6ffd29b35ab61351896e8849b
35025 F20101113_AABHBF stanley_k_Page_71.jp2
34e721d75c23c713c91fa1b3fb802ad8
948989a96c8e7e32e43871b6ee3a659440d65329
25451 F20101113_AABHLB stanley_k_Page_40.QC.jpg
e7c58a4e1d757df9d6ff4ff6be7fb5c2
67b5f420e577b3ea6edc4233516916dbfeb0b9a8
F20101113_AABGUJ stanley_k_Page_23.tif
2efb0ad99eb496df0193be2f8c85643d
96257b31f087b2b871d451c565ee838092aca863
14362 F20101113_AABHGE stanley_k_Page_82.pro
ed55874f00ba315007f2d2ed613c1c93
b25328683e3283368403cfacb63f2b2edfa9054c
4832 F20101113_AABGZG stanley_k_Page_03.jp2
0f98b2ecb59f812c0770668302565a16
254699d9d83e28c14cc2684c3a3b6dd63f8e36d1
334270 F20101113_AABHBG stanley_k_Page_72.jp2
c612243187f47ca03f59d2d8a5d2bf51
5cbafb3759123191fef38ee81ced259df2e59856
6921 F20101113_AABHLC stanley_k_Page_40thm.jpg
685469536174a31affd51cb46e1f7dd2
3f1dc3a35577808d7efb615018e997a07759ccdd
6982 F20101113_AABGUK stanley_k_Page_17thm.jpg
347f2f652b3e89ffc76f6560dcfe4dcb
c4005a2dc16ad67de85fa70614d291b2d170d45c
423 F20101113_AABHGF stanley_k_Page_01.txt
0fe5445acf638cd84d7fcf3de17db27b
a6fb5ac327f3d5618d23b7e3996bb57ce162c091
49661 F20101113_AABGZH stanley_k_Page_04.jp2
a3f3de57ba146cb1de8cc8228f8507d2
e9321b4ce5ee1902e9c22b86bfaba400874bdb54
465877 F20101113_AABHBH stanley_k_Page_73.jp2
19891d7d4e2feec6b324c8b8c314d08b
5e90a475a1045b30fd66767d24977dc72343c592
26379 F20101113_AABHLD stanley_k_Page_41.QC.jpg
021dff21bb7d69bc51b17821c866a953
8fa59f69c3db661b39791d74214a17a968131f75
2183 F20101113_AABGUL stanley_k_Page_11.txt
835a69758371cf80f1e1dd10b62ca619
ebd5ff3dae37044680c493d4ac7a57a4edc9aa9a
1051983 F20101113_AABGZI stanley_k_Page_05.jp2
2db0daf4817555c50b510244e49b758b
eb615a2b2225da58d12d99132bc970f9c1080eca
281196 F20101113_AABHBI stanley_k_Page_74.jp2
ec80aa5a75b3b65d2110d8b1a5c43397
c5b347cae9b48942ec235e41f1d81952d76f8f36
7070 F20101113_AABHLE stanley_k_Page_41thm.jpg
031f6986a5356ae72a33d274c489d59e
835fbb140fd696a6cad56af4cae111e77a07f845
56417 F20101113_AABGUM stanley_k_Page_20.pro
9a2ee60d2b037c5ac597509f4b77a37b
9e03e85cb0da7f991201b3842674c302fe2782ac
90 F20101113_AABHGG stanley_k_Page_02.txt
060f5b2f45a96e5ea4711333ca630f8e
30d4cbe6cdb976a9c41cbc928f0a775238883f01
616207 F20101113_AABGZJ stanley_k_Page_06.jp2
a6e905581e3178f7511f5fa65cf248d1
ec422b183aa36325ce564aab251ce1d7e2e617b3
594884 F20101113_AABHBJ stanley_k_Page_75.jp2
4d6df1fc49a52248c2b89f3e1ba098d6
9216b4f055a391709e0af49691f3dee1237e7819
7051 F20101113_AABHLF stanley_k_Page_42thm.jpg
836fe662f454303c33d7988df392e15a
f538f4e606a45f594fa2e1171ce89de33fc7b057
25841 F20101113_AABGUN stanley_k_Page_42.QC.jpg
e57e7cf1169ef6fe6dc76cd626cda13b
e36d7930df4f481aa1e3d8bb9c5e6fd38096d2fd
94 F20101113_AABHGH stanley_k_Page_03.txt
2869c6d87d89ca73ef70f5cdf407e801
7b2c3d87e0cbbae98c3eb6f0d6df9710c3f139e8
108961 F20101113_AABGZK stanley_k_Page_07.jp2
e6e49d3d4d608b5ff9d32a9488b94aa7
4036bb315f8702f6a9d0104be72219c0a1204174
133851 F20101113_AABHBK stanley_k_Page_76.jp2
61fc5adedeafb713e56dfe471efc5251
bf59fa0cff120e473996a2cef538a72585f88e64
7000 F20101113_AABHLG stanley_k_Page_43thm.jpg
c3af36d482054382d89de6a6211b44cd
6f62b1873201c24e6dbe4d730d6db125bc73b2d0
52771 F20101113_AABGUO stanley_k_Page_61.pro
63a1c62f375aa6db1c3d29a9905a9138
c8ea1a5b47288def81173a539307588336a23b11
904 F20101113_AABHGI stanley_k_Page_04.txt
59a2c07b6b66ddc10a5530bd91ece786
84d48dc68f32ccd6734dc75a29920bd82fd63264
102290 F20101113_AABGZL stanley_k_Page_08.jp2
3678a4b3390128c01b54bcfc4021202d
816b73fcbfea28d99457ebcbdfca6caeafcfec77
1051976 F20101113_AABHBL stanley_k_Page_77.jp2
2381184c8b9005089b99e8bc65719276
ca214f638db6b52bc162f2e5ea5e5b7d04a81e02
23787 F20101113_AABHLH stanley_k_Page_44.QC.jpg
0695013125a25a2b1adbc753413d4a84
a470068c2fecb8b38afa7c58ba99fb0c86c26326
7128 F20101113_AABGUP stanley_k_Page_57thm.jpg
5a9ba59f941dc4510d1d3ebef4eca21f
a1f955aa26ed6f2756fb04eeef5441762dd721e6
3145 F20101113_AABHGJ stanley_k_Page_05.txt
a2bb5dd847159b4a1383fd1e72465cec
3bab3a518107c7b4f96730262a15c312ce5c0cb0
107350 F20101113_AABGZM stanley_k_Page_09.jp2
58269f4e2f1c6584c1b45c7d0fcfdfb3
16bc0aabf9b8fd1f4644046fcb45654485884c83
135304 F20101113_AABHBM stanley_k_Page_78.jp2
105bf9c521ab7f2686f9025e9b4e65ec
e39c1224100b28fcaed4ed3be1877f4fcd95cce3
24521 F20101113_AABHLI stanley_k_Page_45.QC.jpg
b070ca1e621f8429f352697c482a9665
656d618813b0c21b2937282cfb5ca8f50deb9f25
F20101113_AABGUQ stanley_k_Page_26.QC.jpg
f10ec4941ee91e152dc90fcb3d9fcec0
79514665156105dd911e4da9a0429a8bc6a55d4c
2118 F20101113_AABHGK stanley_k_Page_07.txt
262c63e33d2eb50a665202fcf5b67274
689cb92017ebaa221ae32d2034124ea813ae1678
116510 F20101113_AABGZN stanley_k_Page_12.jp2
abe2cd5123f74179cd5fd2f27a682f4f
6c85be45d478963be392da65915dd56014d6c047
141752 F20101113_AABHBN stanley_k_Page_79.jp2
3422d10b563e88c7a58ff29edd4a3179
f887bbc1752fd9c418cc438a894a0e8eca246a4f
6540 F20101113_AABHLJ stanley_k_Page_45thm.jpg
1789a139434855778c5e5db195ff1f11
824138b249cc95247f557207137cb8d9b18ce8a7
1806 F20101113_AABHGL stanley_k_Page_08.txt
d14149fdccae6473bf5f0f3d2c8d3c73
31cbcddf3d04f7cf112db319c3ae317e6b11c681
588 F20101113_AABGUR stanley_k_Page_06.txt
f4647388aebcbfb02103051e5133bf27
aad8a7c87872877ebb4c43efe8b626c15b65adc9
118917 F20101113_AABGZO stanley_k_Page_13.jp2
93febf21cae44908dd7e973595a78dff
96fb04c526403639a60419e2502396ed4443d080
1051959 F20101113_AABHBO stanley_k_Page_80.jp2
a539dfb1aca84642ca2f258cafae0bd2
5ec6de80a43a87dac30b43888f1d742394da2b29
24503 F20101113_AABHLK stanley_k_Page_46.QC.jpg
9f9c5d13179fc28cfc1a8ad86f9babe2
0bb18c161bc4fe4840c061119973619aa4ed40aa
2068 F20101113_AABHGM stanley_k_Page_09.txt
7b8f11526b26d2906215541fc045ecaf
f5873bc0ff731bf17db634315673cab8e1d0c1ed
1357 F20101113_AABGUS stanley_k_Page_60.txt
82aa15ec7c4798c84e4e4478c943f202
7daa51e59b91e8bbe34559d869ddca420e424c66
116092 F20101113_AABGZP stanley_k_Page_15.jp2
6ce80e879c01b5ececc106b9fbd0970a
67d9a2c2e10b18d2285353248a0c6f8fb9e97866
14780 F20101113_AABHBP stanley_k_Page_81.jp2
7d1fa87b7c641c86f2379546a9b5dda3
006c83afbb15995aebadf630ba9c14d222988e54
2021 F20101113_AABHGN stanley_k_Page_10.txt
f9c4e21e9b69bbe12d47d12332567038
e7e67b36393770cc93734c11f118ab60d516c9d3
6632 F20101113_AABGUT stanley_k_Page_38thm.jpg
b02bfc97fb0dfcebf343916d8a150d21
0b600449ecdc1fd30176117abb51cb3b251adac2
121638 F20101113_AABGZQ stanley_k_Page_17.jp2
c89e9c93675b32ae641697364b001b3c
fc6d4895d438cda05897334c859ae627172a1aa1
36002 F20101113_AABHBQ stanley_k_Page_82.jp2
48c90dc9991eba81c6e043eeda3e491c
060a4e4d2daaaeafdf1c3f619a219e5b8fd8adf8
6748 F20101113_AABHLL stanley_k_Page_46thm.jpg
9b40b6f4a7e8160f76ea3911856ccd29
d0e3bfbb982c5ab03db5f6e2b802a866e42d3251
2100 F20101113_AABHGO stanley_k_Page_12.txt
2c73b629825ba0bd6cb23f0952a40fb6
55752ed98feafc4de40384c30009c4eb5b094d99
123557 F20101113_AABGUU stanley_k_Page_25.jp2
0396f542056f2aae1480674defa76420
99dfc38bce0aee8849c9e2a43fa5e09ae8fc507a
98676 F20101113_AABGZR stanley_k_Page_19.jp2
ea3333ffa2b075fceebe2edb0de61728
f3ded8ca5c399a03c18ff9f00f80a3266b92792e
F20101113_AABHBR stanley_k_Page_01.tif
d4582a87063fdab5e3ac5adb8c2835d8
c4582efb14762c79f72f1356788a0b1cc4a7c9e7
25872 F20101113_AABHLM stanley_k_Page_47.QC.jpg
98bf700bafe89bf54db66e0ff97a1095
27c8a8be4f0b8110bf51de367263df7b87279ba4
2219 F20101113_AABHGP stanley_k_Page_13.txt
27e7267d31559812a16d0351ef5a86b9
419417414ff8ab30399d1b7344c87f3d907762f6
121917 F20101113_AABGZS stanley_k_Page_20.jp2
f819f2d939f2c8c98f6a639690824f92
055ee08d3c76baadb93ca371ae4a4dddd1723ea4
F20101113_AABHBS stanley_k_Page_03.tif
fd6c2a08d3f4dbfca0e96615bb55661f
5c2d6984cc87047174f8208c29f82155dced7400
6927 F20101113_AABHLN stanley_k_Page_47thm.jpg
a809592499ca43afb1cb49b3dc65d34f
e2d3cbbc0c5e35de8b34e023aa63ab31bcd35c99
2202 F20101113_AABHGQ stanley_k_Page_14.txt
517c3ebaa1acb01732764ca54d55a7c0
5e1ce8878fab03e5e6b97935c438e08a82856236
115915 F20101113_AABGZT stanley_k_Page_21.jp2
6d52e0ca089c652fcb84bb3613182f4a
6d506728285e34ac354962320fca1418b7abeef2
F20101113_AABHBT stanley_k_Page_04.tif
a8f2322956847245bd97e0658fa465f6
9fdea4fcdf727958785ca1becd111048fb517200
6885 F20101113_AABGUV stanley_k_Page_61thm.jpg
988f47eff0cb8fc2088af2980e392c57
58b1b701622640a66dc4b7e8ba49af9f73d680ec
23607 F20101113_AABHLO stanley_k_Page_48.QC.jpg
60111feb31a972ba27b36e20219a1b3f
43d993ed1bcb5b97c3c45b433f60208f010c6a15
2146 F20101113_AABHGR stanley_k_Page_16.txt
98cedb11e71b4c62609f423a62df7769
0783801dfcde62ad827767fdac18cef7428d8753
115805 F20101113_AABGZU stanley_k_Page_22.jp2
cdfb2ce2b77fe9727731af4e619cae67
c62003698f8965bb4ce0f5d65e180e8df67ca397
25271604 F20101113_AABHBU stanley_k_Page_05.tif
95eaf95c6fe1d0123439898c905c6d7c
055373bef7c22c8cfa038ec1dcc45f8d2993ef16
F20101113_AABGUW stanley_k_Page_09.tif
0c4bb96f2cd40c6b9d9e8ae59db0f6e8
85232e5fb193ae3156d60af8bd23e164144a17d3
6782 F20101113_AABHLP stanley_k_Page_48thm.jpg
0b215591ff66a94a85ac50f57a4e08a0
30ed267910f226a8b94481f0077114fc75f2751e
2209 F20101113_AABHGS stanley_k_Page_17.txt
348f598da14f4d1590a73111cb3f294d
a7c5dbd21a18d9ccb12810f7ea1ba930f0c126f2
114390 F20101113_AABGZV stanley_k_Page_23.jp2
ed0dd223f482ad634ca238ee5a7bc072
ebb24b6621151493299112a76e3bfb30bb059625
F20101113_AABHBV stanley_k_Page_06.tif
3664eaef11aefcb742bc6ff49ecc0715
5bbc3c3efb765b870e60f52e9aa1fe0e61325830
F20101113_AABGUX stanley_k_Page_36.tif
55a90802b00b5c06dc21bf4fa0fc10d2
838adb4f100376ffc03607e8ebfba760435d1c3a
22057 F20101113_AABHLQ stanley_k_Page_49.QC.jpg
23afe17f8ca345d40fefce64870c1d87
82bd2c9b86eaaac29e3c050372675d9e2cd1ae90
782 F20101113_AABHGT stanley_k_Page_18.txt
7ac5e112d71086169d381096a1bcd914
402f8b8cdfb2c1b86f78c81935900bc8692ea797
119557 F20101113_AABGZW stanley_k_Page_24.jp2
015665b6e48b94b549a96b44b3ebcdb9
baef882b111a40dbbea95ed7a634cdf270c68c24
F20101113_AABHBW stanley_k_Page_07.tif
2b2f068be334765f6f3c722345390b69
f33f23eaf00c904c8e7e5f98c30faa77b9a7ab02
53287 F20101113_AABGSA stanley_k_Page_16.pro
a3446a51764d51b1ab7dd02b5ad98b7e
d252ca9e2cd6a8fc4e63351215ade2a0a08b47ff
52105 F20101113_AABGUY stanley_k_Page_35.pro
852c32f82c1cc88a82cdc66ed8fc0600
f0679f6f92cb4f8df92d33c4d8c4222b4724f925
25618 F20101113_AABHLR stanley_k_Page_50.QC.jpg
6b457c77f9977283ecf9c02df304de55
b71349b698a66f8be3a4c43cb7c7dc28a57a4d0f
1984 F20101113_AABHGU stanley_k_Page_19.txt
0fecd2adbce47857f9487e065d3b975d
201c50121e553c036b65133b62dd7ce462e4907f
118679 F20101113_AABGZX stanley_k_Page_27.jp2
cee00b7e97539b39409897170e120691
1f8e7bf97696232a158e8c52df926ed31ed6fe7a
F20101113_AABHBX stanley_k_Page_10.tif
75f164aecd595748d914c09960d92f4c
bcb54ec3d8d1555887b53c7248f1e082344b1b8a
7013 F20101113_AABGSB stanley_k_Page_66thm.jpg
34a6b86fd211eb9c66fc0057a575f24e
93c527d99c23de875f3c12d73f537801f84f3929
713 F20101113_AABGUZ stanley_k_Page_73.txt
eeb42b58387ee5d39148f8f517532e6d
38b71fea861f8fa49e0d25b54c69df9e93a18b42
6538 F20101113_AABHLS stanley_k_Page_51thm.jpg
6e53a60162f594e93c8a510496546817
170601047c5e8a3ae23b8d62139ed108290566a4
2247 F20101113_AABHGV stanley_k_Page_20.txt
9a343817efe70510ee6e44742c19be6e
a20604e8f94207901d101154ab44263c0240c8e6
111527 F20101113_AABGZY stanley_k_Page_29.jp2
bc91330caeba1547af8cb4b1cb561a86
1a0c3a66ddef5ed25fe88bcaa0a4207d3700f6a8
F20101113_AABHBY stanley_k_Page_11.tif
a83f5151eae04217ef7221c0c9ff90d4
c31eb995745bef46d0805e64097b8c630d109bfe
115510 F20101113_AABGSC stanley_k_Page_16.jp2
cf53b90ce97fea241ee3eabe94ae115f
b23757f460e2dd5170bf549aa2babe3a133ed0d8
25061 F20101113_AABHLT stanley_k_Page_52.QC.jpg
c2fe77dd5020e0c7f3a1e09ebaf23aec
369261f607dccba27cb9909c7643680e695ed390
2137 F20101113_AABHGW stanley_k_Page_21.txt
49428236d519c0772f5100236664dcd0
a16fcf34ec4296ee05af56baa79736ebf963292f
74034 F20101113_AABGXA stanley_k_Page_10.jpg
b9cdb38d74beaa6ee65cfd8f40a588e6
1418f93969f36ce56e48efb905774c5feeb02d08
113811 F20101113_AABGZZ stanley_k_Page_30.jp2
88a4e162af7330e19bd13eb11c262f66
7207f53360a84e2f79d76ed40028445bc53a9ae3
F20101113_AABHBZ stanley_k_Page_13.tif
b7f362bed75e9395eb4c7e796ae2ce77
4edead73a0407435faea1c64fb9968945e441419
7171 F20101113_AABGSD stanley_k_Page_58thm.jpg
8482280f3284a0724d0e283420c51447
c54b22a2f06b1f455b7ddde438a2a4b52833b955
6713 F20101113_AABHLU stanley_k_Page_52thm.jpg
6318ffecfc25869f75f185a8430faecb
538e6a6c997d4f775e838fc8e1dc12b94e7271c3
F20101113_AABHGX stanley_k_Page_22.txt
34c556a9ce735bee76cc33ffe36f4810
e63c96aeeb7d0c721b5d18acb6d4ff43b09dddef
76459 F20101113_AABGXB stanley_k_Page_11.jpg
68c5bc8ad0113429f4f9223fe29d8976
e6bedd3b055567d49155b66896163c693fc39805
23459 F20101113_AABGSE stanley_k_Page_01.jp2
a487f9b49e48302e34fa46127ae40005
b25f6afd430a46900b2404cbb90957269cdf5a3f
22867 F20101113_AABHLV stanley_k_Page_53.QC.jpg
9fbde76bfed7f366e14412d486cf3c34
bbb10041fdcec68ecfefc76ebf69e527e8fd6a40
2161 F20101113_AABHGY stanley_k_Page_24.txt
47aae792cc4dd06d7ffa44d205246f05
6cb26918832d9e7de6d066a611120e40d250bf55
76339 F20101113_AABGXC stanley_k_Page_12.jpg
38884b436f8273516b9eb6af7bc25d91
5992c6510d2a215ea2ba65d5ef0796ba06c30999
1323 F20101113_AABGSF stanley_k_Page_03thm.jpg
fd824908af7f145518813d02ad8c654c
957879ae80506b579500b227f41cd756bef83300
71876 F20101113_AABHEA stanley_k_Page_05.pro
693f46dd54c0282406e5c241891da05c
39e7bd9206c01cf2a89803a013c2b590ce97d3d7
6381 F20101113_AABHLW stanley_k_Page_53thm.jpg
705c414ac34caed581ba58ae32b492cf
1d54231954cb6195266b26ad6d8e16c982d9fb11
2204 F20101113_AABHGZ stanley_k_Page_25.txt
861bfd1ae474e7245d3fb1982d90fcdb
1913868220f0b2c285c5229d873128f0fa6d466a
78727 F20101113_AABGXD stanley_k_Page_13.jpg
176cc886acfd5eacbee703b46d1af8aa
24cc3433c0d3929413e09fdc792767bb9721ddd2
70282 F20101113_AABGSG stanley_k_Page_09.jpg
af813aac16c0cd3ea43a4dde96b83ef3
09332ac691540d16e7ebe52098171b0cbf996a1e
14715 F20101113_AABHEB stanley_k_Page_06.pro
10268b81c1b9487a8a78756ff2648941
4a0bf79ee9ffa22f7f0bb41e475eeb49e53e470e
24150 F20101113_AABHLX stanley_k_Page_54.QC.jpg
701882654b81de7d20b3f15846f9ccbf
c4bc4fdd95dcb158224a7766c904fe8bcdab4718
75502 F20101113_AABGXE stanley_k_Page_15.jpg
e7b49a7fd721db405724461ce1ed3651
4a1c829fa1bd2237db0b28a017f38619c25a5362
26026 F20101113_AABGSH stanley_k_Page_20.QC.jpg
8dd8f26bbf9702ed9c286adb59fadc59
43542a039791e95dca8baf2931f38cb33bf57d5c
49277 F20101113_AABHEC stanley_k_Page_07.pro
b0c0d052f54e17de97f9a65626b33601
650e486f19faaa04e236513006596ba2d8155705
6993 F20101113_AABHLY stanley_k_Page_55thm.jpg
4c518fa7ab0a501ba0129ae770828ba5
0d040e7b033fc80dea411753344f65e6cbdd8463
2704 F20101113_AABHJA stanley_k_Page_06thm.jpg
e596f2a257a66426206d082acbd29426
cbffa0eedaeaf7d65dc33a18493713a35b448450
75737 F20101113_AABGXF stanley_k_Page_16.jpg
846f97688c8bd2236a7791f969a33771
9d548657530d82f0970eb042ddd8f003013e61f1
78592 F20101113_AABGSI stanley_k_Page_47.jpg
77d6593ae32a8cb07f3c391cc6486d75
8a22642fb1c237c4f6c66ed8d5b43376a9c8bea7
48766 F20101113_AABHED stanley_k_Page_09.pro
6b6fd3e1874c2e6efc9ed8a2f028fd67
0f296583e4c2e7d64f0c06f51bbe2e47b2988ca5
26050 F20101113_AABHLZ stanley_k_Page_56.QC.jpg
40f9cddfb889aaf3944463675100e323
22dc562d0044b529b05bf194c2dd67dfaed297e5
6518 F20101113_AABHJB stanley_k_Page_07thm.jpg
103db2d4665005be6bd4485ee6e96db8
e13c0fb2b84dfc7edf881336cddd21aefc2384c9
78784 F20101113_AABGXG stanley_k_Page_17.jpg
42e432ec41bc67bb32ac19eb7d9766e5
d1551331b7847e7f9cc92c0bc13303fe44af0fba
116492 F20101113_AABGSJ stanley_k_Page_36.jp2
6e3c3c0f0f09b1971ec0fb8cacc20f68
1ae680570dec44e9e50c19c2dd73687f1e3a173c
21940 F20101113_AABHJC stanley_k_Page_08.QC.jpg
dd9bcbfc2cf36efe132e44372157ee45
71d1638bfec01dfefad6c8dbd07401727b7213b3
32346 F20101113_AABGXH stanley_k_Page_18.jpg
d18023c86ff115f9e545053d526e21d5
d2fb728e62fb2810cc1f235ae207cfae1e2422cd
26342 F20101113_AABGSK stanley_k_Page_57.QC.jpg
ab3b22ffe6880f83e28399620c1701fb
8fc7828fdf1fe9af799a3cac4f48e4752e0445eb
54628 F20101113_AABHEE stanley_k_Page_11.pro
66818467a629b5bd63842cc5e53ba258
018c57a93c34995adb3a3a2a00ba7775920a52ac
6002 F20101113_AABHJD stanley_k_Page_08thm.jpg
09cf540f4bcb78e1c526f58fd3d201af
10d309b571eefe17432413f7d6aa9716db868cdb
78731 F20101113_AABGXI stanley_k_Page_20.jpg
02470cad04412da192026aa9adc3f2cb
a0c1a01011ee8faace6e107a43bd09311d252f1b
F20101113_AABGSL stanley_k_Page_12.tif
68668a2cbd560cf524bb6fc99b6aeee9
e991247ddab15613628c9ba279f966d5ff4fa50a
52998 F20101113_AABHEF stanley_k_Page_12.pro
9ea38b6633af79953aa9a74a9fcc2033
478b988e3419a764d7b75afd336a01616cf08d6f
22828 F20101113_AABHJE stanley_k_Page_09.QC.jpg
718f6d870843b53274670ba4b34c4eec
8efe0915691f7deb1d3e08f9957f9c0e96e9828e
76171 F20101113_AABGXJ stanley_k_Page_21.jpg
a06c1e734be5085187d47c4454682a89
6cb5db7e25f45863cbd64b61016397c3c6e31cb2
6686 F20101113_AABGSM stanley_k_Page_22thm.jpg
11c51ae9b69e506844853a5007210c1e
e90d820f4da93fb8770d35fa67e86a649acd32f6
55161 F20101113_AABHEG stanley_k_Page_14.pro
e5739c56e3e920ba2fdabb792845c530
7a2c251c728ee025ca2b887032cd242c0687d9c1
6371 F20101113_AABHJF stanley_k_Page_09thm.jpg
7b48135f40128ecf99c28eb87de928a1
638e2649df5fe6d9afc3fcb932b4a21d41835499
75232 F20101113_AABGXK stanley_k_Page_22.jpg
c31afbadd3f03ecd89d43537fb2f4787
af30b462bc407273ca45a77f5bf44f087d60c68d
2108 F20101113_AABGSN stanley_k_Page_36.txt
82585cd654c615445f7900774732e66d
de21dbb7e5ae2a4fee9a4b6ab2c07369f9f2e452
52824 F20101113_AABHEH stanley_k_Page_15.pro
0384c0a0e57bdb5a7896a85c4115bde1
4d3b3f6e6ce1b4b66c2b74fd132ab29b02de2cea
24436 F20101113_AABHJG stanley_k_Page_10.QC.jpg
47baf75bb17dbe344a83a2194b42c064
c886e6676031629e46160a2dd19b21aaf6906fc5
75652 F20101113_AABGXL stanley_k_Page_23.jpg
aa288e8beabde093840cd3025568715e
96c7cf1b29702e21aaceb80abaf5e4fe0782d997
44413 F20101113_AABGSO stanley_k_Page_18.jp2
782203b881900dd28307a8e2e953ae19
5fa8165b6795374c7a42fe5ab97e687a4e8c584f
56253 F20101113_AABHEI stanley_k_Page_17.pro
c389465b4cf5948a5e2af9883a0759ef
06be46f05bfe378e014c007482025c4bbff79368
6685 F20101113_AABHJH stanley_k_Page_10thm.jpg
b5fed24cbe20f5000172bd9008101ff4
fd84591cddb3722550ff0054787b8070ad9f63de
77799 F20101113_AABGXM stanley_k_Page_24.jpg
d8841a285789573cf0fde33ab39813c8
cf8c878c8f460911718a8b84c1844148f41c0882
26140 F20101113_AABGSP stanley_k_Page_25.QC.jpg
8c37a0cd070d57ea7cb706d31144b400
80bde499adfd2ff4fbaa74cc54688028f992fcd7
19432 F20101113_AABHEJ stanley_k_Page_18.pro
a07584ff8c812afae6e2fd8941b5a004
e07d80ce3309c588bb516358d23533f4dbecf665
25567 F20101113_AABHJI stanley_k_Page_11.QC.jpg
8ed6e2804b0e1019ddc3e8e5c978624c
ae13b69bdf0b92f4a0d71be45f68250ed135fecc
78904 F20101113_AABGXN stanley_k_Page_25.jpg
48e5c8e29c3da39623020e44a91957b2
18461f3fec4cf774379573c23a92fff4fa905955
F20101113_AABGSQ stanley_k_Page_70.tif
231ab5902f1c85a7e8c7f7ad45bc5885
50d5a8410357c3d3893edf9a23c94466b13afd9a
45707 F20101113_AABHEK stanley_k_Page_19.pro
81921c6fa2187749c62aa67cef481a80
9a141b90e435c9a1e5b9eb26b0f24c6713be0e83
77923 F20101113_AABGXO stanley_k_Page_27.jpg
700872a31d9b0cb823f0b0c61885dc67
a0c6a16c4fe55b279b747d8110cac0a224ea984f
79740 F20101113_AABGSR stanley_k_Page_41.jpg
5b3b58643857deb3190a50b7ce28b547
b8e8e7710d72a5a12ed56cd75cad9dc5e7efbbfd
53832 F20101113_AABHEL stanley_k_Page_22.pro
f3e494f18bd08548361367e0491b1c58
558190dcc58a8f4b7d8de29100375c43b31edba0
25034 F20101113_AABHJJ stanley_k_Page_12.QC.jpg
d81768dd6fb0bdc3884d803fe3a60a73
141b0c85885ae49584254c651a39597403390a22
74030 F20101113_AABGXP stanley_k_Page_28.jpg
b3d0cc8bd5991745d7952193c9cf2e37
726d77145d740ed56d6208d7c67585777bb71406
74012 F20101113_AABGSS stanley_k_Page_44.jpg
dd8a75f8ad363ef95b6796c8a579baed
bc7d4a3e124ffcd247e1aa2f4353dd89fb95f939
52848 F20101113_AABHEM stanley_k_Page_23.pro
0bfd2e1e0054a736870531ca557ef8b1
0bb6f8bfa4ff1dfa438e66d599a6b6a9b5ea73fe
6918 F20101113_AABHJK stanley_k_Page_12thm.jpg
4b26bf41429152e0df67b7e39988f8a3
c41178e0ecbf49c86ede59cc94cf7eab223154c2
73165 F20101113_AABGXQ stanley_k_Page_29.jpg
bb8bd2baa21e4004b7e4f7cc7b1d59e6
53dcf7a5364445820c446c429ed3dbfc99bd0bd4
54702 F20101113_AABHEN stanley_k_Page_24.pro
9a5cd916c6a5276624b4728854b37d5f
7491a9140f56639c76182e325fe20613e08f6674
25350 F20101113_AABHJL stanley_k_Page_13.QC.jpg
c0a154e603dd1ef8e49063c4300ac6a7
ee52e4d73af4fcde950459077afd17a3701ceaa2
F20101113_AABGXR stanley_k_Page_31.jpg
081b43698de39ee45fed7905699edb9d
864a30788d5d2baa277edaffd70284c664917502
2783 F20101113_AABGST stanley_k_Page_82thm.jpg
1c62d842f5e3f924d929f254a1dee058
7d3a9bdb164e51020b9b6181567e1bbca847f78c
56134 F20101113_AABHEO stanley_k_Page_25.pro
65459bc0e8ae26722033d8a26c0cb253
dca943dd084c17946b641cd253728ac37d92a68d
7109 F20101113_AABHJM stanley_k_Page_13thm.jpg
1b00d4daf5e7d0243c597a5b2645a2d6
4c8d971bdcc3b5cc04094513e1d9e3813ef82b27
51900 F20101113_AABGSU stanley_k_Page_28.pro
11355980c459fb575fad2832ed23f132
75db4bbabcd17d122011200ae2ceadf88a8384a3
52381 F20101113_AABHEP stanley_k_Page_26.pro
cb5dfadcd320d4434fa3ace4db0c356b
04d63bbdb5c60cfeacaa9ae8b3d1ba8736ec7ebf
77030 F20101113_AABGXS stanley_k_Page_32.jpg
b4e6cbf59045d37f41b0e542d63fa2e2
5c456294bab5855df53fe13771920f6a0eb673b3
25530 F20101113_AABHJN stanley_k_Page_14.QC.jpg
7f84973401fc189cb0ae8e94ad87a0aa
11bb51df1d9cc99ddc1d427e14a3a9f9ce3cdb5c
57894 F20101113_AABGSV stanley_k_Page_58.pro
2f80e8032dbca199e259cad9499b5f44
543dbdb795ce2ecea0ed0b998990d8d083a26f3a
54753 F20101113_AABHEQ stanley_k_Page_27.pro
153fedc6e71486d7fef16d7f32f5990b
30d7db743d8e791f0402d4f216036f7a1bd1c9a6
78218 F20101113_AABGXT stanley_k_Page_33.jpg
bbeae90b768aaae9ea4734c3ead60b95
32aaa19bdd9f8c9727f2d521a6529fb03fe197aa
7011 F20101113_AABHJO stanley_k_Page_14thm.jpg
618ac8da428340c0a3d13f10d7973bce
49ff0ba5919d486a8393a195ffc05c150f6d13de
5583 F20101113_AABGSW stanley_k_Page_69thm.jpg
5b0c4ac4ebc7bf74fcff108e617487eb
699aef37203b6f04b4f19084300082e32467998f
50800 F20101113_AABHER stanley_k_Page_29.pro
7969d5832c48fada6cc13ef4567340a6
ad82190f3d42a9ebdd156f129395a68e88ab7831
70883 F20101113_AABGXU stanley_k_Page_34.jpg
605f2993d2f358089eec054188d5b9d8
585d2596f136fd100bcc97245c586fe02abd8894
25064 F20101113_AABHJP stanley_k_Page_15.QC.jpg
245132b2af3fe51d0f355c3b42046171
e7c0c763f632750d95f462b43acc030a743f6f91
30076 F20101113_AABGSX stanley_k_Page_72.jpg
3e25339a0ebb93af9864e955bca6aa70
63ca0d4e2a34bb99393d74c0bf1eaa10cca95dba
50927 F20101113_AABHES stanley_k_Page_31.pro
a1a6f7489eff1a168c2d46e6bf186a03
5e3796ca4ba66bbfd9b91e0cd86e7c10f889ac7e
73533 F20101113_AABGXV stanley_k_Page_35.jpg
5c6dfb29281a8d983a157405ad0825bc
68ebc622468d8bb556b56115f92dd972b7ccbcbc
6664 F20101113_AABHJQ stanley_k_Page_15thm.jpg
112c7f9b5a437d4a17f16e6c1ca1f580
fef4a7786ee67ea760d685d3d76f6bf88cac710a
65340 F20101113_AABGSY stanley_k_Page_79.pro
0ff30afaf4e1cf698e2c13633ad6f394
0b1a9c7c6e6db4c98443256baba4f9e975859292
54694 F20101113_AABHET stanley_k_Page_32.pro
42b01c3702205c04baf6cce6370e5c05
42593f8e16a71ddfcb03251b60eccf337b901ac8
75126 F20101113_AABGXW stanley_k_Page_36.jpg
a6d9cfc3c8b47602a653f1749aa14888
4293d20caf701c9f5af5be18dea0a175fb61d50b
24481 F20101113_AABHJR stanley_k_Page_16.QC.jpg
e218b962fcbbeab921caf2730264b1e9
8e8973dd52f1b8aa302a67faab7a7a322a4da19c
2026 F20101113_AABGSZ stanley_k_Page_44.txt
5b6630d0967265c9f199991a877fe180
fe1751cbbbd6f0089cadeaaa8d87d8657be64cbb
55895 F20101113_AABHEU stanley_k_Page_33.pro
1438229e65107571cb15b31dc178567e
4cf43019bee004a7faf0dc7da9ea97ab02c9b550
77227 F20101113_AABGXX stanley_k_Page_37.jpg
25a3fe05636de1ffde15e0517b5aa3fe
82712bf2e457bd9c9c8cfbe555083c32c4a4b7d8
6971 F20101113_AABHJS stanley_k_Page_16thm.jpg
caca3f615c33d5db8087272c145d6193
04efb329c0852eedb16562e0b357657cd997846c
49340 F20101113_AABHEV stanley_k_Page_34.pro
0c7edd64756741900ab19cfddbb7c18d
92b9b4febfe7a1194f41e0f2279ffb8bd57d133d
25766 F20101113_AABHJT stanley_k_Page_17.QC.jpg
66e1379d16a2286a460b362d97837ae5
d5de78a3509111398163d012d81fd689e8612352
53525 F20101113_AABHEW stanley_k_Page_36.pro
fd74a9bc883c5ffe03ebc4f24f638845
1de009fc6d95dec3f316808d18d2ea5f98b674b9
F20101113_AABGVA stanley_k_Page_19.tif
1f908a1beab1ae83c6ddcd97c3883fff
c84a11a29902f425dde3238db0852dd15b4e8b6a
74591 F20101113_AABGXY stanley_k_Page_38.jpg
49d7b5dcd48450a2f49056d57065f517
87246a4b18fc442d8f0aeaab9f2b361dfae2d978
10586 F20101113_AABHJU stanley_k_Page_18.QC.jpg
f49c3cde23edcb6d4d9bc66534ffa0f8
8d719ba142f9d2cd7d34f2814058b43c4e689d95
52208 F20101113_AABHEX stanley_k_Page_38.pro
49fbb79be529ea312f599d4525c8495b
35ac8808fe74be6b4d92fd42b4b4614b3c7f2c26
2252 F20101113_AABGVB stanley_k_Page_47.txt
eb65976a0802a171a441737c618dd370
c8128505e4f82a18ca2d223483dc72b79b0ac6f3
75494 F20101113_AABGXZ stanley_k_Page_39.jpg
b62723f4f81ec303e5eb61103cc44157
5de925fa7b31161c4d6421e5077c645a641bfdf3
3369 F20101113_AABHJV stanley_k_Page_18thm.jpg
c83e4f1bbafa707a27df8ac4901e0bb4
cee9d1d484d6a40f053e185dbbb9716e916eec2a
F20101113_AABHCA stanley_k_Page_14.tif
d13d5cf316c1190ad69a3919c44484f3
20f567d838de220e6c9acacbb1ba68d659ec97da
54686 F20101113_AABHEY stanley_k_Page_40.pro
83583a45778e96bf298dfd4789bddb60
d3f78f6fc1f81c33d1b027dca9850850f7b1fa7e
5098 F20101113_AABGVC stanley_k_Page_81.pro
5c46ac29a746101a1cde7fae7fbf9fdf
088903d72f1f34dbf69c4477327db6adc7d03912
22178 F20101113_AABHJW stanley_k_Page_19.QC.jpg
f644c130b3de20df4668b425864127c8
1c68e5f8bda8ec798cc466ac3b272730a4ce07cc
F20101113_AABHCB stanley_k_Page_15.tif
3836b6677a562f39a9dbb3b407f0c92b
08cd13fc15b81094dc2d898f5f398e79298e5c21
56773 F20101113_AABHEZ stanley_k_Page_41.pro
e2aee2d91f27ca7c88c8fc9e80f59575
6327e921b1903e07d225d58b8e37b467104a6e54
2091 F20101113_AABGVD stanley_k_Page_15.txt
e0ae056d4150d1ad30c63506d18826f9
c9cf6574cc10fc3923ac50c2e292e84039d5bcca
7078 F20101113_AABHJX stanley_k_Page_20thm.jpg
7f5bbafa2594c18eb4a95dc9c6ed88f1
ae64d8c944fdb6d33278ad051f19dd4d82128098
53851 F20101113_AABGVE stanley_k_Page_39.pro
7f0d105fbe253b6214dcf80f8e22f6b6
eaaa4245b4edf367d576c049a6cfe09875728d3c
25035 F20101113_AABHJY stanley_k_Page_21.QC.jpg
bb2743ba9662d35ccd5f5c884d7206cb
c1c0ed6a333f77b24223557c5515266f1ea401c4
2075 F20101113_AABHHA stanley_k_Page_26.txt
c279c4bde5d99ad2a4db19346b6139e9
b1b77f4fff2882cea3e6a737aa690de006fb4ffd
F20101113_AABHCC stanley_k_Page_16.tif
9b818a738c665440a8fa29f686361a1b
393149e3f932639268ae76bf2b090a45c0ac1ca9
118798 F20101113_AABGVF stanley_k_Page_40.jp2
99a5c368ad09c4993a5d740be3997704
ef2417c2ee78f2d9ff761f800a0a7df052ac98e9
F20101113_AABHJZ stanley_k_Page_21thm.jpg
7446530b1886e1718063e76937e78f0f
bdee7654748a8ea1ee7fed5507094991161f8731
2152 F20101113_AABHHB stanley_k_Page_27.txt
c9c1b116b09e73bdb1cd3f8ae819ec94
629c8e224ed1969debc3d4ad18a9cb67c544b282
F20101113_AABHCD stanley_k_Page_17.tif
d2a2fb74dadccdc8298616ea35551e12
27ffe6023e6f25e4b8778ff03245dc459435977b
F20101113_AABGVG stanley_k_Page_51.tif
54aa582a55e986d0f9a76ef715676c1b
1faa2ab0f78d12c4922ae1f614951b5edc573f6f
2051 F20101113_AABHHC stanley_k_Page_28.txt
6e92bd999bded1d821d774ad0c7e9e33
36df1caad2970da9238d7a74d3bd2202861e68b9
F20101113_AABHCE stanley_k_Page_18.tif
a5928e0d6a7c589d32d6637578e2d255
9b6484246f7594ad0b8f0d9732a4d1658ac26fdb
77591 F20101113_AABGVH stanley_k_Page_14.jpg
fb9f29dfb643577afa27e86803434e89
4790d6b1a55fc1faec80676940f9ee6108779ac7
6942 F20101113_AABHMA stanley_k_Page_56thm.jpg
6436c85a4d4e42116c8bbb5e01c513ff
30a426e71eec037329bc4d948dea0c8c5d83aea2
2013 F20101113_AABHHD stanley_k_Page_29.txt
9407132ae073e4fc8786b99bd852d1cc
0e087ec890c67b453751df15d46c59b5ee53c6d0
F20101113_AABHCF stanley_k_Page_20.tif
895c8112f2f3d7e6ad5e767d0d9199dd
8f9d96ea197b44e53cafd48e04cd26306d47a084
6078 F20101113_AABGVI stanley_k_Page_19thm.jpg
2988e868d188404d55f472519e43a0b8
70b66a389e3c2d042f9dc1f682690fc75e204298
27009 F20101113_AABHMB stanley_k_Page_58.QC.jpg
29beefcf45e014da7a4cb210c9cdd82b
cfccdcc562e7c130599af25646820c3d512c6a6d
2088 F20101113_AABHHE stanley_k_Page_30.txt
648c71ced3e129f71bbf67aa4e567335
1b18be11accb612c830b620f57bc3adcea6e3682
F20101113_AABHCG stanley_k_Page_21.tif
27ddd9244d5b4e7ffce5627a5dd8d99f
682c36dbbd66be77e0b29d9793bb65bcc94c2624
75394 F20101113_AABGVJ stanley_k_Page_59.jpg
2d11eebc9c09af2dd018b0292d6bf980
55f51007dc4697cf7327482db24143d3dfe1ea98
24997 F20101113_AABHMC stanley_k_Page_59.QC.jpg
14512ecec664a3ac0a23dd6c82cd3d64
cf90e9e8855bbfe1f6fb813070dcc26082af4fdb
2018 F20101113_AABHHF stanley_k_Page_31.txt
318479de1bf0fe368e0bf461d4aed89b
36359c2f9f94f4b155e0e01a4ecbac9d05f827ee
F20101113_AABHCH stanley_k_Page_22.tif
cc19b48841c8a46ed46109568832aeaa
1484801ebac55c1a1ae744416503453fd0dbd8da
F20101113_AABGVK stanley_k_Page_44thm.jpg
9d50a06ea235147fd5b382e1c2a581db
7a75412206af2cb8dd549908ec1d8ba4c8862474
6931 F20101113_AABHMD stanley_k_Page_59thm.jpg
da4af58221f8564697d0f160f7f1a925
fb6b663411057cb20e74a3ef248b488486f25f9a
2163 F20101113_AABHHG stanley_k_Page_32.txt
619a8024402c3ef89b0c4c1a685f619b
d695b1338b671c58d69d6de96908cc598635ea91
F20101113_AABHCI stanley_k_Page_24.tif
37df90cc4bc71582ba157d0ca04de9d5
d4cc3271fcc03db2d6dd94044519f53f39aebdbe
53728 F20101113_AABGVL stanley_k_Page_64.pro
e174171a5d50874226c54c0e06b1525f
06f14a66604cea365aa54d750a74ee02576d8047
17321 F20101113_AABHME stanley_k_Page_60.QC.jpg
6fbe83df8cb0b0e891b8b9c225b1a337
8a0c35442cfbf345d39b38cb34ed3ac06ccb5d8c
F20101113_AABHCJ stanley_k_Page_25.tif
7d5238d54f25cb609539f7531c5f3fde
c6d90ee1fec9267a700cdaf534bfca18946f7ad6
F20101113_AABGVM stanley_k_Page_08.tif
ec7d560e0eb9c60686e643756cfc2e7b
7a98b20b4668c07938acedd71e5869db9903f30e
4914 F20101113_AABHMF stanley_k_Page_60thm.jpg
2feee3c759ff06c4718bea85e25939c1
2ec52c269b5fe2c8d92292ef20c19a12d40cc6ef
2212 F20101113_AABHHH stanley_k_Page_33.txt
3ea8c8070ff15579e52608c0deab7f47
494f2fb3f96cef50063cf266f7a7e207b79dbc07
F20101113_AABHCK stanley_k_Page_26.tif
aef4ecbaa5983f5ffa1a9bfac7317f76
b6e6acc7bd634eb019004db7ee7a485e2ce063ae
79540 F20101113_AABGVN stanley_k_Page_56.jpg
4c7b96769c528470fe0f0cb6378a4a9d
1b8b72897f160d95e5e8ad2c4798b89a1f028269
24602 F20101113_AABHMG stanley_k_Page_62.QC.jpg
a5598e6b460b4c342e994b1ebc3c372f
5e353c31bff31324f321e2179264df7884963769
1953 F20101113_AABHHI stanley_k_Page_34.txt
9f584d8a72f245ba37bb160cc860df92
eff596f23b848c78312ae33021890a76619168f4
F20101113_AABHCL stanley_k_Page_28.tif
503b0194f52a4081982fe68b9509becb
9730f07c928604151d6cf3752181932be5a4cc11
113523 F20101113_AABGVO stanley_k_Page_26.jp2
fdbcabfff9e3a56963c3211516bf0416
a43be81bf8b6759f636b31275fb6e89ba1fc6de8
6842 F20101113_AABHMH stanley_k_Page_62thm.jpg
740657a68be85e93e823d50e3a7f4576
0c677bdc05e9c6a9a6fc2113f0ed03705fba7c34
2071 F20101113_AABHHJ stanley_k_Page_35.txt
58ca13ca265cc079bcc8b48040fc9e2d
544cb97570a4a50f92a264fb02aea7758d905c4b
F20101113_AABHCM stanley_k_Page_31.tif
7b1f9abbef709509504089a6d92439a2
dc4d93549225feeeb1d6bf63e6cb48bb45ccb5a2
55375 F20101113_AABGVP stanley_k_Page_13.pro
0e7e9d53768a62b681b5c355e74b0575
284ee2e6fdb075858e6f483e387a888deee36220
25870 F20101113_AABHMI stanley_k_Page_63.QC.jpg
f81b0d1bd2cf5770e2fa0e4074f61193
06cc4089db055ab6e6feaba5dc7d6b99e4c4a51b
2169 F20101113_AABHHK stanley_k_Page_37.txt
3fa50296c790b6d384ee02b540adf340
0198081803d88019963adb67febc1277cf6bb552
F20101113_AABHCN stanley_k_Page_32.tif
e43535f6fc6ca462e485a2fb06ab844d
cb9cbb9a31c57e05d2b0306134bbfd1e6fb25717
F20101113_AABGVQ stanley_k_Page_61.tif
485c3dad5dc91a204e79777a5be57830
bafa46ed017178edb9f48add39f0726d15dad5e0
F20101113_AABHMJ stanley_k_Page_63thm.jpg
d2682887df3781219a5fa78b55f9fd90
cc3a102e16da799756c7e301569b4a5d378e7088
F20101113_AABHHL stanley_k_Page_38.txt
511fe8184ee582aee2488371c2a09a64
495d90e29e96394d7442195ea7a7b1390f64ddc8
F20101113_AABHCO stanley_k_Page_33.tif
6c210a590b32d4b2d30786f3228c2e6e
fe23ba0f18e3663236bc3c7e64a9878d7f881463
114650 F20101113_AABGVR stanley_k_Page_10.jp2
aa492faefa00612d66c26eae4ac79803
1de6606ce73cb40914911f87f799ba633b405f5d
24696 F20101113_AABHMK stanley_k_Page_64.QC.jpg
4740144be0642615c314f32347a79a52
c73db1907add2c25362d2829dfbe250f949d0c1c
2126 F20101113_AABHHM stanley_k_Page_39.txt
c70b3db8021ef1cde99f34adaa31659d
ecc16d870fb0bc7934329850f3acbad8ab2f717e
F20101113_AABHCP stanley_k_Page_34.tif
c219a4a1f8d1610ad53f68002c0c7103
ed0fdd82ba442e1f17f6eaed068999fa31c7704e
45469 F20101113_AABGVS stanley_k_Page_08.pro
a33ae4135aea44a6d049cf5fe730314f
b30b676e7f22edbdcdf49ba02bbdf0d6d120e522
24554 F20101113_AABHML stanley_k_Page_65.QC.jpg
74bcfdff8f6d9460d6221fa5a0e56deb
984e5090b398c9229122578107f5e7b7d36a061a
2162 F20101113_AABHHN stanley_k_Page_40.txt
0fe9dca0145abbcb087ad371ebaa326c
bfae775dec833d7852e72d919679af65d8b7b743
F20101113_AABHCQ stanley_k_Page_35.tif
685047eb8cddee7a57b5a82701532e85
6fba95210e17d9d6c502d562cbf8bac81b4b9fdf
F20101113_AABGVT stanley_k_Page_41.tif
94a8d48cc689aa26857473c57f812c53
6e18464f828b329ab6cc52fe2e61c32ddd8bb49c
2237 F20101113_AABHHO stanley_k_Page_41.txt
30daa6924afc622071e85e77ef7950b4
b49d285d0915263f619aa7b677199146da379a79
F20101113_AABHCR stanley_k_Page_37.tif
72fcedf613b0f0dc935c7b78de1c61d4
c56fd5b39d8c93a0a6b45afa60acc56c4cf1216e
102177 F20101113_AABGVU stanley_k_Page_49.jp2
c6d351b06f0c9eaeb7956d66df19a8b6
bd162239d4033cf3b233a1c1f74686400b8ed18e
F20101113_AABHMM stanley_k_Page_65thm.jpg
2a5330b2cc59264994647601e43b9176
8ab2528576bfb436d86a40d4339a48f736778350
2186 F20101113_AABHHP stanley_k_Page_42.txt
4565f744c1496dbe909c1f2355c305a2
678b075f597368e1e8a8e7ce2a498b2a14a450d8
F20101113_AABHCS stanley_k_Page_38.tif
0796b3f47131ecea6070e62c0aa9436f
35173ed3ee8f70d33901e949e4ef8fcad1fb8a4c
6732 F20101113_AABGVV stanley_k_Page_31thm.jpg
d559bab312965e29dfa29de34cb82c2d
c77dcb052084cfbc93e54c892619d68a10c5fa63
6832 F20101113_AABHMN stanley_k_Page_67thm.jpg
bf79caa3bd496756f6050af9f7bd8709
9dd62e4cfe3c337e1663d185625da272d5f21a77
2199 F20101113_AABHHQ stanley_k_Page_43.txt
e9b9fd674efde9db213e373c9500edab
01ec727465dd406724e84947b7daf5cff076b257
F20101113_AABHCT stanley_k_Page_39.tif
541194c50534526b033c06f7139fdae5
c8394ed0ff49d9b07c3eaf39dcecb7e899ead60a
14902 F20101113_AABHMO stanley_k_Page_69.QC.jpg
ad39e6199442a3abd3997018a319267e
6b223425b30f0e5fbeabb2d768ccd8ace06f72dd
2046 F20101113_AABHHR stanley_k_Page_46.txt
9497ddecc98e3ac9eb67068555b372f2
668cc0091fd60f7d80e5ebb5e523397fb4eeedb6
F20101113_AABHCU stanley_k_Page_40.tif
f1b1381f7a8de927fd1c6d6b8f266527
070d3cf16a985954cd6754dcadd19d8ed710f26d
75067 F20101113_AABGVW stanley_k_Page_26.jpg
844b3f571a7945315e8ffeb4fe73261f
0bdd6d75e4db30f856eb3ee91d56cdd7e428ca7c
51299 F20101113_AABGQZ stanley_k_Page_48.pro
120fceebe185225602393375a4fc859a
c0a4d42f58d8f583ae79dc66434adc47ae2939fa
8695 F20101113_AABHMP stanley_k_Page_71.QC.jpg
167e5a15862c7a955fb9580a2c5679ff
6fb47541fd35136bacf47665461a4a56277f6544
2020 F20101113_AABHHS stanley_k_Page_48.txt
5adccb51ee84f70d5187b40a4a9c6fb4
2f89896f43951fc5a5465d2241f6d15756bca30b
7006 F20101113_AABGVX stanley_k_Page_11thm.jpg
5fcafb6709ede9e018775734f1397815
b2865f77f8d14b42f130e2197e9dc83c025f3818
F20101113_AABHCV stanley_k_Page_42.tif
54641f26b3b16b6a0b4c2ca63df00b5f
770df3d31ea62425d9c6083ca6b5a714d3309922
10361 F20101113_AABHMQ stanley_k_Page_72.QC.jpg
c79f0975a51a2f24ffbb96f768ff5659
beceea901261a913fd29e916145a0da3d47c0589
2012 F20101113_AABHHT stanley_k_Page_49.txt
b41af964b16239fdd63d4d1f6d567533
078e9c2ad004ac79e3c79426a1b37a2e1320f83a
25935 F20101113_AABGTA stanley_k_Page_43.QC.jpg
618a2961b28ba3879577f3f6bacd16e0
c420a0ba2a5233465e5e80995aa09e4087e556c4
73458 F20101113_AABGVY stanley_k_Page_48.jpg
6dc7020fe5f3296c3d0b680ffe5595be
2f9d53cecb0f31762f02428587567cdbc0342e78
F20101113_AABHCW stanley_k_Page_43.tif
2c17a8b423578f2459ca7fb5facdf996
7ce368dca6a5f0292807463feacfda3a79b71b9c
3142 F20101113_AABHMR stanley_k_Page_72thm.jpg
ce33d073eb7d72cbe6b874532fa0b858
71688028c132906416ace05a597ecc2d4bc51866
2184 F20101113_AABHHU stanley_k_Page_50.txt
cbbb813573e214c7814b3ba8896d57ba
bef2d05a99b2bc8e0e58eb627f8c42b2cc51bf9c
54832 F20101113_AABGVZ stanley_k_Page_37.pro
e881718f3421ac61b8f1a6c617858341
02b93b9276d330f3ff3611699501745341c73769
F20101113_AABHCX stanley_k_Page_44.tif
023335d476f13e8a0f3b1782bc7bc233
f1f45d5d02a14ec6a105458b6318b8de74319a7b
2086 F20101113_AABGTB stanley_k_Page_23.txt
d51b182114904aa647ad70fa00de87c5
9466b84003c87677fa557d281e873442e3f170dc
12216 F20101113_AABHMS stanley_k_Page_73.QC.jpg
5af4e0562dec2b43b575462693caf2ae
507dc15f5b138606cce8cdc02355d935e2126fc5
2129 F20101113_AABHHV stanley_k_Page_51.txt
727f714f6c570eed823f8bdbcb5c6928
1d3d0d7e9980c3fb9fd4b12f1d1131de57dd43d0
F20101113_AABHCY stanley_k_Page_45.tif
781947b93d329d125f9e5f1d7497fe77
81be6329bdc8bda3b0995c3bae767415db82ca88
73248 F20101113_AABGTC stanley_k_Page_30.jpg
4341509a5fef5ec7fb4d42f91b27c716
90e64f27edbf894db1fa8b33ca6cce28a84e31ee
3958 F20101113_AABHMT stanley_k_Page_73thm.jpg
2c380f15d3437ae1761095bc0ed12097
306eefa381cf0f7186f0b3edc62c4b1c4e11ec25
2140 F20101113_AABHHW stanley_k_Page_52.txt
ae693081c0e2d03cbfeafcb43ffe4b99
796b936573cbc76bd44d803ecbc17f6331da16db
77722 F20101113_AABGYA stanley_k_Page_40.jpg
cb5ac89b69fb625d9add7f2ff8390bec
90c291dd0653562984806984d1ca28b3dc33c404
110755 F20101113_AABHAA stanley_k_Page_31.jp2
b28b0e3bb5db87714dfe8bfceebda31f
747ed12438cfd96cb415bfadb9b503931ac52ace
F20101113_AABHCZ stanley_k_Page_46.tif
d0c1758e55f3307e40580fc8468d1357
5afde65b0c1edcf8912f580efd0770c985ecafe5
2622 F20101113_AABGTD stanley_k_Page_79.txt
394caaf6d056a08be2eba4af8aa7f38f
701e8ba23b82d92e5e2c0807489310e9e821b2b5
7696 F20101113_AABHMU stanley_k_Page_74.QC.jpg
fe887ad3815af54e8610cf96df361e52
0de25d516bedf7592b65a2dc3e5ffb11fceaf5e4
2160 F20101113_AABHHX stanley_k_Page_54.txt
4d294a3bbcb5e98c4207d4dde72b20c1
99dbab2733710cb1eecb22708074f0b9f6567a2b
78720 F20101113_AABGYB stanley_k_Page_42.jpg
183441008688c5a9b670d888b40ee9ec
22a283993fb5b5ba6508678534a05c070003305e
117932 F20101113_AABHAB stanley_k_Page_32.jp2
599a2a550bc70965d7dfd0abf2a16a7e
0be7c3f3de2eeaa7b5595b446435b72d170dd4af
F20101113_AABGTE stanley_k_Page_29.tif
1fde95cec1d96e703a727f12aa8e0e91
3461c4b52f2923c409d8749ebb79ec33f7e3977a
2707 F20101113_AABHMV stanley_k_Page_74thm.jpg
f2ccfc56a1906a818179f77e877ed3ba
3c0b753b60d686719e6bae8474c49830adec7127
2203 F20101113_AABHHY stanley_k_Page_55.txt
756dbc857faadc1c6f755ce23a830a6e
1ffdbff373fea2881e559d948234b2e6289b01c0
55623 F20101113_AABHFA stanley_k_Page_42.pro
3a79b1d4050767a354e685dfd4a7de33
cc56d73863a7db6d88773a68481be091114ff2bc
79209 F20101113_AABGYC stanley_k_Page_43.jpg
2039d9b80a6f44a0e4f02d2153fda580
63672e09930475d823df7e6aff3b37ed16214807
119544 F20101113_AABHAC stanley_k_Page_33.jp2
3b0c417b2521391aa2677fe178a606d9
5e59f5ce0acbaa34f39c3dc92691eb83a28eff2a
1106 F20101113_AABGTF stanley_k_Page_75.txt
10d753978c6511cbe49f75a1cc912972
31c590e4a64e11284cb067434547fd90d5af2c3a
25104 F20101113_AABHMW stanley_k_Page_76.QC.jpg
762805d4583e177d993d3b18ddb57891
9a361b324fc3c3496aa9d13f5060c73001eb3746
2196 F20101113_AABHHZ stanley_k_Page_56.txt
f2c1e8227ca79fd28cf8c67ba6b6cbaf
6d80e1fd16e0cddd44cfa2efe6f299ae61b6fb38
51273 F20101113_AABHFB stanley_k_Page_44.pro
c7e3b16731c7b9ffb7408f7c80153681
82c3626a7bec96430940d3003e333b73cd0e55f2
74746 F20101113_AABGYD stanley_k_Page_45.jpg
8846ec387d9e44f7607ffff680125121
661827b60306e08441f395db6a8809bec69b174c
108014 F20101113_AABHAD stanley_k_Page_34.jp2
dac3c0ccd13352931706099f76f63e23
6c00f6688dac6788397969a21ab14c508b6a6bb5
26952 F20101113_AABGTG stanley_k_Page_79.QC.jpg
e2135b90871f5d00ceeacf53bea9043c
581a8859c41ef0e440fbe82e385c890709d83ab2
6789 F20101113_AABHMX stanley_k_Page_76thm.jpg
cdf2988d4ec2d3256014466cefa8aa23
81d3d2fd048ce632b82a7bd5cb11487cbfa5fedc
52459 F20101113_AABHFC stanley_k_Page_45.pro
2942a1bae442d2aecb0da8f71b7f9af9
01250e643457aac49bc320968cd0d077cf777e21
67578 F20101113_AABGYE stanley_k_Page_49.jpg
df45124869213667990615243c61279f
f67c5d5b8415d2207d799e457372cd4f9d1d6336
113625 F20101113_AABHAE stanley_k_Page_35.jp2
9a99532a8ed9ba344112ce6a46897d28
b7f6e14d22eea4042280b2f3fca8e55d1f9c8dca
116224 F20101113_AABGTH stanley_k_Page_39.jp2
1ab8568d5ab6e5c3615c35644df73439
c94149656f6558a95e5960f645b46f1308803652
27005 F20101113_AABHMY stanley_k_Page_77.QC.jpg
1741e54f7c4c36ae29d9b09e7291a858
3280da0e59dbe034e2eaa9818a55ceb9e91d7608
24587 F20101113_AABHKA stanley_k_Page_22.QC.jpg
fb2b4138f59c766db4bd4ec511df4f9e
ffe92ec0adfbb7a8f2fd2719bb21a4d682b4a47a
51673 F20101113_AABHFD stanley_k_Page_46.pro
ad0404f9070e9c83f185ac2612cf0e35
df4cb9a86872794a2669e00bbc9248d09c0243af
75271 F20101113_AABGYF stanley_k_Page_51.jpg
5a0dc131cb053fd72cb5569a9ca593aa
6609ecb1db40765d629da4971ae71ba3b2b31b22
114725 F20101113_AABHAF stanley_k_Page_38.jp2
74d6f8f6bb4ad05f9b9f9a5df449b271
c191fcd4ecb656697af4a860f114e4c046a76c09
25428 F20101113_AABGTI stanley_k_Page_24.QC.jpg
fd547eba4e1ad70fae57d3d3ff489f95
e27a4640365023e7974859c1d03517a9fde487bc
F20101113_AABHMZ stanley_k_Page_77thm.jpg
b3ed5306f5a8744e84b4d2c91efb841f
2c691765ea789d337ac28ad4aa3622448cab1b25
24272 F20101113_AABHKB stanley_k_Page_23.QC.jpg
661eb0a1c7d01ffc06b46b8ba0520aac
4b6f001b094d0655df31cbccf9e543d0ef0db292
46572 F20101113_AABHFE stanley_k_Page_49.pro
e621e3265f0aadba08953d870f86270d
0b5734c51878ec48206ab8290ca781299cb75060
74332 F20101113_AABGYG stanley_k_Page_52.jpg
75ad5712f04bef1966dc51b9965db3e4
1b7cd35b569ad25a1679a5161e9cc4f82d7db474
121127 F20101113_AABHAG stanley_k_Page_42.jp2
bd3a749f5bcd8242e5542bc10c442944
b3e4db8e44e8cdf9960d142cd8593c39abecac9f
199 F20101113_AABGTJ stanley_k_Page_69.txt
1e29689ecbe248a7edbdc26519fac67a
df7d3c0eb1b4d4c3ddc22ef10eb45e6e680d71df
6788 F20101113_AABHKC stanley_k_Page_23thm.jpg
565fc822e1216f185a6068cda74aad6b
211732fee5eb5594437798d11ca4a0ef724cb4fa
70221 F20101113_AABGYH stanley_k_Page_53.jpg
bde833fcabfeaec296323788996a69cf
e96a4a977d9cd8f9f96320ddadf9172d84cb7125
115770 F20101113_AABHAH stanley_k_Page_45.jp2
60a33d3800684dcf0df55dbf60669268
7fb89b5f8be1c8eb2695c77af1d36919387e81bb
F20101113_AABGTK stanley_k_Page_49.tif
04ad738c4bfa3f08dd1f49ca9ae09ef3
75ce8deccd89561474668703fbe56fb73da9a653
6819 F20101113_AABHKD stanley_k_Page_24thm.jpg
85d66bebb53c31340e6bd92fb931b40f
03fb9c10e07e6bf77aae048a58346c078bb1136c
52727 F20101113_AABHFF stanley_k_Page_51.pro
3aa627baf391ee896e92332d90561682
2615333f4fa958677f994ebf7d0a03d88e86bcc4
74357 F20101113_AABGYI stanley_k_Page_54.jpg
ab6c528a0705e2d4972eb4feb438b9eb
cc2bbb16cc2d1e72c14df787e4ea78988a4b64d8
111560 F20101113_AABHAI stanley_k_Page_46.jp2
3b88234ee8ac09beb2c29bff2f8c0da8
91cc6c59eb1578248bbf4ecf762e7bcb7d08d99d
14384 F20101113_AABGTL stanley_k_Page_75.QC.jpg
a0a52a5362c36677df4247ef9c36e4df
01a5c07e8c57cb916975e7eb0e2a7a73546a55de
6945 F20101113_AABHKE stanley_k_Page_25thm.jpg
293aec28193d5626bd0b0cd07790c4b5
b8bcba3441049df52d1ba4d97f6eb255564fd8d3
53201 F20101113_AABHFG stanley_k_Page_52.pro
275b37e8da177c99115489028c2654fd
3a92690111fe72d538934c0f0dcac74c6b593f66
79516 F20101113_AABGYJ stanley_k_Page_55.jpg
faaeea7375c1d07c71fa25f00de5735d
851419e5bf831df728edd095b624ef9c5ebabddf
119828 F20101113_AABHAJ stanley_k_Page_47.jp2
31f7aff3805b35c1423567f79bc66b04
bacbe43729d63b593ef43963e2d58ba33536c425
117556 F20101113_AABGTM stanley_k_Page_11.jp2
422e33b9a7de165a7ab3716bc585e197
e259f96dd398b84849aafb63a75a7f0fa5847a55
6649 F20101113_AABHKF stanley_k_Page_26thm.jpg
7f79240214842a65b1974459dccf9787
2a6a90f509960dc9748d4ce480b8e49ac4835136
48102 F20101113_AABHFH stanley_k_Page_53.pro
9300ec53bcf9455ca51aa448b7c2795e
f816960bae684de2611e1b74919cf0dc6945dbaa
79491 F20101113_AABGYK stanley_k_Page_57.jpg
f4aa38fe9666998dff6d28876c152d5a
1a5a1187e19c4c6bd0783da6ee306680dfdd8e22
111758 F20101113_AABHAK stanley_k_Page_48.jp2
cc21b6f2904f06f6ef43db96d975e146
99e6a8379928f211e15167ca6236f188c2d5da4a
2835 F20101113_AABGTN stanley_k_Page_71thm.jpg
d53b40db9dde39f83b585e27ec1bdb6b
e508f0292b23157ec6f3fd6863d701408c40a26f
25427 F20101113_AABHKG stanley_k_Page_27.QC.jpg
8bb3ae52d0c2a11761f1c4fa9e4a1ea8
8b1cd5afff41fb7b58f37e0b4bb5301737622dec
51553 F20101113_AABHFI stanley_k_Page_54.pro
36f160bd6a0a5b4e873c4f7c3b3f0b3f
16e3c340f98fd0e181a2ddace281c305222237f3
52022 F20101113_AABGYL stanley_k_Page_60.jpg
adb49c525a14fc17d2c190d9fa93da3c
d08fb0e7ab3e77a7608dcc8de82e97266cbcab90
121236 F20101113_AABHAL stanley_k_Page_50.jp2
4ffc66605d9bc343d474174283824417
3442591c0f7c7bd066065681b9e10578be05c848
1912 F20101113_AABGTO stanley_k_Page_53.txt
252f6f1da27d7160825ec8d89851a1c9
1be92f1839c4a66183b171bf707c902cffdc788a
6892 F20101113_AABHKH stanley_k_Page_27thm.jpg
890e90b820db9cc54ed65905baaa99ab
1532d9bb4973c8e95c115e63a27567811b9ee1c7
56071 F20101113_AABHFJ stanley_k_Page_55.pro
09d81b03009aa8026d0d462122a59b71
496644640d15501c93ba1f8f4cc113b101c78bfb
75405 F20101113_AABGYM stanley_k_Page_61.jpg
9268166005aa4b50609385472628995f
fcc44310c63d64352ca8bee8817100a378c6a74c
113670 F20101113_AABHAM stanley_k_Page_51.jp2
a3d431f4d4592984ccd4da1b3f4b5223
676b67108465537967043c532169490c63bbdb44
27710 F20101113_AABGTP stanley_k_Page_06.jpg
61367e8027cba578c7a48d43f0786072
c79b6cedf91e158f10f284adebb5b4ba0b960f94
23809 F20101113_AABHKI stanley_k_Page_28.QC.jpg
c8431070bb7488a54eaa22e3a24ce12d
f954e27daa65458e50b84a91cc1ae7a459e80164
55912 F20101113_AABHFK stanley_k_Page_56.pro
6ad0bc537ae8f04128b576e05df4c445
96f77a9d7d04e5ff1475800543bbd56783d3833b
75930 F20101113_AABGYN stanley_k_Page_62.jpg
91b388950d5d47a7d010751b6572c406
109a35dd9671ff532ffdc65ced7e48230a8ce38f
113502 F20101113_AABHAN stanley_k_Page_52.jp2
c869351b8e9bd1027f1d5730680324d0
c455caea062d013d4c9333e8242786eabd081d5d
25961 F20101113_AABGTQ stanley_k_Page_67.QC.jpg
31f331a9d85031079942ceb7c7f0b9ec
b5e42e40cba73616050601a4e988f9cc168e760d
6668 F20101113_AABHKJ stanley_k_Page_28thm.jpg
f774dd018d863f34b41582750d9737b2
b4350bf698d586b4ccbbebebcbf59a9f7b570b2a
55853 F20101113_AABHFL stanley_k_Page_57.pro
a97e22a28c203852b2320d9d1aedcd1b
1bc6344b24b7590c32da705adc85a143f9036aa0
76950 F20101113_AABGYO stanley_k_Page_63.jpg
ec6d6960272b9c8b4271d7a4e3517f2a
1723a3699e870a9eed88215b0ce148335118e469
107349 F20101113_AABHAO stanley_k_Page_53.jp2
d1cf757962a038bfb2b8caa151dd8534
26d4077cccf2bbc48a1d67ab24cbe27dd7234d29
2225 F20101113_AABGTR stanley_k_Page_70.txt
90e8140c6705f338042b77ea9177ea4c
b3d990befc30ea58a337204050e38a274a0c6e3b
52059 F20101113_AABHFM stanley_k_Page_59.pro
d4356c96fd303df19c0d677a57f10956
d84a793977f4dd0c3f10a15dc1702c0766547009
76464 F20101113_AABGYP stanley_k_Page_64.jpg
e71428988693987fbec83e1db40da4cb
3b20885bc563428984740f013dd25e2003f48b08
113752 F20101113_AABHAP stanley_k_Page_54.jp2
5829b400401322650e0f69f42a23b831
7f77de6ba40e2578622dc870f3e80f3b67783ecb
73328 F20101113_AABGTS stanley_k_Page_46.jpg
dbab2af2639c8e3dbbd98bfbecf12212
0158e84d7f95580fe2bf95e69ec5a5111f87cded
23759 F20101113_AABHKK stanley_k_Page_29.QC.jpg
d2be497689579942441c627ecf9253a4
a12c8df99f1b7824df56b81420a7e14b05f07067
34050 F20101113_AABHFN stanley_k_Page_60.pro
6ec5055da98d8009b900b073fac9e8c3
881b86c71fd6b831027069fdbf80e9d57d6163d7
74926 F20101113_AABGYQ stanley_k_Page_65.jpg
518ef45f490adde893b162325929363a
58d1ae2b9785eed1b2dbdf31f96c69fc533b9185
122302 F20101113_AABHAQ stanley_k_Page_55.jp2
4cecc5407d2f9f8864d4d3b8bd11bd95
7e3dadd2dfdd3ea48d12789881cb9f4237fa77af
F20101113_AABGTT stanley_k_Page_71.tif
092fe1e99fdb49a44075c0af60f4a582
3e548aa70fe66ae8a2bad4a916d91db864f82250
6839 F20101113_AABHKL stanley_k_Page_29thm.jpg
3e6c6f2c4470d61ba26aa7fd8d51fb30
9dd254161ab271498749044c0989382f8fb62e33
53189 F20101113_AABHFO stanley_k_Page_62.pro
23d111906c236c3140bf7246adfc91d2
41d82e89402670afe42657748f37afc7c86cd14f
79866 F20101113_AABGYR stanley_k_Page_66.jpg
861fca487da12d1a99d1c44209f6053c
963581fb013709ebe5026942e806f5a58d1d4fb6
121877 F20101113_AABHAR stanley_k_Page_56.jp2
43ae207803b3cc7294706dfb964e3064
8ea139fa14fe06eea5099ab1753198fdfd7ab2e8
24189 F20101113_AABHKM stanley_k_Page_30.QC.jpg
d1b0066706e336d1276ddfb78bfa9259
33ea433ddd6cb75e48864eb3f4f2eb610f700be8
54883 F20101113_AABHFP stanley_k_Page_63.pro
330acc712c998b4fad7d9a3cce7b0d02
70debb58e92e8999641636b52e7cf0f8af983382
78217 F20101113_AABGYS stanley_k_Page_67.jpg
e18849aa86717993a3bb3107e3d39774
d1dba8907c1b1c63d6169af881d9352c2e5c86ca
121957 F20101113_AABHAS stanley_k_Page_57.jp2
6dbd663b9d24a40bd230ed340d62b706
1e00766b064f0ef64c2724edac7ebfeefa90b685
37338 F20101113_AABGTU stanley_k_Page_04.jpg
2d2c55df84ce2dac769691b0cf80f513
123704095e0e8efea16631209966bf77c243add7
6550 F20101113_AABHKN stanley_k_Page_30thm.jpg
2dd98dfd3d796eb751884c2f3584916a
ff82996140a503bdad22d2032656c8b4496e1d89
53712 F20101113_AABHFQ stanley_k_Page_65.pro
e7eff6db0b295522fa050f4ffb5be4ec
eddbb99c5e4c5d25dcb1a84df512df17f84b74c0
15287 F20101113_AABGYT stanley_k_Page_68.jpg
e31a7030a32cf5f24dfd6b54af2882f1
ab353c645a3a95ffeac2146bf56684d25d93fd1b
125607 F20101113_AABHAT stanley_k_Page_58.jp2
ba2a2dd0ee1cdeae1dfbfea163884e78
44655a36b9d88d9f4c7fee93e72275419fb34949
18049 F20101113_AABGTV stanley_k_Page_05.QC.jpg
35799ccb70f740f6e8e07fd3059ea498
df6c04f340d637938efb04554f77e22915489742
23438 F20101113_AABHKO stanley_k_Page_31.QC.jpg
5aba91f39671cbab1dfb845ca4db3436
c38981ca185e42c894fe5ba17b7d58d6d3b4b5f7
39968 F20101113_AABGYU stanley_k_Page_69.jpg
03b91a7ae8dd50af2447d10ab78670e0
930039f2257a4da00677f691dfd5545308661b60
114796 F20101113_AABHAU stanley_k_Page_59.jp2
fa4b81740d5a2161a1f834298e0ea655
f72ab6d4af7847a63f499ddf4100305e29e8c1eb
6956 F20101113_AABGTW stanley_k_Page_32thm.jpg
d4de25d9572a5e4b098cf26c5e4f8262
b94000e5af08c46159e1b846263966a8f02a4018
56790 F20101113_AABHFR stanley_k_Page_66.pro
f5848fb682186ce70b68e0eb37e6f391
a01ee2bb3900b83e6279432f2d28c8666b0f95c6
25414 F20101113_AABHKP stanley_k_Page_33.QC.jpg
e3d6de87ca47a5b4970cb99d69b94d7c
1b93f0a7677a343a534ed441c6cd629d99653a14
59688 F20101113_AABGYV stanley_k_Page_70.jpg
244523a49f72ed3236b60d1c1bf97cd2
ec64d85be3b4ec7fd5db2c5d9187d639c3043168
78010 F20101113_AABHAV stanley_k_Page_60.jp2
b8297df79ec21d3e04e8b153723caf41
09377136ea0b88b77da1c93987b2d4a05e93cd93
81810 F20101113_AABGTX stanley_k_Page_58.jpg
cd5e50bba88aaec7b26aff48a0f362e1
5db513222e7cc451ae615dedc6674180a4015004
55116 F20101113_AABHFS stanley_k_Page_67.pro
16f84049d739e41fc548118c1cd91f1a
9dd70cf23d6f615eb6d2843a201f934204e17750
F20101113_AABHKQ stanley_k_Page_33thm.jpg
b1f1039616a6991a8eb8561cd6621f10
f74e720d76cd71e4812b994ea4f14defb68e068a
36037 F20101113_AABGYW stanley_k_Page_73.jpg
43a66388661c212a3eef56e44d1b551c
48edf699da3e21ea4f5033e6f7912768bf00b5cf
113838 F20101113_AABHAW stanley_k_Page_61.jp2
858ad9f8342e26b5036c8cfd8917d6a6
9e6f7ad77b01ce703d7bc1949c99942ce9f40b7d
F20101113_AABGRA stanley_k_Page_30.tif
00e26d0f1da9e163663a16f2bdbddf98
3201003a4977967d6586f18c310d2f10a8dad177
25338 F20101113_AABGTY stanley_k_Page_32.QC.jpg
316c0ece40e1872984b78558523fe135
c608e6e43fdd8d74e46520d1438cffcdb1c58071
5487 F20101113_AABHFT stanley_k_Page_68.pro
d42ca9fdc238422eb4cbb4d67815cfd9
86615016e76a21a9c13b9830502355424d673fdd
23237 F20101113_AABHKR stanley_k_Page_34.QC.jpg
c6f79696a577c006d1723265adfc4bd2
34811c83326a142f68f6e832f8b37792efddad47
22516 F20101113_AABGYX stanley_k_Page_74.jpg
d206c71e502ec52e8c30a5be1e3a4b77
327a6715ef0fe526717404b59668f4c395ac5e1c
114297 F20101113_AABHAX stanley_k_Page_62.jp2
49325563ca6751d697059976b19d79db
6e52fdb7e34dcc2c6156f6da3d60d8f0abd3de03
26037 F20101113_AABGRB stanley_k_Page_55.QC.jpg
c0ae7cbd2d0846e231df8ad5b65e5093
2eb9cb0301f01c01a181e9735df3e1ecbd8967be
25075 F20101113_AABGTZ stanley_k_Page_51.QC.jpg
f0324edfcedca1fd652d540f0628f8d8
8991305e483269e226de7bdc30fc80efaec67487
4783 F20101113_AABHFU stanley_k_Page_69.pro
4104970b13a150e53c6518a05d311194
a5086672a289e7768c7a719e188cc82bdf91abf2
6552 F20101113_AABHKS stanley_k_Page_34thm.jpg
851fc7e91c75940e8b79a18c5ff6ddad
2298c6891172a7835b575e00c68a7d3b76e2e34a
46683 F20101113_AABGYY stanley_k_Page_75.jpg
cad4b32d3dd25c7aa860eee2654b67a8
b964c3eca1525d14d5543c5a532679725dbeafb9
118420 F20101113_AABHAY stanley_k_Page_63.jp2
33d8c20a11a291ab9a183ffd7fe5ba5a
865a12c0e8ebb3828337dd57140bd43b920fdd82
5614 F20101113_AABGRC stanley_k_Page_70thm.jpg
121d71e36809e45a3b8783657a8e592e
4b5b032dcccc616eea9071ba702f11bd30538147
52613 F20101113_AABHFV stanley_k_Page_70.pro
3e53057f2291a4c79ba26ec9e36717e3
8cd54f47d5f35a7234a9308079fdff9b6e7c0f7d
24646 F20101113_AABHKT stanley_k_Page_35.QC.jpg
274a3d41538b8c94b7ebfab0c2aedf64
4257a4aa67435376dcec494d3d201a6f0823fbd2
116246 F20101113_AABHAZ stanley_k_Page_64.jp2
f1c95373f9d2c1f3228fabc08f24ccec
a731284397068758b567f721911d1f12b20c3bcd
680 F20101113_AABGRD stanley_k_Page_72.txt
2f22c0949fd5f060280f40a5cb76b407
adc1dcb19664aaa652abe907d3edc4acb05cc084
17231 F20101113_AABHFW stanley_k_Page_71.pro
7e9f6da96ef6211f2fd8f24d855426d4
342a8bd8e5bb0d3c72114c6902b5ab8f48e31f78
27953 F20101113_AABGWA stanley_k_Page_71.jpg
9185e600eb1494defe9a33db23b82c38
ca99e930afb78de6b2172010806b380128f08c9f
6480 F20101113_AABHKU stanley_k_Page_35thm.jpg
dc58e4f16c2312f1c6cbe6fae103b2a9
ec6e72aeba5140ee5c4fdd0d12a428f10d937a3b
82104 F20101113_AABGYZ stanley_k_Page_76.jpg
8fd33b5298bf849c0bed5a7b4e02fc9a
dd682e3e01f14b3347744db71501cfe0f92292c7
23112 F20101113_AABGRE stanley_k_Page_07.QC.jpg
d1e058f6146b3d05d34aa2fb89362efe
32ae7adb385881ba885faef05fbda5bf7f148dd3
12199 F20101113_AABHFX stanley_k_Page_72.pro
4ec8e37ef5724240eaf05f5eb107c395
75b917d7c0aa6fe13ed4c8e847f86f2bf87b019a
54038 F20101113_AABGWB stanley_k_Page_21.pro
c583ed1db4530d53eb168b35325f54fa
49685866c1d140a27eb7fc8f0e8410a40f0731e9
24667 F20101113_AABHKV stanley_k_Page_36.QC.jpg
e59154730f0d1febe6fccdfbf4c85180
dcaed6b93ab1e9429f529e5e385294386fe6f468
26431 F20101113_AABGRF stanley_k_Page_66.QC.jpg
889b23c00c7c97dd248dc5540ddbee1f
1cf7a2cfe75f9d10244136af72d4f3f72aa78198
F20101113_AABHDA stanley_k_Page_47.tif
adae85754df0a034129fec572f458252
526e82ce438bd9652461744e7b13751de4f0b9e2
14312 F20101113_AABHFY stanley_k_Page_73.pro
8de763d24a8c3989c28685ede4a0490e
82fa71854bb652da2053ca66e2ec9ebc48dc19ae
51114 F20101113_AABGWC stanley_k_Page_10.pro
ed6d58bd2bc8d357badb34fc18b6a41e
da34dbb8241ee44d79e2073a2b264ce3c8a4e2c4
25136 F20101113_AABHKW stanley_k_Page_37.QC.jpg
c274dc96fdc6c66e22dcfcc82153e596
82476ff1347c80b759d9796536056fd7866625a8
F20101113_AABGRG stanley_k_Page_02.tif
7a816b12c9bbdc2faa25541448ba4555
930a9af1f9158bf4caff1278f87bdde6b2e3dcde
F20101113_AABHDB stanley_k_Page_48.tif
305789cc111dd83fe58b236a8b32daeb
9f23408cdfb72d17f76f715ebe10d103b055b93b
5954 F20101113_AABHFZ stanley_k_Page_74.pro
095cf06e6d6872e829cd986afc712fc6
229d4e3381fba36b328fba9064f40e6226531958
119492 F20101113_AABGWD stanley_k_Page_37.jp2
5919251f98a8b42155f9b60dccbef049
7904743cdf4786513130d56abd095c21f6f9adec
6777 F20101113_AABHKX stanley_k_Page_37thm.jpg
b34b1971b931f682f84cce21f6d6b9fa
feb94b07c4906a6a869039694b2fe30ee2e6f3e9
F20101113_AABGRH stanley_k_Page_58.tif
9fd3392c09e35b026c1792594a4287b2
2370aca677fd635907419560cfc650759cf0ce42
F20101113_AABHDC stanley_k_Page_50.tif
f11fb59268731c1ddb815db8fb99a50d
00be407d2133cfb80d021a06752da3ccb3ad9492
52500 F20101113_AABGWE stanley_k_Page_30.pro
9ce186355f46d93c3bbb9e6a83b09480
34fa301addcb897b89422316bf592dca54492fc2
24351 F20101113_AABHKY stanley_k_Page_38.QC.jpg
bd0be3e5b870177b864f9f3519cf3d64
2150ef14f7858dd54855700f3c58958e1b34ec7d
2232 F20101113_AABHIA stanley_k_Page_57.txt
7a9945f2b26b980c3de95116ce15c301
4ffeac9748afe3e36846b14e3fbc0e6390318bcf
2077 F20101113_AABGRI stanley_k_Page_45.txt
fb4496bbd3062ff5a2ff257ec6d87374
942cb347754030383b7ba62a059d568fbb3b1132
F20101113_AABGWF stanley_k_Page_65.tif
116ae9be4b231ca22003302caf1a0a13
d698917453bc85dabcff9143a84184d547eee504
24831 F20101113_AABHKZ stanley_k_Page_39.QC.jpg
ff7390f11391832c317998de2752538f
9cb103b1d561c31da7f948c3849bb6c3af3163e5
2132 F20101113_AABHIB stanley_k_Page_59.txt
da5447b1d855649fc876c4f193337015
1deca85d170cd7579f81eef889f75810bb3a7db4
78053 F20101113_AABGRJ stanley_k_Page_50.jpg
5043d159cbbf97393abfcda3b5e944e3
5b85176e0a24392ba16f2b836f209d3d25f69261
F20101113_AABHDD stanley_k_Page_53.tif
81e162b416735dec89fad0ccdcc4a567
3e2e9790b13f698ade3259b9a983e7641413c37a
6829 F20101113_AABGWG stanley_k_Page_36thm.jpg
d40113bfb0c951a2974f39d400f2375e
3817d4d213d0118009187eabc6a0eb610ff25ba4
2207 F20101113_AABHIC stanley_k_Page_61.txt
686c7846654ecef540fd1e64a73cc31f
f8bff438e4267f2dc36891b5f2561bb29da58c7d
121603 F20101113_AABGRK stanley_k_Page_43.jp2
3b114e29fdf64cac0f9973b42f81bf89
e3cc665b000e63b9b8c107df0b4b11b1771aeb4a
F20101113_AABHDE stanley_k_Page_54.tif
e0f7303139a55b9c2cf0f5a258f2efad
086d70b5a004415e69d5b7ce4175bcab5b0bdd63
6978 F20101113_AABGWH stanley_k_Page_78thm.jpg
4f7a58f901abbad49cc01d78fe68525f
ff1ee47f09b9a6adace5b0027026522cddc795c2
7130 F20101113_AABHNA stanley_k_Page_79thm.jpg
f8c4fb729bf4207bcaa8c06b528e5325
fe7bb913f5e87c2d1638cba1ebb1b608431582a2
2096 F20101113_AABHID stanley_k_Page_62.txt
1a1ae1024a39697b9b898f3b5138191d
c85928a7ab2bdc6747e7b7ed44333c41b3fd5bab
114381 F20101113_AABGRL stanley_k_Page_28.jp2
a2ccece199579a405e533a3da11cb180
0f969aa4f0543561fc6284fb4d731da8cbcd274a
F20101113_AABHDF stanley_k_Page_57.tif
5e04a36b1073403457db60f82fb9496c
53579ce8142c1ebb98107b843c7b9c3582139788
2308 F20101113_AABGWI stanley_k_Page_58.txt
fb585cc0e3145f441e3e59f2c805b5b8
1137406eaefa0e4627c196ecb513ea2866aa749f
27513 F20101113_AABHNB stanley_k_Page_80.QC.jpg
4d13358184d155373950ab81b7ca2c1d
b6f015462236e82843f17f8a7b0e340aee3b1593
2174 F20101113_AABHIE stanley_k_Page_63.txt
960af8522a783ef257f230f23fe0fbaa
530c826927d4e6432b89e9d9b86c1883c9658970
F20101113_AABGRM stanley_k_Page_27.tif
7682c630a066e00ee4837d891e46b317
fb4b5e9762fc54e852f525ce033dcef511cc2b52
F20101113_AABHDG stanley_k_Page_60.tif
77c6ced40a2a503611c91806561221d7
d76eaa6ffa026ed96c7b5e593efaa409e1bc78e2
6847 F20101113_AABGWJ stanley_k_Page_64thm.jpg
1d73a391abd57cd8e6af38e255be2514
00b061669f8afdc54fe4b6d55313a72ba2d9ef98
7298 F20101113_AABHNC stanley_k_Page_80thm.jpg
9d0bb3d3f2bd78563f20cfa67ac1d18c
4cab8e465e5982f7d01cff5fd109af3c6755fd41
F20101113_AABHIF stanley_k_Page_64.txt
e8c97a063f7af8ae49151b2a15ec5298
f986f0f6dfd85307487a45088db2a2de7a0ed502
123786 F20101113_AABGRN stanley_k_Page_41.jp2
2019774d278cdfc7b2696c3d2e1fb0f1
1d4f94d36b15c5421566e0c84713f7eec5f3c247
F20101113_AABHDH stanley_k_Page_64.tif
e1b307eaf2cb3c7aff8488db51a94fd2
4b3ef0e35add9cb901340f17ecfe3e83d235a81e
5038 F20101113_AABGWK stanley_k_Page_02.jp2
9c1edb984e06dce28361c5e7b0f1d8b0
0935de08a694b2b42366c7359ddbb4ae7b4e4e0c
4889 F20101113_AABHND stanley_k_Page_81.QC.jpg
9184b9aa325ef4549359cec2714d2245
9771c055289d2494876ecbb63ea3b520ebc73d2d
2159 F20101113_AABHIG stanley_k_Page_65.txt
65875988d20ce73fb35c76a9cb0fee27
75ae1752fdfaf0513dd3ff55007a87c09ee72a00
215 F20101113_AABGRO stanley_k_Page_81.txt
6fb3978cf2222ce42b3b6159c1306101
ec0735359f9cc67ad28a2a7c26ca2b1898053311
F20101113_AABHDI stanley_k_Page_66.tif
83bbb75594f5b26e48ff719f0e9de8b2
e6870c2acbf836594712968890a545b495e58f28
66657 F20101113_AABGWL stanley_k_Page_19.jpg
f9a88c47cdb061f5247f08a5ed8c1254
e755e0ba9b6eadaf6c578333fe7ddc8ad5bcaacc
9149 F20101113_AABHNE stanley_k_Page_82.QC.jpg
3ec95513b9dca53827890a105b33182d
4ccf903670950b1053c1fb9b38eefb76abf0dcf4
2260 F20101113_AABHIH stanley_k_Page_66.txt
8c5aca59cc91a7fa7c1fb18e57cacfb0
5e234c755e715b04d4e523e80f040da1698217b6
6262 F20101113_AABGRP stanley_k_Page_49thm.jpg
32748c413dea980203d4815d6a4aae74
e228e25ebdf99fb1d365ed8ded69bca0c1157cdd
F20101113_AABHDJ stanley_k_Page_67.tif
5b535482fa6c2fdb5906d60fe5952e7c
402dfb28b87baa95daf18770fe3fb3fdfec287e7
55900 F20101113_AABGWM stanley_k_Page_43.pro
a059925d3fadf9d4cbad871e5dce6ded
5965ff55de317a18cae92bf94f564c1fe1449a38
96386 F20101113_AABHNF UFE0021045_00001.mets FULL
757e5a47f88a92f594cc21afec524ebb
fcd8f502550fbae1a2168fc965bbd05c79f43fe7
18815 F20101113_AABGRQ stanley_k_Page_70.QC.jpg
bcb44cfb8bfc7582e4112087235e8548
342edf3c1729c41ce190f3e2af9c70ce48e95617
F20101113_AABHDK stanley_k_Page_68.tif
d111016790293f41512a042fd33654d2
75cbc10aa290e42413f204f359817e4c33503555
F20101113_AABGWN stanley_k_Page_52.tif
096c78f37831ed0e78ff8794999f007b
f0437d150c1718dffe92003cb5330f94ffe74654
2217 F20101113_AABHII stanley_k_Page_67.txt
4a5df38aad81fb75f39fb733a5566a42
0a47ac6a1ebad0b5cca54b43461d8c576734499b
25715 F20101113_AABGRR stanley_k_Page_75.pro
264fb70657e581bec95bce7ff9cd4bad
78e2ed6329ccbf667b35e3e106884de0a2fb45fa
F20101113_AABHDL stanley_k_Page_69.tif
907174c0755c6b37cc4f743251505c93
8f4ef5f8e80706fdae4997090d5c986528b73b0c
113310 F20101113_AABGWO stanley_k_Page_44.jp2
b8c537ff4914b8b3a1455ebe60670447
f17057ea1b0a37649d7d18b48c23d6e24add1e13
224 F20101113_AABHIJ stanley_k_Page_68.txt
897b866efbd13827956bf2b298786642
f967b5559099fbe982ad55add41643b4607a04a2
F20101113_AABHDM stanley_k_Page_72.tif
014e6d4c91228b475ffd204abfad6749
3fd1f39d0a3ed6ff1a1f09810f7a5852731b84da
F20101113_AABGWP stanley_k_Page_55.tif
3e0658499a796c84ac94f893d31dd3cb
d911b63aa469055c197a0fdc8ec4245036b44761
691 F20101113_AABHIK stanley_k_Page_71.txt
0f166357d8daba27598d8fd7fb2eb012
37c6b95f2ad5616acbade798d7bd7a0154afe11e
118697 F20101113_AABGRS stanley_k_Page_14.jp2
1f49eba67902db831eafd5e71523a997
bd82e97ef3b47324f53ce657705bde002df75a7d
F20101113_AABHDN stanley_k_Page_73.tif
3e306b54433a716d2d13509e5f1b987d
43884648ec7d20a9f3a597b6650c395076576dc5
2520 F20101113_AABGWQ stanley_k_Page_76.txt
e5996f031646266d540765d6d2e77994
e8282bee32f3b89251c1d467095edb8862c782da
277 F20101113_AABHIL stanley_k_Page_74.txt
19562fadc28d418d860183a705eb1c95
a7d12df90d94710ad2b9244e6928b116f20bbe72
F20101113_AABGRT stanley_k_Page_59.tif
004daaa4af9bfeaff29023d5ed03de03
8913fab4f30b0c3f27dd3c35aea3445f99fe23b3
F20101113_AABHDO stanley_k_Page_74.tif
71e52ebf911d0af03ec07329d8aebf1a
e11750d96a192596da960b6265687afc660bd8e5
124536 F20101113_AABGWR UFE0021045_00001.xml
374a92cb06c7d7163a5a80e12fa17c0c
6b68af565e9bc3c2156f707738c17dc824d816b5
2467 F20101113_AABHIM stanley_k_Page_77.txt
5ee4c56b9ce864635244ecb61f894c93
2aea1b95ca465aae89ce619aef930110b1ca9ca3
26809 F20101113_AABGRU stanley_k_Page_78.QC.jpg
7ff82c5d64c720697d1ed25b05919dea
ea75d9141f4e8d8f6a961b53ceabc00218c184b6
F20101113_AABHDP stanley_k_Page_75.tif
2aeae6f0a07d76264a98945d1351de51
502b71a2eb895cf927d1893d64079b9b94658c36
2521 F20101113_AABHIN stanley_k_Page_78.txt
dae3757d0630da922931f9795fe4f5c4
18f84ff9f7533bdc9319a51f777a8a2c333acecf
7077 F20101113_AABGRV stanley_k_Page_50thm.jpg
388d531ec8449f5aef094c1535ff34a9
6046bb1ae3fdcf1de90d8055ebfa7a310103b4d4
F20101113_AABHDQ stanley_k_Page_76.tif
b173fe2f71d72caea2e905395dfefb74
0faf49622d25173c4b6f421674cde4a529cefd2e
2469 F20101113_AABHIO stanley_k_Page_80.txt
faed39342d37f422720eea6e05441b02
b7f4d488b42671ae1ab201f6527050c9938e58d2
55403 F20101113_AABGRW stanley_k_Page_50.pro
35bc4198df959b4ff018f153c1040fa5
f0a392f0c9fef568aba07c3205fe8985e7e091b8
F20101113_AABHDR stanley_k_Page_77.tif
885e576d87914838d4d19362cd7f023e
fc65b934f41fdb713baa7912859e7cd17b8ce841
21633 F20101113_AABGWU stanley_k_Page_01.jpg
8e1f8903f63da32c1aa4fd428af50824
c54d8aea07448a0727af9e5556a28d04fafb0015
609 F20101113_AABHIP stanley_k_Page_82.txt
2889ec5ac6f32bb064e0ff050f01f310
1695b636e53bae3e53a38490fee1132451b1c42a
1656 F20101113_AABGRX stanley_k_Page_81thm.jpg
08920dc830f88ff8341fce2febcbcce9
64c895972b95d3eb4e688ea242df7a95b2ba41d1
F20101113_AABHDS stanley_k_Page_79.tif
8a25319d8e5551d7e0c264dcfb4fc1f9
9a4befdc7effdd8888ab10dabcbdebb45302140c
9889 F20101113_AABGWV stanley_k_Page_02.jpg
a87e118c560c8741bf2248aab24bb809
b0093019262c21dbedfa099cd33b178ea05b3c51
265497 F20101113_AABHIQ stanley_k.pdf
837504712e852a66a805a19eb107dd10
08e5553be6fe96bcdfe2788cfae1a1058e65c837
112850 F20101113_AABGRY stanley_k_Page_65.jp2
3bf6e0640a545b276e61dd58876ef31f
c03ec288797a62172350b1951ba07b9ea85e6db0
F20101113_AABHDT stanley_k_Page_80.tif
7409eb82e5e5cebc089772bd97ac9b24
2fb6a8b8af0467d9249f8b8cb009623ed09356e6
9847 F20101113_AABGWW stanley_k_Page_03.jpg
c2474971aea030fb8dadcddaae74d0e0
c516a18aadf672bb0b1f1ff3388e50345bbf91f9
2277 F20101113_AABHIR stanley_k_Page_01thm.jpg
25557aa0fd692e83f6c0f7d148b90d2e
88aee213b61043ba5191b9c3deecd12f45ff9183
F20101113_AABHDU stanley_k_Page_81.tif
745b7fe92dbb926a153ac01fb9179d88
17aaa6479ba841ad51989854bcf39f2a7cdca216
3993 F20101113_AABGRZ stanley_k_Page_75thm.jpg
b8d98e4d808adc9f90b00ce30f36491f
652d2aeed8477d5ae018640e338b48f13ee0c7f5







SEX BIAS INT INTERPRETING EMOTIONAL STATES FROM VISUAL CUES


By

KEVIN E. STANLEY















A DISSERTATION PRESENTED TO THE GRADUATE SCHOOL
OF THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT
OF THE REQUIREMENTS FOR THE DEGREE OF
DOCTOR OF PHILOSOPHY

UNIVERSITY OF FLORIDA

2007


































O 2007 Kevin E. Stanley





































To Jennifer and Simon









ACKNOWLEDGMENTS

I would like to thank all the people who have contributed to this dissertation through their

support, encouragement, help, feedback, and inspiration. There are too many to list them all, but

a few stand out for specific mention. I thank my parents, Lloyd and Rosa Stanley, who have been

there for me from the beginning. I thank Drs. Constance Shehan, Franz Epting, and Ira Fischler,

who have been helpful and kind in their service on my dissertation committee. I thank Drs. Paul

Schauble and Rafael Harris, who went beyond the requirements of their roles as my clinical

supervisors to become mentors and dissertation coaches. I thank Dr. Martin Heesacker, my

dissertation chair, editor, mentor, and friend. And most of all I thank my wife Jennifer and my

son Simon, who give me love and a reason to succeed.












TABLE OF CONTENTS


page

ACKNOWLEDGMENT S .............. ...............4.....


AB STRAC T ...._.._ ............... ...........7........


CHAPTER


1 INTRODUCTION ................. ...............9.......... ......


Overview. .................. ... ........ .. ........ ..... ..........9
Portrayal of Sex Differences in Popular Culture ................ ...............9............ ..
Sex Differences in Research ................. ...............11................
Factors in Perception of Sex Differences .............. ...............13....
Facial Expressions as Displays of Emotion ................. ...............14...............
Hypotheses............... ...............1

2 LITERATURE REVIEW ................. ...............19................


Overview ................. ...............19.................
M ethod ............... .. ............ ...... ...... .............1
Chronological Review of Literature ................. ...............20.......... ....
Conclusion ................ ...............47.................


3 METHOD .............. ...............49....


Participants .............. ...............49....
M material s .............. ...............49....
Procedure .............. .... ...............5.. 1..............
Evaluation of Hypotheses ................. ...............52................


4 RE SULT S .............. ...............54....


Hypothesis One............... ...............54..
Hypothesis Two ................. ...............57.................
Additional Analyses................. .. .............5

Apparent Sex Manipulation Check............... ...............59.
Sum m ary ................. ...............59.......... ......


5 DI SCUS SSION ................. ...............6.. 1......... ....


Tests of Hypotheses .................. ... ..... ........... ... ..... ... ...... .........6
Sex Bias in Interpreting Affect as a Reinforcer of Cultural Stereotypes ............... ... ............63
Limitations of this Investigation ........._... ...... ._._ ...............65...

Implications for Future Research............... ...............66
Conclusion ........._... ......___ ...............67....











APPENDIX

A EXAMPLE STIMULI .............. ...............69....


B INFORMED CONSENT STATEMENT .............. ...............70....


C DEMOGRAPHIC QUESTIONNAIRE............... .............7

D EXAMPLE ITEMS .............. ...............73....

E INSTRUCTIONS FOR RECEIPT OF CREDIT AND THANK-YOU MESSAGE. .............75


LIST OF REFERENCES ............_ .......__ ...............76...


BIOGRAPHICAL SKETCH .............. ...............82....









Abstract of Dissertation Presented to the Graduate School
of the University of Florida in Partial Fulfillment of the
Requirements for the Degree of Doctor of Philosophy

SEX BIAS INT INTERPRETING EMOTIONAL STATES FROM VISUAL CUES
By

Kevin E. Stanley

August 2007

Chair: Martin Heesacker
Major: Counseling Psychology

A disconnect exists between perceptions in the general public about sex differences in

emotion and the findings in the scientific literature. Whereas popular books and portrayals in

various media depict the sexes as inhabiting wholly different emotional realms, the scientific

evidence reveals sex differences in emotion to be small and often situational. Attempting to

determine the source of this divergence of popular and scientific opinion is puzzling. The present

study examined one possible element in the creation and maintenance of widely-held beliefs

about large sex differences in emotion.

In the present study, the decoding of facial and postural affect was examined for evidence

of a bias based on the sex of the encoder. Apparently male, apparently female, and androgynous

stimulus figures were created using computer graphics software. Stimulus figures were

manipulated to display unambiguous expressions of stereotypically masculine emotions (pride

and anger), and stereotypically feminine emotions (happiness and fear), as well as ambiguous

expressions of emotions (pride-happiness blended expression, anger-fear blended expression).

Participants rated each expression for the degree to which anger, pride, fear, and happiness were

perceived to be represented. It was hypothesized that unambiguous expressions would be

interpreted similarly regardless of the apparent sex of the stimulus figure, or encoder. It was

hypothesized that ambiguous expressions portrayed by apparent females would be interpreted as









more consistent with stereotypically feminine emotions, and ambiguous expressions portrayed

by apparent males would be interpreted as more consistent with stereotypically masculine

emotions. It was further hypothesized that when the apparent sex of the encoder was ambiguous,

sex bias effects would be attenuated. That is, androgynous figures would be rated lower on

stereotypically masculine emotions than would apparent males, and lower on stereotypically

feminine emotions than would apparent females.

Statistical analysis of the results yielded partial support for the hypotheses. Unambiguous

expressions were rated significantly higher on the intended emotion than on the same-valence

alternative in 11 of 12 cases. Both apparent males and apparent females displaying ambiguous

expressions were rated higher on stereotypically feminine emotions than on stereotypically

masculine emotions, which was inconsistent with the hypothesis. However, follow-up analysis

revealed that apparent males displaying ambiguous expressions were rated higher on

stereotypically masculine emotions than were apparent females displaying ambiguous emotions,

and apparent females displaying ambiguous expressions were rated higher on stereotypically

feminine emotions than were apparent males displaying ambiguous emotions. Androgynous

figures were rated lower on stereotypically masculine emotions than were apparent males, but

they were not rated significantly lower on stereotypically feminine emotions than were apparent

females. Manipulation check data indicates that participants did not interpret androgynous

figures to be female in the maj ority of cases. Implications of the findings and directions for

future research are discussed.









CHAPTER 1
INTTRODUCTION

Overview

Our study is an investigation of one mechanism that may serve to generate and perpetuate

overestimations of sex differences in emotion. A sex bias in the interpretation of visual displays

of affect could amplify any real differences in emotional expression between the sexes and/or

create the impression of differences where none exist. In our study the interpretation of visual

affect is examined utilizing recent advances in readily accessible computer graphics software that

allow the manipulation of apparent sex and facial expression of the encoder at a previously

unavailable level of control.

Portrayal of Sex Differences in Popular Culture

The idea that women and men experience emotion in starkly different ways seems to be

quite popular. The prototypical example is probably Gray's M~en Are From Mars, Women Are

From Venus (1992). This and his other books on the topic have sold over 14 million copies

worldwide, and have been translated into 40 different languages (Gray, 2006). Another classic in

the popular literature, Tannen' s (1990) You Just Don't Und'erstand: Women and Men in

Conversation, contrasts men's fact-based, instrumental "report talk" with women's emotion-

based, relational "rapport talk" approaches to communication.

Many other successful books published in recent years also make extensive use of the

premise that large sex differences exist in emotion. For example, two books by Pease and Pease

(2001, 2004). became number one bestsellers on the International Bestsellers list. The authors of

these and other popular books assert that women and men exhibit significant, sex-based

emotional differences. Many further argue that those differences are innate and unchangeable,

citing research from evolutionary psychology, comparative psychology, and biological










psychology to support those claims (e.g. Baron-Cohen, 2003; Moir & Moir, 2000; Rhodes,

2004).

The sex-differences theme can be found in other popular media, such as the theatre. The

record for the longest-running solo play on Broadway is held by Rob Becker' s Defending the

Caveman (1991), which makes extensive use of the premise that women and men are

psychologically very different (Theater Mogul, 2006). Becker' s play has been "recommended

by thousands of psychologists and counselors" and he was invited to perform at the 1999

convention of the American Association of Marriage and Family Therapists (Epinions, 2006).

The message that the sexes inhabit separate emotional worlds is also conveyed through

television and film. For example Seidman (1992) found pervasive sex-stereotyping on many

dimensions, including affective expression, in music videos. Seidman's analysis of sixty hours of

music videos shown on MTV revealed that women were portrayed as affectionate, dependent, or

fearful more often than were men, whereas men were portrayed as adventuresome, aggressive, or

domineering more often than were women. Similar findings have been reported for televised

sports coverage (e.g. Billings, Angelini, & Eastman, 2005; Fink & Kensicki, 2002), prime-time

shows (e.g. Aubrey & Harrison, 2004; Signorielli, 1989), and commercials (e.g. Stern & Mastro,

2004).

The portrayal of separate emotional realms for males and females can also be seen in

children's entertainment. Thompson and Zerbinos (1995) studied 175 episodes of 41 different

children's cartoons, and found sex-stereotypic portrayals in numerous domains, including

affective behaviors. For example, males displayed pride or anger more often than did females,

whereas females displayed virtually all other emotions more often than did males, especially

affection. Dundes (2001) observed that Disney's animated films have been widely criticized as










promoting gender stereotypes, and went on to argue that the film Pocahontas, although often

held up as a counterexample, in fact continued the trend by reinforcing "stereotypes of girls

whose identity is determined first by romantic relationships and later by their role as selfless

nurturer" (p. 354).

Sex Differences in Research

It is clear from the portrayal of his-and-hers emotional worlds in best-selling books, on

stage, in movie theaters, and in television shows marketed to adults, adolescents, and young

children that the idea of sex-segregated emotions is a very popular one. However, a growing

body of psychological and sociological research seems to indicate that men and women are

actually much more alike than different in their experience of emotion. Canary & Emmers-

Sommer (1997) used an extensive review of the then-existing research literature to argue that

traditional stereotypes about sex differences in emotion usually fail to predict people's behavior.

They wrote that there seems to be more overlap than separation in the sexes' experience of

emotion, and explicitly rejected John Gray's analogy of separate planets of origin. In a review of

published literature reviews on the topic, Wester, Vogel, Pressly, and Heesacker (2002) came to

a similar conclusion, stating that "sex differences are small, inconsistent, or limited to the

influence of specific situational demands" (p. 639).

In a 2005 meta-analysis of studies dealing with all types of sex differences, Janet Shibley

Hyde concluded that only a very few large differences exist; men's physical upper body strength

is reliably greater than that of women, for example. On most studied dimensions, however,

reliable sex differences were found to be small or non-existent. Hyde found the results striking

enough to entitle her article "The Gender Similarities Hypothesis," and she reiterated a theme

found in Canary & Emmers-Sommer (1997) and Wester et al. (2002): on a given dimension,

variation within each sex often eclipses the average difference between them.










The sociology literature also contains the gender similarities theme. In the American

Journal ofSociology, Simon and Nath reported that men and women in the U.S. are broadly

similar in their self-report of their emotional experiences. Upon review of data from the emotions

module of the 1996 General Social Survey (GSS), the investigators seemed rather surprised to

conclude that "there is little correspondence between men's and women's feelings and

expressive behavior and gender-linked cultural beliefs about emotion" (2004, p. 1166; italics in

original). After examining the same data using a variety of theoretical and statistical models,

Lively and Heise reported that "sex accounts for less than 1% of the variance on any of these

emotionality dimensions" (2004, p. 1120).

In the cases where emotional differences between the sexes have been observed, there are

often qualifying factors to be considered. For example, differences are sometimes found in the

ways men and women express emotions but this has been convincingly explained in terms of

culture-bound display rules as opposed to differences in the experience of emotion (e.g. Brody,

2000; Hall, Carney, & Murphy 2002; O'Kearney & Dadds, 2004; Simpson & Stroh, 2004). Sex-

stereotypical patterns of emotional expression can be elicited in men by manipulating the social

context to make them emotionally vulnerable, suggesting perhaps that these behaviors represent

a defensive strategy of adhering to low-risk normative expectations rather than a genuine

expression of their inner experiences (Vogel, Wester, Heesacker, & Madon, 2003).

Additionally, observed differences are most often small relative to the within-sex

variation on the examined dimension (e.g., Hyde, 2005; LaFrance, Hecht, & Paluck, 2003). So

while studies do frequently describe statistically significant average differences in women's and

men's emotional behaviors, these studies can be seen in a broader context to reveal more

convergence than divergence.









Factors in Perception of Sex Differences

To summarize, much of the available research indicates that emotional experience is

fundamentally similar for men and women, with most differences being small, situational, or

otherwise qualified. This is clearly at odds with the position taken by Gray, Tannen, and other

authors, and portrayed in various entertainment media. Nevertheless, in light of the fact that self-

reported data, such as the GSS, show that American men's and women's subjective experiences

are so alike, it seems odd that M~en are from Mars, Women are from Venus and other works from

this perspective should be so widely embraced. How can this disjunction between perception and

empirical findings be explained?

One potential contributing factor is that while peer-reviewed journal articles usually

contain the context and qualifiers necessary to put results in the proper perspective, in the

popular media this information may be jettisoned to tell a more easily understood, if somewhat

misleading, story. Another potential contributing factor lies in the fact that the concept of sex

differences is a pervasive one, embedded in western culture. Therefore, popular opinion may

have remained at odds with scientific Eindings because those Eindings are seen as counterintuitive

- just as when the concept of a spheroid Earth met with much resistance because people's senses

seemed to indicate otherwise. Such a scenario can certainly be imagined.

Gestalt theories of perception argue that relatively frequent instances of similarity will

provide a less attended-to background against which relatively infrequent instances of difference

stand out starkly. The large similarities may therefore simply be neglected, becoming the

virtually invisible background against which the relatively small differences stand out, thereby

receiving the larger share of conscious consideration.

Whatever the origin of the notion that there are large sex differences in emotion, once

such a belief has formed it could presumably sustain itself by biasing the person' s attention,










provided enough ambiguity in observed expressive displays. Ambiguous events could be

interpreted within the framework of the belief, seemingly providing ongoing reinforcing

evidence for the belief. So, it seems possible that even if the emotionally expressive behaviors of

men and women aren't systematically different, observers might unknowingly apply a bias as

they encode ambiguous expressive behaviors.

This reasoning constitutes the basis for the current research. A sex bias towards the

interpretation of visual emotional cues according to traditional sex-role stereotypes is proposed.

The goal of this proj ect is to investigate this proposal experimentally. Whereas the research

discussed so far deals with individuals' experiences or displays of emotion, the proposed proj ect

will focus on the receipt and encoding of emotional signals by an observer, and particularly on

possible bias introduced at encoding.

Facial Expressions as Displays of Emotion

In a variety of social contexts, facial expressions are an important source of information

regarding the emotional states of the participants (e.g. Ekman, 1993; Ekman & Friesen, 1987;

Keltner, 1995). Nonverbal cues about emotional states play a vital role in effective

communication in day-to-day interactions as well as in more constrained and goal-directed

interactions, such as teaching, sales, and psychotherapy (Philippot, Feldman, & Coats, 2003).

However, emotional communication is susceptible to distortion from various sources of bias.

Of particular interest for this proposed dissertation is people's tendency to view certain

emotions as inherently masculine or feminine, as this might lead people to interpret nonverbal

emotional communications in a manner consistent with this sex-based categorization of

emotions. U. S. culture has long included a widespread belief that women are more "emotional"

than men in general, and that many emotions are regarded as especially feminine, while a few are

seen as at least relatively masculine (Plant, Hyde, Keltner, & Devine, 2000). More recent data









show that the categorizing of emotions by gender is still alive and well. For example, in a study

of the relationships between gender, j ob status, and the interpretation of emotional signals,

Algoe, Buswell, and DeLameter (2000) found that their participants rated anger and disgust as

relatively masculine emotions, and fear as relatively feminine. The participants in the Algoe et

al. study also rated anger and disgust as more instrumental and fear as more expressive; traits

which are themselves strongly associated with masculinity and femininity respectively (Maj or,

Carnevale, & Deaux; 1981).

Plant, Hyde, Keltner, and Devine (2000) asked 117 undergraduates to estimate the

frequency with which men and women experience and express 19 emotions--12 were regarded

as being experienced and expressed significantly more often by women, and only 2 as more

typical of men. Plant et al. also tested participant' s ratings of facial expressions of emotions, and

found that pictures depicting blends of sadness and anger (upper and lower portions of the face

mismatched) were rated in a way consistent with gender stereotypes.

Other recent research has also shown that people do at times display a sex-stereotypical

bias in their interpretations of facial expressions. Plant, Kling, and Smith (2004) morphed

together photographs of men and women posing facial expressions of anger and sadness. Male-

typical and female-typical haircuts and clothing were added to the resulting blends to manipulate

gender. Figures in the images perceived by participants to be male were rated as more angry than

those perceived to be female, and figures perceived to be female were rated as sadder than those

perceived to be male.

Hess, Adams, and Kleck (2004) used a similar methodology, using drawings of facial

expressions differing only by hair and clothing for one study, and photographs of people rated as

androgynous in their facial appearances with different hairstyles and clothes added using a










computer program in another. Hess et al. found that using this method, the sex-stereotype effect

was eliminated in some instances and even reversed in others. The authors offered the rationale

that certain aspects of facial appearance, such as thickness of eyebrows and width of jaw, convey

dominance or affiliation cues. They posited that these aspects of appearance, rather than gender

per se, yield the cues that trigger the stereotyped interpretations. However, it also seems possible

that Hess et al. were over-dichotomizing gender, and that there is a broader range of possible

gender associations than they seemed to expect. Their drawings of lantern-j awed women and

photos depicting slender-faced men with barely visible eyebrows may in fact have triggered

mixed or ambivalent gender associations, rather than associations to only male only female

construct categories.

Hypotheses

The purpose of our study was to test two hypotheses regarding conditions that might

foster a bias in interpreting the emotional state of another using visual information. Computer

graphics modeling software was used to generate stimuli that facilitated an examination of

conditions under which the bias towards sex-stereotyped interpretations of facial expressions was

likely to manifest.

The first hypothesis was that bias would be evident when the target' s expression was

ambiguous, but would decrease or disappear when the target' s expression was unambiguous. The

second hypothesis was that bias would be less evident when the sex of the target was ambiguous,

but more evident when the sex of the target was unambiguous.

As a basis for the first hypothesis, the expression of a single basic emotion is presumed to

demand a particular interpretation, thus leaving little room for participants to proj ect any sex-

biased expectations they may hold onto the target as they form ideas about what subj ective

emotional experience would give rise to such an expression. This assumption is consistent with









Ekman (1993) who asserted that certain expressions are universally identifiable as representing

the corresponding discrete emotions. In contrast, blended expressions were presumed to be more

difficult to categorize, triggering participants to use information such as context (when available)

or associations to help resolve the ambiguity. Associations with the target's sex were expected to

decisively inform a guess about the subj ective experience behind the target' s expression only

when more directly applicable data are unavailable or problematic. This idea is consistent with

the findings of Plant et al. (2000) and others, which indicated that participants do sometimes

interpret facial expressions in a biased way that is often consistent with gender stereotypes. In the

current experiment, expressions closely conforming to those described by Ekman (1993) as

representative of basic emotions were defined as unambiguous expressions, and those that are

blends of two or more basic emotions as ambiguous expressions.

Whereas the first hypothesis dealt with ambiguity in visible affect, the second dealt with

ambiguity of apparent sex. Modeling software allows the manipulation of the secondary sex

characteristics and gender cues of realistic human Eigures without changing other aspects of the

Eigures' appearance. This opened the possibility of varying the apparent maleness and

femaleness of the target faces, including the creation of blended targets of ambiguous apparent

sex, while holding facial and postural affect cues constant.

This manipulation allowed a test of the second hypothesis: that as the ambiguity of the

sex of the stimulus face increased, the sex bias in participants' ratings would decrease. When

people are unable to categorize a face as male or female, they should also be unable to apply a

sex-biased interpretation. Furthermore, when the stimulus face is categorized in spite of some

features that are ambiguous or more representative of the other sex, sex-based bias could be

attenuated. For example: a participant is shown two faces that he or she interprets as male. If one










'male' face displays male characteristics very strongly, it might then strongly activate

assumptions about men' s emotions. However, if the other 'male' face has features that the

participant interprets as feminine, it might then activate the participant' s beliefs about both men's

and--although probably to a lesser extent--women's emotions.

It is worth noting that associations with categories other than apparent sex, such as race,

age, or attractiveness, could also be activated under these conditions. A thorough exploration of

all these possible sources of bias was beyond the scope of our current study, and we must be

content for the present with simply having controlled these other triggers to the extent it was

feasible.









CHAPTER 2
LITERATURE REVIEW

Overview

In this chapter, literature on the interpretation of facial expression of emotion dealing

specifically with the effects of the sex of the stimulus person, or "encoder," is reviewed. This

review supports an investigation of sex of encoder effects on the interpretation of facial and

postural emotional expressions using novel techniques of stimulus development. It is organized

chronologically.

Method

This literature review is based on articles retrieved using the PsychlNFO online database.

The initial search was done using the key terms "facial expression," "sex," and "emotion" in

PsychlNFO. Abstracts from the items in the results list were examined for any mention of

analysis by sex of encoder, which was expressed using a variety of terms including but not

limited to: "sex" and "gender" interchangeably to indicate apparent sex of the real or simulated

people in the images; and "encoder," "sender," "stimulus person," "model," and "poser"

interchangeably to indicate the real or simulated people encoding facial expressions. Items found

to include a mention of analysis by sex of encoder were selected, then sources listed as

references on these items and, when available, records retrieved by PsychlNFO using the "Times

cited in this database" tool were screened for inclusion using the same criteria. Items for the

period October 1974 to February 2007 were included in this review.

It should be noted that because abstracts are probably more likely to list hypothesis

confirming results than other results, it is possible that in using this method, studies were missed

that analyzed for sex of encoder effects but found none. This review should therefore be viewed










as an examination of sex of encoder effects when they are observed, and not as an examination

of the absolute prevalence of such effects.

Chronological Review of Literature

Sex of encoder effects have been interpreted in various ways. In some older studies it was

speculated that the observed effect might result from women being superior encoders relative to

men. However, some studies produced results not easily explained in this way, for example that

certain emotions were more easily recognized when portrayed by male encoders. In some

studies, the sex of the encoder was manipulated without changing the expression of emotion,

often by pasting different hair and clothes onto a photograph. In these studies also, the

explanation that women were better at encoding emotion facially was inadequate to explain the

differences. It is studies such as these that give rise to the hypothesis that a sex bias drives sex of

encoder differences observed in studies dealing with the interpretation of visual displays of

affect. Some recent studies advance the theory that at least some of the observed differences are

attributable to the effects of certain facial features that people associate with personality

characteristics of dominance and affiliation. These are usually confounded with sex of encoder,

such that facial features associated with dominance are most often found on males and facial

features associated with affiliation are most often found on females.

Buck, Miller, and Caul (1974) were among the first to study the impact on interpretation

of visual affect of the sex of the encoder (i.e., the person displaying an emotion). In this study,

one group of participants functioned as encoders, another as "observers." The encoders viewed

slides meant to evoke emotional responses, while the observers watched the encoders' faces

through a closed-circuit television system. The encoders were unaware that they were being

observed. Based on the encoders' facial expressions, the observers were asked to guess what

category of slide the encoders were seeing: sexual, scenic, pleasant people, unpleasant, or










unusual. Observers of both sexes were more accurate in categorizing the slides when responding

to female encoders. The authors concluded that females have "greater facial responsiveness"

than males (p. 593). The possibility that the results were impacted by observers' expectations

regarding females' expressivity as compared with that of males, i.e. the influence of a sex bias, is

not discussed.

Zuckerman Lipets, Koivumaki, and Rosenthal (1975) had 40 students (termed

"encoders") pose in expressions of anger, happiness, sadness, fear, disgust, surprise,

bewilderment, suffering and determination. Photos were made of these expressions, and later a

group of 102 students, including 30 of the original encoders, viewed the slides and chose from a

list of emotions the one they thought best fit each slide. It was found that people were more

accurate in decoding emotions from opposite-sex faces than from same-sex faces, and that

female faces were more accurately decoded on the whole than male faces. Results examining any

interaction between sex of encoder and the emotion being posed were not reported.

Perhaps the first study to manipulate the apparent sex of a target person in order to detect

sex bias in the interpretation of emotional expressions was Condry and Condry's (1976) seminal

investigation of people' s interpretations of an infant' s display of emotions. In this case, apparent

sex was manipulated by simply telling half the participants that the infant in the film they were

watching was a boy, and telling the other half of the participants that it was a girl. In the film, the

9-month-old infant is exposed repeatedly to four emotionally evocative stimuli: a teddy bear, a

doll, a buzzer, and a jack-in-the-box. Participants were instructed to rate the infant' s expressions

by type (pleasure, anger, and fear) and intensity.

The investigators found that participants were likely to interpret the infant' s emotional

expression in sex-biased ways--for example the "boy" as more angry and the "girl" as more









fearful--but only in some instances. They wrote that "it appears to us that the more 'ambiguous'

the situation, the more of a difference subj ects report between the sexes" (p. 816). The infant' s

response to the buzzer, for example, was rated as relatively pure fear whether the infant was

labeled a boy or a girl. On the other hand, the infant' s response to the jack-in-the-box was

interpreted as more angry if the infant was labeled male, and more fearful if the infant was

labeled female.

Eiland and Richardson (1976) created a large set of photographs depicting various

expressions of emotion using male and female encoders from two age groups (adults and

children) and two race groups (black and white). Their participants were demographically similar

to their encoders. They were male and female, black and white, 2nd graders and college students.

The participants sorted the pictures into boxes, each labeled with an emotion. The investigators

found that the sex, race, and age of the participant did not affect interpretation of the emotions

depicted in the photographs. However, the sex, race, and age of the encoders each impacted the

interpretation. The investigators did not designate particular responses as right or wrong, so there

were no accuracy data. In fact they did not characterize the differences they found at all beyond

simply observing that people "...do not interpret 'messages' sent by black faces (whether young,

old, male, or female) the same as 'messages' sent by white faces. Similarly, we do not interpret

'messages' sent by male faces (whether white, black, young, or old) the same as 'messages' sent

by female faces" (p. 174-175).

In a study published in 1983, Felleman, Barden, Carlson, Rosenberg, and Masters

examined children' s and adults' recognition of the emotional expressions of children. The

researchers took photographs of children displaying happiness, sadness, anger, and neutrality.

Posed expressions as well as expressions spontaneously generated in reaction to emotion-










eliciting stimuli were used. Children more quickly identified the emotional content of the

expressions of same-sex children. However, the sexes of the children in the photographs had

more of an impact on adults' interpretations than on children's interpretations. The authors

speculated that this might be caused by adults' more developed stereotyped beliefs, for example

that boys are more angry or aggressive.

In another study published in 1983, Knudsen and Muzekari reported more evidence that

the sex of the encoder can affect interpretations of facial affect. The investigators used four male

and four female encoders to pose expressions of fear, anger, sadness, and happiness. These

photographs were shown to 98 undergraduate students, along with, in some cases, verbal

statements manipulating the context in which the expressions were supposed to have occurred.

Participants rated the emotions they perceived to be present by choosing from a list of six

emotions (fear, anger, sadness, happiness, surprise, and disgust) and/or by writing in a response.

Female encoders were rated as sadder than males in conditions where verbal context was

provided. Male encoders were interpreted as being more fearful than female encoders in

conditions where verbal context was not provided. The authors refrained from trying to provide a

rationale for these differences, and simply noted that the sex of the encoder appears to affect

interpretation. The finding regarding interpretation of female encoders' expressions as sadder is

consistent with stereotypes about masculinity and femininity of particular emotions, but the

finding that male's expressions were seen as more fearful is not. As noted by Condry and Condry

(1976) however, stereotyped interpretations are more likely to be observed in conditions of

ambiguity, and the visual stimuli used in this investigation were unambiguous expressions of a

single basic emotion. This limitation crops up in much of the literature on the subj ect.









In a study designed to examine brain lateralization in processing faces expressing

emotion, Thompson (1983) did not find differences based on whether faces were presented in the

left or right visual field, but he did find differences based on whether the encoder was male or

female. Participants were shown pictures of faces with happy, sad, or neutral expressions for

either 30ms or 200ms, then shown another picture and asked whether the two pictures matched.

Participants were more accurate in judgments involving the male face. Thompson cautioned

against drawing firm conclusions about this, however, because only one encoder of each sex was

used. Small numbers of encoders, and the attendant possibility of artifacts relating to features of

specific encoders, is another problem frequently encountered in the literature.

Noting that the literature on decoding facial expressions to date was concerned mostly

with static images, and often with posed expressions, Wagner, MacDonald, And Manstead

(1986) investigated whether dynamic, spontaneous facial expressions could be correctly

interpreted. The researchers filmed one set of participants' faces as the participants viewed

emotionally loaded slides, and asked these participants to identify what emotions they were

experiencing at different points in the film. They then showed the films to another set of

participants, and asked them to identify which emotions were being expressed. They found that

participants were more accurate in interpreting the expressions of females, and concluded that

females are better encoders than males, particularly of neutral and surprised expressions. Males

and females performed similarly as interpreters or "receivers." Here again, the possibility of

stereotyped expectations was not examined.

Rotter and Rotter (1988) studied the encoding and decoding of facial expressions using

methods similar to prior studies, but introduced hypotheses making different predictions for

different emotions. Specifically, Rotter and Rotter predicted that females would be better









encoders and decoders of disgust, fear, and sadness, whereas males would be better encoders and

decoders of anger. This prediction was based on the idea that people would best pose and detect

the emotions they were most likely to express, and some prior research had suggested that

women suppressed aggression but were more expressive than men regarding other emotions,

whereas men tended to suppress most emotions but were more expressive of anger than women.

The researchers photographed students, staff, and faculty members in posed expressions

of anger, disgust, fear, and sadness. They recruited 10 judges to select photographs that were

perceived to express the target emotion particularly well, ending up with 30 pictures of each

pose, with 39 different female encoders and 15 different male encoders represented. Participants

were asked to categorize each photograph as representing anger, disgust, fear, or sadness.

Women performed more accurately in the categorization on the whole, and photographs of

females were more accurately categorized for all emotions except anger. Male decoders were

better identifiers of male-encoded anger than female decoders. These results were interpreted as

supporting the concept of differentiated sex roles, caused by "socialization which encourages

females to be more expressive than males," and socialization of males to be both more

aggressive and more attuned to aggressiveness from other males (p. 146-147).

Walbott (1988) tested whether facial expressions carry sufficient information to

categorize emotions without context, by using clips from movies. Short clips in which

professional actors displayed joy, sadness, fear, or anger (according to judges familiar with the

films) were shown to participants, who rated the expressions for nine component emotions--

happy, sad, surprised, fearful, angry, thoughtful, in despair, full of contempt, and full of guilt--

on five-point scales. Participants identified j oy with high accuracy for encoders of both sexes.

They identified fear and sadness more accurately for female encoders, and anger more accurately










for male encoders. The investigators interpret these results as art imitating life, citing prior

research describing socialized display rules (Ekman, 1972) requiring men to suppress feelings of

sadness and fear, and requiring women to suppress anger.

Erwin, Gur, Gur, Skolnic et al. (1992) developed a set of facial emotion stimuli for an

instrument for use with various clinical populations, and tested it initially on a non-clinical

sample. In the first of the two experiments sex of encoder effects were not examined. In the

second experiment, pictures of male and female encoders posing expressions of happiness,

expressions of sadness, or neutral expressions were shown to participants, who were asked to

rate the perceived emotion on a seven-point scale from very happy to very sad. Interactions

between participant sex, encoder sex, and posed emotion were observed. Female participants

were more accurate with male encoders generally, and were more accurate especially in

identifying happiness for male encoders than for female encoders. Male encoders identified

happiness similarly for male and female encoders, but were less accurate in identifying sadness

for female encoders.

These Eindings are difficult to rationalize in terms of sex biased interpretations, which

would presumably lead people of both sexes to identify these stereotypically feminine emotions

more readily in females. However, as with many studies that examine sex of encoder effects, the

stimuli are limited in that they do not include ambiguous expressions, which is where bias effects

would be most likely to manifest. Nor are expressions of stereotypically masculine emotions

included, ratings of which could be directly compared to ratings of stereotypically feminine

emotions for each stimulus face.

Keltner (1995) observed that prior research literature on facial expressions was largely

focused on just 7-10 emotions, fewer than the total number identified by lay people and emotion









theorists. He set out to determine whether a distinct display of something like embarrassment,

guilt, or shame could be identified. He theorized that this type of emotion should have a distinct

display because it served a useful social function of appeasement when norms had been violated.

He further theorized that this type of emotion should be more easily recognized when displayed

by individuals from low social status groups.

In a series of five experiments, Keltner set out to describe an expression of

embarrassment and then test whether it could be distinguished from other expressions of

emotion. In the first, he elicited embarrassment by having participants perform a task that had

been identified as embarrassing in previous research. Participants' nonverbal behavior was

observed, and they were asked to report on their experiences. From this information, components

of a tentative expression of embarrassment were identified and differentiated from amusement,

an expression that shared several components with the expression of embarrassment being

described. In the next four experiments, participants viewed short films of people making

expressions intended to convey embarrassment and other emotions, and tried to identify they

emotions being displayed using a variety of response formats. Expressions of embarrassment

were correctly identified and distinguished from other emotions. Embarrassment displays from

women and African-American targets were more easily identified and judged to be more intense

than embarrassment displays from male and Caucasian targets.

Keltner' s (1995) study is of particular significance in that an "eye of the beholder" effect

similar to that observed by Condry and Condry (1976) was considered in the interpretation of the

results. Keltner posited that observers' perceptions of targets' social status influenced the

observers' judgments about the expressed emotion. In the maj ority of the previous literature,

Condry and Condry being the notable exception, females were regarded as being better encoders










of emotion, but the possibility of observers applying different standards when interpreting

expressions on female faces as opposed to expressions on male faces was not considered.

Baron-Cohen, Wheelwright, and Jollife (1997) showed participants pictures of various

expressions of basic and complex emotions, using whole face images, eyes-only images, and

mouth-only images. Pictures made with a female encoder and pictures made with a male encoder

were used in separate studies. The same pattern of results was found with female and male

encoders, namely that whole-face pictures yielded the most accurate judgments for basic

emotions, that accuracy was as good for the eyes-only pictures as for the whole-face pictures for

complex emotions, and that whole-face and eyes-only pictures yielded better accuracy than

mouth-only pictures. No direct comparison was made between ratings for the male and female

faces.

The use of computer morphing programs to manipulate facial expression stimuli was

introduced in a study by Hess, Blairy, and Kleck (1997). Morphing is a process in which one

image is gradually deformed until it matches another. Intermediate images can thereby be created

that combine aspects of the two endpoint images. The investigators chose neutral and emotional

expressions from a pre-existing set of stimulus faces and used a morphing program to create

varying levels of intensity for each expression. The target emotions were anger, disgust, sadness,

and happiness. Photographs of two Caucasian male encoders and two Caucasian female encoders

were used.

Participants viewed each stimulus picture on a computer screen and rated the perceived

intensity of anger, contempt, disgust, fear, happiness, sadness, and surprise along a continuous

scale. For each rated emotion, participants used a computer mouse to click a point along a line,

anchored at one end with the phrase "not at all," and at the other end with the phrase "very










intensely." Accuracy in identifying the portrayed emotion and rated intensity of that emotion

varied linearly with portrayed intensity for most images, which was interpreted as a validation of

the manipulation technique.

The investigators observed main effects of sex of encoder, qualified by some interactions.

For happy and sad expressions, low-intensity expressions were more accurately identified for

images of male encoders than for images of female encoders. At higher intensities, ratings of

these emotions were similar for male and female encoders. Also, male raters were more accurate

in evaluating male encoders' expressions of disgust than female raters' expressions of disgust.

After reanalyzing the data by including the perceived intensity of the expression as a

covariate, and thereby controlling for actual differences in intensity between the expressions

made by the four encoders, sex of encoder effects remained. Female encoders' expressions of joy

were more accurately rated, and male encoders' expressions of sadness were more accurately

rated. The authors interpreted this finding as evidence for "a decoding bias suggesting that

observers decode women's and men's low to mid-intensity emotional facial expressions

differently" (p. 255). They refrained from speculation regarding the source or exact nature of this

bias.

Algoe, Buswell, and DeLamater (2000) showed participants slides of male and female

encoders, or "focal people" as they put it, posing one of three expressions: anger, disgust, or fear.

The investigators put forth two competing hypotheses. Theorizing from the "universality"

perspective they first hypothesized that the expressions being posed by the focal people should

be correctly identified regardless of any contextual cues. Theorizing from the "context-specific"

perspective, their second hypothesis was that participants would adjust their interpretation of the










focal person's expression based on contextual information such as the focal person's gender and

job status.

The researchers found that the gender of the focal person did influence participants'

interpretations in some circumstances. Males posed in expressions of anger were seen as more

angry and less fearful than women posed in expressions of anger. Across posed expressions, men

were seen on average as expressing more contempt than women, and women were seen on

average as expressing more fear than men. These results are consistent with sex-stereotyped

interpretation of the expressions.

Dimitrovsky, Spector, and Levy-Shiff (2000) studied the ability of learning-disabled

(LD) and non-disabled children to recognize facial expressions of emotion that varied in their

ease of identification. Photographs from a preexisting stimulus set were chosen for relatively

high and relatively low inter-rater agreement. Portrayals of happiness, sadness, anger, surprise,

fear, disgust, and neutral expressions were used, with four male and four female encoders.

Participants from both the LD and non-LD groups more accurately identified emotions

from the female faces. This effect increased with difficulty of identification. That is, for

emotions with lower inter-rater agreement, there was a larger difference between the accuracy

rating for female faces and the accuracy rating for male faces than for emotions with higher

inter-rater agreement. This was interpreted by the authors as evidence of women' s superior facial

emotional expressivity as compared with men. The authors concluded that "the present results

can be viewed within the wider context of women' s greater emotionality" (p. 414). The authors

did not appear to entertain the possibility of sex bias in their interpretation.

Hess, Blairy, and Kleck (2000) conducted a study to investigate the impact of facial

expressions of emotion, sex of encoder, and ethnicity of encoder on participants' perceptions of









the encoders' levels of dominance and affiliation. Images of male and female Caucasian and

Japanese people displaying high and low intensities of happiness, anger, disgust, sadness, and

fear were presented to participants.

A main effect of sex of encoder was observed. However, sex of encoder interacted in

complex ways with the other independent variables, and the magnitude of the effect of facial

expression of emotion dwarfed the effects of the other variables. All this led the authors to

conclude that "observers interpret the information regarding behavioral intentions provided by

affect displays in similar ways regardless of the ethnic group membership or the sex of the

expressor," but that sex of the expressor has subtle effects on the observer' s interpretations (p.

281).

Plant, Hyde, Keltner and Devine (2000) addressed the connection between gender

stereotypes and facial expressions of emotion in this series of three studies. The first study

established which emotions are currently considered to be stereotypically masculine or feminine

in the US. In the other two studies, participants' interpretations of emotional expressions were

solicited and discussed in light of the stereotype information garnered in the first study.

In the first study, participants responded to two questionnaires. The first questionnaire

required participants to indicate the frequency with which men and women experience and

express 19 emotions according to US cultural stereotypes as the participants perceived them. The

second required them to rate the frequencies according to their personal beliefs, regardless of

what they perceived the cultural stereotypes to be. Eleven of the 19 emotions were rated as being

experienced and expressed more by women than men on both questionnaires. Two emotions,

anger and pride, were rated as being experienced and expressed more by men than by women on

both questionnaires.









In the second study, the investigators created photographs of two men and two women

posing facial expressions of anger, sadness, and blends of the two, using Ekman and Friesen' s

Facial Action Coding System to pose the expressions (FACS; Ekman & Friesen 1976). The

blended expressions were created by posing the upper half of the face in one expression and the

lower half in the other expression. Participants viewed the photographs and rated the degree to

which they perceived each to express four emotions. The investigators found that participants

rated the blended expressions in a stereotype-consi stent manner. That is, they rated men's

blended expressions as angrier than women's, and they rated women's blended expressions as

sadder than men's.

In the third study, participants interpreted an infant' s display of emotion in a

methodology similar to that of Condry and Condry (1976). Participants, who were tested

regarding their own endorsement of sex stereotypes, viewed a videotape of an infant and rated

the infants' behavior on several emotions. Half were told that the child was a boy half were told

that the child was a girl. The participants' beliefs about the sex of the baby did not influence their

interpretations of emotion except in the case of high-stereotyped men rating anger. In this case,

the men rated ostensibly male infants as angrier than ostensibly female infants.

The maj ority of the literature available in English dealing with the effects of encoder sex

on interpretations of facial expressions of emotion describes experiments done in English-

speaking countries. A study performed by Thayer and Johnsen (2000) in Norway provides an

exception. In this case participants rated their own experience of happiness, sadness, anger, fear,

disgust, surprise, interest, pleasantness, activation, calmness, arousal, and liking for the stimulus

in response to viewing slides showing facial expressions of emotion. The slides depicted one

male encoder and one female encoder displaying expressions of neutrality, disgust, fear,










happiness, surprise, sadness, and anger. Responses were considered to be correct classifications

when the participant reported an elicited emotion that matched the emotion displayed.

Female participants' responses included more correct classifications and fewer

misclassifieations than male participants' responses, and did not vary as a function of encoder

sex. Male participants performed at chance levels in differentiating female encoders' expressions

of anger and fear. In the discussion, the authors framed this difference in terms of females'

presumed superiority in decoding emotion and greater sensitivity in experiencing elicited

emotions. However, it seems possible that emotions elicited through viewing the emotional

display of another might not always be congruent with that emotional display, and that this might

have played a role in males' reactions to seeing the female encoder displaying negative

emotions. This possibility was not discussed in the article.

In an effort to develop stimuli for future use in evaluating populations of neurologically

impaired people, Pell (2002) created facial and vocal stimuli depicting six target emotions:

neutrality, happiness, pleasant surprise, disgust, anger, and sadness. The facial stimuli consisted

of pictures of 4 male and 4 female encoders posing facial expressions of each target emotion.

The investigator tested the stimuli with non-impaired participants in order to establish

baseline parameters for the stimulus set. In doing so, he found that the sex of the encoder

influenced the interpretation of the displayed emotion in some cases. Specifically, participants

correctly identified expressions of neutrality on male faces more accurately than on female faces,

and correctly identified disgust on female faces more accurately than on male faces.

In the discussion of this finding, Pell did not provide a rationale as to why this specific

pattern may have been manifested. Rather, he wrote that the observed effect of encoder gender

might reflect "systematic properties of how these emotions are decoded and labeled," or it might










reflect an artifact of specific properties of some or all of the eight encoders used in this particular

case (p. 504). The apparent tendency not to label female faces as neutral seems consistent with

stereotypes regarding women's emotionality, but the author did not engage in this level of

speculation.

Widen and Russell (2002) examined the effect of the apparent sex of the encoder on

preschoolers' interpretations of facial expressions of emotion. Participants, who were 4 or 5

years of age, were shown pictures of what appeared to be a male and a female child of around 12

or 13 years of age displaying facial expressions of happiness, sadness, anger, fear, and disgust. In

reality, these pictures were created from photographs of a 13-year-old girl and a 12-year-old boy

in posed expressions. Pictures of the boy and the girl displaying the same expression were

morphed together using computer software, and hairstyles typical of boys and girls were

electronically placed onto the resulting blended-sex faces to create a set of apparent males and

apparent females. Pairs displaying each emotion had exactly the same face--only the hair

differed.

The apparent sex of the encoder impacted participants' ratings of emotions. Male

participants labeled the male figure as disgusted more often than they labeled the female figure

as disgusted, and female participants labeled the female figure as fearful marginally more often

than they labeled the male figure as fearful. The authors discussed the results in terms of the

presumed influence of gender stereotypes of emotion. They noted, however, that participants'

ratings of anger were the same for apparently male and apparently female encoders, whereas the

theory of gender stereotyping of emotions would lead one to predict that ratings of anger in

particular should yield stereotypical interpretations.










Mignault and Chaudhuri (2003) used high-resolution 3-D models of stimulus faces in an

examination of the impact of head tilt on participants' interpretations of perceived dominance

and emotional content. Apparently male and apparently female stimulus faces displaying neutral

expressions were presented on a computer screen at different angles. In addition to rating the

perceived dominance, participants were asked to give a one-word answer to the question "what is

the main emotion expressed in this picture?" (p. 117). Responses were categorized as anger, fear,

happiness, sadness, neutral, and other.

Participants rated apparently male faces as angry more often than they rated apparently

female faces as angry. Apparently female faces were more often rated as happy compared with

apparently male faces. Apparent sex had no detectable effect on ratings of fear, sadness, or

neutrality. The authors interpret these results as being consistent both with theories of "social

stereotyping based on women's unequal status" and with "an evolutionary explanation based on

greater innate aggressiveness in males" (p. 128).

Hess, Adams and Kleck (2004) tested the theory that facial features conveying

dominance and affHliativeness actually drive effects identified elsewhere in the literature as

evidence of sex bias. Because the features that they assert are cues for dominance or

affi1iativeness-eyebrow thickness, height of forehead, j aw form, and facial rounding--are

confounded with sex, they reasoned that effects of these features may easily be misinterpreted as

effects of encoder sex. They employed two studies with different types of stimuli to test this

theory .

In their first study, black-and-white drawings of the center of faces (as opposed to the

outer edge of faces) displaying anger, sadness, happiness, disgust, and a neutral expression were

created. (Interiors of faces convey relatively little information about the sex of the person, but a










lot of emotional information, whereas the outer edge of faces conveys relatively little emotional

information, but a lot of sex-cue information.) Various levels of intensity were generated by

morphing emotional expressions with the neutral expression, and apparent encoder sex was

manipulated by adding masculine and feminine hairstyles to the drawings. Participants rated the

perceived intensity of anger, contempt, disgust, fear, happiness, sadness, and surprise for each

stimulus face. The investigators hypothesized that because apparently male and apparently

female faces were exactly the same except for hairstyle, and therefore shared all identified

dominance and affiliativeness cues, the often-observed effect of encoder sex should not be

observed.

Results of the first study were mixed. Ratings of disgust were, as predicted, equivalent

for apparently male and apparently female faces. Ratings of sadness were higher for apparent

females than for apparent males, consistent with a theory of sex bias and inconsistent with the

hypothesis. Interestingly, the typical effect of encoder sex was reversed for expressions of anger

and happiness--apparent females were perceived as angrier than apparent males, and apparent

males were perceived as happier than apparent females. The investigators tentatively conclude

that facial features rather than perceived sex of the encoder may be responsible for effects

commonly attributed to sex bias in interpreting facial expressions of emotion, but caution that it

is possible that the drawings used introduced an artifact.

In the second study, a largely similar methodology was employed using photos of

androgynous faces for the interior parts of stimulus faces, and once again using different

hairstyles to manipulate apparent sex. Intensity of expressions was not manipulated in the second

study. Similar results to those from the first study were obtained for ratings of anger and

happiness, again reversing the pattern predicted by the sex bias hypothesis.









Hess et al. were faced with trying to explain the fact that they seemed to have observed a

sex bias in the opposite of the usual direction for expressions of anger and happiness. To do so,

they ended up invoking a version of the sex bias theory, by speculating that participants carried

expectations that women should appear less angry and men less happy, and when those

expectations were violated the female faces' anger and the male faces' happiness stood out all

the more starkly.

Palermo and Coltheart (2004) observed that much of the prior research on facial

expressions relied on a few databases of stimulus faces. In an effort to expand the available pool

of facial expressions of emotion stimuli, they gathered photographs of 50 individuals displaying

expressions of happiness, sadness, anger, fear, disgust, surprise, and neutrality. To test the utility

of these photographs, the researchers asked a group of 24 participants to view the images and

select which of the seven target expressions they perceived each image to portray. They found a

main effect of encoder sex in that expressions posed by females were more often accurately

identified than expressions posed by males. Anger and sadness especially were correctly

recognized more often when posed by female encoders as opposed to male encoders.

The investigators observed that other studies have yielded similar findings, i.e. that

expressions posed by female encoders are often recognized at higher rates than corresponding

expressions posed by male encoders. The authors did not speculate as to why that might be. In

this case, the finding that anger was more often identified when displayed by female encoders

does not seem to be consistent with sex bias theory. The finding that sadness was more readily

recognized on female faces than male faces, however, does seem consistent with sex bias theory.

Plant, Kling, and Smith (2004) used stimuli similar to those used by Hess et al. (2004) to

investigate the effect of encoder sex on the interpretation of facial expressions, but produced










different results. Plant et al. created stimulus faces by morphing together photos of males and

females posing expressions, then adding gender-typical hairstyles to manipulate apparent sex.

The expressions were ambiguous, being constructed either from an anger expression in the upper

half of the face and a sadness expression in the lower half of the face, or vice-versa. Participants

were asked to rate the perceived intensity of two stereotypically feminine emotions, sadness and

sympathy, and two stereotypically masculine emotions, anger and contempt.

Apparently female encoders' expressions were rated as sadder than those of apparently

male encoders, and apparently male encoders' expressions were rated as angrier than those of

apparently female encoders. Apparently female encoders' expressions were also rated as more

sympathetic than those of apparent males. As in Hess et al. (2004), faces that were exactly the

same except in hairstyle and clothing were interpreted in different ways. However, whereas Hess

et al. observed a partial reversal of stereotype-consistent interpretations using this approach, the

findings of Plant et al. were consistent with a sex bias in the interpretation of facial expressions.

Rahman, Wilson, and Abrahams (2004) measured accuracy and reaction time as

participants categorized happy, sad, and neutral facial expressions. The stimuli were pictures of

four male and four female encoders posing the expressions, presented on a computer. Sex of

encoder interacted with sex of participant in that female participants were more accurate in

categorizing male faces, whereas sex of encoder did not impact accuracy for male participants.

Sex of encoder interacted with facial expression in that sadness was more accurately identified

on male faces than on female faces, and responses were faster to happy and sad male faces than

to happy and sad female faces. The authors conclude that males' facial expressions may be easier

to read.










The Einding that sadness was more accurately identified on male faces than on female

faces runs counter to what theories of sex bias in interpreting facial expression would appear to

predict. Recognition of stereotypically feminine emotions such as sadness on male faces should

be hampered by the bias. However, given that there were no stereotypically male emotions as

response options, the methodology used does not lend itself well to examination of sex bias

questions.

In a study designed to examine the impact of encoder sex on emotion classification, as

well as the impact of displayed emotion on judgments. of encoder sex, Atkinson, Tipples, Burt,

and Young (2005) found evidence that variations in sex of encoder significantly influenced

decisions about what emotion is being portrayed. First, the researchers showed participants

pictures depicting facial affect in blocks with all male encoders, all female encoders, or mixed,

and asked participants to make a rapid judgment as to whether fear or happiness was being

portrayed. In the mixed sex-of-encoder blocks, performance was significantly slower than in

blocks with all male or all female encoders. Next, participants completed a similar task requiring

them to judge quickly the sex of the person in the picture while expression of emotion was held

constant or varied. The speed with which the participants made judgments about sex was not

significantly different in the varying conditions.

The results of the Atkinson et al. study do not directly indicate evidence for or against a

sex bias in interpreting emotional expressions. The authors of this study did not report on the

reaction times for male encoders versus female encoders, but only for blocks of homogeneous

encoder sex versus blocks of heterogeneous encoder sexes. However, the results of this study do

help establish the stage of processing at which such a bias would take place, as they interpret

their results as supporting a model in which information about the sex of a face is processed









faster than information about affect. Therefore, any interpretation of affect is conducted within a

context where information about sex has already been processed.

A series of three studies by Hess, Adams, & Kleck (2005) continued their investigation of

perceived dominance and aff61iation as mediators of the sex-stereotypical processing of facial

affect that is frequently observed. In the first, photographs of male and female faces displaying

neutral affect were shown to three groups of participants. One group rated how likely they

thought the people in the pictures were to show anger, fear, contempt, sadness, disgust,

happiness, and surprise. Another group rated each picture for how dominant the people appeared

to be, and a third group rated each picture for how aff61iative each person appeared to be.

A mediational analysis showed that the sex of the encoder contributed strongly to his or

her perceived dominance and aff61iation, and to predictions about what emotions the encoders

were likely to show. Additionally, dominance and aff61iation contributed to predictions regarding

shown emotion after controlling for sex of the encoder. Males were judged to be more likely to

show the stereotypically masculine emotions of anger, contempt, and disgust, and less likely to

show the stereotypically feminine emotions of fear, sadness, happiness, and surprise. This pattern

was reversed for females. After factoring out the effect of sex of the encoder, perceived

dominance was positively correlated with the stereotypically masculine emotions studied, and

negatively correlated with two of the four stereotypically feminine emotions, fear and sadness.

Affiliation, after factoring out sex, was negatively correlated with the masculine emotions and

positively correlated with three of the four stereotypically feminine emotions, namely fear,

happiness, and surprise.

In the other two studies, participants viewed pictures of encoders previously rated as high

or low dominance (Study 2) and aff61iation (Study 3), along with vignettes describing the









encoders in situations likely to evoke a variety of emotions. Participants were asked to indicate

which of a series of schematic drawings depicting facial expressions of emotion they believed

the encoder would show in response to the situation described. In the dominance study, male

encoders and high dominance encoders of both sexes were judged more likely to display angry

facial expressions, and female encoders were judged more likely to display expressions of

sadness. In the affiliation study, high affiliation encoders were judged more likely to display

happiness in the happy vignette condition than were low affiliation encoders, and the effect was

stronger for male encoders than for female encoders. In the angry and neutral vignette

conditions, male encoders were rated as more likely to show anger regardless of affiliation level,

and female encoders were rated as less likely to show anger regardless of affiliation level.

The authors interpret these results as supporting both an effect of sex bias and effects of

perceived dominance and affiliation. They observed that these variables were confounded,

because facial features associated with dominance are more typical of males and facial features

associated with affiliation are more typical of females. However, the authors concluded that their

findings "show that sex-based stereotypical expectations can be partially overruled by

expectations based on our perceptions of the dominance and affiliativeness of a person" (p. 534).

Hugenberg and Scezny (2006) examined the impact of the sex of the encoder on the

happy face advantage, or HFA, which refers to the fact that happy expressions are categorized

more quickly than other expressions in speeded response studies. Participants viewed images on

a computer monitor of encoders displaying a negative emotion--anger in one version, sadness in

another--or happiness, and were asked to categorize the emotion as quickly as possible.

The authors presented two rationales, both of which lead to predictions that the happy

face advantage would be stronger for female encoders than for male encoders. One rationale was









sex bias in interpreting expressions. Because happiness is stereotypically more closely associated

with women than with men, it was argued that the expectation of seeing happiness on female

faces would lead to a stronger HFA for women. The second rationale was based on the valence

of women compared with men as a stimulus category, and effects of emotional congruence. The

authors cited evidence of the so-called "women are wonderful" effect, i.e. that women are

generally regarded more positively than are men, and argued that valence-congruent processing

would lead to a stronger HFA for women. In the happiness versus sadness trials, it was argued

that these two rationales lead to different predictions. They stated that the stereotype-based

expectancies should not lead to a stronger HFA for women because sadness and happiness

should be equally expected on female faces. On the other hand, they argued that the congruent

valence rationale would still predict a stronger HFA.

As predicted, on the whole the happy face advantage was present for all encoders but

more pronounced for female encoders. In the happiness versus sadness trials, a larger HFA for

female encoders was observed, which the authors interpreted as stronger support for the valence-

congruence model than for the stereotype-based expectancy model. They commented that this

finding doesn't detract from the utility of sex-stereotyped interpretations of affect in explaining

effects other than the HFA, particularly because this and other HFA studies use unambiguous

expression stimuli, and stereotypes are more likely to affect interpretations of ambiguous stimuli.

The most recently published investigation uncovered in this review that addressed the

effects of encoder sex on interpretations of emotion expressions provides an in-depth,

multifaceted examination of perceptions of happiness and anger as a function of the perceived

sex of the encoder. Becker, Kenrick, Neuberg, Blackwell, and Smith (2007) considered the

related phenomena of (a) anger being more quickly and accurately identified on male faces and










(b) happiness being more quickly and accurately identified on female faces. They conducted a

series of seven studies to compare the utility of two theoretical explanations for these effects: the

theory of bias arising from sex stereotypes rooted in social learning, and the theory of bias

arising from evolved tuning of human perceptual systems to avoid threats and approach

opportunities. The authors started with a hypothesis that could be formed from either theoretical

perspective: "judgments and speeded decisions about expression would be dependent on the sex

of the displayer of the emotion, revealing correlations of maleness with anger and femaleness

with happiness" (p. 181). They then went on to examine the issue using multiple methodologies

and tried to evaluate the hypothesis and also to search for factors supporting or undermining each

of the theoretical perspectives under consideration.

In the first study, participants were asked to imagine a face. Half the participants were

instructed to imagine a happy face, and half were instructed to imagine an angry face. They then

provided details pertaining to the face they imagined by responding to items on a questionnaire.

Among other things, they were asked whether they had imagined a male or female face. Most

participants of both sexes who were asked to imagine an angry face imagined the face to be

male. A significant maj ority of males who imagined a happy face imagined it to be female.

Marginally more females also imagined happy faces as female. The authors noted that this

procedure tapped participants' associations of these emotions to sex, and was able to do so

without cuing sex explicitly, but that it revealed little about the source of those associations.

In the second study, participants viewed a series of photos of encoders displaying angry

and happy expressions on a computer and were instructed to categorize each as quickly as

possible. Afterwards, they completed an implicit association task to assess any automatic

associations of male or female names with synonyms for happiness or anger. Happy faces were










judged more quickly than angry faces, and the quicker reaction to happy faces was more

pronounced for female faces than for male faces. Angry male faces were categorized more

quickly than angry female faces, and happy female faces were categorized more quickly than

happy male faces. Accuracy for categorization of angry expressions was better for male encoders

than for female encoders, and accuracy for happy expressions was better for female faces than

for male faces. On the implicit associations measure, the overall pattern was for participants to

associate males with anger and females with happiness.

Categorization and reaction time data were reanalyzed for a subset of participants whose

associations were in the opposite directions from the overall averages. This was done to

investigate the possibility that the previously observed patterns would be reversed in this subset,

as one might expect if the observed effects were caused by automatic associations between the

sexes and the emotions in question. However, some aspects of the patterns persisted. This subset

of participants was also faster and more accurate in categorizing angry male faces as compared

with happy male faces, and they were faster in categorizing happy female faces compared with

angry female faces. The authors conclude that the overall results for the categorization and

response time task support the initial hypothesis and are consistent with both the social learning

rational and the perceptual mechanism rationale, and the results for the subset of participants

with the less common pattern of associations is somewhat more compatible with the perceptual

mechanism rationale.

In the third study participants viewed the same images as were used in the second study,

but were asked to determine quickly the sex of the encoder instead of the emotion being

displayed. Participants categorized male faces more accurately when they had an angry










expression, and they categorized female faces more accurately and quickly when they had a

happy expression.

The stimuli for study four consisted of computer-generated faces created to simulate men

and women expressing anger and happiness. This was done in order to control for the possibility

that men actually portray expressions of anger better than do women, and the possibility that

women actually portray expressions of happiness better than do men. The methodology of the

second study was repeated with the computer-generated faces. Anger was categorized more

quickly on apparently male faces than on apparently female faces, and happiness was categorized

more quickly on apparently female faces apparently male faces. Participants were more accurate

in identifying anger on apparently male faces than on apparently female faces, and more accurate

in identifying happiness on apparently female faces than on apparently male faces. The authors

interpreted the results as supporting the primary hypothesis.

For the fifth study, photographs of angry, happy, neutral and fearful faces were presented

for very short time intervals. Participants were asked to identify the emotions they saw. Neutral

male faces were misidentified as angry more often than were neutral female faces. Happy female

faces were correctly identified more often than were happy male faces. Accuracy rates were the

same for angry male faces and angry female faces. These results were considered partially

supportive of the original hypothesis, regarding the association of maleness with anger and

femaleness with happiness. In addition, fearful female faces were more accurately categorized

than fearful male faces.

For the sixth study, computer graphics software was again used, this time to generate nine

androgynous faces with neutral expressions. From these, nine pairs of faces were created by

making a slightly feminized and a slightly masculinized version of each. Four of these pairs were









used with the neutral expressions. The remaining fiye pairs were given emotional expressions of

happiness or anger. Each member of a pair had an almost identical expression, with the

feminized version being slightly modified to be either less happy or angrier than the

masculinized version.

Participants viewed each pair, and made judgments either as to which one of the two was

more masculine, or which one appeared angrier. Participants' judgments of masculinity and

femininity aligned with the ways the investigators made the faces, i.e. faces that were

masculinized were judged to be male, and faces that were feminized were judged to be female.

Despite the fact that the emotional expressions of the pairs had in every case either been left

neutral or changed to make the feminized face angrier or less happy, the masculinized faces were

always rated the angrier of the two on average. The authors interpreted this as a "natural

confound between sex and facial expression" (p. 187).

In the seventh study, six androgynous faces were generated using computer software,

then modified in each of six ways: a body with traditionally masculine or feminine clothing was

added, the jaw was made squarer or was made rounder and narrower, and the brow ridge was

raised or lowered. Each original face and its six variants was presented as a stimulus. Half the

participants were told the stimuli had been modified to look slightly angry or slightly happy.

These participants rated each stimulus on a nine-point scale from "slightly angry" to "slightly

happy." The other half were told the stimuli had been modified to look slightly masculine or

slightly feminine, and asked to rate each stimulus on a nine-point scale from "slightly masculine"

to "slightly feminine."

Masculine clothing caused the faces to be rated as more masculine compared with the

original versions, but did not cause them to be rated as angrier, as would be predicted from sex-









stereotype theory. Feminine clothing caused faces to be rated as more feminine compared with

originals, but did not cause them to be rated as happier. Faces with lower brow ridges were seen

as more masculine and angrier. Faces with higher brow ridges were seen as more feminine, but

higher brow ridges did not cause faces to be rated as happier. Making the jaw more square did

not result in higher ratings of masculinity as expected by the investigators, but did cause faces to

be rated as more angry. Similarly, the rounding and narrowing of the j aw did not result in higher

ratings of femininity, but did result in higher ratings of happiness. The investigators regarded

these results as being inconsistent with the social learning hypothesis.

Becker et al. interpreted their results as a whole to be more consistent with the theory that

human perceptual mechanisms are tuned to associate anger with males and happiness with

females, rather than with the theory that social learning leads to stereotyped beliefs about gender

and emotion that in turn bias the interpretation of affect. They speculated that certain facial

features that are associated with human sexual dimorphism, but that are not always or necessarily

associated with concepts of masculinity or femininity, may be perceived as conveying anger and

happiness. They did not dismiss social learning as a factor, however, emphasizing that both

sources of variance may be at play in a given situation. Situations involving ambiguous and

complex emotional expressions might give stereotypical interpretations the opportunity to

emerge, as Condry and Condry (1976) observed three decades earlier. Such expressions were not

studied in the Becker et al. investigation.

Conclusion

The impact of encoder sex on the interpretation of emotion expressions has been

observed several times in the scientific literature. The exact nature (or natures) of this effect has

not been firmly established. The theory that females are more skilled at encoding emotions has

been offered and may be correct. However, this theory fails to explain findings that certain









emotions may ascribed to males more quickly and/or accurately, or at a higher level of intensity

as compared with females. Studies in which faces are kept constant across conditions while sex

of encoder is manipulated using peripheral cues like clothing and hairstyle also reveal the

inadequacy of such an explanation. An interpretive bias based on the sex-stereotyping of

emotions has been offered and may also be correct. Many of the studies designed with the

intention of studying sex of encoder effects, as opposed to those revealing such effects more or

less incidentally, support this theory. But not every study's results fit readily into such a model,

and interesting alternative or complementary models are beginning to arise, such as those tying

observed effects to particular facial features, for example Hess et al. (2005) and Becker et al.

(2007).

Lingering questions in this area may soon be answered, though doubtlessly new questions

will arise in the process. New techniques are being developed, such as the use of computers to

perform tasks such as combining images, adding or removing cues such as hairstyle and clothing,

and even generating very realistic, highly manipulable synthetic encoders. The literature already

reflects some of the innovative methodologies and superior controls these techniques make

possible, and more will surely come. These better investigative tools challenge researchers to

examine old issues in new ways, and to ask new questions that require shifts in one's

assumptions, similar to Condry and Condry's innovative manipulation of the apparent sex of the

encoder--something that was conventionally assumed to be fixed. The interplay between the

application of developing technologies and the creative formulation of research questions will

likely soon shed considerable light on the interpretation of emotional states and on all the

processes by which we humans understand each other.









CHAPTER 3
METHOD

Participants

Participants were 163 University of Florida students recruited from undergraduate

courses. 124 participants were women, 39 were men. 63.8% indicated that they were European-

American or White, 17.8% were African-American or Black, 11.7% were Hispanic, 11% were

Asian or Asian-American, and 1.2% were Pacific Islander. Participants were permitted to

indicate more than one racial/ethnic group, and 10 participants did so. There were 79 1st-year

students, 40 2nd-year students, 34 3rd-year students, and 10 4th-year students. Students

participated in return for extra credit in the class from which they were recruited, or in the case

of introductory psychology students, in exchange for credit towards the research participation

requirement in the course. All students in the classes used for recruitment were eligible to

participate.

Materials

The independent variables in our study were (a) the visual display of affect as determined

by facial expression and body positioning of encoders, and (b) the apparent sex of the encoders.

For the test of the first hypothesis, apparent sex was presented at two levels: male and female.

Visual display of affect was presented at three levels: stereotypically feminine discrete emotion,

stereotypically masculine discrete emotion, and ambiguous (blended) emotion. Positively

valenced emotions (happiness, pride, and a blend of the two) and negatively valenced emotions

(fear, anger, and a blend of the two) were examined separately. For the test of the second

hypothesis, all expressions were of ambiguous emotional content, while apparent sex was

presented at three levels: male, female, and ambiguous. Positively and negatively valenced

emotions were examined separately.









The stimuli were a series of synthesized human faces and upper bodies generated using

Curious Labs' Poser 6 software (Weinberg et al., 2005). The software allows the creation of

realistic three-dimensional models of human Eigures, which can vary on a number of user-

definable parameters such as sex, race, age, and facial expression. Facial expression in particular

may be Einely controlled in specific regions of the face. More than 50 parameters may be

adjusted to specify the behavior of areas such as the right or left forehead, right or left eyebrow,

eyelids, around the right or left eye, and around the mouth. The lips and adj acent areas have 20

available movements, each with numerous possible gradations. The high degree of control

afforded by the software allowed the accurate reproduction of empirically investigated and

rigorously described affective expressions without relying on live encoders.

For our study, six encoders were created from two starting encoders with randomly

generated features. Each starting encoder was made into a male version, a female version, and an

androgynous version, for a total of two apparent males, two apparent females, and two

androgynous figures. Secondary sex characteristics such as certain aspects of bone structure, skin

texture, and the presence and degree of facial hair shading were adjusted to manipulate apparent

sex of encoders, as were gender cues such as haircut and clothing.

Anger, fear, and happiness expressions as described by Ekman et al. (FACS; 2002) and

the expression of pride described by Tracy and Robins (2004) were created and applied to each

Eigure, generating 24 stimulus images. Ambiguous expressions were then created by using the

software to interpolate, or morph, between same-valence expressions of basic emotions. This

process generates a series of images as one endpoint gradually transitions to the other. The

middle image in each series was selected as a mathematical halfway point between the basic

emotion expressions. This yielded 6 expressions in total (anger, fear, and a blend of the two;










happiness, pride, and a blend of the two) that were applied to the six encoders for a total of 36

stimulus images. See Appendix A for a representative set of stimulus images.

The dependent variables used in the tests of both hypotheses were ratings on a four-point

scale of the degree to which four emotions (anger, fear, happiness, and pride) were judged to be

present. After viewing each encoder, participants rated the extent to which they believe the figure

is expressing each emotion from 1 (not at all) to 4 (very much), anger and pride being

stereotypically masculine, and fear and happiness being stereotypically feminine.

A set of manipulation-check questions was included after the main questionnaire. The

stimuli were a subset of the images used for the hypothesis tests. After each image, participants

were prompted to indicate if the person depicted was "male," "female," or "not sure," and

whether or not the person's expression was ambiguous.

Procedure

The questionnaires used to collect data resided on the internet. The questionnaires were in

the form of interactive web pages, which presented the stimuli, received the participants'

responses, and wrote the response data to a digital file for analysis. Participants signed up for the

experiment by following a hypertext link on the web page of the course from which they are

recruited, by following a hypertext link from the University of Florida Psychology Department' s

online experiment interface, or by entering a URL supplied by their instructor into a web

browser. The link lead them to a page containing an informed consent statement and contact

information for the primary investigator (see Appendix B). After the informed consent

statement, participants advanced to an interactive page beginning with a section where they

entered demographic information (see Appendix C). They then continued on to the main body of

the questionnaire. See Appendix D for examples of regular and manipulation check items.










Participants will viewed each item in succession, in one of six semi-random orders. For

counterbalancing, each of the 36 stimulus images was assigned a number between 1 and 36. Six

randomly ordered lists of the numbers 1-36 were generated, to create six orders of presentation.

The orders of presentation were then adjusted so that each of the six encoders appeared first on

one list and last on another, and each of the six emotion expressions appeared first on one list

and last on another. The 36 manipulation check items were divided into six groups of six and one

group was appended to the end of each presentation order.

For the regular items, a series of questions below each stimulus elicited ratings from the

participant on each of the four emotions, on a scale of 1-4. For the manipulation check items, the

two manipulation check questions followed each item (see Appendix D). After entering their

responses, participants used the mouse to click the word "next," and the next item was presented.

After responding to the last item, participants clicked a box labeled "submit:" and their responses

were written to a data file. They were then directed to pages thanking them and providing

instructions for receiving credit for their participation (see Appendix E). Participants who began

the study but then decided not to continue participating were able to receive credit by scrolling to

the end of the questionnaire, clicking the submit button, and following the subsequent

instructions.

Evaluation of Hypotheses

The criteria for support for the first hypothesis were as follows: if participants' ratings of

emotions for figures in the ambiguous expression condition were sex-stereotypical, and their

ratings of the emotions of figures in the unambiguous expression conditions were consistent with

the intended interpretation of the expression regardless of target sex, then the first hypothesis

was supported. These criteria were analyzed using two series of Bonferoni-corrected t-tests.










Four t-tests were employed to evaluate whether or not ratings of emotions for Eigures in

the ambiguous expression condition were sex-stereotypical. For apparently male and apparently

female Eigures displaying positive-valence and negative-valence blended expressions, ratings of

stereotypically masculine emotions were compared to ratings of stereotypically feminine

emotions. Twelve t-tests were employed to evaluate whether or not ratings of the emotions of

Eigures in the unambiguous expression conditions were consistent with the intended

interpretation of the expression regardless of encoder sex. For apparently male, apparently

female, and androgynous figures displaying each of the four unambiguous expressions, ratings of

the emotion matching the intended interpretation of the expression were compared with ratings

of the other same-valence emotion.

The criteria for support of the second hypothesis were as follows: in ambiguous

expression conditions, if participant ratings of stereotypically masculine emotions were higher

for cells with unambiguously male figures than for cells in which the Eigure's apparent sex is

ambiguous, and if participant ratings for stereotypically feminine emotions were larger for cells

with unambiguously female figures than for cells in which the Eigure's apparent sex is

ambiguous, then the second hypothesis was supported. A series of four Bonferoni-corrected t-

tests was used to evaluate the second hypothesis. Ratings of stereotypically masculine emotions

were compared for apparently male and ambiguously sexed figures displaying positive-valence

and negative-valence blended expressions. Ratings of stereotypically feminine emotions were

compared for apparently female and ambiguously sexed figures displaying positive-valence and

negative-valence blended expressions.









CHAPTER 4
RESULTS

Data from 8 participants were discarded because the participants completed less than

70% of the items. Data from 163 participants were analyzed. Reliability for facial expression

subscales was assessed by calculating Cronbach' s alpha for each of the 6 facial expressions.

Cronbach's alphas were all above .7, indicating adequate reliability.

Hypothesis One

The first hypothesis predicted that participants' ratings of emotions for ambiguous

expressions would be sex-stereotypical. That is, apparently male Eigures displaying ambiguous

emotions were predicted to be rated higher on stereotypically masculine emotions than on

stereotypically feminine emotions, and apparently female Eigures were predicted to be rated

higher on stereotypically feminine emotions than on stereotypically masculine emotions. It was

also predicted that participants' ratings of emotions for unambiguous expressions would be

consistent with the intended interpretations of the expressions regardless of the apparent sex of

the target Eigures. Expressions constructed using facial action units for one emotion, i.e.

unblended expressions, were considered to be unambiguous expressions for the purpose of

hypothesis testing. Blended expressions were constructed by morphing one unambiguous

expression into another, generating a series of intermediate images. The middle image in the

series was selected for use. These blended expressions were considered to be ambiguous.

The first part of the first hypothesis, regarding ambiguous expressions, was partially

supported. Both apparently male and apparently female figures displaying blended expressions

were rated higher on stereotypically feminine emotions, whereas the hypotheses predicted that

apparently male Eigures displaying blended expressions would be rated higher on stereotypically

masculine emotions. The following four planned t-tests were employed to evaluate the first part









of the hypothesis: ratings of anger (stereotypically masculine) and fear (stereotypically feminine)

were compared for anger-fear blended expressions on apparently male figures. Anger was rated

significantly lower than fear (t = -2.376, p = .019), which was not consistent with the hypothesis.

Ratings of anger and fear were compared for anger-fear blended expressions on apparently

female figures. Fear was rated significantly higher than anger (t= 16.225, p < .001), which was

consistent with the hypothesis. Ratings of pride (stereotypically masculine) and happiness

(stereotypically feminine) were compared for pride-happiness blended expressions on apparently

male figures. Pride was rated significantly lower than happiness (t = -6.091, p < .001), which was

not consistent with the hypothesis. Ratings of pride (stereotypically masculine) and happiness

(stereotypically feminine) were compared for pride-happiness blended expressions on apparently

female figures. Happiness was rated significantly higher than pride (t = -12.290, p < .001), which

was consistent with the hypothesis.

The second part of the first hypothesis, regarding the correct identification of

unambiguous emotions, was supported except in the case of apparently female figures displaying

unblended pride, in which case participants' ratings of pride and happiness were not significantly

different. Twelve planned t-tests were used to evaluate the second part of the first hypothesis.

The first four t-tests dealt with apparently male figures: For apparently male figures

displaying unblended anger, ratings of anger (stereotypically masculine) were compared with

ratings of fear (stereotypically feminine). Anger was rated significantly higher than fear (t =

43.362, p < .001), which was consistent with the hypothesis. For apparently male figures

displaying unblended fear, ratings of anger were compared with ratings of fear. Anger was rated

significantly lower than fear (t = -27.333, p < .001), which was consistent with the hypothesis.

For apparently male figures displaying unblended pride, ratings of pride (stereotypically









masculine) were compared with ratings of happiness (stereotypically feminine). Pride was rated

significantly higher than happiness (t = 8.628, p < .001), which was consistent with the

hypothesis. For apparently male Eigures displaying unblended happiness, ratings of pride were

compared with ratings of happiness. Pride was rated significantly lower than happiness (t = -

9.700, p < .001), which was consistent with the hypothesis.

A corresponding set of four t-tests was used to evaluate participants' identification of

unambiguous emotions of apparently female figures. For apparently female Eigures displaying

unblended anger, ratings of anger (stereotypically masculine) were compared with ratings of fear

(stereotypically feminine). Anger was rated significantly higher than fear (t = 41.117, p < .001),

which was consistent with the hypothesis. For apparently female figures displaying unblended

fear, ratings of anger were compared with ratings of fear. Anger was rated significantly lower

than fear (t = -33.185, p < .001), which was consistent with the hypothesis. For apparently

female figures displaying unblended pride, ratings of pride (stereotypically masculine) were

compared with ratings of happiness (stereotypically feminine). Ratings of pride were not

significantly different than ratings of happiness (t = -.009, p = .993), which was not consistent

with the hypothesis. For apparently female figures displaying unblended happiness, ratings of

pride were compared with ratings of happiness. Pride was rated significantly lower than

happiness (t = -13.009, p < .001), which was consistent with the hypothesis.

A third set of four t-tests was used to evaluate participants' identification of unambiguous

emotions of figures with ambiguous apparent sex. For ambiguously sexed figures displaying

unblended anger, ratings of anger (stereotypically masculine) were compared with ratings of fear

(stereotypically feminine). Anger was rated significantly higher than fear (t = 29.894, p < .001),

which was consistent with the hypothesis. For ambiguously sexed figures displaying unblended









fear, ratings of anger were compared with ratings of fear. Anger was rated significantly lower

than fear (t = -27.751, p < .001), which was consistent with the hypothesis. For ambiguously

sexed figures displaying unblended pride, ratings of pride (stereotypically masculine) were

compared with ratings of happiness (stereotypically feminine). Pride was rated significantly

higher than happiness (t = 8.411, p < .001), which was consistent with the hypothesis. For

ambiguously sexed figures displaying unblended happiness, ratings of pride were compared with

ratings of happiness. Pride was rated significantly lower than happiness (t = -15.632, p < .001),

which was consistent with the hypothesis.

Hypothesis Two

In the second hypothesis it was predicted that in when the target figure's sex was

ambiguous and the target figure was displaying an ambiguous expression of emotion,

participants would assign lower ratings of stereotypically masculine emotions than they would to

apparently male target figures, and that they would assign lower ratings of stereotypically

feminine emotions than they would to apparently female target figures. The second hypothesis

was partially supported. Participants rated ambiguously sexed figures lower on stereotypically

masculine emotions than they did apparently male figures. Comparisons of participants' ratings

of stereotypically feminine emotions for ambiguously sexed figures versus apparently female

figures yielded results that did not reach the required p-value for significance when the

Bonferoni correction was applied (p < .013), but did without the Bonferoni correction.

Four planned t-tests were used to evaluate the second hypothesis. Ratings of anger for

apparently male figures displaying blends of anger and fear were compared to ratings of anger

for ambiguously sexed figures displaying blends of anger and fear. Ratings of anger were higher

for apparently male figures than for ambiguously sexed figures (t = 9.467, p < .001), which was

consistent with the hypothesis. Ratings of fear for apparently female figures displaying blends of









anger and fear were compared to ratings of fear for ambiguously sexed Eigures displaying blends

of anger and fear. Ratings of fear were not significantly higher for apparently female Eigures than

for ambiguously sexed Eigures after Bonferoni-correction of alpha to account for the four t-tests

(t = 1.81, p = .036). This was not consistent with the hypothesis. Ratings of pride for apparently

male Eigures displaying blends of pride and happiness were compared to ratings of pride for

ambiguously sexed Eigures displaying blends of pride and happiness. Ratings of pride were

higher for apparently male figures than for ambiguously sexed Eigures (t = 2.34, p = .011), which

was consistent with the hypothesis. Ratings of happiness for apparently female Eigures displaying

blends of pride and happiness were compared to ratings of happiness for ambiguously sexed

figures displaying blends of pride and happiness. Ratings of happiness were not significantly

higher for apparently female figures than for ambiguously sexed figures after Bonferoni-

correction of alpha (t = 2. 117, p = .018), which was not consistent with the hypothesis.

Additional Analyses

An additional set of t-tests was employed to examine the premise of the first part of the

first hypothesis, i.e. that participants would exhibit sex bias in the interpretation of ambiguous

expressions of emotion on apparently male and apparently female target figures. In the additional

analysis, apparently male and apparently female figures were compared on ratings of

stereotypically masculine emotions, and on ratings of stereotypically feminine emotions.

Four t-tests were used for the additional analysis. Ratings of anger were compared for

apparently male figures displaying blends of anger (stereotypically masculine) and fear

(stereotypically feminine) and for apparently female figures displaying blends of anger and fear.

Apparently male figures were rated as significantly angrier than were apparently female figures

(t = 3.686, p < .001). Ratings of fear were compared for apparently male figures displaying

blends of anger and fear and for apparently female figures displaying blends of anger and fear.










Apparently male figures were rated as being significantly less fearful than apparently female

figures (t = -12.041, p < .001). Ratings of pride were compared for apparently male figures

displaying blends of pride (stereotypically masculine) and happiness (stereotypically feminine)

and for apparently female figures displaying blends of pride and happiness. Apparently male

figures were rated as significantly prouder than were apparently female figures (t = 4.269, p <

.001). Ratings of happiness were compared for apparently male figures displaying blends of

pride and happiness and for apparently female figures displaying blends of pride and happiness.

Apparently male figures were rated as significantly less happy than apparently female figures (t

=-3.023, p = .003). These results are all consistent with sex-biased interpretation of ambiguous

expressions of emotion.

Apparent Sex Manipulation Check

Figures intended to be unambiguously male were consistently rated as such. The first

male figure was rated as male by 159 of 163 participants, and the second was rated as male by

160 of 163 participants. Figures intended to be unambiguously female were also consistently

rated as such. Both female figures were rated as female by 161 of 163 participants. As expected,

there was less agreement regarding the figures whose sex was intended to be ambiguous. The

first ambiguous figure was rated as male by 101 participants, as female by 36 participants, and

26 participants chose the response "don't know." The second ambiguous figure was rated as

male by 68 participants, as female by 50 participants, and 44 participants chose the response

"don't know."

Summary

The first hypothesis of our study was partially supported by the results. The first part of

the first hypothesis was not supported in the primary analysis, because participants rated blended

expressions significantly higher on stereotypically feminine emotions regardless of the apparent









sex of the target figure. However, the additional analysis did reveal a pattern of sex-stereotypical

ratings of ambiguous emotions in that apparently male figures displaying blended expressions

were rated higher on stereotypically masculine emotions than were apparently female figures

displaying blended emotions, and apparently female figures displaying blended expressions were

rated higher on stereotypically feminine emotions than were apparently male figures displaying

blended emotions. The second part of the first hypothesis was mostly supported in that in 11 of

12 t-tests, results were consistent with participants assigning the highest ratings to the intended

emotions regardless of apparent sex of target and whether the intended expression portrayed a

stereotypically masculine or stereotypically feminine emotion.

The second hypothesis of our study was also partially supported by the results.

Apparently male figures displaying blended emotions were rated higher on stereotypically

masculine emotions than were ambiguously sexed figures displaying blended expressions, as

predicted. However, apparently female figures displaying blended expressions were not rated

higher on stereotypically feminine emotions than were ambiguously sexed figures displaying

blended expressions.









CHAPTER 5
DISCUSSION

The theory underlying the hypotheses was generally supported. Some elements of the

hypotheses were not fully supported, but explanations that preserve the essential ideas readily

present themselves. In one case there seems to have been a problem with the calibration of the

stimuli, and additional analyses correcting for this reveal the predicted effect. In another, the test

seems to have been very slightly underpowered for use with the conservative Bonferoni

correction to avoid family-wise error. If these rationales can be accepted, then the results provide

evidence for a sex bias in the interpretation of visual expressions of emotion. This may partially

explain the persistent exaggeration of sex differences in emotion in the popular imagination, as

compared with the scientific literature.

Tests of Hypotheses

Testing of the first hypothesis yielded mixed results. The first part of the first hypothesis,

i.e. that people would interpret ambiguous expressions in a biased way according to the apparent

sex of the target, was only partially supported. For blended expressions, participants rated the

stereotypically female emotions higher for both apparently male and apparently female faces.

However, the anticipated sex-bias was observed in the follow-up analyses. In blended expression

conditions, ratings of stereotypically masculine emotions were significantly higher on apparently

male faces as compared to apparently female faces, and ratings of stereotypically feminine

emotions were significantly higher on apparently female faces than on apparently male faces.

The concept underlying the hypothesis appears to have garnered support even though not

all parts of the hypothesis did. The failure to fully support the hypothesis may have resulted from

the blended expressions not being perceived as sufficiently ambiguous, or perhaps they did not

fall at the psychological midpoint between the pairs of unblended expressions from which they









were made. Put another way, the psychological midpoint between two expressions, upon which

the hypothesis was based, may be different from the mathematical midpoint, which is what was

used for our study. An investigation into where that midpoint actually lies, and whether it is best

approximated by morphing unambiguous expressions together or by some other method of

combining elements of unambiguous expressions, could be a fruitful area for further research.

The second part of the first hypothesis was supported in 11 of the 12 comparisons made.

Participants correctly identified unblended emotions except in the case of female faces

displaying unblended pride. There was no difference in their ratings of perceived pride and

perceived happiness in this condition. In fact, all figures in the study displaying positive affect

were rated relatively highly on both pride and happiness. In contrast, expressions of

unambiguous fear received comparatively low ratings on anger, and expressions of unambiguous

anger received similarly low ratings on fear.

The reasons for this difference are unclear, but perhaps worth speculating about. It could

be that happiness and pride were simply not adequately differentiated in the creation of the

stimuli. It is also possible that emotions of positive valence are not psychologically differentiated

to the same extent as emotions of negative valence, or at least that "happiness" is a somewhat

broad or undifferentiated emotion. For example, pride, love and pleasant surprise may all be

described as "happy" emotions. For emotions of negative valence, the term "unhappy" could

perform a similar role in categorizing them. The important difference for the purpose of this

work is that happiness is described in the facial expression literature as a recognizable basic

emotion in its own right, whereas there is no described expression that is simply "unhappy"

without also being something else, such as sad, angry, or fearful. It might also be a useful area of









further investigation to explore how differentiated expressions of positively valenced emotions

are as compared with negatively valenced emotions.

Testing of the second hypothesis also yielded mixed results. In blended expression

conditions, Eigures of ambiguous apparent sex were rated lower on stereotypically masculine

emotions than were apparently male Eigures. This was what was predicted--if the effect being

examined is truly a sex bias, the effect should disappear or be attenuated when the sex of the

target is unclear. However, Eigures of ambiguous sex were not rated significantly lower on

stereotypically feminine emotions than were apparently female Eigures. This probably indicates

that the test was under-powered: the results comparing ambiguously sexed Eigures with apparent

females approached significance in the predicted direction. It could in theory also indicate poor

manipulation of apparent sex, with participants tending to believe that Eigures intend to be

ambiguous were in fact females. Examination of the manipulation check data contradicts this,

however. In fact, if poor manipulation were a problem, one would expect the ratings for

ambiguous figures to have been closer to those for apparent males rather than apparent females

as observed, since both ambiguously sexed figures were rated as male more often than female.

Sex Bias in Interpreting Affect as a Reinforcer of Cultural Stereotypes

In the introduction, it was observed that there is a popular perception that men's and

women's emotional experiences are very different, while the research literature indicates that the

emotional worlds of men and women are much more alike than different. The purpose of our

study was to examine one possible explanation for this disjunction between popular perception

and empirical reality. Namely, that there is a sex bias in play when people decode the emotional

expressions of others. Previous studies had revealed some evidence to that effect, but were

limited by having to trade between realism and control in the creation of stimuli.









Our study brought new tools to the task. Realistic computer-generated faces with Einely

controllable emotional expressions made it possible for apparent males and females to be shown

displaying exactly the same expression, as opposed the best approximation of a human trying to

pose the expression. The same tools made it possible to create encoders of ambiguous apparent

sex, and again to show them displaying precisely the same expressions of emotion as the

apparently male and female Eigures, in order to see if the hypothesized sex bias was eliminated or

attenuated when the target' s sex wasn't clear.

The results gave partial support to the hypotheses as they were formulated, but appear to

support the basic idea of a sex bias in the decoding of expressions of emotion. Apparently male

Eigures and apparently female Eigures displaying precisely the same facial expressions and

postures were rated differently, and the difference in the ratings reflected stereotypes about

which emotions are considered masculine and which are considered feminine. This indicates sex

bias. When the Eigure was neither clearly male nor clearly female, they were rated lower on

stereotypically masculine emotions than male figures. There was a similar trend in comparisons

between ambiguous figures and female Eigures, which fell short of statistical significance by a

slim margin. Firm conclusions can't be drawn about trends that fall short of significance, but the

pattern here is tantalizingly close to evidence that Eigures that are obviously male or obviously

female receive an extra boost in ratings of stereotypically masculine and stereotypically feminine

emotions respectively, and that Eigures that are not obviously male or female receive no such

boost. If this were definitively shown to be the case, this would also indicate sex bias. The results

regarding the ambiguously sexed Eigures are consistent with the presence of sex bias in the

interpretation of expressions of emotion, though they do not demonstrate it clearly.









As with many other types of bias, the practical implications of research indicating a sex

bias in the interpretation of emotions relate to the fact that when people are aware of a bias, they

have on opportunity to correct for it. In social life as well as in any professional arena in which

the accurate understanding of the emotional signals of others is important, knowledge of this

particular bias may create a chance for improved communication. Mental health practitioners

may be particularly impacted, because they deal so directly with interpreting the emotional states

of clients. Another possible implication for counseling is that clients themselves may be made

aware of the sex bias in interpreting the emotional states of others, and thereby be given an

opportunity to try to counteract it in their own inferences. This could be particularly useful in

couples counseling.

Limitations of this Investigation

Our study had a few limitations. The significant time involved in learning to use the

Poser 6 software to create encoders, combined with the desirability of keeping the questionnaire

short enough to avoid participant fatigue, led to the inclusion of only six encoders in the study.

The inclusion of more figures would have diluted any unintended effects of the figures

themselves as opposed to the desired effects of the expressions and the apparent sexes of the

figures.

The use of a larger number of participants might have helped the results pertaining to the

second hypothesis be more conclusive. All of the observed results were in the predicted

direction, but after Bonferoni correction to avoid inflating family-wise error, some were not quite

statistically significant. Inadequate power seems a likely cause of this, although it is certainly

possible that the predicted effect simply is not there.

The use of a wider variety of blends of emotions in the stimuli might have added some

clarity to the testing of the first hypothesis. The test of the first hypothesis confounded the effect









of sex bias with the effect of a less relevant aspect of the blended expression stimuli. The

blended stimuli were perceived as being more representative of stereotypically feminine

emotions, despite being midpoints between stereotypically masculine and feminine emotions

from a mathematical point of view. A variety of blended expressions made from different ratios

of the constituent emotions might have made it possible to find a blend that functioned better as a

midpoint in the way it was perceived by participants.

Implications for Future Research

Several possibilities for further investigation are suggested by the outcome of our study.

For example, the development of stimuli could be taken further. By creating stimuli showing a

variety of blended emotional expressions, made with various ratios of the constituent emotions

and perhaps different methods of combining expressions, it may be possible to gain considerable

insight into the way people decode ambiguous, ambivalent, or complex emotional expressions.

Would blends of emotional expressions other than the ones examined here behave the same way,

i.e. would the midpoint generated by morphing two together be interpreted as representing one of

the constituent emotions more than the other? If so, what would characterize the emotion

perceived to be dominant? Might it be, as it was here, the more stereotypically feminine of the

two? Also, what ratio or method of combination would maximize ambiguity? The answers to

these questions would be interesting in themselves, and could also lead to the creation of better

tools for the exploration of other questions about the decoding of facial expressions.

In our study, pleasant and unpleasant emotional expressions seem to have behaved

differently. The emotion ratings for pride and happiness were both rather high not only for

expressions blending happiness and pride, but also for expressions of supposedly pure happiness,

or pure pride. Expressions of anger and fear were rated much more monolithically. Further

investigation could clarify the source of this difference. Perhaps in our study the manipulation of









this pair of expressions was inadequate. On the other hand, perhaps happiness and pride are

poorly differentiated in people's minds. Perhaps positively valenced emotions in general are less

differentiated than negatively valenced emotions. Evolutionary psychology might provide a

rationale for hypothesizing that negative emotions are interpreted at a higher resolution or in a

more differentiated way. It could be argued that in the more dangerous environments faced by

our distant ancestors, being wrong about whether another person was angry or afraid would be

more likely to preclude successful procreation (by leading to death, for example) than would

being wrong about whether another was proud or amused.

Another possible avenue for future investigation would be to look at the influence of

contextual factors on the observed sex bias in interpreting expressions of emotion. This could

take any number of forms. Participants could be given information about the figures, such as

some indication of their personalities, social status, or sexual orientations. Alternatively, the

participants' mindsets could be manipulated, for example by having experimental groups read

excerpts from John Gray's or Janet Shibley Hyde's work. It would be interesting to see how

responsive the bias is to changes in awareness on the part of the decoders.

It is possible that different groups may exhibit different levels of bias in interpreting

emotional expressions. Future research could reveal variations in the amount of bias among such

groups as mental health professionals versus lay people, experienced versus inexperienced

mental health professionals, younger versus older people, and so on.

Conclusion

Our study brought new tools and finer control to the study of sex of encoder effects on

the interpretation of visual displays of affect. With some qualifications, evidence of a bias

consistent with cultural stereotypes of men' s and women' s emotions was found. This bias may

contribute to the observed disjunction between popular perceptions about men and women being










from different emotional planets and the scientific literature indicating that sex differences in

emotion tend to be small, are often situational, and are dwarfed in comparison with the

similarities.










APPENDIX A
EXAMPLE STIMULI


M~ale 1, anger M~ale Z, priae


Androgynous 1, tear-anger Dilena Anarogynous z, nappiness-pnae Dilent


Female 1, fear Female 2, happiness











APPENDIX B
INFORMED CONSENT STATEMENT

Informed Consent
Protocol Title: Interpreting Expressions of Emotion

Please read this consent document carefully before you decide to participate in this study.

Purpose of the research study:

The purpose of this study is to better understand how people interpret facial expressions of emotion.

What you will be asked to do in the study:

You will fill out a brief demographic questionnaire, then you will see a series of images and answer a few questions
about each image.

Time required:

About 20-30 minutes.

Risks and Benefits:

We do not anticipate that you will benefit directly or be harmed in any way by participating in tlus experiment.

Compensation:

If you are a UF general psychology student in the participant pool, you will receive one research credit in return for
your participation. After submitting your answers to all the questions, you will be asked to enter your Gatorlink ID
in order to receive your credit.

If you are not part of the psychology participant pool, your instructor will determine and announce in class the
amount of extra credit you will receive for your participation. If your instructor decides not to award extra credit,
you will not be compensated for your participation. After you complete the survey, you must print out the "Thank
You" page and give it to your instructor in order to receive credit.

Confidentiality:

Your identity will be kept confidential to the extent provided by law. Your responses will be saved without any
identifying information. If you provide your Gatorlink ID, it will be stored in a separate file and used only for the
purpose of assigning credit. Neither your name nor your ID number can be connected with your responses to the
survey.

Voluntary participation:

Your participation in this study is completely voluntary. There is no penalty for not participating.

Right to withdraw from the study:

You have the right to withdraw from the study at anytime without consequence.











Whom to contact if you have questions about the study:

Primary Investigator: Kevin Stanley, M.S., Graduate Student, Psychology Department, (352) 379-7918,
stanleyk~,counsel.ufl.edu

Faculty Supervisor: Martin Heesacker, Ph.D., Chair, Psychology Department, PSY 144A, (352) 392-0601 x 200,
heesackiiufl.edu

Whom to contact about your rights as a research participant in the study:

UFIRB Office, Box 1 12250, University of Florida, Gainesville, FL 32611-2250, (352) 392-0433, irb2@ufl.edu

Agreement:

I have read the procedure described above. By clicking the link below I agree to voluntarily participate in tlus
research study.

Click here to enter.









APPENDIX C
DEMOGRAPHIC QUESTIONNAIRE

Please enter the following demographic information about yourself:


Sex: Male Female


Age:

Education:
Undergraduate :

1st year 2nd year 3rd year 4th year
Graduate or Professional

Race/Ethnicity (select all that apply):
American Indian or Alaska Native
Asian
Black or African American
Native Hawaiian or Other Pacific Islander
White

Hispanic or Latino

If you are an international student, what is your home country?









APPENDIX D
EXAMPLE ITEMS

Example 1: regular questionnaire item:

Please look at this picture, and then answer the questions below it:


1. On a scale of 1-4, how angry does this person look?
1 Not at all 2 Slightly 3 Some

2. On a scale of 1-4, how afraid does this person look?
1 No atall 2 Slightly 3 Some

3. On a scale of 1-4, how happy does this person look?
1 No atall 2 Slightly 3 Some

4. On a scale of 1-4, how proud does this person look?
1 Not at all 2 Slightly 3 Some


4 Very much


4 Very much


4 Very much


4 Very much










Example 2: manipulation check item:

Please look at this picture, and then answer the questions below it:


1. This person is...
1 IMale 2 Femnale 3 Not sure

2. Is this person's expression ambiguous?
1 Yes 2 No









APPENDIX E
INSTRUCTIONS FOR RECEIPT OF CREDIT AND THANK-YOU1VIESSAGE

Instructions for receipt of credit:


If you are in the UF general psychology participant pool, please enter your Gatorlink ID in the
box below, then click the submit button. This will allow us to assign you your research credit.

If you are not in the UF general psychology participant pool, click submit without entering
anything in the Gatorlink ID box. You will print out the following page and submit it to your
instructor to receive credit for your participation.

Gatorlink ID:l

Submit



Thank you/debriefing message:

Thank you!

Your participation in this research proj ect is appreciated. Your responses will help improve our
understanding of how people interpret the emotional states of others from facial expressions and
body postures.

If you are not in the UF general psychology participant pool, please print this page and submit it
to your instructor with your name printed legibly at the top.









LIST OF REFERENCES


Algoe, S. B., Buswell, B. N., & DeLamater, J. D. (2000). Gender and job status as contextual
cues for the interpretation of facial expression of emotion. Sex Roles, 42(3-4), 183-208.

Atkinson, A. P., Tipples, J., & Burt, D. M. (2005). Asymmetric interference between sex and
emotion in face perception. Perception & Psychophysics, 67(7), 1199-1213.

Aubrey, J. S., & Harrison, K. (2004). The gender-role content of children' s favorite television
programs and its link to their gender perceptions. Media Psychology, 6(2), 111-146.

Baron-Cohen, S., Wheelwright, S., & Jolliffe, T. (1997). Is there a 'language of the eyes'?
Evidence from normal adults, and adults with autism or Asperger syndrome. Visual
Cognition, 4(3), 311-331.

Baron-Cohen, S. (2003). The essential difference: The nl inh about the male and female brain.
New York: Perseus Books Group.

Becker, D. V., Kenrick, D. T., Neuberg, S. L., Blackwell, K. C., & Smith, D. M. (2007) The
confounded nature of angry men and happy women. Journal ofPersonality and Social
Psychology, 92(2), 179-190.

Becker, R. (Writer). (1991). Defending the caveman [Broadway play]. United States: Theater
Mogul NA, Inc.

Billings, A. C., Angelini, J. R., & Eastman, S. T. (2005). Diverging discourses: Gender
differences in televised golf announcing. Ma~ss Communication and Society, 8(2), 155-171.

Brody, L. R. (2000). The socialization of gender differences in emotional expression: display
rules, infant temperament, and differentiation. In A. H. Fischer (Ed.), Gender and emotion:
Social psychological perspectives. (pp. 24-47). New York: Cambridge University Press.

Buck, R., Miller, R. E., & Caul, W. F. (1974). Sex, personality, and physiological variables in
the communication of affect via facial expression. Journal of Personality and Social
Psychology, 30(4), 587-596.

Canary, D. J., & Emmers-Sommer, T. M. (with Faulkner, S.) (1997). Sex and gender differences
in personal relationships. New York: Guilford Press.

Condry, J., & Condry, S. (1976). Sex differences: A study of the eye of the beholder. Child
Development, 47(3), 8 12-8 19.

Dimitrovsky, L., Spector, H., & Levy-Shiff, R. (2000). Stimulus gender and emotional difficulty
level: Their effect on recognition of facial expressions of affect in children with and
without LD. Journal of learning Disabilities, 33(5), 410-416.










Dundes, L. (2001). Disney's modern heroine Pocahontas: Revealing age-old gender stereotypes
and role discontinuity under a facade of liberation. The Social Science Journal, 38(3), 353-
365.

Eiland, R., & Richardson, D. (1976). The influence of race, sex, and age on judgments of
emotion portrayed in photographs. Conanunication M~onographs, 43(3), 167-175.

Ekman, P. (1993). Facial expression and emotion. American Psychologist, 48(4), 384-392.

Ekman, P., & Friesen, W. V. (1976). Measuring facial movement. Environmental Psychology &
Nonverbal Behavior, 1(1), 56-75.

Ekman, P., & Friesen, W. V. (1987). Universals and cultural differences in the judgments of
facial expressions of emotion. Journal ofPersonality and' Social Psychology, 53(4), 712-
717.

Ekman, P., Friesen, W. V., & Ellsworth, P. (1972) Emotion in the human face: Guidelines for
research and' an integration of findings. Oxford: Pergamon Press.

Ekman, P., Friesen, W. V., & Hager, J. C. (2002). Facial Action Coding System [Computer
software]. Salt Lake City, Utah: A Human Face.

Epinions (2000). Epinions.com Defending the caveman [web site]. Shopping.com, Inc.:
http://www. epini ons. com/trvl -revi ew-20 1D-45 62E8 9D-3 A4BAE9C-prod3 accesss sed
March 2006, April 2007].

Erwin, R. J., Gur, R. C., Gur, R. E., Skolnick, B., Mawhinney-Hee, M., & Samalis, J. (1992).
Facial emotion discrimination: I. Task construction and behavioral findings in normal
subj ects. Psychiatry Research, 42(3), 23 1-240.

Felleman, E. S., Barden, R. C., Carlson, C. R., Rosenberg, L., & Masters, J. C. (1983).
Children's and adults' recognition of spontaneous and posed emotional expressions in
young children. Developmental Psychology, 19(3), 405-413.

Fink, J. S., & Kensicki, L. J. (2002). An imperceptible difference: Visual and textual
constructions of femininity in Sports Illustrated and Sports Illustrated for Women. M\~a;ss
Conanunication and' Society, 5(3), 3 17-33 9.

Gray, J. (1992). M~en are ~ons Mars, Women are ~ons Venus: A practical guide for improving
conanunication and getting what you want in your relationships. New York:
HarperCollins.

Gray, J. (2006). About John Gray: Men are from Mars, women are from Venus [web site].
MarsVenus.com: http://www.marsvenus. com/JohnGrayProfile.php. [accessed March
2006, April 2007].










Hall, J. A., Carney, D. R., & Murphy, N. A. (2002) Gender differences in smiling. In M. H. Abel
(Ed.), An empirical reflection on the smile. (pp. 155-185). Lewiston, NY: Edwin Mellen
Press.

Hess, U., Adams, R. B. Jr., & Kleck, R. E. (2004). Facial appearance, gender, and emotion
expression. Emotion 4(4), 378-388.

Hess, U., Adams, R. B. Jr., & Kleck, R. E. (2005). Who may frown and who should smile?
Dominance, affiliation, and the display of happiness and anger. Cognition & Emotion,
19(4), 515-536.

Hess, U., Blairy, S., Kleck, R. E. (1997). The intensity of emotional facial expressions and
decoding accuracy. Journal ofNonverbal Behavior, 21(4), 241-257.

Hess, U., Blairy, S., Kleck, R. E. (2000). The influence of facial emotion displays, gender, and
ethnicity on judgments of dominance and affiliation. Journal ofNonverbal Behavior,
24(4), 265-283.

Hugenberg, K., & Sczesny, S. (2006). On wonderful women and seeing smiles: Social
categorization moderates the happy face response latency advantage. Social Cognition,
24(5), 516-539.

Hyde, J. S. (2005). The gender similarities hypothesis. American Psychologist, 60(6), 581-592.

Keltner, D. (1995). Signs of appeasement: Evidence for the distinct displays of embarrassment,
amusement, and shame. Journal ofPersonality and Social Psychology, 68(3), 441-454.

Knudsen, H. R., & Muzekari, L. H. (1983) The effects of verbal statements of context on facial
expressions of emotion. Journal ofNonverbal Behavior, 7(4), 202-212.

LaFrance, M. and Banaji, M. (1992). Toward a reconsideration of the gender-emotion
relationship. In Clark, M.S. (Ed.), Emotion and' social behavior (178-201). Thousand Oaks,
CA: Sage.

LaFrance, M., Hecht, M. A., & Paluck, E.L. (2003). The contingent smile: A meta-analysis of
sex differences in smiling. Psychological bulletin, 129(2), 305-334.

Lively, K. J., & Heise, D. R. (2004). Sociological realms of emotional experience. American
Journal of Sociology 109(5), 1109-1136.

Major, B., Carnevale, P. J., & Deaux, K. (1981). A different perspective on androgyny:
Evaluations of masculine and feminine personality characteristics. Journal of Personality
and' Social Psychology, 41(5), 988-1001.

Mignault, A., & Chaudhuri, A. (2003). The many faces of a neutral face: Head tilt and perception
of dominance and emotion. Journal ofNonverbal Behavior, 27(2), 1 11-132.










Moir, A. & Moir, W. (2003). Why men don 't iron: The fascinating and unalterable differences
between men and women. New York: Citadel Press.

O'Kearney, R., & Dadds, M. (2004). Developmental and gender differences in the language for
emotions across the adolescent years. Cognition & Emotion, 18(7), 913-938.

Palermo, R., & Coltheart, M. (2004). Photographs of facial expression: Accuracy, response
times, and ratings of intensity. Behavior Research M~ethods, Instruments, & Computers,
36(4), 634-638.

Pease, A. & Pease, B. (2001). Why men don 't listen and women can 't read maps: How we 're
different and what to do about it. New York: Broadway Books.

Pease, B. & Pease, A. (2004). Why men don 't have a clue and women always need more shoes:
The ultimate guide to the opposite sex. New York: Broadway Books.

Pell, M. D. (2002). Evaluation of nonverbal emotion in face and voice: Some preliminary
findings on a new battery of tests. Brain and Cognition, 48(2-3), 499-504.

Philippot, P., Feldman, R. S., & Coats, E. J. (2003). The role of nonverbal behavior in clinical
settings: Introduction and overview. In P. Phillipot, R. S. Feldman, & E. J. Coats (Eds.),
Nonverbal behavior in clinical settings. New York: Oxford University Press.

Plant, E. A., Hyde, J. S., Keltner, D., & Devine, P. G. (2000). The gender stereotyping of
emotions. Psychology of Women Quarterly, 24(1), 8 1-92.

Plant, E. A., Kling, K. C., & Smith, G. L. (2004). The influence of gender and social role on the
interpretation of facial expressions. Sex Roles, 51(3-4), 187-196.

Rahman, Q., Wilson, G. D., & Abrahams, S. (2004). Sex, sexual orientation, and identification of
positive and negative facial affect. Brain and Cognition, 54(3), 179-185.

Rhodes, S. E. (2004). Taking sex differences seriously. San Francisco: Encounter Books.

Rotter, N. G., & Rotter, G. S. (1988). Sex differences in the encoding and decoding of negative
facial emotions. Journal ofNonverbal Behavior, 12(2), 139-148.

Seidman, S. A. (1992). An investigation of sex-role stereotyping in music videos. Journal of
Broadcasting and Electronic M~edia, 36(2), 209-216.

Signorielli, N. (1989). Television and conceptions about sex roles: Maintaining conventionality
and the status quo. Sex Roles 21(5-6), 341-360.

Simon, R. W., & Nath, L. E. (2004). Gender and Emotion in the United States: Do men and
women differ in self-reports of feelings and expressive behavior? American Journal of
Sociology 109(5), 1137-1176.










Simpson, P. A., & Stroh, L. K. (2004). Gender differences: Emotional expression and feelings of
personal inauthenticity. Journal ofAppliedPsychology, 89(4), 715-721.

Stern, S. R., & Mastro, D. E. (2004). Gender portrayals across the life span: A content analytic
look at broadcast commercials. Ma~ss Communication and Society, 7(2), 215-236.

Tannen, D. (1990). You just don't understand: Women and men in conversation. New York:
William Morrow & Co.

Thayer, J. F., & Johnsen, B. H. (2000). Sex differences in judgment of facial affect: A
multivariate analysis of recognition errors. Scandinavian Journal ofPsychology, 41(3),
243-246.

Theater Mogul (2006). Defending the caveman: About Rob Becker [web site]. Theater Mogul
NA, Inc.: http://www. cavemania. com/05-about-rob .html. [accessed March 2006, April
2007].

Thompson, J. K. (1983). Visual field, exposure duration, and sex as factors in the perception of
emotional facial expressions. Cortex, 19(3), 293-308.

Thompson, T. L., & Zerbinos, E. (1995). Gender roles in animated cartoons: Has the picture
changed in 20 years? Sex Roles, 32, 651-673.

Tracey, J. L., & Robins, R. W. (2004) Show your pride: Evidence for a discrete emotion
expression. Psychological Science, 15(3), 194-197.

Vogel, D. L., Wester, S. R., Heesacker, M., & Madon, S. (2003). Confirming gender stereotypes:
A social role perspective. Sex Roles 48(11-12), 519-528.

Wagner, H. L., MacDonald, C. J., & Manstead, A. S. (1986). Communication of individual
emotions by spontaneous facial expressions. Journal ofPersonality and Social
Psychology, 50(4), 737-743.

Wallbott, H. G. (1988). Big girls don't frown, big boys don't cry--Gender differences of
professional actors in communicating emotion via facial expression. Journal ofNonverbal
Behavior, 12(2), 98-106.

Wester, S. R., Vogel, D. L., Pressley, P. K., & Heesacker, M. (2002). Sex differences in emotion:
A critical review of the literature and implications for counseling psychology. The
Counseling Psychologist, 30(4), 630-652.

Weinberg, L., Shen, J., Walther, J., Bryant, A., Werner, S., Mack, R., et al. (2005). Poser
(Version 6) [Computer software]. Santa Cruz, CA: E frontier, Inc.

Widen, S. C., & Russell, J. A. (2002). Gender and preschoolers' perception of emotion. M~errill-
Palmer Quarterly, 48(3), 248-262.










Zuckerman, M., Lipets, M. S., Koivumaki, J. H., & Rosenthal, R. (1975). Encoding and
decoding nonverbal cues of emotion. Journal ofPersonality and' Social Psychology, 32(6),
1068-1076.









BIOGRAPHICAL SKETCH

Kevin Stanley received his Bachelor of Science degree in May of 1996 from the University

of Florida in Gainesville, Florida. He maj ored in Psychology. Stanley entered the doctoral

program in Counseling Psychology at the University of Florida in August of 1996. Stanley

received his Master of Science degree from the Counseling Psychology program at the

University of Florida in August of 2001. Upon completion of his Ph.D., Stanley plans to embark

on a career as a Counseling Psychologist in a direct-service setting.





PAGE 1

1 SEX BIAS IN INTERPRETING EMOTIONA L STATES FROM VISUAL CUES By KEVIN E. STANLEY A DISSERTATION PRESENTED TO THE GRADUATE SCHOOL OF THE UNIVERSITY OF FLOR IDA IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF DOCTOR OF PHILOSOPHY UNIVERSITY OF FLORIDA 2007

PAGE 2

2 2007 Kevin E. Stanley

PAGE 3

3 To Jennifer and Simon

PAGE 4

4 ACKNOWLEDGMENTS I would like to thank a ll the people who have contributed to this dissertation through their support, encouragement, help, feedback, and inspir ation. There are too many to list them all, but a few stand out for specific mention. I thank my pa rents, Lloyd and Rosa Stanley, who have been there for me from the beginning. I thank Drs. C onstance Shehan, Franz Epting, and Ira Fischler, who have been helpful and kind in their service on my dissertati on committee. I thank Drs. Paul Schauble and Rafael Harris, who went beyond the requirements of their roles as my clinical supervisors to become mentors and dissertati on coaches. I thank Dr. Martin Heesacker, my dissertation chair, editor, mentor and friend. And most of all I thank my wife Jennifer and my son Simon, who give me l ove and a reason to succeed.

PAGE 5

5 TABLE OF CONTENTS page ACKNOWLEDGMENTS...............................................................................................................4 ABSTRACT. CHAPTER 1 INTRODUCTION................................................................................................................... .9 Overview....................................................................................................................... ............9 Portrayal of Sex Differences in Popular Culture......................................................................9 Sex Differences in Research...................................................................................................11 Factors in Perception of Sex Differences...............................................................................13 Facial Expressions as Displays of Emotion............................................................................14 Hypotheses..................................................................................................................... .........16 2 LITERATURE REVIEW.......................................................................................................19 Overview....................................................................................................................... ..........19 Method......................................................................................................................... ...........19 Chronological Review of Literature.......................................................................................20 Conclusion..................................................................................................................... .........47 3 METHOD......................................................................................................................... ......49 Participants................................................................................................................... ..........49 Materials...................................................................................................................... ...........49 Procedure...................................................................................................................... ..........51 Evaluation of Hypotheses.......................................................................................................52 4 RESULTS........................................................................................................................ .......54 Hypothesis One................................................................................................................. ......54 Hypothesis Two................................................................................................................. .....57 Additional Analyses............................................................................................................ ....58 Apparent Sex Manipulation Check.........................................................................................59 Summary........................................................................................................................ .........59 5 DISCUSSION..................................................................................................................... ....61 Tests of Hypotheses............................................................................................................ ....61 Sex Bias in Interpreting Affect as a Reinforcer of Cultural Stereotypes................................63 Limitations of this Investigation.............................................................................................65 Implications for Future Research............................................................................................66 Conclusion..................................................................................................................... .........67

PAGE 6

6 APPENDIX A EXAMPLE STIMULI............................................................................................................69 B INFORMED CONSENT STATEMENT...............................................................................70 C DEMOGRAPHIC QUESTIONNAIRE..................................................................................72 D EXAMPLE ITEMS................................................................................................................73 E INSTRUCTIONS FOR RECEIPT OF CREDIT AND THANK-YOU MESSAGE..............75 LIST OF REFERENCES............................................................................................................. ..76 BIOGRAPHICAL SKETCH.........................................................................................................82

PAGE 7

7 Abstract of Dissertation Pres ented to the Graduate School of the University of Florida in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy SEX BIAS IN INTERPRETING EMOTIONA L STATES FROM VISUAL CUES By Kevin E. Stanley August 2007 Chair: Martin Heesacker Major: Counseling Psychology A disconnect exists between perceptions in the general public about sex differences in emotion and the findings in the scientific lite rature. Whereas popular boo ks and portrayals in various media depict the sexes as inhabiting wholly different em otional realms, the scientific evidence reveals sex differences in emotion to be small and often situational. Attempting to determine the source of this di vergence of popular and scientific opinion is puzzling. The present study examined one possible element in the crea tion and maintenance of widely-held beliefs about large sex differences in emotion. In the present study, the decoding of facial and postural affect was examined for evidence of a bias based on the sex of the encoder. A pparently male, apparently female, and androgynous stimulus figures were created using computer graphics software. Stimulus figures were manipulated to display unambiguous expressions of stereotypically masculine emotions (pride and anger), and stereotypically feminine emoti ons (happiness and fea r), as well as ambiguous expressions of emotions (pride-happiness blende d expression, anger-fear blended expression). Participants rated each expression for the degree to which anger, pride, fear, and happiness were perceived to be represented. It was hypothe sized that unambiguous expressions would be interpreted similarly regardless of the apparent sex of the stimulus figure, or encoder. It was hypothesized that ambiguous expressions portrayed by apparent females would be interpreted as

PAGE 8

8 more consistent with stereotypically feminine emotions, and ambiguous expressions portrayed by apparent males would be interpreted as more consistent with stereotypically masculine emotions. It was further hypothesized that when the apparent sex of the encoder was ambiguous, sex bias effects would be attenuated. That is, androgynous figures would be rated lower on stereotypically masculine emotions than would apparent males, and lower on stereotypically feminine emotions than would apparent females. Statistical analysis of the results yielde d partial support for th e hypotheses. Unambiguous expressions were rated signifi cantly higher on the intended emo tion than on the same-valence alternative in 11 of 12 cases. Both apparent males and apparent females displaying ambiguous expressions were rated higher on stereotypically feminine emo tions than on stereotypically masculine emotions, which was inconsistent w ith the hypothesis. However, follow-up analysis revealed that apparent males displaying ambiguous expressions were rated higher on stereotypically masculine emotions than were apparent females displaying ambiguous emotions, and apparent females displaying ambiguous expr essions were rated higher on stereotypically feminine emotions than were apparent ma les displaying ambiguous emotions. Androgynous figures were rated lower on stereo typically masculine emotions than were apparent males, but they were not rated significantly lower on stereot ypically feminine emotions than were apparent females. Manipulation check data indicates that participants did not interpret androgynous figures to be female in the majority of cases. Implications of the findi ngs and directions for future research are discussed.

PAGE 9

9 CHAPTER 1 INTRODUCTION Overview Our study is an investigation of one mechanism that may serve to generate and perpetuate overestimations of sex differences in emotion. A se x bias in the interpretation of visual displays of affect could amplify any real differences in emotional expression be tween the sexes and/or create the impression of differences where none ex ist. In our study the in terpretation of visual affect is examined utilizing recent advances in re adily accessible computer graphics software that allow the manipulation of apparent sex and facial expression of the encoder at a previously unavailable level of control. Portrayal of Sex Differen ces in Popular Culture The idea that women and men experience emoti on in starkly different ways seems to be quite popular. The prototypical example is probably Gray's Men Are From Mars, Women Are From Venus (1992). This and his other books on the topic have sold over 14 million copies worldwide, and have been translated into 40 di fferent languages (Gray, 2 006). Another classic in the popular literature Tannens (1990) You Just Don't Understand: Women and Men in Conversation, contrasts mens fact-based, instrument al report talk with womens emotionbased, relational rapport talk approaches to communication. Many other successful books published in recen t years also make extensive use of the premise that large sex differences exist in em otion. For example, two books by Pease and Pease (2001, 2004). became number one bestsellers on the In ternational Bestsellers list. The authors of these and other popular books assert that wo men and men exhibit significant, sex-based emotional differences. Many further argue that those differences are innate and unchangeable, citing research from evolutionary psyc hology, comparative psychology, and biological

PAGE 10

10 psychology to support those claims (e.g. Bar on-Cohen, 2003; Moir & Moir, 2000; Rhodes, 2004). The sex-differences theme can be found in othe r popular media, such as the theatre. The record for the longest-running solo play on Broadway is held by Rob Beckers Defending the Caveman (1991), which makes extensive use of the premise that women and men are psychologically very different (Theater Mogul, 2006). Beckers play has been recommended by thousands of psychologists a nd counselors and he was in vited to perform at the 1999 convention of the American Association of Marri age and Family Therapists (Epinions, 2006). The message that the sexes inhabit separate emotional worlds is also conveyed through television and film. For example Seidman ( 1992) found pervasive sex-stereotyping on many dimensions, including affective expression, in musi c videos. Seidmans analysis of sixty hours of music videos shown on MTV revealed that women were portrayed as aff ectionate, dependent, or fearful more often than were men, whereas men we re portrayed as adventuresome, aggressive, or domineering more often than were women. Simila r findings have been reported for televised sports coverage (e.g. Billings, Angelini, & Eastman, 2005; Fink & Kensicki, 2002), prime-time shows (e.g. Aubrey & Harrison, 2004; Signorielli 1989), and commercials (e.g. Stern & Mastro, 2004). The portrayal of separate emotional realms for males and females can also be seen in childrens entertainment. Thompson and Zerbin os (1995) studied 175 episodes of 41 different childrens cartoons, and found sexstereotypic portrayals in numerous domains, including affective behaviors. For example, males displayed prid e or anger more often than did females, whereas females displayed virtuall y all other emotions more ofte n than did males, especially affection. Dundes (2001) observed that Disneys animated films have been widely criticized as

PAGE 11

11 promoting gender stereotypes, and went on to argue that the film Pocahontas although often held up as a counterexample, in fact continue d the trend by reinforci ng stereotypes of girls whose identity is determined first by romantic re lationships and later by their role as selfless nurturer (p. 354). Sex Differences in Research It is clear from the portrayal of his-and-hers emotional worlds in best-selling books, on stage, in movie theaters, and in television shows marketed to adults, adolescents, and young children that the idea of sex-segregated emoti ons is a very popular one. However, a growing body of psychological and sociological research seems to indicate that men and women are actually much more alike than different in their experience of emo tion. Canary & EmmersSommer (1997) used an extensive review of the th en-existing research literature to argue that traditional stereotypes about sex differences in emotion usually fail to predict peoples behavior. They wrote that there seems to be more overlap than separation in the sexes experience of emotion, and explicitly rejected John Grays analogy of separate planets of origin. In a review of published literature reviews on the topic, Wester Vogel, Pressly, and Heesacker (2002) came to a similar conclusion, stating that sex differenc es are small, inconsistent, or limited to the influence of specific situa tional demands (p. 639). In a 2005 meta-analysis of studies dealing with all types of sex differences, Janet Shibley Hyde concluded that only a very few large diffe rences exist; mens phys ical upper body strength is reliably greater than that of women, for example. On most studied dimensions, however, reliable sex differences were found to be small or non-existent Hyde found the results striking enough to entitle her article The Gender Similar ities Hypothesis, and sh e reiterated a theme found in Canary & Emmers-Somme r (1997) and Wester et al. (2002): on a given dimension, variation within each sex often eclipses the average difference between them.

PAGE 12

12 The sociology literature also contains the gender similarities theme. In the American Journal of Sociology Simon and Nath reported that men and women in the U.S. are broadly similar in their self-repo rt of their emotional ex periences. Upon review of data from the emotions module of the 1996 General Social Survey (GSS), th e investigators seemed rather surprised to conclude that there is little correspondence between mens and womens feelings and expressive behavior and genderlinked cultural beliefs about em otion (2004, p. 1166; italics in original). After examining the sa me data using a variety of theo retical and statistical models, Lively and Heise reported that sex accounts for less than 1% of the variance on any of these emotionality dimensions (2004, p. 1120). In the cases where emotional differences betw een the sexes have been observed, there are often qualifying factors to be c onsidered. For example, differen ces are sometimes found in the ways men and women express emotions but this has been convincingly explained in terms of culture-bound display rules as opposed to differe nces in the experience of emotion (e.g. Brody, 2000; Hall, Carney, & Murphy 2002; OKearne y & Dadds, 2004; Simpson & Stroh, 2004). Sexstereotypical patterns of emotiona l expression can be elicited in men by manipulating the social context to make them emotionally vulnerable, su ggesting perhaps that these behaviors represent a defensive strategy of adhering to low-risk normative expecta tions rather than a genuine expression of their inner experiences (V ogel, Wester, Heesacker, & Madon, 2003). Additionally, observed differences are most often small relative to the within-sex variation on the examined dimension (e.g., Hyde 2005; LaFrance, Hecht, & Paluck, 2003). So while studies do frequently descri be statistically significant aver age differences in womens and mens emotional behaviors, these studies can be seen in a broader context to reveal more convergence than divergence.

PAGE 13

13 Factors in Perception of Sex Differences To summarize, much of the available resear ch indicates that emotional experience is fundamentally similar for men and women, with mo st differences being small, situational, or otherwise qualified. This is clearly at odds w ith the position taken by Gray, Tannen, and other authors, and portrayed in various entertainment media. Nevertheless, in light of the fact that selfreported data, such as the GSS, show that American mens and womens subjective experiences are so alike, it seems odd that Men are from Mars, Women are from Venus and other works from this perspective should be so widely embraced. How can this disjunction between perception and empirical findings be explained? One potential contributing fact or is that while peer-reviewed journal articles usually contain the context and qualifiers necessary to put results in the proper perspective, in the popular media this information ma y be jettisoned to tell a more easily understood, if somewhat misleading, story. Another potential c ontributing factor lies in the fact that the concept of sex differences is a pervasive one, embedded in west ern culture. Therefore, popular opinion may have remained at odds with scientific findings be cause those findings are s een as counterintuitive just as when the concept of a spheroid Earth met with much resistance because people's senses seemed to indicate otherwise. Such a scenario can certainly be imagined. Gestalt theories of perception argue that rela tively frequent instances of similarity will provide a less attended-to background against whic h relatively infrequent instances of difference stand out starkly. The large similarities may therefore simply be neglected, becoming the virtually invisible background against which the re latively small differences stand out, thereby receiving the larger share of conscious consideration. Whatever the origin of the notion that there are large se x differences in emotion, once such a belief has formed it could presumably sustain itself by biasi ng the persons attention,

PAGE 14

14 provided enough ambiguity in observed expressi ve displays. Ambiguous events could be interpreted within th e framework of the belief, seem ingly providing ongoing reinforcing evidence for the belief. So, it seems possible that even if the emotionally expressive behaviors of men and women arent systema tically different, observers mi ght unknowingly apply a bias as they encode ambiguous expressive behaviors. This reasoning constitutes the basis for th e current research. A sex bias towards the interpretation of visual emotional cues according to traditional sex-role stereotypes is proposed. The goal of this project is to investigate this proposal experimentall y. Whereas the research discussed so far deals with individuals' experien ces or displays of emotion, the proposed project will focus on the receipt and encoding of emotional signals by an observer, and particularly on possible bias introduced at encoding. Facial Expressions as Displays of Emotion In a variety of social contexts, facial expr essions are an important source of information regarding the emotional states of the participants (e.g. Ekma n, 1993; Ekman & Friesen, 1987; Keltner, 1995). Nonverbal cues about emotional states play a vital role in effective communication in day-to-day inte ractions as well as in more constrained and goal-directed interactions, such as teaching, sales, and psychotherapy (Philippot, Feldman, & Coats, 2003). However, emotional communication is susceptible to distortion fr om various sources of bias. Of particular interest for this proposed disse rtation is peoples te ndency to view certain emotions as inherently masculine or feminine, as this might lead people to interpret nonverbal emotional communications in a manner consiste nt with this sex-ba sed categorization of emotions. U.S. culture has long included a widesp read belief that women are more emotional than men in general, and that many emotions are regarded as especially feminine, while a few are seen as at least relatively ma sculine (Plant, Hyde, Keltner, & Devine, 2000). More recent data

PAGE 15

15 show that the categorizing of emotions by gender is still alive and well. For example, in a study of the relationships between gender, job status and the interpretation of emotional signals, Algoe, Buswell, and DeLameter (2000) found that th eir participants rated anger and disgust as relatively masculine emotions, and fear as relativ ely feminine. The participants in the Algoe et al. study also rated anger and dis gust as more instrumental and fear as more expressive; traits which are themselves strongly a ssociated with masculinity and femininity respectively (Major, Carnevale, & Deaux; 1981). Plant, Hyde, Keltner, and Devine (2000) asked 117 undergraduates to estimate the frequency with which men and women experien ce and express 19 emotions were regarded as being experienced and expressed significan tly more often by wome n, and only 2 as more typical of men. Plant et al. also tested participants ratings of f acial expressions of emotions, and found that pictures depicting blends of sadness an d anger (upper and lower portions of the face mismatched) were rated in a way cons istent with gender stereotypes. Other recent research has also shown that pe ople do at times displa y a sex-stereotypical bias in their interpretations of facial expressions. Plant, Kling, and Smith (2004) morphed together photographs of men a nd women posing facial expressions of anger and sadness. Maletypical and female-typical haircuts and clothing were added to the resulting blends to manipulate gender. Figures in the images perceived by participan ts to be male were rated as more angry than those perceived to be female, and figures perceived to be female were rated as sadder than those perceived to be male. Hess, Adams, and Kleck (2004) used a sim ilar methodology, using drawings of facial expressions differing only by hair and clothing for one study, and photographs of people rated as androgynous in their facial appearances with di fferent hairstyles and clothes added using a

PAGE 16

16 computer program in another. Hess et al. found th at using this method, the sex-stereotype effect was eliminated in some instances and even revers ed in others. The authors offered the rationale that certain aspects of facial a ppearance, such as thickness of ey ebrows and width of jaw, convey dominance or affiliation cues. They posited that these aspects of appearance, rather than gender per se, yield the cues that trigger the stereotype d interpretations. However, it also seems possible that Hess et al. were over-dicho tomizing gender, and that there is a broader range of possible gender associations than they seemed to expect. Their drawi ngs of lantern-jawed women and photos depicting slender-faced men with barely vi sible eyebrows may in fact have triggered mixed or ambivalent gender asso ciations, rather than associati ons to only male only female construct categories. Hypotheses The purpose of our study was to test two hypotheses regarding conditions that might foster a bias in interpreting the emotional stat e of another using visual information. Computer graphics modeling software was used to generate stimuli that facilitated an examination of conditions under which the bias to wards sex-stereotyped interpretati ons of facial expressions was likely to manifest. The first hypothesis was that bias would be evident when the targets expression was ambiguous, but would decrease or disappear when the targets expression was unambiguous. The second hypothesis was that bias would be less evident when the sex of the target was ambiguous, but more evident when the sex of the target was unambiguous. As a basis for the first hypothesis, the expressi on of a single basic em otion is presumed to demand a particular interpretation, thus leaving li ttle room for participan ts to project any sexbiased expectations they may hold onto the target as they fo rm ideas about what subjective emotional experience would give rise to such an expression. This assumpti on is consistent with

PAGE 17

17 Ekman (1993) who asserted that certain expressions are universally identifiable as representing the corresponding discrete emotions. In contrast, blended expressions were presumed to be more difficult to categorize, triggering participants to use information such as context (when available) or associations to help resolve the ambiguity. Associations with th e targets sex were expected to decisively inform a guess about the subjective experi ence behind the targets expression only when more directly applicable data are unavailabl e or problematic. This id ea is consistent with the findings of Plant et al. ( 2000) and others, which indicated that participants do sometimes interpret facial expressions in a biased way that is often consistent with ge nder stereotypes. In the current experiment, expressions closely confor ming to those described by Ekman (1993) as representative of basic emotions were defined as unambiguous expression s, and those that are blends of two or more basic emotions as ambiguous expressions. Whereas the first hypothesis dealt with ambiguity in visible affect, the second dealt with ambiguity of apparent sex. Modeling software allows the manipulation of the secondary sex characteristics and gender cues of realistic human figures without changing other aspects of the figures appearance. This opened the possibili ty of varying the apparent maleness and femaleness of the target faces, including the cr eation of blended targets of ambiguous apparent sex, while holding facial and pos tural affect cues constant. This manipulation allowed a test of the sec ond hypothesis: that as the ambiguity of the sex of the stimulus face increased, the sex bias in participants ratings would decrease. When people are unable to categorize a face as male or female, they should also be unable to apply a sex-biased interpretation. Furthermore, when the stimulus face is categori zed in spite of some features that are ambiguous or mo re representative of the othe r sex, sex-based bias could be attenuated. For example: a participant is shown two faces that he or she interprets as male. If one

PAGE 18

18 male face displays male characteristics very strongly, it might then strongly activate assumptions about mens emotions. However, if the other male face has features that the participant interprets as feminine, it might then activate the participants beliefs about both mens andalthough probably to a lesse r extentwomens emotions. It is worth noting that associations with cate gories other than apparent sex, such as race, age, or attractiveness, could also be activated under these conditions. A thorough exploration of all these possible sources of bias was beyond the scope of our current study, and we must be content for the present with simply having contro lled these other triggers to the extent it was feasible.

PAGE 19

19 CHAPTER 2 LITERATURE REVIEW Overview In this chapter, literature on the interpreta tion of facial expressi on of emotion dealing specifically with the effects of the sex of the stimulus person, or encoder, is reviewed. This review supports an investigati on of sex of encoder effects on th e interpretation of facial and postural emotional expressions us ing novel techniques of stimulus de velopment. It is organized chronologically. Method This literature review is based on articles retrieved using the PsychINFO online database. The initial search was done us ing the key terms facial expre ssion, sex, and emotion in PsychINFO. Abstracts from the items in the re sults list were examined for any mention of analysis by sex of encoder, which was expre ssed using a variety of terms including but not limited to: sex and gender interchangeably to i ndicate apparent sex of the real or simulated people in the images; and encoder, sende r, stimulus person, model, and poser interchangeably to indicate the real or simula ted people encoding facial expressions. Items found to include a mention of analysis by sex of en coder were selected, then sources listed as references on these items and, when available, records retrieved by PsychINFO using the Times cited in this database tool were screened for inclusion using the same criteria. Items for the period October 1974 to February 2007 were included in this review. It should be noted that because abstracts ar e probably more likely to list hypothesis confirming results than other resu lts, it is possible that in usin g this method, studies were missed that analyzed for sex of encoder effects but foun d none. This review should therefore be viewed

PAGE 20

20 as an examination of sex of encoder effects when they are observed, and not as an examination of the absolute prevalence of such effects. Chronological Review of Literature Sex of encoder effects have been interpreted in various ways. In some older studies it was speculated that the observed effect might result from women being superior encoders relative to men. However, some studies produced results not easily explained in this way, for example that certain emotions were more easily recognized when portrayed by male encoders. In some studies, the sex of the encoder was manipulated without changing the expression of emotion, often by pasting different hair and clothes onto a photograph. In these studies also, the explanation that women were bett er at encoding emotion facially was inadequate to explain the differences. It is studies such as these that give rise to the hypothe sis that a sex bias drives sex of encoder differences observed in studies dealing w ith the interpretation of visual displays of affect. Some recent studies advance the theory that at least some of the observed differences are attributable to the effects of certain facial features that pe ople associate with personality characteristics of dominance and affiliation. Th ese are usually confounded with sex of encoder, such that facial features associated with dom inance are most often found on males and facial features associated with affiliati on are most often found on females. Buck, Miller, and Caul (1974) were among the first to study the impact on interpretation of visual affect of the sex of the encoder (i.e ., the person displaying an emotion). In this study, one group of participants functione d as encoders, another as obs ervers. The encoders viewed slides meant to evoke emotional responses, wh ile the observers watched the encoders faces through a closed-circuit television system. The encoders were unaware that they were being observed. Based on the encoders facial expressions, the observe rs were asked to guess what category of slide the encoders were seeing: sexual, scenic, pl easant people, unpleasant, or

PAGE 21

21 unusual. Observers of both sexes were more accura te in categorizing the slides when responding to female encoders. The authors concluded that females have greater facial responsiveness than males (p. 593). The possibility that the re sults were impacted by observers expectations regarding females expressivity as compared with that of males, i.e. the influence of a sex bias, is not discussed. Zuckerman Lipets, Koivumaki, and Ro senthal (1975) had 40 students (termed encoders) pose in expressions of anger, happiness, sadness, fear, disgust, surprise, bewilderment, suffering and determination. Photos were made of these ex pressions, and later a group of 102 students, including 30 of the original encoders, viewed the slides and chose from a list of emotions the one they thought best fit e ach slide. It was found that people were more accurate in decoding emotions from opposite-sex faces than from same-sex faces, and that female faces were more accurately decoded on th e whole than male faces. Results examining any interaction between sex of en coder and the emotion being posed were not reported. Perhaps the first study to manipul ate the apparent sex of a targ et person in order to detect sex bias in the interpretation of emotional expressions was Condry and Condrys (1976) seminal investigation of peoples in terpretations of an infants display of emotions. In this case, apparent sex was manipulated by simply telling half the participants that the infant in the film they were watching was a boy, and telling the othe r half of the participants that it was a girl. In the film, the 9-month-old infant is exposed repeatedly to f our emotionally evocative stimuli: a teddy bear, a doll, a buzzer, and a jack -in-the-box. Participants were instructed to rate the infa nts expressions by type (pleasure, anger, and fear) and intensity. The investigators found that participants were likely to interpret th e infants emotional expression in sex-biased waysfor example the boy as more angry and the girl as more

PAGE 22

22 fearfulbut only in some instances. They wrote th at it appears to us that the more ambiguous the situation, the more of a difference subjects report between the sexes (p. 816). The infants response to the buzzer, for example, was rated as relatively pure fear whether the infant was labeled a boy or a girl. On the other hand, the infants response to the jack-in-the-box was interpreted as more angry if the infant was la beled male, and more fearful if the infant was labeled female. Eiland and Richardson (1976) created a larg e set of photographs depicting various expressions of emotion using male and female encoders from two age groups (adults and children) and two race groups (black and white). Th eir participants were demographically similar to their encoders. They were male and female, black and white, 2nd graders and college students. The participants sorted the pictures into boxes, each labeled with an em otion. The investigators found that the sex, race, and age of the participant did not affect interpretation of the emotions depicted in the photographs. However, the sex, race, and age of the encoders each impacted the interpretation. The investigators di d not designate particular respons es as right or wrong, so there were no accuracy data. In fact they did not ch aracterize the differences they found at all beyond simply observing that people do not interpret messages sent by black faces (whether young, old, male, or female) the same as messages sent by white faces. Similarly, we do not interpret messages sent by male faces (whether white, bl ack, young, or old) the same as messages sent by female faces (p. 174-175). In a study published in 1983, Felleman, Ba rden, Carlson, Rosenberg, and Masters examined childrens and adults recognition of the emotional expressions of children. The researchers took photographs of children displa ying happiness, sadness, anger, and neutrality. Posed expressions as well as expressions spont aneously generated in reaction to emotion-

PAGE 23

23 eliciting stimuli were used. Ch ildren more quickly identified the emotional c ontent of the expressions of same-sex children. However, th e sexes of the children in the photographs had more of an impact on adults interpretations than on childrens interp retations. The authors speculated that this might be caused by adults mo re developed stereotype d beliefs, for example that boys are more angr y or aggressive. In another study published in 1983, Knudsen a nd Muzekari reported more evidence that the sex of the encoder can affect interpretations of facial affect. The investigators used four male and four female encoders to pose expressions of fear, anger, sadness, and happiness. These photographs were shown to 98 undergraduate st udents, along with, in some cases, verbal statements manipulating the context in which the expressions were supposed to have occurred. Participants rated the emotions they perceived to be present by choosing from a list of six emotions (fear, anger, sadness, ha ppiness, surprise, and disgust) an d/or by writing in a response. Female encoders were rated as sadder than males in conditions where verbal context was provided. Male encoders were interpreted as be ing more fearful than female encoders in conditions where verbal context was not provided. The authors refr ained from trying to provide a rationale for these differences, and simply noted that the sex of the encoder appears to affect interpretation. The finding regarding interpretation of female encoders e xpressions as sadder is consistent with stereotypes about masculinity an d femininity of particular emotions, but the finding that males expressions were seen as mo re fearful is not. As noted by Condry and Condry (1976) however, stereotyped interp retations are more likely to be observed in conditions of ambiguity, and the visual stimuli used in this investigation were unambiguous expressions of a single basic emotion. This limitation crops up in much of the literature on the subject.

PAGE 24

24 In a study designed to examine brain late ralization in processing faces expressing emotion, Thompson (1983) did not find differences ba sed on whether faces were presented in the left or right visual fi eld, but he did find differences based on whether the encoder was male or female. Participants were shown pictures of faces with happy, sad, or neutral expressions for either 30ms or 200ms, then show n another picture and asked whet her the two pictures matched. Participants were more accurate in judgment s involving the male face. Thompson cautioned against drawing firm conclusions about this, how ever, because only one encoder of each sex was used. Small numbers of encoders, and the attendant possibility of artifacts relating to features of specific encoders, is anothe r problem frequently encount ered in the literature. Noting that the literature on d ecoding facial expressions to date was concerned mostly with static images, and often with posed expressions, Wagner, MacDonald, And Manstead (1986) investigated whether dynamic, spontaneo us facial expressions could be correctly interpreted. The researchers filmed one set of participants faces as the participants viewed emotionally loaded slides, and as ked these participants to iden tify what emotions they were experiencing at different points in the film. Th ey then showed the films to another set of participants, and asked them to identify which emotions were being expressed. They found that participants were more accurate in interpreting the expressions of females, and concluded that females are better encoders than males, particular ly of neutral and surprised expressions. Males and females performed similarly as interpreters or receivers. Here again, the possibility of stereotyped expectations was not examined. Rotter and Rotter (1988) studied the encoding and decoding of facial expressions using methods similar to prior studi es, but introduced hypotheses ma king different predictions for different emotions. Specifically, Rotter and Ro tter predicted that females would be better

PAGE 25

25 encoders and decoders of disgust, fear, and sadn ess, whereas males would be better encoders and decoders of anger. This prediction was based on the idea that people would best pose and detect the emotions they were most likely to express, and some prior research had suggested that women suppressed aggression but were more expr essive than men regarding other emotions, whereas men tended to suppress most emotions but were more expressive of anger than women. The researchers photographed students, staff, and faculty members in posed expressions of anger, disgust, fear, and sadness. They recr uited 10 judges to select photographs that were perceived to express the target emotion partic ularly well, ending up with 30 pictures of each pose, with 39 different female encoders and 15 di fferent male encoders re presented. Participants were asked to categorize each photograph as repr esenting anger, disgust, fear, or sadness. Women performed more accurately in the categorization on th e whole, and photographs of females were more accurately categorized for al l emotions except anger. Male decoders were better identifiers of male -encoded anger than female decoders. These results were interpreted as supporting the concept of differentiated sex role s, caused by socialization which encourages females to be more expressive than males, and socialization of males to be both more aggressive and more attune d to aggressiveness from other males (p. 146-147). Walbott (1988) tested whether facial e xpressions carry sufficient information to categorize emotions without c ontext, by using clips from mo vies. Short clips in which professional actors displayed joy, sadness, fear, or anger (accordi ng to judges familiar with the films) were shown to participants, who rated the expressions for nine component emotions happy, sad, surprised, fearful, angry, thoughtful, in despair, full of contempt, and full of guilt on five-point scales. Participants identified joy with high accuracy for encoders of both sexes. They identified fear and sadness more accurately for female encoders, and anger more accurately

PAGE 26

26 for male encoders. The investig ators interpret these results as art imitating life, citing prior research describing socialized display rules (Ekman, 1972) requiri ng men to suppress feelings of sadness and fear, and requiring women to suppress anger. Erwin, Gur, Gur, Skolnic et al. (1992) devel oped a set of facial emotion stimuli for an instrument for use with various clinical populations, and tested it initially on a non-clinical sample. In the first of the two experiments sex of encoder effects were not examined. In the second experiment, pictures of male and fema le encoders posing expressions of happiness, expressions of sadness, or neutra l expressions were shown to pa rticipants, who were asked to rate the perceived emotion on a seven-point scale from very ha ppy to very sad. Interactions between participant sex, encoder sex, and posed emotion were observed. Female participants were more accurate with male encoders gene rally, and were more accu rate especially in identifying happiness for male encoders than fo r female encoders. Male encoders identified happiness similarly for male and female encoders but were less accurate in identifying sadness for female encoders. These findings are difficult to rationalize in terms of sex biased interpretations, which would presumably lead people of both sexes to id entify these stereotypically feminine emotions more readily in females. However, as with many studies that examine sex of encoder effects, the stimuli are limited in that they do not include ambiguous expressions, which is where bias effects would be most likely to manife st. Nor are expressions of ster eotypically masculine emotions included, ratings of which could be directly comp ared to ratings of st ereotypically feminine emotions for each stimulus face. Keltner (1995) observed that prior research literature on f acial expressions was largely focused on just 7-10 emotions, fewer than the total number identified by lay people and emotion

PAGE 27

27 theorists. He set out to determine whether a di stinct display of some thing like embarrassment, guilt, or shame could be identified. He theorized th at this type of emotion should have a distinct display because it served a useful social function of appeasement when norms had been violated. He further theorized that this type of emoti on should be more easily recognized when displayed by individuals from low social status groups. In a series of five experiments, Keltner set out to describe an expression of embarrassment and then test whether it could be distinguished from other expressions of emotion. In the first, he elicited embarrassment by having participants perform a task that had been identified as embarrassing in previous re search. Participants nonverbal behavior was observed, and they were asked to report on thei r experiences. From this information, components of a tentative expression of embarrassment were identified and differentiated from amusement, an expression that shared several component s with the expression of embarrassment being described. In the next four experiments, par ticipants viewed short films of people making expressions intended to convey embarrassment and other emotions, and tried to identify they emotions being displayed using a variety of response formats. Expressions of embarrassment were correctly identified and di stinguished from other emotions. Embarrassment displays from women and African-American targets were more easil y identified and judged to be more intense than embarrassment displays from male and Caucasian targets. Keltners (1995) study is of par ticular significance in that an eye of the beholder effect similar to that observed by Condry and Condry (1976) was considered in th e interpretation of the results. Keltner posited that observers percepti ons of targets social status influenced the observers judgments about the expressed emotion. In the majority of th e previous literature, Condry and Condry being the notable exception, fema les were regarded as being better encoders

PAGE 28

28 of emotion, but the possibility of observers applying different standards when interpreting expressions on female faces as opposed to e xpressions on male faces was not considered. Baron-Cohen, Wheelwright, and Jo llife (1997) showed particip ants pictures of various expressions of basic and complex emotions, us ing whole face images, eyes-only images, and mouth-only images. Pictures made with a female encoder and pictures made with a male encoder were used in separate studies. The same patte rn of results was found with female and male encoders, namely that whole-face pictures yi elded the most accurate judgments for basic emotions, that accuracy was as good for the eyes-onl y pictures as for the whole-face pictures for complex emotions, and that whole-face and eyesonly pictures yielded be tter accuracy than mouth-only pictures. No direct comparison was made between ratings for the male and female faces. The use of computer morphing programs to manipulate facial expression stimuli was introduced in a study by Hess, Blairy, and Kleck (1997). Morphing is a process in which one image is gradually deformed until it matches another. Intermediate images can thereby be created that combine aspects of the two endpoint images The investigators chose neutral and emotional expressions from a pre-existing set of stimulus faces and used a morphing program to create varying levels of intensity for each expression. The target emotions were anger, disgust, sadness, and happiness. Photographs of two Caucasian male encoders and two Caucasian female encoders were used. Participants viewed each stimulus picture on a computer screen a nd rated the perceived intensity of anger, contempt, disgust, fear, happiness, sadness, and surprise along a continuous scale. For each rated emotion, participants used a computer mouse to click a point along a line, anchored at one end with the phr ase not at all, and at the other end with the phrase very

PAGE 29

29 intensely. Accuracy in identifying the portrayed emotion and rated intensity of that emotion varied linearly with portrayed intensity for most images, which was interpreted as a validation of the manipulation technique. The investigators observed main effects of se x of encoder, qualified by some interactions. For happy and sad expressions, low-intensity expr essions were more accurately identified for images of male encoders than for images of fe male encoders. At higher intensities, ratings of these emotions were similar for male and female encoders. Also, male raters were more accurate in evaluating male encoders expres sions of disgust than female ra ters expressions of disgust. After reanalyzing the data by including the pe rceived intensity of the expression as a covariate, and thereby control ling for actual differences in in tensity between the expressions made by the four encoders, sex of encoder effect s remained. Female encoders expressions of joy were more accurately rated, and male encoders expressions of sadness were more accurately rated. The authors interp reted this finding as evidence fo r a decoding bias suggesting that observers decode womens and mens low to mid-intensity emotiona l facial expressions differently (p. 255). They refraine d from speculation regarding the s ource or exact nature of this bias. Algoe, Buswell, and DeLamater (2000) showed participants slides of male and female encoders, or focal people as th ey put it, posing one of three expr essions: anger, disgust, or fear. The investigators put forth tw o competing hypotheses. Theori zing from the universality perspective they first hypothesi zed that the expressions being posed by the focal people should be correctly identified regardless of any contex tual cues. Theorizing from the context-specific perspective, their second hypothesis was that participants would adju st their interpretation of the

PAGE 30

30 focal persons expression based on contextual in formation such as the focal persons gender and job status. The researchers found that the gender of the focal person did influence participants interpretations in some circumstances. Males pose d in expressions of anger were seen as more angry and less fearful than women posed in expre ssions of anger. Across posed expressions, men were seen on average as expressing more c ontempt than women, and women were seen on average as expressing more fear than men. Thes e results are consistent with sex-stereotyped interpretation of the expressions. Dimitrovsky, Spector, and Levy-Shiff (2000) st udied the ability of learning-disabled (LD) and non-disabled children to recognize facial expressions of emotion that varied in their ease of identification. Photographs from a preexis ting stimulus set were chosen for relatively high and relatively low inter-rater agreement. Port rayals of happiness, sadness, anger, surprise, fear, disgust, and neutral expressions were use d, with four male and four female encoders. Participants from both the LD and non-LD gr oups more accurately identified emotions from the female faces. This effect increased with difficulty of identification. That is, for emotions with lower inter-rater agreement, ther e was a larger difference between the accuracy rating for female faces and the accuracy rating fo r male faces than for emotions with higher inter-rater agreement. This was interpreted by the authors as evidence of wo mens superior facial emotional expressivity as compared with men. Th e authors concluded that the present results can be viewed within the wider context of womens greater emo tionality (p. 414). The authors did not appear to entertain the possibility of sex bias in their interpretation. Hess, Blairy, and Kleck (2000) conducted a study to investig ate the impact of facial expressions of emotion, sex of encoder, and ethni city of encoder on partic ipants perceptions of

PAGE 31

31 the encoders levels of dominance and affiliation. Images of male and female Caucasian and Japanese people displaying high and low intensitie s of happiness, anger, disgust, sadness, and fear were presented to participants. A main effect of sex of encoder was observe d. However, sex of encoder interacted in complex ways with the other inde pendent variables, and the magnit ude of the effect of facial expression of emotion dwarfed the effects of the other variables. All this led the authors to conclude that observers interpret the informa tion regarding behavioral intentions provided by affect displays in similar ways regardless of the ethnic group membership or the sex of the expressor, but that sex of the expressor has subtle effects on the observers inte rpretations (p. 281). Plant, Hyde, Keltner and Devine (2000) addressed the conne ction between gender stereotypes and facial expression s of emotion in this series of three studies. The first study established which emotions are currently considered to be stereotypically masculine or feminine in the US. In the other two studi es, participants interpretations of emotional expressions were solicited and discussed in light of the stereotype information garnered in the first study. In the first study, participants responded to two questionnaires. The first questionnaire required participants to indicate the frequenc y with which men and women experience and express 19 emotions according to US cultural stereo types as the participants perceived them. The second required them to rate the frequencies according to their personal beliefs, regardless of what they perceived the cultural stereotypes to be Eleven of the 19 emotions were rated as being experienced and expressed more by women than men on both questionnaires. Two emotions, anger and pride, were rated as being experien ced and expressed more by men than by women on both questionnaires.

PAGE 32

32 In the second study, the investigators crea ted photographs of two men and two women posing facial expressions of anger, sadness, a nd blends of the two, us ing Ekman and Friesens Facial Action Coding System to pose the e xpressions (FACS; Ekman & Friesen 1976). The blended expressions were created by posing the uppe r half of the face in one expression and the lower half in the other expressi on. Participants viewed the phot ographs and rated the degree to which they perceived each to express four emo tions. The investigators found that participants rated the blended expressions in a stereotype-consistent manner. That is, they rated mens blended expressions as angrier than womens, and they rated womens blended expressions as sadder than mens. In the third study, particip ants interpreted an infants display of emotion in a methodology similar to that of Condry and C ondry (1976). Participants, who were tested regarding their own endorsement of sex stereotypes, viewed a vide otape of an infant and rated the infants behavior on several em otions. Half were told that th e child was a boy half were told that the child was a girl. The part icipants beliefs about the sex of the baby di d not influence their interpretations of emotion except in the case of high-stereotyped men rating anger. In this case, the men rated ostensibly male infants as angrier than ostensibly female infants. The majority of the literature available in English dealing wi th the effects of encoder sex on interpretations of f acial expressions of emotion desc ribes experiments done in Englishspeaking countries. A study performed by Thayer and Johnsen (2000) in Norway provides an exception. In this case participants rated their own experience of happiness, sadness, anger, fear, disgust, surprise, intere st, pleasantness, activation, calmness, arousal, and liking for the stimulus in response to viewing slides s howing facial expressions of emotion. The slides depicted one male encoder and one female encoder displayi ng expressions of neutra lity, disgust, fear,

PAGE 33

33 happiness, surprise, sadness, and anger. Responses were considered to be correct classifications when the participant reported an elicited emotion that matched the emotion displayed. Female participants responses included more correct classifications and fewer misclassifications than male participants respon ses, and did not vary as a function of encoder sex. Male participants performed at chance levels in differentiating female encoders expressions of anger and fear. In the discussion, the author s framed this difference in terms of females presumed superiority in decoding emotion and greater sensitivity in experiencing elicited emotions. However, it seems possible that emo tions elicited through viewing the emotional display of another might not always be congruent with that emoti onal display, and that this might have played a role in males reactions to seeing the female encoder displaying negative emotions. This possibility was not discussed in the article. In an effort to develop stimuli for future use in evaluating populati ons of neurologically impaired people, Pell (2002) created facial a nd vocal stimuli depicting six target emotions: neutrality, happiness, pleasant surp rise, disgust, anger, and sadness. The facial stimuli consisted of pictures of 4 male and 4 female encoders pos ing facial expressions of each target emotion. The investigator tested the s timuli with non-impaired particip ants in order to establish baseline parameters for the stimulus set. In doing so, he found that the sex of the encoder influenced the interpretation of the displayed emotion in some cases. Specifically, participants correctly identified expressions of neutrality on ma le faces more accurately than on female faces, and correctly identified disgust on female faces more accurately than on male faces. In the discussion of this findi ng, Pell did not provide a rationa le as to why this specific pattern may have been manifested. Rather, he wr ote that the observed effect of encoder gender might reflect systematic properties of how these emotions are decoded and labeled, or it might

PAGE 34

34 reflect an artifact of spec ific properties of some or all of the ei ght encoders used in this particular case (p. 504). The apparent tendency not to label female faces as neutral seems consistent with stereotypes regarding womens em otionality, but the author did not engage in this level of speculation. Widen and Russell (2002) examined the effect of the apparent sex of the encoder on preschoolers interpretations of facial expressi ons of emotion. Participants, who were 4 or 5 years of age, were shown pictures of what app eared to be a male and a female child of around 12 or 13 years of age displaying facial expressions of happiness, sadness, anger, fear, and disgust. In reality, these pictures were cr eated from photographs of a 13-yea r-old girl and a 12-year-old boy in posed expressions. Pictures of the boy and the girl displaying the same expression were morphed together using computer software, a nd hairstyles typical of boys and girls were electronically placed onto the resulting blended-se x faces to create a set of apparent males and apparent females. Pairs displaying each emo tion had exactly the same faceonly the hair differed. The apparent sex of the encoder impacted participants ratings of emotions. Male participants labeled the male fi gure as disgusted more often than they labeled the female figure as disgusted, and female particip ants labeled the female figure as fearful marginally more often than they labeled the male figure as fearful. Th e authors discussed the results in terms of the presumed influence of gender stereotypes of em otion. They noted, however, that participants ratings of anger were the same for apparently ma le and apparently female encoders, whereas the theory of gender stereotyping of emotions would lead one to pred ict that ratings of anger in particular should yield stereotypical interpretations.

PAGE 35

35 Mignault and Chaudhuri (2003) used high-resolution 3-D models of stimulus faces in an examination of the impact of head tilt on par ticipants interpretations of perceived dominance and emotional content. Apparently male and appa rently female stimulus faces displaying neutral expressions were presented on a co mputer screen at different a ngles. In addition to rating the perceived dominance, participants were asked to give a one-word answer to the question what is the main emotion expressed in this picture? (p. 117). Responses were cate gorized as anger, fear, happiness, sadness, neutral, and other. Participants rated apparently male faces as angry more often than they rated apparently female faces as angry. Apparently female faces were more often rated as happy compared with apparently male faces. Apparent sex had no det ectable effect on ratings of fear, sadness, or neutrality. The authors interpret th ese results as being consistent both with theories of social stereotyping based on womens une qual status and with an evol utionary explanation based on greater innate aggressive ness in males (p. 128). Hess, Adams and Kleck (2004) tested the theory that facial features conveying dominance and affiliativeness actually drive effects identified elsewhere in the literature as evidence of sex bias. Because th e features that they assert are cues for dominance or affiliativenesseyebrow thickness, height of forehead, jaw form, and facial roundingare confounded with sex, they reasoned that effects of these features may easily be misinterpreted as effects of encoder sex. They employed two studies with different types of stimuli to test this theory. In their first study, black-and-white drawings of the center of faces (as opposed to the outer edge of faces) displaying anger, sadness, ha ppiness, disgust, and a neutral expression were created. (Interiors of faces conve y relatively little information a bout the sex of the person, but a

PAGE 36

36 lot of emotional information, whereas the outer e dge of faces conveys rela tively little emotional information, but a lot of sex-cue information.) Various levels of inte nsity were generated by morphing emotional expressions with the neutra l expression, and apparent encoder sex was manipulated by adding masculine and feminine hairst yles to the drawings. Participants rated the perceived intensity of anger, contempt, disgust, fear, happiness, sadness, and surprise for each stimulus face. The investigators hypothesized th at because apparently male and apparently female faces were exactly the same except for ha irstyle, and therefore shared all identified dominance and affiliativeness cues, the often-obs erved effect of encoder sex should not be observed. Results of the first study were mixed. Ratings of disgust were, as predicted, equivalent for apparently male and apparently female f aces. Ratings of sadness were higher for apparent females than for apparent males, consistent with a theory of sex bias a nd inconsistent with the hypothesis. Interestingly, the typi cal effect of encoder sex was re versed for expressions of anger and happinessapparent females were perceived as angrier than apparent males, and apparent males were perceived as happier than apparent females. The inve stigators tentatively conclude that facial features ra ther than perceived sex of the enc oder may be responsible for effects commonly attributed to sex bias in interpreting facial expressions of emotion, but caution that it is possible that the drawings used introduced an artifact. In the second study, a largely similar me thodology was employed using photos of androgynous faces for the interior parts of st imulus faces, and once again using different hairstyles to manipulate apparent sex. Intensity of expressions was not manipulated in the second study. Similar results to those from the first study were obtained for ratings of anger and happiness, again reversing the pattern predicted by the sex bias hypothesis.

PAGE 37

37 Hess et al. were faced with trying to explain th e fact that they seemed to have observed a sex bias in the opposite of the us ual direction for expressions of anger and happiness. To do so, they ended up invoking a version of the sex bias theory, by speculating that participants carried expectations that women should appear le ss angry and men less happy, and when those expectations were violated the female faces anger and the male faces happiness stood out all the more starkly. Palermo and Coltheart (2004) observed that mu ch of the prior research on facial expressions relied on a few databases of stimulus faces. In an effort to expand the available pool of facial expressions of emoti on stimuli, they gathered photogra phs of 50 individuals displaying expressions of happiness, sadness, an ger, fear, disgust, surprise, and neutrality. To test the utility of these photographs, the research ers asked a group of 24 participan ts to view the images and select which of the seven target expressions th ey perceived each image to portray. They found a main effect of encoder sex in that expressions posed by females were more often accurately identified than expressions posed by males. Anger and sadness especially were correctly recognized more often when posed by female enc oders as opposed to male encoders. The investigators observed that other studies have yielded similar findings, i.e. that expressions posed by female encoders are ofte n recognized at higher rates than corresponding expressions posed by male encoders The authors did not speculate as to why that might be. In this case, the finding that anger was more ofte n identified when displayed by female encoders does not seem to be consistent with sex bias th eory. The finding that sadness was more readily recognized on female faces than male faces, however does seem consistent with sex bias theory. Plant, Kling, and Smith (2004) used stimuli si milar to those used by Hess et al. (2004) to investigate the effect of encoder sex on the in terpretation of facial expressions, but produced

PAGE 38

38 different results. Plant et al. created stimulus faces by morphing together photos of males and females posing expressions, then adding gender-ty pical hairstyles to ma nipulate apparent sex. The expressions were ambiguous, being constructed either from an anger expression in the upper half of the face and a sadness expression in the lowe r half of the face, or vice-versa. Participants were asked to rate the perceived intensity of two stereotypically feminine emotions, sadness and sympathy, and two stereotypically mascul ine emotions, anger and contempt. Apparently female encoders expressions were rated as sadder than those of apparently male encoders, and apparently male encoders ex pressions were rated as angrier than those of apparently female encoders. Apparently female encoders expressions were also rated as more sympathetic than those of apparent males. As in Hess et al. (2004), faces that were exactly the same except in hairstyle and clothing were inte rpreted in different ways. However, whereas Hess et al. observed a partial reversal of stereotype-consistent interpre tations using this approach, the findings of Plant et al. were consis tent with a sex bias in the interp retation of facial expressions. Rahman, Wilson, and Abrahams (2004) measured accuracy and reaction time as participants categorized happy, sad, and neutral facial expressions. The stimuli were pictures of four male and four female encoders posing th e expressions, presente d on a computer. Sex of encoder interacted with sex of participant in that female par ticipants were more accurate in categorizing male faces, whereas sex of encoder did not impact accuracy for male participants. Sex of encoder interacted with facial expressi on in that sadness was more accurately identified on male faces than on female faces, and responses were faster to happy and sad male faces than to happy and sad female faces. The authors conclude that males facial expressions may be easier to read.

PAGE 39

39 The finding that sadness was more accurately identified on male faces than on female faces runs counter to what theories of sex bias in interpreting facial ex pression would appear to predict. Recognition of stereotypically feminine emotions such as sadness on male faces should be hampered by the bias. However, given that th ere were no stereotypically male emotions as response options, the methodology used does not le nd itself well to examination of sex bias questions. In a study designed to examine the impact of encoder sex on emotion classification, as well as the impact of displayed emotion on judg ments of encoder sex, Atkinson, Tipples, Burt, and Young (2005) found evidence that variations in sex of enc oder significantly influenced decisions about what emotion is being portraye d. First, the researchers showed participants pictures depicting facial affect in blocks with all male encoders, all female encoders, or mixed, and asked participants to make a rapid judgmen t as to whether fear or happiness was being portrayed. In the mixed sex-of-encoder blocks, performance was significantly slower than in blocks with all male or all female encoders. Ne xt, participants completed a similar task requiring them to judge quickly the sex of the person in the picture while expression of emotion was held constant or varied. The speed with which the participants made judgments about sex was not significantly different in the varying conditions. The results of the Atkinson et al. study do not directly indicate evidence for or against a sex bias in interpreting emotiona l expressions. The authors of this study did not report on the reaction times for male encoders versus female encoders, but only for blocks of homogeneous encoder sex versus blocks of heterogeneous enco der sexes. However, the results of this study do help establish the stage of pro cessing at which such a bias woul d take place, as they interpret their results as supporting a model in which in formation about the sex of a face is processed

PAGE 40

40 faster than information about aff ect. Therefore, any interpretation of affect is conducted within a context where information about se x has already been processed. A series of three studies by Hess, Adams, & Kleck (2005) continued th eir investigation of perceived dominance and affiliation as mediators of the sex-stereotypical processing of facial affect that is frequently observed. In the first, photographs of male and female faces displaying neutral affect were shown to three groups of participants. One group rated how likely they thought the people in the pictures were to s how anger, fear, contempt, sadness, disgust, happiness, and surprise. Another group rated each picture for how dominant the people appeared to be, and a third group rated each picture fo r how affiliative each person appeared to be. A mediational analysis showed that the sex of the encoder contributed strongly to his or her perceived dominance and affiliation, and to pr edictions about what emotions the encoders were likely to show. Additionally, dominance and affiliation contributed to predictions regarding shown emotion after controlling for sex of the en coder. Males were judged to be more likely to show the stereotypically masculine emotions of a nger, contempt, and disgust, and less likely to show the stereotypically feminine emotions of f ear, sadness, happiness, and surprise. This pattern was reversed for females. After factoring out the effect of sex of the encoder, perceived dominance was positively correlated with the st ereotypically masculine emotions studied, and negatively correlated with two of the four stereo typically feminine emotions, fear and sadness. Affiliation, after factori ng out sex, was negatively correlated with the masculine emotions and positively correlated with three of the four stereotypically feminine emotions, namely fear, happiness, and surprise. In the other two studies, particip ants viewed pictures of enco ders previously rated as high or low dominance (Study 2) and affiliation (Stu dy 3), along with vignettes describing the

PAGE 41

41 encoders in situations likely to evoke a variety of emotions. Part icipants were asked to indicate which of a series of schematic drawings depic ting facial expressions of emotion they believed the encoder would show in res ponse to the situation described. In the dominance study, male encoders and high dominance encoders of both se xes were judged more likely to display angry facial expressions, and female encoders were judged more likely to display expressions of sadness. In the affiliation study, high affiliation encoders were judged more likely to display happiness in the happy vignette condition than were low affiliation encoders, and the effect was stronger for male encoders than for female encoders. In the angr y and neutral vignette conditions, male encoders were rated as more likel y to show anger regardless of affiliation level, and female encoders were rated as less likely to show anger regardless of affiliation level. The authors interpret these resu lts as supporting both an effect of sex bias and effects of perceived dominance and affiliation. They obs erved that these variables were confounded, because facial features associat ed with dominance are more typical of males and facial features associated with affiliation are more typical of fe males. However, the authors concluded that their findings show that sex-based stereotypical expectations can be partially overruled by expectations based on our perceptions of the dom inance and affiliativeness of a person (p. 534). Hugenberg and Scezny (2006) examined the im pact of the sex of the encoder on the happy face advantage, or HFA, which refers to the fact that happy expressions are categorized more quickly than other expressions in speeded response studies. Particip ants viewed images on a computer monitor of encoders displaying a ne gative emotionanger in one version, sadness in anotheror happiness, and were asked to cate gorize the emotion as quickly as possible. The authors presented two rationales, both of which lead to predictions that the happy face advantage would be stronger for female encode rs than for male encoders. One rationale was

PAGE 42

42 sex bias in interpreting expressions. Because happ iness is stereotypically more closely associated with women than with men, it was argued that the expectation of seeing happiness on female faces would lead to a stronger HFA for women. The second rationale was based on the valence of women compared with men as a stimulus cat egory, and effects of emotional congruence. The authors cited evidence of the so-called women are wonderful effect, i.e. that women are generally regarded more positively than are me n, and argued that valence-congruent processing would lead to a stronger HFA for women. In th e happiness versus sadness trials, it was argued that these two rationales lead to different predictions. They stat ed that the stereotype-based expectancies should not lead to a stronger HFA for women because sadness and happiness should be equally expected on female faces. On the other hand, they argued that the congruent valence rationale would stil l predict a stronger HFA. As predicted, on the whole the happy face advantage was present for all encoders but more pronounced for female encoders. In the ha ppiness versus sadness tria ls, a larger HFA for female encoders was observed, which the authors interpreted as stronger support for the valencecongruence model than for the st ereotype-based expectancy mode l. They commented that this finding doesnt detract from the utility of sex-ster eotyped interpretations of affect in explaining effects other than the HFA, particularly becau se this and other HFA studies use unambiguous expression stimuli, and stereotypes are more likely to affect interpretations of ambiguous stimuli. The most recently published investigation uncov ered in this review that addressed the effects of encoder sex on in terpretations of emotion expr essions provides an in-depth, multifaceted examination of perceptions of happi ness and anger as a function of the perceived sex of the encoder. Becker, Kenrick, Neuberg, Blackwell, and Smith (2007) considered the related phenomena of (a) anger being more quick ly and accurately identified on male faces and

PAGE 43

43 (b) happiness being more quickly and accurately identified on female faces. They conducted a series of seven studies to compare the utility of two theoretical explanations for these effects: the theory of bias arising from sex stereotypes root ed in social learning, a nd the theory of bias arising from evolved tuning of human perceptu al systems to avoid threats and approach opportunities. The authors started with a hypothesis th at could be formed from either theoretical perspective: judgments and speeded decisions about expression would be dependent on the sex of the displayer of the emoti on, revealing correlations of maleness with anger and femaleness with happiness (p. 181). They then went on to examine the issue using multiple methodologies and tried to evaluate the hypothesi s and also to search for fact ors supporting or undermining each of the theoretical perspec tives under consideration. In the first study, participants were asked to imagine a face. Half the participants were instructed to imagine a happy face, and half were instructed to im agine an angry face. They then provided details pertaining to the face they im agined by responding to items on a questionnaire. Among other things, they were asked whether they had imagined a male or female face. Most participants of both sexes who were asked to imagine an angry face im agined the face to be male. A significant majority of males who imagined a happy face imagined it to be female. Marginally more females also imagined happy f aces as female. The authors noted that this procedure tapped participants a ssociations of these emotions to sex, and was able to do so without cuing sex explicitly, but that it revealed little about the source of those associations. In the second study, participants viewed a se ries of photos of encoders displaying angry and happy expressions on a computer and were in structed to categorize each as quickly as possible. Afterwards, they completed an implic it association task to assess any automatic associations of male or female names with synonyms for happiness or anger. Happy faces were

PAGE 44

44 judged more quickly than angry faces, and the quicker reaction to happy faces was more pronounced for female faces than for male faces Angry male faces were categorized more quickly than angry female faces, and happy female faces were categorized more quickly than happy male faces. Accuracy for categorization of angry expressions was better for male encoders than for female encoders, and accuracy for happy expressions was better for female faces than for male faces. On the implicit associations measur e, the overall pattern was for participants to associate males with anger and females with happiness. Categorization and reaction time data were rean alyzed for a subset of participants whose associations were in the opposite directions from the overall averages. This was done to investigate the possibility that the previously obs erved patterns would be reversed in this subset, as one might expect if the observed effects were caused by automatic associations between the sexes and the emotions in question. However, some aspects of the patterns persisted. This subset of participants was also faster and more accurate in categorizing angry male faces as compared with happy male faces, and they were faster in categorizing happy female faces compared with angry female faces. The authors conclude that the overall results for the categorization and response time task support the ini tial hypothesis and are consistent with both the social learning rational and the perceptual mechan ism rationale, and the results for the subset of participants with the less common patte rn of associations is somewhat mo re compatible with the perceptual mechanism rationale. In the third study participants viewed the same images as we re used in the second study, but were asked to determine quickly the sex of the encoder instead of the emotion being displayed. Participants categorized male faces more accurately when they had an angry

PAGE 45

45 expression, and they categorized female faces more accurately and quickly when they had a happy expression. The stimuli for study four consisted of comput er-generated faces created to simulate men and women expressing anger and ha ppiness. This was done in order to control for the possibility that men actually portray expressions of anger better than do women, and the possibility that women actually portray expressi ons of happiness better than do men. The methodology of the second study was repeated with the computer-g enerated faces. Anger was categorized more quickly on apparently male faces than on appare ntly female faces, and happiness was categorized more quickly on apparently female faces apparen tly male faces. Participants were more accurate in identifying anger on apparently male faces than on apparently female faces, and more accurate in identifying happiness on apparently female f aces than on apparently male faces. The authors interpreted the results as s upporting the primary hypothesis. For the fifth study, photographs of angry, happy, neutral and fearful faces were presented for very short time intervals. Participants were asked to identify the emotions they saw. Neutral male faces were misidentified as angry more ofte n than were neutral female faces. Happy female faces were correctly identified mo re often than were happy male faces. Accuracy rates were the same for angry male faces and angry female f aces. These results were considered partially supportive of the original hypothe sis, regarding the association of male ness with anger and femaleness with happiness. In addition, fearful fe male faces were more accurately categorized than fearful male faces. For the sixth study, computer graphics software was again used, this time to generate nine androgynous faces with neutral expressions. From these, nine pairs of faces were created by making a slightly feminized and a slightly masculin ized version of each. Four of these pairs were

PAGE 46

46 used with the neutral expression s. The remaining five pairs were given emotional expressions of happiness or anger. Each member of a pair had an almost identical expression, with the feminized version being slightly modified to be either less happy or angrier than the masculinized version. Participants viewed each pair, and made judgmen ts either as to which one of the two was more masculine, or which one appeared angrie r. Participants judgme nts of masculinity and femininity aligned with the ways the investig ators made the faces, i.e. faces that were masculinized were judged to be male, and faces that were feminized were judged to be female. Despite the fact that the emotional expressions of the pairs had in every case either been left neutral or changed to make the feminized face an grier or less happy, the masculinized faces were always rated the angrier of th e two on average. The authors interpreted this as a natural confound between sex and faci al expression (p. 187). In the seventh study, six androgynous faces we re generated using computer software, then modified in each of six ways: a body with tr aditionally masculine or feminine clothing was added, the jaw was made square r or was made rounder and narrower, and the brow ridge was raised or lowered. Each original face and its si x variants was presented as a stimulus. Half the participants were told the stim uli had been modified to look s lightly angry or slightly happy. These participants rated each stim ulus on a nine-point scale from slightly angry to slightly happy. The other half were told the stimuli had been modified to look slightly masculine or slightly feminine, and asked to rate each stimul us on a nine-point scale from slightly masculine to slightly feminine. Masculine clothing caused the faces to be ra ted as more masculine compared with the original versions, but did not cause them to be rated as angrier, as would be predicted from sex-

PAGE 47

47 stereotype theory. Femini ne clothing caused faces to be rate d as more feminine compared with originals, but did not cau se them to be rated as happier. Face s with lower brow ridges were seen as more masculine and angrier. Faces with higher brow ridges were seen as more feminine, but higher brow ridges did not cause faces to be ra ted as happier. Making th e jaw more square did not result in higher ratings of masculinity as expe cted by the investigators, but did cause faces to be rated as more angry. Similarl y, the rounding and narrowing of the jaw did not result in higher ratings of femininity, but did result in higher ratings of happiness. The investigators regarded these results as being inconsistent with the social learning hypothesis. Becker et al. interpreted their results as a whol e to be more consistent with the theory that human perceptual mechanisms are tuned to a ssociate anger with males and happiness with females, rather than with the theory that social learning leads to stereo typed beliefs about gender and emotion that in turn bias the interpretation of affect. They speculated that certain facial features that are associated with human sexual dimorphism, but that are not always or necessarily associated with concepts of masculinity or femininity, may be perceived as conveying anger and happiness. They did not dismiss social learni ng as a factor, however, emphasizing that both sources of variance may be at play in a given situ ation. Situations involving ambiguous and complex emotional expressions might give ster eotypical interpretations the opportunity to emerge, as Condry and Condry (1976) observed thr ee decades earlier. Such expressions were not studied in the Becker et al. investigation. Conclusion The impact of encoder sex on the interpre tation of emotion expressions has been observed several times in the scientific literature. The exact nature (or nature s) of this effect has not been firmly established. The theory that fe males are more skilled at encoding emotions has been offered and may be correct. However, this theory fails to explain findings that certain

PAGE 48

48 emotions may ascribed to males more quickly and/ or accurately, or at a higher level of intensity as compared with females. Studies in which f aces are kept constant across conditions while sex of encoder is manipulated using peripheral cues like clothing and hair style also reveal the inadequacy of such an explanation. An interp retive bias based on the sex-stereotyping of emotions has been offered and may also be correct. Many of the stud ies designed with the intention of studying sex of encoder effects, as o pposed to those revealing such effects more or less incidentally, support this th eory. But not every studys results fit readily into such a model, and interesting alternative or co mplementary models are beginning to arise, such as those tying observed effects to particular faci al features, for example Hess et al. (2005) and Becker et al. (2007). Lingering questions in this area may soon be answered, though doubtlessly new questions will arise in the process. New techniques are bein g developed, such as the use of computers to perform tasks such as combining images, adding or removing cues such as hairstyle and clothing, and even generating very realistic, highly manipul able synthetic encoders. The literature already reflects some of the innovative methodologies an d superior controls th ese techniques make possible, and more will surely come. These better investigative tools challenge researchers to examine old issues in new ways, and to ask new questions that requ ire shifts in ones assumptions, similar to Condry and Condrys innova tive manipulation of the apparent sex of the encodersomething that was conventionally assu med to be fixed. The interplay between the application of developing technologies and the creative formulation of research questions will likely soon shed considerable light on the inte rpretation of emotional states and on all the processes by which we humans understand each other.

PAGE 49

49 CHAPTER 3 METHOD Participants Participants were 163 University of Flor ida students recruite d from undergraduate courses. 124 participants were women, 39 were men. 63.8% indicated that they were EuropeanAmerican or White, 17.8% were African-Ameri can or Black, 11.7% were Hispanic, 11% were Asian or Asian-American, and 1.2% were Pacific Islander. Participants were permitted to indicate more than one racial/e thnic group, and 10 participants di d so. There were 79 1st-year students, 40 2nd-year students, 34 3rd-year st udents, and 10 4th-year students. Students participated in return for extra credit in the class from which they were recruited, or in the case of introductory psychology students, in exchange for credit towards the research participation requirement in the course. All students in the cl asses used for recruitment were eligible to participate. Materials The independent variables in our study were (a) the visual display of affect as determined by facial expression and body positi oning of encoders, and (b) the apparent sex of the encoders. For the test of the first hypothe sis, apparent sex was presented at two levels: male and female. Visual display of affect was presented at three levels: stereotypically feminine discrete emotion, stereotypically masculine discrete emoti on, and ambiguous (blended) emotion. Positively valenced emotions (happiness, pride, and a ble nd of the two) and negativ ely valenced emotions (fear, anger, and a blend of th e two) were examined separate ly. For the test of the second hypothesis, all expressions were of ambiguous emotional conten t, while apparent sex was presented at three levels: male, female, and ambiguous. Positively and negatively valenced emotions were examined separately.

PAGE 50

50 The stimuli were a series of synthesized human faces and upper bodies generated using Curious Labs Poser 6 software (Weinberg et al., 2005). Th e software allows the creation of realistic three-dimensional models of human figures, which can vary on a number of userdefinable parameters such as sex, race, age, and facial expression. Facial expression in particular may be finely controlled in specific regions of the face. More than 50 parameters may be adjusted to specify the behavior of areas such as the right or left forehead right or left eyebrow, eyelids, around the right or le ft eye, and around the mouth. The lips and adjacent areas have 20 available movements, each with numerous possibl e gradations. The high degree of control afforded by the software allowed the accurate reproduction of empirically investigated and rigorously described affective expressions without relying on live encoders. For our study, six encoders were created from two starting encoders with randomly generated features. Each starting encoder was made into a male ve rsion, a female version, and an androgynous version, for a total of two apparent males, two apparent females, and two androgynous figures. Secondary sex char acteristics such as certain as pects of bone structure, skin texture, and the presence and degr ee of facial hair shading were adjusted to manipulate apparent sex of encoders, as were gender cu es such as haircut and clothing. Anger, fear, and happiness expressions as described by Ekman et al. (FACS; 2002) and the expression of pride described by Tracy and R obins (2004) were created and applied to each figure, generating 24 stimulus images. Ambiguous expressions were then created by using the software to interpolate, or morph, between sa me-valence expressions of basic emotions. This process generates a series of images as one e ndpoint gradually transitions to the other. The middle image in each series was selected as a mathematical halfway point between the basic emotion expressions. This yielded 6 expressions in total (anger, fear, and a blend of the two;

PAGE 51

51 happiness, pride, and a blend of the two) that we re applied to the six en coders for a total of 36 stimulus images. See Appendix A for a representative set of stimulus images. The dependent variables used in the tests of both hypotheses were ra tings on a four-point scale of the degree to which four emotions (anger, fear, happiness, and pride) were judged to be present. After viewing each encode r, participants rated the extent to which they believe the figure is expressing each emotion from 1 (not at al l) to 4 (very much), anger and pride being stereotypically masculine, and fear and happiness being stereot ypically feminine. A set of manipulation-check que stions was included after the main questionnaire. The stimuli were a subset of the images used for the hypothesis tests. After ea ch image, participants were prompted to indicate if the person depict ed was male, female, or not sure, and whether or not the persons expression was ambiguous. Procedure The questionnaires used to collect data reside d on the internet. The questionnaires were in the form of interactive web pages, which pres ented the stimuli, received the participants responses, and wrote the response data to a digital file for analysis. Partic ipants signed up for the experiment by following a hypertext link on the web page of the course from which they are recruited, by following a hypertext link from the University of Florida Psychology Departments online experiment interface, or by entering a UR L supplied by their instructor into a web browser. The link lead them to a page containi ng an informed consent statement and contact information for the primary investigator (s ee Appendix B). After the informed consent statement, participants advanced to an inter active page beginning with a section where they entered demographic information (see Appendix C) They then continued on to the main body of the questionnaire. See Appendix D for examples of regular and manipulation check items.

PAGE 52

52 Participants will viewed each item in succes sion, in one of six semi-random orders. For counterbalancing, each of the 36 stimulus images was assigned a number between 1 and 36. Six randomly ordered lists of the numbers 1-36 were ge nerated, to create six orders of presentation. The orders of presentation were then adjusted so that each of the six encoders appeared first on one list and last on another, and each of the six emotion expre ssions appeared first on one list and last on another. The 36 manipulation check item s were divided into six groups of six and one group was appended to the end of each presentation order. For the regular items, a series of questions be low each stimulus elicited ratings from the participant on each of the four emotions, on a sc ale of 1-4. For the manipulation check items, the two manipulation check questions followed each item (see Appendix D). After entering their responses, participants used the mouse to click th e word next, and the next item was presented. After responding to the last item, participants clicked a box labeled submit and their responses were written to a data file. They were then directed to pages thanking them and providing instructions for receiving credit for their part icipation (see Appendix E). Participants who began the study but then decided not to co ntinue participating were able to receive credit by scrolling to the end of the questionnaire, clicking the submit button, a nd following the subsequent instructions. Evaluation of Hypotheses The criteria for support for the first hypothesis we re as follows: if pa rticipants ratings of emotions for figures in the ambiguous expressi on condition were sex-ster eotypical, and their ratings of the emotions of figures in the unamb iguous expression conditions were consistent with the intended interpretation of the expression regardless of target sex, then the first hypothesis was supported. These criteria were analyzed using two series of Bonferoni -corrected t-tests.

PAGE 53

53 Four t-tests were employed to evaluate whethe r or not ratings of emotions for figures in the ambiguous expression condition were sex-stereotypical. For appa rently male and apparently female figures displaying positive-valence and ne gative-valence blended expressions, ratings of stereotypically masculine emotions were compar ed to ratings of stereotypically feminine emotions. Twelve t-tests were employed to evalua te whether or not ratings of the emotions of figures in the unambiguous e xpression conditions were cons istent with the intended interpretation of the expression regardless of encoder sex. For apparently male, apparently female, and androgynous figures displaying each of the four unambiguous expressions, ratings of the emotion matching the intended interpretation of the expression were compared with ratings of the other same-valence emotion. The criteria for support of the second hypothesis were as follows: in ambiguous expression conditions, if participant ratings of stereotypically masculine emotions were higher for cells with unambiguously male figures than fo r cells in which the figures apparent sex is ambiguous, and if participant ratings for stereotypically feminine em otions were larger for cells with unambiguously female figures than for cel ls in which the figur es apparent sex is ambiguous, then the second hypothesis was supporte d. A series of four Bonferoni-corrected ttests was used to evaluate the second hypothesis. Ratings of stereotypica lly masculine emotions were compared for apparently male and ambigu ously sexed figures disp laying positive-valence and negative-valence blended expressions. Ratings of stereotypically feminine emotions were compared for apparently female and ambiguously sexed figures displaying positive-valence and negative-valence blended expressions.

PAGE 54

54 CHAPTER 4 RESULTS Data from 8 participants were discarded b ecause the participants completed less than 70% of the items. Data from 163 participants we re analyzed. Reliability for facial expression subscales was assessed by calculating Cronbachs alpha for each of the 6 facial expressions. Cronbachs alphas were all above .7, indicating adequate reliability. Hypothesis One The first hypothesis predicted that partic ipants ratings of emotions for ambiguous expressions would be sex-stereo typical. That is, apparently ma le figures displaying ambiguous emotions were predicted to be rated higher on stereotypically masculine emotions than on stereotypically feminine emotions, and apparently female figures were predicted to be rated higher on stereotypically feminine emotions than on stereotypically masculine emotions. It was also predicted that participan ts ratings of emotions for una mbiguous expressions would be consistent with the intended interpretations of the expressions regardless of the apparent sex of the target figures. Expressions constructed us ing facial action units for one emotion, i.e. unblended expressions, were c onsidered to be unambiguous expressions for the purpose of hypothesis testing. Blended expressions were constructed by morphing one unambiguous expression into another, genera ting a series of intermediate images. The middle image in the series was selected for use. These blended e xpressions were considered to be ambiguous. The first part of the first hypothesis, rega rding ambiguous expressions, was partially supported. Both apparently male and apparently female figures displayi ng blended expressions were rated higher on stereotypically feminine em otions, whereas the hypotheses predicted that apparently male figures displa ying blended expressions would be rated higher on stereotypically masculine emotions. The following four planned t-te sts were employed to evaluate the first part

PAGE 55

55 of the hypothesis: ratings of ange r (stereotypically masculine) and fear (stereotypically feminine) were compared for anger-fear blended expression s on apparently male figures. Anger was rated significantly lower than fear (t = -2.376, p = .019), which was not consistent with the hypothesis. Ratings of anger and fear were compared fo r anger-fear blended expressions on apparently female figures. Fear was rated significantly higher than anger (t= 16.225, p < .001), which was consistent with the hypothesis. Ratings of pride (stereotypically masculine) and happiness (stereotypically feminine) were compared for pr ide-happiness blended expr essions on apparently male figures. Pride was rated si gnificantly lower than happiness (t = -6.091, p < .001), which was not consistent with the hypothesis. Ratings of pride (stere otypically masculine) and happiness (stereotypically feminine) were compared for pr ide-happiness blended expr essions on apparently female figures. Happiness was rated significantly higher than pride (t = -12.290, p < .001), which was consistent with the hypothesis. The second part of the first hypothesis, regarding the correct identification of unambiguous emotions, was supported except in the case of apparently female figures displaying unblended pride, in which case part icipants ratings of pride and happiness were not significantly different. Twelve planned t-tests were used to ev aluate the second part of the first hypothesis. The first four t-tests dealt with apparently male figures: For apparently male figures displaying unblended anger, ratings of anger (stereotypically ma sculine) were compared with ratings of fear (stereotypically feminine). A nger was rated significantly higher than fear (t = 43.362, p < .001), which was consistent with the hypothesis. For apparently male figures displaying unblended fear, ratings of anger were compared with ratings of fear. Anger was rated significantly lower than fear (t = -27.333, p < .001) which was consistent with the hypothesis. For apparently male figures displaying unble nded pride, ratings of pride (stereotypically

PAGE 56

56 masculine) were compared with ratings of happi ness (stereotypically feminine). Pride was rated significantly higher than happi ness (t = 8.628, p < .001), which was consistent with the hypothesis. For apparently male figures displaying unblended happi ness, ratings of pride were compared with ratings of happiness. Pride was ra ted significantly lower than happiness (t = 9.700, p < .001), which was consiste nt with the hypothesis. A corresponding set of four t-tests was used to evaluate participants identification of unambiguous emotions of apparently female figur es. For apparently female figures displaying unblended anger, ratings of anger (s tereotypically masculine) were compared with ratings of fear (stereotypically feminine). Anger was rated sign ificantly higher than fear (t = 41.117, p < .001), which was consistent with the hypothesis. For ap parently female figures displaying unblended fear, ratings of anger were compared with rati ngs of fear. Anger was rated significantly lower than fear (t = -33.185, p < .001), which was consis tent with the hypothesis. For apparently female figures displaying unble nded pride, ratings of pride (stereotypically masculine) were compared with ratings of happiness (stereotypi cally feminine). Ratings of pride were not significantly different than ratings of happine ss (t = -.009, p = .993), which was not consistent with the hypothesis. For appare ntly female figures displaying unblended happiness, ratings of pride were compared with ratings of happine ss. Pride was rated significantly lower than happiness (t = -13.009, p < .001), which wa s consistent with the hypothesis. A third set of four t-tests was used to eval uate participan ts identificati on of unambiguous emotions of figures with ambiguous apparent sex. For ambiguously se xed figures displaying unblended anger, ratings of anger (s tereotypically masculine) were compared with ratings of fear (stereotypically feminine). Anger was rated sign ificantly higher than fear (t = 29.894, p < .001), which was consistent with the hypothesis. For ambiguously sexed figures displaying unblended

PAGE 57

57 fear, ratings of anger were compared with rati ngs of fear. Anger was rated significantly lower than fear (t = -27.751, p < .001), which was cons istent with the hypothesis. For ambiguously sexed figures displaying unblended pride, ratings of pride (stere otypically masculine) were compared with ratings of happiness (stereotypical ly feminine). Pride was rated significantly higher than happiness (t = 8.411, p < .001), which was consistent with the hypothesis. For ambiguously sexed figures displayi ng unblended happiness, ratings of pride were compared with ratings of happiness. Pride was rated significantl y lower than happiness (t = -15.632, p < .001), which was consistent with the hypothesis. Hypothesis Two In the second hypothesis it was predicted th at in when the target figures sex was ambiguous and the target figure was displa ying an ambiguous expression of emotion, participants would assign lower ra tings of stereotypically masculin e emotions than they would to apparently male target figures, and that they would assign lower rati ngs of stereotypically feminine emotions than they would to apparen tly female target figures. The second hypothesis was partially supported. Participants rated ambi guously sexed figures lower on stereotypically masculine emotions than they did apparently ma le figures. Comparisons of participants ratings of stereotypically feminine emotions for ambi guously sexed figures versus apparently female figures yielded results that did not reach th e required p-value for significance when the Bonferoni correction was applied (p < .013), but did without the Bonf eroni correction. Four planned t-tests were used to evaluate the second hypothesis. Ratings of anger for apparently male figures displayi ng blends of anger and fear were compared to ratings of anger for ambiguously sexed figures displa ying blends of anger and fear. Ratings of anger were higher for apparently male figures than for ambiguous ly sexed figures (t = 9.467, p < .001), which was consistent with the hypothesis. Ratings of fear for apparently female figures displaying blends of

PAGE 58

58 anger and fear were compared to ratings of f ear for ambiguously sexed figures displaying blends of anger and fear. Ratings of fear were not signif icantly higher for apparently female figures than for ambiguously sexed figures afte r Bonferoni-correction of alpha to account for the four t-tests (t = 1.81, p = .036). This was not consistent with the hypothesis. Ratings of pride for apparently male figures displaying blends of pride and ha ppiness were compared to ratings of pride for ambiguously sexed figures displaying blends of pride and happiness. Ratings of pride were higher for apparently male figures than for ambiguously sexed figures (t = 2.34, p = .011), which was consistent with the hypothesi s. Ratings of happiness for apparently female figures displaying blends of pride and happiness were compared to ratings of happiness for ambiguously sexed figures displaying blends of pride and happine ss. Ratings of happiness were not significantly higher for apparently female figures than for ambiguously sexed figures after Bonferonicorrection of alpha (t = 2.117, p = .018), which was not consistent with the hypothesis. Additional Analyses An additional set of t-tests was employed to examine the premise of the first part of the first hypothesis, i.e. that partic ipants would exhibit sex bias in the interpretation of ambiguous expressions of emotion on apparen tly male and apparently female target figures. In the additional analysis, apparently male and apparently fe male figures were compared on ratings of stereotypically masculine emotions, and on rati ngs of stereotypically feminine emotions. Four t-tests were used for the additional an alysis. Ratings of anger were compared for apparently male figures displa ying blends of anger (stereotyp ically masculine) and fear (stereotypically feminine) and fo r apparently female figures disp laying blends of anger and fear. Apparently male figures were ra ted as significantly angrier than were apparently female figures (t = 3.686, p < .001). Ratings of fear were compar ed for apparently male figures displaying blends of anger and fear and for apparently fema le figures displaying blends of anger and fear.

PAGE 59

59 Apparently male figures were ra ted as being significantly less f earful than apparently female figures (t = -12.041, p < .001). Ratings of pride we re compared for apparently male figures displaying blends of pride (stereotypically masc uline) and happiness (stereotypically feminine) and for apparently female figures displaying bl ends of pride and happiness. Apparently male figures were rated as significantly prouder than were apparently female figures (t = 4.269, p < .001). Ratings of happiness were compared for a pparently male figures displaying blends of pride and happiness and for apparently female fi gures displaying blends of pride and happiness. Apparently male figures were ra ted as significantly less happy than apparently female figures (t = -3.023, p = .003). These results ar e all consistent with sex-bias ed interpretation of ambiguous expressions of emotion. Apparent Sex Manipulation Check Figures intended to be unambiguously male we re consistently rated as such. The first male figure was rated as male by 159 of 163 partic ipants, and the second was rated as male by 160 of 163 participants. Figures in tended to be unambiguously female were also consistently rated as such. Both female figur es were rated as female by 161 of 163 participants. As expected, there was less agreement regarding the figures whose sex was intended to be ambiguous. The first ambiguous figure was rated as male by 101 pa rticipants, as female by 36 participants, and 26 participants chose the res ponse dont know. The second ambiguous figure was rated as male by 68 participants, as female by 50 partic ipants, and 44 participan ts chose the response dont know. Summary The first hypothesis of our study was partially supported by the results. The first part of the first hypothesis was not supported in the primar y analysis, because participants rated blended expressions significantly hi gher on stereotypically feminine emo tions regardless of the apparent

PAGE 60

60 sex of the target figure. However, the additional analysis did reveal a pattern of sex-stereotypical ratings of ambiguous emotions in that apparently male figures displayi ng blended expressions were rated higher on stereotypically masculine em otions than were apparently female figures displaying blended emotions, and apparently fema le figures displaying bl ended expressions were rated higher on stereotypically feminine emotions than were apparently male figures displaying blended emotions. The second part of the first h ypothesis was mostly supported in that in 11 of 12 t-tests, results were consistent with particip ants assigning the highest ratings to the intended emotions regardless of apparent sex of target and whether the intende d expression portrayed a stereotypically masculine or stereotypically feminine emotion. The second hypothesis of our study was also partially support ed by the results. Apparently male figures displaying blended em otions were rated hi gher on stereotypically masculine emotions than were ambiguously sexe d figures displaying blended expressions, as predicted. However, apparently female figures displaying blended expressions were not rated higher on stereotypically feminine emotions th an were ambiguously sexed figures displaying blended expressions.

PAGE 61

61 CHAPTER 5 DISCUSSION The theory underlying the hypotheses was ge nerally supported. Some elements of the hypotheses were not fully supported, but explanations that preser ve the essential ideas readily present themselves. In one case there seems to ha ve been a problem with the calibration of the stimuli, and additional analyses co rrecting for this reveal the predic ted effect. In another, the test seems to have been very slightly underpower ed for use with the conservative Bonferoni correction to avoid family-wise er ror. If these rationales can be accepted, then the results provide evidence for a sex bias in the interpretation of vi sual expressions of emotion. This may partially explain the persistent exaggerati on of sex differences in emotion in the popular imagination, as compared with the scientific literature. Tests of Hypotheses Testing of the first hypothesis yielded mixed results. The first part of the first hypothesis, i.e. that people would interpret ambiguous expressi ons in a biased way according to the apparent sex of the target, was only par tially supported. For ble nded expressions, part icipants rated the stereotypically female emotions higher for both apparently male and apparently female faces. However, the anticipated sex-bias was observed in the follow-up analyses. In blended expression conditions, ratings of stereotypically masculine em otions were significant ly higher on apparently male faces as compared to apparently female faces, and ratings of stereotypically feminine emotions were significantly higher on apparently female faces than on apparently male faces. The concept underlying the hypothesis appears to have garnered support even though not all parts of the hypothesis did. The failure to fu lly support the hypothesis may have resulted from the blended expressions not bei ng perceived as sufficiently ambi guous, or perhaps they did not fall at the psychological midpoint between the pa irs of unblended expressions from which they

PAGE 62

62 were made. Put another way, the psychological midpoint between two expressions, upon which the hypothesis was based, may be different from the mathematical midpoint, which is what was used for our study. An investigation into where th at midpoint actually lies, and whether it is best approximated by morphing unambiguous expressions together or by some other method of combining elements of unambiguous expressions, c ould be a fruitful area for further research. The second part of the first hypothesis was s upported in 11 of the 12 comparisons made. Participants correctly identif ied unblended emotions except in the case of female faces displaying unblended pride. There was no differe nce in their ratings of perceived pride and perceived happiness in this conditi on. In fact, all figures in th e study displaying positive affect were rated relatively highly on both pride and happiness. In cont rast, expressions of unambiguous fear received comparatively low ra tings on anger, and expressions of unambiguous anger received similarly low ratings on fear. The reasons for this difference are unclear, but perhaps worth speculating about. It could be that happiness and pride were simply not ad equately differentiated in the creation of the stimuli. It is also possible that emotions of positive valence are not psychologically differentiated to the same extent as emotions of negative valen ce, or at least that happiness is a somewhat broad or undifferentiated emotion. For example, pride, love and pleasant surprise may all be described as happy emotions. For emotions of negative valence, th e term unhappy could perform a similar role in categorizing them. Th e important difference for the purpose of this work is that happiness is descri bed in the facial expression lite rature as a recognizable basic emotion in its own right, whereas there is no described expression that is simply unhappy without also being something else, such as sad, angr y, or fearful. It might also be a useful area of

PAGE 63

63 further investigation to explore how differentia ted expressions of positively valenced emotions are as compared with negatively valenced emotions. Testing of the second hypothesis also yiel ded mixed results. In blended expression conditions, figures of ambiguous apparent sex were rated lowe r on stereotypically masculine emotions than were apparently male figures. Th is was what was predictedif the effect being examined is truly a sex bias, the effect should di sappear or be attenuated when the sex of the target is unclear. However, figures of ambi guous sex were not rated significantly lower on stereotypically feminine emotions than were ap parently female figures. This probably indicates that the test was under-powered: the results comparing ambiguously sexed figures with apparent females approached significance in the predicted di rection. It could in theo ry also indicate poor manipulation of apparent sex, with participants tending to believe that figures intend to be ambiguous were in fact females. Examination of the manipulation check data contradicts this, however. In fact, if poor mani pulation were a problem, one would expect the ratings for ambiguous figures to have been cl oser to those for apparent male s rather than apparent females as observed, since both ambiguously sexed figures were rated as ma le more often than female. Sex Bias in Interpreting Affect as a Reinforcer of Cultural Stereotypes In the introduction, it was obs erved that there is a popular perception that mens and womens emotional experiences are very different, while the research literature indicates that the emotional worlds of men and women are much more alike than different. The purpose of our study was to examine one possible explanation fo r this disjunction be tween popular perception and empirical reality. Namely, that there is a sex bias in play when people decode the emotional expressions of others. Previous studies had revealed some evidence to that effect, but were limited by having to trade between realism and control in the creation of stimuli.

PAGE 64

64 Our study brought new tools to the task. Realis tic computer-generated faces with finely controllable emotional expressions made it possibl e for apparent males and females to be shown displaying exactly the same expression, as opposed the best approximation of a human trying to pose the expression. The same tools made it possibl e to create encoders of ambiguous apparent sex, and again to show them displaying precis ely the same expressions of emotion as the apparently male and female figures in order to see if the hypothesi zed sex bias was eliminated or attenuated when the targets sex wasnt clear. The results gave partial support to the hypotheses as they were formulated, but appear to support the basic idea of a sex bias in the decoding of expressions of emotion. Apparently male figures and apparently female figures displaying precisely the same facial expressions and postures were rated differently, and the differenc e in the ratings reflec ted stereotypes about which emotions are considered masculine and whic h are considered feminine. This indicates sex bias. When the figure was neither clearly male nor clearly female, they were rated lower on stereotypically masculine emotions than male fi gures. There was a similar trend in comparisons between ambiguous figures and female figures, wh ich fell short of statis tical significance by a slim margin. Firm conclusions cant be drawn abou t trends that fall short of significance, but the pattern here is tantalizingly cl ose to evidence that figures that are obviously male or obviously female receive an extra boost in ratings of stereo typically masculine and st ereotypically feminine emotions respectively, and that figures that are not obviously male or female receive no such boost. If this were definitively s hown to be the case, th is would also indicate sex bias. The results regarding the ambiguously sexed figures are consis tent with the presence of sex bias in the interpretation of expressions of emotion, though they do not demonstrate it clearly.

PAGE 65

65 As with many other types of bias, the practic al implications of research indicating a sex bias in the interpretation of emotions relate to th e fact that when people are aware of a bias, they have on opportunity to correct for it. In social li fe as well as in any pr ofessional arena in which the accurate understanding of the emotional signals of others is important, knowledge of this particular bias may create a chance for improve d communication. Mental health practitioners may be particularly impacted, because they deal so directly with interpreting the emotional states of clients. Another possible implication for couns eling is that clients themselves may be made aware of the sex bias in interpreting the emoti onal states of others, and thereby be given an opportunity to try to counteract it in their own inferences. This c ould be particularly useful in couples counseling. Limitations of this Investigation Our study had a few limitations. The significant time involved in learning to use the Poser 6 software to create encoders, combined w ith the desirability of keeping the questionnaire short enough to avoid participant fatigue, led to the inclusion of only si x encoders in the study. The inclusion of more figures would have d iluted any unintended effects of the figures themselves as opposed to the desired effects of the expressions and the apparent sexes of the figures. The use of a larger number of participants mi ght have helped the results pertaining to the second hypothesis be more conclusive. All of the observed results were in the predicted direction, but after Bonferoni corr ection to avoid inflating family-w ise error, some were not quite statistically significant. Inadequate power seem s a likely cause of this, although it is certainly possible that the predicted eff ect simply is not there. The use of a wider variety of blends of emo tions in the stimuli might have added some clarity to the testing of the fi rst hypothesis. The test of the fi rst hypothesis confounded the effect

PAGE 66

66 of sex bias with the effect of a less relevant aspect of the blended expression stimuli. The blended stimuli were perceived as being more representative of stereotypically feminine emotions, despite being midpoints between stereo typically masculine and feminine emotions from a mathematical point of view. A variety of blended expressions made from different ratios of the constituent emotions might have made it po ssible to find a blend that functioned better as a midpoint in the way it was perceived by participants. Implications for Future Research Several possibilities for further investigat ion are suggested by the outcome of our study. For example, the development of stimuli could be taken further. By creating stimuli showing a variety of blended emotional expressions, made w ith various ratios of the constituent emotions and perhaps different methods of combining expre ssions, it may be possible to gain considerable insight into the way people dec ode ambiguous, ambivalent, or co mplex emotional expressions. Would blends of emotional expre ssions other than the ones examined here behave the same way, i.e. would the midpoint generated by morphing two t ogether be interpreted as representing one of the constituent emotions more than the other? If so, what would characterize the emotion perceived to be dominant? Might it be, as it was here, the more stereotypically feminine of the two? Also, what ratio or method of combina tion would maximize ambi guity? The answers to these questions would be interesting in themselves and could also lead to the creation of better tools for the exploration of ot her questions about the decodi ng of facial expressions. In our study, pleasant and unpl easant emotional expressions seem to have behaved differently. The emotion ratings for pride a nd happiness were both rather high not only for expressions blending happiness and pride, but also for expressions of supposedly pure happiness, or pure pride. Expressions of anger and fear were rated much more monolithically. Further investigation could clarify the s ource of this difference. Perhap s in our study the manipulation of

PAGE 67

67 this pair of expressions was inadequate. On the other hand, perhaps happiness and pride are poorly differentiated in peoples minds. Perhaps pos itively valenced emotions in general are less differentiated than negatively valenced emo tions. Evolutionary psychology might provide a rationale for hypothesizing that negative emotions ar e interpreted at a higher resolution or in a more differentiated way. It could be argued that in the more dangerous environments faced by our distant ancestors, being wrong about whether another person was angry or afraid would be more likely to preclude successful procreation (b y leading to death, for example) than would being wrong about whether a nother was proud or amused. Another possible avenue for fu ture investigation would be to look at the influence of contextual factors on the observed sex bias in interpreting expressions of emotion. This could take any number of forms. Par ticipants could be given information about the figures, such as some indication of their personali ties, social status, or sexual orientations. Alternatively, the participants mindsets could be manipulated, fo r example by having experimental groups read excerpts from John Grays or Janet Shibley Hyde s work. It would be interesting to see how responsive the bias is to changes in aw areness on the part of the decoders. It is possible that different groups may exhibit different levels of bias in interpreting emotional expressions. Future re search could reveal va riations in the amount of bias among such groups as mental health profe ssionals versus lay people, expe rienced versus inexperienced mental health professionals, younger versus older people, and so on. Conclusion Our study brought new tools and finer control to the study of sex of encoder effects on the interpretation of visual disp lays of affect. With some qualif ications, evidence of a bias consistent with cultural stereotypes of mens and womens emotions was found. This bias may contribute to the observed disj unction between popular percepti ons about men and women being

PAGE 68

68 from different emotional planets and the scientific literature i ndicating that sex differences in emotion tend to be small, are often situati onal, and are dwarfed in comparison with the similarities.

PAGE 69

69 APPENDIX A EXAMPLE STIMULI Male 1, anger Male 2, pride Androgynous 1, fear-anger blend Androgynous 2, happiness-pride blend Female 1, fear Female 2, happiness

PAGE 70

70 APPENDIX B INFORMED CONSENT STATEMENT Informed Consent Protocol Title: Interpreting Expressions of Emotion Please read this consent document carefully befo re you decide to participate in this study. Purpose of the research study: The purpose of this study is to better understand how people interpret facial expressions of emotion. What you will be asked to do in the study: You will fill out a brief demographic questionnaire, then you will see a series of images and answer a few questions about each image. Time required: About 20-30 minutes. Risks and Benefits: We do not anticipate that you will benefit directly or be harmed in any way by participating in this experiment. Compensation: If you are a UF general psychology student in the participan t pool, you will receive one re search credit in return for your participation. After submitting your answers to all the questions, you will be asked to enter your Gatorlink ID in order to receive your credit. If you are not part of the psychology participant pool, your instructor will determine and announce in class the amount of extra credit you will receive for your participatio n. If your instructor decide s not to award extra credit, you will not be compensated for your participation. After you complete the survey, you must print out the Thank You page and give it to your inst ructor in order to receive credit. Confidentiality: Your identity will be kept confidential to the extent prov ided by law. Your responses will be saved without any identifying information. If you provide your Gatorlink ID, it will be stored in a separate file and used only for the purpose of assigning credit. Neither your name nor your ID number can be connected with your responses to the survey. Voluntary participation: Your participation in this study is completely voluntary. There is no penalty for not participating. Right to withdraw from the study: You have the right to withdraw from th e study at anytime without consequence.

PAGE 71

71 Whom to contact if you have questions about the study: Primary Investigator: Kevin Stanley, M.S., Graduate Student, Psychology Department, (352) 379-7918, stanleyk@counsel.ufl.edu Faculty Supervisor: Martin Heesacker, Ph.D., Chair, Psychology Department, PSY 144A, (352) 392-0601 x 200, heesack@ufl.edu Whom to contact about your rights as a research participant in the study: UFIRB Office, Box 112250, University of Florida, Ga inesville, FL 32611-2250, (352) 392-0433, irb2@ufl.edu Agreement: I have read the procedure described above. By clicking the link below I agree to voluntarily participate in this research study. Click here to enter.

PAGE 72

72 APPENDIX C DEMOGRAPHIC QUESTIONNAIRE Please enter the following demographic information about yourself: Sex: Male Female Age: Education: Undergraduate: 1st year 2nd year 3rd year 4th year Graduate or Professional Race/Ethnicity (select all that apply): American Indian or Alaska Native Asian Black or African American Native Hawaiian or Other Pacific Islander White Hispanic or Latino If you are an international student, what is your home country?

PAGE 73

73 APPENDIX D EXAMPLE ITEMS Example 1: regular questionnaire item: Please look at this picture, and th en answer the questions below it: 1. On a scale of 1-4, how angry does this person look? 1 Not at all 2 Slightly 3 Some 4 Very much 2. On a scale of 1-4, how afraid does this person look? 1 Not at all 2 Slightly 3 Some 4 Very much 3. On a scale of 1-4, how happy does this person look? 1 Not at all 2 Slightly 3 Some 4 Very much 4. On a scale of 1-4, how proud does this person look? 1 Not at all 2 Slightly 3 Some 4 Very much

PAGE 74

74 Example 2: manipulation check item: Please look at this picture, and th en answer the questions below it: 1. This person is... 1 Male 2 Female 3 Not sure 2. Is this person's expression ambiguous? 1 Yes 2 No

PAGE 75

75 APPENDIX E INSTRUCTIONS FOR RECEIPT OF CREDIT AND THANK-YOU MESSAGE Instructions for receipt of credit: If you are in the UF general ps ychology participant pool, please en ter your Gatorlink ID in the box below, then click the submit button. This will allow us to assign you your research credit. If you are not in the UF general psychology pa rticipant pool, click submit without entering anything in the Gatorlink ID box. You will prin t out the following page and submit it to your instructor to receive cred it for your participation. Gatorlink ID: S ubmit Thank you/debriefing message: Thank you! Your participation in this rese arch project is appreciated. Your responses will help improve our understanding of how people interpret the emotional st ates of others from facial expressions and body postures. If you are not in the UF general psychology partic ipant pool, please print this page and submit it to your instructor with your na me printed legibly at the top.

PAGE 76

76 LIST OF REFERENCES Algoe, S. B., Buswell, B. N., & DeLamater, J. D. (2000). Gender and job status as contextual cues for the interpretation of facial expression of emotion. Sex Roles, 42 (3-4), 183-208. Atkinson, A. P., Tipples, J., & Burt, D. M. ( 2005). Asymmetric interference between sex and emotion in face perception. Perception & Psychophysics, 67 (7), 1199-1213. Aubrey, J. S., & Harrison, K. (2004). The gender -role content of childre ns favorite television programs and its link to th eir gender perceptions. Media Psychology, 6 (2), 111-146. Baron-Cohen, S., Wheelwright, S., & Jolliffe, T. (1997). Is ther e a language of the eyes? Evidence from normal adults, and adults with autism or Asperger syndrome. Visual Cognition, 4 (3), 311-331. Baron-Cohen, S. (2003). The essential difference: The truth about the male and female brain. New York: Perseus Books Group. Becker, D. V., Kenrick, D. T., Neuberg, S. L ., Blackwell, K. C., & Smith, D. M. (2007) The confounded nature of angry men and happy women. Journal of Personality and Social Psychology, 92 (2), 179-190. Becker, R. (Writer). (1991). Defending the caveman [Broadway play]. United States: Theater Mogul NA, Inc. Billings, A. C., Angelini, J. R., & Eastman, S. T. (2005). Diverging discourses: Gender differences in televised golf announcing. Mass Communication and Society, 8 (2), 155-171. Brody, L. R. (2000). The socialization of gender differences in emotional expression: display rules, infant temperament, and diffe rentiation. In A. H. Fischer (Ed.), Gender and emotion: Social psychological perspectives. (pp. 24-47). New York: Camb ridge University Press. Buck, R., Miller, R. E., & Caul, W. F. (1974). Sex, personality, and physio logical variables in the communication of affect via facial expression. Journal of Personality and Social Psychology, 30 (4), 587-596. Canary, D. J., & Emmers-Sommer, T. M. (with Faulkner, S.) (1997). Sex and gender differences in personal relationships. New York: Guilford Press. Condry, J., & Condry, S. (1976). Sex differenc es: A study of the eye of the beholder. Child Development, 47 (3), 812-819. Dimitrovsky, L., Spector, H., & Levy-Shiff, R. (2000). Stimulus gender and emotional difficulty level: Their effect on recogniti on of facial expressions of affect in children with and without LD. Journal of learning Disabilities, 33 (5), 410-416.

PAGE 77

77 Dundes, L. (2001). Disneys modern heroine Po cahontas: Revealing age-old gender stereotypes and role discontinuity unde r a faade of liberation. The Social Science Journal, 38 (3) 353365. Eiland, R., & Richardson, D. (1976). The influe nce of race, sex, and age on judgments of emotion portrayed in photographs. Communication Monographs, 43 (3), 167-175. Ekman, P. (1993). Facial expression and emotion. American Psychologist, 48 (4), 384-392. Ekman, P., & Friesen, W. V. (1976 ). Measuring facial movement. Environmental Psychology & Nonverbal Behavior, 1 (1), 56-75. Ekman, P., & Friesen, W. V. (1987 ). Universals and cultural diffe rences in the judgments of facial expressions of emotion. Journal of Personality and Social Psychology, 53 (4), 712717. Ekman, P., Friesen, W. V., & Ellsworth, P. (1972) Emotion in the human face: Guidelines for research and an integration of findings. Oxford: Pergamon Press. Ekman, P., Friesen, W. V., & Hager, J. C. (2002). Facial Action C oding System [Computer software]. Salt Lake City, Utah: A Human Face. Epinions (2000). Epinions.com Defending th e caveman [web site]. Shopping.com, Inc.: http://www.epinions.com/trvl-revi ew-201D-4562E89D-3A4BAE9C-prod3 [accessed March 2006, April 2007]. Erwin, R. J., Gur, R. C., Gur, R. E., Skolni ck, B., Mawhinney-Hee, M., & Samalis, J. (1992). Facial emotion discrimination: I. Task c onstruction and behavioral findings in normal subjects. Psychiatry Research, 42 (3), 231-240. Felleman, E. S., Barden, R. C., Carlson, C. R., Rosenberg, L., & Masters, J. C. (1983). Childrens and adults recognition of spontaneous and pos ed emotional expressions in young children. Developmental Psychology, 19 (3), 405-413. Fink, J. S., & Kensicki, L. J. (2002). An im perceptible difference: Visual and textual constructions of femininity in Sports I llustrated and Sports Illustrated for Women. Mass Communication and Society, 5 (3), 317-339. Gray, J. (1992). Men are from Mars, Women are from Ve nus: A practical guide for improving communication and getting what you want in your relationships. New York: HarperCollins. Gray, J. (2006). About John Gray: Men are from Mars, women are from Venus [web site]. MarsVenus.com: http://www.marsvenus.com/JohnGrayProfile.php. [accessed March 2006, April 2007].

PAGE 78

78 Hall, J. A., Carney, D. R., & Murphy, N. A. (2002) Gender differences in smiling. In M. H. Abel (Ed.), An empirical reflection on the smile. (pp. 155-185). Lewiston, NY: Edwin Mellen Press. Hess, U., Adams, R. B. Jr., & Kleck, R. E. (2004). Facial appearance, gender, and emotion expression. Emotion 4 (4), 378-388. Hess, U., Adams, R. B. Jr., & Kleck, R. E. (2005). Who may frown and who should smile? Dominance, affiliation, and the di splay of happiness and anger. Cognition & Emotion, 19 (4), 515-536. Hess, U., Blairy, S., Kleck, R. E. (1997). The in tensity of emotional f acial expressions and decoding accuracy. Journal of Nonverbal Behavior, 21 (4), 241-257. Hess, U., Blairy, S., Kleck, R. E. (2000). The infl uence of facial emotion displays, gender, and ethnicity on judgments of dominance and affiliation. Journal of Nonverbal Behavior, 24 (4), 265-283. Hugenberg, K., & Sczesny, S. (2006). On wonde rful women and seei ng smiles: Social categorization moderates the happy face response latency advantage. Social Cognition, 24 (5), 516-539. Hyde, J. S. (2005). The gender similarities hypothesis. American Psychologist, 60 (6), 581-592. Keltner, D. (1995). Signs of appeasement: Evidence for the distinct displays of embarrassment, amusement, and shame. Journal of Personality and Social Psychology, 68 (3), 441-454. Knudsen, H. R., & Muzekari, L. H. (1983) The eff ects of verbal statements of context on facial expressions of emotion. Journal of Nonverbal Behavior, 7 (4), 202-212. LaFrance, M. and Banaji, M. (1992). Toward a reconsideration of the gender-emotion relationship. In Clark, M.S. (Ed.), Emotion and social behavior (178-201). Thousand Oaks, CA: Sage. LaFrance, M., Hecht, M. A., & Paluck, E.L. (2 003). The contingent smile: A meta-analysis of sex differences in smiling. Psychological bulletin, 129 (2), 305-334. Lively, K. J., & Heise, D. R. (2004). Soci ological realms of em otional experience. American Journal of Sociology 109 (5), 1109-1136. Major, B., Carnevale, P. J., & Deaux, K. (1981). A different pe rspective on androgyny: Evaluations of masculine and femi nine personality characteristics. Journal of Personality and Social Psychology, 41 (5), 988-1001. Mignault, A., & Chaudhuri, A. (2003). The many f aces of a neutral face: Head tilt and perception of dominance and emotion. Journal of Nonverbal Behavior, 27 (2), 111-132.

PAGE 79

79 Moir, A. & Moir, W. (2003). Why men dont iron: The fascina ting and unalterable differences between men and women. New York: Citadel Press. OKearney, R., & Dadds, M. (2004). Developmental and gender diffe rences in the language for emotions across the adolescent years. Cognition & Emotion, 18 (7), 913-938. Palermo, R., & Coltheart, M. (2004). Photographs of facial expressi on: Accuracy, response times, and ratings of intensity. Behavior Research Methods, Instruments, & Computers, 36 (4), 634-638. Pease, A. & Pease, B. (2001). Why men dont listen and women cant read maps: How were different and what to do about it. New York: Broadway Books. Pease, B. & Pease, A. (2004). Why men dont have a clue and wome n always need more shoes: The ultimate guide to the opposite sex. New York: Broadway Books. Pell, M. D. (2002). Evaluation of nonverbal emotion in face and voice: Some preliminary findings on a new battery of tests. Brain and Cognition, 48 (2-3), 499-504. Philippot, P., Feldman, R. S., & Coats, E. J. (200 3). The role of nonverbal behavior in clinical settings: Introduction and overview. In P. Phillipot, R. S. Feldman, & E. J. Coats (Eds.), Nonverbal behavior in clinical settings. New York: Oxford University Press. Plant, E. A., Hyde, J. S., Keltner, D., & Devi ne, P. G. (2000). The gender stereotyping of emotions. Psychology of Women Quarterly, 24 (1), 81-92. Plant, E. A., Kling, K. C., & Smith, G. L. (2004) The influence of gender and social role on the interpretation of f acial expressions. Sex Roles, 51 (3-4), 187-196. Rahman, Q., Wilson, G. D., & Abrahams, S. (2004) Sex, sexual orientati on, and identification of positive and negative facial affect. Brain and Cognition, 54 (3), 179-185. Rhodes, S. E. (2004). Taking sex differences seriously. San Francisco: Encounter Books. Rotter, N. G., & Rotter, G. S. (1988). Sex diffe rences in the encoding and decoding of negative facial emotions. Journal of Nonverbal Behavior, 12 (2), 139-148. Seidman, S. A. (1992). An investigation of sex-role stereotyping in music videos. Journal of Broadcasting and Electronic Media, 36 (2) 209-216. Signorielli, N. (1989). Television and conceptions about sex roles: Maintaining conventionality and the status quo. Sex Roles 21 (5-6), 341-360. Simon, R. W., & Nath, L. E. (2004). Gender an d Emotion in the United States: Do men and women differ in self-reports of f eelings and expressive behavior? American Journal of Sociology 109 (5), 1137-1176.

PAGE 80

80 Simpson, P. A., & Stroh, L. K. (2004). Gender di fferences: Emotional expression and feelings of personal inauthenticity. Journal of Applied Psychology, 89 (4), 715-721. Stern, S. R., & Mastro, D. E. (2004). Gender portr ayals across the life span: A content analytic look at broadcast commercials. Mass Communication and Society, 7 (2), 215-236. Tannen, D. (1990). You just don't understand: Women and men in conversation New York: William Morrow & Co. Thayer, J. F., & Johnsen, B. H. (2000). Sex di fferences in judgment of facial affect: A multivariate analysis of recognition errors. Scandinavian Journal of Psychology, 41 (3), 243-246. Theater Mogul (2006). Defending the caveman: About Rob Becker [web site]. Theater Mogul NA, Inc.: http://www.cavemania.com/05-about-rob.html [accessed March 2006, April 2007]. Thompson, J. K. (1983). Visual fi eld, exposure duration, and sex as factors in the perception of emotional facial expressions. Cortex, 19 (3), 293-308. Thompson, T. L., & Zerbinos, E. (1995). Gender roles in animated cartoons: Has the picture changed in 20 years? Sex Roles, 32, 651-673. Tracey, J. L., & Robins, R. W. (2004) Show your pride: Evidence for a discrete emotion expression. Psychological Science, 15 (3), 194-197. Vogel, D. L., Wester, S. R., Heesacker, M., & Madon, S. (2003). Confirming gender stereotypes: A social role perspective. Sex Roles 48 (11-12), 519-528. Wagner, H. L., MacDonald, C. J., & Manstea d, A. S. (1986). Communi cation of individual emotions by spontaneous facial expressions. Journal of Personality and Social Psychology, 50 (4), 737-743. Wallbott, H. G. (1988). Big girls dont fr own, big boys dont cryGender differences of professional actors in communicatin g emotion via facial expression. Journal of Nonverbal Behavior, 12 (2), 98-106. Wester, S. R., Vogel, D. L., Pressley, P. K., & Heesacker, M. (2002). Sex di fferences in emotion: A critical review of the literature and implications for counseling psychology. The Counseling Psychologist, 30 (4), 630-652. Weinberg, L., Shen, J., Walther, J., Bryant, A ., Werner, S., Mack, R., et al. (2005). Poser (Version 6) [Computer so ftware]. Santa Cruz, CA: E frontier, Inc. Widen, S. C., & Russell, J. A. (2002). Gende r and preschoolers pe rception of emotion. MerrillPalmer Quarterly, 48 (3), 248-262.

PAGE 81

81 Zuckerman, M., Lipets, M. S., Koivumaki, J. H., & Rosenthal, R. (1975). Encoding and decoding nonverbal cues of emotion. Journal of Personality and Social Psychology, 32 (6), 1068-1076.

PAGE 82

82 BIOGRAPHICAL SKETCH Kevin Stanley received his Bachelor of Scie nce degree in May of 1996 from the University of Florida in Gainesville, Florida. He majo red in Psychology. Stanley entered the doctoral program in Counseling Psychology at the Univer sity of Florida in August of 1996. Stanley received his Master of Science degree fr om the Counseling Psychology program at the University of Florida in August of 2001. Upon comp letion of his Ph.D., Stanley plans to embark on a career as a Counseling Psychol ogist in a direct-service setting.