<%BANNER%>

Distributed Virtual Rehearsals

xml version 1.0 encoding UTF-8
REPORT xmlns http:www.fcla.edudlsmddaitss xmlns:xsi http:www.w3.org2001XMLSchema-instance xsi:schemaLocation http:www.fcla.edudlsmddaitssdaitssReport.xsd
INGEST IEID E20110114_AAAAFM INGEST_TIME 2011-01-15T02:08:44Z PACKAGE UFE0007402_00001
AGREEMENT_INFO ACCOUNT UF PROJECT UFDC
FILES
FILE SIZE 4953 DFID F20110114_AADJFV ORIGIN DEPOSITOR PATH mora_g_Page_30thm.jpg GLOBAL false PRESERVATION BIT MESSAGE_DIGEST ALGORITHM MD5
2d1a0463f83b66b657b8d87359cf8583
SHA-1
12e5051023c19455f8add2f87046dc9fc010b1b8
46199 F20110114_AADIYY mora_g_Page_29.jpg
5292c5c1c93552f6ec019a0087298a56
5cf3f2263a4e591a330f6f713ea0cb3fe0749697
25271604 F20110114_AADJAY mora_g_Page_06.tif
681eb01c09061d02b7414090f634f788
6e0fa673809e0e5bcff714435b9700601165c38d
18659 F20110114_AADJFW mora_g_Page_31.QC.jpg
32073e3c3756fb36122ea2b393a46e72
5757dc24ac659638cd10d49ee0cd07c303108fc1
54086 F20110114_AADIWA mora_g_Page_48.pro
c3db6f5e06b4d54830cd9b2886b1148b
e09f71d005d0c6687c81efc62b496da28f795026
41334 F20110114_AADIYZ mora_g_Page_30.jpg
e676243ba6edfa149aa694c60b6a13d1
b27815dddcb7ec05e7710a17da4d145431824620
F20110114_AADJAZ mora_g_Page_07.tif
061860326080e2c59b18207219de9b7b
b0c9812b9c76892970998be57736eace2be4d55f
25010 F20110114_AADJFX mora_g_Page_32.QC.jpg
d9a143f3f4fe932dd68a83229f11bbc1
d19f81d7c3a6326bbb274e5487fe0c3e1b7a9f22
31550 F20110114_AADIWB mora_g_Page_06.jpg
c06af01a72aa28b127c7a1fbb3571f42
d100a752678edd73325ceb572c518c2a7014bd36
44194 F20110114_AADJDA mora_g_Page_41.pro
1f489c3fd95c3dea8ba56d70084a3f64
11f7bb0bde71009b613725713c75fc19e3d9ca6a
7298 F20110114_AADJFY mora_g_Page_32thm.jpg
f7ec1fd410177aec88e55953c060d881
ac8db82ffc7ee55d17e913675c83026f179cad0b
1809 F20110114_AADIWC mora_g_Page_21.txt
9d0cd666751f6830edb878516ddb9eb2
4b8cdc7e1dc10dade1c3497ca2c5fe666fd1e76b
34309 F20110114_AADJDB mora_g_Page_42.pro
aefe2e6910c21933f1cd32672de25a0d
8738bb08958b4921888008beaf4d6f46d9dbc2cf
6332 F20110114_AADJFZ mora_g_Page_33thm.jpg
785ee13326112fbf1072c5537b538fab
801bc997bfe91788453774a6707220acd7778e91
59476 F20110114_AADIWD mora_g_Page_49.pro
f613a82bac28f83e04244d7baa9f5120
d1e314dc98b66939917aa94687dfe0b5aba22216
39968 F20110114_AADJDC mora_g_Page_45.pro
b39a905dd950b1731a4542637c9f2355
14304f03956c3c73fe120560284f3534ba08c09e
1053954 F20110114_AADIWE mora_g_Page_47.tif
451a113ea0da52a05e7e446c33ba0cfc
217c74098920cc194c23c80ddbd0306e63047d2c
23884 F20110114_AADJDD mora_g_Page_51.pro
9f603c6798de45d87393552769a1ca33
ea0ee789304d1ba956681de57494423b041691ce
10019 F20110114_AADIWF mora_g_Page_02.jpg
793d66e8ac69b1e8cc9dbddb2db319f5
cbc93445ae95a5e546483430e12627a0e96844a8
102 F20110114_AADJDE mora_g_Page_02.txt
22d5edcc78dfa34a753e243389a40447
aa0d8fb64ec43f2265f848c4ba3fcc1e5425a005
99449 F20110114_AADIWG mora_g_Page_12.jp2
89f13adf228a3b1a3a5ed9099e889d96
050c92f9d4272392fecc5eb1e859ec28651de621
217 F20110114_AADJDF mora_g_Page_03.txt
a8aa91974245b47777ccba7b60e0e9e9
cb5227ed6bd92fc7cfa32f3892bb73c8be454bf5
22051 F20110114_AADIWH mora_g_Page_26.QC.jpg
2a77b4ffb9580122d348811e4df42da9
67577d320c28220386cb422163c0fa290f0f8181
1851 F20110114_AADIWI mora_g_Page_18.txt
b9cb400e3388423d2963fd0ba2c84f37
a055bfa88baec5ed5664475bd6610da333c7618c
1047 F20110114_AADJDG mora_g_Page_04.txt
2f5fe2b8735aa5e43808c570dafdd367
f6d31ca58eb59f41a3d86e5d8ca247bb19515fce
53458 F20110114_AADIWJ mora_g_Page_42.jpg
f3039efdbbd2c49ab880cd9076d67c7b
7fa01123f9c9b500b17d319d14260819a4db1dfd
2333 F20110114_AADJDH mora_g_Page_05.txt
159665e2aa5c47c561c1e16db48ce891
4f67626dac05ff80e6d765126cee02fe999b15ff
18720 F20110114_AADIWK mora_g_Page_15.jp2
0c192d4e5455559bd1ad302b1f0f54c2
7f8f088e0f126aa14a8cb5e466dd84858c9849af
859 F20110114_AADJDI mora_g_Page_06.txt
f91ad5817374dd45d1cbf75218c2c24b
06f05cfe026f1620e9a42f5bc127e0d31691328e
6536 F20110114_AADIWL mora_g_Page_41thm.jpg
fe2d873d0b75200f0ddc3167e3dc0c40
305f614427a120e7262557dd17638d1daf02a0c8
1287 F20110114_AADJDJ mora_g_Page_07.txt
865a40018f879219fea81019d19be2dc
3303c7feb545675dbb4add9bef2d6ebe458b24f6
20154 F20110114_AADIWM mora_g_Page_16.QC.jpg
3bc7838e67e4b617487b03f6a8f91558
6118a68df5ae199d55f9ce861f1a70881af5bd00
1547 F20110114_AADJDK mora_g_Page_08.txt
84afe22b25b71ec2227c0221ffb92748
adf726d898446785e66bab772bce8613c758287b
1051984 F20110114_AADIWN mora_g_Page_11.jp2
474b4023e3aca80cee592a772fdd9a5f
4fc1770cdcbc3c6c5460242d6f15cd6fbf4583e1
693 F20110114_AADJDL mora_g_Page_09.txt
44d4dd9eb947deb852e709999c478ea9
4d7140e3cbcde02234b6411536ad1189022b25ad
1430 F20110114_AADIWO mora_g_Page_32.txt
a1c8c23ccf0f6d0ec9b35bdf2dff702a
20b84b3b9e13db69c62308fcf92d1d9772367617
1856 F20110114_AADJDM mora_g_Page_12.txt
24c2544ea0343166b35bd6ecba4f0631
5f481ed290d6ea10e6205ff988b5a0602ae3e969
F20110114_AADIWP mora_g_Page_48.jp2
9a6611a6af532e7213fd765f189eb0bb
bb73a8c5751671d4ef85afe502db37263c663a3e
1743 F20110114_AADJDN mora_g_Page_13.txt
f59964e7f50677435b8211afa4916dff
170f0ce7165aa8743fb45e5134d1cc3af3aaaf80
677502 F20110114_AADIWQ mora_g_Page_06.jp2
a9fff272811bcc642f43f3f21f9d4621
c7251bcff6f7150dba30ade061b39e59c0559029
291 F20110114_AADJDO mora_g_Page_15.txt
4cf846912867602e8feecd7cc827cbad
5cfc323089d866ba6dd0c7e32b9c729a63a66b80
70710 F20110114_AADIWR mora_g_Page_44.jpg
6fb4edf11550f753ee48de6e5985f553
ee4bf940d72129c30507598602183a7882318ea2
1717 F20110114_AADJDP mora_g_Page_16.txt
5460821c43778341ea0f22176130e9de
4c4e6c7bd70b48fba3d194ec1eebd74ca47c248e
2068 F20110114_AADIWS mora_g_Page_01thm.jpg
682402873f3b76b13ede03d58ebba87b
7924621ac55e83172c6adade86b57aa16caeadc2
F20110114_AADIWT mora_g_Page_16.tif
81331d07fde7d2e44345a4a7e463ed81
e211b6d2bc967669a9941d28f456a68143640943
1838 F20110114_AADJDQ mora_g_Page_17.txt
997cae2bd2a21f6b69a082cf958664ee
cfc3579afcaacdafc02f491ab5f25d2a001be73f
3124 F20110114_AADIWU mora_g_Page_09thm.jpg
cf0e2d353e1868e6c11a397c54010023
5f225d52ec84d5a5f4ec75680d20431dbbea8c81
1796 F20110114_AADJDR mora_g_Page_19.txt
45624272ba90c88459c8ad0ed0c7059a
4b778cb5126af72836c610385535e0f5eb72b28e
17912 F20110114_AADIWV mora_g_Page_47.QC.jpg
5cf3cbaf78480ddaef511f892a032c0b
f43513cebcfcda57f4d4c09443b195e7eef57d4c
1936 F20110114_AADJDS mora_g_Page_20.txt
da663c203f487df9abc0166e5844c032
4937f90ea53e48f030376ec39d6b048ef21e50e0
65707 F20110114_AADIWW mora_g_Page_17.jpg
30c30b58931c8c25356d771c0a2366de
387dab69317c335914f6969ffa48518970a739f1
1731 F20110114_AADJDT mora_g_Page_22.txt
3299da201e7fc48db7d291ed54684f35
f62b57c18d4a4d6120b5df07dd07fa1d4ce2691a
33983 F20110114_AADIWX mora_g_Page_08.pro
62bf23856d3bdfb8ecf7325eef702662
7850b74be90eba6834757018a2fa09ffa0174e1c
1861 F20110114_AADJDU mora_g_Page_23.txt
8b5498a7e5061dad96ba7cd6c4144c39
a7352c5be0032dad66f604fed5ebdeaa4349bdd4
F20110114_AADIWY mora_g_Page_24.tif
a41655c60ef781efb5de8608ebe32f8e
757702aec313100821a01bbeacea0f265e2a8dc9
1067 F20110114_AADJDV mora_g_Page_25.txt
dd036885a61718679281b9a8e6911256
d6aa379767c8dae69b1817e94acbb2fceedd07e2
40515 F20110114_AADIWZ mora_g_Page_16.pro
7366b7c1a5dd1cfd2b78df17451e17d8
8ffef2e95d27832b7546021d2df6e7f4799501a9
1898 F20110114_AADJDW mora_g_Page_26.txt
c3ef81aef1f819e065846a3eb591a793
a656f7e499bfe62b57e52de59dfb4cdbc8bc9a32
F20110114_AADIUA mora_g_Page_32.tif
d080d6ef6bfddd804c1052e8ad0e9d9c
7c26591e7816a8917a9c85fb9e5c1feb927aa8a5
1349 F20110114_AADJDX mora_g_Page_27.txt
0bd33091d8485dded9db2e53322ed355
8f24d2a80fb925789d12ce1b84cacc63624d2a14
30952 F20110114_AADIUB mora_g_Page_47.pro
3c9eecd3b276acc63d3775534ee5197f
9b866e5b1ee34c968c1b86b2483835f1076fe98c
F20110114_AADJBA mora_g_Page_08.tif
757c0778c886c2342ef1cea29cafb79a
1ae684ff20c5b38d9ead083b4f8997d742527fc2
194 F20110114_AADJDY mora_g_Page_28.txt
11deb5235c407b73665bfc16af2535c3
6f2fc0dc729b84a1d8b7b790bfd454c12db5ba2b
55812 F20110114_AADIUC mora_g_Page_51.jp2
53adcf1fad4b163dde43781692d7677f
96f2e940005b58eb0d05e0dd796cf1b22e389647
77766 F20110114_AADIZA mora_g_Page_32.jpg
428fde024337c4dd1eaf699bf8602787
70c6fe54dcb13c045096c9fa66054fde428b6eb6
F20110114_AADJBB mora_g_Page_10.tif
6bdfdd2c9313207cab7a80e10669d62d
c5fbb88b5f4efc887244dd9554c0ff335247c654
152 F20110114_AADJDZ mora_g_Page_29.txt
39a2b1c50383d266e08dc8686eeb8820
ad2a0c7180b9cc87b87c4677e113a363a2b790b7
1357 F20110114_AADIUD mora_g_Page_02thm.jpg
9343b88a2250bd6c536319e7b46904a0
97e1cab01330e5f24e662b6e13bf13dd5568ff41
66019 F20110114_AADIZB mora_g_Page_33.jpg
e0540d97df5332ee8a0a024c938e6b8e
cc10e149298eb50a6fc5d8cf9119f57dcedb92ca
F20110114_AADJBC mora_g_Page_11.tif
e3f8f8f039e80e2fcf90c69cfa1f9840
0e2c30e45a3b05dc2b8b144791d99363094d61dd
1626 F20110114_AADIUE mora_g_Page_31.txt
748e15e876b917c7643a947c78322ec5
ed651cfa21f1d882a39f77fab8a3049b7a7c001d
24533 F20110114_AADJGA mora_g_Page_34.QC.jpg
13ea048ddde8bdc8dff3132030493faf
a57156acf4743393d8b10a87c020ff8b38720006
77754 F20110114_AADIZC mora_g_Page_34.jpg
87c526a97be56ee257a9d70faa8d10b1
81bab4864845050e664ce29dc47af074757b7cd6
F20110114_AADJBD mora_g_Page_12.tif
2c8b30b5e240cbe3d52aae920e3da6eb
c83c11d282111d9210e0f83798222faf405733ef
48016 F20110114_AADIUF mora_g_Page_20.pro
8fb2476a099846fe0410f895508e237b
8cedfe8c087d567fba438b5779a92bc8483522be
6701 F20110114_AADJGB mora_g_Page_34thm.jpg
a0d63bd88c99d3cae05557f6d3d743ad
6f805313b345b2db51f25cfef325bca1832791e4
69525 F20110114_AADIZD mora_g_Page_35.jpg
03ea8ad384ae7feb148dca54890bf67a
9390f6b3e219b032a1ffdc308a84e254008c9d4d
20222 F20110114_AADIUG mora_g_Page_19.QC.jpg
cb19b1390d1f19a06d4ef2b3875e81bd
b1f8aba672945f1c2642107769fe38d9900acb72
22750 F20110114_AADJGC mora_g_Page_35.QC.jpg
03eeb39dfdadb632bbed23e1af593706
1ba9b321809d4f12fb52114bbb3c75c03d84959e
54221 F20110114_AADIZE mora_g_Page_36.jpg
8bfb273d645e987205e7b98d2180f84c
13cafa1ef53fab7aebf2ea9e6e63ab783dd14dca
F20110114_AADJBE mora_g_Page_13.tif
e7a17abddce7209401381863e0aee077
5707ad4684fe9b50de254c43e6b8e7c4ebd3e66d
F20110114_AADIUH mora_g_Page_50.tif
dd0341a928a86f92016665446baa9f95
56db86a7c9f9769aef4ea465dac6680c9825b90e
17702 F20110114_AADJGD mora_g_Page_36.QC.jpg
e27aaec3afb694e251f188b01dca0eca
a15262f9ec73e88442e24b033be2244226429e4d
62931 F20110114_AADIZF mora_g_Page_37.jpg
3af731664c285bfbff5ba09af3d473d5
ca3b27d965d3d4e0295ed2a070dcc223c7d146a8
F20110114_AADJBF mora_g_Page_14.tif
d509fe5e388420c8c538c3e84061c684
2dd20e9401ef67b41aca460185c42ec932b2b5ee
1681 F20110114_AADIUI mora_g_Page_40.txt
34163fac864b7549ac47018cc91ab609
0b937a655c0135dae0d340cfcf19360a9854349a
5368 F20110114_AADJGE mora_g_Page_37thm.jpg
4be426233980ae62731cb8a85782534c
f821ff470c3b8114394e7139045c2dc443d2bd59
54446 F20110114_AADIZG mora_g_Page_38.jpg
c57b44769048e4748d73f7e6464fa44b
6eecc97ad3a7630f23a31be2e5dc9ad4a1fcd886
F20110114_AADJBG mora_g_Page_15.tif
3e6442d9cabb814be832f3b0ad383436
75a5fc964bceb10bbfaf647d3de6ea2880b3f7bc
6458 F20110114_AADIUJ mora_g_Page_25thm.jpg
846f34fcc8e92774aab9e0d39a3f3fce
97bdf71ca8bb7f8598e59d7f7a27f7a0432d6625
17643 F20110114_AADJGF mora_g_Page_38.QC.jpg
f4bcbc644f952e6a1c40f0edc2af2e38
e4b7dada14e2c3a79f6ae1ac4b86582d6c779dfe
59244 F20110114_AADIZH mora_g_Page_40.jpg
124de6a4895a668e481ae8ae4582b427
9403fda2e81c7b579bfe7aadabe216868518372a
F20110114_AADJBH mora_g_Page_17.tif
49e377fecde097dc1bd72a26f8cafdf2
1c8bdb5f492bc57c74e3d68ed3ec3c41dd9b9420
F20110114_AADIUK mora_g_Page_45.tif
132cfcc0759734ec5c97d7d415cb2c0a
1c099292ded98edeb25f5978f05b0bac6307a8dd
4578 F20110114_AADJGG mora_g_Page_38thm.jpg
7d3f0b28863cf89007644355ec4a318b
87aa3184f720481fd8c6a12f866f46386c37dbce
65037 F20110114_AADIZI mora_g_Page_41.jpg
4c1ae3e7786a95e11ae0556ea3815864
bb5d404b28feda638e5faa810d6b16c66aa8e274
F20110114_AADJBI mora_g_Page_18.tif
22727b0578d9cf11f6c8664c7fac3805
62a29673a341a801fa5f6ce70bd0de8b376a687f
6265 F20110114_AADIUL mora_g_Page_12thm.jpg
6b64feed93a2c5bff31ddf1027a546c3
b06abfa8781526648f64407460ceb07a2dae179f
19214 F20110114_AADJGH mora_g_Page_40.QC.jpg
9ca1aaddaa3f7ea164689e607a521ea5
460242e1df989a3917e3b65b60585b5779fc411a
60087 F20110114_AADIZJ mora_g_Page_43.jpg
ae04af1516963c8190b85f417e2eb88c
0fb0be6aaa72f47b5c98a7918b013c8b08232141
F20110114_AADJBJ mora_g_Page_19.tif
28ac538095de50dfa009e1c927d506e5
2cdd551842548f15eb03edc0ad1de8b022b617c5
54808 F20110114_AADIUM mora_g_Page_34.pro
e2c5fdc6f52bb7b970eb89b8b56b5860
23008ba9a8219a1994c88e8a301fa2c40d6947e8
5618 F20110114_AADJGI mora_g_Page_40thm.jpg
89ce1544f049f59cf446aed6acac1c55
30fd1148d2020f310ed879c13a6b9227c021a768
56259 F20110114_AADIZK mora_g_Page_47.jpg
ad2e2a941d661c43eaa975f54cf5248b
89f55d48f2b86d5b71eab0601042160cbafa999f
F20110114_AADJBK mora_g_Page_22.tif
ff6bd0b8508073ccaeb67d9f5e88bb9c
a64a6aaf08c8f744e51076799865323cc21d3a12
1772 F20110114_AADIUN mora_g_Page_41.txt
350ca2ba152cb8441f08b9853dcd8b17
b6335ee171be8eb887e0d9e71a19b925385e7554
88162 F20110114_AADIZL mora_g_Page_48.jpg
2ea2dd56e39d576b8815ed4a9f71c7df
eeee8dbef04821558265eee70f49aadabdd15319
F20110114_AADJBL mora_g_Page_26.tif
9081a2e9fc8a6628fda68a848d3bb526
a7eefe23222f419dff2f59c48d8329378258c6ae
5510 F20110114_AADIUO mora_g_Page_43thm.jpg
91ae7717fb0a7c6abbfca91c1f1116e0
e4524a15bf5b7fbd62192d85f2afd75c7a7cad7d
21548 F20110114_AADJGJ mora_g_Page_41.QC.jpg
7cfa463bed8d55406c145778f96391b3
008476f3066fa49a5b5c8765c8db70aafcbfc8eb
92643 F20110114_AADIZM mora_g_Page_49.jpg
06fe6a81de9394a01598eafd0ff1ad74
0d84feec0ddf62d010cb2e64b8a43d303a5fd5c4
F20110114_AADJBM mora_g_Page_31.tif
ac5b5caa552462f3a8a34211f0307f88
0459fa9a8ca04b16303b0abb14b0a58da57e68b9
F20110114_AADIUP mora_g_Page_35.tif
c261a050fbc7e371f55403bc11f518b7
9dca58a1962666ba9dbb1e7ee612a7bbf5b706f1
19108 F20110114_AADJGK mora_g_Page_43.QC.jpg
d3595cb4bfc5718bae3be7709597ae85
f15a842eb19bdfb1e8b1bf5d92f448cd4e51d00f
16585 F20110114_AADIZN mora_g_Page_50.jpg
3de075b9051f7e720c54a57984b96bd0
9938a3196e948efb3d6cbc8f4f96a0f47aa5a227
F20110114_AADJBN mora_g_Page_34.tif
046aaa919980335bf94a2f33cd0bf6bb
0b2ad514e29761cb02b3c9435e280f4e78d66fdf
22574 F20110114_AADIUQ mora_g_Page_23.QC.jpg
6827a39f744427b6819f44f6d8c517e9
33fc8a34d812988121d91e16346742802495f980
6930 F20110114_AADJGL mora_g_Page_44thm.jpg
76484c2a8ee5d7205f7f1ab5867a6f08
0d0d35c119a20f0762822a4b67fdbecda8d60736
41307 F20110114_AADIZO mora_g_Page_51.jpg
18ba6016a286059a940a9d565129fd4c
088bf9643044eebc58003bb2746baef94dfb7480
F20110114_AADJBO mora_g_Page_36.tif
f4b284fb6f9cdf3d6055ae29fc2397c6
127f1a43639d153a2b3b51c0b3a28c61c0fb510f
104997 F20110114_AADIUR mora_g_Page_20.jp2
86b08d684f583778c400a5962b899521
7699b334026dd076a2a4bcd2e7ed6176ba8b595a
19868 F20110114_AADIZP mora_g_Page_01.jp2
334e7e30c80443e7241df0e2032a1e0f
a29ac8b64eb8d643a1fabe2138cf0045dc47c59a
8423998 F20110114_AADJBP mora_g_Page_37.tif
a82211a87f367dc61a6bb174b00f2cff
4dc1a7c0ec0a97b669eac162b0f6deb5d64d59c1
60411 F20110114_AADIUS mora_g_Page_14.jpg
7e4297c341856ae8bf9c4b45a56738f7
9c4dbfd435dcd1f403d5a61799c67cf29d572457
7186 F20110114_AADJGM mora_g_Page_45thm.jpg
787803f8e6abae4811d670aab2cfd95a
3211bc0015341ffc6e325416892ed815f3402c00
5608 F20110114_AADIZQ mora_g_Page_02.jp2
2c4265c37c6bd4c8cec921d4b1b75181
76f432c12de6f663c92f7fe1d65d5adc6e8facff
F20110114_AADJBQ mora_g_Page_38.tif
2d519f421af8f5804e17510c4c7751bf
80ae13f01645df742bd9a641cf21e71a8e55d4e9
1555 F20110114_AADIUT mora_g_Page_29.pro
790b10a890adf0fff540f838aa3714df
e6d3d8211d995fcafae7c4c743c14e23eed4a512
6858 F20110114_AADJGN mora_g_Page_46thm.jpg
782b75cacf2b15487b327d189dcf3995
2531003a625cb05ff4624b35e493968a11a4eb40
10370 F20110114_AADIZR mora_g_Page_03.jp2
3f79b6f65bf2be3d06f5d07bf55adf69
eb67843a18fb9c8ac06d6ebbcc4d8927b3a38e5a
F20110114_AADJBR mora_g_Page_39.tif
31719a1799ab26e0b6a62afb3364ae92
7489ee8d767fc093226f02471fd9101e03a84434
17696 F20110114_AADIUU mora_g_Page_42.QC.jpg
525f0071455dcb0ebc43a444ed2ffcf6
3477d04e21dc76c3c09c146ea6ed48966bae7870
5164 F20110114_AADJGO mora_g_Page_47thm.jpg
a322454c62095f41121cd5fe75823eec
6e14bb3f70479a81a1193b7d3bb802bf28721fb5
57402 F20110114_AADIZS mora_g_Page_04.jp2
bff050050e35ecec932150d042f25625
a7b893e8be37a9a0d62b6157783b6145181bfa96
F20110114_AADJBS mora_g_Page_40.tif
9faedaee63c5d7d8b39e435c30ebba36
4d4d5a1d7ffa1f80678f329ad339c118f03296fc
24048 F20110114_AADIUV mora_g_Page_44.QC.jpg
31680273a7b2f9996798e80268746c82
9cc75c6780f371aa7f193787e80cc24eeacc2281
24511 F20110114_AADJGP mora_g_Page_48.QC.jpg
7369df8e63506514af66dc641f1c8162
d132a583375abe40fac74e71513839523ce6a4b4
1051958 F20110114_AADIZT mora_g_Page_05.jp2
605ee484ebaceba632329afc8a7ed240
c18a202d9562405a1373a5488bd5043ad86784b4
F20110114_AADJBT mora_g_Page_41.tif
5257e7d33e58975acdb8af606cd8d628
7eb8ca4557ba503638da2bfd0d39c9c113915287
37167 F20110114_AADIUW mora_g_Page_39.jpg
bfaee955653f55248a8a08998908c4b2
88909f6dde9d1fe0ec6fa06f89b21f0a3da4400a
26211 F20110114_AADJGQ mora_g_Page_49.QC.jpg
df28bb4ccb7f434d7498d7e5e135d0d1
9b5885bf359f8eed14c79e81e04da6185d72bb6b
1051972 F20110114_AADIZU mora_g_Page_07.jp2
d97de23493d6abe6f11c76150224a563
43a59db74afe93c128fba1d3ed874191f5ce1762
F20110114_AADJBU mora_g_Page_42.tif
7b8acfcfd4e45ce9c38c3b4263e016f0
5faa261fff6c4e614a978fd6386ce43b62b9d34c
38828 F20110114_AADIUX mora_g_Page_40.pro
d80456eefa0da331845c013930a9b507
4c59119495e1d91b9e2fb691b64946d74139508d
6971 F20110114_AADJGR mora_g_Page_49thm.jpg
f0f24d6634294dc30edfce3c0b71c791
a7404dcbfaa79c8f823aabbb44f9da78fe8f4987
39106 F20110114_AADIZV mora_g_Page_09.jp2
9dd5cf2839a58c0ba0e4c7ba08d79b28
220dca2bbafaa29498a2bd655b58f59d3f23879d
F20110114_AADJBV mora_g_Page_43.tif
1e0067f53cd7afb3bf0ba8ae2efc70ce
cd4b9d68df97635a1a7856265eb8c145982e92a3
5069 F20110114_AADJGS mora_g_Page_50.QC.jpg
1a5a15d6640a8ec03d5c34ac41b27406
5f7bda6366cc7e265ca7f26daca129336c29965a
85186 F20110114_AADIZW mora_g_Page_13.jp2
f6470f5bac302ca267b0b4e6b295cef6
e4660e6fc05a8a6f7b90331b53ed531a45522273
F20110114_AADJBW mora_g_Page_44.tif
838027ab1c340e47e8f2cf824d2badd0
2130f5ad0fe5d456045abc6b5fac0f47de539c04
6745 F20110114_AADIUY mora_g_Page_48thm.jpg
fedd376340f7bab26288f56116a4ad6f
9ccee6b3f3753e88bcc14c9622c4f6e7ebc97ef4
13820 F20110114_AADJGT mora_g_Page_51.QC.jpg
d6a2a63ce1f5308e3e12f48c25a83ae8
3f139fc2b9e43d5ca5f680ed043b0c92bb6b1470
88694 F20110114_AADIZX mora_g_Page_16.jp2
cdfcd7bddeb84a8b12e2e67eec079cd8
062e26db5d5c17dfbba5c0a26f4b3271d5b6aec5
F20110114_AADJBX mora_g_Page_48.tif
e5967984667daf51f5fdaa3c1c0ed67e
c2ca90e724bcd90d38f851d7e34dee2ca65a2381
76920 F20110114_AADIUZ mora_g_Page_42.jp2
f1a99ead2b2c4382a98273713774e4cb
0281fe1fcf14e5da6b00af2e5dfc15679447e6e0
60707 F20110114_AADJGU UFE0007402_00001.mets FULL
2cd36efd0034b171d4c2dacafa078dbf
6625165e9329fafef0b8e5012e4e4ab30bb7023b
97997 F20110114_AADIZY mora_g_Page_17.jp2
d6271e8836f55be743f1017ccbeb55a4
7385f4ddd74a698965f7f4e65e2414f115b03f22
F20110114_AADJBY mora_g_Page_49.tif
63465743f79481f6a5f02b6cb890359c
f4dc40ae08cb60ea28ab99e4223320dab49a7b4c
99617 F20110114_AADIZZ mora_g_Page_18.jp2
ffd64d355e89b68ef5533d44b1d560bf
f89b740a90885fa4ea37c8a1609e35823e39eb54
F20110114_AADJBZ mora_g_Page_51.tif
4cd5c0698d5da540b9da8c886f26dd1c
108f707ff0f69d2aa9441d4a74ec6cc34f7adb5e
1893 F20110114_AADIXA mora_g_Page_35.txt
bd8c14f839e56922a0c6dc71d956a025
ebdb55224079a81288a833760180c52b3f9d665f
38659 F20110114_AADIXB mora_g_Page_46.pro
2a97983e4a9e2d595d8c4d4daa0d18ec
b4e0b5a97158f0cc345783438895280253484430
238 F20110114_AADJEA mora_g_Page_30.txt
4bc2fda3aaf2937271092d76fd6aa901
3496122c96d5a6e744d3c8c5bc2de2e662f8cdd1
F20110114_AADIXC mora_g_Page_33.tif
69d18895c8cde5e5b54c9dc9903785a0
ea7d83b6c110e927183b6bbd2f8d7e5c155c3fa2
1848 F20110114_AADJEB mora_g_Page_33.txt
e6ebfd3cb113bc86f8c8768c69103624
22dd19414dede4a83975d2de86de1503971524d9
76448 F20110114_AADIXD mora_g_Page_45.jpg
6b818531baf1f8009ae31af638cc07f5
e5f4e12320f1f48ce92fb6d326b3eb9157d96004
1531 F20110114_AADJEC mora_g_Page_36.txt
0f3ad211c9df7f3e524850aa2a81fd39
301715fbae568ec06ed5caea84ca74b55b118235
19416 F20110114_AADIXE mora_g_Page_01.jpg
692d98d7bd57e3ddc0834b2c48fcd4b4
0b107a75aeb8fbc021694944af3b14e8894772d4
1261 F20110114_AADJED mora_g_Page_37.txt
aaa8e4bdeb59f8781f009c21899f65d9
f37d2a6112ace3f1122d11e9318135933c388b0b
7010 F20110114_AADIXF mora_g_Page_15.pro
3c5fba0d0e3f99ab071266886aa50007
0332f8f3ccf4b64d2687707fb5b9c68ce9820a6c
1035 F20110114_AADJEE mora_g_Page_38.txt
c8588ebc8cf243a376985587f8d6c417
4e1297e8d7359ef876fb0bf13bd5f06b7c8cb06d
29005 F20110114_AADIXG mora_g_Page_09.jpg
1685ed00f4c4b26d05636377f1b0d5d7
0ac2822b399dbc46979ac5c05cf76d9aaf29f235
700 F20110114_AADJEF mora_g_Page_39.txt
d64ecddcb1a37b2658c0598b689cf5e9
c65d8881b325c256756692afca01b615c89547ca
3925 F20110114_AADIXH mora_g_Page_51thm.jpg
cc5998cc6a05b96f5d51adf5e4f6f9c6
196085bf8cd0b80addb65c9dcc2b3425a2eb1176
1665 F20110114_AADJEG mora_g_Page_43.txt
333be14e3a183a551532e7cd0d925a34
5d917f848d5ea01e531dacc608012983cef988d9
387063 F20110114_AADIXI mora_g_Page_39.jp2
7fd04505df648159dcec8e2a64cbdc08
96bf530831868c77e97d2971d16362d2981ab7c2
F20110114_AADIXJ mora_g_Page_32.jp2
273cce0bbbcee3ecb2a1996478ac0e28
2f9c5d04ebd716cda3ecd2e75fdb9884b25949cf
1524 F20110114_AADJEH mora_g_Page_44.txt
94930d2a126255c613c079065e72a43f
5b68fb96a1006497dea139109181a6f845824682
1421 F20110114_AADIXK mora_g_Page_42.txt
6ef1f5076ddcedcf85c4aea876d217d4
047bc63f7de3139622744ea9ecfc91c0093af1cf
1574 F20110114_AADJEI mora_g_Page_45.txt
538142ed58723b5eceb889b80ded3545
e9575180c6f8cdc47eaac217eb65a34fd48471a8
27511 F20110114_AADIXL mora_g_Page_25.pro
cc2400da62249488f133285b2f0f4bc5
fdc9d01b6b50c63df0cb8bebbad5c3ff455d2cfa
1506 F20110114_AADJEJ mora_g_Page_46.txt
0e453e4420a576371e22fc63bd562033
61a676bfda76ab3d351307197a09310d89906b3e
F20110114_AADIXM mora_g_Page_28.tif
67baaacb29516ae76b1e6fecad87e74a
22ccd5d859a1aedfe11207d4e25607579f3cf87f
1222 F20110114_AADJEK mora_g_Page_47.txt
cd07f5db509d02397352cad266d5fba7
f38c650a7a16cb8da0842fa95fa35c0b51ae1360
17758 F20110114_AADIXN mora_g_Page_15.jpg
5d0629011697da0bd6aac9bdd9af8025
142e14c0c2da4f1e30925fa6b3eb7a0f5c8e80ed
2200 F20110114_AADJEL mora_g_Page_48.txt
bc790977afc0b151a73c3e8f4940cf50
342baaebc9849fe6ad51c1e60d270370006e327f
69007 F20110114_AADIXO mora_g_Page_20.jpg
edf38d78facddffb544c6472eed602b3
3cc52be98f213434e20da941e8f6e0983c2c3bac
237 F20110114_AADJEM mora_g_Page_50.txt
bb3dc735b706dea294171587449d3285
30237724821d500887da706bd2ab837e8a771e52
1752 F20110114_AADIXP mora_g_Page_50thm.jpg
30cec7622824a3650ce57a7e5b338289
08e3861a76f1b58e7fed9a8f89cf5cad4e465770
3190 F20110114_AADJEN mora_g_Page_02.QC.jpg
ff56dc05170e15a7ab7f892828ad65e8
5e98c1176a538fbf364cdb70cc15aadccd57f019
14867 F20110114_AADIXQ mora_g_Page_28.QC.jpg
a382f7b23beb42be5e15ed9c95809f56
cc58aac8d2b1573ec53b7edf8f77d6ae14389441
4085 F20110114_AADJEO mora_g_Page_03.QC.jpg
6a3ab1a70e4d314d158a4897d72ad585
824901e76bcc6a495522b7f120b4741529c80b72
4010 F20110114_AADIXR mora_g_Page_07thm.jpg
6099e920bd17fa894cd1b77a822e8ada
0615d17fd26122268832b56a4b75e4f3e95a3361
1535 F20110114_AADJEP mora_g_Page_03thm.jpg
d4b7bf653cbf00455ffc4e3fcf2ae0c7
b32312e6b8ae050aae1f0a39de602a6211c5211d
11230 F20110114_AADIXS mora_g_Page_39.QC.jpg
1f36d36ac4893fb36e4dcc8f5b6de344
ee45a8d45fe583add863c8ff39388434e812066f
13779 F20110114_AADJEQ mora_g_Page_04.QC.jpg
73a1a3a7f5f2af23d46f7d54e1a4be25
8adde8460b640e6190afb4806fa54977100e5312
6468 F20110114_AADIXT mora_g_Page_17thm.jpg
853e58d8c130baa0f78eda6abaef041f
bd25f7530d12be7e4b168140ddb8fffe99fe8864
4234 F20110114_AADJER mora_g_Page_04thm.jpg
5fcf460e34c7da025bf164c0f010437b
453e03b580b51af8d51ada2e4da6fda3f632cb4f
38578 F20110114_AADIXU mora_g_Page_44.pro
c1ae7ad181860683b7f4d4abfa765709
1b775f53883c343c5c4ac74e72b0528965c9caf4
16056 F20110114_AADJES mora_g_Page_05.QC.jpg
68778efe4901ffcf7a415f491a10fd67
759a3b0b96b5fa3d632f67a2755510cd62e37487
5226 F20110114_AADIXV mora_g_Page_36thm.jpg
9450882c66a21138be534faef32c9b36
174ef57882bc853fb64b557bbfd6a1d8ee29779f
82048 F20110114_AADISX mora_g_Page_10.jp2
3b5636d1c875c187a17b4ad90eda4dbb
f05e993ada70303915ff641eea23bec23b160bcd
4395 F20110114_AADJET mora_g_Page_05thm.jpg
6b2a784bbef289b678eb4bed82d2926d
371eb370ca534a6fd56acfa3f794c23cdefb03f3
37217 F20110114_AADIXW mora_g_Page_11.pro
a67b50dbce35e3f910ba42be4fd47214
a612cafedd8d842da46b1694ab38574270fd7980
22958 F20110114_AADISY mora_g_Page_20.QC.jpg
6a25a451158cb611834bcdb73166d7d3
0f025c3f3ba74fe6b91a4e8c5fd9ad32bdb00bc8
9024 F20110114_AADJEU mora_g_Page_06.QC.jpg
065ace18d8f4b17241aca000756141df
fc551d0c4ca7ce39db497c84e5b07bf018d0e8a6
5701 F20110114_AADIXX mora_g_Page_15.QC.jpg
a27e963ec08a60fb2c5fc092c36b8e5c
93f452fb72986191f2aba8407fa1e53a2e3c2e2e
18241 F20110114_AADISZ mora_g_Page_10.QC.jpg
55192fa6986ab50ce60216ef18e91f77
ab074b1e64c2dd91a37f3ae59db70438b4cfa73b
13351 F20110114_AADJEV mora_g_Page_07.QC.jpg
01a10802a3f3f775a1772bab6521f5d1
e0d5a1b44947c1ca2c3ce9f380c69d379e9c3113
F20110114_AADIXY mora_g_Page_27.tif
a49c701ab844fd426c6d75a0204632d8
b5b26e9075d06d557e06fbd8cbb7ac1d5dba8252
17038 F20110114_AADJEW mora_g_Page_08.QC.jpg
ecec35bf3e5b59eb7647a87a5255294b
71c8990e2809165b39b705d842f02aced9f8f7c2
2224 F20110114_AADIVA mora_g_Page_34.txt
28abf2b4c84ab2831e71ba8e4d9a5514
19715167e367d2816598281f8c48a0994f714499
13549 F20110114_AADIXZ mora_g_Page_03.jpg
00d0fcb953de907bd865331788dd78a9
83f268361203b76b35e1bdf0b7e6840f42c9caf5
5603 F20110114_AADJEX mora_g_Page_10thm.jpg
ffea409e1787458828b4a9cb39f75120
39a742cc14ee3ba06e7eab373590987232a95103
64940 F20110114_AADIVB mora_g_Page_12.jpg
7fa5e7bc4cb7b1209514f93e54395558
de6d4509e7d74f47c8e2bd5bc91e496b2186b9d6
24664 F20110114_AADJEY mora_g_Page_11.QC.jpg
e0ec17ee989711f38db341918b4e9a62
e424fbcbce2953ecaa8a0e3c2aea848d38ab1de4
71663 F20110114_AADIVC mora_g_Page_46.jpg
c0faa79724c93094d4025f65948d97a8
fb85af1edb4e308b9d832978b5ae6c77c773d394
6611 F20110114_AADJCA mora_g_Page_01.pro
e49a309ba7a486167863a7804995432f
8e72601a6e4f3fa85270b960fb37077c294b2596
7196 F20110114_AADJEZ mora_g_Page_11thm.jpg
f22769716dd238eab063fdc8319de894
f69b13c18b15a0ebfad1e1cf1f055e35845b54af
1503 F20110114_AADIVD mora_g_Page_11.txt
f0d4c8538ea65dc8119a7918c7a253c0
d936cff4426164711b2a27d61c2b9b07efcdad8a
1065 F20110114_AADJCB mora_g_Page_02.pro
5dd05509bac26d1d3204d383ae4bf1d8
36a84345e9edccc168ad623ff4ca577e2aeb9dd5
42742 F20110114_AADIVE mora_g_Page_19.pro
374410c9fdccdbe11258b67db602d2e8
a12f6c147cd0100d4f3b41cb1e654cd05c2a405a
24657 F20110114_AADJCC mora_g_Page_04.pro
4f7060adbc76d016634134e3a3f50af9
adfadda7aabec2983a9633d65825efe6722a5b58
6433 F20110114_AADIVF mora_g_Page_35thm.jpg
7744ca7d7916113ce238a7463db40c62
6947fe58c59c57472ddb14d2e74160ac0057c217
19555 F20110114_AADJCD mora_g_Page_06.pro
d43512ea769be7044f26e698d63b760c
fa4fccb3cf2c6453d5829809349c460a756a1e27
F20110114_AADIVG mora_g_Page_23.tif
0e32dff48675ecd3b22ad2eb0008a607
8d30a84ebe374ea8ffc3c36a166f31a3a121fe46
17052 F20110114_AADJCE mora_g_Page_09.pro
6f9c21f75933f433b6ac0f2f67aaa41c
022b69d7b1e9e3940fafcba730730d78e40f70e2
F20110114_AADIVH mora_g_Page_25.tif
07b984239a0d07cbe9abc1701c4251cc
29f9365e1eca1f2b7effe1f5ebd3ba4064a575ea
19781 F20110114_AADIVI mora_g_Page_14.QC.jpg
94f4dd94888e7009be73085c5daad81f
6660ae9d6858bd022a12d806919780e183d64275
36969 F20110114_AADJCF mora_g_Page_10.pro
a96243a0f700f3d396a42786a7567db7
da96554c34e670091be8872de111e9534ca29443
9681 F20110114_AADIVJ mora_g_Page_09.QC.jpg
08424d3daafe857d8b79d335ada7c86d
0a174a2f4ef89582ce0aa4e999770553b4a6c1c2
45208 F20110114_AADJCG mora_g_Page_12.pro
fd9e225c4be91ba1c919578c478bd7c3
52cb3ae66b24a0383c2b98e957cd182c1181b11e
68383 F20110114_AADIVK mora_g_Page_26.jpg
9482e2f70a8888c1a1ae6741277f59b0
4876d8d356adcd718803b9808ee926c7761c1d5f
39155 F20110114_AADJCH mora_g_Page_13.pro
00864cdc4983bbcee5fefcce5aefc30e
14a91a4a5c3bdc3b73e2b57139de9ed37f0fce68
97649 F20110114_AADIVL mora_g_Page_41.jp2
2d7f2202075c00e90b22678b028883cd
9f758b1c4d766e893b086b8f29bf1595cb89af44
41597 F20110114_AADJCI mora_g_Page_14.pro
6f76890738e65b253386b791883df915
d788b209e3f2a2dba37df3981b1c9dae272d52aa
1007 F20110114_AADIVM mora_g_Page_51.txt
687c5e01d7b76805531e5ecee94746b5
f59be45347bc91fe9a3cc8d19393219ec73db8ed
45114 F20110114_AADJCJ mora_g_Page_17.pro
99d8e4136c0b3b16197bbe728be0088e
9afca34f964bf6a01ab140be7e402b774394da34
24440 F20110114_AADIVN mora_g_Page_46.QC.jpg
426b0eaecaf96445eae7ee319ac26f0e
e71c7b820727c22781936bb9d460a16deff44942
46169 F20110114_AADJCK mora_g_Page_18.pro
5328552d901f7475b2f760ca17e58e66
724527aed041a3dd19310516da86591d1a8b231f
1833 F20110114_AADIVO mora_g_Page_24.txt
48c3f2727fb7b1b4a2e18c1e24a5c024
69068b894516f59d1336a157e98933ae6b533646
44422 F20110114_AADJCL mora_g_Page_21.pro
0ac127905fa29e2183ab4ae6fed934b1
56c68d655a39de3e0c2eea8e9effae67552a1f21
4902 F20110114_AADIVP mora_g_Page_42thm.jpg
53089eaec21c2b344d0edbae0544b170
84e992c50c40a7f20fb09c5a7531157a9df18815
39648 F20110114_AADJCM mora_g_Page_22.pro
3ce7d8493c0757ed76f19885710a111a
3fdff10e18f70477d0962865cc754b5c6407769a
36303 F20110114_AADIVQ mora_g_Page_36.pro
4b91d11d3c2d0caf09685450d9338311
520f356dd272c41b7e274bcaadd7c399e27ea144
46769 F20110114_AADJCN mora_g_Page_23.pro
e18f4df55421585f2cd4973fe2eed327
1d82a9b75a4a60976173ca7e6b0991bfd20ea986
45105 F20110114_AADJCO mora_g_Page_24.pro
069509096935268fd5291abbe5a844ab
83111e015fe6c62ec305c97d9414612583376ead
22060 F20110114_AADIVR mora_g_Page_33.QC.jpg
41ea786dd2bc2b042447cefa4ee294ab
75b00f36992fffb30a817bf277fa3ed3539a89c9
46985 F20110114_AADJCP mora_g_Page_26.pro
e12e87f39c17af4774b14e2c422e9f32
fad9df5b7b738380bb2de500f4ca0a7d709747c3
2413 F20110114_AADIVS mora_g_Page_49.txt
0fdee0349702dc9559c8f397c5b4efbd
b9f68c9d0835b2b46dd7b3da2a7e19734d4daf68
32368 F20110114_AADJCQ mora_g_Page_27.pro
d8a49df62bcfb35965b16932c31eb24e
dbeefb500ae2772b9d9048abe8915aff58ec75c8
99141 F20110114_AADIVT mora_g_Page_33.jp2
f2ef637f97cf281454621e5f41ecd947
f18a3d05c104249d500e04d82b2f8867f285e811
3621 F20110114_AADJCR mora_g_Page_28.pro
a6f2aef4973fcfdda4067e2f16f154f7
584ea575f3a8ad958f9e705a1dd83e5013873471
5167 F20110114_AADIVU mora_g_Page_08thm.jpg
a4d199f4d55c247ef9a8807f0915f29c
93cc1bac40e3ddb068f269c62bbd9175d37640e8
4764 F20110114_AADJCS mora_g_Page_30.pro
2afa2d94cc41ca4e7ed6c7cc2199b134
6a801eea2fcaf98ba54b813ab7d4f4834300c2b1
2546 F20110114_AADIVV mora_g_Page_06thm.jpg
6fa0c865110d006be142a5199e12f357
230e9a82fae8ddfd4cf41d23084ed504b2aa0282
37969 F20110114_AADJCT mora_g_Page_31.pro
55cd91b36785814fb5ea5dfd2a98dd1f
292db591ad8e42cea1fd129ff8ebf52c70ea71cd
F20110114_AADIVW mora_g_Page_09.tif
aaaf9645ba8255f2c86ded1530e540bb
8b357824b01a4e2c704ec102958fffb4fce73029
31376 F20110114_AADJCU mora_g_Page_32.pro
b51d2f219f9b220050d1e88fad519423
427b321b9c53d2f51651be9f2894f990ada2a731
3121 F20110114_AADIVX mora_g_Page_39thm.jpg
251d1bfd44aa6f2fdf3bbd127615e0e5
7b6169088ce0fe6c9a7bcdd3abb574f4d4478be3
45322 F20110114_AADJCV mora_g_Page_33.pro
a28e2aae1179bf7e03a43d35c5e2cea5
2ecf329e821d9db87fc06935d92ec7c1a9ca6d47
5740 F20110114_AADIVY mora_g_Page_19thm.jpg
f25156d04f5e14f71a45b2607f522564
242f5d731c8839a39c3e8e82e9bbf039cfbe1478
47679 F20110114_AADJCW mora_g_Page_35.pro
55f7830e87fc1b2a3a30e0d1b9c55c89
562b77741ec741b1854e277b9b208fe7fede5d85
F20110114_AADITA mora_g_Page_30.tif
936c71a8ae0c995db14e2d8c72b76da9
7d1543fcaf696827c483bff51b23b3b7e9d61948
26074 F20110114_AADJCX mora_g_Page_37.pro
3f5c0cbb25d6e7e6cb47a422996d4442
edf2a360818caea0cc87c5d0f4340274f31bb1c9
1787 F20110114_AADITB mora_g_Page_14.txt
fb7489b75e451257df6d8bcd558f1388
e81943476d375b092016119ce96da75d4311f63d
75501 F20110114_AADIVZ mora_g_Page_08.jp2
fd5f7b695e55973903437c97a410affe
531ef92e8c4639492bbcdda39366d704abb71859
93139 F20110114_AADJAA mora_g_Page_19.jp2
db82931d56784af64ab474ffe55614ad
9c8fb9e799f7c18147e53728fbda81826fa4ddb2
21386 F20110114_AADJCY mora_g_Page_38.pro
d46a920b2eb7c36d33aadaf54345b091
3a21c062933fe9387d4278430bc928283f4c1da0
5666 F20110114_AADITC mora_g_Page_50.pro
7bb50cebed7cc96e6864c1e22cfed619
c020183e59f182ddeaaa191ecd41d968ef81ae87
50735 F20110114_AADIYA mora_g_Page_05.pro
c1a9fe2e4602b840c8e9bea54486824e
7a2fd798f063161390e3e878447a3d848b97e529
97683 F20110114_AADJAB mora_g_Page_21.jp2
4717185293f9aa79195e7369f7174ed9
d3c32064b9212c22ec337c53f3f9f28213fd1abb
14383 F20110114_AADJCZ mora_g_Page_39.pro
acc8d78446506c67de5c6011ef9d0fac
c491bfd1fdbe1e6a507bbdf61d49bc36fa1ce883
16558 F20110114_AADITD mora_g_Page_27.QC.jpg
2d6e97b219f615585df8eef9eb5465d1
2e1d5056eedf65651046710a6460a83e8d714cde
F20110114_AADIYB mora_g_Page_21.tif
ddb3938c1a656c769c5b2f4c292e4dab
8c828d6ecfa00c89ee7dc5abbe1a3d870183c04a
101010 F20110114_AADJAC mora_g_Page_23.jp2
aa0dd2fba145aa08826acebc77701f4f
57d74adc8ef828496b025ed2e92d792c7345b5f6
61263 F20110114_AADITE mora_g_Page_16.jpg
f9536562602e2810d492fade9df04e22
15ed42a071f81f5633bcb0e953cf2913c4a1aa82
F20110114_AADIYC mora_g_Page_46.tif
715befc5f4b4c36ed58305562c1eae10
a1323d7b08721bcf4dbe22f4afe0bb76f60f4a36
24759 F20110114_AADITF mora_g_Page_45.QC.jpg
7e36738335dc06f66bafbc31173969f9
4af1bea6ae285a52a6eded9548e9533bc4e12e49
21842 F20110114_AADJFA mora_g_Page_12.QC.jpg
30e610a0e4384d7f0b10229b8c85ec7a
aa2a5c0e4b6c4a3f40859b3e596ce17b1ff75659
447056 F20110114_AADIYD mora_g_Page_29.jp2
51346e3c673bd2512952c919d410848c
3a584db583b446bf881643c52278190c2cace062
100327 F20110114_AADJAD mora_g_Page_24.jp2
37110664fe4c7c8a19d586eed053d2d8
32b6ef13ed05c571d50b5070303bde7d634fad62
57025 F20110114_AADITG mora_g_Page_10.jpg
75e8597ca867d008c7af1a083366b4b2
230d87f9fe4c92a2b3a77edd5f1a388e6b679080
19313 F20110114_AADJFB mora_g_Page_13.QC.jpg
e006713cd14f3218516fa5694985d522
d047f51e1fb1d29ee83d4ff724e02681f4767d82
654975 F20110114_AADIYE mora_g.pdf
a2f3e1b4cb14e3fa2f9e4173895a8480
0ca112be13b2535980963e51fbc28c699e755db8
1051979 F20110114_AADJAE mora_g_Page_25.jp2
979a4fc9e319a16ee83be9cd7d94981c
4fa3db686b5d997758d4e01959c11a0fd11409b9
20516 F20110114_AADITH mora_g_Page_37.QC.jpg
c77473af62228c5e3824636d8bae2082
279c976571b4d21feb23cb92bf365836da8c14b0
5772 F20110114_AADJFC mora_g_Page_13thm.jpg
6d54b05a673779a77a62e7b743598044
02fc692fecd95206e3c26b0374c0b12193872b48
102839 F20110114_AADIYF mora_g_Page_26.jp2
fa22bbb210f5e2c7ca6efdbefff31321
912eb8eb80b3e30ddc52f2fbe8770afec9a56ebb
73229 F20110114_AADJAF mora_g_Page_27.jp2
5e4bbe0a6599b774028b16585c29e5c0
2a2032cf86363cf64e44cf411963e14a7e4ad990
378 F20110114_AADITI mora_g_Page_01.txt
6434bd090f0dd1bb3ce967d5ef9a5cd7
b0de45cf51568085d0805139808c972c6cbd3241
5970 F20110114_AADJFD mora_g_Page_14thm.jpg
359a1de75e4f372f59e2cd7d191de3c3
306c9ee6151f4734c23ace7a72e74f0112903074
78281 F20110114_AADIYG UFE0007402_00001.xml
01e9157c096c95b29f436f365ab761db
647442626e2eeb70e7fd61e801ef70954c01da9c
1051962 F20110114_AADJAG mora_g_Page_28.jp2
5af7d6dc1820922d8533dbe240f6965f
27b1a03b973ae178f1f2fa1d40a912a5087a7a2c
28109 F20110114_AADITJ mora_g_Page_43.pro
6be6802a850134a88f305ee5d9e6be5e
d7d554a7808b6fb3ba696edb6f183e068077c38e
2087 F20110114_AADJFE mora_g_Page_15thm.jpg
6604cdb0c025fed98df14f1a3618682b
a785e2f0e5dcdc5dac7b9580295fa42caeb063d1
451035 F20110114_AADJAH mora_g_Page_30.jp2
074e436312c35c55233f448d58626dbd
2a13aa36b104f23c383c238397e934556b2e3022
86951 F20110114_AADITK mora_g_Page_22.jp2
8264012b19a1aa8fad87824e4d9d6e2b
f6aa854c4e69630c78fa2e7c29be5dc09c071ae5
5817 F20110114_AADJFF mora_g_Page_16thm.jpg
12df971153042475930d3b00465ca86d
c2b13881a00a40b74813f2419010517a44a2cd99
84507 F20110114_AADJAI mora_g_Page_31.jp2
8d5f4ad6c4e63c016fe95954c3960c1e
3039e7cce13a2e49e247704daf800bd5743d6703
6094 F20110114_AADITL mora_g_Page_01.QC.jpg
e1063ed89f673753b925b478c73d8e08
04c31d6309d92b54723dd1aea5a35378cde9eec5
22676 F20110114_AADJFG mora_g_Page_18.QC.jpg
518c1bfa3e5073199b5cef2b042c0051
ecb767951d324a062cf8f1edfdb4534b019ec86d
42094 F20110114_AADIYJ mora_g_Page_04.jpg
86deee4ea6a9b80ba8606056ec526e5e
ac6b2d49198443e788a72a64f7a8a9cf3246daee
115910 F20110114_AADJAJ mora_g_Page_34.jp2
6caabed8bb258e400b2d07cee133e3cf
9156f5d2f870306c9148f1b9b74d25a8dbe80548
F20110114_AADITM mora_g_Page_29.tif
354ff75c7bbf64a7915ccf2dd8ba0b91
781fec3eaffb564339a1ee67cbad25eeffed3d2c
6486 F20110114_AADJFH mora_g_Page_18thm.jpg
c49c947dbf9d65bd544fd775e3b720f0
65fb1a643a041dc79bae31f034924f483817eb03
60013 F20110114_AADIYK mora_g_Page_05.jpg
7da3ef002b799fed567db444afdcd029
a0e262292a0c3d181a1fdbf7dfb408739495a41b
104311 F20110114_AADJAK mora_g_Page_35.jp2
1279e9773381889042ea851fa1a2001e
04bb66e4435f388638d54804df072bc8309dfb20
90188 F20110114_AADITN mora_g_Page_14.jp2
938082ad93e5e3d5499e6a6b1342d88b
5aae1f63f56519157eab5669ecd158e3b931c9b4
46338 F20110114_AADIYL mora_g_Page_07.jpg
174014e41ef13b6b6f5d7509d37aacdf
17ebd607635806febddb31fa574addd880c0af04
79669 F20110114_AADJAL mora_g_Page_36.jp2
db558c7e89c82746a573f4529a9d4a22
bafe9dbd33c1496ae395fb118af3a777e1fbca28
1051951 F20110114_AADITO mora_g_Page_49.jp2
fd86d0cbf2bff1995e655875fa65b853
fedcae4519213acf44b2e45912b175dd833b4058
6387 F20110114_AADJFI mora_g_Page_20thm.jpg
38eecd4d52da1bc33a67a934d3a02ffc
5f3206c0b85939a84289ded557884e1b36ac3442
53280 F20110114_AADIYM mora_g_Page_08.jpg
77194bc8765d979182e0618059c5d177
7e709276ab76c9bde27c1b127df9fec7b7c46b65
708873 F20110114_AADJAM mora_g_Page_37.jp2
a1bbf0ef0c6577ce4e1ab1d3bb756e53
b13c8a400c2fa61be5652ed937306b7583af28e2
18607 F20110114_AADITP mora_g_Page_22.QC.jpg
68d4c3f1bb2650d6490bcbb31b7211d1
4d19d9dc6daef394c91c60f5268a6f84f4c0e1d8
21421 F20110114_AADJFJ mora_g_Page_21.QC.jpg
ccac909f3537a38ea4b749517a899920
e74948914d06a814db85c5a6ed6e0738ae3680fb
589738 F20110114_AADJAN mora_g_Page_38.jp2
3a113cc71879ff2f0935c22f8740df82
d96b498d2bed6693bb90c3ce257e28eba878c2f8
58724 F20110114_AADITQ mora_g_Page_31.jpg
42c4b8193ff2b1e2a6c7677e00c15450
9d11063ae5bd16735f4d5f4bc319f27c3a0c64b2
6150 F20110114_AADJFK mora_g_Page_21thm.jpg
de85f34566baecab31fba04ce59977d7
baad96f70ff7c48dea988dd5cf5d8d8a43e1ffa0
75621 F20110114_AADIYN mora_g_Page_11.jpg
e14e3b44069ddc67037fb3d6f06ed40a
9d1c48fc8d8a9f1f1da6a023435df0129e4ef269
85159 F20110114_AADJAO mora_g_Page_40.jp2
30f7e2939f7063068d153b350e6d5dc5
67c1bf82cdb0d348abf38b246d06e1cd9c91b2d9
F20110114_AADITR mora_g_Page_02.tif
ecdc27e9d11e97ed35368ba2c7e5406a
2d61ebae8926c01ef3c7ff6429944c7a74147dd8
5336 F20110114_AADJFL mora_g_Page_22thm.jpg
3559b83450d8fcbf2d553156c9bcd269
0be5b0a396c585bb5b4229d10b3c34aa4cbdb230
58851 F20110114_AADIYO mora_g_Page_13.jpg
e272725d7af31533fd8a5e0a5670a9f2
479c55ec20388a69b70d53e9c0e5b06cbcfceb9c
68789 F20110114_AADJAP mora_g_Page_43.jp2
aca3a5811ed333ac3c27e65f6965aa92
311e5e8dd970f2123ea0fb879328a62aa7bab596
F20110114_AADITS mora_g_Page_20.tif
d9f4d3e0b2691c75bb2afc0d4d4f6927
1e0e09379ec0c5618abb9184c4ce3b51f3f860db
6506 F20110114_AADJFM mora_g_Page_23thm.jpg
e9b10b6057533a2ad6b1e68c40a0ed0a
3e4c4da2629ecfd6adde1b36c919734dc7e08e80
68233 F20110114_AADIYP mora_g_Page_18.jpg
9e7a26c73cb202fcc3294c52b9905c75
5bb4b97c7c1b1abe082b534c9ceffc5ac18d17bd
90224 F20110114_AADJAQ mora_g_Page_44.jp2
d2c5ddce8a5cb08ed81e9a11f14753e7
10a1d486ddc5c39db61d963ae3823e80e6b09b53
22211 F20110114_AADITT mora_g_Page_24.QC.jpg
bd085f05f91c2727b83c753984afb9bd
c0067c07559cd0a4e565c5fe15e74b6cd8481ed8
6300 F20110114_AADJFN mora_g_Page_24thm.jpg
660bbd6ee1fcf89541a8e114c87a1465
58e5dcc1c4599290da44faecb69e448f682dcb3f
62908 F20110114_AADIYQ mora_g_Page_19.jpg
ac70f2f398ffef9122d225f9e967a485
6f684d07603aa05c3d0360ca6764ab2481d4cf1f
90257 F20110114_AADJAR mora_g_Page_46.jp2
c1728f3dd4f2e656b5db97b953ff1fcd
16192dbd7c76fb7a096c2b11adbfcfd0c783be49
21954 F20110114_AADITU mora_g_Page_17.QC.jpg
c61c3bf2e7ec6698dcec2ac79116d842
7b121bf2ea782510ec229d83f49f0fe5b267bf5c
21316 F20110114_AADJFO mora_g_Page_25.QC.jpg
73c5f8dbb5bc645d21655e5f460a02e4
a7ccd886db218b7c6267c4cdd562a4e7b3229ef1
65511 F20110114_AADIYR mora_g_Page_21.jpg
145e0c80a627fe584b457bc8debb439b
b63794d8af0dc632cb77899f0f4b19571622cc63
68798 F20110114_AADJAS mora_g_Page_47.jp2
57435dd43721708987e2dee1a785862e
232fcb52ceeddc4730e3f4cf59b3274dcc98cd46
5659 F20110114_AADITV mora_g_Page_31thm.jpg
bbdbae94729ecadfc96eb3d94724f9f6
378ae738f18be1793b163a26f299501c6e9d249a
6301 F20110114_AADJFP mora_g_Page_26thm.jpg
34ea726c143636de1e3dcbfa33a0410d
79c8d543e9369d01aea7905ebd3174beed6a1bbc
59460 F20110114_AADIYS mora_g_Page_22.jpg
c237947b5fb5f5064fea20fc5f4676a5
ceca95cd5c2bcbd682b00d484363c7e6fa9a0dc5
15292 F20110114_AADJAT mora_g_Page_50.jp2
2c4d31aa3ebb64b006f66451d5f7a764
16c3b4fc0b45dadb0d11a5b229b71f60f344c6f6
93284 F20110114_AADITW mora_g_Page_45.jp2
a7d6f5413ad1985ed2dbdf1bfb1b1fbc
0a5e58c5fab79b0f1c115cf15ad54bb77e81d67c
4965 F20110114_AADJFQ mora_g_Page_27thm.jpg
d0ec61ca4590505ab61e15dbf6dbeb38
3afd00be7704a4cb75afe26605ff9157dc6d7e5d
68779 F20110114_AADIYT mora_g_Page_23.jpg
aadd166a26a1197c558c34ce1f859601
b5174a60b00b859390de3b34af2b287f0141400b
5283 F20110114_AADJFR mora_g_Page_28thm.jpg
1f791e835cbe9e4110de97678fcfa7d1
ce44933d6a9bcf3136b9ccd8b528e43197810c68
66726 F20110114_AADIYU mora_g_Page_24.jpg
7bda31e1560093bed207c9ee82da3bb8
3b65d6553d1383c8eb417797f480c108f9575ecf
F20110114_AADJAU mora_g_Page_01.tif
b7a1ffdf0342ee78fe90c86b16c4e7e7
3d5a9ac22cf3c3e7bd6725a3fb34de543905e8f4
1586 F20110114_AADITX mora_g_Page_10.txt
f1acea176c5c696831c9c6c33d2b1634
a530f9e740c93184858466c4f2664d7fa9a96ff8
15299 F20110114_AADJFS mora_g_Page_29.QC.jpg
24660a2ffe6b9985248fe8375086e745
b5740bd5739b042f1107a9fc16e6eaaa97490538
64488 F20110114_AADIYV mora_g_Page_25.jpg
5c574e22cda38055a8ef980275e888d7
12058b878413729b52599e1707e5250dbdde4be0
F20110114_AADJAV mora_g_Page_03.tif
a7a5cce61337899a39ac28c189f833a5
592a4f1d8e9102f2c65ae27a49d0c06c3a2f41fe
31026 F20110114_AADITY mora_g_Page_07.pro
8d9a549ecaed7664ac98d0eab61f1989
95a8fd653b0a772ea3151eea737f4a73520e3136
4950 F20110114_AADJFT mora_g_Page_29thm.jpg
1e16888a9a45d5e6644753a650f5d70d
8806e82b2f765df0eb1f3a8f17b6f30885234113
50619 F20110114_AADIYW mora_g_Page_27.jpg
5c19e4f28a8377e5c55bfaa847ba9a43
34045a23c650e29931cea5e2e2e5163a79ff7903
F20110114_AADJAW mora_g_Page_04.tif
93de589c2a56fbb39e81958759eb8ce0
e45d7d2bdeed0563bafdc44914b54d0073c456a5
3550 F20110114_AADITZ mora_g_Page_03.pro
ad91ae878805ccb1d6ee629c0fc588f9
d7c2191227c097b7efe2bcd21e1a312fe48652df
15373 F20110114_AADJFU mora_g_Page_30.QC.jpg
2db7582a0707d364ec1e0ed563e45a8c
2dba49ab4de56a3a141ae50459532811fc8d6ee5
44168 F20110114_AADIYX mora_g_Page_28.jpg
9cbca5cf40912f0d2b9dd93b1ed6cb4d
0e52c0bd3fecfba5a8d010c8bafed3be7a37cb30
F20110114_AADJAX mora_g_Page_05.tif
1c3d73985e246c8e48ddf7c805cc7f05
36f4162b781bdee6877f8d081e0bc7c2937187a6



PAGE 1

DISTRIBUTED VIRTUAL REHEARSALS By GEORGE MORA A THESIS PRESENTED TO THE GRADUATE SCHOOL OF THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF MASTER OF SCIENCE UNIVERSITY OF FLORIDA 2004

PAGE 2

Copyright 2004 by George Mora

PAGE 3

To my wife, Maria, My parents, Jorge and Johanna Mora And my family and friends For their constant support and encouragement

PAGE 4

ACKNOWLEDGMENTS I would like to thank my thesis committee chairman, Dr. Benjamin C. Lok, for his enthusiasm and interest in this project, as well as for keeping me motivated and on track. I would also like to thank James C. Oliverio for being on my committee and for the constant support, advice, and opportunities he has provided for me. I also give much thanks to Dr. Jorg Peters for supporting both my undergraduate and graduate final projects. This thesis was completed with the help of several people. My gratitude goes out to Jonathan Jackson, Kai Bernal, Bob Dubois, Kyle Johnsen, Cyrus Harrison, Andy Quay, and Lauren Vogelbaum. This thesis would not have been possible without their help. I would like to thank my parents, Jorge and Johanna Mora, for always encouraging me to grow both intellectually and creatively. Finally, I would like to thank my wife, Maria Mora, for her unending love, support, and understanding. iv

PAGE 5

TABLE OF CONTENTS page ACKNOWLEDGMENTS..iv LIST OF FIGURES...vii ABSTRACT.viii CHAPTER 1 INTRODUCTION...1 1.1 Motivation.1 1.2 Challenges. 1.3 Project Goals.4 1.4 Organization of Thesis..4 1.5 Thesis Statement...5 1.6 Approach...5 2 PREVIOUS WORKS..7 2.1 Distributed Performance... 2.2 Virtual Reality... 2.3 Digital Characters and Avatars........ 3 APPLICATION..13 3.1 Scene Design and Experience Development...13 3.2 Tracking the Actors.....15 3.3 Putting It All Together.....17 3.4 Final Software and Hardware Setup 4 RESULTS... 4.1 Description of Studies..22 4.2 Reaction from Actors... 4.3 Results..24 4.3.1 Virtual Reality Used for Successful Rehearsals.....24 4.3.2 Lack of Presence Distracted the Actors......26 v

PAGE 6

4.3.3 Improvements That Should Be Made to the System..27 5 CONCLUSION.......31 5.1 Usefulness to Acting Community....31 5.2 Future Work.31 5.3 Future Applications..33 APPENDIX: STUDY QUESTIONNAIRES....34 A.1 Co-presence Questionnaire.34 A.2 Presence Questionnaire...37 LIST OF REFERENCES...39 BIOGRAPHICAL SKETCH.42 vi

PAGE 7

LIST OF FIGURES Figure page 1-1. Two actors rehearsing in a virtual environment ......2 3-1. A participant wearing the colored felt straps......16 3-2. A participant testing out the system.......19 3-3. Sample screenshot demonstrating the virtual script system...19 3-4. Data flow for both rendering systems.....20 3-5. Hardware setup for each location.......21 4-1. The location of each actor on the University of Florida campus....23 4-2. Results of the co-presence questionnaire administered during the first study...28 4-3. Results of the presence and co-presence questionnairessecond study.....28 4-4. Results of the presence and co-presence questionnairesthird study.....29 4-5. Comparison between question averages for the presence questionnaire 4-6. Comparison between question averages for the co-presence questionnaire...30 vii

PAGE 8

Abstract of Thesis Presented to the Graduate School of the University of Florida in Partial Fulfillment of the Requirements for the Degree of Master of Science DISTRIBUTED VIRTUAL REHEARSALS By George Mora December, 2004 Chair: Benjamin C. Lok Major Department: Computer and Information Science and Engineering Acting rehearsals with multiple actors are limited by many factors. Physical presence is the most obvious, especially in a conversation between two or more characters. Cost is an obstacle that primarily affects actors who are in different locations. This cost consists of travel and living expenses. Preparation time is another hindrance, especially for performances involving elaborate costumes and intricate makeup. Many recent high-budget motion pictures require that key actors go through several hours of makeup application to complete their characters look. Virtual reality can bring actors together to rehearse their scene in a shared environment. Since virtual reality elicits emotions and a sense of perceived presence from its users, actors should be able to successfully rehearse in a virtual environment. This environment can range from an empty space to a fully realized set depending on the directors imagination and the projects scope. viii

PAGE 9

Actors movements will be tracked and applied to a digital character, creating a virtual representation of the character. The digital character will resemble the actorin full costume and makeup. In the virtual environment, each actor will see (in real-time) the character being controlled by their acting partner. The goal is to show that multiple actors can use a shared virtual environment as an effective acting rehearsal tool. This project will also demonstrate that actors can hone their skills from remote locations through virtual reality, and serve as a foundation for future applications that enhance the virtual acting paradigm. ix

PAGE 10

CHAPTER 1 INTRODUCTION 1.1 Motivation Acting rehearsal is the process by which actors refine their acting skills and practice scenes for future public performances. These rehearsals traditionally occur on a stage with the principle actors and the director physically present. Although costumes and makeup are not essential until the final few rehearsals (called dress rehearsals), a functional set is important for determining when and where to move (known as movement blocking). There are several variations on the standard rehearsal. During the pre-production stage, a read through or reading is scheduled to familiarize the actors with the script and each other. Typically, actors are physically present in a conference room, although this can be accomplished through using a video or telephone conference. After the reading, a blocking rehearsal will help choreograph the actors movements. Blocking rehearsals usually take place on a stage or practice set, since its dimensions affect the choreography of a production. Polishing and Building rehearsals take up the majority of the total rehearsal time. During these rehearsals, actors perfect their performance and work out any major problems. The final rehearsals (dress and technical rehearsals) involve practicing the performance in full costume and makeup with complete lighting, sound, and props on a finished set. 1

PAGE 11

2 Currently, a reading is the only rehearsal method which does not need an actors physical presence. The reading does not require that actors wear costume/makeup or move on an assembled stage. Therefore it could be performed over the telephone. One could argue that distributed rehearsals could be easily achieved through video conferencing. However the cost and availability of a system which could deliver satisfying results in terms of video/audio quality, bandwidth, and robustness make video conferencing a poor choice for effective distributed rehearsals. Allowing digital characters to represent an actor in a shared immersive virtual environment increases the number of conditions under which an acting rehearsal can occur. Physical presence, preparation time, and cost would no longer limit rehearsals. This would allow multiple actors from anywhere in the world to meet and rehearse a scene before there are costumes or constructed sets. Figure 1-1. Two actors rehearsing in a virtual environment. Actor 1 controls the movements of Character 1 (Morpheus), while Actor 2 controls the movements of Character 2 (Neo). By allowing actors to meet in a virtual space, there is an added advantage of virtual reality interaction. Such interaction includes stereoscopic vision, gaze tracking, and easy prop and set maintenance. Stereoscopic vision allows the actor to see the acting

PAGE 12

3 partner, set, and props in three dimensions. Gaze tracking changes the display based on the location of the actors head and the direction he or she looks. Prop and set maintenance allow one to move, rotate, and replace any prop or piece of scenery. Consideration must be given to acting theory since the actors expressions will be expressed through the avatar. The form of expression this thesis focuses on is gestures. Kinesics encompasses all non-verbal forms of communicating. These include gestures, body language, and facial expressions. There are several categories of kinesics: Emblems: non-verbal messages with a verbal counterpart. Illustrators: gestures associated with verbal messages. Affective Displays: gestures or expressions conveying emotion. Regulators: non-verbal signs that maintain the flow of a conversation. Adaptors: small changes in composure that subconsciously convey mood [1]. Communication on stage mimics communication in real life. Therefore, the relationship of kinesics to acting is obvious. Actors must pay special attention to their movement, gestures, facial expressions, and body language in relation to what they are saying or doing. Ryan reaffirms the connection between kinesics and acting: The informal spatial code relates to the movement of the body on stage including facial expression, gestures, formations of bodies (i.e. patterns, shapes), and sometimes includes moving set and pieces of furniture. Ryan also lists several properties of kinesics use on stage: Gestures cant stand alone. Gestures cant be separated from the general continuum of motion. Gesture is the primary mode of ostending (i.e. showing) the body on stage [2].

PAGE 13

4 1.2 Challenges Actors are accustomed to being physically present with other actors on real sets. For a virtual environment to be effective in improving acting skills, the actor must experience a strong sense of being in the same space as the other actor (referred to as a sense of co-presence). Some challenges faced when trying to achieve this sense of co-presence include: Keeping the actor comfortable and as unaware that their movements are being tracked. Ensuring high audio quality to simulate hearing the other voice in the same room. Placing the cameras, projector, and screen in such a way that the actor has a clear view of the screen while still being accurately tracked. Providing realistic models and textures for the virtual environment. Having the character exhibit human-like behavior and expressions. Ensuring high-speed data transmission between systems. 1.3 Project Goals This project seeks to enhance the fields of virtual environments research and acting theory in the following ways: Demonstrate that digital characters in virtual environments allow for effective distributed rehearsals. Provide a prototype system to allow actors to interact with each other virtually in full costume/makeup across long distances. 1.4 Organization of Thesis This thesis is organized into the following sections: Introduction. Specifies the need for this research, obstacles in completing the project, and the ultimate goals for this thesis.

PAGE 14

5 Previous Work. Describes the research and work that served as both inspiration and a foundation for this project. Application. Details the process of creating the application that demonstrates the ideas presented in this thesis. Results. Discusses the results of a survey of several actors who tested the system to rehearse a simple scene. Conclusion. Summarizes results and lists future work and applications of the research. 1.5 Thesis Statement Distributed Virtual Rehearsals can bring actors together from different locations to successfully rehearse a scene. 1.6 Approach My approach is to build a system that will allow two or more actors to rehearse a scene in a virtual world. Digital characters that resemble the actor in full costume and makeup will represent the actor. The actors movements will be tracked and will directly affect the movements of the digital character. The actor will see digital characters controlled by other actors in remote locations. Since the data required for rendering the characters and props exists on local machines, the only information that needs to be sent is the tracking data for each actors movements. The tracking data for each body part is contained in a text message composed of several (three for position data only) floating point numbers. The information can be efficiently transmitted which will reduce lag. The system for this rehearsal setup (per actor): Projector-based display system with a large projection screen. A well-lit room large enough for the actor to perform the scene.

PAGE 15

6 Two web cameras connected to two PCs. One rendering PC. High-speed network connecting each system. Several different colored straps attached to the actors body. A headset with built-in microphone (wireless if extensive body movement is required)

PAGE 16

. CHAPTER 2 PREVIOUS WORK 2.1 Distributed Performance Distributed performance refers to media-centered events and actions that affect each other yet occur in different locations. This could range from a simple telephone call to a complex massive multiplayer online role-playing game (MMORPG). In the study Acting in Virtual Reality, distributed performance brings several actors and directors together to rehearse a short play. Each participant interacts with the others through networked computers and his/her own non-immersive display. Semi-realistic avatars represent the actors while the director is only heard over the sound system. The study proved successful in allowing actors to rehearse in a shared virtual environment. A performance level was reached in the virtual rehearsal which formed the basis of a successful live performance, one that could not have been achieved by learning of lines or video conferencing [3]. Networked virtual environments are also used in the Collaboration in Tele-Immersive Environments project at the University of London and the University of North Carolina at Chapel Hill. This project investigated if virtual environments could facilitate finishing a collaborative task in virtual reality. The task involved two people carrying a stretcher along a path and into a building. Their result indicated that realistic interaction in a virtual environment over a high-speed network while possiblestill suffers from tracking delays, packet losses, and difficulty sharing 7

PAGE 17

8 control of objects. The data suggests that in order to have a sense of being with another person, it is vital that the system works in the sense that people have an impression of being able to actually do what they wish to do [4]. Dancing Beyond Boundaries involves the use of video conferencing over a high-speed network as a method of distributed performance. This piece used Access Grid technology and an Internet2 connection to allow dancers and musicians from four different locations across North and South America to interact with each other. Thus the combination of multi-modal information from the four nodes created a virtual studio that could also be termed a distributed virtual environment, though perhaps not in the usual sense [5]. An important aspect of distributed performance is the state of co-presence that is achieved. Co-presence is the sense of being in the same space with other people. Distributed collaboration has shown to be successful since it achieves a high degree of co-presence. It has been shown that the level of immersion is related to a users sense of presence and co-presence. The level of immersion also relates to leadership roles in group settings [6]. 2.2 Virtual Reality Frederick Brooks Sr. defines a virtual reality experience as any in which the user is effectively immersed in a responsive virtual world. This implies user dynamic control of viewpoint. Effective immersion has been achieved through the use of novel display and interaction systems [7]. Immersive displays create a sense of presence through multi-sensory stimulation. Previous examples of these systems include head-mounted displays, CAVE (Cave-like

PAGE 18

9 Automatic Virtual Environment) systems, projector-based displays, and computer monitors. Several of the goals which inspired the creation of the CAVE system can be applied to most other immersive display systems: The desire for higher-resolution color images and a large field of view without geometric distortion. The ability to mix VR imagery with real devices (like one's hand, for instance). The opportunity to use virtual environments to guide and teach others [8]. Effective interaction is as important as a novel display system in creating an immersive virtual environment. Successful interaction involves allowing the user to control the view of the environment or objects inside the environment. The degree to which presence is experienced depends on how well the interface imitates real world interaction. A defining feature of virtual reality (VR) is the ability to manipulate virtual objects interactively, rather than simply viewing a passive environment [9]. Motion tracking provides a realistic means of interacting with a virtual environment. It can adjust the view of the environment, manipulate objects in the environment, and trigger visual and aural cues based on gaze and gesture recognition. Motion tracking is often used in head-mounted display systems where the users position and orientation affect what the user sees and hears in the environment. Although stereo presentation is important to the three-dimensional illusion, it is less important than the change that takes place in the image when the observer moves his head. The image presented by the three-dimensional display must change in exactly the way that the image of a real object would change for similar motions of the users head [10].

PAGE 19

10 There are many commercial motion tracking devices: Polhemus FASTRAK uses several electromagnetic coils per tracker to transmit position and orientation information to a receiver. InterSense InertiaCube small orientation tracker that provides 3 degrees of freedom (yaw, pitch, and roll) and allows for full 360 rotation about each axis. HiBall-3100 wide-area position and orientation tracker. Uses hundreds of infrared beacons on the ceiling and 6 lenses on the HiBall Sensor to track the user. These commercial solutions are highly accurate and produce very low latency. However, they are expensive and oftentimes encumber the user [11, 12, 13]. Motion tracking has been applied to avatar movement through by tracking colored straps. In Straps: A Simple Method for Placing Dynamic Avatars in an Immersive Virtual Environment, colored straps are attached to the users legs to accurately represent their movement through an avatar in the virtual environment. The straps system has two major advantages over other tracking systems: Freedom there are no encumbering cable, which reduces system complexity, accident hazards, and equipment failure. Simplicity the colored straps are cheap, easy to create, and contain no moving parts or electronics [14]. 2.3 Digital Characters and Avatars Inserting digital characters into virtual environments can make the experience much more realistic to the user. The users more natural perception of each other (and of autonomous actors) increases their sense of being together, and thus the overall sense of shared presence in the environment [15].

PAGE 20

11 An avatar is the representation of oneself in a virtual environment. In an ideally tracked environment, the avatar would follow the users movements exactly. Slater and Usoh discuss the influence of virtual bodies on their human counterpart: The essence of Virtual Reality is that we (individual, group, simultaneously, asynchronously) are transported bodily to a computer generated environment. We recognize habitation of others through the representation of their own bodies. This way of thinking can result in quite revolutionary forms of virtual communication [16]. Digital character realism affects the amount of immersion experienced by the user in a virtual environment. This realism is manifested visually, behaviorally, and audibly. A break in presence (losing the feeling of presence) can occur if someone in a virtual environment is distracted by a lack of realism in an avatar. In The Impact of Avatar Realism on Perceived Quality of Communication in a Shared Immersive Virtual Environment, avatar realism and a conversation-sensitive eye motion model are tested to determine their effect on presence. We conclude that independent of head-tracking, inferred eye animations can have a significant positive effect on participants responses to an immersive interaction. The caveat is that they must have a certain degree of visual realism [17]. Even without realistic avatars, users can still be greatly affected by other users digital characters, as well as their own avatar. Establishing a sense of presence increases the chances of a participant becoming attached to avatars in the virtual space. Emotional attachment to avatars was a surprising result of the study Small Group Behavior in a Virtual and Real Environment: A Comparative Study. Although, except by inference, the

PAGE 21

12 individuals were not aware of the appearance of their own body, they seemed to generally respect the avatars of others, trying to avoid passing through them, and sometimes apologizing when they did so. The avatars used in the study were simple models associated with a unique color [18]. Digital characters have been successfully integrated into real environments using computer vision and camera tracking techniques. These characters are used partly as virtual teachers that train factory workers to operate some of the machinery. The virtual humans pass on knowledge to participants using an augmented reality system. Although the characters are automated, a training specialist can control them from a different location via a networked application [19]. Digital character realism has been integrated into the character rendering system created by Haptek. This system can integrate multiple highly realistic and customizable characters into a virtual environment. These characters also act realistically (i.e. blinking, looking around, and shifting weight). The Haptek system allows these characters to be used as avatars, or as autonomous virtual humans [20]. There are many other commercially available character-rendering systems: UGS Corp. Jackusability, performance, and comfort evaluation using digital characters that are incorporated into virtual environments. Boston Dynamics DI-Guyreal-time human simulation used in military simulations created by all branches of the United States Armed Forces. VQ Interactive BOTizenonline customer support conducted by digital characters. Characters respond to queries using a text-to-speech engine [21, 22, 23].

PAGE 22

CHAPTER 3 APPLICATION 3.1 Scene Design and Experience Development The first step in creating the distributed rehearsal system was to choose a sample scene. This scene would determine character design, set design, and the dialogue. Several factors influenced the decision: Familiarity Since the actors would be testing the system without reading the script beforehand, the scene needed to be immediately accessible to most actors. Ease of tracking The system is a prototype; therefore extensive tracking would be beyond its scope. Additionally, the acting focuses on kinesics, so gesture tracking is the only requirement. The scene would involve characters that stay relatively still acting primarily with gestures. Interesting Set and Characters Presence is one of the main factors measured when evaluating virtual environment systems. Incorporating stimulating digital characters and sets into your environment can achieve presence. Several scenes were evaluated using the above criteria. The scenes included the balcony scene from Romeo and Juliet, the heads and tails scene from Rosencrantz and Guildenstern are Dead, and the red pill, blue pill scene from The Matrix. The red pill, blue pill scene was chosen because it is a very familiar scene that few actors would have previously rehearsed. Once the scene was selected, the characters and set needed to be constructed. The modeling software 3D Studio Max was used to create the set. The set consisted of a dark 13

PAGE 23

14 room with a fireplace, two red chairs and a small white table. The two main characters, Neo and Morpheus, were created using the base Haptek male character model and adjusting the texture accordingly. The characters and environments, after being fully modeled and textured, were then exported into a format that could be incorporated into a graphics engine. The 3DS file format was chosen and then read into an OpenGL application along with each character. Lighting was set up to reflect the locations of the physical lights in the scene. It was important to be able to manipulate the characters skeleton in real-time. Therefore, each joint in the character needed to be explicitly controlled. Haptek uses a separate joint for each degree of freedom that exists in a joint. The shoulders, elbows, and neck each have three joints. For simplicity, two joints were used for the neck, two for each shoulder, and one for each elbow. Special attention went toward developing aspects of the system that would enhance the users experience. The actor sat on a cushioned chair in front of a large surface onto which the scene is projected. The setup was designed to physically immerse the actors an environment similar to the one used in the designated scene. Each rendering system had the option to allow each actor to use his/her head movement to affect his/her cameras viewpoint. This simulates an actor looking through a large hole in the wall at the other actorif the actors tilt their heads to the side, their viewpoints rotate slightly and allow them to see more of the room on the other side of the wall. The experience began with a clip from The Matrix that leads into the scene. These were efforts to increase the sense of presence each actor experiences.

PAGE 24

15 3.2 Tracking the Actors Setting up the tracking system required two cameras, two PCs (one for each camera), colored paper or cloth for the straps, and sufficient USB or Firewire cable to accommodate the setup. The tracking system worked under different lighting conditions provided adequate training is performed on each camera. Training consisted of acquiring many pictures of each strap and the background and determining the range of the colors that belong to each strap. An application provided with the tracking system accomplished most of this process. Two sets of straps were created for the system. The first set of straps consisted of colored pieces of paper fastened with tape. These straps were used in the first study and the participants suggested using a different material because the paper was uncomfortable and sometimes distracting. The second set was constructed with colored pieces of felt to increase the comfort level of each participant. These straps were fastened with small strips of Velcro. Figure 3-1 shows the second set of straps attached to a participant. The tracking system on each PC then transmitted the two-dimensional coordinates of each strap to a client computer. Tsais camera calibration algorithm is used to calibrate each camera and recover extrinsic and intrinsic camera parameters [24]. Calibration was achieved by explicitly mapping two-dimensional sample picture coordinates to their three-dimensional equivalents. This provided a configuration file for use in the client computer. Once the rendering system was receiving correct tracking values from the straps system, these values needed to be appropriately mapped to the digital characters

PAGE 25

16 movements. The system first saved the fixed locations of each shoulder. Then, after instructing the user to place their hands directly in front of them with their elbows fully extended, the system determined their arm length and head height (distance from their neck to their forehead). The system used the actors arm length, head height, and shoulder width constants to appropriately displace the digital characters hands and head. Figure 3-1. A participant wearing the colored felt straps. Forward shoulder joint animation (vertical movement) was accomplished by determining the angle of displacement that a line passing from the shoulder to the hand would create from a central horizontal line. The distance from the shoulder to the hand determines the amount of elbow bend that is required. For instance, if the hand is arms length away from the shoulder, the elbow wouldnt be bent at all. Conversely, if the hand was located adjacent to the shoulder, the elbow would be fully bent. Finally, shoulder

PAGE 26

17 turn (horizontal) was calculated by determining the angle of displacement the hand would make from a central vertical line. 3.3 Putting it all Together VRPN was used to connect and transfer text messages between the tracking system and the rendering system as well as between each rendering system. The tracking system sends a text message containing the two-dimensional coordinates for each color detected along with the width and height of the image in pixels. The rendering system receives these values and, combined with the values from the second tracking system, uses Tsais algorithm for recovering the three-dimensional coordinates [24]. Once calibration has finished and the actors are accustomed to using the system, the tracked data is shared between rendering systems via VRPN text messages. The text message contains the two angles for each shoulder and for the neck, the bend angle for the elbow, and the speaking state. The speaking state determines which actor is currently speaking; this is used with the lip-syncing and virtual script systems. Voice acting is an important aspect of rehearsal. Therefore it was necessary to implement a system that allowed the actors to transmit their voice to their partner. Headsets with built-in microphones were used. The headsets had a behind-the-neck design so they would not interfere with the forehead strap. Voice was transmitted using DirectPlay. Instead of using a traditional physical script, a virtual script system allowed the actors to read their lines without having to look away from the display. This system displayed the actors current line on the bottom of the screen when their speaking state is true. Incorporating the virtual script system introduced the problem of determining when

PAGE 27

18 to proceed with the next line. Originally, the actor would use their foot to press a key that would trigger the speak state to false and send a message to the remote rendering system to change its speak state to true. However, this hindered presence and lowered the comfort level. It was decided on to have the system operator, who calibrated the system and trained the actor to use the system, manually switch the speak state to false when a line was finished. 3.4 Final Software and Hardware Setup The final hardware setup used to test the system was composed of the following (for each location): 3 Dell PCs 2 OrangeMicro USB 2.0 web-cameras Sony projector Colored felt straps with Velcro attachments GE stereo PC headset Cushioned chair The participant sat facing a large projection screen. The Sony projector was placed under each participants seat. The two web-cameras were each attached to a Dell PC running the Straps software. These PCs had the VRPN and Imaging Control libraries installed. The rendering PC connected to the projector ran the rehearsal software. This PC had the VRPN and Haptek libraries installed. Figure 3-2 shows a diagram of the final hardware setup used for each study.

PAGE 28

19 Figure 3-2. A participant testing out the system. Figure 3-3. Sample screenshot demonstrating the virtual script system.

PAGE 29

20 Figure 3-4. Data flow for both rendering systems.

PAGE 30

21 Figure 3-5. Hardware setup for each location.

PAGE 31

CHAPTER 4 RESULTS 4.1 Description of Studies The system was evaluated using three studies, each with two actors. The first study was conducted before the system was fully operational. The second and third studies were conducted using the complete system. The aspects of the system that werent incorporated into the first study included the introductory movie, head-controlled viewpoint, and accurate hand tracking. The participants (4 females and 2 males) ranged in age from 18 to 20. They had significant acting experience. Before each study, each participant was given a small tutorial on how the system worked, a brief overview of its limitations, and some time to see his/her character being manipulated by his/her movements. The participants then watched the introductory movie and rehearsed the scene provided for them. When the scripted lines ended, the participants were given time to adlib. Each study concluded by having participants fill out a presence and co-presence questionnaire. All three sets of participants were given the co-presence questionnaire used in Collaboration in Tele-Immersive Environments. This questionnaire gauges the degree to which each participant felt they were present in the virtual environment with the other participant. The last two sets of participants were also given the Slater, Usoh and Steed (SUS) Presence questionnaire. The SUS questionnaire is commonly used to assess 22

PAGE 32

23 virtual environment immersion. Along with each questionnaire, participants were asked to specify their level of computer literacy and their level of experience with virtual reality. The Appendix contains both questionnaires. An informal verbal debriefing followed the questionnaires. Figure 4-1. The location of each actor on the University of Florida campus. 4.2 Reaction from Actors The participants from the first study appeared to be initially frustrated with the inaccurate hand tracking, although with some practice they compensated for it. One participant used exaggerated gestures to counteract the limited forward/backward character movement. During the adlib portion the participants spontaneously began a slapping duel with each other that consisted of one person moving their arm in a slapping motion and the other moving their head when hit and vise versa. The second and third set of study participants quickly became adept at using the system. They seemed very comfortable working through the scene despite having little

PAGE 33

24 or no virtual reality experience. The introductory movie did not appear to significantly affect the participants experience. The adlib session flowed seamlessly from the scripted section. The participants seemed to be highly engrossed in the experienceevidenced by the fact that all four participants prolonged the adlib session for more than 5 minutes. 4.3 Results The results from the questionnaires and the debriefing can be organized into the following three categories: Virtual Reality can be used to bring actors together for a successful rehearsal. Lack of presence distracted the actors. Improvements should be made to the system. 4.3.1 Virtual Reality Can Be Used to Bring Actors Together for a Successful Rehearsal The results of the questionnaires proved that the study was effective in achieving successful rehearsals. The participants on average felt a stronger sense of co-presence than standard presence. This is understandable considering the participants had limited control over their own environment while still having significant interaction with their partner. The average responses to the co-presence questionnaire were low for the first study (only 26% of the responses were above 4.5) yet moderately high for the second and third studies (60% and 66% of the responses were above 4.5 for the second and third studies, respectively). There was an average increase of .81 in the responses from the first study to the second and third. This demonstrates that the increased interactivity included in the system for the second and third studies positively influenced each actors experience. The high responses for the second and third studies also indicate that the

PAGE 34

25 participants felt that they could effectively communicate both verbally and gesturally. The following responses to the debriefing session reaffirm these findings: Ultimately, I had fun. There were a few synch issues but we found out ways to interact with the limited tools at our disposal. I felt very connected to the other person and I felt that the acting experience was very worthwhile. I actually felt like if we were rehearsing a scene we would have gotten someplace with our exercise. It was very easy to feel like youre with another person in the space. One, because you were talking to them. And two, because youre always conscious of the fact that theyre going to be looking at what youre at least partially doing. I started to think of the images on the screen as being another person in the room with me; it very much seemed like a human figure and I felt as though I were in their presence. Several items on the co-presence questionnaire generated interesting results. Question 4, which asked, To what extent did you feel embarrassed with respect to what you believed the other person might be thinking of you? generated an average score of 1.25 (on a scale of 1 [not at all] to 7 [a great deal]). Questions 6 and 7, which determined the degree to which each participant felt their progress was hindered by their partner and vice versa, generated an average score of 1.5 and 1.75, respectively. These low results are likely a result of the participants having previously worked with each other. This co-presence questionnaire uses the participants unfamiliarity with their partner to gauge co-presence by showing the existence of social phenomena such as shyness and awkward interaction with the aforementioned questions. Thus, participants familiar with each other, or those who have acted together before, would probably get low scores on those questions. Question 14, which measured the degree to which each participant had a sense that there was another human being interacting with them (as opposed to just a machine),

PAGE 35

26 generated an average score of 6. This score further supports the systems effectiveness. Question 15, which determined how happy the participant thought their partner felt, generated an average score of 7. This question assumes that the participants are strangers (similar to questions 4, 6, and 7). All of the participants showed obvious signs of enjoying the experience as evidenced by the average score of 7 (the maximum score) for this question. Figures 4-2 to 4-6 detail the results of each questionnaire arranged by study. 4.3.2 Lack of Presence Distracted the Actors The results of the presence questionnaire that was given to the second and third study participants were average. Typical presence scoring involves adding a point for each response of 6 or 7, however that would give only one participant (ID number 3) a score above 0. According to Figure 4-5, the average of the responses for both studies also generates a score of 0. Since the average responses were all between 3 and 5, it can be said that the participants were only moderately engrossed in the environment. This affected the experience by distracting the participant. Several participants mentioned the experience would have been enhanced if they could see a representation of their hands on the screen. Had each participants sense of presence been higher, they might have accepted the reality of acting with the character on the screen as opposed to feeling that they were physically controlling a character that is acting with the character on the screen. The following responses from the debriefing session were the basis for these conclusions: Its kind of like a first-person shooter sort of game where you dont really see any of yourself; you just see whats going on. Its a little bit disorienting.

PAGE 36

27 I wouldve really liked to see my characters hands on the screenso I know what theyre doing. It was kind of skeletal but the way it works right now is really good for where it is. There was a little sense that you were really there (in the virtual environment) like when you move your head and the camera pans back and forth. 4.3.3 Improvements Should Be Made to the System The participants suggested a number of areas for improvement. Nearly all suggested that more body parts be tracked and that interactive facial expressions be added. One participant from the first study suggested abandoning the gesture tracking for a system that would aid only in the blocking of a scene. The following are the debriefing responses that dealt with system improvements and the overall idea of the system: For practice out of rehearsal this could work. It all depends on the level of sophistication. It needs to incorporate more color straps to include the whole body and hopefully, facial expressions. I like the idea of the opposite image being that of the character instead of the other actor. I would add lots more tracking spots to allow for full body and maybe facial movements. Theres a lot more that goes into acting that just moving your arms. To make it more of an acting experience there would have to be more mobility and expression.

PAGE 37

28 FIRST STUDY Co-presence Questionnaire Part 1 ID Number Literacy Experience 1 2 3 4 5 6 7 1 3 1 4 5 4 1 4 2 2 2 4 1 3 3 3 1 2 1 1 Average: 3.5 4 3.5 1 3 1.5 1.5 FIRST STUDY Co-presence Questionnaire Part 2 ID Number 8 9 10 11 12 13 14 15 1 2 5 1 3 1 5 6 7 2 4 5 4 3 1 6 5 7 Average: 3 5 2.5 3 1 5.5 5.5 7 Figure 4-2. Results of the co-presence questionnaire administered during the first study. SECOND STUDY Presence Questionnaire ID Number 1 2 3 4 5 6 3 6 4 2 1 6 6 4 5 4 5 5 4 5 Average: 5.5 4 3.5 3 5 5.5 SECOND STUDY Co-presence Questionnaire Part 1 ID Number Literacy Experience 1 2 3 4 5 6 7 3 7 2 6 6 5 2 6 1 2 4 4 1 5 5 5 1 5 3 3 Average: 5.5 5.5 5 1.5 5.5 2 2.5 SECOND STUDY Co-presence Questionnaire Part 2 ID Number 8 9 10 11 12 13 14 15 3 5 3 2 3 2 7 7 7 4 7 6 5 6 3 4 5 7 Average: 6 4.5 3.5 4.5 2.5 5.5 6 7 Figure 4-3. Results of the presence and co-presence questionnaires administered during the second study.

PAGE 38

29 THIRD STUDY Presence Questionnaire ID Number 1 2 3 4 5 6 5 4 3 5 4 4 5 6 5 4 3 4 3 5 Average: 4.5 3.5 4 4 3.5 5 THIRD STUDY Co-presence Questionnaire Part 1 ID Number Literacy Experience 1 2 3 4 5 6 7 5 6 1 4 4 6 1 3 1 1 6 7 2 5 5 5 1 4 1 1 Average: 4.5 4.5 5.5 1 3.5 1 1 THIRD STUDY Co-presence Questionnaire Part 2 ID Number 8 9 10 11 12 13 14 15 5 5 3 3 4 2 4 6 7 6 5 6 5 5 6 5 6 7 Average: 5 4.5 4 4.5 4 4.5 6 7 Figure 4-4. Results of the presence and co-presence questionnaires administered during the third study. Presence Questionnaire Summary Question 2 nd Study Average 3 rd Study Average Total Average 1 5.5 4.5 5 2 4 3.5 3.75 3 3.5 4 3.75 4 3 4 3.5 5 5 3.5 4.25 6 5.5 5 5.25 Figure 4-5. Comparison between question averages for the presence questionnaire.

PAGE 39

30 Co-presence Questionnaire Summary Question 1 st Study Average 2 nd Study Average 3 rd Study Average Total Average (2 nd & 3 rd Studies) 1 3.5 5.5 4.5 5 2 4 5.5 4.5 5 3 3.5 5 5.5 5.25 4 1 1.5 1 1.25 5 3 5.5 3.5 4.5 6 1.5 2 1 1.5 7 1.5 2.5 1 1.75 8 3 6 5 5.5 9 5 4.5 4.5 4.5 10 2.5 3.5 4 3.75 11 3 4.5 4.5 4.5 12 1 2.5 4 3.25 13 5.5 5.5 4.5 5 14 5.5 6 6 6 15 7 7 7 7 Figure 4-6. Comparison between question averages for the co-presence questionnaire showing improvement from the first study to the second and third studies.

PAGE 40

CHAPTER 5 CONCLUSION 5.1 Usefulness to the Acting Community It has been shown that virtual environments allow multiple actors to successfully rehearse scenes without the need to be in makeup or costume. The true usefulness of this system to the acting community lies in the fact that it can bring actors together from two remote locations for an engaging acting experience. A fully developed virtual rehearsal system could save actors a significant amount of time and money. The system, however, is far from being fully developed. 5.2 Future Work The distributed virtual rehearsal system has many areas that can be improved. The depth of an actors experience in a virtual rehearsal is greatly affected by how realistic their interaction is. Realistic interaction is achieved by making the digital characters movements as life-like as possible. One main complaint from the study participants was that the character they were facing lacked expression. Implementing interactive facial expressions would be costly but would dramatically increase the realism of the experience. In Acting in Virtual Reality, simple mouse strokes were used to change the characters expression [3], however that solution isnt plausible if the actor is to remain wireless (as they are in the virtual rehearsal system). Another solution would be to incorporate a third web-camera into the system that would provide images of the actors face to a PC that could detect 31

PAGE 41

32 changes in facial expressions. The third and easiest solution would be to give the system operator control over the characters facial expressions. The drawbacks to this solution are operator subjectivity and that the operator would have to be within visual range of the actor. The other main complaint from the study participants was the limited number of tracked body parts. More tracked areas would have increased realism, although only 3 tracked areas were needed for the scene. The shoulders straps were used during system calibration but werent actively tracked during the rehearsal. Adding shoulder tracking could have allowed for torso manipulation, which would have been especially useful when the actors wanted to lean forward. Orientation tracking, while not specifically mentioned by the study participants, would have greatly affected character realism. This would allow the characters to look left and right as well as rotate their hands. Using two colored straps to determine the direction of the vector that passes through both straps could approximate head orientation tracking. Hand orientation would be much more difficult since there are several axes of rotation. Automated accurate lip-synching is another aspect that would have a significant effect on the users sense of presence. For this to work, the actors audio stream would need to be analyzed in real-time. This would be difficult to implement and computationally expensive. The ideal system would not only track gestures and facial expressions, but allow the actor to move freely around the stage. This could be achieved using a modified CAVE system or a head-mounted display.

PAGE 42

33 5.3 Future Applications Motion capture systems are typically used to capture an actors movements and later add them to a digital character. Virtual rehearsals could be modified to record the actors movements as they rehearse their scene. It would then essentially be a real-time motion capture system. The recorded movements could then be played back for the actor to review or they could be sent directly to an animation package for the purpose of rendering a digitally animated movie. A virtual film director system could also be added to the virtual rehearsal system. The virtual director could plan out camera angles, arrange and modify props, start and stop the action, and direct the actors movements. The director could be represented by a digital character or simply watch the action from a monitor, speaking through a virtual speaker. Distributed virtual performances are another plausible extension of the virtual rehearsal. This would introduce audience systems into the distributed virtual rehearsal paradigm. While several actors perform the scene from separate locations, an audience can watch the action unfold from a third-person point of view. Allowing a director to control the camera angles would further enhance the experience by providing the audience with cinematic visuals.

PAGE 43

APPENDIX STUDY QUESTIONNAIRES A.1 Co-presence Questionnaire Part A: Personal Information Your Given ID number Your Age Your Gender Male Female Occupational status Undergraduate Student Masters Student PhD Student Research Assistant/Fellow Staff systems, technical Faculty Administrative Staff Other Please state your level of computer literacy on a scale of (1) (never used before) 1 2 3 4 5 6 7 (a great deal) Have you ever experienced virtual reality before? (never used before) 1 2 3 4 5 6 7 (a great deal) Part B: Virtual Reality Experience Please give your assessment as to how well you contributed to the successful performance of the task. My contribution to the successful performance of the task was (not at all) 1 2 3 4 5 6 7 (a great deal) Please give your assessment as to how well the other person contributed to the successful performance of the task. The other persons contribution to the task was (not at all) 1 2 3 4 5 6 7 (a great deal) 34

PAGE 44

35 To what extent were you and the other person in harmony during the course of the experience. We were in harmony (not at all) 1 2 3 4 5 6 7 (a great deal) To what extent did you feel embarrassed with respect to what you believed the other person might be thinking about you? I felt embarrassed (not at all) 1 2 3 4 5 6 7 (a great deal) Think about a previous time when you co-operatively worked together with another person in order to achieve something similar to what you were trying to achieve here. To what extent was your experience in working with the other person on this task today like the real experience, with regard to your sense of doing something together? This was like working together with another person in the real world (not at all) 1 2 3 4 5 6 7 (a great deal) To what extent, if at all, did the other person hinder you from carrying out the task? The other person hindered me from carrying out this task (not at all) 1 2 3 4 5 6 7 (a great deal) To what extent, if at all, did you hinder the other person from carrying out the task? I hindered the other person from carrying out this task (not at all) 1 2 3 4 5 6 7 (a great deal) Part C: Virtual Reality Experience Continued Please give your assessment of how well you and the other person together performed the task. We performed the task successfully (not at all) 1 2 3 4 5 6 7 (a great deal)

PAGE 45

36 To what extent, if at all, did you have a sense of being with the other person? I had a sense of being with the other person (not at all) 1 2 3 4 5 6 7 (a great deal) To what extent were there times, if at all, during which the computer interface seemed to vanish, and you were directly working with the other person? There were times during which I had a sense of working with the other person (not at all) 1 2 3 4 5 6 7 (a great deal) When you think back about your experience, do you remember this as more like just interacting with a computer or working with another person? The experience seems to me more like interacting with a person (not at all) 1 2 3 4 5 6 7 (a great deal) To what extent did you forget about the other person, and concentrate only on doing the task as if you were the only one involved? I forgot about the other person (not at all) 1 2 3 4 5 6 7 (a great deal) During the time of the experience did you think to yourself that you were just manipulating some screen images with a mouse-like device, or did you have a sense of being with another person? During the experience I often thought that I was really manipulating some screen images (not at all) 1 2 3 4 5 6 7 (a great deal) Overall rate the degree to which you had a sense that there was another human being interacting with you, rather than just a machine. My sense of there being another person was (not at all) 1 2 3 4 5 6 7 (a great deal)

PAGE 46

37 If you had a chance, would you like to meet the other person? (not at all) 1 2 3 4 5 6 7 (a great deal) Assess the mood of the other person on the basis of very depressed to very happy. The mood of the other person seemed to be happy (not at all) 1 2 3 4 5 6 7 (a great deal) Please write any additional comments here. Things you could consider are: Things that hindered you or the other person from carrying out the task; what you think of the person you worked with; and any other comments about the experience and your sense of being there with another person. What things made you pull out and more aware of the computer A.2 Presence Questionnaire 1. Please rate your sense of being in the environment, on the following scale from 1 to 7, where 7 represents your normal experience of being in a place. I had a sense of being there in the environment (not at all) 1 2 3 4 5 6 7 (a great deal) 2. To what extent were there times during the experience when the environment was the reality for you? There were times during the experience when the environment was the reality for me (at no time) 1 2 3 4 5 6 7 (Almost all the time) 3. When you think back about your experience, do you think of the environment more as images that you saw, or more as somewhere that you visited? The environment seems to me to be more like (Images that I saw) 1 2 3 4 5 6 7 (Somewhere I visited)

PAGE 47

38 4. During the time of the experience, which was the strongest on the whole, your sense of being in the environment, or of being elsewhere? I had a stronger sense of (Being elsewhere) 1 2 3 4 5 6 7 (Being in the environment) 5. Consider your memory of being in the environment. How similar in terms of the structure of the memory is this to the structure of the memory of other places you have been today? By structure of the memory consider things like the extent to which you have a visual memory of the environment, whether that memory is in color, the extent to which the memory seems vivid or realistic, its size, location in your imagination, the extent to which it is panoramic in your imagination, and other such structural elements. I think of the environment as a place in a way similar to other places that Ive been today (not at all) 1 2 3 4 5 6 7 (very much so) 6. During the time of the experience, did you often think to yourself that you were actually in the environment? During the experience I often thought that I was really existing in the environment (not very often) 1 2 3 4 5 6 7 (very much so)

PAGE 48

LIST OF REFERENCES 1. Dahl, S., Kinesics, Business School, Middlesex University, 2004. Retrieved 14 Mar. 2004 < http://stephan.dahl.at/nonverbal/kinesics.html >. 2. Ryan, D., Semiotics, School of Arts and Sciences, Australian Catholic University, 2003. Retrieved 14 Mar. 2004 < http://www.mcauley.acu.edu.au/staff/delyse/semiotic.htm >. 3. Slater, M., Howell, J., Steed, A., Pertaub, D-P., Garau, M. and Springel, S., Acting in Virtual Reality, ACM Collaborative Virtual Environments, CVE, 2000. 4. Mortensen, J., Vinayagamoorthy, V., Slater, M., Steed, A., Lok, B. and Whitton, M.C., Collaboration in Tele-Immersive Environments, Proceedings of the Eighth Eurographics Workshop on Virtual Environments, 2002. 5. Oliverio, J., Quay, A. and Walz, J., Facilitating Real-time Intercontinental Collaboration with Emergent Grid Technologies: Dancing Beyond Boundaries, Paper from the Digital Worlds Institute, 2001. Retrieved 9 Aug. 2004 < http://www.dwi.ufl.edu/projects/dbb/media/VSMM_DigitalWorlds.pdf >. 6. Steed, A., Slater, M., Sadagic, A., Tromp, J. and Bullock, A., Leadership and Collaboration in Virtual Environments, IEEE Virtual Reality, Houston, March 1999, 112-115. 7. Brooks, F.P., Whats Real about Virtual Reality? IEEE Computer Graphics and Applications, Nov./Dec. 1999. 8. Cruz-Neira, C., Sandin, D.J. and DeFanti, T.A., Surround-Screen Projection-Based Virtual Reality: The Design and Implementation of the CAVE, Computer Graphics (SIGGRAPH) Proceedings, Annual Conference Series, 1993. 9. Bowman, D.A. and Hodges, L.F., An Evaluation of Techniques for Grabbing and Manipulating Remote Objects in Immersive Virtual Environments, Symposium on Interactive 3D Graphics, Apr. 1997. 10. Sutherland, I.E., A Head-mounted Three Dimensional Display, Proceedings of the AFIPS Fall Joint Computer Conference, Vol. 33, 757-764, 1968. 11. Polhemus, FASTRAK: The Fast and Easy Digital Tracker, Colchester, VT, 2004. Retrieved Apr. 2004 < http://www.polhemus.com/FASTRAK/Fastrak Brochure.pdf >. 39

PAGE 49

40 12. InterSense, InterSense InertiaCube2, Bedford, MA, 2004. Retrieved Apr. 2004 < http://www.isense.com/products/prec/ic2/InertiaCube2.pdf >. 13. 3rdTech, Inc., HiBall-3000 Wide Area Tracker and 3D Digitizer, Chapel Hill, NC, 2004. Retrieved Apr. 2004 < http://www.3rdtech.com/images/hiballdatasheet02v5forweb2.PDF >. 14. Jackson, J., Lok, B., Kim, J. Xiao, D., Hodges, L. and Shin, M., Straps: A Simple Method for Placing Dynamic Avatars in a Immersive Virtual Environment, Future Computing Lab Tech Report FCL-01-2004, Department of Computer Science, University of North Carolina at Charlotte, 2004. 15. Thalmann, D., The Role of Virtual Humans in Virtual Environment Technology and Interfaces, in Frontiers of Human-Centered Computing, Online Communities and Virtual Environments, Springer, London, 2001, 27-38. 16. Slater, M. and Usoh, M., Body Centered Interaction in Immersive Virtual Environments, in N. Magnenat Thalmann and D. Thalmann (eds.) Artificial Life and Virtual Reality, John Wiley and Sons, New York, 1994, 125-148. 17. Garau, M., Vinayagamoorthy, V., Slater, M., Steed, A. and Brogni, A., The Impact of Avatar Realism on Perceived Quality of Communication in a Shared Immersive Virtual Environment, Equator Annual Conference, 2002. 18. Slater, M., Sadagic, A., Usoh, M. and Schroeder, R., Small Group Behavior in a Virtual and Real Environment: A Comparative Study, presented at the BT Workshop on Presence in Shared Virtual Environments, June 1998. 19. Vacchetti, L., Lepetit, V., Papagiannakis, G., Ponder, M., Fua, P., Magnenat-Thalmann, N. and Thalmann, D., Stable Real-Time Interaction Between Virtual Humans and Real Scenes, Proceedings of 3DIM 2003 Conference, 2003. 20. Haptek Inc., Santa Cruz, California, Sept. 2003. Retrieved 9 Aug. 2004 < http://www.haptek.com/ >. 21. UGS Corporation, E-Factory: Jake, Plano, TX, 2004. Retrieved Apr. 2004 < http://www.ugs.com/products/efactory/jack/ >. 22. Boston Dynamics, DI-Guy: The Industry Standard in Real-Time Human Simulation, Cambridge, MA, 2004. Retrieved Apr. 2004 < http://www.bdi.com/content/sec.php?section=diguy >. 23. VQ Interactive, Inc., BOTizen: The Power of Interactivity, Selangor Malaysia, 2003. Retrieved Apr. 2004 < http://www.botizen.com/ >.

PAGE 50

41 24. Tsai, R., A Versatile Camera Calibration Technique for High-Accuracy 3D Machine Vision Metrology Using Off-the-Shelf TV Cameras and Lenses, IEEE Journal of Robotics and Automation, Aug. 1987, 323-344.

PAGE 51

BIOGRAPHICAL SKETCH George Victor Mora was born in Miami, Florida, on March 29 th 1980. He spent the first 18 years of his life in South Florida. His obsession with art and technology began at an early age. During high school, he focused his attention on art and computer science classes. Upon completing high school, he moved to Gainesville, Florida, to attend the University of Florida. In August of 2002, George finished his undergraduate degree in computer science. He returned to the University of Florida the following semester as a graduate student in the newly formed digital arts and sciences program in the College of Engineering. For the next two years, George focused on virtual environments and digital media both through his school work and as an employee of the Digital Worlds Institute. In December of 2004 George will receive his Master of Science degree in digital arts and sciences. 42


Permanent Link: http://ufdc.ufl.edu/UFE0007402/00001

Material Information

Title: Distributed Virtual Rehearsals
Physical Description: Mixed Material
Copyright Date: 2008

Record Information

Source Institution: University of Florida
Holding Location: University of Florida
Rights Management: All rights reserved by the source institution and holding location.
System ID: UFE0007402:00001

Permanent Link: http://ufdc.ufl.edu/UFE0007402/00001

Material Information

Title: Distributed Virtual Rehearsals
Physical Description: Mixed Material
Copyright Date: 2008

Record Information

Source Institution: University of Florida
Holding Location: University of Florida
Rights Management: All rights reserved by the source institution and holding location.
System ID: UFE0007402:00001


This item has the following downloads:


Full Text











DISTRIBUTED VIRTUAL REHEARSALS


By

GEORGE MORA















A THESIS PRESENTED TO THE GRADUATE SCHOOL
OF THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT
OF THE REQUIREMENTS FOR THE DEGREE OF
MASTER OF SCIENCE

UNIVERSITY OF FLORIDA


2004































Copyright 2004

by

George Mora


































To my wife, Maria,
My parents, Jorge and Johanna Mora
And my family and friends
For their constant support and encouragement















ACKNOWLEDGMENTS

I would like to thank my thesis committee chairman, Dr. Benj amin C. Lok, for his

enthusiasm and interest in this project, as well as for keeping me motivated and on track.

I would also like to thank James C. Oliverio for being on my committee and for the

constant support, advice, and opportunities he has provided for me. I also give much

thanks to Dr. Jorg Peters for supporting both my undergraduate and graduate final

proj ects.

This thesis was completed with the help of several people. My gratitude goes out

to Jonathan Jackson, Kai Bernal, Bob Dubois, Kyle Johnsen, Cyrus Harrison, Andy

Quay, and Lauren Vogelbaum. This thesis would not have been possible without their

help.

I would like to thank my parents, Jorge and Johanna Mora, for always

encouraging me to grow both intellectually and creatively. Finally, I would like to thank

my wife, Maria Mora, for her unending love, support, and understanding.



















TABLE OF CONTENTS


page


ACKNOWLED G1VENT S .............. ............iv.. .......... ....


LIST OF FIGURES ............... ............vii. .......... ....


AB STRAC T ......__.............. ............viii......


CHAPTER


1 INTRODUCTION ............... ...........1...................


1.1 Motivation ............... ...........1.............. ...
1.2 Challenges ............... ...........4.............. ....
1.3 Project Goals ............... ...........4.............. ....
1.4 Organization of Thesis ............... ..............................4
1.5 Thesis Statement ............... ............5... ......... ...
1.6 Approach ............... ............5... ......... ....


2 PREVIOUS WORKS ............... ............7... ......... ....


2. 1 Distributed Performance ............... .............................7
2.2 Virtual Reality ................. ...........8.............. ....
2.3 Digital Characters and Avatars ............... ......... ...............10O


3 APPLICATION ............... ...........13...................


3.1 Scene Design and Experience Development ............... ................ 13
3.2 Tracking the Actors ............... ............15.. .......... ...
3.3 Putting It All Together ................. ...........17.................
3.4 Final Software and Hardware Setup ............... .....................18


4 RESULTS ............... ............22.. ...............


4.1 Description of Studies ............... ..............................22
4.2 Reaction from Actors ............... ............23.. ..............
4.3 Results. .......... .... ... .. .. ....... .... ....... ... ...........2
4.3.1 Virtual Reality Used for Successful Rehearsals .............. ... .........24
4.3.2 Lack of Presence Distracted the Actors ............... ............... .26












4.3.3 Improvements That Should Be Made to the System .....................27


5 CONCLUSION. ....___ ............... ............31.. .....


5.1 Usefulness to Acting Community ............... ......... .............31
5.2 Future Work ............... ...........31............. ....
5.3 Future Applications ............... ............33.. .......... ....


APPENDIX: STUDY QUESTIONNAIRES ............... ......... .............34


A. 1 Co-presence Questionnaire ............... ............... ..........34
A.2 Presence Questionnaire ............... .............................37


LIST OF REFERENCES ............... ............39.. .......... ....


BIOGRAPHICAL SKETCH ............... ...........42...................

















LIST OF FIGURES


Figure pg

1-1. Two actors rehearsing in a virtual environment ............... ...................2

3 -1. A parti cipant wearing the colored felt strap s ............... ................ ....1 6

3-2. A participant testing out the system ............... ............... ......... 19

3-3. Sample screenshot demonstrating the virtual script system ................ ...........19

3-4. Data flow for both rendering systems ............... .......... ..............20

3-5. Hardware setup for each location ............... ............................21

4-1. The location of each actor on the University of Florida campus ............ ...... ......23

4-2. Results of the co-presence questionnaire administered during the first study.........28

4-3. Results of the presence and co-presence questionnaires-second study .................28

4-4. Results of the presence and co-presence questionnaires-third study ................... .29

4-5. Comparison between question averages for the presence questionnaire ...............29

4-6. Compari son b between que sti on averages for the co-presence que sti onnaire............ 30















Abstract of Thesis Presented to the Graduate School
of the University of Florida in Partial Fulfillment of the
Requirements for the Degree of Master of Science

DISTRIBUTED VIRTUAL REHEARSALS

By

George Mora

December, 2004

Chair: Benjamin C. Lok
Maj or Department: Computer and Information Science and Engineering

Acting rehearsals with multiple actors are limited by many factors. Physical

presence is the most obvious, especially in a conversation between two or more

characters. Cost is an obstacle that primarily affects actors who are in different locations.

This cost consists of travel and living expenses. Preparation time is another hindrance,

especially for performances involving elaborate costumes and intricate makeup. Many

recent high-budget motion pictures require that key actors go through several hours of

makeup application to complete their character's look.

Virtual reality can bring actors together to rehearse their scene in a shared

environment. Since virtual reality elicits emotions and a sense of perceived presence

from its users, actors should be able to successfully rehearse in a virtual environment.

This environment can range from an empty space to a fully realized set depending on the

director' s imagination and the proj ect' s scope.









Actors' movements will be tracked and applied to a digital character, creating a

virtual representation of the character. The digital character will resemble the actor--in

full costume and makeup. In the virtual environment, each actor will see (in real-time)

the character being controlled by their acting partner.

The goal is to show that multiple actors can use a shared virtual environment as

an effective acting rehearsal tool. This proj ect will also demonstrate that actors can hone

their skills from remote locations through virtual reality, and serve as a foundation for

future applications that enhance the virtual acting paradigm.















CHAPTER 1
INTRODUCTION

1.1 Motivation

Acting rehearsal is the process by which actors refine their acting skills and

practice scenes for future public performances. These rehearsals traditionally occur on a

stage with the principle actors and the director physically present. Although costumes

and makeup are not essential until the final few rehearsals (called dress rehearsals), a

functional set is important for determining when and where to move (known as

movement blocking).

There are several variations on the standard rehearsal. During the pre-production

stage, a "read through" or "reading" is scheduled to familiarize the actors with the script

and each other. Typically, actors are physically present in a conference room, although

this can be accomplished through using a video or telephone conference. After the

"reading", a blocking rehearsal will help choreograph the actors' movements.

"Blocking" rehearsals usually take place on a stage or practice set, since its dimensions

affect the choreography of a production. "Polishing and Building" rehearsals take up the

majority of the total rehearsal time. During these rehearsals, actors perfect their

performance and work out any major problems. The final rehearsals (dress and technical

rehearsals) involve practicing the performance in full costume and makeup with complete

lighting, sound, and props on a finished set.










Currently, a "reading" is the only rehearsal method which does not need an

actor's physical presence. The "reading" does not require that actors wear

costume/makeup or move on an assembled stage. Therefore it could be performed over

the telephone. One could argue that distributed rehearsals could be easily achieved

through video conferencing. However the cost and availability of a system which could

deliver satisfying results in terms of video/audio quality, bandwidth, and robustness make

video conferencing a poor choice for effective distributed rehearsals.

Allowing digital characters to represent an actor in a shared immersive virtual

environment increases the number of conditions under which an acting rehearsal can

occur. Physical presence, preparation time, and cost would no longer limit rehearsals.

This would allow multiple actors from anywhere in the world to meet and rehearse a

scene before there are costumes or constructed sets.
















Figure 1-1. Two actors rehearsing in a virtual environment. Actor 1 controls the
movements of Character 1 (Morpheus), while Actor 2 controls the movements of
Character 2 (Neo).

By allowing actors to meet in a virtual space, there is an added advantage of

virtual reality interaction. Such interaction includes stereoscopic vision, gaze tracking,

and easy prop and set maintenance. Stereoscopic vision allows the actor to see the acting










partner, set, and props in three dimensions. Gaze tracking changes the display based on

the location of the actor's head and the direction he or she looks. Prop and set

maintenance allow one to move, rotate, and replace any prop or piece of scenery.

Consideration must be given to acting theory since the actor's expressions will be

expressed through the avatar. The form of expression this thesis focuses on is gestures.

Kinesics encompasses all non-verbal forms of communicating. These include gestures,

body language, and facial expressions. There are several categories of kinesics:

Emblems: non-verbal messages with a verbal counterpart.

Illustrators: gestures associated with verbal messages.

Affective Displays: gestures or expressions conveying emotion.

Regulators: non-verbal signs that maintain the flow of a conversation.

Adaptors: small changes in composure that subconsciously convey mood [1].

Communication on stage mimics communication in real life. Therefore, the

relationship of kinesics to acting is obvious. Actors must pay special attention to their

movement, gestures, facial expressions, and body language in relation to what they are

saying or doing. Ryan reaffirms the connection between kinesics and acting:

The informal spatial code relates to the movement of the body on stage including

facial expression, gestures, formations of bodies (i.e. patterns, shapes), and

sometimes includes moving set and pieces of furniture.

Ryan also lists several properties of kinesics use on stage:

Gestures can't stand alone.

Gestures can't be separated from the general continuum of motion.

Gesture is the primary mode of ostending (i.e. showing) the body on stage [2].










1.2 Challenges

Actors are accustomed to being physically present with other actors on real sets.

For a virtual environment to be effective in improving acting skills, the actor must

experience a strong sense of being in the same space as the other actor (referred to as a

sense of co-presence).

Some challenges faced when trying to achieve this sense of co-presence include:

Keeping the actor comfortable and as unaware that their movements are being
tracked.

Ensuring high audio quality to simulate hearing the other voice in the same room.

Placing the cameras, projector, and screen in such a way that the actor has a clear
view of the screen while still being accurately tracked.

Providing realistic models and textures for the virtual environment.

Having the character exhibit human-like behavior and expressions.

Ensuring high-speed data transmission between systems.

1.3 Project Goals

This project seeks to enhance the fields of virtual environments research and

acting theory in the following ways:

Demonstrate that digital characters in virtual environments allow for effective
distributed rehearsals.

Provide a prototype system to allow actors to interact with each other virtually in
full costume/makeup across long distances.

1.4 Organization of Thesis

This thesis is organized into the following sections:

Introduction. Specifies the need for this research, obstacles in completing the
proj ect, and the ultimate goals for this thesis.










Previous Work. Describes the research and work that served as both inspiration
and a foundation for this proj ect.

Application. Details the process of creating the application that demonstrates the
ideas presented in this thesis.

Results. Discusses the results of a survey of several actors who tested the system
to rehearse a simple scene.

Conclusion. Summarizes results and lists future work and applications of the
research.

1.5 Thesis Statement

Distributed Virtual Rehearsals can bring actors together from different locations

to successfully rehearse a scene.

1.6 Approach

My approach is to build a system that will allow two or more actors to rehearse a

scene in a virtual world. Digital characters that resemble the actor in full costume and

makeup will represent the actor. The actor's movements will be tracked and will directly

affect the movements of the digital character. The actor will see digital characters

controlled by other actors in remote locations.

Since the data required for rendering the characters and props exists on local

machines, the only information that needs to be sent is the tracking data for each actor's

movements. The tracking data for each body part is contained in a text message

composed of several (three for position data only) floating point numbers. The

information can be efficiently transmitted which will reduce lag.

The system for this rehearsal setup (per actor):

Proj ector-based display system with a large proj section screen.

A well-lit room large enough for the actor to perform the scene.










* Two web cameras connected to two PCs.

* One rendering PC.

* High-speed network connecting each system.

* Several different colored straps attached to the actor's body.

* A headset with built-in microphone (wireless if extensive body movement is
required)















CHAPTER 2
PREVIOUS WORK

2.1 Distributed Performance

Distributed performance refers to media-centered events and actions that affect

each other yet occur in different locations. This could range from a simple telephone

call to a complex massive multiplayer online role-playing game (MMORPG).

In the study Acting in Virtual Reality, distributed performance brings several

actors and directors together to rehearse a short play. Each participant interacts with the

others through networked computers and his/her own non-immersive display. Semi-

realistic avatars represent the actors while the director is only heard over the sound

system. The study proved successful in allowing actors to rehearse in a shared virtual

environment. "A performance level was reached in the virtual rehearsal which formed

the basis of a successful live performance, one that could not have been achieved by

learning of lines or video conferencing" [3].

Networked virtual environments are also used in the Collaboration in Tele-

Immersive Environments proj ect at the University of London and the University of North

Carolina at Chapel Hill. This project investigated if virtual environments could facilitate

finishing a collaborative task in virtual reality. The task involved two people carrying a

stretcher along a path and into a building. Their result indicated that realistic interaction

in a virtual environment over a high-speed network- while possible--still suffers from

tracking delays, packet losses, and difficulty sharing










control of objects. "The data suggests that in order to have a sense of being with another

person, it is vital that the system 'works' in the sense that people have an impression of

being able to actually do what they wish to do" [4].

Dancing Beyond Boundaries~~~~dddd~~~~ddd involves the use of video conferencing over a high-

speed network as a method of distributed performance. This piece used Access Grid

technology and an Internet2 connection to allow dancers and musicians from four

different locations across North and South America to interact with each other. "Thus the

combination of multi-modal information from the four nodes created a 'virtual studio'

that could also be termed a distributed virtual environment, though perhaps not in the

usual sense" [5].

An important aspect of distributed performance is the state of co-presence that is

achieved. Co-presence is the sense of being in the same space with other people.

Distributed collaboration has shown to be successful since it achieves a high degree of

co-presence. It has been shown that the level of immersion is related to a user' s sense of

presence and co-presence. The level of immersion also relates to leadership roles in

group settings [6].

2.2 Virtual Reality

Frederick Brooks Sr. defines a virtual reality experience as "any in which the user

is effectively immersed in a responsive virtual world. This implies user dynamic control

of viewpoint." Effective immersion has been achieved through the use of novel display

and interaction systems [7].

Immersive displays create a sense of presence through multi-sensory stimulation.

Previous examples of these systems include head-mounted displays, CAVE (Cave-like









Automatic Virtual Environment) systems, projector-based displays, and computer

monitors. Several of the goals which inspired the creation of the CAVE system can be

applied to most other immersive display systems:

The desire for higher-resolution color images and a large field of view without
geometric distortion.

The ability to mix VR imagery with real devices (like one's hand, for instance).

The opportunity to use virtual environments to guide and teach others [8].

Effective interaction is as important as a novel display system in creating an

immersive virtual environment. Successful interaction involves allowing the user to

control the view of the environment or objects inside the environment. The degree to

which presence is experienced depends on how well the interface imitates "real world"

interaction. "A defining feature of virtual reality (VR) is the ability to manipulate virtual

obj ects interactively, rather than simply viewing a passive environment" [9].

Motion tracking provides a realistic means of interacting with a virtual

environment. It can adjust the view of the environment, manipulate objects in the

environment, and trigger visual and aural cues based on gaze and gesture recognition.

Motion tracking is often used in head-mounted display systems where the user's position

and orientation affect what the user sees and hears in the environment. "Although stereo

presentation is important to the three-dimensional illusion, it is less important than the

change that takes place in the image when the observer moves his head. The image

presented by the three-dimensional display must change in exactly the way that the image

of a real obj ect would change for similar motions of the user' s head" [10].












There are many commercial motion tracking devices:

Polhemus FASTRAK uses several electromagnetic coils per tracker to transmit
position and orientation information to a receiver.

InterSense InertiaCube small orientation tracker that provides 3 degrees of
freedom (yaw, pitch, and roll) and allows for full 3600 rotation about each axis.

HiBall-3100 wide-area position and orientation tracker. Uses hundreds of
infrared beacons on the ceiling and 6 lenses on the HiBall Sensor to track the user.

These commercial solutions are highly accurate and produce very low latency.

However, they are expensive and oftentimes encumber the user [l l, 12, 13].

Motion tracking has been applied to avatar movement through by tracking colored

straps. In Straps: A Simple M~ethod for Placing Dynamnic Avatars in an Immersive Virtual

Environment, colored straps are attached to the user's legs to accurately represent their

movement through an avatar in the virtual environment. The straps system has two major

advantages over other tracking systems:

Freedom there are no encumbering cable, which reduces system complexity,
accident hazards, and equipment failure.

Simplicity the colored straps are cheap, easy to create, and contain no moving
parts or electronics [14].

2.3 Digital Characters and Avatars

Inserting digital characters into virtual environments can make the experience

much more realistic to the user. "The user's more natural perception of each other (and

of autonomous actors) increases their sense of being together, and thus the overall sense

of shared presence in the environment" [15].









An avatar is the representation of oneself in a virtual environment. In an ideally

tracked environment, the avatar would follow the user's movements exactly. Slater and

Usoh discuss the influence of virtual bodies on their human counterpart:

The essence of Virtual Reality is that we (individual, group, simultaneously,

asynchronously) are transported bodily to a computer generated environment. We

recognize habitation of others through the representation of their own bodies.

This way of thinking can result in quite revolutionary forms of virtual

communication [16].

Digital character realism affects the amount of immersion experienced by the user

in a virtual environment. This realism is manifested visually, behaviorally, and audibly.

A break in presence (losing the feeling of presence) can occur if someone in a virtual

environment is distracted by a lack of realism in an avatar. In The Impact of Avatar

Realism on Perceived Quality of Communication in a Mai edl~( Immersive Virtual

Environment, avatar realism and a conversation-sensitive eye motion model are tested to

determine their effect on presence. "We conclude that independent of head-tracking,

inferred eye animations can have a significant positive effect on participants' responses to

an immersive interaction. The caveat is that they must have a certain degree of visual

realism" [17].

Even without realistic avatars, users can still be greatly affected by other users'

digital characters, as well as their own avatar. Establishing a sense of presence increases

the chances of a participant becoming attached to avatars in the virtual space. Emotional

attachment to avatars was a surprising result of the study Small Group Behavior in a

Virtual and Real Environment: A Comparative Study. "Although, except by inference, the









individuals were not aware of the appearance of their own body, they seemed to generally

respect the avatars of others, trying to avoid passing through them, and sometimes

apologizing when they did so." The avatars used in the study were simple models

associated with a unique color [18].

Digital characters have been successfully integrated into real environments using

computer vision and camera tracking techniques. These characters are used partly as

"virtual teachers" that train factory workers to operate some of the machinery. The

virtual humans pass on knowledge to participants using an augmented reality system.

Although the characters are automated, a training specialist can control them from a

different location via a networked application [19].

Digital character realism has been integrated into the character rendering system

created by Haptek. This system can integrate multiple highly realistic and customizable

characters into a virtual environment. These characters also act realistically (i.e. blinking,

looking around, and shifting weight). The Haptek system allows these characters to be

used as avatars, or as autonomous virtual humans [20].

There are many other commercially available character-rendering systems:

UGS Corp. Jack-usability, performance, and comfort evaluation using digital
characters that are incorporated into virtual environments.

Boston Dynamics DI-Guy-real-time human simulation used in military
simulations created by all branches of the United States Armed Forces.

VQ Interactive BOTizen-online customer support conducted by digital
characters. Characters respond to queries using a text-to-speech engine [21, 22,
23].















CHAPTER 3
APPLICATION

3.1 Scene Design and Experience Development

The first step in creating the distributed rehearsal system was to choose a sample

scene. This scene would determine character design, set design, and the dialogue.

Several factors influenced the decision:

Familiarity Since the actors would be testing the system without reading the
script beforehand, the scene needed to be immediately accessible to most actors.

Ease of tracking The system is a prototype; therefore extensive tracking would
be beyond its scope. Additionally, the acting focuses on kinesics, so gesture
tracking is the only requirement. The scene would involve characters that stay
relatively still acting primarily with gestures.

Interesting Set and Characters Presence is one of the main factors measured
when evaluating virtual environment systems. Incorporating stimulating digital
characters and sets into your environment can achieve presence.

Several scenes were evaluated using the above criteria. The scenes included the

"balcony scene" from Romeo and Juliet, the "heads and tails" scene from Rosencrantz

and Guildenstern are Dead, and the "red pill, blue pill" scene from The Matrix. The "red

pill, blue pill" scene was chosen because it is a very familiar scene that few actors would

have previously rehearsed.

Once the scene was selected, the characters and set needed to be constructed. The

modeling software 3D Studio Max was used to create the set. The set consisted of a dark









room with a fireplace, two red chairs and a small white table. The two main characters,

Neo and Morpheus, were created using the base Haptek male character model and

adjusting the texture accordingly.

The characters and environments, after being fully modeled and textured, were

then exported into a format that could be incorporated into a graphics engine. The 3DS

file format was chosen and then read into an OpenGL application along with each

character. Lighting was set up to reflect the locations of the physical lights in the scene.

It was important to be able to manipulate the character's skeleton in real-time.

Therefore, each joint in the character needed to be explicitly controlled. Haptek uses a

separate joint for each degree of freedom that exists in a joint. The shoulders, elbows,

and neck each have three joints. For simplicity, two joints were used for the neck, two

for each shoulder, and one for each elbow.

Special attention went toward developing aspects of the system that would

enhance the user's experience. The actor sat on a cushioned chair in front of a large

surface onto which the scene is projected. The setup was designed to physically immerse

the actors an environment similar to the one used in the designated scene. Each rendering

system had the option to allow each actor to use his/her head movement to affect his/her

camera's viewpoint. This simulates an actor looking through a large hole in the wall at

the other actor--if the actors tilt their heads to the side, their viewpoints rotate slightly

and allow them to see more of the room on the other side of the wall. The experience

began with a clip from The Matrix that leads into the scene. These were efforts to

increase the sense of presence each actor experiences.









3.2 Tracking the Actors

Setting up the tracking system required two cameras, two PC's (one for each

camera), colored paper or cloth for the straps, and sufficient USB or Firewire cable to

accommodate the setup. The tracking system worked under different lighting conditions

provided adequate training is performed on each camera. Training consisted of acquiring

many pictures of each strap and the background and determining the range of the colors

that belong to each strap. An application provided with the tracking system

accomplished most of this process.

Two sets of straps were created for the system. The first set of straps consisted of

colored pieces of paper fastened with tape. These straps were used in the first study and

the participants suggested using a different material because the paper was

"uncomfortable and sometimes distracting." The second set was constructed with

colored pieces of felt to increase the comfort level of each participant. These straps were

fastened with small strips of Velcro. Figure 3-1 shows the second set of straps attached

to a participant.

The tracking system on each PC then transmitted the two-dimensional coordinates

of each strap to a client computer. Tsai's camera calibration algorithm is used to

calibrate each camera and recover extrinsic and intrinsic camera parameters [24].

Calibration was achieved by explicitly mapping two-dimensional sample picture

coordinates to their three-dimensional equivalents. This provided a configuration Eile for

use in the client computer.

Once the rendering system was receiving correct tracking values from the straps

system, these values needed to be appropriately mapped to the digital characters









movements. The system first saved the fixed locations of each shoulder. Then, after

instructing the user to place their hands directly in front of them with their elbows fully

extended, the system determined their arm length and head height (distance from their

neck to their forehead). The system used the actor's arm length, head height, and

shoulder width constants to appropriately displace the digital character's hands and head.


Figure 3-1. A participant wearing the colored felt straps.

Forward shoulder joint animation (vertical movement) was accomplished by

determining the angle of displacement that a line passing from the shoulder to the hand

would create from a central horizontal line. The distance from the shoulder to the hand

determines the amount of elbow bend that is required. For instance, if the hand is arm's

length away from the shoulder, the elbow wouldn't be bent at all. Conversely, if the hand

was located adjacent to the shoulder, the elbow would be fully bent. Finally, shoulder









turn (horizontal) was calculated by determining the angle of displacement the hand would

make from a central vertical line.

3.3 Putting it all Together

VRPN was used to connect and transfer text messages between the tracking

system and the rendering system as well as between each rendering system. The tracking

system sends a text message containing the two-dimensional coordinates for each color

detected along with the width and height of the image in pixels. The rendering system

receives these values and, combined with the values from the second tracking system,

uses Tsai's algorithm for recovering the three-dimensional coordinates [24].

Once calibration has finished and the actors are accustomed to using the system,

the tracked data is shared between rendering systems via VRPN text messages. The text

message contains the two angles for each shoulder and for the neck, the bend angle for

the elbow, and the speaking state. The speaking state determines which actor is currently

speaking; this is used with the lip-syncing and virtual script systems.

Voice acting is an important aspect of rehearsal. Therefore it was necessary to

implement a system that allowed the actors to transmit their voice to their partner.

Headsets with built-in microphones were used. The headsets had a behind-the-neck

design so they would not interfere with the forehead strap. Voice was transmitted using

DirectPlay.

Instead of using a traditional physical script, a virtual script system allowed the

actors to read their lines without having to look away from the display. This system

displayed the actor' s current line on the bottom of the screen when their speaking state is

true. Incorporating the virtual script system introduced the problem of determining when










to proceed with the next line. Originally, the actor would use their foot to press a key that

would trigger the speak state to false and send a message to the remote rendering system

to change its speak state to true. However, this hindered presence and lowered the

comfort level. It was decided on to have the system operator, who calibrated the system

and trained the actor to use the system, manually switch the speak state to false when a

line was finished.

3.4 Final Software and Hardware Setup

The final hardware setup used to test the system was composed of the following (for

each location):

3 Dell PCs

2 OrangeMicro USB 2.0 web-cameras

Sony projector

Colored felt straps with Velcro attachments

GE stereo PC headset

Cushioned chair

The participant sat facing a large projection screen. The Sony projector was

placed under each participant's seat. The two web-cameras were each attached to a Dell

PC running the Straps software. These PCs had the VRPN and Imaging Control libraries

installed. The rendering PC connected to the projector ran the rehearsal software. This

PC had the VRPN and Haptek libraries installed. Figure 3-2 shows a diagram of the final

hardware setup used for each study.



































Figure 3-2. A participant testing out the system.


Figure 3-3. Sample screenshot demonstrating the virtual script system.




















































Figure 3-4. Data flow for both rendering systems.





,,


,,


.~,


Webcam Left ~









'P~C Left



endering


Webe~ram Right










PC Rightly


~I


"0' 1


RI


Acto~r


Figure 3-5. Hardware setup for each location.















CHAPTER 4
RESULTS

4.1 Description of Studies

The system was evaluated using three studies, each with two actors. The first

study was conducted before the system was fully operational. The second and third

studies were conducted using the complete system. The aspects of the system that

weren't incorporated into the first study included the introductory movie, head-controlled

viewpoint, and accurate hand tracking.

The participants (4 females and 2 males) ranged in age from 18 to 20. They had

significant acting experience. Before each study, each participant was given a small

tutorial on how the system worked, a brief overview of its limitations, and some time to

see his/her character being manipulated by his/her movements. The participants then

watched the introductory movie and rehearsed the scene provided for them. When the

scripted lines ended, the participants were given time to adlib. Each study concluded by

having participants fill out a presence and co-presence questionnaire.

All three sets of participants were given the co-presence questionnaire used in

Collaboration in Tele-Immersive Environments. This questionnaire gauges the degree to

which each participant felt they were "present" in the virtual environment with the other

participant. The last two sets of participants were also given the Slater, Usoh and Steed

(SUS) Presence questionnaire. The SUS questionnaire is commonly used to assess










virtual environment immersion. Along with each questionnaire, participants were asked

to specify their level of computer literacy and their level of experience with virtual

reality. The Appendix contains both questionnaires. An informal verbal debriefing

followed the questionnaires.


Actor 1 located at the

Lab in the CISE Building.












Actor 2 located at the
REVE Polymodal Imrmersive
Theater in the Norman Gymr.



Figure 4-1. The location of each actor on the University of Florida campus.

4.2 Reaction from Actors

The participants from the first study appeared to be initially frustrated with the

inaccurate hand tracking, although with some practice they compensated for it. One

participant used exaggerated gestures to counteract the limited forward/backward

character movement. During the adlib portion the participants spontaneously began a

slapping duel with each other that consisted of one person moving their arm in a slapping

motion and the other moving their head when hit and vise versa.

The second and third set of study participants quickly became adept at using the

system. They seemed very comfortable working through the scene despite having little










or no virtual reality experience. The introductory movie did not appear to significantly

affect the participants' experience. The adlib session flowed seamlessly from the scripted

section. The participants seemed to be highly engrossed in the experience-evidenced by

the fact that all four participants prolonged the adlib session for more than 5 minutes.

4.3 Results

The results from the questionnaires and the debriefing can be organized into the

following three categories:

Virtual Reality can be used to bring actors together for a successful rehearsal.

Lack of presence distracted the actors.

Improvements should be made to the system.

4.3.1 Virtual Reality Can Be Used to Bring Actors Together for a Successful
Rehearsal

The results of the questionnaires proved that the study was effective in achieving

successful rehearsals. The participants on average felt a stronger sense of co-presence

than standard presence. This is understandable considering the participants had limited

control over their own environment while still having significant interaction with their

partner.

The average responses to the co-presence questionnaire were low for the first

study (only 26% of the responses were above 4.5) yet moderately high for the second and

third studies (60% and 66% of the responses were above 4.5 for the second and third

studies, respectively). There was an average increase of .81 in the responses from the

first study to the second and third. This demonstrates that the increased interactivity

included in the system for the second and third studies positively influenced each actor's

experience. The high responses for the second and third studies also indicate that the










participants felt that they could effectively communicate both verbally and gesturally.

The following responses to the debriefing session reaffirm these Eindings:

"Ultimately, I had fun. There were a few synch issues but we found out ways to
interact with the limited tools at our disposal."

"I felt very connected to the other person and I felt that the acting experience was
very worthwhile. I actually felt like if we were rehearsing a scene we would have
gotten someplace with our exercise."

"It was very easy to feel like you're with another person in the space. One,
because you were talking to them. And two, because you're always conscious of
the fact that they're going to be looking at what you're at least partially doing."

"I started to think of the images on the screen as being another person in the room
with me; it very much seemed like a human Eigure and I felt as though I were in
their presence."

Several items on the co-presence questionnaire generated interesting results.

Question 4, which asked, "To what extent did you feel embarrassed with respect to what

you believed the other person might be thinking of you?" generated an average score of

1.25 (on a scale of 1 [not at all] to 7 [a great deal]). Questions 6 and 7, which determined

the degree to which each participant felt their progress was hindered by their partner and

vice versa, generated an average score of 1.5 and 1.75, respectively. These low results

are likely a result of the participants having previously worked with each other. This co-

presence questionnaire uses the participant' s unfamiliarity with their partner to gauge co-

presence by showing the existence of social phenomena such as shyness and awkward

interaction with the aforementioned questions. Thus, participants familiar with each

other, or those who have acted together before, would probably get low scores on those

questions.

Question 14, which measured the degree to which each participant had a sense

that there was another human being interacting with them (as opposed to just a machine),










generated an average score of 6. This score further supports the system's effectiveness.

Question 15, which determined how happy the participant thought their partner felt,

generated an average score of 7. This question assumes that the participants are strangers

(similar to questions 4, 6, and 7). All of the participants showed obvious signs of

enjoying the experience as evidenced by the average score of 7 (the maximum score) for

this question. Figures 4-2 to 4-6 detail the results of each questionnaire arranged by

study .

4.3.2 Lack of Presence Distracted the Actors

The results of the presence questionnaire that was given to the second and third

study participants were average. Typical presence scoring involves adding a point for

each response of 6 or 7, however that would give only one participant (ID number 3) a

score above 0. According to Figure 4-5, the average of the responses for both studies

also generates a score of 0. Since the average responses were all between 3 and 5, it can

be said that the participants were only moderately engrossed in the environment. This

affected the experience by distracting the participant. Several participants mentioned the

experience would have been enhanced if they could see a representation of their hands on

the screen. Had each participant's sense of presence been higher, they might have

accepted the reality of acting with the character on the screen as opposed to feeling that

they were physically controlling a character that is acting with the character on the

screen. The following responses from the debriefing session were the basis for these

conclusions:

*"It's kind of like a first-person shooter sort of game where you don't really see
any of yourself; you just see what' s going on. It' s a little bit disorienting."










"I would've really liked to see my character's hands on the screen--so I know
what they're doing."

"It was kind of skeletal but the way it works right now is really good for where it
1S."

"There was a little sense that you were really there (in the virtual environment)
like when you move your head and the camera pans back and forth."

4.3.3 Improvements Should Be Made to the System

The participants suggested a number of areas for improvement. Nearly all

suggested that more body parts be tracked and that interactive facial expressions be

added. One participant from the first study suggested abandoning the gesture tracking for

a system that would aid only in the "blocking" of a scene. The following are the

debriefing responses that dealt with system improvements and the overall idea of the

sy stem :

"For practice out of rehearsal this could work. It all depends on the level of
sophi sti cati on."

"It needs to incorporate more color straps to include the whole body and
hopefully, facial expressions. I like the idea of the opposite image being that of
the character instead of the other actor."

"I would add lots more tracking spots to allow for full body and maybe facial
movements."

"There's a lot more that goes into acting that just moving your arms. To make it
more of an acting experience there would have to be more mobility and
expression."











FIRST STUDY Co-presence Questionnaire Part 1
ID Number Literacy Experience 1 2 3 4 5 6 7
1 3 1 4 5 4 1 4 2 2
2 4 1 3 3 3 1 2 1 1
Averae: 3.5 4 3.5 1 3 1.5 1.5

FIRST STUDY- Co-presence Questionnaire Part 2
ID Number 8 9 10 11 12 13 14 15
1 2 5 1 3 1 5 6 7
2 4 5 4 3 1 6 5 7
Average: 3 5 2.5 3 1 5.5 5.5 7

Figure 4-2. Results of the co-presence questionnaire administered during the first study.

SECOND STUDY- Presence Questionnaire
ID Number 1 2 3 4 5 6
3 6 4 2 1 6 6
4 5 4 5 5 4 5
Average: 5.5 4 3.5 3 5 5.5

SECOND STUDY Co-presence Questionnaire Part 1
ID Number Literac Eprience 1 2 3 4 5 6 7
3 7 2 6 6 5 2 6 1 2
4 4 1 5 5 5 1 5 3 3
Average: 5.5 5.5 5 1.5 5.5 2 2.5

SECOND STUDY- Co-presence Questionnaire Part 2
ID Number 8 9 10 11 12 13 14 15
3 5 3 2 3 2 7 7 7
4 7 6 5 6 3 4 5 7
Average: 6 4.5 3.5 4.5 2.5 5.5 6 7

Figure 4-3. Results of the presence and co-presence questionnaires administered during
the second study.
















THIRD STUDY- Presence Questionnaire
ID Number 1 2 3 4 5 6
5 4 3 5 4 4 5
6 5 4 3 4 3 5
Average: 4.5 3.5 4 4 3.5 5

THIRD STUDY Co-presence Questionnaire Part 1
ID Number Literac Experience 1 2 3 4 5 6 7
5 6 1 4 4 6 1 3 1 1
6 7 2 5 5 5 1 4 1 1
Average: 4.5 4.5 5.5 1 3.5 1 1

THIRD STUDY- Co-presence Questionnaire Part 2
ID Number 8 9 10 11 12 13 14 15
5 5 3 3 4 2 4 6 7
6 5 6 5 5 6 5 6 7
Average: 5 4.5 4 4.5 4 4.5 6 7

Figure 4-4. Results of the presence and co-presence questionnaires administered during
the third study.

Presence Questionnaire Summar
Question 2nd Stud Average 3rd Stud Averae Total Avere
1 5.5 4.5 5
2 4 3.5 3.75
3 3.5 4 3.75
4 3 4 3.5
5 5 3.5 4.25
6 5.5 5 5.25

Figure 4-5. Comparison between question averages for the presence questionnaire.











Co-presence Questionnaire Summar
Question 1st Study 2n Study 3r Study Total Average
Avere Avere Average (2nd & 3rd Studies)
1 3.5 5.5 4.5 5
2 4 5.5 4.5 5
3 3.5 5 5.5 5.25
4 1 1.5 1 1.25
5 3 5.5 3.5 4.5
6 1.5 2 1 1.5
7 1.5 2.5 1 1.75
8 3 6 5 5.5
9 5 4.5 4.5 4.5
10 2.5 3.5 4 3.75
11 3 4.5 4.5 4.5
12 1 2.5 4 3.25
13 5.5 5.5 4.5 5
14 5.5 6 6 6
15 7 7 7 7

Figure 4-6. Comparison between question averages for the co-presence questionnaire
showing improvement from the first study to the second and third studies.















CHAPTER 5
CONCLUSION

5.1 Usefulness to the Acting Community

It has been shown that virtual environments allow multiple actors to successfully

rehearse scenes without the need to be in makeup or costume. The true usefulness of this

system to the acting community lies in the fact that it can bring actors together from two

remote locations for an engaging acting experience. A fully developed virtual rehearsal

system could save actors a significant amount of time and money. The system, however,

is far from being fully developed.

5.2 Future Work

The distributed virtual rehearsal system has many areas that can be improved.

The depth of an actor's experience in a virtual rehearsal is greatly affected by how

realistic their interaction is. Realistic interaction is achieved by making the digital

characters' movements as life-like as possible.

One main complaint from the study participants was that the character they were

facing lacked expression. Implementing interactive facial expressions would be costly

but would dramatically increase the realism of the experience. In Acting in Virtual

Reality, simple mouse strokes were used to change the character's expression [3],

however that solution isn't plausible if the actor is to remain wireless (as they are in the

virtual rehearsal system). Another solution would be to incorporate a third web-camera

into the system that would provide images of the actor's face to a PC that could detect










changes in facial expressions. The third and easiest solution would be to give the system

operator control over the character's facial expressions. The drawbacks to this solution

are operator subj activity and that the operator would have to be within visual range of the

actor.

The other main complaint from the study participants was the limited number of

tracked body parts. More tracked areas would have increased realism, although only 3

tracked areas were needed for the scene. The shoulders straps were used during system

calibration but weren't actively tracked during the rehearsal. Adding shoulder tracking

could have allowed for torso manipulation, which would have been especially useful

when the actors wanted to lean forward.

Orientation tracking, while not specifically mentioned by the study participants,

would have greatly affected character realism. This would allow the characters to look

left and right as well as rotate their hands. Using two colored straps to determine the

direction of the vector that passes through both straps could approximate head orientation

tracking. Hand orientation would be much more difficult since there are several axes of

rotation.

Automated accurate lip-synching is another aspect that would have a significant

effect on the user's sense of presence. For this to work, the actor's audio stream would

need to be analyzed in real-time. This would be difficult to implement and

computationally expensive.

The ideal system would not only track gestures and facial expressions, but allow

the actor to move freely around the stage. This could be achieved using a modified

CAVE system or a head-mounted display.










5.3 Future Applications

Motion capture systems are typically used to capture an actor's movements and

later add them to a digital character. Virtual rehearsals could be modified to record the

actors' movements as they rehearse their scene. It would then essentially be a real-time

motion capture system. The recorded movements could then be played back for the actor

to review or they could be sent directly to an animation package for the purpose of

rendering a digitally animated movie.

A virtual film director system could also be added to the virtual rehearsal system.

The virtual director could plan out camera angles, arrange and modify props, start and

stop the action, and direct the actors' movements. The director could be represented by a

digital character or simply watch the action from a monitor, speaking through a virtual

speaker.

Distributed virtual performances are another plausible extension of the virtual

rehearsal. This would introduce audience systems into the distributed virtual rehearsal

paradigm. While several actors perform the scene from separate locations, an audience

can watch the action unfold from a third-person point of view. Allowing a director to

control the camera angles would further enhance the experience by providing the

audience with cinematic visuals.























Your Given ID number
Your Ae
Your Gender O Male O Female
Undergraduate Student O
Masters Student O
PhD Student O
Occupational status Research Assistant/Fellow O
Staff systems, technical O
Faculty O
Administrative Staff O
Other O
Please state your level of computer literacy on a scale of (1...7)
(never used before) 1 O 2 O 3 O 4 O 5 O 6 O 7 O (a reat deal)
Have you ever experienced 'virtual reality' before?
(never used before) 1 O2 O3 O4 O5 O6 O7 O (a great deal)

Part B: Virtual Reality Experience

Please give your assessment as to how well you contributed to the successful
performance of the task.

My contribution to the successful performance of the task was...

(not at all) 1 02 03 04 05 06 07 0 (a great deal)

Please give your assessment as to how well the other person contributed to the
successful performance of the task.

The other person 's contribution to the task was...

(not at all) 1 02 03 04 05 06 07 0 (a great deal)


APPENDIX
STUDY QUESTIONNAIRES

A.1 Co-presence Questionnaire

Part A: Personal Information










To what extent were you and the other person in harmony during the course of the
experience.

We were in harmony...

(not at all) 1 02 03 04 05 06 07 0 (a great deal)

To what extent did you feel embarrassed with respect to what you believed the other
person might be thinking about you?

I felt embarrassed...

(not at all) 1 02 03 04 05 06 07 0 (a great deal)

Think about a previous time when you co-operatively worked together with another
person in order to achieve something similar to what you were trying to achieve
here. To what extent was your experience in working with the other person on this
task today like the real experience, with regard to your sense of doing something
together?

This was like working together with another person in the real world...

(not at all) 1 02 03 04 05 06 07 0 (a great deal)

To what extent, if at all, did the other person hinder you from carrying out the task?

The other person hindered me from carrying out this task...

(not at all) 1 02 03 04 05 06 07 0 (a great deal)

To what extent, if at all, did you hinder the other person from carrying out the task?

I hindered the other person from carrying out this task...

(not at all) 1 02 03 04 05 06 07 0 (a great deal)


Part C: Virtual Reality Experience Continued

Please give your assessment of how well you and the other person together performed
the task.

We performed the task successfully...


(not at all) 1 02 03 04 05 06 07 0 (a great deal)











To what extent, if at all, did you have a sense of being with the other person?

I had a sense of being with the other person...

(not at all) 1 02 03 04 05 06 07 0 (a great deal)

To what extent were there times, if at all, during which the computer interface
seemed to vanish, and you were directly working with the other person?

There were times during which I had a sense of working with the other person...

(not at all) 1 02 03 04 05 06 07 0 (a great deal)

When you think back about your experience, do you remember this as more like
just interacting with a computer or working with another person?

The experience seems to me more like interacting with a person...

(not at all) 1 02 03 04 05 06 07 0 (a great deal)

To what extent did you forget about the other person, and concentrate only on doing
the task as if you were the only one involved?

I forgot about the other person...

(not at all) 1 02 03 04 05 06 07 0 (a great deal)

During the time of the experience did you think to yourself that you were just
manipulating some screen images with a mouse-like device, or did you have a sense
of being with another person?

During the experience I often thought that I was really manipulating some screen
images ...

(not at all) 1 02 03 04 05 06 07 0 (a great deal)

Overall rate the degree to which you had a sense that there was another human
being interacting with you, rather than just a machine.

My sense of there being another person was...

(not at all) 1 02 03 04 05 06 07 0 (a great deal)





1. Please rate your sense of being in the environment, on the following scale from 1
to 7, where 7 represents your normal experience of being in a place.

I had a sense of "being there" in the environment...

(not at all) 1 02 03 04 05 06 07 0 (a great deal)

2. To what extent were there times during the experience when the environment
was the reality for you?

There were times during the experience when the environment was the reality for me...

(at no time) 1 O2 O3 O4 O5 O6 O7 O (Almost all the time)

3. When you think back about your experience, do you think of the environment
more as images that you saw, or more as somewhere that you visited?

The environment seems to me to be more like...

(Images that Isaw) 1 O2 O3 O4 O5 O6 O7 O (Somewhere Ivisited)


If you had a chance, would you like to meet the other person?

(not at all) 1 02 03 04 05 06 07 0 (a great deal)

Assess the mood of the other person on the basis of very depressed to very happy.

The mood of the other person seemed to be happy...

(not at all) 1 02 03 04 05 06 07 0 (a great deal)

Please write any additional comments here. Things you could consider are:

Things that hindered you or the other person from carrying out the task; what you
think of the person you worked with; and any other comments about the experience
and your sense of being there with another person. What things made you "pull
out" and more aware of the computer...


A.2 Presence Questionnaire










4. During the time of the experience, which was the strongest on the whole, your
sense of being in the environment, or of being elsewhere?

I had a stronger sense of...

(Being elsewhere) 1 O 2 O 3 O 4 O 5 O 6 O 7 O (Being in the environment)

5. Consider your memory of being in the environment. How similar in terms of the
structure of the memory is this to the structure of the memory of other places you
have been today? By 'structure of the memory' consider things like the extent to
which you have a visual memory of the environment, whether that memory is in
color, the extent to which the memory seems vivid or realistic, its size, location in
your imagination, the extent to which it is panoramic in your imagination, and other
such structural elements.

I think of the environment as a place in a way similar to other places that I've been
today...

(not at all) 1 O 2 O 3 O 4 O 5 O 6 O 7 O (very much so)

6. During the time of the experience, did you often think to yourself that you were
actually in the environment?

During the experience I often thought that I was really existing in the environment...

(not very often) 1 O 2 O 3 O 4 O 5 O 6 O 7 O (very much so)














LIST OF REFERENCES


1. Dahl, S., "Kinesics," Business School, Middlesex University, 2004. Retrieved 14
Mar. 2004 .

2. Ryan, D., "Semiotics," School of Arts and Sciences, Australian Catholic University,
2003. Retrieved 14 Mar. 2004
.

3. Slater, M., Howell, J., Steed, A., Pertaub, D-P., Garau, M. and Springel, S., "Acting
in Virtual Reality," ACM Collaborative Virtual Environments, CVE'2000, 2000.

4. Mortensen, J., Vinayagamoorthy, V., Slater, M., Steed, A., Lok, B. and Whitton,
M.C., "Collaboration in Tele-Immersive Environments," Proceedings of the Eighth
Eurographics Workshop on Virtual Environments, 2002.

5. Oliverio, J., Quay, A. and Walz, J., "Facilitating Real-time Intercontinental
Collaboration with Emergent Grid Technologies: Dancing Beyond Boundaries,"
Paper from the Digital Worlds Institute, 2001. Retrieved 9 Aug. 2004
.

6. Steed, A., Slater, M., Sadagic, A., Tromp, J. and Bullock, A., "Leadership and
Collaboration in Virtual Environments," IEEE Virtual Reality, Houston, March
1999, 112-115.

7. Brooks, F.P., "What's Real about Virtual Reality?" IEEE Computer Graphics and
Applications, Nov./Dec. 1999.

8. Cruz-Neira, C., Sandin, D.J. and DeFanti, T.A., "Surround-Screen Proj ection-Based
Virtual Reality: The Design and Implementation of the CAVE," Computer
Graphics (SIGGRAPH) Proceedings, Annual Conference Series, 1993.

9. Bowman, D.A. and Hodges, L.F., "An Evaluation of Techniques for Grabbing and
Manipulating Remote Obj ects in Immersive Virtual Environments," Symposium on
Interactive 3D Graphics, Apr. 1997.

10. Sutherland, I.E., "A Head-mounted Three Dimensional Display," Proceedings of
the AFIPS Fall Joint Computer Conference, Vol. 33, 757-764, 1968.

11. Polhemus, "FASTRAK: The Fast and Easy Digital Tracker," Colchester, VT, 2004.
Retrieved Apr. 2004 Brochure.pdf>.










12. InterSense, "InterSense InertiaCube2," Bedford, MA, 2004. Retrieved Apr. 2004


13. 3rdTech, Inc., "HiBall-3000 Wide Area Tracker and 3D Digitizer," Chapel Hill,
NC, 2004. Retrieved Apr. 2004

14. Jackson, J., Lok, B., Kim, J. Xiao, D., Hodges, L. and Shin, M., "Straps: A Simple
Method for Placing Dynamic Avatars in a Immersive Virtual Environment," Future
Computing Lab Tech Report FCL-0 1-2004, Department of Computer Science,
University of North Carolina at Charlotte, 2004.

15. Thalmann, D., "The Role of Virtual Humans in Virtual Environment Technology
and Interfaces," in Frontiers of Human-Centered Computing, Online Communities
and Virtual Environments, Springer, London, 2001, 27-38.

16. Slater, M. and Usoh, M., "Body Centered Interaction in Immersive Virtual
Environments," in N. Magnenat Thalmann and D. Thalmann (eds.) Artificial Life
and Virtual Reality, John Wiley and Sons, New York, 1994, 125-148.

17. Garau, M., Vinayagamoorthy, V., Slater, M., Steed, A. and Brogni, A., "The Impact
of Avatar Realism on Perceived Quality of Communication in a Shared Immersive
Virtual Environment," Equator Annual Conference, 2002.

18. Slater, M., Sadagic, A., Usoh, M. and Schroeder, R., "Small Group Behavior in a
Virtual and Real Environment: A Comparative Study," presented at the BT
Workshop on Presence in Shared Virtual Environments, June 1998.

19. Vacchetti, L., Lepetit, V., Papagiannakis, G., Ponder, M., Fua, P., Magnenat-
Thalmann, N. and Thalmann, D., "Stable Real-Time Interaction Between Virtual
Humans and Real Scenes," Proceedings of 3DIM 2003 Conference, 2003.

20. Haptek Inc., Santa Cruz, California, Sept. 2003. Retrieved 9 Aug. 2004
.

21. UGS Corporation, "E-Factory: Jake," Plano, TX, 2004. Retrieved Apr. 2004
.

22. Boston Dynamics, "DI-Guy: The Industry Standard in Real-Time Human
Simulation," Cambridge, MA, 2004. Retrieved Apr. 2004


23. VQ Interactive, Inc., "BOTizen: The Power of Interactivity," Selangor
Malaysia, 2003. Retrieved Apr. 2004 .










24. Tsai, R., "A Versatile Camera Calibration Technique for High-Accuracy 3D
Machine Vision Metrology Using Off-the-Shelf TV Cameras and Lenses," IEEE
Journal of Robotics and Automation, Aug. 1987, 323-344.















BIOGRAPHICAL SKETCH

George Victor Mora was born in Miami, Florida, on March 29th, 1980. He spent

the first 18 years of his life in South Florida. His obsession with art and technology

began at an early age. During high school, he focused his attention on art and computer

science classes. Upon completing high school, he moved to Gainesville, Florida, to

attend the University of Florida.

In August of 2002, George finished his undergraduate degree in computer

science. He returned to the University of Florida the following semester as a graduate

student in the newly formed digital arts and sciences program in the College of

Engineering. For the next two years, George focused on virtual environments and digital

media both through his school work and as an employee of the Digital Worlds Institute.

In December of 2004 George will receive his Master of Science degree in digital arts and

sciences.