<%BANNER%>

On the Formulation of Inertial Attitude Estimation Using Point Correspondences and Differential Carrier Phase GPS Positi...

xml version 1.0 encoding UTF-8
REPORT xmlns http:www.fcla.edudlsmddaitss xmlns:xsi http:www.w3.org2001XMLSchema-instance xsi:schemaLocation http:www.fcla.edudlsmddaitssdaitssReport.xsd
INGEST IEID E20110113_AAAABF INGEST_TIME 2011-01-13T09:33:33Z PACKAGE UFE0004565_00001
AGREEMENT_INFO ACCOUNT UF PROJECT UFDC
FILES
FILE SIZE 55948 DFID F20110113_AAASPR ORIGIN DEPOSITOR PATH rosengren_s_Page_084.pro GLOBAL false PRESERVATION BIT MESSAGE_DIGEST ALGORITHM MD5
c2850e3662832312e953c06f00c9aa6f
SHA-1
4f540de7fbee424a76163386831e8c156e4990c4
19850 F20110113_AAASPS rosengren_s_Page_085.pro
5df9cc1a3546693cb9994b5157066949
1127acc50edebf66cde18d931f78e784e62e2b54
65681 F20110113_AAASQG rosengren_s_Page_101.pro
8e050e1bde84ec72598ce91d7a87cae4
e1d45f698ee350ecc3f21702cfb82489661153f7
12227 F20110113_AAASPT rosengren_s_Page_086.pro
2f46bb4a54378588f6c54ed9ee50b34a
585415dbc263424b0234de45b3dbb8510b3cbd73
5451 F20110113_AAASQH rosengren_s_Page_102.pro
4f3c765c6c8aa9b7f2c38e2eea0621df
574cd5712d0ba8b44ba3b1463042478cbdd3b987
38978 F20110113_AAASPU rosengren_s_Page_087.pro
c058c6a7c3da67efebdfc89caf95855b
7eb17da6dc04e95301f7a01e99aaca76f90415a6
61120 F20110113_AAASQI rosengren_s_Page_103.pro
a0d11edea749382d0b4f4efde4c52ccc
2496b225375f5bb14f18d6d10e4e9c54cb2669c6
20568 F20110113_AAASPV rosengren_s_Page_089.pro
cfac60344eae83f77c203976813a1a21
34d489cdfd71535fe721a6306341a1ae4321d670
628 F20110113_AAASQJ rosengren_s_Page_001.txt
e8a9ac9e1876735fd66dc823e9094788
376332a98b7c7ffc13c945ea19eec7ee2839e7ca
16062 F20110113_AAASPW rosengren_s_Page_090.pro
e409d2fd7f5f38250b9fd30ec0d9fa01
d6563baed2f74f370016c6a70a138d680f7e356d
136 F20110113_AAASQK rosengren_s_Page_002.txt
d27c8a15ec772990f7d7ee1242b266b1
951c681a948f1a0d7f5fcf2f1f49c6654a37d5b0
28116 F20110113_AAASPX rosengren_s_Page_091.pro
5a02f18e09366a1c442d3691ff10025f
23962fdc6b724db237457d9376ff12af7e2172b6
1375 F20110113_AAASRA rosengren_s_Page_018.txt
94eb0466e2bfe6b25c28ddb03e6a4f90
de9d5723b39cf0715673f4023351b3cf7ef1a8a4
90 F20110113_AAASQL rosengren_s_Page_003.txt
b6697e4621c310adf43ba223c59d97d8
5268ae393191d223af05e94cdc0a110d8b5e1715
39468 F20110113_AAASPY rosengren_s_Page_092.pro
213873e339d89d3b060780b8492c0ac8
cd5a2be69b7462041aa9cde11bc82858baa0311f
2824 F20110113_AAASRB rosengren_s_Page_020.txt
122fc55a3064a787e7e9b52bed21c6ae
b8b972a2ee666d7f47a8c310d475cdf41c637e61
1329 F20110113_AAASQM rosengren_s_Page_004.txt
97e57eef40b851715759135d59b312e3
acb7f0aac4354193ef42d60f8a76e19ca2ca16ff
28564 F20110113_AAASPZ rosengren_s_Page_094.pro
7f3d55a7ee0180979189b56292cdafef
5cece01226ac232aa33f0c2c1431f2c64a53af8e
184 F20110113_AAASRC rosengren_s_Page_021.txt
23d7e71042fb41000f33f319a0cc4491
21a0d9dcb3d566e21fe823ba2054d27b4f78ed1b
3678 F20110113_AAASQN rosengren_s_Page_005.txt
c195d0d61ddcad025db6a77bcd54b9b8
0c4e501c29dc992ccfee1a3a443c922fa24a1600
2592 F20110113_AAASRD rosengren_s_Page_022.txt
b22f0b170636dafd982f17b95ecad675
cb16c23140334e4e9f1ddd9ce8dc57ce72193d04
2126 F20110113_AAASQO rosengren_s_Page_006.txt
e123a68e4a40a616b230b46ef0d77664
fddc71928f488481e1ef3c4c88856d5e413eef5b
2452 F20110113_AAASRE rosengren_s_Page_023.txt
dd7de62ae6a0c3f552949b6827cbbe70
23d2b2bf9c7a0340c72fe2a8aef91b62395d940f
1522 F20110113_AAASQP rosengren_s_Page_007.txt
f365ad99da81e13063c09e56bde541a9
cc30c932f4e81a801bc89b415f70bfe3dc87b664
2664 F20110113_AAASRF rosengren_s_Page_024.txt
c3c8052efb21a63f9572b389fbef1efd
60ef11ef66f5068fa9b551ec28bbb5cffc5aa741
3290 F20110113_AAASQQ rosengren_s_Page_008.txt
b146c1b4729664eda25c9f336c56245f
786dac908f8945fb7a4d99d799fb73cae64c2eb5
2744 F20110113_AAASRG rosengren_s_Page_025.txt
44f27e5b14f5357e7603ca02994c28f4
e0ee3e0d4a24f78aeb0ff51cfba3367b848258f0
3737 F20110113_AAASQR rosengren_s_Page_009.txt
554c5be5593e4b7585ad98874ac1dd7f
c60e44786486d005dc60614ad82d12e8da1c7026
2777 F20110113_AAASQS rosengren_s_Page_010.txt
fd1ceb84bf0143b2b00db2568ced4ad8
3b86963fd39b90e39c003c817867b9d4b353767e
2829 F20110113_AAASRH rosengren_s_Page_026.txt
0c7707d91bb765dc8ff817684f3e3247
2e8e5e513edb8a567db94a6b4fcb0c4d5bd8fe6c
2098 F20110113_AAASQT rosengren_s_Page_011.txt
7fe4f36569e6f6aba0efa9c35ad8658f
f530470b86ac4acbdbfc94497cc02587607e2163
2060 F20110113_AAASRI rosengren_s_Page_027.txt
d988d1140dee48eea02d572a719978a6
987e2de51a74db5440829c8fc265b4d51d850adb
2574 F20110113_AAASQU rosengren_s_Page_012.txt
dcb8413e821cef2cc03c2e16260f53b5
ee1b5087c3f517aa758520cc4d3af5c92bc5ce90
2803 F20110113_AAASRJ rosengren_s_Page_028.txt
5c7f10508cbb1c79edc93a47911670b1
53533a739f046fec4cd23a5adc90b3d7a7772261
2617 F20110113_AAASQV rosengren_s_Page_013.txt
2312bb57916164ca680798e6a2bf0be1
306645c3d03be95d1957986be89ec68f2ea4f593
2749 F20110113_AAASRK rosengren_s_Page_029.txt
79ac429187b7c705e7017a96663fdf0d
f05b683ae14ae8e0b2fc38e5fb18823638d10da6
2484 F20110113_AAASQW rosengren_s_Page_014.txt
1cd58dea46269540e0fbedc69ddbdffe
fba6bee111f13486ea5f3c976da9e966bff67d61
2370 F20110113_AAASSA rosengren_s_Page_046.txt
0209ad6d71d7a4892499724026e22379
d1584ef1222b14b786cf328b8c3d1d4218f54fb2
2577 F20110113_AAASRL rosengren_s_Page_031.txt
b02de484c09e374e88e8200ac918790d
90b9682907590a602a41f13937cf88f68b322bf7
2587 F20110113_AAASQX rosengren_s_Page_015.txt
a3811d98a31e8b0f70f0bdbd59d41cf6
673d300fe45c192c748bce7ff619c9899a54c5e9
2793 F20110113_AAASSB rosengren_s_Page_048.txt
3b0873d1c710f475b74289e703f19afc
fa05a917ddc50cb1ca155e98a10a7c188f910cbc
1627 F20110113_AAASRM rosengren_s_Page_032.txt
4480bd2d891144365727eeed8303909d
29839bf87b7d585a50077a0bb62def7788a49387
1768 F20110113_AAASQY rosengren_s_Page_016.txt
25099e40ce4e0be1aaf8f13c29fc08b6
e309306d435fe867c3da74bc399792256c75f904
2774 F20110113_AAASSC rosengren_s_Page_049.txt
745c03bce085a995f54643208888640a
63f902e222499d15fb1df61cea85309e9b179462
1814 F20110113_AAASRN rosengren_s_Page_033.txt
9ab47db614dc02c76de6730ad6efb481
f8c61ec2442d3d91408eabdf3d7f2a777bf68f55
2340 F20110113_AAASQZ rosengren_s_Page_017.txt
f4c58cfccf545b19b7a8fe14f54d2579
4f59fb06c802f67d5d579476753f95bd35b42ba3
1397 F20110113_AAASSD rosengren_s_Page_050.txt
9b033f63a8873164b5b0be53bec30a85
63ebce22d8427b40bc7e82b2f893cdc066a59a59
2134 F20110113_AAASRO rosengren_s_Page_034.txt
ae715b7bab8e7c691b45b633639aa897
99a7bbb692296c51be54fda281b7cff94d6f0b1d
2359 F20110113_AAASSE rosengren_s_Page_051.txt
670a53aa9b9b1658b503169e911ca313
66d9d207e1d0fbcbe9c8293d1fe9454877b64dce
1723 F20110113_AAASRP rosengren_s_Page_035.txt
2ee23eeafcc90d2c2c1f807e27390c44
331c29650efed923ecbd2de673578c6d55bb7335
2339 F20110113_AAASSF rosengren_s_Page_052.txt
7e320455159440cbf60fdda8cf70c7cc
d1e4a5f07b5fc92a9bbf86116e9ae1d7fe817bdb
2022 F20110113_AAASRQ rosengren_s_Page_036.txt
aaa00792cd92c948f031b45426e976ef
848982c5d27876819aebba14ec9264ab274093f3
2331 F20110113_AAASSG rosengren_s_Page_053.txt
c357daef4e8f5c0cc5184797a8056bd5
43cb97ed7ad8d22d56d4db61dd7dcfaa6ec62355
2333 F20110113_AAASRR rosengren_s_Page_037.txt
262432275d87c35f1d088b128335d693
36626ebe7bbc66524f7e946d29758a313535f34f
1951 F20110113_AAASSH rosengren_s_Page_054.txt
5a4d10d6fe50f9375b5a65a36cdedbdb
bb894f7c7f1ae532d4f2b2049a9cc4307ed508b8
2614 F20110113_AAASRS rosengren_s_Page_038.txt
be9a2cd277fe82817c6bc182be6c288c
ac31812ecbc99b952cc1447e2a466ac61bd0a229
2538 F20110113_AAASRT rosengren_s_Page_039.txt
2914d38516503bb4206e670a9563ceed
301f85edda56224729efd02db1f98b60bad3985a
2042 F20110113_AAASSI rosengren_s_Page_055.txt
ae5d0454faee2d7a44f9240d61a5d38b
3a1153fc58ddea4c9f125b50fd1c2c3905727a1f
1516 F20110113_AAASRU rosengren_s_Page_040.txt
493ac24363acb13e5012578f35bab229
5753faf725bc2b6bc29ed2e40772a995b5dd2dfb
3156 F20110113_AAASSJ rosengren_s_Page_056.txt
1b2b6d01e5c9686d47242b4e75e2d948
71b7f2deb363067f882715b266ee09b50dd26e64
1335 F20110113_AAASRV rosengren_s_Page_041.txt
4aeb7fd83b32c4847d66da6b97c2522a
597d2efcef7fc3871a2b9d5a7ffe09999e0dd203
3337 F20110113_AAASSK rosengren_s_Page_057.txt
8ed15b5119e1af7c1399fcaa21a911c2
a778887fded4212dd80d97ab0a6e900a340b1d52
2215 F20110113_AAASRW rosengren_s_Page_042.txt
69022f530c1acd50f0d3ea157ed5b3f3
f47e9534bf2df961edf6133d6232cac042c81fc7
2447 F20110113_AAASSL rosengren_s_Page_058.txt
0e89c248315fe7e8d69b388f135edefe
fa687ad207145e8195f16b4d47ecee238fdc30d7
2441 F20110113_AAASRX rosengren_s_Page_043.txt
8310a31a8c1d616228c0a6435d96853a
394a96de497491cf08573a74deffb7dd8850026e
1373 F20110113_AAASTA rosengren_s_Page_077.txt
b7f355f8e2c9ef5ef1d6656abcd1e14d
cadcadb32313e67254336c6779f856adfe8a6488
1245 F20110113_AAASSM rosengren_s_Page_060.txt
4a9d6d00109e632bcf291977674f78c2
7fedced7fee1016519d27ca82f894b47242f300f
2294 F20110113_AAASRY rosengren_s_Page_044.txt
5537172aa6b3edd7b08ebe2c80940308
0a7903cdb8e4c5ee19e27063c3ee2b5403e0b5ef
1553 F20110113_AAASTB rosengren_s_Page_079.txt
67048e145204fa06eb2121bb0425bbd5
d851f5d0d329ecea17cb9368a61ebc41df607554
2085 F20110113_AAASSN rosengren_s_Page_061.txt
97d67c73da55236a6d639562e087ffcd
51ccad0b3fb3894317f582ff875ebbb213f43c4b
1691 F20110113_AAASRZ rosengren_s_Page_045.txt
6a0245e0fd87cab9201cf1fc56506d52
cf620ee4220374db6ef981bb45e573c9e1cbb1e4
1267 F20110113_AAASTC rosengren_s_Page_080.txt
47c7648c48daf3de0cc33f84672129db
e4022b73941f87e08c4272b644f4dd89920c8dc2
1700 F20110113_AAASSO rosengren_s_Page_062.txt
bce37816731bcd5521c0f03a9fdfcd08
01c34afb3e43e81c9af103054cbd661b582dc39d
1891 F20110113_AAASTD rosengren_s_Page_081.txt
3bcdf2449a584ac6593d75f926270fb6
be53933a0c1071d8f62a7bb947604f0e138acca4
1079 F20110113_AAASSP rosengren_s_Page_063.txt
0c118fa2fbf05eb4340b8be9d14e45e8
35daab75069e34e75cae2ddd00f49bde62a1bc34
712 F20110113_AAASTE rosengren_s_Page_082.txt
94a846ae66bdad19a601f47a6ee3a15d
04ef944e095db48aaa4e7ce4430508ade9458f92
1478 F20110113_AAASSQ rosengren_s_Page_064.txt
369391e44f0a6685e51b94cfe9f115b5
47ae8342f0248511adafeb3b3302271c1f978dd2
1383 F20110113_AAASTF rosengren_s_Page_083.txt
cd0af5c586c63653bdffd06e8d00af06
546bd0a13f15aed623c40950d70a53269dbdcef7
1248 F20110113_AAASSR rosengren_s_Page_065.txt
47e886453a7a1d0ae651bf12dc887c1f
646780d06a8be8908ad614d71f31d53a393c49d6
3150 F20110113_AAASTG rosengren_s_Page_084.txt
47c10c3352b3d77ccf36a95815f3b6c5
5aff0f40dfb9d23854ace6f11cdae5421d780a9b
2140 F20110113_AAASSS rosengren_s_Page_066.txt
7d0f62fbe3edc074a960b852b5cfb771
262d23610bc1274f71b2812352a4c2be96419844
1029 F20110113_AAASTH rosengren_s_Page_085.txt
6ca1b12425eec5f26a59e0001ae2a02b
f4cdbb487951f4abc0702f595f4687eb55726da0
860 F20110113_AAASST rosengren_s_Page_067.txt
70fd69fc8a8ec2f965f7c1c484b9f81f
996fcc9f2cf59ed6ea19e4db964eb175b34e82be
936 F20110113_AAASTI rosengren_s_Page_086.txt
fe5e3fec9b90de934005dbdea3107763
7ef2b006cf6d9d7c131e55927d1dd6c6362c7057
1460 F20110113_AAASSU rosengren_s_Page_068.txt
ab32c7776a809f929efe0f8637be61bc
e9cd15440c23f49dbb0f1c8288aeef447b06f1b6
2097 F20110113_AAASSV rosengren_s_Page_069.txt
0d21e00e028fd5dc4b8c923473f4583d
92514b428e0870a4a1433f5b3cd995d4b4152862
1746 F20110113_AAASTJ rosengren_s_Page_087.txt
20eafae7efef1880c253273d4685b449
e82901bf05b9a7aaa960d1445253919be6a20853
1981 F20110113_AAASSW rosengren_s_Page_070.txt
f837d6c627ed9bbdbc16bf454fb06969
e605dce3e27e8896fca333bafb294143d7fe6afc
1361 F20110113_AAASTK rosengren_s_Page_088.txt
b8df0b7a23a375735e6695c2489f32ea
07f1333fa69e3fa366a0c400f1092d1d517be057
2200 F20110113_AAASUA rosengren_s_Page_001thm.jpg
1fbec724e442527d1e00b07e99a54a9b
4ea113146c282dc74a96c52bfdc79c16828f4e52
1096 F20110113_AAASTL rosengren_s_Page_089.txt
1a3695b9b2fa16043ac8ccd331859ea1
343b23313b63ab4ecf3c86f9e051eb3ef6d4a5a5
2020 F20110113_AAASSX rosengren_s_Page_072.txt
392d31dca8134eefa98913d60aa8de6c
fd050d541c28366d34ff185de345c636f9a84815
1991724 F20110113_AAASUB rosengren_s.pdf
5b6da86621343b961b67ca2d096549db
bf19bf48d5555f8fe5aba673364126bb60b8e524
1073 F20110113_AAASTM rosengren_s_Page_090.txt
212f4a3751c3577bd385d6417cdf59c8
c9d39b872ed9225e84a0915190bc338e73f4bfc5
881 F20110113_AAASSY rosengren_s_Page_074.txt
f15f52e91c1a06408e5442562cd4beda
b33dda889afe82bdf7eb97ad5afc7accb12af964
121759 F20110113_AAASUC UFE0004565_00001.mets FULL
027eb37f871a07b9defb49016f8ab100
af972c101972a35fcf974806bcf6b51459512df3
1236 F20110113_AAASTN rosengren_s_Page_091.txt
2a62d7da5f21ed70189c2246aa8619ac
75dbe8c6da9b48e0d6f7eebbe6bcca47b4737414
1488 F20110113_AAASSZ rosengren_s_Page_076.txt
2846d334d1c09bfb930de4f0d6ccc1c9
321b977b694a98f1ca1978bef5a14f2d690c7dcd
6512 F20110113_AAASUD rosengren_s_Page_001.QC.jpg
18296e44a8023a3589bfbe3f31cee243
56ac2b6f56128e3f8fd76ff02bbecf25a940f4ef
1900 F20110113_AAASTO rosengren_s_Page_092.txt
874d2f92b588899e7ebdc889cd297516
068f17dde94cacbac566d7086150c013aaf5cba3
3194 F20110113_AAASUE rosengren_s_Page_002.QC.jpg
2e6e6e316e8ef01a68ae01977f893d6c
137c04bfbd45e86536d3a064ae41d25adf02dde0
604 F20110113_AAASTP rosengren_s_Page_093.txt
289f72aad63666b684bd911be87b693f
18bd3c90035ad9fc578d62123b43635567f4646b
1331 F20110113_AAASUF rosengren_s_Page_002thm.jpg
fa2ce8bd050e7f0c27ed773a57fd8fc4
1b1bebfa205c7ec4673d95eb8971662beeec08ec
1264 F20110113_AAASTQ rosengren_s_Page_094.txt
09f0db39b2d5c8b83d8772074d53fccf
84dda5fec5161781e238d4bbdae7a606fb63c2e0
3309 F20110113_AAASUG rosengren_s_Page_003.QC.jpg
f7f09a46e50457383a52f01ca7503374
fb162e68073116c90b93f5e0cf273091540cdbdc
1156 F20110113_AAASTR rosengren_s_Page_095.txt
cb836a5dfa0dfa881d7a667755c12500
208522e7542aa54c2f9b151f3d3bf295c654f595
18168 F20110113_AAATAA rosengren_s_Page_081.QC.jpg
70cf304c844df1c8a999d4d36aa0db7c
b6d5b31e9cb1f63896d96177cf0f9b8018f5b16c
1348 F20110113_AAASUH rosengren_s_Page_003thm.jpg
9e35c17b60c4cd33d19aedc2dd8669b8
2ec9e71e51aab928e3bb86853f6c4fdb92fe3d5d
919 F20110113_AAASTS rosengren_s_Page_096.txt
8cb8412ccc8b215b60cd1cfcd8b66e66
01c75980c91445333dc0850a587c2ff529febb81
5051 F20110113_AAATAB rosengren_s_Page_081thm.jpg
87f4a907c834b3fd51e7c62006498880
19965f2e6ec91df0ec4440fa0ecda5e613332773
12093 F20110113_AAASUI rosengren_s_Page_004.QC.jpg
2ed3f2d09dd1df67147d52c7d3a97f7c
0c176d0fcc500144637e0cc4c6a82e48a7625b99
2546 F20110113_AAASTT rosengren_s_Page_097.txt
29296f9d18614d00d61dd7fc36905cfc
8eb7e9ab1613cc98a148be8e48420b3bfe5e40f5
12593 F20110113_AAATAC rosengren_s_Page_082.QC.jpg
fe0a5bef8fc5bb0791a502d30d2d474d
5229fa001b3b315f0b5eb910bc6ceea13b98b9e7
3606 F20110113_AAASUJ rosengren_s_Page_004thm.jpg
d3bddc4e94dc66a23bbe548744a9a72e
33cd8651c7577ca683dfa5fd57550508a268bb5a
2588 F20110113_AAASTU rosengren_s_Page_098.txt
9d54a1278731ff16acedcd2f38632a2d
2b6b09acd502b524b5a897f1a70bf1e7795811b8
3869 F20110113_AAATAD rosengren_s_Page_082thm.jpg
753cd04bfe088d0382d3f68e39de94ab
a68ff98eb7b422b90e10013f57b793ca8bf6d966
1261 F20110113_AAASTV rosengren_s_Page_099.txt
70ee3d6d26c010b3460f06ee5f966a74
a7bed09fafa3350811e9307056a4373dd9316fdb
12709 F20110113_AAATAE rosengren_s_Page_083.QC.jpg
98c7f2cd21fd1a62f5f93d89c2e25b89
c9b1b92046810e53f4227f828e742d65961e279e
22346 F20110113_AAASUK rosengren_s_Page_005.QC.jpg
563f61b19e687b4fb2db48fcce0e60f9
2032a764a4359ec738767905ebd4384cfe71856d
2108 F20110113_AAASTW rosengren_s_Page_100.txt
58cb3551463502f0c1988381dc5a7144
a8b5beb5d9096683cbb920dbb2287c4970f41ecf
3977 F20110113_AAATAF rosengren_s_Page_083thm.jpg
3b7fb79d8129946a7c245daaa2be2f5d
a2235aa94de475067ea847e30a57a16a9c48ca7d
5299 F20110113_AAASUL rosengren_s_Page_005thm.jpg
563c4ac1875784aae2fe60ee6a853097
7c78fe74e90704c4f2625c0aaa943129f7a16698
2751 F20110113_AAASTX rosengren_s_Page_101.txt
9c0e05f7794fc5baaa0f1f456a37105b
d4ecf16ca7c529e2ef4fd6c9f2e147ce4f8e6472
21272 F20110113_AAATAG rosengren_s_Page_084.QC.jpg
fa3e24a85f575e5498fafbe1aa2055ea
8792e0ec5ef1b8afcccaf3d46aa714c93b1333a2
24163 F20110113_AAASVA rosengren_s_Page_013.QC.jpg
a68cbbd2a60f14f121df38e1b1b6d827
0c0d8682dcaa205e4e91b9bfc5bc3b586bdf516b
13412 F20110113_AAASUM rosengren_s_Page_006.QC.jpg
557507626553bb7d6d6d71a511b4d8c9
c4389dc946bb55419db333e5f37786e4f83eff19
277 F20110113_AAASTY rosengren_s_Page_102.txt
9c08d5810dc44d7d7c3e91f536730aac
84de25158fdc3dcb204590300bdb80a7e3b2dbf4
5548 F20110113_AAATAH rosengren_s_Page_084thm.jpg
f19360f6985a0e59d3524802b38fb36d
1402d6444a2f86a1b3fe123eb3a852d01f08ac68
5952 F20110113_AAASVB rosengren_s_Page_013thm.jpg
2833a9a73c792c79b7493771bdd03fb7
44b6b019ecaef0008c6d96eaa0c3d8c2c48030c7
3485 F20110113_AAASUN rosengren_s_Page_006thm.jpg
bd58fcac28695d109ad96d73131e67f1
35843c0d89f4203992ebf9c4c031aea373098976
2456 F20110113_AAASTZ rosengren_s_Page_103.txt
0f52d91d9e7c035ee10f2ab6152213e1
1e43f11f60985d726141db18cc0a05e28b5c862e
10543 F20110113_AAATAI rosengren_s_Page_085.QC.jpg
2859c5437830d8111921271d0db8394e
d4b85c372ef6a956eee8086f6e128e1ce8ab8436
23097 F20110113_AAASVC rosengren_s_Page_014.QC.jpg
2f103e86cec7f3730bb2ecd9ac8c1bec
480adfb23c7dad637ffcfbbb4ae4f6c56f05dc4a
12778 F20110113_AAASUO rosengren_s_Page_007.QC.jpg
1656e648d0d983c3409c52bbdccfd77c
007c695c96222c140f11194ddf83c847f89040e7
3232 F20110113_AAATAJ rosengren_s_Page_085thm.jpg
79b1fb67cd5192be9f9af0d4333d978e
2e0ab0c95785852a8d3d9721c8ba86aed3e5ff91
5841 F20110113_AAASVD rosengren_s_Page_014thm.jpg
16aa4882dcd8954332ac0903af243505
7e1e68b733fa4f26de40d6c39a2201450979300d
3521 F20110113_AAASUP rosengren_s_Page_007thm.jpg
2b62b3718d182fdedb8ccb26910ca94b
09767478099f61f87898625930256ad273b76a43
3274 F20110113_AAATAK rosengren_s_Page_086thm.jpg
cd028e0767db1411d8a51203230318dd
4cbdda14e880401bc2f8ff0e26cfc20ddada3e69
19515 F20110113_AAASVE rosengren_s_Page_015.QC.jpg
2b87b39f957939715614012027aed2bb
69c61e830bdfc995a1a6ba3b8e58bcb950678029
23040 F20110113_AAASUQ rosengren_s_Page_008.QC.jpg
a5e137318dabca1d02e86907b320971d
9b36324ad30069bafe0f018538ea67cd1822a116
19544 F20110113_AAATAL rosengren_s_Page_087.QC.jpg
7fc024090fb7d9dcfe2c997aa5695430
f084e82d92afba003ea7e8adc4fec442f37e2415
5234 F20110113_AAASVF rosengren_s_Page_015thm.jpg
87a027bd2d4ff12636da39a5a9e7961c
0075eb5084f61f656c890336b0204fe5e7351072
5581 F20110113_AAASUR rosengren_s_Page_008thm.jpg
ea0b9390596b6f49cd2d75beeb2a25e7
9fe23e3d4c166075f6bb8008e97469fb84a92181
12018 F20110113_AAATBA rosengren_s_Page_095.QC.jpg
dd60db03e1e6a4784c739cef4032d181
314806eae29eb4fae6007035ec1e32566acb1dac
5240 F20110113_AAATAM rosengren_s_Page_087thm.jpg
7ba7a4d0a4d90c8ea26b4628256fbe9a
b7dfd8b0685c9dd32a0629d5c793d6eac60515e6
16391 F20110113_AAASVG rosengren_s_Page_016.QC.jpg
ea56aeb8c6df9cf80303bfa0d09a251d
5e1001a31a5b75f18a0693c7910176bfbf791625
26205 F20110113_AAASUS rosengren_s_Page_009.QC.jpg
466f3619422944240f131b11eead9d17
e4e88d7614c0022eafdfd45e2d340d607670be78
3549 F20110113_AAATBB rosengren_s_Page_095thm.jpg
3be846da69e491a13d82e29afa953185
282a48a97c50042e432ca8c26ecc85b60e99efdb
12908 F20110113_AAATAN rosengren_s_Page_088.QC.jpg
dc6a517752d9950de36676bc67a9eaa0
fb28f7f09e4d15cdd2b18747c7a84dd5b31632ea
4595 F20110113_AAASVH rosengren_s_Page_016thm.jpg
d4fa9104370e452b8f4c9fa9b18e3f1a
146417491414d3b270086e929f608343b2be105e
6539 F20110113_AAASUT rosengren_s_Page_009thm.jpg
6097c45f11a27313da2c70a77e07c40a
301d18ac7462da94594910edd413427aa8c9e8db
11195 F20110113_AAATBC rosengren_s_Page_096.QC.jpg
1d4ec22aa1107bee62a13603d1c217f0
cdf1799017b66c5ba28652cce3cf2bbefa7217cc
4021 F20110113_AAATAO rosengren_s_Page_088thm.jpg
503a732e73a910e16201ef7a05c9dec5
08a51fd1258aeb9cc7686f2c07e57dab0d4cf231
19235 F20110113_AAASVI rosengren_s_Page_017.QC.jpg
1d38db3e7c6ba9c6eb3b0465653c5ab4
69496bf684b27d781cc1a3af45f91966ce2be338
20426 F20110113_AAASUU rosengren_s_Page_010.QC.jpg
fedd36857c0b9213da022f6443fece60
c17d0d769dc6af1dcb5d26ee1e3efd65572576d4
3820 F20110113_AAATBD rosengren_s_Page_096thm.jpg
82fb9cd7841eb8d776bc310bea12c4c2
cb56ffe6a1bf5cf1eaa162fd547b1bf3a91f5830
13553 F20110113_AAATAP rosengren_s_Page_089.QC.jpg
d22ed2e3671d56e61de531545c336ff4
745d1b2e2cfd495866bb5558be33606de496b8d4
5349 F20110113_AAASVJ rosengren_s_Page_017thm.jpg
9cf1e85af3b3dd5a366f749c0e309c2d
c8a83ab6e39a1e1546417901f188b3c619b230dd
5305 F20110113_AAASUV rosengren_s_Page_010thm.jpg
7fa1a191c2b58b98a01ed9c516b5acad
84fbbc1abc840abf33a1ab23030d664a3029eddb
23085 F20110113_AAATBE rosengren_s_Page_097.QC.jpg
7b7eaf8b1fe1b2c6e884feb3eb83e65b
53659c22e152115e64951430d464b5fc6159842a
4075 F20110113_AAATAQ rosengren_s_Page_089thm.jpg
95568e0cb070c157798f491dcad919d5
806c6add68d06c4e48a1af0cbe7ba79cd754dc56
4463 F20110113_AAASVK rosengren_s_Page_018thm.jpg
19ea7b2d9c8a5c1fab4c3a62de1d349e
19d1ee712ea0490ad7027123802639385ffd518c
16724 F20110113_AAASUW rosengren_s_Page_011.QC.jpg
f4b2f8382e2d94b4e5c8af16f0d1074b
63d26c906a091d63f993b0400ca835f4fb51215b
5958 F20110113_AAATBF rosengren_s_Page_097thm.jpg
3268c49c693d4df296fc1300471352f9
af09387b37c29e484571ecfa030ea98a1b7669e3
10611 F20110113_AAATAR rosengren_s_Page_090.QC.jpg
312400ec351a78877555ad84468c49e4
a18b613b6c06837b87a1288d18127133fde95962
4403 F20110113_AAASUX rosengren_s_Page_011thm.jpg
05c7fb16034898d467e29392c63a52c6
a75c8007f883bca6c091d7ebb4c90a7d7a5654af
23383 F20110113_AAATBG rosengren_s_Page_098.QC.jpg
0bff8ff762ad86401dd7fa80dcb416e8
c8f3750a4e459f014a37b5785a0158a0b22ea9ab
4994 F20110113_AAASWA rosengren_s_Page_027thm.jpg
ddbc8445c9deca93f6770f95dc4e0307
203a39432b080a2cae744eba93c7c40d95980040
3317 F20110113_AAATAS rosengren_s_Page_090thm.jpg
f8b17cd24e0676379b1d576c910fc578
eecd4dcc71e9507a2e0cbae3389e10ed5b2d953c
6411 F20110113_AAASVL rosengren_s_Page_019thm.jpg
7c47d9b7e64c179d7786d64ae6c3a93a
d0e188d285e8ebe79aa39bf7070705662d92634d
20307 F20110113_AAASUY rosengren_s_Page_012.QC.jpg
3c8a484c9258b46336d187d56fe89ca1
87ee522019be1c0ede886eabadd0b6c128d69861
5946 F20110113_AAATBH rosengren_s_Page_098thm.jpg
1cdb2c59fd12e267ff09e42f76dea3b0
27c5639160791b0df7c07e2d7edebd7bd05fa2de
24532 F20110113_AAASWB rosengren_s_Page_028.QC.jpg
dde1847885317224894a0e035bed3495
8ba02ff203f086d03bfd3bb2de215c90be1bf960
11921 F20110113_AAATAT rosengren_s_Page_091.QC.jpg
666e171d494144bd71b9adf08512d81f
dddd63bf092e3a493a15b87d66aa274edcf37ec3
25802 F20110113_AAASVM rosengren_s_Page_020.QC.jpg
46c39334badf2ce6d0b5d006e1503b26
177afe6ea341b37e1e275460e077e8272309a1e4
5197 F20110113_AAASUZ rosengren_s_Page_012thm.jpg
0d6bac3335803cda5edb0c3f7a943781
78cd00fdcffa9c5447542e497fb7dbcd00ec9049
12638 F20110113_AAATBI rosengren_s_Page_099.QC.jpg
eba33241fe16fe4d70f6f68fb9549c8c
8f76ca5719ae256fa33a30fe4f7c40c479e2c946
6093 F20110113_AAASWC rosengren_s_Page_028thm.jpg
4f14fc7bc718b604e30fefc106cc0400
3b6583fc32395e93380f4eaf91ed39a9d45fa1a8
3839 F20110113_AAATAU rosengren_s_Page_091thm.jpg
e9cce7be1540d6d2d56e0b4227e78e9e
4552b92b6b61c15924628e726bf8a22702d69224
6431 F20110113_AAASVN rosengren_s_Page_020thm.jpg
f63ce41f2b01a81e616168fa1aefc812
b3d8b1bd61b993c0d3ac9f626c9ce97e893385ca
3530 F20110113_AAATBJ rosengren_s_Page_099thm.jpg
b0e99bc961ab794260989c621404719e
684161afd2d4c19d4923225f26b4009f51eecbdc
25868 F20110113_AAASWD rosengren_s_Page_029.QC.jpg
d4abf106e6194c0059586f4628b605df
25adda4e67440b0a731bff8a3d83b33d67ee83c6
3780 F20110113_AAASVO rosengren_s_Page_021.QC.jpg
9d67eede0f2e139b09fe4006666f7e5b
f643218dbd9a19543aefe9a8488872beaebd6767
13432 F20110113_AAATBK rosengren_s_Page_100.QC.jpg
899b2b549b6c9826d897a49f9f95c8a8
b7099fb737cd38f895430fb9eb8684c419154912
6122 F20110113_AAASWE rosengren_s_Page_030thm.jpg
1779d145db26409faa4d476247dd2e88
c2fd5f434311ee29c4fe7cdc1c42674682c77b6e
5065 F20110113_AAATAV rosengren_s_Page_092thm.jpg
bcb226c597b1ed6bdc018cf2a610c541
cca15e74e654218eff071c9b9a57a7fd851939b8
1515 F20110113_AAASVP rosengren_s_Page_021thm.jpg
151d81a6f4319eb778306d756d50cf94
0825618e91618d1690faf5daa6afaf9591e254dc
4390 F20110113_AAATBL rosengren_s_Page_100thm.jpg
a04968254bca340075e12dd5930cb17a
7ab084685961d7981d781f17b9b9f81b511f780b
23879 F20110113_AAASWF rosengren_s_Page_031.QC.jpg
4ebd2ffb50efbe2bb47708dab1db43b8
d6b1ffecf0a5806b148845d65fd4451c7d4331ab
9678 F20110113_AAATAW rosengren_s_Page_093.QC.jpg
983572eb1d9bcd8a97aaae06e9cb46b6
523ec48ad667bc9efb5f4f204a34d7ec8de4aacb
24158 F20110113_AAASVQ rosengren_s_Page_022.QC.jpg
7dd6af4dc4ba77fa34b439788f641b04
90605a1e21e11bb2c4677edf92997a647a807f39
5407 F20110113_AAATBM rosengren_s_Page_101thm.jpg
0b3173be1020c8c8ba70c7414a10d502
9952726e562589e6caa949c7d1c7f0a795f89946
6152 F20110113_AAASWG rosengren_s_Page_031thm.jpg
368383cdc3dbe213771ac6f9a06fb14b
a0486d2d34216a849d7257a71d49ccdf0345e231
3360 F20110113_AAATAX rosengren_s_Page_093thm.jpg
090797f2e51bb781f1d060a7655dd208
16ddcf92d76f8ba7f1e77118c4e7f06f77e1be13
6081 F20110113_AAASVR rosengren_s_Page_022thm.jpg
f56e1648bf39f4061a2a8332244d8652
b84497384be215e04daecfc45a4269a68a6f019a
4747 F20110113_AAATBN rosengren_s_Page_102.QC.jpg
a1d5b98442cec182ebb9a7029f7ba6c8
e4ade0d488796b5ad04158e739efde6d8b4cacb2
16743 F20110113_AAASWH rosengren_s_Page_032.QC.jpg
ec607a6ff27590459220f61927dcdc33
13b8198d52c5302edf0d5eab19b725bcab7e8e2a
11967 F20110113_AAATAY rosengren_s_Page_094.QC.jpg
e608da08d11a1c8fc97fa59eacbab5f0
ba743096d6604e846b6edd2ce9530b9143fe0705
22968 F20110113_AAASVS rosengren_s_Page_023.QC.jpg
3e0f093afcad3e0679752f0aeb6498c0
be5278872db90ddfd6ae46c538913e423687fc45
1713 F20110113_AAATBO rosengren_s_Page_102thm.jpg
1e0a964faca0bbec632910425e15c6c3
0e35430420911f413a51e2c8952d4013e42a7ea1
4559 F20110113_AAASWI rosengren_s_Page_032thm.jpg
fd96f41876c3cfd9ea7751f89c4465f4
cd0c09757ae2c69e15ec0b635ce446fc65f8e89c
3828 F20110113_AAATAZ rosengren_s_Page_094thm.jpg
1cd524f5b51c1eb3db008214f2a79379
9279d2d40b194691726cb7ef59b156e391bf356c
5763 F20110113_AAASVT rosengren_s_Page_023thm.jpg
bcf689a2320d912e144411b0a02e992a
5f81af8b304b4c42e0bc9a437aabec939e966878
21649 F20110113_AAATBP rosengren_s_Page_103.QC.jpg
6937ae9aaec74648d79328e7eec3d823
6ffeb1247272ac3b330a2549467dc724fccf3da0
16820 F20110113_AAASWJ rosengren_s_Page_033.QC.jpg
8a39dabd841cbed3b747408913f726d2
e3af2af6e6bf3ed95f0da45b0bdb3c8b9d345578
6275 F20110113_AAASVU rosengren_s_Page_024thm.jpg
5c8f230d1ffcb2ca56b32bbc51a371eb
184d8fc8f2d631a8ff645f1eca026c0a220420fa
5492 F20110113_AAATBQ rosengren_s_Page_103thm.jpg
221f2989381356e4e7e099040041a791
3bf44cfeaaafd1de27a3996397d3121a6257ab04
4589 F20110113_AAASWK rosengren_s_Page_033thm.jpg
7d28defc05ebe0ac175bf8e0a55b8e7a
f06df9d59af20d73b5688b6a496d88ed46832a0f
23516 F20110113_AAASVV rosengren_s_Page_025.QC.jpg
d921205204dc18dd6d9ef2b1e601e439
bb7b66eac3840c94963d3257980cb880a27fe13a
16052 F20110113_AAASWL rosengren_s_Page_034.QC.jpg
bacb46709e0243791259be9299ad1765
84a6fe145fd82b56c298b765f3d82369ba55fecc
6108 F20110113_AAASVW rosengren_s_Page_025thm.jpg
2d3f56ad6c25353a913385e7ccfcd91b
c172d6a5d19c06d3e6eede54bfb528093fe3341f
19298 F20110113_AAASXA rosengren_s_Page_042.QC.jpg
1b604d9b1ccfead92f254dbc58795d0f
142cab4e60710cf716f5577c93163a483b820536
24698 F20110113_AAASVX rosengren_s_Page_026.QC.jpg
afc401410422c2285a608ce3de3bc37c
4bc41489a092d050b315c652bd77f9fcfb1ccc3f
5464 F20110113_AAASXB rosengren_s_Page_042thm.jpg
c6ec6395a8c07d1348b3cf0a888d4825
6e8b83ae7ed48284644088ac69f2504d39e64bae
4812 F20110113_AAASWM rosengren_s_Page_034thm.jpg
d350cb332c2064e559816368cefa5c8e
8a9b04039ccb27a5ef326b5d30429c44fb3f0231
6291 F20110113_AAASVY rosengren_s_Page_026thm.jpg
52f9d8257d3efd5408c7fffe6950b46f
45bd0658a6f635d85e9e9096e1705c32b1452803
21093 F20110113_AAASXC rosengren_s_Page_043.QC.jpg
897dbdb302a7209e8a1149d689d2e2eb
cba0ac86a752e398b80a2cca490295a94e508f82
11593 F20110113_AAASWN rosengren_s_Page_035.QC.jpg
f5e2c96f9ff11f2e8a8f8812507b68c4
8cd3909683eaad5c3c76cfbd42a5713083bcca83
18648 F20110113_AAASVZ rosengren_s_Page_027.QC.jpg
51e9398e9bb93c61ee96fd2d028c3cc8
3e670d02bc6c440d056dabd826a403ddaadae081
5940 F20110113_AAASXD rosengren_s_Page_043thm.jpg
8dcc4675aa83fd49d1e543309c23620d
138f01e3bcf03c5e6017580ee9b0a9a78e615aee
3745 F20110113_AAASWO rosengren_s_Page_035thm.jpg
fe67305ebba3511c04a0501a6d809ecf
2ca6dbee518cc11cce62095febd342f4cd777aaa
20262 F20110113_AAASXE rosengren_s_Page_044.QC.jpg
25814f6dd47ab6e431d37faa8d3a367a
bafa9cd2a1547d52566402ef2a833d918c41a87b
17423 F20110113_AAASWP rosengren_s_Page_036.QC.jpg
2368dd2ccd8b925d5171a4205cc829eb
25c6c35c6a623e41cbf6fda89fa2c31b89743c7b
5469 F20110113_AAASXF rosengren_s_Page_044thm.jpg
7a6fd839a85d0a2735d113dd7aff9f67
97a54ccc6eb4124594b047a1820e1fe3fd49c7d4
5253 F20110113_AAASWQ rosengren_s_Page_036thm.jpg
7808daae561d4a83c63c318265657a25
e26614ea13794a174e79b8af4bbfc87f3397c9db
15092 F20110113_AAASXG rosengren_s_Page_045.QC.jpg
a27249f668880d86266d1a77942e8cc7
6d77d96ebb4852ade1418a15d7af4e01719148e4
5417 F20110113_AAASWR rosengren_s_Page_037thm.jpg
0d2c39f74254ad808bcaec1bf7d8a796
41fcb310934f45655af1f7e5761771f7d873c958
F20110113_AAASXH rosengren_s_Page_045thm.jpg
b89fb97b826a3280bb383383614783d2
1aaa5ac17845234897e69b13ecac44b0e45fabe7
16102 F20110113_AAASWS rosengren_s_Page_038.QC.jpg
e5cad20a58877705db825748cf842cdb
93d789248ea229e863a6a5c75e5b60593df782a4
20819 F20110113_AAASXI rosengren_s_Page_046.QC.jpg
782b0a0546d52d6a33215761fbf35310
25c029a2cc4907a9956cf2287cde78efaac4d2f9
4680 F20110113_AAASWT rosengren_s_Page_038thm.jpg
435b98f08ec496811249ec97433a2fe0
dfdc8e20dd314125c745b0d1315ef78886679b65
25353 F20110113_AAASAA rosengren_s_Page_035.pro
766c6c43ef3cc2d7290eddf92505acc3
95f9a4de8f15babd0b1436898d2c95b9742b6b37
5397 F20110113_AAASXJ rosengren_s_Page_046thm.jpg
7fb6f4bcda4e8d0ebc374c5af5235cc8
56a0654204fb03003ed0c4397ecc2ed61b41c407
22268 F20110113_AAASWU rosengren_s_Page_039.QC.jpg
b6099dd9dece9f2b36d3b97ae1fc228a
9597a1179a43bb01bee1290d86cf95dcb2321b84
23489 F20110113_AAASAB rosengren_s_Page_088.pro
c80142e972b14929feb9710b9e41d75c
6c210a142950f155f16f7901a0096d78595f64ba
14206 F20110113_AAASXK rosengren_s_Page_047.QC.jpg
8f32cd3128db63cecdd1adf05807b113
bc016036ac70714a62cb408e9bbfc0b1c66f3a65
5829 F20110113_AAASWV rosengren_s_Page_039thm.jpg
cb70bbcc17f4df3a52379a9413d80dce
540f5dbdaa228ecca0f7213999d75b446a233cff
19403 F20110113_AAASAC rosengren_s_Page_092.QC.jpg
3ff82624c461110e862e4033346b5122
3bb9f6213d3526fa657f108473dce757d7631ba7
4497 F20110113_AAASXL rosengren_s_Page_047thm.jpg
1ab7627d7ec26b588972f39b8ed15d43
43d550b190230cc8fadfbeba04c3e1f0d66fb144
13372 F20110113_AAASWW rosengren_s_Page_040.QC.jpg
c24bc41df99848016d7ce8cea92a9cd1
388c0b8bc404169db33449749b0b536704c1c29e
25271604 F20110113_AAASAD rosengren_s_Page_058.tif
f387545337d892cb8b029e2aa6b127a1
4176032e9fee82eb5b54b2490703180678875238
17656 F20110113_AAASXM rosengren_s_Page_048.QC.jpg
ac94a3a13366f99c75f275365675b574
03db29c919b2c54b8d66e96c84e06917b788707a
4243 F20110113_AAASWX rosengren_s_Page_040thm.jpg
4698924011b2d183d16c3e26605d90f1
c0b2451b4c3f49b655239805e0eaf797d33ea9f5
1053954 F20110113_AAASAE rosengren_s_Page_012.tif
92c11ff22aa13daa0cf0dc461a897ab0
98b566d0752aae67ab6e03cdd55138db1e6df985
18230 F20110113_AAASYA rosengren_s_Page_055.QC.jpg
e48fa0cb31020d4b2f2bf943bd039884
26342db2d9e48032cf3c46c79e2b1afbf62e0431
12017 F20110113_AAASWY rosengren_s_Page_041.QC.jpg
3e8aa5673715dd489adbdc6f8d867e44
d901f660fd6eb85d330565a93b89e4edf148bb4f
F20110113_AAASAF rosengren_s_Page_092.tif
ae10a3686e2631f858aba6626011f8f2
e8bfe465dfad8cf8cd1890409367b69a6b28ed32
5149 F20110113_AAASYB rosengren_s_Page_055thm.jpg
9d2176046d10cdad06d3a873e7f3ec42
ef3ad705036af4fe508a5e6d510e5fb29fa13da5
5166 F20110113_AAASXN rosengren_s_Page_048thm.jpg
c25f3727e92b5f94192718f3e1cf6c7c
9ca68920c18bb5707e3de957d9ebd65e8586522b
3725 F20110113_AAASWZ rosengren_s_Page_041thm.jpg
e9b089f276f07043a17de5b9b200fd9b
4075cdf115b6fde88c78172f1a6341256f1d5f94
6264 F20110113_AAASAG rosengren_s_Page_029thm.jpg
056e896da24418f3776d7045e7afa2f5
d905a957ea6ddfdbcae9d617605a71f7dbc50081
23113 F20110113_AAASYC rosengren_s_Page_056.QC.jpg
680587cf9a6b4752df26ec348a0d70d7
b234752c9e3edd48fdeb89574455de9488d827ea
20585 F20110113_AAASXO rosengren_s_Page_049.QC.jpg
094573cbceec3112014dd5639027b3f3
d9e3d21879531b04876a80823f966a06a07e8a15
10298 F20110113_AAASAH rosengren_s_Page_093.pro
2b0d4804e6fa317c193b5d9a667e979c
be490bc86d1c671c22ebf9f0d8b062983028909b
6247 F20110113_AAASYD rosengren_s_Page_056thm.jpg
2201254111d5f3301c33a46a5471a674
635c67cef27f6532ff12dd18bad51025f619033b
5536 F20110113_AAASXP rosengren_s_Page_049thm.jpg
6f332028f3be42cd4b627121cd24f3b4
746c1f5a65f11448411beb6719143bec7123f6f3
23945 F20110113_AAASAI rosengren_s_Page_024.QC.jpg
f629b0f7133184ce5c350ea9db20f684
c233cd38e6d5ca37cbd65dfa20b30a39858e4cb3
24508 F20110113_AAASYE rosengren_s_Page_057.QC.jpg
09fa6c75dc7971b80508411efa1e9be5
31a01e6003c2cf19010a097939d1a8f8b78b9ae3
11341 F20110113_AAASXQ rosengren_s_Page_050.QC.jpg
34cd039733eb5188a597369457f301c7
a54d6fe8998a25b5f79c8d5d04be2cf378f1260a
F20110113_AAASAJ rosengren_s_Page_031.tif
64e8ae63e7da3bf55ea26b684f1366df
39e471e606a1b9cd59392d30c97cc535dd03b201
6011 F20110113_AAASYF rosengren_s_Page_057thm.jpg
ea036a385251e57cf282acd87929e289
71891669cf8d9f7968243316c9a581ac9660c731
3373 F20110113_AAASXR rosengren_s_Page_050thm.jpg
97aa6cfa4cf8633bc07a77b71b4ec815
858b36627c473579f94789491f7fe8d20302706f
1084 F20110113_AAASAK rosengren_s_Page_078.txt
85d3af03773266a8585e26ad7f3a675a
05eefc30f79f27d51a3a0f75319867a5d07623ff
19872 F20110113_AAASYG rosengren_s_Page_058.QC.jpg
2b3b4e2c3a0d7f923f4414f2b23d9f18
57d041438c4758b46a46138cbadbebcd73266309
13231 F20110113_AAASXS rosengren_s_Page_051.QC.jpg
daae43e0573c1f376cc9f00fc3975d8d
fe2b4594305ff46c5c3e560120134952ae3b7e89
F20110113_AAASAL rosengren_s_Page_024.tif
f3a8fc68312b2ea85430631e9b2dd19e
451cc69a038d91e8d8cb4c4fed38e9929587310f
5279 F20110113_AAASYH rosengren_s_Page_058thm.jpg
372b1c42e3fa2bf55c17115bda5fb5cf
ce6eab9dcd194a416774ac6009c3594dbb894b21
4063 F20110113_AAASXT rosengren_s_Page_051thm.jpg
8ba57c8eda967b6085abccbf8145ef7c
2af4c0eaf8d1f393e5dc8d0612f8c26ccb96e5b2
1129 F20110113_AAASBA rosengren_s_Page_075.txt
0c10baeec60112996778c9517ad94836
bac854a6e5440f8144e1fd70834db5e18fcdd584
18692 F20110113_AAASAM rosengren_s_Page_037.QC.jpg
9ab5211b1c2ff8d644734acb449f1b8a
f514593c53fffa631e4907558ec4b2cbd2557953
9532 F20110113_AAASYI rosengren_s_Page_059.QC.jpg
10a779a4d6fdaae88401c9de20732c80
c4825e78683c654b415cdf4f128a8d916ea3c83e
1051966 F20110113_AAASBB rosengren_s_Page_009.jp2
90fec1adcf0d6de462618e7af2e0b3e0
86edfcfc17b443744d3210db63b397a026e091fe
F20110113_AAASAN rosengren_s_Page_013.tif
8e46dbde075d5c60483bf80bbd3ca0a2
7362736fdffcb4d782846a8567e5fca3c3eefafa
3149 F20110113_AAASYJ rosengren_s_Page_059thm.jpg
b267fd38ae60318887cea22e28fa287f
32a4ff1ce90202ab1a5fde86d245d764b030ef47
15966 F20110113_AAASXU rosengren_s_Page_052.QC.jpg
2e1248afc52bb1b0becbf9ed035dcc7e
3591cab79d0b1e64adb3d543f2d6e850712b85d3
23173 F20110113_AAASBC rosengren_s_Page_030.QC.jpg
35df85273fb3cab2c6fe219f05cd1d41
f4edffa613e5b79a16123c140c31c9ebde769269
1296 F20110113_AAASAO rosengren_s_Page_071.txt
15931b96cd7a9c8d0a0553822d9034b8
0a69d6c6787b48feadc87fea1e1a9623028cd6e6
11884 F20110113_AAASYK rosengren_s_Page_060.QC.jpg
7401a3ca350a5533ccf1fbcd2dc5d9ba
886e284b76837016715f09fd49135f01b9f1eb1f
4480 F20110113_AAASXV rosengren_s_Page_052thm.jpg
18554e27b355491484c043e9fa625761
b85305f0c724c5b531fb3914d6a1fab79b0c35a7
28261 F20110113_AAASBD rosengren_s_Page_073.jpg
6b5fc064dfd4a5436cbbf87b1345fef0
cd2d323262cb3cdb89c15cebf4d76b52395b015d
F20110113_AAASAP rosengren_s_Page_035.tif
8720ee2350e26409a26cb7ae0426b41a
6902d0065d1f5f62eb4a131bb3725210a0aab47c
3829 F20110113_AAASYL rosengren_s_Page_060thm.jpg
047fc6b1e6815707840db4063f07aa93
21457e35e3d86cc6248d0e044614e12c8cfcea49
21893 F20110113_AAASXW rosengren_s_Page_053.QC.jpg
85fc393063d19b78cc63cd7dff482074
cbb9acaedcbf7b5c35df4bf905e42218fb0a04cf
86930 F20110113_AAASBE rosengren_s_Page_011.jp2
db72510440618a115b7445843f16875f
34a00e127725d4c3c2680be27182e3b60e98d0ce
F20110113_AAASAQ rosengren_s_Page_098.tif
a1879517437fa94f0cf953502e3379dc
dda904e479f84f5fa649698fe36934195cc30c80
11878 F20110113_AAASZA rosengren_s_Page_068.QC.jpg
bfea7500542d7e6ce08b2fd40fe51a54
9974bbbe30549011f3485fe00a4dba4ef2b2f1eb
18434 F20110113_AAASYM rosengren_s_Page_061.QC.jpg
26a4858b1b93ca3bd117a386a20ae806
19bf0b8b63e8b219f0945f87049c2f3e1fc7b415
6185 F20110113_AAASXX rosengren_s_Page_053thm.jpg
02d4a590abdf316f34a57188a4db1708
a14766fb59f0131973b7cf7202b91cf66734b1fb
31887 F20110113_AAASBF rosengren_s_Page_078.jpg
b7575ea5d80648ea2ddedb81993390ba
ac01ba54f7f9415c1e78cffd0cd03ff9dd2d12bb
31470 F20110113_AAASAR rosengren_s_Page_004.pro
4b0be1d71ecf383560fe4fcbdcde544d
b0ba3c5e9f13b48d60908181ffaccccfc99023be
3671 F20110113_AAASZB rosengren_s_Page_068thm.jpg
1a321bb7ce40a0fffcb8233dad93f5cc
c68b480138580d37102d3dafd5058ec0c5d2b9f2
4946 F20110113_AAASYN rosengren_s_Page_061thm.jpg
bb54296b5f924cf3237c4e4b64a095dd
d74b56a315128e1b69d132c2f96c875ff44f73b6
19815 F20110113_AAASXY rosengren_s_Page_054.QC.jpg
6501cbbdff5f9f7528dccc0445d137b0
81cdc2437f41cb65b1dfbf494992aee491b87a2d
40960 F20110113_AAASBG rosengren_s_Page_034.pro
843ee82a6525b57fd01187b5e58a7742
e6167f0a0f33902dbcadba690211f556735700e3
F20110113_AAASAS rosengren_s_Page_022.tif
7c088592da4c24d347f323e89a8eaf80
932058f17d39bcf1a31a90ab5c24f9f0a0bfce7f
12432 F20110113_AAASZC rosengren_s_Page_069.QC.jpg
cbbdf46a4db43468ef7bd1f8a1e97bde
d99feebed4dce59cd8d2321fa7e311622c482396
5314 F20110113_AAASXZ rosengren_s_Page_054thm.jpg
8653226e4ba4f193faf72c8276dd5e4d
9037daa4fef648a54fb97e119f3b4be1a4d3faea
24517 F20110113_AAASAT rosengren_s_Page_019.QC.jpg
c300725407cbcd06540e467d16ad3536
c0cf1ecfdc066f451cc112a119a62ed43c68fa0f
3810 F20110113_AAASZD rosengren_s_Page_069thm.jpg
446ecd2f17f49e0c6e11eaed95fb51c6
e75f954411c1524aa78128bc72ad0c9fe8c7337f
17261 F20110113_AAASYO rosengren_s_Page_062.QC.jpg
cb521c85a72c101ec36970a765cb22d3
42abbbfd4fa43d41df7836bdd9dd69104589854d
62988 F20110113_AAASBH rosengren_s_Page_012.pro
80f466152bfdd3740c8bc225cda682c9
5891e33f44e177a393bcc43c74e826df3b69d335
834284 F20110113_AAASAU rosengren_s_Page_037.jp2
3db09b8f77fe77262ccb9d7e5d0ad6c8
99468fad9ca9a51e35a23af44c1e7f43c7d5498b
20054 F20110113_AAASZE rosengren_s_Page_070.QC.jpg
19e04c844457251e59a7ba21b78563d5
d735fb78de6c7a84bc5bfb0504d0251a897e7a89
4872 F20110113_AAASYP rosengren_s_Page_062thm.jpg
355856972436e26b8b9f16513fe7423e
68b16698c52eeec1b99e7985124403c571f6c544
491685 F20110113_AAASBI rosengren_s_Page_040.jp2
9f38f4cd40c583fdde09a431609cb0b6
27c53b0431fabbe07e7ef17925610413f4e603ce
36489 F20110113_AAASAV rosengren_s_Page_035.jpg
0741ea250afa2abfaa9059c59e31a257
046ca51b038b07b8beccc9c67927cdd34f94797b
5409 F20110113_AAASZF rosengren_s_Page_070thm.jpg
fe790217a07dac1f6a3a8c957bc31ba3
3fe63e28d8aac065327b99e0bbaa5f5a9819de88
10303 F20110113_AAASYQ rosengren_s_Page_063.QC.jpg
0a500477a730f63d67f6a425c99fda57
c680ce2e90fb510079f38cc1a56dd86e8423030c
F20110113_AAASBJ rosengren_s_Page_008.tif
8478a1896ef5864c59ae42ad4bf32c41
4a613d34978551285418ff8f0fbac027ceaf03d8
11823 F20110113_AAASZG rosengren_s_Page_071.QC.jpg
277fccc3f83da770aa6ecc059b1734ad
d7eb73e06f1a3117a0a5e04fa316a79a59e98b65
3523 F20110113_AAASYR rosengren_s_Page_063thm.jpg
0a610007f4879b3773dd46e5fe32f367
fb8a0142780ac94282d181f0eddcfd28716f7c39
79718 F20110113_AAASBK rosengren_s_Page_019.jpg
976f1e6f93cfb801a63cc86dcba34912
8a13f20544c3a9b44c0f07bb46d0f6d3d38a79d0
45601 F20110113_AAASAW rosengren_s_Page_027.pro
9a3092db2c3692cb83b72c174d421bc6
ac5557899f0cc1e37ede5243480f64ee65fb0f60
3748 F20110113_AAASZH rosengren_s_Page_071thm.jpg
871430853364c6687a334ef712074da8
bb993f960ae0bc6e7dc5e0b73a7b5c648da0d6ea
10667 F20110113_AAASYS rosengren_s_Page_064.QC.jpg
cd0a7c4d922565e8b2c63c580d29d409
9e5d6a10714f9f25d323d839689ed2b804f30dee
35797 F20110113_AAASBL rosengren_s_Page_041.jpg
c1b73b14ef7143af572a51f74c7e731e
66891abfc1e13b20c1a6414388a444acac7c3895
686711 F20110113_AAASAX rosengren_s_Page_034.jp2
5cc1e0569260446f0ae503825c7fc8cb
4847c034bb94dd15473f5db5500aa037cec29457
19394 F20110113_AAASZI rosengren_s_Page_072.QC.jpg
2e07ba0e16555a0c4d12fdd950d24217
07bb58bb98edda1b3450fc20ff36c447f1e5866a
3718 F20110113_AAASYT rosengren_s_Page_064thm.jpg
6bc8644a9d7bdeb770d8135a21b04087
a86f0a41895c620361e6e3dd56c2a2366fb8f3ed
1544 F20110113_AAASCA rosengren_s_Page_047.txt
33882b0545bc328111eea30831eba1e7
f8cb6b12156fa701d24631a44aab3d4b371e7c97
F20110113_AAASBM rosengren_s_Page_046.tif
7657d4113a8fba8549735fdcdbf5d2e9
2c0c787c44f3e0fcb0812b995d71d48b3ea3ff9b
29602 F20110113_AAASAY rosengren_s_Page_071.pro
332db98755b1da10dd1ee2eaa939152e
8c8c3680aa5626dee50310de142f14d65f50603b
5154 F20110113_AAASZJ rosengren_s_Page_072thm.jpg
7b82fbae29a7b00ce584353ba398870b
586f2946c86d003ab9d45267f6686e7a1936a5ec
11942 F20110113_AAASYU rosengren_s_Page_065.QC.jpg
9083adabfe6f9bf73ee3898429fc3740
e76a8a29b02896b9f24e20018ddba79924d15ca6
F20110113_AAASCB rosengren_s_Page_060.tif
aa17a745e7fca625a3fbda45e96c63d8
7fefdee549b12e58f4c174092ecebe95a83c7897
14741 F20110113_AAASBN rosengren_s_Page_018.QC.jpg
bc35b6bbb1f1b978cfe183baf544b067
fdf0317fc17c7e35fc548cd5438cee542ef25ccf
958656 F20110113_AAASAZ rosengren_s_Page_049.jp2
7804e4e12ea91921d0c4d2dfb95d131f
028d4cb82e45c3eab6c9b29a644464260215cd50
9500 F20110113_AAASZK rosengren_s_Page_073.QC.jpg
61f1fd285272b0f5a7b3c6af303f8cb8
9f222496027c23637705e6ba4ed046964f6658ab
3827 F20110113_AAASYV rosengren_s_Page_065thm.jpg
4b663e561888fd0fdbe0f9b49aaf2f3f
b106a66fc0775be3409f072c52bd3b0b0617861a
2509 F20110113_AAASCC rosengren_s_Page_030.txt
66b71966657b4ad2afb6745fa97f55b4
1f59a00fef016145f067d7978dd1ccbed4dbe6c0
F20110113_AAASBO rosengren_s_Page_097.tif
20f965e9c5ea31062f014675f8425eab
032ccafad1828bbc107eaf4dbb28429ee1ecefe7
3311 F20110113_AAASZL rosengren_s_Page_073thm.jpg
a4aa7505712f345bebea6fadb6cbaa36
36188c0eff72983c456804161738b268c1b14139
20966 F20110113_AAASYW rosengren_s_Page_066.QC.jpg
905006790874c31fffec20ea22ca7bf0
a7803bf8a9985a7e18f2f11e0855f766e5780494
838048 F20110113_AAASCD rosengren_s_Page_055.jp2
8ff35509c4e661a1540eb1eb7b5b319e
14b3f0a00b00514c715c0a7b9b099068b9dcdab6
21450 F20110113_AAASBP rosengren_s_Page_101.QC.jpg
b4ea9023cd6497f6a419537be773562e
e5ade946de184074bd96932069de41dd415bfa0f
10407 F20110113_AAASZM rosengren_s_Page_074.QC.jpg
158049aa3b40dfd6f68e396889b6de78
3013c001a58c9a80ea58e75aa0b3b7fab7d150d1
5589 F20110113_AAASYX rosengren_s_Page_066thm.jpg
97135cc17919bad44813fc0540f4ca2c
19f1bc75f16009beba5642aa7525f0b3139a7c36
157609 F20110113_AAASCE UFE0004565_00001.xml
f9d121805254fb3c8ab921a030c41170
33910eaeb5103c96d5429bf36e006b22512f1081
58562 F20110113_AAASBQ rosengren_s_Page_004.jp2
bf8d70deb569b161e197e202375b59ea
7ae1d7a569faa58d5e8c4375dbe40b790ae77d9f
3505 F20110113_AAASZN rosengren_s_Page_074thm.jpg
d8a048d8b07c34772fb44efd27996085
f82d17ebb7838e4870b8ae5f425a686cd1d084e1
10020 F20110113_AAASYY rosengren_s_Page_067.QC.jpg
39adbd42b410a25b765789ea621968b5
123e8b866cb8b503538151f8ca16d96198653b05
2651 F20110113_AAASBR rosengren_s_Page_019.txt
376e6f254d4838b531432127f6b6df4d
8ba11dedc16a4e778b9944da91464fbcfdb68a3f
12346 F20110113_AAASZO rosengren_s_Page_075.QC.jpg
049bd0fd9569073ea60af68b28a7e158
d1114247055c79a8f21d887e8785d79946b7ddfb
3413 F20110113_AAASYZ rosengren_s_Page_067thm.jpg
865752793173494537865402913682f7
26ea35419784c724ce143398cbb029b90cd64b13
48697 F20110113_AAASBS rosengren_s_Page_042.pro
3c93723ffc0b2ef258b7e53fe1f6b52b
2c9e28855d2f7c0892a4e9f13bd16f05de5f78e5
22993 F20110113_AAASCH rosengren_s_Page_001.jpg
2ba9dfe9de85c0fd4ed6887fa2efedf3
8d8fea834e9a34a801a49007560fe56c6853d9b1
F20110113_AAASBT rosengren_s_Page_050.tif
f652c66deb4f37d618830140d7f664bc
d5762ed14a89f4e8f92621d0afd558162abe5eb0
3861 F20110113_AAASZP rosengren_s_Page_075thm.jpg
0a0f65377d1a1172fb4b633c291ca013
10c946870b853828c28310090bddd982682c49aa
10085 F20110113_AAASCI rosengren_s_Page_002.jpg
f3af4d2d146cad394d075c54d966d0b1
3bbc36d58d00a5cbcc6a1058d1681afc7607c9cb
28236 F20110113_AAASBU rosengren_s_Page_040.pro
3dd63e72fb91632835ae6e9f540a5191
0f2dbb6cfb3d90b34853da3ba0763a437839dd16
13226 F20110113_AAASZQ rosengren_s_Page_076.QC.jpg
5400bedfa4b348a2551a2d53aefec56b
58fb9b997fa690cf33ed4d815ee5fae531b2d4b4
10400 F20110113_AAASCJ rosengren_s_Page_003.jpg
a5aef0f565ad5b31ef0161476520ee72
f2dcc2b7a7e678e2ce5aeedb984ede7a3a1242e4
59048 F20110113_AAASBV rosengren_s_Page_015.pro
2737cc1a60f75a14b76a1661601319a2
4aca51b958c4bd21e72fee0fa90d36513c4e6ec8
4090 F20110113_AAASZR rosengren_s_Page_076thm.jpg
7d0e3f554366b9d44111e41a992e4d52
b56415f1f50384ab1bd5baccc45d82f8d22b4a92
37994 F20110113_AAASCK rosengren_s_Page_004.jpg
b5e00e31a641396a0227bfc0d8751424
1b463044088d5a144e0df4c8c880a24d22719bb0
33334 F20110113_AAASBW rosengren_s_Page_059.jpg
49f8d5176820db78095d55a4769de404
df06b2e3900600e6a815fbebfdc0e4cd4924d43e
13043 F20110113_AAASZS rosengren_s_Page_077.QC.jpg
1d6666c904705930ef6e3cca17b0ea73
840039cdda999a7daf00fca66c5fb12826920514
87226 F20110113_AAASCL rosengren_s_Page_005.jpg
3ffd1dcb22bbaf32aecc8c51ac1d0cb7
1688403eb59b85385ecf2884fe12ee92a0bd12b7
4041 F20110113_AAASZT rosengren_s_Page_077thm.jpg
396b0c52a0f5c33f3bb7d347add6a20c
2dc8691b2647eb85000b5c962a57afb11091eb5a
11462 F20110113_AAASDA rosengren_s_Page_021.jpg
342ec9beef113e430e05f6503f8214b4
94c662efcd8190f94f55faef66f0476f57ede6d4
50147 F20110113_AAASCM rosengren_s_Page_006.jpg
34cfa2f1936dcc854f459ef1398a89c3
a06795dcc21a748b63503807a8541c1161f057b3
42458 F20110113_AAASBX rosengren_s_Page_070.pro
606509b0e7c310b8d0453e721cf9727f
c300065845bd0a25af4f151b3876df9e34aa53fe
10783 F20110113_AAASZU rosengren_s_Page_078.QC.jpg
c8dd53f8ee80b8de36597d33fb70efe7
d4454923ef3747c61747731e56c00aa03a8168ba
79458 F20110113_AAASDB rosengren_s_Page_022.jpg
3c17d50fb9a66b1c8d49290be916886f
ac84f55f5944e87f4425f0cbf521fba055f60a1a
43577 F20110113_AAASCN rosengren_s_Page_007.jpg
7562e3056e89a78de426fd9e262a6476
d21d2aaf1c47e04b8bcf6066b2046cd1cd2323dc
56216 F20110113_AAASBY rosengren_s_Page_081.jpg
2a2c5a81db2c45df03865c89d68699be
3cb4fad6fb1dbc02bbe909a77c9da4b108e7679e
3250 F20110113_AAASZV rosengren_s_Page_078thm.jpg
5547d6a804e6452b48a57882c4eb385f
7997000077aef169a988a353d65cd331e427bc92
73152 F20110113_AAASDC rosengren_s_Page_023.jpg
963138bfd53130f7c7aaf8496d2e0545
d1510b75bb58ad61ff3fe7564725e2358bd00cae
82478 F20110113_AAASCO rosengren_s_Page_008.jpg
d03c741ba14123cd923e5a6a3b5807e0
0b3b23eebbf7f2361d3dc08630f421cba2fa330d
1051964 F20110113_AAASBZ rosengren_s_Page_019.jp2
ac59f868195bfc10354d59a3534da83f
b13499d779197b5bd980c097d365f84ec8d5a916
11998 F20110113_AAASZW rosengren_s_Page_079.QC.jpg
b91374bc0090c344433fa29179f1fc3b
14695c7efdba792c4e27f7ff13f467c76376daa1
76481 F20110113_AAASDD rosengren_s_Page_024.jpg
0ee450b8b8a1321379a1b51a04d0fc66
b036213b8ba27391182c2356a3d584e3972ddddc
94768 F20110113_AAASCP rosengren_s_Page_009.jpg
40c2c2ec7c86f6574e2bbbdfe6e44ae1
006e383a28871eea036260443d88e719bc3f5f20
3854 F20110113_AAASZX rosengren_s_Page_079thm.jpg
425db61d8f413e2ec293ccc0b3614102
937390660b8676866ad9833060075e93ebb4c927
75138 F20110113_AAASDE rosengren_s_Page_025.jpg
8096deced1bb8e2faae9927b3dafba90
3a6873ad03a34bfd2b5e339f98f4e55391a69aea
70622 F20110113_AAASCQ rosengren_s_Page_010.jpg
c8cbfcb5c0ea3c6090977b7a215af025
2e3a719d3fddc9736aa82d007e7836b5200201eb
12036 F20110113_AAASZY rosengren_s_Page_080.QC.jpg
9e626e8e2d51a0d2da054835ea8df60a
110128481bea7977e1760a44e8e81f102393b313
81537 F20110113_AAASDF rosengren_s_Page_026.jpg
ea17c9cc47819bad9c4962e113d79ff4
8baed3d3b0d6ab75b0af678bd2a64789ef4975bc
54492 F20110113_AAASCR rosengren_s_Page_011.jpg
2979096ed6e9e19ccca1556f5466b0e9
67f1495638c337c9acf2b74997736cb7a62e2923
3870 F20110113_AAASZZ rosengren_s_Page_080thm.jpg
b46acd40e9c27df97d2d36eb0b3d6456
71e35f4ff6bfe7776f35b1aa208bb8a70662c822
61937 F20110113_AAASDG rosengren_s_Page_027.jpg
d5cad359e05b56b9df7a0ca92fd71338
4aa76e13472374ca1efb1d9905d5aac4cd30a12b
67275 F20110113_AAASCS rosengren_s_Page_012.jpg
a5b2a3c4c5f141656b31186f97cb8687
64ac86c8d6eda96ae42f18edf98234db7e9b9334
78222 F20110113_AAASDH rosengren_s_Page_028.jpg
6926882edc48ced2725c7ab629610a2e
c00ecce6a3458df84a3f6166f8200bb02000498a
79710 F20110113_AAASCT rosengren_s_Page_013.jpg
3760d1ec4655be506bca8be33bb9b9fa
66fc7c44ac5903959ba6d7a13694e0880c372fbe
83670 F20110113_AAASDI rosengren_s_Page_029.jpg
b245f2bb99558a75060ebe2facf05f85
292b47adb8fa10d80523117a0348ff9b2a045c2b
77824 F20110113_AAASCU rosengren_s_Page_014.jpg
69afb909e3c92af8d937e8fa038ec9fe
22cf53bc80a9f16e010ee45069065db91ba12117
76504 F20110113_AAASDJ rosengren_s_Page_030.jpg
076575b1e5d202fbd4a30cce59c04f87
5daa0e40d066e1ed8e0e142b2b8433389cb5d72e
62203 F20110113_AAASCV rosengren_s_Page_015.jpg
0f7adc9e8c01641371525a093814647e
df1c792882cb5ed12e1cad6132cf38d4c4098487
77361 F20110113_AAASDK rosengren_s_Page_031.jpg
d3651a4d0a5143af2f6d4724153604fa
18686046abe52e0b61f28bf6510bb26513f57863
53959 F20110113_AAASCW rosengren_s_Page_016.jpg
54088154dd80c2a83560e542ad0ded29
8d0ee5e7474f8125419a745c5d71266dcfb72f1e
53573 F20110113_AAASDL rosengren_s_Page_032.jpg
c53f4443b3d2e037a7bc3c667741cf21
54ba95023283106bb4ee21d0e4c9c618eeb56248
62332 F20110113_AAASCX rosengren_s_Page_017.jpg
9b8938aa7485dfb4c6737af7dab2476a
7960242077016a5effc94e5f247dc711f7d1e6c1
54711 F20110113_AAASDM rosengren_s_Page_033.jpg
867aec63d158eb9247572d4476947292
776e597e93f2c07a9d77794db1fc5b89a489f74f
67401 F20110113_AAASEA rosengren_s_Page_049.jpg
c1bac1887b27227bac044085bfdc7f5f
3bdb248b623e67c3319eb37718bd15f93578aa40
52494 F20110113_AAASDN rosengren_s_Page_034.jpg
140d881a30053001783e24287d31df61
8ea877b78059eb01e0f7d532820c694484dda543
46599 F20110113_AAASCY rosengren_s_Page_018.jpg
2a8a5cb5d0a663487db90c950605b8bc
451335922e01b5f85bd5ce8003c3de099077ef2d
37973 F20110113_AAASEB rosengren_s_Page_050.jpg
7b4e8e3ac87f3ca704b65f90583777c7
310fcdbbd817c2f37680a4bd80cc31b077c2d27b
58131 F20110113_AAASDO rosengren_s_Page_036.jpg
7b5501a15f013c81200da5310072a119
03edc2fc2109940d738c922ad6ccf3119bce116e
84519 F20110113_AAASCZ rosengren_s_Page_020.jpg
2e9a2ce32198094cc601255f8dd0ee02
e3743963311d263fce75e9bbe0ec84ea9009fc8a
43040 F20110113_AAASEC rosengren_s_Page_051.jpg
7b6f6b59b22f57bd68b1c35f748eb523
d4f035743718a7129b95969ab7cf766e28ee4c7a
61443 F20110113_AAASDP rosengren_s_Page_037.jpg
66ac0e74ba36a3b6d2e4ad8a4ce8b7f8
08614e33e15a48b9aaca29a63fa60debd9f6f459
49510 F20110113_AAASED rosengren_s_Page_052.jpg
d9613e4b810c81164a2d28a0c01332c7
a07f8c90bd572be7fd6da25c6e23488feca60514
51651 F20110113_AAASDQ rosengren_s_Page_038.jpg
429bbd7488117e0d04a710ee6df436b9
1dec6a55b3fedd7087bd984ac1356beb0120fc16
71785 F20110113_AAASEE rosengren_s_Page_053.jpg
e097287f710fb84a26c7e8830412deca
660ed3d6ef3af6a0aa3f8123b3aa7b210e698c8d
70417 F20110113_AAASDR rosengren_s_Page_039.jpg
451ec3b539cc7915c3a9c0f48c460e05
1223ed4caf98992246e3bcc40704baefa794af67
62716 F20110113_AAASEF rosengren_s_Page_054.jpg
a058e1db18f933b466d1c9a469b8cb0f
2f8079c99e9a2f3ffd5ddc167df1cb73d1fa926e
39549 F20110113_AAASDS rosengren_s_Page_040.jpg
758b64a683abf185288574769ff9b13a
68b5301905a7e82d80f52dbc8682ed72bc77e41c
61729 F20110113_AAASEG rosengren_s_Page_055.jpg
9b5fea541c86303b76ee04e1b176d9da
9f06c858f487e5bb28b27902d2798876dc711e90
63489 F20110113_AAASDT rosengren_s_Page_042.jpg
19000fe03b293bd5b4ba795a903f68a1
71f3a4564bafb2e0caf5a7a8e5e01ec7bae8e651
80000 F20110113_AAASEH rosengren_s_Page_056.jpg
c1a38c012203eddd4e7b1b16f2226a1a
8a07752c6e95c88c0893c7340803ef8671f0eadb
68278 F20110113_AAASDU rosengren_s_Page_043.jpg
594448ce8f0eb0057584de48de942a6f
a22e637da22a7ad01fda04a3784cccf6bf3ddd4a
82114 F20110113_AAASEI rosengren_s_Page_057.jpg
f50b3734bf1ef11cfe518a7d38701c03
8640412ce55741f8e38aea9165fa92f5133e9414
63593 F20110113_AAASDV rosengren_s_Page_044.jpg
f3eea2344a2195684ce4429be387a81d
8ac4422f97c3390cc8531d5d8cdbe1f98ca0d4ce
62767 F20110113_AAASEJ rosengren_s_Page_058.jpg
bb2edc0e8e8fe0f2e9a52e4eaa3d973e
7b1840955adb92025a719473f7e6686d5aef2d3b
49360 F20110113_AAASDW rosengren_s_Page_045.jpg
10fdcb6e89fcf5ebf1ff049e269d442a
0198f07e0551a2e4d5136cc28fff9acdcf01d7c5
37427 F20110113_AAASEK rosengren_s_Page_060.jpg
3f3acf8c330886c4c86d5589202f6e38
562d39a1746b00469cbd8e546a8a948ee5ba2887
68854 F20110113_AAASDX rosengren_s_Page_046.jpg
7a7d394ffbeb21ce25f1c74d8732117d
053806405f5dff6952c7ab74bda382f49c0b4974
57324 F20110113_AAASEL rosengren_s_Page_061.jpg
b910030fdb9849bae7d64fb84663a11c
0f4c605f8db512ce6d026f328afb8dae9c5d9197
48001 F20110113_AAASDY rosengren_s_Page_047.jpg
1013d022271d34ff0021712004132f28
fa5b782779782a0b3857307887f9b25447bcc9cd
41460 F20110113_AAASFA rosengren_s_Page_077.jpg
68148d36a42889a5a6f7fd540b0497b9
ad4e6534dbf6832ed85407a82fad6dc8e12b0de3
53753 F20110113_AAASEM rosengren_s_Page_062.jpg
640b055d62f0b14ba7d5d7dc2b31860d
f6f083079c83121549017899087b935e3288f78c
35853 F20110113_AAASFB rosengren_s_Page_079.jpg
2b07590b7dd2aae84dd0e2843d739378
b91295baabd53c6a5d9dd4c12e5bba918bb8724f
32573 F20110113_AAASEN rosengren_s_Page_063.jpg
2fe33e613d5fbed0bdfb56bd0f3a0773
c44990eaefc09fa92de1a2346f68290cb022db21
56808 F20110113_AAASDZ rosengren_s_Page_048.jpg
1d09a00d7da598d3fd95f727cbb77ec2
dca1498a4988139f08d47cf174a339214b1148be
38022 F20110113_AAASFC rosengren_s_Page_080.jpg
ffc289cee6ab1f170f95e900425030f5
dcdcc114b8c79da5809d2f0fd9fc268b3a9b2870
34174 F20110113_AAASEO rosengren_s_Page_064.jpg
46eca47e5ac8ec4c63bb21faf3f975a4
c4860c79e594f38c84c6eb82c1245d55eb7e1a2f
40156 F20110113_AAASFD rosengren_s_Page_082.jpg
864ab6363fe3adcfc6d3e4a6cf07c238
02cc89001bda114d21fbdaffb2ec3f279b59ce87
37708 F20110113_AAASEP rosengren_s_Page_065.jpg
45ac7b3cd28789c69de2e1546e6f4a75
78a3c26d47636f17af9d31decc3f6fcf3f96b7e0
40421 F20110113_AAASFE rosengren_s_Page_083.jpg
4fb5080d2181ee9417a61e9d02a45791
c7493fcb71591e3cc761609f67fe2a78b3ebe4b2
65373 F20110113_AAASEQ rosengren_s_Page_066.jpg
2b7be5f5c9a89b98bef2c55e04b43c8b
9e9a7c10c9d01ffc28ece0d13ab147d161602036
68988 F20110113_AAASFF rosengren_s_Page_084.jpg
cfbf6ccd150319053dce4d1ff8b8a653
87b1f1d1ba2fafecbebcb2c17d9559995e383126
29891 F20110113_AAASER rosengren_s_Page_067.jpg
1c6faadcfb6db4301043f12de039efef
543463a73dec436915cf7166d2c67a2e5ee4cea3
32422 F20110113_AAASFG rosengren_s_Page_085.jpg
506bc6dcb392994ba0cd2b404695aa38
ce5ceef5a99250cd1534fd4de362bbf1e94fc29e
36649 F20110113_AAASES rosengren_s_Page_068.jpg
51237f8355f908aac92c8f33f97faa4d
fcfa96528c321460650b6c071c64cc720300ad16
28938 F20110113_AAASFH rosengren_s_Page_086.jpg
2cbcdba88991941d3c7b9ae18ec722ba
8ce7b01a39f5b5596c11dc4bf3279fd69857eb1a
38181 F20110113_AAASET rosengren_s_Page_069.jpg
157a986b1d8f612964740047f2898480
fac98e307a23e313ee73f5252b174827ebaeb27d
62106 F20110113_AAASFI rosengren_s_Page_087.jpg
b316f80a24888b3d9b82f620f3e81e66
184e0e0d21cc32847035bb6ccf8e852bcbccb689
62491 F20110113_AAASEU rosengren_s_Page_070.jpg
538cc4270cbf1b39554c65d58bdae450
5acf25cecc0d55943d1981be91a546c7477b13fd
41069 F20110113_AAASFJ rosengren_s_Page_088.jpg
4191ba2a8e31fac4649b09ecf527d8a8
0bd06f516ef1c256a25ad42d9a7afc3e956726a7
37183 F20110113_AAASEV rosengren_s_Page_071.jpg
cc1360942e80487371b18e7001e5de35
6671ec39f91cc4fb75f5b86f054521a423ee8b78
41487 F20110113_AAASFK rosengren_s_Page_089.jpg
9808f10273e584abbc52ee469ef666ae
5a342db4fcad8589294d928a373bea0ba52eda18
59811 F20110113_AAASEW rosengren_s_Page_072.jpg
c300abe1a51cf24d113d31c800c5b5a4
2c2a5742546653e303a54c9f50061301e4abd4d1
31110 F20110113_AAASFL rosengren_s_Page_090.jpg
aae63f5102f5e322e8dcef012a1888b6
ecd217f85f1a3393274406cdd42754553de13bb9
31320 F20110113_AAASEX rosengren_s_Page_074.jpg
d4cb1e24817cf984482f0de5f3d22b9e
51539e5dc1a553d42d695429e3d1745c4d2a5148
5480 F20110113_AAASGA rosengren_s_Page_002.jp2
fa925005d8e887134a8943e1d66c8be3
4b00006024a87ec078a0f606666ed3a7c20b7783
37645 F20110113_AAASFM rosengren_s_Page_091.jpg
aa81851da190f9610f538b74fcec0ca0
50a14e6a94204708311aa5701404aa84e0feac09
38886 F20110113_AAASEY rosengren_s_Page_075.jpg
2d7cd1294e03b9ecaf2ff63fd27d5f97
4bebfc1788b4ad0df91f0e2d1e11c4d9af40946a
6100 F20110113_AAASGB rosengren_s_Page_003.jp2
e86c95e6a6e0b79d8f2c0a365799612e
c98bfa19f8b60dd61802d528071b3d4c074f131e
61204 F20110113_AAASFN rosengren_s_Page_092.jpg
76bda896ed86735319b1f00cf7f5e388
b379ae37e53652a40c78704c6b027c789a2dd957
42207 F20110113_AAASEZ rosengren_s_Page_076.jpg
2512221d4ba94decf971da81721de19a
28d25098e44559d78186440c7d50d8798cfac302
1051969 F20110113_AAASGC rosengren_s_Page_005.jp2
1a2088f050494c21fd7438eeb0de3750
e622e0463df350987a9c04218c1b25ecfbf94cbd
30258 F20110113_AAASFO rosengren_s_Page_093.jpg
2e16c9c02675dcba24e4ec6ec3b3e2ee
5c9f12e24d9a042e1537849be1c2637984317833
313 F20110113_AAARZT rosengren_s_Page_059.txt
625d6ed2c2a7acb44dbbef149c41cc73
51b1c1d4e7734edd85153e3470216ad11e4bbc3a
1051943 F20110113_AAASGD rosengren_s_Page_006.jp2
fe85e60bf74002751e6cb9a6305683c7
6482145865cf76d438156e0f308c43c09764a04e
37783 F20110113_AAASFP rosengren_s_Page_094.jpg
1d09b3155e05d851ec9b2073ff8e63bd
4eb2479dc29c37a6dea80a23617b7cbee4fc66f4
9165 F20110113_AAARZU rosengren_s_Page_086.QC.jpg
39c5b165016bdb88d7ddba02bfe07342
d63cfabcf04f089789b535fc5f9448db8c5d7faf
37480 F20110113_AAASFQ rosengren_s_Page_095.jpg
4f6e2aeaf6949e2a81844ceb2b85f52a
9026566332eb91226cd20cea0cf76f0682c328dd
1037 F20110113_AAARZV rosengren_s_Page_073.txt
f78539296d007557390694965e5ad077
8e46250f4fe97981fc06a56aa64a9957c876cda4
1051978 F20110113_AAASGE rosengren_s_Page_007.jp2
e6f73f73060fbd2135a900e03b162b97
31d7b13fe1cba2e3ac998c11c9ae9cf84afe23b7
34471 F20110113_AAASFR rosengren_s_Page_096.jpg
aaa33d9849c0abe60e777f0118a5e684
0e9c95aa725158fd41db269c5353a4fdf4103563
1051979 F20110113_AAARZW rosengren_s_Page_030.jp2
2957e8550ee45ee54a82bf98aaaf869e
783935cfdfc73b9c5719272294a4861d9e03bbb3
1051950 F20110113_AAASGF rosengren_s_Page_008.jp2
4913f7a3ec4faa04b919137728abc8c5
68281a79dfbc5a035272106a7fe49e1244c14891
76624 F20110113_AAASFS rosengren_s_Page_097.jpg
346e53298e595b57778b5cb3ea2e594b
931c1e1ee6571f009a01dcb16c5d9d29c2292811
1051976 F20110113_AAARZX rosengren_s_Page_023.jp2
3fd120dfcfb70042fbc5b579f7e2325b
0ddf367741e1ee4f9a893bcbce65eedf9fa0bf08
1051984 F20110113_AAASGG rosengren_s_Page_010.jp2
d432886d9ba589a388a3938fb2c9a49a
9fbeca8f1a33d465842b4fc74bc991851db2f9c5
75564 F20110113_AAASFT rosengren_s_Page_098.jpg
eeee43915d3468e4dd0760216f72ab1a
145a2f57bb6a8ca19688dba4b6bdca74a5ee9be9
F20110113_AAARZY rosengren_s_Page_020.tif
b3ed5d34d8ac8c43acd8a6d0a360d376
df32130b9ea166c556398d19398f8abf260a19c1
109247 F20110113_AAASGH rosengren_s_Page_012.jp2
a76e0e72fba466e25cd8c2d4b5680776
bed1308f54af632414a17f7d12bbc776ccedb500
40921 F20110113_AAASFU rosengren_s_Page_099.jpg
1cd3b549a6132e54164b7149a9b83978
9eabc930882e8cbea9e054cd903587f4ae0c6c43
18297 F20110113_AAARZZ rosengren_s_Page_078.pro
91ae53754a88371b2972ebba8aba0c32
51c6614456172ce292ad1380fe0364e0fca9116b
1051945 F20110113_AAASGI rosengren_s_Page_013.jp2
90a91fdace4aa7e633059ee5f8bd4fff
6b60f0fe37fb0746443c7688346e06aa0f649f31
41444 F20110113_AAASFV rosengren_s_Page_100.jpg
d517f13d57a1b6a083dae7c816b721f3
5a637459d682d8396dbad88253ace26854643984
F20110113_AAASGJ rosengren_s_Page_014.jp2
b8368a3171e3039f8c666227fd803b4b
5268bbe944531ddb8be462add159c3102bc2376f
73397 F20110113_AAASFW rosengren_s_Page_101.jpg
eafc779ab590e0d1290b66fa008f4f13
224234d04b0d234e87a71575603619c3f62743f7
102198 F20110113_AAASGK rosengren_s_Page_015.jp2
532286b60863d21d271524f2bc5b7d22
825fb607261d76be56e9b1fd9fa8a33756b736a3
13881 F20110113_AAASFX rosengren_s_Page_102.jpg
e5f798eab116c1e0e86c6cccd0bf954a
aeee01e8513750a154bf92809d72396cf6d908f7
726670 F20110113_AAASGL rosengren_s_Page_016.jp2
1fe2aeba61437e2fea6bae9da2fdbc9e
340a0e155ef70732c3fea4138adc05064eaa5720
69126 F20110113_AAASFY rosengren_s_Page_103.jpg
5dcab53c50dc543abe8af7029a10399d
c51caacce1580277f73b42f9cae5afea9cdd6b45
460277 F20110113_AAASHA rosengren_s_Page_035.jp2
7b72e3cacd403afbed956a8b586da349
ca7e332a813a1ab226dcec6e6eea6fc7e1d83435
910455 F20110113_AAASGM rosengren_s_Page_017.jp2
03dbf1b1c797fa1756d89dec968e77af
91fb6c764419e81e43679308ca839cabd93a1715
25996 F20110113_AAASFZ rosengren_s_Page_001.jp2
7871cf3c1413ee3eba49baf34f8c49aa
ba660c9f77e0505af58a2e53f63f874044dcd521
872896 F20110113_AAASHB rosengren_s_Page_036.jp2
ceae1d69c61f4f7fa9e0fbe0a3d63724
1185ab4e2e0851de9491d68d9d0d3648a5642d82
616424 F20110113_AAASGN rosengren_s_Page_018.jp2
91fe771537f5d41b3598b7051e123eff
6c812a71ff29fa8ba34bd47ae56ee6e84afd8d69
81571 F20110113_AAASHC rosengren_s_Page_038.jp2
dc6cff0a58cb88dc8af1db4e05d104e5
272dd716f38ff0194c43eabdd47e8c131d812dc6
1051973 F20110113_AAASGO rosengren_s_Page_020.jp2
3662611965a666fd3a1de943f00934ec
a375bacbced4cf854ff2727a8498836f1829d84e
1032097 F20110113_AAASHD rosengren_s_Page_039.jp2
55a2e349b4369cdf9393f64e12a3bff0
035afa335607ceba70bd0e4bac0f0ba22aab9d03
9173 F20110113_AAASGP rosengren_s_Page_021.jp2
6287cf575293e40a90c43f371096d591
160968c7f4b6e21c0bbce7e791f57feb90c43987
47701 F20110113_AAASHE rosengren_s_Page_041.jp2
e3f93b9d888f2fcd64e8b3625b26a87d
87d11936d9c6c5ff9b6fadad68c46ba0f74b4172
1051985 F20110113_AAASGQ rosengren_s_Page_022.jp2
821d282f7dd518605d68dfb2860e7b80
72928e79dd928414c793cd13c2829a4d27efda3c
932920 F20110113_AAASHF rosengren_s_Page_042.jp2
b03ccaa54e8dd84504290663899cd46a
917b721d05e665ea755824903c98d518a6fa197a
1051982 F20110113_AAASGR rosengren_s_Page_024.jp2
128ad75c6679b5c80ed7199f4dcac6b5
0742b7d8119b2d3e7b790779b0468b28556c8629
1004273 F20110113_AAASHG rosengren_s_Page_043.jp2
4c21402575ff08ed676c1a6745da40cd
990996ef1f7679187d9019b02550c560eebead3e
F20110113_AAASGS rosengren_s_Page_025.jp2
410940f7e53025cc6695fb8d464a4ebd
ee1106a53d61ad08c9b97689eda10b1ef66ae2b8
882055 F20110113_AAASHH rosengren_s_Page_044.jp2
bef5408815977c30dcba6d40a19236e4
01992448eb031f8eff1866849990c453269112e5
1051983 F20110113_AAASGT rosengren_s_Page_026.jp2
c5fd74db9eb8c8ae460fc373b50fe44e
8d05ceab03e05f7d8e6d42531e27b1ee8c8a6046
671482 F20110113_AAASHI rosengren_s_Page_045.jp2
e7884e7add68061184fbc9083c617fe9
d8bfb74d0695061a50dd4eeda81410b8fb776f44
832031 F20110113_AAASGU rosengren_s_Page_027.jp2
fc7c629ac65c500db497c7765597829e
6e75258ffbde03f3e435942427ddd5c92b5c211a
985003 F20110113_AAASHJ rosengren_s_Page_046.jp2
3b5807135e17bb7a725ccf1592728e8f
1000f4988118a2e97aebf01cb6016767377b7c45
1051932 F20110113_AAASGV rosengren_s_Page_028.jp2
fc871e10c257f6c055aa81cc4bee9bb0
da9db04a29b382106b3d17c74307191657ef4a1a
655916 F20110113_AAASHK rosengren_s_Page_047.jp2
b778cf2b8cffa7c72447999013f441f6
cdc70dc71db89d9b91a27b97189eea359e150b65
1051944 F20110113_AAASGW rosengren_s_Page_029.jp2
fde4263e03de6d3c6445b07fde713558
4f2ea08501c0116430fcf0204e7744639cb53f46
770846 F20110113_AAASHL rosengren_s_Page_048.jp2
a067fbee5e780216d3c9f69e9fdfb3e1
a3e2521e8dce3e6b819f18ecce571265b360eb78
1051914 F20110113_AAASGX rosengren_s_Page_031.jp2
9861012a5b8d01e4a323d4c38af8603a
3074e22dbf25f681565cc102ff0c37117f7ee667
55508 F20110113_AAASIA rosengren_s_Page_065.jp2
99031efe71468e66cae61bbd99adeedd
aad3108339f096f9319a12f57d15cf08969bf269
55109 F20110113_AAASHM rosengren_s_Page_050.jp2
f352f72ba57e7da4672f924896909204
22e3e79b7edf5fde98930c2e2d35e35e851d4140
733026 F20110113_AAASGY rosengren_s_Page_032.jp2
17dd78d933d5345463c2e3d61d723f9b
9067064abb23d05b9877dbbcb0bf4dc702996f56
941291 F20110113_AAASIB rosengren_s_Page_066.jp2
0e562cbc2308bba594de6e0a8d355f01
ba9f946309f6e3558bc917aeffa7e4e9ecce15a8
582201 F20110113_AAASHN rosengren_s_Page_051.jp2
be33381b222ba62519c84f94d65bf167
78d26c0baaede89a36beae013d5a96cda469d6ae
730332 F20110113_AAASGZ rosengren_s_Page_033.jp2
0cabd7b958be267b3d4b2707498b2e26
fab15106bbb29dbbe7600e2762c8ca1990267228
459491 F20110113_AAASIC rosengren_s_Page_067.jp2
da4f76cefe25616a90521192487fea75
244574e0c34a18364f42a97657cbcd71dbc0f89f
640849 F20110113_AAASHO rosengren_s_Page_052.jp2
9c7cdd1c93eb7179a799245c95f195a3
d72d8b96aef23c8423d76b9b8758e3ccb55a5931
582983 F20110113_AAASID rosengren_s_Page_068.jp2
cf7f92ec8a422e2ee7420937562a579f
9c02e1f8131b65deb4a8c1e3e4c529286cdb9b6b
F20110113_AAASHP rosengren_s_Page_053.jp2
e49388efe5a78317bf6092c363598afd
2c0363a8f1f5fe766868c1254a27a074a5d7c82e
672303 F20110113_AAASIE rosengren_s_Page_069.jp2
911e23d71fc2e7461cb15c5a2f926cf3
0b67c7844f9d415f507d7ec2a6a32dc3a0e7a9f3
863555 F20110113_AAASHQ rosengren_s_Page_054.jp2
8899ed8d811caad671428ea3e0e7bb69
d47b7f38299719f98d4f3c8ec1b1f3777d561b60
902454 F20110113_AAASIF rosengren_s_Page_070.jp2
5eb6dc4c4bd0bafeafe7f42e4b35c832
23c968b721b78e92543ecbbc3b1b4a015f66bccf
1036592 F20110113_AAASHR rosengren_s_Page_056.jp2
b7e1358f1a6dceb70868651ae1f039ad
c0bcf6412866c54a731454ef359c6569676838cd
57111 F20110113_AAASIG rosengren_s_Page_071.jp2
3aa79470145f6bd981d469a7b3dc41d8
8698de911c34798c9c46afb82bc06c922297f54a
F20110113_AAASHS rosengren_s_Page_057.jp2
1234a1e4aef1e06aa2a010026296669f
80eecc66e49491340bdfd69ff692d0425ef930c1
842498 F20110113_AAASIH rosengren_s_Page_072.jp2
4cdfb436d1e34a100b914714ac892e07
216225ad38b88444ffd01fee8a45eb0185d4bfb9
906860 F20110113_AAASHT rosengren_s_Page_058.jp2
d7e1138ba9ef10e9049909bd66b5aa1d
afd8cda49dd0580f3ef386df8ebf90643045d061
436140 F20110113_AAASII rosengren_s_Page_073.jp2
a3936a4ca3f3fd2a054fbce15434308b
ac9d6c77b3dd67689aa7e050ec5e450b3c66dc7f
1051571 F20110113_AAASHU rosengren_s_Page_059.jp2
9a3b3661c6b4a55c8aa15e743c636f18
14bd6b62999395c14538bc480be4ff9627de87c0
515545 F20110113_AAASIJ rosengren_s_Page_074.jp2
fc898b70343108f59b50f609487e936d
60340944141c50ee528aad25562bb397d741c4b7
55024 F20110113_AAASHV rosengren_s_Page_060.jp2
d35a5106767e0d3670bd8e3948556e6c
c7db16b8cbf0c40af9a8569418abbe755a6b0a2f
621625 F20110113_AAASIK rosengren_s_Page_075.jp2
19b24242dac37a601ce450baa7285791
2574efa707f0cfe6db9266ded81fed108dda4795
862546 F20110113_AAASHW rosengren_s_Page_061.jp2
7bd17c4701823511b090c3ae6daaf0f3
f907296cc8a1fe5527af982e43d8d7d1a10c8f04
675234 F20110113_AAASIL rosengren_s_Page_076.jp2
996f180585fdb2c50d81735a2ac82abc
540eff8f872291b390a15cb6481545b7a024f525
801461 F20110113_AAASHX rosengren_s_Page_062.jp2
637ba615a38d2bcc591b6d36510003bb
fe9f6dbfcbcd17ca1f3a105ef09031aa6421a92d
653344 F20110113_AAASIM rosengren_s_Page_077.jp2
de2a37d0c5b5be654a82282aeff5b817
e19dd715015fab23c33051ad49d1702918847ff6
576167 F20110113_AAASHY rosengren_s_Page_063.jp2
65ce40d2f7b77aaa5d30a6ea3088234f
25631dc94e5a4bdb3ffd2f7d8f39d4fe03770d0d
55069 F20110113_AAASJA rosengren_s_Page_091.jp2
9745c339663a12ecd53b0676236f429f
369c558b6ccec6436e879f574cf027af75acb7dd
432357 F20110113_AAASIN rosengren_s_Page_078.jp2
1675e3930d99003e6194b63530c126c7
a890f0e330e2118662ec044ca87339d126ce3655
509239 F20110113_AAASHZ rosengren_s_Page_064.jp2
684e540b62fa3349c1b075a169fe21a3
5c25c006002529da4a0d0dfb43dbf37c37ef21bb
1029472 F20110113_AAASJB rosengren_s_Page_092.jp2
359bb74699e068c47a53c092ad7e8631
a5fd68e3c807ca9f7d0ea3dc8033533646c8199d
415547 F20110113_AAASIO rosengren_s_Page_079.jp2
7ce766c44ab883eb84f029086d642c03
84491e48c7a559fe2d159bd4f7f653af8fb915b7
556765 F20110113_AAASJC rosengren_s_Page_093.jp2
efbafc982fadac590250ea8b0deb2614
d86618bfe93450a4c178bb3ac567b279fe293687
55891 F20110113_AAASIP rosengren_s_Page_080.jp2
d634942e124f1bec878b72fb45f37fe8
3dc47485ee639373abe22a9b367eac2cb4f7d566
55515 F20110113_AAASJD rosengren_s_Page_094.jp2
69c2b687183d6c87484aa38acbc51754
12db7e71508041227fec0b099a856b8fecf4cd90
785322 F20110113_AAASIQ rosengren_s_Page_081.jp2
9580fbff50583b2cdaf38ccf94a64797
23f2b1d5bb1cc3b8541e6853a1e53eb1102401e8
538496 F20110113_AAASJE rosengren_s_Page_095.jp2
fc058e4b67ce23248dc77764cfd68554
16d0f982e9f97d87184c98634be1e6ce3f186e4d
830089 F20110113_AAASIR rosengren_s_Page_082.jp2
29732393eba33733fb9a2829e7eeaaf2
660d02fef14b00436339e6db23cb6ef77dbaf11b
466648 F20110113_AAASJF rosengren_s_Page_096.jp2
05d1058c36548fe86f225e6df9be5289
b3723eb5b86cfd1b8487f3cef0a859387606a43f
651178 F20110113_AAASIS rosengren_s_Page_083.jp2
d5886c072a4d5b2002118381608d64a8
a3c65dddbbaf5cc0d6e11a77508840e5643789ed
1051918 F20110113_AAASJG rosengren_s_Page_097.jp2
32de1ff4dcbdc9d58fbf2bf99c89155e
7c2bd3eee9a39f8aea0c458f8fabcfe41fe4953a
1051986 F20110113_AAASIT rosengren_s_Page_084.jp2
e188ae4ae9c57a05b7fcefbfda2b6f2f
fbe7000621025af314598a17e374f69357b14492
1051954 F20110113_AAASJH rosengren_s_Page_098.jp2
a823dd2b25cb2c0dbfc6dcedbbc8038c
e17f30abe0342e2c3ff259bca268de2e0f8769e3
411732 F20110113_AAASIU rosengren_s_Page_085.jp2
a9d483072b174cc4a21095931b2661a8
80fdfbbe513f1d58daa4792acf7a1ae16fcbd646
527423 F20110113_AAASJI rosengren_s_Page_099.jp2
03f89c9883e831f85280deb39a07255d
ebdccf37ef3f23e0b798e3770b5faba2f457611a
467710 F20110113_AAASIV rosengren_s_Page_086.jp2
3379b69b328983f81bac37dfc0447ae9
61a0ed8cec3302bf186c6dac34c2e528bad2339b
63412 F20110113_AAASJJ rosengren_s_Page_100.jp2
d2e44da193966911ed4fd32649b92bc8
11c91de74122ad391902c9bd43d8fe87dd42b723
F20110113_AAASIW rosengren_s_Page_087.jp2
fe1004bd061c63fd8d96e3609b7954bf
314efcca61cca134c709e2127b90be548d8ff80f
121547 F20110113_AAASJK rosengren_s_Page_101.jp2
f28643e33bd091ffec131f2f1657afb7
e2fcbaa3ae54894d8aceaa6a8c90934212f4c752
662467 F20110113_AAASIX rosengren_s_Page_088.jp2
e8a94d05b04dffe3683ed70bc6e6cbfb
5b9647bc66216bd235c2004e30f8c4660966b15f
13738 F20110113_AAASJL rosengren_s_Page_102.jp2
2f991894cb69ae7d04513721b69eca00
09c5d66e541c56b059ef50e9cb1a4886e956c2c0
658772 F20110113_AAASIY rosengren_s_Page_089.jp2
521ed24c4f90d87a2b52eb113d39687c
4e4364b69c4718090779d86ca799ae9998aeb02f
F20110113_AAASKA rosengren_s_Page_017.tif
171eb3336d164149831c84ea7e49e446
addb2feb1cb690b698d134b6c7b2903ad85874b9
111301 F20110113_AAASJM rosengren_s_Page_103.jp2
7c7231ce1e7cdccd57435afc3f1c197d
f5141a3d0271c82e15bfeebb2c3d52220e95ab7e
360694 F20110113_AAASIZ rosengren_s_Page_090.jp2
ac673bd8877ecc6a5fac27eb33fb0490
30e830bc0a880328cea6221d310b92f77b1e8997
F20110113_AAASKB rosengren_s_Page_018.tif
4ee1e86c327a81a33cce9d51ceec0ffe
6b40d19b64f18f66c3eef4369a22338eec4fae42
F20110113_AAASJN rosengren_s_Page_001.tif
e542e7b62f8242045ea8b2a9b384b634
730b896d32dfa5cf46d0e2bf238c0b03da1acc97
F20110113_AAASKC rosengren_s_Page_019.tif
e2c031250744c928815cb8498cd203b3
a0a8ac37ee5d6118451e171312da61c5f3d7ecdb
F20110113_AAASJO rosengren_s_Page_002.tif
01d458420797b8199090cd3b0e04b84e
99fd5e505c075668563ffd336e388aa59007628f
F20110113_AAASKD rosengren_s_Page_021.tif
9a1db22a7b3c0deaf7f37abcaf785069
7601b3878f7814683de385aaea3e0021b5bbec80
F20110113_AAASJP rosengren_s_Page_003.tif
f7b5756c11677a0874062c8b242f8289
54b6c55173a1693487530276733402dcf6b024c3
F20110113_AAASKE rosengren_s_Page_023.tif
1ec16223a81ddc3623ce5a5dc404bb88
0fb0622573ea2cc86ec4fa481efb5c074acf7e45
F20110113_AAASJQ rosengren_s_Page_004.tif
20692291f61f04972ce371ab5dfc5897
d1194c3a5282238ecbad025de3ae688ce6dfef1b
F20110113_AAASKF rosengren_s_Page_025.tif
29eef9f9c3fa96cd15d827abd8cc84cc
3607bec6e976068617192012bbfaaba92494fd00
F20110113_AAASJR rosengren_s_Page_005.tif
8ee7cbc9c67b794f6613a6aea4393268
960ed772eb758c16897a6bb7c3bac43d0890122a
F20110113_AAASKG rosengren_s_Page_026.tif
ffd6925b1a785e2ce0277763513812de
eafc30c6a39b8289481663519a56536da7f002b9
F20110113_AAASJS rosengren_s_Page_006.tif
9ed4588f687c17139c23732aa631a7c0
1d00b002d39d72645cfc12eb218edc3d47eabfbe
F20110113_AAASKH rosengren_s_Page_027.tif
c40c02c0397cb06cb229464951cb21bf
a43dac822a9d71f4a62e639c7922e3acb508825f
F20110113_AAASJT rosengren_s_Page_007.tif
99d1fbb8a8456f42926b04877308db48
8284bad654581f9ff459ab52cb752c65b40f7a34
F20110113_AAASKI rosengren_s_Page_028.tif
eda980da5bc21a46d17841c0a22aefda
95d1089615f32912ceb9423c9fd4569827fd7505
F20110113_AAASJU rosengren_s_Page_009.tif
9abc20db2799b23e1943caaef1a99c00
2ebd39bed65af0f5e25439f77b8c99267ed50bb8
F20110113_AAASKJ rosengren_s_Page_029.tif
da626ca4e10544405c2e30fdea57c68f
beeeae3e14173e8109e77010194f2aee9d21fa13
F20110113_AAASJV rosengren_s_Page_010.tif
9fba61f2f57f3760a5b891c9cd415626
e0822de880b8a912a2d6da6470946ebd738c8f73
F20110113_AAASKK rosengren_s_Page_030.tif
97048719eae31431deef6fe70ad3ea26
2feeca571536af74ac0a292e19c103c9bdd97165
F20110113_AAASJW rosengren_s_Page_011.tif
110b22920eba1f19c99d7264b8d9dc9a
ac0a26a34bf9b7cf9f67e78cc154fcbc8e36d0e2
F20110113_AAASLA rosengren_s_Page_049.tif
390b94747c13397e6fddd8e132c55271
cea83bc031776152c08f3fa8c58c57f4db059ca5
F20110113_AAASKL rosengren_s_Page_032.tif
4a09ca25f277a2287fcc3e9594fd596c
d8336259575d6b2e7a8d6cce9a6f41d8c3b224c3
F20110113_AAASJX rosengren_s_Page_014.tif
943b636f2eb59731af3f19abd98898dd
6dd56f4cf45a3948ed7e4caecede9f6010874928
F20110113_AAASKM rosengren_s_Page_033.tif
eff65c532e6488c711e5a478b532670a
252a1588fd084f4e2aafa16487380235dde4ef20
F20110113_AAASJY rosengren_s_Page_015.tif
e4f7fb16e42f88266df82cf7f1c5ed89
28b32a4bd38f1655b67fc1f3560598b1b2d33de4
F20110113_AAASKN rosengren_s_Page_034.tif
ed09ced046d04d2977a7fd48c0371572
86b32859fbfdf02e59a5bb29835887cd509b81b0
F20110113_AAASJZ rosengren_s_Page_016.tif
251406b51174f6dc1b22bf14e2d662e1
0fcae8b1c385d0cbb4fb8737998f0b9b5121fe3c
F20110113_AAASLB rosengren_s_Page_051.tif
69782e644bd1e8787378b5f5b54edb86
33b832f719dd534efd9d3ee260d89971e88202ac
F20110113_AAASKO rosengren_s_Page_036.tif
f997ba7b445db67860c73ad32ec177d6
6f6c334367b61109cb46c8e3085bab9a53c48e20
F20110113_AAASLC rosengren_s_Page_052.tif
46d51fa61cde1bfcc08b657b001876b6
1547d3b5523059cd7fb4582595583ff8b020d33f
F20110113_AAASKP rosengren_s_Page_037.tif
84aec2b59301cc2f7e24480586d55b53
047fe8644e55c3f6fae9dace45c6765f49be4c4b
F20110113_AAASLD rosengren_s_Page_053.tif
edd6ab3a13492ffd85b7c12f56cb98b1
c1878bfc4c965bcc0a9526386fe75bace7788283
F20110113_AAASKQ rosengren_s_Page_038.tif
a17326613e54fbedee51e1d3c5fc1af6
33946ca193cf73555e888d9c592c26b69c35e09c
F20110113_AAASLE rosengren_s_Page_054.tif
ea192b5f07e972d256e1c2261953e942
1432f4d765c82a62fe284c1debaf655ce55c2459
F20110113_AAASKR rosengren_s_Page_039.tif
ea8fe435ae192816aae73e2dc72c3373
7d5634f7d8472bf0cd468a4e236ed77be7082d00
F20110113_AAASLF rosengren_s_Page_055.tif
1f75ba39124dea83c85af7928fcbdfeb
f8bbdd7d7ce23b2037d785f7fc5f97cf49776569
F20110113_AAASKS rosengren_s_Page_040.tif
62de2f471f2c0be4639efb43143b86f3
00f61fec98fa12b2fcaa9e70f4fcab97220430fc
F20110113_AAASLG rosengren_s_Page_056.tif
83764917de3722f3f3a305175df2bfc1
48049fac98cb2b0c873fd152348c94cc29b58d58
F20110113_AAASKT rosengren_s_Page_041.tif
c08801e1f1afc7df881012b2a9c28111
5886f887bf398a74f9185fa02c54727fa361b4e7
F20110113_AAASLH rosengren_s_Page_057.tif
46fabd6fef9a56929a0459bbc4bb4e53
697c8484005f4756da2fd4e0294f2ef4d1afbda5
F20110113_AAASKU rosengren_s_Page_042.tif
2c5bf178205354deb163c1ac49e757ce
d422758ac3b59e8927fe6537565be3c43632141b
F20110113_AAASLI rosengren_s_Page_059.tif
9369aa4461edfcb0039377459a56ca4b
6425ddae29309172049f17d5f58a5660ae33c0a7
F20110113_AAASKV rosengren_s_Page_043.tif
5b0e2f21a14a125db0d698500967ab9e
8a2ec32966d839ad4f167dcd6748b8f4a6db3958
F20110113_AAASLJ rosengren_s_Page_061.tif
71316661a8c181b6afd8c5ff18b734f4
cf3a33baae9d980e6866d37a6d68866dfa2e0278
F20110113_AAASKW rosengren_s_Page_044.tif
d0f5ab942311ee559d3a029b099cb9d9
74b5c11d743f02c5b38e7beb14257ea172c5b3e8
F20110113_AAASLK rosengren_s_Page_062.tif
7bd7f4d2e27594e48ed71bebbdddf08f
c707fc95273ff16c9e220d38dfd2deff5bfe616a
F20110113_AAASKX rosengren_s_Page_045.tif
0d8e55748b4fb72a651da89c7d150829
9188289e5e00afaae4d64588bf56636e9467aa27
F20110113_AAASMA rosengren_s_Page_078.tif
109643b4fcff7278e7e3c1cd6f6d82e8
789e22a2b9ff1b486211304ce0a40884f408ca42
F20110113_AAASLL rosengren_s_Page_063.tif
a3bda8aaa742c7b1982f4eb361c45008
5a61ce9f6787325bfc88f527990dcb5fd52f8d10
F20110113_AAASKY rosengren_s_Page_047.tif
4867fd63411dcb2e7669e9448fa2dcde
a8dc1e2ba68a439d07a0c92820b8f301f0daac69
F20110113_AAASMB rosengren_s_Page_079.tif
d58983720b5feb944863c800c7bda0df
1a4d9434fe77dd44602143a8c2b767aa1dc052d5
F20110113_AAASLM rosengren_s_Page_064.tif
db7249f92d14b696969cde3a1da77e5b
733a1e3396a99f1c0b5056b5212d745560718c1b
F20110113_AAASKZ rosengren_s_Page_048.tif
adb21f5d657f75544471aeccc50eb803
4c1ac8be49d849548730eeb06b098633b1337a1f
F20110113_AAASLN rosengren_s_Page_065.tif
7f688e20b22e5024fded3a837058eeec
7a7e31559734ea0cfa6f8f87692c2586da4a6a99
F20110113_AAASMC rosengren_s_Page_080.tif
69d51c61c2c19092d691aab5628a95fe
e791dbefbcab9bccfed9913972b76f6011baaa94
F20110113_AAASLO rosengren_s_Page_066.tif
1f2d8f1e67584c089328632040b85596
ee429001d1faa265ac7539df8ff7ee6e203351bd
F20110113_AAASMD rosengren_s_Page_081.tif
a6a0ef5aa40b7c67eb961045029b2114
c17f51a072ca00bb4e377a1ba98e97bbebb0b5ed
F20110113_AAASLP rosengren_s_Page_067.tif
94fcbac32ad6e372fdf51db81da440e9
814f4cd4973a28a8c341dd9575fa258ede7280a3
F20110113_AAASME rosengren_s_Page_082.tif
7a5d28a3ec3983b80ecea8e22d92db61
f7c395afc54c690e0de5a9b115b8424bbed0ad6d
F20110113_AAASLQ rosengren_s_Page_068.tif
eedf35793eceeb4b6c034579b3205553
d2d2d11d6fc584d5c9b431d9696ba00f0dfc616e
F20110113_AAASMF rosengren_s_Page_083.tif
be20c8bf33833c9fbae371cf529958a6
31b5adca3d8765571358ff9c3672bfe4a8179e57
F20110113_AAASLR rosengren_s_Page_069.tif
68a7dca11ecbd879f66690295cd37b57
5e4be20163917474b458eba36129d2cda2f2cb80
F20110113_AAASMG rosengren_s_Page_084.tif
2b26521589b574ae97b4b00497906afb
0ef8898de1ee469cf080ce4fabf3c3a805d0234a
F20110113_AAASLS rosengren_s_Page_070.tif
91e8df9eb76e9adc9360d7932c79a3ae
07af0e4f3c7086d8312c3205489d72961d298bdb
F20110113_AAASMH rosengren_s_Page_085.tif
29ebc9b8013c9917347d08ac64a378e6
af58a01b183346bb2fcfce286c785c53f78b57f5
F20110113_AAASLT rosengren_s_Page_071.tif
b2b3c6b6e541d7e1ae52daa18b303def
e6bcdb86a28ab97124a1f06e2f3727d0fa09b1e9
F20110113_AAASMI rosengren_s_Page_086.tif
91862eba0b709a674d8cbc20c04c11f5
1cf04659279783669c25a282e56e72d4250af529
F20110113_AAASLU rosengren_s_Page_072.tif
f2473b05675503b6c330be2542878134
c2f44b7df84ea85f05120bbda58d7fcb1d052f1f
F20110113_AAASMJ rosengren_s_Page_087.tif
fa957a341ba1b0fb8871d34f6b416e88
c5127c7a50dbea48ce02d4a36e005637c2d28877
F20110113_AAASLV rosengren_s_Page_073.tif
1ebbbc52ee9316dac3460ff88a96acc4
567bfd0596d03595c3b1b336e67487e86b0dfea2
F20110113_AAASMK rosengren_s_Page_088.tif
de5000dc448ad575b1ed8fd0e7c1a256
e2015752cadd01bc12a861f0338280d47e2f9f6b
F20110113_AAASLW rosengren_s_Page_074.tif
2b7ea63836df4cd4843b3c50ac2763af
c60100ad0250d85311862c9acde3fecc9852d288
79602 F20110113_AAASNA rosengren_s_Page_005.pro
9aab00e670d7e8d9827f04b92e345495
52e1204a088cbdd8bd64aaa4aabdccadbc4b18e7
F20110113_AAASML rosengren_s_Page_089.tif
8d1aba42edd20c964af0a9801587a894
7e4efd7456619e27f7c25bc773bd129bc1e5d23a
F20110113_AAASLX rosengren_s_Page_075.tif
86234c38ce10f8a676211609a045adb0
d0f560f24a291c6d85c1cdbada1fef3599316c9b
44667 F20110113_AAASNB rosengren_s_Page_006.pro
c2cfc71fb6ca4734d96fc8bb025e2eff
75737c60e9752f835f207d881294d98b29a7e6f9
F20110113_AAASMM rosengren_s_Page_090.tif
4ceaa45aa3fa8ca853aae97e70c52147
03a0efd16094bd4d1562fa227f949bc7509c83b1
F20110113_AAASLY rosengren_s_Page_076.tif
7971c6c8e7433205deeac8af9a2089ad
2cad30bf67452282b9a1f02f3a47363093aec0be
30017 F20110113_AAASNC rosengren_s_Page_007.pro
cb56d643cf1d7c0206090110e0124688
27be47b0e99126afc7244bebc058a10ce67e119e
F20110113_AAASMN rosengren_s_Page_091.tif
76e1d489d0f35fc2d8f0ee8ac2355f73
f86ece8b6ab151244fdd98d71ad000518ce7564f
F20110113_AAASLZ rosengren_s_Page_077.tif
b81cd9aaca3df46ba9ae704eaa5f2922
b08caf8df4be9f50d679cb5c0dca064ab9a5fda7
F20110113_AAASMO rosengren_s_Page_093.tif
887507a28dc7f70a4e68a96a67e225d7
893fde4d9445bd5f0df8153df147da093ad8a9d9
74036 F20110113_AAASND rosengren_s_Page_008.pro
7d0ac5f6e1cd5163297cca884282ac61
59f3f196888dfbf2c0c31f112ac9e72d7ace2c2b
F20110113_AAASMP rosengren_s_Page_094.tif
0e600cd71cbfc03521508a5b78e59303
7ddc1ce0a6b10f07a316a34117179cc22dbb27f6
88491 F20110113_AAASNE rosengren_s_Page_009.pro
9ef1df2a1dda7752e82e26598ef44b12
6b0afd288c8014c2c0b5d46b2f7da757aa611e27
F20110113_AAASMQ rosengren_s_Page_095.tif
2d13d5643a39c4f16029d57109cbf68e
fa15043178a2c380bf6213d5172676714706083f
64842 F20110113_AAASNF rosengren_s_Page_010.pro
6e353ef381afb3a5ca178ed5a76bbe07
fdb1cb567a2cd613b943b8431225bb23d5f8fc5e
F20110113_AAASMR rosengren_s_Page_096.tif
445974cb810dcc8ffb3f05d9989fa986
c2eaa60419f00531037a29df0fcdaffd702c1c97
46958 F20110113_AAASNG rosengren_s_Page_011.pro
d79e0417640ce608f957b94ca576ef43
87686c4db06a152db277b9525c8851ed0ad846a3
F20110113_AAASMS rosengren_s_Page_099.tif
5508c345786d85ad26276884ec9f1b08
8c693ece33e64fcda17a734aeb47340d5e18746c
65720 F20110113_AAASNH rosengren_s_Page_013.pro
be4af8130d0d724a70a10bfec891ab62
de01711fa3d669127af913dfba7459aef0db0434
F20110113_AAASMT rosengren_s_Page_100.tif
6bb2890523c1bf42f002751ff2ed9008
55e97b20911cdedcb8b26e1e28b258c4b106a4f0
60708 F20110113_AAASNI rosengren_s_Page_014.pro
8ed7e753062a5d9c8cc1ccffe4d07927
d0da3ecd7559033ebfe583210d61acb4163685ec
F20110113_AAASMU rosengren_s_Page_101.tif
9f4a73790a4ba68c46f1f030a330e543
85ea73b010493f129ef5e590ef3b8fb872d08bba
39951 F20110113_AAASNJ rosengren_s_Page_016.pro
19eef4285590e10bb4251e084d143013
6750a59bf718e421abc2f2fbf2e631a04ca016b6
F20110113_AAASMV rosengren_s_Page_102.tif
c827b9d8e5767b1a6d51ba1ed8b01f5c
b7a7c32c40bb926ebc91edc2425d6700f6d1c5c1
49876 F20110113_AAASNK rosengren_s_Page_017.pro
6fb7e332d4f62a0543e16ecc9771d7e3
8bd61e7b7fb290b9a14ee4c304c5af9a7708c8fb
F20110113_AAASMW rosengren_s_Page_103.tif
e54276f1a29c2c1b704558fc0829a043
4fe9bbba02da4602887dbe77e3ba22fba0e39499
25781 F20110113_AAASNL rosengren_s_Page_018.pro
470eddf067be4654fb781d6575c5dfb4
2b5bed4c32abbdf1f39eb9d8b136622e23105927
10424 F20110113_AAASMX rosengren_s_Page_001.pro
c240b397726dee38a9ff67070120196d
17e69ed9b490fd45f5c51d6ab1e40f108e0a4f72
38703 F20110113_AAASOA rosengren_s_Page_036.pro
69b5f5d4df87b7ac744082ff85dbf696
43ffeaf8ca8935f88bb824e149938a846b7aaa80
62583 F20110113_AAASNM rosengren_s_Page_019.pro
4f75c68c279d0b34dd9e056ed301e3b3
42434b787c194095198bdc0137b503f168b5c81d
1298 F20110113_AAASMY rosengren_s_Page_002.pro
cc27ebc137ddea8311e2861a2dd1a911
bb5c7416e1522d6e1973762c775f74070ecc84ea
47025 F20110113_AAASOB rosengren_s_Page_037.pro
9d398732c15d7a0600b0126f82fb7285
4e60b532127cd2c2dbce0f18975f994a9413da09
68792 F20110113_AAASNN rosengren_s_Page_020.pro
badd494dc904fc624fbb5591855d7627
0b2ea164724aa4babc008f5a82e844c2ca4b621d
1664 F20110113_AAASMZ rosengren_s_Page_003.pro
815eb192b8f03618f3bccbdb481ae24d
3cd2f537e4f360f56070c8df20f002f03fd8ffce
45340 F20110113_AAASOC rosengren_s_Page_038.pro
f4f772a64ed8c9eec658451de33fdec1
0b129354fe9a54fb36cd296da1867d81a59e5c53
3141 F20110113_AAASNO rosengren_s_Page_021.pro
acdbb5687507886494833b6bcf4dbaa6
fc760f0a74e47f7f335b8b11ab06102b1e4158d3
57537 F20110113_AAASOD rosengren_s_Page_039.pro
b3209fa3799d73d6b89296966051cf07
94a065d51d70a4962088ea67a14028e8ef8a9b61
61405 F20110113_AAASNP rosengren_s_Page_022.pro
2b6905f9248edcdb24f98f39714a7ebc
71203772872a8fdf654c461155395e9ad49bf906
57539 F20110113_AAASNQ rosengren_s_Page_023.pro
e6515c00438fb9996acd4f7773c29228
00176baf003084985b93c39cc5a492978d5249d1
24602 F20110113_AAASOE rosengren_s_Page_041.pro
07158c582ea959889ffb98ba6e622927
c71321084220a8723cfbe6015f428aa2fe053baf
62334 F20110113_AAASNR rosengren_s_Page_024.pro
f1357c2f0c375aa791edbdbbb4c0c952
26d8f8971ba33f76aa91b1b2875936f24c765496
54091 F20110113_AAASOF rosengren_s_Page_043.pro
504fc95a70afbe0ada854d0c422ecb26
f10ab693a5a61d07953e3845d2ba8a5c6cb23dae
62608 F20110113_AAASNS rosengren_s_Page_025.pro
82f74a7742553ed2b3d0b126fd47403b
104c3bbc4564a6c489d78b2152d6617f050ad0df
51229 F20110113_AAASOG rosengren_s_Page_044.pro
11e03fb1fdbdfbc25c5d5901b477973e
e95cc89bc8f1cb034bf2d33ccec7928985cedaec
65751 F20110113_AAASNT rosengren_s_Page_026.pro
70d6f028d065cc44ae72000ad1f127a0
f88491cd6e64984155fdfec3c767795ad8eff8f7
36247 F20110113_AAASOH rosengren_s_Page_045.pro
55c094d3d7f2fba36eed10a26a7c3ecc
b1d926d96c65db15dce8e7b837c776a15da19041
61495 F20110113_AAASNU rosengren_s_Page_028.pro
2398a60c6cce94b07a9a3ff43870f106
dc95fb27633f69035496bd897f4f097887dac32c
55666 F20110113_AAASOI rosengren_s_Page_046.pro
8bdd4ffc6f3dff22650bd524621d2522
7cf725486a7ae336831306c3f099367f54b94137
67184 F20110113_AAASNV rosengren_s_Page_029.pro
48c63fe375c5a632b4b85b7c266c602d
c3e48bd301ae743b4aa051c3efae43c9a427f08a
30764 F20110113_AAASOJ rosengren_s_Page_047.pro
9194c327b3e0090641a4c6f84faa5929
940559d331faaee097752003dd2bf81edf1a4f4e
61077 F20110113_AAASNW rosengren_s_Page_030.pro
c1fa54a658be804f097b41c784bd909c
89c0bcf4a83821690329a04a673cad87269daf26
51384 F20110113_AAASOK rosengren_s_Page_048.pro
b485fd52c8023a949425928cad2affe1
7317c9c1be64bd976a09f5e1f40596c1563fc280
58847 F20110113_AAASNX rosengren_s_Page_031.pro
1cb92ac779e4747bca2d6487f79fd63f
e2b6dc88bbd69ffd886348ef7ad6899218cc2411
20082 F20110113_AAASPA rosengren_s_Page_064.pro
e3a3e2715da9a8b5fb7bbeebef796945
1d47fbdf8b2b60cbf85bc0796e0406638ed726cd
53504 F20110113_AAASOL rosengren_s_Page_049.pro
300b41f169f272e7baaddd40204ec1a9
1f53a05ccc93849999bce192edd70d3a01a216b4
36858 F20110113_AAASNY rosengren_s_Page_032.pro
2632ffabb1d07a85a0931a0da642d4db
ef82f0284c1624a4e9de8258711f523ebfec5247
28432 F20110113_AAASPB rosengren_s_Page_065.pro
d1a529c20c21489f2d38321b50168477
841ee1dfadb1be1cc415806db268e893d64d7039
30261 F20110113_AAASOM rosengren_s_Page_050.pro
dfa9dd19db6c3ec0a532fd0edb82d707
6763727da65ee1e5697fecd9f95666f03dbbc9dd
35914 F20110113_AAASNZ rosengren_s_Page_033.pro
e425a3020025ba76158f2df7f9e35b02
51e0ffbc16cb6f8f6b9da6d75d827dcd68501f31
48003 F20110113_AAASPC rosengren_s_Page_066.pro
935bc7338fde650c61d3e3414742dc14
46bfa0a0a2a1f7c932e7e85f873eb9fe98dcd6e5
38130 F20110113_AAASON rosengren_s_Page_051.pro
ce9fb0232269daf75d0c3d4d423ee937
3c8f88c812fc6e4e57369902c566df25b57a1243
13229 F20110113_AAASPD rosengren_s_Page_067.pro
9dd44d09841bb4fd451e93854ff09108
0290adce706570b28505a2487a8de003362bff84
37513 F20110113_AAASOO rosengren_s_Page_052.pro
e184f4b61387c68822471b5fd90cf772
0b62fd8f091791cbbc9e432131df88a2edd85f89
20094 F20110113_AAASPE rosengren_s_Page_068.pro
9c000b7fb105bf98a0c646e939205c18
805a724c126e8750bcb6e4c16e0ec60880479a32
54396 F20110113_AAASOP rosengren_s_Page_053.pro
f8c5a6e9c0aa442f9086e17c17b964c5
dfd7d764ffcfaf888661d41fbb263260ca977dfa
45188 F20110113_AAASOQ rosengren_s_Page_054.pro
fd8d8f8a69f33962496e68288f1fee3f
bed72aa75dfd9a8a32ed8e99c0d4a427023aa752
29258 F20110113_AAASPF rosengren_s_Page_069.pro
de7e15ec2ea7e5230c2d15c42944157c
08afb5c3d1e3fffc65d4ccd57d53fa6285fe761c
46434 F20110113_AAASOR rosengren_s_Page_055.pro
458840e6b41ef66333dadc76e9cfc443
66259d693374f8612c7db9ddfe0753bfa85d6c53
41843 F20110113_AAASPG rosengren_s_Page_072.pro
3e7ded682dfa857a87792a0e9635ea78
d4f2fcfc1fbfb8627156078bb020b84524872190
64123 F20110113_AAASOS rosengren_s_Page_056.pro
ea0f415840959c25887d46a53fbf7502
a2604bf25166d1b88249e9bf18dcc2e082be0100
14556 F20110113_AAASPH rosengren_s_Page_073.pro
c2323d21bdff3f3c73f263ae32079718
ffb4e331138539dbeb4cdc9ebe835c9f28c2fc0b
64146 F20110113_AAASOT rosengren_s_Page_057.pro
6c6ea2ece2566085739b8cbee7bb29a9
10654747970aa5f5595c150769c0bcd5a7b18a6d
13754 F20110113_AAASPI rosengren_s_Page_074.pro
1a1c548e2a5ac4bbd3254211eb3e7cf4
a5f7a4e1c5be7e29ff98f6ffc8d6b87d7b4cb47e
51447 F20110113_AAASOU rosengren_s_Page_058.pro
23d743de1f6a0d743df03161f58f5ff0
98b919b6877940ef55feffe6a1cf732ddf1014b3
20878 F20110113_AAASPJ rosengren_s_Page_075.pro
49038bd9bd91cea9c66957f602f70b75
56b8cddf2d425cca6763b17eb544062079ddb967
7852 F20110113_AAASOV rosengren_s_Page_059.pro
de08262e9be5aaaa004ac0d21779874a
3b4011c4efc44db70a8a7f42ee49683ad24eb8ce
29354 F20110113_AAASPK rosengren_s_Page_076.pro
4c09f69198a5d1163af2b5f757784ae4
18336a9efbf0752493828453de0807b967179a02
28242 F20110113_AAASOW rosengren_s_Page_060.pro
792c549c3557bff18a10d4e328017d21
ff3e385604ffb5d4b6fec9998686d6f847d90d95
26277 F20110113_AAASPL rosengren_s_Page_077.pro
23d660d9b366190c32723c8c7f5d92f0
f74b0e20806290a7fd2ea875bd3487a2d8f35027
45453 F20110113_AAASOX rosengren_s_Page_061.pro
29b33f12752a6d61274d50cfe8048894
dfc6b8a04a2869c3d01894bbacc8b94baf1a4a25
22191 F20110113_AAASQA rosengren_s_Page_095.pro
130319453a0b471fa10078eb779294ff
0e374d6abcb229289380be9d85d1df372e949de9
20893 F20110113_AAASPM rosengren_s_Page_079.pro
7c2083055ad0683a64f54dc65b9a8761
8f0196ec910318bfda4fed9ddc274299076184d1
40600 F20110113_AAASOY rosengren_s_Page_062.pro
136a76374187bbd194bb63a56e115c0e
b20835cda78e567a59cde875937e8d5a84a7b22f
18323 F20110113_AAASQB rosengren_s_Page_096.pro
37eea009868188d30a5dfcaf0dcc62de
41801c31df7bf15cba3a0d5b27b0b6dbf40ecc6a
28918 F20110113_AAASPN rosengren_s_Page_080.pro
0585cf2f0bdf506c9cbf910b1fbda24a
682253133a5669b31da182c650e83dc97b8d2ecf
17493 F20110113_AAASOZ rosengren_s_Page_063.pro
b1b7407fb99aabcc041743ec3e52ea3d
2f65932fc30775404b655f22084e487f8177d8b0
59442 F20110113_AAASQC rosengren_s_Page_097.pro
b357c575de2a7ab048c446ef92068324
45d38b7cf259fa08619a8d494f944786f302c625
43892 F20110113_AAASPO rosengren_s_Page_081.pro
9251c22cab67188467f42d3c075b8927
7abca44d0903fed1330ee028e995c385fb48b123
59871 F20110113_AAASQD rosengren_s_Page_098.pro
68e84d3ae8b44a861b10db40fc00534a
3e4e66f92ebc3ae4038bbdf097a2a279409cb4aa
14391 F20110113_AAASPP rosengren_s_Page_082.pro
ba66ec59b31e1bb8450f9ca7c895ded8
3d391d0afc8fae603678b03e9b45274d2dac828b
28305 F20110113_AAASQE rosengren_s_Page_099.pro
2232817edceab55710e1eda22c619d58
1c1208a7a55bd4b1d274612bcec458aedb704d86
24599 F20110113_AAASPQ rosengren_s_Page_083.pro
c4e15ffeefb5d2ed686c4de089227e68
bd5ad18da63016d03b009359daca4d3ad309695c
37657 F20110113_AAASQF rosengren_s_Page_100.pro
510be225110e23d4022b8f7b16c08f6d
ce7365c40422054b8ff73a49819e503c8db7a8e3



PAGE 1

ONTHEFORMULATIONOFINERTIALATTITUDEESTIMATIONUSINGPOINTCORRESPONDENCESANDDIFFERENTIALCARRIERPHASEGPSPOSITIONING:ANAPPLICATIONINSTRUCTUREFROMMOTIONBySCOTTCLARKROSENGRENATHESISPRESENTEDTOTHEGRADUATESCHOOLOFTHEUNIVERSITYOFFLORIDAINPARTIALFULFILLMENTOFTHEREQUIREMENTSFORTHEDEGREEOFMASTEROFSCIENCEUNIVERSITYOFFLORIDA2004

PAGE 2

Copyright2004byScottClarkRosengren

PAGE 3

ThisthesisisdedicatedtoBG.Iwillalwaysloveyou.

PAGE 4

ACKNOWLEDGMENTSFirst,IgiveallthegloryandhonortomyLORDandSaviourJesusChristforallowingandenablingmetocompletethiswork.Ialsothankmywifeforputtingupwithcountlesslatenightsandearlymornings,readingeachcopyofthemanuscript,watchingourdaughterwhenshewasdone,andalltheprecioustimethatshesacricedformetoworktwojobsforthelastfouryears.ShehasspentatleastasmucheortasIdidincompletingthisthesisandallthecoursework.Ithankherforhernever-endingloveandsupport.Mylovewillbeforeverhers.Iwishtothankmyparentsforsupportingme,andconvincingmethatIcandoanythingIsetmymindtoaccomplish.WithoutthesupportanddirectionfromProfessorEricSutton,thisprojectwouldnothaveevenbegun.Ithankhimforthetimeandenergyhespenttutoringandmentoringmeoverthelasttwoyears.MygratitudeisextendedtoRonSmithfordevelopingtheufthesisdocumentclassfortheLATEXwordprocessingprograminwhichthisdocumentwasprepared.Finally,IwishtothankmycompanyDynetics,Inc.inShalimar,FLforallowingmethetimeandresourcestocompletemycourseworkandthesis. iv

PAGE 5

TABLEOFCONTENTS Page ACKNOWLEDGMENTS ...................................... iv LISTOFTABLES .......................................... vii LISTOFFIGURES ......................................... viii ABSTRACT ............................................. xi 1INTRODUCTION ....................................... 1 2OVERVIEWOFEXPLOITATIONOFPOINTCORRESPONDENCESINMULTI-PLEIMAGESOFTHESAMESCENE ........................... 3 2.1GeneralStructurefromMotionSfMProblemSetupandNotationforEuclideanFramework ....................................... 4 2.1.1GeneralMathematicalNotations ....................... 4 2.1.2ImagingSystem ................................. 5 2.1.3PresentationoftheEssentialMatrixtoDe-CoupleStructurefromMotion 6 2.1.4NoteAboutSolutionAmbiguitiesintheEuclideanFramework ....... 8 2.2ProblemCategorizationandRenement ....................... 8 3BENEFITSGAINEDFROMGLOBALPOSITIONINGSYSTEMGPSDIFFEREN-TIALCARRIERPHASEPOSITIONING. ......................... 11 3.1GeneralDiscussionofGPSPositioning ........................ 12 3.1.1PositioningwithCodeDelayMeasurements ................. 12 3.1.2PositioningwithCarrierPhaseMeasurements ................ 13 3.1.3PositioningwithCarrierPhaseAbsoluteandDierentialPositionwithRespecttoanExternalReferenceSystem .................. 14 3.1.4DeterminationofDierentialPositionfromCarrierPhaseMeasurements 16 3.2DisplacementVectorforOneVehiclefromCarrierPhase:IntegerAmbiguityNotaProblem ........................................ 17 3.3DisplacementVectorforTwoVehiclesfromCarrierPhase:IntegerAmbiguityisaProblem ........................................ 17 3.3.1RealTimeKinematicRTKwithRovingReferenceReceiver ....... 18 3.3.2WideLaningUsingDualFrequency ...................... 18 3.3.3KalmanFilter .................................. 18 4PROBLEMDEFINITIONANDSOLUTION ........................ 19 4.1ProblemNotesandAssumptions ........................... 19 4.1.1RotationAmbiguityaboutTranslationalAxis ................ 19 4.1.2NoGPSAttitudeDeterminationAvailable .................. 19 4.1.3SpecicationofControlPoints ......................... 20 4.1.4GPSDierentialCarrierPhasePositioningDataUsedasTruth ...... 20 4.2SpecicSfMProblemSetuptoFindVehicleInertialAttitude ........... 20 4.3SingularValueDecompositionSVDSolution .................... 21 4.3.1BriefOverviewofLinearAlgebraTechniquesusedinSVDSolution .... 23 4.3.2DetailedLookatSVDSolutionAlgorithm .................. 24 v

PAGE 6

4.4ErrorMinimizationSolutionApproach ........................ 28 4.4.1BriefOverviewofRotationsusingQuaternions ............... 28 4.4.2DescentAlgorithmsusingCoplanarityConstraint .............. 31 4.4.3DescentAlgorithmUsingLinearizedMeasurementModel .......... 34 5ERRORANALYSIS ...................................... 35 5.1GeneralizedLinearizedErrorModel ......................... 35 5.1.1ImplementationoftheGeneralizedLinearizedErrorModelAnalysisGLEMAMatlabcProgram ............................... 42 5.2ParameterStudy .................................... 46 5.2.1VaryingCameraFocalLength ......................... 47 5.2.2VaryingCameraDepressionAngle ...................... 51 5.2.3VaryingDisplacementVectorMagnitudeBetweenImagingSystems .... 59 5.2.4VaryingNumberofControlPoints ...................... 61 5.2.5VaryingLocationofLastControlPoint .................... 76 5.2.6VaryingControlPointMeasurementError .................. 81 6CONCLUSIONS ........................................ 86 6.1OptimalCameraParametersforHeadingEstimation ................ 86 6.2WorstCameraParameterValuesforAttitudeEstimation ............. 87 6.3FutureAreastoExplore ................................ 88 APPENDIXCENTERcCOORDINATEFRAMEDISCUSSION ............. 89 REFERENCES ............................................ 90 BIOGRAPHICALSKETCH .................................... 92 vi

PAGE 7

LISTOFTABLES Table Page 5{1CoordinateSystemDenitions ............................... 36 5{2Descriptionofvariablesinthegeneralizederrormodel .................. 39 5{3Correspondencebetweendriver.mandplot results.mlesfortheGLEMAprogram. 44 5{4SummaryoflinearizedindependentvariablescontainedinthesystemerrorvectorvwiththeircorrespondinginputsintheGLEMAprogram. ................ 45 5{5SteponevericationresultsfortheGLEMAprogram .................. 45 5{6SteptwovericationresultsfortheGLEMAprogram .................. 46 5{7Inputparametersforvaryingcamerafocallengthstudy ................. 49 5{8Inputparametersforvaryingcameradepressionanglestudy ............. 54 5{9Inputparametersforvaryingdisplacementvectormagnitudebetweenimagingsystems. 60 5{10Inputparametersforvaryingthenumberofcontrolpointsstudy. ........... 69 5{11Inputparametersforvaryinglocationoflastcontrolpoint. ............... 80 5{12Inputparametersforvaryingcontrolpointmeasurementerror. ............. 83 6{1Optimalparametervaluesforheadingestimate. ..................... 87 vii

PAGE 8

LISTOFFIGURES Figure Page 2{1GeneralSfMproblemnotationforEuclideanFramework ................ 5 2{2GeometryforSfM2D-to-2Dpointcorrespondenceproblem. .............. 7 3{1Geometryfordeterminationofdierentialpositionfromcarrierphasemeasurements. 16 4{1Imagingsystemnotationusedinndingvehicle'sinertialattitude. .......... 21 4{2Single2D-to-2Dpointcorrespondenceusedinndingvehicle'sinertialattitude. ... 22 5{1Erroranalysisscenario. ................................... 36 5{2OverviewofMatlabcimplementationofthegeneralizedlinearizederrormodelanal-ysisGLEMAprogram. .................................. 42 5{3GraphicaloverviewofvericationoftheGLEMAprogram ............... 44 5{4GraphicaloverviewoftheparameterstudyprocessusingtheGLEMAprogram ... 47 5{5Allowedlocationofcontrolpointswithineachgridboxofthefocalplane. ...... 48 5{6Controlpointlocationsonafocalplanewitha3rowby3columngriddesignation. 48 5{7VaryingFOVvs.focallength ............................... 50 5{8ControlpointlocationsonthefocalplanesofbothcamerasfortherstiterationforvaryingcameraFOV. .................................... 51 5{9ControlpointlocationsonthefocalplanesofbothcamerasforthelastiterationforvaryingcameraFOV. .................................... 52 5{10Three-dimensionalviewofthescenefortherstiterationforavaryingfocallength. 52 5{11Three-dimensionalviewofthesceneforthelastiterationforavaryingfocallength. 53 5{12Varyingcamerafocallengthresults-nedtoc2 ..................... 53 5{13Varyingcamerafocallengthresults-c1toc2 ...................... 55 5{14Controlpointlocationsonfocalplaneforit=45degreesforthevaryingcameradepressionanglestudy ................................... 56 5{15Controlpointlocationsonfocalplaneforit=90degreesforthevaryingcameradepressionanglestudy ................................... 56 5{16Controlpointlocationsonfocalplaneforit=135degreesforthevaryingcameradepressionanglestudy ................................... 57 5{17Three-dimensionalviewofthescenefor=45degreesforthevaryingcamerade-pressionanglestudy. .................................... 57 5{18Three-dimensionalviewofthescenefor=90degreesforthevaryingcamerade-pressionanglestudy. .................................... 58 viii

PAGE 9

5{19Three-dimensionalviewofthescenefor=135degreesforthevaryingcamerade-pressionanglestudy. .................................... 58 5{20Varyingcameradepressionangleresults-nedtoc2 ................... 59 5{21Varyingcameradepressionangleresults-c1toc2 ................... 61 5{22Controlpointlocationsonfocalplanefora=1.0metersforthevaryingdisplace-mentvectormagnitudestudy. ............................... 62 5{23Controlpointlocationsonfocalplanefora=5.9metersforthevaryingdisplace-mentvectormagnitudestudy. ............................... 62 5{24Controlpointlocationsonfocalplanefora=10.9metersforthevaryingdisplace-mentvectormagnitudestudy. ............................... 63 5{25Controlpointlocationsonfocalplanefora=15.9metersforthevaryingdisplace-mentvectormagnitudestudy. ............................... 63 5{26Controlpointlocationsonfocalplanefora=20.9metersforthevaryingdisplace-mentvectormagnitudestudy. ............................... 64 5{27Three-dimensionalviewofthescenefora=1.0meterforthevaryingdisplacementvectormagnitudebetweenimagingsystemsstudy. .................... 64 5{28Three-dimensionalviewofthescenefora=5.9meterforthevaryingdisplacementvectormagnitudebetweenimagingsystemsstudy. .................... 65 5{29Three-dimensionalviewofthescenefora=10.9meterforthevaryingdisplace-mentvectormagnitudebetweenimagingsystemsstudy. ................ 65 5{30Three-dimensionalviewofthescenefora=15.9meterforthevaryingdisplace-mentvectormagnitudebetweenimagingsystemsstudy. ................ 66 5{31Three-dimensionalviewofthescenefora=20.9meterforthevaryingdisplace-mentvectormagnitudebetweenimagingsystemsstudy. ................ 66 5{32Varyingdisplacementvectormagnitudebetweenimagingsystemsresultsfora=1.0to20.9meters-nedtoc2 ................................. 67 5{33Varyingdisplacementvectormagnitudebetweenimagingsystemsresultsfora=5.9to20.9meters-nedtoc2 ................................. 67 5{34Varyingdisplacementvectormagnitudebetweenimagingsystemsresultsfora=1.0to20.9meters-c1toc2 .................................. 68 5{35Varyingdisplacementvectormagnitudebetweenimagingsystemsresultsfora=5.9to20.9meters-c1toc2 .................................. 68 5{36Controlpointlocationsonfocalplaneforacolumncountof1andarowcountof6. 70 5{37Controlpointlocationsonfocalplaneforcolumncountof10andarowcountof6. 71 5{38Controlpointlocationsonfocalplaneforcolumncountof20andarowcountof6. 71 5{39Three-dimensionalviewofthesceneforacolumncountof1andarowcountof6forthevaryingnumberofcontrolpointsstudy. ..................... 72 5{40Three-dimensionalviewofthesceneforacolumncountof10andarowcountof6forthevaryingnumberofcontrolpointsstudy. ..................... 72 ix

PAGE 10

5{41Three-dimensionalviewofthesceneforacolumncountof20andarowcountof6forthevaryingnumberofcontrolpointsstudy. ..................... 73 5{42Varyingnumberofcontrolpointsstudyresultsforcolumncounts1-20andarowcountof6-nedtoc2 .................................... 74 5{43Varyingnumberofcontrolpointsstudyresultsforcolumncounts1-20andarowcountof6-c1toc2 .................................... 74 5{44Controlpointlocationsonfocalplaneforrowcountof1andacolumncountof6. 75 5{45Controlpointlocationsonfocalplaneforrowcountof10andacolumncountof6. 75 5{46Controlpointlocationsonfocalplaneforrowcountof20andacolumncountof6. 76 5{47Three-dimensionalviewofthesceneforarowcountof1andacolumncountof6forthevaryingnumberofcontrolpointsstudy. ..................... 77 5{48Three-dimensionalviewofthesceneforarowcountsof10andacolumncountof6forthevaryingnumberofcontrolpointsstudy. ..................... 77 5{49Three-dimensionalviewofthesceneforarowcountof20andacolumncountof6forthevaryingnumberofcontrolpointsstudy. ..................... 78 5{50Varyingnumberofcontrolpointsstudyresultsforrowcounts2-20andacolumncountof6-nedtoc2 .................................... 78 5{51Varyingnumberofcontrolpointsstudyresultsforrowcounts2-20andacolumncountof6-c1toc2 .................................... 79 5{52Originalfocalplaneviewofcontrolpointlocationsforthevaryinglastcontrolpointstudy. ............................................ 79 5{53Inertialheadingestimateerrorsforvaryinglastcontrolpointstudy. .......... 81 5{54Thecalculatedterrainmappedontothefocalplaneforvaryinglastcontrolpointstudy. ............................................ 82 5{55Controlpointlocationsonfocalplaneforthevaryingpixelerrorstudy. ........ 82 5{56Three-dimensionalviewofthesceneforthevaryingcontrolpointmeasurementer-rorstudy. .......................................... 84 5{57Varyingcontrolpointmeasurementerrorsresultsfor1.0to2.99pixels-nedtoc2 .. 85 5{58Varyingcontrolpointmeasurementerrorsresultsfor1.0to2.99pixels-c1toc2 .. 85 x

PAGE 11

AbstractofThesisPresentedtotheGraduateSchooloftheUniversityofFloridainPartialFulllmentoftheRequirementsfortheDegreeofMasterofScienceONTHEFORMULATIONOFINERTIALATTITUDEESTIMATIONUSINGPOINTCORRESPONDENCESANDDIFFERENTIALCARRIERPHASEGPSPOSITIONING:ANAPPLICATIONINSTRUCTUREFROMMOTIONByScottClarkRosengrenMay2004Chair:EricSuttonMajorDepartment:ElectricalandComputerEngineeringCombiningtheresultsfromStructurefromMotionSfMprocessingoftwooverlappingimageswithpreciseGPSdierentialcarrierphasepositioningyieldsanessentiallyfree"attitudeestimatefortheplatformaircraft.Thistechniqueisfree,"becauseitusessensorsystemsalreadyrequiredontheplatformaircraftforotherreasons.TheSfMprocessingresultsinadisplacementvectorinthebodycoordinatesystembetweenthetwocameralocations.TheGPSdierentialcarrierphasepositioningisusedtoproduceanextremelyaccuratedisplacementvectorinalocallevelcoordinatesystem.Oncetwovectorsarefoundthatareexpressedinbothcoordinatesystems,thenthetransformationbetweenthetwocoordinatesystemscanbefound.Thiscorrespondsdirectlytotheplatformvehicle'sattitude.Adetailederroranalysisonalinearizedmodelofthesystemshowsthaterrorsontheattitudeestimateareontheorderoftensofmradforameasurementerrorstandarddeviationofonepixel.BackgroundinformationinboththeSfMandGPSeldsisgiven.Then,theproblemisdescribed,allassumptionsarestated,andsolutiontechniquesarepresented.Finally,adetailederroranalysisisperformedonalinearizedmodelofthesystem.Thekeycontributionofthisthesisisthedetailederroranalysisoftheattitudeestimate. xi

PAGE 12

CHAPTER1INTRODUCTIONMuchresearchoverthelast20yearsintheStructurefromMotionSfMeldhasfocusedonusinginformationfrommultiple2-Dimagesofthesamescenetakenfromdierentcameralocationstocalculatea3-Drenderingofthesceneandthedisplacementofthecamera,commonlycalledthereconstructionproblem.Thisthesispresentsanin-depthstudyofanewapplicationforthiseldofstudy.ByutilizingtechniquesfromSfM,dierentialpositioninginformationfromGPSsensorscanbecombinedwithvision-systeminformationtoyieldanestimateofvehicleheading.Increasingtheaccuracyandrobustnessofairbornevehiclenavigationistheobjectivedrivingtheformulationoftheproblempresentedinthisthesis.Theprimaryfocusofthisworkistoexamineanalgorithmthatusesdatathatwillbeavailableonthevehicleforotherreasons.Itisassumedthattheairbornevehicleissmall,lowcost,mostlyautonomous,andusedforbothattackand/orintelligencegathering.Thisvehicleisenvisionedtobeabletoyaspartofmultipleorsinglevehiclemissions.Eachvehicleoragentwillalmostcertainlyrequirethefollowingfoursensorsystems:1alowgradeinertialmeasurementunitIMU,2aGPSreceiver,avisionsystem,andaninter-agentcommunicationsystemifpartofmulti-agentmission.TheIMUandGPScombinationprovidesaccuratenavigation,andtheIMUprovidesattitudesensorsforvehiclecontrol.Thevisionsystemisnecessaryforintelligencegatheringortargetacquisition.Aninter-agentcommunicationsystemisnecessaryfordistributedintelligenceandwillprovidesecurecommunicationwithjammingresistanceusingtechnologysuchasspreadspectrumorultrawideband.Bothofthesetechnologiescanbeusedtomeasureprecisedistancesbetweentransmitterandreceiver,inadditiontothecommunicationfunctions.Theproblemdetailedinthisthesisusesvisioncorrespondencesfrommultiplereferencesystems,combinedwiththecorrespondinginertialdisplacementvectorsbetweenthesystemstoobtaintheinertialattitudeofoneofthesystems.Thevisioncorrespondencesofthereferencesystemscouldbefromdierentvehiclesviewingpartsofthesamesceneatthesametimeoronemovingvehiclethattakesmultipleimages.Theonlyconstraintisthatthepoint 1

PAGE 13

2 correspondencesremainconstantinallimages.Asfarastheformulationoftheproblem,itmakesnodierencewhetheritisthoughtofasorabove.Asanexample, 1 supposevehicleoneisdirectlybehindvehicletwo;andthatvehicleonecansense,usingitsvisionsystem,theangularlocationofvehicletwowithrespecttoitself.IftherelativepositionofvehicletwowithrespecttovehicleoneisknownusingGPS,thenthepitchandheadingofvehicleonecanbedeterminedveryprecisely.Thissimpliedscenarioclearlyimposessomeunacceptableoperationalconstraints,buttheseconstraintscanberemovedusingmoresophisticatedalgorithmstoprocessthesensorinformation.Ifbothvehicleshavedownward-lookingvisionsystemsandthetwoeldsofviewoverlap,thenitshouldbepossibletoobtainthesameinformationaswouldbeobtainedifvehicletwoweredirectlyvisibletovehicleone.Inaddition,evenifthetwoeldsofviewdonotsimultaneouslyoverlap,iftheirpointcorrespondencesremainxedinbothimages,thatshouldprovideenoughinformationtocalculatetheattitudeofoneofthevehicles.Thebaselinebetweenvehicleswillbelongenoughtoprovideanextremelyaccuratepointingdirection;theaccuracyofthistechniquewilldependprimarilyonthevision-systemgeometry.Theprimaryobjectiveofthisstudyisathorougherroranalysisofthistechnique.Thisthesisisorganizedinthefollowingmanner.AbriefoverviewoftheStructurefromMotioneldthatusespointcorrespondencestocalculatespatialandorientationcharacteristicsisgiven,alongwithamathematicaloverviewofthereconstructionproblemgivenintheEuclideanframework.AchapteronthebenetsgainedfromGPSdierentialcarrierphasepositioningispresented.Next,theproblemofinterestismathematicallydened,allsimplifyingassumptionsaregiven,andsolutiontechniquesarepresented.Then,anerroranalysischapterdevelopsalinearizedmodelofthesystemandprovidesadetailedlookathowvariousparametersaswellastheoverallscenegeometryaecttheheadingestimate.Resultsfortheothererrortermsarealsoshown.Finally,aconclusionsectionpresentswhattheauthorbelievesaretheimportantndingscontainedinthisthesis,andfurtherareastoexplore. 1OriginallygivenbyE.Suttonina2001whitepaperoriginatingfromtheUniversityofFloridaGraduateEngineeringandResearchCenterinShalimar,FL.

PAGE 14

CHAPTER2OVERVIEWOFEXPLOITATIONOFPOINTCORRESPONDENCESINMULTIPLEIMAGESOFTHESAMESCENEUsingpointcorrespondencesbetweentwodierent2-Dviewsofthesamescenetoderive3-Dinformationaboutthatsceneisnotanewidea.Asearlyasthemid-nineteenthcentury,theeldofphotogrammetryusedpointcorrespondencesbetweentwoimagesofthesamescenefromdierentcameralocationstoaidintheproductionoftopographicalmapstoassisttheexplorersofthatgeneration[ 1 ].Withtheadventofincreasedcomputingpowerinthe1980s,thetechniquesofphotogrammetryhavefoundapplicationinroboticsandcomputervision.Researchinroboticshasemphasizedvisionasanaidtonavigation,andresearchincomputervisionhasemphasizedthereconstructionof3-Dscenesfrom2-Ddata.ThelattercomprisestheeldknownasStructurefromMotionSfM.Theoriginalformulationofthisproblemusedatechniquecalledprojectivegeometrytodescribetherelationshipsbetweenimagecorrespondences[ 2 3 ].Projectivegeometryusesraytracingthroughtheopticalcenterofthecamerasystemandepipolartransformationstodescribethereconstructionofthe3-Dscene.Morerecently,SfMresearchershavealsodevelopedandquantiedthe3-DreconstructionproblemusingEuclideangeometry[ 1 4 5 6 ].TheEuclideanframeworksimpliestheproblemtothatofndingasolutiontoasetofnonlinearequations,andlendsitselftoananalyticalsolution.Maybank[ 1 ]givesthefundamentaldierencebetweentheseapproachesontheirdierentviewsofgeometry:syntheticversusanalytic.Projectivegeometryusesasyntheticapproachthatgivesinterpretationstotheequationsdescribingthescenebasedonthegeometricshapesbeingseenlines,planes,conics,etc..Thedrawbacktothesyntheticapproachisthatitisdiculttomakeitcompletelyrigorous.Theanalyticapproach,usedintheEuclideanframework,leadstotheintroductionofanessentialmatrixthatconciselyandrobustlydescribesthemotionofthecamera.Maybank[ 1 ]givesathoroughbackgroundandoutlinesthemathematicalcomplexitiesofbothtechniques.BeforeexploringsomeoftherelevantresearchinSfM,abasicoverviewoftheproblemintheEuclideanframeworkisgiven. 3

PAGE 15

4 2.1GeneralStructurefromMotionSfMProblemSetupandNotationforEuclideanFrameworkBeforegoingintofurtherndingsabout2D-to-2DpointcorrespondencesfromSfMresearch,ageneralmathematicaloverviewoftheclassicalSfMproblemneedstobegiven.Generalnotationusedthroughoutthecommunityisnotstandardized,sothenotationusedinthisthesisisgivenanelementarytreatment.2.1.1GeneralMathematicalNotationsVectorsandmatrices.Thenotationofaboldfacedlowercaseletterdenotesavector.Thesuperscriptdenotesthecoordinatesysteminwhichthecomponentsofthevectorareexpressed.Forexample,pc1representsthevectorpexpressedinthec13-Dcoordinatesystem.Aunitvectorisdenotedbyacaratontopoftheboldfacedlowercaseletter.Forexample,aunitvectorinthesamedirectionasthevectorgivenabovewouldbedenotedas^pc1.Asinglecapitalletterisusedtodesignateamatrix.Rotationmatrices.Rc2c1representsanorthonormalrotationmatrixwherethesubscriptc1representsthecoordinatesystemthattherotationwillgofromandthesuperscriptc2representsthecoordinatesystemthattherotationwillgoto.Theorthonormalrotationmatrixcanbeexpressedastheproductofsuccessiverotationsaboutthethreebodyaxesthatarexedtothesystemundergoingtherotation.ThethreeanglesassociatedwiththethreesuccessiverotationsarecalledEulerangles.Forthisthesis,a3-2-1sequenceofrotationsisusedinreferencetoEulerangles,where3standsforthez-axis,2fortheresultingy-axis,and1forresultingx-axis.Allrotationsareright-handed.Thatis,thesystemwillrstyawaboutthez-axis,pitchabouttheresultingy-axis2,andthenrollabouttheresultingx-axis1.ThisisrepresentedmathematicallyasR21=2666641000cossin0)]TJ/F8 9.963 Tf 9.409 0 Td[(sincos377775266664cos0)]TJ/F8 9.963 Tf 9.409 0 Td[(sin010sin0cos377775266664cossin0)]TJ/F8 9.963 Tf 9.409 0 Td[(sincos0001377775-1 Also,theinverseofanorthonormalrotationmatrixisequaltoitstransposition.R)]TJ/F7 6.974 Tf 6.227 0 Td[(1=RT-2 So,forarotationmatrixthatgoesfromsystem1tosystem2,R21=R12T.-3

PAGE 16

5 ^z ^x;^c ^y;^r f x;y;z c;r Figure2{1:GeneralSfMproblemnotationforEuclideanFramework 2.1.2ImagingSystemLetanimagingsystemhavelengthffromtheopticalfocuspointtothefocalplane.Thisiscommonlyreferredtoasthefocallength.Imagecoordinatescandrarethe2-Dcoordinatesonthefocalplaneusedtorepresenttheprojectionofapointinthethree-dimensional3-Dscenegivenbyx,y,z.Theoriginofthe3-Dsceneiscollocatedwiththeoriginoftheimagingsystem'sfocalplane.Thex-axisandy-axisofthe3-Dscenecorrespondtothecandraxesof2-Dfocalplanecoordinates.Thez-axisofthe3-Dsceneisorthogonaltothefocalplanei.e.,directlyinthelineofsightorpointingdirection.Byusinganalyticalgeometricalrelationships,theimagingsystemcoordinatesarerelatedtothe3-DscenecoordinatesbyEquations 2-4 and 2-5 ,asshowninFigure 2{1 .c=fx z-4r=fy z-5Forthe2D-to-2Dsolution,twodierentviewsorimagesofthesamescenewithncorrespon-dencesareusedtoreconstructthe3-Dsceneineitheroneofthecamera3-Dcoordinatesystems.Lettheoriginoftherstcamerasystembegivenbyoc1andtheoriginofthesecondcamerasystembegivenbyoc2andbothoftheirvaluesare;0;0intheirrespective3-Dcoordinatesys-tems.Thedisplacementbetweenthecamerasystemsisgivenbyatranslationac1andarotation

PAGE 17

6 Rc2c1.Notethatitisarbitrarytowhich3-Dcamerasystemiscalledc1andwhichiscalledc2whenusingthisnotation.Thisdetailispointedoutbecausethesolutiontechniqueofchapter4andtheerroranalysistechniqueofchapter5solvefortheattitudeofc1andc2respectively,asgiveninFigure 2{2 .Forthisparticularexample,ifthetranslationbetweenthetwocameracoordinatesystemsiszero:qc2=Rc2c1pc1-6 So,addinginanon-zerotranslationgivesthecoordinatesystemconversionas:qc2=Rc2c1pc1)]TJ/F29 9.963 Tf 9.962 0 Td[(ac1-7 Thelinespc1andqc2representthe3-Dscenecoordinatesofapointcorrespondenceexpressedintherstandsecondcamerasystemrespectively.ThegeometryusingtwocamerasisshowninFigure 2{2 .Ifthemagnitudeofpc1istakentobepandtheunitvectorisdenotedas^pc1,andlikewiseforthesecondcoordinatesystem,thenq^qc2=Rc2c1p^pc1)]TJ/F29 9.963 Tf 9.963 0 Td[(ac1-8 Equation 2-8 formsthebasisfortheEuclideanreconstructionproblemusedincomputervision. 1 ThereconstructionproblemcanthenberewritteninaEuclideanframeworkassuch:fornpointcorrespondences,ndthescalarvaluespiandqialongwiththecameradisplacementdescribedbyRc2c1andac1thatisvalidforeach^pc1i,^qc2ipointcorrespondencepair.2.1.3PresentationoftheEssentialMatrixtoDe-CoupleStructurefromMotionEquation 2-8 takesintoaccountthetranslationandrotationbetweentherstandsecondcameracoordinatesystemswhenviewingacommonpointinrealspace.ThefollowinglogiccanbeappliedtoseparatethestructureandmotioncomponentsoftheEuclideanSfMproblem[ 1 ]: Takethelimitaskpc1k!1,kqc2k!1,andp q!1.Thisgivestherelationshipbetween^pc1iand^qc2ias^qc2=Rc2c1^pc1-9 Thedotproductbetween^qc2andRc2c1^pc1Rc2c1ac1yieldsRc2c1^pc1Rc2c1ac1^qc2=0-10 1EquivalenttoEquation2-1inMaybank.[ 1 ]

PAGE 18

7 ^zc1 ^xc1;^cc1 cc1;rc1 ^zc2 ^xc2;^cc2 ac1 f f pc1 qc2 control point c2focalplane ^yc1;^rc1 c1focalplane cc2;rc2 ^yc2;^rc2 Figure2{2:GeometryforSfM2D-to-2Dpointcorrespondenceproblem. Thisisreferredtoasthecoplanarconstraintandstatesinmathematicaltermsthattwolinesintersectingthesamepointinspacemusthaveacommonplanewithbothlinesbeingamember.IfEquation 2-10 holdsand^qc2Rc2c1^pc16=0thenthedepth,orpandqcanbefoundviaEquation 2-8 .If^qc2Rc2c1^pc1=0then^qc2=Rc2c1^pc1andthedepthisinnite. WriteEquation 2-10 usingtheantisymmetricmatrixTac1involvingthevectorcomponentsofac1whereTac1,2666640a3)]TJ/F11 9.963 Tf 7.748 0 Td[(a2)]TJ/F11 9.963 Tf 7.748 0 Td[(a30a1a2)]TJ/F11 9.963 Tf 7.748 0 Td[(a10377775.-11 NotingthatTac1^pc1=^pc1ac1givesanalternaterepresentationforEquation 2-10 as^qc2TRc2c1Tac1^pc1=0.-12 Anessentialmatrixisdenedasa33matrixwhichistheproductofanorthogonalrotationmatrixandanon-zeroantisymmetricmatrix.LettingtheessentialmatrixEbedenedasE=Rc2c1Tac1gives^qc2TE^pc1=0.-13

PAGE 19

8 Themostimportantpropertyofessentialmatricesisthattheycanbedenedasthezerosofasetofninehomogeneouspolynomialequationsofdegreethreeinthecoecientsof33matrices[ 1 ].Anotherkeypropertyofessentialmatricesisthattheyareofranktwo. 2 AthirdstatementoftheproblembasedontheessentialmatrixformulationintheEuclideanframeworkissuch:fornpointcorrespondences,ndtheessentialmatrixEthatisvalidforall^pc1i,^qc2ipointcorrespondences.Theadvantageforthisformulationisthatmotionandstructurearenowdecoupledinthesolution.Thatis,thecameradisplacementoriginallygivenbyRc2c1andac1issummarizedsuccinctlyinE.Then,giventhesolutionforE,thestructurecomponentsofthesolutiongivenbypiandqicanbefoundbytherelationshipsgiveninEquation 2-8 .Thisapproachlendsitselftoanalgorithmthatbeginsbysolvingforcameradisplacementindependentofstructure,whichwillbeimportantlater.2.1.4NoteAboutSolutionAmbiguitiesintheEuclideanFrameworkForanysolutiontothereconstructionproblemutilizingtheEuclideanframeworktherearetwoambiguitiesthatwillalwaysbepresent:ascalingambiguityandatwistedpairofsolutions[ 1 ].Thescalingambiguityisinherentbecausetheabsolutesizeofanyobjectina2-Dimageisnotknown.Thatis,alargerobjectfartherawaymayappeartobethesamesizeasasmallerobjectthatisnearer.Thescalingambiguitycanberesolvedbyplacingarbitraryconstraintsonthetranslationvectorac1[ 1 ]e.g.,kac1k=1.Thetwistedpairsolutionarisesbyrotatingthecamera180degreesaroundtheaxisoftranslation.Afterrotationaroundthetranslationaxis,itcanbeshownthatthereisasecondsetofcameradisplacementterms,Sc2c1andbc1thatisvalidforeach^pc1i,^qc2ipointcorrespondencepair[ 1 ].Thatis,E=Rc2c1Tac1=Sc2c1Tbc1.-14AsinglesolutionintheEuclideanframeworkistakentobethesumofsolutionsoverthescalingandtwisted-pairambiguities.2.2ProblemCategorizationandRenementTherearedierentsolutionapproachesbaseduponwhatinformationisavailabletothemanyapplicationsoftheSfMproblem.HuangandNetravali[ 6 ]haveplacedthesesolutionapproachesintothreecategories: 2ForfurtherdetailsonessentialmatricesthereadercanlookinSection2.2ofMaybank'stext[ 1 ]andotherSfMliterature[ 4 5 6 ].

PAGE 20

9 Three-dimensional3Dtothree-dimensional3Dfeaturecorrespondences Two-dimensional2Dtothree-dimensional3Dfeaturecorrespondences Two-dimensional2Dtotwo-dimensional2Dfeaturecorrespondences ThisthesisexploresthethirdofHuang'scategories,specicallyusingpointcorrespondencesinaEuclideanframework.AccordingtoHuang[ 6 ],theapplicationsthat2D-to-2Dfeaturecorrespondencesareusedtosolveare Findingrelativeattitudesoftwocamerasobservingthesamescene Estimatingmotionandstructureofobjectsmovingrelativetoacamera Passivenavigationi.e.,ndingtherelativeattitudeofavehicleattwodierenttimeinstants Ecientcodingandnoisereductionofimagesequencesbyestimatingmotionofobjects. Theobjectiveofthisthesiscorrespondstotherstandthirdapplicationsgivenabove.Asshowninsection 2.1 ,itmakesnodierencemathematicallywhethertheproblemisstatedasndingrelativeattitudeoftwocamerassimultaneouslyviewingthesamesceneorasinglecameraviewinganoverlappingsceneattwodierenttimeinstances.AnattempttoformageneralframeworkforcomparingthevariousavorsofsolutionalgorithmsisgivenbySoattoandPerona[ 5 7 ].Theygiveveinstancesoftheirgeneralmodel.OneoftheseveistheEssentialModel,whichusesthecoplanarityconstraintintroducedbyLonguet-Higgens[ 8 ].ThisformulationisessentiallytheEuclideanframeworkdescribedearlier.AnotheroftheveinstancesoftheirgeneralmodelistheSubspacemodel.ItconsistsofthesubspaceconstraintintroducedbyHeeger[ 9 ]whichinterpretsthemodelasadynamicalsystemratherthananalgebraicconstraint.ThishasthesameadvantageoftheEuclideanframeworkinthatitdecouplesthemotionandstructurecomponentsinthesolution,buthasanadditionaladvantageoffurtherdecouplingtherotationalvelocityfromthetranslationalvelocity.Theotherthreeinstancesoftheirmodelinvolvexationofavariousnumberoffeaturesonthefocalplaneandarenotrelevanttothisthesis.SoattoandPerona'sresearchintogeneralizingthemanySfMapplicationsintoacommonframeworklookspromisingbecausetheypresenttheideaofintegratinginformationovertime.Ifintegrationofinformationcanbeaccomplished,thentheeectivebaselinebetweenimageswillbeincreased.Alongerbaselineshouldresultinincreasedaccuracyofthereconstruction.SoattoandPerona'sworkhastheimplicationthatanysolutiontechniquecanbeconformedtothegeneralmodelandthushavetheintegrationbenetprovidedtherein.Thiscaveathasthepotentialto

PAGE 21

10 increasetheaccuracyofanyapplicationthatiscurrentlybasedonaSfMalgorithmtoperformreconstruction.

PAGE 22

CHAPTER3BENEFITSGAINEDFROMGLOBALPOSITIONINGSYSTEMGPSDIFFERENTIALCARRIERPHASEPOSITIONING.TheGlobalPositioningSystemGPSisatechnologyconsistingofaconstellationofsatel-litesinprescribedorbitstransmittingknownsignalstobeusedforprecisepositioningviaso-phisticatedtriangulationtechniques.Eachsatellitetransmitspseudo-randomcodesatknownfrequencies.TherearetwowaystodeterminepositionfromtheGPSconstellation:codedelaymeasurementsandcarrierphasemeasurements.TypicaluserrangeerrorsUREsusingcodedelaymeasurementsareontheorderof5-10metersandthereisnoambiguityinthesolution.TypicaldierentialpositionerrorsusingcarrierphasemeasurementsaretwoordersofmagnitudelessthanthecodedelayUREs.However,thereisanintegerambiguityinherentinusingcarrierphasemeasurementstodeterminedierentialpositionthatmustberesolved.[ 10 ]Theproblemexaminedinthisthesis,detailedinchapter4,involvesusingadisplacementvectorbetweentwoairbornevehiclesthatiscalculatedintwocoordinatesystemstodeterminetherotationbetweenthosecoordinatesystems.OnedisplacementvectorwillbecalculatedinthebodycoordinatesystemusingSfMtechniques.Theeectsofmeasurementerroronthisdisplacementvectorinthecalculationoftherotationbetweenthetwocoordinatesystemsisthefocusofchapter5.Theothercoordinatesystemwherethedisplacementvectoriscalculatedisaninertialsystem.GPSdierentialcarrierphasepositioningwillbeusedtocalculatethisdisplacementvectorandwillbetreatedastruthdataintheerroranalysisgiveninchapter5.ThischapteriswrittentoshowthevalidityofusingGPSdierentialcarrierphasepositioningastruthdata.ThischapterbeginswithageneraldiscussionofdeterminingpositionfromtheGPScon-stellationusingcodedelayandthenusingcarrierphasemeasurements.Next,theprocessfordeterminingtherelativedisplacementvectorforasinglemovingvehicleisdiscussed.Finally,thedetailsofhowtherelativedisplacementvectorbetweentwovehiclesiscalculatedusingcarrierphasemeasurementsarepresented.AllmaterialcontainedinthischapterwasoriginallypresentedorderivedfrommaterialgiveninchaptersfourandsixofanexcellentGPStextwrittenbyMisraandEnge[ 10 ]. 11

PAGE 23

12 3.1GeneralDiscussionofGPSPositioningTherearetwotypesofmeasurementsthatcanbemadetouseGPStodetermineposition.TheoriginaldesignoftheGPSsystemwastoestimatetherangetoatleastfoursatellitesusingmeasurementsofthepseudo-randomcodetounambiguouslydeterminethereceiverposition.Thepseudo-randomcodegeneratedbythesatelliteiscomparedtothereceivergeneratedpseudo-randomcodetoproduceanestimatedtransittime.Thisestimatedtransittimemultipliedbythespeedoflightinavacuumdeterminestherangetothesatellitethattransmittedthesignal.Themeasurementsfromfoursatellitesareusedtodeterminethe3-Dcoordinatesofthereceiverandthereceiverclockbias.Itwasshowninthelate1970sthatasecondtypeofmeasurementcouldalsobeusedtode-terminereceiverposition[ 11 12 ].Thephaseofthecarriersignaltransmittedbythesatellitecanbemeasuredrelativetoareceivergeneratedcarriersignal.Thephasedierenceplusanunknownnumberofwholecyclesgivesanotherestimateoftherangetothesatellite.Measurementsfromatleastfoursatellitesmuststillbeusedtodeterminereceiverposition,butnowthereceiverpositioncontainsanintegerambiguitythatcannotberesolvedinadirectmanner.Positioningusingbothmeasurementtypesisdiscussedinthenexttwosections.3.1.1PositioningwithCodeDelayMeasurementsPositioningwithcodedelaymeasurementsreliesonthesatelliteandreceiverbeingabletocreateidenticalsignalsintime.Thereceiver-generatedsignalwillthenbeadelayedversionofthetransmittedsignal.Thesignaldelayisproportionaltothedistancetothesatelliteandwillbecalledthetransittime.However,inordertocomparethesignalstheremustbeacommontimereference,t.GPStimeGPSTisusedasthecommontimereference.UsingthenotationasgiveninMisraandEnge[ 10 ],letthetransittimebedenotedas.BoththereceiverandsatelliteclocksarenotpreciselyalignedwithGPST.LetthebiastermsrelativetoGPSTforthesatelliteandreceiverbedenotedastsandtrrespectively.Then,notingthattstandsforGPST,tst)]TJ/F11 9.963 Tf 9.962 0 Td[(=t)]TJ/F11 9.963 Tf 9.962 0 Td[(+tst)]TJ/F11 9.963 Tf 9.962 0 Td[(-1trt=t+trt.-2

PAGE 24

13 So,themeasuredrangetothesatellites,orpseudo-range,isgivenbyt=c[trt)]TJ/F11 9.963 Tf 9.963 0 Td[(tst)]TJ/F11 9.963 Tf 9.962 0 Td[(]=c+c[trt)]TJ/F11 9.963 Tf 9.963 0 Td[(tst)]TJ/F11 9.963 Tf 9.963 0 Td[(]+,-3whereisusedtodenotetherandompseudo-rangeestimationerror.Thetransmissionspeedofthesignalwillbesignicantlylessthanthespeedoflightonceitenterstheearth'satmosphere.Thetransmissionspeedcanbemodeledtotakeintoaccountthedelaysresultingfromtheionosphereandtropospherebyc=rt;t)]TJ/F11 9.963 Tf 9.963 0 Td[(+It+Tt,-4whereItandTtrepresenttheionosphericandtropospherictransmissiondelaysandrt;t)]TJ/F11 9.963 Tf 8.206 0 Td[(representsthetruerangefromthereceivertothesatellite.DroppingthereferencetoaspecictimeinGPST,t,givesthemeasurementequationforthepseudo-rangefromthereceivertothesatelliteas=r+c[tr)]TJ/F11 9.963 Tf 9.962 0 Td[(ts]+I+T+.3-5TheaccuracyofEquation 3-5 isdirectlyrelatedtohowwellthebiasanderrortermsaremodeledortakenintoaccount.Thesatelliteclockbiasatthesignalgenerationtime,tst)]TJ/F11 9.963 Tf 10.278 0 Td[(,iscalculatedbytheGPSgroundstationsandcontainedinthenavigationmessagethatissentwiththepseudo-randomsignal.Thereceiverclockbias,trt,variesforeachreceiverandcancausesignicanterrors.Satelliterangesareontheorderof20000-26000km,whichcorrespondtotransmissiontimesof70to90ms[ 10 ].So,measurementofan80mstransmissiontimewith1%error,or.8ms,wouldresultinapproximately240kmofrangeerror.Or,inordertohavelessthan20mofrangeerror,thereceiverclockbiaswithrespecttoGPSTmustbeaccuratetowithin66.7s.3.1.2PositioningwithCarrierPhaseMeasurementsAmoreaccuratewayofdeterminingtherangefromareceivertoasatelliteistomeasurethephaseofthecarriersignalthatcarriesthepseudo-randomcodegeneratedfromthesatellitewithrespecttothecarriersignalthatthereceivergenerates.Thephaseofthecarriersignalcanbeconvertedtotransittimeofthesignalbyaddingthefractionalcyclethatisgivenbythephaseplusanunknownnumberofwholecycles.Thistechniqueisambiguousinthenumberofwholecyclesthatittakestodeterminetherangetothesatellite.Thecycle,orwavelength,ofthecarriersignalis19cmfortheL1GPSsignaland24cmfortheL2signal.Bycomparison,thelengthofonecodechipoftheC/Acodethatistransmitted

PAGE 25

14 ontheL1frequencyis300m.Thedierentcyclesofthecarrierandcodesignalsresultsindierentresolutionsinthedistancesthatcanbecalculatedfromtheirrespectivemeasurements.Theaccuracy,orresolution,dierencebetweenthesetwotypesofmeasurementscanbecomparedtotheaccuracydierencebetweenmakingmeasurementswitharulerthathastickmarkseveryhalf-centimetertoonethathastickmarkseveryhalf-meter.Inthisanalogy,thecarrierphasemeasurementswouldbetherulerwithtickmarkseveryhalf-centimeterandthecodephasemeasurementswouldbetherulerwithtickmarkseveryhalf-meter.Inanidealizedcase,thecarrierphasemeasurementinunitsofcyclesfromareceivertoasatellitecanberepresentedast=rt)]TJ/F11 9.963 Tf 9.962 0 Td[(st)]TJ/F11 9.963 Tf 9.962 0 Td[(+N,-6wheretherepresentsthetransittimeofthesignal,Nistheintegerambiguity,rtrepresentsthephaseofthereceivergeneratedsignal,andst)]TJ/F11 9.963 Tf 10.124 0 Td[(representsthephaseofthecarriersignalgeneratedbythesatelliteattimet)]TJ/F11 9.963 Tf 10.445 0 Td[(thatisreceivedbythereceiverattimet.SimplifyingEquation 3-6 bywritingphaseastheproductoffrequencyandtimegivest=f+N=c +N=rt;t)]TJ/F11 9.963 Tf 9.963 0 Td[( +N,-7wherefandarethefrequencyandwavelengthofthecarriersignal,cisthespeedoflightinavacuum,andrt;t)]TJ/F11 9.963 Tf 10.417 0 Td[(isthegeometricortruerangefromthereceivertothesatellitesamenotationassection 3.1.1 .AccountingfortheerrorandbiastermsinherentinthemeasurementequationanddroppingthereferencetoaspecictimegivesEquation 3-7 tobe=[r+I+T]+ctr)]TJ/F11 9.963 Tf 9.962 0 Td[(ts +N+,-8whereIandTaretheionosphericandtroposphericdelaysinmeters,trandtsaccountfortheclockbiasesandinitialphaseosetsofthereceiverandsatelliteclocksrespectively,andaccountsforallothermodelingandmeasurementerrors.NotethatEquation 3-8 isinunitsofcycles.3.1.3PositioningwithCarrierPhaseAbsoluteandDierentialPositionwithRespecttoanExternalReferenceSystemSections 3.1.1 and 3.1.2 describedhowtousetheGPSsystemtosolveforabsolutepositionwithrespecttoanexternalreferencesystem.Absolutepositionisthedeterminationofasinglereceiver'slocationwithrespecttoanexternalreferencesystem.Withtheknowledgeofthe

PAGE 26

15 absolutepositionoftworeceivers,thedierentialorrelativepositionbetweenthosetworeceiverscanbefound.However,absolutepositionisnotnecessarytodeterminerelative,ordierential,positionwhenusingcarrierphasemeasurements.Onlydierentialpositionisneededfortheapplicationinthisthesis.Thissimpliestheprocessingofthecarrierphasemeasurements.Thenumberofwholecyclesbetweenthereceiverandthesatellitemustbedeterminedinordertocalculatethereceiverpositionwhensolvingforabsolutepositionusingcarrierphasemeasurements.Thisisnormallyaccomplishedbyrelyingongeometricchangesinthesatelliteconstellationtoruleoutalltheerroneoussolutionstotheintegerambiguityuntilthereisonlyonevalidsolution.However,whenonlythedierentialpositionbetweentworeceiversisneeded,asisthecaseforthisapplication,thenumberofwholecyclescorrespondingtowhatisreferredtoasthedeltapseudo-rangecanbeused[ 10 ].Thedeltapseudo-rangeisdeterminedbythechangeincarrierphasemeasurementsoveratimeinterval.=t1)]TJ/F11 9.963 Tf 9.963 0 Td[(t03-9Ifthebaselinebetweenthetwomeasurementsissmall, 1 thenthedelayduetoionosphericandtroposphericeectsonbothmeasurementswillbeverysimilarandwillessentiallycancelout.Also,thesatelliteclockbiasterms,ts,willcanceloutbetweenthetwomeasurements.Thisleavesonlydierencesinthetworeceiverclockbiasterms,thetruerangedierence,andthenumberofwholecyclesbetweenthetworeceiversastheonlythreeunknownsleftfromEquation 3-8 .Thatis,=r+ctr2)]TJ/F11 9.963 Tf 9.963 0 Td[(tr1 +N0+~.-10whereristhedierenceintherangestothesatellitebetweenthetworeceivers,tr2andtr1representtheclockbiastermsforreceiverr2andr1respectively,N0isthenumberofwholecyclesbetweenthetwomeasurements,and~representsthemeasurementerror.NotethatNfromEquation 3-8 ismuchgreaterthanN0fromEquation 3-10 asshowninFigure 3{1 .Also,notethatwillbegreaterthan~whenthebaselinebetweenthereceiversisveryshort,sodierentialpositionestimatesfromcarrierphasemeasurementswillbeevenmoreaccuratethanabsolutepositionestimatesusingcarrierphasemeasurements. 1Rangeslessthan10kmcanbeclassiedassmallforpurposesofionosphericandtroposphericpurposes[ 10 ].Sothisassumptionisespeciallytruefortheapplicationinthisthesissincebase-lineswillbeontheorderoftensofmeters.

PAGE 27

16 ^li r1 r2 Ni a SAi N0i Figure3{1:Geometryfordeterminationofdierentialpositionfromcarrierphasemeasurements. 3.1.4DeterminationofDierentialPositionfromCarrierPhaseMeasurementsAllthepreviousdiscussionsinthischapterdealtwithasinglesatelliteandoneortworeceivers.However,inordertodeterminedierentialpositionfromcarrierphasemeasurements,thetworeceiversmusttrackatleastfoursatellitestodeterminethethreepositionunknownsx;y;zintheexternalreferencesystemandtheclockbiastermtr2)]TJ/F11 9.963 Tf 9.568 0 Td[(tr1forthetworeceivers.Subscriptswillnowbeusedtodenotesatellitenumberinallrespectivesymbols.ReferringtoFigure 3{1 :^lirepresentstheunitline-of-siteULOSvectorintheexternalreferencesystemthatpointsinthedirectionfromreceiverr1tothesatellitedenotedbySAi,arepresentsthevectorfromreceiverr1toreceiverr2,N0irepresentsthenumberofwholecyclesofthecarriersignaltransmittedbySAibetweenthetworeceivers,andNirepresentsthenumberofwholecyclesbetweenreceiverr1andSAi.Usingthisnotation,r=^lia,-11whereristhedierenceintherangestothesatellitebetweenthetworeceivers,andtheULOSvector,^li,isaknownquantitybecausethereceiverhasknowledgeofthei-thsatellitelocationintheexternalcoordinatesystemanditcansensetheazimuthandelevationanglestothatsatelliteviaitssignal.Letn=[N1N2:::NM]TwhereMrepresentsthenumberofsatellitesthataretrackedbybothr1andr2,andt=tr2)]TJ/F11 9.963 Tf 10.636 0 Td[(tr1.Thisletstherelationshipfordeterminingthedierentialpositionbetweenreceiversr1andr2usingcarrierphasemeasurementsfromM

PAGE 28

17 satellitesas2666666644142...4M377777775M1=266666664^lT11^lT21......^lTM1377777775M4264act37541+nM1-12Noticethatthereare,ingeneral,M+4unknownsinEquation 3-12 ,sothisproblemcannotbedirectlysolvedaswritten.Otherknowledgeofthesituationmustbeusedtoaccountfortheintegerambiguityandsolveforthepositionandclockbiasterms.3.2DisplacementVectorforOneVehiclefromCarrierPhase:IntegerAmbiguityNotaProblemThedierentialpositionbetweenasinglemovingreceiverattwodierenttimesissimplythedisplacementvectorofthereceiver.Ifthereceivercancontinuouslytrackaminimumoffoursatellitesforagivendurationandcountupthenumberofwholecyclesoccurringinthatdurationforeachsatellite,thenEquation 3-12 willhavefourequationsandfourunknowns.Therefore,itcanbeusedtosolveunambiguouslyforthedierentialpositionandclockbiasterms,[aTct]T.Ifthebaselinebetweenthetwomeasurementsisontheorderoftensofmeters,thentheionosphericandtroposphericdelayswillbepracticallyidenticalandtheaccuracyofthedier-entialpositionvectorawillbegivenbytheaccuracyofthephasemeasurements.Thecarrierphasecantypicallybemeasuredwithanaccuracyof0.01-0.05cyclesmm-1cm[ 10 ].Thismeansthatbycontinuouslymeasuringthephaseofaminimumoffoursatellitesforagivendu-ration,dierentialpositioni.e.,theavectorinanexternalreferencesystemcanbeestimatedwithmillimeter-to-centimeteraccuracy.Thesimplemethodwhichachievesthisaccuracyisquiteremarkable!3.3DisplacementVectorforTwoVehiclesfromCarrierPhase:IntegerAmbiguityisaProblemInordertousecarrierphasemeasurementstocalculatethedisplacementvectorbetweentwodierentreceivers,theintegerambiguitymustberesolved.Thetimeittakestoresolvetheintegerambiguity,orinitializationtime,iswhatneedstobereducedinordertousecarrierphasemeasurementsforairbornevehiclenavigation.Thereareseveralmethodsforresolvingtheintegerambiguityindierentialcarrierphasepositioningavailableinpresenttechnology.Threemethodsarediscussedinthefollowingsections:RealtimekinematicRTK,Widelaning,andKalmanltering.However,oncetheintegerambiguityisresolvedcorrectly,thedierentialpositionisstill

PAGE 29

18 inthemillimeter-to-centimeteraccuracyjustasthesinglevehiclecase.Theonlydierenceisthatthetwovehiclecaserequiresaninitializationtimetoresolvetheintegerambiguityproblem.3.3.1RealTimeKinematicRTKwithRovingReferenceReceiverDierentialGPSDGPSisaproventechniquethattakesadvantageoftheslowlychangingnatureofthestandardGPSerrors.Iftworeceiversarewithinareasonabledistancefromeachother,dierencingthecorrelatederrorscanreducetheireectsandresultinamoreaccuratepositionestimate.AmodeavailablefrommanyreceivermanufacturersiscalledRealTimeKinematicRTK.RTKmodecombinesDGPSandcarrierphasemeasurementstoproducepreciserelativepositioning.RTKmodeusesreferenceandroverreceivers,acommunicationlink,andsoftwaretopreciselydeterminetheroverreceiver'sposition.Theinitializationtimeforintegerambiguityresolutionforbaselinesofseveralkilometersfor2001technologyis30-60s[ 10 ].Oncetheinitializationtimeiscomplete,thetworeceiversmustcontinuetotrackthesamesetofsatellitestohaveknowledgeoftheintegerambiguity.Thereisnoreasonwhyboththereferenceandroverreceiverscannotbemoving.3.3.2WideLaningUsingDualFrequencyItcanbeshownthatthetotalnumberofintegervalueswhichsatisfytheintegerambiguitydecreasesasthecarrierwavelengthincreases[ 10 ].Therearetworelatedeectswhichcausethistooccur:1increasedwavelengthdecreasesthenumberofnodesinthesearchspaceofpossiblesolutions,and2increasedwavelengthincreasesthedistancebetweenadjacentnodesinthesearchspace.TheGPSsatellitesarecurrentlytransmittingsignalsattwofrequencies:L1andL2cycle=19cmatL1and24cmatL2.So,inordertoimproveintegerestimation,thewidelaningmethodcombinesboththeL1andL2frequenciestocreate"anewsignalwithalongerwavelength.Thenewsignal,L12,resultsinawavelengthof0.862m[ 10 ].Thisresultsinamoreaccurateestimateoftheintegerambiguities,butsacricesaccuracyintheestimationofreceiverposition.3.3.3KalmanFilterAKalmanlterapproachcanalsobeusedtoestimatetheintegerambiguities.Bothcodeandcarriermeasurementsareusedinthismethod.Thecoursecodemeasurementsareusedtogiveaguessattheinitialposition.Then,theintegerambiguitiesaretreatedasoatingpointstatesinaKalmanlter.Whenevertheintegers"convergei.e.,thelterapproachessteadystate,theyareroundedtothenearestintegertodeterminethesolution.

PAGE 30

CHAPTER4PROBLEMDEFINITIONANDSOLUTIONTheusualsolutiontothereconstructionproblemcontainsbothmotionandstructurecomponents.However,forthepurposeofvehicleattitudedetermination,thestructurecomponentisnotrequired.Themotionsolutioncomponent,specicallythetranslationvectorbetweentwocamerasystems,ac1,whencombinedwiththeGPSdierentialcarrierphasepositioningbetweenthetwosystems,aLL,givesalltheinformationneededtoknowtheinertialattitudeofthevehicleuptorotationaboutthetranslationaxisthislimitationisexplainedfurtherinthenextsection.Statingtheprobleminamoreformalform:fornpointcorrespondencesbetweentwocamerasystems,c1andc2,andaninertialtranslationalvectoraLL,ndtheinertialorthonormalrotationmatrixRc1LLorequivalentlyRc2LL.Twosolutionapproachesaregivenbasedonsection6.1ofMaybank'stext[ 1 ]:anSVDbasedapproachandanerrorminimizationapproach.TheSVDbasedapproachisgivenacompletesolution,whiletheerrorminimizationapproachisdealtwithingeneralterms.4.1ProblemNotesandAssumptions4.1.1RotationAmbiguityaboutTranslationalAxisTheinertialrotationaroundthetranslationalaxisbetweenthetwocamerasystemscannotbefoundfromtheinformationcontainedintheproblemstatement.Thiscanbeillustratedbylookingatatrivialexample:Ifcameraoneandcameratwowerebothpointedduenorthandtheirdisplacementwasalsoduenorth,thenusinga1-2-3orroll-pitch-yawEuleranglerepresentationoftheorthonormalrotationmatrixwouldmeanthattherolliscompletelyambiguous.Likewise,ifthepointinganddisplacementvectorwereinanyotherdirectioncomprisedofmorethanonevectorcomponent,thentheroll-pitch-yawangleswouldbesomehowinterrelated.Inotherwords,usingonlytwovisionsystemsintheformulationofthisproblemallowssolutionforonlytwodegreesoffreedom.Athirdvisionsystemwouldneedtobeaddedtoallowsolutionforathirddegreeoffreedom.4.1.2NoGPSAttitudeDeterminationAvailableGPSattitudedeterminationwillnotbeavailableonthevehicleofinterest.Thisassumptionisreasonablebecausethesizeofthevehiclebeingconsideredinthisapplicationdoesnotsupport 19

PAGE 31

20 alongenoughbaselinetoprovideaccuratemeasurements.Abaselineof0.5-5metersisneededtoprovideusefulresultsfromGPSattitudedetermination[ 13 ].AsecondapproachtoGPSattitudedeterminationistousesuccessivepositionalmeasure-mentsandmaketheassumptionthatthevehicleattitudecorrespondsdirectlytothedirectionofmotion.ThiswouldbepossiblewithasingleGPSreceiverandcouldusetheaccurateresultsassociatedwiththedierentialcarrierphasemeasurements;however,assoonasthevehicle'sightpathcontainedanyrotationalvelocity,oranynon-zeroside-sliporangle-of-attack,thisapproachproducesfundamentallyerroneousresults.4.1.3SpecicationofControlPointsControlpointscanbespeciedortrackedbetweenimagessuchthattheprojectionsofthepointcorrespondencesontothefocalplaneareknown.Howthecontrolpointsareidentiedisoutsidethescopeofthisthesis.Thisisaclassicrecognitionortrackingprobleminimageprocessingandallthatisassumedinthisthesisisthatthecontrolpointvaluesi.e.,candrcanbemeasuredwithknownerrorstatistics.4.1.4GPSDierentialCarrierPhasePositioningDataUsedasTruthTheerrorsassociatedwiththeGPSdierentialcarrierphasepositioningareconsiderednegligiblecomparedtoothersystemerrorsandthesepositionestimatesarethereforeusedastruthdata.Thisisreasonablesincethebaselineusedinthisproblemis10metersandtheerrorsassociatedwiththeGPSmeasurementsareontheorderofcentimeters,thuscorrespondingtoangularerrorsontheorderof5mrade.g.,tan)]TJ/F7 6.974 Tf 6.227 0 Td[(15cm 10m.Seechapter3fordetaileddiscussionofthisassumption.4.2SpecicSfMProblemSetuptoFindVehicleInertialAttitudeThesameimagingsystemisusedasgiveninChapter 2 ,notingthatthe3-Dscenecoordinatesystemusedhereiscommonlycalledthecameraorsensorsysteminnavigationalterms.Figure 4{1 isareproductionofthecorrespondinggureinChapter 2 ,wherethefollowingequationsremainvalid:c=fx z,-1r=fy z.-2Forthe2D-to-2DSfMsolution,twodierentviewsorimagesofthesamescenewithncorrespondencesareusedtoreconstructthe3-Dsceneineitheroneofthecamera3-Dcoordinate

PAGE 32

21 ^z ^x;^c ^y;^r f x;y;z c;r Figure4{1:Imagingsystemnotationusedinndingvehicle'sinertialattitude. systems.Usingthenotationgivenearlier,thetranslationandrotationbetweenthetwocamerasystemsisac1andRc2c1respectively.Theoriginsofthetwocamerasystemsaregivenbyo1LLando2LL,whereLLstandsforalocallevelcoordinatesystemsuchasNorth-East-DownNEDorEast-North-UpENU.Thesevaluescannotbeobtainedfromtheimagingsystem,butaresuppliedviatheGPSsensorandusedastruthdata. 1 Thelinespc1andqc2representthe3-Dscenecoordinatesofonecorrespondenceexpressedintherstandsecondcamerasystemrespectively.ThisisshowninFigure 4{2 .Notethatitisarbitrarywhich3-Dcamerasystemiscalledc1andwhichiscalledc2whensolvingthisproblem.4.3SingularValueDecompositionSVDSolutionTheSVDsolutionapproachtothevehicleattitudedeterminationfrompointcorrespondencesproblemisadaptedfromsection6.1.1ofMaybank'stext[ 1 ].Thereaderisreferredtothistextifthereareanyareasofthisalgorithmthatarenotpresentedtothelevelofdetailthatisdesired.Letthepointcorrespondencepairs,^pc1iand^qc2iwherei=1:::nsuchthatn9,bedescribedbytheEuclideanframeworknotationutilizingtheessentialmatrixEdescribedearliersuchthatE=Rc2c1Tac1-3 1seesection 4.1.4 fordetails.

PAGE 33

22 ^zc1 ^zc2 ^xc2;^cc2 f f c2focalplane ^yc1;^rc1 cc2;rc2 ^xLL ^zLL ^yLL control point pc1 qc2 o2LL c1focalplane ac1;aLL o1LL ^xc1;^cc1 ^yc2;^rc2 cc1;rc1 Figure4{2:Single2D-to-2Dpointcorrespondenceusedinndingvehicle'sinertialattitude. wherethetranslationandrotationbetweenthecamerasystemsaregivenbyac1andRc2c1respectively.AnerrorfunctionVEisdenedonthespaceof33matricessuchthatVE=nXi=1^qc2iTE^pc1i2-4Findthe33matrixbEwithunitFrobeniusnormthatminimizesVbE,andthenndthenearestessentialmatrixtobEtheunitFrobeniusnormconstraintisimposedtoexcludethetrivialsolutionbE=0. 2 AnoutlineoftheSVDsolutionisgivenbelow,withdetailsfollowingbackgroundsectionsonsomeofthemathematicaldetailsneededforthesolution. Formthen9matrixAfromthenpointcorrespondencepairs. PerformSVDonAviaA=UVT Setbe=lastcolumnofV ReshapebeintotheestimateoftheessentialmatrixbE. PerformSVDonbEviabE=U00V0T Setb^ac1=lastcolumnofV0 DeneaLL=o2LL)]TJ/F29 9.963 Tf 9.963 0 Td[(o1LL 2Notethatthewidehataccent,bE,isusedtodenoteanestimateofavariable.Thisisnottobeconfusedwiththehataccentsymbolusedtodenoteaunitvector,^p.Withthisnomenclature,b^acindicatesanestimateoftheunitvector^ac.

PAGE 34

23 Determineasecondlinearlyindependentvectorthatisexpressedinboththec1andLLcoordinatesystemsb^bc1andbLL. Calculateathirdlinearlyindependentvectorthatisexpressedinboththec1andLLcoordinatesystemsbytakingthecrossproductbetweenb^bc1andb^ac1,and^bLLand^aLL. Usethethreevectorsexpressedinboththec1andLLcoordinatesystemstocomputeRc1LL. Thefollowingsubsectionsgiveabriefbackgroundinthelinearalgebraicmathematicaldetailsneededbeforedescribingtheoutlinedstepsgivenabove.4.3.1BriefOverviewofLinearAlgebraTechniquesusedinSVDSolutionSingularvaluedecomposition.AnymnmatrixAcanbewrittenastheproductofanmncolumn-orthogonalmatrixU,annndiagonalmatrixwithpositiveorzeroelements,andthetransposeofannnorthogonalmatrixV[ 14 ]A=UVT-5 where=diag12:::n)]TJ/F7 6.974 Tf 6.227 0 Td[(1nfor12:::n)]TJ/F7 6.974 Tf 6.226 0 Td[(1n0-6 andUTU=I-7VTV=I-8Matrixnormcalculations.Therearetwomatrixnormalizationcalculationsusedinthisthesis.AsgiveninMaybank'stext[ 1 ],theEuclideannorm,k:k,issubordinatetotheEuclideanvectornormandisdenedaskAk=supkAxkjkxk=1,-9whereAisanmnmatrix.TheFrobeniusnorm,k:kf,isdenedaskAk2f=m;nXi=1;j=1A2ij.-10PerformingtheSVDineachofthesematrixnormsresultsin[ 1 ]kAk=kUVk=kk=1,-11

PAGE 35

24 kAk2f=kUVk2f=kk2f=kXi=12i.-124.3.2DetailedLookatSVDSolutionAlgorithmFormthen9Amatrix.Lookingatasinglepointcorrespondencepair,^pc1and^qc2,andwritingouttheessentialmatrixformulationinitscomponentformgives^qc2x^qc2y^qc2z266664E11E12E13E21E22E23E31E32E33377775266664^pc1x^pc1y^pc1z377775=04-13WritingEquation 4-13 inscalarformgives:^qc2x^pc1xE11+^qc2y^pc1xE21+^qc2z^pc1xE31+^qc2x^pc1yE12+^qc2y^pc1yE22+^qc2z^pc1yE32+^qc2x^pc1zE13+^qc2y^pc1zE23+^qc2z^pc1zE33=0.4-14Lettingthe^pc1and^qc2termsformarowofthen9Amatrix.Thei-throwofAisthengivenbyAi,^qc2ix^pc1ix^qc2iy^pc1ix^qc2iz^pc1ix^qc2ix^pc1iy^qc2iy^pc1iy^qc2iz^pc1iy^qc2ix^pc1iz^qc2iy^pc1iz^qc2iz^pc1iz:-15 RewritetheEtermsasa91columnvectorase,26666666666666666666666664E11E12E13E21E22E23E31E32E3337777777777777777777777775.-16 Soforninepointcorrespondencesthecorrespondingmatrixequationis

PAGE 36

25 26666666666666666666666664^qc21x^pc11x^qc21y^pc11x^qc21z^pc11x^qc21x^pc11y^qc21y^pc11y^qc21z^pc11y^qc22x^pc12x^qc22y^pc12x^qc22z^pc12x^qc22x^pc12y^qc22y^qc12y^qc22z^pc1iy^qc23x^pc13x^qc23y^pc13x^qc23z^pc13x^qc23x^pc13y^qc23y^pc13y^qc23z^pc13y^qc24x^pc14x^qc24y^pc14x^qc24z^pc14x^qc24x^pc14y^qc24y^pc14y^qc24z^pc14y^qc25x^pc15x^qc25y^pc15x^qc25z^pc15x^qc25x^pc15y^qc25y^pc15y^qc25z^pc15y^qc26x^pc16x^qc26y^pc16x^qc26z^pc16x^qc26x^pc16y^qc26y^pc16y^qc26z^pc16y^qc27x^pc17x^qc27y^pc17x^qc27z^pc17x^qc27x^pc17y^qc27y^pc17y^qc27z^pc17y^qc28x^pc18x^qc28y^pc18x^qc28z^pc18x^qc28x^pc18y^qc28y^pc18y^qc28z^pc18y^qc29x^pc19x^qc29y^pc19x^qc29z^pc19x^qc29x^pc19y^qc29y^pc19y^qc29z^pc19y^qc21x^pc11z^qc21y^pc11z^qc21z^pc11z^qc22x^pc12z^qc22y^pc12z^qc22z^pc12z^qc23x^pc13z^qc23y^pc13z^qc23z^pc13z^qc24x^pc14z^qc24y^pc14z^qc24z^pc14z^qc25x^pc15z^qc25y^pc15z^qc25z^pc15z^qc26x^pc16z^qc26y^pc16z^qc26z^pc16z^qc27x^pc17z^qc27y^pc17z^qc27z^pc17z^qc28x^pc18z^qc28y^pc18z^qc28z^pc18z^qc29x^pc19z^qc29y^pc19z^qc29z^pc19z3777777777777777777777777526666666666666666666666664E11E12E13E21E22E23E31E32E3337777777777777777777777775=0.-17Equation 4-17 canberepresentedcompactlyforthesetofnpointcorrespondencepairsbyAn9e91=0n1.-18 Notethatthekek=1constraintisplacedonetoexcludethetrivialsolution.PerformSVDonA.Fromtheabovereformulationofthereconstructionproblem,itfollowsthat^qc2iTE^pc1i2=Aie2-19 andalsothatVE=kAek2-20 Then,performingthesingularvaluedecompositiononAgivesVE=kUVTek=kVTek29kVTek=29kek=29-21 Notingthattheconstraintkek=1isinvokedinthelaststep.Setbe=lastcolumnofV.So,the33matrixthatminimizestheerrorfunctiondescribedbyEquation 4-4 canbefoundfromthe91columnvectorbe=VTf9-22, wheref9istheninedimensionalvectordenedas;0;:::;1T.Intheabsenceofnoise,e=be.TheconsequenceofthisresultisshownlatertobethatVbE=VE=0.

PAGE 37

26 ReshapebeintotheessentialmatrixbE.bEisfoundfrombeviabE=266664be1be2be3be4be5be6be7be8be9377775-23 whereVbE=9whichwasshownbeforetobetheminimumvalue.If9=0,thenVbE=VE=0andthereforebE=E.PerformSVDonbE.Atthispointinthealgorithm,Maybank[ 1 ]discussesanotherminimizationfunctionusingtheleasteigenvaluetoderiverotationandtranslationtermsfromtheessentialmatrix.Theproblemdescribedinthisthesisrequiresonlythetranslationalterm,ac1,toformasolutionforthevehicleattitude.Thatis,Eac1=Rc2c1Tac1ac1=Rc2c1ac1ac1=Rc2c10=0.-24 Therefore,ac1isinthenullspaceofE.ThisresultsintheneedtoperformthesingularvaluedecompositiononbEsuchthatbE=U00V0T.-25Setb^ac1=lastcolumnofV0T.Usingthesamelogicasndingbe,b^ac1=V0Ts3,-26 wheres3isdenedasthethree-dimensionalvector;0;1T.Notethatintheabsenceofnoise,b^ac1=^ac1,aswouldbeexpected.ThisconcludestheSfMcontributiontothisalgorithm.Theremainingstepsareaconse-quenceofusingtheavailableinformationfromtheGPSsensorandmatrixmanipulationvialinearalgebra.DeneaLL=o2LL)]TJ/F29 9.963 Tf 10.813 0 Td[(o1LL.UseGPSdierentialcarrierphasepositioningtomarktheoriginofeachcamerasystem,c1andc2.ThetranslationdisplacementvectorcanthenbeobtainedviaaLL=o2LL)]TJ/F29 9.963 Tf 9.962 0 Td[(o1LL.-27Determineasecondlinearlyindependentvectorthatisexpressedinboththec1andLLcoordinatesystemsb^bc1andbLL.Thereareatleasttwowaystondasecondlinearlyindependentvectorthatisexpressedinboththec1andLLcoordinatesystems.Therstwayistorepeatsteps-aboveforasecondcamerasystempairc1andc3tond

PAGE 38

27 b^bc1andbLL.ThesecondpossibilityistousethedownvectorobtainedfromtheonboardinertialnavigationsystemINSasthesecondvectorexpressedinbothcoordinatesystems.ThiswouldtakeadvantageofthedownvectorinLLasbeing;0;1NEDor;0;)]TJ/F8 9.963 Tf 7.749 0 Td[(1ENU.Whicheverofthetwomethodsisamoreconvenientchoiceshouldbeused.Deriveathirdlinearlyindependentvectorthatisexpressedinboththec1andLLcoordinatesystemsbytakingthecrossproductbetweenb^bc1andb^ac1,and^bLLand^aLL.Considerthetwovectorsusedinthecrossproductasinputsandtheresultingvectorasoutput.Ifthetwoinputvectorsarelinearlyindependent,thenfromthedenitionofacrossproduct,theoutputvectorislinearlyindependentofbothinputvectors.Thisresultcanbeexploitedtoreducethenumberofcalculatedvectorsexpressedinboththec1andLLcoordinatesystemsfromthreetotwo.Thethirdlinearlyindependentvectorcanbederivedfromthersttwovectorsviab^cc1=b^bc1b^ac1-28^cLL=^bLL^aLL.-29Usethethreevectorsexpressedinboththec1andLLcoordinatesystemstocomputeRc1LL.Leteachcolumninthe33matrixCbedenedasalinearindependentvectorexpressedinthec1coordinatesystem.FromtheresultsobtainedearlierCiscomprisedasC,266664b^cc1xb^bc1xb^ac1xb^cc1yb^bc1yb^ac1yb^cc1zb^bc1zb^ac1z377775.-30 Andsimilarly,leteachcolumnofthe33matrixLbedenedasthecorrespondingvectorsusedtopopulateCexpressedintheLLcoordinatesystem.L,266664^cLLx^bLLx^aLLx^cLLy^bLLy^aLLy^cLLz^bLLz^aLLz377775.-31 Then,therotationmatrixRc1LLcanbedeterminedviaRc1LL=CL)]TJ/F7 6.974 Tf 6.227 0 Td[(1.-32

PAGE 39

28 ThissolutionforRc1LLwillbeuniqueifthethreevectorscomprisingCandLarelinearlyindependent.Statedanotherway,ifbothLandCareoffullrank,thenRc1LLwillbeauniquesolution.4.4ErrorMinimizationSolutionApproachAnotherapproachforndingtheinertialattitudeofavehicleutilizingEuclideanSfMtechniquesistondtheminimumofanerrorequationusingadescentoradaptivealgorithm.Thedescentalgorithmisaniterativeapproachtoasolution.Itsearchestheneighborhoodaroundaninitialpointtodetermineanewpointwherethevalueoftheerrorfunctionislessthantheinitialpoint.TheerrorfunctionisapproximatedbyaTaylorseriesexpansiontosimplifythesearchalgorithm.Thisprocessrepeatsuntilaminimumoftheerrorfunctionisfound.Threesolutionapproachesbasedondescentalgorithmsarepresented.Thersttwoap-proachesusethecoplanarityconstrainttoobtainanerrorequationthatcanbeminimized.Thethirdapproachminimizesthesystemerrorvectorfromthelinearizedmeasurementmodelderivedinthenextchapter.Allthreesolutionsarediscussedingeneralterms.Thetwoapproachesutiliz-ingthecoplanarityconstraintwereoriginallygivenbyHorn[ 15 ]andalsodiscussedbyMaybank[ 1 ].Developmentofthethirdapproachwhichminimizesthestateerrorvectorisleftasafutureareaofstudy.AllthreeapproachesarediscussedinamoregeneralmannerthantheSVDsolutiongivenintheprevioussection.4.4.1BriefOverviewofRotationsusingQuaternionsThedescentsolutionalgorithmsgiveninthesection 4.4.2 makethesubtlechangeofusingquaternionstorepresenttherotationfromtheinertialtobodycoordinatesystem.Thissectiongivesabriefoverviewoftheuseofquaternionstorepresentrotations.ThefollowinginformationisderivedfromlecturenotesgivenbyE.SuttonattheUniversityofFloridaGraduateEngineer-ingandResearchCenterinShalimar,FL.FurtherinformationonquaternionscanbefoundinFraleigh[ 16 ]andZipfel[ 17 ].Quaternions.Quaternionscanbethoughtofasafourelementvectororasasum:q=266666664q1q2q3q4377777775=q1^{+q2^|+q3^k+q4-33

PAGE 40

29 where^{,^|,and^kobeythefollowingrelationships:^{^|=^k,^|^k=^{,^k^{=^|,^|^{=)]TJ/F8 9.963 Tf 8.008 2.629 Td[(^k,^k^|=)]TJ/F8 9.963 Tf 7.141 0 Td[(^{,and^{^k=)]TJ/F8 9.963 Tf 8.001 0 Td[(^|.Quaternionmultiplicationorcompositionisdenedasfollows:ab=a1^{+a2^|+a3^k+a4b1^{+b2^|+b3^k+b4-34ExpandingthetermsontherightsideoftheEquation 4-34 andgroupingbythe^{,^|,^k,andscalarpartsgiveab=a1b4+a2b3)]TJ/F11 9.963 Tf 9.962 0 Td[(a3b2+a4b1^{+)]TJ/F11 9.963 Tf 7.749 0 Td[(a1b3+a2b4+a3b1+a4b2^|+a1b2)]TJ/F11 9.963 Tf 9.962 0 Td[(a2b1+a3b4+a4b3^k)]TJ/F8 9.963 Tf 9.962 0 Td[(a1b1)]TJ/F11 9.963 Tf 9.963 0 Td[(a2b2)]TJ/F11 9.963 Tf 9.963 0 Td[(a3b3+a4b4.-35Quaternionconjugationisdenedbyq=266666664)]TJ/F11 9.963 Tf 7.749 0 Td[(q1)]TJ/F11 9.963 Tf 7.749 0 Td[(q2)]TJ/F11 9.963 Tf 7.749 0 Td[(q3q4377777775=)]TJ/F11 9.963 Tf 7.749 0 Td[(q1^{)]TJ/F11 9.963 Tf 9.963 0 Td[(q2^|)]TJ/F11 9.963 Tf 9.962 0 Td[(q3^k+q4-36Themultiplicativeinverseisgivenbyq)]TJ/F7 6.974 Tf 6.226 0 Td[(1=q kqk2-37wherekqk=q q21+q22+q23+q24-38Quaternionmultiplicationcanalsobeperformedusingtwoequivalentmatrix-vectorforms:ab=266666664a4)]TJ/F11 9.963 Tf 7.749 0 Td[(a3a2a1a3a4)]TJ/F11 9.963 Tf 7.749 0 Td[(a1a2)]TJ/F11 9.963 Tf 7.749 0 Td[(a2a1a4a3)]TJ/F11 9.963 Tf 7.749 0 Td[(a1)]TJ/F11 9.963 Tf 7.749 0 Td[(a2)]TJ/F11 9.963 Tf 7.749 0 Td[(a3a4377777775266666664b1b2b3b4377777775=266666664b4b3)]TJ/F11 9.963 Tf 7.748 0 Td[(b2b1)]TJ/F11 9.963 Tf 7.749 0 Td[(b3b4b1b2b2)]TJ/F11 9.963 Tf 7.749 0 Td[(b1b4b3)]TJ/F11 9.963 Tf 7.749 0 Td[(b1)]TJ/F11 9.963 Tf 7.749 0 Td[(b2)]TJ/F11 9.963 Tf 7.748 0 Td[(b3b4377777775266666664a1a2a3a4377777775-39Quaternionsobeythefollowingproperties:q=q-40kqk2=qq=qq-41pq=qp-42

PAGE 41

30 kpqk=kpkkqk-43pq)]TJ/F7 6.974 Tf 6.227 0 Td[(1=q)]TJ/F7 6.974 Tf 6.226 0 Td[(1p)]TJ/F7 6.974 Tf 6.226 0 Td[(1-44Quaternionsrepresentingrotations.Quaternionscanbeusedtorepresentrotations:q=264esin 2cos 2375=266666664exsin 2eysin 2ezsin 2cos 2377777775=266666664q1q2q3q4377777775-45wheree=ex;ey;ezistheaxisofrotationandistheangleofrotation.Aquaternionthatrepresentsarotationmusthaveunitlengthkqk=1.Also,notethat)]TJ/F29 9.963 Tf 7.748 0 Td[(qrepresentsthesamerotationasq.Arotationfromcoordinatesystemxtocoordinatesystemyisaccomplishedasfollows:ry=qyxrxqyx-46Expandingintomatrixnotationgives266666664y1y2y30377777775=266666664q4)]TJ/F11 9.963 Tf 7.749 0 Td[(q3q2q1q3q4)]TJ/F11 9.963 Tf 7.748 0 Td[(q1q2)]TJ/F11 9.963 Tf 7.749 0 Td[(q2q1q4q3)]TJ/F11 9.963 Tf 7.749 0 Td[(q1)]TJ/F11 9.963 Tf 7.749 0 Td[(q2)]TJ/F11 9.963 Tf 7.748 0 Td[(q3q4377777775266666664q4)]TJ/F11 9.963 Tf 7.749 0 Td[(q3q2)]TJ/F11 9.963 Tf 7.749 0 Td[(q1q3q4)]TJ/F11 9.963 Tf 7.748 0 Td[(q1)]TJ/F11 9.963 Tf 7.749 0 Td[(q2)]TJ/F11 9.963 Tf 7.749 0 Td[(q2q1q4)]TJ/F11 9.963 Tf 7.749 0 Td[(q3q1q2q3q4377777775266666664x1x2x30377777775=266666664q21)]TJ/F11 9.963 Tf 9.962 0 Td[(q22)]TJ/F11 9.963 Tf 9.963 0 Td[(q23+q242q1q2)]TJ/F11 9.963 Tf 9.962 0 Td[(q3q42q1q3+q2q4 2q1q2+q3q4)]TJ/F11 9.963 Tf 7.748 0 Td[(q21+q22)]TJ/F11 9.963 Tf 9.963 0 Td[(q23+q242)]TJ/F11 9.963 Tf 7.748 0 Td[(q1q4+q2q3 2q1q3)]TJ/F11 9.963 Tf 9.963 0 Td[(q2q42q1q4+q2q3)]TJ/F11 9.963 Tf 7.749 0 Td[(q21)]TJ/F11 9.963 Tf 9.963 0 Td[(q22+q23+q24 000 0 0 0 q21+q22+q23+q24377777775266666664x1x2x30377777775-47

PAGE 42

31 notingthatanythree-dimensionalvectorcanberotatedusingquaternionsbyaddingazeroforthescalarcomponent.TherotationmatrixCyxequivalenttoqyxistheupperleft33blockfromEquation 4-47 .Cyx=266664q21)]TJ/F11 9.963 Tf 9.963 0 Td[(q22)]TJ/F11 9.963 Tf 9.963 0 Td[(q23+q242q1q2)]TJ/F11 9.963 Tf 9.963 0 Td[(q3q42q1q3+q2q42q1q2+q3q4)]TJ/F11 9.963 Tf 7.749 0 Td[(q21+q22)]TJ/F11 9.963 Tf 9.963 0 Td[(q23+q242)]TJ/F11 9.963 Tf 7.749 0 Td[(q1q4+q2q32q1q3)]TJ/F11 9.963 Tf 9.963 0 Td[(q2q42q1q4+q2q3)]TJ/F11 9.963 Tf 7.749 0 Td[(q21)]TJ/F11 9.963 Tf 9.963 0 Td[(q22+q23+q24377775-48Ifqyxrepresentsarotationfromxtoy,andqzyrepresentsarotationfromytoz,thentherotationfromxtozisgivenbyqzx=qzyqyx.-494.4.2DescentAlgorithmsusingCoplanarityConstraintOverview.Section6.1.2ofMaybank'stext[ 1 ]showsarstandsecondorderdescentalgorithmwhichcouldbeusedtocalculateanestimateofthedisplacementvectorinthebodycoordinatesystem,^ac1.GPSdierentialcarrier-phasepositioningwouldthengiveacorrespondingdisplacementvectorinalocallevelcoordinatesystem,aLL.Anothervectorexpressedinbothcoordinatesystemswouldthenneedtobefound.Oncetwovectorsexpressedinbothcoordinatesystemsarefound,thenthesolutionforthevehicleattitudecanbefoundviathesamemanneroutlinedintheSVDsolution.Calculationofdisplacementvectorinbodycoordinatesystem.BothdescentalgorithmsgiveninMaybank'stext[ 1 ]usetheerrorfunctionVEdescribedbyEquation 4-4 withtwosubtlechanges:expandingtheessentialmatrixEtoitsimplicitcomponentsofthecameradisplacementfRc2c1;^ac1gandusingaunitquaternionzc2c1torepresenttherotationbetweenthecamerasystemsinsteadoftheorthonormalrotationmatrixRc2c1.MakingtheabovetwochangestoEquation 4-4 resultsin[ 1 ]VE=VRc2c1;^ac1=Vzc2c1;^ac1=nXi=1zc2c1^qc2izc2c1)]TJ/F7 6.974 Tf 6.227 0 Td[(1^ac1^pc1i;-50wheretheEquation 4-50 isutilizingthecoplanarityconstraint.Notethatthehatsymbolon^ac1;^qc2i;and^pc1iinEquation 4-50 isusedtodenoteunitvectors.

PAGE 43

32 Thedescentalgorithmsearchestheneighborhoodaroundzc2c1;^ac1anddeterminesthesmallperturbationszc2c1andac1whichproducethegreatestdecreaseinVforthegivenconstraintskzc2c1+zc2c1k=kzc2c1k=1,-51k^ac1+ac1k=k^ac1k=1.-52TheTaylorseriesexpansionofEquation 4-50 aboutthepointzc2c1;^ac1isVzc2c1+zc2c1;^ac1+ac1=Vzc2c1;^ac1+lzc2c1+mac1+zc2c1TLzc2c1+2zTMac1+ac1TNac1+H.O.T.-53wherelandmarevectorsandL;M;andNarematriceswhicharefunctionsofzc2c1;^ac1;^qc2;and^pc1.Equation 4-53 isusedtocalculateVinMaybank'sdescentalgorithms. 3 [ 1 ]Firstorderdescent.TherstorderdescentalgorithmusestherstthreetermsofEquation 4-53 tocalculatezanda.Thevaluesofzandaaresoughtwhichmakelz+mathemostnegative,subjecttotheconstraintsgivenbyEquations 4-51 and 4-52 .Also,astepsizehneedstobedenedtoconstrainkzkandkaktobeasmallxedvalue.kzk=kak=h-54ThisconstraintisimposedtoensurethattherstorderapproximationisvalidintheregionaroundVz;athatissearched.[ 1 ]Thetwodeltaterms,zanda,arefoundindependentlyofeachother.Ifthestartingpointofthedescentalgorithmisgivenasz1;a1,thenthevalueofzisfoundviatheerrorfunctionW1;2;z=lz+1kzk2)]TJ/F11 9.963 Tf 9.962 0 Td[(h2+2kz1+zk2)]TJ/F8 9.963 Tf 9.962 0 Td[(1,-55where1and2areLaGrangemultipliersusedtoenforcetheconstraintslistedinEquations 4-51 and 4-54 .DierentiatingEquation 4-55 withrespectto1,2,andz,andsettingtherespectiveresultstozerogivesthreeequationsandthreeunknowns.ThesethreeequationscanbeusedtosolveforthevalueofzwhichresultsinthemostdecreaseinV.AsimilarmethodcanbeusedtondthevalueofawhichproducesthemostdecreaseinthevalueofV.[ 1 ] 3Foreaseofreadabilitytheunitvectorhatsymbol,superscriptsandsubscriptwillbedroppedonthereferencestozc2c1,zc2c1,^ac1,andac1fortheremainderofthediscussionofMaybank'sdescentalgorithms.

PAGE 44

33 Theupdatedvaluesz2;a2aredenedasz2;a2=z1+z;a1+a.-56IfthevalueofVz2;a2issignicantlysmallerthanthevalueofVz1;a1,thenthedescentalgorithmisappliedtoz2;a2inplaceofz1;a1.IfVz2;a2isnotlessthanadeneddeltafromVz2;a2,thenthealgorithmisrepeatedonz1;a1withasmallerstepsize,h,todeterminezanda.Ifthestepsizereachesitssmallestallowablevalue,thenthealgorithmterminatesandreturnsthecurrentvalueofz;aasanestimateoftheglobalminimumoftheerrorfunctionV.Secondorderdescent.ThesecondorderdescentalgorithmcanbeappliediftherstorderdescentalgorithmdoesnotproduceareasonablevaluefortheglobalminimumofV.ThesecondorderalgorithmusesalltermsofEquation 4-53 tocalculatezanda,withtheadditionalconstraintsadded,z1z=0,4-57a1a=0.4-58TheerrorfunctionUisdenedbyU1;2;z;a=lz+ma+zTLz+2zTMa+aTNa+1z1z+2a1a-59wheretheLaGrangemultipliers1and2areusedtoenforcetheconstraintsgiveninEquations 4-57 and 4-58 .DierentiatingEquation 4-59 withrespecttozandaandsettingtheresultstozerogivestwomatrixequationswithfourunknowns.Thetwoconstraintequations,z1z=0anda1a=0,canthenbeappliedtosolvefor1and2.Then,substitutingthevaluesof1and2intothetwomatrixequationsfoundviasettingthepartialderivativesofEquation 4-59 tozero,allowsthetwoequationstobesolvedforzanda.[ 1 ]Theupdatedvaluesz2;a2aredenedasz2;a2=z1+z;a1+a.-60IfthevalueofVz2;a2issignicantlysmallerthanthevalueofVz1;a1,thenthedescentalgorithmisappliedtoz2;a2inplaceofz1;a1.IfVz2;a2isnotlessthanadeneddelta

PAGE 45

34 fromVz2;a2,thenthealgorithmterminatesandreturnsthecurrentvalueofz;aasanestimateoftheglobalminimumoftheerrorfunctionV. 4 4.4.3DescentAlgorithmUsingLinearizedMeasurementModelThedescentorerrorminimizationapproachislessecientthantheSVDsolution.ButtheeciencyintheSVDapproachcanbelostiftheincorporationofstatisticsintothealgorithmisdesired.Thedescentalgorithmallowstheincorporationofstatisticswiththecostbeingamorecomplicatederrorequation.Alinearizedmodelofthesystemwhichincorporatesfocalplanemeasurementerror,Equation 5-33 ,isderivedinthenextchapter.Thestateerrorvector,vi,ofindependentvariablesisafunctionofthepointcorrespondencemeasurements^qc2i;^pc1iandthemeasurementerrors~ci;~ri.AnerrorequationbasedonminimizingthestateerrorvectorcanbewrittenintheformJvi=nXi=1vTiWvi,-61whereWisaddedasaweightmatrixbecausethestateerrorvectorvisacombinationofangularandrangeerrors.Equation 4-61 canbeusedtoformadescentsolutionalgorithmsimilartotheonesdescribedbyMaybank'stextinsection6.1.2[ 1 ].Developmentandimplementationofthissolutiontechniqueisleftforafutureareaofstudy. 4Furtherdetailsoftherstandsecondorderdescentsolutionalgorithmsaregiveninsection6.1.2ofMaybank'stext[ 1 ].

PAGE 46

CHAPTER5ERRORANALYSISTheaccuracyoftheinertialattitudeestimatesisthelitmustestfortheviabilityofthisapplication.Thischapterstartswithapresentationoftheerrormodelscenario.Allassumptionsandinformationconsideredasconstantsareoutlined.Next,thederivationofageneralizedsystemmodelispresented.Then,analgorithmsimulatingthegeneralizedmodelisdescribedandvericationresultsarepresented.Finally,theresultsofparameterstudiesutilizingthisalgorithmarepresentedforcamerafocallength/eldofviewFOV,cameradepressionangle,displacementvectormagnitudebetweenimagingsystems,numberofcontrolpoints,focalplanelocationofcontrolpoints,andcontrolpointmeasurementerror.5.1GeneralizedLinearizedErrorModelToquantifytheaccuracyofthismodel,consideranaerialvehicle,oragentthatcontainsanimagingsystem,GPSsensorandINSsystem.Theimagingsystemconsistsofacamerawithagivenfocallength,physicalsize,pixelcount,anddepressionangle.Supposetheagentisyinghorizontalsothatthetranslationvector,aLLdoesnothaveanycomponentintheupor^zLLdirection 1 .Whilemovinghorizontally,theagenttakestwoimagessuchthattheyhaveanoverlappingFOV.Aminimumofveimagecorrespondencesmustbepresentinthetwoimages.AnexamplescenarioisshowninFigure 5{1 .Themeasurementmodelequationisgivenbyqc2=Rc2c1pc1)]TJ/F11 9.963 Tf 9.963 0 Td[(Rc2LLaLL,-1wherepc1isthevectorfromtherstcamerapositiontothecontrolpoint,qc2isthevectorfromthesecondcamerapositiontothecontrolpoint,andaLListhevectorfromtherstcamerapositiontothesecondcamerapositionintheLLcoordinatesystem.Asareminder,thenotationqxdenotesavectorqexpressedincoordinatesystemx,andRyxistherotationmatrixfrom 1Thehorizontalconstraintisdonetocanceloutthetranslationalaxisambiguity.Thec,orcenter,coordinatesystemwillbeintroducedlatertoexplainthereasoningforthisseeminglyarbi-traryconstraint. 35

PAGE 47

36 + + Camerafocalplane c2 c1 c1FOV aLL=[axay0] pc1 qc2 c2FOV Figure5{1:Erroranalysisscenario. coordinatesystemxtocoordinatesystemy.ThecoordinatesystemsusedinthismodelaredescribedinTable 5{1 Table5{1:CoordinateSystemDenitions SystemDenition c1Cameracoordinatesforcameraintherstposition;thex-yplaneisthefocalplaneofthecamera,andthezaxisisalongthecameraboresite.c2Cameracoordinatesforcamerainthesecondposition;sameconventionasabove.cCentercoordinatesystemwhere^xcaxispointsalongvectoraLLandthe^zcaxisisasclosetoverticalaspossibleseeappendixfordiscussionoftheccoordinatesystem.LLLocallevelcoordinates;north-east-downoreast-north-up. Wewillassumethatthedirectionofpc1isaperfectmeasurementandallerrorsindirectionareassociatedwiththesecondmeasurementqc2.However,therangetothecontrolpointisunknownandmustbecalculatedfromthemeasurements,soletpc1=bpc1+p,-2wherebpc1isanestimateofpc1,andpistheproportionoferrorinthelengthofbpc1.

PAGE 48

37 Inordertoovercomethetranslationaxisambiguity,thehorizontalconstraintonthedis-placementvector'sdirectionisusedtodenethecoordinatesystemcsuchthatthe^xcaxispointsalongvectoraLL,andthe^zcaxisisasclosetoverticalaspossible.Asdiscussedintheappendix,thematrixRcLLisgivenbyRcLL=1 kaLLk2666664aLLxaLLyaLLz)]TJ/F10 6.974 Tf 6.227 0 Td[(aLLykak p aLLx2+aLLy2aLLxkak p aLLx2+aLLy20)]TJ/F10 6.974 Tf 6.227 0 Td[(aLLxaLLz p aLLx2+aLLy2)]TJ/F10 6.974 Tf 6.227 0 Td[(aLLyaLLz p aLLx2+aLLy2q aLLx2+aLLy23777775.-3 NotethatwhentherotationRcLLisappliedtoaLL,thefollowingresultisobtained:266664a00377775=1 kaLLk2666664aLLxaLLyaLLz)]TJ/F10 6.974 Tf 6.227 0 Td[(aLLykak p aLLx2+aLLy2aLLxkak p aLLx2+aLLy20)]TJ/F10 6.974 Tf 6.227 0 Td[(aLLxaLLz p aLLx2+aLLy2)]TJ/F10 6.974 Tf 6.227 0 Td[(aLLyaLLz p aLLx2+aLLy2q aLLx2+aLLy23777775266664aLLxaLLyaLLz377775,-4wherea=kaLLk.Now,letRc2c1=R1bRc2c1,-5 wherebRc2c1isanestimateofRc2c1,andR1isthemisalignmentintheestimate.Also,letRc2LL=bRc2LLRLLcR2RcLL,-6wherebRc2LLisanestimateofRc2LL,andR2isthemisalignmentintheestimateexpressedintheccoordinatesystem.WhenR2isexpressedintheccoordinatesystem,oneofthethreemisalignmentanglesisunobservableandwilldropoutofthecalculations.Thisremovesthetranslationalaxisambiguityfromthisproblem.Themeasurementequationforthegeneralizederrormodel,Equation 5-1 ,nowbecomesqc2=R1bRc2c1bpc1+p)]TJ/F1 9.963 Tf 11.846 2.518 Td[(bRc2LLRLLcR2RcLLaLL.5-7TosimplifyEquation 5-7 ,combinesomeconstantfactors:bpc2=bRc2c1bpc1-8bRc2c=bRc2LLRLLc-9ac=RcLLaLL-10Thenthemeasurementmodelequationbecomesqc2=R1bpc2+p)]TJ/F1 9.963 Tf 11.846 2.518 Td[(bRc2cR2ac.-11

PAGE 49

38 Equation 5-11 writtenoutwithallvectorsandmatricesexpandedis266664qc2xqc2yqc2z377775=2666641~1)]TJ/F8 9.963 Tf 8.566 2.629 Td[(~1)]TJ/F8 9.963 Tf 9.789 2.629 Td[(~11~1~1)]TJ/F8 9.963 Tf 9.056 2.629 Td[(~11377775266664bpc2xbpc2ybpc2z377775+p)]TJ/F1 9.963 Tf 9.962 31.98 Td[(266664r11r12r13r21r22r23r31r32r33377775266664a)]TJ/F8 9.963 Tf 9.789 2.629 Td[(~2a~2a377775.-12Thesolutionforthecr-coordinatesonthefocalplaneofthesecondimagingsystemc2givenbyEquations 4-1 and 4-2 arec=fqc2x qc2z=fbpc2x+~1bpc2y)]TJ/F8 9.963 Tf 10.779 2.629 Td[(~1bpc2z+p)]TJ/F8 9.963 Tf 9.963 0 Td[(r11a)]TJ/F11 9.963 Tf 9.963 0 Td[(r12~2a+r13~2a ~1bpc2x)]TJ/F8 9.963 Tf 11.271 2.629 Td[(~1bpc2y+bpc2z+p)]TJ/F8 9.963 Tf 9.963 0 Td[(r31a)]TJ/F11 9.963 Tf 9.962 0 Td[(r32~2a+r33~2a-13r=fqc2y qc2z=f)]TJ/F8 9.963 Tf 9.789 2.629 Td[(~1bpc2x+bpc2y+~1bpc2z+p)]TJ/F8 9.963 Tf 9.963 0 Td[(r21a)]TJ/F11 9.963 Tf 9.963 0 Td[(r22~2a+r23~2a ~1bpc2x)]TJ/F8 9.963 Tf 11.271 2.629 Td[(~1bpc2y+bpc2z+p)]TJ/F8 9.963 Tf 9.963 0 Td[(r31a)]TJ/F11 9.963 Tf 9.962 0 Td[(r32~2a+r33~2a.5-14 AsummaryoftheabovevariablesisgiveninTable 5{2 wherebpc2x;bpc2y;bpc2z,a,andfrijgaretreatedasconstants;c;raredependentvariablesormeasurementssubjecttomeasurementerrors;andthestatevectorofindependentvariablesis~1;~1;~1;~2;~2;p.Notethat~2,whichrepresentstherollaboutthegeneralizedtranslationalaxis,^xc,doesnotappearinEquations 5-13 and 5-14 .Thisresolvesthetranslationalaxisambiguitythatresultsfromonlyusingtwoimagestoobtaintheattitudeestimate.Tolinearizethemeasurementmodel,wecalculatethepartialderivativesofeachofthemeasurementswithrespecttotheindependentvariablesaboutanominalpoint.Letthevectornotationf=f0+@f @b04b+@f @c04c+higherorderterms-15indicatearstorderTaylorseriesexpansionontheindependentvariablesa,b,andcaboutanominalpoint0where=[abc]Tand0=[ab0c0]T.Thenominalpoint0consistsofaconstantvalueaandnominalvaluesforbandc.Extendingthisnotationtothemeasurementequationgivesthegeneralizedlinearizedmeasurementmodelasc=c0+@c @~104~1+@c @~104~1+@c @~104~1+@c @~204~2+@c @~204~2+@c @p04p-16r=r0+@r @~104~1+@r @~104~1+@r @~104~1+@r @~204~2+@r @~204~2+@r @p04p-17

PAGE 50

39 Table5{2:Descriptionofvariablesinthegeneralizederrormodel VariablesDescription bpc2x;bpc2y;bpc2zThepositionofthecontrolpointdesignatedfromtherstcamerapositionwiththerangeestimated,transformedtothesecondcamerasystem,c2viabRc2c1.aDistancebetweentherstcamerapositionandthesecondcameraposition;thedirectionofdisplacementbetweencamerapositionsappearsinthemeasurementequationsimplicitly.frijgElementsofmatrixbRc2c;afunctionofestimatedorknownrotations.r,cMeasurementofthecontrolpointpositionfromthesecondcameraposition.~1;~1;~1Misalignmentanglesintheestimateoftherotationofthecamerafromrstpositiontothesecondposition.~2;~2Misalignmentanglesintheestimateoftherollandyaw3-2-1Euleranglesfromtheccoordinatesystemtothec2system.Notethatifthevehicleisyinghorizontallyi.e.,no^zLLcomponenttoaLLthattheseanglescorresponddirectlytotheinertialpitchandheadingattitudeerrorsofthevehicleinthesecondposition.pProportionoferrorintheestimateofrangetothecontrolpoint.

PAGE 51

40 where=[bqc2bpc1rija~1~1~1~2~2p]Tand0=[bqc2bpc1rija~1;0~1;0~1;0~2;0~2;0p0]T.Notethatbqc2;bpc1;rij;andaaretreatedasconstantsintheTaylorseriesexpansionofthemeasurementequationsforcandr.CalculatingthepartialderivativesforeachofthetermsinEquations 5-16 and 5-17 ,andlettingtheallthenominalvaluesbezerogives@c @~1=fbpc2x)]TJ/F11 9.963 Tf 9.963 0 Td[(r11abpc2y bpc2z)]TJ/F11 9.963 Tf 9.962 0 Td[(r31a2-18@c @~1=f)]TJ/F8 9.963 Tf 7.748 0 Td[(bpc2z)]TJ/F11 9.963 Tf 9.962 0 Td[(r31abpc2z)]TJ/F8 9.963 Tf 9.963 0 Td[(bpc2x)]TJ/F11 9.963 Tf 9.963 0 Td[(r11abpc2x bpc2z)]TJ/F11 9.963 Tf 9.962 0 Td[(r31a2-19@c @~1=fbpc2z)]TJ/F11 9.963 Tf 9.963 0 Td[(r31abpc2y bpc2z)]TJ/F11 9.963 Tf 9.962 0 Td[(r31a2-20@c @~2=f)]TJ/F8 9.963 Tf 7.748 0 Td[(bpc2z)]TJ/F11 9.963 Tf 9.962 0 Td[(r31ar13a+bpc2x)]TJ/F11 9.963 Tf 9.962 0 Td[(r11ar33a bpc2z)]TJ/F11 9.963 Tf 9.963 0 Td[(r31a2-21@c @~2=fbpc2z)]TJ/F11 9.963 Tf 9.963 0 Td[(r31ar12a)]TJ/F8 9.963 Tf 9.962 0 Td[(bpc2x)]TJ/F11 9.963 Tf 9.963 0 Td[(r11ar32a bpc2z)]TJ/F11 9.963 Tf 9.962 0 Td[(r31a2-22@c @p=f)]TJ/F11 9.963 Tf 7.748 0 Td[(r31abpc2x+r11abpc2z bpc2z)]TJ/F11 9.963 Tf 9.963 0 Td[(r31a2-23@r @~1=fbpc2z)]TJ/F11 9.963 Tf 9.963 0 Td[(r31abpc2z+bpc2y)]TJ/F11 9.963 Tf 9.962 0 Td[(r21abpc2y bpc2z)]TJ/F11 9.963 Tf 9.963 0 Td[(r31a2-24@r @~1=f)]TJ/F8 9.963 Tf 7.748 0 Td[(bpc2y)]TJ/F11 9.963 Tf 9.962 0 Td[(r21abpc2x bpc2z)]TJ/F11 9.963 Tf 9.963 0 Td[(r31a2-25@r @~1=f)]TJ/F8 9.963 Tf 7.748 0 Td[(bpc2z)]TJ/F11 9.963 Tf 9.962 0 Td[(r31abpc2x bpc2z)]TJ/F11 9.963 Tf 9.963 0 Td[(r31a2-26@r @~2=f)]TJ/F8 9.963 Tf 7.748 0 Td[(bpc2z)]TJ/F11 9.963 Tf 9.962 0 Td[(r31ar23a+bpc2y)]TJ/F11 9.963 Tf 9.962 0 Td[(r21ar33a bpc2z)]TJ/F11 9.963 Tf 9.963 0 Td[(r31a2-27@r @~2=fbpc2z)]TJ/F11 9.963 Tf 9.963 0 Td[(r31ar22a)]TJ/F8 9.963 Tf 9.962 0 Td[(bpc2y)]TJ/F11 9.963 Tf 9.963 0 Td[(r21ar32a bpc2z)]TJ/F11 9.963 Tf 9.962 0 Td[(r31a2-28@r @p=f)]TJ/F11 9.963 Tf 7.748 0 Td[(r31abpc2y+r21abpc2z bpc2z)]TJ/F11 9.963 Tf 9.963 0 Td[(r31a2-29Denetheerrorinthecandrmeasurementsas264~c~r375,264cr375)]TJ/F1 9.963 Tf 9.962 23.014 Td[(264c0r0375.-30

PAGE 52

41 Thecompletelinearizedmeasurementequationisthengivenby264~c~r375=f bpc2z)]TJ/F11 9.963 Tf 9.963 0 Td[(r31a2264bpc2x)]TJ/F11 9.963 Tf 9.963 0 Td[(r11abpc2y bpc2z)]TJ/F11 9.963 Tf 9.963 0 Td[(r31abpc2z+bpc2y)]TJ/F11 9.963 Tf 9.962 0 Td[(r21abpc2y )]TJ/F8 9.963 Tf 7.748 0 Td[(bpc2z)]TJ/F11 9.963 Tf 9.962 0 Td[(r31abpc2z)]TJ/F8 9.963 Tf 9.963 0 Td[(bpc2x)]TJ/F11 9.963 Tf 9.963 0 Td[(r11abpc2x bpc2z)]TJ/F11 9.963 Tf 9.962 0 Td[(r31abpc2y )]TJ/F8 9.963 Tf 7.749 0 Td[(bpc2y)]TJ/F11 9.963 Tf 9.962 0 Td[(r21abpc2x )]TJ/F8 9.963 Tf 7.749 0 Td[(bpc2z)]TJ/F11 9.963 Tf 9.963 0 Td[(r31abpc2x )]TJ/F8 9.963 Tf 7.748 0 Td[(bpc2z)]TJ/F11 9.963 Tf 9.962 0 Td[(r31ar13a+bpc2x)]TJ/F11 9.963 Tf 9.962 0 Td[(r11ar33a )]TJ/F8 9.963 Tf 7.748 0 Td[(bpc2z)]TJ/F11 9.963 Tf 9.962 0 Td[(r31ar23a+bpc2y)]TJ/F11 9.963 Tf 9.962 0 Td[(r21ar33a bpc2z)]TJ/F11 9.963 Tf 9.963 0 Td[(r31ar12a)]TJ/F8 9.963 Tf 9.962 0 Td[(bpc2x)]TJ/F11 9.963 Tf 9.962 0 Td[(r11ar32a )]TJ/F11 9.963 Tf 7.749 0 Td[(r31abpc2x+r11abpc2z bpc2z)]TJ/F11 9.963 Tf 9.963 0 Td[(r31ar22a)]TJ/F8 9.963 Tf 9.962 0 Td[(bpc2y)]TJ/F11 9.963 Tf 9.962 0 Td[(r21ar32a )]TJ/F11 9.963 Tf 7.749 0 Td[(r31abpc2y+r21abpc2z37526666666666666644~14~14~14~24~24p3777777777777775.-31LetCrepresentthesubmatrixconsistingoftherstvecolumnsofthematrixinEquation 5-31 ,andletcrepresentthelastcolumnofthematrixinEquation 5-31 .So,Cisa25submatrixandcisa21submatrixorcolumnvector.Notethattherearesixunknownsandtwomeasurements.So,moremeasurementsareneededtouniquelydeterminethesetofindependentvariables.But,eachmeasurementwillintroduceadierentrangeerror,pi.So,theminimumnumberofcontrolpointsneededtouniquelydeterminetheindependentvariablesisve.Forvecontrolpoints,thesystemisrepresentedby:26666666666666666666666666664~c1~r1~c2~r2~c3~r3~c4~r4~c5~r53777777777777777777777777777510x1=26666666666666666666666666664C1c100000000C20c20000000C300c3000000C4000c400000C50000c500003777777777777777777777777777510x10266666666666666666666666666644~14~14~14~24~24p14p24p34p44p53777777777777777777777777777510x1-32

PAGE 53

42 Figure5{2:OverviewofMatlabcimplementationofthegeneralizedlinearizederrormodelanal-ysisGLEMAprogram. wherethesubscriptsdenotethematrixsizesinEquation 5-32 .Letthencontrolpointmeasure-mentsbedenotedasthe2n1columnvectorm,theindependentvariables,orstateerrorvector,bedenotedasthe5+n1columnvectorv,andthe2n+nsquarematrixofconstanttermsbeG.Then,Equation 5-32 canbeconciselywrittenasm2n1=G2nn+5vn+51-33wherevwillbereferredtoasthestateerrorvectorhereafter.Addingmorethanvemeasure-mentsallowsEquation 5-33 tobesolvedbyaleastsquarestypeapproach.5.1.1ImplementationoftheGeneralizedLinearizedErrorModelAnalysisGLEMAMatlabcProgramAnanalysisprogramforthegeneralizedlinearizederrormodel,GLEMA,wascreatedintheMatlabcprogramminglanguage.Theimplementationcontainsthreeinter-dependentcomponents:initialization,loopstructure,and3displayofresults.Theinitializationcomponentdenesthexedparameters,setsuptheparameterthatwillbeloopedoveri.e.,focallength,cameradepressionangle,displacementvector,etc.,andsetsupandmaintainstheoutputcontainersthatholdtheanalysisresults.Theloopingcomponentcomputestheerrorvariancesfortheindependentvariablesgiveninthestateerrorvectorvforallthegeometriesofinterest.Thevariancesarethenplottedversustheparameterofinterestintheplotresultscomponent.GLEMAconsistsofacombinationofMatlabcfunctionsandscripts. 2 Thelabelsdriver.mandplot_results.minFigure 5{2 areusedtoillustratetheone-to-onecorrespondencebetweenadriverleandaplottingle.ThiscorrespondenceisshowninTable 5{3 .Thelistbelowdescribes 2Thedierencebetweenafunctionandascriptisinthescopeofthevariables.Ascriptsendsthevariablestotheworkspaceandafunctiondoesnot.Bothlesendin.m"andarecommonlycalledm-les.

PAGE 54

43 eachstepoftheimplementationasitcorrespondstothethreecomponentsshowninFigure 5{2 andliststhem-lesthatcorrespondtoeachstep. initialize1 driver.mSetupxedcameraparameters,controlpointscr{coordinateswithLLheights,trueattitude,rangeandattitudeerrors,anddisplacementvectorbetweenthecameras. initialize2 driver.mSetuptheparameterofinterestandoutputcontainerstocapturethedatafromtheupcomingloop. initialize3 driver.m,errorAnalysis.mLoopoverparameterofinterest. loop1 errorAnalysis.m,inputGeometry_truth.mGetthegeometryofforthetruthsysteminLL. loop2 errorAnalysis.m,checkIfControlPointInFOV.mEnsurethatthereareaminimumofvepointsthatareintheFOVofbothcameras. loop3 errorAnalysis.m,inputGeometry_truth.m,crh2enu.m,computeControlPoint.mCalculatethetruecontrolpointlocationsinthec1andc2coordinatesystems. loop4 errorAnalysis.m,inputGeometry_est.mAddinattitudeandrangeerrorstogetthegeometryintheestimatedsysteminLL loop5 errorAnalysis.m,calc_cr_truth.mCalculatethetruecontrolpointcr{coordinatesonthesecondcamera'sfocalplane. loop6 errorAnalysis.m,calc_cr_est.mCalculatetheestimatedcontrolpointcr{coordinatesonthesecondcamera'sfocalplane. loop7 errorAnalysis.mCalculatethecr{coordinateerrorvaluesonthesecondcamera'sfocalplane. loop8 errorAnalysis.m,reshape_c_and_r_err_into_M.mFormthem2n1vector. loop9 errorAnalysis.m,formG.m,form_c_row_for_G.m,form_r_row_for_G.mFormtheG2nn+5matrix. loop10 errorAnalysis.mComputethesystemerrorvector,vn+51,ofindependentstatevariablesfrommandG. loop11 errorAnalysis.mComputethen+5n+5matrixviaGTG)]TJ/F7 6.974 Tf 6.227 0 Td[(1.

PAGE 55

44 Comparecalculated betweenthetrueand estimatedstates andactualdierences Calculateddierrencesbetween trueandestimatedstates Actualdierencesbetween trueandestimatedstates calculatemtrue calculatemest truestate estimatedstate G)]TJ/F6 4.981 Tf 5.397 0 Td[(1 Inversionoflinearizedsystem + merrors + Figure5{3:GraphicaloverviewofvericationoftheGLEMAprogram loop12 driver.m,errorAnalysis.mReadothegeometrytermsforeachindependentstatevariablegivenbythediagonaltermsofGTG)]TJ/F7 6.974 Tf 6.226 0 Td[(1. loop13 driver.mDetermineapixelbasederrorcorrespondingtothecr{coordinatemeasurementsmadeonthesecondcamerasystemsfocalplane.Convertthepixel-basederrortoanareaonthefocalplane. loop14 driver.mCalculatetheerrorvariancesforeachdesiredindependentvariablebasedonitsgeometrytermandgivenfocalplaneerror. plotresults1 driver.m,plot_results.mPlottheerrorvariancesversustheparameterofinterest. Table5{3:Correspondencebetweendriver.mandplot results.mlesfortheGLEMAprogram. driver.mplot_results.m varying_beta.mplot_results_b.mvarying_displacement_vector.mplot_results_a.mvarying_focal_length.mplot_results_f.mvarying_last_CP_location.mplot_results_cp.mvarying_num_cp_cols.mplot_results_num_cp_cols.mvarying_num_cp_rows.mplot_results_num_cp_rows.mvarying_pixel_error.mplot_results_pixel_error.mverification.mMatlabccommandwindow Vericationresults.ThegoalofverifyingtheerrormodelistoshowthattheknownerrorsthatareinputtothesystemcanbesuccessfullycalculatedusingtheGLEMAprogram.Toaccomplishthisgoal,thedriverleverification.mwascreated.AgraphicaloverviewofthevericationprocessisshowninFigure 5{3

PAGE 56

45 VerifyingtheaccuracyoftheGLEMAprogramwasaccomplishedintwosteps.Therststepwastoverifythatlinearizationofeachindependentvariableinthesystemerrorvector,v,withallcoordinatesystemsaligned.AsummaryofthevariablescontainedinthesystemerrorvectorwiththeircorrespondinginputsisgiveninTable 5{4 .Thiswasveriedbysettingallinputerrorstozeroexcepttheerrorofinterest,andthencalculatingthesystemerrorvector,v,andvisuallyverifyingtheoutputtotheMatlabccommandwindowcorrespondstotheinputerror.Forexample,toverifythesuccessfullinearizationoftheinertialheadingestimate~2,allinputerrortermswerezeroexcept2LL 3 andthedisplacementvectorwasalignedsuchthatnoside-sliporangle-of-attackwaspresent.Avalueof0.01radiansfor2LLproducedavalueof0.0099radfor~2andvalueslessthan10)]TJ/F7 6.974 Tf 6.227 0 Td[(14fortherestofthevaluesv.AcompletelistofresultscomparingtheinputerrorstotheresultingtermsofsystemerrorvectorisgiveninTable 5{5 Table5{4:SummaryoflinearizedindependentvariablescontainedinthesystemerrorvectorvwiththeircorrespondinginputsintheGLEMAprogram. VariableGLEMAInput ~1errors.phi_c1toc2~1errors.theta_c1toc2~1errors.psi_c1toc2~2errors.theta_ned2c2~2errors.psi_ned2c2pierrors.r[i] Table5{5:SteponevericationresultsfortheGLEMAprogram IndependentVariables ~1 ~1 ~1 ~2 ~2 p1 p2 p3 p4 p5 p6 input 1e-2 0 0 0 0 0 0 0 0 0 0 output 1e-2 0 0 -1.1e-3 0 7e-4 9e-4 9e-4 4e-4 4e-4 4e-4 input 0 1e-2 0 0 0 0 0 0 0 0 0 output 0 1.02e-2 0 7e-4 -1e-4 1.6e-3 2.5e-3 1.4e-3 2e-3 2.4e-3 2e-3 input 0 0 1e-2 0 0 0 0 0 0 0 0 output -1e-4 0 1e-2 6e-4 7e-4 -3e-4 3e-4 5e-4 0 2e-4 3e-4 input 0 0 0 1e-2 0 0 0 0 0 0 0 output 0 0 0 1e-2 0 0 0 -1e-4 0 0 -1e-4 input 0 0 0 0 1e-2 0 0 0 0 0 0 output 0 0 0 0 1e-2 0 0 0 0 0 0 input 0 0 0 0 0 0 0 1e-2 0 0 0 output 0 0 0 0 0 0 0 1e-2 0 0 0 3Forconveniencesake,therotationmatrixnotationisusedtodenotethecoordinatesystemthattheinputerrortermactsupon.

PAGE 57

46 Thesecondstepofthevericationprocessinvolvesslewingtheheadingandvelocity/displacementvectoraround360o.Ifthecoordinatesystemrotationsandtranslationsarecorrect,thenthereshouldbenochangeinthesystemerrorvector.Theinertialheadingisgivena0.01raderrorandthevelocityandinertialheadingareslewedaroundin45oincrements.ThecomparisonresultsareshowninTable 5{6 .Correspondingcomparisonsfortheremainingindependentvariablesarenotshown,sincesteponeproducedavalidlinearizationforallindependentvariables. Table5{6:SteptwovericationresultsfortheGLEMAprogram IndependentVariables ~1 ~1 ~1 ~2 ~2 p1 p2 p3 p4 p5 p6 =45oinput 0 0 0 0 1e-2 0 0 0 0 0 0 output 0 0 0 0 1e-2 0 0 0 0 0 0 =90oinput 0 0 0 0 1e-2 0 0 0 0 0 0 output 0 0 0 0 1e-2 0 0 0 0 0 0 =135oinput 0 0 0 0 1e-2 0 0 0 0 0 0 output 0 0 0 0 1e-2 0 0 0 0 0 0 =180oinput 0 0 0 0 1e-2 0 0 0 0 0 0 output 0 0 0 0 1e-2 0 0 0 0 0 0 =225oinput 0 0 0 0 1e-2 0 0 0 0 0 0 output 0 0 0 0 1e-2 0 0 0 0 0 0 =270oinput 0 0 0 0 1e-2 0 0 0 0 0 0 output 0 0 0 0 1e-2 0 0 0 0 0 0 =315oinput 0 0 0 0 1e-2 0 0 0 0 0 0 output 0 0 0 0 1e-2 0 0 0 0 0 0 =360oinput 0 0 0 0 1e-2 0 0 0 0 0 0 output 0 0 0 0 1e-2 0 0 0 0 0 0 TheresultspresentedinTables 5{5 and 5{6 showthattheGLEMAprogramisareasonablerepresentationoftheerrormodelscenariogiveninFigure 5{1 .Specically,theGmatrixisshowntobeavalidmodelofthelinearizedsystem.5.2ParameterStudyTheparameterstudysectionusestheGmatrixtotransformmeasurementerrorsintosolutionorangularerrors.TheGmatrixisformedusingtruthdata.Givenmeasurementerrorsmarethenusedtocalculatetheerrorsthatresultfromthegeometryofthescenee.g.,vfromEquation 5-33 .Figure 5{4 showsagraphicalrepresentationofthisprocess.Thisstudyassumesthatallmeasurementerrorsareuncorrelatedandcanbespeciedbyavalueforthepixelvariancedenotedas2p.Notethattheunitsfor2pare[pixels2].TheGmatrix,camerasize,andnumberofpixelsisusedtotransform2pintoangularerrorsbypaream2 pixels2=camerasize[m] numpixels[pixel]2-34

PAGE 58

47 GTG)]TJ/F33 5.978 Tf 5.756 0 Td[(1 truevalues measurementnoise errorsinsolution Figure5{4:GraphicaloverviewoftheparameterstudyprocessusingtheGLEMAprogram 2soln=GTG)]TJ/F7 6.974 Tf 6.226 0 Td[(12pparea-35wheretheunitsforsolnareradiansfor~1;~1;~1;~2;~2andproportionoferrorintheestimateofrangetothei-thcontrolpointforpi.Foralltheparameterstudyresults,aone-pixelstandarddeviationisassumed.ThissimpliesEquation 5-35 to2soln=GTG)]TJ/F7 6.974 Tf 6.227 0 Td[(1pareaforp=1-36Thepresultsarenotdisplayedinanyoftheparameterstudyresults.Thisisthesacricethatcomesfromensuringthatallcontrolpointsarevisibletobothcamerasystems.Thatis,ifapointisdesignatedintherstcamerasystemandcannotbeseenbythesecondcamerasystemthenitisnotusedintheformulationoftheGmatrix.Inthefollowingsections,theresultsfortheangularterms~1;~1;~1;~2;~2aredisplayedinunitsofmrad.Tofollowingequationwasusedtoconverteachangulartermof2solntomrad:soln,i[mrad]=1000mrad rad2soln1 2-37Thecontrolpointsaredesignatedinapseudo-randomfashionontherstcamerasfocalplane.Thefocalplaneisdividedupintoannrowbymcolumngrid.Thecontrolpointsarerandomlyplacedwithineachgridboxwiththeconstraintthattheycannotbewithin25%ofthegridboundaries.TheallowedlocationofacontrolpointwithinasinglegridboxisshowninFigure 5{5 .Anexamplefocalplanewitha3rowby3columngridisshowninFigure 5{6 .ThecontrolpointdesignationintheGLEMAprogramisperformedinthemarkControlPoints.mm{le.5.2.1VaryingCameraFocalLengthTheeectofvaryingthecamera'sfocallengthontheattitudeestimatesisshowninthissection.TheinputparametersusedforthisstudyareshowninTable 5{7 .TheGLEMAdriverleusedinthisstudywasvarying_focal_length.m.

PAGE 59

48 Figure5{5:Allowedlocationofcontrolpointswithineachgridboxofthefocalplane. Figure5{6:Controlpointlocationsonafocalplanewitha3rowby3columngriddesignation.

PAGE 60

49 Table5{7:Inputparametersforvaryingcamerafocallengthstudy ParameterValueDescription 4Cameradepressionangleinradians.fvariedCamerafocallengthinmeters.ccd pixels384288Numberofcolumnandrowpixelsintheccdarraycamera.ccd size0.00420.0032Sizeoftheccdarraycamerainmeters.cpcols5Numberofcontrolpointcolumnsontheccdarraycamera.cprows7Numberofcontrolpointrowsontheccdarraycamera.c2c1;c2c1;c2c1,0,0Roll,pitch,yawoftherelativeattitudebetweenc1andc2.c2ned;c2ned;c2ned,0,0Roll,pitch,yawoftheinertialattitudeofthesecondcamerasystemc2.~c2c1;~c2c1;~c2c1,0,0Roll,pitch,yawmisalignmenterrorsintherelativeattitudebetweenc1andc2.~c2c;~c2c;~c2c,0,0Roll,pitch,yawmisalignmenterrorsbetweentheccentercoordinatesystemandsecondcamerasystemc2.kak10Displacementvectormagnitudeinmeters.venu,1,0Directionofthedisplacementvector,orvelocityvector.pixelerror1Measurementerrorofthecontrolpointsinpixels.

PAGE 61

50 Figure5{7:VaryingFOVvs.focallength Thefocallengthofthecamerasystemwasvariedfrom0.0015to0.0164meters.Thelowerlimit,0.0015meters,correspondstoanunrealisticFOVregionforthegivencamerasize.Thedatageneratedinthisrealmcanonlybeusedfortheoreticalarguments.ThecameraFOVisrelatedtothefocallengthofthecamerabyFOV[radians]=2arctancamerasize[m] 2f[m]-38TheresultingdatapointsusedinthisanalysisisshowninFigure 5{7 .TherstiterationoftheloopcorrespondstothelargestFOVandthelastiterationcorrespondstothesmallestFOV.TheverticalFOVhasamaximumvalueof108.92degreesandaminimumvalueof14.59degrees.ThehorizontalFOVhasamaximumvalueof93.70degreesandaminimumvalueof11.14degrees.Thecontrolpointlocationsonthefocalplanesofbothcamerasfortherstandlastit-erationsoftheanalysisloopareshowninFigures 5{8 and 5{9 .Thex"symbolsindicatethedesignatedcontrolpointlocationontherstcamera'sfocalplane,andthehexagonsymbolsindicatethecalculatedcontrolpointlocationsonthesecondcamera'sfocalplane.Alineisdrawntoconnectthecorrespondingcontrolpointlocationsintherstandsecondcamerasystem.Noticethatthelasttworowsofthecontrolpointswheretherstrowislocatedonthebottomofthe

PAGE 62

51 Figure5{8:ControlpointlocationsonthefocalplanesofbothcamerasfortherstiterationforvaryingcameraFOV. focalplanedisappearinfocalplaneofthelastiteration.Thisisduetothecontrolpointsgoingoutofthesecondcamera'sFOV.The3-DviewofthescenefortherstandlastiterationsisshowninFigures 5{10 and 5{11 .Thecontrolpointsaremarkedbyplussymbols.Therstcamerapositionismarkedbyasquaresymbol.Thesecondcamerapositionismarkedbyatriangle.TheresultsforthecomputedattitudestandarddeviationsareshowninFigures 5{12 and 5{13 .Thediscontinuitiesinthevariousplotsareduetocontrolpointsgoingoutofthesecondcamera'sFOV.5.2.2VaryingCameraDepressionAngleTheeectofvaryingthecamera'sdepressionangleontheattitudeestimatesisshowninthissection.TheinputparametersusedforthisstudyareshowninTable 5{8 .TheGLEMAdriverleusedinthisstudywasvarying_beta.m.Thecameradepressionanglewasvariedfrom0to179.9degrees.Therstiterationcor-respondstoacameralookingdirectlyaheadofthevehicle.Thelastiterationcorrespondstoacameralookingalmostdirectlybehindthevehicle.Sincetheprojectionontothex)]TJ/F11 9.963 Tf 9.379 0 Td[(yplaneofthevelocityandcameralineofsightvectorsareidenticalinthisstudy,aforwardlookingcameragets

PAGE 63

52 Figure5{9:ControlpointlocationsonthefocalplanesofbothcamerasforthelastiterationforvaryingcameraFOV. Figure5{10:Three-dimensionalviewofthescenefortherstiterationforavaryingfocallength.

PAGE 64

53 Figure5{11:Three-dimensionalviewofthesceneforthelastiterationforavaryingfocallength. Figure5{12:Varyingcamerafocallengthresults-nedtoc2

PAGE 65

54 Table5{8:Inputparametersforvaryingcameradepressionanglestudy ParameterValueDescription variedCameradepressionangleinradians.f0.005Camerafocallengthinmeters.ccd pixels384288Numberofcolumnandrowpixelsintheccdarraycamera.ccd size0.00420.0032Sizeoftheccdarraycamerainmeters.cpcols5Numberofcontrolpointcolumnsontheccdarraycamera.cprows7Numberofcontrolpointrowsontheccdarraycamera.c2c1;c2c1;c2c1,0,0Roll,pitch,yawoftherelativeattitudebetweenc1andc2.c2ned;c2ned;c2ned,0,0Roll,pitch,yawoftheinertialattitudeofthesecondcamerasystemc2.~c2c1;~c2c1;~c2c1,0,0Roll,pitch,yawmisalignmenterrorsintherelativeattitudebetweenc1andc2.~c2c;~c2c;~c2c,0,0Roll,pitch,yawmisalignmenterrorsbetweentheccentercoordinatesystemandsecondcamerasystemc2.kak10Displacementvectormagnitudeinmeters.venu,1,0Directionofthedisplacementvector,orvelocityvector.pixelerror1Measurementerrorofthecontrolpointsinpixels.

PAGE 66

55 Figure5{13:Varyingcamerafocallengthresults-c1toc2 closertothepointswhilearear-lookingcameragetsfartheraway.Thecamerawasallowedtoberearlookingtoshowthisdierence.Thecontrolpointlocationsonthefocalplanesofbothcamerasfortheequaling45,90,and135degreesoftheanalysisloopareshowninFigures 5{14 through 5{16 .Thex"symbolsindicatethedesignatedcontrolpointlocationontherstcamera'sfocalplane,andthehexagonsymbolsindicatethecalculatedcontrolpointlocationsonthesecondcamera'sfocalplane.Alineisdrawntoconnectthecorrespondingcontrolpointlocationsintherstandsecondcamerasystem.Noticethatthelasttworowsofthecontrolpointswheretherstrowislocatedonthebottomofthefocalplanedisappearinthefocalplaneofthelastiteration.Thisisduetothecontrolpointsgoingoutofthesecondcamera'sFOV.The3-Dviewofthesceneforequaling45,90,and135degreesisshowninFigures 5{17 through 5{19 .Thecontrolpointsaremarkedbyplussymbols.Therstcamerapositionismarkedbyasquaresymbol.Thesecondcamerapositionismarkedbyatriangle.Thelocationofthecontrolpointsmoveasexpectedfrominfronttodirectlybelowtobehindthecamerasystemswhenequals45,90,and135degreesrespectively.TheresultsforthecomputedattitudestandarddeviationsareshowninFigures 5{20 and 5{21 .Thediscontinuitiesinthevariousplotsareduetocontrolpointsgoingoutofthesecond

PAGE 67

56 Figure5{14:Controlpointlocationsonfocalplaneforit=45degreesforthevaryingcameradepressionanglestudy Figure5{15:Controlpointlocationsonfocalplaneforit=90degreesforthevaryingcameradepressionanglestudy

PAGE 68

57 Figure5{16:Controlpointlocationsonfocalplaneforit=135degreesforthevaryingcameradepressionanglestudy Figure5{17:Three-dimensionalviewofthescenefor=45degreesforthevaryingcameradepressionanglestudy.

PAGE 69

58 Figure5{18:Three-dimensionalviewofthescenefor=90degreesforthevaryingcameradepressionanglestudy. Figure5{19:Three-dimensionalviewofthescenefor=135degreesforthevaryingcameradepressionanglestudy.

PAGE 70

59 Figure5{20:Varyingcameradepressionangleresults-nedtoc2 camera'sFOV.Thenon-symmetricbehavioraboutthe90ocameradepressionangleisduetothedierentcontrolpointlocationswhichoverlapinthetwoFOVs,andthedierencebetweenforwardandrear-lookingcamerai.e.,approachingandrecedingfromcontrolpoints.5.2.3VaryingDisplacementVectorMagnitudeBetweenImagingSystemsTheeectofvaryingthedisplacementvectormagnitudebetweencamerasystemsontheattitudeestimatesisshowninthissection.TheinputparametersusedforthisstudyareshowninTable 5{9 .TheGLEMAdriverleusedinthisstudywasvarying_displacement_vector.m.Themagnitudeofthedisplacementvectorbetweencamerasystemskakwasvariedfrom1.0to20.9meters.Therstiterationcorrespondstocamerakak=1.0andthelastiterationcorrespondstokak=20.9.Thecontrolpointlocationsonthefocalplanesofbothcamerasforthekak=1.0,5.9,10.9,15.9and20.9metersareshowninFigures 5{22 through 5{26 .Thex"symbolsindicatethedesignatedcontrolpointlocationontherstcamera'sfocalplane,andthehexagonsymbolsindicatethecalculatedcontrolpointlocationsonthesecondcamera'sfocalplane.Alineisdrawntoconnectthecorrespondingcontrolpointlocationsintherstandsecondcamerasystem.Noticethatthelasttworowsofthecontrolpointswheretherstrowislocatedonthebottomofthe

PAGE 71

60 Table5{9:Inputparametersforvaryingdisplacementvectormagnitudebetweenimagingsys-tems. ParameterValueDescription 3Cameradepressionangleinradians.f0.005Camerafocallengthinmeters.ccd pixels384288Numberofcolumnandrowpixelsintheccdarraycamera.ccd size0.00420.0032Sizeoftheccdarraycamerainmeters.cpcols5Numberofcontrolpointcolumnsontheccdarraycamera.cprows7Numberofcontrolpointrowsontheccdarraycamera.c2c1;c2c1;c2c1,0,0Roll,pitch,yawoftherelativeattitudebetweenc1andc2.c2ned;c2ned;c2ned,0,0Roll,pitch,yawoftheinertialattitudeofthesecondcamerasystemc2.~c2c1;~c2c1;~c2c1,0,0Roll,pitch,yawmisalignmenterrorsintherelativeattitudebetweenc1andc2.~c2c;~c2c;~c2c,0,0Roll,pitch,yawmisalignmenterrorsbetweentheccentercoordinatesystemandsecondcamerasystemc2.kakvariedDisplacementvectormagnitudeinmeters.venu,1,0Directionofthedisplacementvector,orvelocityvector.pixelerror1Measurementerrorofthecontrolpointsinpixels.

PAGE 72

61 Figure5{21:Varyingcameradepressionangleresults-c1toc2 focalplanedisappearbythetimekak=20.9.Thisisduetothecontrolpointsgoingoutofthesecondcamera'sFOV.The3-Dviewofthesceneforkak=1.0,5.9,10.9,15.9and20.9metersisshowninFigures 5{27 through 5{31 .Thecontrolpointsaremarkedbyplussymbols.Therstcamerapositionismarkedbyasquaresymbol.Thesecondcamerapositionismarkedbyatriangle.Thesecondcameramovesfromrighttoleftasthedisplacementvectorisincreasedfrom1to20.9meters.Referringtothetopportionofeachgure,noticethatthecontrolpointscorrespondingtothedisappearingrowsareseentograduallydisappearfromtheleftsideofthecontrolpointgrouping.TheresultsforthecomputedattitudestandarddeviationsareshowninFigures 5{32 through 5{35 .Thediscontinuitiesinthevariousplotsareduetocontrolpointsgoingoutofthesecondcamera'sFOV.5.2.4VaryingNumberofControlPointsTheeectofvaryingthenumberofcontrolpointsmarkedonthefocalplaneontheatti-tudeestimatesisshowninthissection.TheinputparametersusedforthisstudyareshowninTable 5{10 .TheGLEMAdriverlesusedinthisstudywerevarying_num_cp_cols.mandvarying_num_cp_rows.m.

PAGE 73

62 Figure5{22:Controlpointlocationsonfocalplanefora=1.0metersforthevaryingdisplace-mentvectormagnitudestudy. Figure5{23:Controlpointlocationsonfocalplanefora=5.9metersforthevaryingdisplace-mentvectormagnitudestudy.

PAGE 74

63 Figure5{24:Controlpointlocationsonfocalplanefora=10.9metersforthevaryingdisplace-mentvectormagnitudestudy. Figure5{25:Controlpointlocationsonfocalplanefora=15.9metersforthevaryingdisplace-mentvectormagnitudestudy.

PAGE 75

64 Figure5{26:Controlpointlocationsonfocalplanefora=20.9metersforthevaryingdisplace-mentvectormagnitudestudy. Figure5{27:Three-dimensionalviewofthescenefora=1.0meterforthevaryingdisplacementvectormagnitudebetweenimagingsystemsstudy.

PAGE 76

65 Figure5{28:Three-dimensionalviewofthescenefora=5.9meterforthevaryingdisplacementvectormagnitudebetweenimagingsystemsstudy. Figure5{29:Three-dimensionalviewofthescenefora=10.9meterforthevaryingdisplacementvectormagnitudebetweenimagingsystemsstudy.

PAGE 77

66 Figure5{30:Three-dimensionalviewofthescenefora=15.9meterforthevaryingdisplacementvectormagnitudebetweenimagingsystemsstudy. Figure5{31:Three-dimensionalviewofthescenefora=20.9meterforthevaryingdisplacementvectormagnitudebetweenimagingsystemsstudy.

PAGE 78

67 Figure5{32:Varyingdisplacementvectormagnitudebetweenimagingsystemsresultsfora=1.0to20.9meters-nedtoc2 Figure5{33:Varyingdisplacementvectormagnitudebetweenimagingsystemsresultsfora=5.9to20.9meters-nedtoc2

PAGE 79

68 Figure5{34:Varyingdisplacementvectormagnitudebetweenimagingsystemsresultsfora=1.0to20.9meters-c1toc2 Figure5{35:Varyingdisplacementvectormagnitudebetweenimagingsystemsresultsfora=5.9to20.9meters-c1toc2

PAGE 80

69 Table5{10:Inputparametersforvaryingthenumberofcontrolpointsstudy. ParameterValueDescription 3Cameradepressionangleinradians.f0.005Camerafocallengthinmeters.ccd pixels384288Numberofcolumnandrowpixelsintheccdarraycamera.ccd size0.00420.0032Sizeoftheccdarraycamerainmeters.cpcolsvariedNumberofcontrolpointcolumnsontheccdarraycamera.cprowsvariedNumberofcontrolpointrowsontheccdarraycamera.c2c1;c2c1;c2c1,0,0Roll,pitch,yawoftherelativeattitudebetweenc1andc2.c2ned;c2ned;c2ned,0,0Roll,pitch,yawoftheinertialattitudeofthesecondcamerasystemc2.~c2c1;~c2c1;~c2c1,0,0Roll,pitch,yawmisalignmenterrorsintherelativeattitudebetweenc1andc2.~c2c;~c2c;~c2c,0,0Roll,pitch,yawmisalignmenterrorsbetweentheccentercoordinatesystemandsecondcamerasystemc2.kak10Displacementvectormagnitudeinmeters.venu,1,0Directionofthedisplacementvector,orvelocityvector.pixelerror1Measurementerrorofthecontrolpointsinpixels.

PAGE 81

70 Figure5{36:Controlpointlocationsonfocalplaneforacolumncountof1andarowcountof6. Thenumberofrowsandnumberofcolumnsofcontrolpointswerevariedseparatelytoshowtheimpactthateachhadontheaccuracyoftheattitudeestimate.VaryingnumberofcontrolpointcolumncountThenumberofcontrolpointcolumncountswasvariedfrom1to20whilethenumberofrowswasxedat6.Therstiterationcorrespondstoacolumncountor1andthelastiterationcorrespondstoacolumncountof20.TheGLEMAdriverleusedforthispartofthestudywasvarying_num_cp_cols.m.Thecontrolpointlocationsonthefocalplanesofbothcamerasforthecolumncountof1,10,and20areshowninFigures 5{36 through 5{38 .Thex"symbolsindicatethedesignatedcontrolpointlocationontherstcamera'sfocalplane,andthehexagonsymbolsindicatethecalculatedcontrolpointlocationsonthesecondcamera'sfocalplane.Alineisdrawntoconnectthecorrespondingcontrolpointlocationsintherstandsecondcamerasystem.Noticethatthelastcolumninallthreeofthefocalplaneviewsismissing.Thisisduetothecontrolpointsgoingoutofthesecondcamera'sFOV.The3-Dviewofthesceneforcolumncountsof1,10,and20andarowcountof6isshowninFigures 5{39 through 5{41 .Thecontrolpointsaremarkedbyplussymbols.Therstcamerapositionismarkedbyasquaresymbol.Thesecondcamerapositionismarkedbyatriangle.

PAGE 82

71 Figure5{37:Controlpointlocationsonfocalplaneforcolumncountof10andarowcountof6. Figure5{38:Controlpointlocationsonfocalplaneforcolumncountof20andarowcountof6.

PAGE 83

72 Figure5{39:Three-dimensionalviewofthesceneforacolumncountof1andarowcountof6forthevaryingnumberofcontrolpointsstudy. Figure5{40:Three-dimensionalviewofthesceneforacolumncountof10andarowcountof6forthevaryingnumberofcontrolpointsstudy.

PAGE 84

73 Figure5{41:Three-dimensionalviewofthesceneforacolumncountof20andarowcountof6forthevaryingnumberofcontrolpointsstudy. TheresultsforthecomputedattitudestandarddeviationsareshowninFigures 5{42 and 5{43 .Asexpected,increasingthenumberofcontrolpointsdecreasestheerrorvalues.VaryingnumberofcontrolpointrowcountThenumberofcontrolpointrowcountwasvariedfrom1to20whilethenumberofcolumnswasxedat6.Therstiterationcorrespondstoarowcountor1andthelastiterationcor-respondstoarowcountof20.TheGLEMAdriverleusedforthispartofthestudywasvarying_num_cp_rows.m.Thecontrolpointlocationsonthefocalplanesofbothcamerasfortherowcountsof1,10,and20areshowninFigures 5{44 through 5{46 .Thex"symbolsindicatethedesignatedcontrolpointlocationontherstcamera'sfocalplane,andthehexagonsymbolsindicatethecalculatedcontrolpointlocationsonthesecondcamera'sfocalplane.Alineisdrawntoconnectthecorrespondingcontrolpointlocationsintherstandsecondcamerasystem.Noticethatthelastcolumninallthreeofthefocalplaneviewsismissing.Thisisduetothecontrolpointsgoingoutofthesecondcamera'sFOV.The3-Dviewofthesceneforrowcountsof1,10,and20andacolumncountof6isshowningures 5{47 through 5{49 .Thecontrolpointsaremarkedbyplussymbols.Therstcamerapositionismarkedbyasquaresymbol.Thesecondcamerapositionismarkedbyatriangle.Note

PAGE 85

74 Figure5{42:Varyingnumberofcontrolpointsstudyresultsforcolumncounts1-20andarowcountof6-nedtoc2 Figure5{43:Varyingnumberofcontrolpointsstudyresultsforcolumncounts1-20andarowcountof6-c1toc2

PAGE 86

75 Figure5{44:Controlpointlocationsonfocalplaneforrowcountof1andacolumncountof6. Figure5{45:Controlpointlocationsonfocalplaneforrowcountof10andacolumncountof6.

PAGE 87

76 Figure5{46:Controlpointlocationsonfocalplaneforrowcountof20andacolumncountof6. thatthereareonly5pointsingure 5{44 duetothelastcontrolpointgoingoutofthesecondcamera'sFOV.TheresultsforthecomputedattitudestandarddeviationsareshowninFigures 5{50 and 5{51 .5.2.5VaryingLocationofLastControlPointTheeectofvaryingthelocationofthelastcontrolpointontheattitudeestimatesisshowninthissection.TheinputparametersusedforthisstudyareshowninTable 5{11 .TheGLEMAdriverlevarying_last_CP_location.mwasusedforthispartofthestudy.Theoriginal33controlpointcongurationisshowninFigure 5{52 withthecontrolpointthatwasvariedbeingdesignatedbyahexagram.ThevaryingcontrolpointsoriginallocationcorrespondstotheupperrightgridlocationonthefocalplanesofFigures 5{53 and 5{54 ;itslocationwasvariedfromthelowerlefthandcornertotheupperrighthandcornerofthefocalplaneforthosetwogures.TheeightstationarycontrolpointsaredesignatedbyadiamondinFigures 5{53 and 5{54 .Forthe384288pixelarraycamera,thatcorrespondsto111,592iterations.TheinertialheadingestimateerrorforeachlocationisdisplayedinFigure 5{53 .Theterrainusedfortheattitudevariancecalculationwasformedusingthe

PAGE 88

77 Figure5{47:Three-dimensionalviewofthesceneforarowcountof1andacolumncountof6forthevaryingnumberofcontrolpointsstudy. Figure5{48:Three-dimensionalviewofthesceneforarowcountsof10andacolumncountof6forthevaryingnumberofcontrolpointsstudy.

PAGE 89

78 Figure5{49:Three-dimensionalviewofthesceneforarowcountof20andacolumncountof6forthevaryingnumberofcontrolpointsstudy. Figure5{50:Varyingnumberofcontrolpointsstudyresultsforrowcounts2-20andacolumncountof6-nedtoc2

PAGE 90

79 Figure5{51:Varyingnumberofcontrolpointsstudyresultsforrowcounts2-20andacolumncountof6-c1toc2 Figure5{52:Originalfocalplaneviewofcontrolpointlocationsforthevaryinglastcontrolpointstudy.

PAGE 91

80 Table5{11:Inputparametersforvaryinglocationoflastcontrolpoint. ParameterValueDescription 3Cameradepressionangleinradians.f0.005Camerafocallengthinmeters.ccd pixels384288Numberofcolumnandrowpixelsintheccdarraycamera.ccd size0.00420.0032Sizeoftheccdarraycamerainmeters.cpcols3Numberofcontrolpointcolumnsontheccdarraycamera.cprows3Numberofcontrolpointrowsontheccdarraycamera.c2c1;c2c1;c2c1,0,0Roll,pitch,yawoftherelativeattitudebetweenc1andc2.c2ned;c2ned;c2ned,0,0Roll,pitch,yawoftheinertialattitudeofthesecondcamerasystemc2.~c2c1;~c2c1;~c2c1,0,0Roll,pitch,yawmisalignmenterrorsintherelativeattitudebetweenc1andc2.~c2c;~c2c;~c2c,0,0Roll,pitch,yawmisalignmenterrorsbetweentheccentercoordinatesystemandsecondcamerasystemc2.kak10Displacementvectormagnitudeinmeters.venu,1,0Directionofthedisplacementvector,orvelocityvector.pixelerror1Measurementerrorofthecontrolpointsinpixels.

PAGE 92

81 Figure5{53:Inertialheadingestimateerrorsforvaryinglastcontrolpointstudy. computeLastCPRange_better.mm-lefromtheGLEMAprogram.Itusedalltherangesofthedesignatedcontrolpointswithina200-pixelthresholdtointerpolatetherangeateachpixellocationonthefocalplane.ThecorrespondingterrainfromeachcalculationisshownmappedontothefocalplaneinFigure 5{54 .5.2.6VaryingControlPointMeasurementErrorTheeectofvaryingthecontrolpoint'smeasurementerrorontheattitudeestimatesisshowninthissection.TheinputparametersusedforthisstudyareshowninTable 5{12 .TheGLEMAdriverleusedinthisstudywasvarying_pixel_error.m.Themeasurementerrorofthecontrolpointswasvariedfrom1.0to2.99pixels.Therstiterationcorrespondstomeasurementerrorof1.0pixelsandthelastiterationcorrespondstoameasurementerrorof2.99pixels.ThecontrolpointlocationsonthefocalplanesofbothcameraswasnotvariedforthisstudyandareshowninFigure 5{55 .Thex"symbolsindicatethedesignatedcontrolpointlocationontherstcamera'sfocalplane,andthehexagonsymbolsindicatethecalculatedcontrolpointlocationsonthesecondcamera'sfocalplane.Alineisdrawntoconnectthecorrespondingcontrolpointlocationsintherstandsecondcamerasystem.

PAGE 93

82 Figure5{54:Thecalculatedterrainmappedontothefocalplaneforvaryinglastcontrolpointstudy. Figure5{55:Controlpointlocationsonfocalplaneforthevaryingpixelerrorstudy.

PAGE 94

83 Table5{12:Inputparametersforvaryingcontrolpointmeasurementerror. ParameterValueDescription 3Cameradepressionangleinradians.f0.005Camerafocallengthinmeters.ccd pixels384288Numberofcolumnandrowpixelsintheccdarraycamera.ccd size0.00420.0032Sizeoftheccdarraycamerainmeters.cpcols5Numberofcontrolpointcolumnsontheccdarraycamera.cprows7Numberofcontrolpointrowsontheccdarraycamera.c2c1;c2c1;c2c1,0,0Roll,pitch,yawoftherelativeattitudebetweenc1andc2.c2ned;c2ned;c2ned,0,0Roll,pitch,yawoftheinertialattitudeofthesecondcamerasystemc2.~c2c1;~c2c1;~c2c1,0,0Roll,pitch,yawmisalignmenterrorsintherelativeattitudebetweenc1andc2.~c2c;~c2c;~c2c,0,0Roll,pitch,yawmisalignmenterrorsbetweentheccentercoordinatesystemandsecondcamerasystemc2.kak10Displacementvectormagnitudeinmeters.venu,1,0Directionofthedisplacementvector,orvelocityvector.pixelerrorvariedMeasurementerrorofthecontrolpointsinpixels.

PAGE 95

84 Figure5{56:Three-dimensionalviewofthesceneforthevaryingcontrolpointmeasurementerrorstudy. The3-Dviewofthescenedidnotchangeforthisstudyaswell.ItisshowninFigure 5{56 .Thecontrolpointsaremarkedbyplussymbols.Therstcamerapositionismarkedbyasquaresymbol.Thesecondcamerapositionismarkedbyatriangle.TheresultsforthecomputedattitudestandarddeviationsareshowninFigures 5{57 and 5{58 .Theattitudeerrorsareshowntobedirectlyproportionaltothemeasurementerrorsasexpected.

PAGE 96

85 Figure5{57:Varyingcontrolpointmeasurementerrorsresultsfor1.0to2.99pixels-nedtoc2 Figure5{58:Varyingcontrolpointmeasurementerrorsresultsfor1.0to2.99pixels-c1toc2

PAGE 97

CHAPTER6CONCLUSIONSIthasbeenshowninthisthesisthatafree"attitudeestimateisavailableforavehiclethatcontainsadownward-lookingvisionsystemandaGPSreceivercapableofdierentialcarrier-phasepositioning,and2twoormorevehiclescontaininganoverlappingFOVandGPSreceiverscapableofdierentialcarrier-phasepositioning.SfMtechniquesontwoimageswithpointcorrespondencesproduceadisplacementvectorbetweenthecameralocationsinthebodycoordinatesystem.GPSdierentialcarrier-phasepositioningproducesthedisplacementvectorbetweenthetwocamerasinaninertialreferenceframe.Oncetwovectorsarefoundthatareexpressedinbothcoordinatesystems,therotationbetweenthetwocoordinatesystemscanbefound.Thiscorrespondstothevehicleattitude.ThisthesisbeganwithaliteraturereviewoftheSfMeldanddetailedhowtosetupthereconstructionproblemusingEuclideangeometrytechniques.BackgroundinGPStechnologyshowedtheaccuracyandambiguitydierencesbetweenpositioningwithcodemeasurementsandpositioningwithcarrier-phasemeasurements.Also,itwasshownthatwhenusingcarrier-phasemeasurementsforpositioningitislessambiguousandmoreaccuratetocomputedierentialpositionsbetweentworeceiversthantheabsolutepositionofeachreceiverposition.Twosolutionapproachesforndinganattitudeestimatewerepresented:anSVDapproachandadescentapproach.Finally,alinearizedmodelofthesystemwasdevelopedinwhichadetailederroranalysiswasperformed.Thekeycontributionthatthisthesispresentsisthedetailederroranalysisoftheattitudeestimate.Theerroranalysisshowedhowtheaccuracyoftheattitudeestimatedependsonthegeome-tryofthescene,cameraparameters,displacementbetweencameralocations,numberandlocationofcontrolpointsspeciedonthefocalplane,andmeasurementerrorassociatedwithspecifyingthecontrolpoints.Theaccuracyoftheattitudeestimatewasshowntobeontheorderoftensofmrad.6.1OptimalCameraParametersforHeadingEstimationTable 6{1 showstheoptimalvaluesforcomputingtheinertialheadingestimatebasedontheerroranalysisparameterstudyofchapter 5 .Basedonthesendingsthefollowingvalues 86

PAGE 98

87 werechosenfortheoptimumheadingestimate:20oFOV,20ocameradepressionangle,tenmetercameradisplacementvectormagnitude,sixcolumnsbytenrowsofcontrolpoints,ameasurementerrorofonepixel,anda384288ccdarraycamera 1 witha0.00420.0032meterfocalplane.InputtingthiscongurationintotheGLEMAMatlabcprogramresultedinaheadingestimateaccuracyof8.6836mradorapproximatelyhalfadegree.Thedriverleusedtogeneratethisresultswasoptimal_heading_estimate.m. Table6{1:Optimalparametervaluesforheadingestimate. OptimalParameterValueDescription/Notes FOV20oorFOV70oCameraFOVindegreesnotethatforthepitchattitudeestimateslargerFOVvaluesaredesired.20oor160oCameradepressionangleindegrees.kak10Displacementvectormagnitudeinmeters.cpcols6Numberofcontrolpointcolumnsontheccdarraycamera.cprows10Numberofcontrolpointrowsontheccdarraycamera.pixelerror2Measurementerrorofthecontrolpointsinpixels. 6.2WorstCameraParameterValuesforAttitudeEstimationTheerroranalysispresentedinchapter5alsoshowedthatthereareworstcasevaluesfortheparametersinregardstoformulatingthefree"attitudeestimate.Afocallengthof45oproducedthelargesterrorintheheadingestimate.Acameradepressionangleof90opointingstraightdownproducedthelargesterrorintheheadingestimate,andavalueofapproximately65oproducedthelargestpitcherrorvalue.Adisplacementvectormagnitudeoflessthan5metersproducederrorsinbothpitchandheadingestimatesgreaterthan100mradandincreasingexponentially.Finally,lessthan12controlpointsalsoproducedsignicanterrorsinboththeheadingandpitchestimates.InputtingthecongurationfortheworstheadingestimateintotheGLEMAMatlabcprogramresultedinanaccuracyof405.8933mradorapproximately23o.Thedriverleusedtogeneratethisresultwasworst_heading_estimate.m. 1Theccdarraysizeandnumberofpixelswasnotpartoftheparameterstudy.Ifanerresolu-tiononthefocalplanecanbeachievedi.e.,increasednumberofpixelsperfocalplanearea,thenthatwouldobviouslydecreasethemagnitudeoftheheadingestimationerror.

PAGE 99

88 6.3FutureAreastoExploreThereareseveralareasthroughoutthisthesisthatwerenotedasfutureareastoexplore.Thoseareas,aswellassomeothersnotpreviouslymentionedarelistedbelow. Implementationandcharacterizationi.e.,viaMonteCarloanalysisofanalgorithmfortheSVDsolution. Implementationandcharacterizationofalgorithmsfortherstandsecondordercoplanarityconstraintbaseddescentalgorithms. Development,implementation,andcharacterizationofadescentalgorithmusingthegeneralizedlinearizederrormodelpresentedinchapter 5 IncorporationofrangeinformationintotheerroranalysisfromlaserradarorsyntheticapertureradarSARvisioningsystems. Experimentalvericationoferroranalysisresults. Expansionofapplicationtotacticalfartargetidenticationdevices. Deriveafundamentalequationthatexpressestheaccuracyoftheattitudeestimateasafunctionofthenumberofcontrolpointsi.e.,isthereatheoreticallimitatwhichaddingmorecontrolpointshasnopositiveeectontheaccuracyoftheattitudeestimate.

PAGE 100

APPENDIXCENTERcCOORDINATEFRAMEDISCUSSIONWeneedatransformationthatmapsaLLtothex-axisoftheccoordinateframe,andkeepsthez-axisoftheccoordinateframeasclosetothez-axisoftheLLcoordinateframeaspossible.Assumethisform:266664a00377775=266664r11r12r13r21r22r23r31r32r33377775266664axayaz377775=266664rT1rT2rT3377775266664axayaz377775,A-1wherea=kak.TherowvectorsrT1,rT2,andrT3aretheunitvectorsoftheaxesoftheccoordinateframeexpressedintheLLcoordinateframe.ItimmediatelyfollowsthatrT1=aT kak.A-2Therequirementthatthez-axisoftheccoordinateframebeascloseaspossibletothez-axisoftheLLcoordinateframeissatisedifthey-axisoftheccoordinateframeliesinthexy-planeoftheLLcoordinateframe.Itimmediatelyfollowsthatr23=0.Forr2,weneedavectorinthexy-planeoftheLLcoordinateframethatisorthogonaltoa.Therearetwopossibilities:[)]TJ/F11 9.963 Tf 7.749 0 Td[(ayax0]and[ay)]TJ/F11 9.963 Tf 7.748 0 Td[(ax0].Therstisthecorrectchoice,sinceaLL=[a00]TshouldleadtoRcLL=I.Therefore,rT2=1 q a2x+a2y)]TJ/F11 9.963 Tf 7.749 0 Td[(ayax0.A-3Finally,r3=r1r2,sorT3=1 kakq a2x+a2y)]TJ/F11 9.963 Tf 7.749 0 Td[(axaz)]TJ/F11 9.963 Tf 7.749 0 Td[(ayaza2x+a2y.A-4Puttingitalltogether:RcLL=1 kakq a2x+a2y266664axq a2x+a2yayq a2x+a2yazq a2x+a2y)]TJ/F11 9.963 Tf 7.749 0 Td[(aykakaxkak0)]TJ/F11 9.963 Tf 7.749 0 Td[(axaz)]TJ/F11 9.963 Tf 7.749 0 Td[(ayaza2x+a2y377775A-5 89

PAGE 101

REFERENCES [1] S.Maybank,TheoryofReconstructionfromImageMotion,Springer{Verlag,Berlin,Germany,1993. [2] R.Sturm,DasProblemperProjektivitatundseineAnwendungaufdieFlachenzweitenGrades,"MathAnnalen,vol.1,pp.533{573,1869. [3] O.Hesse,DiecubischeGleichungvonwelcherdieLosungdesProblemsderHomographievonM.Chaslesabhangt,"J.ReineAngreMath,vol.62,pp.188{192,1863. [4] Q.-T.LuongandT.Vieville,CanonicalRepresentationsfortheGeometriesofMultipleProjectiveViews,"ComputerVisionandImageUnderstanding,vol.64,no.2,pp.193{229,1996. [5] S.SoattoandP.Perona,ReducingStructurefromMotion:AGeneralFrameworkforDynamicVisionPart1:Modeling,"IEEETransactionsonPatternAnalysisandMachineIntelligence,vol.20,no.9,pp.933{942,Sept.1998. [6] T.HuangandA.Netravali,MotionandStructurefromFeatureCorrespondences:AReview,"inProc.oftheIEEE,Feb.1994,vol.82,pp.252{268. [7] S.SoattoandP.Perona,ReducingStructurefromMotion:AGeneralFrameworkforDynamicVisionPart2:ImplementationandExperimentalAssesment,"IEEETransactionsonPatternAnalysisandMachineIntelligence,vol.20,no.9,pp.943{960,Sept.1998. [8] H.Longuet-Higgens,AComputerAlgorithmforReconstructingaSceneFromTwoProjections,"Nature,vol.293,pp.133{135,1981. [9] D.HeegerandA.Jepson,SubspaceMethodsforRecoveringRigidMotionI:AlgorithmandImplementation,"Int'lJ.ComputerVision,vol.7,no.2,1992. [10] P.MisraandP.Enge,GLOBALPOSITIONINGSYSTEMSignals,Measurements,andPerformance,Ganga{JamunaPress,Lincoln,Massachusetts,2001. [11] C.C.CounselmanIII,I.I.Shapiro,R.L.Greenspan,andD.B.Cox,Jr.,BackpackVLBITerminalwithSubcentimeterCapability,"inProc.RadioInterferometricTechniquesforGeodesy.1979,vol.2115,pp.409{413,NASAConverencePublications. [12] C.C.CounselmanIIIandS.Gourevitch,Miniatureinterferometerterminalsforearthsurveying:Ambiguityandmultipathwithglobalpositioningsystem,"IEEETransactionsonGeosciencesandRemoteSensing,vol.GE{19,no.4,pp.244{252,1981. [13] R.Brown,InstantaneousGPSattitudedetermination,"inProc.ofIEEEPosition,Location,andNavigationSymposiumPLANS'92,Monterey,California,Mar.1992,pp.113{120. [14] G.GolubandC.V.Loan,MatrixComputations,NorthOxfordPublishingCo.Ltd.,Oxford,1983. [15] B.K.P.Horn,RelativeOrientation,"InternationalJournalofComputerVision,vol.4,no.1,pp.59{78,1990. 90

PAGE 102

91 [16] J.Fraleigh,AFirstCourseinAbstractAlgebra,AddisonWesley,Reading,Massachusetts,1973. [17] P.H.Zipfel,ModelingandSimulationofAerospaceVehicleDynamics,AIAA,Reston,VA,2000.

PAGE 103

BIOGRAPHICALSKETCHMr.RosengrenwasbornthesonofanAirForceocerinCampSprings,MDin1975.HegraduatedfromA.C.MosleyHighSchoolinPanamaCity,FL,in1994.Afterspendingthenext3yearsstudyingspacephysicsattheUnitedStatesAirForceAcademy,Mr.RosengrennishedhisundergraduatestudiesattheUniversityofFlorida.HegraduatedintheSummerof1998,earninghisB.S.inphysicswithhighhonors.Upongraduation,Mr.Rosengrenspent2yearsasamemberofthe28thCommunicationsSquadronatEllsworthAFB,workingasa3C0x1ComputerOperator.IntheFallof2000,Mr.RosengrenbeganworkingforDynetics,Inc.inShalimar,FL.HeworksasaSystemsAnalyst,supportingtaskssuchasthedevelopmentoftheMultipleOrdinanceAirBurstMOABGPS-guidedmunition,Exploitationof3-DDataE3DDefenseAdvancedResearchProjectsAgencyDARPAprogram,PostMissionMissDistanceScoringSystemPMMDSSfortheOpenAirRangeOAR,TargetAcquisitionandTrackingSystemTATSUpgradealsofortheOpenAirRangeOAR,andmodelingandsimulationofatacticalunmannedairvehicleTUAVandanorganicairvehicleOAVforuseintheDARPAJigsawsensordevelopmentprogram.WhileinGainesvillenishinghisundergraduatestudies,Mr.Rosengrenmettheloveofhislife,theformerDawnTammyFriesofFloralCity,FL.Somehow,Mr.RosengrenmanagedtoconvincethelovelyDawntomarryhim.TheyweremarriedintheSummerof1999andfollowinganalltooshorthoneymoonintheBahamas,Mrs.RosengrenmoveduptoSouthDakotatojoinherhusband.Apreciouslittlegirl,MissGabriellaNicole,wasgiventoMr.andMrs.RosengrenonFeb4,2002.TheyalsohaveascruyMaltesenamedDaisyMaeandaoppy-earedbunnynamedFlopsy.Mr.Rosengrenenjoysplayingtennis,volleyball,andgolfinhissparetime.Healsoenjoysplayingjazzandclassicaltrumpet,andservesasamemberoftheRockyBayouBaptistChurchOrchestra.Mr.RosengrenbeganhisgraduatestudiesattheUniversityofFloridaGraduateEngineeringandResearchCenterintheSpringof2001.HeisscheduledtocompletehisMasterofSciencedegreeinelectricalengineeringintheSpringof2004.MuchcelebrationfromtheRosengrenfamilywillcommenceatthattime. 92


Permanent Link: http://ufdc.ufl.edu/UFE0004565/00001

Material Information

Title: On the Formulation of Inertial Attitude Estimation Using Point Correspondences and Differential Carrier Phase GPS Positioning: An Application in Structure from Motion
Physical Description: Mixed Material
Copyright Date: 2008

Record Information

Source Institution: University of Florida
Holding Location: University of Florida
Rights Management: All rights reserved by the source institution and holding location.
System ID: UFE0004565:00001

Permanent Link: http://ufdc.ufl.edu/UFE0004565/00001

Material Information

Title: On the Formulation of Inertial Attitude Estimation Using Point Correspondences and Differential Carrier Phase GPS Positioning: An Application in Structure from Motion
Physical Description: Mixed Material
Copyright Date: 2008

Record Information

Source Institution: University of Florida
Holding Location: University of Florida
Rights Management: All rights reserved by the source institution and holding location.
System ID: UFE0004565:00001


This item has the following downloads:


Full Text













ON THE FORMULATION OF INERTIAL ATTITUDE ESTIMATION USING POINT
CORRESPONDENCES AND DIFFERENTIAL CARRIER PHASE GPS POSITIONING:
AN APPLICATION IN STRUCTURE FROM MOTION




















By

SCOTT CLARK ROSENGREN




















A THESIS PRESENTED TO THE GRADUATE SCHOOL
OF THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT
OF THE REQUIREMENTS FOR THE DEGREE OF
MASTER OF SCIENCE

UNIVERSITY OF FLORIDA


2004








































Copyright 2004

by

Scott Clark Rosengren


















This thesis is dedicated to BG. I will always love you.

















ACKNOWLEDGMENTS

First, I give all the glory and honor to my LORD and Saviour Jesus Christ for allowing and

enabling me to complete this work.

I also thank my wife for putting up with countless late nights and early mornings, reading

each copy of the manuscript, watching our daughter when she was done, and all the precious time

that she sacrificed for me to work two jobs for the last four years. She has spent at least as much

effort as I did in completing this thesis and all the coursework. I thank her for her never-ending

love and support. My love will be forever hers.

I wish to thank my parents for supporting me, and convincing me that I can do anything I

set my mind to accomplish.

Without the support and direction from Professor Eric Sutton, this project would not have

even begun. I thank him for the time and energy he spent tutoring and mentoring me over the

last two years.

My gratitude is extended to Ron Smith for developing the ufthesis document class for the

LAjlgX word processing program in which this document was prepared.

Finally, I wish to thank my company Dynetics, Inc. in Shalimar, FL for allowing me the time

and resources to complete my coursework and thesis.


















TABLE OF CONTENTS
Page

ACKNOWLEDGMENTS ................... ................... iv

LIST OF TABLES . ............ ................... .... vii

LIST OF FIGURES . ........... ................... ... viii

ABSTRACT ..................................... ....... xi

1 INTRODUCTION ..... .................................. 1

2 OVERVIEW OF EXPLOITATION OF POINT CORRESPONDENCES IN MULTI-
PLE IMAGES OF THE SAME SCENE ................... ........ 3

2.1 General Structure from Motion (SfM) Problem Setup and Notation for Euclidean
Framework ....................... ..... ............. ..... 4
2.1.1 General Mathematical Notations ................. .... 4
2.1.2 Imaging System .................. ............. 5
2.1.3 Presentation of the Essential Matrix to De-Couple Structure from Motion 6
2.1.4 Note About Solution Ambiguities in the Euclidean Framework ...... 8
2.2 Problem Categorization and Refinement ................... .... 8

3 BENEFITS GAINED FROM GLOBAL POSITIONING SYSTEM (GPS) DIFFEREN-
TIAL CARRIER PHASE POSITIONING. ................. ........ 11

3.1 General Discussion of GPS Positioning ..... ........... ..... 12
3.1.1 Positioning with Code Delay Measurements ................. 12
3.1.2 Positioning with Carrier Phase Measurements ............... 13
3.1.3 Positioning with Carrier Phase Absolute and Differential Position with
Respect to an External Reference System . . . ...... 14
3.1.4 Determination of Differential Position from Carrier Phase Measurements 16
3.2 Displacement Vector for One Vehicle from Carrier Phase: Integer Ambiguity Not
a Problem ...... ......... ..... .. ...... ... 17
3.3 Displacement Vector for Two Vehicles from Carrier Phase: Integer Ambiguity is
a Problem .... . . . ... ... ............... ... .. 17
3.3.1 Real Time Kinematic (RTK) with Roving Reference Receiver . 18
3.3.2 Wide Laning Using Dual Frequency . . . . . 18
3.3.3 Kalman Filter .. . . . .. . . . ..... 18

4 PROBLEM DEFINITION AND SOLUTION . . . ... . 19

4.1 Problem Notes and Assumptions .................. . .... 19
4.1.1 Rotation Ambiguity about Translational Axis . . . ..... 19
4.1.2 No GPS Attitude Determination Available . . . ...... 19
4.1.3 Specification of Control Points .. . . . .... 20
4.1.4 GPS Differential Carrier Phase Positioning Data Used as Truth . 20
4.2 Specific SfM Problem Setup to Find Vehicle Inertial Attitude . 20
4.3 Singular Value Decomposition (SVD) Solution . . . . 21
4.3.1 Brief Overview of Linear Algebra Techniques used in SVD Solution . 23
4.3.2 Detailed Look at SVD Solution Algorithm . . . .... 24











4.4 Error Minimization Solution Approach . . . . . . 28
4.4.1 Brief Overview of Rotations using Quaternions . . ...... 28
4.4.2 Descent Algorithms using Coplanarity Constraint . . . 31
4.4.3 Descent Algorithm Using Linearized Measurement Model . .... 34

5 ERROR ANALYSIS .. . . . .. . . . . .... 35

5.1 Generalized Linearized Error Model . . . . .... . 35
5.1.1 Implementation of the Generalized Linearized Error Model Analysis (GLEMA)
MatlabV Program .. . . . .. . . .... 42
5.2 Parameter Study ................. . . . . 46
5.2.1 Varying Camera Focal Length .. . . . . .... 47
5.2.2 Varying Camera Depression Angle . . . ....... 51
5.2.3 Varying Displacement Vector Magnitude Between Imaging Systems . 59
5.2.4 Varying Number of Control Points . . . . . 61
5.2.5 Varying Location of Last Control Point . . . . .... 76
5.2.6 Varying Control Point Measurement Error . . . .... 81

6 CONCLUSIONS .. . . . . . . . . . 86

6.1 Optimal Camera Parameters for Heading Estimation . . . ..... 86
6.2 Worst Camera Parameter Values for Attitude Estimation . . . 87
6.3 Future Areas to Explore .. . . . .. . . .... 88

APPENDIX CENTER (c) COORDINATE FRAME DISCUSSION . . 89

REFERENCES . . . . . . ......... . .... 90

BIOGRAPHICAL SKETCH .. . . . .. . . . .... 92



















LIST OF TABLES


Coordinate System Definitions . ..............................

Description of variables in the generalized error model . . . . .

Correspondence between driver.m and plotresults.m files for the GLEMA program.

Summary of linearized independent variables contained in the system error vector v
with their corresponding inputs in the GLEMA program. . . . .


Page

36

39

44


Tabl

5

5-

5

5-


5

5-

5

5-

5

5-

5

5

6-


e

1

2

3

4


5 Step one verification results for the GLEMA program . . . . .

6 Step two verification results for the GLEMA program . . . . .

7 Input parameters for varying camera focal length study . . . . .

8 Input parameters for varying camera depression angle 3 study . . . .

9 Input parameters for varying displacement vector magnitude between imaging systems.

10 Input parameters for varying the number of control points study. . . .

11 Input parameters for varying location of last control point. . . . .

12 Input parameters for varying control point measurement error. . . . .

1 Optimal parameter values for heading estimate. . . . . . .


















LIST OF FIGURES
Figure Page

2-1 General SfM problem notation for Euclidean Framework . . . ... 5

2-2 Geometry for SfM 2D-to-2D point correspondence problem. . . . 7

3-1 Geometry for determination of differential position from carrier phase measurements. 16

4-1 Imaging system notation used in finding vehicle's inertial attitude. . . 21

4-2 Single 2D-to-2D point correspondence used in finding vehicle's inertial attitude. 22

5-1 Error analysis scenario. . . . . . . . ...... 36

5-2 Overview of MatlabV implementation of the generalized linearized error model anal-
ysis (GLEMA) program. ... . . . . . . ... 42

5-3 Graphical overview of verification of the GLEMA program . . . 44

5-4 Graphical overview of the parameter study process using the GLEMA program 47

5-5 Allowed location of control points within each grid box of the focal plane. . 48

5-6 Control point locations on a focal plane with a 3 row by 3 column grid designation. 48

5-7 Varying FOV vs. focal length .. . . . ... . . ... 50

5-8 Control point locations on the focal planes of both cameras for the first iteration for
varying camera FOV. . . . . . . . ...... 51

5-9 Control point locations on the focal planes of both cameras for the last iteration for
varying camera FOV. . . . . . . . ...... 52

5-10 Three-dimensional view of the scene for the first iteration for a varying focal length. 52

5-11 Three-dimensional view of the scene for the last iteration for a varying focal length. 53

5-12 Varying camera focal length results ned to c2 . . . . ... 53

5-13 Varying camera focal length results cl to c2 . . . . .. .. 55

5-14 Control point locations on focal plane for it 3 = 45 degrees for the varying camera
depression angle study .. . . . .. . . . ... 56

5-15 Control point locations on focal plane for it 3 = 90 degrees for the varying camera
depression angle study . . . . . . . . 56

5-16 Control point locations on focal plane for it 3 = 135 degrees for the varying camera
depression angle study .. . . . .. . . . ... 57

5-17 Three-dimensional view of the scene for 3 = 45 degrees for the varying camera de-
pression angle study. . . . . . . . ....... 57

5-18 Three-dimensional view of the scene for 3 = 90 degrees for the varying camera de-
pression angle study. . . . . . . . ....... 58











5-19 Three-dimensional view of the scene for 3 = 135 degrees for the varying camera de-
pression angle study. . . . . . . . ....... 58

5-20 Varying camera depression angle results ned to c2 . . . ....... 59

5-21 Varying camera depression angle results cl to c2 . . . ....... 61

5-22 Control point locations on focal plane for a = 1.0 meters for the varying displace-
ment vector magnitude study. . . . . . . ...... 62

5-23 Control point locations on focal plane for a = 5.9 meters for the varying displace-
ment vector magnitude study. . . . . . . ...... 62

5-24 Control point locations on focal plane for a = 10.9 meters for the varying displace-
ment vector magnitude study. . . . . . . ...... 63

5-25 Control point locations on focal plane for a = 15.9 meters for the varying displace-
ment vector magnitude study. . . . . . . ...... 63

5-26 Control point locations on focal plane for a = 20.9 meters for the varying displace-
ment vector magnitude study. . . . . . . ...... 64

5-27 Three-dimensional view of the scene for a = 1.0 meter for the varying displacement
vector magnitude between imaging systems study. . . . ..... 64

5-28 Three-dimensional view of the scene for a = 5.9 meter for the varying displacement
vector magnitude between imaging systems study. . . . ..... 65

5-29 Three-dimensional view of the scene for a = 10.9 meter for the varying displace-
ment vector magnitude between imaging systems study. . . ....... 65

5-30 Three-dimensional view of the scene for a = 15.9 meter for the varying displace-
ment vector magnitude between imaging systems study. . . ....... 66

5-31 Three-dimensional view of the scene for a = 20.9 meter for the varying displace-
ment vector magnitude between imaging systems study. . . ....... 66

5-32 Varying displacement vector magnitude between imaging systems results for a = 1.0
to 20.9 meters ned to c2 .. . . . .. . . .... 67

5-33 Varying displacement vector magnitude between imaging systems results for a = 5.9
to 20.9 meters ned to c2 .. . . . .. . . ... 67

5-34 Varying displacement vector magnitude between imaging systems results for a = 1.0
to 20.9 meters cl to c2 .. . . . . . . ... 68

5-35 Varying displacement vector magnitude between imaging systems results for a = 5.9
to 20.9 meters cl to c2 .. . . . . . . .... 68

5-36 Control point locations on focal plane for a column count of 1 and a row count of 6. 70

5-37 Control point locations on focal plane for column count of 10 and a row count of 6. 71

5-38 Control point locations on focal plane for column count of 20 and a row count of 6. 71

5-39 Three-dimensional view of the scene for a column count of 1 and a row count of 6
for the varying number of control points study. . . . . ... 72

5-40 Three-dimensional view of the scene for a column count of 10 and a row count of 6
for the varying number of control points study. . . . . ... 72











5-41 Three-dimensional view of the scene for a column count of 20 and a row count of 6
for the varying number of control points study. . . . . ... 73

5-42 Varying number of control points study results for column counts 1-20 and a row
count of 6 ned to c2 .. . . .. . . . . 74

5-43 Varying number of control points study results for column counts 1-20 and a row
count of 6 cl to c2 .. . . .. . . . . 74

5-44 Control point locations on focal plane for row count of 1 and a column count of 6. 75

5-45 Control point locations on focal plane for row count of 10 and a column count of 6. 75

5-46 Control point locations on focal plane for row count of 20 and a column count of 6. 76

5-47 Three-dimensional view of the scene for a row count of 1 and a column count of 6
for the varying number of control points study. . . . . ... 77

5-48 Three-dimensional view of the scene for a row counts of 10 and a column count of 6
for the varying number of control points study. . . . . ... 77

5-49 Three-dimensional view of the scene for a row count of 20 and a column count of 6
for the varying number of control points study. . . . . ... 78

5-50 Varying number of control points study results for row counts 2-20 and a column
count of 6 ned to c2 .. . . .. . . . . 78

5-51 Varying number of control points study results for row counts 2-20 and a column
count of 6 cl to c2 .. . . .. . . . . 79

5-52 Original focal plane view of control point locations for the varying last control point
study. . . . . . . . . . . 79

5-53 Inertial heading estimate errors for varying last control point study. . ..... 81

5-54 The calculated terrain mapped onto the focal plane for varying last control point
study. ... . . . . . . .... ....... 82

5-55 Control point locations on focal plane for the varying pixel error study. . . 82

5-56 Three-dimensional view of the scene for the varying control point measurement er-
ror study. . . . . . . . . . . 84

5-57 Varying control point measurement errors results for 1.0 to 2.99 pixels ned to c2 85

5-58 Varying control point measurement errors results for 1.0 to 2.99 pixels cl to c2 85

















Abstract of Thesis Presented to the Graduate School
of the University of Florida in Partial Fulfillment of the
Requirements for the Degree of Master of Science

ON THE FORMULATION OF INERTIAL ATTITUDE ESTIMATION USING POINT
CORRESPONDENCES AND DIFFERENTIAL CARRIER PHASE GPS POSITIONING:
AN APPLICATION IN STRUCTURE FROM MOTION

By

Scott Clark Rosengren

May 2004

Chair: Eric Sutton
Major Department: Electrical and Computer Engineering

Combining the results from Structure from Motion (SfM) processing of two overlapping

images with precise GPS differential carrier phase positioning yields an essentially "free" attitude

estimate for the platform aircraft. This technique is II ," because it uses sensor systems

already required on the platform aircraft for other reasons. The SfM processing results in a

displacement vector in the body coordinate system between the two camera locations. The GPS

differential carrier phase positioning is used to produce an extremely accurate displacement

vector in a local level coordinate system. Once two vectors are found that are expressed in

both coordinate systems, then the transformation between the two coordinate systems can be

found. This corresponds directly to the platform vehicle's attitude. A detailed error analysis on a

linearized model of the system shows that errors on the attitude estimate are on the order of tens

of mrad for a measurement error standard deviation of one pixel.

Background information in both the SfM and GPS fields is given. Then, the problem is

described, all assumptions are stated, and solution techniques are presented. Finally, a detailed

error analysis is performed on a linearized model of the system. The key contribution of this

thesis is the detailed error analysis of the attitude estimate.

















CHAPTER 1
INTRODUCTION

Much research over the last 20 years in the Structure from Motion (SfM) field has focused

on using information from multiple 2-D images of the same scene taken from different camera

locations to calculate a 3-D rendering of the scene and the displacement of the camera, commonly

called the reconstruction problem. This thesis presents an in-depth study of a new application

for this field of study. By utilizing techniques from SfM, differential positioning information from

GPS sensors can be combined with vision-system information to yield an estimate of vehicle

heading.

Increasing the accuracy and robustness of airborne vehicle navigation is the objective driving

the formulation of the problem presented in this thesis. The primary focus of this work is to

examine an algorithm that uses data that will be available on the vehicle for other reasons. It

is assumed that the airborne vehicle is small, low cost, mostly autonomous, and used for both

attack and/or intelligence gathering. This vehicle is envisioned to be able to fly as part of multiple

or single vehicle missions. Each vehicle or agent will almost certainly require the following four

sensor systems: (1) a low grade inertial measurement unit (IMU), (2) a GPS receiver, (3) a vision

system, and (4) an inter-agent communication system (if part of multi-agent mission). The IMU

and GPS combination provides accurate navigation, and the IMU provides attitude sensors for

vehicle control. The vision system is necessary for intelligence gathering or target acquisition. An

inter-agent communication system is necessary for distributed intelligence and will provide secure

communication with jamming resistance using technology such as spread spectrum or ultra wide

band. Both of these technologies can be used to measure precise distances between transmitter

and receiver, in addition to the communication functions.

The problem detailed in this thesis uses vision correspondences from multiple reference

systems, combined with the corresponding inertial displacement vectors between the systems to

obtain the inertial attitude of one of the systems. The vision correspondences of the reference

systems could be from (1) different vehicles viewing parts of the same scene at the same time

or (2) one moving vehicle that takes multiple images. The only constraint is that the point










correspondences remain constant in all images. As far as the formulation of the problem, it makes

no difference whether it is thought of as (1) or (2) above.

As an example,1 suppose vehicle one is directly behind vehicle two; and that vehicle one

can sense, using its vision system, the angular location of vehicle two with respect to itself. If

the relative position of vehicle two with respect to vehicle one is known using GPS, then the

pitch and heading of vehicle one can be determined very precisely. This simplified scenario

clearly imposes some unacceptable operational constraints, but these constraints can be removed

using more sophisticated algorithms to process the sensor information. If both vehicles have

downward-looking vision systems and the two fields of view overlap, then it should be possible to

obtain the same information as would be obtained if vehicle two were directly visible to vehicle

one. In addition, even if the two fields of view do not simultaneously overlap, if their point

correspondences remain fixed in both images, that should provide enough information to calculate

the attitude of one of the vehicles. The baseline between vehicles will be long enough to provide

an extremely accurate pointing direction; the accuracy of this technique will depend primarily on

the vision-system geometry. The primary objective of this study is a thorough error analysis of

this technique.

This thesis is organized in the following manner. A brief overview of the Structure from

Motion field that uses point correspondences to calculate spatial and orientation characteristics is

given, along with a mathematical overview of the reconstruction problem given in the Euclidean

framework. A chapter on the benefits gained from GPS differential carrier phase positioning is

presented. Next, the problem of interest is mathematically defined, all simplifying assumptions

are given, and solution techniques are presented. Then, an error analysis chapter develops a

linearized model of the system and provides a detailed look at how various parameters as well as

the overall scene geometry affect the heading estimate. Results for the other error terms are also

shown. Finally, a conclusion section presents what the author believes are the important findings

contained in this thesis, and further areas to explore.











1 Originally given by E. Sutton in a 2001 white paper originating from the University of Florida
Graduate Engineering and Research Center in Shalimar, FL.

















CHAPTER 2
OVERVIEW OF EXPLOITATION OF POINT CORRESPONDENCES IN MULTIPLE IMAGES
OF THE SAME SCENE

Using point correspondences between two different 2-D views of the same scene to derive

3-D information about that scene is not a new idea. As early as the mid-nineteenth century, the

field of photogrammetry used point correspondences between two images of the same scene from

different camera locations to aid in the production of topographical maps to assist the explorers

of that generation [1]. With the advent of increased computing power in the 1980s, the techniques

of photogrammetry have found application in robotics and computer vision. Research in robotics

has emphasized vision as an aid to navigation, and research in computer vision has emphasized

the reconstruction of 3-D scenes from 2-D data. The latter comprises the field known as Structure

from Motion (SfM).

The original formulation of this problem used a technique called projective geometry to

describe the relationships between image correspondences [2, 3]. Projective geometry uses ray

tracing through the optical center of the camera system and epipolar transformations to describe

the reconstruction of the 3-D scene. More recently, SfM researchers have also developed and

quantified the 3-D reconstruction problem using Euclidean geometry [1, 4, 5, 6]. The Euclidean

framework simplifies the problem to that of finding a solution to a set of nonlinear equations, and

lends itself to an analytical solution. Maybank [1] gives the fundamental difference between these

approaches on their different views of geometry: synthetic versus analytic. Projective geometry

uses a synthetic approach that gives interpretations to the equations describing the scene based

on the geometric shapes being seen (lines, planes, conics, etc.). The drawback to the synthetic

approach is that it is difficult to make it completely rigorous. The analytic approach, used in the

Euclidean framework, leads to the introduction of an essential matrix that concisely and robustly

describes the motion of the camera. Maybank [1] gives a thorough background and outlines the

mathematical complexities of both techniques. Before exploring some of the relevant research in

SfM, a basic overview of the problem in the Euclidean framework is given.










2.1 General Structure from Motion (SfM) Problem Setup and Notation for
Euclidean Framework

Before going into further findings about 2D-to-2D point correspondences from SfM research,

a general mathematical overview of the classical SfM problem needs to be given. General notation

used throughout the community is not standardized, so the notation used in this thesis is given an

elementary treatment.

2.1.1 General Mathematical Notations

Vectors and matrices. The notation of a boldfaced lowercase letter denotes a vector. The

superscript denotes the coordinate system in which the components of the vector are expressed.

For example, pc1 represents the vector p expressed in the (cl) 3-D coordinate system. A unit

vector is denoted by a carat on top of the boldfaced lowercase letter. For example, a unit vector

in the same direction as the vector given above would be denoted as pc1. A single capital letter is

used to designate a matrix.

Rotation matrices. R, represents an orthonormal rotation matrix where the subscript

(cl) represents the coordinate system that the rotation will go from and the superscript (c2)

represents the coordinate system that the rotation will go to. The orthonormal rotation matrix

can be expressed as the product of successive rotations about the three body axes that are fixed

to the system undergoing the rotation. The three angles associated with the three successive

rotations are called Euler angles. For this thesis, a 3-2-1 sequence of rotations is used in reference

to Euler angles, where 3 stands for the z-axis, 2 for the resulting y-axis, and 1 for resulting

x-axis. All rotations are right-handed. That is, the system will first yaw (y') about the z-axis (3),

pitch (0) about the resulting y-axis (2), and then roll (q) about the resulting x-axis (1). This is

represented mathematically as

1 0 0 cos(6) 0 sin(6) cos(y) sin(y) 0

0 cos(y) sin(y) 0 1 0) -sin(y) cos(y) 0 (2-1)

0 sin(y) cos(y) sin(6) 0 cos(6) 0 0 1

Also, the inverse of an orthonormal rotation matrix is equal to its transposition.

R1 RT (2-2)

So, for a rotation matrix that goes from system 1 to system 2,

R = (RD) (2-3)















X, C
xc


2r


V (x, y, z)

Figure 2-1: General SfM problem notation for Euclidean Framework


2.1.2 Imaging System

Let an imaging system have length f from the optical focus point to the focal plane. This is

commonly referred to as the focal length. Image coordinates c and r are the 2-D coordinates on

the focal plane used to represent the projection of a point in the three-dimensional (3-D) scene

given by (x,y,z). The origin of the 3-D scene is collocated with the origin of the imaging system's

focal plane. The x-axis and y-axis of the 3-D scene correspond to the c and r axes of 2-D focal

plane coordinates. The z-axis of the 3-D scene is orthogonal to the focal plane (i.e., directly in

the line of sight or pointing direction). By using analytical geometrical relationships, the imaging

system coordinates are related to the 3-D scene coordinates by Equations 2-4 and 2-5, as shown in

Figure 2-1.

c=f f (2-4)



r = f (2-5)

For the 2D-to-2D solution, two different views or images of the same scene with n correspon-

dences are used to reconstruct the 3-D scene in either one of the camera 3-D coordinate systems.

Let the origin of the first camera system be given by o01 and the origin of the second camera

system be given by o02 and both of their values are (0, 0, 0) in their respective 3-D coordinate sys-

tems. The displacement between the camera systems is given by a translation act and a rotation











R Note that it is arbitrary to which 3-D camera system is called cl and which is called c2 when

using this notation. This detail is pointed out because the solution technique of chapter 4 and the

error analysis technique of chapter 5 solve for the attitude of cl and c2 respectively, as given in

Figure 2-2.

For this particular example, if the translation between the two camera coordinate systems is

zero:

q2 = R pC1 (2-6)

So, adding in a non-zero translation gives the coordinate system conversion as:

q2 = R (p ac1) (2-7)

The lines pc1 and q(2 represent the 3-D scene coordinates of a point correspondence expressed in

the first and second camera system respectively. The geometry using two cameras is shown in

Figure 2-2. If the magnitude of pc1 is taken to be p and the unit vector is denoted as pc1, and

likewise for the second coordinate system, then

qcc2= Rc(pC aCi) (2-8)

Equation 2-8 forms the basis for the Euclidean reconstruction problem used in computer vision.1

The reconstruction problem can then be rewritten in a Euclidean framework as such: for n point

correspondences, find the scalar values pi and qi along with the camera displacement described by

R, and a"c that is valid for each p1, q2 point correspondence pair.

2.1.3 Presentation of the Essential Matrix to De-Couple Structure from Motion

Equation 2-8 takes into account the translation and rotation between the first and second

camera coordinate systems when viewing a common point in real space. The following logic can

be applied to separate the structure and motion components of the Euclidean SfM problem [1]:

Take the limit as |pc'1| co, |q|2| -2 oc, and P 1. This gives the relationship between

p1l and q42 as

4lc2 = R pc1 (2-9)

The dot product between qC2 and Rjpc1 x Rj ac1 yields


(Rp 1 x Rjact.) qc2 = (2-10)


1 Equivalent to Equation 2-1 in Maybank. [1]













cl focal plane

ycl, Pcl
cr r


cF1, cl
Jr


Figure 2-2: Geometry for SfM 2D-to-2D point correspondence problem.


This is referred to as the coplanar constraint and states in mathematical terms that two

lines intersecting the same point in space must have a common plane with both lines being

a member. If Equation 2-10 holds and qC2 x Rpc1 () 0 then the depth, or p and q can be

found via Equation 2-8. If q42 x Rc^pc1 = 0 then q42 = R 2p1 and the depth is infinite.

* Write Equation 2-10 using the antisymmetric matrix T,,i involving the vector components

of ac1 where
0 a3 -a2

Tao, -ag 0 a1 (2-11)

a2 -ai 0

* Noting that Tacp1 = pc1 x acl gives an alternate representation for Equation 2-10 as


(q42)T RJ T,,c pl = (2-12)


* An essential matrix is defined as a 3 x 3 matrix which is the product of an orthogonal

rotation matrix and a non-zero antisymmetric matrix. Letting the essential matrix E be

defined as E = R2 Tai gives
(q2TC1 ( 3


(

(2-13)










The most important property of essential matrices is that they can be defined as the zeros

of a set of nine homogeneous polynomial equations of degree three in the coefficients of 3 x 3

matrices [1]. Another key property of essential matrices is that they are of rank two.2

A third statement of the problem based on the essential matrix formulation in the Euclidean

framework is such: for n point correspondences, find the essential matrix E that is valid for all

]p1, 4q2 point correspondences. The advantage for this formulation is that motion and structure
are now decoupled in the solution. That is, the camera displacement originally given by R,2 and

a1 is summarized succinctly in E. Then, given the solution for E, the structure components of

the solution given by pi and qi can be found by the relationships given in Equation 2-8. This

approach lends itself to an algorithm that begins by solving for camera displacement independent

of structure, which will be important later.

2.1.4 Note About Solution Ambiguities in the Euclidean Framework

For any solution to the reconstruction problem utilizing the Euclidean framework there

are two ambiguities that will always be present: (1) a scaling ambiguity and (2) a twisted pair

of solutions [1]. The scaling ambiguity is inherent because the absolute size of any object in a

2-D image is not known. That is, a larger object farther away may appear to be the same size

as a smaller object that is nearer. The scaling ambiguity can be resolved by placing arbitrary

constraints on the translation vector a"1 [1] (e.g., |ac1|| = 1).

The twisted pair solution arises by rotating the camera 180 degrees around the axis of

translation. After rotation around the translation axis, it can be shown that there is a second set

of camera displacement terms, Sc7 and bW1 that is valid for each ]p1, ql2 point correspondence

pair [1]. That is,

S= RI T = SfTbl. (2-14)

A single solution in the Euclidean framework is taken to be the sum of solutions over the scaling

and twisted-pair ambiguities.

2.2 Problem Categorization and Refinement

There are different solution approaches based upon what information is available to the many

applications of the SfM problem. Huang and Netravali [6] have placed these solution approaches

into three categories:



2 For further details on essential matrices the reader can look in Section 2.2 of Maybank's text
[1] and other SfM literature [4, 5, 6].










Three-dimensional (3D) to three-dimensional (3D) feature correspondences

Two-dimensional (2D) to three-dimensional (3D) feature correspondences

Two-dimensional (2D) to two-dimensional (2D) feature correspondences

This thesis explores the third of Huang's categories, specifically using point correspondences in a

Euclidean framework. According to Huang [6], the applications that 2D-to-2D feature

correspondences are used to solve are

Finding relative attitudes of two cameras observing the same scene

Estimating motion and structure of objects moving relative to a camera

Passive navigation (i.e., finding the relative attitude of a vehicle at two different time

instants)

Efficient coding and noise reduction of image sequences by estimating motion of objects.

The objective of this thesis corresponds to the first and third applications given above. As shown

in section 2.1, it makes no difference mathematically whether the problem is stated as finding

relative attitude of two cameras simultaneously viewing the same scene or a single camera viewing

an overlapping scene at two different time instances.

An attempt to form a general framework for comparing the various flavors of solution

algorithms is given by Soatto and Perona [5, 7]. They give five instances of their general model.

One of these five is the Essential Model, which uses the coplanarity constraint introduced by

Longuet-Higgens [8]. This formulation is essentially the Euclidean framework described earlier.

Another of the five instances of their general model is the Subspace model. It consists of the

subspace constraint introduced by Heeger [9] which interprets the model as a dynamical system

rather than an algebraic constraint. This has the same advantage of the Euclidean framework

in that it decouples the motion and structure components in the solution, but has an additional

advantage of further decoupling the rotational velocity from the translational velocity. The other

three instances of their model involve fixation of a various number of features on the focal plane

and are not relevant to this thesis.

Soatto and Perona's research into generalizing the many SfM applications into a common

framework looks promising because they present the idea of integrating information over time. If

integration of information can be accomplished, then the effective baseline between images will be

increased. A longer baseline should result in increased accuracy of the reconstruction. Soatto and

Perona's work has the implication that any solution technique can be conformed to the general

model and thus have the integration benefit provided therein. This caveat has the potential to








10


increase the accuracy of any application that is currently based on a SfM algorithm to perform

reconstruction.

















CHAPTER 3
BENEFITS GAINED FROM GLOBAL POSITIONING SYSTEM (GPS) DIFFERENTIAL
CARRIER PHASE POSITIONING.

The Global Positioning System (GPS) is a technology consisting of a constellation of satel-

lites in prescribed orbits transmitting known signals to be used for precise positioning via so-

phisticated triangulation techniques. Each satellite transmits pseudo-random codes at known

frequencies. There are two ways to determine position from the GPS constellation: (1) code delay

measurements and (2) carrier phase measurements. Typical user range errors (UREs) using code

delay measurements are on the order of 5-10 meters and there is no ambiguity in the solution.

Typical differential position errors using carrier phase measurements are two orders of magnitude

less than the code delay UREs. However, there is an integer ambiguity inherent in using carrier

phase measurements to determine differential position that must be resolved. [10]

The problem examined in this thesis, detailed in chapter 4, involves using a displacement

vector between two airborne vehicles that is calculated in two coordinate systems to determine

the rotation between those coordinate systems. One displacement vector will be calculated in

the body coordinate system using SfM techniques. The effects of measurement error on this

displacement vector in the calculation of the rotation between the two coordinate systems is

the focus of chapter 5. The other coordinate system where the displacement vector is calculated

is an inertial system. GPS differential carrier phase positioning will be used to calculate this

displacement vector and will be treated as truth data in the error analysis given in chapter 5.

This chapter is written to show the validity of using GPS differential carrier phase positioning as

truth data.

This chapter begins with a general discussion of determining position from the GPS con-

stellation using code delay and then using carrier phase measurements. Next, the process for

determining the relative displacement vector for a single moving vehicle is discussed. Finally, the

details of how the relative displacement vector between two vehicles is calculated using carrier

phase measurements are presented. All material contained in this chapter was originally presented

or derived from material given in chapters four and six of an excellent GPS text written by Misra

and Enge [10].










3.1 General Discussion of GPS Positioning

There are two types of measurements that can be made to use GPS to determine position.

The original design of the GPS system was to estimate the range to at least four satellites using

measurements of the pseudo-random code to unambiguously determine the receiver position. The

pseudo-random code generated by the satellite is compared to the receiver generated pseudo-

random code to produce an estimated transit time. This estimated transit time multiplied by the

speed of light in a vacuum determines the range to the satellite that transmitted the signal. The

measurements from four satellites are used to determine the 3-D coordinates of the receiver and

the receiver clock bias.

It was shown in the late 1970s that a second type of measurement could also be used to de-

termine receiver position [11, 12]. The phase of the carrier signal transmitted by the satellite can

be measured relative to a receiver generated carrier signal. The phase difference plus an unknown

number of whole cycles gives another estimate of the range to the satellite. Measurements from at

least four satellites must still be used to determine receiver position, but now the receiver position

contains an integer ambiguity that cannot be resolved in a direct manner. Positioning using both

measurement types is discussed in the next two sections.

3.1.1 Positioning with Code Delay Measurements

Positioning with code delay measurements relies on the satellite and receiver being able to

create identical signals in time. The receiver-generated signal will then be a delayed version of the

transmitted signal. The signal delay is proportional to the distance to the satellite and will be

called the transit time. However, in order to compare the signals there must be a common time

reference, t. GPS time (GPST) is used as the common time reference. Using the notation as given

in Misra and Enge [10], let the transit time be denoted as 7.

Both the receiver and satellite clocks are not precisely aligned with GPST. Let the bias

terms relative to GPST for the satellite and receiver be denoted as t"8 and St, respectively. Then,

noting that t stands for GPST,


ts(t T) =(t T) + 6tS(t T) (3-1)


(3-2)


tr(t) = t + 6t,(t) .










So, the measured range to the satellites, or pseudo-range, is given by


p(t) = c[t,(t) -tS (t T)]= CT + c[8t,(t) St (t T)] + p, (3-3)


where c, is used to denote the random pseudo-range estimation error.

The transmission speed of the signal will be significantly less than the speed of light once it

enters the earth's atmosphere. The transmission speed can be modeled to take into account the

delays resulting from the ionosphere and troposphere by


cr = r(t, t T) + Ip(t) + Tp(t), (3-4)


where I,(t) and T,(t) represent the ionospheric and tropospheric transmission delays and r(t, t-T)

represents the true range from the receiver to the satellite. Dropping the reference to a specific

time in GPST, t, gives the measurement equation for the pseudo-range from the receiver to the

satellite as

p = r + c[6t, 6t8] + Ip + T + (3-5)

The accuracy of Equation 3-5 is directly related to how well the bias and error terms are

modeled or taken into account. The satellite clock bias at the signal generation time, 6t8(t T),

is calculated by the GPS ground stations and contained in the navigation message that is sent

with the pseudo-random signal. The receiver clock bias, 6t,(t), varies for each receiver and can

cause significant errors. Satellite ranges are on the order of 20000 26000 km, which correspond

to transmission times of 70 to 90 ms [10]. So, measurement of an 80 ms transmission time with

1% error, or .8 ms, would result in approximately 240 km of range error. Or, in order to have less

than 20 m of range error, the receiver clock bias with respect to GPST must be accurate to within

66.7 ps.

3.1.2 Positioning with Carrier Phase Measurements

A more accurate way of determining the range from a receiver to a satellite is to measure the

phase of the carrier signal that carries the pseudo-random code generated from the satellite with

respect to the carrier signal that the receiver generates. The phase of the carrier signal can be

converted to transit time of the signal by adding the fractional cycle that is given by the phase

plus an unknown number of whole cycles. This technique is ambiguous in the number of whole

cycles that it takes to determine the range to the satellite.

The cycle, or wavelength, of the carrier signal is 19 cm for the LI GPS signal and 24 cm for

the L2 signal. By comparison, the length of one code chip of the C/A code that is transmitted











on the LI frequency is 300 m. The different cycles of the carrier and code signals results in

different resolutions in the distances that can be calculated from their respective measurements.

The accuracy, or resolution, difference between these two types of measurements can be compared

to the accuracy difference between making measurements with a ruler that has tick marks every

half-centimeter to one that has tick marks every half-meter. In this analogy, the carrier phase

measurements would be the ruler with tick marks every half-centimeter and the code phase

measurements would be the ruler with tick marks every half-meter.

In an idealized case, the carrier phase measurement (in units of cycles) from a receiver to a

satellite can be represented as


(t) = (t) (t r)+ N, (3-6)

where the T represents the transit time of the signal, N is the integer ambiguity, 0, (t) represents

the phase of the receiver generated signal, and "(t T) represents the phase of the carrier signal

generated by the satellite at time (t T) that is received by the receiver at time t. Simplifying

Equation 3-6 by writing phase as the product of frequency and time gives


(t) = fT + N = + N ) + N, (3-7)
A A

where f and A are the frequency and wavelength of the carrier signal, c is the speed of light in a

vacuum, and r(t, t T) is the geometric (or true) range from the receiver to the satellite (same

notation as section 3.1.1).

Accounting for the error and bias terms inherent in the measurement equation and dropping

the reference to a specific time gives Equation 3-7 to be

[+ I, + TO] c (6t, 6t")
A = +N + t (3-8)


where 1I and To are the ionospheric and tropospheric delays in meters, St, and 6t" account for

the clock biases and initial phase offsets of the receiver and satellite clocks respectively, and eo

accounts for all other modeling and measurement errors. Note that Equation 3-8 is in units of

cycles.

3.1.3 Positioning with Carrier Phase Absolute and Differential Position with
Respect to an External Reference System

Sections 3.1.1 and 3.1.2 described how to use the GPS system to solve for absolute position

with respect to an external reference system. Absolute position is the determination of a single

receiver's location with respect to an external reference system. With the knowledge of the










absolute position of two receivers, the differential or relative position between those two receivers

can be found. However, absolute position is not necessary to determine relative, or differential,

position when using carrier phase measurements.

Only differential position is needed for the application in this thesis. This simplifies the

processing of the carrier phase measurements. The number of whole cycles between the receiver

and the satellite must be determined in order to calculate the receiver position when solving for

absolute position using carrier phase measurements. This is normally accomplished by relying

on geometric changes in the satellite constellation to rule out all the erroneous solutions to the

integer ambiguity until there is only one valid solution. However, when only the differential

position between two receivers is needed, as is the case for this application, the number of whole

cycles corresponding to what is referred to as the delta pseudo-range can be used [10].

The delta pseudo-range is determined by the change in carrier phase measurements over a

time interval.

A = (ti) (to) (3-9)

If the baseline between the two measurements is small,1 then the delay due to ionospheric and

tropospheric effects on both measurements will be very similar and will essentially cancel out.

Also, the satellite clock bias terms, 6t8, will cancel out between the two measurements. This

leaves only differences in the two receiver clock bias terms, the true range difference, and the

number of whole cycles between the two receivers as the only three unknowns left from Equation

3-8. That is,

A0 = C + N' + 9 (3-10)
A

where Ar is the difference in the ranges to the satellite between the two receivers, bt,2 and 6tri

represent the clock bias terms for receiver r2 and rl respectively, N' is the number of whole cycles

between the two measurements, and 6E represents the measurement error. Note that N from

Equation 3-8 is much greater than N' from Equation 3-10 as shown in Figure 3-1. Also, note that

69 will be greater than 6E when the baseline between the receivers is very short, so differential

position estimates from carrier phase measurements will be even more accurate than absolute

position estimates using carrier phase measurements.



1 Ranges less than 10 km can be classified as small for purposes of ionospheric and tropospheric
purposes [10]. So this assumption is especially true for the application in this thesis since base-
lines will be on the order of tens of meters.













N 7 SA,
N,















rl r2

Figure 3-1: Geometry for determination of differential position from carrier phase measurements.


3.1.4 Determination of Differential Position from Carrier Phase Measurements

All the previous discussions in this chapter dealt with a single satellite and one or two

receivers. However, in order to determine differential position from carrier phase measurements,

the two receivers must track at least four satellites to determine the three position unknowns

(x, y, z) in the external reference system and the clock bias term (t,2 6t,1) for the two receivers.

Subscripts will now be used to denote satellite number in all respective symbols.

Referring to Figure 3-1: 1, represents the unit line-of-site (ULOS) vector in the external

reference system that points in the direction from receiver rl to the satellite denoted by SAL, a

represents the vector from receiver rl to receiver r2, NJ represents the number of whole cycles of

the carrier signal transmitted by SAi between the two receivers, and Ni represents the number of

whole cycles between receiver rl and SAi. Using this notation,


Ar = ii a (3-11)


where Ar is the difference in the ranges to the satellite between the two receivers, and the ULOS

vector, ii, is a known quantity because the receiver has knowledge of the i-th satellite location in

the external coordinate system and it can sense the azimuth and elevation angles to that satellite

via its signal. Let n = [NI N2 ... NM]T where M represents the number of satellites that are

tracked by both rl and r2, and 6t = (6t2 6tr1). This lets the relationship for determining

the differential position between receivers rl and r2 using carrier phase measurements from M











satellites as
AL1 if

+ 1 [ 4x1 Mx (3-12)

4x1
AYM 1 1
-Mxl M -Mx4
Notice that there are, in general, M + 4 unknowns in Equation 3-12, so this problem cannot

be directly solved as written. Other knowledge of the situation must be used to account for the

integer ambiguity and solve for the position and clock bias terms.

3.2 Displacement Vector for One Vehicle from Carrier Phase: Integer Ambiguity
Not a Problem

The differential position between a single moving receiver at two different times is simply

the displacement vector of the receiver. If the receiver can continuously track a minimum of four

satellites for a given duration and count up the number of whole cycles occurring in that duration

for each satellite, then Equation 3-12 will have four equations and four unknowns. Therefore, it

can be used to solve unambiguously for the differential position and clock bias terms, [aT c6t]T.

If the baseline between the two measurements is on the order of tens of meters, then the

ionospheric and tropospheric delays will be practically identical and the accuracy of the differ-

ential position vector a will be given by the accuracy of the phase measurements. The carrier

phase can typically be measured with an accuracy of 0.01-0.05 cycles (2 mm 1 cm) [10]. This

means that by continuously measuring the phase of a minimum of four satellites for a given du-

ration, differential position (i.e., the a vector in an external reference system) can be estimated

with millimeter-to-centimeter accuracy. The simple method which achieves this accuracy is quite

remarkable!

3.3 Displacement Vector for Two Vehicles from Carrier Phase: Integer Ambiguity is
a Problem

In order to use carrier phase measurements to calculate the displacement vector between

two different receivers, the integer ambiguity must be resolved. The time it takes to resolve the

integer ambiguity, or initialization time, is what needs to be reduced in order to use carrier phase

measurements for airborne vehicle navigation. There are several methods for resolving the integer

ambiguity in differential carrier phase positioning available in present technology. Three methods

are discussed in the following sections: Real time kinematic (RTK), Wide laning, and Kalman

filtering. However, once the integer ambiguity is resolved correctly, the differential position is still











in the millimeter-to-centimeter accuracy just as the single vehicle case. The only difference is that

the two vehicle case requires an initialization time to resolve the integer ambiguity problem.

3.3.1 Real Time Kinematic (RTK) with Roving Reference Receiver

Differential GPS (DGPS) is a proven technique that takes advantage of the slowly changing

nature of the standard GPS errors. If two receivers are within a reasonable distance from each

other, differencing the correlated errors can reduce their effects and result in a more accurate

position estimate. A mode available from many receiver manufacturers is called Real Time

Kinematic (RTK). RTK mode combines DGPS and carrier phase measurements to produce

precise relative positioning. RTK mode uses reference and rover receivers, a communication

link, and software to precisely determine the rover receiver's position. The initialization time for

integer ambiguity resolution for baselines of several kilometers (for 2001 technology) is 30-60 s

[10]. Once the initialization time is complete, the two receivers must continue to track the same

set of satellites to have knowledge of the integer ambiguity. There is no reason why both the

reference and rover receivers cannot be moving.

3.3.2 Wide Laning Using Dual Frequency

It can be shown that the total number of integer values which satisfy the integer ambiguity

decreases as the carrier wavelength increases [10]. There are two related effects which cause this

to occur: 1) increased wavelength decreases the number of nodes in the search space of possible

solutions, and 2) increased wavelength increases the distance between adjacent nodes in the

search space. The GPS satellites are currently transmitting signals at two frequencies: LI and L2

(1 cycle = 19 cm at LI and 24 cm at L2). So, in order to improve integer estimation, the wide

laning method combines both the LI and L2 frequencies to i. -i. a new signal with a longer

wavelength. The new signal, L12, results in a wavelength of 0.862 m [10]. This results in a more

accurate estimate of the integer ambiguities, but sacrifices accuracy in the estimation of receiver

position.

3.3.3 Kalman Filter

A Kalman filter approach can also be used to estimate the integer ambiguities. Both code

and carrier measurements are used in this method. The course code measurements are used to

give a guess at the initial position. Then, the integer ambiguities are treated as floating point

states in a Kalman filter. Whenever the "int(-. I converge (i.e., the filter approaches steady

state), they are rounded to the nearest integer to determine the solution.

















CHAPTER 4
PROBLEM DEFINITION AND SOLUTION

The usual solution to the reconstruction problem contains both motion and structure

components. However, for the purpose of vehicle attitude determination, the structure component

is not required. The motion solution component, specifically the translation vector between two

camera systems, aC1, when combined with the GPS differential carrier phase positioning between

the two systems, aLL, gives all the information needed to know the inertial attitude of the vehicle

up to rotation about the translation axis (this limitation is explained further in the next section).

Stating the problem in a more formal form: for n point correspondences between two camera

systems, cl and c2, and an inertial translational vector aLL, find the inertial orthonormal rotation

matrix RL Or equivalently R2L. Two solution approaches are given based on section 6.1 of

Maybank's text [1]: an SVD based approach and an error minimization approach. The SVD based

approach is given a complete solution, while the error minimization approach is dealt with in

general terms.

4.1 Problem Notes and Assumptions

4.1.1 Rotation Ambiguity about Translational Axis

The inertial rotation around the translational axis between the two camera systems cannot

be found from the information contained in the problem statement. This can be illustrated by

looking at a trivial example: If camera one and camera two were both pointed due north and their

displacement was also due north, then using a 1-2-3 (or roll-pitch-yaw) Euler angle representation

of the orthonormal rotation matrix would mean that the roll is completely ambiguous. Likewise,

if the pointing and displacement vector were in any other direction comprised of more than one

vector component, then the roll-pitch-yaw angles would be somehow interrelated. In other words,

using only two vision systems in the formulation of this problem allows solution for only two

degrees of freedom. A third vision system would need to be added to allow solution for a third

degree of freedom.

4.1.2 No GPS Attitude Determination Available

GPS attitude determination will not be available on the vehicle of interest. This assumption

is reasonable because the size of the vehicle being considered in this application does not support











a long enough baseline to provide accurate measurements. A baseline of 0.5-5 meters is needed to

provide useful results from GPS attitude determination [13].

A second approach to GPS attitude determination is to use successive positional measure-

ments and make the assumption that the vehicle attitude corresponds directly to the direction

of motion. This would be possible with a single GPS receiver and could use the accurate results

associated with the differential carrier phase measurements; however, as soon as the vehicle's flight

path contained any rotational velocity, or any non-zero side-slip or angle-of-attack, this approach

produces fundamentally erroneous results.

4.1.3 Specification of Control Points

Control points can be specified or tracked between images such that the projections of the

point correspondences onto the focal plane are known. How the control points are identified

is outside the scope of this thesis. This is a classic recognition or tracking problem in image

processing and all that is assumed in this thesis is that the control point values (i.e., c and r) can

be measured with known error statistics.

4.1.4 GPS Differential Carrier Phase Positioning Data Used as Truth

The errors associated with the GPS differential carrier phase positioning are considered

negligible compared to other system errors and these position estimates are therefore used as

truth data. This is reasonable since the baseline used in this problem is 10 meters and the errors

associated with the GPS measurements are on the order of centimeters, thus corresponding to

angular errors on the order of 5 mrad (e.g., tan -(5 )). See chapter 3 for detailed discussion of

this assumption.

4.2 Specific SfM Problem Setup to Find Vehicle Inertial Attitude

The same imaging system is used as given in Chapter 2, noting that the 3-D scene coordinate

system used here is commonly called the camera or sensor system in navigational terms. Figure

4-1 is a reproduction of the corresponding figure in Chapter 2, where the following equations

remain valid:

c=f-, (4-1)



r=f (4-2)



For the 2D-to-2D SfM solution, two different views or images of the same scene with n

correspondences are used to reconstruct the 3-D scene in either one of the camera 3-D coordinate
















X, C
xc


2r


S(x, y, z)

Figure 4-1: Imaging system notation used in finding vehicle's inertial attitude.


systems. Using the notation given earlier, the translation and rotation between the two camera

systems is ac1 and R, respectively. The origins of the two camera systems are given by olL

and 02LL, where LL stands for a local level coordinate system such as North-East-Down (NED)

or East-North-Up (ENU). These values cannot be obtained from the imaging system, but are

supplied via the GPS sensor and used as truth data.1 The lines p"l and q(2 represent the 3-

D scene coordinates of one correspondence expressed in the first and second camera system

respectively. This is shown in Figure 4-2. Note that it is arbitrary which 3-D camera system is

called cl and which is called c2 when solving this problem.

4.3 Singular Value Decomposition (SVD) Solution

The SVD solution approach to the vehicle attitude determination from point correspondences

problem is adapted from section 6.1.1 of Maybank's text [1]. The reader is referred to this text if

there are any areas of this algorithm that are not presented to the level of detail that is desired.

Let the point correspondence pairs, p1 and q^2 where i = 1...n such that n > 9, be described

by the Euclidean framework notation utilizing the essential matrix E described earlier such that


E = Re Ta (4-3)


1 see section 4.1.4 for details.












c2 focal plane
cl focal plane fclpa










SLL f control
q ,2







02LL point
L L/


LL




Figure 4-2: Single 2D-to-2D point correspondence used in finding vehicle's inertial attitude.


where the translation and rotation between the camera systems are given by ac1 and RI

respectively. An error function V(E) is defined on the space of 3 x 3 matrices such that

V(E) ((qf Ep1)2 (4-4)
i=1

Find the 3 x 3 matrix E with unit Frobenius norm that minimizes V(E), and then find the

nearest essential matrix to E (the unit Frobenius norm constraint is imposed to exclude the

trivial solution E = 0).2 An outline of the SVD solution is given below, with details following

background sections on some of the mathematical details needed for the solution.

Form the n x 9 matrix A from the n point correspondence pairs.

Perform SVD on A via A = UyVT

Set e = last column of V

Reshape e into the estimate of the essential matrix E.

Perform SVD on E via E = U''V'T
--cl
Set a = last column of V'

Define aLL 02LL 01LL




2 Note that the wide hat accent, E, is used to denote an estimate of a variable. This is not to
be confused with the hat accent symbol used to denote a unit vector, p. With this nomenclature,
a indicates an estimate of the unit vector AC











Determine a second linearly independent vector that is expressed in both the cl and LL
-cl
coordinate systems ( b and bLL ).

Calculate a third linearly independent vector that is expressed in both the cl and LL
-cl -cl
coordinate systems by taking the cross product between (1) b and A and (2) bLL and

aLL.

Use the three vectors expressed in both the cl and LL coordinate systems to compute RL.

The following subsections give a brief background in the linear algebraic mathematical details

needed before describing the outlined steps given above.

4.3.1 Brief Overview of Linear Algebra Techniques used in SVD Solution

Singular value decomposition. Any m x n matrix A can be written as the product of an

m x n column-orthogonal matrix U, an n x n diagonal matrix E with positive or zero elements,

and the transpose of an n x n orthogonal matrix V [14]


A =U VT (4-5)


where

E = diag(aal2...aTn-1n,) for aTi > a2 > ... > aTn- 1 > an > 0 (4-6)

and

UT U = I (4-7)



VT V = I (4-8)

Matrix norm calculations. There are two matrix normalization calculations used in this

thesis. As given in Maybank's text [1], the Euclidean norm, ||. ||, is subordinate to the Euclidean

vector norm and is defined as

||A|| sup(||Ax|| |I |x|| 1), (4-9)

where A is an m x n matrix. The Frobenius norm, ||. l||, is defined as
m,n
1|A||2- A (4-10)
i= 1,j= 1

Performing the SVD in each of these matrix norms results in [1]


||A|| = li '\ 11 = || \ I| = a1,1


(4-11)











k
\IIA\\ i \ J E sot'.- (4-12)
Si=1

4.3.2 Detailed Look at SVD Solution Algorithm

Form the n x 9 A matrix. Looking at a single point correspondence pair, pC1 and q(2,

and writing out the essential matrix formulation in its component form gives


Ell E12 E13 ..1

q q q j F21 22 F23 1 -0 (4-13)

E31 E32 E33

Writing Equation 4-13 in scalar form gives:


i. r.'E11 + r.' E21 + I ".31 + ."/. 1 E12 + / "1 E22 + / "i E32

i." 1 E13+ -/i 'E23 + i 33 E =0. (4-14)

Letting the pa1 and 402 terms form a row of the n x 9 A matrix. The i-th row of A is then given

by

A A 1 1 1 i 1 (4-15)


Rewrite the E terms as a 9 x 1 column vector as
E11

E12

E13

E21
eA E22 (4-16)

E23

E31

E32

E33

So for nine point correspondences the corresponding matrix equation is











1 "* 1 1 1 1 1 V 11

'. V *' .V '' V *V E13

1/ V I V 1 V 1 V V I I 111

1 1l 1l ~ 1_: 1 l 1 ~ 1 i ~ 1 1 _1
S I / I / / I '1- *I / 122 0 .


1. 1 I 1V. I 1. I V V1 I' I1 23

1 1 1 1 1 V 1 1 1 1 1- E 3 1

V. '. I' l l /V. V V I- E322
1 - 33



(4-17)

Equation 4-17 can be represented compactly for the set of n point correspondence pairs by


Anxe9xli= Onxl (4-18)


Note that the ||e|| = 1 constraint is placed on e to exclude the trivial solution.

Perform SVD on A. From the above reformulation of the reconstruction problem, it

follows that

((f2)T Ep1)2 = (A e)2 (4-19)

and also that

V(E)= 11Ae112 (4-20)

Then, performing the singular value decomposition on A gives

V(E)= / \ e|| y|VT II > a | llell = a, |11_-. a(2 (4-21)


Noting that the constraint ||e|| = 1 is invoked in the last step.

Set e = last column of V. So, the 3 x 3 matrix that minimizes the error function

described by Equation 4-4 can be found from the 9 x 1 column vector


S= VT f (4-22)




where f9 is the nine dimensional vector defined as (0,0,..., 1)T. In the absence of noise, e= e.

The consequence of this result is shown later to be that V(E) = V(E) = 0.











Reshape e into the essential matrix E. E is found from e via


e1 C2 C3

E = 4 5 C6 (4-23)

e7 8eg eg9

where V(E) = a9 which was shown before to be the minimum value. If a9 = 0, then

V(E) = V(E) = 0 and therefore E = E.

Perform SVD on E. At this point in the algorithm, Maybank [1] discusses another

minimization function using the least eigenvalue to derive rotation and translation terms from the

essential matrix. The problem described in this thesis requires only the translational term, a"c, to

form a solution for the vehicle attitude. That is,


Ea"1 =R. Ti aca1 = R (ac1 x ac1) = R 0= 0. (4-24)


Therefore, a"1 is in the null space of E. This results in the need to perform the singular value

decomposition on E such that

E = U'y'V'. (4-25)

-cl
Set a = last column of V' Using the same logic as finding e,


a1 = V' ss, (4-26)


where ss is defined as the three-dimensional vector (0, 0, 1)T. Note that in the absence of noise,
-cl
a = a1, as would be expected.

This concludes the SfM contribution to this algorithm. The remaining steps are a conse-

quence of using the available information from the GPS sensor and matrix manipulation via linear

algebra.

Define aLL = 02LL 01LL. Use GPS differential carrier phase positioning to mark

the origin of each camera system, cl and c2. The translation displacement vector can then be

obtained via

a = 02L o1L. (4-27)

Determine a second linearly independent vector that is expressed in both the cl
-cl
and LL coordinate systems (b and bLL ). There are at least two ways to find a second

linearly independent vector that is expressed in both the cl and LL coordinate systems. The

first way is to repeat steps (1) (7) above for a second camera system pair cl and c3 to find











-cl
b and bLL. The second possibility is to use the down vector obtained from the onboard inertial

navigation system (INS) as the second vector expressed in both coordinate systems. This would

take advantage of the down vector in LL as being (0, 0)NED or (0,0, -1)ENU. Whichever of the

two methods is a more convenient choice should be used.

Derive a third linearly independent vector that is expressed in both the cl and
^cl -cl
LL coordinate systems by taking the cross product between (1) b and a and (2)

bLLand aLL Consider the two vectors used in the cross product as inputs and the resulting

vector as output. If the two input vectors are linearly independent, then from the definition of a

cross product, the output vector is linearly independent of both input vectors.

This result can be exploited to reduce the number of calculated vectors expressed in both the

cl and LL coordinate systems from three to two. The third linearly independent vector can be

derived from the first two vectors via

-cl ^c -cl
c b xa (4-28)




cLL bLL X aLL. (4-29)


Use the three vectors expressed in both the cl and LL coordinate systems to

compute RgL Let each column in the 3 x 3 matrix C be defined as a linear independent

vector expressed in the cl coordinate system. From the results obtained earlier C is comprised as

-cl ^cl -cl
cx b1 ax
C -cl c1 c(l
y y y (4-30)
-cl ^cl cl
b a

And similarly, let each column of the 3 x 3 matrix L be defined as the corresponding vectors used

to populate C expressed in the LL coordinate system.
,LL ,T T LL

L LL bLL aLL (4-31)

,LL ,TT aLL
z Z

Then, the rotation matrix RV1 can be determined via

RjL =CL-1. (4-32)











This solution for R"L will be unique if the three vectors comprising C and L are linearly

independent. Stated another way, if both L and C are of full rank, then RL will be a unique

solution.

4.4 Error Minimization Solution Approach

Another approach for finding the inertial attitude of a vehicle utilizing Euclidean SfM

techniques is to find the minimum of an error equation using a descent or adaptive algorithm.

The descent algorithm is an iterative approach to a solution. It searches the neighborhood around

an initial point to determine a new point where the value of the error function is less than the

initial point. The error function is approximated by a Taylor series expansion to simplify the

search algorithm. This process repeats until a minimum of the error function is found.

Three solution approaches based on descent algorithms are presented. The first two ap-

proaches use the coplanarity constraint to obtain an error equation that can be minimized. The

third approach minimizes the system error vector from the linearized measurement model derived

in the next chapter. All three solutions are discussed in general terms. The two approaches utiliz-

ing the coplanarity constraint were originally given by Horn [15] and also discussed by Maybank

[1]. Development of the third approach which minimizes the state error vector is left as a future

area of study. All three approaches are discussed in a more general manner than the SVD solution

given in the previous section.

4.4.1 Brief Overview of Rotations using Quaternions

The descent solution algorithms given in the section 4.4.2 make the subtle change of using

quaternions to represent the rotation from the inertial to body coordinate system. This section

gives a brief overview of the use of quaternions to represent rotations. The following information

is derived from lecture notes given by E. Sutton at the University of Florida Graduate Engineer-

ing and Research Center in Shalimar, FL. Further information on quaternions can be found in

Fraleigh [16] and Zipfel [17].

Quaternions. Quaternions can be thought of as a four element vector or as a sum:


qi

9q2
q qi + 2 + qsk + q4 (4-33)
q4

94











where i, j, and k obey the following relationships: i) = k, jk = i, ki = j, fj

ik = -j. Quaternion multiplication (or composition) is defined as follows:


-k, kj = -i, and


a -b= i + j a sk2 i a3k+a4) (bi + b2j b3k + 4 (4-34)


Expanding the terms on the right side of the Equation 4-34 and grouping by the i, j, k, and scalar

parts give


a b = (alb4 + a2b3 a3b2 + 0a4b) i + (-alb3 + a 2b4 + a3b1 + a4b2)j

+ (ab2 ab + a3b4 ab3) k (aibi a2b2 asb3 + a4b4) (4-35)


Quaternion conjugation is defined by


q*


-91
-qi

-q2

q3

q4


-qli q2J q3k + q4


The multiplicative inverse is given by

q 2

where

lqll = q/ 2 + q + q3 + q4

Quaternion multiplication can also be performed using two equivalent matrix-vector forms:


a4 -a3 a2 al b1 b4 b3 -b2 b1 al

as 04 -al a2 b2 -b3 b4 b1 b2 a2

-a2 al a4 as b3 b2 -bi b4 b3 a3

-al -a2 -as a4 b -bi -b2 -b3 b4 a04


(4-36)


(4-37)



(4-38)







(4-39)


Quaternions obey the following properties:


(4-40)


q|112 = q* q q q*



(p q)* = q* p*


(4-41)



(4-42)














Ip q| =l p|l I|qll




(p q)' = q p


Quaternions representing rotations.


.3
esin 2
q=
3
2os 2


where e = (e., ey, ec) is the axis of rotation a

represents a rotation must have unit length (||


Quaternions can be used to represent rotations:

.3
sc, sin 2 q1

ey sin 2 q2
2 (4-
ec sin 2 3
3
cos 3 q4

nd 3 is the angle of rotation. A quaternion that

q|I = 1). Also, note that -q represents the same


rotation as q.

A rotation from coordinate system x to coordinate system y is accomplished as follows:


r" = qY rx" (qj)*


Expanding into matrix notation gives


-q3 q2 qi

q4 -qi q2

qi q4 q3

-q2 -qs q4

q1 q qq + q4

2 (qlq2 + qgq4)

2 (qiq3 q2q4)


q4 -qs q2

qs q4 -qi

-q2 qi q4

qi q2 qs

2 (qlq2 qsq4)

-q2 +q2 q + q2

2 (qlq4 + q2qs)


2 (qiq3 + q2q4)

2 (-qlq4 + qs3)







2q 2_ + q2 + 2q
q1 q2 q3 q4

0

0

0

0

q2 + q2 + q2 2
q1 2 q4 r


(4-43)




(4-44)


45)


(4-46)


X2

03

0


(4-47)










noting that any three-dimensional vector can be rotated using quaternions by adding a zero for

the scalar component. The rotation matrix 0C equivalent to qY is the upper left 3 x 3 block from

Equation 4-47.


q1 q3 + 2 (q1q2 qsq4) 2 (qlq3 + q2q4)
C = 2 (qq,2 + q3q4) -q + q2 q3 + q4 2 (-qlq4 + 9293) (4-48)

2 (qiq3 q2q4) 2 (qlq4 + q2q3) -q 92 + q + q9

If qY represents a rotation from x to y, and q' represents a rotation from y to z, then the rotation

from x to z is given by

q = q qY. (4-49)

4.4.2 Descent Algorithms using Coplanarity Constraint

Overview. Section 6.1.2 of Maybank's text [1] shows a first and second order descent

algorithm which could be used to calculate an estimate of the displacement vector in the body

coordinate system, aC1. GPS differential carrier-phase positioning would then give a corresponding

displacement vector in a local level coordinate system, aL. Another vector expressed in both

coordinate systems would then need to be found. Once two vectors expressed in both coordinate

systems are found, then the solution for the vehicle attitude can be found via the same manner

outlined in the SVD solution.

Calculation of displacement vector in body coordinate system. Both descent

algorithms given in Maybank's text [1] use the error function V(E) described by Equation 4-4

with two subtle changes: (1) expanding the essential matrix E to its implicit components of the

camera displacement {-R aC1} and (2) using a unit quaternion zj to represent the rotation

between the camera systems instead of the orthonormal rotation matrix R,. Making the above

two changes to Equation 4-4 results in [1]

V(E) = V(R, a) = V(zI, ac1) = [(z qf (z)) (act x p1)] (4-50)
V(E) 1

where the Equation 4-50 is utilizing the coplanarity constraint. Note that the hat symbol on

acl, q2, and p1 in Equation 4-50 is used to denote unit vectors.










The descent algorithm searches the neighborhood around (z a, W1) and determines the small

perturbations Azi and Aac1 which produce the greatest decrease in V for the given constraints


z '+Az -III= \L I= 1 (4-51)

llc + Aac1ll I =a 1 (4-52)

The Taylor series expansion of Equation 4-50 about the point (z, aC1) is


V(zC2 + a 8a1) V(z- a1) 1+ IL m.8a1+ ( Z('TIf;

+ 26zTM6acl + (6ac TN6ac1 + H.O.T. (4-53)

where 1 and m are vectors and L, M, and N are matrices which are functions of z c1, q2, and pc1

Equation 4-53 is used to calculate V in Maybank's descent algorithms.3 [1]

First order descent. The first order descent algorithm uses the first three terms of

Equation 4-53 to calculate Az and Aa. The values of Az and Aa are sought which make 1 Az +

m Aa the most negative, subject to the constraints given by Equations 4-51 and 4-52. Also, a

step size h needs to be defined to constrain ||Az|| and ||Aa|| to be a small fixed value.


||Az|| ||Aa|| h (4-54)

This constraint is imposed to ensure that the first order approximation is valid in the region

around V(z, a) that is searched. [1]

The two delta terms, Az and Aa, are found independently of each other. If the starting point

of the descent algorithm is given as (zi, ai), then the value of Az is found via the error function


W(Ai, A2, Az) = 1A- |A + A(Az||2 h2) + A2(|zi + Az||2 1) (4-55)

where A1 and A2 are LaGrange multipliers used to enforce the constraints listed in Equations 4-51

and 4-54. Differentiating Equation 4-55 with respect to Ai, A2, and Az, and setting the respective

results to zero gives three equations and three unknowns. These three equations can be used to

solve for the value of Az which results in the most decrease in V. A similar method can be used

to find the value of Aa which produces the most decrease in the value of V. [1]



3 For ease of readability the unit vector hat symbol, superscripts and subscript will be dropped
on the references to z2, Azi, aW1, and Aac1 for the remainder of the discussion of Maybank's
descent algorithms.










The updated values (z2,a2) are defined as


(z2,a2) = (zi + Az, ai + Aa) (4-56)

If the value of V(z2, a2) is significantly smaller than the value of V(zi, ai), then the descent

algorithm is applied to (z2, a2) in place of (zi, ai). If V(z2, a2) is not less than a defined delta

from V(z2,a2), then the algorithm is repeated on (zi,ai) with a smaller step size, h, to determine

Az and Aa. If the step size reaches its smallest allowable value, then the algorithm terminates

and returns the current value of (z, a) as an estimate of the global minimum of the error function

V.

Second order descent. The second order descent algorithm can be applied if the first

order descent algorithm does not produce a reasonable value for the global minimum of V.

The second order algorithm uses all terms of Equation 4-53 to calculate Az and Aa, with the

additional constraints added,


z1 -Az = 0 (4-57)

ai Aa 0 (4-58)


The error function U is defined by


U(A1, A2, Az, Aa) = 1 Az + m Aa + (Az)TLAz

+ 2(6z)TM6a + (Sa)TN6a + Alzi Az + A2a1i Aa (4-59)


where the LaGrange multipliers A1 and A2 are used to enforce the constraints given in Equations

4-57 and 4-58. Differentiating Equation 4-59 with respect to Az and Aa and setting the results to

zero gives two matrix equations with four unknowns. The two constraint equations, z1 Az = 0

and a, Aa = 0, can then be applied to solve for A1 and A2. Then, substituting the values of A1

and A2 into the two matrix equations found via setting the partial derivatives of Equation 4-59 to

zero, allows the two equations to be solved for Az and Aa. [1]

The updated values (z2,a2) are defined as


(z2,a2) = (zi + Az, ai + Aa) (4-60)

If the value of V(z2, a2) is significantly smaller than the value of V(zi, ai), then the descent

algorithm is applied to (z2, a2) in place of (zi, ai). If V(z2, a2) is not less than a defined delta











from V(z2,a2), then the algorithm terminates and returns the current value of (z, a) as an

estimate of the global minimum of the error function V.4

4.4.3 Descent Algorithm Using Linearized Measurement Model

The descent or error minimization approach is less efficient than the SVD solution. But the

efficiency in the SVD approach can be lost if the incorporation of statistics into the algorithm is

desired. The descent algorithm allows the incorporation of statistics with the cost being a more

complicated error equation. A linearized model of the system which incorporates focal plane

measurement error, Equation 5-33, is derived in the next chapter. The state error vector, vi, of

independent variables is a function of the point correspondence measurements (q2, p1) and the

measurement errors (ci, rF). An error equation based on minimizing the state error vector can be

written in the form
n
J(v) = vT W v, (4-61)
i=1
where W is added as a weight matrix because the state error vector v is a combination of angular

and range errors. Equation 4-61 can be used to form a descent solution algorithm similar to the

ones described by Maybank's text in section 6.1.2 [1]. Development and implementation of this

solution technique is left for a future area of study.































4 Further details of the first and second order descent solution algorithms are given in section
6.1.2 of Maybank's text [1].

















CHAPTER 5
ERROR ANALYSIS

The accuracy of the inertial attitude estimates is the litmus test for the viability of this

application. This chapter starts with a presentation of the error model scenario. All assumptions

and information considered as constants are outlined. Next, the derivation of a generalized

system model is presented. Then, an algorithm simulating the generalized model is described and

verification results are presented. Finally, the results of parameter studies utilizing this algorithm

are presented for (1) camera focal length/field of view (FOV), (2) camera depression angle, (3)

displacement vector magnitude between imaging systems, (4) number of control points, (5) focal

plane location of control points, and (6) control point measurement error.

5.1 Generalized Linearized Error Model

To quantify the accuracy of this model, consider an aerial vehicle, or agent that contains

an imaging system, GPS sensor and INS system. The imaging system consists of a camera

with a given focal length, physical size, pixel count, and depression angle. Suppose the agent is

flying horizontal so that the translation vector, aLL does not have any component in the up or

zLL direction1 While moving horizontally, the agent takes two images such that they have an

overlapping FOV. A minimum of five image correspondences must be present in the two images.

An example scenario is shown in Figure 5-1.

The measurement model equation is given by


q2 = Rp R aLL, (5-1)


where p"l is the vector from the first camera position to the control point, qC2 is the vector from

the second camera position to the control point, and aLL is the vector from the first camera

position to the second camera position in the LL coordinate system. As a reminder, the notation

qX denotes a vector q expressed in coordinate system x, and Ry is the rotation matrix from



1 The horizontal constraint is done to cancel out the translational axis ambiguity. The c, or
center, coordinate system will be introduced later to explain the reasoning for this seemingly arbi-
trary constraint.













aLL = [a. ay 0]


cl FOV


Camera focal plane

cl

+


+

\C


cS FOV


Figure 5-1: Error analysis scenario.


coordinate system x to coordinate system y. The coordinate systems used in this model are

described in Table 5-1.

Table 5-1: Coordinate System Definitions

System Definition

cl Camera coordinates for camera in the first position;
the x-y plane is the focal plane of the camera, and the
z axis is along the camera boresite.

c2 Camera coordinates for camera in the second position;
same convention as above.

c Center coordinate system where x6 axis points along vector aLL
and the z axis is as close to vertical as possible (see appendix
for discussion of the c coordinate system).

LL Local level coordinates; north-east-down or east-north-up.




We will assume that the direction of p"l is a perfect measurement and all errors in direction

are associated with the second measurement q'2. However, the range to the control point is

unknown and must be calculated from the measurements, so let


pC1 l(1 + 6), (5-2)


where p1i is an estimate of pc', and 6p is the proportion of error in the length of pc1l











In order to overcome the translation axis ambiguity, the horizontal constraint on the dis-

placement vector's direction is used to define the coordinate system c such that the x: axis points

along vector aLL, and the z6 axis is as close to vertical as possible. As discussed in the appendix,

the matrix RLL is given by


LL LL LL
X y az

Re ( 0 (5-3)
-a a a a (aLL)2 (LL)2
_(aL)2 (aYL2 (aL) 2 \aL ) 2

Note that when the rotation RLL is applied to aLL, the following result is obtained:
LL LL LL LL
a a a L
a x y z az

0 =2 0 LL (5-4)
1 a aL aa all o L
0 laLLH i (a t)2+( L )2 ( )2 (L a ( 4)
LL LL LLL LL
0 + -2 )2 a _LL
(atL) ( ( 2atL)2 (a&LL)2

where a = |aLLiH.

Now, let

R2 = 8R1R (5-5)

where RI? is an estimate of RI, and 6R1 is the misalignment in the estimate. Also, let

R2 = RiRfLLSRRLL, (5-6)

where RL is an estimate of RL, and 6R2 is the misalignment in the estimate expressed in

the c coordinate system. When 6R2 is expressed in the c coordinate system, one of the three

misalignment angles is unobservable and will drop out of the calculations. This removes the

translational axis ambiguity from this problem.

The measurement equation for the generalized error model, Equation 5-1, now becomes


qc2 -= 6RiRpc1l(+ 6p) R RRLSR2RLaLL. (5-7)


To simplify Equation 5-7, combine some constant factors:


pc2 c Ri2p1 (5-8)

R2- = R2RBL (5-9)

a6 = RLL aLL (5-10)


Then the measurement model equation becomes


qc2 6Rpc2(1 + 6p) RR2 6R ac.


(5-11)










Equation 5-11 written out with all vectors and matrices expanded is

c2 1 -
q0 1 1i pi rin r712 r713 a
q = 1 1i (1+ )- 2 r22 23 -2a (5-12)

c2 T 31 T31 T32 r33 2a

The solution for the cr-coordinates on the focal plane of the second imaging system c2 given by

Equations 4-1 and 4-2 are

q(2 + yifpi Oif)(1+ Sp) (rla r12y2a + rn2a)
c -f X f (5-13)
Zq (+,. -. + )(1+ 6p) (ra rs + rssO2a)

f qf (-V'ipf+r +>lp^)1+ Sp) (ruca r2a22 + r a)(514)
qz (i. -.i- + -)(1 + 6p) (rsca r322a + r330a)

A summary of the above variables is given in Table 5-2 where (ia., p ), a, and {rij} are

treated as constants; (c, r) are dependent variables or measurements subject to measurement

errors; and the state vector of independent variables is (1,i1, i, 02, 92, Y2, Sp). Note that s2, which

represents the roll about the generalized translational axis, xk, does not appear in Equations 5-13

and 5-14. This resolves the translational axis ambiguity that results from only using two images

to obtain the attitude estimate.

To linearize the measurement model, we calculate the partial derivatives of each of the

measurements with respect to the independent variables about a nominal point. Let the vector

notation

f(v) = f(vo) +O Ab + Ac + higher order terms (5-15)
db dc

indicate a first order Taylor series expansion on the independent variables a, b, and c about a

nominal point vo where v = [a b c]T and vo = [a bo co]T. The nominal point vo consists of a

constant value a and nominal values for b and c. Extending this notation to the measurement

equation gives the generalized linearized measurement model as


c(v) = C(VO)+ ALi + A0i + -_ Ai + A02
a1 3Y '0 00 ) 'i V0 "A

+ 02 A p + A6p (5-16)



Dr Dr Dr Dr
r(V) = r(vo) + 1 i + A A+_ i + 02
Dy1 i. D o1 z. D01 0-' 0
Dr Dr
+ A'2 + 76 A06p (5-17)
9V'2D~ V o

























Table 5-2: Description of variables in the generalized error model

Variable(s) Description

ir", / ", ") The position of the control point designated from the first camera position
with the range estimated, transformed to the second camera system, c2 via
Rc2

a Distance between the first camera position and the second camera position;
the direction of displacement between camera positions appears in the
measurement equations implicitly.

{rij} Elements of matrix R ?2; a function of estimated or known rotations.

(r,c) Measurement of the control point position from the second camera position.

(i, 1, 0i, ) Misalignment angles in the estimate of the rotation of the camera from first
position to the second position.

(02, 02) Misalignment angles in the estimate of the roll and yaw 3-2-1 Euler angles
from the c coordinate system to the c2 system. Note that if the vehicle
is flying horizontally (i.e., no zLL component to aLL) that these angles
correspond directly to the inertial pitch and heading attitude errors of the
vehicle in the second position.


Proportion of error in the estimate of range to the control point.










where v = [qc2 pc rij a 1 01 1 02 2 sp ]T and vo = [qC2 pal rij a 91,0o 1,0 6i,o 02,0 62,0 sPo]T.

Note that qc2, pc, r j, and a are treated as constants in the Taylor series expansion of the

measurement equations for c(v) and r(v).

Calculating the partial derivatives for each of the terms in Equations 5-16 and 5-17, and

letting the all the nominal values be zero gives

Dc (i'- -rIia -
Sf rla (5-18)

9C = f r3-a'- r1-a (5-19)


(= f r3la, (5-20)
('Ii r31 r-a)a2
9c ( -( r3la)rl3a +(' rla)r33a
Sf (5-21)

D^2 (i' r3la)2

9C T3-rs1apc2 + r11ap2
f (5-23)
86p (P r3l a)2
r (r"- rsiari- I+ ( r Ialr -
=O f3( (5-24)
Dr -( -r3a a) 2

f (5-25)
610 ('P -r3ia)2
Dr rs31a I
= f (5-26)
Dy L(P r3la)2
Dr / r31 a)r23a ('+ -r2ia)r33a
Sf (5-27)
d('l (/ r3la)2
Dr ,' r3la)r22a (, r2la)r32a
=f (5-28)
D02 ( -r31a)2


86p ( r3ia)2

Define the error in the c and r measurements as

c(M) c(VO)
SA (5-30)
r r(V) r (vo)










The complete linearized measurement equation is then given by


c ('. a, i/.

[ rj r31a)2 [ (' rsia'i + (/, -' rla^ 'I





-('' r3ia)ri3a + ('./ ria)r33a

-('' -r3ia)r23a+ ('* r2ia)r33a



A01

L r3la)rl2a i'(. rnla)r32a -r3ia) .2 + rla2 j A (5-
(5-31)
(' r3la)r22a (P' r2la)r32a -r3la y + rap2 A6 (2

Ay2
Ajp

Let C represent the submatrix consisting of the first five columns of the matrix in Equation 5-31,

and let c represent the last column of the matrix in Equation 5-31. So, C is a 2 x 5 submatrix and

c is a 2 x 1 submatrix or column vector. Note that there are six unknowns and two measurements.

So, more measurements are needed to uniquely determine the set of independent variables. But,

each measurement will introduce a different range error, 6pi. So, the minimum number of control

points needed to uniquely determine the independent variables is five. For five control points, the

system is represented by:

ci C1 c1 0 0 0 0 A01

Fi 0 0 0 0 A01

C2 C2 0 C2 0 0 0 Ay1

ri 0 0 0 0 A02

C3 0 3 0 0 C3 0 0 Ay2
(5-32)
ir3 0 0 0 0 Aspi

C4 4 0 0 0 c4 0 A6p2

r4 0 0 0 0 A8p3

55 Cs 0 0 0 0 C5 A6P4

r5 0 0 0 0 Asp5
L 101 L 10x10 L 10x1











Driverml 1"itializ

LAMW
Error model m-files


Plot Results
plot results.m I


Figure 5-2: Overview of Matlab implementation of the generalized linearized error model anal-
ysis (GLEMA) program.


where the subscripts denote the matrix sizes in Equation 5-32. Let the n control point measure-

ments be denoted as the 2n x 1 column vector m, the independent variables, or state error vector,

be denoted as the (5 + n) x 1 column vector v, and the 2n x (5 + n) square matrix of constant

terms be G. Then, Equation 5-32 can be concisely written as


m2nxl = G2nx(n+5)V(n+5)xl (5-33)


where v will be referred to as the state error vector hereafter. Adding more than five measure-

ments allows Equation 5-33 to be solved by a least squares type approach.

5.1.1 Implementation of the Generalized Linearized Error Model Analysis (GLEMA)
Matlab Program

An analysis program for the generalized linearized error model, GLEMA, was created

in the MatlabV programming language. The implementation contains three inter-dependent

components: (1) initialization, (2) loop structure, and (3) display of results. The initialization

component defines the fixed parameters, sets up the parameter that will be looped over (i.e., focal

length, camera depression angle, displacement vector, etc.), and sets up and maintains the output

containers that hold the analysis results. The looping component computes the error variances for

the independent variables (given in the state error vector v) for all the geometries of interest. The

variances are then plotted versus the parameter of interest in the plot results component.

GLEMA consists of a combination of MatlabV functions and scripts.2 The labels driver.m

and plot_results.m in Figure 5-2 are used to illustrate the one-to-one correspondence between a

driver file and a plotting file. This correspondence is shown in Table 5-3. The list below describes




2 The difference between a function and a script is in the scope of the variables. A script sends
the variables to the workspace and a function does not. Both files end in ".m" and are commonly
called m-files.











each step of the implementation as it corresponds to the three components shown in Figure 5-2

and lists the m-files that correspond to each step.

initialize 1 driver.m

Set up fixed camera parameters, control points (cr-coordinates with LL heights), true

attitude, range and attitude errors, and displacement vector between the cameras.

initialize 2 driver.m

Set up the parameter of interest and output containers) to capture the data from the

upcoming loop.

initialize 3 driver.m, errorAnalysis.m

Loop over parameter of interest.

loop 1 errorAnalysis.m, inputGeometry_truth.m

Get the geometry of for the truth system in LL.

loop 2 errorAnalysis.m, checkIfControlPointInFOV.m

Ensure that there are a minimum of five points that are in the FOV of both cameras.

loop 3 errorAnalysis.m, inputGeometry_truth.m, crh2enu.m, computeControlPoint.m

Calculate the true control point locations in the cl and c2 coordinate systems.

loop 4 errorAnalysis.m, inputGeometry_est.m

Add in attitude and range errors to get the geometry in the estimated system in LL

loop 5 errorAnalysis.m, calc_cr_truth.m

Calculate the true control point cr-coordinates on the second camera's focal plane.

loop 6 errorAnalysis.m, calc_cr_est.m

Calculate the estimated control point cr-coordinates on the second camera's focal plane.

loop 7 errorAnalysis.m

Calculate the cr-coordinate error values on the second camera's focal plane.

loop 8 errorAnalysis.m, reshape_c_and_r_err_into_M.m

Form the m2,,xi vector.

loop 9 errorAnalysis.m, formG.m, form_c_row_for_G.m, form_r_row_for_G.m

Form the G2nx(n+5) matrix.

loop 10 errorAnalysis.m

Compute the system error vector, V(n+5)xl, of independent state variables from m and G.

loop 11 errorAnalysis.m

Compute the (n + 5) x (n + 5) matrix via (GTG)-1











Inversion of linearized system


true state
calculate mtrue


V
S m0ors0
calculatem-G-


calculate miest ----


Calculated differences between
true and estimated states


Compare calculated
and actual differences
between the true and
estimated states


Actual differences between
true and estimated states


Figure 5-3: Graphical overview of verification of the GLEMA program


loop 12 driver.m, errorAnalysis.m

Read off the geometry terms for each independent state variable given by the diagonal terms

of (GTG)1

loop 13 driver.m

Determine a pixel based error corresponding to the cr-coordinate measurements made on

the second camera systems focal plane. Convert the pixel-based error to an area on the focal

plane.

loop 14 driver.m

Calculate the error variances for each desired independent variable based on its geometry

term and given focal plane error.

plot results 1 driver.m, plot_results.m

Plot the error variances versus the parameter of interest.

Table 5-3: Correspondence between driver.m and plotresults.m files for the GLEMA program.


driver.m
varying_beta.m
varying_displacement_vector.m
varying_focal_length.m
varying_last_CP_location.m
varying_num_cp_cols.m
varying_num_cp_rows.m
varying_pixel_error.m
verification.m


plot_results.m
plot_results_b.m
plot_results_a.m
plot_results_f.m
plot_results_cp.m
plot_results_num_cp_cols.m
plot_results_num_cp_rows.m
plot_results_pixel_error.m
MatlabVc command window


Verification results. The goal of verifying the error model is to show that the known

errors that are input to the system can be successfully calculated using the GLEMA program.

To accomplish this goal, the driver file verification.m was created. A graphical overview of the

verification process is shown in Figure 5-3.


estimated state


L











Verifying the accuracy of the GLEMA program was accomplished in two steps. The first

step was to verify that linearization of each independent variable in the system error vector, v,

with all coordinate systems aligned. A summary of the variables contained in the system error

vector with their corresponding inputs is given in Table 5-4. This was verified by setting all input

errors to zero except the error of interest, and then calculating the system error vector, v, and

visually verifying the output to the MatlabV command window corresponds to the input error.

For example, to verify the successful linearization of the inertial heading estimate Y2, all input

error terms were zero except YL3 and the displacement vector was aligned such that no side-slip

or angle-of-attack was present. A value of 0.01 radians for L produced a value of 0.0099 rad for

02 and values less than 10-14 for the rest of the values v. A complete list of results comparing

the input errors to the resulting terms of system error vector is given in Table 5-5.

Table 5-4: Summary of linearized independent variables contained in the system error vector v
with their corresponding inputs in the GLEMA program.

Variable GLEMA Input
01 errors.phi_cltoc2
01 errors.theta_cltoc2
01 errors.psi_cltoc2
02 errors.theta_ned2c2
02 errors.psi_ned2c2
6pi errors.r[i]


Table 5-5: Step one verification results for the GLEMA program

Independent Variables
1 01 {1 2 j2 6P1 SP2 6P3 6P4 6p5 6P6

input le-2 0 0 0 0 0 0 0 0 0 0
output le-2 0 0 -1.1e-3 0 7e-4 9e-4 9e-4 4e-4 4e-4 4e-4
input 0 le-2 0 0 0 0 0 0 0 0 0
output 0 1.02e-2 0 7e-4 -le-4 1.6e-3 2.5e-3 1.4e-3 2e-3 2.4e-3 2e-3
input 0 0 le-2 0 0 0 0 0 0 0 0
output -le-4 0 le-2 6e-4 7e-4 -3e-4 3e-4 5e-4 0 2e-4 3e-4
input 0 0 0 le-2 0 0 0 0 0 0 0
output 0 0 0 le-2 0 0 0 -le-4 0 0 -le-4
input 0 0 0 0 le-2 0 0 0 0 0 0
output 0 0 0 0 le-2 0 0 0 0 0 0
input 0 0 0 0 0 0 0 le-2 0 0 0
output 0 0 0 0 0 0 0 le-2 0 0 0


3 For convenience sake, the rotation matrix notation is used to denote the coordinate system
that the input error term acts upon.











The second step of the verification process involves slewing the heading and velocity/displacement

vector around 3600. If the coordinate system rotations and translations are correct, then there

should be no change in the system error vector. The inertial heading is given a 0.01 rad error and

the velocity and inertial heading are slewed around in 450 increments. The comparison results are

shown in Table 5-6. Corresponding comparisons for the remaining independent variables are not

shown, since step one produced a valid linearization for all independent variables.

Table 5-6: Step two verification results for the GLEMA program

Independent Variables
>1 ] 4p, 0 npu2 00 ]Pl2 2 3 4 sP5 sp6

S-450 input 0 0 0 0 le-2 0 0 0 0 0 0
output 0 0 0 0 le-2 0 0 0 0 0 0
S=-900 input 0 0 0 0 le-2 0 0 0 0 0 0
output 0 0 0 0 le-2 0 0 0 0 0 0
S-135 o input 0 0 0 0 le-2 0 0 0 0 0 0
output 0 0 0 0 1e-2 0 0 0 0 0 0
S- 180 input 0 0 0 0 le-2 0 0 0 0 0 0
output 0 0 0 0 le-2 0 0 0 0 0 0
7Py 2250 input 0 0 0 0 1e-2 0 0 0 0 0 0
output 0 0 0 0 le-2 0 0 0 0 0 0
P' 2700 input 0 0 0 0 le-2 0 0 0 0 0 0
output 0 0 0 0 le-2 0 0 0 0 0 0
S- 315 input 0 0 0 0 le-2 0 0 0 0 0 0
output 0 0 0 0 le-2 0 0 0 0 0 0
7 = 3600 input 0 0 0 0 le-2 0 0 0 0 0 0
output 0 0 0 0 le-2 0 0 0 0 0 0


The results presented in Tables 5-5 and 5-6 show that the GLEMA program is a reasonable

representation of the error model scenario given in Figure 5-1. Specifically, the G matrix is shown

to be a valid model of the linearized system.

5.2 Parameter Study

The parameter study section uses the G matrix to transform measurement errors into

solution or angular errors. The G matrix is formed using truth data. Given measurement errors

m are then used to calculate the errors that result from the geometry of the scene (e.g., v) from

Equation 5-33. Figure 5-4 shows a graphical representation of this process. This study assumes

that all measurement errors are uncorrelated and can be specified by a value for the pixel variance

denoted as a2. Note that the units for a2 are 'I.;-.. 1- ]. The G matrix, camera size, and number of

pixels is used to transform a2 into angular errors by

Parea m2 camera size [min] 2
pixels num pixels [pixel] (










measurement noise errors in solution
(G G)-1 -



true values

Figure 5-4: Graphical overview of the parameter study process using the GLEMA program




s0n = (GQT) P.area (5-35)

where the units for as,,, are radians for (91, Oi, 1, 02, Y2) and proportion of error in the estimate

of range to the i-th control point for 6pi. For all the parameter study results, a one-pixel standard

deviation is assumed. This simplifies Equation 5-35 to


2oln (T G) Parea for ap = 1 (5-36)

The 6p results are not displayed in any of the parameter study results. This is the sacrifice

that comes from ensuring that all control points are visible to both camera systems. That is, if a

point is designated in the first camera system and cannot be seen by the second camera system

then it is not used in the formulation of the G matrix.

In the following sections, the results for the angular terms (91, Oi ,I 02, Y2) are displayed in

units of mrad. To following equation was used to convert each angular term of ua01 to mrad:

[mrad] 2 1
ason, [mrad] = 1000 II (ain 2 (5-37)
rad (o-1) 2

The control points are designated in a pseudo-random fashion on the first cameras focal

plane. The focal plane is divided up into an n row by m column grid. The control points are

randomly placed within each grid box with the constraint that they cannot be within _".' of

the grid boundaries. The allowed location of a control point within a single grid box is shown in

Figure 5-5. An example focal plane with a 3 row by 3 column grid is shown in Figure 5-6. The

control point designation in the GLEMA program is performed in the markControlPoints.m

m-file.

5.2.1 Varying Camera Focal Length

The effect of varying the camera's focal length on the attitude estimates is shown in this

section. The input parameters used for this study are shown in Table 5-7. The GLEMA driver

file used in this study was varying_focal_length.m.





































Figure 5-5: Allowed location of control points within each grid box of the focal plane.


-1T I''1 -I -


i up -I '


Figure 5-6: Control point locations on a focal plane with a 3 row by 3 column grid designation.


--T


+























Table 5-7: Input parameters for varying camera focal length study


Parameter




f

ccdpixels

ccdsize

cp cols

cp rows

( ,0 oc2, /c2

(Yed, Oned, ned)


(1, 0(1, IC, )



(Oe2, ^2, VC2)





venu

pixel error


Value

7F
4

varied

384 288

0.0042 0.0032

5

7

(0,0,0)

(0,0,0)



(0,0,0)


(0,0,0)


10

(0,1,0)

1


Description

Camera depression angle in radians.

Camera focal length in meters.

Number of column and row pixels in the ccd array camera.

Size of the ccd array camera in meters.

Number of control point columns on the ccd array camera.

Number of control point rows on the ccd array camera.

Roll, pitch, yaw of the relative attitude between cl and c2.

Roll, pitch, yaw of the inertial attitude of the second camera
system c2.

Roll, pitch, yaw misalignment errors in the relative attitude
between cl and c2.

Roll, pitch, yaw misalignment errors between the c (center)
coordinate system and second camera system c2.

Displacement vector magnitude in meters.

Direction of the displacement vector, or velocity vector.

Measurement error of the control points in pixels.












0.02


0.015 -


S0.01 -


0.005
xxxxxxxxxxxxxxx

10 20 30 40 50 60 70 80 90 100 110
vertical fov (deg)

0.02


0.015


S0.01


0.005-


10 20 30 40 50 60 70 80 90 100
horizontal fov (deg)

Figure 5-7: Varying FOV vs. focal length


The focal length of the camera system was varied from 0.0015 to 0.0164 meters. The lower

limit, 0.0015 meters, corresponds to an unrealistic FOV region for the given camera size. The data

generated in this realm can only be used for theoretical arguments. The camera FOV is related to

the focal length of the camera by


FOV[radians] 2 arctan camera size [m] (5-38)


The resulting data points used in this analysis is shown in Figure 5-7. The first iteration of the

loop corresponds to the largest FOV and the last iteration corresponds to the smallest FOV. The

vertical FOV has a maximum value of 108.92 degrees and a minimum value of 14.59 degrees. The

horizontal FOV has a maximum value of 93.70 degrees and a minimum value of 11.14 degrees.

The control point locations on the focal planes of both cameras for the first and last it-

erations of the analysis loop are shown in Figures 5-8 and 5-9. The "x" symbols indicate the

designated control point location on the first camera's focal plane, and the hexagon symbols

indicate the calculated control point locations on the second camera's focal plane. A line is drawn

to connect the corresponding control point locations in the first and second camera system. Notice

that the last two rows of the control points (where the first row is located on the bottom of the











150 -








50 -










-100 -



-200 -150 -100 -50 0 50 100 150 200
col (pixels)

Figure 5-8: Control point locations on the focal planes of both cameras for the first iteration for
varying camera FOV.


focal plane) disappear in focal plane of the last iteration. This is due to the control points going

out of the second camera's FOV.

The 3-D view of the scene for the first and last iterations is shown in Figures 5-10 and 5-11.

The control points are marked by plus symbols. The first camera position is marked by a square

symbol. The second camera position is marked by a triangle.

The results for the computed attitude standard deviations are shown in Figures 5-12 and

5-13. The discontinuities in the various plots are due to control points going out of the second

camera's FOV.

5.2.2 Varying Camera Depression Angle

The effect of varying the camera's depression angle B on the attitude estimates is shown in

this section. The input parameters used for this study are shown in Table 5-8. The GLEMA

driver file used in this study was varying_beta.m.

The camera depression angle was varied from 0 to 179.9 degrees. The first iteration cor-

responds to a camera looking directly ahead of the vehicle. The last iteration corresponds to a

camera looking almost directly behind the vehicle. Since the projection onto the x y plane of the

velocity and camera line of sight vectors are identical in this study, a forward looking camera gets






















2


100 -



50 -



a o0



-50



-100


>1


-150
-200 -150 -100 -50 0 50 100 150 200
col (pixels)

Figure 5-9: Control point locations on the focal planes of both cameras for the last iteration for
varying camera FOV.


0 2000 4000 6000 8000 10000 12000 14000 16000 18000
north (m)
100 F D


50 -
C- +
0

-50
-3000 -2000 -1000
x 10'


0 1000 2000 3000 4000 5000 6000 7000
east (m)


1.5 -


0.5

-3000 -2000 -1000 0 1000 2000 3000 4000 5000 6000 7000
east (m)

Figure 5-10: Three-dimensional view of the scene for the first iteration for a varying focal length.


I

















100-


S50 -

a 0



-50 -
100

100,


++


+++* ++ **


+ ++
+


north (m)
&


+


0 -++


-50
-2

250


S200


E 150


+ +:


+
+


0 -15 -10 -5 0 5 10 15 20 2
east (m)




-^.^^^ .^^. ^^


1001 1- -1-- I I 1I-I
-20 -15 -10 -5 0 5 10 15 20 25
east (m)


Figure 5-11: Three-dimensional view of the scene for the last iteration for a varying focal length.








26

24

I22 -

20

I18 -

S16 -

14 -

12
10 20 30 40 50 60 70 80 90 100 110



100


80 -
60 -
60


S40 -


a 20 -


10 20 30 40 50 60 70 80 90 100 110
vertical fov (deg)


Figure 5-12: Varying camera focal length results ned to c2























Table 5-8: Input parameters for varying camera depression angle P study


Parameter




f

ccd_pixels

ccdsize

cp cols

cp rows

( ,0 oc2, /c2

(Yed, Oned, ned)


(1, 0(1, IC, )


(Oe2, ^2, VC2)





venu

pixel error


Value

varied

0.005

384 288

0.0042 0.0032

5

7

(0,0,0)

(0,0,0)


(0,0,0)


(0,0,0)


10

(0,1,0)

1


Description

Camera depression angle in radians.

Camera focal length in meters.

Number of column and row pixels in the ccd array camera.

Size of the ccd array camera in meters.

Number of control point columns on the ccd array camera.

Number of control point rows on the ccd array camera.

Roll, pitch, yaw of the relative attitude between cl and c2.

Roll, pitch, yaw of the inertial attitude of the second camera
system c2.

Roll, pitch, yaw misalignment errors in the relative attitude
between cl and c2.

Roll, pitch, yaw misalignment errors between the c (center)
coordinate system and second camera system c2.

Displacement vector magnitude in meters.

Direction of the displacement vector, or velocity vector.

Measurement error of the control points in pixels.















10 20-30---50 ----0-0--0-----1
E 1.5


0.5-

10 20 30 40 50 60 70 80 90 100 110
10






10 20 30 40 50 60 70 80 90 100 110








10 20 30 40 50 60 70 80 90 100 110
vertical fov (deg)

Figure 5-13: Varying camera focal length results cl to c2


closer to the points while a rear-looking camera gets farther away. The camera was allowed to be

rear looking to show this difference.

The control point locations on the focal planes of both cameras for the 3 equaling 45, 90,

and 135 degrees of the analysis loop are shown in Figures 5-14 through 5-16. The "x" symbols

indicate the designated control point location on the first camera's focal plane, and the hexagon

symbols indicate the calculated control point locations on the second camera's focal plane. A

line is drawn to connect the corresponding control point locations in the first and second camera

system. Notice that the last two rows of the control points (where the first row is located on the

bottom of the focal plane) disappear in the focal plane of the last iteration. This is due to the

control points going out of the second camera's FOV.

The 3-D view of the scene for P3 equaling 45, 90, and 135 degrees is shown in Figures 5-

17 through 5-19. The control points are marked by plus symbols. The first camera position is

marked by a square symbol. The second camera position is marked by a triangle. The location

of the control points move as expected from (1) in front to (2) directly below to (3) behind the

camera systems when 3 equals 45, 90, and 135 degrees respectively.

The results for the computed attitude standard deviations are shown in Figures 5-20 and

5-21. The discontinuities in the various plots are due to control points going out of the second
















100 -



50 -


a0
p


Figure 5-14:
depression ang]


-50 -



-100 -



-150
-200 -150 -100 -50 0 50
col (pixels)

Control point locations on focal plane for it 3
le study


100 150 200


45 degrees for the varying camera


100 -



50 -


a O



-50 -



-100 -



-150
-200 -150 -100 -50 0 50
col (pixels)

Figure 5-15: Control point locations on focal plane for it =
depression angle study


1















100 150 200


90 degrees for the varying camera


P


















/1

100 -




50 -




0-


SI II
-100




-150
-200 -150 -100 -50 0 50
col (pixels)


Figure 5-16: Control point locations on focal plane for it =
depression angle study


100 150 200



135 degrees for the varying camera


100r >

50 -

S0


-50
100 150

100-

50 -

0 +


-50-
-80 -60

400

S300 -
0 + +
200-

100-
-80 -60



Figure 5-17: Three-dimensional
depression angle study.


+ +

+ ,$+


+ +
++ + +
+
+


200
north (m)
&0


+
-+
+


++


++ + 4
+


-40 -20 0 20 40 60 8
east (m)


+ -+
+ -+
+ ++ +


-40 -20 0
east (m)


view of the scene for 3


20 40 60 80



- 45 degrees for the varying camera

















100-


50 -


0-


-50 -1
70

100-


50 -


0-


-50 -
-50

140

120 -
E
100 -

80 -

60 -
-50



Figure 5-18: Three-dii

depression angle study.


o >


+ -P -s- ++^ ++
+ ++- + +


80 90 100 110 120 130 140
north (m)
cl


+ +


+ +


4 ++
+


40 -30 -20 -10 0 10 20 30 40 5
east (m)

S+ +

+ ++
+ + + +
+ + + + +

+ + +


40 -30



nensional


-20 -10 0
east (m)


view of the scene for 3


10



= 90


20 30 40 50



degrees for the varying camera


100


50
E
CL'0


-50-
-150

100r


-100 -50


0
north (m)
D&


E
a
o


-50 1111
-80 -60 -40 -20 0
east (m)
200

100 -

S+ +

-100 -

-200
-80 -60 -40 -20 0
east (m)


Figure 5-19: Three-dimensional view of the scene for P

depression angle study.


50 100 150











20 40 60 80











20 40 60 80



135 degrees for the varying camera













,30-
25
20
20 -

0 1 -
5

0 20 40 60 80 100 120 140 160 180




40



S20-

10


0 20 40 60 80 100 120 140 160 180
beta (deg)

Figure 5-20: Varying camera depression angle results ned to c2


camera's FOV. The non-symmetric behavior about the 90' camera depression angle is due to

(1) the different control point locations which overlap in the two FOVs, and (2) the difference

between forward and rear-looking camera (i.e., approaching and receding from control points).

5.2.3 Varying Displacement Vector Magnitude Between Imaging Systems

The effect of varying the displacement vector magnitude between camera systems on the

attitude estimates is shown in this section. The input parameters used for this study are shown in

Table 5-9. The GLEMA driver file used in this study was varying_displacement_vector.m.

The magnitude of the displacement vector between camera systems ||a|| was varied from

1.0 to 20.9 meters. The first iteration corresponds to camera ||a|| = 1.0 and the last iteration

corresponds to ||a|| = 20.9.

The control point locations on the focal planes of both cameras for the ||a|| = 1.0, 5.9, 10.9,

15.9 and 20.9 meters are shown in Figures 5-22 through 5-26. The "x" symbols indicate the

designated control point location on the first camera's focal plane, and the hexagon symbols

indicate the calculated control point locations on the second camera's focal plane. A line is drawn

to connect the corresponding control point locations in the first and second camera system. Notice

that the last two rows of the control points (where the first row is located on the bottom of the






















Table 5-9: Input parameters for varying displacement vector magnitude between imaging sys-
tems.


Parameter




f

ccd_pixels

ccdsize

cp cols

cp rows

(4 0oc c)2

(wc2 dc2 /c2
(ned, ned, ned)




(q, %0, VC, )

( 9 ( 202 2 )





pixenu

pixel error


Value

7F
3

0.005

384 288

0.0042 0.0032

5

7

(0,0,0)

(0,0,0)



(0,0,0)


(0,0,0)


varied

(0,1,0)

1


Description

Camera depression angle in radians.

Camera focal length in meters.

Number of column and row pixels in the ccd array camera.

Size of the ccd array camera in meters.

Number of control point columns on the ccd array camera.

Number of control point rows on the ccd array camera.

Roll, pitch, yaw of the relative attitude between cl and c2.

Roll, pitch, yaw of the inertial attitude of the second camera
system c2.

Roll, pitch, yaw misalignment errors in the relative attitude
between cl and c2.

Roll, pitch, yaw misalignment errors between the c (center)
coordinate system and second camera system c2.

Displacement vector magnitude in meters.

Direction of the displacement vector, or velocity vector.

Measurement error of the control points in pixels.












-1 I I I I I I I I

2 2-



0-
0 20 40 60 80 100 120 140 160 180
15





S0-


C. I I I----I--I I
0 20 40 60 80 100 120 140 160 180


3-

2-

I-

0 20 40 60 80 100 120 140 160 180
beta (deg)

Figure 5-21: Varying camera depression angle results cl to c2


focal plane) disappear by the time ||a|| = 20.9. This is due to the control points going out of the

second camera's FOV.

The 3-D view of the scene for ||a|| = 1.0, 5.9, 10.9, 15.9 and 20.9 meters is shown in Figures

5-27 through 5-31. The control points are marked by plus symbols. The first camera position

is marked by a square symbol. The second camera position is marked by a triangle. The second

camera moves from right to left as the displacement vector is increased from 1 to 20.9 meters.

Referring to the top portion of each figure, notice that the control points corresponding to the

disappearing rows are seen to gradually disappear from the left side of the control point grouping.

The results for the computed attitude standard deviations are shown in Figures 5-32 through

5-35. The discontinuities in the various plots are due to control points going out of the second

camera's FOV.

5.2.4 Varying Number of Control Points

The effect of varying the number of control points marked on the focal plane on the atti-

tude estimates is shown in this section. The input parameters used for this study are shown

in Table 5-10. The GLEMA driver files used in this study were varying_num_cp_cols.m and

varying_num_cp_rows.m.















150 -




100 -








0 -















S-150
-200 150 100 50 0 50 100 150 200
col (pixels)


Figure 5-22: Control point locations on focal plane for a 1.0 meters for the varying displace-
ment vector magnitude study.





150


100 -




50 -




0-


x 1
-100 -

I I 5


-150 -
-200 -150 -100 -50 0 50 100 150 200
col (pixels)


Figure 5-23: Control point locations on focal plane for a = 5.9 meters for the varying displace-
ment vector magnitude study.

















1A



Si


50



0



-50


-150 -
-200 -150 -100 -50 0 50 100
col (pixels)

Figure 5-24: Control point locations on focal plane for a = 10.9 meters
ment vector magnitude study.


100 -







0
0 -



-50



-100



-150
-200 -150 -100 -50 0 50 100
col (pixels)

Figure 5-25: Control point locations on focal plane for a = 15.9 meters
ment vector magnitude study.


150 200


for the varying displace


150 200


for the varying displace-






















100 -




50 -


-100




-150
-200


-150 -100 -50 0
col (pixels)


Figure 5-26: Control point locations on
ment vector magnitude study.


50 100 150 200


focal plane for a = 20.9 meters for the varying displace


100*


+ +


+ +
+


+ +


120 140 160 180 200 220 240
north (m)


+ + +-
+ 4+


+


+ +


-40 -20 0 20 40 60 80


e




+
+ + +
+ +

++ ++ +


ast (m)







+


+
+
++

++
+ + +
++ + +


-40 -20 0 20 40 60 80
east (m)


Figure 5-27: Three-dimensional view of the scene for a = 1.0 meter for the varying displacement
vector magnitude between imaging systems study.


+ ++
+~+


E200


S150


+

S4






















++ +
+


120 140 160
north (m)
&>


+ +


++

+


+


+ +
~ ++


180 200 220 240


+ + #
+


-40 -20 0 20 40 60 80
east (m)


E200


E 150


+ +
4+-i


+
+ +
+


++
++


-40 -20 0 20 40 60 80
east (m)


Figure 5-28: Three-dimensional view of the scene for a = 5.9 meter for the varying displacement

vector magnitude between imaging systems study.


100oo


+ +
++'+4
+ +++


+ +
+


+ +


120 140 160 180 200 220 240
north (m)


+
+ +
+++~


+ +^
+


-40 -20 0 20 40 60 80
east (m)


+ +
4 +


+


+


+ +
+


++
+
++


-40 -20 0 20 40 60 80
east (m)


Figure 5-29: Three-dimensional view of the scene for a = 10.9 meter for the varying displacement

vector magnitude between imaging systems study.


100a- >


E200


S150-


I I


I I


f;


i I I























+ +


120 140 160
north (m)
&>


+ : +


+
+


+ +
+ +
+:


180 200 220 240


+ +4

+


-40 -20 0 20 40 60 80
east (m)


+ +
4+-i


+ ++

- ++ ++
A


-40 -20 0 20 40 60 80
east (m)


Figure 5-30: Three-dimensional view of the scene for a = 15.9 meter for the varying displacement
vector magnitude between imaging systems study.


-+ ++
++


120 140 160
north (m)
&2


+ +


+ +
+ + +


180 200 220 240


+


-40 -20 0 20 40 60 80
east (m)


4+


++
+


A


-40 -20 0 20 40 60 80
east (m)


Figure 5-31: Three-dimensional view of the scene for a = 20.9 meter for the varying displacement
vector magnitude between imaging systems study.


100-


E200

E 150


100oo


E200


S150-


i I I
















300

S250

E 200

150

100
-3
50

0




350

300

~ 250
E
S200

150
ISO
100

50

0


Figure 5-32: Varying
to 20.9 meters ned tc


) 5 10 15 20 2


















0 5 10 15 20 2
D- 0152


5


displacement vector magnitude (meters)


displacement vector magnitude between imaging systems results for a = 1.0
c2


50

40

30


20

10
4 6 8 10 12 14 16 18 20 2



7n-


60


50

S40

a 30-


20
4 6 8 10 12 14 16 18 20 22
displacement vector magnitude (meters)


Figure 5-33: Varying displacement vector magnitude between imaging systems results for a = 5.9
to 20.9 meters ned to c2

















c& 2.



r-
0
2



C2

-c
e







CD
a

ij
e
E

0


a


5




2


0 5 10 15 20 2



7-

6-

5-


0 5 10 15 20 2




5-


3 -


c -----------------------------------------


0 5 10 15 20 25
displacement vector magnitude (meters)


Figure 5-34: Varying displacement vector magnitude between imaging systems results for a = 1.0

to 20.9 meters cl to c2









E2.4 -

2.2 -

2-

1.8
4 6 8 10 12 14 16 18 20 22


-8

E7



76 -

6 -

4 6 8 10 12 14 16 18 20 22

3.8

3.6-

3.4-

3.2-


4 6 8 10 12 14 16 18 20 22
displacement vector magnitude (meters)


Figure 5-35: Varying displacement vector magnitude between imaging systems results for a = 5.9

to 20.9 meters cl to c2


I I I I


-























Table 5-10: Input parameters for varying the number of control points study.


Parameter




f

ccdpixels

ccdsize

cp cols

cp rows

( ,0 oc2, /c2

(Yed, Oned, ned)


(1, 0(1, IC, )



(Oe2, ^2, VC2)





venu

pixel error


Value

7F
3

0.005

384 288

0.0042 0.0032

varied

varied

(0,0,0)

(0,0,0)



(0,0,0)



(0,0,0)


10

(0,1,0)

1


Description

Camera depression angle in radians.

Camera focal length in meters.

Number of column and row pixels in the ccd array camera.

Size of the ccd array camera in meters.

Number of control point columns on the ccd array camera.

Number of control point rows on the ccd array camera.

Roll, pitch, yaw of the relative attitude between cl and c2.

Roll, pitch, yaw of the inertial attitude of the second camera
system c2.

Roll, pitch, yaw misalignment errors in the relative attitude
between cl and c2.

Roll, pitch, yaw misalignment errors between the c (center)
coordinate system and second camera system c2.

Displacement vector magnitude in meters.

Direction of the displacement vector, or velocity vector.

Measurement error of the control points in pixels.











150 -



100 -



50 -



S0-
P

-50



-100 -




-200 -150 -100 -50 0 50 100 150 200
col (pixels)

Figure 5-36: Control point locations on focal plane for a column count of 1 and a row count of 6.


The number of rows and number of columns of control points were varied separately to show

the impact that each had on the accuracy of the attitude estimate.

Varying number of control point column count

The number of control point column counts was varied from 1 to 20 while the number of

rows was fixed at 6. The first iteration corresponds to a column count or 1 and the last iteration

corresponds to a column count of 20. The GLEMA driver file used for this part of the study was

varying_num_cp_cols.m.

The control point locations on the focal planes of both cameras for the column count of 1,

10, and 20 are shown in Figures 5-36 through 5-38. The "x" symbols indicate the designated

control point location on the first camera's focal plane, and the hexagon symbols indicate the

calculated control point locations on the second camera's focal plane. A line is drawn to connect

the corresponding control point locations in the first and second camera system. Notice that the

last column in all three of the focal plane views is missing. This is due to the control points going

out of the second camera's FOV.

The 3-D view of the scene for column counts of 1, 10, and 20 and a row count of 6 is shown

in Figures 5-39 through 5-41. The control points are marked by plus symbols. The first camera

position is marked by a square symbol. The second camera position is marked by a triangle.










150

100 -

50 -

0-
P


S1


, i


-50 -

-100 I

-150
-200 -150 -100 -50 0 50 100
col (pixels)
Control point locations on focal plane for column count


150 200

of 10 and a row count of 6.


200

a row count of 6.


Figure 5-37:


SiT

If


I ,


Figure 5-38:


-150------
-200 -150 -100 -50 0 50 100
col (pixels)
Control point locations on focal plane for column count


::-IIV


150

of 20 and


\

t\


:i


::


:: 1



























110 120 130 140 150
north (m)
D


160 170 180 190 200


I I I I I I 1 1 2
15 -10 -5 0 5 10 15 20 25
east (m)


150 -


A .. .

15 -10 -5 0 5 10 15 20 25
east (m)


Figure 5-39: Three-dimensional view of the sc

for the varying number of control points study.


ene for a column count of 1 and a row count of 6


100[.


H +


120 140 160 180 200 220 240
north (m)
&


-4- + + + + +
I t ,+ +


+ + +
+ +

+-+ +


0
east (m)

+



+ + +
+ :-


+ + 4-
-I-
1-C t


+

+ ++ +

+ +- + +


0
east (m)


Figure 5-40: Three-dimensional view of the scene for a column count of 10 and a row count of 6

for the varying number of control points study.


100-


S+


E200


E 150o


++ +


I I I I I I


I I I I I I I I I I









73


100 >

50

0 + + + + + +
0 ++
-50 I I I I I I
100 120 140 160 180 200 220 240
north (m)
100

50 -

+ + +++ +++ + ++ ++

+
0+ ++ +:

-50 L ,I + I
-60 -40 -20 0 20 40 60
east (m)
250
+
+ + : + : : :
200 + ++ + + + ++ + + ++

L + ++ +- ++ +. + f+ ++ + + +
+ + +++ + + ++ + + ++

100 1 1 aI I
-60 -40 -20 0 20 40 60
east (m)

Figure 5-41: Three-dimensional view of the scene for a column count of 20 and a row count of 6
for the varying number of control points study.


The results for the computed attitude standard deviations are shown in Figures 5-42 and

5-43. As expected, increasing the number of control points decreases the error values.

Varying number of control point row count

The number of control point row count was varied from 1 to 20 while the number of columns

was fixed at 6. The first iteration corresponds to a row count or 1 and the last iteration cor-

responds to a row count of 20. The GLEMA driver file used for this part of the study was

varying_num_cp_rows.m.

The control point locations on the focal planes of both cameras for the row counts of 1,

10, and 20 are shown in Figures 5-44 through 5-46. The "x" symbols indicate the designated

control point location on the first camera's focal plane, and the hexagon symbols indicate the

calculated control point locations on the second camera's focal plane. A line is drawn to connect

the corresponding control point locations in the first and second camera system. Notice that the

last column in all three of the focal plane views is missing. This is due to the control points going

out of the second camera's FOV.

The 3-D view of the scene for row counts of 1, 10, and 20 and a column count of 6 is shown

in figures 5-47 through 5-49. The control points are marked by plus symbols. The first camera

position is marked by a square symbol. The second camera position is marked by a triangle. Note





























10 12 14 16 18 20


1400

1200

S1000
E
S800

600

400

200
0 I
0 2 4 6 8 10 12 14 16 18 20
number of cp columns


Figure 5-42: Varying number of control points study results for column counts 1-20 and a row
count of 6 ned to c2


c 40
E 30
E30

2 20

- 10
0
-c\ -
CC l --------


E 200


a 100

0
-c

15

E10
0

15


0 2 4 6 8 10 12 14 16 18 20
number of cp columns


Figure 5-43: Varying number of control points study results for column counts 1-20 and a row
count of 6 cl to c2


0 2 4 6 8 10 12 14 16 18 20


0 2 4 6 8 10 12 14 16 18 20






S--


I I I I I I I I I


4 6













150



100 -



50 -



0-
o T

-50 -



-100



-150
-200 -150 -100 -50 0 50 100 150 200
col (pixels)

Figure 5-44: Control point locations on focal plane for row count of 1 and a column count of 6.





150



100







0- 0



-50 -







S-150
-200 -150 -100 50 0 50 100 150 200
col (pixels)

Figure 5-45: Control point locations on focal plane for row count of 10 and a column count of 6.











150











0 0



100
CD










-150
-200 -150 -100 -50 0 50 100 150 200
col (pixels)

Figure 5-46: Control point locations on focal plane for row count of 20 and a column count of 6.


that there are only 5 points in figure 5-44 due to the last control point going out of the second

camera's FOV.

The results for the comesuld attitude standard deviations are shown in Figures 5-50 and

5-51.

5.2.5 Varying Location of Last Control Point

The effect of varying the location of the last control point on the attitude estimates is shown

in this section. The input parameters used for this study are shown in Table 5-11. The GLEMA

driver file varyinglastCP_location.m was used for this part of the study.

The original 3 x 3 control point configuration is shown in Figure 5-52 with the control point

that was varied being designated by a hexagram. The varying control points original location

corresponds to the upper right grid location on the focal planes of Figures 5-53 and 5-54; its

location was varied from the lower left hand corner to the upper right hand corner of the focal

plane for those two figures. The eight stationary control points are designated by a diamond

in Figures 5-53 and 5-54. For the 384 x 288 pixel array camera, that corresponds to 111,592

iterations. The inertial heading estimate error for each location is displayed in Figure 5-53.

The terrain used for the attitude variance calculation was formed using the


ormed using the






















+ + + +
+ +


110 120 130 140 150
north (m)
cl


-40 -30 -20 -10 0
east (m)


A
-40 -30 -20 -10 0
east (m)


160 170 180 190


10 20 30 40 50


10 20 30 40 50


Figure 5-47: Three-dimensional view of the sc
for the varying number of control points study.


ene for a row count of 1 and a column count of 6


100.


+ +4-
1 *1 +


+ + 4


+ : ++


S 120 140 160 180 200 220 240
north (m)





4+ + *+ + ++
+ + + ++ +.-
4-4

-40 -20 0 20 40 60
east (m)


+


+ +

+ +


++
+ +
$++


++ + ++
+ + + +


0
east (m)


Figure 5-48: Three-dimensional view of the scene for a row counts of 10 and a column count of 6
for the varying number of control points study.


100-


150 -


S200

E 150o


I I I I I I I I I I




















++ +4 +'"
++ +...++ .+ +
+ +++. +- ++
4-+


+ ++
+


120 140 160 180 200 220 240
north (m)
c&


-*- +


+


0
east (m)


+ +
+- ++
+ +


+

++ +
^


++ t + +- + + +
+ +4


++ 4
+4+


'+
++- -
S++
++-


0
east (m)


Figure 5-49: Three-dimensional view of the scene for a row count
for the varying number of control points study.


of 20 and a column count of 6


50

40

,30

20-

10
2 4 6 8 10 12 14 16 18 2



70

,60

'50 -

40-

30


2 4 6 8 10 12
number of cp rows


14 16 18 20


Figure 5-50: Varying number of control points study results for row counts 2-20 and a column
count of 6 ned to c2


100a-


250 r


E 200

o
E 150-

100
-60















0
2 -

f1`-- ~---
2 4 6 8 10 12 14 16 18 20
15

0



0i
2 4 6 8 10 12 14 16 18 20
8


4
22
0
^^--- -_



2 4 6 8 10 12 14 16 18 20
number of cp rows


Figure 5-51: Varying number of control points study results for row counts 2-20 and a column
count of 6 cl to c2




150 -



100 -



50 -


-50



-100 -



-150
-200 -150 -100 -50 0 50 100 150 200
col (pixels)

Figure 5-52: Original focal plane view of control point locations for the varying last control point
study.






















Table 5-11: Input parameters for varying location of last control point.


Parameter

13

f

ccdpixels

ccdsize

cp cols

cp rows

(4 ,0c ,)c2

(Yed, ned, 2ed)


( cl, 01, i)


(Oe2, ^2, VC2)




Venu

pixel error


Value

7F
3

0.005

384 288

0.0042 0.0032

3

3

(0,0,0)

(0,0,0)


(0,0,0)


(0,0,0)


10



1


Description

Camera depression angle in radians.

Camera focal length in meters.

Number of column and row pixels in the ccd array camera.

Size of the ccd array camera in meters.

Number of control point columns on the ccd array camera.

Number of control point rows on the ccd array camera.

Roll, pitch, yaw of the relative attitude between cl and c2.

Roll, pitch, yaw of the inertial attitude of the second camera
system c2.

Roll, pitch, yaw misalignment errors in the relative attitude
between cl and c2.

Roll, pitch, yaw misalignment errors between the c (center)
coordinate system and second camera system c2.

Displacement vector magnitude in meters.

Direction of the displacement vector, or velocity vector.

Measurement error of the control points in pixels.






















S150 -

o 160
160
100 140



50- 100
80 f 0


0 50 100 150 200 250 300 350 400
col

Figure 5-53: Inertial heading estimate errors for varying last control point study.


computeLastCPRange_better.m m-file from the GLEMA program. It used all the ranges of the

designated control points within a 200-pixel threshold to interpolate the range at each pixel

location on the focal plane. The corresponding terrain from each calculation is shown mapped

onto the focal plane in Figure 5-54.

5.2.6 Varying Control Point Measurement Error

The effect of varying the control point's measurement error on the attitude estimates is

shown in this section. The input parameters used for this study are shown in Table 5-12. The

GLEMA driver file used in this study was varying_pixel_error.m.

The measurement error of the control points was varied from 1.0 to 2.99 pixels. The first

iteration corresponds to measurement error of 1.0 pixels and the last iteration corresponds to a

measurement error of 2.99 pixels.

The control point locations on the focal planes of both cameras was not varied for this study

and are shown in Figure 5-55. The "x" symbols indicate the designated control point location

on the first camera's focal plane, and the hexagon symbols indicate the calculated control point

locations on the second camera's focal plane. A line is drawn to connect the corresponding control

point locations in the first and second camera system.




















-10


-20


0 50 100 150 200 250 300 350 400
col


Figure 5-54: The calculated terrain mapped onto the focal plane for varying last control point
study.


100 -




50 -




5 0



-50


-150
-200



Figure 5-55:


-150 -100 -50



Control point locations


0 50 100 150 200
col (pixels)


on focal plane for the varying pixel error study.























Table 5-12: Input parameters for varying control point measurement error.


Parameter




f

ccd_pixels

ccdsize

cp cols

cp rows




( c2 ,c2 )c2
ned, ned, )ned


( c, OC VI, C,)


(Oe2, ^2, VC2)


Ilall

Venu

pixel error


Value

7F
3

0.005

384 288

0.0042 0.0032

5

7

(0,0,0)

(0,0,0)


(0,0,0)


(0,0,0)


10

(0,1,0)

varied


Description

Camera depression angle in radians.

Camera focal length in meters.

Number of column and row pixels in the ccd array camera.

Size of the ccd array camera in meters.

Number of control point columns on the ccd array camera.

Number of control point rows on the ccd array camera.

Roll, pitch, yaw of the relative attitude between cl and c2.

Roll, pitch, yaw of the inertial attitude of the second camera
system c2.

Roll, pitch, yaw misalignment errors in the relative attitude
between cl and c2.

Roll, pitch, yaw misalignment errors between the c (center)
coordinate system and second camera system c2.

Displacement vector magnitude in meters.

Direction of the displacement vector, or velocity vector.

Measurement error of the control points in pixels.


















++ + ++ + .+ +
+++ + + + *
+ +


120 140 160 180 200 220 240
north (m)
D&


+ + +
+ Y+


0 20
east (m)


+
+ +




-20 0
east (m)


+


+
+ +
+


Figure 5-56: Three-dimensional
error study.


view of the scene for the varying control point measurement


The 3-D view of the scene did not change for this study as well. It is shown in Figure 5-56.

The control points are marked by plus symbols. The first camera position is marked by a square

symbol. The second camera position is marked by a triangle.

The results for the computed attitude standard deviations are shown in Figures 5-57 and

5-58. The attitude errors are shown to be directly proportional to the measurement errors as

expected.


+ +
+ ,


+ +

++
^+


-50 L-
-60

250


I200 -




100
-60





S200-
-C
E 150-


100-
-60






























S 1.2 1.4 1.6 1.8 2 2.2 2.4 2.6 2.8










I- .


120

100

80

60
4 60

a 40

90


1 1.2 1.4 1.6 1.8 2 2.2 2.4 2.6 2.8 3
pixel measurement error


Varying control point measurement errors results for 1.0 to 2.99 pixels


1.2 1.4 1.6 1.8 2 2.2 2.4 2.6 2.8 3


1.2 1.4 1.6 1.8 2 2.2 2.4
roll error o (cl to c2) vs. cp measurement error


2.6 2.8 3


1 1.2 1.4 1.6 1.8 2 2.2 2.4 2.6 2.8 3


Figure 5-58: Varying control point measurement errors results for 1.0 to 2.99 pixels cl to c2


Figure 5-57:


ned to c2

















CHAPTER 6
CONCLUSIONS

It has been shown in this thesis that a "free" attitude estimate is available for (1) a vehicle

that contains a downward-looking vision system and a GPS receiver capable of differential

carrier-phase positioning, and (2) two or more vehicles containing an overlapping FOV and GPS

receivers capable of differential carrier-phase positioning. SfM techniques on two images with

point correspondences produce a displacement vector between the camera locations in the body

coordinate system. GPS differential carrier-phase positioning produces the displacement vector

between the two cameras in an inertial reference frame. Once two vectors are found that are

expressed in both coordinate systems, the rotation between the two coordinate systems can be

found. This corresponds to the vehicle attitude.

This thesis began with a literature review of the SfM field and detailed how to set up the

reconstruction problem using Euclidean geometry techniques. Background in GPS technology

showed the accuracy and ambiguity differences between positioning with code measurements and

positioning with carrier-phase measurements. Also, it was shown that when using carrier-phase

measurements for positioning it is less ambiguous and more accurate to compute differential

positions between two receivers than the absolute position of each receiver position. Two solution

approaches for finding an attitude estimate were presented: (1) an SVD approach and (2) a

descent approach. Finally, a linearized model of the system was developed in which a detailed

error analysis was performed. The key contribution that this thesis presents is the detailed error

analysis of the attitude estimate.

The error analysis showed how the accuracy of the attitude estimate depends on the geome-

try of the scene, camera parameters, displacement between camera locations, number and location

of control points specified on the focal plane, and measurement error associated with specifying

the control points. The accuracy of the attitude estimate was shown to be on the order of tens of

mrad.

6.1 Optimal Camera Parameters for Heading Estimation

Table 6-1 shows the optimal values for computing the inertial heading estimate based on

the error analysis parameter study of chapter 5. Based on these findings the following values











were chosen for the optimum heading estimate: 20 FOV, 20 camera depression angle, ten meter

camera displacement vector magnitude, six columns by ten rows of control points, a measurement

error of one pixel, and a 384 x 288 ccd array camera1 with a 0.0042 x 0.0032 meter focal plane.

Inputting this configuration into the GLEMA Matlab program resulted in a heading estimate

accuracy of 8.6836 mrad or approximately half a degree. The driver file used to generate this

results was optimal_heading_estimate.m.

Table 6-1: Optimal parameter values for heading estimate.

Optimal Parameter Value Description/Notes

FOV < 20 or FOV > 70 Camera FOV in degrees (note that for the pitch
attitude estimates larger FOV values are desired).

3 < 200 or 3 > 1600 Camera depression angle in degrees.

||a|| > 10 Displacement vector magnitude in meters.

cp cols > 6 Number of control point columns on the ccd array camera.

cp rows > 10 Number of control point rows on the ccd array camera.

pixel error < 2 Measurement error of the control points in pixels.



6.2 Worst Camera Parameter Values for Attitude Estimation

The error analysis presented in chapter 5 also showed that there are worst case values for

the parameters in regards to formulating the "free" attitude estimate. A focal length of 45

produced the largest error in the heading estimate. A camera depression angle of 90" (pointing

straight down) produced the largest error in the heading estimate, and a value of approximately

65" produced the largest pitch error value. A displacement vector magnitude of less than 5

meters produced errors in both pitch and heading estimates greater than 100 mrad and increasing

exponentially. Finally, less than 12 control points also produced significant errors in both the

heading and pitch estimates. Inputting the configuration for the worst heading estimate into the

GLEMA Matlabc program resulted in an accuracy of 405.8933 mrad or approximately 23. The

driver file used to generate this result was worst_heading_estimate.m.



1 The ccd array size and number of pixels was not part of the parameter study. If a finer resolu-
tion on the focal plane can be achieved (i.e., increased number of pixels per focal plane area), then
that would obviously decrease the magnitude of the heading estimation error.











6.3 Future Areas to Explore

There are several areas throughout this thesis that were noted as future areas to explore.

Those areas, as well as some others not previously mentioned are listed below.

Implementation and characterization (i.e., via Monte Carlo analysis) of an algorithm for the

SVD solution.

Implementation and characterization of algorithms for the first and second order coplanarity

constraint based descent algorithms.

Development, implementation, and characterization of a descent algorithm using the

generalized linearized error model presented in chapter 5.

Incorporation of range information into the error analysis from laser radar or synthetic

aperture radar (SAR) visioning systems.

Experimental verification of error analysis results.

Expansion of application to tactical far target identification devices.

Derive a fundamental equation that expresses the accuracy of the attitude estimate as a

function of the number of control points (i.e., is there a theoretical limit at which adding

more control points has no positive effect on the accuracy of the attitude estimate).


















APPENDIX
CENTER (c) COORDINATE FRAME DISCUSSION

We need a transformation that maps aLL to the x-axis of the c coordinate frame, and keeps

the z-axis of the c coordinate frame as close to the z-axis of the LL coordinate frame as possible.

Assume this form:
a r-1 r12 T13 ax ar

0 r21 r22 r23 ay r ay (A-l)

0 r3l r32 r33 az r a

where a = |all.

The row vectors r r and rj are the unit vectors of the axes of the c coordinate frame

expressed in the LL coordinate frame. It immediately follows that

T
rr = (A-2)


The requirement that the z-axis of the c coordinate frame be as close as possible to the z-axis of

the LL coordinate frame is satisfied if the y-axis of the c coordinate frame lies in the xy-plane of

the LL coordinate frame. It immediately follows that r2 = 0.

For r2, we need a vector in the xy-plane of the LL coordinate frame that is orthogonal to a.

There are two possibilities: r-., a 0] and r., -ax 0]. The first is the correct choice, since

aLL = [a 0 0]T should lead to RL = I. Therefore,


r2i 1 [-., ax 0o. (A-3)


Finally, r3 = rl x r2, so


r= Ilall12 [ -. a. a ~ + (A-4)


Putting it all together:


/ay ay /a y ^ +ay 2 2
[ax aa a a ay
RLL -- -- |||| ||1>|| 0 (A-5)
L a a la ,2 +
x +. ay a, ,.


Sy


--