<%BANNER%>

Autopilot Development for a Micro Air Vehicle Using Vision-Based Attitude Estimation

xml version 1.0 encoding UTF-8
REPORT xmlns http:www.fcla.edudlsmddaitss xmlns:xsi http:www.w3.org2001XMLSchema-instance xsi:schemaLocation http:www.fcla.edudlsmddaitssdaitssReport.xsd
INGEST IEID E20110115_AAAAFX INGEST_TIME 2011-01-16T05:03:30Z PACKAGE UFE0008943_00001
AGREEMENT_INFO ACCOUNT UF PROJECT UFDC
FILES
FILE SIZE 23947 DFID F20110116_AAAAMH ORIGIN DEPOSITOR PATH kehoe_j_Page_14.QC.jpg GLOBAL false PRESERVATION BIT MESSAGE_DIGEST ALGORITHM MD5
331dbefb6eb02e12e4f998715c33f68f
SHA-1
9233e9790dc3f5e697d1436c0b576bdfa0fc9c49
2656 F20110116_AAAAHK kehoe_j_Page_78.txt
4a7d71fe68b4c647beb9f6f54fc8a4a5
3496beeb7c594cbc89c0d99a894985e20e9f2b18
731539 F20110115_AADVXE kehoe_j_Page_46.jp2
d33d526649056da0a0303ebaa756d8d0
fa140e74ea029b80db8505ff8960e910a06118a5
84057 F20110115_AADVSH kehoe_j_Page_30.jpg
33ecd3823d72eb4a4a4788c43834ab5c
d0d07ec15ac7955a5101f87fdbe68f3c39f4f1da
6961 F20110116_AAAAMI kehoe_j_Page_15thm.jpg
ba9c958444bfac376367b5d42b667287
dec249ea67eeaf0fc7bac98a51697f17bc2f8e96
1355 F20110116_AAAAHL kehoe_j_Page_80.txt
a60373478e665c2470f12d61f424d148
2eb37fba3348b45242f60a80759f5930c7ed4d6a
53034 F20110116_AAAACN kehoe_j_Page_27.pro
d37cc3b33d2e9f2f2cc3b19be82d490d
f5b3bbf41972e02d34037fa3db3d555b57909c2d
1051982 F20110115_AADVXF kehoe_j_Page_47.jp2
22e6b65a89bdd8232d8286969e0e4904
e56b51a67e4bd13592d0a30188d82be38e139540
4501 F20110115_AADVSI kehoe_j_Page_04thm.jpg
301976f97857cef65b7b00744f46b50a
7e2ea03207aa5715238c1145683b5295b948fb27
6162 F20110116_AAAAMJ kehoe_j_Page_19thm.jpg
a3726ca3c932a7d8978b06f6f97d0ef5
9a1e17e708ffd1d35ea8c36f2481ab5ca6530ef7
2453 F20110116_AAAAHM kehoe_j_Page_01thm.jpg
2e079bf500a395b64ec0d1f3b689099e
48251b947f09b3c2c57bf2570c4493a33efa8507
42464 F20110116_AAAACO kehoe_j_Page_28.pro
14fd25e9ceb7edb6599fabe77a23ccce
0b4234a17a69fa453199627fd09c0ae0975ca805
980331 F20110115_AADVXG kehoe_j_Page_48.jp2
efc175020a4db7c90a366c7a27587479
c4748d2ad0b7a2c003d8c142edd50e999ff9c2b3
810098 F20110115_AADVSJ kehoe_j_Page_08.jp2
9ad9bc57d74af8cd7c627388a211c9f9
8c154c4a44327d5f4c7bad9fe8fe6d95ecea86b3
5779 F20110116_AAAAMK kehoe_j_Page_25thm.jpg
3191b0725f740faedbf37cc7ff94c17d
6f50f65b346f00f5d787acbe40355e05298e74a5
22081 F20110116_AAAAHN kehoe_j_Page_59.QC.jpg
57ba548c07be4a2a79d126c8cb209960
23b7d185920f4ec7d1f8e8cc8b68c57a745fa34c
38430 F20110116_AAAACP kehoe_j_Page_29.pro
a590c83b3dc03fa2cecd2bc200f8ba02
e0515391374967df5a9cd4c0bf2356eeab51c86c
651971 F20110115_AADVXH kehoe_j_Page_49.jp2
356cc7f7b1c6b332ccb118b4c281e6e7
d9ef3970f5bf565cb9fff2df13cfe7e92eec94a9
6377 F20110116_AAAAML kehoe_j_Page_29thm.jpg
d637f0baa0b6ddc8fb68103907abacb4
e852c8d29f11b75fa1b19cfa72761b2b583f5301
4815 F20110116_AAAAHO kehoe_j_Page_63thm.jpg
30e3a307a75ba133e7e38fdf017c5401
0dae4bc703374153d92c958a7371e9891aba70f1
53941 F20110116_AAAACQ kehoe_j_Page_30.pro
f04199a287a631015039d855f1d3b33d
4d0c1730e6b60bcee464b7c86fd1413f1f04e008
932771 F20110115_AADVXI kehoe_j_Page_50.jp2
8beaed416fe075d862062bc940c7eed0
cab5e2c3126ba64b9486a1239341f513e3058db8
979087 F20110115_AADVSK kehoe_j_Page_62.jp2
481198e52c4e0707337e769d61559fe0
6c9846dc74750ff8a11120a4bbac4a5168064011
6572 F20110116_AAAAMM kehoe_j_Page_31thm.jpg
c06b7e6d0aac9098decc23b50d30b9a1
7ec38e2a9e44681abe2321c85b5541e61ad924cc
4981 F20110116_AAAAHP kehoe_j_Page_20thm.jpg
6a75cf9998157b152131fd71cb3a9410
ec9601c207ed1ccc3f3a7a6cb7fd658def587f33
54575 F20110116_AAAACR kehoe_j_Page_31.pro
a74f5f85aec238ae3257673d408f2cc0
48d3052c2c616417c69be99960311838216332e0
861268 F20110115_AADVXJ kehoe_j_Page_51.jp2
ecfef9f747f6da39a5eb3676e51debe8
eef835ff5c888abd52da3e0d53295a51d786c1d7
1001266 F20110115_AADVSL kehoe_j_Page_44.jp2
a16ceb38caeb921451a77f652309142f
b44587a71a0641a60bec94521708f16a31faf9bf
21894 F20110116_AAAAMN kehoe_j_Page_34.QC.jpg
9ad59ce39ffd32feb7c16b99d55ab7b8
8379d75c88c1ef71e93da5a15d7f3828871ecc6e
6097 F20110116_AAAAHQ kehoe_j_Page_12thm.jpg
fe1cb3aab8d7c9e35e7c51fc37d42b19
fb04a45d1500b58c2391edc19edfc8adee2b287b
46542 F20110116_AAAACS kehoe_j_Page_32.pro
4ba25f73f77cfffde3bd07104e41b289
cece1f88fc8fce4e53736cc3aaafca788f06e8f1
1051972 F20110115_AADVXK kehoe_j_Page_52.jp2
85080410e3a94e8bad5bb6d62adb5973
8d08baee038c853930964aee732cfe8a15e3d2be
22281 F20110115_AADVSM kehoe_j_Page_39.QC.jpg
41e33fbb2ecacd43078ae5a7bc87e945
2f1765ec22da09710f3a681b720bbac9e71e03e9
6368 F20110116_AAAAMO kehoe_j_Page_35thm.jpg
fa22a9da52cd0cadabde238feb51277e
f524c1658ecb83b4e6e1fb39f0ce615b3e8ee88b
14376 F20110116_AAAAHR kehoe_j_Page_07.QC.jpg
af462d8f2f255beb80e29f8f84f8a1af
41a78a7e7b0ff5f0c8d16a9b3982b75221acc683
14962 F20110116_AAAACT kehoe_j_Page_33.pro
2a02f99efab2ab074be325fbc92bef56
8570566dda0d984e9496f4a5482b21a4ac85be4d
1020703 F20110115_AADVXL kehoe_j_Page_53.jp2
6d1bacec6e18792a8e231a6da76582bd
c6e53fd3228e68207d9006544bdf8b0f70395b59
87440 F20110115_AADVSN kehoe_j_Page_56.jpg
4e4498fba653b486034f079ddc25bcce
3662328181a2598bacc522e713a8eabc858462d7
21566 F20110116_AAAAMP kehoe_j_Page_36.QC.jpg
8c51b14509ef4df405e56da81dd485f1
b3051d2544ce73bded8d9d16ac59ae8b3d537a08
48980 F20110116_AAAACU kehoe_j_Page_34.pro
997f6bc56695dbc8807a76a5ed4b4652
387a1b50f027adc288f108b8698e87e2396a4586
1051983 F20110115_AADVXM kehoe_j_Page_54.jp2
2b6a68a09801aed858a5330582801a24
c2e1eb4c6bc5a47739134af02fad1dc5663162b0
93472 F20110115_AADVSO UFE0008943_00001.mets FULL
976c80d967ddea4c8394749469779a6c
363ffe5d6622a15db1be1e2b394255978889de0b
5807 F20110116_AAAAMQ kehoe_j_Page_36thm.jpg
3f2e9f5d40527f777b322c098ff1511a
6f494bbcd69adb358a93290799d9e9bc82d6e0ed
7039 F20110116_AAAAHS kehoe_j_Page_17thm.jpg
e1c6c10cfbc653f26a1761b237fef58d
57e6efa655a103ad8ecab542a830b247b882e227
51914 F20110116_AAAACV kehoe_j_Page_35.pro
a744f4be28169ab2fdc74cd50ad34c78
daf12a3ce7d5275aea1f359c23c6d955efd77716
1051978 F20110115_AADVXN kehoe_j_Page_56.jp2
4a48fb7708eac024103096f960549c54
7217331913819374d354e292343c27b199380843
25041 F20110116_AAAAMR kehoe_j_Page_38.QC.jpg
46d7ed0b3efedb17cf968984183bcb00
18d95314a13bd56f02d288c230492690dcadb968
4887 F20110116_AAAAHT kehoe_j_Page_60thm.jpg
a9d5bcfe91037bf56961c56301207055
27209e0b472356f707a64d1c0a0a22348bbda7ef
42482 F20110116_AAAACW kehoe_j_Page_36.pro
688d20b9373059d10d794bfe791ff8c9
0fc1351ff6564e3d97583bf4bb1a0abe66f552de
1051961 F20110115_AADVXO kehoe_j_Page_57.jp2
d0b71460b3050f7e11652faa55c9bfd1
458ff6672fe19668ee305098d64fe9fe14c91f78
8062 F20110116_AAAAMS kehoe_j_Page_40.QC.jpg
d7948d81be90264f222c8677e0c81dd4
077d0586963efa908b5b07050d74fdb0769ad354
6051 F20110116_AAAAHU kehoe_j_Page_24thm.jpg
2bfb701ca9434fcb01c22dd1ea3c864d
a81a28a1d2bfb4acd4de156cf7018c43ddc7d20b
57595 F20110116_AAAACX kehoe_j_Page_37.pro
7840a81d39b4dd76b4dedfe139bcc4c3
da9fe97ae2cd641426d6a6039950ab872b518cd5
26126 F20110115_AADVSR kehoe_j_Page_01.jpg
3e32c26af4f4c002a9ab0d2ab71cf065
47037f5511cf14c2cf2953d7760d3769bce5d158
2641 F20110116_AAAAMT kehoe_j_Page_40thm.jpg
a1ae05c3539eb7d8a199fde2f1ccf93c
523bfa76714c679b7b1f32a145e0dd71ac374fbe
5744 F20110116_AAAAHV kehoe_j_Page_64thm.jpg
d98acc929f75747fd0fe5e75f1641ac1
87e50e61b68a88b84b6eb7f59f76c90fb9d1f159
52065 F20110116_AAAACY kehoe_j_Page_38.pro
66b11d605ab2f3c4ebc5ecf2ec838c16
f45c543b91d0de078c7c0fa63c5ffb6168e8139e
834507 F20110115_AADVXP kehoe_j_Page_58.jp2
3f4fc1d839a41734373e3b444c691a7b
2364179506196d3d83f936af7245d305e444a9ed
11037 F20110115_AADVSS kehoe_j_Page_02.jpg
9cab9088aada63bf4255322bf315e4f5
a57e57c97f28f91744e085bfeecf0a06bc88938a
25271604 F20110116_AAAAAA kehoe_j_Page_39.tif
d97e28e88f3c63a4e6ce6dfa837aac96
e3b976bb834de28e60b1854e28580455dd038a24
4506 F20110116_AAAAMU kehoe_j_Page_41thm.jpg
d10008053d5a4adaaf9fe9c542eb51bc
b54378c9fcd7e458b5d91320e2f4513bdec33d28
6505 F20110116_AAAAHW kehoe_j_Page_66thm.jpg
f00cd1ff1265cdc077814cec9160d724
ebdbd1223425a122db5bd40b26bcc3706a89c414
43785 F20110116_AAAACZ kehoe_j_Page_39.pro
c0f661c6e150f038b9fd5cfa7fce7e4e
60f7d3dd5f859273d90d183ef683e8b2c9ad717d
1011216 F20110115_AADVXQ kehoe_j_Page_59.jp2
d48a4ebdfabc43d16efe3e0d26912458
a5fb0eca25288208e455f5648de67daaf1807e7d
65192 F20110115_AADVST kehoe_j_Page_03.jpg
8701e9553b4fab2ae094989c2629e51b
72385456cc4aa90bf7f5394f785c905d69831c41
F20110116_AAAAAB kehoe_j_Page_40.tif
ef16c01a83fd3e8c098bb4ecb422d2c0
431afad777ffde6995deedffc9cbbe08a4a03425
5596 F20110116_AAAAMV kehoe_j_Page_45thm.jpg
6237a96a1b3b9f911289c59cff22551f
edd9f1297ee333da965f831dab45a0e977aa111d
5762 F20110116_AAAAHX kehoe_j_Page_48thm.jpg
2559dbf7a0ec63aa4256ccc53d3a83d4
bf1415158b0b68e36243ccbe254dd5ee05f3be3e
772309 F20110115_AADVXR kehoe_j_Page_60.jp2
95c86f19fd47e926ae4b03d45d7ae51e
23129152e7c40736e435875ff00e918059be958d
63627 F20110115_AADVSU kehoe_j_Page_04.jpg
8c5d5b85ee47306c35c355f9680bf1c3
17e47ac79057d7ae48c3471f63e9d82353a399ec
F20110116_AAAAAC kehoe_j_Page_41.tif
682f68adcf281e70b0aff57ad7d4c002
9f3bcf2cdd5c6b4dfafa4290797b5e25f07426e4
21542 F20110116_AAAAMW kehoe_j_Page_50.QC.jpg
fb496317e4fff7a7a1049cd7904d23d6
fcf4b4c3619d62c6cb8a949245eb7815b345da80
26878 F20110116_AAAAHY kehoe_j_Page_42.QC.jpg
4c50fbf5233ff316875757c0969920ac
1c5ecbcea189f5746c4d1a4324347854f42516ad
905187 F20110115_AADVXS kehoe_j_Page_61.jp2
f463f91f01819e3301e948c812e7dafa
8d1be10ce5bdba5ffa7f1e9b36595c5f5f51a129
1877 F20110116_AAAAFA kehoe_j_Page_15.txt
eb1bc870caeec883637968055f515644
f3d7bc912ea545ab763d490a8cab5d26542defbb
40031 F20110115_AADVSV kehoe_j_Page_05.jpg
1ea836544f17fff8dda067f3d8f7bc6e
3bbc3cbf1e6f7d570a5309983c05ae972add2380
F20110116_AAAAAD kehoe_j_Page_42.tif
dfd1024607a424011ade331aa4f1225b
144861ca374926fe425ae09aee7a7b1c102a90fe
2828 F20110116_AAAAHZ kehoe_j_Page_33thm.jpg
a6b173af8d63e300e7f24bdabe5404e8
2788cc03ff934e5203537c9652b4d18910f8105e
678976 F20110115_AADVXT kehoe_j_Page_63.jp2
9eec6ee397889d429a825c4a3bbbcc11
458c3fafeec23713397788b3b8604fd423c8251a
1949 F20110116_AAAAFB kehoe_j_Page_16.txt
411c922072b0a622607e95219ed6c2d1
e69132043f3cb8641cb4f942a88d9a76980b58eb
68421 F20110115_AADVSW kehoe_j_Page_06.jpg
ff6e18f70c0f0678f1bac799113f43d6
c45156b3511cd3111c1af7fdf7c9ffece5c0aa5d
1053954 F20110116_AAAAAE kehoe_j_Page_43.tif
5462c1fae4e592c003840fde264a7954
0ae7db8f38bdb1b74466547fee4b40773f09c944
28125 F20110116_AAAAMX kehoe_j_Page_54.QC.jpg
b1c83b94ee9df67be3507f23b176e517
d787f6d67f78de0c7ae7d12dea040627e6698c95
1051977 F20110115_AADVXU kehoe_j_Page_64.jp2
25e190ca191f5821b1db2f9382b567ad
31f58d403495e706cd84db305547956d53711eff
2218 F20110116_AAAAFC kehoe_j_Page_17.txt
ae2b032c0a4120bc38a81126aad5b3a6
c9dcc26fc6e185a0ba455154ec9b1f7b9d2a6cff
48036 F20110115_AADVSX kehoe_j_Page_07.jpg
bfd556a3155d328f7db43cec6c597079
f65032ba4957d3a8fc145fcf3e736d6e60b34f65
F20110116_AAAAAF kehoe_j_Page_44.tif
476783d05a999cd820cee1b21901e97e
974826ef6f1e31ccb24286bd883c22f7c84dac73
7303 F20110116_AAAAMY kehoe_j_Page_57thm.jpg
4a3183b8967dd5ee074a84fd3900a0ec
e9b5c23132335e088d18d7487297c7c3aafe1976
26936 F20110116_AAAAKA kehoe_j_Page_47.QC.jpg
c423da984cd2b8a17f94cf7cd75cad25
0599625791e174faa0c3698847a7cb036d3f2768
952354 F20110115_AADVXV kehoe_j_Page_65.jp2
d39d892a8fa090e67dc0e86847a86601
85cfb78915bb6ff57d8936196f1ddcd0ae54dee7
228 F20110116_AAAAFD kehoe_j_Page_18.txt
ddba87bdd1582e271f7a8131df65f575
55d215cdd9c586c177dbf6e540793f3626e9843e
61305 F20110115_AADVSY kehoe_j_Page_08.jpg
41bc389f8c2bc502d44a56e0cc19b49f
01aaf97d96a4afa3ec18fcc2cce01f211ef4bfe5
F20110116_AAAAAG kehoe_j_Page_45.tif
9aa9a39ee318e9d2e793f01b9c00d643
747a7e40cdc74972233fe2c83eb201bff523c909
21781 F20110116_AAAAMZ kehoe_j_Page_64.QC.jpg
c33810e00601201ab9ca3eb195129e0c
a1b901f04b65295b82a981e10bae84076e5f476f
5864 F20110116_AAAAKB kehoe_j_Page_59thm.jpg
f315733b0067b7d9c800c527503477fe
774866796206266a7b8b1de1177241f2f0085f23
1051965 F20110115_AADVXW kehoe_j_Page_66.jp2
c341f21827cfcca45724286afad5f119
a978552d1b18bbb463d30c38a928f4cc0c65ddf6
1980 F20110116_AAAAFE kehoe_j_Page_19.txt
3a69c704ef32a2a10337bb99614a9805
52ec2ec9c896ddf9ea3c7664849c7c28736b9e76
65416 F20110115_AADVSZ kehoe_j_Page_09.jpg
0315b4e2e522012a7da452ed34fdb618
4949c8d0537cbb5197fd239ced79d55d67c070ba
F20110116_AAAAAH kehoe_j_Page_46.tif
6ba36a2791566df6b80d7d48617cf149
747f61345c57047f3f40283942d64a8ee36d4a8e
21049 F20110116_AAAAKC kehoe_j_Page_26.QC.jpg
871ae3a6c8b55ca410c63b03fe3bbd0c
89d75370ab32fe485fc74646cba06d91386a8246
744195 F20110115_AADVXX kehoe_j_Page_67.jp2
f13962e01c654e362b4a464447a38552
b34f33e1e87e71e351aaf3d3a92f0251e9060994
1651 F20110116_AAAAFF kehoe_j_Page_20.txt
128a4f5f4d48ee4d72de41f1b867a805
5f722a8a746504238a98db3b21177d99ef5cac71
F20110116_AAAAAI kehoe_j_Page_47.tif
05a9d18b0defbf5f932b7bf9ea89abac
ee34082bcea42bab39f79cbf243d3c9aa2b04064
6742 F20110116_AAAAKD kehoe_j_Page_42thm.jpg
ebc1bfd14786ad2eaffed0371b2ab390
b66e35df92983b331ca71f00d770d4eedbbdfd50
65575 F20110115_AADVVA kehoe_j_Page_65.jpg
89507afa14d68a6ed27d3b5ac1747b1b
7720375d4215f9ee3ee0b2167d7c388b88c7635c
1051981 F20110115_AADVXY kehoe_j_Page_68.jp2
02a57eed2762bc53dd7ab8a855a18734
672d2ee4ffb4c6eba215e63655d48081b7562f90
1555 F20110116_AAAAFG kehoe_j_Page_21.txt
0f2e1df9f4b71f18661825a5180a879b
223f8f2bb61e62633cd513c58b22d744c1ac0cf7
F20110116_AAAAAJ kehoe_j_Page_48.tif
aacfc1141dfbb4bef776f959734fc149
3d4fd55fd1817662d87a1a119a8c85bd66ce3265
25782 F20110116_AAAAKE kehoe_j_Page_30.QC.jpg
6e7c706085efb4260506f8d108015eec
8c925c4421ada010a28431a85eb9c3fc29b3f536
78796 F20110115_AADVVB kehoe_j_Page_66.jpg
483f404ca8c5d00c4d4026c1a3d42837
a8990945b3dc34792df195591c23380a04ffb65c
683795 F20110115_AADVXZ kehoe_j_Page_69.jp2
7cb0b99aace5c4634e3cfee8b45d0824
3bb0304db188ab0e5eb9c1c064c017f713621776
1992 F20110116_AAAAFH kehoe_j_Page_22.txt
a843cbcde09ad83ef9ebb9ba76a01e4b
1b7b198c4b6e0d686d7d05633034b160495658b6
F20110116_AAAAAK kehoe_j_Page_49.tif
b8715ed3392f62b9b94e1b9ef9cce93c
778ea49cfa5c43ba061fbb6322a238fe733f43ad
5915 F20110116_AAAAKF kehoe_j_Page_44thm.jpg
2603f7fd9a7f6132cf9269f647868f5e
6589f65a14812583483456ce1491d7f4233ad1cb
2138 F20110116_AAAAFI kehoe_j_Page_23.txt
2f258722b737ea4d339290033d3f0583
3e3f412ddea7d91431de6f496464b3ccc5d5727b
54095 F20110115_AADVVC kehoe_j_Page_67.jpg
525092186b966757101261db0ca6607a
9af11506e2738d9a8ac1f74e424ac209f6ffe346
5906 F20110116_AAAAKG kehoe_j_Page_21thm.jpg
fb218154efde2b309a8c7995fe6f5346
5c2bf12ba76560358fa8081fcb5a575c7da765df
1612 F20110116_AAAAFJ kehoe_j_Page_24.txt
055990d887e09ff637b366ada6617a75
152f95a2518a4a1f71dab180c3ba41a80b5e7d5c
F20110116_AAAAAL kehoe_j_Page_50.tif
f78542472e43e16455a143517c104aa4
2b7bb0993d4819f66f57ce8a698ad0c3fe65cc43
79962 F20110115_AADVVD kehoe_j_Page_68.jpg
275f4885035b722c4cf9f91d012776f2
da84561ef684972440dfcda7e75f0abfd818d460
26079 F20110116_AAAAKH kehoe_j_Page_31.QC.jpg
3f884ced27749c554e5496a088bde7e7
1aec30784416027e4bb27d7175e21f1ee87f4dde
1659 F20110116_AAAAFK kehoe_j_Page_25.txt
4079640a044938bdd8425571caa771f9
48b420aa124c43a452e0fbaf03899680533b9848
F20110116_AAAAAM kehoe_j_Page_51.tif
ca21b4c9d0db1feb423729b4e0a9aa03
8b52d9a72da4b2e0001200bebb39630299436310
53402 F20110115_AADVVE kehoe_j_Page_69.jpg
ba5e044b2b4542f6d40d8e59fed74dc0
b78e8863ab06c45b6dca9b85fd93c7459e44b9dd
2891 F20110116_AAAAKI kehoe_j_Page_79thm.jpg
4ee51cdaac21c14d109f119b5f12e12a
bcb888554241281c556f303865cf25f040581897
1633 F20110116_AAAAFL kehoe_j_Page_26.txt
7310a5509db063d5addf17a829c86bb9
db3f437246ae6d499164e43069df5b50f5a58560
F20110116_AAAAAN kehoe_j_Page_52.tif
07789e428ecae1f4a95a52b7753a3f1c
dae75d029a0f0a5fa23f0c2e38fa82d274f38c94
76531 F20110115_AADVVF kehoe_j_Page_70.jpg
a8f2ecf80361c7a6ff698f3fe1383368
dd30644f24ba57976d1a3c4fe11d7ff04ccb328b
7060 F20110116_AAAAKJ kehoe_j_Page_56thm.jpg
123d163ab2776632cce5de5ebf8772cd
61b8ff2272fe95b2f9f7206cca899dce3217c42a
2333 F20110116_AAAAFM kehoe_j_Page_27.txt
eabe7d1d2a3e5615cc59b6c40e20096d
69c1908204cff6ee9d18bbc2fa8f3d24ad967516
F20110116_AAAAAO kehoe_j_Page_53.tif
d5eb57f2f3b0e2d571990721debe96ee
bb30ae64f5eec55b8343c0ea454e05b491963b39
72939 F20110115_AADVVG kehoe_j_Page_71.jpg
592fae08cb3ced478d7b6f8162f80085
70018149040927bf084f9cfcff16d31ac3efaf9b
5787 F20110116_AAAAKK kehoe_j_Page_28thm.jpg
ae872ee88121c1b0004ed56a30c52498
1b595ac34b9d4d938f478afe90a2fa3bb5ed363f
2068 F20110116_AAAAFN kehoe_j_Page_28.txt
3ee8f78e906d9c930babbe88c69215c1
1384aa5a2f6b3738424654688c537e9f8365f252
F20110116_AAAAAP kehoe_j_Page_54.tif
94c5c3d32b81cb28f54cfc0e1b6d74c2
bc80a34b174f7348da25fc5dc464e53400fc886a
89701 F20110115_AADVVH kehoe_j_Page_72.jpg
0ba7bb1cb687566ef3178884228aa643
e82b9b18ca74ef6ff9042932008f7da731f90a54
17272 F20110116_AAAAKL kehoe_j_Page_60.QC.jpg
4077c8a21e94c4ef8f27cdbcb7927a0b
c4c7c0b7a52ba6ed61a830bdbf5a94829d48f0ef
1566 F20110116_AAAAFO kehoe_j_Page_29.txt
60bf4f4cd9d44f2d5609b9698e21df23
adbd7cf5d1441df48efc8fcc5f235dc7207155c0
F20110116_AAAAAQ kehoe_j_Page_55.tif
74dcae3a91661ff5c5a29ede9a0ba6b2
11c4bcd550b2db0a113847230db9a7abe7974c6d
27908 F20110115_AADVVI kehoe_j_Page_73.jpg
1debefe46d06acdc2e2ee0df846d3da6
1ec1463db4c832b3890d52dd42d29e2a39099258
5015 F20110116_AAAAKM kehoe_j_Page_46thm.jpg
c3075174dc4a29119f19605c2e45887b
7e8ab8ae03c7ffcf71907d2d35e8efbcee62c286
2135 F20110116_AAAAFP kehoe_j_Page_30.txt
98f919e9d7ca709b3117658720c1f808
d0bd79dc89d10f583d373c6f97fff8182c41eb75
F20110116_AAAAAR kehoe_j_Page_56.tif
5db644f4431ed7893ca43dbdd9680f3a
e6f3864281ebfb9c86239e4bce849f0ae38f1825
40663 F20110115_AADVVJ kehoe_j_Page_74.jpg
852183d792334939cd2696a0b672c489
69c32258e338bab34eb6bbfcbc8e3c7cd3738226
22226 F20110116_AAAAKN kehoe_j_Page_44.QC.jpg
6106824ae9c8ef1d9eb3e16e6c2ab8cb
d9b2a69d0a183d1204cb1ba75b35cc62b5c19a25
F20110116_AAAAAS kehoe_j_Page_57.tif
ee75e01bc55d0d8e85cee9393026cace
a7b0af4fbadab6f779a5dc52fca92e5258acb382
92238 F20110115_AADVVK kehoe_j_Page_75.jpg
204adfb9a13b4338477122e2e0f2713f
085b8adbcb46f5372f0f7e16a28b0daafb6013a0
5642 F20110116_AAAAKO kehoe_j_Page_65thm.jpg
285f06cdd222f73c9348fb08cde9fcaa
6684eb9935dbb0ad5d04d7fa0a1f060ce4fe3e5e
2196 F20110116_AAAAFQ kehoe_j_Page_31.txt
38119f1a2739b23cfba0673b9c010c9b
f8cf562a36866c5f1ffe90b5abb43b69bcadda9e
F20110116_AAAAAT kehoe_j_Page_59.tif
cdfbacb5f784c99cfc4994ff7e05cc17
8724fa35527bbdcb7d316bfa5062cf0c749fc512
112790 F20110115_AADVVL kehoe_j_Page_76.jpg
f3156323937f11a6b20e16a03e2976ed
7c92558a6e7a2733d2e54f75177be65d63b90c8f
4256 F20110116_AAAAKP kehoe_j_Page_18.QC.jpg
689212475b0340f805892dfda660becd
dc71f5887db84e8b7e2a2600133bae06b0b0d2e1
1844 F20110116_AAAAFR kehoe_j_Page_32.txt
9e90fd88d878f90e2528cf96877c8003
3ef46296f30f48dd6fd9000603c08b6d73cd015b
F20110116_AAAAAU kehoe_j_Page_60.tif
469fe64ced31cdddb75e690648c59079
f40682001167056b13216d2f78f9ea7dabe3c818
107059 F20110115_AADVVM kehoe_j_Page_77.jpg
56d8c10b98543e0ddaf55c6490bb0666
e388efc74429d0f41c00be46fe401933338c9cb5
7252 F20110116_AAAAKQ kehoe_j_Page_72thm.jpg
7db842a4a618711be4409c7038454612
0cd479a8ae7d962cc48d41149b23bd0dfa187c21
686 F20110116_AAAAFS kehoe_j_Page_33.txt
ed59579a0a144669dc04496efcbe25fc
eed4e14ca680e5ea0b3f42aaff69b41a7b44170e
F20110116_AAAAAV kehoe_j_Page_61.tif
fd4273607fb686beb06868a4a1a192ee
eb110c4e2f3532784278f313f610516e9ec82d84
24486 F20110116_AAAAKR kehoe_j_Page_70.QC.jpg
00e75f4ae5d9a17ffe00765ddc232369
e84045c61e35324a41d22e81edb39d637a21f434
2075 F20110116_AAAAFT kehoe_j_Page_34.txt
0686185a390d519677de21d1fd9188dd
f7ed856a90ee43bf6099cb3917ccf07de8db19fc
F20110116_AAAAAW kehoe_j_Page_62.tif
47762748dda3d7fe5d68b7b7a9f409c9
235cd5c18af81dd8bfdb92b0db08dc2eb56994ec
104651 F20110115_AADVVN kehoe_j_Page_78.jpg
f5e8fb30c9a396e9afb0990ec298ae18
f43db7e7d9d287bee744d96c2fe262f7a3b92bdc
24807 F20110116_AAAAKS kehoe_j_Page_68.QC.jpg
333bc2b181268e66d20b395d8d554494
2ed9498e9ff290d6110aeffa688b454efb1d515d
2057 F20110116_AAAAFU kehoe_j_Page_35.txt
c624695149dcfc5101c5cb1569238a9e
512f1c177be0e5836b44e49151284ceb2e5aca4c
F20110116_AAAAAX kehoe_j_Page_63.tif
dd3c33c934225ca7d0e384048ab4f914
6801fe40627c8003e3ee4b0e5ec967e818b6da54
57001 F20110115_AADVVO kehoe_j_Page_80.jpg
03a6fd6ae8f600c3605c29329f7edae1
66d79ad20c530ae1a05b1df7abfa89e36bdae9c0
20272 F20110116_AAAAKT kehoe_j_Page_43.QC.jpg
0fb458def3f9da558e4143a7c1fbd3f7
3248b4a4c08d3d914528cbda02f241f76d367490
1804 F20110116_AAAAFV kehoe_j_Page_36.txt
b023c6c3b110324c4077186d44421d95
104531b73292532a0a1516d787729a426ecc845f
F20110116_AAAAAY kehoe_j_Page_64.tif
923ffcbb97671ad0071671125bafac98
75d9cb37b9bfcde4076788ac596a8893b550f5ed
242216 F20110115_AADVVP kehoe_j_Page_01.jp2
6a44d28835c323517b6fe8c8f4dd754f
0927cc56301f36c7beadc332172954bab151ab67
23598 F20110116_AAAAKU kehoe_j_Page_71.QC.jpg
5f9a837ec0f3cb3cbeb0445c68a9dbee
5706f840c4246cebd360d4f447e6240768a96675
2262 F20110116_AAAAFW kehoe_j_Page_37.txt
c082d955112782e2aec7c54ad4cd433e
05418f73dbcac19fae1ee06cba752758ea66abb6
F20110116_AAAAAZ kehoe_j_Page_65.tif
e779de8b0b9fe9ee67d0a431aa56163a
ce94a5a6e12a092b99bd25374eb4deba843e65a3
905058 F20110115_AADVVQ kehoe_j_Page_03.jp2
954da75092a4b89ec056fc2a1bbbc9fb
7558bb3dc22c4f6d1a93500493ee77d7efd007ff
2132 F20110116_AAAAFX kehoe_j_Page_38.txt
10894480281695ddee8842f92c212817
8be4cd082b74d368ae12442a4159f11a39dddffd
1051984 F20110115_AADVVR kehoe_j_Page_04.jp2
f184cfad945141ba73aa460bfa00654e
b300c3edb0d04fd38099ec501e079e08023efcce
21314 F20110116_AAAAKV kehoe_j_Page_24.QC.jpg
37acac10fa1c4d291737d8cefdf60b67
4afbf3a9a302eb5c3ebe99013f0ee0cbdaad386d
1837 F20110116_AAAAFY kehoe_j_Page_39.txt
1d8ab327670cf7720cad1e83fc67d0b7
601c3f293de171609f6ed734bc0e427ed750c35d
661745 F20110115_AADVVS kehoe_j_Page_05.jp2
1da3ab10f0b9197a5ca82dccc9b8223c
45d891170fcb8cfeeafcc0ffee736950da76ac07
12829 F20110116_AAAADA kehoe_j_Page_40.pro
b9533980cba30239e113ff612cb9e905
781aa6e3de20f8851d45fe0422d4a8ffbea83f65
6547 F20110116_AAAAKW kehoe_j_Page_32thm.jpg
791c9b383bacebca9688645b1d733ca4
b185c4a67a07126cfcd5bb1bdef49eb18c4d401b
749 F20110116_AAAAFZ kehoe_j_Page_40.txt
58c8872701ae8e515b8c1cb478587229
551253ff8a1c4d1f0acd5626a415d81927efe233
1051975 F20110115_AADVVT kehoe_j_Page_06.jp2
3c819e059c212a20f94684185e45f57a
0e60319f9151260cda1b3a07ae16df78f57e7ef1
28062 F20110116_AAAADB kehoe_j_Page_41.pro
cbd402f3dfd3c0299298a042ea56bfcc
f5e8fa4e6eb67e0050326461f3a17e65458bdc35
5796 F20110116_AAAAKX kehoe_j_Page_09thm.jpg
2367f7147494b3ec87898b1157eb57dd
fe23fd34083a45771ed055f894477de0d5e88b8e
1017749 F20110115_AADVVU kehoe_j_Page_07.jp2
1e4e69be49a6eb61f8ed6ec400aa92b7
5c11cf9fe6325913d8a86a7f8421052fe902f510
54912 F20110116_AAAADC kehoe_j_Page_42.pro
def3d6f99a4d227f3b418e0085f39872
6a7f56dc6964c57d08d2c28da82c0c8e16684492
7174 F20110116_AAAAKY kehoe_j_Page_78thm.jpg
788f88aa3aed6c6ecaaaa7baa00e8f8d
127219a8c757d61bed5fe4d906c88e417fa73ce9
25394 F20110116_AAAAIA kehoe_j_Page_16.QC.jpg
d4d0fa6378e4c6379f3dcaf47cc7628b
9841910932d98678e441e6935b4817e1d385fd7d
99654 F20110115_AADVVV kehoe_j_Page_09.jp2
7ea65785ffa5aed237ffd90a474bcd70
42f57f6ad98b34faaea005c9f8c13b0f2e56b30a
39800 F20110116_AAAADD kehoe_j_Page_43.pro
e75e8f3c45f1d41da80dfcee3b021d21
12e6f22eb94e1fcb34e6139976dff70b64ff9db4
1463 F20110116_AAAAKZ kehoe_j_Page_02thm.jpg
2d43fb8ec9ce7eb3f32f1b2cf672f555
19a8bcb18711cfa8f6c449a58edff6259c188308
3883 F20110116_AAAAIB kehoe_j_Page_07thm.jpg
ea287c87ff3261d7e26f0a51e9c12ce4
d5989df5a24a1715e60706839ca9b785928952d1
1051980 F20110115_AADVVW kehoe_j_Page_10.jp2
a77b50244bc3a35b62e25fb7d208b335
bc1e09f69de5c2f2f5e7a637dbaa9210773afb00
41921 F20110116_AAAADE kehoe_j_Page_44.pro
0ee862724154b6aa8d08c18a7b2e4b09
c7aeaa803a0f833f725ef0181daa345eef047fd1
7146 F20110116_AAAAIC kehoe_j_Page_54thm.jpg
10a91c5d580b73c9c19f95c20c7554fa
d67d6d757799eb14fe8da004f6aaab49ca6d279c
344016 F20110115_AADVVX kehoe_j_Page_11.jp2
f186eb33cc46b01fdf88c0fb42e7d428
45cee65e9a720ef09b0a740b5efd9a615e01af3a
35173 F20110116_AAAADF kehoe_j_Page_45.pro
2be327cebbcd47b3f54fe2b12a99aeb0
59f1824ece513edfa5e17d5af4190c8a193d3b71
20769 F20110116_AAAANA kehoe_j_Page_65.QC.jpg
51b550046c2bc764a8099f331f48cc29
3a0908eaa9e5da15795d630a48ec254c38a18ffe
5557 F20110116_AAAAID kehoe_j_Page_61thm.jpg
c934c3f16b17fac6569c53833650d874
8e80fc5ffe45f8b978d845bb75ede9eb7a345183
83827 F20110115_AADVTA kehoe_j_Page_10.jpg
1fa4c117765dc3f259da46a19a7be28a
57eca974e5906a7e28fe2be04650b728614f214f
1051945 F20110115_AADVVY kehoe_j_Page_12.jp2
ffc1659cbb1e29fa45c98cb659d13589
2c7d952bfe628c6d96059db1e8a6a1e9b45a3dd9
30604 F20110116_AAAADG kehoe_j_Page_46.pro
b9cb06b1930250553445ea248e657dba
db2dca71fbe7f6b4b2454c74d08294a572c0c77a
18139 F20110116_AAAANB kehoe_j_Page_67.QC.jpg
9a313e27689bfb10c74a3a4be179a5be
2f607d6a2454483711b7ecbcd9497b9c7ebfa209
20185 F20110116_AAAAIE kehoe_j_Page_61.QC.jpg
65a763b5a419830a8c0d285ac7089b66
8477a59cc667f2d62558419c61bc309a19ff2e3b
29365 F20110115_AADVTB kehoe_j_Page_11.jpg
3ff25f306b144f73b3b7f3df3282d837
acbfffd1f3e9677e44bf49e1399d8887270b40d6
1051911 F20110115_AADVVZ kehoe_j_Page_13.jp2
3522a7ae76e4f0dc98323d7d4082665b
a4620cbec3efe1135040969da28eb983ab51aadf
56595 F20110116_AAAADH kehoe_j_Page_47.pro
e7e399e818846103371981c864ce8843
fb5595e936fc6ea923449fe15bc33ab73858698f
5584 F20110116_AAAANC kehoe_j_Page_67thm.jpg
70fc9556fbaa3fbffd520ef0d26996d1
73705fb0bf7cb1f389741f71f586b00e38ff4927
F20110116_AAAAIF kehoe_j_Page_34thm.jpg
2c1da908c7785747c0ee9be874c51388
4f782947ea6ac517336f36b85fc0babd8eca560d
75204 F20110115_AADVTC kehoe_j_Page_12.jpg
4aef7cf3621835e106ece38346957f07
7ed10d4f4752405fb434c5628d14b29746311f48
44765 F20110116_AAAADI kehoe_j_Page_48.pro
c8beb85261b76ab4cb66520c8b81b339
1f59f98dd1ef49bc1c1b418018c50d65c4a59634
27984 F20110116_AAAAND kehoe_j_Page_72.QC.jpg
8a91bea7d66ac4e19d632a3b104e3425
5b9f5dfd6f47d7d9e830573729c862794e928168
6834 F20110116_AAAAIG kehoe_j_Page_55thm.jpg
caf68179c55eb5456745e5bbed384b94
f7210cb943ed6f787a04c921fd79079593530d97
85197 F20110115_AADVTD kehoe_j_Page_13.jpg
ac5f6cf85e93bb88e8d0a85ac0692506
23080fca283c98898b23ca1f1b0a8e9d392edd1b
24628 F20110116_AAAADJ kehoe_j_Page_49.pro
3bcd1051a086913d23d1ff1e8ac5d2fc
626c107076c08d8b8c288c1006647de386a3b625
1051907 F20110115_AADVYA kehoe_j_Page_71.jp2
600d436adf1d873d4c48f6a242b1e57d
e5b6461103b8f729cc7ca1795838848ee2bb7c6e
8847 F20110116_AAAANE kehoe_j_Page_73.QC.jpg
28fb74cf67a3d5f265178c2c40375610
991e6824a2e0b659bb4ee7a7b849659d7098abc6
9028 F20110116_AAAAIH kehoe_j_Page_33.QC.jpg
e53bdf47c0d3efae31ea6cff23a523b9
67424e837a7649597ec77b0b01b69f8e3667d119
74469 F20110115_AADVTE kehoe_j_Page_14.jpg
c4a27a7b03343304a1be82ede0a9af78
2d23648334211de72ed0c7ec322c3cf021b667df
42695 F20110116_AAAADK kehoe_j_Page_50.pro
2beb5bda12b6ff74ff6995191b4b9519
c28fb1cc720d8d067ec142144d186e21af0c3d62
F20110115_AADVYB kehoe_j_Page_72.jp2
0b359f36c9f642cb43b546660f6d7e45
5b3952a962b037c3e73fcbebddead6109ac63291
12821 F20110116_AAAANF kehoe_j_Page_74.QC.jpg
37d19e869c2410a8a5973d166c816593
6972d8cfea6bd52ffc933777b17174d45a397349
4744 F20110116_AAAAII kehoe_j_Page_80thm.jpg
d8a001aaf145ac10f42b8d79a8e27bc4
891ebe022c19791accd7ec9b388027e5e5428621
81833 F20110115_AADVTF kehoe_j_Page_15.jpg
70154bf96dde9b8f5f0c4397cd288b55
b05fb1aa41e818eb4f1502e866849d2f545f8591
30600 F20110116_AAAADL kehoe_j_Page_51.pro
1db586b8a0c3dd829333e2fe1062454a
28dbe67bac473ed44734d0de03b51f016126044f
325531 F20110115_AADVYC kehoe_j_Page_73.jp2
d211188e814cf665e520448a9da2a8b4
c9af20856781a13dc369fe2521cbb1f4f51af430
6275 F20110116_AAAANG kehoe_j_Page_75thm.jpg
516facb449ddd5bc43ed07afc6fca339
595273f662b8e10265b60e4e90f60885e0c48d27
18512 F20110116_AAAAIJ kehoe_j_Page_08.QC.jpg
3f5955d841935478ab53f2d1137747b5
10170446426916333d707520970586c52b6b2a9f
80093 F20110115_AADVTG kehoe_j_Page_16.jpg
af824cf1ed6afb9cf129276be9da6342
2c602f735899869554b75a7de54ccf2235c0f1d7
40548 F20110116_AAAADM kehoe_j_Page_52.pro
8407a4eb6476ed14866bed82cd7625e7
42b1a04cfc97148e805da832f8c033ffbb9b1d9f
529698 F20110115_AADVYD kehoe_j_Page_74.jp2
144bfde26f6d5c0d6bc04ec7553f6cb6
1f75e0c62ea3adfabbda363574a3423bf1dc5cdb
16625 F20110116_AAAAIK kehoe_j_Page_46.QC.jpg
7b4f699f6eb6ee251396de618b958e40
83c0a295e11de446c13b231844b0a9d777c75987
87558 F20110115_AADVTH kehoe_j_Page_17.jpg
43df7cd37bd366ceebf0e66c9683bbd8
f27734a4d4a6c3eaece177ec2f815e845d031b48
37224 F20110116_AAAADN kehoe_j_Page_53.pro
db80f9ab4863d48c5fea6d520fe6746a
fe27f1e006098be005cfff20d4d006ce8300fb72
1051969 F20110115_AADVYE kehoe_j_Page_75.jp2
252afef655001aebfe1e2c5ab15308f0
ebd0b27b95b6c651325935434860bf5d5cfe0d86
27736 F20110116_AAAAIL kehoe_j_Page_57.QC.jpg
77740ee0f28b22b3c9cda48724edb821
44b7f02f7d2e5348e8e0cd5f77bc65a8c638608d
12761 F20110115_AADVTI kehoe_j_Page_18.jpg
5be8c880293d407341cc0a9067e32e42
93fd1ebcff8edf31ad62275485812bbf1d94cb3e
1051953 F20110115_AADVYF kehoe_j_Page_76.jp2
180074c517621441686cfcac48c7e6ac
3a6866ae3c6776006dd68927983447ddf19b8e4e
6882 F20110116_AAAAIM kehoe_j_Page_10thm.jpg
a99c9c051d62fb3653fb21f7c56e9403
6a4a7d8f65dfb548a3c6c7be32c80fd10001ab98
73583 F20110115_AADVTJ kehoe_j_Page_19.jpg
22152e18e23387958893e947ee112072
41c8298ed2b5615d36d04b046a25a3d21c3879de
57503 F20110116_AAAADO kehoe_j_Page_54.pro
6ac1754539aa7f7a623820f1119f6428
94caa1af11d26ed103aab224184daa94d74ca218
F20110115_AADVYG kehoe_j_Page_77.jp2
9a79ac059c3bd40b9c19bc10d37a3c4a
9abbfdc0abe863b4bfc46b4dfea091b55b67d43e
6637 F20110116_AAAAIN kehoe_j_Page_16thm.jpg
6852f3ef9be3c7e359b146dfe2772d93
329df07aed9ff206fc31802211a81163597af832
63824 F20110115_AADVTK kehoe_j_Page_21.jpg
5bea23e7985ef88d7a2a07b81fbfc9c9
0d3d1e62400e48f71511e25aeb4b77c0c2cb590b
55733 F20110116_AAAADP kehoe_j_Page_55.pro
d9bbd3a301a3e4cb0e4251a05c56e2ba
96b1d3028c2695cfdd08c4100ef10b8e5ceb458d
1051971 F20110115_AADVYH kehoe_j_Page_78.jp2
6016ea15f99964f7a1da9ff571311fb9
3e1f75f3dff6648b23bfb42b14095eaca1d29528
21257 F20110116_AAAAIO kehoe_j_Page_62.QC.jpg
1264caee53c54167257e019704147b2c
6b40241d7b40fddf2b9582bae0616b603e65dbb5
57815 F20110116_AAAADQ kehoe_j_Page_56.pro
7f79de76ed48996eced38b5bea8eff09
459fef109ed74bf06579d230e33d5874311931b6
438494 F20110115_AADVYI kehoe_j_Page_79.jp2
54cf75000f1b755f601f1c19133fed89
cb3c9b4c3aee6b1ec2d536d598da61b056729847
17654 F20110116_AAAAIP kehoe_j_Page_20.QC.jpg
c8e14c12b199a1cfcb4019146b2f61fb
59180a3e6504cd00d8907b2697ebb422a6783c8b
79483 F20110115_AADVTL kehoe_j_Page_22.jpg
231b1e3e3901a4a22721dee85960acf8
06fd6efa058e0060f8413adc2927389289df3d60
57937 F20110116_AAAADR kehoe_j_Page_57.pro
62ec1b7181f214028ab005fa5f129533
8c73cedfa6f6b345a4dca38974d84806e1040dac
760615 F20110115_AADVYJ kehoe_j_Page_80.jp2
33f75e543500cda2b768081ae774bf38
8fb476762c30d0b8bcb006a1964a075d2c54936d
7592 F20110116_AAAAIQ kehoe_j_Page_76thm.jpg
a637e2e24ec458481a43bdb3e7d7489f
cd3ebc48c0de6df1bdf0f9725c7da52303483351
84396 F20110115_AADVTM kehoe_j_Page_23.jpg
2ba55a7489ba9162c107585a7ebf7815
742f41a1c1151afcac966e5ffc05942a70e61672
33083 F20110116_AAAADS kehoe_j_Page_58.pro
a17d0fca7522c4bad1cfa0252dc23ee3
3e2cea0e4cd6217f81a4b5e0d58484b4118247dd
F20110115_AADVYK kehoe_j_Page_01.tif
4c18590bac60f0e79504837a43c8d712
b04479a963c31ba0004cc8298a7c377bde19363e
24407 F20110116_AAAAIR kehoe_j_Page_22.QC.jpg
d9a2344475f87aec9dbb6552ac9aebdc
a0b41f8ae4511f53ff200c11b140a8b136dbe5dc
68702 F20110115_AADVTN kehoe_j_Page_24.jpg
982071a0301128a2b2619c4b9bc514ec
45f59bcea5385fdd42ca112db9e4bfb5ff345ac4
46778 F20110116_AAAADT kehoe_j_Page_59.pro
f9f6e5fe68cb05c21310209a07b15119
bd570eaf6e929a329bcf8cc81691056f54d97802
F20110115_AADVYL kehoe_j_Page_02.tif
f2c31d57d8ec741fc24300a596b754bc
5b8fe1dc864d9b7355d6f0c4b8aeb4d817c1ec79
23320 F20110116_AAAAIS kehoe_j_Page_12.QC.jpg
e23edbc419571f30c0c5e59b51eb9391
b5956ef8f8a931ee09c64d1bc5d78a73150d2839
67095 F20110115_AADVTO kehoe_j_Page_25.jpg
4b4891ce74e2a0c61a3a339bda8fce7b
220a43726cabe0f4ce004bff50d6da20b7df903e
28606 F20110116_AAAADU kehoe_j_Page_60.pro
1371631a5cdb94ba5622d530067df314
46a999d8aae66292faf0d95195ee7ba106250d0b
F20110115_AADVYM kehoe_j_Page_03.tif
fd9359814fbf7f0f410246de68c9b1dc
93efcd70491c6937a42bfeb5c6959e771664a360
66043 F20110115_AADVTP kehoe_j_Page_26.jpg
0ac98b8927ea86680a1448776c446d0c
f134dd3b0ceb3db5c4cf70221ca6fcfa41f1c987
38907 F20110116_AAAADV kehoe_j_Page_61.pro
9dd8a9ed64415472c9dacad411cc8f2f
94f70be8dfb35952a34b669b2f9d25b2ef1e15f1
F20110115_AADVYN kehoe_j_Page_04.tif
8fb3b11db7ee3778a7f136521c061b85
012d721bd1f740fa556397f4fe294054cc4d80b0
24739 F20110116_AAAAIT kehoe_j_Page_75.QC.jpg
85343d2d7abc9dc73e963415e7267778
c9b2556ef7d8bd11457e6c308137082b80aff9ff
78345 F20110115_AADVTQ kehoe_j_Page_27.jpg
9f0f1b80ec244a477556b920333d50ae
e4f2bd05de874925379bc2ed577f07c456f82e9c
41907 F20110116_AAAADW kehoe_j_Page_62.pro
1c2a57c3a64a9ad1c3476466d99c3f4d
05f376f74f4bbbf73982af6bbffec2befa0610ad
F20110115_AADVYO kehoe_j_Page_05.tif
56a7cb265b20e9ab654d04e8fc8e2e61
852bef60d1048e1bd4b85f3b625a58c925606efa
6341 F20110116_AAAAIU kehoe_j_Page_52thm.jpg
b2f0633c2f8d153a7b147779b27ee73b
e76e4a8a89cad84ca964aa41ebdc27aaa8e142ce
66200 F20110115_AADVTR kehoe_j_Page_28.jpg
b4be62bfe2ac2aba929c9e0a6a38c797
2cc50e584729f456e6b4a82961a83e242a12bcae
38190 F20110116_AAAADX kehoe_j_Page_64.pro
2de6c527d4611167d8a8677ec3961061
9d5f342f12e13ffc2c4f2855a35bd69d0d7651d2
F20110115_AADVYP kehoe_j_Page_06.tif
5225d50c090a69ebc24909cb55ff1a01
4d245289a9f0f9fd1662ef203d73a5a9b89227b5
21387 F20110116_AAAAIV kehoe_j_Page_53.QC.jpg
977f49b5066a42143c3eab5c60d31ce6
7ee45e559a39f7a15fbbd2034286739e97a67dca
70021 F20110115_AADVTS kehoe_j_Page_29.jpg
eeccb065d2bb9ba71d88cfc7895f88e7
d1c9c6c4e49e83f6b79c7623c84b0dc52e43602e
F20110116_AAAABA kehoe_j_Page_66.tif
6a8b7ce8d6875d494a7ef61603f36d1e
c208504924219a75d046564864982c86d10116e9
42008 F20110116_AAAADY kehoe_j_Page_65.pro
df61b983c737ecc07abc9c7a7ea2201c
f3794a420164d81e4cc6cdb6f76126be1c6cfbcb
5940 F20110116_AAAAIW kehoe_j_Page_53thm.jpg
e6292b086a6b8534f49204eab408332b
d288770a6b1a0ff0f647777660d1043abac0195e
84760 F20110115_AADVTT kehoe_j_Page_31.jpg
c73da00cd64b28c4eb052284b2c062db
2d0baf989f90e8a6bd0d3a88bd556dd583717c2d
F20110116_AAAABB kehoe_j_Page_67.tif
1f85aff1c657e44f8cd05aa458b89efb
deff3d103e54e6b844a68693b14481ed43d6f4a9
42486 F20110116_AAAADZ kehoe_j_Page_66.pro
599eb756617522d871fdf8229a8558ca
22bc0ff8910cf9f58734f244d36e73e602e54003
F20110115_AADVYQ kehoe_j_Page_07.tif
7ac2a53f3a728428b50ae48325d12700
19c56f2ade685a3454e718f8b17982455202866e
24705 F20110116_AAAAIX kehoe_j_Page_35.QC.jpg
5e94a786ef3177dc9384b1ae731f3206
b9df4c3477c340e76443509971bb4e89ce2ff54a
75256 F20110115_AADVTU kehoe_j_Page_32.jpg
a7f8a54a8b384d2cf58f044a2505b0ec
65ff178b7cb81fc261e5e3254b1e6fa32a0ac765
F20110116_AAAABC kehoe_j_Page_68.tif
a9e97ce859595e0d78cf572531c393b2
44a6bb135436151573da5f75d6405d81805de67d
F20110115_AADVYR kehoe_j_Page_08.tif
e88530418f66eb5244558d6534a5cfae
a33f662a7d3b9324e61275f2733afa5096fa6e25
7294 F20110116_AAAAIY kehoe_j_Page_01.QC.jpg
183e4d120da26289b57797a50523c797
91168bce1fcd7d731a13894a36d1ca61caafd869
28817 F20110115_AADVTV kehoe_j_Page_33.jpg
ccacd77d0801e9b5f7be457483309498
6e81fcfd1b14e820739aaaa2dbe42972724a0ffd
F20110116_AAAABD kehoe_j_Page_69.tif
c10dcc7e2a01de7f8ce549da044801e5
40b26dcfb62a288d2dd49470d953ff91487eba8f
F20110115_AADVYS kehoe_j_Page_09.tif
9908fd4bfe4118d2ad1b5797e57df071
13e1dc583f2568d162f176f5c172931387690cc6
1433 F20110116_AAAAGA kehoe_j_Page_41.txt
fa75328909213ba53df8435ce23b622d
3f0cf1e84331fb6ac45769a7bb8dd20ba9612d1c
5501 F20110116_AAAAIZ kehoe_j_Page_43thm.jpg
b4558026a5e12028ce3e4339395aa58e
d1281dee5cbc958ddbbb82f30fd2da6675385535
67519 F20110115_AADVTW kehoe_j_Page_34.jpg
d8df45d2ad03c1177787a8d8b5d1f003
d8b441ee22cb4350e6eb78a7c7a30ea75f553cb6
F20110116_AAAABE kehoe_j_Page_70.tif
4de5eca1fdd97a521cb2b3c54781562b
fb130272bb74c8a88bddd30172b1fc96ff0d4689
F20110115_AADVYT kehoe_j_Page_10.tif
0225e2eedbb250da16dee2f6048e0840
223cb48064b9f8bcd7f349bab6faa3b3136c883d
2169 F20110116_AAAAGB kehoe_j_Page_42.txt
31d193873e0b14f7f4b08c28f04e742c
e7477943205ba9391bf820c0f589fc21fef12396
80921 F20110115_AADVTX kehoe_j_Page_35.jpg
a57538fa0577d6916d4ca69d2abd34f5
107b7f2b8eea85946c50c195d272cee40beca824
F20110116_AAAABF kehoe_j_Page_73.tif
584922f62091de153b8aaf54a0ea1956
3635ebd737155c08ed5a9493c15acc63f03d4ab6
F20110115_AADVYU kehoe_j_Page_11.tif
319bdbeb815b5a80ba28ee2cac0e80bc
234aace3c704b7bb25a453beec1f7835cdcc4a1d
1658 F20110116_AAAAGC kehoe_j_Page_43.txt
fa9a37ef0fa7bbf789d5cef8888f3f3b
65014265025f1a3c97d440ed7ccdd8f5206b2c6b
3709 F20110116_AAAALA kehoe_j_Page_74thm.jpg
2fe812c1a2dbdbe0c1662d30e1ad8b13
4af06448844af0def686b61d382368eda74500d3
69116 F20110115_AADVTY kehoe_j_Page_36.jpg
aa9da80f8cb63c7b9c8e02fc3b17f81d
bd4017e62321bcb17f124b134a375d5944a73a02
F20110116_AAAABG kehoe_j_Page_74.tif
e334a28493465f62c3b431203654bd5a
6733cfdd32ba39936d4a77a4098df507607e5a6f
F20110115_AADVYV kehoe_j_Page_12.tif
aecfb35b4a07b9a6d89aa20176b47137
8f83cc41e81e76f488d862c6c7d480c6815e4fd6
1707 F20110116_AAAAGD kehoe_j_Page_44.txt
92bb97bddc238f9c2acab652843ad7ef
beb93e670e27748d1f74ec5c760d739d5381c880
6499 F20110116_AAAALB kehoe_j_Page_68thm.jpg
0ef816b2ba9074b5cf9a7c764495f147
185346ff1afdbd765a9d419113d88c6498b45179
86913 F20110115_AADVTZ kehoe_j_Page_37.jpg
baf72e19ed4e4be75b19d3f35d634404
316a0ae080afe9ebe09b963029ed14cf063afce9
F20110116_AAAABH kehoe_j_Page_75.tif
c4c4cd000cc9e6c56838cd0a99fa0030
e867a55834342318a1c306251ef5f22064189f0e
F20110115_AADVYW kehoe_j_Page_13.tif
98ed12136cd05d00b942152ef10cd8d8
5e00ada10d9c47d7089252b26195ad97e58aaf22
1509 F20110116_AAAAGE kehoe_j_Page_45.txt
593c8260bc4f4db180c906f9a9ae08d9
0c96674e6b88e504d97b60faaf4a39b219f10bef
18081 F20110116_AAAALC kehoe_j_Page_80.QC.jpg
6187d8fc364f35b4e3f3d93b72efd38d
c4a35e811b927b72ab1db783274b2343ae30fcdd
F20110116_AAAABI kehoe_j_Page_76.tif
80418073a46fc984214a4b3c8cf0e81a
aaeec1b744c53ccd2e25214cf63190667f172e53
F20110115_AADVYX kehoe_j_Page_14.tif
b63ed7349d750d2cd21ddccff82eb110
0700b0bdd572bf2b17c1042e2b13adae5324a1ef
1446 F20110116_AAAAGF kehoe_j_Page_46.txt
677277f5abda6760b6280505acb3c823
9f2c94cfe6367a5cfea848d34a7604c8631a7be2
30479 F20110116_AAAALD kehoe_j_Page_76.QC.jpg
59ca2d09141d403e39f8c29b9296472b
80905d833d1defd489120f5b2128badac5f5cc2a
F20110116_AAAABJ kehoe_j_Page_77.tif
0697e20d8bbbe53e36b71deb0bdc3351
db410f5521b812668b01af44adf27cf38b09c906
1051968 F20110115_AADVWA kehoe_j_Page_14.jp2
660c33c238c044cfa4fbc77a5561cc53
4322d288e1ee2cefb31903c980326016f74c31f2
F20110115_AADVYY kehoe_j_Page_15.tif
ee7cd8c1823c5329bcda1e07dd455c8c
8a76e3fddd8456091accd92052094ccaeed97876
2259 F20110116_AAAAGG kehoe_j_Page_47.txt
adc3984a817c9c75debd6e09002730c0
2022c19ca6c5ec98980d671f50f1924486cdf7a9
17696 F20110116_AAAALE kehoe_j_Page_51.QC.jpg
cc7648f58e76a7cf2fa4804d6b865da7
443f235a4490c1f83396baad80ef7705a8c15450
F20110116_AAAABK kehoe_j_Page_78.tif
c0f19f4db7d85ed4287b9862a09c5111
8223f516bcba4af10858f974f767bbfe654662de
1051852 F20110115_AADVWB kehoe_j_Page_15.jp2
6296ac0d2affc11b4917351024e864e8
ee15e450af920b4fcad0f8415bb8f73ffb337cb5
F20110115_AADVYZ kehoe_j_Page_16.tif
eff4ba5abb6b048553e8e77ae67a4ccb
0317d6cac245db710869b1cfae38678b5402b49a
1883 F20110116_AAAAGH kehoe_j_Page_48.txt
f574950f022d6a833ef4fb4457dccc83
ce25d74e2d3e10403d99cabe37b8c9fbaff9f6ce
6431 F20110116_AAAALF kehoe_j_Page_14thm.jpg
69e8c375cdce810356ef1e5e3ad70ada
06f178a19cca32afa041a077c043fc80960f2974
F20110116_AAAABL kehoe_j_Page_79.tif
2ede8b90c40afce1d7a329d02bf06df3
0a7f987ac31092a758241c6e252db689a8f7e3ad
1051957 F20110115_AADVWC kehoe_j_Page_16.jp2
bf6213fc47d0234d9278f3f85b535c8b
313912d65abcd5f5a37e65f3d97731ff93a83e2b
1018 F20110116_AAAAGI kehoe_j_Page_49.txt
d2cf1fdef66b342bd6e1296848a938c5
6bcdd94be365418d22bd116de96fe034a8c77473
3297 F20110116_AAAALG kehoe_j_Page_05thm.jpg
519374814ea11429d6068fbbd79dc238
6d3993acaf6422018e0a2735f8cc6e3465e3cbbc
1051939 F20110115_AADVWD kehoe_j_Page_17.jp2
7b1f4eae85ea1da918797ec650cec070
0fd2a01f887762571a53906fe4aed3033dc12381
1719 F20110116_AAAAGJ kehoe_j_Page_50.txt
3d1180c9e897f932de9d917435196a61
d476e296e7c033618e1d0956cd85df9046a8bd20
27267 F20110116_AAAALH kehoe_j_Page_17.QC.jpg
0d4ceea2d0413b239be4e4061cfff9a6
55dec6e2beaa85649f653e60bff02947c1d90920
F20110116_AAAABM kehoe_j_Page_80.tif
5e117a3743a60a4ce6a0ddabd74d13db
cf95745981663e9e4b9cc2b2dc56b1ed2d3a16cf
86917 F20110115_AADVWE kehoe_j_Page_18.jp2
7f2e1251ab43b4236ea5bed9f281a908
37838800b41431cb2f31cca883a4b0e06ad21cd0
1369 F20110116_AAAAGK kehoe_j_Page_51.txt
9f8932d6e57105812e324e4e0cdd0cf0
5c53e2eb3fc3c07c21e6e9a6bff6df26eb13f6ea
22885 F20110116_AAAALI kehoe_j_Page_52.QC.jpg
e28c94b39056e58b5e60acbc65ddfa5a
ada06e305d4e70bc2590949d056cfcd67015b349
8244 F20110116_AAAABN kehoe_j_Page_01.pro
a8203d9adf3afeb4557b15c2506353ea
f1fd0f85d367e2d0c074425ce0b199e915cda55f
1051942 F20110115_AADVWF kehoe_j_Page_19.jp2
9a96bbb5b626fdf5b40d145ed0ad6a30
6bc9ea3004b86770c80e1d0493b2ea4d53d4e7d1
1641 F20110116_AAAAGL kehoe_j_Page_52.txt
85d540ba98c916e2dea325eb262eb172
d8980b8b8dba83b2d0aba469cfc56360962b86da
20435 F20110116_AAAALJ kehoe_j_Page_21.QC.jpg
7e490bd8e92e8c3201d4a486aa78ad4e
2a7fade28bdcdc5c1dd3b446df8fd27326d94a2c
759065 F20110115_AADVWG kehoe_j_Page_20.jp2
13a5d407d07ea650624a9643752f61a2
af2ded2ca59474734eef201cbd119be6cc437959
1639 F20110116_AAAAGM kehoe_j_Page_53.txt
8786c69392c787c113c7485b1a78a603
5c8b542f8aebf7725b44a1b40d0a042fb7e9c520
2061 F20110116_AAAABO kehoe_j_Page_02.pro
8e8369f880e1d0e030bb3320a512810f
5a19da6e966ca4735546a9fdcad31a1c57f3fb0e
6938 F20110116_AAAALK kehoe_j_Page_13thm.jpg
f43dc0d325dcbfb804130a343a18716c
877ecddbfc9dbec9b76e119272ebc9769efacabd
911910 F20110115_AADVWH kehoe_j_Page_21.jp2
0289e15a980ffe5d32ef7785bebc7538
e940908615a170111cef2dae9584368163e540b2
2263 F20110116_AAAAGN kehoe_j_Page_54.txt
6d8cdee85fc0d263c98a9dcac91797b1
fc9d56bfbce6e566af8b4daa16eca28c19b65ac9
40160 F20110116_AAAABP kehoe_j_Page_03.pro
dcaa3c3f2c1aab811db3452146a29f9e
899407405b48ab944dab4cd532933574f0e276d7
5024 F20110116_AAAALL kehoe_j_Page_49thm.jpg
4a3f618a168725d8d796f722172d557f
7ef1fd2ff62870af64cf1144077395bf85dff7e9
1036087 F20110115_AADVWI kehoe_j_Page_22.jp2
e435c8a43e5fc6c8b88fda58b24f55eb
07ad412c973a331d2daeaa9ce5033326a9646533
2199 F20110116_AAAAGO kehoe_j_Page_55.txt
264f6482683217ec4544510b5ecdf46e
b979e4d146a4a22d463345d91a1c3b448d9fa215
48931 F20110115_AADVRK kehoe_j_Page_02.jp2
e46750f857aa00021857fdcce19c9a88
877c0e81339b798431976ff276b526c15e893c6d
32871 F20110116_AAAABQ kehoe_j_Page_04.pro
b8a15481e14a7bffa046d89682dcd8e0
ee9ded2630c995d7ec6cb736e3b323edca44538e
6257 F20110116_AAAALM kehoe_j_Page_70thm.jpg
423b14c5b1c83d886428b8a1069bf555
e04c5f775165981a04a530e0ef16eba3f7903ab4
F20110115_AADVWJ kehoe_j_Page_23.jp2
2cd56259a7c5c232a1572362830e05c9
9a02880f595ab5b7d5bb4db1bdd6ce02bf84f87d
2270 F20110116_AAAAGP kehoe_j_Page_56.txt
d0043faaa4e98831b386a4528784ddb4
16127f19b926bdb8cf5aff5d28c79a7361dd768b
6142 F20110115_AADVRL kehoe_j_Page_39thm.jpg
acd6c6f7d707c6746359d031a2e80ee3
eb19ac5091d31d084df8734fa8d8dbc2ac7a954a
34425 F20110116_AAAABR kehoe_j_Page_05.pro
bf8bec04296e9b1d6f988d2d65de9ea7
0c2b1d377e2b43c901d161a4e618bded7669e420
24725 F20110116_AAAALN kehoe_j_Page_66.QC.jpg
64d256c44f750d2ba72576757ffe24e3
0f6a50561240477faa0c3ce6049866657742b5d1
920380 F20110115_AADVWK kehoe_j_Page_24.jp2
856fedc2f91e450c66c185472342bb4b
31ebb4b612826f61ec6d6c45eba1c5922e9d0966
2291 F20110116_AAAAGQ kehoe_j_Page_57.txt
e40c8974e9a2d27ac2940e134d6fe2b1
948e65aecd82d4c6d4bac7f0f77551c9b0facec9
F20110115_AADVRM kehoe_j_Page_70.jp2
3cf0857c0a2b8b66b321983f5474186a
abcedc29d5089dbf31228745081bf5713e80dc5a
50551 F20110116_AAAABS kehoe_j_Page_06.pro
8bcaaf9ee62599f97ce334b4e074a939
1b2bd5ec1538ed6028a77ef0b5d98d6113c6c6b6
22776 F20110116_AAAALO kehoe_j_Page_19.QC.jpg
f7f3c3c6f35dd91a6f31c642c31ecfeb
b1a0fc239603d3bf9cb95e79aa21654963c8694d
979941 F20110115_AADVWL kehoe_j_Page_25.jp2
250a3818da2a92a642178e2b4fcbb2ef
8cee903122262a0bb8751751a53a5a2927e4fb1c
1582 F20110115_AADVRN kehoe_j_Page_67.txt
27af91c53a129f92253e7216243d2bf7
1a80e06c4d6da57ab38de6492ea46a2c05c0cd37
36353 F20110116_AAAABT kehoe_j_Page_07.pro
96cc7917b23a9c63f1695c48c13e0cb4
362471d740b042bdb437af2d951e9ecedcaa00a5
24230 F20110116_AAAALP kehoe_j_Page_27.QC.jpg
297b23524bf05565985ffb6aa872d1e7
9e1125bca10d377e4f09b9eb0f74f072ac34cd4c
898725 F20110115_AADVWM kehoe_j_Page_26.jp2
05f25ddd7361184171c024508ec86767
20cf084000ba337d64f69729d5377b52a70254df
1845 F20110116_AAAAGR kehoe_j_Page_58.txt
4c6a8da0271e278f319c0e14793a5874
485546260752156ca2afb6b9ca88ceaebc028c35
F20110115_AADVRO kehoe_j_Page_58.tif
a883ad2350839e6cc18e29fcd86b1a79
53d57f6b2541be4051ae906820f88b5ab4de6b2f
35336 F20110116_AAAABU kehoe_j_Page_08.pro
d81aba745991fe51de90d6933167c808
228f57de50725c1def2c012c72bf5fda989c75c2
6565 F20110116_AAAALQ kehoe_j_Page_71thm.jpg
1eeb4407c63a6e3c563c3a24c07c8a86
05589fa0487fec3f6cdbd04b5a9e3e67542ff971
1051958 F20110115_AADVWN kehoe_j_Page_27.jp2
2e5ec6ac9680c3c3e229a7a6f0d4edd7
c0a4f35001ff4f858c8b95d3382f44a284aca209
2200 F20110116_AAAAGS kehoe_j_Page_59.txt
641bcf7b7bdbd77dec6ef0d5020d10ed
5a135fae2ebeb3fe873b8f60a6fe4fdc5739f5cd
F20110115_AADVRP kehoe_j_Page_71.tif
7e8fadc3a03410f8dff571a6bcf0a77d
42c8876e744fed2e85c3f8cc2b86b9d56ec563b2
46893 F20110116_AAAABV kehoe_j_Page_09.pro
299c454991484049d673b1c145938849
62229cf326cdff37c7caab687e962413efa43300
5402 F20110116_AAAALR kehoe_j_Page_03thm.jpg
627ddfd22fcc99e051f12cd55b3878af
fd99c7d98dbc3c49398b53f13fa9d1990c9a82f1
1585 F20110116_AAAAGT kehoe_j_Page_60.txt
91ac8e3a59c8119221a8e8b472b6b1e5
b9cb89dafc8109c1422947d136687e424a58ed76
25836 F20110115_AADVRQ kehoe_j_Page_15.QC.jpg
6dfc430be5f67a7bdf88f57232c6062e
ff85b941921edba77f4f07c6762b6ea23fe05c36
55569 F20110116_AAAABW kehoe_j_Page_10.pro
9c83801a94dcb24ab719826923cb5a1a
0d738ce06dd85f2ed309fbd11dbbd733d2da9641
6837 F20110116_AAAALS kehoe_j_Page_47thm.jpg
4a7dd10760a70ad044b2f525b3101048
444c168dd80f399fe63449b62c79efe723faaa9d
923070 F20110115_AADVWO kehoe_j_Page_28.jp2
dfe4110d883c6c47f18ed2e70ea2f084
2c35ac6f565805e4870f6d3fe87fb642523fce3a
1901 F20110116_AAAAGU kehoe_j_Page_61.txt
3c90cc00f12d605dc0f1c721f5c629e6
0d1616f63f1b3c6c413e6615c798612cf9f7076d
5210 F20110115_AADVRR kehoe_j_Page_06thm.jpg
31719eb6c391042d747c2deff1dc0840
dbd307bd494e7a10630778ecb567857e675afba7
15543 F20110116_AAAABX kehoe_j_Page_11.pro
72825ca675aadb7fcfe0be54fcdcd0d4
c38ca03c4583b515161050980e466f8dd6acd155
15255 F20110116_AAAALT kehoe_j_Page_41.QC.jpg
4f93b45e44dd7e58e0cbdfde105313eb
6f4b255e8fe426bb475f4c9a7f0f8caeb0cd9b05
981880 F20110115_AADVWP kehoe_j_Page_29.jp2
c7de65fd2bb358b1cf5a9ee8f6e2dbd5
e82e88469a636e806b80741ede1e3f4c74ace256
1860 F20110116_AAAAGV kehoe_j_Page_62.txt
31f31a2a3882ff99279da646ff495cab
1751ee6f2ab046ef941a10c37204704776eb4c9d
822 F20110115_AADVRS kehoe_j_Page_79.txt
76064fa1e6cd1771f002bcc6aa43d2ff
4b69eccea223628e25dc646a19046693a7d3bc4b
47784 F20110116_AAAABY kehoe_j_Page_12.pro
2958786bd7244252ca4b4e4e859ad2b4
8cbea4eb8b90fe0990aa8dcf276c8675b1e18b7b
6538 F20110116_AAAALU kehoe_j_Page_23thm.jpg
f5bfd6936c08e6f2d864386455e3b2c1
0a51e125a8ef30b87e295946a81726d68920d1f8
F20110115_AADVWQ kehoe_j_Page_30.jp2
0f66e4e42ce913dad8ef7825bfc17eed
a4a3375c8d21d054b5969b0e2533125ec53b51fd
1402 F20110116_AAAAGW kehoe_j_Page_63.txt
47e729d0736ee926cc5ffe05672bd306
92a35e76ef90b95fd1f1685d57f7b7a15f411dd2
1636 F20110115_AADVRT kehoe_j_Page_14.txt
45bab55fafd3dd746bf8b3706c65287b
4500bf565ddaa81592432cf614bbb9254e4bd2ab
55593 F20110116_AAAABZ kehoe_j_Page_13.pro
707380c472a1c51cfee180b9db6a2d03
42f0a9fb2f231aacf874882bb674d81d3fc502a2
6715 F20110116_AAAALV kehoe_j_Page_38thm.jpg
cb8d91c44d918567965e5d289805f62e
620b3bcbddc366236b2958682c07ab201c1323ec
1051973 F20110115_AADVWR kehoe_j_Page_31.jp2
eee9eb611ff1c937e27d3d1f789e4269
618764da5d847cb11764c913052380a3ccddd64d
1768 F20110116_AAAAGX kehoe_j_Page_64.txt
27895d6a43490f5d0c2be77e21c60a33
1740150a75b0566d54ef81081b739f75b64a3f0c
25157 F20110115_AADVRU kehoe_j_Page_63.pro
ad62084ec7aa63669d07689ec6116b78
5a4b964723d89bb81ed6e78774cde8e0505fe08c
1051985 F20110115_AADVWS kehoe_j_Page_32.jp2
b748295d5ca342cd3542e0b64c5e0a4e
fe7962d20d04608340cc6e020e9cab2d19951275
28007 F20110116_AAAAEA kehoe_j_Page_67.pro
a5c0203ab3a839d9e819e8fbf96c25d3
5ca19f613767686c611cb67f222c7eb04e8457b1
1962 F20110116_AAAAGY kehoe_j_Page_65.txt
04c7563f86787b3947773a0506543541
2970d53ebbdde3a7257abefe5ec3d682a55d7b08
5203 F20110115_AADVRV kehoe_j_Page_58thm.jpg
fe43e3db091e383877b597358a0a84a1
c9607cb426c0a57ff03ed30ff65768210b369c42
6199 F20110116_AAAALW kehoe_j_Page_22thm.jpg
003fe85bbfefab3006011722aba37a2c
b9e0108fe9ff1c871ba228aec973ba77c0520d36
327433 F20110115_AADVWT kehoe_j_Page_33.jp2
47df942c4b0019072f8f5eb7011d997b
3d1760e046c517542b1e6c94f65286015f2b295a
46259 F20110116_AAAAEB kehoe_j_Page_68.pro
e789074bafbd9c62ab2807d3566abe43
228ce900082ffca6c39f1a3ce09468a6487f0799
1986 F20110116_AAAAGZ kehoe_j_Page_66.txt
6a231728fb0b3a494b302aeea93ee765
aa0e496af76bddb5e7b03019d389b264db5fa89b
891157 F20110115_AADVRW kehoe_j.pdf
e08b47e53962f114e007fd663a51497d
aff6a65e6c45633ff2bcd4f350f5c9dc060814ab
27553 F20110116_AAAALX kehoe_j_Page_37.QC.jpg
8724e2317e80d7b9c70abf526ee5fdba
c8ab5a634a02437e6b6f9957c0e23ac93fd0ecfc
100625 F20110115_AADVWU kehoe_j_Page_34.jp2
0cf641800f39d20792e6ddf72d8a89b6
655c44c4723191f169b8dcd335e4c699df54f4e7
30471 F20110116_AAAAEC kehoe_j_Page_69.pro
80d1534628249933b7798674af7db1e1
c9a93800f0c8741aa5ecb3282af7a9ef701e9d36
67694 F20110115_AADVRX kehoe_j_Page_77.pro
bc9a044e77354292bf9ed100fc58f913
68f4c1927bbf64e3b4b7788fbc0b8781dc69c1da
27219 F20110116_AAAALY kehoe_j_Page_55.QC.jpg
69374cf4874b9a17763b673aeeb87665
22f002f26bfc6f7a8c5c2af7cf43d3267ff83f4c
25989 F20110116_AAAAJA kehoe_j_Page_23.QC.jpg
965544afa2dbb8a3df9500d2c6656cb0
3bf129f7d2468ed5e7ff7438e42bce8eaf1b9a17
F20110115_AADVWV kehoe_j_Page_35.jp2
564a3131c9568267aa355816b7a9e009
001878a7d8b98e00b443f4df8e1224983a9ae95a
50227 F20110116_AAAAED kehoe_j_Page_70.pro
45873c03b4201427653a70e80b119504
59ed03b955910e7bc67bb2923330fbc11f5f0e1f
55745 F20110115_AADVRY kehoe_j_Page_20.jpg
7801abd2bb7f8a7605110ed9ca5fd09f
43f607f191cd4841a4dc99eeffa189d5b03d15d3
6105 F20110116_AAAALZ kehoe_j_Page_26thm.jpg
2e37505b2b45ed6c5b8d2a8f7a58ffcc
38d2198c54e94fcef17863ebf3ad4d583e3e994b
5039 F20110116_AAAAJB kehoe_j_Page_51thm.jpg
6d976e78daecf4b6cd185ae721af13de
345fb4cf8ec23f4403ed0c03444c79ce8d73a1af
41633 F20110116_AAAAEE kehoe_j_Page_71.pro
c3b11e829d5734e78d34479ff483af68
b65428e10cbb9712c9adf5c9612f80dceacfe013
37805 F20110115_AADVRZ kehoe_j_Page_79.jpg
a25a287acffa7fa1a8238862e1eeff9d
cccb82ae1dad0e82d99f7e12e8dd9c056be7ac8f
966156 F20110115_AADVWW kehoe_j_Page_36.jp2
96b0be65ea79548bae2993860ababc51
9fa6481e011e8808d0f8568c492a7cf8e936c889
7051 F20110116_AAAAJC kehoe_j_Page_37thm.jpg
fa6ccbc2c8f8925ad72af9ecbc0d7b50
ad1a1b4eb4e54b26b86e817c8a9cd6051a262680
58753 F20110116_AAAAEF kehoe_j_Page_72.pro
647bf310848278d019a764583ec70920
f8c5163e9e5fc1057e6c50242288eb5c4910da6d
1051959 F20110115_AADVWX kehoe_j_Page_37.jp2
de3167805ecb928a57673e03d9583c12
47b3fcd211a2009467f3ffc4c37651a4c8643348
17197 F20110116_AAAAJD kehoe_j_Page_69.QC.jpg
b0d24ddc1327cb10a88825fad4d51272
1981eecfc5c0d7767daf66a942efa2c6ec1943f3
14557 F20110116_AAAAEG kehoe_j_Page_73.pro
2cd24946a5a2d41242fe1f0dacbc88bf
2e4196c520d6f0b4df4e5828e3b2afead96bf20c
79194 F20110115_AADVUA kehoe_j_Page_38.jpg
b13a91438d3e4e71b0c65ac3b0cdb922
f1824cf8863ce235db3a6cbb2d12c17e90713985
1051954 F20110115_AADVWY kehoe_j_Page_38.jp2
dfab9ac5c2dd4faaee1f3aaaefa6fdda
640b4c255ae474dbcc9fd33374899249be51bfbf
22378 F20110116_AAAAJE kehoe_j_Page_29.QC.jpg
265b6c0324f68b5c8239844db3d591fe
7abd734b2c065ea96d0b38d4cb42dc3039ad11fa
23632 F20110116_AAAAEH kehoe_j_Page_74.pro
cfb5b77254dde4ca78c97f691d5af928
87da85d6cc3d031c5dfd6e83b376dc868935e64a
70198 F20110115_AADVUB kehoe_j_Page_39.jpg
9f28700964bad854a6ad40c8f3ee88fc
92f394d25783655614869edca742f51f890f3505
966157 F20110115_AADVWZ kehoe_j_Page_39.jp2
e9831b8a5bca1142b8983ab33588b2aa
0728ebf3eeb18bf3297bc127ae3215b5c4ee482c
20527 F20110116_AAAAJF kehoe_j_Page_28.QC.jpg
5125854cf2b83983aaabcf07b71b2f7c
37b024d422355386773b75d7047ae5caba4778e7
55799 F20110116_AAAAEI kehoe_j_Page_75.pro
2283f9ad2c3d5e8df7330006a972822a
04b993aba3e9b686ffb0bd1fa1cfdc9ef3c086ec
24694 F20110115_AADVUC kehoe_j_Page_40.jpg
bc37541094367c70428ab9005c50fa68
5afd51e0d4a0abcd87f173e280cb6e482be4195d
21116 F20110116_AAAAJG kehoe_j_Page_03.QC.jpg
28561ff345f876acb80c5b59a197615d
48183ea4eeb0d306bad8aaa29a8c5707f35bdb1f
71095 F20110116_AAAAEJ kehoe_j_Page_76.pro
705cf2e585224771500bc214d321b3ae
8dac4f07defe4f732cc9e754ac1276ce47d17753
F20110115_AADVZA kehoe_j_Page_17.tif
6f40482e7a5624e03a6f58790d6f0c0d
f5cb806434a44b73a2f15a60b754256c15867c6e
47532 F20110115_AADVUD kehoe_j_Page_41.jpg
87d62563da214a4772a98861e6bc1ceb
5160f5139d4b08e7123dd0d5c45a8551fa50d5e4
21096 F20110116_AAAAJH kehoe_j_Page_25.QC.jpg
b77b1f9450ad5facebf9be7482ebf13b
e7a189b8cb65558c263f823efd98c60ccc2cba34
65585 F20110116_AAAAEK kehoe_j_Page_78.pro
f0d209b798942f24aff1551f71c9afe4
a8e5ca7fb80168b9e01423a1129da260a3ce3df3
F20110115_AADVZB kehoe_j_Page_18.tif
8ed7530e255b9608f3d0f0f474b05b14
4d7c0341e72c71576f7c238730893718d269c505
84285 F20110115_AADVUE kehoe_j_Page_42.jpg
d1d7b7b396d8bb7a1744117db53155bf
527af4ad26e1d03fef3530d6b9def44d6451d9c7
26805 F20110116_AAAAJI kehoe_j_Page_13.QC.jpg
ada6200e9f818d0a2555053cadb29382
c21600f6531b92cd2b026ab1102ab470c9d258ce
F20110115_AADVZC kehoe_j_Page_19.tif
8fb1a33d8f9d5c04e7fbf0424c3472a5
ab3b07495c1af82a4470627aa839efe7d22d258c
61710 F20110115_AADVUF kehoe_j_Page_43.jpg
a935dd8d66186eacea96f5e5eff6129b
00190563f2d40af138f7e95ab7a4315594fc4146
18432 F20110116_AAAAEL kehoe_j_Page_79.pro
f2a00130f5da6d28f6075b7b9f918755
3700f5ac3187b7c6bb835cd01d6a01b7eedfc1f6
17270 F20110116_AAAAJJ kehoe_j_Page_49.QC.jpg
64599cabdd365ca7f31dbcae11c805ba
30ae639942c87698634e9a0e0d0c0e1632bf7c2e
F20110115_AADVZD kehoe_j_Page_20.tif
5761d743e32445cdbb8449993c3d870d
b2487e549a718481363479c7667108909915641d
72390 F20110115_AADVUG kehoe_j_Page_44.jpg
51ea5ee5cc340fd66bbc4edacd0c8426
c9d1ba9eb0cda0845864ad864f8350409fa2ff0f
32985 F20110116_AAAAEM kehoe_j_Page_80.pro
ffae431669eaa2db4acb472706ddc0fd
5400305bbc52576be3cbbe04c88ec7bbac61fd9b
11690 F20110116_AAAAJK kehoe_j_Page_05.QC.jpg
847c0d523cb8ea2d6876b578e4efe6f5
d01a0620d5373f623a7c14993bd272759c7a1b36
F20110115_AADVZE kehoe_j_Page_21.tif
c3d400f738b0d7752ff9222bc0388ae5
03efa0e7278a89480909b21879c89ebdb6a9fa8a
61983 F20110115_AADVUH kehoe_j_Page_45.jpg
f452ae534cf8ae4e267d896737850896
30e775e0bf68ba70df7de67ea9699b91a42fea9c
454 F20110116_AAAAEN kehoe_j_Page_01.txt
3cf51665164f61f0c641429b1e96b18b
d0a9279f8755df8c53695f21c3c51fd853d0ab65
10293 F20110116_AAAAJL kehoe_j_Page_79.QC.jpg
976f36c2dd29f9fe7a5cf6f68102523a
81ac97c95b9002fb4408dda1d4f7c362f4edbdb2
F20110115_AADVZF kehoe_j_Page_22.tif
a8aa591c521f65a6ab2962ef6f226dd0
3b6a0bf0aac02dbad2cf936c8f51dfcdd40c8d02
54000 F20110115_AADVUI kehoe_j_Page_46.jpg
2a62bc6352a89d1d93e0ac86f653ec3d
6cc09736f56a97d86658a5e9a0ea983f581a813e
144 F20110116_AAAAEO kehoe_j_Page_02.txt
0fa0fb1beccd4a709e29071341f89294
0e2a639d907ba4210482919a8d592a08dc48953d
17641 F20110116_AAAAJM kehoe_j_Page_04.QC.jpg
8c772433aa849c7220e7417c9bf2c384
a704903c3463a05b794e56a5b606bb911cce08fd
F20110115_AADVZG kehoe_j_Page_23.tif
8e2885ab755669da91382554359cbac7
99ed9f22337b28d468cefc08cf23a90e3862d24e
87165 F20110115_AADVUJ kehoe_j_Page_47.jpg
3b1a0c984191747f91262d535418dddb
4f39b715ee06011c9ae5f454677627dc32e042ee
F20110116_AAAAJN kehoe_j_Page_27thm.jpg
83f70cc3cc1a4a35c0ca02cc7f2ecccd
e2c80c102639ad7c54f3e06745c736a1443235c5
F20110115_AADVZH kehoe_j_Page_24.tif
8f75ecbf6f80386cc48d18c97611acbf
fdf73dfe096e18a8ae0252c86311dd99b3b67cd8
72587 F20110115_AADVUK kehoe_j_Page_48.jpg
6f6436e36fd13848fd0ae569ef7a06f9
78984b23b76df369c0ddda992c2f3fb69a02d0e8
1642 F20110116_AAAAEP kehoe_j_Page_03.txt
e6907b6437fe4e92d0b22be511fa3102
c51e93864fbf5bb10a06d90c8de4026109aef4ba
1696 F20110116_AAAAJO kehoe_j_Page_18thm.jpg
dd4c85cdc04b60799966b825ad55b414
ff5330d88167b46b5cf12963e730dd90ae74681e
F20110115_AADVZI kehoe_j_Page_25.tif
d8b72e62a73af0f6061d38ddb1540d83
abe7a9218c0123124ff5882d907b88eaed52760d
52810 F20110115_AADVUL kehoe_j_Page_49.jpg
f38bb5f1cccf7074ef27aaffb547a5a3
1f00e2dc8dda94b669077d6b2f39701362ac4a04
1489 F20110116_AAAAEQ kehoe_j_Page_04.txt
1d9dcbdf8cade533baaa184154ceb04c
8f55e201563362995ed02ff676196c35204d790d
21225 F20110116_AAAAJP kehoe_j_Page_09.QC.jpg
652a7b2264bbdb3ed45ff249c4d69908
55a33ea97622e08e670c6231237a1ab9f9dcedcc
F20110115_AADVZJ kehoe_j_Page_26.tif
943b82a768c85ca4a1bcc5b75fb70099
49a4e09efd2719b0c694b2d3fe7b1cca6a9f663e
1378 F20110116_AAAAER kehoe_j_Page_05.txt
b287349747ddebc6e555d80a71c5e408
df0f6797e90b160e6982b37ebb03e5aefa6e6699
5845 F20110116_AAAAJQ kehoe_j_Page_50thm.jpg
7c3f5c0e6cd72b7762bab7239108f641
4d480548eb1e403ecac123bfa1f0212747ee0b54
F20110115_AADVZK kehoe_j_Page_27.tif
12d5497ee0d2a760a3cec6b75271f02a
879e0b2e01952614b8cf5d782adaf35efcc7c73e
67933 F20110115_AADVUM kehoe_j_Page_50.jpg
cbe04f3bf9a636697271ed64adb176a7
130f85aac50c54ee3960f0b77b4509366003ad5e
2161 F20110116_AAAAES kehoe_j_Page_06.txt
2a45d0800c251eca1e652a9e8bacbfc9
c50e8b9c12c819963e3547fea405c6cb393a660f
F20110116_AAAAJR kehoe_j_Page_56.QC.jpg
d5e0bc6cf422c438b735004d8ff7232d
7e0a381fa7baa41c38c13ad4e38603a6175e7558
F20110115_AADVZL kehoe_j_Page_28.tif
b6431b16292a374a0311e63b5ac80090
7c7a5676f3a4686701527304701bafa949bbeca4
57175 F20110115_AADVUN kehoe_j_Page_51.jpg
332b738c3f24b8a92da920861328f230
4798c7ffce7cdc3411644a4d07885b28abadb0f6
1383 F20110116_AAAAET kehoe_j_Page_07.txt
78616e9e93dca98d5ffe1da065aaceb3
c3dc7bb6e95d84f71e6597737e6efcddff5edca5
5880 F20110116_AAAAJS kehoe_j_Page_62thm.jpg
f1b91f5905696fa934b5d8914cff1f23
ec2c474152e0ff6eb518978660b94131b6f96399
F20110115_AADVZM kehoe_j_Page_29.tif
49ef11e1d3c9b35b049b4660cec5758a
5b72e3d8dad38d633f696b4f7dff0e561bf07530
72154 F20110115_AADVUO kehoe_j_Page_52.jpg
e5d9041f976db72a24a8d15443282d87
e7ed7701ec46b30ff29bf4cf72d84dd6cc576dad
1604 F20110116_AAAAEU kehoe_j_Page_08.txt
d8bab343407fe59f1f213d0ddbfc5925
ad07b419a0aa05cfb9adbecee851bef57f767c3f
16271 F20110116_AAAAJT kehoe_j_Page_63.QC.jpg
082a15e6654888c69afe7c90ab2e735e
369714d08efef2d3d820eb963d55eda89a94005f
F20110115_AADVZN kehoe_j_Page_30.tif
48ee7b47e17b7cb7ab9ed3b0c89553c8
a81f0b631510049f4baee472590068217573332d
68693 F20110115_AADVUP kehoe_j_Page_53.jpg
5f6c37ce2d0000ee56d688c679eba4f0
83480d1497f879d1c253ea272e3ba64ba928534b
1970 F20110116_AAAAEV kehoe_j_Page_09.txt
8f5884e1ac85127d53a5a7247868f927
61d20b7cb586bc582161eeae39930c0b0354c781
F20110115_AADVZO kehoe_j_Page_31.tif
dc848f67f0c7f107279d442e83cc9b40
6c411c672d22e4564d398108d04e12bc69cd160b
88606 F20110115_AADVUQ kehoe_j_Page_54.jpg
0d62db33336fa26ed2a93a62e2c4f374
6bad1318acb9ecb636b54a041bac5bb387614ba5
2227 F20110116_AAAAEW kehoe_j_Page_10.txt
daa41189b21cac8a4d3ce69d29429873
9eb672dbfcb342a381840dabda20abb849e044f2
7233 F20110116_AAAAJU kehoe_j_Page_77thm.jpg
5401f8eb3b310b5ba9e0f0a49a876884
e624025954e1640a8a87cf8e7aa2f9c5055db2a7
F20110115_AADVZP kehoe_j_Page_32.tif
9c5a4549103fe143fbed600aa9abf839
54f1fd6e30f3ff089709d75797038a5189aa6311
86279 F20110115_AADVUR kehoe_j_Page_55.jpg
74ec063bdd519b02160b7988b98b571c
a985d3490ca0aa182fb92ca2aeb311101841bfbb
708 F20110116_AAAAEX kehoe_j_Page_11.txt
7d1e70c6f9169ad1787792ed85da4aac
6876e263a80f7fe14d2b3aa33b50b62628af72bd
22338 F20110116_AAAAJV kehoe_j_Page_48.QC.jpg
a1c298c4d121672e6459376a7da84064
d0d8034ee281843cd1bc088dea7ac79e26d9a5e7
F20110115_AADVZQ kehoe_j_Page_33.tif
72b85256f6fd9f6b10c6a1e768346601
d6c7a69207bade55175c1b9cb92462131cb67b6d
88917 F20110115_AADVUS kehoe_j_Page_57.jpg
e80070a1f6aa3ea9f758adf12be0a81d
5ba38c0afa442ebf8adbe37eda9bcfe5dcfd6a74
40048 F20110116_AAAACA kehoe_j_Page_14.pro
52b6bf2f6ed43e87e1c3b672537b1618
2f05fd324d4526726b72c034a92c3ebe84c2417b
1994 F20110116_AAAAEY kehoe_j_Page_12.txt
597f04fedd45e8615f9437bac6a94c5d
cac381b35d225470748235fd30ee56242540c53b
6640 F20110116_AAAAJW kehoe_j_Page_30thm.jpg
c89f103b5122afac442e2d941c409f58
52da73cf75d4a2e4f5932b49bf036962830da31c
59902 F20110115_AADVUT kehoe_j_Page_58.jpg
261f065a71e71cabd98993c2399b30af
dd7f9a7909dff5565ec42eb9efe85996a8755ad1
46045 F20110116_AAAACB kehoe_j_Page_15.pro
48501719e3f4667e417ec8bfacd2b962
0fe0351c8a80be3cc3d14949783c011c5a5e90c1
2191 F20110116_AAAAEZ kehoe_j_Page_13.txt
1e54231ecb2f3762491d8eb4b38e564c
04bf5198572e31bbe2af1c797b56c1b72158ef84
2717 F20110116_AAAAJX kehoe_j_Page_73thm.jpg
1890b30e931837a9717dc06d27d46a13
2cb24fc448a41d79f23898baa3b216c23d2ec02b
F20110115_AADVZR kehoe_j_Page_34.tif
8cdca42a205f3acabb87e597b112ef2c
eb919fe3f03d16d01242968b07c94a0c9d160c8e
70012 F20110115_AADVUU kehoe_j_Page_59.jpg
9b0c22c3af9a12d2599fae0f64597107
25f588a704467cc6076d5a7a24344844c5a1034c
48979 F20110116_AAAACC kehoe_j_Page_16.pro
51640677a4eeddc204bb477beae64fd2
fe0c3a7686a4a43379a5ae57669cb3937b6caa76
4756 F20110116_AAAAJY kehoe_j_Page_69thm.jpg
4cace873e3793f938df2c4bc179eb5ba
50799bd95ec8a4b5e2bee640a4f42c009fb0a414
1991 F20110116_AAAAHA kehoe_j_Page_68.txt
05602d770afa9ce7165bd4c59d6c8703
ef4b98b905c43cf438503b190e273104629a61a6
54986 F20110115_AADVUV kehoe_j_Page_60.jpg
33ee15cee75cd9d68f91593b3e4e8e77
cfa0cc11a71a8d331220bc2491c4ffc4b0e578f9
56353 F20110116_AAAACD kehoe_j_Page_17.pro
95ec096263486207ee20df452b99a35c
265790f115ffb00db23b2e50fc917402fa76bb03
F20110115_AADVZS kehoe_j_Page_35.tif
f714be14b62aa969c39422d901aa4035
571c2f6800137d0f8499a1df856f7569b0a65d9a
23928 F20110116_AAAAJZ kehoe_j_Page_32.QC.jpg
74174b0efda9b7646efe7a6480a87e35
ab0fb4442fa42163ebb374127a221d41eee2b60c
1466 F20110116_AAAAHB kehoe_j_Page_69.txt
fc756721a7f3916cc1f8b6be96d18843
425ade82df386ae6584580e5f1fa87b79fc6dd6e
62713 F20110115_AADVUW kehoe_j_Page_61.jpg
d2fb183e56fcc592701be91c6b60589b
62b9436db1deef6aa6684b44e0cf8a9aa179367b
3662 F20110116_AAAACE kehoe_j_Page_18.pro
e570f57cfb1292bb6a5ef83e7988d82f
aa76542ef0daf52c8df071a890afa3a0d4cd99b7
F20110115_AADVZT kehoe_j_Page_36.tif
ff7d96609ec68745da86a5cd2361442a
15b371fdcc7f22de4628617efb62b91c26b9b85c
2058 F20110116_AAAAHC kehoe_j_Page_70.txt
42e0c118810415351adc0a2bd72e1637
316c3c0c574518d65ebd1baf1fa8c827faac0c07
68674 F20110115_AADVUX kehoe_j_Page_62.jpg
8d62c23f1c23443987f6fadf38ce3cca
7805980e63296410a627d27d6fdadf490229d139
47721 F20110116_AAAACF kehoe_j_Page_19.pro
93ec25c9b786ed9cdda861d239385328
8c55e798d5d1e0df6c9aad8eeab053793c384f3d
F20110115_AADVZU kehoe_j_Page_37.tif
9c00202077a65f8cb096f61294f0f504
409aa8f21c450ddf4ee71fb85b7882d927ad34a3
9242 F20110116_AAAAMA kehoe_j_Page_11.QC.jpg
42c3065b4a178d8e1ddd78b3eea9f6d2
8975b7898f9cdbd4d8844dbef264604d8b06942f
1671 F20110116_AAAAHD kehoe_j_Page_71.txt
589b4d28f70fddcd083b8722e986a7ae
3b71112f50c23f6d51c541b3100570dab997b08f
F20110115_AADVSA kehoe_j_Page_72.tif
4be243a457c85bcbbf1298dd751f7b55
35789b868834682cf63c97f1793be98f2cde7f99
51449 F20110115_AADVUY kehoe_j_Page_63.jpg
54e8efd6677cc2530e3845aa0b78efe5
f4ef5448d845baa543b7c6b47baf89d649b8f0db
32600 F20110116_AAAACG kehoe_j_Page_20.pro
f1c2de603cee03fec16c2734e593228e
38e36850f3dd8ff1435a43b759c4417e82b8a35f
F20110115_AADVZV kehoe_j_Page_38.tif
158dd633a2abfe7dd815aeb98e5188c1
f308dd765c2d1d89dc65841583f95532b1dfc9c3
120639 F20110116_AAAAMB UFE0008943_00001.xml
d8a28aedcdc654ced35b5af0d245c467
13fd9821c5c03e3faae65076a34e6289ffd9339b
2311 F20110116_AAAAHE kehoe_j_Page_72.txt
57d0bc39130cc15efc3b9d1a6584eafa
df388bffae146cdeed04a53cc40e3092f7efdc70
1051976 F20110115_AADVSB kehoe_j_Page_55.jp2
1f423d0556a912f505d27d0233a2bbdd
71cdc51fb609fa3739b37c2be7951218db5e7933
70402 F20110115_AADVUZ kehoe_j_Page_64.jpg
0c942d31399c66e4238890540f599831
7316d8b726542b3c2538f04b3f98f2d53881552c
34978 F20110116_AAAACH kehoe_j_Page_21.pro
92ecd3d3313db92b4e23b02a0ec47311
9578e15a8ef9ef7ee270fbac750ac3af2cd33f11
3621 F20110116_AAAAMC kehoe_j_Page_02.QC.jpg
5c841b3e85ed8a8ad02d932e394f6eaf
aa550b41b95d236c2469d127c6f627ab67f37b3f
665 F20110116_AAAAHF kehoe_j_Page_73.txt
58417b24acd015dcc9d5cb44dce77249
eb7b68974388aca47f2a64132a1987e67cfedb5c
19331 F20110115_AADVSC kehoe_j_Page_58.QC.jpg
2204c4ed2c9f79c684fe655c0305c5ef
e8cb22ac7478767e6751dc8afbb690383b9ded12
48014 F20110116_AAAACI kehoe_j_Page_22.pro
bf0e5fdadb2f4279ebbfdcd299f45dcc
d7a851fd297f3bfd7f43f5b2a284df2f65abf2d5
19822 F20110116_AAAAMD kehoe_j_Page_06.QC.jpg
f5087c41fe2132362cf059288da7fd98
039aedc5e75364df3690e9c94c4878593811b605
1031 F20110116_AAAAHG kehoe_j_Page_74.txt
de0515d7bcf24400af76852ab3c90f60
ecb920393f8cbb2b64de5c8ac761d77f85223aae
28094 F20110115_AADVSD kehoe_j_Page_78.QC.jpg
2db7c377d01440665e383cba60024484
10068af69dc841573d454b9ee852fc0427ce2d5c
54010 F20110116_AAAACJ kehoe_j_Page_23.pro
4779c208cfee39dc8118b7651007af71
705e917b65969624e16fa4cb89dcd924284a40de
287577 F20110115_AADVXA kehoe_j_Page_40.jp2
ca1728e8c2c5f1812f299b2ab2c13210
b3def57915ec3ab9a701620d71bd521e5e1fb19d
4957 F20110116_AAAAME kehoe_j_Page_08thm.jpg
b9caa4376e7854bfa92b163c4fa79c8f
b6f720ba40c50c159d92e4b32a8b211ee3c412e0
2236 F20110116_AAAAHH kehoe_j_Page_75.txt
dcf2d119b29183f6019d7016557199fd
9391fd1d4cd6d640d0f75f106da2b632c9000dda
19335 F20110115_AADVSE kehoe_j_Page_45.QC.jpg
47c144033c856e27902af4f403590640
a94273fd854111cbd8a0f79125d1b8d3626db559
40082 F20110116_AAAACK kehoe_j_Page_24.pro
5acd05ee0627ec09ca593fa41531dff1
702da636862ad2e71e1a6e881af84a165d695e8d
661873 F20110115_AADVXB kehoe_j_Page_41.jp2
215c896eef085749f5a253f25b869633
1f2e3ea4cb6a765e47e197109aace897db2ee06c
26758 F20110116_AAAAMF kehoe_j_Page_10.QC.jpg
902ea7a81edd9442e7e5ef8e6165a767
b68a08222a5121c2453f4724b8225949fc54aa4f
2868 F20110116_AAAAHI kehoe_j_Page_76.txt
51a8e19439bf8febf51bc602c1740246
d053b0fb209347f56cec87369c5dd8c5a95c11e8
29088 F20110115_AADVSF kehoe_j_Page_77.QC.jpg
268f6410937c9fc27840e4e5da518cad
d987bc3f345cb93543bad6e28a5505f48b967c56
41107 F20110116_AAAACL kehoe_j_Page_25.pro
df1785b4325395d73378c2e60d638f93
1c7235db39e07d19a13f9d192808fdf56188a157
F20110115_AADVXC kehoe_j_Page_42.jp2
fef086007d4ef1e1748978facdc0df91
ef1a6909f8b0e5ae8365de9fcd5a642334ae7b8d
2729 F20110116_AAAAMG kehoe_j_Page_11thm.jpg
63d35228a88fd315b1a1dbfbf20d1151
a4bc708f8bf9adcf1713f6a8dc5c3c748649e47a
2743 F20110116_AAAAHJ kehoe_j_Page_77.txt
1efd801cc25596a2c45f0dd201ae32e6
61304e518b87e80835a7ad27489b39abe9d70a5f
38512 F20110116_AAAACM kehoe_j_Page_26.pro
f7be29bd48ec2da5e7b9ce70cb2c70bf
f778b51918aff3a4027432748bbccffd5e8fea0b
842092 F20110115_AADVXD kehoe_j_Page_45.jp2
77fca178ea4cf9103301552fbef61949
978ec5d001afd01a369b10faa518e047669d6db0
89878 F20110115_AADVSG kehoe_j_Page_43.jp2
fd714a6a138e401b10469a6b73252b45
0f895d6b7f9c0fa11a8123840a2c14630ec8c801



PAGE 1

A UT OPILO T DEVELOPMENT FOR A MICR O AIR VEHICLE USING VISION-B ASED A TTITUDE ESTIMA TION By JOSEPH JOHN KEHOE A THESIS PRESENTED T O THE GRADU A TE SCHOOL OF THE UNIVERSITY OF FLORID A IN P AR TIAL FULFILLMENT OF THE REQ UIREMENTS FOR THE DEGREE OF MASTER OF SCIENCE UNIVERSITY OF FLORID A 2004

PAGE 2

Dedicated to Sar ah for her patience, understanding, caring, Lo v e.

PAGE 3

A CKNO WLEDGMENTS I w ould lik e to ackno wledge AFRL-MN and the management at Eglin Air F orce Base because none of this w ork w ould ha v e been possible without their continued nancial support. I w ould also lik e to e xtend thanks to Dr Peter Ifju and his students K yu Ho Lee and Se w oong Jung, along with the entire MA V Lab for the design, de v elopment, and f abrication of the aircraft. Heartfelt thanks go out to the researchers at the Machine Intelligence Laboratory especially Dr Michael Nechyba, Jason Grzywna, and Jason Ple w for the de v elopment of the electronics hardw are, the system softw are, the horizon detection algorithm, and most importantly for their generous help in troubleshooting the v arious quirks that cropped up throughout the process. This ef fort also w ould not ha v e been possible without the help of the researchers in the Flight Controls Lab, including Ryan Cause y for laying the groundw ork and feeling out man y of the issues, Kristin Fitzpatrick and Jason Jack o wski for pro viding eld support, and Mujahid Abdulrahim for his piloting skills, observ ations on v arious topics, and ne v er -w aning desire to go “ A VCAAFing. ” I w ould lik e to e xpress deep and sincere thanks to Dr Rick Lind for his time, guidance, support, and moti v ation. My parents, Joseph and Linda K ehoe, deserv e much credit for k eeping me pointed in the right direction o v er the years. Lastly and certainly not least, I w ould lik e to thank K enn y Boothe for locating the airplane one hot afternoon in May and well, just for being K enn y B. iii

PAGE 4

T ABLE OF CONTENTS page A CKNO WLEDGMENTS . . . . . . . . . . . . . . . . iii LIST OF FIGURES . . . . . . . . . . . . . . . . . vi ABSTRA CT . . . . . . . . . . . . . . . . . . . viii CHAPTER1 INTR ODUCTION . . . . . . . . . . . . . . . . 1 1.1 Moti v ation . . . . . . . . . . . . . . . . 1 1.2 Ov ervie w . . . . . . . . . . . . . . . . 2 2 MICR O AIR VEHICLES . . . . . . . . . . . . . . 4 2.1 Background . . . . . . . . . . . . . . . . 4 2.1.1 Characterization and Applications . . . . . . . . 4 2.1.2 Current T echnology . . . . . . . . . . . . 5 2.2 Control Issues . . . . . . . . . . . . . . . 7 3 VISION-B ASED A TTITUDE ESTIMA TION . . . . . . . . . 11 3.1 Image-Based Features . . . . . . . . . . . . . 11 3.2 V ision for Flight Control . . . . . . . . . . . . 12 3.3 Horizon Detection . . . . . . . . . . . . . . 14 3.3.1 Interpretation . . . . . . . . . . . . . 15 3.3.2 Algorithm . . . . . . . . . . . . . . 16 3.4 State Estimation . . . . . . . . . . . . . . . 18 3.4.1 Roll . . . . . . . . . . . . . . . . 18 3.4.2 Pitch . . . . . . . . . . . . . . . . 20 3.5 V ision Issues . . . . . . . . . . . . . . . 23 4 INNER-LOOP CONTR OL . . . . . . . . . . . . . . 26 4.1 Lateral Stability Augmentation . . . . . . . . . . . 26 4.1.1 Methods . . . . . . . . . . . . . . . 26 4.1.2 Control Architecture . . . . . . . . . . . 28 4.2 Longitudinal Stability Augmentation System . . . . . . . 30 4.2.1 Methods . . . . . . . . . . . . . . . 30 4.2.2 Control Architecture . . . . . . . . . . . 31 i v

PAGE 5

5 GUID ANCE AND N A VIGA TION . . . . . . . . . . . . 33 5.1 Directional Controller . . . . . . . . . . . . . 33 5.1.1 Methods . . . . . . . . . . . . . . . 33 5.1.2 Control Architecture . . . . . . . . . . . 34 5.1.3 State Estimation . . . . . . . . . . . . . 35 5.2 Altitude Control . . . . . . . . . . . . . . . 41 6 FLIGHT DEMONSTRA TION . . . . . . . . . . . . . 43 6.1 Aircraft . . . . . . . . . . . . . . . . . 43 6.2 Controller . . . . . . . . . . . . . . . . 45 6.3 T racking . . . . . . . . . . . . . . . . . 51 6.4 W aypoint Na vigation . . . . . . . . . . . . . 53 6.5 Controller Performance Analysis . . . . . . . . . . 57 7 FUTURE DIRECTIONS . . . . . . . . . . . . . . 62 8 CONCLUSION . . . . . . . . . . . . . . . . . 66 REFERENCES . . . . . . . . . . . . . . . . . . 67 BIOGRAPHICAL SKETCH . . . . . . . . . . . . . . . 72 v

PAGE 6

LIST OF FIGURES Figure page 2–1 Micro Air V ehicles . . . . . . . . . . . . . . . 6 2–2 V arious MA Vs studied at the Uni v ersity of Florida . . . . . . 7 2–3 A Biologically Inspired Control Approach . . . . . . . . . 8 3–1 Mapping a Physical Feature to the Image Plane . . . . . . . 12 3–2 MA V Kinematics . . . . . . . . . . . . . . . 13 3–3 Estimated Horizon sho wn Relati v e to Physical Horizon and Local Horizontal . . . . . . . . . . . . . . . . . . 16 3–4 V ision Estimated Horizon . . . . . . . . . . . . . 17 3–5 Error in Horizon Estimation . . . . . . . . . . . . 18 3–6 Ef fect of Roll on Pitch Percentage . . . . . . . . . . . 21 3–7 Lens Curv ature for a Horizon . . . . . . . . . . . . 24 4–1 Lateral Stability Augmentation System . . . . . . . . . 28 4–2 Longitudinal Stability Augmentation System . . . . . . . . 31 5–1 Backw ards Dif ference Relationship for Heading Calculation . . . . 33 5–2 Closed-Loop Directional Control System . . . . . . . . . 35 5–3 Measured heading signicantly lags actual heading at each GPS datapoint 36 5–4 V ector Diagram of GPS Sensor Measurement . . . . . . . . 37 5–5 Heading/Position Estimate Relationship . . . . . . . . . 38 5–6 Closed-Loop Directional Control System . . . . . . . . . 41 5–7 Closed-Loop Altitude Control System . . . . . . . . . . 41 6–1 MA V . . . . . . . . . . . . . . . . . . . 43 6–2 MA V Pusher -Prop Conguration . . . . . . . . . . . 44 6–3 Mounting Conguration of On-Board Electronics . . . . . . . 45 vi

PAGE 7

6–4 Approximation of the Relationship Between T urn Rate and Bank Angle 50 6–5 Selection of MA V V elocity Data for Heading Estimation . . . . . 50 6–6 W aypoint Threshold . . . . . . . . . . . . . . . 52 6–7 Lateral Inner -Loop Performance . . . . . . . . . . . 52 6–8 Longitudinal Inner -Loop Performance . . . . . . . . . . 53 6–9 W aypoint Na vigation with K y04 . . . . . . . . . . 54 6–10 W aypoint Na vigation with K y10 . . . . . . . . . . 55 6–11 W aypoint Na vigation with K y10 and Heading Estimator . . . . 55 6–12 Comparison of W aypoint Na vigation Flight P aths . . . . . . . 56 6–13 W ind relati v e to Flight P ath . . . . . . . . . . . . 57 6–14 Roll Angle . . . . . . . . . . . . . . . . . 58 6–15 Altimeter . . . . . . . . . . . . . . . . . . 59 6–16 Pitch Percentage . . . . . . . . . . . . . . . . 59 6–17 Altimeter Relationship to Heading . . . . . . . . . . . 60 6–18 Longitudinal Control T oggle . . . . . . . . . . . . 61 7–1 T echnology Roadmap for V ision-Based Flight Control . . . . . 63 vii

PAGE 8

Abstract of Thesis Presented to the Graduate School of the Uni v ersity of Florida in P artial Fulllment of the Requirements for the De gree of Master of Science A UT OPILO T DEVELOPMENT FOR A MICR O AIR VEHICLE USING VISION-B ASED A TTITUDE ESTIMA TION By Joseph John K ehoe December 2004 Chair: Richard C. Lind, Jr Major Department: Mechanical and Aerospace Engineering Missions en visioned for micro air v ehicles may require a high de gree of autonomy to operate in unkno wn en vironments. These missions require adv anced sensor technology for control tasks such as obstacle detection and a v oidance. A video camera pro vides a small, lightweight, and ine xpensi v e alternati v e to infrared and ultrasonic sensors. As such, vision is a critical technology for mission capability This thesis discusses an autopilot that uses vision coupled with GPS and altitude sensors for w aypoint na vigation. The vision processing analyzes a horizon to estimate roll and pitch information. The GPS and altitude sensors are then analyzed to command v alues to roll and pitch for na vigation between w aypoints. A ight test of a MA V using this autopilot demonstrates the resulting closed-loop system is able to autonomously reach se v eral w aypoints. Se v eral issues, such as estimating heading to account for slo w updates, are in v estigated to increase performance. viii

PAGE 9

CHAPTER 1 INTR ODUCTION 1.1 Moti v ation A v ariety of mission scenarios are en visioned for which a micro air v ehicle (MA V) w ould be particularly attracti v e. A MA V w ould be ideal to y around a city looking for biological agents. A MA V w ould be ideal as a hand-launched platform to surv eil o v er a ridge of a battleeld. A MA V w ould also be ideal for monitoring and tracking tar gets in an urban en vironment. These mission scenarios require the v ehicle to operate outside the visual range of a ground operator; consequently autonomy is a crucial technology for enabling mission capability Man y or ganizations around the w orld ha v e MA V programs which are resulting in a wide range of ying aircraft. Autopilots ha v e e v en been de v eloped for man y v ehicles which allo w basic autonomy These autopilots are able to stabilize the v ehicle and tra v erse along a ight path determined by a set of w aypoints. The majority of these autopilots ha v e not been documented in archi v al literature yet; ho we v er se v eral case studies can be found on the internet. The use of vision is being e xplored to pro vide sensing for an autopilot on a MA V V ision becomes a critical technology when considering mission proles, such as urban en vironments, for a which a MA V is ideally suited. The e xisting autopilots almost e xclusi v ely rely on gyros and accelerometers along with GPS and altimeters for controlling the v ehicle. In actuality vision may be a primary sensor needed for determining ight path, and potentially e v en attitude, during highly agile and aggressi v e maneuv ering to e xplore within a city V ision has been used for se v eral control purposes. Man y of the studies performed for vision-based v ehicle control ha v e in v olv ed autonomous ground v ehicles using vision 1

PAGE 10

2 sensors for road follo wing [ 8 11 20 23 27 41 ]. Se v eral papers ha v e used similar techniques for ight v ehicles such as dirigibles and helicopters [ 7 31 33 ]. T w o things common to the v ehicles in man y of these studies are (1) the y all ha v e the capability to maintain a x ed position for indenite periods of time to process information, and (2) the y all use vision for outer -loop tracking purposes. Limited studies mak e use of vision for control of x ed-wing aircraft [ 9 10 12 14 ]. Lik e wise is true for the use of vision for inner -loop v ehicle stabilization [ 9 12 24 ]. 1.2 Ov ervie w This thesis discusses a method of w aypoint na vigation that uses vision-based attitude estimation. The control methodology b uilds upon a pre viously de v eloped vision system. This system uses a statistical technique to determine an optimal estimate of the horizon. The relati v e orientation of the estimated horizon is then used to estimate body-axis roll angle and an indication of longitudinal attitude denoted pitch percentage. These estimates are used by an inner -loop controller for stability and tracking. An outer -loop controller determines a set of roll and pitch commands, which are track ed by the inner -loop controller to proceed along a pre-determined set of w aypoints. A ight test is performed to demonstrate the methodology and performance of the resulting controller The MA V is equipped only with a single GPS recei v er an altimeter and a forw ard-pointing camera embedded in the nose of the aircraft. As such, the v ehicle must rely entirely on the vision system for estimating attitude. The lack of gyros and accelerometers may seem unrealistic and limits the controller to ight in areas where the horizon is ob vious; ho we v er the test is indicati v e of a controller which uses a feature, such as the horizon, for feedback. Se v eral issues related to vision-based control are discussed in the conte xt of the ight test. One issue is related to coarse resolution in the measurement of visionestimated roll angle. Also, the issue of interpreting vision-based parameters as

PAGE 11

3 physical information is highlighted. This issue becomes particularly important to the relationship between pitch percentage and aircraft pitch attitude. Issues presented by the MA V performance characteristics are also discussed in the conte xt of the ight test. Specically it is seen that the inte gration of a 1 Hz GPS system with the signicantly f aster dynamics of the aircraft requires the estimation of some parameters for ef fecti v e w aypoint na vigation. Also, the highly v ariable ight conditions of an aircraft on the MA V scale are seen to ef fect controller performance.

PAGE 12

CHAPTER 2 MICR O AIR VEHICLES 2.1 Background 2.1.1 Characterization and Applications Micro air v ehicles, or MA Vs, are typically dened as a class of aircraft ha ving a wingspan of less than 6 inc hes and which is capable of operating at airspeeds of 25 mph or less [ 18 ]. This denition is not strictly applied, as others ha v e been used in literature for v arious purposes [ 9 ]. The general concept is that the class of MA Vs consists of small, e xpendable aircraft that are practical for mission scenarios ill-suited for con v entional aircraft. MA Vs ha v e been seen to e xhibit highly agile and maneuv erable beha vior in ight [ 1 2 12 43 ]. These abilities lik ely result from the f act that the small, lightweight v ehicles possess small moments of inertia while the y e xhibit a disproportionately lar ge amount of control authority These characteristics result in interesting ight performance properties, including the ability to drastically change direction almost instantaneously e vidence of non-traditional dynamic modes, and high-bandwidth response characteristics. Such responsi v e beha vior can cause dif culty for a human pilot, b ut also display potential utility for possible mission scenarios. The small size and highly agile dynamics of MA Vs stand to benet a v ariety of applications. When coupled with an adv anced control system, autonomous ight in an urban en vironment becomes a distinct possibility Such capability could pro v e inv aluable for surv eillance and tracking missions in either a military or la w enforcement setting. F or e xample, a MA V could be used for immediate bomb damage assessment if deplo yed with/from the ordinance. Perhaps a soldier or police of cer equipped with an appropriately instrumented MA V capable of hand launch could e xtend a w areness of the 4

PAGE 13

5 ambient surroundings to a wider proximity including areas dense with obstacles and obstructions. In addition, the relati v ely insignicant presence of the MA V pro vides a stealthy edge to such surv eillance missions. Similarly the use of sophisticated sensor packages could transform a MA V into a v ery useful mapping tool. Unkno wn battleeld en vironments could be ef ciently tra v ersed and documented for strate gic adv antage. In the e v ent of a radiological, biological, or chemical disaster an autonomous MA V could identify and map af fected re gions. These mission scenarios all tak e adv antage of the unique ight re gime of the MA V for increased mission ef cac y and safety of parties in v olv ed. Also, the size, structural properties, and general e xpendability mak e MA Vs a useful testbed for v arious research applications. Aerodynamic forces and moments applied to a MA V in ight are small in magnitude relati v e to lar ger v ehicles, thus allo wing lightweight and e xible materials to be used for construction. This f act has allo wed researchers in the eld of wing morphing for ight control to sidestep the complicated issues of materials selection and control actuation. MA V structural properties allo w simplistic morphing actuation mechanisms to be applied so that control issues can be directly addressed with in-ight demonstration [ 1 2 3 15 ]. This thesis in f act tak es adv antage of MA V durability and e xpendability to e xamine vision-based ight control strate gies. Crash survi v ability and repeatability play an enormous role in e xperimental control design. 2.1.2 Current T echnology In recent years, man y strides ha v e been made to adv ance the technologies required for more ef fecti v e MA Vs. The use of adv anced materials ha v e allo wed increasingly lightweight airframe construction such that smaller v ehicles ha v e seen increasing payload capacity Lik e wise, the miniaturization of electronics and communications hardw are has decreased the payload requirements for instrumentation and actuator components. The result of these adv ances are smaller v ehicles that can carry a

PAGE 14

6 substantial payload, and therefore display the potential for autonomy on a useful enough le v el for a v ariety of mission applications. One of the earliest e xamples of a MA V that e xhibited autonomous mission capability w as the Blac k W idow de v eloped by Aero vironment under funding from D ARP A [ 16 ]. The Blac k W idow had a wingspan of 6 inc hes and weighed about 100 gr ams The conguration, which is sho wn in Figure 2–1(a) w as a lifting-body design constructed using e xpanded polystyrene (EPS) foam, balsa w ood, and K e vlar This e xtremely small aircraft w as able to carry an on-board electronics package that included a custom-made video camera system, a 16-channel data logger and autopilot system. The autopilot performed altitude, airspeed, and heading hold operations and also included a ya w damper The electrically po wered v ehicle w as sho wn to reach airspeeds of about 30 mph and e xhibited endurance of about 30 minutes Another early MA V design e xample is the T r oc hoid depicted in Figure 2–1(b) The T r oc hoid w as de v eloped by Ste v e Morris of MLB Compan y using Multidisciplinary Design Optimization (MDO) techniques, and w as 6.9 inc hes in maximum dimension. A 7.9 inc h v ersion of the tr oc hoid boasted an endurance of up to 20 minutes ight speeds ranging from 10-60 mph and w as able to perform turns with radii as small as 15 feet The v ehicle w as also equipped with a video do wnlink and a stability augmentation system. (a) The Blac k W idow [ 16 ] (b) The T r oc hoid [ 28 ] Figure 2–1: Micro Air V ehicles

PAGE 15

7 A great deal of w ork has been done at the Uni v ersity of Florida in the area of MA V de v elopment. Dr Peter Ifju has been w orking with e xible wing designs ranging in wingspan from as small as 4 inc hes to as lar ge as 24 inc hes as seen in Figure 2–2 A e xible membrane wing has been seen to aid in rejection of wind gusts; a problem which plagues v ehicles that operate in the e xtremely lo w Re ynolds' number ight re gime common to MA V designs [ 18 ]. Dr Ifju' s MA Vs are typically constructed from carbon ber and are po wered by an electric motor In the 2004 Micro Air V ehicle Competition, which w as held in T ucson, Arizona, Dr Ifju' s team w on a w ards in se v eral cate gories, adding to a long list of pre vious citations for w ork in the eld of MA Vs. Figure 2–2: V arious MA Vs studied at the Uni v ersity of Florida The f act should be noted that MA V congurations other than x ed-wing designs are currently under in v estigation throughout the research community Some tak e a biologically-inspired approach to ight by implementing apping wing congurations [ 30 32 ]. Designs in this cate gory pose enormously dif cult problems in the uid mechanics and control cate gories. Another approach has realized MA Vs as small lifting f ans using v ectored-thrust and steering v anes for control capability [ 28 ]. Despite the man y interesting problems introduced by the numerous approaches to MA V design, this thesis will focus primarily on issues related to x ed-wing MA Vs. 2.2 Control Issues The ight re gime in which MA Vs typically operate may require the application of some non-traditional control techniques for ef fecti v e performance. T ypical operating Re ynolds' numbers are near the bottom of the spectrum between 80,000

PAGE 16

8 and 150,000 [ 19 ]. A mark ed drop in lift to drag performance has been observ ed in this range. While birds ha v e adapted their control mechanisms to this re gime quite well, man has yet to master nature' s secret to this engineering problem. Some ha v e attempted to mimic biological systems, as seen in Figure 2–3 [ 3 ]. Re gardless, issues are presented that are not normally dealt with in the eld of ight control. Figure 2–3: A Biologically Inspired Control Approach Attempts ha v e been made recently to model the aerodynamic characteristics of MA Vs. Computational ef forts ha v e e xamined the ef fect of tip v ortices on the aerodynamic performance [ 42 ]. V orte x o weld interactions ha v e been seen to play a signicant role in MA V aerodynamics due to the typically lo w aspect ratio of the lifting surf ace. Se v eral computational studies ha v e also in v estigated the properties of e xible membrane wings, such as those used at the Uni v ersity of Florida [ 25 46 ]. W ings in this class ha v e been seen to alle viate some of the ef fects due to unsteady wind gusts. Flight characteristics ha v e been e xamined from an e xperimental standpoint as well, with one wind tunnel study resulting in the de v elopment of a full model and simulation [ 4 43 44 ]. Se v eral control challenges are presented by the aerodynamics and ight mechanics inherent to aircraft on the MA V scale. V ery small mass moments of inertia contrib ute to diminished stability and control characteristics of the v ehicle [ 1 19 ]. Also, wind gusts and turb ulence ef fects occur on a comparable scale to a MA V' s normal operating airspeed. Suddenly the typical controls problem of disturbance rejection becomes one more of complete change in operating condition because of the wide v ariation of wind speeds o v er the wings. These f actors can mak e control dif cult for either a remote

PAGE 17

9 pilot or an automated controller It is apparent that the problem of MA V ight control must be approached with care. The use of inno v ati v e control ef fectors has been an area of study f acilitated by MA Vs. Use of technologies such as morphing also pro vide an approach to more ef fecti v e MA V control. As discussed pre viously an asset of MA V construction techniques is that simple morphing strate gies can be easily implemented. The use of wing twisting and wing curling has been researched as an ef fecti v e method for lateral control [ 1 2 15 ]. From a mathematical standpoint it has been proposed that morphing strate gies can be approximately treated using a special form of linear parameter -v arying (LPV) control coined linear input-v arying (LIV) control [ 5 13 ]. The concept is that morphing inputs can be treated as an LPV problem where the input is the v aried parameter The consideration of autonomy implies the need for a useful payload capacity that may be used to carry necessary electronics and sensors. Computer vision presents itself as an attracti v e technology for automatic MA V control systems. Use of a small video camera as a sensor pro vides man y benets due to light weight, high bandwidth, and the notion that a wide v ariety and quantity of information can be assimilated through the use of a single sensor Thus a camera can be used not only for attitude estimation, b ut also for guidance and na vigation problems such as obstacle a v oidance and tar get tracking. Image processing and vision-based control are enabling technologies that are quickly gro wing in application and capability A simple approach has been sho wn useful for stabilization of a MA V [ 12 ]. A classication algorithm is used to detect the percei v ed horizon from a forw ard pointing camera mounted on a MA V Information about the attitude of the aircraft can be inferred from the position and orientation of the percei v ed horizon in the image. Prior ef forts ha v e used this f act to augment the stability and controllability of a MA V [ 12 ]. This thesis tak es the idea a step further and proposes to use the concept of vision-based

PAGE 18

10 attitude estimation for augmented stability as the fundamental element of a completely autonomous system for control of a MA V

PAGE 19

CHAPTER 3 VISION-B ASED A TTITUDE ESTIMA TION 3.1 Image-Based Features Computer vision is a v aluable sensor technology in that it can pro vide a lar ge quantity/v ariety of information about the ambient en vironment at a relati v ely high frequenc y with a single sensor Image processing technologies continue to allo w the e xtraction of increasing amounts of useful information from tw o-dimensional images, despite the f act that depth information is lost in the process of mapping the en vironment to the image-plane. It is through techniques such as pattern recognition and feature e xtraction that image-plane information is correlated to the physical en vironment. This correlation enables the use of image-based parameters to be used for control purposes relati v e to the physical space. The three-dimensional physical en vironment is mapped to the tw o-dimensional image plane by projection through a camera lens. Each point on the image plane corresponds to a ray in 3D space, and as such, features of v arying depth can be mapped to a single location on the image-plane. Thus, depth information is lost [ 17 ]. T echniques e xist to indirectly reclaim depth information. Examples of such techniques include the use of multiple cameras, the use of a single camera in multiple positions, and kno wledge of the physical space. The mapping of a gi v en physical feature to the image plane can be described by a perspecti v e camera model [ 9 17 ]. If the camera basis is x ed at the camera lens, as depicted in Figure 3–1 the image plane is located at a distance f from the camera basis origin. The position v ector of a physical feature relati v e to the camera basis is e xpressed as h This v ector determines a position in the image plane described by the coordinates ( n ), related in Equations 3.1 11

PAGE 20

12 Feature Projection C Feature PositionCamera AxisImage Plane ˆ c 1 ˆ c 2 ˆ c 3 fh h 1 h 2 h 3 n Figure 3–1: Mapping a Physical Feature to the Image Plane f h 1 h 3 (3.1a) nf h 2 h 3 (3.1b) 3.2 V ision for Flight Control If the camera frame can be related to a basis of some system requiring stabilization or control, feature parameters based in the image plane can be interpreted for control use. Pre viously the coupled equations of motion for an arbitrary camera x ed relati v e to the body-x ed basis of an aircraft ha v e been de v eloped by Cause y [ 9 ]. The formulation presented by Cause y describes a detailed interrelationship between a physical feature mapped into the image plane with the aircraft dynamics. This formulation pro vides a foundation for ight control relati v e to image-based features. Figure 3–2 depicts the situation of a MA V in ight relati v e to some identied physical feature; specically the corner of a b uilding. Also sho wn is the image resulting from the mapping of the en vironment within the eld of vie w to the image plane. The problem formulation holds the assumption that the vision processing technology necessary for identication, recognition, and tracking of such a feature is

PAGE 21

13 in place. Such technology is rapidly maturing gi v en recent de v elopments in the vision processing community by Soatto and others [ 6 21 22 26 34 35 36 37 38 39 45 ]. h E B CR Cx Physical Feature Projected Image Figure 3–2: MA V Kinematics The position of the aircraft relati v e to a locally suitable inertial reference frame is gi v en by the v ectorR c The position of the feature of interest relati v e to the same inertial basis is gi v en byx If a camera is x ed to the aircraft at some locationD relati v e to the origin of the body basis, it follo ws that the position of the feature relati v e to the camera can be e xpressed by Equation 3.2 Inspection of Figure 3–2 v eries this result.h x R c D(3.2) The orientation of the aircraft relati v e to the inertial frame can be described by a typical Euler transformation, which is performed through multiplication by the timev arying transformation matrixlfnqny, where the Euler rotation angles are gi v en by f q and y Similarly the orientation of the camera relati v e to the aircraft body basis is found through a transformationlf cnq cny cr, where f c q c and y c represent the rotation angles for the sequence between the body basis and the camera basis. The latter transformation may or may not be a function of time depending on the freedom

PAGE 22

14 with which the camera is allo wed to mo v e relati v e to the body basis. The v ectorh is of most use when e xpressed relati v e to the camera basis, as in Equation 3.3 .h lf cnq cny crlfnqny x R clf cnq cny c D (3.3) The time deri v ati v e of the abo v e equation can be directly used to e xtend the con v entional state-space formulation of the aircraft equations of motion to include the relati v e position of an arbitrary number of vision-identied feature points to the aircraft. Gi v en that the angular v elocity of the body basis relati v e to the inertial frame is Ew B and that the angular v elocity of the camera basis relati v e to the body basis is Bw C the additional state equations for the ne w system are gi v en by Equation 3.4 If the position and v elocity of a gi v en feature in the image plane can be determined and track ed to some le v el of precision, it becomes possible to deri v e useful error signals for control use through the relationship described by Equations 3.4 and 3.1 h R cEw B D Ew BBw C h (3.4) 3.3 Horizon Detection Ev aluation of the horizon is the fundamental technique used in this thesis for vision-based attitude estimation. Control relati v e to the estimated horizon can be thought of as control relati v e to an image-based feature for the special case of which the horizon is used as a feature. This technique clearly has limitations due to visibility and ground features; ho we v er the horizon is potentially v aluable information, and is often a v ailable for certain mission scenarios. Human pilots often prefer to use visual interpretation of their en vironment e xternal to the aircraft as a means for stable ight. This thesis b uilds upon an e xisting technique for horizon detection which w as pre viously used for attitude estimation to stabilize a MA V [ 12 ].

PAGE 23

15 3.3.1 Interpretation F or the purposes of this thesis, the physical horizon is dened as the plane tangent to the surf ace of the earth at the point of an observ er situated in an earth-x ed position. If this observ er were a camera looking in a direction parallel to this plane, the image plane w ould be located some distance, f from the lens and w ould be oriented perpendicular to the camera axis and thus the horizon plane. The distance, f is dened as the focal length of the camera lens. In this case the horizon plane appears as a line in the image plane. Conceptually this line appears to separate the ground and the sk y in the eld of vie w The vision-estimated horizon is only an approximation to the physical horizon for the case of a forw ard-pointing camera carried by a MA V in straight and le v el ight. The horizontal plane local to the camera basis, is in this case, a plane that is parallel to the physical horizon and translated in the v ertical direction by the altitude of the aircraft. If this plane were a discernible feature, it w ould appear in the image plane as a horizontal line some distance abo v e the line where the ground is percei v ed to meet the sk y The detection algorithm utilized for this thesis detects this percei v ed edge along which the sk y and ground meet. Therefore in straight and le v el ight the vision estimated horizon plane must be slightly ask e w of the local horizontal plane. Further the horizon estimate is slightly ask e w of the physical horizon due to the f act that the local horizontal plane is oriented parallel to the physical horizon. This plane will then intersect the physical horizon plane at some forw ard distance from the observ er If the “physical” location of where the ground is percei v ed to meet the sk y is thought of as being a distance equal to the range of visibility along the observ ed horizon plane, the estimated horizon will intersect the physical horizon plane at this location. This situation is depicted in Figure 3–3 where the altitude of the observ er

PAGE 24

16 (and the local horizontal plane) are gi v en by h while the sk e w angle between the estimated horizon and the local horizontal plane are gi v en by e 0 h e 0 Figure 3–3: Estimated Horizon sho wn Relati v e to Physical Horizon and Local Horizontal Some le v el of error will be introduced into calculations based on the assumption that the estimated horizon coincides with either the physical horizon or the local horizontal plane. This error becomes increasingly ne gligible as the sk e w angle, e 0 gro ws smaller in magnitude. If it can be guaranteed that the range of visibility will al w ays be much lar ger than the altitude of the observ er e 0 can be assumed to be of ne gligible consequence. This condition is met gi v en that visibility conditions are usually on the order of tens of miles while MA V altitudes are typically less than 1000 ft The f act that visibility usually impro v es with altitude aides the ar gument as well. It is important to note ho we v er that le v el ight is assumed. While the error introduced by these ef fects pro v es to be ne gligible for this case, introduction of a nonzero pitch attitude will af fect the relati v e angular displacement of the horizon planes directly and is discussed in more detail in Section 3.4.2 3.3.2 Algorithm The algorithm used for horizon detection uses a statistical measure based on colors. Essentially statistical modeling is used to classify each pix el by color according to kno wn distrib ution models which are based on training images. A set of proposed

PAGE 25

17 horizons are e v aluated to determine which results in the most accurate split of the distrib ution of pix els classied as either “ground” or “sk y”. The resulting horizon, as sho wn in Figure 3–4 reects the optimal estimate [ 12 ]. Figure 3–4: V ision Estimated Horizon The horizon detection is predicated on se v eral assumptions. The horizon is assumed to appear as a straight line in the image. Also, the horizon is assumed to separate the image into tw o distinct color groups. Such assumptions ob viously limit the usefulness of this approach. While the line di viding the ground and the sk y in the image is ideally a straight line, this may not be the case for real situations where trees, mountains, or other obstructions may cause an irre gular di viding line as sho wn by Figure 3–5(a) which is a representati v e case of an irre gular distrib ution. The mountains cause the pix el classication to be sk e wed which is reected by the optimal estimate. Intuiti v ely the horizon estimate should appear more lik e the image in Figure 3–5(b) re gardless of the presence of mountains or other obstructions. State estimation or other calculations based on the horizon estimate sho wn in Figure 3–5(a) will contain some error if the y rely on the assumption that the estimated horizon is a good approximation for the physical horizon as dened in Section 3.3.1 Thus, it can be seen that the use of color presents some limitations to the algorithm. Specically the algorithm will e xhibit accurate estimations for en vironments that are limited to

PAGE 26

18 being v ery similar to the training distrib utions. Despite these limitations, the horizon detection algorithm has been repeatedly sho wn to be ef fecti v e in open areas. (a) (b) Figure 3–5: Error in Horizon Estimation 3.4 State Estimation Information about the attitude of the aircraft can be inferred based on the horizon in the image coupled with kno wledge of the camera relati v e to the airframe. Clearly a complete set of states can not be estimated based purely on the horizon without additional information. In this thesis, only information related to the roll angle and pitch angle are estimated. 3.4.1 Roll Consider a MA V with a x ed, forw ard-pointing camera mounted in the nose such that the camera axis and the body x-axis coincide. If the error due to the horizon sk e w angle (Figure 3–3 ), e 0 can be considered ne gligible, then an angular displacement about the camera axis w ould correspond directly to the resulting angular displacement of the percei v ed line in the image plane. This statement assumes zero pitch angle, such that error is only introduced by e 0 The angle between the horizontal and the horizon line could be interpreted as the roll orientation of camera, which directly correlates to the roll orientation of an object x ed relati v e to the camera. Because the camera axis coincides with the body x-axis, this angle can be directly interpreted as the body axis roll angle of the aircraft.

PAGE 27

19 If e 0 does not approach zero, angular displacements about the camera axis can no longer be e xactly interpreted as angular displacements of the estimated horizon in the image. The error in roll angle for this case is related by the cosine of e 0 Such a situation might arise in ight due to the presence of a nite distance to the sk y/ground di viding line as might occur in urban ight or ight near a line of trees. F or ight at zero pitch angle, e 0 can be obtained through the equations relating the focal coordinates, as sho wn by Equation 3.5 This relationship deri v es from the f act that the v ertical displacement of the horizon at zero pitch, H 0 should be zero for le v el ight. It is assumed that H is measured at the midpoint of the estimated horizon line. H 0h 1 H 0 h 3 H 0tan e 0 (3.5) It can be assumed for most cases that the line at which the ground is percei v ed to meet the sk y is distant enough such that e 0 will be suf ciently small. The situation is dif ferent for the case of ight at an y non-zero pitch angle, q The angle responsible for error in the horizon estimate is directly inuenced by rotation about the pitch axis, as seen by equation 3.6 e T O Tqe 0 (3.6) F or cases where e 0 can be considered ne gligible, e T O Tq This basically states that for zero pitch angle, the estimated horizon is an accurate approximation for the local horizontal plane. Based on the denition of horizon in Section 3.3.1 the horizon line can be thought of as being situated in the v ertical inertial plane that is located by the heading angle of the aircraft and the current range of visibility Aircraft body-axis roll is dened about the body x-axis. As discussed abo v e, this axis coincides with the camera z-axis so pure body-axis roll will correspond to pure rotation in the image plane. The vision-estimated roll angle is measured based on the orientation of the physical horizon, which is out of plane by an angle equal to q The

PAGE 28

20 error associated with nonzero pitch angles can be related through consideration of a v ector of length r e xtending from the CG (camera basis origin) in the direction of the body y-axis. If the aircraft has a nonzero pitch angle, q and a nonzero roll angle, f an observ er situated at the horizon line in the inertial frame will see this v ector as related by Equation 3.7 The vision-estimated roll angle is then obtained using the components of the v ector projected into the v ertical inertial plane, as seen by Equation 3.8 .rˆ b 2ˆ e l 2 ql 1 fr ˆ e 1 ˆ e 2 ˆ e 3 rsin f sin qˆ e 1 rcos fˆ e 2 rsin f cos qˆ e 3 (3.7) tan f Ersin f cos q rcos fcos q tan f (3.8) 3.4.2 Pitch The distrib ution of sk y and ground classication re gions is used to determine a parameter denoted pitch percentage. Specically pitch percentage is a measure of the percentage of the image belo w the vision-estimated horizon and can therefore yield information about the longitudinal attitude of the aircraft. It is important to note this parameter is not concretely tied to an y physical coordinate frame. It depends entirely on color and shading in a tw o-dimensional image. Although a change in pitch attitude will lik ely result in a change in pitch per centage, a change in pitch percentage does not necessarily indicate a change in pitch attitude. Pitch percentage is also af fected by man y other f actors that are independent of pitch attitude. The angle e is af fected directly by changes in both altitude and physical distance from the percei v ed split between ground and sk y As discussed

PAGE 29

21 pre viously in brief, this distance could v ary greatly in a cluttered or urban en vironment because objects of v arying distances from the camera w ould obstruct the vie w to the ”obstacle-free” percei v ed meeting point. F or e xample, a b uilding that is 100 ft a w ay could represent the percei v ed meeting point, while this distance w ould be on the order of miles if the b uilding were not present. Situations could arise where such an obstruction comes into or mo v es out of the eld of during a constant-attitude maneuv er This presents an e xample of a case for which pitch percentage might change drastically while there is no actual change in pitch attitude. Another f actor independent of pitch angle that could af fect the pitch percentage is the body-axis roll angle. F or a rectangular image that might be generated by standard camera hardw are, there is a limiting v alue of roll angle such that v alues less than this limit will not introduce undesirable pitch percentage ef fects. Roll angles greater than this v alue in magnitude cause a change in pitch percentage that is unrelated to pitch attitude. This ef fect is seen in Figure 3–6 Figure 3–6: Ef fect of Roll on Pitch Percentage The top-left image of Figure 3–6 has the aircraft at some constant pitch attitude with zero bank angle. The pitch attitude in this case passes through the center of the image, such that pitch percentage has a v alue of 05. The other three images depict a

PAGE 30

22 pure rotation about the camera axis, which passes through the center of the image. It can be sho wn that the images depicted in the top-right and bottom-left of Figure 3–6 do not change the v alue of pitch percentage. Once rotation e xceeds the angle to the diagonal, more ground will appear in the image than sk y as seen by the image at bottom right. This critical angle of rotation will v ary o v er v alues of pitch percentage. Hence pitch percentage is af fected by roll angle independently of pitch attitude. There are limited cases for which some restricti v e assumptions can be made such that the pitch percentage can be tied to the pitch attitude. First of all, roll must be limited to cases that do not af fect the pitch percentage as depicted in Figure 3–6 In an obstacle free en vironment for which the detected horizon can be assumed to be innitely f ar a w ay from the camera in comparison to the altitude, the le v el horizon displacement angle, e 0 is seen to approach zero as stated in Section 3.3.1 In this situation, zero pitch angle will be reected by a pitch percentage v alue of 05, which has the horizon in the middle of the image. Because the focal coordinate position is proportional to the tangent of the angle relating the tar get line of sight, the small angle approximation can be applied to estimate the v ertical position of the center of the horizon, H as a measure of the pitch angle. The small angle approximation is seen to hold for the tangent function o v er a domain v arying up to about 15 de g Unfortunately this case is v ery limiting, and conditions f acilitating this de v elopment are not lik ely to be common in practice. In general, pitch angle can not be uniquely determined from the pitch percentage due to the man y f actors af fecting the parameter Further the f act that pitch percentage is determined from the color distrib ution of a tw o-dimensional image as opposed to kinematical relations in a physical reference frame limits its use for the unique determination of aircraft body-axis pitch angle.

PAGE 31

23 3.5 V ision Issues Computer vision presents man y adv antages for control purposes; ho we v er it presents man y challenges as well. Inherent to the physical system are nonlinearities associated with the camera along with the high risk for interference and noise associated with wireless video transmission [ 9 ]. The softw are poses some issues to address as well due to de viations from the kno wn distrib ution models in trainer images. Most importantly vision-based attitude estimation using horizon detection requires that the physical horizon be retained in the image at all times. A nonlinearity that is inherent to an y vision system results from distance. Specifically the position and size of a feature in the image plane v aries in v ersely with the distance to that feature as sho wn by Equation 3.1 This dependence on distance will af fect the horizon, and consequently attitude estimation, if the percei v ed horizon depends on features which are close to the aircraft. Ob viously ight amongst trees and mountains will result in features of v arying distances and horizons of v arying orientation. Another nonlinearity is associated with lens distortion. The curv ature of the lens causes the physical space to be mapped nonlinearly into the image plane in such a w ay that the image is distorted axisymmetrically The le v el of con v e x and conca v e distortions increases with distance from the image center as seen in Figure 3–7 At the edge of the image, the mapping of the horizon no longer complies with the straight line assumption of the horizon detection algorithm so some le v el of error in detection is introduced. This phenomenon has been seen to manifest itself as the horizon “twitching” as the algorithm jumps back and forth between straight-line best-guesses along the curv ed mapping. The computational dra w of the image processing algorithm introduces a tradeof f issue between run-time and sensor resolution. Run-time is seen as a critical f actor due to the f act that vision is the primary means for aircraft stabilization. The f ast MA V

PAGE 32

24 Figure 3–7: Lens Curv ature for a Horizon dynamics require a f ast-acting control system to achie v e stable ight. Therefore the nite set of proposed horizons at each computation is only capable of resolving the vision estimated roll angle to 4.45 de g This limitation poses a ne w set of challenges to the control design and can f actor into the twitching horizon ef fect if the body-axis roll angle is between tw o discrete measurement le v els. Issues are also presented by ambient conditions causing de viations from the trainer images. If color anomalies in the image cause a lar ge enough number of sk y or ground pix els to be incorrectly classied, then the distrib ution split will no longer coincide with the physical horizon. F or e xample, if the algorithm w as trained on images of a clear and bright day then dark clouds introduce sk y pix els that do not t into the kno wn distrib ution model. These dark pix els may then be erroneously classied as ground pix els and the distrib ution split no longer lies between the sk y and the ground. Other e xamples of anomalies include bright sunshine associated with sunrise/sunset, dark ground shado ws, and sk y colors associated with Doppler ef fect at dusk. Gross misclassication of pix els can cause lar ge erroneous vision-estimated roll angles and pitch percentages. The most important limitation associated with vision-based attitude estimation using horizon detection is that the horizon must remain in the image at all times. If no sk y or equi v alently no ground is visible in the image, then some lar ge number of pix els will be misclassied and the estimated horizon will be arbitrary In this case, the estimated parameters will ha v e no basis in the physical space and an y indication

PAGE 33

25 of attitude change due to the horizon will be in error This limitation means that the aircraft must be maneuv ered non-aggressi v ely enough that the horizon is al w ays within the camera' s eld of vie w Finally interference and noise are inherent to an y wireless telemetry system. The interference is often especially pre v alent o v er lar ge distances and for positions near the edge of the recei ving antenna' s beam width. A lar ge amount of noise distortion in an image will surely ha v e an ef fect on the color distrib ution and, therefore, the estimated horizon.

PAGE 34

CHAPTER 4 INNER-LOOP CONTR OL 4.1 Lateral Stability Augmentation 4.1.1 Methods A stability augmentation system is incorporated into the autopilot to af fect the lateral-directional dynamics. Specically this system is designed to stabilize the v ehicle and pro vide tracking of roll commands. T racking performance is achie v ed using a proportional plus inte gral controller on an error signal. This error signal deri v es by subtracting the commanded roll angle from the vision-based roll estimation. The proportional plus inte gral control la w pro vides an actuator command to the dif ferential ele v ator which deects antisymmetrically to generate an appropriate rolling moment. The proportional plus inte gral control combination w as chosen for specic purposes. Proportional control elements are essential in an y tracking controller as the y pro vide a good ”directional sense” to the control la w In principle, proportional control la ws generate an actuator command that is directly proportional to the error signal; consequently lar ge de viations from the desired response will result in relati v ely lar ge corrections by the actuator The use of inte gral control for lateral inner -loop stabilization is a some what unusual choice for ight stability systems, b ut turns out to be a necessary element for the case e xamined. An inte gral control la w generates actuator commands that are proportional to the inte gral of the error signal. In this sense, inte gral control pro vides desirable steady-state error qualities. If there is a nonzero error at steady state, the inte gral of the error will b uild and the actuator will ramp up until the error is dri v en to zero. When there is zero error the inte gral of error remains constant and the inte gral control la w commands a constant actuator input. This result means that an inte gral 26

PAGE 35

27 controller can maintain zero error at steady state, e v en if a nonzero actuator command is required. A proportional controller cannot pro vide a nonzero command when there is zero error as it is proportional to the v alue of the error signal. The inte grator is needed in the case of the lateral dynamics of a MA V for se v eral reasons. First, the lateral dynamics of an aircraft ha v e been seen to e xhibit a type-0 system or in other w ords, a system with no free inte grators [ 40 ]. Such a system will e xhibit some le v el of steady-state error if proportional control is used alone. This error can be reduced by increasing the proportional gain, b ut this action can lead to a response that is oscillatory to an undesirable e xtent. In particular the highly agile nature of micro air v ehicles resulting from the combination of lo w inertia moments and a high le v el of relati v e control authority has been described as ha ving erratic controllability qualities, such that high gain feedback could result in system instability [ 1 19 ]. Therefore, a relati v ely lo w proportional gain is necessary with the inte grator taking up the slack by pro viding the minimal control authority to maintain the commanded roll angle. A second reason to include inte gral control in the system is to account for a trim condition that may v ary drastically during ight. Because wind speeds can v ary on the same order of magnitude as the airspeed of the MA V in its normal ight re gime, ying at dif ferent headings on a windy day could require lar ge and wide ranging control inputs at steady state to maintain trim conditions. Similarly asymmetries in construction can require a nonzero command at zero error to maintain trim. Proportional control alone cannot pro vide nonzero inputs when there is zero error An inte gral control element is related to accumulated error o v er time instead of the instantaneous v alue and can therefore maintain a nonzero command during periods of zero error

PAGE 36

28 4.1.2 Control Architecture As discussed in Section 4.1.1 the error between the vision-estimated roll angle, f E and the ltered roll command, f c D is minimized through the use of a proportional plus inte gral controller that uses dif ferential ele v ator to pro vide the necessary rolling moment. The architecture for the stability augmentation and roll tracking system is sho wn in Figure 4.1.2 [ 9 ]. D K f K I f MA V # C # V # en vironment Figure 4–1: Lateral Stability Augmentation System The C element represents the camera whose output is image-plane information. This output is a function of the aircraft position and orientation as well as the physical en vironment within the eld of vie w of the camera as discussed in Chapter 3 Each point in the physical space is mapped to the image plane according to Equations 3.1 The C block includes this mapping process, along with the application of an y nonlinear or distortional ef fects introduced by the hardw are. Image processing operations required for lateral control are represented by the V element in Figure 4.1.2 Specically the horizon is estimated using the horizon detection algorithm discussed in Section 3.3.2 A vision-estimated roll angle is computed using the angular displacement of the estimated horizon. A lo w-pass lter is also included in the V block to attenuate undesirable highfrequenc y data anomalies in the vision-estimated roll angle. These anomalies are associated with instances in v olving a poor horizon estimation, which are seen to occur for v arying reasons including abnormal image coloration due to e xcessi v e

PAGE 37

29 light v ariations as well as image de gradation due to interference in wireless video transmission. These instances ha v e been seen in the past to result in the horizon estimate “jumping” to an arbitrary position and orientation within the image. In man y cases this ef fect is only momentary so the lter can attenuate the high frequencies associated with these spik es. The error signal results from the subtraction of a ltered v ersion of the commanded roll angle from the vision-based estimate. This lter represented as D is required to act on the commanded roll angle prior to computation of the error signal to a v oid quantization errors. These errors are introduced by the limited resolution of the vision-based roll estimation, which is an ef fect related to the computational of the horizon detection algorithm as discussed in Section 3.5 The vision-based attitude estimation system is only capable of resolving the estimated roll angle to discrete steps of 445 de g Commanding a roll angle at an y v alue other than an inte ger multiple of 445 de g w ould result in residual error because the sensor system w ould ne v er be able to match the resolution of the command. This residual error w ould result in actuator motion causing a roll response. If the command were to remain unchanged, the response w ould pass through the commanded v alue to the ne xt discrete-measurable v alue, resulting in a residual error signal in the opposing direction. W ithout a change in the commanded v alue, this amounts to a limit c ycle oscillation in the roll response. Therefore the feedforw ard lter D quantizes the commanded roll angle from a signal that v aries continuously with amplitude to match the discrete estimated roll resolution of 445 de g The command to the dif ferential ele v ator actuator is nally computed by scaling the ltered error by the controller gains, as sho wn in Equation 4.1 These gains are denoted by K f as the proportional gain and K I f as the inte gral gain. The error is scaled directly by K f A separate state, ˆ h which represents the accumulation of error is scaled by K I f The sum of the proportional and inte gral elements represents the

PAGE 38

30 actuator input. uK feK I fˆ h (4.1) 4.2 Longitudinal Stability Augmentation System 4.2.1 Methods The autopilot also includes a stability augmentation system to af fect the longitudinal dynamics. The basic function of this controller is to stabilize the attitude of the v ehicle and track commands to pitch percentage. The relationship of pitch percentage to the longitudinal attitude of the aircraft is discussed in detail in Section 3.4.2 Although there are man y reasons to a v oid using the pitch percentage parameter for longitudinal control it reasonably approximates pitch attitude in some instances. In other cases, pitch percentage has at least some dependenc y on the relati v e pitch attitude of the MA V and therefore can yield generic information concerning attitude changes. Pitch percentage tracking is achie v ed through the use of a proportional controller which acts on the error between some commanded v alue of pitch percentage and the current measured v alue of pitch percentage. Inte gral control is not included in the longitudinal inner -loop because zero error at steady-state is not a requirement. A command to the ele v ator actuator is determined by scaling the error signal by a proportional gain. The longitudinal attitude control system presented here de viates signicantly from traditional pitch attitude hold systems. Specically it is common to implement a proportional plus deri v ati v e approach where the pitch angle and pitch rate are used as feedback signals. The pitch angle feedback pro vides performance in the sense of tracking, while the rate feedback pro vides additional damping of the short period mode. Ev en when these parameters are a v ailable, control systems of this nature are typically used only for wings-le v el ight [ 40 ]. The use of pitch percentage has been sho wn to maintain a non-unique relationship to the pitch attitude of the aircraft. Further limitations that arise due to the nature

PAGE 39

31 of vision as a sensor technology such as eld of vie w constraints and lens distortion ef fects add uncertainty to the estimation. The controller could be useful for situations, such as horizon at innite distance, when pitch percentage can be loosely interpreted in terms of pitch. In general, the lack of correlation between the pitch attitude and pitch percentage reduces the benet of tracking commands to pitch percentage in itself. Ho we v er it will be sho wn later that the methodology proposed here will pro v e to be an important element for limiting commands to retain the horizon in the eld of vie w during higher le v el control acti vities. 4.2.2 Control Architecture As discussed in Section 4.2.1 the error between the commanded v alue of pitch percentage, s c and current measured pitch percentage, s is minimized through the use of a proportional control la w to generate a command to the ele v ator The resulting architecture is sho wn in Figure 4–2 K s MA V # C # V # en vironment Figure 4–2: Longitudinal Stability Augmentation System As is the case with the lateral stability augmentation system, the C element represents the camera which tak es as inputs aircraft states along with the en vironment and produces an image as the output. Also, image processing operations are represented by the V element. The horizon is e v aluated and assigned a statistical estimate per the algorithm discussed in Section 3.3.2 The pitch percentage is determined using the ratio of the tw o classied re gions of the pix el distrib ution of the image [ 12 ]. Pitch percentage is therefore limited to v alues between zero and one.

PAGE 40

32 The error signal is computed directly as the dif ference between the commanded pitch percentage and the measured pitch percentage. The commanded v alue is limited to v alues in the range between zero and one to a v oid e xceeding the sensor limitations. The proportional control la w is then applied to generate an input to the symmetric ele v ator actuator as e xpressed in Equation 4.2 The input, u is computed by scaling the error by the gain, K s uK seK ss cs(4.2)

PAGE 41

CHAPTER 5 GUID ANCE AND N A VIGA TION 5.1 Directional Controller 5.1.1 Methods The directional controller is implemented as an outer loop to the vision-based roll command and stabilization system. Heading calculations are performed using the relati v e displacement of GPS coordinates. These coordinates are e xpressed in terms of latitude, l and longitude, c The current heading, y of the aircraft is determined from the current position and the position from one time step prior using a tangent relationship. This is represented by Figure 5–1 and the Equation 5.1 y c l Dc Dll 1nc 1 l 0nc 0Figure 5–1: Backw ards Dif ference Relationship for Heading Calculation tan y c 2c 1 l 2l 1 (5.1) F or the purposes of this thesis, y is tak en to be the lateral ight-path angle rather than the body-axis heading angle. As a result, this parameter is a measure of the trajectory of the MA V center of gra vity (CG) in the local horizontal x-y plane. This denition is contrary to that of body-axis heading angle, which measures the directional 33

PAGE 42

34 orientation of the aircraft. The directional controller concerns the direction of inertial motion rather than details about specic aircraft orientation. Therefore the ight path pro v es to be more v aluable information for the case of a MA V with highly v ariable ight conditions and limited sensing capability The directional controller for goes tar get detection/identication and assumes the location of a desired w aypoint or tar get is a kno wn point in space. Gi v en w aypoint or tar get coordinates, the relationship abo v e can be used to compute the bearing required to hit the tar get. This bearing is re garded as the commanded heading, y c The measured heading can be used to generate an error signal directly in conjunction with the commanded heading because the y are both measures of CG trajectory This error signal is computed by taking the dif ference between y and y c T o track y c the error is dri v en to zero by commanding heading changes through a non-zero turn rate. F or the MA V of interest, turn rates are achie v ed primarily through v arying bank angle. It is assumed that the relationship between bank angle and turn rate is proportional in nature, although in practice it will be af fected by man y f actors. Therefore this assumption is some what restricti v e; ho we v er it mak es proportional control a con v enient technique to generate a command to the inner -loop lateral controller The inner -loop lateral controller af fects the bank angle to achie v e the turn rates necessary for minimization of the heading error signal. 5.1.2 Control Architecture As discussed in Section 5.1.1 the error between current aircraft heading and commanded heading (heading to tar get) is minimized by commanding a non-zero turn rate through a commanded roll angle. This commanded roll angle is computed by multiplying the heading error by a proportional gain, K y and is limited to$356 de gr ees This limit results in a constant command for the lar ge heading errors corresponding to the gross acquisition phase of na vigation and a v arying command for the smaller error

PAGE 43

35 v alues corresponding to the ne tracking phase. The roll stability augmentation system stabilizes the aircraft about this commanded roll angle. y c K y D K f K I f MA V # C # V # en vironment # S !Figure 5–2: Closed-Loop Directional Control System 5.1.3 State Estimation 5.1.3.1 Directional controller issues The system as described to this point w ould ha v e serious dif culty na vigating with an y de gree of accurac y or aggressi v eness. Dif ferences in time scale between the MA V dynamics and the sensor dynamics impose hea vy limits on the bandwidth of the directional controller Specically the use of GPS to determine position and heading results in undesirable directional controller performance. T ypical GPS sampling rates are signicantly slo wer than the aircraft dynamics associated with position and heading changes. Therefore the sensor cannot con v e y all of the information necessary for good controller performance. This issue requires the control design to sacrice aggressi v eness for accurac y An aggressi v e controller gain, K y gi v es the controller the authority to quickly dri v e a heading error to w ard zero. Unfortunately turn rates resulting from aggressi v e commands are lik ely to e xhibit signicant o v ershoot before the GPS sensor pro vides an update to alert the system that it has surpassed the commanded v alue. This beha vior results from the MA V state changing at a higher rate than the sensor can accurately capture, which can result in instability A con v er gent response can be achie v ed by

PAGE 44

36 limiting the aggressi v eness of controller such that commanded turn rates ne v er result in signicant o v ershoot of the heading command. Such a limitation in aggressi v eness se v erely limits the allo w able conguration and spacing of w aypoints for which it is feasible to hit successi v ely Another issue deals with the inherent lag introduced into the system by the backw ards dif ference approximation used in equation 5.1 to compute the current heading. Using a backw ards dif ference approximation ef fecti v ely measures the heading at the pre vious time step such that “current” heading readings are delayed by one time sample. This lag is depicted in Figure 5–3 which sho ws a representati v e ight path along with the position and heading as w ould be measured by the sensor block, S in Figure 5–2 The 1 Hz sampling rate of GPS data guarantees this lag will be at least 1 second. The rapid dynamics associated with the control of a MA V require real-time kno wledge, and information outdated by more than a second can become almost useless. Actual Flight Path/HeadingMeasured Flight Path/Heading Figure 5–3: Measured heading signicantly lags actual heading at each GPS datapoint Besides the inability of the 1 Hz GPS sensor to accurately capture the rapid changes in position and heading, the slo w sampling frequenc y generally causes some le v el of error in the instantaneous computation of heading at each GPS update. Ev en if the lag ef fect were not present, the accurac y of the method used to compute instantaneous heading relies on the assumption that the ight path between samples can

PAGE 45

37 be approximated by a straight line. This assumption requires that either the sampling rate is v ery f ast or that the ight path is in actuality v ery close to a straight line. This requirement is depicted in Figure 5–4 D s qr%t& r%t'Dt&Dr%t&Flight P athFigure 5–4: V ector Diagram of GPS Sensor Measurement Figure 5–4 sho ws a representati v e plot of tw o successi v e position measurements relati v e to some inertial reference frame, where the measurements are separated by a time delay of Dt The heading computation relies on the assumption that the direction of the tangential unit v ector at both data points coincides with the direction of Dr The angle q must be minimized for this assumption to hold. T o maintain generality this must be achie v ed by minimizing the arclength and in ef fect Dt This ef fect can also be seen by observing the dif ference between the representati v e ight paths in Figure 5–3 5.1.3.2 T echnique/Implementation T o alle viate the issues discussed in the pre vious Subsection, position and heading estimates are introduced to ll the v oid between kno wn v alues. Gi v en a kno wn current position,lnc, heading, y v elocity V and turn rate, y e xtrapolation of the ight path to some future time can be performed readily This is depicted in Figure 5–5 and described by equations 5.2

PAGE 46

38 c l y(k) y(k)Dtlk1*nck1+ lk,nck-Dllk1*nck1+ .,/V.DtDc Figure 5–5: Heading/Position Estimate Relationship lk1 lk 10 V0tcosyk ykt(5.2a) ck1 ck* 0 V0tsiny23k ykt(5.2b) An estimate of position is rst generated relati v e to the kno wn current GPS position. Future position estimates are computed relati v e to the the estimated position at the pre vious time step. Estimated heading is computed using Equation 5.1 with estimated position. This process continues until the GPS sensor updates and pro vides a “check” for the position estimate. In the e v ent of an update the estimate is replaced by the measured position. Measured heading, yk, is a v ailable b ut pro v es to be problematic based on the pre vious discussion. Use of such data w ould result in lar ge estimation errors be ginning at the rst calculation. T w o problems require solution to obtain more accurate heading information, as discussed in the pre vious Subsection. The rst is the issue of 1 sec lag due to the backw ards dif ference method, while the second is error in computation of

PAGE 47

39 instantaneous heading due to the slo w sampling time. Both errors will ha v e diminished ef fect be yond the initial calculation at the kno wn GPS data point. The estimation code runs f ast enough to re gard both the lag and instantaneous computation errors to be ne gligible be yond this initial step. Ho we v er both errors will ha v e signicant ef fect on the rst position estimate at each GPS update. The errors w ould carry through to the ne xt update because future position estimates are computed based on the initial position estimate. Therefore it becomes necessary to address both issues for the initial position estimate at each GPS update. The tw o identied problems with heading measurement are solv ed in an ad hoc manner The lag issue is addressed using the assumption of a constant rate turn between tw o measured positions. Essentially a guess of heading at the ne xt time step is made using the assumption that the turn rate of the MA V is quasi-constant o v er the past tw o time steps. Kno wledge of the measured heading from the tw o time steps prior allo ws the linear e xtrapolation to current heading, using the relationship in Equation 5.3 yk yk1 yk1yk2-(5.3) The issue of instantaneous heading computation is addressed in a similarly ad hoc f ashion. Figure 5–3 sho ws that in a steady turn, the measured current heading undershoots the actual heading due to lag, while the remo v al of lag results in a v alue that o v ershoots the actual heading due to the inability of the sensor to fully capture the smoothness of the ight path. The heading v alue after remo v al of system lag is described by Equation 5.3 and therefore both heading v alues are kno wn. One v alue is kno wn to undershoot, while the other is kno wn to o v ershoot in the case where the constant-turn assumption is v alid. Therefore the dif ference is split between these heading v alues that are non-optimal on opposite ends of the spectrum to approximate the actual v alue of current heading. This action is equi v alent to reducing the ef cac y of the lag remo v al method, and is described by equation 5.4 Again, this step is only

PAGE 48

40 performed in the e v ent of a GPS update. Be yond this point, estimates are considered accurate and are used in conjunction with the pre vious backw ards dif ference approach (Equation 5.1 ) to calculate heading until the ne xt update is recei v ed. yk yk1 1 2yk1yk2+(5.4) Selection of turn rate data for the estimation is also seen to present problems, as an y on-line data w ould ha v e to come from the GPS sensor and w ould therefore e xhibit the same lag seen in the position and heading measurements. Ho we v er it can be assumed that the turn rate, y at an y gi v en time is related to the current bank angle, f Recall the assumption in Section 5.1.1 that this relationship can be roughly approximated as linear where turn rate is tak en to be proportional to the current bank angle. This relationship is sho wn by Equation 5.5 The proportional gain, K y is estimated from pre vious ight data. yK yf (5.5) The v alue of v elocity V is also based on pre vious ight data. Constant ground speed is assumed and is approximated from a v eraged data. Although this may be a poor approximation for the highly v ariable ight conditions of a MA V the alternati v e w ould be to tak e v elocity data from the slo w GPS sensor The constant v elocity assumption should closely approximate the v ehicle v elocity for cases of little or no wind. Using on-line GPS data for v elocity information w ould al w ays be subject to errors associated with the time scale dif ferences between the sensor and MA V dynamics. W ith the estimator implemented, the closed loop lateral-directional controller tak es on the form sho wn in Figure 5.1.3.2 where E represents the estimation process.

PAGE 49

41 y c K y D K f K I f MA V # C # V # en vironment # S # E !Figure 5–6: Closed-Loop Directional Control System 5.2 Altitude Control An altitude controller is also included in the autopilot to ensure the MA V reaches the correct altitude associated with each w aypoint. This controller commands altitudes of the v ehicle throughout the ight path during both straight and turning maneuv ers. The architecture for the altitude control is sho wn in Figure 5–7 h c K h K I h switch MA V # C # V # # K s lim # en vironment # S ! K f d e !Figure 5–7: Closed-Loop Altitude Control System The main feature of the altitude controller is the switc h element. Essentially tw o v alues of ele v ator deection are computed using separate approaches b ut only one is actually commanded to the serv o actuator The upper path to the switc h is simply the inner -loop controller from Figure 4–2 that tracks a pitch percentage. The lo wer path to the switc h is a proportional plus inte gral structure that directly considers an error in altitude.

PAGE 50

42 The upper path is really an augmented v ersion of the tracking controller for pitch percentage. The augmentation accounts for the need to maintain a visible horizon by including a lim block. This block chooses a command to pitch percentage at its maximum limits. The pitch percentage ranges from 0.0 when the horizon is at the bottom of the image to 1.0 when the horizon is at the top of the image. The lim block commands a v alue of 0.2 for a climb or 0.8 for a di v e so the v ehicle is al w ays commanded to its limit. These limits are are chosen so the horizon is ne v er lost. In this w ay the controller commands a limiting v alue of pitch percentage re gardless of the actual pitch angle. The lo wer path is a standard proportional plus inte gral controller The error signal upon which this controller operates is the dif ference between commanded and measured altitude. Such a controller recognizes the lack of a pitch measurement so, unlik e traditional controllers, the altitude error is directly con v erted to an ele v ator command [ 40 ]. The switc h element chooses the smallest magnitude from the upper and lo wer paths. Such a choice will limit the performance of the v ehicle in tracking altitude; ho we v er this choice will also minimize the chance of losing visibility of the horizon. Finally a gain is included to couple the longitudinal dynamics with the lateraldirectional dynamics. This gain, K f d generates a command to symmetric ele v ator based on a measurement of roll angle. The coupling gain is used to impro v e turn performance by both maintaining altitude and increasing rate of change of heading during a turn.

PAGE 51

CHAPTER 6 FLIGHT DEMONSTRA TION 6.1 Aircraft A ight test w as performed to demonstrate w aypoint na vigation using the visionbased attitude estimation. This ight test used the MA V sho wn in Figure 6–1 The v ehicle is a v ariant of a e xible-wing MA V which has been designed at the Uni v er sity of Florida [ 18 ]. In this case, the MA V is 21 inc hes in length and 24 inc hes in wingspan. The total weight of the v ehicle, including all instrumentation, is approximately 540 gr ams Figure 6–1: MA V The airframe is constructed almost entirely of composite and n ylon. The fuselage is constructed from layers of w o v en carbon ber which are cured to form a rigid structure. The thin, under -cambered wing consists of a carbon ber spar -and-batten sk eleton that is then co v ered with a n ylon wing skin. A tail empannage, also constructed of composite and n ylon, is connected to the fuselage by a carbon-ber boom that runs concentrically through the pusher -prop disc. Control is accomplished using a set of control surf aces on the tail. Specically a rudder along with a pair of independent ele v ators can be actuated by commands 43

PAGE 52

44 to separate serv os. Thrust results from a 6 inc h prop which is dri v en by a brushless electric motor The prop setup is sho wn in Figure 6–2 Po wer is v aried through the use of an of f the shelf speed controller commonly used for radio-controlled model airplanes. Figure 6–2: MA V Pusher -Prop Conguration The on-board sensors consist of a GPS unit, an altimeter and a camera. The GPS unit pro vides position at a rate of 1 Hz and is mounted horizontally on the top of the nose hatch. The altimeter which pro vides a measure of static pressure at a rate of 30 Hz is mounted inside the fuselage under the nose hatch. The GPS and altimeter both communicate with a 900 MHz on-board radio transcei v er which w as de v eloped specically for this application at the Uni v ersity of Florida Machine Intelligence Laboratory The transcei v er sends data to an of f-board ground station, and recei v es control commands from an of f the shelf Futaba radio transmitter The video camera is x ed to point directly out the nose of the aircraft. V ideo data is streamed to a Sony V ideo W alkman at 2.4 GHz via a wireless video transmitter All on-board components are po wered by an 11.1 V lithium-polymer battery The mounting setup of the on-board electronics is sho wn in Figure 6–3 The autopilot operates on an of f-board ground station. This ground station essentially consists of a laptop with communication links, a Sony V ideo W alkman and a Futaba radio transmitter Separate streams for video and inertial measurements are sent

PAGE 53

45 Data Processor/T ranscei v er Motor (hidden) Altimeter GPS Unit Battery P ack Camera Figure 6–3: Mounting Conguration of On-Board Electronics using transcei v ers on the aircraft. The video image is interpreted by the Sony V ideo W alkman before it is sent to the laptop via F ir e wir e connection. The image processing and controller analyze these streams and send commands to the transmitter V isionestimated parameters are pro vided by the image-processing at a rate of 30 Hz while the control loop runs at 50 Hz The transmitter is congured such that it transmits computer commands directly to the aircraft, where the y are passed on to the serv o actuators. A human pilot is held on standby in the e v ent that the controller f ails and interv ention is needed. This is an essential precaution to ensure safety and survi v ability which also allo ws some freedom in the gain tuning process. 6.2 Controller The gains for the autopilot were designed based on observing ight properties. Unfortunately a mathematical model describing the ight dynamics of the MA V did not e xist; consequently a trial and error approach w as adopted. A reasonable set of initial gains could actually be deri v ed based on intuition and e v aluating aircraft response to human commands. These gains were rened to increase performance both in terms of tracking and na vigating.

PAGE 54

46 Dif culty w as encountered in se v eral areas of the gain-tuning process. V arying one element of the controller often af fected se v eral other elements, and thus the process w as some what iterati v e. In some cases man y iterations were required to obtain desir able response characteristics. The controller performance w as also continually af fected by the highly v ariable ight conditions of the MA V As discussed in Section 2.2 wind gusts on the same order of magnitude of the airspeed of the v ehicle become more an issue of operating condition changes as opposed to rejection of disturbances as f ar as control design is concerned. V ision issues also played a lar ge role in hindering the progress of the autopilot design. Se v eral signals in the system deri v e from a pure vision component, thus some undesirable ef fects are introduced into controller error signals. Man y cases were encountered for which it w as not clear whether certain aircraft beha vior resulted from controller performance or vision system artif acts such as those discussed in Section 3.5 The lateral and longitudinal control systems were congured independently of each other F or this to be possible, the Futaba transmitter w as programmed such that the pilot could remain in the loop to control specic surf aces while passing complete computer control to others. This w ay the dynamic coupling ef fects of the lateral and longitudinal dynamics could be attenuated manually during certain phases of the design process. The goal w as to isolate the response when each de gree of freedom w as v aried in the controller When independent components demonstrated satisf actory performance, coupling ef fects were reintroduced so that proper ne adjustments could be made. The rst control element addressed w as the inner -loop roll tracking system. The human pilot retained the ability to command symmetric ele v ator deection during this process so that altitude could be maintained throughout the duration of each test. T o be gin the process for a gi v en test maneuv er the proportional roll gain w as adjusted to achie v e general tracking performance. Once mild oscillatory beha vior w as observ ed,

PAGE 55

47 the gain w as decreased slightly before the inte gral gain w as v aried. This step w as necessary because addition of the inte gral control la w had the tendenc y to adv ersely af fect o v ershoot and add some long period oscillatory beha vior to the system. The inte gral control w ould easily mak e up for the control authority lost in the proportional gain decrease. The procedure to tune the lateral inner -loop controller gains consisted of a systematic progression of maneuv ers. Each maneuv er consisted of a computer generated command follo wed by 10-30 sec of aircraft response. The response time w as limited by the need to k eep the aircraft within the vicinity of the ground station for the tuning process. The gains were rst tuned to maintain wings-le v el ight. This intermediate conguration w as then tested for reco v ery to wings-le v el ight from v arying initial bank angles and required adjustments were made. T racking performance w as then tested by commanding v arious bank angles from wings-le v el ight. Se v eral iterations were necessary before the controller e xhibited satisf actory performance across the entire range of maneuv ers. The longitudinal controller w as e xamined ne xt. The inner -loop roll track er w as capable of maintaining wings-le v el ight at this point, so the pilot w as not required in the loop. The pitch percentage tracking system w ould be essential to k eeping the horizon in the image and w ould also place an upper bound on the proportional altitude controller; therefore, this system w as the rst to be congured. High gain v alues were required due to lo w amplitude range of the signal, which v aried between zero and one. The dynamics also pro v ed to be hea vily damped which allo wed for generous application of control authority without a great deal of oscillation and o v ershoot observ ed in the response. Little ne tuning w as necessary due to this e xhibited lack of sensiti vity in the response. A similar range of ight maneuv ers w as performed to tune the pitch percentage tracking controller Gains were initially selected such that the controller could maintain

PAGE 56

48 the initial v alue of pitch percentage. It w as then necessary to identify v alues of pitch percentage, s near the limits of its range for which a moderate bank angle could still be resolv ed. V alues that represented v ery little sk y or v ery little ground in the image often resulted in signicant horizon estimation errors in the e v ent of a signicant bank angle. Therefore the v alues chosen as commands for the pitch percentage attitude limiter were s02 for the nose-high attitude limit and s08 for the nose-do wn attitude limit. Gains were then tuned to achie v e these limits from a neutral initial condition. After adjustments were made, tracking ability w as tested for more drastic initial conditions. Completion of the pitch percentage track er pro vided the necessary constraints to close the altitude loop without the risk of jeopardizing the inte grity of the vision-based error signal. Se v eral interesting performance issues surf aced in this step. First, the MA V had v ery dif ferent climb and di v e characteristics in part due to a sub-optimal po wer conguration. Some gain congurations w ould drastically o v ershoot a lo wer commanded altitude while the response to a commanded climb w ould not e v en reach the desired v alue. Another observ ed property w as that o v erly aggressi v e commands w ould cause more of a change in trim condition than an altitude tracking response. If a command caused a sudden and drastic change in pitch angle, too much airspeed w ould bleed of f for the aircraft to sustain climbing ight, and the dynamics were seen to settle at the ne w airspeed and attitude. Therefore caution w as emplo yed in gain selection in order to minimize these undesirable ef fects. The result w as diminished controller aggressi v eness. In a procedure similar to the design of the roll track er the proportional altitude control element w as addressed before the inte grator in order to achie v e tracking beha vior The initial test maneuv er required the MA V to simply maintain altitude. Inte gral gain increases were applied as necessary The ne xt step in v olv ed mild altitude changes in both directions, follo wed by necessary adjustments. The ef fecti v eness of the

PAGE 57

49 pitch percentage track er as a command limiter w as then tested by commanding drastic changes in altitude. T est maneuv ers typically lasted 20-30 sec and were again limited by the need to k eep the aircraft in the vicinity of the ground station. Design of the proportional gain that coupled altitude to bank angle w as simply a trial and error e x ercise. It w as required to choose a v alue lar ge enough to account for dramatic loss of lift in steep bank situations, b ut not lar ge enough to e xcite an altitude oscillation for small bank angles. These oscillations resulted from the f act that e xcessi v e application of ele v ator at small bank angles w ould cause a slight gain in altitude, which w ould be fought by the altitude loop. Most cases of a required decrease in altitude resulted in some le v el of o v ershoot and oscillation. The outer loop heading control loop w as the last system designed. The process w as again trial and error in nature. T w o congurations of dif fering aggressi v eness were initially e xamined. Performance issues resulted from the slo w dynamics of the GPS sensor In an attempt to alle viate these issues, a nonlinear gain schedule w as proposed, b ut pro v ed to be inef fecti v e. Ev entually an estimator w as incorporated into the system to resolv e problems with directional tracking, as discussed in Section 5.1.3 Implementation of the estimator required the approximation of se v eral parameters from pre vious ight test data. Specically a v alue determining the proportional relationship between bank angle and turn rate w as needed, as well as a v alue to approximate constant v elocity ight. The data used to determine a turn rate relationship is sho wn in Figure 6–4 which is a plot of turn rate as a function of bank angle. The roughness of the linear approximation is immediately apparent. The turn rate data w as computed using successi v e GPS measurements, which were assumed to update at 1 Hz This assumption is one possible cause for some of the abnormal data points seen in Figure 6–4 The v alue of v elocity V w as assumed to be constant for the purposes of heading estimation in Section 5.1.3 This assumption is made because the MA V operates under

PAGE 58

50 -80 -60 -40 -20 0 20 40 60 80 -100 -50 0 50 100 Bank Angle (Deg)Turn Rate (Deg/sec) Figure 6–4: Approximation of the Relationship Between T urn Rate and Bank Angle constant po wer when in autonomous ight. W ind and other f actors will af fect the instantaneous ground speed, b ut these ef fects are impossible to predict gi v en the limited sensor package used for this demonstration. The assumption of constant v elocity ight w as therefore made in lieu of using on-line dif ference calculations in v olving the slo w-rate GPS sensor The data set and approximation used to determine the a v erage v elocity is sho wn in Figure 6–5 Successi v e GPS measurements were used to determine v elocity and again gi v e rise to some data anomalies. This f act w as tak en into account with re gard to selection of a v elocity v alue. 0 100 200 300 400 500 600 0 50 100 150 200 250 300 Time (sec)Measured Velocity (ft/sec) Figure 6–5: Selection of MA V V elocity Data for Heading Estimation The nal controller gains are sho wn in T able 6–1 These v alues correspond to the respecti v e gains sho wn by the block diagrams in Figure 5.1.3.2 and Figure 5–7 The responsi v e nature of the lateral dynamics allo wed for relati v ely high proportional gain

PAGE 59

51 and a f ast inte grator Alternati v ely the slo wer nature of altitude response required a limited proportional gain and a relati v ely slo w-acting inte grator Also, tw o v alues of K y are presented to reect the autopilot at congurations of dif fering aggressi v eness. It is seen in the ne xt section that the inclusion of an estimator for heading angle allo wed the more aggressi v e of the tw o to be used in the nal design. T able 6–1: Controller Gains Gain V alue K f 0.85 K I f 0.04 K s -300 K h -0.25 K I h -0.006 K f d e 0.06 K y40.4, 1.05 The nal gains were based on the ability to continuously attain three preprogrammed w aypoints using a relati v ely smooth ight path. Essentially the aircraft needed to autonomously reach the w aypoints using the shortest possible ight path. The lateral-directional performance w as e v aluated using a threshold of 0.01 minutes latitude/longitude, which corresponds to roughly 16.5 ft around the w aypoint for latitude and longitude. The longitudinal performance w as not considered critical because of the strict limitations imposed by the vision system and MA V performance properties. Relati v ely poor performance w as anticipated, so the threshold around the w aypoint w as increased to roughly 40 ft in altitude. The resulting threshold is depicted in Figure 6–6 Flight paths that passed within these thresholds were considered to ha v e reached the w aypoint so the autopilot then directed the MA V to reach the ne xt w aypoint. 6.3 T racking The performance of the autopilot for tracking is an important e v aluation. The nal objecti v e of the autopilot is w aypoint na vigation; ho we v er this nal objecti v e is only reached by the aircraft follo wing commands such as roll and pitch. The ability to

PAGE 60

52 33 Ft 33 Ft 80 Ft Figure 6–6: W aypoint Threshold quickly follo w and maintain, a required change in roll will ob viously af fect the ability to reach w aypoints so the tracking performance must be in v estigated. Figure 6–7 sho ws a representati v e plot of roll response demonstrating the lateral stability augmentation system. Ov erall, the response tracks reasonably well. Some undesirable dynamic coupling is e xpected because dif ferential ele v ator introduces some longitudinal dynamics and because no rudder w as used for turn coordination. A short delay time and rise time are e xhibited with o v ershoot on the order of 2-3 bins of resolution ($4.45 de g each). Also, oscillations seen between 20-25 sec are e xamples of the horizon ”twitching” about tw o vision-estimated roll angles, which is an issue associated with poor sensor resolution as well as distortions due to lens curv ature, which w as discussed in Section 3.5 Good steady-state tracking is seen otherwise. 0 5 10 15 20 25 30 60 40 20 0 20 40 60 Time (sec)Roll angle (deg) MeasuredCommanded Figure 6–7: Lateral Inner -Loop Performance

PAGE 61

53 Figure 6–8 sho ws a pitch percentage plot demonstrating the longitudinal stability augmentation system, which uses proportional control to track pitch percentage. Although this controller is used only in specic circumstances as a command limiter it is important that good general tracking properties are e xhibited. The response for this non-aggressi v e command e xhibits reasonably good steady-state tracking. When implemented as a command limiter some steady-state error can be e xpected, as the limiting commands correspond to more aggressi v e attitude changes. These more aggressi v e v alues are complemented by associated changes in airspeed which could alter response characteristics undesirably 0 5 10 15 0 0.2 0.4 0.6 0.8 1 Time (sec)Pitch Percentage MeasuredCommanded Figure 6–8: Longitudinal Inner -Loop Performance 6.4 W aypoint Na vigation The ight test demonstrated the ability of the autopilot to reach a set of w aypoints. In this case, a set of 3 w aypoints were chosen. Each w aypoint w as separated by distances that could easily be reached by the aircraft within a minute b ut required some aggressi v e turns. The altitude of each w aypoint w as actually chosen to be equal because the climb performance of the v ehicle w as considerably w orse than the turn performance. The performance of an initial conguration, K y04 with no heading estimator is sho wn in Figure 6–9 The plot depicts a GPS map of the w aypoint conguration with

PAGE 62

54 each of the w aypoints sho wn as a solid dot and the distance threshold the surrounding dashed square. The aircraft clearly attains the w aypoints successi v ely; ho we v er the gross acquisition and ne tracking phases of na vigation are quite non-aggressi v e. The acquisition turns are relati v ely wide and, while the controller is capable of guiding the MA V to within each w aypoint threshold, the ne tracking performance does not e xhibit enough authority to consistently hit the w aypoints with precision. As a result, this autopilot w ould lik ely perform poorly and cause e xcessi v e circling for w aypoints that require aggressi v e maneuv ering. 33.15 33.2 33.25 33.3 33.35 33.4 33.45 31 31.05 31.1 31.15 31.2 31.25 Lattitude from 29 N (min)Longitude from 82 W (min) Figure 6–9: W aypoint Na vigation with K y04 The autopilot w as adjusted to pro vide more aggressi v e maneuv ering with respect to turns. Specically the heading gain w as increased such that K y10 to cause tighter turns in response to heading error The resulting ight path is sho wn in Figure 6–10 The increased gain resulted in sharp turns b ut also e xcessi v e o v ershoot. As a result, the aircraft frequently missed the w aypoint so the ight path displays a non-uniform circling beha vior The autopilot is challenged by an issue related to time scales; namely the time scale of the GPS measurements is considerably greater than the time scale of the ight dynamics. Essentially the MA V can turn and cause considerable changes in ight path much f aster than then the 1 Hz updates from the GPS sensor As a result, the

PAGE 63

55 33.15 33.2 33.25 33.3 33.35 33.4 33.45 31 31.05 31.1 31.15 31.2 31.25 Lattitude from 29 N (min)Longitude from 82 W (min) Figure 6–10: W aypoint Na vigation with K y10 aggressi v e controller commanded a f ast turn b ut the nose mo v ed too f ar before the controller recei v ed feedback about the ne w heading. The ight path in Figure 6–10 sho ws the o v ershoots that result from a f ast rate of turn b ut a slo w rate of heading measurement. The estimator for heading is incorporated into the autopilot. The resulting ight path is sho wn in Figure 6–11 The estimator is able to predict the heading angle during turns at 50 Hz to pro vide feedback for the autopilot. Consequently the autopilot can command an aggressi v e turn for an amount of time suitable to the ight dynamics re gardless of the GPS rate. 33.15 33.2 33.25 33.3 33.35 33.4 33.45 31 31.05 31.1 31.15 31.2 31.25 Lattitude from 29 N (min)Longitude from 82 W (min) Figure 6–11: W aypoint Na vigation with K y10 and Heading Estimator

PAGE 64

56 Clearly the estimator signicantly impro v ed performance of the autopilot for w aypoint na vigation. The lar ge heading changes required during gross acquisition immediately after hitting a w aypoint are observ ably more aggressi v e in Figure 6–11 than in Figure 6–9 Con v ersely the ne tracking phase of na vigation in Figure 6–11 e xhibits little of the o v ershoot seen in Figure 6–10 A representati v e ight path through the w aypoints w as selected for the case sho wn in Figure 6–9 and measured out to 17017 ft while a representati v e track with the estimator running only co v ered 15537 ft of ground. The discrepanc y between trials represents a 9% reduction in ight path distance with the estimator running. The estimator also af fects se v eral features of the ight data. One ef fect is that a more uniform ight path from pass to pass is maintained when estimated data is used. Another ef fect is the v ehicle consistently tracks to within a short distance of the actual tar get at the center of the w aypoint threshold. Thus, using rough estimates at 50 Hz is sho wn to increase both aggressi v eness and precision. The three cases described are plotted against each other in Figure 6–12 33.15 33.2 33.25 33.3 33.35 33.4 33.45 31 31.05 31.1 31.15 31.2 31.25 Lattitude from 29 N (min)Longitude from 82 W (min) Est Off, NonAggressiveEst Off, AggressiveEst On, Aggressive Figure 6–12: Comparison of W aypoint Na vigation Flight P aths

PAGE 65

57 The performance of the altitude controller w as reasonable b ut e xperienced some adv erse ef fects due to the ight performance properties of the MA V Therefore the results are more appropriately presented in the ne xt section during discussion of the major ef fects that MA V performance characteristics and ight re gime ha v e on the performance of the controller 6.5 Controller Performance Analysis The data in Figure 6–11 indicate reasonable performance of the closed-loop system for w aypoint na vigation; ho we v er se v eral issues associated with the ight path must be considered along with simply attaining the w aypoints. In particular the performance with respect to altitude hold and gust rejection need to be in v estigated. Additional sensor measurements must thus be analyzed to fully e v aluate the controller The wind during the ight testing w as recorded at 7.5 ft/s from the westsouthwest. The ef fect of this wind, whose direction is sho wn in Figure 6–13 is clearly seen in this representati v e se gment of the ight path. The aircraft, which nominally ies about 44 ft/s e xperiences v ariation in airspeed about 34% from upwind to do wnwind conditions. The direction of the wind is such that the MA V is blo wn into the turn when acquiring w aypoint-2 b ut blo wn out of the turn when acquiring w aypoint-3. Consequently the MA V is some what blo wn do wnwind during the ight. 33.15 33.2 33.25 33.3 33.35 33.4 33.45 31 31.05 31.1 31.15 31.2 31.25 Wpt 1 Wpt 2 Wpt 3 Start WindLattitude from 29 N (min)Longitude from 82 W (min) Figure 6–13: W ind relati v e to Flight P ath

PAGE 66

58 The wind also has a noticeable ef fect on the roll tracking in addition to the turn radius. Essentially the autopilot is designed for a single airspeed b ut the v ehicle operates o v er a wide range of airspeed throughout the ight. The inner -loop controller which pro vides roll tracking, sho wn in Figure 4.1.2 is thus required to pro vide suf cient performance despite this v ariation in airspeed and associated aircraft dynamics. Figure 6–14 demonstrates the ef fect of wind on the inner -loop roll track er The v ehicle typically iterates between an 8 s turn as it acquires w aypoint-2, then an 8 s turn as it acquires w aypoint-3, then a pair of 1 s turns to acquire w aypoint-1. The turns starting at approximately 3 s 42 s and 84 s are the turns into the wind and sho w roll angles e xceeding the commanded v alues including an o v ershoot. Con v ersely the turns starting at approximately 19 s 58 s 100 s are turns out of the wind and sho w roll angles smaller than the commanded v alues. These results are directly caused by the v ariation in control surf ace ef fecti v eness from the wind. 0 20 40 60 80 100 120 60 40 20 0 20 40 60 Roll (deg)Time (sec) VisionEstimatedCommanded Figure 6–14: Roll Angle The altitude, especially during these turns, is another issue of interest because the v ehicle is some what underpo wered. The altitude tracking is sho wn in Figure 6–15 to be reasonable for straight ight b ut quite poor for turns. The v ehicle consistently loses a great deal of altitude during the turns into the wind b ut considerably less during the turns out of the wind.

PAGE 67

59 0 20 40 60 80 100 120 0 50 100 150 200 250 Altimeter (Pressure Units)Time (sec) MeasuredCommanded Figure 6–15: Altimeter The pitch percentage sho wn in Figure 6–16 indicates, as e xpected, a dramatic increase during times associated with lar ge roll commands. Such increases are indicati v e of a lar ge nose-do wn pitch moment. Dif ferential ele v ator is a major contrib utor to this moment when steeply bank ed because of their location aft of the center of gra vity This coupling ef fect will be more e vident when turning into the wind because of the increased ele v ator ef fecti v eness associated with the change in airspeed. The loss of altitude is abated to some e xtent by the gain added through K f d e ; ho we v er the altitude loss w as still signicant for the turn into the wind. 0 20 40 60 80 100 120 0 0.2 0.4 0.6 0.8 1 Pitch PercentageTime (sec) Figure 6–16: Pitch Percentage The dependence of ight performance on wind conditions is further demonstrated in Figure 6–17 The turn to acquire w aypoint-2 corresponds to headings from 270 o

PAGE 68

60 to 90 o whereas the turn to acquire w aypoint-3 corresponds to headings from 100 o to 310 o with the wind originating from about 250 o Both acquisition turns e xhibit loss of altitude, ho we v er the loss is clearly greatest during the do wnwind turn to acquire w aypoint-2, resulting in the in v erted peak seen in Figure 6–17 near 100 o The v ehicle reco v ers altitude during the tw o se gments o wn at approximately constant headings of 90 o and 270 o Signicant o v ershoot in climb is seen in theses cases, which is lik ely due to of f-design controller performance resulting from wind ef fects. 0 50 100 150 200 250 300 350 70 80 90 100 110 120 Altimeter (Pressure Units)Heading (deg) Figure 6–17: Altimeter Relationship to Heading Finally some discussion is w arranted on the function of the switching mechanism in the longitudinal controller sho wn in Figure 5–7 That controller actually has tw o separate elements that track pitch percentage or track altitude. The switch attempts to retain the horizon in the image, and consequently ensure an accurate vision-based attitude estimation, by choosing the less aggressi v e of the tw o elements. In this case, the controller will only track pitch percentage when the aircraft has a lar ge error in altitude and the altitude track er commands an ele v ator deection that will pitch the camera image abo v e or belo w the horizon. The gain, K s is chosen e xceedingly lar ge to maintain aggressi v e tracking near the e xtreme limits of pitch percentage. It is critical that the controller has the capability to achie v e these limiting v alues of pitch percentage when called upon. Poor tracking could cause the horizon to mo v e out of vie w of the camera with potentially disastrous results.

PAGE 69

61 Figure 6–18 displays instances during the ight test for which the pitch percentage tracking command w as used in place of the altitude tracking command. These instances are denoted by v alues of one in the data set. The controller mostly track ed altitude commands because the ight path rarely required a lar ge pitch which w ould ha v e lost the horizon. The fe w instances when the switch acti v ated actually correlate to situations where the error in pitch percentage w as coincidentally lo w so the less aggressi v e maneuv er w as to track pitch percentage. The situation might be dif ferent for w aypoint congurations that require drastic changes in altitude. The only changes required in this case were the result of the altitude controller making up for poor performance. An en vironment with obstacles might require the v ehicle to mak e lar ge and sudden changes in altitude. An interesting follo w-up e xperiment w ould in v olv e putting the controller in such a situation to better determine the ef fecti v eness of the methodology used for feature retention. 0 20 40 60 80 100 120 0.5 0 0.5 1 1.5 ToggleTime (sec) Figure 6–18: Longitudinal Control T oggle

PAGE 70

CHAPTER 7 FUTURE DIRECTIONS Mission capability for an y autonomous aircraft, including a MA V requires a control system that can perform a di v erse set of tasks. Attitude stability and tracking represent only a small subset of control tasks that might be considered essential to fully autonomous guidance and na vigation. Some of these tasks include trajectory tracking, path planning, obstacle a v oidance, and tar get recognition. The successful completion of these tasks depends in lar ge part on the information pro vided by on-board sensors. In the case of a MA V the a v ailable sensors are limited by size and weight constraints. A small video camera can pro vide a wide v ariety of information and in man y cases this data can be used for control purposes. It therefore becomes desirable to e xtend the technologies de v eloped in the preceding chapters of this thesis to address some of the higher le v el outer -loop control problems. It is apparent that the future direction of vision-based ight control will be hea vily dependent on the direction and adv ances in image processing technology Kno wledge of the nature of information pro vided by a vision sensor is necessary to be gin study of v arious interpretations and implementations in control la ws. Current adv ances in feature detection, classication, and tracking are lik ely to continue to impro v e. The assumption is made that adv anced image processing algorithms will be de v eloped to yield increasing amounts of useful information from the focal plane. These assumptions allo w anticipati v e in v estigation of ho w one w ould tak e adv antage of such information if and when it becomes a v ailable. The broad goal for the direction of vision-based ight control, particularly for MA Vs, can be illustrated by Figure 7–1 T ypical mission proles might require a MA V to autonomously na vigate an unkno wn urban en vironment. Inertial sensors 62

PAGE 71

63 may be suf cient for certain se gments of ight, while other se gments could render readings from an inertial measurement unit (IMU) unusable. In the case of the latter a controller might ha v e to rely on computer vision alone. The possibility also e xists that some control tasks w ould require the blending of inertial sensors with vision-based information. F or full autonomy a system capable of achie ving mission proles lik e that described by Figure 7–1 requires the capability to switch modes of control in real-time. Figure 7–1: T echnology Roadmap for V ision-Based Flight Control This discussion presents se v eral directions for future in v estigation. First, it is necessary to identify possible control tasks that a MA V might be required to perform during autonomous ight in an unkno wn en vironment. F or e xample, obstacle detection and a v oidance should certainly be included in this set. A v ariety of control approaches might be appropriate for each task. F or the e xample of a v oidance, dif ferent actions might be required depending on the nature of the obstacle encountered and the time period between detection and imminent collision. Adv anced techniques for aggressi v e maneuv ering may be necessary F ollo wing task identication, the sensor measurements required to perform specic tasks must then be e xamined. Certain control tasks may require the blending of inertial and vision information on v arious le v els. It will be necessary to de v elop approaches to address the v ariety of uncertainties that are presented through the combination of

PAGE 72

64 dif ferent sensor technologies. The interpretation of tw o f airly uncertain signals as one signal of greater certainty is a critical capability in such cases. Ev entually the v arious control approaches will be gin to form well-dened modes of control; each requiring unique use of a v ailable sensor information. The higher le v el decisions re garding the selection of the appropriate approach to use in an y gi v en situation must be in v estigated. F or the a v oidance e xample, the system must be capable of making the decision whether to use “mild maneuv er a v oid” or “aggressi v e maneuv er a v oid. ” These modes may break do wn further into numerous sub-modes. The controller must ha v e the ability to assess a gi v en situation and decide upon proper action. Sensor uncertainty analysis f actors hea vily into these decisions, as the controller must kno w which measurements to trust and which to ignore in choosing a control action. The actual process of switching from one control mode to another will introduce a ne w set of challenges. Smooth transition from one action to the ne xt is critical in a cluttered ight en vironment. The controller must anticipate and tak e proper action to ensure ef cient transitions. This capability may f actor into other decision-making processes. F or e xample, the controller may be f aced with an a v oidance situation for which “a v oid left” and “a v oid right” are both viable options. The scenario follo wing “a v oid left” may require some other action that could result in a stall if it immediately follo ws “a v oid left. ” If the MA V can anticipate this transition issue through sensor measurements and proper interpretation, the decision to switch to “a v oid right” mode can be made. Finally the controller might be required to recongure the mission prole gi v en the introduction of une xpected course de viations. F or e xample, the controller may be required to switch from a na vigation mode to an obstacle a v oidance mode, then e xhibit on-line path-planning capabilities to resume the original mission goal from the current location and trajectory Lik e wise, a tracking scenario might in v olv e a human tar get climbing into a v ehicle. The controller must recongure the mission to track the

PAGE 73

65 v ehicle instead of the human. These scenarios both require the controller to mak e high le v el decisions concerning its o wn mission requirements. This discussion sho ws that the w ork presented in this thesis is only a small step in the de v elopment of adv anced mission capability using computer vision as a sensor Se v eral e xamples of future directions ha v e been identied. These e xamples sho w that computer vision represents an inte gral part of both adv anced intelligent ight control techniques as well as autonomous v ehicle mission planning.

PAGE 74

CHAPTER 8 CONCLUSION This thesis has demonstrated that w aypoint na vigation can be accomplished using GPS and altitude sensing coupled with vision-based attitude estimation. The estimated states are actually only roll angle and pitch percentage; ho we v er these parameters pro vide suf cient information for basic stability augmentation. A ight test of the resulting autopilot clearly indicates the autopilot performance. In this case, the autopilot is able to generate a short ight path that successi v ely reaches a set of w aypoints. The current approach is limited to ight operations in which the horizon is visible; ho we v er the fundamental concept really relates using an autopilot with feedback of a vision-based feature. As such, the autopilot can be e xtended using this concept to consider other vision-based features, such as b uilding corners or roads, for guidance and na vigation. 66

PAGE 75

REFERENCES [1] Abdulrahim, M., Garcia, H., and Lind, R., “Flight Characteristics of W ing Shaping for a Micro Air V ehicle with Membrane W ings, ” In pr ess, AIAA J ournal Air cr aft [2] Abdulrahim, M., Garcia, H., and Lind, R., “Flight T esting a Micro Air V ehicle Using Morphing for Aeroserv oelastic Control, ” Pr oceedings of the AIAA/ASME/ASCE/AHS/ASC Structur es, Structur al Dynamics and Materials Confer ence AIAA-2004-1674, P alm Springs, CA, April 2004. [3] Abdulrahim, M., and Lind, R., “Flight T esting and Response Characteristics of a V ariable Gull-W ing Morphing Aircraft, ” Pr oceedings of the AIAA Guidance Navigation, and Contr ol Confer ence and Exhibit AIAA-2004-5113, Pro vidence, RI, August 2004. [4] Albertani, R., Hubner P ., Ifju, P .G., Lind, R., Jack o wski, J.,“Experimental Aerodynamics of Micro Air V ehicles, ” Pr oceedings of the SAE W orld A viation Congr ess 2004-01-3090, Reno, NV No v ember 2004. [5] Boothe, K., Fitzpatrick, K., and Lind, R., “Controllers for Disturbance Rejection for a Linear Input-V arying Class of Morphing Aircraft, ” Submitted to the AIAA/ASME/ASCE/AHS Structur es, Structur al Dynamics, and Materials Confer ence Austin, TX, April 2005. [6] Burt, P .J., Ber gen, J.R., Hingorani, R., K olczynski, R., Lee, W .A., Leung, A., Lubin, J., Shv ayster H.,“Object T racking W ith a Mo ving Camera An Application of Dynamic Motion Analysis, ” IEEE CH2716-9/89/0000/0002, 1989. [7] Campos, M.F .M., and Coelho, L. de S., “ Autonomous Dirigible Na vigation Using V isual T racking and Pose Estimation, ” Pr oceedings of the IEEE International Confer ence on Robotics and A utomation V ol. 4, Detroit, MI, May 1999, pp. 2584-2589. [8] Castro, A.P .A., Silv a J. D., and Simoni, P .O., “Image Based Autonomous Na vigation with Fuzzy Logic Control, ” Pr oceedings of the International J oint Confer ence on Neur al Networks ,V olume 3, 15-19 July 2001, pp. 2200-2205. [9] Cause y R., A Later al V ision-Based Contr ol A utopilot for Micr o Air V ehicles Using a Horizon Detection Appr oac h M.S. Thesis, Department of Mechanical and Aerospace Engineering, Uni v ersity of Florida, December 2003. 67

PAGE 76

68 [10] Chatterji, G.B., Menon, P .K., and Sridhar B., “V ision-Based Position and Attitude Determination for Aircraft Night Landing, ” J ournal of Guidance Contr ol, and Dynamics V ol. 21, No. 1, January-February 1998, pp. 84-92. [11] Choi, J.Y ., Kim, C.S., Hong, S., Lee, M.H., Bae, J.I., and Harashima, F ., “V ision Based Lateral Control by Y a w Rate Feedback, ” Pr oceedings of the 27th Annual Confer ence of the IEEE Industrial Electr onics Society V olume 3, 29 No v ember 3 December 2001, pp. 2135-2138. [12] Ettinger S.M., Nechyba, M.C., Ifju, P .G., and W aszak, M.R., “V ision-Guided Flight Stability and Control for Micro Air V ehicles, ” Pr oceedings of the IEEE International Confer ence on Intellig ent Robots and Systems V ol. 3, Lausanne, Switzerland, October 2002, pp. 2134-2140. [13] Fitzpatrick, K., “ Applications of Linear P arameter -V arying Control for Aerospace Systems, ” Master' s Thesis, Uni v ersity of Florida, 2003. [14] Fre w E., McGee, T ., Kim, Z., Xiao, X., Jackson, S., Morimoto, M., Rathinam, S., P adial, J., and Sengupta, R., “V ision-Based Road-F ollo wing Using a Small Autonomous Aircraft, ” Pr oceedings of the 2004 IEEE Aer ospace Confer ence 0-7803-8155-6, IEEA C paper #1479, Big Sk y MT March 2004. [15] Garcia, H., Abdulrahim, M., and Lind, R., “Roll Control for a Micro Air V ehicle using Acti v e W ing Morphing, ” Pr oceedings of the 2003 AIAA Guidance Navigation, and Contr ol Confer ence AIAA-2003-5347, Austin, TX, August 2003. [16] Grasme yer J.M. and K eennon, M.T ., “De v elopment of the Black W ido w Micro Air V ehicle, ” Pr oceedings of the AIAA Aer ospace Sciences Meeting and Exhibit AIAA-2001-0127, Reno, NV January 2001. [17] Hutchinson, S., Hager G., and Cork e, P ., “ A T utorial on V isual Serv o Control, ” IEEE T r ans. on Robotics and A utomation, v ol. 12, pp. 651-670, Oct. 1996. [18] Ifju, P .G., Jenkins, D.A, Ettinger S., Lian, Y ., Shyy W ., and W aszak, M.R., “Fle xible-W ing-Based Micro Air V ehicles, ” Pr oceedings of the AIAA Aer ospace Sciences Meeting and Exhibit AIAA 2002-0705, Reno, NV January 2002. [19] Ifju, P .G., Ettinger S., Jenkins, D.A, and Martinez, L., “Composite Materials for Micro Air V ehicles, ” Society for the Advancement of Materials and Pr ocess Engineering Annual Confer ence P aper #46-162, Long Beach, CA, May 2001. [20] Juberts, M., and Ra vi v D., “V ision-Based V ehicle Control fo A VCS, ” Intellig ent V ehicles Symposium National Institute of Standards and T echnology Gaithersb ur g, MD, 14-16 July 1993, pp. 195-200. [21] Kasetkasem, T ., and V arshne y P .K., “ An Image Detection Algorithm Based on Mark o v Random Fields Models, ” IEEE T r ansactions on Geoscience and Remote Sensing V ol. 40, No. 8, August 2002, pp. 1815-1823.

PAGE 77

69 [22] Kato, T ., Ninomiya, Y ., and Masaki, I., “ An Obstacle Detection Method by Fusion of Radar and Motion Stereo, ” IEEE T r ansactions on Intellig ent T r ansportation Systems V ol. 3, No. 3, September 2002, pp. 182-188. [23] K osecka, J., Blasi, R., T aylor C.J., and Malik, J., “ A Comparati v e Study of V ision-Based Control Strate gies for Autonomous Highw ay Dri ving, ” Pr oceedings of the IEEE International Confer ence on Robotics & A utomation Leuv en, Belgium, May 1998, pp. 1903-1908. [24] Kw olek, B., Kapuscinski, T ., and W ysocki, M., “V ision-Based Implementation of Feedback Control of Unic ycle Robots, ” Robot Motion and Contr ol W orkshop 28-29 June 1999, pp. 101-106. [25] Lian, Y ., and Shyy W .,“Three-Dimensional Fluid-Structure Interactions of a Membrane W ing for Micro Air V ehicle Applications, ” AIAA-2003-1726, Pr oceedings of the AIAA/ASME/ASCE/AHS Structur es, Structur al Dynamics, and Materials Confer ence Norfolk, V A, April 2003. [26] Mandelbaum, R., McDo well, L., Bogoni, L., and Hanson, M., “Real-T ime Stereo Processing, Obstacle detection, and terrain estimation from V ehicle-Mounted stereo Cameras, ” Pr oceedings of the F ourth IEEE W orkshop on Applications of Computer V ision Princeton, NJ, 19-21 October 1998, pp. 288-289. [27] Manigel, J., and Leonhard, W ., “ V ehicle Control by Computer V ision, ” IEEE T r ansactions on Industrial Electr onics V ol. 39, No. 3, June 1992, pp. 181-188. [28] Morris, S., and Holden, M., “Design of Micro Air V ehicles and Flight T est V alidation, ” Pr oceedings of the Confer ence on F ixed, Flapping and Rotary W ing V ehicles at V ery Low Re ynolds Number s South Bend, IN, June 2000, pp. 153-176. [29] Nguyen, Ogb urn, Gilbert, Kibler Bro wn, and Deal, “Simulator Study of Stall/Post-Stall Characteristics of a Fighter Airplane with Relax ed Longitudinal Static Stability ” N ASA T ec hnical P aper 1538 December 1979. [30] Rane y D.L. and Slominsk y E.C., “Mechanization and Control Concepts for Biologically Inspired Micro Aerial V ehicles, ” Pr oceedings of the 2003 AIAA Guidance Navigation, and Contr ol Confer ence and Exhibit AIAA-2003-5345, Austin, TX, August 2003. [31] Rock, S.M., Fre w E.W ., Hank, J., LeMaster E.A., and W oodle y B.R., “Combined CDGPS and V ision-Based Control of a Small Autonomous Helicopter ” Pr oceedings of the American Contr ol Confer ence 0-7803-4530-4/98, Philadelphia, P A, June 1998, pp. 694-698. [32] Shyy W ., Ber g, M., and Ljungqvist, D., “Flapping and Fle xible W ings for Biological and Micro Air V ehicles, ” Pr o gr ess in Aer ospace Sciences V ol. 35, No. 5, 1999, pp. 455-506.

PAGE 78

70 [33] Silv eira, G.F ., Carv alho, J.R.H., Madirid, M.K., Ri v es, P ., and Bueno, S.S., “ A F ast V ision-Based Road F ollo wing Strate gy Applied to the Control of Aerial Robots, ” Br azilian Symposium on Computer Gr aphics and Ima g e Pr ocessing Florianopolis, Brazil, 15-18 October 2001, pp. 226-231. [34] Soatto, S., Perona, P ., Frezza, R., and Picci, G., “Motion Estimation via Dynamic V ision, ” IEEE Pr oceedings of the 33r d Confer ence on Decision and Contr ol Lak e Beuna V ista, FL, December 1994, pp. 3253-3258. [35] Soatto, S., and Perona, P ., “Dynamic V isual Motion Estimation from Subspace Constraints, ” Pr oceedings of the IEEE International Ima g e Pr ocessing Confer ence 0-8186-6950-0/94, Austin, TX, No v ember 1994, pp. 333-337. [36] Soatto, S., and Perona, P ., “Reducing 'Structure from Motion', ” Pr oceedings of the IEEE Computer Society Confer ence on Computer V ision and P attern Reco gnition 1063-6919/96, San Francisco, CA, June 1996, pp. 825-832. [37] Soatto, S., and Perona, P ., “V isual Motion Estimation from Point Features: Unied V ie w ” Pr oceedings of the IEEE International Ima g e Pr ocessing Confer ence 0-8186-7310-9/95, W ashington, DC, October 1995, pp. 21-24. [38] Soatto, S., and Perona, P ., “Reducing 'Structure from Motion': A General Frame w ork for Dynamic V ision P art 1: Modeling, ” IEEE T r ansactions on P attern Analysis and Mac hine Intellig ence V ol. 20, No. 9, September 1998, pp.933-942. [39] Soatto, S., and Perona, P ., “Reducing 'Structure from Motion': A General Frame w ork for Dynamic V ision P art 2: Implementation and Experimental Assessment, ” IEEE T r ansactions on P attern Analysis and Mac hine Intellig ence V ol. 20, No. 9, September 1998, pp. 943-960. [40] Ste v ens, B., and Le wis, F ., Air cr aft Contr ol and Simulation W ile y Inter -science, Ne w Y ork, 1992. [41] Thomas, B.T ., Lotufo, R.A., Mor gan, A.D., Milford, D.J., and Dagless, E.L., “ Real-T ime Image Analysis for V ision Guided V ehicle Control, ” UK IT 1990 Confer ence Southampton, UK, 19-22 March 1990., pp. 66-70. [42] V iieru, D., Lian, Y ., and Shyy W ., and Ifju, P .G., “In v estigation of T ip V orte x on Aerodynamic Performance of a Micro Air V ehicle, ” Pr oceedings of the 33r d AIAA Fluid Dynamics Confer ence and Exhibit AIAA-2003-3597, Orlando, FL, June 2003. [43] W aszak, M.R., Jenkins, L.N., Ifju, P .,“Stability and Control Properties of an Aeroelastic Fix ed W ing Micro Air V ehicle, ” Pr oceedings of the AIAA Atmospheric Flight Mec hanics Confer ence AIAA-2001-4005, Montreal, Canada, August 2001.

PAGE 79

71 [44] W aszak, M.R., Da vidson, J.B., Ifju, P .G., “Simulation and Flight Control of an Aeroelastic Fix ed W ing Micro Air V ehicle, ” Pr oceedings of the AIAA Atmospheric Flight Mec hanics Confer ence AIAA-2002-4875, Montere y CA, August 2002. [45] Y ao, Y ., Chellappa, R., “Dynamic Feature Point T racking in an Image Sequence, ” Pr oceedings of the 12th IAPR International Confer ence on P attern Reco gnition IEEE 1051-4651/94, Jerusalem, Israel, October 1994, pp. 654-657. [46] Zhang, B., Lian, Y ., and Shyy W .,“Proper Orthogonal Decomposition for ThreeDimensional Membrane W ing Aerodynamics, ” Pr oceedings of the 33r d AIAA Fluid Dynamics Confer ence and Exhibit AIAA-2003-3917, Orlando, FL,June 2003.

PAGE 80

BIOGRAPHICAL SKETCH Joseph John K ehoe w as born in Coopersto wn, NY on July 16, 1980. He gre w up in Oneonta, NY and graduated from Oneonta High School in June of 1998. He then attended V ir ginia Polytechnic Institute and State Uni v ersity where he recei v ed a Bachelor of Science de gree in aerospace engineering in May of 2002. During his senior year at V ir ginia T ech, Joseph w as part of an interdisciplinary/international aircraft design team whose ef forts resulted in Ik elos which is a unique general a viation aircraft designed to t into N ASA s Small Aircraft T ransportation Systems (SA TS) infrastructure. The design w on second place in the 2002 student competition out of N ASA Langle y and w as featured at the EAA s Airventur e in Oshk osh, WI, as well as in the October 2002 issue of P opular Science He w ork ed tw o internships with NLX Corporation in Sterling, V A, where he helped with the de v elopment and qualication processes of se v eral F AA le v el D full-ight simulator programs. Joseph is currently a second-year graduate student in the Department of Mechanical and Aerospace Engineering at the Uni v ersity of Florida. He is studying under Dr Rick Lind, and his research in v olv es the de v elopment of vision-based ight control methodologies. 72


Permanent Link: http://ufdc.ufl.edu/UFE0008943/00001

Material Information

Title: Autopilot Development for a Micro Air Vehicle Using Vision-Based Attitude Estimation
Physical Description: Mixed Material
Copyright Date: 2008

Record Information

Source Institution: University of Florida
Holding Location: University of Florida
Rights Management: All rights reserved by the source institution and holding location.
System ID: UFE0008943:00001

Permanent Link: http://ufdc.ufl.edu/UFE0008943/00001

Material Information

Title: Autopilot Development for a Micro Air Vehicle Using Vision-Based Attitude Estimation
Physical Description: Mixed Material
Copyright Date: 2008

Record Information

Source Institution: University of Florida
Holding Location: University of Florida
Rights Management: All rights reserved by the source institution and holding location.
System ID: UFE0008943:00001


This item has the following downloads:


Full Text











AUTOPILOT DEVELOPMENT FOR A MICRO AIR VEHICLE USING
VISION-BASED ATTITUDE ESTIMATION
















By

JOSEPH JOHN KEHOE


A THESIS PRESENTED TO THE GRADUATE SCHOOL
OF THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT
OF THE REQUIREMENTS FOR THE DEGREE OF
MASTER OF SCIENCE

UNIVERSITY OF FLORIDA


2004
















Dedicated to Sarah, for her

patience,

understanding,

caring,

Love.















ACKNOWLEDGMENTS

I would like to acknowledge AFRL-MN and the management at Eglin Air Force

Base because none of this work would have been possible without their continued

financial support. I would also like to extend thanks to Dr. Peter Ifju and his students

Kyu Ho Lee and Sewoong Jung, along with the entire MAV Lab for the design,

development, and fabrication of the aircraft. Heartfelt thanks go out to the researchers

at the Machine Intelligence Laboratory, especially Dr. Michael Nechyba, Jason

Grzywna, and Jason Plew, for the development of the electronics hardware, the system

software, the horizon detection algorithm, and most importantly for their generous help

in troubleshooting the various quirks that cropped up throughout the process. This

effort also would not have been possible without the help of the researchers in the

Flight Controls Lab, including Ryan Causey for laying the groundwork and feeling out

many of the issues, Kristin Fitzpatrick and Jason Jackowski for providing field support,

and Mujahid Abdulrahim for his piloting skills, observations on various topics, and

never-waning desire to go "AVCAAFing." I would like to express deep and sincere

thanks to Dr. Rick Lind for his time, guidance, support, and motivation. My parents,

Joseph and Linda Kehoe, deserve much credit for keeping me pointed in the right

direction over the years. Lastly, and certainly not least, I would like to thank Kenny

Boothe for locating the airplane one hot afternoon in May and well, just for being

Kenny B.
















TABLE OF CONTENTS


page


ACKNOWLEDGMENTS .....


LIST OF FIGURES ................................... vi


ABSTRACT ............


CHAPTER


1 INTRODUCTION .....


Motivation ......
Overview . .


2 MICRO AIR VEHICLES ..


2.1 Background .................
2.1.1 Characterization and Applications .
2.1.2 Current Technology ..........
2.2 Control Issues ................

3 VISION-BASED ATTITUDE ESTIMATION ....

3.1 Image-Based Features .............
3.2 Vision for Flight Control ...........
3.3 Horizon Detection ...............
3.3.1 Interpretation .............
3.3.2 Algorithm ...............
3.4 State Estimation . . . .
3.4.1 R oll . . . . .
3.4.2 Pitch . . . . .
3.5 Vision Issues . . . . .

4 INNER-LOOP CONTROL . . . .

4.1 Lateral Stability Augmentation . .
4.1.1 M ethods . . . .
4.1.2 Control Architecture . . .
4.2 Longitudinal Stability Augmentation System
4.2.1 M ethods . . . .
4.2.2 Control Architecture . . .


...........................
...........................









5 GUIDANCE AND NAVIGATION ....................... 33

5.1 Directional Controller ........................... 33
5.1.1 M ethods . . . . . . . 33
5.1.2 Control Architecture . . . . . 34
5.1.3 State Estimation . . . . . . 35
5.2 Altitude Control . . . . . . . 41

6 FLIGHT DEMONSTRATION . . . . . . 43

6.1 Aircraft . . . . ... ... ..... . 43
6.2 Controller . . . . . . . . 45
6.3 Tracking . . . . . . . . 51
6.4 Waypoint Navigation . . . . . . 53
6.5 Controller Performance Analysis ...... . . . 57

7 FUTURE DIRECTIONS . . . . . . . 62

8 CONCLUSION . . . . . . . . 66

REFERENCES . . . . . . . . . 67

BIOGRAPHICAL SKETCH . . . . . . . 72















LIST OF FIGURES
Figure page

2-1 Micro Air Vehicles . . . . . . ... 6

2-2 Various MAVs studied at the University of Florida . . . 7

2-3 A Biologically Inspired Control Approach . . . 8

3-1 Mapping a Physical Feature to the Image Plane . . . 12

3-2 MAV Kinematics . . . . . . . 13

3-3 Estimated Horizon shown Relative to Physical Horizon and Local Hori-
zontal . . . . . . . . . 16

3-4 Vision Estimated Horizon . . . . . . 17

3-5 Error in Horizon Estimation . . . . . . 18

3-6 Effect of Roll on Pitch Percentage . . . . . 21

3-7 Lens Curvature for a Horizon . . . . . . 24

4-1 Lateral Stability Augmentation System .... . . 28

4-2 Longitudinal Stability Augmentation System . . ... 31

5-1 Backwards Difference Relationship for Heading Calculation . .... 33

5-2 Closed-Loop Directional Control System .... . . 35

5-3 Measured heading significantly lags actual heading at each GPS datapoint 36

5-4 Vector Diagram of GPS Sensor Measurement . . .. 37

5-5 Heading/Position Estimate Relationship ..... . . 38

5-6 Closed-Loop Directional Control System .... . . 41

5-7 Closed-Loop Altitude Control System ..... . . 41

6-1 M AV . . . . . . . . 43

6-2 MAV Pusher-Prop Configuration . . . . . 44

6-3 Mounting Configuration of On-Board Electronics . . . 45









6-4 Approximation of the Relationship Between Turn Rate and Bank Angle 50

6-5 Selection of MAV Velocity Data for Heading Estimation . . 50

6-6 Waypoint Threshold . . . . . . . 52

6-7 Lateral Inner-Loop Performance . . . . . 52

6-8 Longitudinal Inner-Loop Performance ..... . . . 53

6-9 Waypoint Navigation with Ky = 0.4 ...... . . 54

6-10 Waypoint Navigation with Kv = 1.0 ...... . . 55

6-11 Waypoint Navigation with Ky = 1.0 and Heading Estimator . .... 55

6-12 Comparison of Waypoint Navigation Flight Paths . . . 56

6-13 Wind relative to Flight Path . . . . . . 57

6-14 Roll Angle . . . . . . . . 58

6-15 Altim eter . . . . . . . . . 59

6-16 Pitch Percentage . . . . . . . . 59

6-17 Altimeter Relationship to Heading . . . . . 60

6-18 Longitudinal Control Toggle . . . . . . 61

7-1 Technology Roadmap for Vision-Based Flight Control . . 63















Abstract of Thesis Presented to the Graduate School
of the University of Florida in Partial Fulfillment of the
Requirements for the Degree of Master of Science

AUTOPILOT DEVELOPMENT FOR A MICRO AIR VEHICLE USING
VISION-BASED ATTITUDE ESTIMATION

By

Joseph John Kehoe

December 2004

Chair: Richard C. Lind, Jr.
Major Department: Mechanical and Aerospace Engineering

Missions envisioned for micro air vehicles may require a high degree of auton-

omy to operate in unknown environments. These missions require advanced sensor

technology for control tasks such as obstacle detection and avoidance. A video camera

provides a small, lightweight, and inexpensive alternative to infrared and ultrasonic

sensors. As such, vision is a critical technology for mission capability. This thesis

discusses an autopilot that uses vision coupled with GPS and altitude sensors for

waypoint navigation. The vision processing analyzes a horizon to estimate roll and

pitch information. The GPS and altitude sensors are then analyzed to command values

to roll and pitch for navigation between waypoints. A flight test of a MAV using this

autopilot demonstrates the resulting closed-loop system is able to autonomously reach

several waypoints. Several issues, such as estimating heading to account for slow

updates, are investigated to increase performance.















CHAPTER 1
INTRODUCTION

1.1 Motivation

A variety of mission scenarios are envisioned for which a micro air vehicle (MAV)

would be particularly attractive. A MAV would be ideal to fly around a city looking

for biological agents. A MAV would be ideal as a hand-launched platform to surveil

over a ridge of a battlefield. A MAV would also be ideal for monitoring and tracking

targets in an urban environment. These mission scenarios require the vehicle to operate

outside the visual range of a ground operator; consequently, autonomy is a crucial

technology for enabling mission capability.

Many organizations around the world have MAV programs which are resulting in

a wide range of flying aircraft. Autopilots have even been developed for many vehicles

which allow basic autonomy. These autopilots are able to stabilize the vehicle and

traverse along a flight path determined by a set of waypoints. The majority of these

autopilots have not been documented in archival literature yet; however, several case

studies can be found on the internet.

The use of vision is being explored to provide sensing for an autopilot on a

MAV. Vision becomes a critical technology when considering mission profiles, such

as urban environments, for a which a MAV is ideally suited. The existing autopilots

almost exclusively rely on gyros and accelerometers along with GPS and altimeters

for controlling the vehicle. In actuality, vision may be a primary sensor needed for

determining flight path, and potentially even attitude, during highly agile and aggressive

maneuvering to explore within a city.

Vision has been used for several control purposes. Many of the studies performed

for vision-based vehicle control have involved autonomous ground vehicles using vision









sensors for road following [8, 11,20, 23, 27, 41]. Several papers have used similar

techniques for flight vehicles such as dirigibles and helicopters [7, 31, 33]. Two things

common to the vehicles in many of these studies are (1) they all have the capability to

maintain a fixed position for indefinite periods of time to process information, and (2)

they all use vision for outer-loop tracking purposes. Limited studies make use of vision

for control of fixed-wing aircraft [9, 10, 12, 14]. Likewise is true for the use of vision

for inner-loop vehicle stabilization [9, 12,24].

1.2 Overview

This thesis discusses a method of waypoint navigation that uses vision-based

attitude estimation. The control methodology builds upon a previously developed

vision system. This system uses a statistical technique to determine an optimal

estimate of the horizon. The relative orientation of the estimated horizon is then used

to estimate body-axis roll angle and an indication of longitudinal attitude denoted

pitch percentage. These estimates are used by an inner-loop controller for stability

and tracking. An outer-loop controller determines a set of roll and pitch commands,

which are tracked by the inner-loop controller, to proceed along a pre-determined set of

waypoints.

A flight test is performed to demonstrate the methodology and performance of

the resulting controller. The MAV is equipped only with a single GPS receiver, an

altimeter, and a forward-pointing camera embedded in the nose of the aircraft. As

such, the vehicle must rely entirely on the vision system for estimating attitude. The

lack of gyros and accelerometers may seem unrealistic and limits the controller to flight

in areas where the horizon is obvious; however, the test is indicative of a controller

which uses a feature, such as the horizon, for feedback.

Several issues related to vision-based control are discussed in the context of the

flight test. One issue is related to coarse resolution in the measurement of vision-

estimated roll angle. Also, the issue of interpreting vision-based parameters as







3

physical information is highlighted. This issue becomes particularly important to the

relationship between pitch percentage and aircraft pitch attitude.

Issues presented by the MAV performance characteristics are also discussed in

the context of the flight test. Specifically, it is seen that the integration of a 1 Hz GPS

system with the significantly faster dynamics of the aircraft requires the estimation

of some parameters for effective waypoint navigation. Also, the highly variable flight

conditions of an aircraft on the MAV scale are seen to effect controller performance.















CHAPTER 2
MICRO AIR VEHICLES

2.1 Background

2.1.1 Characterization and Applications

Micro air vehicles, or MAVs, are typically defined as a class of aircraft having

a wingspan of less than 6 inches and which is capable of operating at airspeeds of

25 mph or less [18]. This definition is not strictly applied, as others have been used

in literature for various purposes [9]. The general concept is that the class of MAVs

consists of small, expendable aircraft that are practical for mission scenarios ill-suited

for conventional aircraft.

MAVs have been seen to exhibit highly agile and maneuverable behavior in

flight [1,2, 12,43]. These abilities likely result from the fact that the small, lightweight

vehicles possess small moments of inertia while they exhibit a disproportionately

large amount of control authority. These characteristics result in interesting flight

performance properties, including the ability to drastically change direction almost

instantaneously, evidence of non-traditional dynamic modes, and high-bandwidth

response characteristics. Such responsive behavior can cause difficulty for a human

pilot, but also display potential utility for possible mission scenarios.

The small size and highly agile dynamics of MAVs stand to benefit a variety of

applications. When coupled with an advanced control system, autonomous flight in

an urban environment becomes a distinct possibility. Such capability could prove in-

valuable for surveillance and tracking missions in either a military or law enforcement

setting. For example, a MAV could be used for immediate bomb damage assessment if

deployed with/from the ordinance. Perhaps a soldier or police officer equipped with an

appropriately instrumented MAV capable of hand launch could extend awareness of the









ambient surroundings to a wider proximity, including areas dense with obstacles and

obstructions. In addition, the relatively insignificant presence of the MAV provides a

stealthy edge to such surveillance missions.

Similarly, the use of sophisticated sensor packages could transform a MAV into

a very useful mapping tool. Unknown battlefield environments could be efficiently

traversed and documented for strategic advantage. In the event of a radiological,

biological, or chemical disaster, an autonomous MAV could identify and map affected

regions. These mission scenarios all take advantage of the unique flight regime of the

MAV for increased mission efficacy and safety of parties involved.

Also, the size, structural properties, and general expendability make MAVs a

useful testbed for various research applications. Aerodynamic forces and moments

applied to a MAV in flight are small in magnitude relative to larger vehicles, thus

allowing lightweight and flexible materials to be used for construction. This fact

has allowed researchers in the field of wing morphing for flight control to sidestep

the complicated issues of materials selection and control actuation. MAV structural

properties allow simplistic morphing actuation mechanisms to be applied so that control

issues can be directly addressed with in-flight demonstration [1,2, 3, 15]. This thesis

in fact takes advantage of MAV durability and expendability to examine vision-based

flight control strategies. Crash survivability and repeatability play an enormous role in

experimental control design.

2.1.2 Current Technology

In recent years, many strides have been made to advance the technologies required

for more effective MAVs. The use of advanced materials have allowed increasingly

lightweight airframe construction such that smaller vehicles have seen increasing

payload capacity. Likewise, the miniaturization of electronics and communications

hardware has decreased the payload requirements for instrumentation and actuator

components. The result of these advances are smaller vehicles that can carry a









substantial payload, and therefore display the potential for autonomy on a useful

enough level for a variety of mission applications.

One of the earliest examples of a MAV that exhibited autonomous mission

capability was the Black Widow, developed by Aerovironment under funding from

DARPA [16]. The Black Widow had a wingspan of 6 inches and weighed about

100 grams. The configuration, which is shown in Figure 2-1(a), was a lifting-body

design constructed using expanded polystyrene (EPS) foam, balsa wood, and Kevlar.

This extremely small aircraft was able to carry an on-board electronics package that

included a custom-made video camera system, a 16-channel data logger, and autopilot

system. The autopilot performed altitude, airspeed, and heading hold operations and

also included a yaw damper. The electrically powered vehicle was shown to reach

airspeeds of about 30 mph and exhibited endurance of about 30 minutes.

Another early MAV design example is the Trochoid, depicted in Figure 2-1(b).

The Trochoid was developed by Steve Morris of MLB Company using Multidisci-

plinary Design Optimization (MDO) techniques, and was 6.9 inches in maximum

dimension. A 7.9 inch version of the trochoid boasted an endurance of up to 20 min-

utes, flight speeds ranging from 10-60 mph, and was able to perform turns with radii as

small as 15 feet. The vehicle was also equipped with a video downlink and a stability

augmentation system.











(a) The Black Widow [16] (b) The Trochoid [28]


Figure 2-1: Micro Air Vehicles









A great deal of work has been done at the University of Florida in the area of

MAV development. Dr. Peter Ifju has been working with flexible wing designs ranging

in wingspan from as small as 4 inches to as large as 24 inches, as seen in Figure 2-2.

A flexible membrane wing has been seen to aid in rejection of wind gusts; a problem

which plagues vehicles that operate in the extremely low Reynolds' number flight

regime common to MAV designs [18]. Dr. Ifju's MAVs are typically constructed from

carbon fiber and are powered by an electric motor. In the 2004 Micro Air Vehicle

Competition, which was held in Tucson, Arizona, Dr. Ifju's team won awards in

several categories, adding to a long list of previous citations for work in the field of

MAVs.








Figure 2-2: Various MAVs studied at the University of Florida


The fact should be noted that MAV configurations other than fixed-wing designs

are currently under investigation throughout the research community. Some take a

biologically-inspired approach to flight by implementing flapping wing configura-

tions [30, 32]. Designs in this category pose enormously difficult problems in the fluid

mechanics and control categories. Another approach has realized MAVs as small lifting

fans using vectored-thrust and steering vanes for control capability [28]. Despite the

many interesting problems introduced by the numerous approaches to MAV design, this

thesis will focus primarily on issues related to fixed-wing MAVs.

2.2 Control Issues

The flight regime in which MAVs typically operate may require the applica-

tion of some non-traditional control techniques for effective performance. Typical

operating Reynolds' numbers are near the bottom of the spectrum between 80,000









and 150,000 [19]. A marked drop in lift to drag performance has been observed in

this range. While birds have adapted their control mechanisms to this regime quite

well, man has yet to master nature's secret to this engineering problem. Some have

attempted to mimic biological systems, as seen in Figure 2-3 [3]. Regardless, issues

are presented that are not normally dealt with in the field of flight control.








Figure 2-3: A Biologically Inspired Control Approach


Attempts have been made recently to model the aerodynamic characteristics

of MAVs. Computational efforts have examined the effect of tip vortices on the

aerodynamic performance [42]. Vortex flowfield interactions have been seen to play

a significant role in MAV aerodynamics due to the typically low aspect ratio of the

lifting surface. Several computational studies have also investigated the properties

of flexible membrane wings, such as those used at the University of Florida [25, 46].

Wings in this class have been seen to alleviate some of the effects due to unsteady

wind gusts. Flight characteristics have been examined from an experimental standpoint

as well, with one wind tunnel study resulting in the development of a full model and

simulation [4,43,44].

Several control challenges are presented by the aerodynamics and flight mechanics

inherent to aircraft on the MAV scale. Very small mass moments of inertia contribute

to diminished stability and control characteristics of the vehicle [1, 19]. Also, wind

gusts and turbulence effects occur on a comparable scale to a MAV's normal operating

airspeed. Suddenly the typical controls problem of disturbance rejection becomes one

more of complete change in operating condition because of the wide variation of wind

speeds over the wings. These factors can make control difficult for either a remote









pilot or an automated controller. It is apparent that the problem of MAV flight control

must be approached with care.

The use of innovative control effectors has been an area of study facilitated by

MAVs. Use of technologies such as morphing also provide an approach to more

effective MAV control. As discussed previously, an asset of MAV construction

techniques is that simple morphing strategies can be easily implemented. The use of

wing twisting and wing curling has been researched as an effective method for lateral

control [1,2, 15]. From a mathematical standpoint it has been proposed that morphing

strategies can be approximately treated using a special form of linear parameter-varying

(LPV) control coined linear input-varying (LIV) control [5, 13]. The concept is that

morphing inputs can be treated as an LPV problem where the input is the varied

parameter.

The consideration of autonomy implies the need for a useful payload capacity that

may be used to carry necessary electronics and sensors. Computer vision presents itself

as an attractive technology for automatic MAV control systems. Use of a small video

camera as a sensor provides many benefits due to light weight, high bandwidth, and

the notion that a wide variety and quantity of information can be assimilated through

the use of a single sensor. Thus a camera can be used not only for attitude estimation,

but also for guidance and navigation problems such as obstacle avoidance and target

tracking. Image processing and vision-based control are enabling technologies that are

quickly growing in application and capability.

A simple approach has been shown useful for stabilization of a MAV [12]. A

classification algorithm is used to detect the perceived horizon from a forward pointing

camera mounted on a MAV. Information about the attitude of the aircraft can be

inferred from the position and orientation of the perceived horizon in the image. Prior

efforts have used this fact to augment the stability and controllability of a MAV [12].

This thesis takes the idea a step further and proposes to use the concept of vision-based







10

attitude estimation for augmented stability as the fundamental element of a completely

autonomous system for control of a MAV.















CHAPTER 3
VISION-BASED ATTITUDE ESTIMATION

3.1 Image-Based Features

Computer vision is a valuable sensor technology in that it can provide a large

quantity/variety of information about the ambient environment at a relatively high

frequency with a single sensor. Image processing technologies continue to allow

the extraction of increasing amounts of useful information from two-dimensional

images, despite the fact that depth information is lost in the process of mapping the

environment to the image-plane. It is through techniques such as pattern recognition

and feature extraction that image-plane information is correlated to the physical

environment. This correlation enables the use of image-based parameters to be used for

control purposes relative to the physical space.

The three-dimensional physical environment is mapped to the two-dimensional

image plane by projection through a camera lens. Each point on the image plane

corresponds to a ray in 3D space, and as such, features of varying depth can be

mapped to a single location on the image-plane. Thus, depth information is lost [17].

Techniques exist to indirectly reclaim depth information. Examples of such techniques

include the use of multiple cameras, the use of a single camera in multiple positions,

and knowledge of the physical space.

The mapping of a given physical feature to the image plane can be described by

a perspective camera model [9, 17]. If the camera basis is fixed at the camera lens,

as depicted in Figure 3-1, the image plane is located at a distance f from the camera

basis origin. The position vector of a physical feature relative to the camera basis is

expressed as Tr. This vector determines a position in the image plane described by the

coordinates (p, v), related in Equations 3.1.
















el Feature Projection




Feature Position




Figure 3-1: Mapping a Physical Feature to the Image Plane


S= f (3.1a)
13


v = f12 (3.1b)
13


3.2 Vision for Flight Control

If the camera frame can be related to a basis of some system requiring stabiliza-

tion or control, feature parameters based in the image plane can be interpreted for

control use. Previously, the coupled equations of motion for an arbitrary camera fixed

relative to the body-fixed basis of an aircraft have been developed by Causey [9]. The

formulation presented by Causey describes a detailed interrelationship between a phys-

ical feature mapped into the image plane with the aircraft dynamics. This formulation

provides a foundation for flight control relative to image-based features.

Figure 3-2 depicts the situation of a MAV in flight relative to some identified

physical feature; specifically, the comer of a building. Also shown is the image

resulting from the mapping of the environment within the field of view to the image

plane. The problem formulation holds the assumption that the vision processing

technology necessary for identification, recognition, and tracking of such a feature is









in place. Such technology is rapidly maturing given recent developments in the vision

processing community by Soatto and others [6,21,22,26,34,35,36,37,38,39,45].











Projected Image






Figure 3-2: MAV Kinematics


The position of the aircraft relative to a locally suitable inertial reference frame

is given by the vector Re. The position of the feature of interest relative to the same

inertial basis is given by If a camera is fixed to the aircraft at some location A

relative to the origin of the body basis, it follows that the position of the feature

relative to the camera can be expressed by Equation 3.2. Inspection of Figure 3-2

verifies this result.

i =| (c+A) (3.2)

The orientation of the aircraft relative to the inertial frame can be described by a

typical Euler transformation, which is performed through multiplication by the time-

varying transformation matrix [/(), 0, V,)], where the Euler rotation angles are given

by 4), 0, and y. Similarly, the orientation of the camera relative to the aircraft body

basis is found through a transformation [l(4c, 0c, Vc)], where )c, Oc, and Vc represent

the rotation angles for the sequence between the body basis and the camera basis. The

latter transformation may or may not be a function of time depending on the freedom









with which the camera is allowed to move relative to the body basis. The vector fj is

of most use when expressed relative to the camera basis, as in Equation 3.3.


i = [4(0c, c, )] [1(0, O,')] -Rc) [l(4c,Oc, V)] -A (3.3)

The time derivative of the above equation can be directly used to extend the

conventional state-space formulation of the aircraft equations of motion to include

the relative position of an arbitrary number of vision-identified feature points to the

aircraft. Given that the angular velocity of the body basis relative to the inertial frame

is E 6B and that the angular velocity of the camera basis relative to the body basis is
Bfc, the additional state equations for the new system are given by Equation 3.4. If

the position and velocity of a given feature in the image plane can be determined and

tracked to some level of precision, it becomes possible to derive useful error signals for

control use through the relationship described by Equations 3.4 and 3.1.


S= -Rc -EB A_ (EB+B C) x (3.4)


3.3 Horizon Detection

Evaluation of the horizon is the fundamental technique used in this thesis for

vision-based attitude estimation. Control relative to the estimated horizon can be

thought of as control relative to an image-based feature for the special case of which

the horizon is used as a feature. This technique clearly has limitations due to visibility

and ground features; however, the horizon is potentially valuable information, and

is often available for certain mission scenarios. Human pilots often prefer to use

visual interpretation of their environment external to the aircraft as a means for stable

flight. This thesis builds upon an existing technique for horizon detection which was

previously used for attitude estimation to stabilize a MAV [12].









3.3.1 Interpretation

For the purposes of this thesis, the physical horizon is defined as the plane

tangent to the surface of the earth at the point of an observer situated in an earth-fixed

position. If this observer were a camera looking in a direction parallel to this plane, the

image plane would be located some distance, f, from the lens and would be oriented

perpendicular to the camera axis and thus the horizon plane. The distance, f, is defined

as the focal length of the camera lens. In this case the horizon plane appears as a line

in the image plane. Conceptually, this line appears to separate the ground and the sky

in the field of view.

The vision-estimated horizon is only an approximation to the physical horizon for

the case of a forward-pointing camera carried by a MAV in straight and level flight.

The horizontal plane local to the camera basis, is in this case, a plane that is parallel

to the physical horizon and translated in the vertical direction by the altitude of the

aircraft. If this plane were a discernible feature, it would appear in the image plane as

a horizontal line some distance above the line where the ground is perceived to meet

the sky.

The detection algorithm utilized for this thesis detects this perceived edge along

which the sky and ground meet. Therefore in straight and level flight the vision

estimated horizon plane must be slightly askew of the local horizontal plane. Further,

the horizon estimate is slightly askew of the physical horizon due to the fact that the

local horizontal plane is oriented parallel to the physical horizon. This plane will then

intersect the physical horizon plane at some forward distance from the observer.

If the "physical" location of where the ground is perceived to meet the sky is

thought of as being a distance equal to the range of visibility along the observed

horizon plane, the estimated horizon will intersect the physical horizon plane at this

location. This situation is depicted in Figure 3-3, where the altitude of the observer









(and the local horizontal plane) are given by h while the skew angle between the

estimated horizon and the local horizontal plane are given by o.















Figure 3-3: Estimated Horizon shown Relative to Physical Horizon and Local Horizon-
tal


Some level of error will be introduced into calculations based on the assumption

that the estimated horizon coincides with either the physical horizon or the local

horizontal plane. This error becomes increasingly negligible as the skew angle, o,

grows smaller in magnitude. If it can be guaranteed that the range of visibility will

always be much larger than the altitude of the observer, o can be assumed to be

of negligible consequence. This condition is met given that visibility conditions are

usually on the order of tens of miles while MAV altitudes are typically less than

1000 ft. The fact that visibility usually improves with altitude aides the argument

as well. It is important to note however that level flight is assumed. While the error

introduced by these effects proves to be negligible for this case, introduction of a non-

zero pitch attitude will affect the relative angular displacement of the horizon planes

directly, and is discussed in more detail in Section 3.4.2.

3.3.2 Algorithm

The algorithm used for horizon detection uses a statistical measure based on

colors. Essentially, statistical modeling is used to classify each pixel by color according

to known distribution models which are based on training images. A set of proposed









horizons are evaluated to determine which results in the most accurate split of the

distribution of pixels classified as either "ground" or "sky". The resulting horizon, as

shown in Figure 3-4, reflects the optimal estimate [12].













Figure 3-4: Vision Estimated Horizon


The horizon detection is predicated on several assumptions. The horizon is

assumed to appear as a straight line in the image. Also, the horizon is assumed to

separate the image into two distinct color groups. Such assumptions obviously limit

the usefulness of this approach. While the line dividing the ground and the sky

in the image is ideally a straight line, this may not be the case for real situations

where trees, mountains, or other obstructions may cause an irregular dividing line as

shown by Figure 3-5(a), which is a representative case of an irregular distribution.

The mountains cause the pixel classification to be skewed which is reflected by the

optimal estimate. Intuitively, the horizon estimate should appear more like the image

in Figure 3-5(b) regardless of the presence of mountains or other obstructions. State

estimation or other calculations based on the horizon estimate shown in Figure 3-5(a)

will contain some error if they rely on the assumption that the estimated horizon is a

good approximation for the physical horizon as defined in Section 3.3.1. Thus, it can

be seen that the use of color presents some limitations to the algorithm. Specifically,

the algorithm will exhibit accurate estimations for environments that are limited to









being very similar to the training distributions. Despite these limitations, the horizon

detection algorithm has been repeatedly shown to be effective in open areas.












(a) (b)

Figure 3-5: Error in Horizon Estimation


3.4 State Estimation

Information about the attitude of the aircraft can be inferred based on the horizon

in the image coupled with knowledge of the camera relative to the airframe. Clearly

a complete set of states can not be estimated based purely on the horizon without

additional information. In this thesis, only information related to the roll angle and

pitch angle are estimated.

3.4.1 Roll

Consider a MAV with a fixed, forward-pointing camera mounted in the nose such

that the camera axis and the body x-axis coincide. If the error due to the horizon skew

angle (Figure 3-3), co, can be considered negligible, then an angular displacement

about the camera axis would correspond directly to the resulting angular displacement

of the perceived line in the image plane. This statement assumes zero pitch angle, such

that error is only introduced by co. The angle between the horizontal and the horizon

line could be interpreted as the roll orientation of camera, which directly correlates to

the roll orientation of an object fixed relative to the camera. Because the camera axis

coincides with the body x-axis, this angle can be directly interpreted as the body axis

roll angle of the aircraft.









If co does not approach zero, angular displacements about the camera axis can

no longer be exactly interpreted as angular displacements of the estimated horizon

in the image. The error in roll angle for this case is related by the cosine of Co.

Such a situation might arise in flight due to the presence of a finite distance to the

sky/ground dividing line as might occur in urban flight or flight near a line of trees.

For flight at zero pitch angle, co can be obtained through the equations relating the

focal coordinates, as shown by Equation 3.5. This relationship derives from the fact

that the vertical displacement of the horizon at zero pitch, PH0, should be zero for level

flight. It is assumed that PH is measured at the midpoint of the estimated horizon line.

A T1HO
Ho-
13HO (3.5)

= tanCo

It can be assumed for most cases that the line at which the ground is perceived to

meet the sky is distant enough such that co will be sufficiently small. The situation is

different for the case of flight at any non-zero pitch angle, 0. The angle responsible for

error in the horizon estimate is directly influenced by rotation about the pitch axis, as

seen by equation 3.6.

CroT = +C (3.6)

For cases where co can be considered negligible, CTor 0. This basically states

that for zero pitch angle, the estimated horizon is an accurate approximation for the

local horizontal plane. Based on the definition of horizon in Section 3.3.1, the horizon

line can be thought of as being situated in the vertical inertial plane that is located by

the heading angle of the aircraft and the current range of visibility.

Aircraft body-axis roll is defined about the body x-axis. As discussed above, this

axis coincides with the camera z-axis so pure body-axis roll will correspond to pure

rotation in the image plane. The vision-estimated roll angle is measured based on the

orientation of the physical horizon, which is out of plane by an angle equal to 0. The









error associated with nonzero pitch angles can be related through consideration of a

vector of length r extending from the CG (camera basis origin) in the direction of the

body y-axis. If the aircraft has a nonzero pitch angle, 0, and a nonzero roll angle, ), an

observer situated at the horizon line in the inertial frame will see this vector as related

by Equation 3.7. The vision-estimated roll angle is then obtained using the components

of the vector projected into the vertical inertial plane, as seen by Equation 3.8.

*1

{r-b2l = [12(-O)]-[11(-]"- e2
(3.7)
e3

= (r.sin)sinO)e1+(r-cosO))2 + (r-sin)cos0)63


r- sin)cosO
tanE --(.
r-cos (3.8)

= cosO tan


3.4.2 Pitch

The distribution of sky and ground classification regions is used to determine a

parameter denoted pitch percentage. Specifically, pitch percentage is a measure of the

percentage of the image below the vision-estimated horizon and can therefore yield

information about the longitudinal attitude of the aircraft. It is important to note this

parameter is not concretely tied to any physical coordinate frame. It depends entirely

on color and shading in a two-dimensional image.

Although a change in pitch attitude will likely result in a change in pitch per-

centage, a change in pitch percentage does not necessarily indicate a change in pitch

attitude. Pitch percentage is also affected by many other factors that are independent

of pitch attitude. The angle e is affected directly by changes in both altitude and

physical distance from the perceived split between ground and sky. As discussed









previously in brief, this distance could vary greatly in a cluttered or urban environment

because objects of varying distances from the camera would obstruct the view to the

"obstacle-free" perceived meeting point. For example, a building that is 100 ft away

could represent the perceived meeting point, while this distance would be on the order

of miles if the building were not present. Situations could arise where such an ob-

struction comes into or moves out of the field of during a constant-attitude maneuver.

This presents an example of a case for which pitch percentage might change drastically

while there is no actual change in pitch attitude.

Another factor independent of pitch angle that could affect the pitch percentage is

the body-axis roll angle. For a rectangular image that might be generated by standard

camera hardware, there is a limiting value of roll angle such that values less than this

limit will not introduce undesirable pitch percentage effects. Roll angles greater than

this value in magnitude cause a change in pitch percentage that is unrelated to pitch

attitude. This effect is seen in Figure 3-6.

















Figure 3-6: Effect of Roll on Pitch Percentage


The top-left image of Figure 3-6 has the aircraft at some constant pitch attitude

with zero bank angle. The pitch attitude in this case passes through the center of the

image, such that pitch percentage has a value of 0.5. The other three images depict a









pure rotation about the camera axis, which passes through the center of the image. It

can be shown that the images depicted in the top-right and bottom-left of Figure 3-6

do not change the value of pitch percentage. Once rotation exceeds the angle to the

diagonal, more ground will appear in the image than sky, as seen by the image at

bottom right. This critical angle of rotation will vary over values of pitch percentage.

Hence pitch percentage is affected by roll angle independently of pitch attitude.

There are limited cases for which some restrictive assumptions can be made such

that the pitch percentage can be tied to the pitch attitude. First of all, roll must be

limited to cases that do not affect the pitch percentage as depicted in Figure 3-6. In

an obstacle free environment for which the detected horizon can be assumed to be

infinitely far away from the camera in comparison to the altitude, the level horizon

displacement angle, co, is seen to approach zero as stated in Section 3.3.1. In this

situation, zero pitch angle will be reflected by a pitch percentage value of 0.5, which

has the horizon in the middle of the image. Because the focal coordinate position

is proportional to the tangent of the angle relating the target line of sight, the small

angle approximation can be applied to estimate the vertical position of the center of the

horizon, PH, as a measure of the pitch angle. The small angle approximation is seen to

hold for the tangent function over a domain varying up to about 15 deg.

Unfortunately this case is very limiting, and conditions facilitating this devel-

opment are not likely to be common in practice. In general, pitch angle can not be

uniquely determined from the pitch percentage due to the many factors affecting the

parameter. Further, the fact that pitch percentage is determined from the color distri-

bution of a two-dimensional image as opposed to kinematical relations in a physical

reference frame limits its use for the unique determination of aircraft body-axis pitch

angle.









3.5 Vision Issues

Computer vision presents many advantages for control purposes; however, it

presents many challenges as well. Inherent to the physical system are nonlinearities

associated with the camera along with the high risk for interference and noise associ-

ated with wireless video transmission [9]. The software poses some issues to address

as well due to deviations from the known distribution models in trainer images. Most

importantly, vision-based attitude estimation using horizon detection requires that the

physical horizon be retained in the image at all times.

A nonlinearity that is inherent to any vision system results from distance. Specif-

ically, the position and size of a feature in the image plane varies inversely with the

distance to that feature as shown by Equation 3.1 This dependence on distance will

affect the horizon, and consequently attitude estimation, if the perceived horizon

depends on features which are close to the aircraft. Obviously flight amongst trees

and mountains will result in features of varying distances and horizons of varying

orientation.

Another nonlinearity is associated with lens distortion. The curvature of the lens

causes the physical space to be mapped nonlinearly into the image plane in such a

way that the image is distorted axisymmetrically. The level of convex and concave

distortions increases with distance from the image center, as seen in Figure 3-7. At

the edge of the image, the mapping of the horizon no longer complies with the straight

line assumption of the horizon detection algorithm so some level of error in detection

is introduced. This phenomenon has been seen to manifest itself as the horizon

"twitching" as the algorithm jumps back and forth between straight-line best-guesses

along the curved mapping.


The computational draw of the image processing algorithm introduces a tradeoff

issue between run-time and sensor resolution. Run-time is seen as a critical factor due

to the fact that vision is the primary means for aircraft stabilization. The fast MAV



















Figure 3-7: Lens Curvature for a Horizon


dynamics require a fast-acting control system to achieve stable flight. Therefore the

finite set of proposed horizons at each computation is only capable of resolving the

vision estimated roll angle to 4.45 deg. This limitation poses a new set of challenges to

the control design and can factor into the twitching horizon effect if the body-axis roll

angle is between two discrete measurement levels.

Issues are also presented by ambient conditions causing deviations from the trainer

images. If color anomalies in the image cause a large enough number of sky or ground

pixels to be incorrectly classified, then the distribution split will no longer coincide

with the physical horizon. For example, if the algorithm was trained on images of

a clear and bright day, then dark clouds introduce sky pixels that do not fit into the

known distribution model. These dark pixels may then be erroneously classified as

ground pixels and the distribution split no longer lies between the sky and the ground.

Other examples of anomalies include bright sunshine associated with sunrise/sunset,

dark ground shadows, and sky colors associated with Doppler effect at dusk. Gross

misclassification of pixels can cause large erroneous vision-estimated roll angles and

pitch percentages.

The most important limitation associated with vision-based attitude estimation

using horizon detection is that the horizon must remain in the image at all times. If

no sky or equivalently no ground is visible in the image, then some large number of

pixels will be misclassified and the estimated horizon will be arbitrary. In this case,

the estimated parameters will have no basis in the physical space and any indication







25

of attitude change due to the horizon will be in error. This limitation means that the

aircraft must be maneuvered non-aggressively enough that the horizon is always within

the camera's field of view.

Finally, interference and noise are inherent to any wireless telemetry system. The

interference is often especially prevalent over large distances and for positions near the

edge of the receiving antenna's beam width. A large amount of noise distortion in an

image will surely have an effect on the color distribution and, therefore, the estimated

horizon.















CHAPTER 4
INNER-LOOP CONTROL

4.1 Lateral Stability Augmentation

4.1.1 Methods

A stability augmentation system is incorporated into the autopilot to affect the

lateral-directional dynamics. Specifically, this system is designed to stabilize the

vehicle and provide tracking of roll commands. Tracking performance is achieved

using a proportional plus integral controller on an error signal. This error signal derives

by subtracting the commanded roll angle from the vision-based roll estimation. The

proportional plus integral control law provides an actuator command to the differential

elevator, which deflects antisymmetrically to generate an appropriate rolling moment.

The proportional plus integral control combination was chosen for specific

purposes. Proportional control elements are essential in any tracking controller, as they

provide a good "directional sense" to the control law. In principle, proportional control

laws generate an actuator command that is directly proportional to the error signal;

consequently, large deviations from the desired response will result in relatively large

corrections by the actuator.

The use of integral control for lateral inner-loop stabilization is a somewhat

unusual choice for flight stability systems, but turns out to be a necessary element

for the case examined. An integral control law generates actuator commands that are

proportional to the integral of the error signal. In this sense, integral control provides

desirable steady-state error qualities. If there is a nonzero error at steady state, the

integral of the error will build and the actuator will ramp up until the error is driven to

zero. When there is zero error, the integral of error remains constant and the integral

control law commands a constant actuator input. This result means that an integral

26









controller can maintain zero error at steady state, even if a nonzero actuator command

is required. A proportional controller cannot provide a nonzero command when there is

zero error, as it is proportional to the value of the error signal.

The integrator is needed in the case of the lateral dynamics of a MAV for

several reasons. First, the lateral dynamics of an aircraft have been seen to exhibit

a type-0 system or, in other words, a system with no free integrators [40]. Such a

system will exhibit some level of steady-state error if proportional control is used

alone. This error can be reduced by increasing the proportional gain, but this action

can lead to a response that is oscillatory to an undesirable extent. In particular, the

highly agile nature of micro air vehicles resulting from the combination of low inertia

moments and a high level of relative control authority has been described as having

erratic controllability qualities, such that high gain feedback could result in system

instability [1, 19]. Therefore, a relatively low proportional gain is necessary, with the

integrator taking up the slack by providing the minimal control authority to maintain

the commanded roll angle.

A second reason to include integral control in the system is to account for a

trim condition that may vary drastically during flight. Because wind speeds can

vary on the same order of magnitude as the airspeed of the MAV in its normal

flight regime, flying at different headings on a windy day could require large and

wide ranging control inputs at steady state to maintain trim conditions. Similarly,

asymmetries in construction can require a nonzero command at zero error to maintain

trim. Proportional control alone cannot provide nonzero inputs when there is zero error.

An integral control element is related to accumulated error over time instead of the

instantaneous value and can therefore maintain a nonzero command during periods of

zero error.









4.1.2 Control Architecture

As discussed in Section 4.1.1, the error between the vision-estimated roll angle,

O'E, and the filtered roll command, 0,)c, is minimized through the use of a proportional

plus integral controller that uses differential elevator to provide the necessary rolling

moment. The architecture for the stability augmentation and roll tracking system is

shown in Figure 4.1.2 [9].


environment



DMAV




Figure 4-1: Lateral Stability Augmentation System


The C element represents the camera whose output is image-plane information.

This output is a function of the aircraft position and orientation as well as the physical

environment within the field of view of the camera as discussed in Chapter 3. Each

point in the physical space is mapped to the image plane according to Equations 3.1.

The C block includes this mapping process, along with the application of any nonlinear

or distortional effects introduced by the hardware.

Image processing operations required for lateral control are represented by the

V element in Figure 4.1.2. Specifically, the horizon is estimated using the horizon

detection algorithm discussed in Section 3.3.2. A vision-estimated roll angle is

computed using the angular displacement of the estimated horizon.

A low-pass filter is also included in the V block to attenuate undesirable high-

frequency data anomalies in the vision-estimated roll angle. These anomalies are

associated with instances involving a poor horizon estimation, which are seen to

occur for varying reasons including abnormal image coloration due to excessive









light variations as well as image degradation due to interference in wireless video

transmission. These instances have been seen in the past to result in the horizon

estimate "jumping" to an arbitrary position and orientation within the image. In many

cases this effect is only momentary so the filter can attenuate the high frequencies

associated with these spikes.

The error signal results from the subtraction of a filtered version of the com-

manded roll angle from the vision-based estimate. This filter, represented as A, is

required to act on the commanded roll angle prior to computation of the error signal

to avoid quantization errors. These errors are introduced by the limited resolution of

the vision-based roll estimation, which is an effect related to the computational of

the horizon detection algorithm as discussed in Section 3.5. The vision-based attitude

estimation system is only capable of resolving the estimated roll angle to discrete steps

of 4.45 deg. Commanding a roll angle at any value other than an integer multiple

of 4.45 deg would result in residual error because the sensor system would never

be able to match the resolution of the command. This residual error would result in

actuator motion causing a roll response. If the command were to remain unchanged,

the response would pass through the commanded value to the next discrete-measurable

value, resulting in a residual error signal in the opposing direction. Without a change

in the commanded value, this amounts to a limit cycle oscillation in the roll response.

Therefore the feedforward filter, A, quantizes the commanded roll angle from a signal

that varies continuously with amplitude to match the discrete estimated roll resolution

of 4.45 deg.

The command to the differential elevator actuator is finally computed by scaling

the filtered error by the controller gains, as shown in Equation 4.1. These gains are

denoted by KO as the proportional gain and KL as the integral gain. The error is

scaled directly by KO. A separate state, if, which represents the accumulation of error

is scaled by Ki. The sum of the proportional and integral elements represents the









actuator input.

u = KO e + Ki. (4.1)

4.2 Longitudinal Stability Augmentation System

4.2.1 Methods

The autopilot also includes a stability augmentation system to affect the longitu-

dinal dynamics. The basic function of this controller is to stabilize the attitude of the

vehicle and track commands to pitch percentage. The relationship of pitch percent-

age to the longitudinal attitude of the aircraft is discussed in detail in Section 3.4.2.

Although there are many reasons to avoid using the pitch percentage parameter for

longitudinal control it reasonably approximates pitch attitude in some instances. In

other cases, pitch percentage has at least some dependency on the relative pitch attitude

of the MAV and therefore can yield generic information concerning attitude changes.

Pitch percentage tracking is achieved through the use of a proportional controller

which acts on the error between some commanded value of pitch percentage and

the current measured value of pitch percentage. Integral control is not included in

the longitudinal inner-loop because zero error at steady-state is not a requirement.

A command to the elevator actuator is determined by scaling the error signal by a

proportional gain.

The longitudinal attitude control system presented here deviates significantly

from traditional pitch attitude hold systems. Specifically, it is common to implement

a proportional plus derivative approach where the pitch angle and pitch rate are used

as feedback signals. The pitch angle feedback provides performance in the sense of

tracking, while the rate feedback provides additional damping of the short period mode.

Even when these parameters are available, control systems of this nature are typically

used only for wings-level flight [40].

The use of pitch percentage has been shown to maintain a non-unique relationship

to the pitch attitude of the aircraft. Further, limitations that arise due to the nature









of vision as a sensor technology such as field of view constraints and lens distortion

effects add uncertainty to the estimation. The controller could be useful for situations,

such as horizon at infinite distance, when pitch percentage can be loosely interpreted

in terms of pitch. In general, the lack of correlation between the pitch attitude and

pitch percentage reduces the benefit of tracking commands to pitch percentage in itself.

However, it will be shown later that the methodology proposed here will prove to be

an important element for limiting commands to retain the horizon in the field of view

during higher level control activities.

4.2.2 Control Architecture

As discussed in Section 4.2.1, the error between the commanded value of pitch

percentage, Oc, and current measured pitch percentage, o, is minimized through the

use of a proportional control law to generate a command to the elevator. The resulting

architecture is shown in Figure 4-2.

environment





----- ^ Ka H MAV -


Figure 4-2: Longitudinal Stability Augmentation System


As is the case with the lateral stability augmentation system, the C element

represents the camera which takes as inputs aircraft states along with the environment

and produces an image as the output.

Also, image processing operations are represented by the V element. The horizon

is evaluated and assigned a statistical estimate per the algorithm discussed in Sec-

tion 3.3.2. The pitch percentage is determined using the ratio of the two classified

regions of the pixel distribution of the image [12]. Pitch percentage is therefore limited

to values between zero and one.







32

The error signal is computed directly as the difference between the commanded

pitch percentage and the measured pitch percentage. The commanded value is limited

to values in the range between zero and one to avoid exceeding the sensor limitations.

The proportional control law is then applied to generate an input to the symmetric

elevator actuator, as expressed in Equation 4.2. The input, u, is computed by scaling

the error by the gain, Kc.

u = Kc e
(4.2)
= K,(cy,-c-o)















CHAPTER 5
GUIDANCE AND NAVIGATION

5.1 Directional Controller

5.1.1 Methods

The directional controller is implemented as an outer loop to the vision-based

roll command and stabilization system. Heading calculations are performed using the

relative displacement of GPS coordinates. These coordinates are expressed in terms

of latitude, X, and longitude, %. The current heading, y, of the aircraft is determined

from the current position and the position from one time step prior using a tangent

relationship. This is represented by Figure 5-1 and the Equation 5.1.


A








- - --_ -

AAX,


Figure 5-1: Backwards Difference Relationship for Heading Calculation


X2 X1

For the purposes of this thesis, y is taken to be the lateral flight-path angle rather

than the body-axis heading angle. As a result, this parameter is a measure of the

trajectory of the MAV center of gravity (CG) in the local horizontal x-y plane. This

definition is contrary to that of body-axis heading angle, which measures the directional









orientation of the aircraft. The directional controller concerns the direction of inertial

motion rather than details about specific aircraft orientation. Therefore the flight path

proves to be more valuable information for the case of a MAV with highly variable

flight conditions and limited sensing capability.

The directional controller forgoes target detection/identification and assumes the

location of a desired waypoint or target is a known point in space. Given waypoint

or target coordinates, the relationship above can be used to compute the bearing

required to hit the target. This bearing is regarded as the commanded heading, Vc. The

measured heading can be used to generate an error signal directly in conjunction with

the commanded heading because they are both measures of CG trajectory. This error

signal is computed by taking the difference between y and Vc.

To track Vc, the error is driven to zero by commanding heading changes through

a non-zero turn rate. For the MAV of interest, turn rates are achieved primarily

through varying bank angle. It is assumed that the relationship between bank angle

and turn rate is proportional in nature, although in practice it will be affected by

many factors. Therefore this assumption is somewhat restrictive; however, it makes

proportional control a convenient technique to generate a command to the inner-loop

lateral controller. The inner-loop lateral controller affects the bank angle to achieve the

turn rates necessary for minimization of the heading error signal.

5.1.2 Control Architecture

As discussed in Section 5.1.1, the error between current aircraft heading and

commanded heading (heading to target) is minimized by commanding a non-zero turn

rate through a commanded roll angle. This commanded roll angle is computed by mul-

tiplying the heading error by a proportional gain, KV, and is limited to 35.6 degrees.

This limit results in a constant command for the large heading errors corresponding to

the gross acquisition phase of navigation and a varying command for the smaller error









values corresponding to the fine tracking phase. The roll stability augmentation system

stabilizes the aircraft about this commanded roll angle.


environment











Figure 5-2: Closed-Loop Directional Control System


5.1.3 State Estimation

5.1.3.1 Directional controller issues

The system as described to this point would have serious difficulty navigating

with any degree of accuracy or aggressiveness. Differences in time scale between the

MAV dynamics and the sensor dynamics impose heavy limits on the bandwidth of the

directional controller. Specifically, the use of GPS to determine position and heading

results in undesirable directional controller performance. Typical GPS sampling rates

are significantly slower than the aircraft dynamics associated with position and heading

changes. Therefore the sensor cannot convey all of the information necessary for good

controller performance.

This issue requires the control design to sacrifice aggressiveness for accuracy.

An aggressive controller gain, KV, gives the controller the authority to quickly drive

a heading error toward zero. Unfortunately, turn rates resulting from aggressive

commands are likely to exhibit significant overshoot before the GPS sensor provides an

update to alert the system that it has surpassed the commanded value. This behavior

results from the MAV state changing at a higher rate than the sensor can accurately

capture, which can result in instability. A convergent response can be achieved by









limiting the aggressiveness of controller such that commanded turn rates never result

in significant overshoot of the heading command. Such a limitation in aggressiveness

severely limits the allowable configuration and spacing of waypoints for which it is

feasible to hit successively.

Another issue deals with the inherent lag introduced into the system by the

backwards difference approximation used in equation 5.1 to compute the current

heading. Using a backwards difference approximation effectively measures the heading

at the previous time step such that "current" heading readings are delayed by one time

sample. This lag is depicted in Figure 5-3, which shows a representative flight path

along with the position and heading as would be measured by the sensor block, S, in

Figure 5-2. The 1 Hz sampling rate of GPS data guarantees this lag will be at least 1

second. The rapid dynamics associated with the control of a MAV require real-time

knowledge, and information outdated by more than a second can become almost

useless.










Measured ,I II ,i,,



Figure 5-3: Measured heading significantly lags actual heading at each GPS datapoint


Besides the inability of the 1 Hz GPS sensor to accurately capture the rapid

changes in position and heading, the slow sampling frequency generally causes

some level of error in the instantaneous computation of heading at each GPS update.

Even if the lag effect were not present, the accuracy of the method used to compute

instantaneous heading relies on the assumption that the flight path between samples can









be approximated by a straight line. This assumption requires that either the sampling

rate is very fast or that the flight path is in actuality very close to a straight line. This

requirement is depicted in Figure 5-4.

As






7(t) /7(t + AT)










Figure 5-4: Vector Diagram of GPS Sensor Measurement


Figure 5-4 shows a representative plot of two successive position measurements

relative to some inertial reference frame, where the measurements are separated by a

time delay of AT. The heading computation relies on the assumption that the direction

of the tangential unit vector at both data points coincides with the direction of Ay. The

angle 0 must be minimized for this assumption to hold. To maintain generality, this

must be achieved by minimizing the arclength and in effect AT. This effect can also be

seen by observing the difference between the representative flight paths in Figure 5-3.

5.1.3.2 Technique/Implementation

To alleviate the issues discussed in the previous Subsection, position and heading

estimates are introduced to fill the void between known values. Given a known current

position, (X, x), heading, xy, velocity, V, and turn rate, y, extrapolation of the flight

path to some future time can be performed readily. This is depicted in Figure 5-5, and

described by equations 5.2.










1* I


W-[k ] = [k]A+, | V c(X[k+ 1],X[k 1])
AX



IX








S([k 1],z[k- 1])

x

Figure 5-5: Heading/Position Estimate Relationship

X[k -1] = X[k]+ 1 1 T cos(V[k] -+ k [k]T) (5.2a)

X[k+ 1] = [k]- || V || T-sin(V*[k] +4 T[k]T) (5.2b)

An estimate of position is first generated relative to the known current GPS

position. Future position estimates are computed relative to the the estimated position

at the previous time step. Estimated heading is computed using Equation 5.1 with

estimated position. This process continues until the GPS sensor updates and provides a

"check" for the position estimate. In the event of an update the estimate is replaced by

the measured position.

Measured heading, Vy[k], is available but proves to be problematic based on the

previous discussion. Use of such data would result in large estimation errors beginning

at the first calculation. Two problems require solution to obtain more accurate heading

information, as discussed in the previous Subsection. The first is the issue of 1 sec lag

due to the backwards difference method, while the second is error in computation of









instantaneous heading due to the slow sampling time. Both errors will have diminished

effect beyond the initial calculation at the known GPS data point. The estimation code

runs fast enough to regard both the lag and instantaneous computation errors to be

negligible beyond this initial step. However, both errors will have significant effect on

the first position estimate at each GPS update. The errors would carry through to the

next update because future position estimates are computed based on the initial position

estimate. Therefore it becomes necessary to address both issues for the initial position

estimate at each GPS update.

The two identified problems with heading measurement are solved in an ad

hoc manner. The lag issue is addressed using the assumption of a constant rate turn

between two measured positions. Essentially a guess of heading at the next time

step is made using the assumption that the turn rate of the MAV is quasi-constant

over the past two time steps. Knowledge of the measured heading from the two time

steps prior allows the linear extrapolation to current heading, using the relationship in

Equation 5.3.

[k] = V[k- 1] + (V[k 1] [k 2]) (5.3)

The issue of instantaneous heading computation is addressed in a similarly ad

hoc fashion. Figure 5-3 shows that in a steady turn, the measured current heading

undershoots the actual heading due to lag, while the removal of lag results in a value

that overshoots the actual heading due to the inability of the sensor to fully capture

the smoothness of the flight path. The heading value after removal of system lag is

described by Equation 5.3, and therefore both heading values are known. One value

is known to undershoot, while the other is known to overshoot in the case where

the constant-turn assumption is valid. Therefore the difference is split between these

heading values that are non-optimal on opposite ends of the spectrum to approximate

the actual value of current heading. This action is equivalent to reducing the efficacy

of the lag removal method, and is described by equation 5.4. Again, this step is only









performed in the event of a GPS update. Beyond this point, estimates are considered

accurate and are used in conjunction with the previous backwards difference approach

(Equation 5.1) to calculate heading until the next update is received.


[k]= v[k- 1] + ([k- 1]- [k- 2]) (5.4)


Selection of turn rate data for the estimation is also seen to present problems,

as any on-line data would have to come from the GPS sensor and would therefore

exhibit the same lag seen in the position and heading measurements. However, it

can be assumed that the turn rate, *f, at any given time is related to the current bank

angle, ). Recall the assumption in Section 5.1.1 that this relationship can be roughly

approximated as linear, where turn rate is taken to be proportional to the current bank

angle. This relationship is shown by Equation 5.5. The proportional gain, Ky, is

estimated from previous flight data.


= K (5.5)


The value of velocity, V, is also based on previous flight data. Constant ground

speed is assumed and is approximated from averaged data. Although this may be a

poor approximation for the highly variable flight conditions of a MAV, the alternative

would be to take velocity data from the slow GPS sensor. The constant velocity

assumption should closely approximate the vehicle velocity for cases of little or no

wind. Using on-line GPS data for velocity information would always be subject

to errors associated with the time scale differences between the sensor and MAV

dynamics.

With the estimator implemented, the closed loop lateral-directional controller takes

on the form shown in Figure 5.1.3.2, where E represents the estimation process.









environment


Figure 5-6: Closed-Loop Directional Control System


5.2 Altitude Control

An altitude controller is also included in the autopilot to ensure the MAV reaches

the correct altitude associated with each waypoint. This controller commands altitudes

of the vehicle throughout the flight path during both straight and turning maneuvers.

The architecture for the altitude control is shown in Figure 5-7.


Figure 5-7: Closed-Loop Altitude Control System


The main feature of the altitude controller is the switch element. Essentially, two

values of elevator deflection are computed using separate approaches but only one is

actually commanded to the servo actuator. The upper path to the switch is simply the

inner-loop controller from Figure 4-2 that tracks a pitch percentage. The lower path

to the switch is a proportional plus integral structure that directly considers an error in

altitude.









The upper path is really an augmented version of the tracking controller for pitch

percentage. The augmentation accounts for the need to maintain a visible horizon

by including a lirn block. This block chooses a command to pitch percentage at its

maximum limits. The pitch percentage ranges from 0.0 when the horizon is at the

bottom of the image to 1.0 when the horizon is at the top of the image. The lim

block commands a value of 0.2 for a climb or 0.8 for a dive so the vehicle is always

commanded to its limit. These limits are are chosen so the horizon is never lost. In

this way, the controller commands a limiting value of pitch percentage regardless of the

actual pitch angle.

The lower path is a standard proportional plus integral controller. The error

signal upon which this controller operates is the difference between commanded and

measured altitude. Such a controller recognizes the lack of a pitch measurement so,

unlike traditional controllers, the altitude error is directly converted to an elevator

command [40].

The switch element chooses the smallest magnitude from the upper and lower

paths. Such a choice will limit the performance of the vehicle in tracking altitude;

however, this choice will also minimize the chance of losing visibility of the horizon.

Finally, a gain is included to couple the longitudinal dynamics with the lateral-

directional dynamics. This gain, KOg, generates a command to symmetric elevator

based on a measurement of roll angle. The coupling gain is used to improve turn

performance by both maintaining altitude and increasing rate of change of heading

during a turn.















CHAPTER 6
FLIGHT DEMONSTRATION

6.1 Aircraft

A flight test was performed to demonstrate waypoint navigation using the vision-

based attitude estimation. This flight test used the MAV shown in Figure 6-1. The

vehicle is a variant of a flexible-wing MAV which has been designed at the Univer-

sity of Florida [18]. In this case, the MAV is 21 inches in length and 24 inches in

wingspan. The total weight of the vehicle, including all instrumentation, is approxi-

mately 540 grams.













Figure 6-1: MAV


The airframe is constructed almost entirely of composite and nylon. The fuselage

is constructed from layers of woven carbon fiber which are cured to form a rigid struc-

ture. The thin, under-cambered wing consists of a carbon fiber spar-and-batten skeleton

that is then covered with a nylon wing skin. A tail empannage, also constructed of

composite and nylon, is connected to the fuselage by a carbon-fiber boom that runs

concentrically through the pusher-prop disc.

Control is accomplished using a set of control surfaces on the tail. Specifically,

a rudder along with a pair of independent elevators can be actuated by commands









to separate servos. Thrust results from a 6 inch prop which is driven by a brushless

electric motor. The prop setup is shown in Figure 6-2. Power is varied through the

use of an off the shelf speed controller commonly used for radio-controlled model

airplanes.














Figure 6-2: MAV Pusher-Prop Configuration


The on-board sensors consist of a GPS unit, an altimeter, and a camera. The GPS

unit provides position at a rate of 1 Hz and is mounted horizontally on the top of the

nose hatch. The altimeter, which provides a measure of static pressure at a rate of

30 Hz, is mounted inside the fuselage under the nose hatch. The GPS and altimeter

both communicate with a 900 MHz on-board radio transceiver, which was developed

specifically for this application at the University of Florida Machine Intelligence

Laboratory. The transceiver sends data to an off-board ground station, and receives

control commands from an off the shelf Futaba radio transmitter. The video camera

is fixed to point directly out the nose of the aircraft. Video data is streamed to a Sony

Video Walkman at 2.4 GHz via a wireless video transmitter. All on-board components

are powered by an 11.1 V lithium-polymer battery. The mounting setup of the on-board

electronics is shown in Figure 6-3.

The autopilot operates on an off-board ground station. This ground station

essentially consists of a laptop with communication links, a Sony Video Walkman, and a

Futaba radio transmitter. Separate streams for video and inertial measurements are sent









Data Processor/Transceiver Motor (hidden)

Altimeter Battery Pack


GPS Unit Camera








Figure 6-3: Mounting Configuration of On-Board Electronics


using transceivers on the aircraft. The video image is interpreted by the Sony Video

Walkman before it is sent to the laptop via Firewire connection. The image processing

and controller analyze these streams and send commands to the transmitter. Vision-

estimated parameters are provided by the image-processing at a rate of 30 Hz, while

the control loop runs at 50 Hz. The transmitter is configured such that it transmits

computer commands directly to the aircraft, where they are passed on to the servo

actuators.

A human pilot is held on standby in the event that the controller fails and

intervention is needed. This is an essential precaution to ensure safety and survivability

which also allows some freedom in the gain tuning process.

6.2 Controller

The gains for the autopilot were designed based on observing flight properties.

Unfortunately, a mathematical model describing the flight dynamics of the MAV did

not exist; consequently, a trial and error approach was adopted. A reasonable set

of initial gains could actually be derived based on intuition and evaluating aircraft

response to human commands. These gains were refined to increase performance both

in terms of tracking and navigating.









Difficulty was encountered in several areas of the gain-tuning process. Varying

one element of the controller often affected several other elements, and thus the process

was somewhat iterative. In some cases many iterations were required to obtain desir-

able response characteristics. The controller performance was also continually affected

by the highly variable flight conditions of the MAV. As discussed in Section 2.2, wind

gusts on the same order of magnitude of the airspeed of the vehicle become more an

issue of operating condition changes as opposed to rejection of disturbances as far

as control design is concerned. Vision issues also played a large role in hindering

the progress of the autopilot design. Several signals in the system derive from a pure

vision component, thus some undesirable effects are introduced into controller error

signals. Many cases were encountered for which it was not clear whether certain

aircraft behavior resulted from controller performance or vision system artifacts such as

those discussed in Section 3.5.

The lateral and longitudinal control systems were configured independently of

each other. For this to be possible, the Futaba transmitter was programmed such that

the pilot could remain in the loop to control specific surfaces while passing complete

computer control to others. This way the dynamic coupling effects of the lateral and

longitudinal dynamics could be attenuated manually during certain phases of the

design process. The goal was to isolate the response when each degree of freedom

was varied in the controller. When independent components demonstrated satisfactory

performance, coupling effects were reintroduced so that proper fine adjustments could

be made.

The first control element addressed was the inner-loop roll tracking system. The

human pilot retained the ability to command symmetric elevator deflection during this

process so that altitude could be maintained throughout the duration of each test. To

begin the process for a given test maneuver, the proportional roll gain was adjusted to

achieve general tracking performance. Once mild oscillatory behavior was observed,









the gain was decreased slightly before the integral gain was varied. This step was

necessary because addition of the integral control law had the tendency to adversely

affect overshoot and add some long period oscillatory behavior to the system. The

integral control would easily make up for the control authority lost in the proportional

gain decrease.

The procedure to tune the lateral inner-loop controller gains consisted of a sys-

tematic progression of maneuvers. Each maneuver consisted of a computer generated

command followed by 10-30 sec of aircraft response. The response time was limited

by the need to keep the aircraft within the vicinity of the ground station for the tuning

process. The gains were first tuned to maintain wings-level flight. This intermediate

configuration was then tested for recovery to wings-level flight from varying initial

bank angles and required adjustments were made. Tracking performance was then

tested by commanding various bank angles from wings-level flight. Several iterations

were necessary before the controller exhibited satisfactory performance across the

entire range of maneuvers.

The longitudinal controller was examined next. The inner-loop roll tracker was

capable of maintaining wings-level flight at this point, so the pilot was not required

in the loop. The pitch percentage tracking system would be essential to keeping the

horizon in the image and would also place an upper bound on the proportional altitude

controller; therefore, this system was the first to be configured. High gain values were

required due to low amplitude range of the signal, which varied between zero and

one. The dynamics also proved to be heavily damped which allowed for generous

application of control authority without a great deal of oscillation and overshoot

observed in the response. Little fine tuning was necessary due to this exhibited lack of

sensitivity in the response.

A similar range of flight maneuvers was performed to tune the pitch percentage

tracking controller. Gains were initially selected such that the controller could maintain









the initial value of pitch percentage. It was then necessary to identify values of pitch

percentage, o, near the limits of its range for which a moderate bank angle could still

be resolved. Values that represented very little sky or very little ground in the image

often resulted in significant horizon estimation errors in the event of a significant bank

angle. Therefore the values chosen as commands for the pitch percentage attitude

limiter were o = 0.2 for the nose-high attitude limit and C = 0.8 for the nose-down

attitude limit. Gains were then tuned to achieve these limits from a neutral initial

condition. After adjustments were made, tracking ability was tested for more drastic

initial conditions.

Completion of the pitch percentage tracker provided the necessary constraints to

close the altitude loop without the risk of jeopardizing the integrity of the vision-based

error signal. Several interesting performance issues surfaced in this step. First, the

MAV had very different climb and dive characteristics in part due to a sub-optimal

power configuration. Some gain configurations would drastically overshoot a lower

commanded altitude while the response to a commanded climb would not even reach

the desired value. Another observed property was that overly aggressive commands

would cause more of a change in trim condition than an altitude tracking response. If a

command caused a sudden and drastic change in pitch angle, too much airspeed would

bleed off for the aircraft to sustain climbing flight, and the dynamics were seen to settle

at the new airspeed and attitude. Therefore caution was employed in gain selection

in order to minimize these undesirable effects. The result was diminished controller

aggressiveness.

In a procedure similar to the design of the roll tracker, the proportional altitude

control element was addressed before the integrator in order to achieve tracking

behavior. The initial test maneuver required the MAV to simply maintain altitude.

Integral gain increases were applied as necessary. The next step involved mild altitude

changes in both directions, followed by necessary adjustments. The effectiveness of the









pitch percentage tracker as a command limiter was then tested by commanding drastic

changes in altitude. Test maneuvers typically lasted 20-30 sec, and were again limited

by the need to keep the aircraft in the vicinity of the ground station.

Design of the proportional gain that coupled altitude to bank angle was simply

a trial and error exercise. It was required to choose a value large enough to account

for dramatic loss of lift in steep bank situations, but not large enough to excite an

altitude oscillation for small bank angles. These oscillations resulted from the fact

that excessive application of elevator at small bank angles would cause a slight gain in

altitude, which would be fought by the altitude loop. Most cases of a required decrease

in altitude resulted in some level of overshoot and oscillation.

The outer loop heading control loop was the last system designed. The process

was again trial and error in nature. Two configurations of differing aggressiveness were

initially examined. Performance issues resulted from the slow dynamics of the GPS

sensor. In an attempt to alleviate these issues, a nonlinear gain schedule was proposed,

but proved to be ineffective. Eventually an estimator was incorporated into the system

to resolve problems with directional tracking, as discussed in Section 5.1.3.

Implementation of the estimator required the approximation of several parame-

ters from previous flight test data. Specifically, a value determining the proportional

relationship between bank angle and turn rate was needed, as well as a value to ap-

proximate constant velocity flight. The data used to determine a turn rate relationship

is shown in Figure 6-4, which is a plot of turn rate as a function of bank angle. The

roughness of the linear approximation is immediately apparent. The turn rate data was

computed using successive GPS measurements, which were assumed to update at 1 Hz.

This assumption is one possible cause for some of the abnormal data points seen in

Figure 6-4.

The value of velocity, V, was assumed to be constant for the purposes of heading

estimation in Section 5.1.3. This assumption is made because the MAV operates under











100


50

.
^ 0


-50 '.,


10 -60 -40 -20 0 20 40 60 80
Bank Angle (Deg)


Figure 6-4: Approximation of the Relationship Between Turn Rate and Bank Angle


constant power when in autonomous flight. Wind and other factors will affect the

instantaneous ground speed, but these effects are impossible to predict given the limited

sensor package used for this demonstration. The assumption of constant velocity

flight was therefore made in lieu of using on-line difference calculations involving

the slow-rate GPS sensor. The data set and approximation used to determine the

average velocity is shown in Figure 6-5. Successive GPS measurements were used to

determine velocity, and again give rise to some data anomalies. This fact was taken

into account with regard to selection of a velocity value.

300-

250

200 *
>


100* *
.50. .



}0 100 200 300 400 500 600
Time (sec)


Figure 6-5: Selection of MAV Velocity Data for Heading Estimation



The final controller gains are shown in Table 6-1. These values correspond to the

respective gains shown by the block diagrams in Figure 5.1.3.2 and Figure 5-7. The

responsive nature of the lateral dynamics allowed for relatively high proportional gain









and a fast integrator. Alternatively, the slower nature of altitude response required a

limited proportional gain and a relatively slow-acting integrator. Also, two values of

Ky are presented to reflect the autopilot at configurations of differing aggressiveness. It

is seen in the next section that the inclusion of an estimator for heading angle allowed

the more aggressive of the two to be used in the final design.


Table 6-1: Controller Gains

Gain Value
KO 0.85
KI, 0.04
Ko -300
Kh -0.25
KIh -0.006
K% 0.06
Kv { 0.4, 1.0 }


The final gains were based on the ability to continuously attain three pre-

programmed waypoints using a relatively smooth flight path. Essentially, the aircraft

needed to autonomously reach the waypoints using the shortest possible flight path.

The lateral-directional performance was evaluated using a threshold of 0.01 minutes

latitude/longitude, which corresponds to roughly 16.5 ft, around the waypoint for lati-

tude and longitude. The longitudinal performance was not considered critical because

of the strict limitations imposed by the vision system and MAV performance properties.

Relatively poor performance was anticipated, so the threshold around the waypoint was

increased to roughly 40 ft in altitude. The resulting threshold is depicted in Figure 6-6.

Flight paths that passed within these thresholds were considered to have reached the

waypoint so the autopilot then directed the MAV to reach the next waypoint.

6.3 Tracking

The performance of the autopilot for tracking is an important evaluation. The

final objective of the autopilot is waypoint navigation; however, this final objective is

only reached by the aircraft following commands such as roll and pitch. The ability to










33 Ft


S80 Ft





33 Ft

Figure 6-6: Waypoint Threshold


quickly follow, and maintain, a required change in roll will obviously affect the ability

to reach waypoints so the tracking performance must be investigated.

Figure 6-7 shows a representative plot of roll response demonstrating the lateral

stability augmentation system. Overall, the response tracks reasonably well. Some

undesirable dynamic coupling is expected because differential elevator introduces

some longitudinal dynamics and because no rudder was used for turn coordination. A

short delay time and rise time are exhibited with overshoot on the order of 2-3 bins of

resolution ( 4.45 deg each). Also, oscillations seen between 20-25 sec are examples

of the horizon "twitching" about two vision-estimated roll angles, which is an issue

associated with poor sensor resolution as well as distortions due to lens curvature,

which was discussed in Section 3.5. Good steady-state tracking is seen otherwise.

60
Measured
40





-20


-40

-60
0 5 10 15 20 25 30
Time (sec)

Figure 6-7: Lateral Inner-Loop Performance










Figure 6-8 shows a pitch percentage plot demonstrating the longitudinal stability

augmentation system, which uses proportional control to track pitch percentage.

Although this controller is used only in specific circumstances as a command limiter,

it is important that good general tracking properties are exhibited. The response for

this non-aggressive command exhibits reasonably good steady-state tracking. When

implemented as a command limiter, some steady-state error can be expected, as the

limiting commands correspond to more aggressive attitude changes. These more

aggressive values are complemented by associated changes in airspeed which could

alter response characteristics undesirably.

1
Measured
-- Commanded
0.8

0.6

0.4

0.2


0 5 10 15
Time (sec)

Figure 6-8: Longitudinal Inner-Loop Performance


6.4 Waypoint Navigation

The flight test demonstrated the ability of the autopilot to reach a set of waypoints.

In this case, a set of 3 waypoints were chosen. Each waypoint was separated by

distances that could easily be reached by the aircraft within a minute but required

some aggressive turns. The altitude of each waypoint was actually chosen to be equal

because the climb performance of the vehicle was considerably worse than the turn

performance.

The performance of an initial configuration, KV = 0.4 with no heading estimator, is

shown in Figure 6-9. The plot depicts a GPS map of the waypoint configuration with










each of the waypoints shown as a solid dot and the distance threshold the surrounding

dashed square. The aircraft clearly attains the waypoints successively; however, the

gross acquisition and fine tracking phases of navigation are quite ni'n-aiggessive. The

acquisition turns are relatively wide and, while the controller is capable of guiding the

MAV to within each waypoint threshold, the fine tracking performance does not exhibit

enough authority to consistently hit the waypoints with precision. As a result, this

autopilot would likely perform poorly and cause excessive circling for waypoints that

require aggressive maneuvering.

31.25

S31.2

31.15

31.1

S31.05


g1.45 33.4 33.35 33.3 33.25 33.2 33.15
Longitude from 82 W (min)

Figure 6-9: Waypoint Navigation with KV = 0.4


The autopilot was adjusted to provide more aggressive maneuvering with respect

to turns. Specifically, the heading gain was increased such that K. = 1.0 to cause

tighter turns in response to heading error. The resulting flight path is shown in

Figure 6-10. The increased gain resulted in sharp turns but also excessive overshoot.

As a result, the aircraft frequently missed the waypoint so the flight path displays a

non-uniform circling behavior.

The autopilot is challenged by an issue related to time scales; namely, the time

scale of the GPS measurements is considerably greater than the time scale of the flight

dynamics. Essentially, the MAV can turn and cause considerable changes in flight

path much faster than then the 1 Hz updates from the GPS sensor. As a result, the










31.25

S31.2

31.15

31.1

S31.05


.45 33.4 33.35 33.3 33.25 33.2 33.15
Longitude from 82 W (min)

Figure 6-10: Waypoint Navigation with KV = 1.0


aggressive controller commanded a fast turn but the nose moved too far before the

controller received feedback about the new heading. The flight path in Figure 6-10

shows the overshoots that result from a fast rate of turn but a slow rate of heading

measurement.

The estimator for heading is incorporated into the autopilot. The resulting flight

path is shown in Figure 6-11. The estimator is able to predict the heading angle during

turns at 50 Hz to provide feedback for the autopilot. Consequently, the autopilot can

command an aggressive turn for an amount of time suitable to the flight dynamics

regardless of the GPS rate.

31.25

S31.2

31.15

31.1


31.05


gR.45 33.4 33.35 33.3 33.25 33.2 33.15
Longitude from 82 W (min)

Figure 6-11: Waypoint Navigation with KV = 1.0 and Heading Estimator









Clearly the estimator significantly improved performance of the autopilot for

waypoint navigation. The large heading changes required during gross acquisition

immediately after hitting a waypoint are observably more aggressive in Figure 6-11

than in Figure 6-9. Conversely, the fine tracking phase of navigation in Figure 6-11

exhibits little of the overshoot seen in Figure 6-10. A representative flight path through

the waypoints was selected for the case shown in Figure 6-9 and measured out to

1701.7 ft while a representative track with the estimator running only covered 1553.7 ft

of ground. The discrepancy between trials represents a 9% reduction in flight path

distance with the estimator running.

The estimator also affects several features of the flight data. One effect is that a

more uniform flight path from pass to pass is maintained when estimated data is used.

Another effect is the vehicle consistently tracks to within a short distance of the actual

target at the center of the waypoint threshold. Thus, using rough estimates at 50 Hz

is shown to increase both aggressiveness and precision. The three cases described are

plotted against each other in Figure 6-12.

31.25
Est Off, Non-Aggressive
-- Est Off, Aggressive
31.2 Est On, Aggressive


31.15


31.1 -,

c 31.05-


.45 33.4 33.35 33.3 33.25 33.2 33.15
Longitude from 82 W (min)


Figure 6-12: Comparison of Waypoint Navigation Flight Paths










The performance of the altitude controller was reasonable but experienced some

adverse effects due to the flight performance properties of the MAV. Therefore the

results are more appropriately presented in the next section during discussion of the

major effects that MAV performance characteristics and flight regime have on the

performance of the controller.

6.5 Controller Performance Analysis

The data in Figure 6-11 indicate reasonable performance of the closed-loop

system for waypoint navigation; however, several issues associated with the flight

path must be considered along with simply attaining the waypoints. In particular, the

performance with respect to altitude hold and gust rejection need to be investigated.

Additional sensor measurements must thus be analyzed to fully evaluate the controller.

The wind during the flight testing was recorded at 7.5 ft/s from the west-

southwest. The effect of this wind, whose direction is shown in Figure 6-13, is

clearly seen in this representative segment of the flight path. The aircraft, which nom-

inally flies about 44 ft/s, experiences variation in airspeed about 34% from upwind

to downwind conditions. The direction of the wind is such that the MAV is blown

into the turn when acquiring waypoint-2 but blown out of the turn when acquiring

waypoint-3. Consequently, the MAV is somewhat blown downwind during the flight.

31.25

31.2
31 Wpt 1 Wpt 3
31.151 1-.,




31.05 Wpt 2
Wind

1.45 33.4 33.35 33.3 33.25 33.2 33.15
Longitude from 82 W (min)

Figure 6-13: Wind relative to Flight Path











The wind also has a noticeable effect on the roll tracking in addition to the turn

radius. Essentially, the autopilot is designed for a single airspeed but the vehicle op-

erates over a wide range of airspeed throughout the flight. The inner-loop controller

which provides roll tracking, shown in Figure 4.1.2, is thus required to provide suffi-

cient performance despite this variation in airspeed and associated aircraft dynamics.

Figure 6-14 demonstrates the effect of wind on the inner-loop roll tracker. The

vehicle typically iterates between an 8s turn as it acquires waypoint-2, then an 8s turn

as it acquires waypoint-3, then a pair of is turns to acquire waypoint-1. The turns

starting at approximately 3s, 42s, and 84s are the turns into the wind and show roll

angles exceeding the commanded values including an overshoot. Conversely, the turns

starting at approximately 19s, 58s, 100s are turns out of the wind and show roll angles

smaller than the commanded values. These results are directly caused by the variation

in control surface effectiveness from the wind.

60
Vision-Estimnated
Commnlanded
40-

20


-20

-40-

-60
-600 20 40 60 80 100 120
Time (sec)

Figure 6-14: Roll Angle



The altitude, especially during these turns, is another issue of interest because the

vehicle is somewhat underpowered. The altitude tracking is shown in Figure 6-15 to

be reasonable for straight flight but quite poor for turns. The vehicle consistently loses

a great deal of altitude during the turns into the wind but considerably less during the

turns out of the wind.













S50o

100


150

S200
Measured
250 Commanded
0 20 40 60 80 100 120
Time (sec)

Figure 6-15: Altimeter


The pitch percentage shown in Figure 6-16 indicates, as expected, a dramatic in-

crease during times associated with large roll commands. Such increases are indicative

of a large nose-down pitch moment. Differential elevator is a major contributor to this

moment when steeply banked because of their location aft of the center of gravity.

This coupling effect will be more evident when turning into the wind because of the

increased elevator effectiveness associated with the change in airspeed. The loss of

altitude is abated to some extent by the gain added through KO ; however, the altitude

loss was still significant for the turn into the wind.



0.8 -

0.6-

0.4

0.2


0 20 40 60 80 100 120
Time (sec)

Figure 6-16: Pitch Percentage


The dependence of flight performance on wind conditions is further demonstrated

in Figure 6-17. The turn to acquire waypoint-2 corresponds to headings from 270










to 90 whereas the turn to acquire waypoint-3 corresponds to headings from 100 to

310, with the wind originating from about 250. Both acquisition turns exhibit loss

of altitude, however the loss is clearly greatest during the downwind turn to acquire

waypoint-2, resulting in the inverted peak seen in Figure 6-17 near 100. The vehicle

recovers altitude during the two segments flown at approximately constant headings of

90 and 270. Significant overshoot in climb is seen in theses cases, which is likely

due to off-design controller performance resulting from wind effects.

70



110 -- r- -





120
O 50 100 150 200 250 300 350
I-eading (ldeg)

Figure 6-17: Altimeter Relationship to Heading


Finally, some discussion is warranted on the function of the switching mechanism

in the longitudinal controller shown in Figure 5-7. That controller actually has two

separate elements that track pitch percentage or track altitude. The switch attempts

to retain the horizon in the image, and consequently ensure an accurate vision-based

attitude estimation, by choosing the less aggressive of the two elements. In this case,

the controller will only track pitch percentage when the aircraft has a large error in

altitude and the altitude tracker commands an elevator deflection that will pitch the

camera image above or below the horizon. The gain, Kc, is chosen exceedingly large

to maintain aggressive tracking near the extreme limits of pitch percentage. It is critical

that the controller has the capability to achieve these limiting values of pitch percentage

when called upon. Poor tracking could cause the horizon to move out of view of the

camera with potentially disastrous results.










Figure 6-18 displays instances during the flight test for which the pitch percentage

tracking command was used in place of the altitude tracking command. These

instances are denoted by values of one in the data set. The controller mostly tracked

altitude commands because the flight path rarely required a large pitch which would

have lost the horizon. The few instances when the switch activated actually correlate

to situations where the error in pitch percentage was coincidentally low so the less

aggressive maneuver was to track pitch percentage.

The situation might be different for waypoint configurations that require drastic

changes in altitude. The only changes required in this case were the result of the

altitude controller making up for poor performance. An environment with obstacles

might require the vehicle to make large and sudden changes in altitude. An interesting

follow-up experiment would involve putting the controller in such a situation to better

determine the effectiveness of the methodology used for feature retention.

1.5


1


QO 0.5-


0-


-0.5
0 20 40 60 80 100 120
Time (sec)

Figure 6-18: Longitudinal Control Toggle















CHAPTER 7
FUTURE DIRECTIONS

Mission capability for any autonomous aircraft, including a MAV, requires a

control system that can perform a diverse set of tasks. Attitude stability and tracking

represent only a small subset of control tasks that might be considered essential to fully

autonomous guidance and navigation. Some of these tasks include trajectory tracking,

path planning, obstacle avoidance, and target recognition. The successful completion

of these tasks depends in large part on the information provided by on-board sensors.

In the case of a MAV, the available sensors are limited by size and weight constraints.

A small video camera can provide a wide variety of information and in many cases

this data can be used for control purposes. It therefore becomes desirable to extend the

technologies developed in the preceding chapters of this thesis to address some of the

higher level outer-loop control problems.

It is apparent that the future direction of vision-based flight control will be

heavily dependent on the direction and advances in image processing technology.

Knowledge of the nature of information provided by a vision sensor is necessary to

begin study of various interpretations and implementations in control laws. Current

advances in feature detection, classification, and tracking are likely to continue to

improve. The assumption is made that advanced image processing algorithms will

be developed to yield increasing amounts of useful information from the focal plane.

These assumptions allow anticipative investigation of how one would take advantage of

such information if and when it becomes available.

The broad goal for the direction of vision-based flight control, particularly for

MAVs, can be illustrated by Figure 7-1. Typical mission profiles might require a

MAV to autonomously navigate an unknown urban environment. Inertial sensors









may be sufficient for certain segments of flight, while other segments could render

readings from an inertial measurement unit (IMU) unusable. In the case of the latter

a controller might have to rely on computer vision alone. The possibility also exists

that some control tasks would require the blending of inertial sensors with vision-based

information. For full autonomy, a system capable of achieving mission profiles like that

described by Figure 7-1 requires the capability to switch modes of control in real-time.
















Figure 7-1: Technology Roadmap for Vision-Based Flight Control


This discussion presents several directions for future investigation. First, it is

necessary to identify possible control tasks that a MAV might be required to perform

during autonomous flight in an unknown environment. For example, obstacle detection

and avoidance should certainly be included in this set. A variety of control approaches

might be appropriate for each task. For the example of avoidance, different actions

might be required depending on the nature of the obstacle encountered and the time

period between detection and imminent collision. Advanced techniques for aggressive

maneuvering may be necessary.

Following task identification, the sensor measurements required to perform specific

tasks must then be examined. Certain control tasks may require the blending of inertial

and vision information on various levels. It will be necessary to develop approaches

to address the variety of uncertainties that are presented through the combination of









different sensor technologies. The interpretation of two fairly uncertain signals as one

signal of greater certainty is a critical capability in such cases.

Eventually the various control approaches will begin to form well-defined modes

of control; each requiring unique use of available sensor information. The higher

level decisions regarding the selection of the appropriate approach to use in any given

situation must be investigated. For the avoidance example, the system must be capable

of making the decision whether to use "mild maneuver avoid" or "aggressive maneuver

avoid." These modes may break down further into numerous sub-modes. The controller

must have the ability to assess a given situation and decide upon proper action. Sensor

uncertainty analysis factors heavily into these decisions, as the controller must know

which measurements to trust and which to ignore in choosing a control action.

The actual process of switching from one control mode to another will introduce

a new set of challenges. Smooth transition from one action to the next is critical in

a cluttered flight environment. The controller must anticipate and take proper action

to ensure efficient transitions. This capability may factor into other decision-making

processes. For example, the controller may be faced with an avoidance situation for

which "avoid left" and "avoid right" are both viable options. The scenario following

"avoid left" may require some other action that could result in a stall if it immediately

follows "avoid left." If the MAV can anticipate this transition issue through sensor

measurements and proper interpretation, the decision to switch to "avoid right" mode

can be made.

Finally, the controller might be required to reconfigure the mission profile given

the introduction of unexpected course deviations. For example, the controller may

be required to switch from a navigation mode to an obstacle avoidance mode, then

exhibit on-line path-planning capabilities to resume the original mission goal from the

current location and trajectory. Likewise, a tracking scenario might involve a human

target climbing into a vehicle. The controller must reconfigure the mission to track the







65

vehicle instead of the human. These scenarios both require the controller to make high

level decisions concerning its own mission requirements.

This discussion shows that the work presented in this thesis is only a small step

in the development of advanced mission capability using computer vision as a sensor.

Several examples of future directions have been identified. These examples show that

computer vision represents an integral part of both advanced intelligent flight control

techniques as well as autonomous vehicle mission planning.















CHAPTER 8
CONCLUSION

This thesis has demonstrated that waypoint navigation can be accomplished

using GPS and altitude sensing coupled with vision-based attitude estimation. The

estimated states are actually only roll angle and pitch percentage; however, these

parameters provide sufficient information for basic stability augmentation. A flight

test of the resulting autopilot clearly indicates the autopilot performance. In this case,

the autopilot is able to generate a short flight path that successively reaches a set of

waypoints. The current approach is limited to flight operations in which the horizon

is visible; however, the fundamental concept really relates using an autopilot with

feedback of a vision-based feature. As such, the autopilot can be extended using this

concept to consider other vision-based features, such as building comers or roads, for

guidance and navigation.















REFERENCES


[1] Abdulrahim, M., Garcia, H., and Lind, R., "Flight Characteristics of Wing
Shaping for a Micro Air Vehicle with Membrane Wings," In press, AIAA Journal
Aircraft.

[2] Abdulrahim, M., Garcia, H., and Lind, R., "Flight Testing a Micro Air Ve-
hicle Using Morphing for Aeroservoelastic Control," Proceedings of the
AIAA/ASME/ASCE/AHS/ASC Structures, Structural Dynamics and Materials
Conference, AIAA-2004-1674, Palm Springs, CA, April 2004.

[3] Abdulrahim, M., and Lind, R., "Flight Testing and Response Characteristics of
a Variable Gull-Wing Morphing Aircraft," Proceedings of the AIAA Guidance,
Navigation, and Control Conference and Exhibit, AIAA-2004-5113, Providence,
RI, August 2004.

[4] Albertani, R., Hubner, P., Ifju, P.G., Lind, R., Jackowski, J.,"Experimental
Aerodynamics of Micro Air Vehicles," Proceedings of the SAE World Aviation
Congress, 2004-01-3090, Reno, NV, November 2004.

[5] Boothe, K., Fitzpatrick, K., and Lind, R., "Controllers for Disturbance Rejec-
tion for a Linear Input-Varying Class of Morphing Aircraft,"Submitted to the
AIAA/ASME/ASCE/AHS Structures, Structural Dynamics, and Materials Confer-
ence, Austin, TX, April 2005.

[6] Burt, P.J., Bergen, J.R., Hingorani, R., Kolczynski, R., Lee, W.A., Leung,
A., Lubin, J., Shvayster, H.,"Object Tracking With a Moving Camera An
Application of Dynamic Motion Analysis," IEEE CH2716-9/89/0000/0002, 1989.

[7] Campos, M.F.M., and Coelho, L. de S., "Autonomous Dirigible Navigation Using
Visual Tracking and Pose Estimation," Proceedings of the IEEE International
Conference on Robotics and Automation, Vol. 4, Detroit, MI, May 1999, pp.
2584-2589.

[8] Castro, A.P.A., Silva J. D., and Simoni, P.O., "Image Based Autonomous Naviga-
tion with Fuzzy Logic Control," Proceedings of the International Joint Conference
on Neural Networks,Volume 3, 15-19 July 2001, pp. 2200-2205.

[9] Causey, R., A Lateral Vision-Based Control Autopilot for Micro Air Vehicles
Using a Horizon Detection Approach, M.S. Thesis, Department of Mechanical and
Aerospace Engineering, University of Florida, December 2003.









[10] Chatterji, G.B., Menon, P.K., and Sridhar, B., "Vision-Based Position and Attitude
Determination for Aircraft Night Landing," Journal of Guidance, Control, and
Dynamics, Vol. 21, No. 1, January-February 1998, pp. 84-92.

[11] Choi, J.Y., Kim, C.S., Hong, S., Lee, M.H., Bae, J.I., and Harashima, F., "Vision
Based Lateral Control by Yaw Rate Feedback," Proceedings of the 27th Annual
Conference of the IEEE Industrial Electronics Society, Volume 3, 29 November -
3 December 2001, pp. 2135-2138.

[12] Ettinger, S.M., Nechyba, M.C., Ifju, P.G., and Waszak, M.R., "Vision-Guided
Flight Stability and Control for Micro Air Vehicles," Proceedings of the IEEE
International Conference on Intelligent Robots and Systems, Vol. 3, Lausanne,
Switzerland, October 2002, pp. 2134-2140.

[13] Fitzpatrick, K., "Applications of Linear Parameter-Varying Control for Aerospace
Systems," Master's Thesis, University of Florida, 2003.

[14] Frew, E., McGee, T., Kim, Z., Xiao, X., Jackson, S., Morimoto, M., Rathinam,
S., Padial, J., and Sengupta, R., "Vision-Based Road-Following Using a Small
Autonomous Aircraft," Proceedings of the 2004 IEEE Aerospace Conference,
0-7803-8155-6, IEEAC paper #1479, Big Sky, MT, March 2004.

[15] Garcia, H., Abdulrahim, M., and Lind, R., "Roll Control for a Micro Air Ve-
hicle using Active Wing Morphing," Proceedings of the 2003 AIAA Guidance,
Navigation, and Control Conference, AIAA-2003-5347, Austin, TX, August 2003.

[16] Grasmeyer, J.M. and Keennon, M.T., "Development of the Black Widow Micro
Air Vehicle," Proceedings of the AIAA Aerospace Sciences Meeting and Exhibit,
AIAA-2001-0127, Reno, NV, January 2001.

[17] Hutchinson, S., Hager, G., and Corke, P., "A Tutorial on Visual Servo Control,"
IEEE Trans. on Robotics and Automation, vol. 12, pp. 651-670, Oct. 1996.

[18] Ifju, P.G., Jenkins, D.A, Ettinger, S., Lian, Y., Shyy, W., and Waszak, M.R.,
"Flexible-Wing-Based Micro Air Vehicles," Proceedings of the AIAA Aerospace
Sciences Meeting and Exhibit, AIAA 2002-0705, Reno, NV, January 2002.

[19] Ifju, P.G., Ettinger, S., Jenkins, D.A, and Martinez, L., "Composite Materials
for Micro Air Vehicles," Society for the Advancement of Materials and Process
Engineering Annual Conference, Paper #46-162, Long Beach, CA, May 2001.

[20] Juberts, M., and Raviv, D., "Vision-Based Vehicle Control fo AVCS," Intelligent
Vehicles Symposium, National Institute of Standards and Technology, Gaithers-
burg, MD, 14-16 July 1993, pp. 195-200.

[21] Kasetkasem, T., and Varshney, P.K., "An Image Detection Algorithm Based on
Markov Random Fields Models," IEEE Transactions on Geoscience and Remote
Sensing, Vol. 40, No. 8, August 2002, pp. 1815-1823.









[22] Kato, T., Ninomiya, Y., and Masaki, I., "An Obstacle Detection Method by Fusion
of Radar and Motion Stereo," IEEE Transactions on Intelligent Transportation
Systems, Vol. 3, No. 3, September 2002, pp. 182-188.

[23] Kosecka, J., Blasi, R., Taylor C.J., and Malik, J., "A Comparative Study of
Vision-Based Control Strategies for Autonomous Highway Driving," Proceedings
of the IEEE International Conference on Robotics & Automation, Leuven,
Belgium, May 1998, pp. 1903-1908.

[24] Kwolek, B., Kapuscinski, T., and Wysocki, M., "Vision-Based Implementation
of Feedback Control of Unicycle Robots," Robot Motion and Control Workshop,
28-29 June 1999, pp. 101-106.

[25] Lian, Y., and Shyy, W.,"Three-Dimensional Fluid-Structure Interactions of
a Membrane Wing for Micro Air Vehicle Applications," AIAA-2003-1726,
Proceedings of the AIAA/ASME/ASCE/AHS Structures, Structural Dynamics, and
Materials Conference, Norfolk, VA, April 2003.

[26] Mandelbaum, R., McDowell, L., Bogoni, L., and Hanson, M., "Real-Time Stereo
Processing, Obstacle detection, and terrain estimation from Vehicle-Mounted
stereo Cameras," Proceedings of the Fourth IEEE Workshop on Applications of
Computer Vision, Princeton, NJ, 19-21 October 1998, pp. 288-289.

[27] Manigel, J., and Leonhard, W., Vehicle Control by Computer Vision," IEEE
Transactions on Industrial Electronics, Vol. 39, No. 3, June 1992, pp. 181-188.

[28] Morris, S., and Holden, M., "Design of Micro Air Vehicles and Flight Test
Validation," Proceedings of the Conference on Fixed, Flapping, and Rotary Wing
Vehicles at Very Low Reynolds Numbers, South Bend, IN, June 2000, pp. 153-176.

[29] Nguyen, Ogburn, Gilbert, Kibler, Brown, and Deal, "Simulator Study of
Stall/Post-Stall Characteristics of a Fighter Airplane with Relaxed Longitudi-
nal Static Stability," NASA Technical Paper 1538, December 1979.

[30] Raney, D.L. and Slominsky, E.C., "Mechanization and Control Concepts for
Biologically Inspired Micro Aerial Vehicles," Proceedings of the 2003 AIAA
Guidance, Navigation, and Control Conference and Exhibit, AIAA-2003-5345,
Austin, TX, August 2003.

[31] Rock, S.M., Frew, E.W., Hank, J., LeMaster, E.A., and Woodley, B.R., "Combined
CDGPS and Vision-Based Control of a Small Autonomous Helicopter," Proceed-
ings of the American Control Conference, 0-7803-4530-4/98, Philadelphia, PA,
June 1998, pp. 694-698.

[32] Shyy, W., Berg, M., and Ljungqvist, D., "Flapping and Flexible Wings for
Biological and Micro Air Vehicles," Progress in Aerospace Sciences, Vol. 35, No.
5, 1999, pp. 455-506.









[33] Silveira, G.F., Carvalho, J.R.H., Madirid, M.K., Rives, P., and Bueno, S.S., "
A Fast Vision-Based Road Following Strategy Applied to the Control of Aerial
Robots," Brazilian Symposium on Computer Graphics and Image Processing,
Florianopolis, Brazil, 15-18 October 2001, pp. 226-231.

[34] Soatto, S., Perona, P., Frezza, R., and Picci, G., "Motion Estimation via Dynamic
Vision," IEEE Proceedings of the 33rd Conference on Decision and Control, Lake
Beuna Vista, FL, December 1994, pp. 3253-3258.

[35] Soatto, S., and Perona, P., "Dynamic Visual Motion Estimation from Subspace
Constraints," Proceedings of the IEEE International Image Processing Conference,
0-8186-6950-0/94, Austin, TX, November 1994, pp. 333-337.

[36] Soatto, S., and Perona, P., "Reducing 'Structure from Motion'," Proceedings
of the IEEE Computer Society Conference on Computer Vision and Pattern
Recognition, 1063-6919/96, San Francisco, CA, June 1996, pp. 825-832.

[37] Soatto, S., and Perona, P., "Visual Motion Estimation from Point Features: Uni-
fied View," Proceedings of the IEEE International Image Processing Conference,
0-8186-7310-9/95, Washington, DC, October 1995, pp. 21-24.

[38] Soatto, S., and Perona, P., "Reducing 'Structure from Motion': A General
Framework for Dynamic Vision Part 1: Modeling," IEEE Transactions on Pattern
Alildi % and Machine Intelligence, Vol. 20, No. 9, September 1998, pp.933-942.

[39] Soatto, S., and Perona, P., "Reducing 'Structure from Motion': A General
Framework for Dynamic Vision Part 2: Implementation and Experimental
Assessment," IEEE Transactions on Pattern Anal' %,i% and Machine Intelligence,
Vol. 20, No. 9, September 1998, pp. 943-960.

[40] Stevens, B., and Lewis, F, Aircraft Control and Simulation, Wiley Inter-science,
New York, 1992.

[41] Thomas, B.T., Lotufo, R.A., Morgan, A.D., Milford, D.J., and Dagless, E.L.,
Real-Time Image Analysis for Vision Guided Vehicle Control," UK IT 1990
Conference, Southampton, UK, 19-22 March 1990., pp. 66-70.

[42] Viieru, D., Lian, Y., and Shyy, W., and Ifju, P.G., "Investigation of Tip Vortex on
Aerodynamic Performance of a Micro Air Vehicle," Proceedings of the 33rd AIAA
Fluid Dynamics Conference and Exhibit, AIAA-2003-3597, Orlando, FL, June
2003.

[43] Waszak, M.R., Jenkins, L.N., Ifju, P.,"Stability and Control Properties of an
Aeroelastic Fixed Wing Micro Air Vehicle,"Proceedings of the AIAA Atmospheric
Flight Mechanics Conference, AIAA-2001-4005, Montreal, Canada, August 2001.







71

[44] Waszak, M.R., Davidson, J.B., Ifju, P.G., "Simulation and Flight Control of an
Aeroelastic Fixed Wing Micro Air Vehicle,"Proceedings of the AIAA Atmospheric
Flight Mechanics Conference, AIAA-2002-4875, Monterey, CA, August 2002.

[45] Yao, Y., Chellappa, R., "Dynamic Feature Point Tracking in an Image Sequence,"
Proceedings of the 12th IAPR International Conference on Pattern Recognition,
IEEE 1051-4651/94, Jerusalem, Israel, October 1994, pp. 654-657.

[46] Zhang, B., Lian, Y., and Shyy, W.,"Proper Orthogonal Decomposition for Three-
Dimensional Membrane Wing Aerodynamics,"Proceedings of the 33rd AIAA Fluid
Dynamics Conference and Exhibit, AIAA-2003-3917, Orlando, FL,June 2003.















BIOGRAPHICAL SKETCH

Joseph John Kehoe was born in Cooperstown, NY, on July 16, 1980. He grew

up in Oneonta, NY, and graduated from Oneonta High School in June of 1998. He

then attended Virginia Polytechnic Institute and State University, where he received

a Bachelor of Science degree in aerospace engineering in May of 2002. During his

senior year at Virginia Tech, Joseph was part of an interdisciplinary/international

aircraft design team whose efforts resulted in Ikelos, which is a unique general aviation

aircraft designed to fit into NASA's Small Aircraft Transportation Systems (SATS)

infrastructure. The design won second place in the 2002 student competition out of

NASA Langley and was featured at the EAA's Airventure in Oshkosh, WI, as well as

in the October 2002 issue of Popular Science. He worked two internships with NLX

Corporation in Sterling, VA, where he helped with the development and qualification

processes of several FAA level D full-flight simulator programs. Joseph is currently

a second-year graduate student in the Department of Mechanical and Aerospace

Engineering at the University of Florida. He is studying under Dr. Rick Lind, and his

research involves the development of vision-based flight control methodologies.