<%BANNER%>

A Flight Testbed with Virtual Environment Capabilities for Developing Autonomous Micro Air Vehicles

xml version 1.0 encoding UTF-8
REPORT xmlns http:www.fcla.edudlsmddaitss xmlns:xsi http:www.w3.org2001XMLSchema-instance xsi:schemaLocation http:www.fcla.edudlsmddaitssdaitssReport.xsd
INGEST IEID E20110115_AAAACZ INGEST_TIME 2011-01-15T18:31:56Z PACKAGE UFE0008441_00001
AGREEMENT_INFO ACCOUNT UF PROJECT UFDC
FILES
FILE SIZE 21279 DFID F20110115_AACBHX ORIGIN DEPOSITOR PATH grzywna_j_Page_42thm.jpg GLOBAL false PRESERVATION BIT MESSAGE_DIGEST ALGORITHM MD5
b2e9677a2f4ab56f44e45954c30fb85f
SHA-1
864004e79f45cd507d5c9b05db0bd1cca8d8ac89
62112 F20110115_AACAYC grzywna_j_Page_08.jp2
3f749517f1bcd6c7db12494ec1f74ae0
49a4566c946a314076941dfc2a70b04c496fd17a
1053954 F20110115_AACBAC grzywna_j_Page_25.tif
4af476ef45e48d5a4264396cd55ea68d
7bcc4fef0e9020a3a9e7b186c871aaaca893d6b4
9786 F20110115_AACBHY grzywna_j_Page_44thm.jpg
b2e975f2d57b7bca67de95e377750100
fca548f2a83346343ddb3e3daee0b6fe9ad49dc3
197449 F20110115_AACAYD grzywna_j_Page_14.jpg
4527aa6db573d92bf47b27ec77dbe381
51e6a728a475cd4ca365367430c9059c48ab3423
238524 F20110115_AACBAD grzywna_j_Page_31.jpg
bb1a991adacd142b3c458ea7db558f8e
e129fca9b272355d1e2a720560fb8c7f2921d1d0
42141 F20110115_AACBFA grzywna_j_Page_24.pro
42027da937255e9e1f52743b587335af
a8a5438e731e4b55febcd7282409c73b73e89e89
29138 F20110115_AACBHZ grzywna_j_Page_44.QC.jpg
73c9de35f85e40a11c00af407e1dc8c3
fadd2e6b4cfca2b58b2cd09a70b6b998e20d82d5
F20110115_AACBAE grzywna_j_Page_43.tif
220c8bd9a926da931a5dfdd8b79e0c48
834f2c60e679c0c85c69f57855f50ebdaf33df0d
38177 F20110115_AACBFB grzywna_j_Page_25.pro
088581726d43e93a13a0074ce4775655
59fe6265d76534ec5c139cad7ca09d2129209b88
43525 F20110115_AACAYE grzywna_j_Page_41.pro
e3f19ea1f616eed4deec8fed0515c010
bfd1e79340feaeb0c1a32eaae06e35125c20fe0f
14632 F20110115_AACBAF grzywna_j_Page_16thm.jpg
465b1876863f3d54becaf04d2a657400
caeb5c58af588529286055bde99f8967204c2977
25722 F20110115_AACBFC grzywna_j_Page_27.pro
55a72f1cb157ca21b3a51716e7ae645d
638285a25965efeb9b6907ae2c0e76bc1ac9168a
183010 F20110115_AACAYF grzywna_j_Page_37.jpg
9589e5613e1c049f7ec879fbe798e798
ad5c2437acb0d172a1ad6e0bf091101ba914beec
55166 F20110115_AACBAG grzywna_j_Page_40.jp2
9daa996ec6fb5cdd9d0959c85149cd84
ac5d58f1a7072c168f595a296237bac3ffa19ac5
48675 F20110115_AACBFD grzywna_j_Page_29.pro
c2e12d3aea79d77a6d1f2ee35bc9c820
6e7c563c2f2caa3ed840140ff1bbdfd40b913d73
143314 F20110115_AACAYG grzywna_j_Page_09.jpg
75c28bbceb7cf3d5a5528ddaa4c36661
d6fea1921a2023d972fe17d20eea01f010127491
42899 F20110115_AACBAH grzywna_j_Page_31.pro
fdccc9005255e5283c0db5f8b3f22412
c19a19657adc732780488396d9286c935792bc28
38579 F20110115_AACBFE grzywna_j_Page_34.pro
e84d0662ac91dc736d995afa426a3def
d7cbb7b9365322e3841867d6dafc9eaea9686a04
966954 F20110115_AACAYH grzywna_j_Page_30.jp2
f54c48b631a44b93b36fc3519b6d68fe
9d3b3cf10bdb9d6a2227f26277f72639e74867c7
137896 F20110115_AACBAI grzywna_j_Page_02.jpg
c862b2126579c44e2505f9316137193c
8b930b371c9c060ccc3acd246c980e99f8479714
40761 F20110115_AACBFF grzywna_j_Page_35.pro
b50d958f1b2ea0fb1bd6714d87211983
84c92cb2d75891268239fa7c19254f085c307a0b
99429 F20110115_AACAYI grzywna_j_Page_19.jp2
8b58ad041a6fa977dd2907cb6afc5348
dba5542a0b23abf9b564f3777d6444c2c0d47519
35606 F20110115_AACBAJ grzywna_j_Page_38.pro
c947619323dc8e49277db067535a7fec
082d21ee0d209c36b1954f6b0761cf1630f82f77
36852 F20110115_AACBFG grzywna_j_Page_36.pro
69d3e95609382f229a68b0c6eb340938
83adfe4db43d162367be66a8c8207ff72fbdf4ca
45012 F20110115_AACAYJ grzywna_j_Page_23thm.jpg
fdc0384dc903122ec6a2acd1fd3eba39
891da04f76f35d4e9a853f05dd5c87bc75cca4b9
65389 F20110115_AACBAK grzywna_j_Page_11.QC.jpg
d6a20d1f18483e89d4e9f6c42836779b
395d29b0585fe0763810cbe5a4670d3d47ee3fcf
20392 F20110115_AACBFH grzywna_j_Page_39.pro
afa7075278517f6165fe99e9c2829ac5
43779a8a5ba7909460f60cf1c50b73f60121ff7e
1588 F20110115_AACAYK grzywna_j_Page_25.txt
e3b0f095c4ca752fd7085beef19bc850
3c54b00df70bb74e095349b73add5a8cc686adef
55274 F20110115_AACBFI grzywna_j_Page_42.pro
d4af54e40675d5efee757126c1f0c947
c920c4b52922280b202017dfaefda4cf5e27a1bf
F20110115_AACAYL grzywna_j_Page_10.tif
a7db6525fdddbe830186a533271eddfb
01e942c7408244b414b014865137f8baf9e299e8
155262 F20110115_AACBAL grzywna_j_Page_27.jpg
b22574103ca14c07d3001d5c2be67e7c
d34606b6dab69eeeaa380355b76c445ffeeb15a5
18295 F20110115_AACBFJ grzywna_j_Page_44.pro
c371d0416d773f3469e4a451e4038d73
6c736ee66df34789e4ae119ff677c87954919ce8
48109 F20110115_AACAYM grzywna_j_Page_13.pro
b13b0df8a01fb3b01734fa258e83905a
4217aa4be63f59aa137cdcbdb66b42d1dc92d2aa
1557 F20110115_AACBAM grzywna_j_Page_30.txt
a7a8b8595931e817adec8b2e046f2288
99951645e446d07863d89f70089b41e4b8aa642a
175 F20110115_AACBFK grzywna_j_Page_06.txt
f39bf1a93e644846c6ea68a8b2aa2f47
1d9eec51448261987a5077329d375a30865b097d
39141 F20110115_AACAYN grzywna_j_Page_43.QC.jpg
d9a19605185c131216b2cc36c8326d89
05ac16ab54b84b31c4ae9b380dd8754c5e57f7a3
1051966 F20110115_AACBAN grzywna_j_Page_03.jp2
aad43f9e7da82f16358c6533b71922cf
febe321896c9cf229369a717a7939d20a8144330
1129 F20110115_AACBFL grzywna_j_Page_08.txt
af2f4fe09e7cbe68a98aad2be5c5fc3c
e13bb2cbebdee3b7730de975520649c0ec42c7aa
F20110115_AACAYO grzywna_j_Page_34.tif
78ae96e73795dac49b8dab34fdf1f6db
77b2cec5d9d6b56ad5e7a10577bbdbcd4d54d779
17494 F20110115_AACBAO grzywna_j_Page_01.QC.jpg
89a9af7822b10780d2c97d71c343ee5c
91d300e0d79642b893e1c1f1a6e014030bf602a5
1897 F20110115_AACBFM grzywna_j_Page_10.txt
7e56d486598cd0e3fdccd7faeae5e6d8
4b92ea57bf1ac2e5046a5f6c3c578a83817883b8
18338 F20110115_AACAYP grzywna_j_Page_21thm.jpg
02b8cbea9b582ddf6137421824472659
0424daae72354d5bb37928f17d2e026b9497afc5
21543 F20110115_AACBAP grzywna_j_Page_29thm.jpg
5b33fcba3f04ec3dd82fc92ee2dd759e
bf2f63b0142439165e3a0b693a57e83ffd1069cf
1661 F20110115_AACBFN grzywna_j_Page_14.txt
a98734433dc83b737908e07d3f139f1a
770e13ef39b8d025e21dd414423a864d9ee73280
48944 F20110115_AACAYQ grzywna_j_Page_31thm.jpg
e0442c15fd3c8dc28c6ade21f07fe8fd
30b9d16a31787241fdb549ac6859f75718e3108f
59716 F20110115_AACBAQ grzywna_j_Page_28.QC.jpg
55dc467b173d7fff3c1803ca88cb53cf
ded0f7f665dbb6d75e151fd42a8462bee0d79aa0
1861 F20110115_AACBFO grzywna_j_Page_17.txt
b530dac8f6816ad1dd5282f71bbd0d02
d2265aa945ef88fab6438f512e9fd64c47b6ef7b
52544 F20110115_AACAYR grzywna_j_Page_06.jpg
d7afee2381ca06d454da6b7fbe669b88
6f7cf7321b8785e20049b276a65ff5daf0102484
1397 F20110115_AACBAR grzywna_j_Page_23.txt
da9126348a7eec26c9bc595b4b9cfa55
ac4a8c83b610d5b32059290a50fe968cfa2578ae
1247 F20110115_AACBFP grzywna_j_Page_18.txt
e2d1d744b1bf9d9efc4180d08c8a2c4c
2ce3a7ba93a26f1d26b5f5fedc810225ea37c385
81343 F20110115_AACAYS grzywna_j_Page_34.jp2
14f971dbfa3cee600f810a0243f4be11
19c8a8082b90b7690b9e9d738c05b16bc62b7dd0
34522 F20110115_AACBAS grzywna_j_Page_09.pro
7fed0813eb6999315d8df2f9fba7f3e9
991f75fc3c045f7f6f4040725628314298cd04b8
F20110115_AACAYT grzywna_j_Page_28.tif
8256673c456de98e386d77976854b3e0
54e09ca15a40a9968d7ded4c36ce075acdd85a07
37786 F20110115_AACBAT grzywna_j_Page_07.pro
663f893f973f534808f3fabd307a6b10
85b078aa26b04df76211f2d4feed753eca7d9b96
1654 F20110115_AACBFQ grzywna_j_Page_21.txt
27e99ea17c0d26a97ce0ce64cd269dd5
dae837ac49a34c25548182bbfd9bd5b8ace87622
1493 F20110115_AACBAU grzywna_j_Page_36.txt
8932265ea15ec37da49401edc2159edd
c14fabe655dc8c7bdc813d643beaeb1d0611a31a
1706 F20110115_AACBFR grzywna_j_Page_24.txt
c8cf159072917f7198faefc12f1885bb
fa64df05b179e93705311d989caa0f090344778d
1075 F20110115_AACAYU grzywna_j_Page_40.txt
f1305800c66811f3c870a7b946985136
6d825fc99de753c33b70901e4941efb89c903924
188475 F20110115_AACBAV grzywna_j_Page_30.jpg
5c102007ea4e66e5b17aef5500700d87
ec8dc5fcbcb5eb236ec3cc86983631469fd8f515
1545 F20110115_AACBFS grzywna_j_Page_28.txt
b4503bee7c7ec264c7ade65a69d9edf6
9c4e0d32c91b1d7123d97ce41d1580aafb682f28
1458 F20110115_AACAYV grzywna_j_Page_22.txt
27775dad73a54d8d1d59838832d71b55
e2a933e815a28dffa36520c9917c7cff40e9559a
16932 F20110115_AACBAW grzywna_j_Page_33thm.jpg
c66c5a95ad0e2e098f92093f57fa1900
2d871fbaf64a2eedf6eb037696b1e896a01c3278
1950 F20110115_AACBFT grzywna_j_Page_29.txt
0859e21bbaeba9dbe915bbad5d8eddce
cd91776b4840dd3acfb27ddf7a60ad47e968b260
25271604 F20110115_AACAYW grzywna_j_Page_04.tif
5a11297899a899d96a7a044370511cb3
215c3dbbc7d1aedd42898e71d9fd66560938d8de
188121 F20110115_AACBAX grzywna_j_Page_41.jpg
7e18b4728b7ef6d809293fd769d5ea6c
bb0e8b5e35ad4c848b6ba3e475c5cc2c17f3b691
1813 F20110115_AACBFU grzywna_j_Page_31.txt
76226d6879fbac210f5eef9766d3d174
a49eab05a840752c151c711ff1c1697dd3dd18ce
45235 F20110115_AACAYX grzywna_j_Page_05thm.jpg
20c0846796557eccaa1a46ac04e81d6b
b8721123caffd1894bf5064b8e7801df2ce5d1bf
F20110115_AACBAY grzywna_j_Page_36.tif
c7c9f0e4800d9e2921ec607ca1655a1c
54a343498b18bf17761524585346a69018ce93e2
1903 F20110115_AACBFV grzywna_j_Page_32.txt
2b2662712ec2d0ef98c8caf31765a7e8
e69987d00d0752991622564d26780e5164cf3236
1483 F20110115_AACAWA grzywna_j_Page_09.txt
b68785e55cbfd8443cffd57886efa236
a94b9f8566e03172fc65abd5f0464d16aa64ceb8
1225 F20110115_AACAYY grzywna_j_Page_37.txt
b00bac7af90994f45e2848087d4f06e9
d1d1bf58e195eafd37f5fb024e0f9a840cd1270a
34661 F20110115_AACBAZ grzywna_j_Page_06.QC.jpg
a014a04c02a255a9494dad75a8504c85
dd02317322486e1a793ad5a4e2dbd79ab3d9485c
1529 F20110115_AACBFW grzywna_j_Page_33.txt
cf74521c60ef4a072ad00292b275abc9
b37cd06c40cfd758c9f799c6b5eda313a9c81550
42776 F20110115_AACAWB grzywna_j_Page_08.QC.jpg
23b08ef7f4f478562cc56201b91850e8
97d0435778fcc44d858876ccc5d9f6bb29140ba1
25773 F20110115_AACAYZ grzywna_j_Page_01.jp2
6642a65f43265bb58896dc0ce043c3f4
7e1559c6960f95a771bd3fe0f49e1c593b16e490
2235 F20110115_AACBFX grzywna_j_Page_42.txt
bb0745aeb76cd589bff065af5fbd68ad
8f85b073dc2e279b2514dd43565549b807df7392
82902 F20110115_AACBDA grzywna_j_Page_07.jp2
e5442ae1aa6fa743d7a4863630677bbf
abad714059fc104a8210d313c35e4b4b712a028c
1110 F20110115_AACBFY grzywna_j_Page_43.txt
5bc5e61bac5331fa985200af73823933
fbc205ae6320dddda0b2fcd5cdf3bb1737b4ccf7
97886 F20110115_AACAWC grzywna_j_Page_17.jp2
341d6657c85aac000f5c1efe74ca1e75
c7f7fe3c58d0955e719c04cd38b2447bffe2d43b
95145 F20110115_AACBDB grzywna_j_Page_11.jp2
9d0a2b6a910b6a9f6d22c99c157fbad4
ae9f87299ea52fda860aa870ff59a81eaf16dc3a
6930 F20110115_AACBFZ grzywna_j_Page_01thm.jpg
06b94ed932f743a77c8a8926a8353d23
0c7884454cbc0ef3d63afb40ea1c4e6996d505ab
F20110115_AACAWD grzywna_j_Page_11.tif
08a3a632ce18d7d3d4823eabf0c38cbe
698592bdbcc53365032899dbe5006e33c7629b0c
665655 F20110115_AACBDC grzywna_j_Page_15.jp2
97d22616e030692e7c45374a1ed54080
9497a162974e5440734f2ef36d5f7a7f5543ded5
146909 F20110115_AACAWE grzywna_j_Page_20.jpg
7fed1eaff364333437efbfa3adaaf400
5bd38af373a8fcf0935a2ac82301a787e5269892
67110 F20110115_AACBDD grzywna_j_Page_16.jp2
f8f269639329e93fc445b7e20c00cc08
8e128d1672f4dd555cb061bad4243f65fdb0d95f
1290 F20110115_AACAWF grzywna_j_Page_02.txt
ce9155ae4447a97650062154f594d142
2eccda82c55767da2d27b534cbbfd4739b1df919
54117 F20110115_AACBIA UFE0008441_00001.mets FULL
e8043f10682ccbdc32ba00105acc8fff
f1ac34a11abbc0a3c5e30da8f0a29297178013b4
810002 F20110115_AACBDE grzywna_j_Page_18.jp2
3a66b160e2f34b6d1b1a595300f4ac8e
1c217abc5470b2d1e27468cec36f4851b326db2b
1462 F20110115_AACAWG grzywna_j_Page_38.txt
12199dd810e3a42f753a266906510f13
0d80d4f647af33e94bb22785679fa72ddbf4e37d
87787 F20110115_AACBDF grzywna_j_Page_21.jp2
35cb6b12311976cca1bdfa4922056f5c
88d7e3a587cfc3744d1a265133a92c5845a20ce7
28329 F20110115_AACAWH grzywna_j_Page_08.pro
5ace42fde9837385c284654d116e3f5f
3a65ec6ec095e0ad9a16104344e745061e03938a
70869 F20110115_AACBDG grzywna_j_Page_22.jp2
afdae698320bc4ad44af325e681c0394
114c4ef317b3ef812d7424a31648563cc03d1708
197243 F20110115_AACAWI grzywna_j_Page_17.jpg
c581cf53e7c8be5d5582b8943515177e
92d3c5b159229fdffb4dbbc7318df10162a0ce2b
892317 F20110115_AACBDH grzywna_j_Page_23.jp2
fe4a9767662788ddd4aa379bfe69d630
d94373c3835d570693114a54becbca8ee9315771
124331 F20110115_AACAWJ grzywna_j_Page_39.jpg
43c6e5290da4fad8d1d0676959911dad
069b411f62043a6081d46990c8f2280b82e811b5
90826 F20110115_AACBDI grzywna_j_Page_24.jp2
64623118fc5c7f7316254b298e9a2e17
fd9f4d39ed2f14d44f122e6ee0b1149d6836503e
18671 F20110115_AACAWK grzywna_j_Page_41thm.jpg
e939e3316333796cd3e24cdfaf778d5c
bef8801ccda8b7f3375b4fa67d9e5cccd987e7aa
83512 F20110115_AACBDJ grzywna_j_Page_25.jp2
af509e2c5350ef20a939230bdf8e8a36
4c27fabe058ef14b346d2e2e7fc854cebbb99afa
249435 F20110115_AACAWL grzywna_j_Page_05.jpg
64b97b15a06afd40a8e3234efeda6508
b0fad2584ae9947d5cdfa65272fae454076047ad
676354 F20110115_AACBDK grzywna_j_Page_27.jp2
261e3ec17ebe1351a30fa9971c0c68a8
8e419fa4877719befe898671aa57913d0995fe7d
2003 F20110115_AACAWM grzywna_j_Page_13.txt
3f661c7306ff8d06481fdfb92b731e9e
420be886c73b225b2c4c44633e03236be25f165a
79436 F20110115_AACBDL grzywna_j_Page_28.jp2
976699c64f1360f23399e1b82db1d0ca
40a4e8d5ecfe0e12124aa1c8e640071430dae5b3
F20110115_AACAWN grzywna_j_Page_31.tif
d6a28c7fad34d263b0abe0c14ff1ce67
de4d7f8b69af2884ba6cabf473bd55b0bebf0b01
1051963 F20110115_AACBDM grzywna_j_Page_31.jp2
cb9d9b4fcfe5551708a356c112cb94b3
81b515a9ad97fc167c8d5eb2fd1a08b2e7570274
175079 F20110115_AACAWO grzywna_j_Page_11.jpg
21f5498c8c3e0ada23b3140eacfe7f7a
f1085221e4f0a7a8c1db547c3f1a4493570dccc5
99310 F20110115_AACBDN grzywna_j_Page_32.jp2
16f1177456b70544c1a60ede9b7995e0
c9bbf5a1e8b16f633997afc83378b4ea09316f8a
1766 F20110115_AACAWP grzywna_j_Page_41.txt
06ca75ea5ea72986935a1f7b79594816
23e92df33b6bb11a9411f9f9a0626d09db8bf846
83559 F20110115_AACAWQ grzywna_j_Page_33.jp2
932177722d7c0b57ec53689cd168d807
253804cc2fcd27db624f1e8dc7100476c13c8b9b
1043761 F20110115_AACBDO grzywna_j_Page_35.jp2
a30d8ffd9aeb4957c2d2ae2a9d0c2fdb
82379d5d5fc0ad5a05dce88acd1555208dc52e33
27096 F20110115_AACAWR grzywna_j_Page_18.pro
871e3414ce1606c0c27fa03df4a2c71b
fafb1cb17b22e24d30b3a0f1a571330a893b903b
1051962 F20110115_AACBDP grzywna_j_Page_38.jp2
f89014d0ab58be879c44eac32e522131
685b12d087023799dbaf347502f0bff8207302bb
26477 F20110115_AACAWS grzywna_j_Page_37.pro
bbb4877e01a7a4a9a800380b6c266424
b81677db16a1beb0c797f499a82f5ce9f3bef0a4
94837 F20110115_AACBDQ grzywna_j_Page_41.jp2
ffc00972adabf83e60c2203a0a7fb82b
ffedca8c1e7f57b762cb7db90639ed5c3befa334
30324 F20110115_AACAWT grzywna_j_Page_16.pro
cc8beb59a0dbcc7b2b3d0ce7fd02b38f
9b87e8f24b2c11dfb92c423c668ca476324ef7cf
60526 F20110115_AACBDR grzywna_j_Page_43.jp2
f1c5e0fc31a2f3f5f8094536d46b0ed1
84393cce646f6a4cebc065a0d7230f9b4df8f65d
524342 F20110115_AACAWU grzywna_j_Page_39.jp2
7e16bc95cde8214316e27b655ef13419
d2fc07a112f649294f554de5538350673e385b59
42887 F20110115_AACBDS grzywna_j_Page_44.jp2
f71adfd018e5b1a0180bd3168d6fe7dd
74cc7e97b616b5eccb7dc5b8c66d7c5292ec713b
F20110115_AACAWV grzywna_j_Page_16.tif
8e4461c63623eeb6aca9f298536ea22c
faeff64582e3d3d886fb5535b61221426da59f4d
F20110115_AACBDT grzywna_j_Page_02.tif
2f7431f8c83e2cd03e7d907e1afedd60
383c957c1896496ae2495e7b3d6522d3a0c5141c
F20110115_AACAWW grzywna_j_Page_17.tif
4ea1bdd267a5cb71137739cc2d836a88
ddd85f0dc2b1b5120c7a2563834a9638ce5a8097
F20110115_AACBDU grzywna_j_Page_03.tif
66770d600e169251895d886ee7c9909f
b08a9e7e10772afb0c044b4b071b15fef15aea47
F20110115_AACAWX grzywna_j_Page_35.tif
b9182ff5982b53da172c43f0c98286de
4e391fb74861efaa1c8059bcfd6edca1572f77bd
F20110115_AACBDV grzywna_j_Page_06.tif
d9bb8c596d0d6ec2830a1e345c6af8fb
e746a07ecf6970147e6cd8ad85828a92fe358093
164097 F20110115_AACAWY grzywna_j_Page_07.jpg
2df7b570d9eeb8b61be3107903fdf524
fac4725cb75211199c1afcdb13581976268a898d
F20110115_AACBDW grzywna_j_Page_07.tif
c944d73923978c755bb7d78a90d92bf6
fac57accf093fa742540649019079d006ec5896c
1660 F20110115_AACAWZ grzywna_j_Page_07.txt
d7129c371d8caa0ec87644a1b9392919
8762241e96f0b473be7f783a53db21158f640d6f
F20110115_AACBDX grzywna_j_Page_08.tif
dd0c5a5716ee8b075d835d27cbe44ba2
cc5d9016ef7d926e9222d3e68c7a49c88a6cd471
46554 F20110115_AACAZA grzywna_j_Page_12thm.jpg
e2f8fc82ddbb3ef072dfbf2511dc0d9d
b7faf63a80c64892ee53465b773587dd53003795
59218 F20110115_AACBBA grzywna_j_Page_39.QC.jpg
51390429c8181f434f4c61db82d00c04
45b6f4e2fd848d7472cbdbe20c6b04a477a94790
F20110115_AACBDY grzywna_j_Page_09.tif
c5d987f1d4e5d314da45f546d96e73a6
1424013cd97aab3d6b59e72a7b92d701823812f9
42418 F20110115_AACAZB grzywna_j_Page_03thm.jpg
eaeb69048dab545c185e2eadd331a9f1
06d90469702de11356eb7ef281b13a9bc608e5eb
33003 F20110115_AACBBB grzywna_j_Page_22.pro
918aaf5e9ff082879bbdba73720109cf
47db4ac99bd6367b5d4527bb6689968cbc5435a2
F20110115_AACBDZ grzywna_j_Page_12.tif
fa4633a0c9503e4637b02f78d4c46650
f6fa5c1159245f3afe47275327b23ba10ab3f2aa
73757 F20110115_AACAZC grzywna_j_Page_15.QC.jpg
c2e0be58f918416618056b425748ebb3
53c8ce92170ad1a564113736344fe11d916434e4
196329 F20110115_AACBBC grzywna_j_Page_13.jpg
6a50b78d81af944ef6a2737c1476d5ca
21c2ee29d2d17b77753903c82566b744077ed8ea
15352 F20110115_AACBGA grzywna_j_Page_02thm.jpg
86b01c416f511c00bdda01fe474bb61e
67a7e783dd58c2e939d85e4902ff4c5df05dbe25
32773 F20110115_AACAZD grzywna_j_Page_30.pro
32aa609411dfb13b4d7c94402ce63005
fc431d9877e63b5475fc13497dd75f10443e4cd4
F20110115_AACBBD grzywna_j_Page_33.tif
b89ad32f0bbb3da3ef420e1ef31cf2fe
fbf659fb1dc2fc5a6df1a7b1876b2f65ece72489
85127 F20110115_AACBGB grzywna_j_Page_03.QC.jpg
a5a69dee6ddb2e24710bb74a83106cb0
1545a86cf20ddc97531e850f917b06d7c5c4a8a7
47782 F20110115_AACAZE grzywna_j_Page_10.pro
5d9685f1dba3274a52eb4db8e8bbba7a
daa28f246c47ea1b727728dce614b05d67fb9c01
642443 F20110115_AACBBE grzywna_j_Page_20.jp2
df9a4ce1101f64c0212292512be00a45
f6666daceb007b7bb865b47eed1449f7c646f084
32815 F20110115_AACBGC grzywna_j_Page_04thm.jpg
bc2939f41a46d02a064627035ecde354
0235515b49142310a27e697b553acec280258554
47825 F20110115_AACBBF grzywna_j_Page_05.pro
d7a163073e28c658986c43c802d81067
be6ea5a85c967fede282a361409ad049b27c798c
47431 F20110115_AACBGD grzywna_j_Page_04.QC.jpg
e7036cfe550a92b69a1bab8b92349be9
76db0e9887a302778b92f6addce42d3fb1a0e636
24793 F20110115_AACAZF grzywna_j_Page_40.pro
9131cb425d2d82d87e83faa96285e54e
5a7d49f6a3c42d8ecc162922d2cf245b98aa4891
47304 F20110115_AACBBG grzywna_j_Page_32.pro
a574011c3910a5944088c238b0e5d655
bda904f541550e51c15aed33a59491750f02bc8f
92303 F20110115_AACBGE grzywna_j_Page_05.QC.jpg
51b0b6b020418888ad51af3762e214e5
7d6b621566bd71e199e41c9cba7f217f9b43b4ec
38362 F20110115_AACAZG grzywna_j_Page_33.pro
9d11d5d07cf297e8c00200c6bcfdab82
331e7d78f4c36d1660ea588be5866e24a086bdf2
162577 F20110115_AACBBH grzywna_j_Page_34.jpg
c768cc7d2df3cc0d2068bf0971371ce6
45b9c58d5d617e871edbb30552ba1253f35ad026
30097 F20110115_AACBGF grzywna_j_Page_06thm.jpg
7ae0450186281e423bd80cfd6184ee13
a9b648a1148ad30a5582104f5ba58de8319518a0
1051977 F20110115_AACAZH grzywna_j_Page_05.jp2
4adedad45919e1dd571f3d12c6272d74
c2d226a5d557ddf50db9c2d090c21b6d0d4199f9
17111 F20110115_AACBBI grzywna_j_Page_07thm.jpg
fa6f2453ea8070c06fe4049fabfb8284
2a612090e466725a291144608e0bab3a082366d6
13864 F20110115_AACBGG grzywna_j_Page_08thm.jpg
73669a78be7c974be8744402ff27eb61
23b9045aef8b269dd3f808b302ce9dfb49021905
1072 F20110115_AACAZI grzywna_j_Page_27.txt
5bd3203cec654a1727552388a8bbcf32
89b1a303b11bf8ba062e7571f2ab6dc67d648352
996 F20110115_AACBBJ grzywna_j_Page_15.txt
98e9d5d4f85796a77a129405093707d1
a98d2f921ee6af7d3fa582d6617153919ad6995f
16678 F20110115_AACBGH grzywna_j_Page_09thm.jpg
272854880475a958c60a5e85112063a0
c09b9ae31cd7ee190b089081ca7b5013a5e1431d
30241 F20110115_AACAZJ grzywna_j_Page_03.pro
509d4db378bf55322c380c5376e873dc
ac4d4c39028d0bc5525b5e3b244f2a16101f6433
176654 F20110115_AACBBK grzywna_j_Page_23.jpg
a95a32c5e093dd6eadc5aa58b639faa0
f94e694c473ec0e2a70fda2d778e727cdaaa4939
53317 F20110115_AACBGI grzywna_j_Page_09.QC.jpg
6e73592f8d55f0a4e3a8ef6be202f31b
83f78c614df329cd1573c0b1da5dc2a648cf349f
221175 F20110115_AACAZK grzywna_j_Page_03.jpg
2fba3068b1a0dc36b4441e7d3387af17
07f0864fab47ccb179e94af375becb3194779422
52171 F20110115_AACBBL grzywna_j_Page_02.QC.jpg
686f1ea29a3e32f47b8e2ae570400db7
bb1813d5d1d71e091301ec2b168813823aefb498
21359 F20110115_AACBGJ grzywna_j_Page_10thm.jpg
aa9b88689b9154647b6be8aa1019f7d3
5783ee0a30629953199878afc99f08750e02e6f1
92763 F20110115_AACAZL grzywna_j_Page_35.QC.jpg
fb706c2e64cbf8013ae8bc7a3d5ba726
0194e84717bf39aacff81407485032eeef9936fd
70379 F20110115_AACBGK grzywna_j_Page_10.QC.jpg
c5cfcc7fc5cdbb9fe2bf756f63289a88
370bab3a8df58fe9b84aa6855e862b7b4936633b
71184 F20110115_AACAZM grzywna_j_Page_19.QC.jpg
d7b93721b68e08279b68a1b19b63011c
69c78350a5639b125e362be5a7500f14b89d5dbc
102470 F20110115_AACBBM grzywna_j_Page_29.jp2
b0473ad7ed46d65ca861556a7fb963bd
aa757532c4c43546f4b8f0a9f2bc699c40ebf37a
21009 F20110115_AACBGL grzywna_j_Page_11thm.jpg
c621fb493331807a2ee14b3b38ed378d
242ab684a3d2c9a6d1a626362987d42d7c7d676c
F20110115_AACAZN grzywna_j_Page_32.tif
86d175d3534a6330cda828f66ca1f97f
1fa10178b83297702e08cf7b7d8696d62585e06d
919537 F20110115_AACBBN grzywna_j_Page_37.jp2
7f1a6e4898816964bb27d2284d6071a0
07b7eb3790729848a2b8901fb3bfed1fc3e6af2d
87492 F20110115_AACBGM grzywna_j_Page_12.QC.jpg
92743ce1c809e43d8a5a6cc2b6c5f70d
2699bb40482e573af3bf72bb6e7011bf824b0be7
144819 F20110115_AACAZO grzywna_j_Page_22.jpg
bc322c8f33d87eafa409e7d6301c76c3
e30af0258c5ddbc5b338aa9d470f69019d63f9cc
72826 F20110115_AACBBO grzywna_j_Page_42.QC.jpg
da65dfd3d330e3b5a20f7c331cbea31f
903c2101f96db6548fa77c2108be0da6bcee8145
71507 F20110115_AACBGN grzywna_j_Page_13.QC.jpg
9c26b41f0268da58ea9a3e782e06bad7
a2e3972571950c2bf8dab74ed8829d9d4deb1234
1051969 F20110115_AACAZP grzywna_j_Page_12.jp2
a65ef818c7c464f8001ca7e724af7cfa
3288efa704c91f8a88c56efab8d6c23c8b90b634
94440 F20110115_AACBBP grzywna_j_Page_26.QC.jpg
692b8c70fe0a0693dd1ae18b8501fd1f
e45e83eb3754ee4cc8d477fdadd9b1d1fe508147
44607 F20110115_AACBGO grzywna_j_Page_14thm.jpg
9a05e437e616152afe3dc19d087152ef
00cf9f2b2b8db69762fb625e5665ea87f599da97
161515 F20110115_AACAZQ grzywna_j_Page_28.jpg
32db69b74710f97874e4704e1628724e
d6564ba348b556e49570f5d5f6a7a00216adef2a
F20110115_AACBBQ grzywna_j_Page_01.tif
ef422d8197ae4239a1ca28729df46a63
9265dd70cb5a7c7ec2b4eadfd1b47e400d0d0341
41581 F20110115_AACBGP grzywna_j_Page_15thm.jpg
4146641a371ed847f8c3ecc22aa71b7b
0b3bf2d2e3cc7d945f2deb4128afe3b03ce0e24c
168967 F20110115_AACAZR grzywna_j_Page_18.jpg
b550e8e394bfcdada0a7072ab67efed2
472f94e8c647e9a2ad52cc6fdf05710037cd9731
101901 F20110115_AACBBR grzywna_j_Page_10.jp2
eda5725db7164a79ec54616059386ab2
bc9c09d10681369660e5a9e1d6b33527b74011f9
50004 F20110115_AACBGQ grzywna_j_Page_16.QC.jpg
d73aa00cfc3f9cdbf322830ea4242c4e
409db382259abafa0e3be95f218be7f6a2e2e6c2
102040 F20110115_AACAZS grzywna_j_Page_13.jp2
5a88af4684bad34a630d514d7a8c18a3
8890cb821f0f8247bde446f2c708617aed7d477a
597 F20110115_AACBBS grzywna_j_Page_04.txt
c858096dd75986e33dd63545521c4ea6
aa6248a0090bd6aef6e92a4fe088ded5488f9338
73373 F20110115_AACAZT grzywna_j_Page_27.QC.jpg
26b02750f5f5a489bfbe8b45bb7c51d2
1c33aac1042b246fc3673c8592b9d624dc95e854
F20110115_AACBBT grzywna_j_Page_13.tif
75ded439f818e4a268d508e50a3d3e1d
b5824cd87a7af492b344841c8d897f8374719073
F20110115_AACBGR grzywna_j_Page_17thm.jpg
b59837c0e0f60e865d1c3a0bbd933889
7924241535b1f991cf09f3b540f1dc617f22a7a2
42713 F20110115_AACAZU grzywna_j_Page_18thm.jpg
7301d73c9b33cb08577c784d55859b16
0e734ed7794cf73c4129bbc37e53544f9679b287
1307 F20110115_AACBBU grzywna_j_Page_16.txt
4fe764161951dd1341e169df34c8fe65
09c3b8bcf3d6876c51f647827a5da58858325363
69247 F20110115_AACBGS grzywna_j_Page_17.QC.jpg
be85aa0acbdbc67c11cac3c910f65d07
df3cc5dd451a32129c272fc48b63231b668400a4
39621 F20110115_AACAZV grzywna_j_Page_26.pro
08fd17882cef7c83a0778703a9a268fd
92997ed5d78660606f5b8c7cf4f8e7044cf4aaae
27350 F20110115_AACBBV grzywna_j_Page_12.pro
80fb9515638139e0fa86b3d3646346e3
d83a3a40f92f781dfe273baeb3c6008cb70cbd9b
214851 F20110115_AACAZW grzywna_j_Page_26.jpg
3b04d4bd895ef889396f0b8b521f419d
94cc88606624162288f5758a3845679f276ab5aa
1582 F20110115_AACBBW grzywna_j_Page_26.txt
dd0e55891fa27ffee398c510b683820c
9ecdd8f5609969bf8c99fe84ab8dc70f2bcbc52c
77264 F20110115_AACBGT grzywna_j_Page_18.QC.jpg
67486fa3c8abfe3474dad3c4a899330f
0d6d060c0dc6c0dc5c2991223a48a8d7e840d08e
1051978 F20110115_AACAZX grzywna_j_Page_36.jp2
47e258f7e54cec719b98d9add441f8e7
7d20805e448b3ffbb07a41b181c37abcf4b1e7c0
99113 F20110115_AACBBX grzywna_j_Page_38.QC.jpg
4c553615aedc4609d42151eb76a3eb89
993e6017f7e80844b5b87d77df55be009fd6b764
22410 F20110115_AACBGU grzywna_j_Page_19thm.jpg
6f4d00791cfe886514edd77cb362eaea
f990578349ba5f6dbfb5c969016286c72ec23d3c
885357 F20110115_AACAZY grzywna_j_Page_14.jp2
086999009e09a8780d31eddfae6a37d4
93a635fd377836e69389a1d5d5b0c5cb51c3b6bc
36958 F20110115_AACBBY grzywna_j_Page_28.pro
4b9747df4ce40cddb64b9d8cbd6d53d4
2d354e03e839f6dbcafb616670c63f892dbdaa29
38972 F20110115_AACBGV grzywna_j_Page_20thm.jpg
2ba929dcc9c6a8e16edfadcd723190d8
b910b717524874ddb51656f1f78965497b926c86
F20110115_AACAXA grzywna_j_Page_38.tif
e1aff5bd9e396f72702b804ccb3327c2
7e9438337b067894720070c390919aa6367dcb92
1884 F20110115_AACAZZ grzywna_j_Page_19.txt
20fdab646e61392abb59f96513bb024e
ce712792809606555b6070776aa4ae841dd122d3
1087 F20110115_AACBBZ grzywna_j_Page_12.txt
da1cc00c96947ddf3e2c13d98a344f48
379deb5104e1671cec689032d770a73324627700
68622 F20110115_AACBGW grzywna_j_Page_20.QC.jpg
0cc6b1a7fc796fdea41cb59db31754e3
12449fdb74fea1ba11fe587d5c5877ed828b8537
1645 F20110115_AACAXB grzywna_j_Page_34.txt
687bb955066993948bfdfd07075d1a8e
95e42e12c5f418419b829eb662f5f8e3f3a47b8f
15752 F20110115_AACBGX grzywna_j_Page_22thm.jpg
b12fdfdde50906b95734838f7daa8c00
25dd68d0a8aa6ced115c0aa601cd3a4718bcb28e
F20110115_AACAXC grzywna_j_Page_26.tif
bf4a3fbc37ba6dd670d58fc94aa24911
8604c607cec5a1f19a897b5a41cba1fc7cabba46
F20110115_AACBEA grzywna_j_Page_14.tif
25daf2eed6a64742a91692996268ede7
8e037e44eb52f86b514f0780d6d656504e0a854e
50966 F20110115_AACBGY grzywna_j_Page_22.QC.jpg
389683919b4e905914dfacbf9fd742b9
e6117d70069ca952fe026275583d0094ec1b51c4
F20110115_AACBEB grzywna_j_Page_15.tif
d37f3ce2d263821006579f26c26f15c4
6c2b3fde0d0a08c182db1384d955e4dd9051716c
79673 F20110115_AACBGZ grzywna_j_Page_23.QC.jpg
d132d934e3ef1bdacb5453b60765bfee
53b85d1c81bf9f23f5a69cc9d142704c7a1b5514
2020 F20110115_AACAXD grzywna_j_Page_05.txt
3fe8192a318389dde5e49a4d2e356d41
fbbb07228cb11b52b0255de37ab67e4764b084e2
F20110115_AACBEC grzywna_j_Page_18.tif
b8e3f3ca0e41fc32d12417b8e51dff98
053b8bae52781d6429118d5622959570d35ef21f
44217 F20110115_AACAXE grzywna_j_Page_37thm.jpg
490b724056ac1c1ce309c18525f57fcb
8a6e69e6761f6e55a68f59e4216cc318a08ea406
F20110115_AACBED grzywna_j_Page_19.tif
a3130259f5e5e6ba778f89d950cc8d94
c9563363c3099cd3c91842571f75751cb9d8c9f8
60598 F20110115_AACAXF grzywna_j_Page_07.QC.jpg
2c352bc20ad9197d2640488e08955f99
e5f5339b84141248c1b14045fbed816a439a57b4
F20110115_AACBEE grzywna_j_Page_20.tif
9b99dc4c70302e6ce63693b633e44683
843497bc85df65d05a71385afc240b6ea12b2d5f
470 F20110115_AACAXG grzywna_j_Page_01.txt
5ca925c72c6afeedf0c78940548b8e2b
719b373050d0674571477aea7b1119a0a1594d9d
F20110115_AACBEF grzywna_j_Page_22.tif
5b52b18e5260faf06409d47a388645b0
f6fcd7ddef0067009375502ed4025a88362db18d
67898 F20110115_AACAXH grzywna_j_Page_02.jp2
0950033d85890ab020af1970d0042fe7
f6832d1cbb46813c729af565e14a617ab808ef50
F20110115_AACBEG grzywna_j_Page_27.tif
d2161a83dd539581201c836700546f7f
ee5a8c20df43de1571fd064233c1d3e7fb593355
39385 F20110115_AACAXI grzywna_j_Page_40.QC.jpg
7c5828d70d63080e7bbef1533195fbd9
bd038796821b8fc730e85861409384f394dd7add
F20110115_AACBEH grzywna_j_Page_29.tif
3a8f250c208b1721e0b759d642589d74
5bee9d854c7a62fb5b2c2bc319b476ae2cbf5c95
1351 F20110115_AACAXJ grzywna_j_Page_03.txt
7b1c78bb8fec8995734f481ff925bb69
34b486c8b9f38680523877796f8f52632b3f3771
F20110115_AACBEI grzywna_j_Page_30.tif
3e640c83f00fd8f9fed26f46b97e3eeb
b738680e5dc2b48e6fde47f68497cb231067899b
747 F20110115_AACAXK grzywna_j_Page_44.txt
2871dd1571457c7c0ca09427cbcf13e2
7c6f262b82945b92ff7e55f916dd40196e971ba8
F20110115_AACBEJ grzywna_j_Page_37.tif
0a434c67e6efcd08d6600d61edbc0114
855a9bc0c32766e4e9b7e7b08cda25a61876869f
1787 F20110115_AACAXL grzywna_j_Page_11.txt
88a5337ec0c6a34d8c4d0b1411401998
bf176d059d0467e8d189497c6de48aeca6381b86
F20110115_AACBEK grzywna_j_Page_39.tif
bee85ee6e50b4479e7db1d169632a54e
97f5dcbc7dad1017641fee4097439d2d7337fd4c
27240 F20110115_AACAXM grzywna_j_Page_43.pro
e685beed58e3b9317c94444695f8ce07
c7b814a3aae62093bc7f0645ad38161ab224b029
F20110115_AACBEL grzywna_j_Page_41.tif
e286292e5c4243cf430763730d1b08b8
0b626528a0b627e7246b8cdf473e77c632b01373
F20110115_AACAXN grzywna_j_Page_23.tif
66dad9626687a968c40aca7f0e96121d
a72cdbfe59b23adc46a878e27b95bf447e49867c
F20110115_AACBEM grzywna_j_Page_42.tif
843325ab01d8765f2561d4cd6bbeaadf
36b0d4031a90d7e142d12797c94bd1cf6d2120fd
F20110115_AACAXO grzywna_j_Page_05.tif
4a5ac2ebee057746e2b4be94cc93466c
404bef2cfcc0009452b46640cf98b74571dfe781
F20110115_AACBEN grzywna_j_Page_44.tif
0909d235668efde6f6d36f43db87edca
b172d65d6a1529fe547977c3afa95ce402dd488d
22482 F20110115_AACAXP grzywna_j_Page_13thm.jpg
5ca4c2f0145ab10fac8c024cd5b6c1c7
e7ad08dcacb0b95580edfcce8508f93b0d29b5eb
8714 F20110115_AACBEO grzywna_j_Page_01.pro
b2ba7f64f60784b8d9b8ae87b460cab0
fe7ebef9d3b8bb40108b4d17b07f4b9e79a5b89c
992257 F20110115_AACAXQ grzywna_j_Page_26.jp2
fa202d5c22359b70408214210ac68fcf
fd5dc1098fca092f6a829bfc85ee9a187c5384ce
1054 F20110115_AACAXR grzywna_j_Page_39.txt
f3de9c3c60016fa5dde30a742335f4da
46eb1bd6dd6aab7ca5a027e16eb041ffb76b349c
31741 F20110115_AACBEP grzywna_j_Page_02.pro
45566b2a1b900d99713d359aeef718ce
31442990cc87221cbe5c19abe045fe4820073b55
1797 F20110115_AACAXS grzywna_j_Page_35.txt
cc80c4bcdb460d2e7a99632683bde74c
c3f976961bb627d9898f5f819b4e130116853718
15271 F20110115_AACBEQ grzywna_j_Page_04.pro
7af4af499f8d1ac618c44a1bcc0ceec0
89a7862c240af87bb6bf3d83d9aea36d61a57fb6
918 F20110115_AACAXT grzywna_j_Page_20.txt
46c79a9f3d91730dce959a6df1af9acc
ff762ff2ddb8888f5c98ee56fe48d0f5b78a6776
4448 F20110115_AACBER grzywna_j_Page_06.pro
724fc34e8581d0c0dfab7f9c5d471403
fe37bc4b3b8edfdf734b811f81f7f6d46b11edb2
F20110115_AACAXU grzywna_j_Page_21.tif
fba6bbcf98d97879f73b108240253fa3
b644b30779d7b3601be0b9bde66a4c23ab3ca770
44953 F20110115_AACBES grzywna_j_Page_11.pro
7c75d9e7377cbf5c3becff96255dd720
829074109cdb6bcf5ad562a9dfb941ae9617e20b
195553 F20110115_AACAXV grzywna_j_Page_10.jpg
5a37e0353e68af27a63c2e31f7e12682
bb780829a87c3ddb0b4e537bbadc017d1ba686ca
40162 F20110115_AACBET grzywna_j_Page_14.pro
905a5245df1c8d84c93c07d751b3da28
f6c449f34670c9b4315622111e17547a1aca097c
102854 F20110115_AACAXW grzywna_j_Page_31.QC.jpg
d494ad1b155e48e4e4571858e73e4c5f
5485408a48ffa84595114905e2021f860db86e4c
23387 F20110115_AACBEU grzywna_j_Page_15.pro
1ab0179a7c971e438993aad0486b464e
6a519c0e1ca6d53ffd06023c31699453f9e3c7d7
204520 F20110115_AACAXX grzywna_j_Page_29.jpg
5fad4b9d65f9f866fa7de2c081dd70bd
f14a9787e8343d8bda2354a8b09f27b2c68a4057
46787 F20110115_AACBEV grzywna_j_Page_17.pro
6730d0c761fe1409cdd0dbce63b6acac
a84c2e437d7770f0cee13e4a419abbc07efe22b4
114898 F20110115_AACAXY grzywna_j_Page_42.jp2
ed8262780ab6fc106bea631762314a21
7761f5152434ccc045dd252b4199c5a63cad37ba
46847 F20110115_AACBEW grzywna_j_Page_19.pro
cf53f7180b770e8b0db2465a823fff2a
a0821e1d7c0ef235dc9dbe0de41f406e10b4857b
60994 F20110115_AACAXZ grzywna_j_Page_21.QC.jpg
d3476aad5bb719a9b2ba70dfc3b923be
c905fcd5a7811101d8f6c7f6327212e420efadc5
20507 F20110115_AACBEX grzywna_j_Page_20.pro
571f1772f462cdbcddc9c930a4adb0b2
e5da259050468a9ff07910e7119dd747562836d8
105722 F20110115_AACBCA grzywna_j_Page_40.jpg
034572c516fcce800387ff3ede20c5df
555debe1bb1e2e70391580cf4f4bf3a0599339ee
41408 F20110115_AACBEY grzywna_j_Page_21.pro
c2c31a1230fd2cd8765e5128118e3a9d
fabbbe287abe6fce63850e7a4890a5e29052431f
76116 F20110115_AACBCB grzywna_j_Page_09.jp2
b08ed96bdb5c604decf5c2be740f6341
d28df56b6b960f4a0b7fd5f61b89013b8b8e2be9
29071 F20110115_AACBEZ grzywna_j_Page_23.pro
2dc68e0be30c857883b67a72f8b1994a
fa079b6d7edb0d0acefcc675157fb48853664069
88049 F20110115_AACBCC grzywna_j_Page_14.QC.jpg
5eacf66b29d2122616ed7f992c86c1e7
88b7ed1f84deeeaf03e4ddc446b8a5434c83ad93
19784 F20110115_AACBHA grzywna_j_Page_24thm.jpg
722881b8646aff5c1c82c90fcbd2d3e7
c587b4734b176e28664414b8cf6e7a7aa4e9937d
210508 F20110115_AACBCD grzywna_j_Page_36.jpg
99fecff35e4e682d2ce6f9554217b85a
06f06238a18577780a82d904618a07995e0e5a86
64422 F20110115_AACBHB grzywna_j_Page_24.QC.jpg
6b59126652d5e03389e3753a5a22ef05
15de45ffa8b016a17ec5057087534dd4e1558a85
F20110115_AACBCE grzywna_j_Page_24.tif
f30bfdf6a5124c91c642dd4376730710
86feb9d9d8dd37e3c0d02253a99924340983d155
18788 F20110115_AACBHC grzywna_j_Page_25thm.jpg
848ad80786c2dc0edca079be836cd965
db3704dcc4d8aab68f9d2bf7046ffbd6a5723b84
69531 F20110115_AACBCF UFE0008441_00001.xml
729137041e2ffdfa8849180c44aaf4cf
71dccadbe22d22857294f93a0d6f295bbbc49eac
55837 F20110115_AACBHD grzywna_j_Page_25.QC.jpg
4f937f8e6de62c3f5fd994ed4e8c31c1
637d0dd97c63e71c007bda7409810fe65e04a8b1
46645 F20110115_AACBHE grzywna_j_Page_26thm.jpg
701260e40cee92f69bd8166c04105117
0946816d5d91955fd0ae302732aba522461cb9d1
40712 F20110115_AACBHF grzywna_j_Page_27thm.jpg
6309a78c6644b4d411d96d33f1746596
329d71782d525b5cd33f49407d4bc4437ef3eb94
59638 F20110115_AACBCI grzywna_j_Page_01.jpg
30ef9e2325e355078facfef8c94686db
c7243230ffe0ef7719ad863148625405ed5d6c83
16532 F20110115_AACBHG grzywna_j_Page_28thm.jpg
fac5d300563ff20e95a0acba63fea404
975d6d5953cb14c76121718b2082cb3b9ee4839b
92883 F20110115_AACBCJ grzywna_j_Page_04.jpg
20c8599094abf6fbaefaf9d75b4bddce
2467442125c3546d20629b76a21a366210129ba3
74685 F20110115_AACBHH grzywna_j_Page_29.QC.jpg
c36558a3e22b4410ba3d59bc7808a52b
1cc58857f62fd0dfb847daac7cae543202d164e0
122395 F20110115_AACBCK grzywna_j_Page_08.jpg
70efcc274585fa44e0e5061f8f1bc2f0
7b8080a5209195202a2ee524982eee7f53a6c36a
44750 F20110115_AACBHI grzywna_j_Page_30thm.jpg
f9789fdd08a538ce69f0789e6f4e21fd
0726d21cae91ecbcaa69764d9a73da3386bca2e9
204159 F20110115_AACBCL grzywna_j_Page_12.jpg
56ef838d5e8c7277fb670fa0de38e858
72eb9dc25059233755da19078d613cd14ec6d7a5
85122 F20110115_AACBHJ grzywna_j_Page_30.QC.jpg
1dc1a5e4a3d0a33b7e500600e529a510
ad27c1f1aac5b9d230e8e75a296b5b8eba123373
161190 F20110115_AACBCM grzywna_j_Page_15.jpg
b8a8d24bc4cc079ca8304672004456d2
ea22c0247c0a237fb6f3d8805be43db61330e72f
20770 F20110115_AACBHK grzywna_j_Page_32thm.jpg
b90bf0ccb7f53a1951c101988e0c5949
e64ea20bf3812ec437e1e6419235eedb61ea4d46
71805 F20110115_AACBHL grzywna_j_Page_32.QC.jpg
4e5168a02d0036a953cfe7a28c6740ea
6dbb91b3a95ca81fc4166b8cb2cb89bf29a72cc1
133406 F20110115_AACBCN grzywna_j_Page_16.jpg
afda9e1eaed82f54377541b391689603
f041ccb69eaac3d131712aeb50ed65bb1eede5ab
59444 F20110115_AACBHM grzywna_j_Page_33.QC.jpg
582db537fafe9723ffb1c16fb6c7f9b2
5e0aaf92f74067d446f71000fdb0c3846dc9b200
193596 F20110115_AACBCO grzywna_j_Page_19.jpg
dd776d2bb94a3946420d90d4d95c3e63
a58e0c1fd63602bee0e87e025d803241f258b711
17562 F20110115_AACBHN grzywna_j_Page_34thm.jpg
c724ba2c05243837efb98ee24c704b49
641d3657fb27855911aafde8505a638ccf2da3d9
166828 F20110115_AACBCP grzywna_j_Page_21.jpg
80d00ba4d8cca32d1233b1d6e57f7131
1b4dbc5851cab848776601ce7f486fc74a0c81f3
59176 F20110115_AACBHO grzywna_j_Page_34.QC.jpg
fe6eebb0994de80a70ef9363e33c9557
1994c23c0f5f1d49b6a84866ce6db51b41cb16cd
180343 F20110115_AACBCQ grzywna_j_Page_24.jpg
c715df275aca0aa8fc338af462ce7826
8f71087b09e65d89addfeb9e64e45028ea819e39
46095 F20110115_AACBHP grzywna_j_Page_35thm.jpg
7b39d4e90b695a5fea4df364ec0ef0ef
ab108b5e68ca85d3cb016f10aedc467da9cba332
163193 F20110115_AACBCR grzywna_j_Page_25.jpg
339ca85cebe68ea5031a5de384a77369
0adc439a140c9223cca586a546533697a31a07e3
46057 F20110115_AACBHQ grzywna_j_Page_36thm.jpg
006f90a4a4003d0e8cf875f0272cdaaf
87733d16a723314ea2edcaa83cacf05d8f25233d
191748 F20110115_AACBCS grzywna_j_Page_32.jpg
015cc210abb1c1ba4afa236684bcdf09
f8f7bfd1c9eb221b963dca2e1fbd19e6ca80f827
93424 F20110115_AACBHR grzywna_j_Page_36.QC.jpg
93602072e6c79f2d3a8cdb77e2f3b17b
bbd990efde3086f351d777726a3d593257667efe
163833 F20110115_AACBCT grzywna_j_Page_33.jpg
8b0ce7c58a3fac014950975119c0872d
81060e35924b37e78b370195ccc5a9bd4e9ad409
207285 F20110115_AACBCU grzywna_j_Page_35.jpg
d18afb2d1d5a2f4256a0a8ea6ed5a710
2457ae191afdf433805467fdfe1cab669cb758de
83696 F20110115_AACBHS grzywna_j_Page_37.QC.jpg
cac83e79691f860ec28e1b9d5730c9fd
70d22cc8614c81ce1a1fad563d6f96cf2716400b
231246 F20110115_AACBCV grzywna_j_Page_38.jpg
447fadaf3c62ab7e2c148895c6ade059
8edf2205d9e8e88ce3a43ad4a4aee8efbb95bbe6
46839 F20110115_AACBHT grzywna_j_Page_38thm.jpg
cff7271685017de3c7019c5bcf42315d
9b5a8cb447b1b311360367f3711c3b0fa9930905
119501 F20110115_AACBCW grzywna_j_Page_43.jpg
cebd623f859b06f7e620b3ff19ff5212
bf1f1aa77598abc6fb3f7f3bf5a14e771dc96130
37434 F20110115_AACBHU grzywna_j_Page_39thm.jpg
738ab3fe9f436de5ebfac6d9b5d1c2c9
814bb04ddd458f397c0638eccab88207d3a931fc
80551 F20110115_AACBCX grzywna_j_Page_44.jpg
70407b88b02fb0b0617623f7c4ac15e6
7a42c25cb85e2ffe5dbf74f8b113198f87e274ab
12292 F20110115_AACBHV grzywna_j_Page_40thm.jpg
678ebe8609b9d710680710829cd6142f
3dfe01e680f28ca69b988876ac70355ccb390bfd
F20110115_AACAYA grzywna_j_Page_40.tif
42765b93231f79627fe4b35b4c089519
6e978b12ff695f8b13c70ea0b852f2aa8ddc7f87
12445 F20110115_AACBAA grzywna_j_Page_43thm.jpg
96d92dafd3657c315159f5c07e5a8819
e308801a1bd746d7791261029ea3bf67b7223a6a
430649 F20110115_AACBCY grzywna_j_Page_04.jp2
a8c85b33c3b1d31d76b879e5d4d373b2
dc0b32b2dc8cb266e807be498c0204e7218ec807
61251 F20110115_AACBHW grzywna_j_Page_41.QC.jpg
eb93b58a185dc04bd5631499ea89bc05
ca039a5695c830e85f06c8e460dd208b72a14b47
228251 F20110115_AACAYB grzywna_j_Page_42.jpg
ffb181c36957adae87742e1a2c7afc60
ec1601bbc29cf8637d8a94165cf75bcfaf372730
12095766 F20110115_AACBAB grzywna_j.pdf
9279a7b19172cc6c30e2a8762c692fee
0bfa8a6a08c6776174e2ad172451546204aae7be
148271 F20110115_AACBCZ grzywna_j_Page_06.jp2
8af8d09a01b1a3773392d28a4721fffe
1634c7701b2b0951e4d3d2d495141a2531119d52



PAGE 1

AFLIGHTTESTBEDWITHVIRTUALENVIRONMENTCAPABILITIESFOR DEVELOPINGAUTONOMOUSMICROAIRVEHICLES By JASONWESLEYGRZYWNA ATHESISPRESENTEDTOTHEGRADUATESCHOOL OFTHEUNIVERSITYOFFLORIDAINPARTIALFULFILLMENT OFTHEREQUIREMENTSFORTHEDEGREEOF MASTEROFSCIENCE UNIVERSITYOFFLORIDA 2004

PAGE 2

ACKNOWLEDGMENTS IwouldlikethankDr.MichaelNechybaforhisguidanceandsupportofmy researchforthisthesis.Asmyadvisor, Dr.Nechybahasmotivatedmethroughhis leadershipandhisabilitytocultivateasynergisticworkenvironment.Iwouldalso liketothankDr.A.AntonioArroyoforhispassionforeducationandhisbeliefin me.Heinvitedmeintohislabandpushedmetoreachthenextlevel.Dr.Eric Schwartztaughttheclassesthatleadmeintorobotics.Heisagreatfriendanda honestmentor.IwouldalsoliketothankDr.PeterIfjuandhisstudents.They buildtheplatformswhichenabletheworkthatIdo. SpecialthanksgotoJasonPlew,aninvaluableresearchpartner,AshishJain, afriendwisebeyondhisyears,UrielRodriguez,afriendwhoalwayshadaway, SinisaTodorovic,amentorwhoprovided invaluableinsight,ShalomDarmanjian, afriendwhoalwaysmademelaugh,andMujahidAdbulrahim,afriendwithgreat ideas. Finally,Iwouldliketothankmyfamilyfortheirunendingsupportofmywork andbeliefthatIwouldalwayssucceed.IespeciallywanttothankJennifer,thegirl whoownsmyheart.Sheismybestfriendandmyinspiration. ii

PAGE 3

TABLEOFCONTENTS page ACKNOWLEDGMENTS.............................ii LISTOFFIGURES................................v ABSTRACT....................................vii 1INTRODUCTION..............................1 1.1MicroAirVehicles..........................1 1.1.1ChallengesinDevelopingVision-basedAutonomy.....3 1.1.2UtilizingaVirtualEnvironment ...............4 1.2OverviewoftheProposedMAVTestbed..............5 1.3OverviewoftheThesis........................7 2MICROAIRVEHICLEPLATFORM...................8 2.1AdvantagesandLimitations.....................8 2.2ConstructionTechniques.......................10 2.3PropulsionSystemDesign......................11 2.4IntegratingVision...........................12 3VISION-BASEDCONTROL........................14 3.1FlightStability ............................14 3.2ObjectTracking............................16 3.3Controller...............................17 4TESTBEDIMPLEMENTATION......................20 4.1ArchitectureoftheSystem......................21 4.2VirtualEnvironmentSimulation...................22 4.3TestbedHardware...........................24 5EXPERIMENTALRESULTS........................26 5.1FlightTestingProcedures......................26 iii

PAGE 4

5.2SimpleStabilizationExperiment.. .................26 5.3ObjectTracking............................27 5.4AutonomousLanding:VirtualEnvironment............28 5.5AutonomousLanding:Real-ightExperiments...........30 6CONCLUSION................................32 REFERENCES...................................33 BIOGRAPHICALSKETCH............................36 iv

PAGE 5

LISTOFFIGURES Figure page 1.1UFHILSfacilitycurrentlyunderconstruction:conceptdiagram....4 1.2Try-by-yingapproach:Feedbackfromtheighttest..........6 1.3Testbedarchitectureoverview.......................7 2.1Adaptivewashoutinaction........................10 2.2MAVplatform...............................12 3.1Horizontracking:(a)originalimage;(b)optimizationcriterion J as afunctionofbankangleandpitchpercentage;(c)resultingclassicationofskyandgroundpixelsinRGBspace.............15 3.2Inobjecttracking,thesearchregionforthenextframeisafunction oftheobjectlocationinthecurrentframe...............18 3.3Controllerforvision-basedstabilizationandobjecttracking. .....19 4.1Testbedsystemoverview..........................22 4.2Somesamplevirtualscenes:(a)eld,treesandmountains,(b)simpleurban,(c)urbanwithfeatures,and(d)complexurban......23 5.1Stabilizationresults:(a)Direct RC-pilotedight,and(b)horizonstabilized(human-directed)ight.Maneuversforighttrajectory (b)wereexecutedtomimicighttrajectory(a)ascloselyaspossible.27 5.2Objecttracking:(a)virtualtestbed,and(b)realightimagesequence.28 5.3Autonomouslandinginavirtualenvironment:foursampleframes...29 5.4Roll,pitchandtrackingcommand forvirtualautonomouslandingin Figure5.3.................................29 5.5Real-ightautonomouslandingineldtesting:foursampleframes..30 v

PAGE 6

5.6Roll,pitchandtrackingcommandforreal-ightautonomouslanding inFigure5.5...............................31 vi

PAGE 7

AbstractofThesisPresentedtotheGraduateSchool oftheUniversityofFloridainPartialFul“llmentofthe RequirementsfortheDegreeofMasterofScience AFLIGHTTESTBEDWITHVIRTUALENVIRONMENTCAPABILITIESFOR DEVELOPINGAUTONOMOUSMICROAIRVEHICLES By JasonWesleyGrzywna December2004 Chair:A.AntonioArroyo MajorDepartment:ElectricalandComputerEngineering Weseektodevelopvision-basedautonomyforsmall-scaleaircraftknownas MicroAirVehicles(MAVs).Developmentofsuchautonomypresentssigni“cant challenges,innosmallmeasurebecauseo ftheinherentinstabilityofthese”ight vehiclesandthetry-by-”yingpracticesinusetoday.Therefore,inthisthesis, weproposea”ighttestbedsystemthatseekstomitigatethesechallengesby facilitatingtherapiddevelopmentofne wvision-basedcontrolalgorithmsthat wouldhavebeen,inthetestbedsabsence,s ubstantiallymorediculttotransition tosuccessful”ighttesting.Thepropos edtestbedsystemprovidesacomplete architecture,builtfromcustom-designe dhardwareandsoftware,fordeveloping autonomousbehaviorsforMAVsusingacameraastheprimarysensor.This systembridgesthegapbetweentheoryand”ighttestingthroughtheintegration ofanewvirtualtestingenvironment.Thisvirtualenvironmentallowsthesystem tobetailoredtoanumberofdierentmissionpro“lesthroughitsabilityto vii

PAGE 8

performtest”ightsinamultitudeofvirtuallocations.Thevirtualenvironment presentedinthisthesisisaprecursortoamorecomplexHardware-in-the-Loop Simulation(HILS)facilitycurrentlybeingc onstructedattheUniversityofFlorida. HILSsystemsallowustoexperimentwithv ision-basedalgorithmsincontrolled laboratorysettings,therebyminimizinglo ss-of-vehiclerisksassociatedwithactual ”ighttesting.Alongwithavirtualtestingenvironment,theproposedsystem optionallyallowsahumaninthecontrolloop.Inthisthesis,we“rstdiscussthe backgroundworkdonewithMAVsandgiveanoverviewofthetestbedsystem architecture.Second,wepresentourvisi on-basedapproachestoMAVstabilization, objecttracking,andautonomouslanding.Third,wepresentdetailsoftheproposed systemandshowhowtheworkdonemitigatestheproblemsandchallengesof implementingvision-based”ightcontrollers.Finally,wereportexperimental ”ightresultsanddiscusshowthepresen tedsystemfacilitatesthedevelopmentof autonomous”ightMAVs. viii

PAGE 9

CHAPTER1 INTRODUCTION Overthepastseveralyears,UnmannedAirVehicles(UAVs)havebegunto takeonmissionsthathadpreviouslybeenreservedexclusivelyformannedaircraft, asevidencedinpartbythemuchpublicizeddeploymentoftheGlobalHawkand PredatorUAVsintherecentAfghanandIraqicon”icts.Whilethesevehicles demonstrateremarkableadvancesinUAVtechnology,theirdeploymentislargely limitedtohigh-altitudesurveillanceandmunitionsdeployment,duetotheirsize andlimitedautonomouscapabilities. Moreover,whilesuchUAVmissionscan preventunnecessarylossofhumanlife, atcostsof$70millionand$4.5millionfor theGlobalHawkandPredator,respectively[1],theseUAVscannotbeconsidered expendable. 1.1MicroAirVehicles Interesthasgrownforadierentclassofsmall-scaleUAVs,knownas Micro AirVehicles(MAVs) ,thatovercomethelimitationsoflargerandmoreexpensive UAVs.AttheUniversityofFlorida,ouron-goingresearcheortshaveledtothe developmentofalargenumberofMAVplat forms,ranginginmaximumdimension from5to24inches[2,3].1Giventheirsmallsize,weight,andcost(approximately 1RecentdevelopmentofbendablewingsallowsevenlargerMAVsto“tinside containerswithdiametersassmallas4inches. 1

PAGE 10

2 $1,000/vehicle),MAVsallowformissionsthatarenotpossibleforlargerUAVs. Forexample,suchsmall-scaleaircraftcou ldsafelybedeployedatlowaltitudesin complexurbanenvironments[4],andcouldbecarriedanddeployedbyindividual soldiersforremotesurveillanceandreconna issanceofpotentiallyhostileareasin theirpath. WhileMAVspresentgreatpossibilities,theyalsopresentgreatchallenges beyondthoseoflargerUAVs.First,evenbasic”ightstabilityandcontrolpresent uniquechallenges.ThelowmomentsofinertiaofMAVsmakethemvulnerable torapidangularaccelerations,aproblemfurthercomplicatedbythefactthat aerodynamicdampingofangularratesdecreaseswithareductioninwingspan. AnotherpotentialsourceofinstabilityforMAVsistherelativemagnitudesofwind gusts,whicharemuchhigherattheMAVscalethanforlargeraircraft.Infact, windgustscantypicallybeequaltoorgreaterthantheforwardairspeedofthe MAVitself.Thus,anaveragewindgustca nimmediatelyaectadramaticchange inthevehicles”ightpath. Second,MAVs,duetosevereweightres trictions,cannotn ecessarilymakeuse ofthesamesensorsuiteaslargerUAVs.WhilesomeMAVsrecentlydeveloped haveseentheincorporationofminiatureon-boardINSandGPS[5,6],suchsensors maynotbethebestallocationofpayloadcapacity.FormanypotentialMAV missions,visionistheonlypracticalsensorthancanachieverequiredand/or desirableautonomousbehaviors,asist hecase,forexample,for”ightinurban environmentsbelowroof-topaltitudes[7].Furthermore,giventhatsurveillance hasbeenidenti“edasoneoftheirprimarymissions,MAVsmustnecessarilybe equippedwithon-boardimagingsensors,suchascamerasorinfraredarrays.Thus,

PAGE 11

3 computer-visiontechniquescanexploitalre adypresentsensors,richininformation content,tosigni“cantlyextendthecapab ilitiesofMAVs,withoutincreasingtheir requiredpayload. Whenadditionalsensorsarepresentthatdontcompromiseweightandsize constraints,morestateinformationcanbederivedfromthesystemandfusedwith thedataextractedwithcomputervision techniquesforanoverallmorerobust system.Inthisthesiswedonotruleouttheuseofadditionalsensors,wejusttreat visionastheprimary,andtheonlynecessary,sensorforautonomous”ight. 1.1.1ChallengesinDevelopingVision-basedAutonomy Inthisthesis,weseektobuildonourprevioussuccessinvision-based”ight stabilityandcontrol[8,9],ontheMAVs cale,toachievemorecomplexvisionbasedautonomousbehaviors,suchasurbanenvironmentsurvival.Development ofsuchbehaviorsdoes,however,presentsomedicultchallenges.First,dedicated ”ighttestlocationstypicallydonotexhibitthetypeofscenediversitylikelyto beencounteredindeploymentscenarios.2Second,closed-loop,vision-based approachesmustoperatewithinatight computationalbudg etforreal-time performance,andrequireextensive”ig httestingforrobustperformanceinmany dierentscenarios.Becauseofthecomple xityinvolved,simpleerrorsinsoftware developmentcanoftenleadtocriticalfa iluresthatresultincrashesandlossof theMAVairframeandpayload.Thisinturnintroducessubstantialdelaysinthe 2Ourtypical”ighttestlocationwouldbeafeaturelessopen“eld.Thisisa sharpcontrasttoadeploymentsceneconsistingofstructuresandothervertical obstacles.

PAGE 12

4 developmentcycleforintelligent,au tonomousMAVs.Itisalsoapparentthat havingahuman-controlcapabilityinthecontrolloopwouldbeadvantageousto mitigatescenarioswheretheairframeisinperil. 1.1.2UtilizingaVirtualEnvironment Toaddressthechallengesdiscusseda bove,wearecurrentlyconstructinga Hardware-In-the-LoopSimulation(HILS)facility,expectedtobecompletedby thespringof2005,thatwillenabletestinganddebuggingofcomplexvision-based behaviorswithoutriskingdestructionoftheMAV”ightvehicles.Asconceived anddepictedinFigure1.1,theHILSfacilitywillsimulatethe”ightofasingle MAVthroughdiversephoto-realisticvirtualworlds(e.g.,urbanenvironments),by measuringandmodelingaerodynamic”ightcharacteristicsinawindtunnelinreal time.Thevirtualdisplaywillrenderthecorrectperspectiveofthevirtualworldas theMAVstrajectoryiscomputedfromitsdynamicmodel. Figure1.1:UFHILSfacilitycurrently underconstruction:conceptdiagram.

PAGE 13

5 1.2OverviewoftheProposedMAVTestbed Inthisthesis,wepresenta”ighttestbe dsystemthatallowsforrapiddevelopmentofvision-basedautonomousMAVs.Theproposedtestbedsystemprovidesa completearchitecture,builtfromcustom-designedhardwareandsoftware,fordevelopingautonomousbehaviorsforMAVsusingacameraastheprimarysensor.This systembridgesthegapbetweentheoryand”ighttestingthroughtheintegrationof anewvirtualtestingenvironment. Thevirtualenvironmentsimulationcomponentservesasaprecursortothe HILSfacilitybeingconstructed.Thissimulatedenvironmentprovides(1)adiverse scenerysetaswellasvehiclemodels,includingarealisticphysicsengine;(2)the abilitytode“neadditionalsceneryandmo delsexternally;(3)fullsupportfor collisiondetectionandsimulationofpartialvehicledamage;and(4)environmental factorssuchaswindorradionoise.Thesefeaturesareenoughtoperformprecursoryexperimentsinavirtualenvironmentbutareonlyasubsetofwhatthefull facilitywouldoer. Employingvision-basedstabilityandnavigationalgorithmsforUAVcontrolis anemergingscience.TherearesystemsthatexistthatutilizevisiononlargerUAV platforms[10,11],butnonethatallowforthesafeandrapiddevelopmentofvisionbasedcontrolonthescaleofaMAV.TraditionalMAVdevelopmentapproaches involveatry-by-”yingapproach,showninFigure1.2,sincetheaircraftaresmall andeasytorepairinmostcases.Try-by-”yingworksforsimpletasks,(e.g.,PID looptuning)butforamoresophisticat edsystemisneededtuningcomplicated visionalgorithms.Largeraircraft(e.g.,F-16s)userigoroushardware-in-the-loop andwindtunnelfacilitiesforcompletesystem veri“cationbeforetheaircraftleaves

PAGE 14

6 theground.Wedonothavethetimetovigorouslytestouralgorithmsinasimilar manner.Therefore,weneedtodevelophar dwarethatcanbeusedinbothatesting situationandinareal”ight.Inaddition,weneedthatsystemtoprovideatleast somelevelofhardware-in-the-loopveri“cation. Algorithm Under Development Flight Test MAV + Ground Station Figure1.2:Try-by-”yingapproach:Feedbackfromthe”ighttest. Thisthesisproposessuchasystem,showninFigure1.3.Thetestbedis dividedintotheon-boardcomponents,(carriedintheairframe),avirtualenvironmentsimulation,(forlaboratoryveri “cationandtesting),andtheo-board components,locatedontheground,(theint erfacetothe”ightvehicle).Theground stationinterfacetotheŽ”ightvehicleŽdoesnotchange.Thatis,thegroundstationiscompletelyinterchangeablebetweenthereal”ightvehicleandthevirtual environment”ightvehicle,sothatcode,controllers,andhardwaredevelopedinone environmentareimmediatelytransferabletotheother. Insteadofdevelopingacontrolalgorithmandgoingdirectlyto”ighttesting, asdoneinthepast,wewilldevelopthatalgorithmundertheframeworkofthe presentedtestbed,whichincludesutilizingthevirtualenvironmentsimulation. Oncethevirtualenvironmenttestinghasbeencompletedandthealgorithmhas beenveri“edinawiderangeofenvironme ntalconditions,wecanthendeploythat technologytoareal”ighttestwithlittlerisktotheaircraft.

PAGE 15

7 Virtual Environment Simulation Algorithm Under Development Testbed System HILS Flight Test MAV Ground Station Testbed System Flight Test Ground Station MAV Tuning Figure1.3:Testbedarchitectureoverview. 1.3OverviewoftheThesis Inthefollowingchapterswediscussthemaincomponentsofour”ighttestbed system,showninFigure1.3. First,inChapter2,wediscusstheMAVplatformandtheintegrationofvision. Next,inChapter3,wepresentourvision -basedapproachestoMAVstabilization, objecttracking,andautonomouslanding.Then,inChapter4,wediscussthe testbedarchitectureindetail,including thevirtualenvironmentsimulationand thehardware.Next,inChapter5,wereportexperimental”ightresultsforboth thevirtualenvironment,aswellasfor”ighttestsinthe“eld,anddiscusshow algorithmsdevelopedinthevirtualenvironmentwereseamlesslytransitionedto real”ighttesting.Finally,inChapter6,wegiveourconclusions.

PAGE 16

CHAPTER2 MICROAIRVEHICLEPLATFORM 2.1AdvantagesandLimitations Therearenumerouschallengesthatprev entthedirectapplicationoftechnologydevelopedforlargervehiclestobeimplementedonMAVs[12].Thissection willdiscusssomeoftheseissues.OntheMAVscale,thereisasevereReynolds numberdependentdegradationinaerodynamiceciency.Thisdegradationrequires thatMAVs”yatmuchlowerwingloading,thusplacingapremiumonvehicle weight.Traditionalairframedesignhas limitedapplicabilitytoMAVs.Controlis moredicultsincethesmallmassmoment ofinertiarequiresincreasedcontrol inputbandwidth.Disturbances(e.g.,windgusts)haveanexaggeratedeecton the”ightpathsincethevehiclespeedisonthesameorderasthedisturbance. Additionally,o-the-shelfcomponents(e.g.,servos,electronics,andvideocameras) arenotspeci“callydesignedforMAVs. Finally,supplyingreliableandecient propulsionisaseriouschallenge. Giventheseinherenttechnicalobstacles,aseriesofMAVsandsmallUAVs, thatincorporateanumberofadvances,havebeenproducedattheUniversityof 8

PAGE 17

9 Florida.1Aunique,thin,undercambered,”exiblewingthatismoreaerodynamicallyecientthantraditionalairfoils hasbeendeveloped[2].Theairframesare madefromcarbon“ber,durableplastic“lms,andlatexrubbergivingthemhigh speci“cstrength[3]. The”exiblewing,showninFigure2.1,exhibitsadvantagesovertraditional rigidwingsingustywindconditions.Wh enatraditionalaircraftencountersa windgust,theairspeedincreases(headongust)and,subsequently,thewing liftincreases.Withvehiclesoflowin ertia,suchasMAVs,thereisanalmost immediatealtitudechange.Inerraticconditions(e.g.,frequentgusts),theaircraft becomesextremelydiculttocontrol.The”exiblewingonourMAVsincorporates apassivemechanism,calledŽadaptivewashout,Žthatisdesignedtoproduce smoother”ight.Thewingdeformswiththeincreaseinairpressureassociatedwith agust,creatingnear-constantlift[13,14,15].Inerraticconditions,thesevehicles ”ysmoothly,makingthemeasiertocon trolandexcellentcameraplatforms. TheoverallMAVplatformdesignisb iologicallyinspiredbysmall”ying creatures,suchasbirdsandbats[16].Theseanimalshavethin,”exiblewingsand virtuallysilent”ightmechanisms.MAVs aredesignedtomimicthesecreatures. Theybene“tfromasimilarvisuallikenessduetotheirsmallsizeanddarkcarbon “berfuselages.MAVsalsouseelectricmotors,whicharemuchlessnoisythan combustionengines,andarenearlysilentatadistance.Thesecharacteristicsallow aMAVtooperatewithahighdegreeofstealth,makingthemdiculttodetect. 1Theseincludeairframesthatrangeinsizefroma4.5inchmaximumdiameter tosmallUAVswitha24inchmaximumdimension.

PAGE 18

10 Figure2.1:Adaptivewashoutinaction. 2.2ConstructionTechniques Theairframeisconstructedfromlayersofbidirectionalcarbon-“ber.The compositeisformedtoafoammoldandcuredinanautoclavetoformarigid structure.Duetothefactthattheaircr aftisdesignedwithoutlandinggear,an additionallayer,composedofkevlar,isinterwovenintothebottomhalfofthe airframetoaddstrength. Thethin,under-camberedwingconsistsofacarbon-“berskeletonthatis thencoveredwithawingskin.2Theleadingedgeofthewingismadethicker tomaintaintheintegrityoftheairfoilbysupplingadditionalreinforcement.The tailempennage,alsoconstructedfromcarbon-“ber,andsometimes“berglass,is connectedtothefuselagebyacarbon-“berboomthatrunsconcentricallythrough thepusher-propassembly.Tailsonnonpusherpropdesignsaremoldedintothe fuselage. 2Thewingskinistypicallymadefrompolystyreneorparachutematerial.

PAGE 19

11 2.3PropulsionSystemDesign Typicalsmallscaleaircrafthavethe irdrivesystemsmountedinthenoseof theaircraft.Inthiscon“guration,thef orwardview,alongthecenter-lineofthe airframe,isobscuredbythepropellorwhe nspinning.Thispropellorinterference, knownaspropwash,forcesanycamerastobeplacedo-center,typicallyona wing,toavoidthealiasingeectsthata risewhencapturingimagesthrougha propellor.Consequently,mountingtheca meraonthewingintroducesasigni“cant amountofgeometriccomplexity.Thisisduetothefactthatthecenterofmass viewwouldneedtoberecoveredmathematically.Tosimplifythecamerageometry, thenewversionsofourtestplatformarebe ingdesignedwitharear-mounteddrive system,asshowninFigure2.2.Thisallowsaforward-lookingcameratobeplaced directlyonthecenter-lineoftheairfra me.Notonlydoesthepusher-propsystem allowforaclearline-of-sightfromthefrontoftheaircraft,itincreasesliftonthe wingbyreducingskinfriction,drag,andprovideschanneledair”owoverthetailof theaircraft. Theconventionalpusher-propcon“gur ationhasmanyadvantages,butitalso hasdisadvantages.Overall,itincreasestheenvelopesizeoftheairplaneandcreates issueswithpropellorclearanceduring”ig ht.Theseissueswereinitiallydealtwith byutilizingagearingsystemandafoldabl epropellortoreducetheoverallsizeof thedrivesystem.Thatcon“gurationw ascomplicatedduetotheneedtomount andmaintaincorrectalignmentofthegears.Newaircraft,usingadirect-drive systemandafoldableprop,arenowbeingdeveloped.Theiroverallenvelopeis slightlylargerthentheirgearedcounterpa rt;however,thetrade-oforsimplicityis

PAGE 20

12 Pusher-prop Camera Figure2.2:MAVplatform. invaluable.Additionally,thereductionin movingpartsmakestheaircraftquieter andeasiertorepair. 2.4IntegratingVision FormanypotentialMAVmissions,visionistheonlypracticalsensorthat canachieverequiredand/ordesirableautonomousbehaviors,asisthecasewhen ”yinginurbanenvironmentsbelowroof-topaltitudes.Furthermore,giventhat surveillancehasbeenidenti“edasoneoftheirprimarymissions,MAVsmust necessarilybeequippedwithon-boardimagingsensors,suchascamerasorinfrared arrays.Thus,computer-visiontechniquescanexploitalreadypresentsensors,rich ininformationcontent,tosigni“cantlyextendthecapabilitiesofMAVs,without increasingtheirrequiredpayload.

PAGE 21

13 Visionisthemostdesirablesensorbecauseitisveryversatile.Traditional aircraftsensors,likeaccelerometersandgyros,arelimitedtomeasuringonly thecurrentstateofthesystem,whilevisionmeasuresinformationaboutthe environment.Thisinformationcanbeusedtomakethesystemreacttoits surroundingenvironmentinananticipatorymanner,suchasobjecttrackingand pathplanning.Anotheradvantageofvisionisthatitcanalsobeusedtomeasure thevehiclescurrentstatebyanalyzingtheaircraftsmotionandlocationinthe environment.Usingoptical”owtechniques and3Dvision,theposition,orientation, andtrajectoryoftheaircraftcanbeestimatedovertime[17,18].Althoughthese estimatesalonecouldpotentiallybeusedtoreplacetraditionalaircraftsensors, amorereasonableapproachwouldbetocorrelatethetraditionalsensorswith theinformationextractedthroughvisio n.Manytechniqueshavebeendeveloped toenabledatafrommanydierentsourcestobeutilizedtogethertomakevery accurateestimatesaboutthestateoftheaircraft[19,20]. Placingimagingsensorson-boardtheaircraftiscosteectiveinbothpayload andtime.Processingthedatatheyareca pableofgatheringisverycomputationallyexpensiveandnon-trivialtoimplementon-boardaMAVsizeplatform.To addressthisissue,atransmitterisinstalledalongwiththecamera.Thistransmitterallowsthevideosignaltobebroadcasttothegroundstationwhereamore powerfulcomputercanperformthecomputervisioncalculations.

PAGE 22

CHAPTER3 VISION-BASEDCONTROL 3.1FlightStability Fundamentally,”ightstabilityandcon trolrequiresmeasurementoftheMAVs angularorientation.Thetwodegreesoffreedomcriticalforstability(i.e.,the bank angle ,andthe pitchangle ,1)canbederivedfromalinecorrespondingto thehorizonasseenfromaforwardfacingcameraontheaircraft.Below,webrie”y summarizethehorizon-detectionalgorithmusedinourexperiments(furtherdetails canbefoundin[9,21]). Foragivenhypothesizedhorizonlinedividingthecurrent”ightimageintoa sky anda ground region,wede“nethefollowingoptimizationcriterion J : J =( sŠ g)(s+g)Š 1( sŠ g)(3.1) where sand gdenotethemeanvectors,andsandgdenotethecovariance matricesinRGBcolorspaceofallthepixelsintheskyandgroundregions, respectively.Since J representstheMahalanobisdistancebetweenthecolor distributionsofthetworegions,thetruehorizonshouldyieldthemaximumvalue of J ,asisillustratedforasampl e”ightimageinFigure3.1. 1Insteadofthepitchangle ,weactuallyrecoverthecloselyrelatedpitchpercentage ,whichmeasuresthepercentageoftheimagebelowthehorizonline. 14

PAGE 23

15 Figure3.1:Horizontracking:(a)originalimage;(b)optimizationcriterion J asa functionofbankangleandpitchpercentage;(c)resultingclassi“cationofskyand groundpixelsinRGBspace. Given J ,horizondetectionproceedsasfollowsforavideoframeat XH YHresolution: 1.Down-sampletheimageto XL YL,where XL XH, YL YH. 2.Evaluate J onthedown-sampledimageforlineparameters( i,j),where, ( i,j)=( i n Š 2 100 j n ) 0 i n, 0 j n 3.Select( )suchthat, J | = = J | = i, = j, i,j4.Performabisectionsearchonthehigh-resolutionimageto“ne-tunethe valuesof( ). Forexperimentsreportedinthispaper,weusethefollowingparameters: XH YH=320 240, XL YL=20 15,and n =60.Also,theprecisevalueofthepitch percentage( )thatresultsinlevel”ight(i.e.,nochangeinaltitude)isdependent onthetrimsettingsforaparticularaircraft.Forourexperiments,weassumea

PAGE 24

16 perfectlyalignedforwardlookingcamera(seeFigure2.2),suchthata valueof0.5 correspondstolevel”ight. 3.2ObjectTracking Objecttrackingisawell-studiedproblemincomputervision[22,23];our intenthereistouseobjecttrackingtoallowausertoeasilycontrolthe”ight vehiclesheading(insteadof,forexample,GPS).2Wespeci“callydonotperform autonomoustargetrecognition,sincewewanttobeabletodynamicallychange whatgroundregiontheMAVtracks.Assuch,ausercanselectwhichground region(i.e.,object)totrackbyclickingonthelivevideowithamouse.This actionselectsan M M regiontotrack,centeredatthe( x,y )coordinatesofthe mouseclick.FortheexperimentsreportedinChapter5,weset M =15forvideo resolutionsof XH YH. WeemploytemplatematchinginRGBcolorspaceforourobjecttracking oversuccessivevideoframes.Ourcriterionisthesumofsquaredierences(SSD), awidelyusedcorrelationtechniqueinstereovision,structurefrommotion,and egomotionestimation.Ourapproachdiersfromsomeofthatworkinthatwe computetheSSDforRGBinsteadofinten sity,sincetrackingresultsaremuch betterwithfullcolorinformationthanintensityalone.Todealwithvaryingimage intensitiesasenvironmentalfactors(e.g.,clouds)ortheMAVsattitudewith respecttothesunchanges,wealsoupdatethe M M templatetobethematched 2Theobjecttrackingalgorithmdescribedinthissectionwasdevelopedby AshishJainattheMachineIntelligenceLabduringtheSpringsemesterof2004.

PAGE 25

17 regionforthecurrentframepriortosearchingforanewmatchinsubsequentvideo frames.Furthermore,sincegroundobjectsmoverelativelyslowlyintheimage planefromoneframetothenext,duetotheMAVsaltitudeabovetheground,we constrainthesearchregionforsubsequentframestobeinan N N neighborhood ( N =25 XH,YH)centeredaroundthecurrentground-objectlocation( x,y ), asillustratedinFigure3.2.Thisredu cesthecomputationalcomplexityfrom O ( M2XHXL)to O ( M2N2),andallowsustoperformbothhorizontrackingfor stabilizationandobjecttrackingforhead ingcontrolinrealtime(30frames/sec). Infact,withthePowerPCG4AltivecUnit,weareabletodramaticallyreduce CPUloadstoaslittleas35%withbothvision-processingalgorithmsrunning simultaneously. Below,webrie”ysummarizetheobject-trackingalgorithm: 1.Userselectstheimagelocation( x,y )tobetrackedforframe t 2.Thetemplate T issettocorrespondtothe M M squarecenteredat( x,y ) forframe t 3.Thesearchregion R forframe t +1issettothe N N squarecenteredat ( x,y ). 4.Thelocation( x,y )oftheobjectforframe t +1iscomputedastheminimum SSDbetween T andtheimageframewithinsearchregion R 5.Gotostep2. 3.3Controller Acontrollerisnecessarytogenerateactuatormovementsbasedonfeedback toperformthemissionathand.Here,wedes cribethecontrollerarchitecturethat

PAGE 26

18 Figure3.2:Inobjecttracking,thesearchregionforthenextframeisafunctionof theobjectlocationinthecurrentframe. takestheinformationextractedfromho rizonandobjecttrackingandconvertsit tocontrolsurfacecommandtodirectthe” ightpathoftheaircraft.Thiscontrol architectureisshowninFigure3.3. Therearetwopossibleinputstothe systemfromaground-stationuser: (1)ahuman-directedinputthat commandsadesiredbankangle( )andpitch percentage( )and(2)thedesiredlocation xdesofthegroundobjecttobetracked. Intheabsenceofobjecttracking,thehuman-directedinputservesastheprimary headingcontrol;withobjecttracking,thehuman-directedinputistypicallynot engaged,suchthatthetrimsettings( )des=(0 0 5)areactive.Thetwooutputs ofthecontrollerare 1and 2correspondingtothedierentialelevatorsurfaces controlledbytwoindependentservos. Thebankangle andpitchpercentage aretreatedasindependentfromone another,andforbothparametersweimpl ementaPD(proportional-derivative) controller.Thegains Kpand Kdweredeterminedexperimentallyinvirtual environmenttrials.Becauseofthediere ntialelevatorcon“guration,thecontrol signals 1and 2willobviouslybecoupled.Fortracking,aP(proportional) controllerisused.Whenengaged(onactivationofobjecttracking),thecontroller adjuststhebankangle( )proportionaltothedistancebetweenthecenterofthe

PAGE 27

19 s trim tracking enabled/disabled humandirected input object tracking selection Figure3.3:Controllerforvision-basedstabilizationandobjecttracking. trackedtargetandfromthecenterofthecu rrent“eld-of-view.Asbefore,thegain ( Kp)isalsodeterminedexperimentallyinthevirtualenvironment. Thus,therearetwopossiblemodesofsupervisedcontrol:(1)directheading controlthroughahuman-directedinputor(2)indirectheadingcontrolthrough objecttracking.The“rstcaseallowsuserswhoarenotexperiencedin”yingRC aircrafttostablycommandthetrajectoryofthe”ightvehicle.Thisisespecially criticalforMAVs,becauseitissubsta ntiallymorediculttolearndirectRC controlofMAVsthanlarger,morestable RCmodelairplanes.Inthesecondcase, commandingtrajectoriesfortheMAVisevensimplerandreducestopoint-andclicktargetingonthe”ightvideogrounddisplay.Eitherway,thecontrollerwillnot permitŽunsafeŽ”ighttrajectoriesthatcouldpotentiallyleadtoacrash.

PAGE 28

CHAPTER4 TESTBEDIMPLEMENTATION Theparamountgoalofthisresearchistodevelopvision-basedautonomy forMAVs.Developmentofsuchautonomypresentssigni“cantchallenges,inno smallmeasure,becauseoftheinherentinstabilityofthese”ightvehicles.Inthis sectionwepresentthedetailsofa”ight testbedsystemthatseekstomitigate thesechallengesbyfacilitatingtherapidd evelopmentofnewvision-basedcontrol algorithmsintwoways:(1)throughtheuseofavirtualenvironmentsimulation and(2)custom-designed”ighthardwarethatisuni“edbetweenthevirtualandreal ”ight-testcon“gurations. Theproposedtestbedsystemprovidesacompletearchitecture,builtfrom custom-designedhardwareandsoftware ,fordevelopingautonomousbehaviors forMAVsusingacameraastheprimarysensor.Thus,thepresentedtestbed eectivelybridgesthegapbetweendesigningvision-basedalgorithmsforMAVs anddeployingthemintherealworld.Thevirtualenvironmentallowsthesystem tobetailoredtoanumberofdierentmissionpro“lesthroughitsabilityto perform”ighttestsinamultitudeofvirtuallocations.Oncethealgorithmhas beentunedinthevirtualenvironment,theuni“edhardwarearchitectureis interchangeablefromthatenvironment toareal-worlddeploymentsituation. Thatis,thegroundstation,whichper formsthecomputationandcontrol,is 20

PAGE 29

21 completelyinterchangeable,sothatcode ,controllers,andhardwaredevelopedin oneenvironmentareimmediatelytransferabletotheother. 4.1ArchitectureoftheSystem Insteadofdevelopingacontrolalgorithmandgoingdirectlyto”ighttesting, wewillnowdevelopthatalgorithmundertheframeworkofthepresentedtestbed, whichincludesutilizingavirtualenvironmentsimulation.Oncethevirtualenvironmenttestinghasbeencompleteda ndthealgorithmhasbeenveri“edina widerangeofenvironmentalconditions,wecanthendeploythattechnologytoa real”ighttestwithlittlerisktotheaircraft.ExperimentsareshowninChapter5 wherevision-basedalgorithmsareprot otypedusingthevirtualenvironmentand ”ownunmodi“edinarealtest”ight. ThecompletetestbedsystemisshowninFigure4.1.Thetestbedarchitectureisdividedintothreecategories:(1)theon-boardcomponents(carriedinthe airframe),(2)avirtualenvironmentsimulation(forlaboratoryveri“cationand testing),and(3)theo-boardcomponents,locatedontheground(theinterface tothe”ightvehicle).Theon-boardcomponentsincludeacameraandamicroprocessorcontrolledmulti-ratesensorboardt hatincludesinertialsensors,aGPS,an altimeter,andanairspeedsensor.Also,atransceiverisplacedon-boardforinteractionwiththeo-boardcomponents.Thegroundstationcomponents,consisting ofalaptop,atransceiver,andanoptiona lhumanoperableremotecontrol,supply machine-visionandcontrol-processingcapabilitiesnotpossibleon-boardtheaircraft.Avirtualenvironmentsimulatorwasconstructedusing”ight-trainersoftware andaprojectionscreen.Thegroundstationwasinterfacedtothatenvironment andtheaircraft,withitsforwardmountedcamera,waspositionedtoobservethe

PAGE 30

22 ground station Video Capture G4 laptop Futaba RC controller 115 kbs transceiver 24" small UAV GPS pusher propeller color camera video link control link Futaba signal generator Futaba Interface (Emulates MAV) Flight Simulator virtual environment color camera On-board Sensors XYZ Accels XYZ Gyros GPS Altimeter Airspeed Temperature Battery closed-loop control Figure4.1:Testbedsystemoverview. visualoutputofthesimulator.Altogether,thiscomplete”ighttestbedsystem allows”ightineitherarealorasimulatedenvironment.Inthefollowingsections wewilldiscusseachofthecategoriesofthetestbedarchitectureindetail. 4.2VirtualEnvironmentSimulation Thevirtualenvironmentsimulationcomponentservesasaprecursortothe fullHILSfacilitybeingconstructedattheUniversityofFlorida.Thisfacilitywill includeawindtunnelandaphoto-realisticworld.Ourcurrentvirtualtestbed oersonlyasubsetofthefeaturesthatthe fullfacilitywouldoer,focusingmainly onthevisualizationaspectofthesimulationandthepositionoftheaircraft. Thefeaturesthatthecurrentsystemoerareenoughtoperformprecursory experimentsandarediscussedbelow.Thevirtualenvironmentsimulatorisbased

PAGE 31

23 (a) (b) (c) (d)Figure4.2:Somesamplevirtualscenes:(a)“eld,treesandmountains,(b)simple urban,(c)urbanwithfeatures,and(d)complexurban. onano-the-shelfremote-controlairplanesimulationpackage.Theadvantagesof thissoftwareare:(1)itcontainsadiversesetofsceneryaswellasvehiclemodels, includingarealistic-physicsengine;(2)additionalsceneryandvehiclemodelscan bede“nedexternally;(3)itsupportsfu llcollisiondetectionandsimulationof partialvehicledamage(e.g.lossofawing);and,“nally,(4)environmentalfactors suchaswindorradionoise,forexample,canalsobeincorporated.Figure4.2 illustratesafewexamplesofthetypeofs cenerysupportedbythesoftwarepackage; notethatthetypesofsceneryavailablearesigni“cantlymorediversethanwhatis easilyaccessibleforrealtest”ightsofourMAVs. Theonlyadditionalhardwarerequiredforthevirtualtestbed(asopposedto thereal”ightvehicle)isasmallinterfac eboardthatconvertscontroloutputsfrom thegroundstationintosimulator-speci“c syntax.Assuch,thegroundstationdoes notdistinguishbetweenvirtualandreal-”ightexperiments,sincetheinputsand outputstoitremainthesameinbothenvironments. Followingthedevelopmentofthevirtualtestbed,virtual”ightexperiments proceedasfollows.First,the”ightsimulatordisplaysahigh-resolutionimage whichre”ectsthecurrent“eld-of-viewofthesimulatedaircraftataparticular position,orientation,andaltitude.Then,avideocamera,whichisidenticalto

PAGE 32

24 theonemountedontheactualMAV,is“xedinfrontofthedisplaytorecord thatimage.Theresultingsignalfromthisvideocameraisthenprocessedonthe ground-stationlaptop.Next ,theextractedinformationfromthevisionalgorithms beingtestedispassedtothecontroller,whichgeneratescontrolcommandsto maintain”ight-vehiclestabilityanduser-desiredheading(depending,forexample, onground-objecttracking).Thesecontro lcommandsaredigitizedandfedintothe ”ightsimulator.Finally,thesimulatorupdatesthecurrentposition,orientation, andaltitudeoftheaircraft,andanewimageisdisplayedforimagecaptureand processing. Notethatthissystemallowsustoexperimentwithvisionalgorithmsina stablelaboratoryenvironmentpriortoactual”ighttesting.Thismeansthatwe cannotonlydevelopanddebugalgorithmswit houtriskinglossofthe”ightvehicle, butwecanalsoexperimentwithcomplex3Denvironmentswellbeforerisking collisionsofMAVswithrealbuildingsin“eldtesting.Whilethescenesinour currentprototypesystemarenotasphoto-realisticasdesirable,evenwiththis limitation,wewereabletodevelopsigni“c antvision-basedautonomouscapabilities inreal”ighttestswithoutasinglecrash(Chapter5).Moreover,ourlarger-scale HILSfacilitywillhavesubstantiallymore computingpowerforrenderingphotorealisticviewsofcomplexnaturalandurbansettings. 4.3TestbedHardware Theon-boardcomponentsofthetestbed(toprightinFigure4.1)include acameraandamicroprocessorcontrolledmulti-ratesensorboard.Thecamera, acolorCMOSarray,ismountedinthenoseoftheairframe,alongthecenterline.A2.4GHztransmitterisusedtobroadcastthevideostreamtotheground

PAGE 33

25 station.Thesensorboard,stillunderdevel opmentduringourexperiments,includes inertialsensors,aGPS,analtimeter,andanairspeedsensor.Italsocontainsa transceiverforinteractionwiththeo-boardcomponents.Thedetailsofthesensor boardarebeyondthescopeofthisthesis,aswehavedevelopeditfullyinother works[24,25]. Thegroundstation(bottomcenterinFigure4.1)consistsof:(1)a2.4GHz video-patchantenna(notpictured),(2)avideo-capturedevicefromtheImaging Source(formerlyaSonyVideoWalkman)f orNTSC-to-“rewirevideoconversion, (3)a12ŽG4laptop(1GB/1GHz),(4)acustom-designedFutaba-compatiblesignal generatorforconvertingcomputer-generatedcontrolcommandstoPWMFutabareadablesignals,and(5)astandardFutabaRCcontroller.Videoisinputtothe computerinuncompressedYUVformat ,thenconvertedtoRGBforsubsequent processing.TheFutabatransmitter,thetra ditionalremote-controlmechanismfor pilotingRCaircraft,isinterfacedtothel aptopcomputerthroughaKeyspanserialto-USBadapterandhasapass-through trainerswitchthatallowscommandsfrom anothertransmittertobeselectivelyrelayedtotheaircraft.Ourcustom-designed Futaba-compatiblesignalgeneratorlets thelaptopemulatethatothertransmitter, and,therefore,allowsforinstantaneousswitchingbetweencomputercontroland human-pilotedremotecontrolofthe”ightvehicleduringtesting.

PAGE 34

CHAPTER5 EXPERIMENTALRESULTS 5.1FlightTestingProcedures Inthissectionwedescribeseveralexpe rimentsconductedusingtheproposed testbedsystem.First,wecontrastdirect RCcontrolwithhorizon-stabilizedhumandirectedcontrolandillustrateobjecttra ckingonsomesampleimagesequences. Then,weapplytheobjecttrackingframeworktodevelopautonomouslanding capabilities,“rstinthevirtualenvironm entsimulatorandthenin“eldtesting. Theprincipaldierenceintestingproceduresbetweenthevirtualandreal-”ight con“gurationsoccursattake-o.Inthevi rtualenvironment,theaircrafttakeso fromasimulatedrunway,whilein“eldtesting,ourMAVsarehand-launched.After take-o,however,testingisessentiallythesameforbothenvironments.Initially, theaircraftisunderdirectRCcontrolf romahumanpilotuntilasafealtitudeis reached.Oncethedesiredaltitudehasb eenattained,thecontrollerisenabled. Throughoutourtest”ights,bothvirtualandreal,throttlecontrolistypicallysetto aconstantlevelof80%. 5.2SimpleStabilizationExperiment Hereweillustratesimplehorizon-baseds tabilizationandcontrastittodirect RCcontrolinthevirtualtestbed;similarexperimentshavepreviouslybeencarried outin“eldtesting[8,9].Figure5.1illu stratessomesimplerollingandpitch trialsfor:(a)directRC-pilotedand(b)hor izon-stabilized(human-directed)”ight 26

PAGE 35

27 (a) (b)Figure5.1:Stabilizationresults:(a)Di rectRC-piloted”ight,and(b)horizonstabilized(human-directed)”ight.Maneuversfor”ighttrajectory(b)wereexecutedtomimic”ighttrajectory(a)ascloselyaspossible. trajectories.AscanbeobservedfromFigure5.1,horizon-stabilizedcontroltends todoabetterjobofmaintainingsteadyrollandpitchthandirectRC”ight; thisphenomenonhaspreviouslybeenob servedin“eldtesting.Notonlydoes horizonstabilizationleadtosmoother”ig hts,butnospecialtrainingisrequiredto commandthe”ightvehiclewhenhorizonstabilizationisengaged. 5.3ObjectTracking Herewereportresultsongroundobjecttrackingonsomesample”ight sequencesforbothvirtualandreal-”ightv ideos.Figure5.2illustratessomesample framesthatillustratetypicaltrackingr esultsfor:(a)avirtualsequenceand(b) areal-”ightsequence;completevideosareavailableat http://mil.ufl.edu/ ~number9/mav_visualization Oncewehaddeterminedthattrackingwassucientlyrobustforbothvirtual andreal-”ightvideos,weproceededtoengagethetrackingcontrollerinthevirtual testbedandveri“edthattheaircraftwascorrectlyturningtowardtheuser-selected targets.Thisledustoformulate autonomouslanding asagroundobject-tracking

PAGE 36

28 Figure5.2:Objecttracking:(a)virtualtestbed,and(b)real”ightimagesequence. problem,wheretheobjectŽtobetrackedisthelandingzone.We“rstdeveloped andveri“edautonomouslandinginthevi rtualenvironmentsimulationandthen, withoutanymodications ofthedevelopedcode,successfullyexecutedseveral autonomouslandingsinreal-”ight“eldt esting.Wedescribeourexperimentsin autonomouslandinginfurther detailinthenextsection. 5.4AutonomousLanding:VirtualEnvironment Anaircraftwithoutapowersourceisbasicallyaglider,aslongasrolland pitchstabilityaremaintained.Itwillla ndsomewhere,but,withoutanyheading control,yawdriftcanmakethelandinglocationveryunpredictable.However,using ourobjecttrackingtechnique,weareabletoexerciseheadingcontrolandexecutea predictablelanding.Landingataspeci“edlocationrequiresknowledgeoftheglide slope(i.e.,thealtitudeanddistancetothelandinglocation).Sincewecurrentlydo nothaveaccesstothisdatainourvirtualenvironmentsimulation,weassumethat wecanvisuallyapproximatethesevalues .Althoughsomewhatcrude,thismethod workswellinpracticeandisreplicable. Weproceedasfollows.First,thehorizonstabilizedaircraftisorientedsothat therunway(orlandingsite)iswithinthe“eldofview.Theuserthenselectsa

PAGE 37

29 Figure5.3:Autonomouslandinginavirtu alenvironment:foursampleframes. locationontherunwaytobetracked,andthethrottleisdisengaged.Oncetracking isactivated,theplaneglidesdownward,a djustingitsheadingwhilemaintaining level”ight.Inourvirtualenvironment,mountainsarevisible,introducingsome errorinhorizonestimatesatlowaltitudes.Astheplanenearsgroundlevelduring itsdescent,theseerrorsbecomeincreasinglypronounced,causingslightrolland pitchanomaliestooccur.Nevertheless,theaircraftcontinuestoglideforward, successfullylandingontherunwayinrepeatedtrials.Sampleframesfromone autonomouslandingareshowninFigure5.3,whiletheroll,pitchandtracking commandareplottedinFigure5.4forthatl anding.(Asbefore,completevideos areavailableat http://mil.ufl.edu/~number9/mav_visualization .) Figure5.4:Roll,pitchandtrackingco mmandforvirtualautonomouslandingin Figure5.3.

PAGE 38

30 Figure5.5:Real-”ightautonomouslandingin“eldtesting:foursampleframes. 5.5AutonomousLanding:Real-ightExperiments Inreal-”ighttestingofautonomouslanding,wedidnothaveaccesstothe samegroundfeature(i.e.,arunway)asinthevirtualenvironment.OurMAVsdo nothavelandinggearanddonottypicallylandonarunway.Instead,theyare typicallylandedinlargegrass“elds.Ass uch,wesoughtto“rstidentifyground featuresinourtest“eldthatwouldberobustlytrackable.Wesettledonagated areanearafencewherethegroundconsistedmostlyofsandydirt,whichprovided agoodcontrasttothesurrounding“eldandgoodfeaturesfortracking. Duringthe”ighttesting,thehorizon-stabilizedMAVisorientedsuchthat thesandyareaiswithinthe“eldofview.Theuserthenselectsalocationat theedgeofthesandyareatobetracked,andthethrottleisdisengaged.Asin thevirtualenvironment,theMAVglidesdownwardtowardthetarget,adjusting itsheadingtokeepthetargetinthecenteroftheimagewhilemaintaininglevel ”ight.Whentheaircraftapproachesgroundlevel,thetargetbeingtrackedmay falloutofview.However,ifthetargetislostatthispoint,theplanewillstill landsuccessfully.Thisoccursbecaus ethemaximumallowableturncommand

PAGE 39

31 Figure5.6:Roll,pitchandtrackingcommandforreal-”ightautonomouslandingin Figure5.5. generatedbytheobjecttrackingcontro ller,atthatspeed,willnotcausethe planetorollsigni“cantly.Onceontheground,theMAVskidstoahaltonits smoothunderbelly.Inseveralrepeatedtrials,welandedtheMAVwithin10meters ofthetargetlocation.Sampleframesfromoneofthoseautonomouslandings areshowninFigure5.5(alongwithgroundviewsoftheMAVduringlanding). Figure5.6depictstheroll,pitch,and trackingcommandsforthatlanding.(As before,completevideosareavailableat http://mil.ufl.edu/~number9/mav_ visualization ).

PAGE 40

CHAPTER6 CONCLUSION FlighttestingofMAVsisdicultingeneralbecauseoftheinherentinstability ofthese”ightvehicles,andevenmoresowhenimplementingcomplexvisionbasedbehaviors.Overtheyears,manyplaneshavebeendestroyedincrashesdue torelativelysimpleerrorsincodingoralgorithmicweaknesses.Theproposed testbedsystemdescribedinthispaperw asdeveloped,inlargemeasure,todeal withtheseproblemsandtoinvestigatepotentialusesofthefull-scaleUFHILS facilitycurrentlyunderconstruction.Itisvirtuallyinconceivablethatwecould havedevelopedobjecttrackingandautono mouslandingwithoutanycrashesinthe absenceofthevirtualtestbed.Inthecom ingmonths,weplantoextendtheuseof thevirtualtestbedfacilitytomorecomplexvisionproblems,suchas,forexample, 3Dsceneestimationwithincomplexurba nenvironments,aproblemwhichweare nowactivelyinvestigating. 32

PAGE 41

REFERENCES [1]GlobalSecurity.org,RQ-4AGloba lHawk(TierII+HAEUAV),ŽWorldWide Web, http://www.globalsecurity.org/intell/systems/global_hawk.htm March2004. [2]P.G.Ifju,S.Ettinger,D.A.Jenkins,Y.Lian,W.Shyy,andM.R.Waszak, Flexible-wing-basedMicroAirVehicles,Žin Proc.40thAIAAAerospace SciencesMeeting ,Reno,Nevada,January2002,paperno.2002-0705. [3]P.G.Ifju,S.Ettinger,D.A.Jenkins,andL.Martinez,Compositematerials forMicroAirVehicles,Žin PresentationatSAMPEConference ,LongBeach, California,May2001. [4]J.M.McMichaelandCol.M.S.Francis,MicroAirVehicles…Towardanew dimensionin”ight,ŽWorldWideWeb, http://www.darpa.mil/tto/mav/mav_ auvsi.html ,December1997. [5]J.W.Grzywna,J.Plew,M.C.Nechyba,andP.Ifju,Enablingautonomous ”ight,Žin Proc.FloridaConferenceonRecentAdvancesinRobotics ,Miami, Florida,April2003,vol.16,sec.TA3,pp.1…3. [6]J.M.GrasmeyerandM.T.Keennon,DevelopmentoftheBlackWidow MicroAirVehicle,Žin Proc.39thAIAAAerospaceSciencesMeeting ,Reno, Nevada,January2001,paperno.2001-0127. [7]A.Kurdila,M.C.Nechyba,R.Lind,P.Ifju,W.Dahmen,R.DeVore,and R.Sharpley,Vision-basedcontrolo fMicroAirVehicles:Progressand problemsinestimation,Žin PresentationatIEEEInt.ConferenceonDecision andControl ,Nassau,Bahamas,December2004. [8]S.M.Ettinger,MC.Nechyba,P.G.Ifju,andM.Waszak,Vision-guided ”ightstabilityandcontrolforMicroAirVehicles,Žin Proc.IEEEInt. ConferenceonIntelligentRobotsandSystems ,Lausane,October2002,vol.3, pp.2134…40. 33

PAGE 42

34 [9]S.Ettinger,M.C.Nechyba,P.G.Ifju,andM.Waszak,Vision-guided”ight stabilityandcontrolforMicroAirVehicles,Žin JournalofAdvancedRobotics 2003,vol.17,no.3,pp.617…40. [10]C.S.Sharp,O.Shakernia,andS.Sa stry.,Avisionsystemforlanding anUnmannedAerialVehicle,Žin Proc.IEEEInt'lConf.onRoboticsand Automation ,Seoul,Korea,May2001,pp.1720…27. [11]B.Sinopoli,M.Micheli,G.Donato,andT.J.Koo,Visionbasednavigation foranunmannedaerialvehicle,Žin Proc.IEEEInt'lConf.onRoboticsand Automation ,Seoul,Korea,May2001,pp.1757…65. [12]T.J.Mueller,Thein”uenceoflam inarseparationandtransitiononlow reynoldsnumberairfoilhysteresis,Žin JournalofAircraft ,1985,vol.22,pp. 763…70. [13]D.A.Jenkins,P.G.Ifju,M.Abdulrahim,andS.Olipra,Assessmentof controllabilityofMicroAirVehicles,Žin Presentationat16thIntl.Conf. UnmannedAirVehicleSystems ,Bristol,UnitedKingdom,April2001. [14]W.Shyy,D.A.Jenkins,andR.W.Smith,Studyofadaptiveshapeairfolisat lowreynoldsnumberinoscillatory”ows,Žin AIAAJournal ,1997,vol.35,pp. 1545…48. [15]R.W.SmithandW.Shyy,Computationofaerodynamicscoecientsfor a”exiblemembraneairfoilinturbulent”ow:Acomparisonwithclassical theory,Žin Phys.Fluids ,1996,vol.8no.12,pp.3346…53. [16]P.R.Ehrlich,D.S.Dobkin,andD.Wheye,Adaptionsfor”ight,ŽWorld WideWeb, http://www.stanfordalumni.org/birdsite/text/essays/ Adaptions.html ,June2001. [17]B.LucasandT.Kanade,Aniterativeimageregistrationtechniquewith anapplicationtostereovision,Žin Proc.7thInt'lJointConf.onArticial Intelligence ,1981,pp.674…79. [18]T.Kanade,Recoveryofthethreedimensionalshapeofanobjectfroma singleview,Žin JournalofArticialIntelligence ,1981,vol.17,pp.409…60. [19]L.Armesto,S.Chroust,M.Vincze,andJ.Tornero,Multi-ratefusionwith visionandinertialsensors,Žin Proc.IEEEInt'lConf.onRoboticsand Automation ,April2004,vol.1,pp.193…99.

PAGE 43

35 [20]R.Meier,T.Fong,C.Thorp,andC.Ba ur,Sensorfusionbaseduserinterface forvehicleteleoperation,Žin PresentationatInt'lConf.onFieldandService Robotics ,August1999. [21]S.M.Ettinger,Designandimplementationofautonomousvision-guided MicroAirVehicles,ŽM.S.thesis,UniversityofFlorida,May2001. [22]J.ShiandC.Tomasi,Goodfeaturestotrack,Žin Proc.IEEEInt'lConf.on ComputerVisionandPatternRecognition ,Seattle,Washington,June1994, pp.593…600. [23]L.G.Brown,Asurveyofimageregistrationtechniques,Žin ACMComputing Surveys ,December1992,vol.24,no.4,pp.325…76. [24]J.Plew,J.W.Grzywna,M.C.Nechyba,andP.Ifju,Recentprogressinthe developmentofon-boardelectronicsforMicroAirVehicles,Žin Proc.Florida ConferenceonRecentAdvancesinRobotics ,Orlando,Florida,April2004,vol. 17,sec.FP3,pp.1…6. [25]J.Plew,Developmentofa”ightavionicssystemforautonomousMAV control,ŽM.S.thesis,UniversityofFlorida,December2004.

PAGE 44

BIOGRAPHICALSKETCH In2002,JasonW.GrzywnagraduatedfromtheUniversityofFloridawith dualBachelorofSciencedegreesinElectricalandComputerEngineering.Continuinghiseducation,Jasonwasadmittedtothemastersdegreeprograminthe summerof2002,attheUniversityofFlorida.Hismain“eldsofinterest,throughouthisgraduatestudies,weredevelopin gintelligentsystemsforautonomous vehiclesandrobotics.DuringhistimeattheUniversityofFlorida,Jasonwaspart ofmanyotherMAVresearchprojects,includingimmediatebombdamageassessment,thesmallfoldingwingPocketMAVwithinertialandvisionstabilization,and MAVdeploymentfromaPointer. 36


Permanent Link: http://ufdc.ufl.edu/UFE0008441/00001

Material Information

Title: A Flight Testbed with Virtual Environment Capabilities for Developing Autonomous Micro Air Vehicles
Physical Description: Mixed Material
Copyright Date: 2008

Record Information

Source Institution: University of Florida
Holding Location: University of Florida
Rights Management: All rights reserved by the source institution and holding location.
System ID: UFE0008441:00001

Permanent Link: http://ufdc.ufl.edu/UFE0008441/00001

Material Information

Title: A Flight Testbed with Virtual Environment Capabilities for Developing Autonomous Micro Air Vehicles
Physical Description: Mixed Material
Copyright Date: 2008

Record Information

Source Institution: University of Florida
Holding Location: University of Florida
Rights Management: All rights reserved by the source institution and holding location.
System ID: UFE0008441:00001


This item has the following downloads:


Full Text











A FLIGHT TESTBED WITH VIRTUAL ENVIRONMENT CAPABILITIES FOR
DEVELOPING AUTONOMOUS MICRO AIR VEHICLES
















By

JASON WESLEY GRZYWNA


A THESIS PRESENTED TO THE GRADUATE SCHOOL
OF THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT
OF THE REQUIREMENTS FOR THE DEGREE OF
MASTER OF SCIENCE

UNIVERSITY OF FLORIDA


2004















ACKNOWLEDGMENTS


I would like thank Dr. Michael N. ivba for his guidance and support of my

research for this thesis. As my advisor, Dr. N. ,vba has motivated me through his

leadership and his ability to cultivate a synergistic work environment. I would also

like to thank Dr. A. Antonio Arroyo for his passion for education and his belief in

me. He invited me into his lab and pushed me to reach the next level. Dr. Eric

Schwartz taught the classes that lead me into robotics. He is a great friend and a

honest mentor. I would also like to thank Dr. Peter Ifju and his students. They

build the platforms which enable the work that I do.

Special thanks go to Jason Plew, an invaluable research partner, Ashish Jain,

a friend wise beyond his years, Uriel Rodriguez, a friend who ahv-- had a way,

Sinisa Todorovic, a mentor who provided invaluable insight, Shalom D -, ,1 ii ,:-

a friend who .,i. 1,-v made me laugh, and MAi ,liid Adbulrahim, a friend with great

ideas.

Finally, I would like to thank my family for their unending support of my work

and belief that I would ahv--, succeed. I especially want to thank Jennifer, the girl

who owns my heart. She is my best friend and my inspiration.















TABLE OF CONTENTS


page


ACKNOWLEDGMENTS .................

LIST OF FIGURES ....................

ABSTRACT ........................

1 INTRODUCTION ..................

1.1 Micro Air Vehicles . . . .
1.1.1 C!i i!!, ii; in Developing Vision-based
1.1.2 Utilizing a Virtual Environment .
1.2 Overview of the Proposed MAV Testbed .
1.3 Overview of the Thesis . . .

2 MICRO AIR VEHICLE PLATFORM . .

2.1 Advantages and Limitations .........
2.2 Construction Techniques ...........
2.3 Propulsion System Design ..........
2.4 Integrating Vision ..............

3 VISION-BASED CONTROL . . .

3.1 Flight Stability . . . .
3.2 Object Tracking . . . .
3.3 Controller . . . . .

4 TESTBED IMPLEMENTATION . . .

4.1 Architecture of the System . . .
4.2 Virtual Environment Simulation . .
4.3 Testbed Hardware . . . .

5 EXPERIMENTAL RESULTS . . .

5.1 Flight Testing Procedures . . .


Autonomy










5.2 Simple Stabilization Experiment . . . . 26
5.3 Object Tracking . . . . . . . 27
5.4 Autonomous Landing: Virtual Environment . . ... 28
5.5 Autonomous Landing: Real-flight Experiments . . 30

6 CONCLUSION . . . . . . . . 32

REFERENCES ..... . . . . . . .... 33

BIOGRAPHICAL SKETCH . . . . . . . 36















LIST OF FIGURES


Figure page

1.1 UF HILS facility currently under construction: concept diagram. 4

1.2 Try-by-flying approach: Feedback from the flight test. . . 6

1.3 Testbed architecture overview. . . . . .... 7

2.1 Adaptive washout in action. . . . . . 10

2.2 M AV platform . . . . . . . 12

3.1 Horizon tracking: (a) original image; (b) optimization criterion J as
a function of bank angle and pitch percentage; (c) resulting classifi-
cation of sky and ground pixels in RGB space. .. . 15

3.2 In object tracking, the search region for the next frame is a function
of the object location in the current frame. .... . . 18

3.3 Controller for vision-based stabilization and object tracking. . 19

4.1 Testbed system overview . . . . . . 22

4.2 Some sample virtual scenes: (a) field, trees and mountains, (b) sim-
ple urban, (c) urban with features, and (d) complex urban. . 23

5.1 Stabilization results: (a) Direct RC-piloted flight, and (b) horizon-
stabilized (human-directed) flight. Maneuvers for flight trajectory
(b) were executed to mimic flight trajectory (a) as closely as possible. 27

5.2 Object tracking: (a) virtual tested, and (b) real flight image sequence. 28

5.3 Autonomous landing in a virtual environment: four sample frames.. 29

5.4 Roll, pitch and tracking command for virtual autonomous landing in
Figure 5.3 . . . . . . . . 29

5.5 Real-flight autonomous landing in field testing: four sample frames. 30










5.6 Roll, pitch and tracking command for real-flight autonomous landing
in Figure 5.5 . . . . . . . 31















Abstract of Thesis Presented to the Graduate School
of the University of Florida in Partial Fulfillment of the
Requirements for the Degree of Master of Science

A FLIGHT TESTBED WITH VIRTUAL ENVIRONMENT CAPABILITIES FOR
DEVELOPING AUTONOMOUS MICRO AIR VEHICLES

By

Jason Wesley Grzywna

December 2004

CI! ,i': A. Antonio Arroyo
, i ri Department: Electrical and Computer Engineering

We seek to develop vision-based autonomy for small-scale aircraft known as

Micro Air Vehicles (M\ AVs). Development of such .,, iiil v111 presents significant

challenges, in no small measure because of the inherent instability of these flight

vehicles and the try-by-flying practices in use tod -v. Therefore, in this thesis,

we propose a flight tested system that seeks to mitigate these challenges by

facilitating the rapid development of new vision-based control algorithms that

would have been, in the testbeds absence, substantially more difficult to transition

to successful flight testing. The proposed tested system provides a complete

architecture, built from custom-designed hardware and software, for developing

autonomous behaviors for MAVs using a camera as the primary sensor. This

system bridges the gap between theory and flight testing through the integration

of a new virtual testing environment. This virtual environment allows the system

to be tailored to a number of different mission profiles through its ability to










perform test flights in a multitude of virtual locations. The virtual environment

presented in this thesis is a precursor to a more complex Hardware-in-the-Loop

Simulation (HILS) facility currently being constructed at the University of Florida.

HILS systems allow us to experiment with vision-based algorithms in controlled

laboratory settings, thereby minimizing loss-of-vehicle risks associated with actual

flight testing. Along with a virtual testing environment, the proposed system

optionally allows a human in the control loop. In this thesis, we first discuss the

background work done with MAVs and give an overview of the testbed system

architecture. Second, we present our vision-based approaches to MAV stabilization,

object tracking, and autonomous landing. Third, we present details of the proposed

system and show how the work done mitigates the problems and challenges of

implementing vision-based flight controllers. Finally, we report experimental

flight results and discuss how the presented system facilitates the development of

autonomous flight MAVs.















CHAPTER 1
INTRODUCTION


Over the past several years, Unmanned Air Vehicles (UAVs) have begun to

take on missions that had previously been reserved exclusively for manned aircraft,

as evidenced in part by the much publicized deployment of the Global Hawk and

Predator UAVs in the recent Afghan and Iraqi conflicts. While these vehicles

demonstrate remarkable advances in UAV technology, their deployment is largely

limited to high-altitude surveillance and munitions deployment, due to their size

and limited autonomous capabilities. Moreover, while such UAV missions can

prevent unnecessary loss of human life, at costs of $70 million and $4.5 million for

the Global Hawk and Predator, respectively [1], these UAVs cannot be considered

expendable.

1.1 Micro Air Vehicles

Interest has grown for a different class of small-scale UAVs, known as Micro

Air Vehicles (M 1Vs), that overcome the limitations of larger and more expensive

UAVs. At the University of Florida, our on-going research efforts have led to the

development of a large number of MAV platforms, ranging in maximum dimension

from 5 to 24 inches [2, 3].1 Given their small size, weight, and cost (approximately



1 Recent development of bendable wings allows even larger MAVs to fit inside
containers with diameters as small as 4 inches.










$1,000/vehicle), MAVs allow for missions that are not possible for larger UAVs.

For example, such small-scale aircraft could safely be deploy, l1 at low altitudes in

complex urban environments [4], and could be carried and deploy, .1 by individual

soldiers for remote surveillance and reconnaissance of potentially hostile areas in

their path.

While MAVs present great possibilities, they also present great challenges

beyond those of larger UAVs. First, even basic flight stability and control present

unique challenges. The low moments of inertia of MAVs make them vulnerable

to rapid angular accelerations, a problem further complicated by the fact that

aerodynamic damping of angular rates decreases with a reduction in wingspan.

Another potential source of instability for MAVs is the relative magnitudes of wind

gusts, which are much higher at the MAV scale than for larger aircraft. In fact,

wind gusts can typically be equal to or greater than the forward airspeed of the

MAV itself. Thus, an average wind gust can immediately affect a dramatic change

in the vehicle's flight path.

Second, MAVs, due to severe weight restrictions, cannot necessarily make use

of the same sensor suite as larger UAVs. While some MAVs recently developed

have seen the incorporation of miniature on-board INS and GPS [5, 6], such sensors

may not be the best allocation of p loadd capacity. For many potential MAV

missions, vision is the only practical sensor than can achieve required and/or

desirable autonomous behaviors, as is the case, for example, for flight in urban

environments below roof-top altitudes [7]. Furthermore, given that surveillance

has been identified as one of their primary missions, MAVs must necessarily be

equipped with on-board imaging sensors, such as cameras or infrared arrays. Thus,










computer-vision techniques can exploit already present sensors, rich in information

content, to significantly extend the capabilities of MAVs, without increasing their

required 'p i-load.

When additional sensors are present that don't compromise weight and size

constraints, more state information can be derived from the system and fused with

the data extracted with computer vision techniques for an overall more robust

system. In this thesis we do not rule out the use of additional sensors, we just treat

vision as the primary, and the only necessary, sensor for autonomous flight.

1.1.1 Challenges in Developing Vision-based Autonomy

In this thesis, we seek to build on our previous success in vision-based flight

stability and control [8, 9], on the MAV scale, to achieve more complex vision-

based autonomous behaviors, such as urban environment survival. Development

of such behaviors does, however, present some difficult challenges. First, dedicated

flight test locations typically do not exhibit the type of scene diversity likely to

be encountered in deployment scenarios. 2 Second, closed-loop, vision-based

approaches must operate within a tight computational budget for real-time

performance, and require extensive flight testing for robust performance in many

different scenarios. Because of the complexity involved, simple errors in software

development can often lead to critical failures that result in crashes and loss of

the MAV airframe and p loada. This in turn introduces substantial d.-1v in the



2 Our typical flight test location would be a featureless open field. This is a
sharp contrast to a deployment scene consisting of structures and other vertical
obstacles.









development cycle for intelligent, autonomous MAVs. It is also apparent that

having a human-control capability in the control loop would be advantageous to

mitigate scenarios where the airframe is in peril.

1.1.2 Utilizing a Virtual Environment

To address the challenges discussed above, we are currently constructing a

Hardware-In-the-Loop Simulation (HILS) facility, expected to be completed by

the spring of 2005, that will enable testing and debugging of complex vision-based

behaviors without risking destruction of the MAV flight vehicles. As conceived

and depicted in Figure 1.1, the HILS facility will simulate the flight of a single

MAV through diverse photo-realistic virtual worlds (e.g., urban environments), by

measuring and modeling aerodynamic flight characteristics in a wind tunnel in real

time. The virtual display will render the correct perspective of the virtual world as

the MAV's trajectory is computed from its dynamic model.


Figure 1.1: UF HILS facility currently under construction: concept diagram.







5

1.2 Overview of the Proposed MAV Testbed

In this thesis, we present a flight tested system that allows for rapid develop-

ment of vision-based autonomous MAVs. The proposed tested system provides a

complete architecture, built from custom-designed hardware and software, for devel-

oping autonomous behaviors for MAVs using a camera as the primary sensor. This

system bridges the gap between theory and flight testing through the integration of

a new virtual testing environment.

The virtual environment simulation component serves as a precursor to the

HILS facility being constructed. This simulated environment provides (1) a diverse

scenery set as well as vehicle models, including a realistic physics engine; (2) the

ability to define additional scenery and models externally; (3) full support for

collision detection and simulation of partial vehicle damage; and (4) environmental

factors such as wind or radio noise. These features are enough to perform precur-

sory experiments in a virtual environment but are only a subset of what the full

facility would offer.

Employing vision-based stability and navigation algorithms for UAV control is

an emerging science. There are systems that exist that utilize vision on larger UAV

platforms [10, 11], but none that allow for the safe and rapid development of vision-

based control on the scale of a MAV. Traditional MAV development approaches

involve a try-by-flying approach, shown in Figure 1.2, since the aircraft are small

and easy to repair in most cases. Try-by-flying works for simple tasks, (e.g., PID

loop tuning) but for a more sophisticated system is needed tuning complicated

vision algorithms. Larger aircraft (e.g., F-16s) use rigorous hardware-in-the-loop

and wind tunnel facilities for complete system verification before the aircraft leaves










the ground. We do not have the time to vigorously test our algorithms in a similar

manner. Therefore, we need to develop hardware that can be used in both a testing

situation and in a real flight. In addition, we need that system to provide at least

some level of hardware-in-the-loop verification.




Algorithm
Deveopment Ground Station FlightTest


Figure 1.2: Try-by-flying approach: Feedback from the flight test.


This thesis proposes such a system, shown in Figure 1.3. The testbed is

divided into the on-board components, (carried in the airframe), a virtual envi-

ronment simulation, (for laboratory verification and testing), and the off-board

components, located on the ground, (the interface to the flight vehicle). The ground

station interface to the "flight v, !hi !. does not change. That is, the ground sta-

tion is completely interchangeable between the real flight vehicle and the virtual

environment flight vehicle, so that code, controllers, and hardware developed in one

environment are immediately transferable to the other.

Instead of developing a control algorithm and going directly to flight testing,

as done in the past, we will develop that algorithm under the framework of the

presented testbed, which includes utilizing the virtual environment simulation.

Once the virtual environment testing has been completed and the algorithm has

been verified in a wide range of environmental conditions, we can then deploy that

technology to a real flight test with little risk to the aircraft.






















Testbed System HILS Testbed System Flight Test

Figure 1.3: Testbed architecture overview.


1.3 Overview of the Thesis

In the following chapters we discuss the main components of our flight tested

system, shown in Figure 1.3.

First, in C'!i ipter 2, we discuss the MAV platform and the integration of vision.

Next, in C'!i Ipter 3, we present our vision-based approaches to MAV stabilization,

object tracking, and autonomous landing. Then, in C'!i Ipter 4, we discuss the

tested architecture in detail, including the virtual environment simulation and

the hardware. Next, in C'!i Ipter 5, we report experimental flight results for both

the virtual environment, as well as for flight tests in the field, and discuss how

algorithms developed in the virtual environment were seamlessly transitioned to

real flight testing. Finally, in C'! Ipter 6, we give our conclusions.















CHAPTER 2
MICRO AIR VEHICLE PLATFORM


2.1 Advantages and Limitations

There are numerous challenges that prevent the direct application of technol-

ogy developed for larger vehicles to be implemented on MAVs [12]. This section

will discuss some of these issues. On the MAV scale, there is a severe Reynolds

number dependent degradation in aerodynamic efficiency. This degradation requires

that MAVs fly at much lower wing loading, thus placing a premium on vehicle

weight. Traditional airframe design has limited applicability to MAVs. Control is

more difficult since the small mass moment of inertia requires increased control

input bandwidth. Disturbances (e.g., wind gusts) have an E -'::-v -ted effect on

the flight path since the vehicle speed is on the same order as the disturbance.

Additionally, off-the-shelf components (e.g., servos, electronics, and video cameras)

are not specifically designed for MAVs. Finally, supplying reliable and efficient

propulsion is a serious challenge.

Given these inherent technical obstacles, a series of MAVs and small UAVs,

that incorporate a number of advances, have been produced at the University of









Florida.1 A unique, thin, undercambered, flexible wing that is more aerodynam-

ically efficient than traditional airfoils has been developed [2]. The airframes are

made from carbon fiber, durable plastic films, and latex rubber giving them high

specific strength [3].

The flexible wing, shown in Figure 2.1, exhibits advantages over traditional

rigid wings in ,ii-Ii, wind conditions. When a traditional aircraft encounters a

wind gust, the airspeed increases (head on gust) and, subsequently, the wing

lift increases. With vehicles of low inertia, such as MAVs, there is an almost

immediate altitude change. In erratic conditions (e.g., frequent gusts), the aircraft

becomes extremely difficult to control. The flexible wing on our MAVs incorporates

a passive mechanism, called "adaptive washout," that is designed to produce

smoother flight. The wing deforms with the increase in air pressure associated with

a gust, creating near-constant lift [13, 14, 15]. In erratic conditions, these vehicles

fly smoothly, making them easier to control and excellent camera platforms.

The overall MAV platform design is biologically inspired by small flying

creatures, such as birds and bats [16]. These animals have thin, flexible wings and

virtually silent flight mechanisms. MAVs are designed to mimic these creatures.

They benefit from a similar visual likeness due to their small size and dark carbon

fiber fuselages. MAVs also use electric motors, which are much less noisy than

combustion engines, and are nearly silent at a distance. These characteristics allow

a MAV to operate with a high degree of stealth, making them difficult to detect.



1 These include airframes that range in size from a 4.5 inch maximum diameter
to small UAVs with a 24 inch maximum dimension.


















Direction, WmgPrwrta t
Gus, t
Airspeed

Figure 2.1: Adaptive washout in action.

2.2 Construction Techniques

The airframe is constructed from 1 v -r of bidirectional carbon-fiber. The

composite is formed to a foam mold and cured in an autoclave to form a rigid

structure. Due to the fact that the aircraft is designed without landing gear, an

additional Il.-r, composed of kevlar, is interwoven into the bottom half of the

airframe to add strength.

The thin, under-cambered wing consists of a carbon-fiber skeleton that is

then covered with a wing skin.2 The leading edge of the wing is made thicker

to maintain the integrity of the airfoil by suppling additional reinforcement. The

tail empennage, also constructed from carbon-fiber, and sometimes fiberglass, is

connected to the fuselage by a carbon-fiber boom that runs concentrically through

the pusher-prop assembly. Tails on non-pusher prop designs are molded into the

fuselage.


2 The wing skin is typically made from polystyrene or parachute material.










2.3 Propulsion System Design

Typical small scale aircraft have their drive systems mounted in the nose of

the aircraft. In this configuration, the forward view, along the center-line of the

airframe, is obscured by the propellor when spinning. This propellor interference,

known as prop wash, forces any cameras to be placed off-center, typically on a

wing, to avoid the aliasing effects that arise when capturing images through a

propellor. Consequently, mounting the camera on the wing introduces a significant

amount of geometric complexity. This is due to the fact that the center of mass

view would need to be recovered mathematically. To simplify the camera geometry,

the new versions of our test platform are being designed with a rear-mounted drive

system, as shown in Figure 2.2. This allows a forward-looking camera to be placed

directly on the center-line of the airframe. Not only does the pusher-prop system

allow for a clear line-of-sight from the front of the aircraft, it increases lift on the

wing by reducing skin friction, drag, and provides channeled airflow over the tail of

the aircraft.

The conventional pusher-prop configuration has many advantages, but it also

has disadvantages. Overall, it increases the envelope size of the airplane and creates

issues with propellor clearance during flight. These issues were initially dealt with

by utilizing a gearing system and a foldable propellor to reduce the overall size of

the drive system. That configuration was complicated due to the need to mount

and maintain correct alignment of the gears. New aircraft, using a direct-drive

system and a foldable prop, are now being developed. Their overall envelope is

slightly larger then their geared counterpart; however, the trade-off for simplicity is




























Figure 2.2: MAV platform.

invaluable. Additionally, the reduction in moving parts makes the aircraft quieter

and easier to repair.

2.4 Integrating Vision

For many potential MAV missions, vision is the only practical sensor that

can achieve required and/or desirable autonomous behaviors, as is the case when

flying in urban environments below roof-top altitudes. Furthermore, given that

surveillance has been identified as one of their primary missions, MAVs must

necessarily be equipped with on-board imaging sensors, such as cameras or infrared

arrays. Thus, computer-vision techniques can exploit already present sensors, rich

in information content, to significantly extend the capabilities of MAVs, without

increasing their required p' load.


Pusher


- Camera










Vision is the most desirable sensor because it is very versatile. Traditional

aircraft sensors, like accelerometers and i.-., i- are limited to measuring only

the current state of the system, while vision measures information about the

environment. This information can be used to make the system react to its

surrounding environment in an anticipatory manner, such as object tracking and

path planning. Another advantage of vision is that it can also be used to measure

the vehicles current state by analyzing the aircraft's motion and location in the

environment. Using optical flow techniques and 3D vision, the position, orientation,

and trajectory of the aircraft can be estimated over time [17, 18]. Although these

estimates alone could potentially be used to replace traditional aircraft sensors,

a more reasonable approach would be to correlate the traditional sensors with

the information extracted through vision. AT wi,' techniques have been developed

to enable data from many different sources to be utilized together to make very

accurate estimates about the state of the aircraft [19, 20].

Placing imaging sensors on-board the aircraft is cost effective in both p ivload

and time. Processing the data they are capable of gathering is very computation-

ally expensive and non-trivial to implement on-board a MAV size platform. To

address this issue, a transmitter is installed along with the camera. This trans-

mitter allows the video signal to be broadcast to the ground station where a more

powerful computer can perform the computer vision calculations.














CHAPTER 3
VISION-BASED CONTROL

3.1 Flight Stability

Fundamentally, flight stability and control requires measurement of the MAV's

angular orientation. The two degrees of freedom critical for stability (i.e., the bank

i,,j1l. 4, and the pitch w.,l, 0, 1 ) can be derived from a line corresponding to

the horizon as seen from a forward facing camera on the aircraft. Below, we briefly

summarize the horizon-detection algorithm used in our experiments (further details

can be found in [9, 21]).

For a given hypothesized horizon line dividing the current flight image into a

sky and a ,..;';.l/ region, we define the following optimization criterion J:


J- (s- P9)'(s + 9g)-Vl(s- ,g) (3.1)

where ps and pg denote the mean vectors, and Y, and Zg denote the covariance

matrices in RGB color space of all the pixels in the sky and ground regions,

respectively. Since J represents the Mahalanobis distance between the color

distributions of the two regions, the true horizon should yield the maximum value

of J, as is illustrated for a sample flight image in Figure 3.1.




1 Instead of the pitch angle 0, we actually recover the closely related pitch per-
centage ao, which measures the percentage of the image below the horizon line.










Optimization criterion J RGB color cube












function of bank angle and pitch percentage; (c) resulting classification of sky and





resolution:
04 50 bank gieen
pitch % Down-sample the image to XL x YL, where XL XH L H.
(a) (b) (c)

Figure 3.1: Horizon tracking: (a) original image; (b) optimization criterion J as a
function of bank angle and pitch percentage; (c) resulting classification of sky and
ground pixels in RGB space.


Given J, horizon detection proceeds as thellows for a video frame at XH x Y

resolution:

1. Down-sample the image to XL Y15 where XL XH, Yprecise value of the pitchY.







3. Select (Q*, a*) such that,





4. Perform a bisection search on the high-resolution image to fine-tune the

values of (Q*, a*).

For experiments reported in this paper, we use the following parameters: XH x

YH 320 x 240, XL x YL 20 x 15, and n 60. Also, the precise value of the pitch

percentage (a) that results in level flight (i.e., no change in altitude) is dependent

on the trim settings for a particular aircraft. For our experiments, we assume a










perfectly aligned forward looking camera (see Figure 2.2), such that a a value of 0.5

corresponds to level flight.

3.2 Object Tracking

Object tracking is a well-studied problem in computer vision [22, 23]; our

intent here is to use object tracking to allow a user to easily control the flight

vehicle's heading (instead of, for example, GPS).2 We specifically do not perform

autonomous target recognition, since we want to be able to dynamically change

what ground region the MAV tracks. As such, a user can select which ground

region (i.e., object) to track by clicking on the live video with a mouse. This

action selects an M x M region to track, centered at the (x, y) coordinates of the

mouse click. For the experiments reported in Ch'! pter 5, we set M 15 for video

resolutions of XH x YH.

We employ template matching in RGB color space for our object tracking

over successive video frames. Our criterion is the sum of square differences (SSD),

a widely used correlation technique in stereo vision, structure from motion, and

egomotion estimation. Our approach differs from some of that work in that we

compute the SSD for RGB instead of intensity, since tracking results are much

better with full color information than intensity alone. To deal with varying image

intensities as environmental factors (e.g., clouds) or the MAV's attitude with

respect to the sun changes, we also update the M x M template to be the matched



2 The object tracking algorithm described in this section was developed by
Ashish Jain at the Machine Intelligence Lab during the Spring semester of 2004.









region for the current frame prior to searching for a new match in subsequent video

frames. Furthermore, since ground objects move relatively slowly in the image

plane from one frame to the next, due to the MAV's altitude above the ground, we

constrain the search region for subsequent frames to be in an N x N neighborhood

(N = 25 < XH, YH) centered around the current ground-object location (x, y),

as illustrated in Figure 3.2. This reduces the computational complexity from

O(M2XHXL) to O(M2 N2), and allows us to perform both horizon tracking for

stabilization and object tracking for heading control in real time (30 frames/sec).

In fact, with the PowerPC G4 Altivec Unit, we are able to dramatically reduce

CPU loads to as little as 35'. with both vision-processing algorithms running

simultaneously.

Below, we briefly summarize the object-tracking algorithm:

1. User selects the image location (x, y) to be tracked for frame t.

2. The template T is set to correspond to the M x M square centered at (x, y)

for frame t.

3. The search region R for frame t + 1 is set to the N x N square centered at


4. The location (x, y) of the object for frame t + 1 is computed as the minimum

SSD between T and the image frame within search region R.

5. Go to step 2.


3.3 Controller

A controller is necessary to generate actuator movements based on feedback

to perform the mission at hand. Here, we describe the controller architecture that

















Figure 3.2: In object tracking, the search region for the next frame is a function of
the object location in the current frame.

takes the information extracted from horizon and object tracking and converts it

to control surface command to direct the flight path of the aircraft. This control

architecture is shown in Figure 3.3.

There are two possible inputs to the system from a ground-station user:

(1) a human-directed input that commands a desired bank angle (4) and pitch

percentage (a) and (2) the desired location Xd8s of the ground object to be tracked.

In the absence of object tracking, the human-directed input serves as the primary

heading control; with object tracking, the human-directed input is typically not

engaged, such that the trim settings (, o)ds8 = (0, 0.5) are active. The two outputs

of the controller are i1 and 62 corresponding to the differential elevator surfaces

controlled by two independent servos.

The bank angle Q and pitch percentage a are treated as independent from one

another, and for both parameters we implement a PD (proportional-derivative)

controller. The gains Kp and Kd were determined experimentally in virtual

environment trials. Because of the differential elevator configuration, the control

signals 61 and 62 will obviously be coupled. For tracking, a P (proportional)

controller is used. When engaged (on activation of object tracking), the controller

adjusts the bank angle (Q) proportional to the distance between the center of the



















[ Xmeas

Figure 3.3: Controller for vision-based stabilization and object tracking.

tracked target and from the center of the current field-of-view. As before, the gain

(Kp) is also determined experimentally in the virtual environment.

Thus, there are two possible modes of supervised control: (1) direct heading

control through a human-directed input or (2) indirect heading control through

object tracking. The first case allows users who are not experienced in flying RC

aircraft to stably command the trajectory of the flight vehicle. This is especially

critical for MAVs, because it is substantially more difficult to learn direct RC

control of MAVs than larger, more stable RC model airplanes. In the second case,

commanding trajectories for the MAV is even simpler and reduces to point-and-

click targeting on the flight video ground display. Either way, the controller will not

permit "ui- I. flight trajectories that could potentially lead to a crash.















CHAPTER 4
TESTBED IMPLEMENTATION


The paramount goal of this research is to develop vision-based autonomy

for MAVs. Development of such autonomy presents significant challenges, in no

small measure, because of the inherent instability of these flight vehicles. In this

section we present the details of a flight tested system that seeks to mitigate

these challenges by facilitating the rapid development of new vision-based control

algorithms in two v--,-- (1) through the use of a virtual environment simulation

and (2) custom-designed flight hardware that is unified between the virtual and real

flight-test configurations.

The proposed tested system provides a complete architecture, built from

custom-designed hardware and software, for developing autonomous behaviors

for MAVs using a camera as the primary sensor. Thus, the presented tested

effectively bridges the gap between designing vision-based algorithms for MAVs

and deploying them in the real world. The virtual environment allows the system

to be tailored to a number of different mission profiles through its ability to

perform flight tests in a multitude of virtual locations. Once the algorithm has

been tuned in the virtual environment, the unified hardware architecture is

interchangeable from that environment to a real-world deployment situation.

That is, the ground station, which performs the computation and control, is










completely interchangeable, so that code, controllers, and hardware developed in

one environment are immediately transferable to the other.

4.1 Architecture of the System

Instead of developing a control algorithm and going directly to flight testing,

we will now develop that algorithm under the framework of the presented tested,

which includes utilizing a virtual environment simulation. Once the virtual en-

vironment testing has been completed and the algorithm has been verified in a

wide range of environmental conditions, we can then deploy that technology to a

real flight test with little risk to the aircraft. Experiments are shown in Ch'! Ipter 5

where vision-based algorithms are prototyped using the virtual environment and

flown unmodified in a real test flight.

The complete tested system is shown in Figure 4.1. The tested architec-

ture is divided into three categories: (1) the on-board components (carried in the

airframe), (2) a virtual environment simulation (for laboratory verification and

testing), and (3) the off-board components, located on the ground (the interface

to the flight vehicle). The on-board components include a camera and a micropro-

cessor controlled multi-rate sensor board that includes inertial sensors, a GPS, an

altimeter, and an airspeed sensor. Also, a transceiver is placed on-board for inter-

action with the off-board components. The ground station components, consisting

of a laptop, a transceiver, and an optional human operable remote control, supply

machine-vision and control-processing capabilities not possible on-board the air-

craft. A virtual environment simulator was constructed using flight-trainer software

and a projection screen. The ground station was interfaced to that environment

and the aircraft, with its forward mounted camera, was positioned to observe the











virtual environment pusher 24" sma,
ro eller



] colo, ._______
Flight 4
ca". i.iL ,tuet ,,i.. I.


- - 4 video link- I
- control link-

115Ga, 1n


S/uAl On-board Sensors

XYZ Accels
XYZ Gyros
GPS
Altimeter
Airspeed
-l Temperature
S Battery


closed-loop control


Figure 4.1: Testbed system overview.


visual output of the simulator. Altogether, this complete flight tested system

allows flight in either a real or a simulated environment. In the following sections

we will discuss each of the categories of the tested architecture in detail.


4.2 Virtual Environment Simulation

The virtual environment simulation component serves as a precursor to the

full HILS facility being constructed at the University of Florida. This facility will

include a wind tunnel and a photo-realistic world. Our current virtual tested

offers only a subset of the features that the full facility would offer, focusing mainly

on the visualization aspect of the simulation and the position of the aircraft.

The features that the current system offer are enough to perform precursory

experiments and are discussed below. The virtual environment simulator is based







23







(a) (b) (c) (d)
Figure 4.2: Some sample virtual scenes: (a) field, trees and mountains, (b) simple
urban, (c) urban with features, and (d) complex urban.

on an off-the-shelf remote-control airplane simulation package. The advantages of

this software are: (1) it contains a diverse set of scenery as well as vehicle models,

including a realistic-physics engine; (2) additional scenery and vehicle models can

be defined externally; (3) it supports full collision detection and simulation of

partial vehicle damage (e.g. loss of a wing); and, finally, (4) environmental factors

such as wind or radio noise, for example, can also be incorporated. Figure 4.2

illustrates a few examples of the type of scenery supported by the software package;

note that the types of scenery available are significantly more diverse than what is

easily accessible for real test flights of our MAVs.

The only additional hardware required for the virtual testbed (as opposed to

the real flight vehicle) is a small interface board that converts control outputs from

the ground station into simulator-specific syntax. As such, the ground station does

not distinguish between virtual and real-flight experiments, since the inputs and

outputs to it remain the same in both environments.

Following the development of the virtual testbed, virtual flight experiments

proceed as follows. First, the flight simulator di pl-'1 a high-resolution image

which reflects the current field-of-view of the simulated aircraft at a particular

position, orientation, and altitude. Then, a video camera, which is identical to










the one mounted on the actual MAV, is fixed in front of the display to record

that image. The resulting signal from this video camera is then processed on the

ground-station laptop. Next, the extracted information from the vision algorithms

being tested is passed to the controller, which generates control commands to

maintain flight-vehicle stability and user-desired heading (depending, for example,

on ground-object tracking). These control commands are digitized and fed into the

flight simulator. Finally, the simulator updates the current position, orientation,

and altitude of the aircraft, and a new image is di-1p i 1 for image capture and

processing.

Note that this system allows us to experiment with vision algorithms in a

stable laboratory environment prior to actual flight testing. This means that we

can not only develop and debug algorithms without risking loss of the flight vehicle,

but we can also experiment with complex 3D environments well before risking

collisions of MAVs with real buildings in field testing. While the scenes in our

current prototype system are not as photo-realistic as desirable, even with this

limitation, we were able to develop significant vision-based autonomous capabilities

in real flight tests without a single crash (Ch'! Ipter 5). Moreover, our larger-scale

HILS facility will have substantially more computing power for rendering photo-

realistic views of complex natural and urban settings.

4.3 Testbed Hardware

The on-board components of the testbed (top right in Figure 4.1) include

a camera and a microprocessor controlled multi-rate sensor board. The camera,

a color C\ I OS array, is mounted in the nose of the airframe, along the center-

line. A 2.4GHz transmitter is used to broadcast the video stream to the ground









station. The sensor board, still under development during our experiments, includes

inertial sensors, a GPS, an altimeter, and an airspeed sensor. It also contains a

transceiver for interaction with the off-board components. The details of the sensor

board are beyond the scope of this thesis, as we have developed it fully in other

works [24, 25].

The ground station (bottom center in Figure 4.1) consists of: (1) a 2.4 GHz

video-patch antenna (not pictured), (2) a video-capture device from the Imaging

Source (formerly a Sony Video Walkman) for NTSC-to-firewire video conversion,

(3) a 12" G4 laptop (1GB/1GHz), (4) a custom-designed Futaba-compatible signal

generator for converting computer-generated control commands to PWM Futaba-

readable signals, and (5) a standard Futaba RC controller. Video is input to the

computer in uncompressed YUV format, then converted to RGB for subsequent

processing. The Futaba transmitter, the traditional remote-control mechanism for

piloting RC aircraft, is interfaced to the laptop computer through a Keyspan serial-

to-USB adapter and has a pass-through trainer switch that allows commands from

another transmitter to be selectively r!d i,.- to the aircraft. Our custom-designed

Futaba -compatible signal generator lets the laptop emulate that other transmitter,

and, therefore, allows for instantaneous switching between computer control and

human-piloted remote control of the flight vehicle during testing.















CHAPTER 5
EXPERIMENTAL RESULTS


5.1 Flight Testing Procedures

In this section we describe several experiments conducted using the proposed

tested system. First, we contrast direct RC control with horizon-stabilized human-

directed control and illustrate object tracking on some sample image sequences.

Then, we apply the object tracking framework to develop autonomous landing

capabilities, first in the virtual environment simulator and then in field testing.

The principal difference in testing procedures between the virtual and real-flight

configurations occurs at take-off. In the virtual environment, the aircraft takes off

from a simulated runway, while in field I ii.- our MAVs are hand-launched. After

take-off, however, testing is essentially the same for both environments. Initially,

the aircraft is under direct RC control from a human pilot until a safe altitude is

reached. Once the desired altitude has been attained, the controller is enabled.

Throughout our test flights, both virtual and real, throttle control is typically set to

a constant level of ,,'.

5.2 Simple Stabilization Experiment

Here we illustrate simple horizon-based stabilization and contrast it to direct

RC control in the virtual tested; similar experiments have previously been carried

out in field testing [8, 9]. Figure 5.1 illustrates some simple rolling and pitch

trials for: (a) direct RC-piloted and (b) horizon-stabilized (human-directed) flight











- 330
S-30 -30
-60 -60


0.5 0.5


0 50 100 150 0 50 100 150 200
time (sec) time (sec)
(a) (b)

Figure 5.1: Stabilization results: (a) Direct RC-piloted flight, and (b) horizon-
stabilized (human-directed) flight. Maneuvers for flight trajectory (b) were exe-
cuted to mimic flight trajectory (a) as closely as possible.


trajectories. As can be observed from Figure 5.1, horizon-stabilized control tends

to do a better job of maintaining steady roll and pitch than direct RC flight;

this phenomenon has previously been observed in field testing. Not only does

horizon stabilization lead to smoother flights, but no special training is required to

command the flight vehicle when horizon stabilization is engaged.

5.3 Object Tracking

Here we report results on ground object tracking on some sample flight

sequences for both virtual and real-flight videos. Figure 5.2 illustrates some sample

frames that illustrate typical tracking results for: (a) a virtual sequence and (b)

a real-flight sequence; complete videos are available at http://mil.ufl. edu/

~number9/mav_visualization.

Once we had determined that tracking was sufficiently robust for both virtual

and real-flight videos, we proceeded to engage the tracking controller in the virtual

tested and verified that the aircraft was correctly turning toward the user-selected

targets. This led us to formulate autonomous l7<'.1..:' as a ground object-tracking



















Figure 5.2: Object tracking: (a) virtual testbed, and (b) real flight image sequence.

problem, where the "object" to be tracked is the landing zone. We first developed

and verified autonomous landing in the virtual environment simulation and then,

without i,' fi ,n..,1.i. ,..mns of the developed code, successfully executed several

autonomous landings in real-flight field testing. We describe our experiments in

autonomous landing in further detail in the next section.

5.4 Autonomous Landing: Virtual Environment

An aircraft without a power source is basically a glider, as long as roll and

pitch stability are maintained. It will land somewhere, but, without any heading

control, yaw drift can make the landing location very unpredictable. However, using

our object tracking technique, we are able to exercise heading control and execute a

predictable landing. Landing at a specified location requires knowledge of the glide

slope (i.e., the altitude and distance to the landing location). Since we currently do

not have access to this data in our virtual environment simulation, we assume that

we can visually approximate these values. Although somewhat crude, this method

works well in practice and is replicable.

We proceed as follows. First, the horizon-stabilized aircraft is oriented so that

the runway (or landing site) is within the field of view. The user then selects a



















Figure 5.3: Autonomous landing in a virtual environment: four sample frames.


location on the runway to be tracked, and the throttle is disengaged. Once tracking

is activated, the plane glides downward, adjusting its heading while maintaining

level flight. In our virtual environment, mountains are visible, introducing some

error in horizon estimates at low altitudes. As the plane nears ground level during

its descent, these errors become increasingly pronounced, causing slight roll and

pitch anomalies to occur. Nevertheless, the aircraft continues to glide forward,

successfully landing on the runway in repeated trials. Sample frames from one

autonomous landing are shown in Figure 5.3, while the roll, pitch and tracking

command are plotted in Figure 5.4 for that landing. (As before, complete videos

are available at http://mil.ufl.edu/~number9/mav visualization.)



!0
60







20 40 60 80
time (sec)

Figure 5.4: Roll, pitch and tracking command for virtual autonomous landing in
Figure 5.3.

















.... .... ...





Figure 5.5: Real-flight autonomous landing in field testing: four sample frames.


5.5 Autonomous Landing: Real-flight Experiments

In real-flight testing of autonomous 1 .lii; we did not have access to the

same ground feature (i.e., a runway) as in the virtual environment. Our MAVs do

not have landing gear and do not typically land on a runway. Instead, they are

typically landed in large grass fields. As such, we sought to first identify ground

features in our test field that would be robustly trackable. We settled on a gated

area near a fence where the ground consisted mostly of sandy dirt, which provided

a good contrast to the surrounding field and good features for tracking.

During the flight testing, the horizon-stabilized MAV is oriented such that

the sandy area is within the field of view. The user then selects a location at

the edge of the sandy area to be tracked, and the throttle is disengaged. As in

the virtual environment, the MAV glides downward toward the target, adjusting

its heading to keep the target in the center of the image while maintaining level

flight. When the aircraft approaches ground level, the target being tracked may

fall out of view. However, if the target is lost at this point, the plane will still

land successfully. This occurs because the maximum allowable turn command













-3



S05

0 I I
1 -_ '


human control landing landed


70 20 30 40
time (see)

Figure 5.6: Roll, pitch and tracking command for real-flight autonomous landing in
Figure 5.5.


generated by the object tracking controller, at that speed, will not cause the

plane to roll significantly. Once on the ground, the MAV skids to a halt on its

smooth underbelly. In several repeated trials, we landed the MAV within 10 meters

of the target location. Sample frames from one of those autonomous landings

are shown in Figure 5.5 (along with ground views of the MAV during landing).

Figure 5.6 depicts the roll, pitch, and tracking commands for that landing. (As

before, complete videos are available at http://mil .uf 1. edu/~number9/mav_

visualization).















CHAPTER 6
CONCLUSION


Flight testing of MAVs is difficult in general because of the inherent instability

of these flight vehicles, and even more so when implementing complex vision-

based behaviors. Over the years, many planes have been destroyed in crashes due

to relatively simple errors in coding or algorithmic weaknesses. The proposed

testbed system described in this paper was developed, in large measure, to deal

with these problems and to investigate potential uses of the full-scale UF HILS

facility currently under construction. It is virtually inconceivable that we could

have developed object tracking and autonomous landing without any crashes in the

absence of the virtual testbed. In the coming months, we plan to extend the use of

the virtual testbed facility to more complex vision problems, such as, for example,

3D scene estimation within complex urban environments, a problem which we are

now actively investigating.














REFERENCES


[1] Global Security.org, "RQ-4A Global Hawk (Tier II+ HAE UAV)," World Wide
Web, http://www.globalsecurity.org/intell/systems/global_hawk.htm,
March 2004.

[2] P. G. Ifju, S. Ettinger, D. A. Jenkins, Y. Lian, W. Shyy, and M. R. Waszak,
"Flexible-wing-based Micro Air Vehicles," in Proc. 40th AIAA Aerospace
Sciences Meeting, Reno, Nevada, January 2002, paper no. 2002-0705.

[3] P. G. Ifju, S. Ettinger, D. A. Jenkins, and L. Martinez, "Composite materials
for Micro Air Vehicles," in Presentation at SAMPE Conference, Long Beach,
California, M r- 2001.

[4] J. M. McMichael and Col. M. S. Francis, \!lcro Air Vehicles-Toward a new
dimension in flight," World Wide Web, http://www.darpa.mi1/tto/mav/mav_
auvsi.html, December 1997.

[5] J. W. Grzywna, J. Plew, M. C. N. .iirba, and P. Ifju, "Enabling autonomous
flight," in Proc. Florida Conference on Recent Advances in Robotics, Miami,
Florida, April 2003, vol. 16, sec. TA3, pp. 1-3.

[6] J. M. Grasmeyer and M. T. Keennon, "Development of the Black Widow
Micro Air Vehicle," in Proc. 39th AIAA Aerospace Sciences Meeting, Reno,
Nevada, January 2001, paper no. 2001-0127.

[7] A. Kurdila, M. C. N. I.Iiba, R. Lind, P. Ifju, W. Dahmen, R. DeVore, and
R. Sharpley, "Vision-based control of Micro Air Vehicles: Progress and
problems in estimation," in Presentation at IEEE Int. Conference on Decision
and Control, Nassau, Bahamas, December 2004.

[8] S. M. Ettinger, M C. N. .!irba, P. G. Ifju, and M. Waszak, "Vision-guided
flight stability and control for Micro Air Vehicles," in Proc. IEEE Int.
Conference on Intelligent Robots and S.1-/. m- Lausane, October 2002, vol. 3,
pp. 2134-40.









[9] S. Ettinger, M. C. N. .ivba, P. G. Ifju, and M. Waszak, "Vision-guided flight
stability and control for Micro Air Vehicles," in Journal of Advanced Robotics,
2003, vol. 17, no. 3, pp. 617 40.

[10] C. S. Sharp, 0. Shakernia, and S. Sastry., "A vision system for landing
an Unmanned Aerial Vehicle," in Proc. IEEE Int'l Conf. on Robotics and
Automation, Seoul, Korea, May 2001, pp. 1720-27.

[11] B. Sinopoli, M. Micheli, G. Donato, and T. J. Koo, "Vision based navigation
for an unmanned aerial vehicle," in Proc. IEEE Int'l Conf. on Robotics and
Automation, Seoul, Korea, May 2001, pp. 1757-65.

[12] T. J. Mueller, "The influence of laminar separation and transition on low
reynold's number airfoil hysteresis," in Journal of Aircraft, 1985, vol. 22, pp.
763-70.

[13] D. A. Jenkins, P. G. Ifju, M. Abdulrahim, and S. Olipra, "Assessment of
controllability of Micro Air Vehicles," in Presentation at 16th Intl. Conf.
Unmanned Air Vehicle S.i'-/1 Bristol, United Kingdom, April 2001.

[14] W. Shyy, D. A. Jenkins, and R. W. Smith, "Study of adaptive shape airfolis at
low reynolds number in oscillatory flows," in AIAA Journal, 1997, vol. 35, pp.
1545-48.

[15] R. W. Smith and W. Shyy, "Computation of aerodynamics coefficients for
a flexible membrane airfoil in turbulent flow: A comparison with classical
theory," in Phys. Fluids, 1996, vol. 8 no. 12, pp. 3346-53.

[16] P. R. Ehrlich, D. S. Dobkin, and D. WI. i, "Adaptions for flight," World
Wide Web, http://www. stanfordalumni. org/birdsite/text/essays/
Adaptions.html, June 2001.

[17] B. Lucas and T. Kanade, "An iterative image registration technique with
an application to stereo vision," in Proc. 7th Int'l Joint Conf. on Arl'.:[i .:./
Intelligence, 1981, pp. 674-79.

[18] T. Kanade, "Recovery of the three-dimensional shape of an object from a
single view," in Journal of Ar'-.:[ i.., Intelligence, 1981, vol. 17, pp. 409-60.

[19] L. Armesto, S. Chroust, M. Vincze, and J. Tornero, \!i!l -rate fusion with
vision and inertial sensors," in Proc. IEEE Int'l Conf. on Robotics and
Automation, April 2004, vol. 1, pp. 193-99.









[20] R. Meier, T. Fong, C. Thorp, and C. Baur, "Sensor fusion based user interface
for vehicle teleoperation," in Presentation at Int'l Conf. on Field and Service
Robotics, August 1999.

[21] S. M. Ettinger, "Design and implementation of autonomous vision-guided
Micro Air Vehicles," M.S. thesis, University of Florida, M ,i- 2001.

[22] J. Shi and C. Tomasi, "Good features to track," in Proc. IEEE Int'l Conf. on
Computer Vision and Pattern Recognition, Seattle, Washington, June 1994,
pp. 593-600.

[23] L. G. Brown, "A survey of image registration techniques," in AC[ CorTniH;..
Surveys, December 1992, vol. 24, no. 4, pp. 325-76.

[24] J. Plew, J. W. Grzywna, M. C. N. -. vba, and P. Ifju, "Recent progress in the
development of on-board electronics for Micro Air Vehicles," in Proc. Florida
Conference on Recent Advances in Robotics, Orlando, Florida, April 2004, vol.
17, sec. FP3, pp. 1-6.

[25] J. Plew, "Development of a flight avionics system for autonomous MAV
control," M.S. thesis, University of Florida, December 2004.















BIOGRAPHICAL SKETCH


In 2002, Jason W. Grzywna graduated from the University of Florida with

dual Bachelor of Science degrees in Electrical and Computer Engineering. Con-

tinuing his education, Jason was admitted to the master's degree program in the

summer of 2002, at the University of Florida. His main fields of interest, through-

out his graduate studies, were developing intelligent systems for autonomous

vehicles and robotics. During his time at the University of Florida, Jason was part

of many other MAV research projects, including immediate bomb damage assess-

ment, the small folding wing PocketMAV with inertial and vision stabilization, and

MAV deployment from a Pointer.