<%BANNER%>

Gradient Methods for Large-Scale Nonlinear Optimization

xml version 1.0 encoding UTF-8
REPORT xmlns http:www.fcla.edudlsmddaitss xmlns:xsi http:www.w3.org2001XMLSchema-instance xsi:schemaLocation http:www.fcla.edudlsmddaitssdaitssReport.xsd
INGEST IEID E20110404_AAAAAQ INGEST_TIME 2011-04-04T07:47:37Z PACKAGE UFE0013703_00001
AGREEMENT_INFO ACCOUNT UF PROJECT UFDC
FILES
FILE SIZE 5826 DFID F20110404_AAAMGJ ORIGIN DEPOSITOR PATH zhang_h_Page_019thm.jpg GLOBAL false PRESERVATION BIT MESSAGE_DIGEST ALGORITHM MD5
bdc381ae3105222ba91199f3f323cbde
SHA-1
65f17793389ad16dbef68c1bc94dbab75fc76e7f
30211 F20110404_AAAMFV zhang_h_Page_033.QC.jpg
c11e39f10a918b40de6e45de10c0040a
9f866ac5bb7979ed318c8e24dd0e2d6e825971c9
5844 F20110404_AAALDH zhang_h_Page_015thm.jpg
0bec5f1bfc884450ba1d62cf4a317a62
0c7fc7988941af98d09692d6b7748c3d0b328093
17174 F20110404_AAAKYB zhang_h_Page_112.QC.jpg
144f9dfbafbdef2fb68aa8850d7b97d4
5730510fc8474017c081a0718ac8207af1dcbdfa
167551 F20110404_AAALCT zhang_h_Page_138.jp2
b86136c1b9ea28d1f573ab520d49f8f4
3628bfe74690b38262425aacd93aef8a15e01d24
4914 F20110404_AAAMGK zhang_h_Page_106thm.jpg
c7d73b18830fabcd9916c759ef8e9da1
1b18aed4dbd09c53b1271efd37f2f0ed20226c73
7069 F20110404_AAAMFW zhang_h_Page_031thm.jpg
4fd7c25b396d4574c63319c84971918f
5bbbb06501d7249b666f517729ffe0f303a41cd7
922274 F20110404_AAALDI zhang_h_Page_013.jp2
f3278df44b25ae13db66f8281834d7c8
48cc0e2406202faa321d11846897a05b9e6a8e91
1940 F20110404_AAAKYC zhang_h_Page_058.txt
7537f616f10fb72217546eaff0f42fcc
363634d606da2c939db3dfaa6e09192c1718dc03
38678 F20110404_AAALCU zhang_h_Page_089.pro
6c3dea171f898c11cb24e3d927e3c3a7
66630e64346721deeecd7122278c2aa9436fcc4a
18271 F20110404_AAAMHA zhang_h_Page_121.QC.jpg
e6fdbcd5e39ebf4d575c57a9e0e7d362
1852936d54ecf4adc72dfb6a5b2a75ed446d74df
25652 F20110404_AAAMGL zhang_h_Page_017.QC.jpg
516803b3e7a07e1843c6e404f79723e6
678412df327a197d9243a8f7ab6c5c593a3211c2
23171 F20110404_AAAMFX zhang_h_Page_016.QC.jpg
f2686527e0f136089afe5261f246337d
8d54de6f3789411b28794d8ea127117917c9d992
231867 F20110404_AAALDJ zhang_h_Page_001.jp2
e6da3f68b1a08a4ab29ffd4611b0f78e
8c9d078e97be34f31ae45adb263923c9d0bf300a
1028865 F20110404_AAAKYD zhang_h_Page_051.jp2
368152adc0633634ded7fef7add51367
e8e654328f51e46d65e6462d0f911c80e9820e3c
6584 F20110404_AAALCV zhang_h_Page_089thm.jpg
9ba68199d16cc1cc41991f025ae4e812
993e7a5f74b7e3dd08ca0276997ad1ace75afc7f
31448 F20110404_AAAMHB zhang_h_Page_130.QC.jpg
989e8ad3ca62fba90cf5c9d37ca605ec
c153ad944d09203ef3a252a75e77a2a789ecc4d3
24183 F20110404_AAAMGM zhang_h_Page_024.QC.jpg
219e0bb2c4d7a2a131be63d2ca99a028
c382fc35f053d0a7df6454ff898b8a9f58047708
18673 F20110404_AAAMFY zhang_h_Page_122.QC.jpg
07fd2c9f8b41b1157b84ba0fb56c7b9a
10ab9f1c60135eec5b9ace0d70b762dd4482d8f3
6246 F20110404_AAALDK zhang_h_Page_062thm.jpg
7407ae33b799049f9756f8eae61c46d0
d93038edb0060530fbc2a2421b17464577555793
8423998 F20110404_AAAKYE zhang_h_Page_098.tif
12a67479574761e7a1799c79f036822b
23d46d5747a5d4bebfd3ca853a86452f3bd2c652
1042663 F20110404_AAALCW zhang_h_Page_032.jp2
49e39d1800922129fc2dfbfbb7b664fa
eb4dbfa97a7a639ad8dbf9fdf57f23ec6dae1fdd
59353 F20110404_AAAKXQ zhang_h_Page_122.jpg
febf91b5c466c3b93dd7b6ee04438229
8e793ae11372e20a4b9f23c7fdf31427dcb9ad37
22185 F20110404_AAAMHC zhang_h_Page_021.QC.jpg
86ecf7c10a0540ced37936c077f2cbe6
28c1f476336c897d249c8d81c6e0a65b9248ca1d
4894 F20110404_AAAMGN zhang_h_Page_010thm.jpg
1c35393cd94bab0d6c16a455019ed2a4
91a2a4734512b35262b14173fd2fe949b6a2b956
5433 F20110404_AAAMFZ zhang_h_Page_016thm.jpg
7b7c1d503ba75c57f9186b57fc33e984
6419d1b665257b34f73b2ed2d3eaccfc20f87cb9
656341 F20110404_AAALEA zhang_h_Page_070.jp2
602f7dd95fac0c3853ffc9d99a026bb7
30c5ab69c751aa7174e70c9659fc03623443c4b0
812324 F20110404_AAALDL zhang_h_Page_024.jp2
33f7d7ded45eabe6152bb6773489d844
a4463b43e5d066ee70cd5e4b54af14b376b1d6b8
4871 F20110404_AAAKYF zhang_h_Page_100thm.jpg
a28c2dedfd7fae51ce829c49a231a394
4206d3cf4a895c2dcbad5843abac7e18b54e3eb7
F20110404_AAALCX zhang_h_Page_008.tif
7b67fc441f3d543c609d0fcd273f4801
50bcc7c5922607e391adef72e0c2a67a51cf5a07
1346 F20110404_AAAKXR zhang_h_Page_073.txt
29a44b11c96f15978b88c49cddea0c22
cf6f185bb6036e02f18cc7deb189f392b5f8ab2d
5366 F20110404_AAAMHD zhang_h_Page_067thm.jpg
f7695453c625149660094577958f52b6
4e1b6937d1b428d3fe01f4a907e81dac51941d56
25173 F20110404_AAAMGO zhang_h_Page_111.QC.jpg
00bed92729c20a2d40d22165ecea74c2
e0653d980d9469f3b339024e7e36a0890d614bb9
1660 F20110404_AAALEB zhang_h_Page_101.txt
681bb26c277d86fe14163ecc6104365d
39bbd7acecdc80134311eeeec765ae3d2c84800a
21862 F20110404_AAALDM zhang_h_Page_042.QC.jpg
6b75f7d39edf50dea4db1ba52f3ea8d5
22992ee038238b639a5649f3ef9f3b407792d9d2
107147 F20110404_AAAKYG zhang_h_Page_130.jpg
e39883757b8327130e37f5d876169a81
d71f08f2c20748bf5d3bc5703146ef255d487b82
F20110404_AAALCY zhang_h_Page_043.tif
54c1284c0e35ffdd281bf11756e3a154
8d9becf165eda5a99d1fb5ee2a2df7740f05c10b
2194 F20110404_AAAKXS zhang_h_Page_130.txt
b0ee3e33918b837da24fde52453b8837
29ec8e53c3d30cc8a96b9eb294a2bd3a48e93832
7414 F20110404_AAAMHE zhang_h_Page_127thm.jpg
6663f51f6e0d2f761cd56eb9a01a2968
5f6c1d36e35d4c2a206801313e58ec5a0f130ad4
25546 F20110404_AAAMGP zhang_h_Page_084.QC.jpg
c8aa79ced39ca90ee7ea8ed6aa4d2387
c9b94dfeb37249823b34d8fba7c2029283e03a27
39266 F20110404_AAALEC zhang_h_Page_010.pro
5024547392ee6a5f5cd8af515af05cdd
8cb45fc5350094fd7ece6677f3e00c93efe8e17b
83092 F20110404_AAALDN zhang_h_Page_017.jpg
25f10f96442361304fc40ae3009ed00c
855a462458aa3ae101a5d9e6585bc547372af4a3
693692 F20110404_AAAKYH zhang_h_Page_023.jp2
16b002b0b63975c4c0385f721c6a741f
600166c199b155394ae34a408607fe145cc0f044
1448 F20110404_AAALCZ zhang_h_Page_047.txt
7d43c0105d844f87092afbc246213a58
f8c79b8dd0b1dca0672e732cca223fcd0e1e92d2
1841 F20110404_AAAKXT zhang_h_Page_064.txt
14e8e3f31ff59db098e29f402791c741
fa5594073eaaabfcf95edea9f2dd8c841dc6d441
23072 F20110404_AAAMHF zhang_h_Page_120.QC.jpg
523bc1f038b04d05906655eace85c961
8ab10d121e3d3147a634c0a99de804823826f31c
10180 F20110404_AAAMGQ zhang_h_Page_136.QC.jpg
0eaa31fbadcca6a5c99a532774851b70
5432479cdfa1164649e68bbbb9f967b8ec6105ce
F20110404_AAALED zhang_h_Page_139.tif
92e2342f2483f6089ff3dfc66480b850
a191eafa6ccc43cfc82ef85fea2e3e50e750f736
625228 F20110404_AAALDO zhang_h_Page_122.jp2
63dde08528b0a853184e2767aac9b403
a7f42c7ae0fd1371446aeef9c8c0a401e895ea19
27770 F20110404_AAAKYI zhang_h_Page_091.pro
ee0090cb5b30a55fee34bcf20cb8b71d
fcbf7a3f8ff41e1fbab5f1f5e484d81a5ff40afc
2078 F20110404_AAAKXU zhang_h_Page_061.txt
ce53c49dd9ef720c324fd0bb02eaf173
5fd1507347d0eee3cf4db1e79a8a9eff0a68c18a
7210 F20110404_AAAMHG zhang_h_Page_018thm.jpg
5bf643c65106edf4101f8911923e6561
816f2c95d762dbd9a29e120012f41008cdda9b57
5168 F20110404_AAAMGR zhang_h_Page_034thm.jpg
7d21fb647507086d86371c4e98c4f346
ae724e1bb42136afce9bbd3268b0125c4fa736a4
1551 F20110404_AAALEE zhang_h_Page_139thm.jpg
1fee79624ce07bfc7b1ba85042faf434
8c8db475e958b48f0dd810b2b8415f7a8dbbb716
F20110404_AAALDP zhang_h_Page_062.tif
2f610ec921ba27f7c4c2e9d99b8390d6
3febf8aaa2bdf4c6065642613abd6f8c9f38b265
887239 F20110404_AAAKYJ zhang_h_Page_094.jp2
bc72f4fb4d3f5fc97c473a45df8c82ba
67d7854a497ef437e3b94b06a3c64a8fe5390620
1542 F20110404_AAAKXV zhang_h_Page_096.txt
9e7bab1150bf2924339c8a1b84894126
38b420153b5186852bb1052a41b5355727d39c59
13405 F20110404_AAAMHH zhang_h_Page_008.QC.jpg
249103d22ff66298261e2791b67ae437
9a740dac82ec401e0a77dcb36749d836b7a9fdbb
16887 F20110404_AAAMGS zhang_h_Page_069.QC.jpg
564393b0686590257fd18f0f0e749e66
dc7362cfd06f03b78179077ac61fcdad52f50f31
5308 F20110404_AAALDQ zhang_h_Page_035thm.jpg
f951450a656ad51171199ec1c3f317fc
9555cf01a2127ab3e7c84e7480a8c9ba483a4531
38533 F20110404_AAAKYK zhang_h_Page_117.pro
af88dbecfa67a6fdc6fe16bcb3c2709a
77e34214585ec62c8c6d3297256fb5436d876ec1
939729 F20110404_AAAKXW zhang_h_Page_098.jp2
1a525d7b3bf585f1226d1fe2c188e3e3
3ad1cdda527e111e6c7a26f82299456ed3f4a7c0
469224 F20110404_AAALEF zhang_h_Page_044.jp2
3a53f045c7dbf9290c8035cc2eb9804c
48d4e6a0d190e46535b8c7eea7a4f80ee33fb001
23815 F20110404_AAAMGT zhang_h_Page_045.QC.jpg
11d553894c8bb7ecaf0fd3bf65207a13
2b6ee6e04a8da3a5eca7522cec6c9cca78dd8929
57573 F20110404_AAALDR zhang_h_Page_107.jpg
af0b88c28c91eddfb3b9aa7c7e86ea01
dfe06204a11ffe796ec600ef3a3d21f60bf58f77
573797 F20110404_AAAKYL zhang_h_Page_107.jp2
d57555fa2737d45ab328c250561f93c3
d86ba449b61ba283a9d5a6630a4ddb2609c1a119
1458 F20110404_AAAKXX zhang_h_Page_028.txt
957d70483f8935fde21453491386b08f
4687815c4ee12d116286b8a75b41a7d65d3a8dfe
4453 F20110404_AAAMHI zhang_h_Page_140thm.jpg
457ae17c5c5218c723a1df57eb58e4e8
07343a2134c7e755b26d440274b52565f238be04
5973 F20110404_AAAMGU zhang_h_Page_053thm.jpg
a4096368b817794f1b3e91fa476f8efd
5d84872b9f3d38be5dc0f3d1d7b4b5e86c2f6513
24313 F20110404_AAAKZA zhang_h_Page_097.QC.jpg
b6be8b882c903c92b899e4db87d64696
b68eda354cc1f353804e52e9714b58ed455bc951
1051983 F20110404_AAALDS zhang_h_Page_031.jp2
842b6773ed0d4db032a1c30841cf58f8
462939f39077d71dc9bdad1255618fff524fc3fb
6101 F20110404_AAAKYM zhang_h_Page_021thm.jpg
7e9d37707e1af55cfd55a5205abcb0af
0b87f34dc0984f3dbafdb57ca57f8856cd5db836
F20110404_AAAKXY zhang_h_Page_102.tif
3fb0d5a320551ac63f246947a55d9462
01bf6c23618c215db93c6a367aa54bb53ca9e49d
1477 F20110404_AAALEG zhang_h_Page_138thm.jpg
c0f2c6275e5fe1906392a4f3bec34a00
379563f1847f865cf4fe29115e30a776be2ba4f3
31709 F20110404_AAAMHJ zhang_h_Page_052.QC.jpg
963b06b4c246dbd62dfd7bf29bff9969
f1d705dafe9050431edcd16e4c73e528c80826f5
24977 F20110404_AAAMGV zhang_h_Page_089.QC.jpg
1d313b6cebdb9c5b8f8c6b1d65cceb94
9b5ce9e18fce7d977c754eaec9a1a747eed43764
F20110404_AAAKZB zhang_h_Page_014.tif
f4376cf2f155b8365561b68ee0485f9c
92f7f57cd5da70252ceb97d44c3b345d13e9d3fe
2651 F20110404_AAALDT zhang_h_Page_054.txt
362e75f700d18810d0e0bcd8090e603e
924dc3c9d47e3dcbfc08a4e4119d684199c39d1e
7149 F20110404_AAAKYN zhang_h_Page_086thm.jpg
c8324ea816507258c72674d4cea1d538
8e6516ffac4587c0ba2f4162678c111338841d81
385 F20110404_AAAKXZ zhang_h_Page_139.txt
5fab2e9bfca2e50582cc18d0eac1d4f7
b2eab3a239f4017670b6c6d93b22dda3ac522c36
23797 F20110404_AAALEH zhang_h_Page_063.QC.jpg
c2093ad24460bdacc169af4c7ea85a01
7aa8c6273910fa3142ad83a2b8e9cf57fa3a5d28
28151 F20110404_AAAMHK zhang_h_Page_055.QC.jpg
0e42ed2862928ea79e36c2571046d9da
7e29594edc752d85bb555f853764342c8816a96d
30143 F20110404_AAAMGW zhang_h_Page_036.QC.jpg
9867fc8d09f8117a9b04f116776227b3
712616ce89588186fb5b01837bd102a28c05f1be
509597 F20110404_AAAKZC zhang_h_Page_025.jp2
ccbfee251388a94cebf23322d6565513
721251b107e10578132470551d075c3c1963288e
51169 F20110404_AAALDU zhang_h_Page_069.jpg
f18229d80c8d92f221f3d359c63e22ed
148d36edb2884c5e875c19a1d45b1d67450440e0
F20110404_AAAKYO zhang_h_Page_106.tif
63397f70aaa1e23272593a48846eb232
d45a5602626ba89b5b2333cb3348a490999853cf
1986 F20110404_AAALEI zhang_h_Page_103.txt
88847f29d716f8489e2d36de05d8b7b2
f41bae9aa047c23fc118fd4ac64d42744e3cefbd
1328 F20110404_AAAMIA zhang_h_Page_002.QC.jpg
58fb4dc4d628ceb5f62906d050ee4375
1a3d8d913f6bea6f966f8d0d0d99b2c0e6d4a912
21397 F20110404_AAAMHL zhang_h_Page_079.QC.jpg
d61384bafc05f0518e5ad316646b7b8c
e84d80e05a8b9177cdc17f640cbbb12f99a44dad
24301 F20110404_AAAMGX zhang_h_Page_088.QC.jpg
0eec490f3bc591c72973300ce95b6d0e
5b68dfac9b82b2f93ce9ba8799b82e968c0e925b
F20110404_AAAKZD zhang_h_Page_112.tif
40a603c50cc30e33249a255e517ca779
1f0e1ec07ca500ef435b1fd9af0c4174bf1ad907
51284 F20110404_AAALDV zhang_h_Page_046.jpg
7f5245b6434757e496af74c9065257b4
82805acac16205b4e103b4edad2ab623b80b9bd3
53372 F20110404_AAAKYP zhang_h_Page_100.jpg
d2228df77a7e8a2107a7563a21f23288
12ce01a1a2eeeb400adea2421635b15383f034cb
18826 F20110404_AAALEJ zhang_h_Page_096.QC.jpg
1a59be2d88e80a8d26698e8c11dd6be9
cf86d008293fbef3fa66e1a403e2b1ce3e2a1271
21924 F20110404_AAAMIB zhang_h_Page_035.QC.jpg
a6916e1d04488eed43a7d015a83420b0
1322da1b362c01872629ead1e7b4db2fbd7c26a6
21788 F20110404_AAAMHM zhang_h_Page_070.QC.jpg
4c4a55a49649e1896dd2d1fb923ca86f
380387fe108db7a36f71ac86401a9cc93eef393d
18539 F20110404_AAAMGY zhang_h_Page_050.QC.jpg
02d1c8b8a9f2138a08314796f78d3461
b1fb1a784e19d7cd8c9b071fa34c2d1e76265e6e
30589 F20110404_AAAKZE zhang_h_Page_031.QC.jpg
c96435d46d26441fcd321ae22ccb2cbc
d14ba11db3cadc915aded5c1f0df1b60b0bc0695
6826 F20110404_AAALDW zhang_h_Page_051thm.jpg
0b075d4faaada56d55f45353fbced229
c40cf4368f58043fe720af9ff0f0870266c38060
1028072 F20110404_AAAKYQ zhang_h_Page_135.jp2
32e247e0554ba46746c55f3c47cb0ba6
3d59143c541b2625ccd5f925d97d9652e7639b39
64687 F20110404_AAALEK zhang_h_Page_123.jpg
858d91bc3a4ac812be694afd4f8bcab2
6087aa67588ddb01e79d1faa72587b39b9ebf919
5533 F20110404_AAAMIC zhang_h_Page_076thm.jpg
bb8ed0ef33f59b49ebad0eebc4e97a6d
b5afeb0610c6ad8250a181aa7ea8d5c36ad002c0
5462 F20110404_AAAMHN zhang_h_Page_022thm.jpg
0a34618daf0f9f1eced6b6487505b0ae
19b869b33dd6f26285a6805c424369847c401286
7642 F20110404_AAAMGZ zhang_h_Page_085thm.jpg
2adc110b027cef995559b0604ea3867a
f13b162831505a58c37c2466c41b4405b163cb44
94252 F20110404_AAAKZF zhang_h_Page_038.jpg
d9cb746f543f88ea228e3a8bc65c437d
243bfe09fe4dac83b05721824a91a2c3b21f7323
51487 F20110404_AAALDX zhang_h_Page_039.pro
7eeeec8d19049f614bdd21555f5d1dd2
a98347c892ab64811a36b0b5c6726c7f537ca8df
18665 F20110404_AAAKYR zhang_h_Page_022.QC.jpg
dc4c74ebea92d01cb0425a454450db6d
5bcb20540dd7296153ef98fd4bb6fde157970ef5
971345 F20110404_AAALFA zhang_h_Page_056.jp2
4193799f59eed633c051e0ec8a9d5734
9ff7ede4769f7cd93fcf6bf4385c7be3ecf6bff5
4694 F20110404_AAALEL zhang_h_Page_025thm.jpg
19f95bcad4814c71e56a48b8e851e59d
12e5ccc5d44192a8c1860573be9e9598ec3f51b3
31991 F20110404_AAAMID zhang_h_Page_133.QC.jpg
5d31a671833345be5f35750aaa82bc94
04d060bf2f35b872f32346374e4cee6cbf8561fe
6411 F20110404_AAAMHO zhang_h_Page_111thm.jpg
f5a96ac2b8ee8cfdcba3bf6678776f1d
f064d3841665247d0e7c7d1b00c8c4195ce40f45
1828 F20110404_AAAKZG zhang_h_Page_014.txt
2088352ae0b00a902bd9fe09ea8d9c61
74db0ca9fc23a4a7148ca0edb54e19385faa3544
399279 F20110404_AAALDY zhang_h_Page_087.jp2
942be2480867d834dd35adb7da70c686
a58df641bbca790e9e8b8b2c71d6d681d2fc8342
33898 F20110404_AAAKYS zhang_h_Page_110.pro
0474cc62af58b96cd9cd88089d75c1a2
e544b30bb105bdb72aebc422c02a681bb5d4438f
623 F20110404_AAALFB zhang_h_Page_003.pro
0723f42e2fab79c25075415b1ce02361
995209583f9b0e99b977a0fc5f4473f3338c69c2
F20110404_AAALEM zhang_h_Page_136.tif
d954a64700cc7987dd39547bf9d95aa3
85481d2234d2b23d39ffbea3b4b5e9a1715b9ae1
5338 F20110404_AAAMIE zhang_h_Page_083thm.jpg
3c6d818574a76b1ab5578e7e270b544d
05086cbe5f8e79be314f9ee89a703ad5534882d0
29969 F20110404_AAAMHP zhang_h_Page_103.QC.jpg
0c57cffa753c03b4566caa7ce5e212b7
30418279698e1e316e3de0ee55e461d457e7b38d
33095 F20110404_AAAKZH zhang_h_Page_072.pro
56fa21f2f9708241dfc8fcaee946b781
f3722a48a83d26dca32c05683335d5d182e2dd48
84548 F20110404_AAALDZ zhang_h_Page_013.jpg
d9f37e85984920e5fb270ba10507e7ab
b7e595324703ca6cff24d668c9b28cae5797c456
951 F20110404_AAAKYT zhang_h_Page_003.QC.jpg
391b32fb6a62665a3de30a11160c5a1b
4830cc67544ae0045372f20a954c2ce4cc779c2c
101089 F20110404_AAALFC zhang_h_Page_086.jpg
dc005e917b66534ec48182f964dd8f97
f98755988ec9e807603d8041dea19831919f05f7
34064 F20110404_AAALEN zhang_h_Page_012.QC.jpg
64d2aea15d258024da4d59c00a60bd35
1f5e5e61a76329befd912936da0d48ac6a6246b8
29596 F20110404_AAAMIF zhang_h_Page_037.QC.jpg
b78418e8ae953bd7cdc8b82b4367d09f
f1347eb419cd7976fff20241ffea59eb45fb3573
24062 F20110404_AAAMHQ zhang_h_Page_065.QC.jpg
09f9a64062007d620edf3222466df040
8ec2d0d3c0229b67d48bbca7b17c5a72c403bddd
191893 F20110404_AAAKZI zhang_h_Page_139.jp2
f9fd19afa23e656c3015ae2812e51d19
d19d76730e3512432dbd2337c6842baa742d25ab
6417 F20110404_AAAKYU zhang_h_Page_024thm.jpg
e3c9cccade732c9765ba4d6f13034649
bf234df3f5f3fcce84410fb5570d5a7653d67d4f
1641 F20110404_AAALFD zhang_h_Page_042.txt
63a610f723b8f4244dfdb1a2f50006da
33a6bdc33e4b6a47b78b590d9d460aed35d9c532
28601 F20110404_AAALEO zhang_h_Page_029.QC.jpg
66408691920e79e4e38d04f7d30527a1
101fccb6b30ffff5361a279f3aba1339f8f547b9
26581 F20110404_AAAMIG zhang_h_Page_117.QC.jpg
6823b6c82966f69ae73d74d2c085abba
b1750100f93eb94a4df6829ab4ab8cbad8f04b92
17780 F20110404_AAAMHR zhang_h_Page_106.QC.jpg
82b0f457e325da13d21cd3b5d7b59a74
a1f1b79b2e09421613403632d07e3e4809ef46ac
104286 F20110404_AAAKZJ zhang_h_Page_128.jpg
ceb612be6742a7c991b82a8b7b203ee2
26c29791b1654d02c917a2925ff9203ad8076613
F20110404_AAAKYV zhang_h_Page_029.tif
50b174af55edd78c5740f73e7fd741e5
f54acfe0a0471972938ee058168d4c3219fcb06b
85251 F20110404_AAALFE zhang_h_Page_011.jpg
6cb8aef44bf5ed4d46a4fcc5a08f74cd
1fafaa2486eff68400a1fe923f095f6d663c5864
42285 F20110404_AAALEP zhang_h_Page_062.pro
350793168721db3d0b8356ee632797c1
1b13b8261adf6faefbcd61e491e1ceeaee820610
23992 F20110404_AAAMIH zhang_h_Page_030.QC.jpg
dfa7aaa0729914dbc7045f75227c1839
417a64b1229bd8094b78539e6a768a2f8caac59d
5133 F20110404_AAAMHS zhang_h_Page_069thm.jpg
577f73790b05735867069016d4fed11e
096ecb8e94e8e2a3a6e2d0d2fb8dab24c444a292
19975 F20110404_AAAKZK zhang_h_Page_077.QC.jpg
5facfe6e05ae0cd4c449c040faaa26aa
03c823a227d95689717f73c3dc053a5b2c0ca49c
76945 F20110404_AAAKYW zhang_h_Page_083.jpg
d61c46b4224ab4a7d63e9b76b2d73049
49abc90186f24796e333d20552437aa7e6d27811
102475 F20110404_AAALFF zhang_h_Page_036.jpg
acb0c8972c52c5cdb3fd5902b382f826
e84266d7186d43358be499cc6931bfb34fd2644d
93956 F20110404_AAALEQ zhang_h_Page_061.jpg
fdd3689b80c284df5fe07d2c0b1d5ea1
c9585e0aeaaa4ee48452a8be96784f25db4f9b88
6843 F20110404_AAAMII zhang_h_Page_055thm.jpg
b8ad2d3a44f7da131f050fb7828d2027
ba31d925a79b4c2cc897f2701c1a58f0b9fecbfb
5504 F20110404_AAAMHT zhang_h_Page_027thm.jpg
646dd09be78fcb3653501f5298798c79
d816caeaa1911ecb0743f550a8b6abdf2c5f0d29
20269 F20110404_AAAKZL zhang_h_Page_009.jpg
a6fdcc94b03f578dedf94707ece44444
ba93d5c332b15499f50034008839ab6a6489c8b4
48454 F20110404_AAAKYX zhang_h_Page_061.pro
8ebaeacfcb4fda345b474fbc540918c5
22af4ddffef85534e629ae7851e5421f3131cde4
677424 F20110404_AAALFG zhang_h_Page_095.jp2
6cdd6be3be0fce66ea382c03447cd8e8
b616acb4de65295270e5eb11f4e01c70a7719a9d
35481 F20110404_AAALER zhang_h_Page_088.pro
fe16ed47c6d80d2694040eb5e9773302
a4aaef1e639b6c36eca32501bfe5076e9e9c02c7
5545 F20110404_AAAMHU zhang_h_Page_110thm.jpg
315899f81705529a8ce30aa0c015b675
04270a39af1886f60b53bd6ac84642078692649a
50293 F20110404_AAAKZM zhang_h_Page_059.pro
cacacaf2bc075e076d714f4dd3d7c2cd
dbbf2af52318c35162ec70252fb71444e2200ee5
1047305 F20110404_AAAKYY zhang_h_Page_036.jp2
b3424427f46c4b83633eb515deb2b04c
92a09c5d1cb89e48de7e3cb3c47a018875eea8da
34821 F20110404_AAALES zhang_h_Page_016.pro
8d796cdc24c8f14c2909508995ec3e79
e697ae7419da8484ae09493ef12c8625d07c6858
4984 F20110404_AAAMIJ zhang_h_Page_112thm.jpg
3699f6b29c2a93ecf8cc21c5812f422c
65474a96a08313520978c9917b955cc1dcd4abf1
21085 F20110404_AAAMHV zhang_h_Page_072.QC.jpg
35f7150b0c8fcfe6c6c46730b086bf88
c3ffcb96919a66e6994817ace6346f5d6f7f125a
85861 F20110404_AAAKZN zhang_h_Page_099.jpg
8bd13895fe61751f0755fd94c1acf46e
00aedd3048f28dd0bdd7a7b812b36b95a0ea4c9d
33442 F20110404_AAAKYZ zhang_h_Page_129.QC.jpg
12a7080453e71eed3314aefd6921326a
1d0567472088d491880cbe6c8235e8abaa6d8acd
29330 F20110404_AAALFH zhang_h_Page_051.QC.jpg
d927b67e6feb93f5d797bc69cd8c7a23
fbfebe9ae4a67f774d1d42836d219bccb40a3e32
5765 F20110404_AAALET zhang_h_Page_093thm.jpg
6348c93390af23a4c2bab1233f9154c9
c2e2d4965a70652c07d7ed1f94039526455c64e1
4727 F20110404_AAAMIK zhang_h_Page_137thm.jpg
cfd2822ae137b8c406cd666fa149e94c
628124e0aa7c5614f13a9510864433a725ce6819
6226 F20110404_AAAMHW zhang_h_Page_011thm.jpg
c10ce73595da5c3368f7d7657a2e0a01
15179ebd3263db012233524163a7741da60ffcbb
30566 F20110404_AAAKZO zhang_h_Page_134.QC.jpg
abb6c87a20e2c9c572f3a54688816ca9
66a7eee0e3f9ea64ce6b31fafe78949982faf0e6
38289 F20110404_AAALFI zhang_h_Page_080.jpg
db6dcbea8e415aec931636f6976f3d37
c88c7b6fc87fd573614cdd1cedec2121b742531c
7401 F20110404_AAALEU zhang_h_Page_054thm.jpg
713e57c3d19dbc544d378faa149de5c2
e31647214c1d285959b10431cda63ac37c86f7e5
25284 F20110404_AAAMJA zhang_h_Page_135.QC.jpg
42d0f64d98c2ac5e5c8212d034b2e900
846462a3b7d8e46360eb14a9bf2fb196ca6521d5
23170 F20110404_AAAMIL zhang_h_Page_101.QC.jpg
9572c3ab8482b9e36ee1b8919a08ee74
11d92ae448a2c86da93e28a17cc36c42bb803a8c
25809 F20110404_AAAMHX zhang_h_Page_040.QC.jpg
f5305dc1d8795f5afeb6581ba8fc4e89
8361375d5f928efd17e779bb5079c646cb0b9c33
18835 F20110404_AAAKZP zhang_h_Page_107.QC.jpg
2e240f09805d3f3e8480df55cc2fb090
fa8dc7d6e23b039adc0c561f5d8e16a56d7953f3
F20110404_AAALFJ zhang_h_Page_081.tif
86bd530b5aab0ab9c821a79f67f16d1e
1a0a62a9818283cd35461b809e3a4a9d050ffe40
6144 F20110404_AAALEV zhang_h_Page_139.QC.jpg
4b9447ad20653fd92292ed99f6d588e5
3ff075ab44abf1b96efb65b4dcd0b023be1809c5
6303 F20110404_AAAMJB zhang_h_Page_013thm.jpg
b909a5e572c6e88bf46d191d287b27b7
f3f17874ead8681568e61e670cf325a407f3ec3c
18316 F20110404_AAAMIM zhang_h_Page_100.QC.jpg
b0ad2f9fa0bd6431b3d1b14be22b810a
e119891ec0a577403e454f5bc035a430bfbe1fdf
25247 F20110404_AAAMHY zhang_h_Page_053.QC.jpg
0fd48341f5db008987e0853a0b0fe5d8
4da0f965667143b375ebc5cf194c2a2260cdfee1
2056 F20110404_AAAKZQ zhang_h_Page_126.txt
28854c4ec6d6a322d6c086b483debbdc
c74eb49996380df49eefb905efd7405c75e80dfa
2664 F20110404_AAALFK zhang_h_Page_036.txt
a4e2f57360c2b962ca2e60dae021240e
6637df45f5a88a5e1fc087bffe0465b7923e8ea5
95972 F20110404_AAALEW zhang_h_Page_102.jpg
2b3ac8955a1c70ac3072aad65b9b23b3
ce3dd16959014c6b4e1b50639fbe6053a424028e
4785 F20110404_AAAMJC zhang_h_Page_026thm.jpg
0f5fbb0f809052c2bdfeb9deb9f330cf
e494092818a1bd9d0a6e8fd79541cf9a0b5e9d5c
5969 F20110404_AAAMIN zhang_h_Page_088thm.jpg
3a50a5bd12cfa6dd1b3efeb88b973c5e
68e0568cc2e222167fe59802fb087247efed86dc
5142 F20110404_AAAMHZ zhang_h_Page_121thm.jpg
3e01fd871846a4be147b3fa7bfb83522
5aa193d979abbef416e7670cb4de9b6c147c146e
961 F20110404_AAAKZR zhang_h_Page_080.txt
8bd5050264923760a952de57dfd16701
af913568006862e5c0d565912e3808ba62acbc26
F20110404_AAALGA zhang_h_Page_049.tif
104427ea980349dd6cc57180f29fa9e4
a0f9e4c6788809fc06915da03d169ec0a6bf0874
48834 F20110404_AAALFL zhang_h_Page_102.pro
c04e339c93ede326cc473c68fa691056
e9a8e072241d06068effbcd06fa114091f013382
F20110404_AAALEX zhang_h_Page_004.tif
97bcec6172be05b71544ce2f6ed121cf
14b5a425a3122c633307f3d1fc34ec9afedc5a6c
4940 F20110404_AAAMJD zhang_h_Page_046thm.jpg
596153f131584530c8ceb5f05cadc505
70e0025d2b18b5b6b230e12795a9e4a9cd3a9481
5653 F20110404_AAAMIO zhang_h_Page_119thm.jpg
db352a5296aaf92053c3cd7114a3098c
730d523d9a1d85cfe8a80536e747029f9a56cac7
F20110404_AAAKZS zhang_h_Page_047.tif
cd4800291756283bc4914e1a7de3ea56
03165b5ccf364bdaf764ba0620adeaffaace967d
45341 F20110404_AAALGB zhang_h_Page_037.pro
16d745fce942aefa42c10ce76c4c046a
f4cf1f8a531b6e19911489db974971e0023f65a9
32117 F20110404_AAALFM zhang_h_Page_070.pro
601c208797718d7e12a7937715f25598
8b7bd52709295f8c87080bd824c8fc3f11f50df7
7044 F20110404_AAALEY zhang_h_Page_103thm.jpg
019b86023233c0e572cc9214befe73fd
fed39f74e77e5c718cc82bc7bca15b99463ece92
5975 F20110404_AAAMJE zhang_h_Page_066thm.jpg
396d9fc9173a8e6df35606edc56a4e9e
ef6104d26e6546ed7c0005423e10fa041e2ff538
224899 F20110404_AAAMIP UFE0013703_00001.xml FULL
c7bddcf065be53153c28667b0bfe1050
88e9853a54775010b9951b64effca4e1298b2b79
BROKEN_LINK
zhang_h_Page_001.tif
48304 F20110404_AAAKZT zhang_h_Page_103.pro
31297d868eb0d867394d02378b22be58
0df0fe514ea00e2122323fc100ef2ea77e447253
31511 F20110404_AAALGC zhang_h_Page_004.pro
5477250e3d421dbee344abb2c56b877c
7aa2eff2535983d9886dd6afbff68a0d076c9d5a
18713 F20110404_AAALFN zhang_h_Page_027.QC.jpg
d41697b0ef7186ac7f3f860869ba9796
e8139018f23a76bdc797cc0bc21d874c634d5413
21629 F20110404_AAALEZ zhang_h_Page_066.QC.jpg
f8b451a77e03f44e4a189d0cc98d42cf
66f30a112ed8b75eebb0c0e5be164dd22e80cb7b
6566 F20110404_AAAMJF zhang_h_Page_071thm.jpg
5f4530468ced58f0a6f4e75cbc5aa18a
a8a765c798200949d1a9dccd0f519be9d6b40ce8
6672 F20110404_AAAMIQ zhang_h_Page_001.QC.jpg
0b0a3f518e0055ec55aaa8491ccdd425
1d8a3b734a87d334689aa0edd62c7879b8c64188
27343 F20110404_AAAKZU zhang_h_Page_047.pro
423e376ea1948739ca46d68f1e0e5ac2
5293a5b39b70043a94c303c39f84407768be04c1
62363 F20110404_AAALGD zhang_h_Page_109.jpg
31da57b453a27c7b40bc7a5a6a06143f
6283ee298763efb3200300de5b9812fc0634e024
1460 F20110404_AAALFO zhang_h_Page_067.txt
50794897b3c79ca741334df704e67b16
1c0c9031955b2a0c932b45ed63d0470d49496803
5977 F20110404_AAAMJG zhang_h_Page_072thm.jpg
95c60f8f734a6eb3659b5052e0fbcca6
90beebf83f844c2d0e8ea89859b28db99424f615
30478 F20110404_AAAMIR zhang_h_Page_018.QC.jpg
aa9db2cf58d478281f531e36a7184ca0
60cee6455094d9483ec84d8420c1762f607a5733
F20110404_AAAKZV zhang_h_Page_097.tif
d01ee2cc801f5ca84eae21b1baeb8a00
7ac5679c9cc19dd7ca79695630d7491fa61a62e6
56482 F20110404_AAALGE zhang_h_Page_054.pro
04c7ccd34ce965740439daa203e06204
529627d547971cc0b3df0b371e00f812459554f2
34490 F20110404_AAALFP zhang_h_Page_092.pro
436cf7f367b4171e0a7a843274dea53b
3b61df0e52bb6a9e721089075b0408601d2a233c
5838 F20110404_AAAMJH zhang_h_Page_078thm.jpg
4ff4195f696d49162038237d576cd4c9
ee0118aa39365fafcdd19e71d8964f06d56f11a3
19796 F20110404_AAAMIS zhang_h_Page_043.QC.jpg
629274ddac7ee6ed455d751737cc8e80
c3acd945107b40b37d411dfc231e262d56e770f7
27741 F20110404_AAAKZW zhang_h_Page_106.pro
d526ed7022e08544ad456d808540084c
f409f803455adfd6f075782264917aab9fc674c8
6923 F20110404_AAALGF zhang_h_Page_056thm.jpg
7a534a2dfa1f948709e4776a9da098f1
4132498e7f58f6a6008734f04a758ad99c90a816
1569 F20110404_AAALFQ zhang_h_Page_121.txt
be0152fb0c83e9400ef42be9493c2699
744d625d97869f6f355c90badb36818e26285bb5
5965 F20110404_AAAMJI zhang_h_Page_084thm.jpg
b2a9bef2f1bf2a0eaf1dcbcb956f18bc
fd20312d8ff7afb9004ccb220818b30ec5aa0d36
27360 F20110404_AAAMIT zhang_h_Page_056.QC.jpg
7248da32f40153d402c521b9c63e559b
9d971d7f3b2012233b3f6bcc3a0afd59fba76403
56753 F20110404_AAAKZX zhang_h_Page_050.jpg
4eb333751d26b0361c9dbba0afdc7499
84de1198aab483736100750ceea3e96a660f0f2c
1792 F20110404_AAALGG zhang_h_Page_071.txt
620c0e9e09a44025d6abe96f6fdb4642
cb1a5814cca9264040a270a26528a1daf14fc72c
43321 F20110404_AAALFR zhang_h_Page_014.pro
8418e5a5bb513bed8d894bf69ec1991f
9c3191450da9c24e1492aa2c71e8f561278c061f
5782 F20110404_AAAMJJ zhang_h_Page_120thm.jpg
ee146d94c0a3829da48a837faec2cf71
2316d3bdcad7d4c30014a806c3ec1f411a8005af
22381 F20110404_AAAMIU zhang_h_Page_057.QC.jpg
b5a014f62f3e625c50afe198f14c147e
b93bb344396243dfc054e70f6af780737161fefa
3714 F20110404_AAAKZY zhang_h_Page_080thm.jpg
4b4de007d9f4ffe703b1f517c4f7561d
4c2939b2285aafb9c1c87d032bef70c9fc08a67e
26881 F20110404_AAALGH zhang_h_Page_112.pro
4130760115178642e43cd1895e3eaf4d
ceceb5a4c96ece90685433ecf6d4591313d424e1
45776 F20110404_AAALFS zhang_h_Page_044.jpg
fe9d0bed3eee4d32a36a9e71fe0807c7
adbc4ddcc1f369b64c7a8b9d49a69c77187e993c
15917 F20110404_AAAMIV zhang_h_Page_073.QC.jpg
5b068c47c3c76a46053554d32bd31027
3c1b6aad7b79de717c5380e7c0725809175d45b0
19824 F20110404_AAAKZZ zhang_h_Page_093.QC.jpg
e28982194b65d7d56eeba4815853c91c
68a5e877354da6af080ffc03b83c4bf07a35252a
1836 F20110404_AAALFT zhang_h_Page_013.txt
8edefd1dd01287e1d3d3d7aee601e575
63cea435d36aec7519823f270e1c3aca4e30ecc2
7789 F20110404_AAAMJK zhang_h_Page_129thm.jpg
2be7136fd82b46195687848fc1e7d05f
4e8bcc8362ebd9d57d6209467a4d8b0817cad837
12847 F20110404_AAAMIW zhang_h_Page_080.QC.jpg
5af30f1374c36bc02092462a7527717f
cdbd8ccf67fe6a836cf161207108ea4758967b90
1808 F20110404_AAALGI zhang_h_Page_011.txt
c38005a01ef5f1da954dea8e5b0b9618
62620251485de17ab6051aa924a900db39f04d75
966751 F20110404_AAALFU zhang_h_Page_055.jp2
45be8b9aedb36ef9b3992a0a39706e91
1183d5fc85696b45ff33e586b2fbe049eb920d02
6026 F20110404_AAAMJL zhang_h_Page_135thm.jpg
92a8d10fc199b44348121491c3bf03fe
190786acd444a85e5a4966f1c3a6305ebe7a34f6
12907 F20110404_AAAMIX zhang_h_Page_087.QC.jpg
9984cec7deaac65d741f1ecfd4c46e36
ca08a4f2c74de0d4b6c95f687012876403c5f5a6
F20110404_AAALGJ zhang_h_Page_007.tif
bca1596f111a4c1a98b6990ed3bdbcb9
bd2be2d81e717df7b5c32b42ade553ece446c3de
7599 F20110404_AAALFV zhang_h_Page_102thm.jpg
200337ac5909811add455792e6d43b51
9ec0492d438195da940c07f031ec56fa92c5740c
31229 F20110404_AAAMIY zhang_h_Page_102.QC.jpg
ea3cd45897e142149f4a79c5f0e8aa21
d10989a12761927ae009cee6635202522559c9cd
22449 F20110404_AAALGK zhang_h_Page_092.QC.jpg
025e4e20e34a52e7b56d1fe7aedf757d
d2c56bdccadad3d39c399733245b13ee36e0612b
68704 F20110404_AAALFW zhang_h_Page_035.jpg
71a9700acc4467ba336e76c71d47a228
c6f83a21a4adf99f8a927fc8562664ec16d61804
20278 F20110404_AAAMIZ zhang_h_Page_110.QC.jpg
31f6b3a228b51b328c4ca51b0d3e2f93
0c0e692d5d164f6dd71251563bb77269f8276513
100161 F20110404_AAALGL zhang_h_Page_052.jpg
b0379e2fb926c347fdbdf988dc58f03c
cfb7208371978bfccc07cc022b48e539d3d3e6b5
F20110404_AAALFX zhang_h_Page_063.tif
51194f899b8d2b0197081767de029b1e
80b55be447116eba73e1f3906a7c27f1285bb84a
1554 F20110404_AAALHA zhang_h_Page_068.txt
ecac9e4ea28f7c957192336b0e1dc984
a5b2afbb697c7883bf553a154b956e6c4a438c07
18394 F20110404_AAALGM zhang_h_Page_113.QC.jpg
473c5d3a774e612882aee6579fef5936
6c9c05399d2d93dae955a5aade49649c65612256
F20110404_AAALFY zhang_h_Page_107.tif
27e2382a8b6d5b8075a3be1d6af55093
2f08f714bd0dab1499fbb663f4307d3b13c8061f
F20110404_AAALHB zhang_h_Page_071.tif
7a4d86c179a1255cb89a58359faf23ac
cf7c2ba73b189951dacedfe5c3b4fc400de43659
91628 F20110404_AAALGN zhang_h_Page_135.jpg
6f1d1fa28d033b1ee0ea016a4856bd0d
3daaee1410955697404569df459f6b0c547503df
F20110404_AAALFZ zhang_h_Page_089.tif
d2e95b692369a22adfb3449ac9f92765
fff68d2f5d183e07198a876730ca0ef7dd11f3b7
F20110404_AAALHC zhang_h_Page_133.tif
d483ce2da9a735aee8a1f275d5ebe96e
ae216c6c5183a0b2f7a0a60ddc07c92424d0a429
63142 F20110404_AAALGO zhang_h_Page_004.jpg
fb5aba420d2702fd8fa09695b17a6be7
05d11b8b34835d4745d51a4cf6bfc54a7d4d266e
F20110404_AAALHD zhang_h_Page_083.tif
eff1027bc175c5bc021f1e442a7d1797
3fb3d124bdeba6801185215711ed84371fd157a2
F20110404_AAALGP zhang_h_Page_025.tif
67ec3a093d93feee4eda9bde397f97f3
4680dcd066f8688cf5520a02b8391b71d9c52f19
4422 F20110404_AAALHE zhang_h_Page_125thm.jpg
30a52cdc6022d26ad3cdbc2bf45eb425
9670651bc975ad6dae212121c871b1446dd222ab
20596 F20110404_AAALGQ zhang_h_Page_074.QC.jpg
047bf73c516f7452e273d14d34396127
9cde7319d43b7070d2737e685a9feb7dc180a1ae
6404 F20110404_AAALHF zhang_h_Page_017thm.jpg
e9a2c38666a2e15e2adc8434c18faf6b
371f8a65976f1eaca8a24a9538ea9c7565fc1f36
1483 F20110404_AAALGR zhang_h_Page_109.txt
db99a7142211619d1444f715229236c7
557be54d09d8b5eac0cafcaf6a6e4c744284af9d
17206 F20110404_AAALHG zhang_h_Page_137.QC.jpg
f558ab1fec611347c68ce300cc5a6689
00a468b457352b13a31f4ce4c8c129d69d5f0b9e
1051938 F20110404_AAALGS zhang_h_Page_114.jp2
28aae188c9ec634f98b7361863a90497
14f1e97f0121140c28e1867af91af412eb0ee8c8
1221 F20110404_AAALHH zhang_h_Page_107.txt
9e3939dcadd740cbe0a1513f7c3ff909
bbeb3c8bc681f7735ed9f589a5c0dba0b339a473
596603 F20110404_AAALGT zhang_h_Page_077.jp2
4be560a65c6ba5dbdd03830f5177f53b
a9e40a5454ba16c0f8671d5fc737a5c386e08273
47753 F20110404_AAALHI zhang_h_Page_038.pro
df31b6804bbb926754d72796ee3638e1
7f433b48e26e8188453ee428360fb396ad4c06dd
1051973 F20110404_AAALGU zhang_h_Page_127.jp2
2518986dabf78a258583ca5b976f2b5a
8f91e61d6ceac0e9a6025ee0f04e62fc3077d542
F20110404_AAALGV zhang_h_Page_069.tif
1114be8155eb2731e8ade7d443458a3e
156c7824017c02166f2757fe3628c55999bfff9c
58186 F20110404_AAALHJ zhang_h_Page_022.jpg
e8d8157fcc6769f1414b2d08efad4d73
62336aa0ec6d6eae54a7384bd78d415373dba459
27887 F20110404_AAALGW zhang_h_Page_082.QC.jpg
3b3cb79a741478a6b84fac6ecab0202e
a02a5ed4899b01c73eef8a1a141f15dfdaba902b
2170 F20110404_AAALHK zhang_h_Page_012.txt
d2993244887d2a519e6fa03a1f2584ff
536d1d21d1e65b8ac4168500275a9de9fbb61413
17802 F20110404_AAALGX zhang_h_Page_049.QC.jpg
5fcd489c85e47810788cfce2035ce1fa
3252cb04671ccf7accd453aa6b337ee0de64acc9
511579 F20110404_AAALIA zhang_h_Page_069.jp2
73d89ad580e03855e82b2a1b85c41011
cde394dae690b24f616f6c46f8ab0113d2da9361
F20110404_AAALHL zhang_h_Page_131.tif
6361846e3278dab2c2081836d2ad564c
460a5cd51f36412ae9eb9b072e11cce594c77c33
75894 F20110404_AAALGY zhang_h_Page_065.jpg
1df87f4c42199c19d167a5816c25c1ae
fefa82f534bdca00a5444a318a0bcc9213b47a1d
20477 F20110404_AAALIB zhang_h_Page_075.QC.jpg
253ed65ab29c39f0ba5e7d9a19197fae
c7fcf1d9ce5ee98571cfca8ef473e7282495193b
758674 F20110404_AAALHM zhang_h_Page_045.jp2
a053b231aa64861da4972c966ae222fb
711cc51a2d35f72834e3ab37c00b56ee047dfa0f
F20110404_AAALGZ zhang_h_Page_128.tif
5110d9cd3f4165337d15a5a8f3057027
1ab466fa195f87edab5f9a484028b056b84f369a
F20110404_AAALIC zhang_h_Page_023.tif
47b214be030827cbfdf8bf79e5677f91
65b1d4e762315553ce57b44b53abaccdd16426eb
20437 F20110404_AAALHN zhang_h_Page_010.QC.jpg
a9c1145ef40a98a6494731ab454b2228
28a3b5982d840250a500e8319872e1dc02e238df
63921 F20110404_AAALID zhang_h_Page_066.jpg
4e7f1db8153fda400a161b3320003eea
93f7ecc052272fcd7156d8d63df6dad6d818b806
37318 F20110404_AAALHO zhang_h_Page_101.pro
49ac24d5262352b6e63528dae5b8b07a
15eaebddb23ddbcb267a361e65b03e3624044a12
49644 F20110404_AAALIE zhang_h_Page_018.pro
d67dc82bdf1c97474d0c9ad3babf4761
e14792ee531d2f8c5eb7c0257479c1a183b930cf
F20110404_AAALHP zhang_h_Page_042.tif
d5ff088491a7254fdb8529e5e3278881
eaa05caa2c38670e0f443400e9ed81ac910a129f
28777 F20110404_AAALIF zhang_h_Page_049.pro
d24736850cf66b3cc883a78ba8cdf987
d0c56ad6b5c21a6bca8abc79938ed6f1bd4bb1c6
5863 F20110404_AAALHQ zhang_h_Page_009.QC.jpg
7c81498f09106310146c072cb856556a
4d8d7949a9efdc3e0801fe17f9c61a3efbea983f
25676 F20110404_AAALIG zhang_h_Page_094.QC.jpg
756c4eb321f5c29fe5a228f2a9f47e96
8a03b8362f28e9cb332665e7cd5ca2d850077e5d
7208 F20110404_AAALHR zhang_h_Page_134thm.jpg
fd9c2935b44291fe612e0fd694f0643e
dde8a3ba4f5df128d5ae83e73c837f1585b68b1a
823694 F20110404_AAALIH zhang_h_Page_030.jp2
8d62c09f7be1a669a2b91bc4e4d73bcd
52e65bc2c5675c9457a5071727bf61f85653408a
1582 F20110404_AAALHS zhang_h_Page_070.txt
0f16e405f05dcc8fcaaa5b61048edf58
195b4fc1e7b5d7cca14ea727172d933e708f1115
649807 F20110404_AAALII zhang_h_Page_121.jp2
214b5605d56609ae2231b149df4d4f84
a67d5a761511bb82b83a77777e771e593d0f8ed4
4964 F20110404_AAALHT zhang_h_Page_123thm.jpg
68df61866d4438d354764e2e63616cf5
3580000d3ab89669f6e200a4851ee631c8d8a371
F20110404_AAALIJ zhang_h_Page_118.tif
e16a04443a8d4ee2d68c538844a74858
ccbd662c7f85a3af37969b573077d357507b9ab3
1917 F20110404_AAALHU zhang_h_Page_038.txt
0a0a7e8c202847f1658d3facea0c7317
1420c70924b3717e3ebde8e1617c4615ce3f9ab3
46197 F20110404_AAALHV zhang_h_Page_082.pro
bbd69df3e80b1750600e6559a5b3593c
02156874ba3c95e26042dbf8a1e0dcd8a0c615ce
1567 F20110404_AAALIK zhang_h_Page_093.txt
98bb670d3d615fe57dad9cf2916a0454
acff1dc48fc2828b3891f9df11e44b1b637a3dd9
109435 F20110404_AAALHW zhang_h_Page_012.jpg
f81a81af9588035e882553d78a3d990c
7773db9f7d064c6e661e34231ba107d1e9e556be
79987 F20110404_AAALJA zhang_h_Page_084.jpg
1320cc2341257411935dac889ae804a2
610d4f6b82e0b3e723a4b28f237933a81c37a00e
1738 F20110404_AAALIL zhang_h_Page_049.txt
7f094cf212e26b43ecf2ed77400b2863
9238fbee3128fe7ad56dfc5ae2c81c39b8791147
635202 F20110404_AAALHX zhang_h_Page_043.jp2
80e547518544f02abcc62dff13628193
232287ea588a533b62d08ede007d40480325493c
899380 F20110404_AAALJB zhang_h_Page_040.jp2
c0fc2e8c56725281a6b11c6de83e112c
bdd53441b30cb593015a7ffa5b1fa4f5fcbd6d1c
45445 F20110404_AAALIM zhang_h_Page_056.pro
82e4a63eaa51068368983057efee2ffe
7d525c35e40a5ed522e96780428ffdce8aa32261
F20110404_AAALHY zhang_h_Page_077.tif
d65bc9797b6970b5208b3a86c3a66c98
a30f24ee103d59a6ce1367cc181ac9e9e2584e9e
F20110404_AAALIN zhang_h_Page_065.tif
d766b12091d571fac0400bb26ca84eb6
83be8ae4df0a8c8baf4bed9e26284bdc16931eed
1728 F20110404_AAALHZ zhang_h_Page_063.txt
0384430a4c695ae609ad7341ec912c39
9693400d91f0d51e58d9686af0a1d7371f5d0b7d
6777 F20110404_AAALJC zhang_h_Page_029thm.jpg
8fef523a8d81bc0e8f3a8affba3e9503
a6e59a41b6027727b3b1e653b30fd5fb19bdf197
528082 F20110404_AAALIO zhang_h_Page_073.jp2
61583e168688d8b7d5404d13ecd79c7f
a55248e94b4c7d3ad9a8f4beadd1b4afb5610859
100720 F20110404_AAALJD zhang_h_Page_054.jpg
3edcea137f6e4d6fdfa2984a7108f88a
7e6cac0a3c72165a8b59dd9c48dba113d0fc532e
59045 F20110404_AAALIP zhang_h_Page_132.pro
07bdb3436bc7eb22ce5b8699d0ea92ca
9faa9e06d6dd380f230038c19767446c42588e19
19513 F20110404_AAALJE zhang_h_Page_081.QC.jpg
9b237ae0fbe28f651ac42d2062a27ed6
f795912cd60248e08e659543cd56300236bed107
2915 F20110404_AAALIQ zhang_h_Page_003.jpg
1307d735d56b2e1a4f36b17e80c3ca3c
6239bb3f95c385dec836b434bdbc181636ce6e9f
F20110404_AAALJF zhang_h_Page_119.tif
b9d185d0d1f8bb2d757038a28906158b
bb70a8872778b9f91c4de8e0348e2a1da6241417
28197 F20110404_AAALIR zhang_h_Page_126.QC.jpg
c546c9b1cfffe28d695f825124df840c
924da4050f7c10755544247b9d274fd406515fdb
4602 F20110404_AAALJG zhang_h_Page_073thm.jpg
ab03fb7c87db65947e1b5995708af97b
d133796d5e0e21e1755b98afcfd6568b91ff0e71
36288 F20110404_AAALIS zhang_h_Page_045.pro
2a6bb558fbb0ebfe533daa74b53a915c
39200015f22aa4b0fd999b7bab7f31a44c0c50e4
28749 F20110404_AAALJH zhang_h_Page_058.QC.jpg
6728a9229e152d5ffc775f6164809343
98bf36663ce41f586ea1fc1791e345266c122a6d
F20110404_AAALIT zhang_h_Page_075.tif
b94762102965ef5ef82307d9d9e4f58f
a79ce2837179549e27f18b51c5327cbd26de79e0
60951 F20110404_AAALJI zhang_h_Page_137.jpg
5d7b0073489ebf64b3c9909f0b3d1344
3013db716aa05bb619849497101fcf6b42dd540a
5720 F20110404_AAALIU zhang_h_Page_042thm.jpg
1db281c809b1ac18d7eb0e28817b91ba
56a07f7127c452f5e17b3bd86e80afb4f46ec3e2
2073 F20110404_AAALJJ zhang_h_Page_111.txt
6355fd70a5c0a82c940bfd0a959ae486
16b398eed0baacc8c2e841a6f5df2ab1f7343b4b
5673 F20110404_AAALIV zhang_h_Page_041thm.jpg
2563eecb74485bdd329f5f948167717d
6672df094a7fbdc25a0b6cc9fca842839f06323f
55182 F20110404_AAALJK zhang_h_Page_131.pro
7b399da039cdaec8a4d88776d5e79853
8a740a0fafc429a33876f1ecaaba240f8cb255f1
30449 F20110404_AAALIW zhang_h_Page_027.pro
d9c9e48ba9a2220561f6f0a229439d77
e999111ab904137dc8561ca7e1816b9bf79ab229
6763 F20110404_AAALIX zhang_h_Page_126thm.jpg
5d28003f440372f21b96c15c86397aef
01ea7d7de20ded9887b662156aac33f2a2960674
5952 F20110404_AAALKA zhang_h_Page_045thm.jpg
d150598a133c608b867a296fd438b5a9
52e5bc727e963732514bb7ba03fc2a8ccb18b26d
8577 F20110404_AAALJL zhang_h_Page_139.pro
611dd0feae212e79bfede2f7b7b14982
9d76255609220624da1cf44753f4cf7e2f8937ee
1682 F20110404_AAALIY zhang_h_Page_083.txt
97d8f0734b3981c46592137b5adb78c4
c87e776ad0543f7158e14526b1c6959efc307e1e
F20110404_AAALKB zhang_h_Page_040.tif
539e0b8b8b7649b001a6fe24a854f119
00b4bade0537bf07fcfcee4811e7f15376a341f5
547712 F20110404_AAALJM zhang_h_Page_100.jp2
dabb6d43a80c6f6745d751411d00ead5
6a80fa897ad6baea2277f5278317675135f56d21
19385 F20110404_AAALIZ zhang_h_Page_068.QC.jpg
6d92370aeb3bf00b2ec96ce8f5942ad2
535f49beed8582de8da1bbb1017fe85001233289
6781 F20110404_AAALKC zhang_h_Page_094thm.jpg
5e3b8a318487a18715a0da17446c5f52
b30623e6766a3361ad4ffc7b6efe0f6f8c42b5b4
6887 F20110404_AAALJN zhang_h_Page_038thm.jpg
cc891986c95f26404f89c169424b8647
8ee3e3dc68c667158f8e574999fe9450906636ad
29891 F20110404_AAALKD zhang_h_Page_123.pro
2cdd6854e16d727a16a7835da5048dec
1154570a1b5e9e64384c2a9850c0602a79bc0750
33333 F20110404_AAALJO zhang_h_Page_034.pro
06cab1694af97bd9f3407db5fd422da7
c4053089ee85807c156a463832f96e96a60a7957
1093 F20110404_AAALKE zhang_h_Page_122.txt
f4e5784ffb983973072a29a09c7eb118
5183386a501ded4a591bdd742f7d0b3220c14288
80186 F20110404_AAALJP zhang_h_Page_040.jpg
e2a50b34315e40b06dfd881940f9f335
0d270492a93e257f3b87687d8d9ecd535320c76e
1911 F20110404_AAALKF zhang_h_Page_006thm.jpg
5dd3dd4158fcfd197a5e09d9ff14c4f0
3647f297e99b0ee36d594e3e588c99cf4c43929a
18280 F20110404_AAALJQ zhang_h_Page_080.pro
58903a31d58e01297b97331628be045a
827cea97bdda37e72a236ed103fc40a91bb21dc3
1051959 F20110404_AAALKG zhang_h_Page_128.jp2
830fd5faa50e6d432fdb3e929329340f
036aaaf4234890a507f84c692bcce2df77b7e8f7
56544 F20110404_AAALJR zhang_h_Page_096.jpg
5ab2a8de66653f33cc9b424f4ad65adc
c0cf168a7731633e044d1ac6fed2eceadaec320a
7415 F20110404_AAALKH zhang_h_Page_052thm.jpg
1348b1da0b92309606f9082dc0543b85
6a52e0e76a68fc387e9a883082ecc1b827fabf48
23902 F20110404_AAALJS zhang_h_Page_115.QC.jpg
e2302516c16f2c2261c95b6744f17015
a8d44862149f51d806c07e10195d0d85e65e4a05
23589 F20110404_AAALKI zhang_h_Page_020.QC.jpg
adb7f1e0c0653fe773cfc8fa3346a2b8
342e4cb20bbe5fe42feb481ae695d03a4c989bcb
5496 F20110404_AAALJT zhang_h_Page_095thm.jpg
a47c7b25c74db3db18f47ede1ce45bd2
2e93ecf9c3dd669371184ffbe60c15f7414d2553
F20110404_AAALKJ zhang_h_Page_013.tif
7cfbe429d305840d6191a5b257605d0f
ba70513d42d0053471c764bc37780d916649c654
31664 F20110404_AAALJU zhang_h_Page_109.pro
3494229e8fff5a2811d37e4bbebd5cd7
86425547e66c958da10c2c49602f2a46950a7c4a
34951 F20110404_AAALKK zhang_h_Page_042.pro
9e4fac93a418c0700b9594fd8008addc
84f9dda6fcba5d5ad953fdaec869c7ac48106e6e
28073 F20110404_AAALJV zhang_h_Page_022.pro
d26d3e4e8d369239830a39d83d9aa1c8
7dee2db91a654f7bdd0ed5d8db30e950b7fd68ca
1297 F20110404_AAALKL zhang_h_Page_044.txt
940d678aafb73e52f29174d8c36a5652
17f82a67826895432cd9dc733e13991ec67bd23b
5382 F20110404_AAALJW zhang_h_Page_048thm.jpg
7beeaafaa397428548381389b1eb7914
50605bc01a0cec993ee57653445d62a10cc12aa7
34422 F20110404_AAALLA zhang_h_Page_023.pro
8eb959ea447892aaf4f2b2f71a0b1358
b095fcdfe0962d96365d48d763316207542bf7b5
4789 F20110404_AAALJX zhang_h_Page_050thm.jpg
8ffeb354ade425118eca250e74b4fada
6c1c1f6bdbcf465ea43a8d303673bb323202dbfb
115793 F20110404_AAALLB zhang_h_Page_132.jpg
93ad0bd3803c46645d8cdbf088e8ce62
cbae1f438c512ac4b65380ff44956411e7b2571e
1051976 F20110404_AAALKM zhang_h_Page_085.jp2
c7426d8c9202c5ba5814142dde1f050b
ebcf2341bb907f900eef51ad6a291e6a8b302a61
94133 F20110404_AAALJY zhang_h_Page_037.jpg
6887e17b88083a6ea63a261a4d72a591
4aaf1da55e4ad31fa099cd4d6f0ce5e0e2d40489
588160 F20110404_AAALLC zhang_h_Page_010.jp2
74041fc775719cbb62885405953fce42
9b7c4fc6005546034885fee8681a7efefbbfa675
45711 F20110404_AAALKN zhang_h_Page_135.pro
e5ea88fdfa45b7c48237bd583f43067f
d4601153db4519af437116a4ba54063da43819dd
62649 F20110404_AAALJZ zhang_h_Page_093.jpg
1df772cc8548cd05eb26d5f98d228b11
2702b4c8a8ff49139c446446b3487c01f927ed61
16893 F20110404_AAALLD zhang_h_Page_087.pro
319ea1fe7c5c80c5178adb55863bdba1
b48457a6356d26cf56db9ceb7847aa1f1ccc8163
6582 F20110404_AAALKO zhang_h_Page_082thm.jpg
e3448a8c2b7c96ce48e7309ea9c3aa55
083c91f66cb5a984b8dbde82695322de8c4c35c5
1744 F20110404_AAALLE zhang_h_Page_119.txt
a19a1c73fe5ab8848459ad532a57f7dc
829d69eef95792be0448a065097a189e50d0c81c
1316 F20110404_AAALKP zhang_h_Page_004.txt
f8fdef1a96605d59f265db83f24af1fc
a62c5ddf810d0735abbdd1d87e78ac9d2f95b999
5913 F20110404_AAALLF zhang_h_Page_090thm.jpg
00d6f4f61a7454797e02b961fd9aa0dd
aae0183ddf008a2eff5b99217733f2cc32e0ba7b
72471 F20110404_AAALKQ zhang_h_Page_120.jpg
f60f071ba2cf380d9e875ea4d6607150
b7b6fadca648ed8c41aa8693a7faca17b4637e9a
6518 F20110404_AAALLG zhang_h_Page_098thm.jpg
b0699b0422baebe1a09b6dabad51d098
99891caf6cc61e7285e7fef7c0bfa112ab2c9d63
5750 F20110404_AAALKR zhang_h_Page_057thm.jpg
9bd66df76e3aa7f7103f3bc1e8b21066
711e67034217f40451c831b28e36c6071743341e
2050 F20110404_AAALLH zhang_h_Page_124.txt
32b0910ce116e7413e622365592fdc5b
30a171c92ef783a3b133570efde9ff1fc7271a97
29994 F20110404_AAALKS zhang_h_Page_067.pro
b1592ca78b5cfbf2439e78fe2df7cccc
32ee29c1324eee9ce57ac64a823ab41f38a31aec
57843 F20110404_AAALLI zhang_h_Page_091.jpg
8e6b5618fd35b555ffe81dbdde5a0d9a
1fa7138365ee9e36ef35c4023db9d11b7def0412
F20110404_AAALKT zhang_h_Page_114.tif
bbfdce00e777b370babaa350d82036db
bce5366c75705985d927c20688799ddd12ec0ade
4929 F20110404_AAALLJ zhang_h_Page_122thm.jpg
674bd07e27f0e1f0d9cec88566b4be74
d4af47df5b0eac6961e658ae58bdf71743d5f3f6
5937 F20110404_AAALKU zhang_h_Page_074thm.jpg
dd5c7bfec80a9f8dd75343ab4f2162ee
0cdf98dfd50e115a9ab34d9dc1b741fb0c303ec5
1784 F20110404_AAALLK zhang_h_Page_084.txt
27cf909b007c272bb7b10bbccadfced0
ad3290f24cc23291aa89485739cbebe951d68375
1626 F20110404_AAALKV zhang_h_Page_115.txt
c81b9737a8f832688be83bd3c8b5ba51
818d1093eb7d28f14892745cf4400217d82ab0be
29545 F20110404_AAALLL zhang_h_Page_081.pro
c96b1aea733533a23ac0d7f4ffbefb94
ab21584fb6ec918201a2c16db2e6b449ae565daa
17022 F20110404_AAALKW zhang_h_Page_047.QC.jpg
aa89245e2004487a4f5f5d4396bafc29
508be7364c06f609e78c1f2e63d32293659e756e
24963 F20110404_AAALLM zhang_h_Page_073.pro
578d89f0883cf59cbdec3ce6f07f39cb
145dcf25499ec9fd50983db390d4f735013a8050
787062 F20110404_AAALKX zhang_h_Page_088.jp2
460322931dceb5b6594fdd8fbca59847
017e11fbddd23021cc0b6c7f87f287fafa50a044
1610 F20110404_AAALMA zhang_h_Page_019.txt
6d0b39d0cdfd9f6bc420c769e4dbd254
478b70678626bc7c58f5456100853296e6fc9926
5181 F20110404_AAALKY zhang_h_Page_043thm.jpg
3e36db2bca11fa0910476986468c92f8
b3a99b9ae0c7902e8ed35816d4d742f68e11dfed
6942 F20110404_AAALMB zhang_h_Page_036thm.jpg
780517da6210e6451b6915a632bcb2d5
80885af528362f5dda14817ed4ad1008862350cf
408 F20110404_AAALLN zhang_h_Page_001.txt
e104036a76a42bc48d00e1c202606102
7551a2365cf0cf024b86d9f6aec15cd7112dfc37
398 F20110404_AAALKZ zhang_h_Page_138.txt
a82747b1261d0caac8a346caae11a3d9
80d5dbf2e6394ef3b0463a80c5c53e4f250ecf29
109218 F20110404_AAALMC zhang_h_Page_085.jpg
def1bebf16d4695609813e1181ad1543
45fbb58e7d679ae421579370f5e9be5ca80ac5eb
27196 F20110404_AAALLO zhang_h_Page_025.pro
a7f11e2b21574f096773cf4c504160eb
04985099d95c306d09fb7eb8eaf56882c3f0643d
2499 F20110404_AAALMD zhang_h_Page_007thm.jpg
47790ecd2922d69e952bd3fdc711e84a
6e48e5c76a4728b38da0f2bae42a81aaaf019f07
6892 F20110404_AAALLP zhang_h_Page_037thm.jpg
b385ba6b6a0f3adb971b44aa3300c503
bf3993e8b0c2f1f69a5ade7a8b9a25d24de1c639
593331 F20110404_AAALME zhang_h_Page_022.jp2
dfa1e50d092e52ad03b2e77ec29c16d7
5143d376a52aed0b5a9d5689a681bc942e778648
29813 F20110404_AAALLQ zhang_h_Page_128.QC.jpg
8f08bd45ed96813a92da8b59c8b99d02
ada7025a3405764175c68f070d79817adcc8acc7
F20110404_AAALMF zhang_h_Page_113.tif
cb4a6819e03aa17ca6beff6f8c9969a1
2333091f005b281ce70a03c87d252b9af9212178
1558 F20110404_AAALLR zhang_h_Page_137.txt
eed1d66499d32f8625d52360c98d680c
66afd9d3e41930ba0f8fff22f239319e4fc9f74a
36842 F20110404_AAALMG zhang_h_Page_057.pro
fb901bc2cdb34829845857dcc1df1bc2
8bebc4d85d8977e96a8a8b4d6d738c3d85024074
19335 F20110404_AAALLS zhang_h_Page_091.QC.jpg
26c82a09c0f1ee9d562e01a81a3def08
aa64755fdc3687a4c6589714813865a0ac9cc4c2
1947 F20110404_AAALMH zhang_h_Page_031.txt
d7e21fa4d0d33138abf0029374aa7d38
39ba48a0252cc72e67520921c15219e1264e6b60
1051968 F20110404_AAALLT zhang_h_Page_133.jp2
bc59a57047bb2a4a63f925755c827bb3
03f3e2db7d18fcc85e33cbf3ca42f68eb220d516
49746 F20110404_AAALMI zhang_h_Page_025.jpg
7936dc5428fc89900b443dd883097b62
11a37693cbe986b3c14dc97160e1c0d63930049a
F20110404_AAALLU zhang_h_Page_117.tif
b4e4e79d6c945f71928340b58c7e59f3
8f82a7b03416af68f4389ef2dfcc89b37cdf3d73
22806 F20110404_AAALMJ zhang_h_Page_015.QC.jpg
cca97c05870a57afcb2dbb0003a66162
dbee27ca2f7725c11147b85b16b26b46044ab896
53700 F20110404_AAALLV zhang_h_Page_036.pro
a1a4bc74acc5058c817b784856e4a17b
75d19504440a62d2bc2aa25fbee49f8003e7dbe5
9408 F20110404_AAALMK zhang_h_Page_006.QC.jpg
251bd83ab719498980117e8f0df153ca
45c3f71ff3ae1ef7e83040c267168689d496b42d
F20110404_AAALLW zhang_h_Page_032.tif
6a07a827730d10f8ee1810a73d48104b
c4d056210a5c1e44c8ec03465e2cb2f5b9e26318
546359 F20110404_AAALML zhang_h_Page_049.jp2
5fde6b2034b98aea100d6db7d824ede0
40f715e0990f244bbbe37de178ed7360a75c182e
23391 F20110404_AAALLX zhang_h_Page_083.QC.jpg
ce7ce43550336cff92780966bb90f22e
eee789576b19f637bbdfa178de5332badd9f41d2
1429 F20110404_AAALNA zhang_h_Page_095.txt
b2b05ad4f4a937dfff307e34ac188122
f0a738ef1dc51e53ead5877b8566fc6f99c901c5
48640 F20110404_AAALMM zhang_h_Page_051.pro
badabdb2a30294fcc1c755f9e8e2a399
452e69dc4365f6f75c7c9e5a9379df280a385cc5
943755 F20110404_AAALLY zhang_h_Page_099.jp2
a59ee648d7c15d77201ea22597e7fe72
ca1ae85378b9021d21e62f2251e7c28dfa7a7dd3
26410 F20110404_AAALNB zhang_h_Page_011.QC.jpg
247f706aabf0054d039157c2bbc36b05
e0ae1b6eb3c2ef53bbf7edf14cccc06c3bb7ee7c
652955 F20110404_AAALMN zhang_h_Page_093.jp2
66769ade708d03c20b9f44021cf12720
2f46836de24f6b9fc54ba404e6978f195b1d4e63
86280 F20110404_AAALLZ zhang_h_Page_098.jpg
e0a2463052c12b853767b0a202bbc521
71c029a5aa3f31e17f6f1d4baaa0000e630a1dd4
7703 F20110404_AAALNC zhang_h_Page_138.pro
70cdb41d6e6cf79bf217c543f3246235
f96971c5c0f770abeee09312c6223878bff8a3c7
961059 F20110404_AAALND zhang_h_Page_011.jp2
8c334e8e11d99ebe50906604085165be
b5a5434781c7c79565b71a3296d5bfacf035818b
F20110404_AAALMO zhang_h_Page_003.tif
73dbf6da81ff7f0d28e827ba2ec64479
e5d42a5467654174197bfe1a9f802886f6f2d973
57499 F20110404_AAALNE zhang_h_Page_133.pro
e3efb2b89c74dce82764e3e687b8596f
f09266ea5d08a2f19aec8746e99bae19bc3a32f9
F20110404_AAALMP zhang_h_Page_044.tif
45927dde84c97a6c1eeb5e06af930a30
4d998b6ca23268c3fc57da9caf3ad5447574cbfc
835542 F20110404_AAALNF zhang_h_Page_115.jp2
ab76107561887530fdc572c7edabf193
1000369e4df44adfb8302537e4d7b87736ce22a1
4319 F20110404_AAALMQ zhang_h_Page_028thm.jpg
841648b5c86bb38cb5ed05b952413c0e
10975a80253a2ecd04bd2235d46a9e006bdfe643
7493 F20110404_AAALNG zhang_h_Page_001.pro
ce15540475a81560defa3f34c8bed7f1
cb29eaa6ca303edee5ceb3ca7213282cdc80e5fd
7160 F20110404_AAALMR zhang_h_Page_114thm.jpg
be3096286c877fe170ce52efe4c40ef8
0161bfd0b67f50c2133511423d27fc0241040f5d
983739 F20110404_AAALNH zhang_h_Page_082.jp2
7fb42e6e0e3558dc19f9c89045a6ee0d
3a5264b01b0a01d7a8448b7a04d78b8d6b30c58e
3907 F20110404_AAALMS zhang_h_Page_087thm.jpg
0bb56aa574e0dc60ac5fe1b5bae43994
e6e011e7d150cdedfcc17538a2a89f0958624b1d
F20110404_AAALNI zhang_h_Page_078.tif
68ce86eb6c27f754a47e6865e305acbc
4dde6f7490aede1986163864d445656e50f49788
27168 F20110404_AAALMT zhang_h_Page_113.pro
7a356a8f52933971a4fbb3da48115092
b653e130d5de7d86e5fac8ca66e3a02d4fe041ed
600865 F20110404_AAALNJ zhang_h_Page_091.jp2
39e5e926a633988a77687cf9d80213ed
9117fe6edcdbfeed2c251bcff5cbc203201907b2
40703 F20110404_AAALMU zhang_h_Page_084.pro
d8e4d438d1b5bef891639dba6aa79c18
6084a7a8979962b5510a91a0f07972d95ccf745e
F20110404_AAALNK zhang_h_Page_050.tif
bb43a87c291e4faf44a8d4ed556a8484
0d905516b958ca35dd3ad75c180a9e05deb1abaa
F20110404_AAALMV zhang_h_Page_067.tif
807385d50eb6affdf3c979e30bf4e788
0abecd86bbbc30e4a6c15147b17c7f20292d6576
32129 F20110404_AAALNL zhang_h_Page_132.QC.jpg
d24047e190345d3a9b64a9ef0317b513
4af8e88f8bb5a96298e4f2313c7f49ecccea17eb
1005685 F20110404_AAALMW zhang_h_Page_058.jp2
4bf52f0469611dee680f34465cd3645b
d96f76277745fc2f86d5135cca642f347d1cc1e5
20067 F20110404_AAALNM zhang_h_Page_041.QC.jpg
9b3639c912b322abca9a3e2eadc505a8
49b8ca208a8228b120bae993ac8127ab767ef191
62003 F20110404_AAALMX zhang_h_Page_048.jpg
c1ec9231e894da4707ff67cbfb57e6ae
3cdfc5084d00269f8ff9ed42c8da4c48dc8cd7eb
5635 F20110404_AAALOA zhang_h_Page_101thm.jpg
bd5aa4165a0cebb7259c61ace4a27d85
156b60549c9a37de798ae5338b13fa990e36f875
6476 F20110404_AAALNN zhang_h_Page_105thm.jpg
be060fc85b63b418e1e3843cf7f1a3cd
b20ac40280ad88412d7697a105671b99c7869049
F20110404_AAALMY zhang_h_Page_017.tif
6ff673250602052ecec15a3a97beb498
9a63cc6e630aba9872ace11f3dd8b88b9368f0b2
56598 F20110404_AAALOB zhang_h_Page_113.jpg
e7192c2277f470b132de12aed8b1d7fb
009fc4586eb698629f464d16b8d1ac417df4d16d
F20110404_AAALNO zhang_h_Page_093.tif
f79375a481eabb3ce0633107eb2a18d8
664a5c384ae25041a33742dc3592dc22f0a59eb8
1805 F20110404_AAALMZ zhang_h_Page_057.txt
7143fe89d4c04cf824ad9204f4ca5729
a52374d7692db8aa6d2a8ea0bae1365399a482c9
F20110404_AAALOC zhang_h_Page_019.tif
052190e0c322eec3f84bbb730ea026bc
a3b8affe17eab1fe42a95bc3dd2b8bf207069635
F20110404_AAALOD zhang_h_Page_010.txt
ea0877e19a9aa76530618359eee383b7
58eab36218f601adb59d5bdbf2b8e9d2a03b41ef
52962 F20110404_AAALNP zhang_h_Page_052.pro
55e8120f7fd87d48883bb9a5d1fc92ba
ab50827b549306f0bcfc786b3f8a0825782d8ee6
72011 F20110404_AAALOE zhang_h_Page_045.jpg
0433a6f28d22ec60477f31fdf092f6d4
bf21990826942552e08c76f992d704b6baf10f01
1051980 F20110404_AAALNQ zhang_h_Page_018.jp2
f80680a336afc6b51425dd87f670bce6
2a08bd4879bac3cf34e72dbabc59858a8d069e85
5563 F20110404_AAALOF zhang_h_Page_081thm.jpg
d72c704e62e0d65f513ebf8022ffff04
95efa5875f4bcaf407caa8b1b8d0597233dc214a
36271 F20110404_AAALNR zhang_h_Page_120.pro
e5db2cbbae9d9af6579cceb158f1e71a
f6ae72bf2542528365d57db60c7be5808ebd51c2
55872 F20110404_AAALOG zhang_h_Page_127.pro
c5546ad65540620e19825d8ad20a0452
513d96cdb62ed6a102e80f93d55419bc76930b3a
71957 F20110404_AAALNS zhang_h_Page_101.jpg
109fc3ca304977eefddb9453b2071162
7e0427898dafa962ba3290127a48bc7d2acad2e0
69147 F20110404_AAALOH zhang_h_Page_015.jpg
445d6b038a98dbd09b0f66dcc3e9ada4
c30a86eaeb67cc88b99d6be285d31a01f06436a6
699955 F20110404_AAALNT zhang_h_Page_116.jp2
63330a939c9cd61632b6a7c1a37cdbb1
598fef421486dd7a53f18969cec7c2d8d09930f8
25877 F20110404_AAALOI zhang_h_Page_108.QC.jpg
18484c7363c6c168de7462cae317d79f
1bf5c4e06ebca3d3210855d39917df20c7fdec34
3296 F20110404_AAALNU zhang_h_Page_008thm.jpg
435f6205292a57dde48eab57ec13f9c4
8ececfe15d8808ef3ad824339527d9363d45f2de
39908 F20110404_AAALOJ zhang_h_Page_087.jpg
887b655d89dd60eca45979721873f914
525bcefeb2feff7d0746bdf8d6feca8eb05e986f
5571 F20110404_AAALNV zhang_h_Page_060thm.jpg
3e11a80db42886bab959c62c38f843e3
7fd950818ab99af3687310356d876d898dec8498
843659 F20110404_AAALOK zhang_h_Page_053.jp2
ad4af81daa660296ecbdce76f064877e
405a1ca1d30478018ba94171d5a663f3fb7161e4
F20110404_AAALNW zhang_h_Page_091.tif
ea9e610b802d3af89a6720d413f5c09f
8c202dbef12d4a663c0cc194a34399c65f51fbc6
F20110404_AAALOL zhang_h_Page_051.txt
c72d719dfc1f3044826bc4db64d49ce7
6b3cc78e785b188674acbcd5b15557c96e89c18b
2160 F20110404_AAALNX zhang_h_Page_005.txt
d6e44a35736e6a10008b35ce5a2287a2
af8c2a5f6dee59c66b70134f108636a0e73cd1d6
735954 F20110404_AAALPA zhang_h_Page_090.jp2
782a688b8685b6c29d09b3e55277d069
66142aeaccb4b7e7829b900297697d0d25d198c0
6508 F20110404_AAALOM zhang_h_Page_108thm.jpg
d57985d4ac6f10616066e195fe134de0
e25edba6ee0218dcf192550981120d38a9258c87
658499 F20110404_AAALNY zhang_h_Page_048.jp2
280bf3d4fc65583144dc3702f8fbfea3
0cbd97b5200db46d4688fb13663ebe88eb110f01
42673 F20110404_AAALPB zhang_h_Page_111.pro
0bdffdd94916179aa2415d9a2cb145e3
c3f94234dad767b1cd01f7454fd8dfe0fa98a59d
F20110404_AAALON zhang_h_Page_041.tif
67b5e55e5e4e98a2403bd0215387664d
c50d203a72e5888f703dabc628db334ac50df2b6
26352 F20110404_AAALNZ zhang_h_Page_013.QC.jpg
9023c59d739124f18541a594e519175c
748ca5161b0200ff3ac61a719270f4fe370b2af1
32379 F20110404_AAALPC zhang_h_Page_007.jpg
67a737d1963d41425d4fd3fb13936dfd
88d182c82ee7953362a9eb4fd73916ee6999fb96
F20110404_AAALOO zhang_h_Page_072.tif
b062b3c98a1e34e24c1c37b04e59248f
30334b1637227402749a2263e72351e8e16ff9dd
1051957 F20110404_AAALPD zhang_h_Page_124.jp2
82db3c4b19f692174342e1fb1e0475c6
39d0fad895c9207aeac4591212f179a6fd1318da
2267 F20110404_AAALOP zhang_h_Page_134.txt
a7f626717ac0bcf71c737629faeeaf28
4a451dbea374b8a01d447f6a69ccc2c1d5f30b78
36401 F20110404_AAALPE zhang_h_Page_104.pro
8b6685bc91e712e403ddfb6d8946bb8a
3e482e7072b746040d21045063862d847803f13d
2331 F20110404_AAALPF zhang_h_Page_133.txt
5e012d771e34164d689ef6db13f2bcde
063d553a87e5b1b5383b62f6f55421ee36e251fb
531644 F20110404_AAALOQ zhang_h_Page_046.jp2
65543a996042bc565bebd87b084bf69a
7815e1f24881930a7b7bfd5c57edce9236783057
342343 F20110404_AAALPG zhang_h_Page_136.jp2
2007e5a1e509ec049590b3ba7f8e82b0
fb990b423a0a3aef04a29498d28259676ca8ad47
28438 F20110404_AAALOR zhang_h_Page_077.pro
6a041bcd12d4f89a8c20146cdf8b174b
5a41c1c64673c3e578beff8947db06ad7c65bc48
1769 F20110404_AAALPH zhang_h_Page_098.txt
f51d52f8d1b9a4bb3f8b1cb654be0354
310af50c0a8a7989c933329b3ea0bfa2e6824c8d
46348 F20110404_AAALOS zhang_h_Page_029.pro
2d560c759a3af26c45069716b9e75518
bbafa281ee269fd531c27926082fe8ea897cafe4
42134 F20110404_AAALPI zhang_h_Page_011.pro
dec458697619f78b62d57e7d5b5b58c3
8fcfbf002fa8c91eea1424fd992e5df030afd849
4619 F20110404_AAALOT zhang_h_Page_005thm.jpg
ba9eef78d7ac2d2144e048c1e0ae7785
11db910cfd913173dd5841fff2512d204903c56a
4854 F20110404_AAALPJ zhang_h_Page_004thm.jpg
749413e3b1155d7badc8d753afa8e799
4272990420da9aa2be822e7160b0120b0030b869
41362 F20110404_AAALOU zhang_h_Page_040.pro
66ff754f1099c6ec4524212298aca834
ef98bd8d154570216bd55d3716672ee95df99949
621 F20110404_AAALPK zhang_h_Page_136.txt
4994b282450bff83b02d69775c6f51d6
fc9934cf0223e96b9ae33c96dddd26a2a0ac02f5
F20110404_AAALOV zhang_h_Page_058.tif
3e1e09a43c068fa8a78ead50aba67d20
5ed981b79b33e30641b605117282898f7c5b1e3e
98359 F20110404_AAALPL zhang_h_Page_059.jpg
6ee1a46b0425a9c453bd748debe8fd2d
fce7fc38703ed2d0f79f825e77fe53a2a1c098a6
22427 F20110404_AAALOW zhang_h_Page_078.QC.jpg
098d35d975dd270f464ac169c72aa632
3cae5c61b1ef63af98c8c892f57c751f8dbca3fe
F20110404_AAALQA zhang_h_Page_020.tif
d222a136af449c7d3c9cb14139c5941c
24023fce059cc8380e3c28a21179ec3bc6f8ad2a
7474 F20110404_AAALPM zhang_h_Page_131thm.jpg
72b0add4e1ed98e2b873b173cdb39e84
a01e35966ec8517ff8195e139ddc3f7958582958
1051966 F20110404_AAALOX zhang_h_Page_012.jp2
5a3fd9c05ce1ec0465fbc0a7c059144d
476ffc721155d4a78fbe41dfe1464b90bee1634c
F20110404_AAALQB zhang_h_Page_021.tif
c193ba6aa8ffdadb7496cb3e98be8b50
f1523c645121d98441b7ab5f928276904cccbad3
59088 F20110404_AAALPN zhang_h_Page_043.jpg
b331d82c574029b4e159cff537e210e5
76b43d16247d0e76fc2c0bc8f2383e7485fc520d
76875 F20110404_AAALOY zhang_h_Page_030.jpg
6830a876b1749f80ca280c4f828b3ed6
bbd8e9b0e12b4e949db8efdf1831de822a1113c7
F20110404_AAALQC zhang_h_Page_022.tif
11e955b0f92d8bb7f6b378ec8e2d5fa6
141fb951f31c19089243068bbb424626942b8752
1871 F20110404_AAALPO zhang_h_Page_108.txt
99fd6de5ed78a6d4fdfebb1ecbaba54d
2cb9a5e9f624c18a9c69ee9e5536aa4799a5a726
100098 F20110404_AAALOZ zhang_h_Page_039.jpg
d98b1ff0d50fb47a21c122c001f5cb8a
17661d828fe06c1e73add72dfba64c533fe829a5
F20110404_AAALQD zhang_h_Page_024.tif
4cbc643648eef5be74bf665a39bd7947
0a27493a325b1669b76be5ff9fbdac52ac1aeae1
1661 F20110404_AAALPP zhang_h_Page_089.txt
8be814dae192b41860bfc2dfea5cd670
ddaed9520ce789d035e07ef83a07051173bc4115
F20110404_AAALQE zhang_h_Page_026.tif
c7186a5e58b64894f00f74bac1655ecd
3447a95d2997fde6b18e25bdd981226fa2e9153f
164124 F20110404_AAALPQ UFE0013703_00001.mets
efbb8db0f41f7fb83292f6931ed3cdf3
65f343eeaeb583aca77899aac728c20654400bbf
zhang_h_Page_001.tif
F20110404_AAALQF zhang_h_Page_027.tif
75db23622aaff79e13c5bd22c3461631
fea6f29dc05a6a8bf1d74b60b2ffa61343542c09
F20110404_AAALQG zhang_h_Page_028.tif
8b3f67d799ec27c026870ab8f4d63dee
84c607e54a3987df432b5f0732ea3e55782cee83
F20110404_AAALQH zhang_h_Page_030.tif
bc1a5e47e09418aa9ba1c82c48407d45
112605899dd2652752876f892a3c32bbd0be0c25
F20110404_AAALQI zhang_h_Page_031.tif
f962107d3714ec6337700882b36de980
a5398ca0acde3c4a0fe7472d5830275b198af3ee
F20110404_AAALPT zhang_h_Page_002.tif
09faa1f5e6863e87c3f6b6ad947da75a
82f70bb078cced72b800a3e68ec5cf194c50b219
F20110404_AAALQJ zhang_h_Page_033.tif
f95e850828fb860096123616e373fe17
1b9399a929234ed9b737e48932c533ba7c320542
F20110404_AAALPU zhang_h_Page_005.tif
6a8c2832f491a616f6b4fbbf2ff02263
36be96d7f5fdb843396437ab21fb8dfc09cd03b5
F20110404_AAALQK zhang_h_Page_035.tif
8ff71171fb4fa63d6da2146181d3763e
8cafc734d8d0e6c11c50c477fb9aa10c9d12ddb1
F20110404_AAALPV zhang_h_Page_006.tif
5b9a77ff85c5eb6a4f2acb24c1194293
d22f5c1e2e7a9944f46f95f892cccf0810022e14
F20110404_AAALQL zhang_h_Page_037.tif
a9472a8549df040e428553b854d3abce
b56e6575302957d1ee7c2c0e1b52ad9928cf5388
F20110404_AAALPW zhang_h_Page_009.tif
115326311e76a1d601e6df526fe10b90
c3e879fe5d1385563367e9b476c520e9609a64bf
F20110404_AAALQM zhang_h_Page_038.tif
82fb52f0704d13687c482bbd20328d8a
675cb54533aff37d634f328bfdfe2187d96130aa
F20110404_AAALPX zhang_h_Page_010.tif
f09cbf33801703376c026e42efab7e67
2cde5539aaf31b9e91cc59eb9f06a1f76391f17c
F20110404_AAALRA zhang_h_Page_068.tif
aaa502adf2b1773615278bfb7becf998
b2f4f1046f59fb85cd2d309b3688d95efa69dd3e
F20110404_AAALQN zhang_h_Page_039.tif
09b4324838689999f513be41a90b06ef
f36bdaa47bae1c76013e4adaf8d9dcd542eb6968
F20110404_AAALPY zhang_h_Page_016.tif
64d138b8468388dcea87275d6d1fcd36
6aeec8174c1dcd68498d987e22d50aea54cee4fd
F20110404_AAALRB zhang_h_Page_070.tif
6fcc6c505602a36d99af12afead0065a
b438fa43f833461417d5a1a028864655f6a43b1c
F20110404_AAALQO zhang_h_Page_045.tif
a50d1cf8bee29bdc8c26416a7af1b034
230dab10c4ab2f3cfd60e7f59ca82950aeb8ff21
F20110404_AAALPZ zhang_h_Page_018.tif
0445905c438441999bb0e6ef10da4245
6f31d8a4a52344790bb76b92cd6e1fe96b07814d
F20110404_AAALRC zhang_h_Page_073.tif
b4a9eab99d947036c30a1af88d41a248
ddf70fc428094c9c4672b9f7a02e6956c6086103
F20110404_AAALQP zhang_h_Page_046.tif
9340c77f8cf5a17eaef8cbc1af19cc99
e58420fd9985e34578bff3dc51f0b104106dc6e6
F20110404_AAALRD zhang_h_Page_074.tif
bae2d5fb923fe69886ae4ce980176da5
5bf6c8f4096ed0931b964e3b45ffd5aafd3e24fb
F20110404_AAALQQ zhang_h_Page_048.tif
6f36880df809b0ab7d72b5c1b95e89a1
bc7a805eb5e44ef4de9154d1cee397554c36d36e
F20110404_AAALRE zhang_h_Page_079.tif
df28213d0a332971c468427c9c30a713
a660f8aa7253591aae19d31f65e1b9348a15a644
F20110404_AAALQR zhang_h_Page_051.tif
0ee8e27454e6ff977fb830622f95009d
635dbbf2b16b1f18eff289efaebaa45b731d5416
F20110404_AAALRF zhang_h_Page_080.tif
c8185988ae7bca24b70a2915d39c1332
6bc7d347d557b416052285de2cba41f574c432f5
F20110404_AAALRG zhang_h_Page_082.tif
574880e18e2369d57da43b24323b82fb
c5adc5054df0dfbf6841f2b4acafdff22f87fa44
F20110404_AAALQS zhang_h_Page_052.tif
d462ca1c47509bb3aed92d33a96c4ac4
097d6d0ef0dc338955c3fdd0d27d392d5f952122
F20110404_AAALRH zhang_h_Page_084.tif
fc7c14a7a7bd53c35e3aa17418174918
6fe2cadff3edc4d0acea93f93c44a8d66ef46474
F20110404_AAALQT zhang_h_Page_053.tif
62f408f2f623fc1097d83a405325ac23
441d4ba9594a8632731f594a84038695ea15fa88
F20110404_AAALRI zhang_h_Page_085.tif
4350a17adc91ac7f5671c8ac7305f812
4d6d0ebd2cf89536bc58bb119eae2d7ceaefdecd
F20110404_AAALQU zhang_h_Page_055.tif
7fd305ea3fd7731636b4909a5ef5d69c
93edbdf32dc01991be6caef6bf12c14c0d3d35c0
F20110404_AAALRJ zhang_h_Page_086.tif
2f1162262062fd219938194d0291565a
3236d80a5c4023fbb6ae5ecb15c5d0a029376e01
F20110404_AAALQV zhang_h_Page_056.tif
8594e18830252ad4d888386a95763316
50bd1e5e1db8b6ee2605479923fe2cd7ed554e6c
F20110404_AAALRK zhang_h_Page_087.tif
b9e6710c6b6bf9d06600ac598acf5057
7c34363b828eedeac2787fb7c9257c3aac626d7c
F20110404_AAALQW zhang_h_Page_057.tif
6feb9d0e110de63efcfa7ad391271289
9a208ea6eed645c5f9a87f23cbe647d00393f19b
F20110404_AAALRL zhang_h_Page_088.tif
3e8ef3b73c803c16c32927f44b681aea
4b672c2791088c37092f8d37cf19a934485a2914
F20110404_AAALSA zhang_h_Page_120.tif
e9c298c34322066754e00302069e3cba
c9f71d3293e5c4db60739657d4869499f62f0b73
F20110404_AAALRM zhang_h_Page_090.tif
1d9e49d1038cc0c9f64af5925b4e1d30
8bb7db4c3fa97ddac94ca5cea4e3b03ff184c458
F20110404_AAALQX zhang_h_Page_060.tif
503066b72dcc95eace297abbc09cfd4c
8dcf086c9de758a49dac883ace7375d27a97fb6e
F20110404_AAALSB zhang_h_Page_122.tif
25ba8e80dff9ca71943627fc2616eae6
44fb1671a0adee03aa9a3f6af78425cd8e6d8d70
F20110404_AAALRN zhang_h_Page_092.tif
a1ef446d435cca197ccb06d7338b6443
63e9b911674d021647458fd3b81fba2a6c35ea86
F20110404_AAALQY zhang_h_Page_064.tif
b6d6016d11d517bf1d852d90f4513ce5
ea0973e04b9c40e5e4451e935cdd4f52bbc0828c
F20110404_AAALSC zhang_h_Page_123.tif
7c67e3dbdccf1c4f3e2cc2037708ca46
7a4842a4cb169454a001a46f0e685c885307a3e9
F20110404_AAALRO zhang_h_Page_094.tif
4f5402ed23094b08f9009ad447985761
5cd80f8decd3da6b6683200dd4f14ef830076286
F20110404_AAALQZ zhang_h_Page_066.tif
5e5737333bf83c6e15821a7b083b81d8
356a29c94b0ceccd558421025f7f952a8f342a5c
F20110404_AAALSD zhang_h_Page_124.tif
74f5d0615efdb4dde6077fef02bf1252
8ddd566f656fa415ad755e9b12d4baffa90df00f
F20110404_AAALRP zhang_h_Page_095.tif
25e5ebc173a1ccd04289b7c88f171df1
04e1097f40f323e0b32b8abd96826e42a683e728
F20110404_AAALSE zhang_h_Page_125.tif
39080c0799ed2f2b99941f01d25441be
abcbdbcf45fab7eed1a1cdb4ba04ab0478e86b51
F20110404_AAALRQ zhang_h_Page_096.tif
68cda15c803dd7cd7af4b585cb4e775f
b0a1a0650a75c1b5ce1265f78c8f9d3f7a48539e
F20110404_AAALSF zhang_h_Page_127.tif
5892ad55614533e1dc955afc9f94ead4
188d46899b0c2d67eef129934c2584f8c3758768
F20110404_AAALRR zhang_h_Page_099.tif
ae9480d1b6aec16d6c59cc3fcda852e1
1d8a4cf779a822e46d19128d3c35a74871aacc11
F20110404_AAALSG zhang_h_Page_129.tif
9ba8834db1f4beb04a8c4fd2750210a7
005e503f2741d51491df35b8b7d7bdffd5aceaa1
F20110404_AAALRS zhang_h_Page_100.tif
4c359cde1c3a107f684383d73225bb5f
b88cbb27549af16077d58ad2370dd594f8dcaab2
F20110404_AAALSH zhang_h_Page_130.tif
2eefe0bb3dbdde561840ce9780fd33c6
c02f85edbf7a872734d27b413a6f63f3657b69a8
F20110404_AAALSI zhang_h_Page_132.tif
9e913fcd0350eb4375647543940583aa
a73afb2107a2adc841c3b1013bc725d3ef5748a6
F20110404_AAALRT zhang_h_Page_101.tif
83c44d0b15f0da35dcf7702145f56d70
2a989aeeb4b7014e97f4fdee05577ffb56c2a07d
F20110404_AAALSJ zhang_h_Page_134.tif
3f62112151173c36cb5f344d5a1fc53b
36ff5c938ca68388eeb0c343aaa2f43fecfec6ae
F20110404_AAALRU zhang_h_Page_103.tif
100db06b769825b211efece5cd38dc05
41eb4dc71163329522f7eba3f4110c0ea6f840cb
F20110404_AAALSK zhang_h_Page_135.tif
5a5126da6f01087fec09eea741e908ad
707c354d62311eff59bb946f1acad00526c4e383
F20110404_AAALRV zhang_h_Page_105.tif
a6a1680062b8d5d248876e38335e7f71
149a7222c5f43d3f5ab349571d27166e693b9fd2
F20110404_AAALSL zhang_h_Page_138.tif
8fef587d01b48e12581f0edd08b7f523
85720b5d11d3e550b7709f449f454d6663e50133
F20110404_AAALRW zhang_h_Page_108.tif
1e7a109a33e314ac36f3c78032e82445
42eca012346b631c776cf0d0e22e8fe714952a82
F20110404_AAALSM zhang_h_Page_140.tif
0042c1469df4eee2321b10b3dc551d04
b4a84aa484c07111ee9777fed5e7e94564eb454e
F20110404_AAALRX zhang_h_Page_109.tif
c928f83dab9ff3e2bad4d062fd3b7e39
0b1f0b5668cc2fcd7f53390a3174ec062b5f378e
1603 F20110404_AAALTA zhang_h_Page_025.txt
93b8a08bfbf3a0d36a13da37e6dfbce1
47e806aa4c324c729b2d4e442d95962e20da04a2
109 F20110404_AAALSN zhang_h_Page_002.txt
7bf46d0b871539ba2ebe3187c6cdcdc5
fb6df063c8d7ce95d85b532aff37577ebf88babb
F20110404_AAALRY zhang_h_Page_110.tif
a01e8796c0ced71500168ced79b53521
0e93b8ae2f675b3976e3796ddc2715b088fc7945
1606 F20110404_AAALTB zhang_h_Page_026.txt
ff2aa11d8e64ca56bcdbf2289ccfbd0b
82f57985158b0ea90631aae1ff35a8abf035884e
44 F20110404_AAALSO zhang_h_Page_003.txt
17351451bff89e04d460db4d5e269ac1
6c8c0a9df9ab954b62d23e891f8e7eee7a6d209d
F20110404_AAALRZ zhang_h_Page_111.tif
971ee560e8e5f145f253172db0110f6b
17238bde3bdb808ad0a2d8fa0513b24d7ed495b6
1618 F20110404_AAALTC zhang_h_Page_027.txt
aa9deba2d6919e1674aaa9a3b7e3af27
0f41a371033de9ccdd5257985c2a41bc874dcbd9
876 F20110404_AAALSP zhang_h_Page_007.txt
9a79c8bfa7ccf791b64cff265096b2e9
97bfbc3f2d41e98e25ebec5e719bdcdd748e0aab
1987 F20110404_AAALTD zhang_h_Page_029.txt
a9b22d3521258889c76f4e75cbf2884e
85db2ba624ebdc53fc80782bae136119b0ae74ce
1247 F20110404_AAALSQ zhang_h_Page_008.txt
d6c39baeae42b0d9b31a5b589fde8208
a1a5ed13d28610bd784909ac8223db6fbe39736b
1716 F20110404_AAALTE zhang_h_Page_030.txt
b789a434ce832ebda946647a9a7ae07e
23273e9ef91737838591b61ea4724b4cc8dd500b
333 F20110404_AAALSR zhang_h_Page_009.txt
740a715937dcc14c790c77752a1e0e27
3fb7984e021dc013b3ee685e130fe88115cdd9c0
F20110404_AAALTF zhang_h_Page_032.txt
3dfadf73fbb40a624dc2de2178c46d1a
58efe50b37cacf79f01ece2106fe4fc2c4c8e2f5
1613 F20110404_AAALSS zhang_h_Page_015.txt
a7344a229812313d90aaef21e0af7ae3
3f5c5d02f38150da80e04e2c165eefd5254e02e6
1997 F20110404_AAALTG zhang_h_Page_033.txt
8dcda9a1f61b5ed57501b6d492e823a0
eaa375c4b2d041e51dfa53b2f64d28e1607a3c41
1546 F20110404_AAALST zhang_h_Page_016.txt
bca664b8cb1d1bba263f05b00bc80f9c
70844187e200e9622a2f62597fb55c1bc16025cf
1583 F20110404_AAALTH zhang_h_Page_034.txt
de7613d0287a4403a1eae3062f3329af
d0ad416563c649d7ccd94459dec578ae8b20136c
1669 F20110404_AAALTI zhang_h_Page_035.txt
68665eee78eef99f106c41e32f1e3c5f
6a9660ef01e12ea4e4692ecde3204fd2646c28a6
1839 F20110404_AAALSU zhang_h_Page_017.txt
672ba8dae31f78d9c491099c7269ef8d
ee98f057e0a32fd9b000fa6f8b5b08d1d1dbe428
1867 F20110404_AAALTJ zhang_h_Page_037.txt
31901fe3233fef94526247488336b383
b65d7395e11228dae578adf29169dd840d865a74
2062 F20110404_AAALSV zhang_h_Page_018.txt
fb161e9a21b1b91c6cf60802e390ec03
3f8ed5c08004f93889d14ddb2d3982defe21abbe
1768 F20110404_AAALTK zhang_h_Page_040.txt
5541e344624ce8ccb78b3337484b8d5a
ee356148cf48bdfba7dc9c625d8b2edea805846b
1597 F20110404_AAALSW zhang_h_Page_020.txt
c7098419dadc3d66fe8aa8140fe9575d
cead94b481c5c70d881cda748a08a1b51d5ae5c1
1714 F20110404_AAALTL zhang_h_Page_041.txt
87eb41d3bb0673bc9e9b6d9bc39f11cc
b1a388fa21482c8b4f3926853f8de6a65c66520d
1750 F20110404_AAALSX zhang_h_Page_021.txt
96ee220a9400703b2e972cfdf695ea29
39368cf66711ecfa2ef5ffd8747cffef21373c69
1561 F20110404_AAALUA zhang_h_Page_072.txt
5b2a6124f7aaca60a26c3a458134f882
e715f163219c6314ed68beef685a5bcb8e801ce3
1608 F20110404_AAALTM zhang_h_Page_045.txt
405c8c8d80972b1ebe40b4cdf0482984
2c0777df6627297f2adf2688c8f2f101e0c99302
1749 F20110404_AAALSY zhang_h_Page_023.txt
5adcad528112545d85088b8813a6a914
dc1d830f02f41862d2288ca5df14464f838d5daf
1500 F20110404_AAALUB zhang_h_Page_074.txt
8d2274aff10097bc43e91a3b37f6b9b7
8f72663f30ea09c2841018aedd90d097fed956ef
1282 F20110404_AAALTN zhang_h_Page_046.txt
523bc576ff5e94f64c3ae5134db93325
f84a5ce739f4ada88c302c89e95363997fa9f682
1854 F20110404_AAALSZ zhang_h_Page_024.txt
46c0573dd9d8f9f1c54325f936759ed8
f6154d80d3f7f11e9162764d56bd45ece193c391
1484 F20110404_AAALUC zhang_h_Page_075.txt
6cde9cd1dae722a0771d483fd5db26eb
bae879684944007ffef59dd73eb2904029fed516
F20110404_AAALTO zhang_h_Page_048.txt
9689398c95759938ce1106efe0f44653
3ed918c5911829615759a77f1f2694cc0928229c
1525 F20110404_AAALUD zhang_h_Page_077.txt
b2f2658098e5ce6e9b18c177d54862c6
e382092380dd3c4ae9a039e4e18d21ca7f2bdc6c
1494 F20110404_AAALTP zhang_h_Page_050.txt
8a3f1469b80ea4dd2ec5cdbd972d1802
3851ceef68b8a03c1984af4552fe293a7b060b6c
1991 F20110404_AAALUE zhang_h_Page_078.txt
9b4a8faae0564c5c378412cc78f3779d
f3b49a7a599ccb582592617bae23fdebb5d36aae
2159 F20110404_AAALTQ zhang_h_Page_052.txt
1e3b74887ce3a9210517d905d1c9c150
4f873e0595a5fc466d15389e5ebfeaa7f80fdd4a
1393 F20110404_AAALUF zhang_h_Page_079.txt
3bf4a1048dcc861a7e05d65140031fbd
f57f67dac3bdee23abff164be8bd88e9e92d6042
1662 F20110404_AAALTR zhang_h_Page_053.txt
bc809fe7525196741a3a20c1092672e9
12516a13683e12f10c534b5738ba31a191195b1d
51281 F20110404_AAAMAA zhang_h_Page_112.jpg
11bc6593cc9081bc70db824aa6320fdc
0a6fb132e9f7be1f7d9abc00a0811c77a7d70ee4
1617 F20110404_AAALUG zhang_h_Page_081.txt
b42525622acc96749b4bdc884c615288
1e16a13925caa8247b79a8ada905a594dfe0d656
1875 F20110404_AAALTS zhang_h_Page_055.txt
d3b65cfe599a256d627603e2fcd6456c
4ddf091aadacb3c0c1a9048d81af7adeab166059
2217 F20110404_AAALUH zhang_h_Page_085.txt
4e9cc9297304fb6ff6717d0530dd6f68
b920c43d51dd0321864aad11aa7d930b2044a71c
1963 F20110404_AAALTT zhang_h_Page_056.txt
ff5b144b747906cc7acf3688fe03085b
6922850205ca20b23e335fa431ff66cace36f030
100034 F20110404_AAAMAB zhang_h_Page_114.jpg
0471611c8cbb4e979aa55c18fa4b19ae
b06b96458b0e1d8f7dd381ac592c763d612737ce
2049 F20110404_AAALUI zhang_h_Page_086.txt
0879e2ccf48f0e73bc19a2a1be7cb759
754a5299faaae6e87d81144e4260a73d471f7aea
2017 F20110404_AAALTU zhang_h_Page_059.txt
c5af1f2cf30e3678a7452fd5277de5bd
9ce2df4c5369efbdf5780cf81f0e05419db372f2
74731 F20110404_AAAMAC zhang_h_Page_115.jpg
15cbb9992d9708fea593157ef18c638b
3b972020b3bedc6245fd0906bbc660364881d37c
706 F20110404_AAALUJ zhang_h_Page_087.txt
38bf20c81168cbe4d1132de5bb105987
454377ace164c86d313e7bae09464cba86c10c29
64843 F20110404_AAAMAD zhang_h_Page_116.jpg
f760d3403144628d749b43e1765fe356
4320113e00d7a2fce0e39c6b32e1b0f33d54ec0c
1198 F20110404_AAALUK zhang_h_Page_091.txt
a05d7d58ac5232640c5680953603a932
01823ed60c88deb7d4c1041dbc77a1e48ae57ed1
1638 F20110404_AAALTV zhang_h_Page_060.txt
d5017c8fe3f83895be4a19f82f4ab4dc
c74baff21a9541dbaa225451d51e5571906b7216
81833 F20110404_AAAMAE zhang_h_Page_117.jpg
ee56296134e633ff3ed70c822d1bc1e9
42e94b750fd6844bbf64bfd23660706cfc5cad22
1645 F20110404_AAALUL zhang_h_Page_092.txt
cd6cb87b45e560fc4e5886822d5ccd80
60500c0e03ff73b1635da3a664514e539c9e9c3f
1772 F20110404_AAALTW zhang_h_Page_062.txt
43391b8327ebd89ca0c7967916eff2f6
94a3c4948336e732ad5814b8d7dff9de0ac2c60f
94783 F20110404_AAAMAF zhang_h_Page_118.jpg
709898e8f8f15699c7a83bad69ca8c7b
3f65def5d64617887434afa4498ceff74f1d11ef
1761 F20110404_AAALVA zhang_h_Page_120.txt
71710fa69e09dcd139a52635f39bde53
db67c1c24e3174a5e15d8589a728185bed3158c7
1865 F20110404_AAALUM zhang_h_Page_094.txt
691099806d58efd15b1cb275c5a92c0b
eff9f35862ed5ec83a6be4ff7f9286eef470220e
1799 F20110404_AAALTX zhang_h_Page_065.txt
b960157dfd63aac0fa7753c7ceae1fd8
781540de257d3df08063d55508f5778b1372fa10
76865 F20110404_AAAMAG zhang_h_Page_119.jpg
1c23e0c35d3996dfe793ce7aa0062c87
fe868bfe1332662b2f42b6a5e22b482abfc68114
1241 F20110404_AAALVB zhang_h_Page_125.txt
1908e98e4643253d7e6e3a12b9cf39bd
76ba17e69ce6e9c7c1f33667a8bf8b64b4b33088
1723 F20110404_AAALUN zhang_h_Page_097.txt
23b58462a948db68f791aefda58d8ada
606fc2330e0f8a9b60cf2e0c110cf6de19f7107e
1453 F20110404_AAALTY zhang_h_Page_066.txt
e9c7f1f3b0c748de59351734edeb4c4e
bfe38905aa14551aaa7d6dc43a88f54f1cb6ab0d
64283 F20110404_AAAMAH zhang_h_Page_121.jpg
cd132dc19abc2207984f61a0240454e5
71060800441df2bb9a84b7a19603b473629ad3b4
2254 F20110404_AAALVC zhang_h_Page_127.txt
433a0e6c40061acc679e8148dc29fc68
fed6316be368b9fa177f11058c2964c0d9b9a9ca
1823 F20110404_AAALUO zhang_h_Page_099.txt
1919e5d0feb0d3927e72086df6b98c48
f721a473c2214593afb14b1460ebe70db0ff4646
1278 F20110404_AAALTZ zhang_h_Page_069.txt
e4fdcdad5bb0b6dcc41f29070e4702bf
728caf8efb3749063564374b183d046def801dfa
62578 F20110404_AAAMAI zhang_h_Page_125.jpg
540895cd2dbbde11379fa8b1e965d99e
549e01a6c0e57e83fbab09b8f18a49ccaa74992a
2097 F20110404_AAALVD zhang_h_Page_128.txt
f9d1afc4a526497899a1defd9c7d2b96
153184ef08f80dd3b8c700aba7cb90137d9d787b
1141 F20110404_AAALUP zhang_h_Page_100.txt
7bdcb4fbe22a8e2aa77d9c917a0d607b
cf206a07785e25c8a2d45f5f3f24ba9b51edf03a
109540 F20110404_AAAMAJ zhang_h_Page_127.jpg
c5f2768d50182d415f540d7623c2c0ec
c2e0f434c1a39e4379399a1ad1f129e2f1d76896
2477 F20110404_AAALVE zhang_h_Page_129.txt
b15a1a6df572f8aaac66f7e07be30885
138176b346244d7f35646e565b0d899489eb5d77
1718 F20110404_AAALUQ zhang_h_Page_104.txt
d18d67d6597f9abf8e9dd37be325306e
da156e858d157f7ee37ca8a3e389637e8e30876d
120504 F20110404_AAAMAK zhang_h_Page_129.jpg
9661aac620ea328b3fc5db2ad2d4d7ff
83db47f76b728f8e5ab428012ddcaf63b5de1639
2232 F20110404_AAALVF zhang_h_Page_131.txt
0617eda43df6cba3ea9b83af3dc1afc1
d57598c1f8f2773acaffbc7bc76ba0c008310fa9
1851 F20110404_AAALUR zhang_h_Page_105.txt
6ba71002e2374ad0e494088894572515
d05114b5f5da69bef30244ae3e22ef3c7587b7d1
895566 F20110404_AAAMBA zhang_h_Page_017.jp2
911b401269f4c6fc7ea9736a51aaa010
1ad0ff201826fa0fd7f25c4853c139bfa4aad38a
108208 F20110404_AAAMAL zhang_h_Page_131.jpg
67f7ef272cf05e89f59076a26d39e2dc
4c775dc63a0a30f1279c2536b5292486310489a7
2369 F20110404_AAALVG zhang_h_Page_132.txt
1ffe39e1e19879e8096ded7c3a1c1553
c9dab7f6e2b55e5a6b3047e0fe56778537f2eaad
1404 F20110404_AAALUS zhang_h_Page_106.txt
ad7a9c96e120e17f484c87ac05da645b
fd2a53b8669530f77b35408debcba30e5a882627
651514 F20110404_AAAMBB zhang_h_Page_019.jp2
54a7520d2502aa25fbee3c7b29801338
81b514480ed2da04d5483afe332ea55c7897e826
114469 F20110404_AAAMAM zhang_h_Page_133.jpg
52c6ea12a4f75650ae8c01f3dd337713
4bc0ce2613bac9e87f1ca42556f89dec11f2ddca
1892 F20110404_AAALVH zhang_h_Page_135.txt
239bdb1c1601cd73ddf0469761d0b264
4671b5a7e979bd042c17ba39d530c02b325f2c95
1612 F20110404_AAALUT zhang_h_Page_110.txt
1b3e9b0b7f945348da6ea60590ab8497
837e826412d1fc49e62fe7f5a7247ed2b8903543
32250 F20110404_AAAMAN zhang_h_Page_136.jpg
f891050106a1b67897dad942f936a0ba
ffa8c962add1b9a29591bfeb81196c5165c3f638
1224 F20110404_AAALVI zhang_h_Page_140.txt
55ca5a56e8c8f8b96b4ce45e5871a031
fda35a2640e1e1553b7ebf1836c66c5975a9c916
1463 F20110404_AAALUU zhang_h_Page_112.txt
3ffce455a75d5163b61c74b7a59965bb
203dc8e3b667032464945dadd8c38847c93f6e49
792539 F20110404_AAAMBC zhang_h_Page_020.jp2
d5da7d1488dfdd6376794eadbbf1e57a
2c09762df6d79c41aa4c1ea1ee92c2a734abedaa
17314 F20110404_AAAMAO zhang_h_Page_138.jpg
457860aebc653cb43e6fd830178cc778
488700ea86983dbdc9249ce70b942e1a159e47f2
1147 F20110404_AAALVJ zhang_h_Page_002.pro
ef3bb3f56f4c053d550f5a7a3c3b6441
1e036071280de31b66ae0b89bb346d3e8272c190
1512 F20110404_AAALUV zhang_h_Page_113.txt
10368f14b6b8d9b036cce29038de0332
4db5c33984e3181b18fb50edf58aae232321db0b
736640 F20110404_AAAMBD zhang_h_Page_021.jp2
2f9f6d35016e57f9e66f31b9cd311cf4
2f71ffbaf0ab93c2b337a6497bb2b688afc45b63
19821 F20110404_AAAMAP zhang_h_Page_139.jpg
6b20f5fd2b757868c4afa83c24c4688a
5c88c48dd07d00e5c586581cc0aced6ad5c5fd53
46367 F20110404_AAALVK zhang_h_Page_005.pro
c526d91a53d7198dbe1fece536ff831d
01cf55d3ec98df778a5525825e3fd8215802e366
521401 F20110404_AAAMBE zhang_h_Page_026.jp2
fcb378412bdbcf65d9c3238d04f5507b
4bef39768b6b6c6df7d0ad2a5b9d221a0d6a6079
64378 F20110404_AAAMAQ zhang_h_Page_140.jpg
65faaa72d7d2dd898cbd9b2e0ed94c82
94cd030b371720e00df6b347afda3ddc7f2bd8e8
15499 F20110404_AAALVL zhang_h_Page_006.pro
f8c2e68751977579e6a881ed64dc83e2
c71a41307a0f469287b26275dac7de6d15c2fef9
2093 F20110404_AAALUW zhang_h_Page_114.txt
970f35e8114312e2ed3b6fd1314c4eb1
aa6802ad4a89a13499970f6ba3575b3d41c8c242
566322 F20110404_AAAMBF zhang_h_Page_027.jp2
12a36516679776f15eea6294b6b85bd7
652a02a03b73a0f243667efdd037c66d05f89b71
28010 F20110404_AAAMAR zhang_h_Page_002.jp2
439e76cf910d6e709106f2f4c3548dae
a53c368a54426b7f1f3dd5a31c007911abe388cd
19326 F20110404_AAALVM zhang_h_Page_007.pro
55e5f0158c783d5ac39baddb77d4828c
13bcba9b81b5e219a8570355aee97bfddd3af15f
1537 F20110404_AAALUX zhang_h_Page_116.txt
075266bf67bb6c4a57dc1ceef6fad56d
bd2c63338ddf022f397d0f75a6913024d00a0579
443317 F20110404_AAAMBG zhang_h_Page_028.jp2
37ced1cfb69dc75c78375d11e8b31513
fc91e44ee39419a63e4be228201a67e5ab1f0f32
48970 F20110404_AAALWA zhang_h_Page_031.pro
62c2df49ae1fd9cb5dbd4d64a0e1117b
a7030f0f816a7cd678f206c5df90a7acf3983faa
12619 F20110404_AAAMAS zhang_h_Page_003.jp2
7ee326405b3dd980af89673414564c6c
1f9800c478bdaebc055b6ecb45799022da494ea9
27204 F20110404_AAALVN zhang_h_Page_008.pro
930eef322423ca564132d4740b2bd67e
6e784666999d488aed33e64c115ce2e7103bfac5
1602 F20110404_AAALUY zhang_h_Page_117.txt
fc02139ecd0a005cccd76514fa93fb7d
5bc1c569074059b1c8e58257cd613bb443d988d6
1051963 F20110404_AAAMBH zhang_h_Page_033.jp2
f65716575cd03341f83abbaa11acee2c
209452fb4eadd3d2e183682170812cf7abe2f71c
47112 F20110404_AAALWB zhang_h_Page_032.pro
b6e04bc5955b5a1bb29657399a5b2fbc
cf4ebe1dc79566fc3754127cf842c3d9dafc42ba
708577 F20110404_AAAMAT zhang_h_Page_005.jp2
67e4601adb83a27a9728909fe79226e9
7d17cbace42fc2a5131ee4eb5e3ce30467fc4063
7696 F20110404_AAALVO zhang_h_Page_009.pro
3e55289d51ed0f70a28a3fafe9a1c75e
4f4111cd19c7b2754824839b04cafcc3be561475
1891 F20110404_AAALUZ zhang_h_Page_118.txt
dec6df0b6e28c801dc14e9ca4c694e48
4d038b49a3e72ac93ebc9ef19ced92f4c59d0bea
722845 F20110404_AAAMBI zhang_h_Page_034.jp2
dd61cdc5cc133f42efb7b614707ccdcd
9496d318814bc56d15c294f3c2294e74dc7733a7
50095 F20110404_AAALWC zhang_h_Page_033.pro
9579b22aab1d4419ee397c8e527c5404
dba30b852f4c233ea72e7667e7b849bf3140fd1f
248080 F20110404_AAAMAU zhang_h_Page_006.jp2
76462f82c85d7250d18c773298d321dd
c1878bfa51bd415169a0e2a06968ef472c3d3aa6
55189 F20110404_AAALVP zhang_h_Page_012.pro
34b43c6b2248a11fd46586328ccbf2df
8f51e79224d2bc6da97e4fc0216b541d829185af
738372 F20110404_AAAMBJ zhang_h_Page_035.jp2
cc10049be5c6d74306c041a60b1fd0bc
ac91d94c1ea4f92a6a4c4739960fe88fdb60e25a
35338 F20110404_AAALWD zhang_h_Page_035.pro
fab9c7e008207b61985f0a3cb43315cf
a46213ccde38a3d843a55a51be9c3a219afce220
301825 F20110404_AAAMAV zhang_h_Page_007.jp2
dcf88cc2a0587ba973a9413c4ce325b1
7fca4cc3d417942c7afcf34f1c977e859d4ba4d2
42831 F20110404_AAALVQ zhang_h_Page_013.pro
a5241f2efae2ba915d25200121471b3d
8edfd573265acaad0130f12a0b21a09e40cd6172
1016380 F20110404_AAAMBK zhang_h_Page_037.jp2
6b8129fc4340169367a92ea3c23b4c96
e2e25fea696990678f5c1948ab9364ea58cf7f4f
33110 F20110404_AAALWE zhang_h_Page_041.pro
f0f6d04252f1a64b15a8c78bfc5ff429
d70f86a96fb10a5f87a10491059c3fc1b836bff7
426575 F20110404_AAAMAW zhang_h_Page_008.jp2
60a5e1d84b7fa6045f996ce664032210
eb8b02d898efabd9ba60845719c02d04030d0521
843225 F20110404_AAAMCA zhang_h_Page_071.jp2
bae247cc7baf27591d7c5ebdea1aacba
9d4eebf99f1bf588df414b406aede8c12cd6acc0
36374 F20110404_AAALVR zhang_h_Page_015.pro
3fbf0ab5622de699f7810c2ba887ba2a
63aeb8483e996419dd17862f7c3c1f569e5e3777
1044152 F20110404_AAAMBL zhang_h_Page_038.jp2
8484d93ce0c351033d942f3e477eafed
068114746183987f25073a731af5bbbcec59abf8
30670 F20110404_AAALWF zhang_h_Page_043.pro
be615fd0901ed4970a1cc2d58b455985
f252d2b12532fd22f530986cd35c06f6a51b0dca
190666 F20110404_AAAMAX zhang_h_Page_009.jp2
acb070ade4d1af4d72c5f67adc3c490f
232a6d513e117290285e803fb5bd6c7feb04153f
729089 F20110404_AAAMCB zhang_h_Page_072.jp2
07744dae87ab08909e919e87c482672d
7de0074521f67e095fa0d812028bf383e29c0c63
41277 F20110404_AAALVS zhang_h_Page_017.pro
0f282c025bca89ddec626525b3a53b48
3b110fb8a7e8d1e9e086ca0f748e96b054469b86
729230 F20110404_AAAMBM zhang_h_Page_042.jp2
e9117bdf8fe0c3000e265876d4a1b1f3
6defd3a055d2e3ec245ac15bfc9bc8ab400b9c8b
22301 F20110404_AAALWG zhang_h_Page_044.pro
9a509aa9fe9d10ef055eececb6125be3
21c117901be5f2c527b6ac7553e61dc88013273e
897207 F20110404_AAAMAY zhang_h_Page_014.jp2
7b51289f6fc93001573a1e8fd8da54c8
f13e233fbafbb97551bae6168acb2abffaa42d8f
681850 F20110404_AAAMCC zhang_h_Page_074.jp2
4d1057b98c668128d2f2e61f27a0211f
31b0bbe80b5e28f2d2302cafea5b031d210a2af7
30176 F20110404_AAALVT zhang_h_Page_019.pro
57c1fedfb11fa3e4a56ac4ec0ecfaeca
94775ca4e3fe9af538fb802657fd163ddcbcd60b
553118 F20110404_AAAMBN zhang_h_Page_047.jp2
468952ef7234caedeec12541c553a76e
f1a330d0b6bde07c903db9082c3e26c390ca7886
32531 F20110404_AAALWH zhang_h_Page_048.pro
93a04a19c1a6748c0fc75a7670cd45ed
ca49bda999edd17912fa00aabccdd4a55afb6684
796922 F20110404_AAAMAZ zhang_h_Page_016.jp2
bc7f7f12e0c5cf3b6ac27bd66f563fa4
7e5a2999ce21920be604e900a707849b7e2f44a8
36846 F20110404_AAALVU zhang_h_Page_020.pro
513200b9781f2a6634389350a1046ef9
6b499b433d469508826092f4d3d1b97a4ced430c
602075 F20110404_AAAMBO zhang_h_Page_050.jp2
2101e5c6188f7c9fca29bc7fe9cd2b14
c84245321f741c961b09e7b41d26cfdb4ecc9602
28415 F20110404_AAALWI zhang_h_Page_050.pro
61e67fc19d0238b94a21a7ffbedbf132
ea7621a1093291edfc91c4c74782d6b08c77c3e7
628842 F20110404_AAAMCD zhang_h_Page_075.jp2
c62d270560b3f3c873e0ee04eba2d94a
719b0398cb781a63b5a650f2491aefcdee53ba8c
1051965 F20110404_AAAMBP zhang_h_Page_052.jp2
521d2a1aa04d8c953f8c20d4b0c60fc3
787242eddb6b1228e257227b2e120ddbb305d4a0
37887 F20110404_AAALWJ zhang_h_Page_053.pro
2275f98ff6f966cb49ab0c48998f79aa
b658dc919c9d71abea9e23fb9606ce46df0172ce
35044 F20110404_AAALVV zhang_h_Page_021.pro
fdc77342da411fab86e2ac9c3a6a6be5
e07439a5e35372bf61fa58027d26355cb6ebb563
588122 F20110404_AAAMCE zhang_h_Page_076.jp2
83fabb3b85038e8c833e9dd53c4ac447
c709fd4bbfa2cd137d5db89ac68ce93030a7bb50
1051974 F20110404_AAAMBQ zhang_h_Page_054.jp2
3de21a29d421a0f65cfaa33c0837919a
5f419202abbee234159ecc243d7abd0142f53854
44892 F20110404_AAALWK zhang_h_Page_058.pro
d3e177dba5e5ec0a451511838763a02d
e64d6fc532e00a16dafca7f3c7219fa7f5389f29
37753 F20110404_AAALVW zhang_h_Page_024.pro
76dffd1f19d0c30c88f942038d8c7561
d80cc2b8974a22a425a0e4dfbf764bf129c1ed8f
721832 F20110404_AAAMCF zhang_h_Page_078.jp2
6e6ca81698c31334d6828782e5b5c61a
46d4155066913ccc28cd019e379a028fc66953e1
756377 F20110404_AAAMBR zhang_h_Page_057.jp2
8455d572f4376c53791dcdc2b50038eb
08c3588b8abf9b8a20adac9377d17b2202f86a67
38694 F20110404_AAALWL zhang_h_Page_063.pro
0d396d16604e42697fd1f7fc993e7565
52c085322f3989ade9363547ee1ebd60632f72a9
695568 F20110404_AAAMCG zhang_h_Page_079.jp2
4f2029a25a8b55eed17bb32d4ec14aea
0b78d9a9002af1bd6c6f2ff712dfd7a162ca57fd
32762 F20110404_AAALXA zhang_h_Page_095.pro
45c23d8f4d2d1907f87a2304b417cc55
269943175cf7e20d1e080ef9c25c57d79c7cbdfe
755074 F20110404_AAAMBS zhang_h_Page_060.jp2
4ecd87b81947ca2e9f40d9d94039f087
9b73ac8d4750e74ad75b2a29178987aa8ea1a18d
38629 F20110404_AAALWM zhang_h_Page_065.pro
a4ba3d11cbc3857dd69095fa47df58ca
ab232989ff6144e3b6b96f310cdca8d08af9d1f7
26316 F20110404_AAALVX zhang_h_Page_026.pro
a7adc290ce12daea6e32d30cf29267c6
f925c5e37f8d5a4047fdbfd4a1638de53eac8d6c
388540 F20110404_AAAMCH zhang_h_Page_080.jp2
eee9f2b9fd645faf468475b559c2149d
36d8270bcc6077485b869bcc8af811bc10e9ddde
29220 F20110404_AAALXB zhang_h_Page_096.pro
0a26d2a237e3b76c1258cf90f2fbceb8
8eb85d3f1e6dd44b3df8771c745505508746a443
1051967 F20110404_AAAMBT zhang_h_Page_061.jp2
70b54ff9829b32bc1464e17ec7b9e110
d29f888243fd723bd7ffdbd888ed6c91e25d338c
32001 F20110404_AAALWN zhang_h_Page_066.pro
6f1947efa1d0729149b41cfd6468839a
68e485201273a8e558984fa1e90c7c03289430f8
21642 F20110404_AAALVY zhang_h_Page_028.pro
7e7051f2e17325f79d34e846bc391479
6ee350af3ef38f956a7c981eb768b608733218df
653926 F20110404_AAAMCI zhang_h_Page_081.jp2
6b0f22c5eb8a750aa53fcffa2d153eb8
1813097c838cfeb4ea7351eae554c9deb9ecdb95
38644 F20110404_AAALXC zhang_h_Page_097.pro
39b159280243e116d96cbe98eebc6308
a46de104db239a91bd869e43ede40c68732b6a8c
832805 F20110404_AAAMBU zhang_h_Page_063.jp2
805ddd6ed0d2bc54682c78ee144aceef
a404900214e0f5b0dc8cdc24fa1f51658a7da8fe
30083 F20110404_AAALWO zhang_h_Page_068.pro
82f1af2c65e756256dc678cf5ff1f63a
fe557b071d35c231d10fdd50a005d20ddfb2776f
38961 F20110404_AAALVZ zhang_h_Page_030.pro
68d84194cc852855fe9e8911b2a6120b
2d078f52751bff2545362337bc22136f5ca44457
853300 F20110404_AAAMCJ zhang_h_Page_083.jp2
e25adf270d5e7fbd607eecf521395282
d95d7714b1b1bc9bb2c830fb3467f9df1dbf04b9
42594 F20110404_AAALXD zhang_h_Page_098.pro
2de8e20bc534fa772fdd44527f6092ac
1ad69d33ed5977484461bda48b3919a61cd6b59c
962479 F20110404_AAAMBV zhang_h_Page_064.jp2
5be0d0630db36caf988c945fd2047749
28a8616e46da878e436475f4af5789263554e05b
39132 F20110404_AAALWP zhang_h_Page_071.pro
6b0d1eb4e0844fd1dfaea8513fa81e79
b7824e1e4df55b4371f00ba176b7b8d156dc2327
872104 F20110404_AAAMCK zhang_h_Page_084.jp2
f637282ba6b14bf296b037a8020b52ec
3fc238befd2a58f0826cca7383394c0aff541b34
837694 F20110404_AAAMBW zhang_h_Page_065.jp2
a86de7cbec0e36bd2bc70386cf4c6ea8
57bd7d82fc60b3906eda594be1554468705497a5
43116 F20110404_AAALXE zhang_h_Page_099.pro
a8b696859945741c0104c9c55684c79e
751c245470ca56f26dcb5e2ce04355cec79f2193
F20110404_AAALWQ zhang_h_Page_074.pro
6ab54bb99c43fb82b2dcb268b053f5e6
cd156ada9396619961613f2e7e599ec4f69b8505
882202 F20110404_AAAMDA zhang_h_Page_117.jp2
debc1700760fedbc508341a07db85765
7dba56522061fd5b39c77e1955acb67c4ad3aa86
1051979 F20110404_AAAMCL zhang_h_Page_086.jp2
78cc5862a6a77ad9ed6848aeece6c165
a35f9ca9c946abc35539cab4e38d125d07c46378
701624 F20110404_AAAMBX zhang_h_Page_066.jp2
d0d2902e659c952aa0cc007889c535bd
85e5c547f948bfc9517301f14461381dcdb97f7a
24227 F20110404_AAALXF zhang_h_Page_100.pro
1d91c37c01e9955b49c10c89f5bde590
68b1e0948cf5f8f97a0c8beb04453cf620c6f180
29407 F20110404_AAALWR zhang_h_Page_075.pro
ba25695c350574de26e55a1b86a85237
cbf863d551ca92e93bb0e4605dfdc118a99265b9
1047591 F20110404_AAAMDB zhang_h_Page_118.jp2
da96e1bbbc9cd9149bae60ad4be47ca5
dda9baea1f397f6aa54120f87184f4972f4bb0a6
827139 F20110404_AAAMCM zhang_h_Page_089.jp2
dd8d135c08380e371e25849b36f6a040
9fe4c6384d483c481643873ab31c18e9b61964aa
612892 F20110404_AAAMBY zhang_h_Page_067.jp2
d352d4e3f342b861598a24eef3a2d49b
b5764156bbd5ab3d2436efc36d79384457352e7d
39118 F20110404_AAALXG zhang_h_Page_105.pro
cff67f4bd7390661d02dbc3e4cea4cff
64003e52ada2bdf774e0f447de8b2f2fe8d43597
28528 F20110404_AAALWS zhang_h_Page_076.pro
8b4b45928d87a2fc81ca99118695e624
c20405118c591fdf4cb8be2ee574081839e66dca
784858 F20110404_AAAMDC zhang_h_Page_120.jp2
b981bcd482e7ca78e596656e1b151f07
1e340eacf532939db89f3c2c353135fb2c177a63
747717 F20110404_AAAMCN zhang_h_Page_092.jp2
5026226f21eb15a79d9e5f8b627bcd79
d3cb3b39b0d88b8b716eec565ef03cc75961b93c
630896 F20110404_AAAMBZ zhang_h_Page_068.jp2
0918c844eb7092ab606a66347a344e49
77d2158e4e29585bd4acedc5d2226d7649a9849e
26607 F20110404_AAALXH zhang_h_Page_107.pro
21e3dc70935b3f179d8ef6bf2d811ceb
0831b802e7421dd72e82e0d15af32f9e7dd0e90e
37929 F20110404_AAALWT zhang_h_Page_078.pro
4a28cb0454563940dfbcbe6581b605ba
14af84d413fc9c3b3b4b89aa85cc8af3506aa927
F20110404_AAALAA zhang_h_Page_137.tif
a116c8b4dfe506a900fa32cc5f6843ed
c3489a027849054eadda2b26ec718f3bc0e6b6eb
695841 F20110404_AAAMDD zhang_h_Page_123.jp2
442942993102f2e7a0d7a92b2a511032
57d4a6ed4c5a0f00ab8a163960255f5f8e78633d
604096 F20110404_AAAMCO zhang_h_Page_096.jp2
5983860cfc65b8166cd7c98951367da3
45272394c1453d6d96c8e36895e79c8b2585b3fe
43844 F20110404_AAALXI zhang_h_Page_108.pro
36bbcfac5eb75a54a1cbc4f900812e4c
7106be493c38c7595ceadeffb3c4fb7b488bd15b
31305 F20110404_AAALWU zhang_h_Page_079.pro
98134fa9701cbeecf16cb66d351e5e6c
f3720b7c6b9202314622029efac0167e8aa442da
29426 F20110404_AAALAB zhang_h_Page_038.QC.jpg
1108bbd7564776e1435e1a7bd3759f39
0ff9ff56effbe86ac57a95ec83020d76f3ea2412
847918 F20110404_AAAMCP zhang_h_Page_097.jp2
197fab0d47cc551a6d6a11c46208cd7c
fe66bbac996cea5b6ae9e3a8f6d7a22fb7a9a27c
50867 F20110404_AAALXJ zhang_h_Page_114.pro
e73347ab77979c112d5c632aa2936fb4
bbc1591a5d535ff1fd53edbc2bf9db784d1c47e1
38668 F20110404_AAALWV zhang_h_Page_083.pro
d5c6b1681c3d604bae60f732e60f2832
35a12f5355b6d85cfa3d33ac6aa1352a9c2797e9
692283 F20110404_AAAMDE zhang_h_Page_125.jp2
6227acff8beee6609e2f6f3e7b34902f
fea027b0fd12aafc0adbd744c862013b5644692e
774733 F20110404_AAAMCQ zhang_h_Page_101.jp2
fe15f1963b58a6ded9989b190abd3029
ca107e13cc41127ff0dd20dda3e16ca75166c2d3
38654 F20110404_AAALXK zhang_h_Page_115.pro
9a717a7b1906b680910f82e6d7a5e09b
e95bb4e73126fed5ccd6f487473b17a0a18defe7
56345 F20110404_AAALWW zhang_h_Page_085.pro
ccdbd7d25a81e14ee9903a952b84f069
22923bcd989beec74576d14a1dafe34ef5851acb
F20110404_AAALAC zhang_h_Page_011.tif
d1258672b7857c97324a383eac20f740
3dd929c5ea13f07b94a8e6ab7fff0ebff4e96f39
1051923 F20110404_AAAMDF zhang_h_Page_126.jp2
4a34aa1ff747652a50379e697c4e160d
ffb8a8841319c811edd8c96f01464683e7962269
1011437 F20110404_AAAMCR zhang_h_Page_103.jp2
26facc6f5c19b340280c026b74e20424
f3d9089abc452dce7635f3ac83deee4ea645a335
46936 F20110404_AAALXL zhang_h_Page_118.pro
c4c2b93dca5cdba18303174720945ad4
3f5b5209f7c677d5f857b3b5dc2a744f16706be1
51945 F20110404_AAALWX zhang_h_Page_086.pro
518f4407562867531603d15512ac1fe9
523aaafec88552d3f147f9ae6a33fd07a00d57df
9814 F20110404_AAALAD zhang_h_Page_007.QC.jpg
37e89b9e93db429ae3f8f5579b16caff
a9038361a511ec31e83075531bf9dfd7defd73ac
1051986 F20110404_AAAMDG zhang_h_Page_130.jp2
d02b803d95ed29fca316d89eac049265
a04c27f47580bcd2649db711a5a2c41e5a88b159
777170 F20110404_AAAMCS zhang_h_Page_104.jp2
42d1a2a3744d9ac253ffe21348a937ab
9a09975fd67f40e4fcf5cb8b49e21cf47dae1d23
37979 F20110404_AAALXM zhang_h_Page_119.pro
512dc9262bbf614f94072d07248cdac3
acd6880716a7c64c158cfaf7b64f40f6b24af064
31061 F20110404_AAALAE zhang_h_Page_125.pro
4935c6637dd30fdd26f260c85aeff600
bbe99259b51cddc052e779eb7dc5c1a0aded04cd
63054 F20110404_AAALYA zhang_h_Page_010.jpg
084c392e67fec22332ef4f03e2295453
97f4e423ac0f9da298935badc1f45cc44e6f57cb
F20110404_AAAMDH zhang_h_Page_131.jp2
211f286eca9635cff8957bb8af447f5b
757236a7efbcc08d77b25762ab86ca9a41b53405
800433 F20110404_AAAMCT zhang_h_Page_105.jp2
65ef3d4394bf653735e3498ab95519bb
0ad494c501d93c898766db8a0aa2b66ef165fcba
31119 F20110404_AAALXN zhang_h_Page_121.pro
c834a3df198e03708b2e8aff4a9f043a
6b58f16d1168ad118a08c9a6bbcfa3781312e61e
31802 F20110404_AAALWY zhang_h_Page_093.pro
67ee5f19b3c5277ab935f3c5bf526fcd
99791fbc889c5168d5f2d49f48f71be61a270334
F20110404_AAALAF zhang_h_Page_061.tif
ba0d9196f508b33239270cad64e7d957
808ab794f7b0e233d69c6ec68819e2c3de8079d4
85477 F20110404_AAALYB zhang_h_Page_014.jpg
66335fd1ba9d88c9637703fac008fa01
1c31ff7bd59a13f056835eb8c26bc3ec5f86f6d5
F20110404_AAAMDI zhang_h_Page_132.jp2
f1ca901a240b5eb0f52766cebdb40217
04ca551942df910a1234a99147c3fe7ca5ebb926
562614 F20110404_AAAMCU zhang_h_Page_106.jp2
23d792a25287ec395f68e70b456fa990
921701c012621421fa7ab954fcc062737534d3ca
27212 F20110404_AAALXO zhang_h_Page_122.pro
896dde327bbb0d18bea2d6757a8ab3f0
bb1b29a1084ac81c74715549de2501beaf613fb7
43216 F20110404_AAALWZ zhang_h_Page_094.pro
de256276079ca461255050070cfeee7a
5f8a30f446d283be5fc7951dde4214b91b8d0fd1
61766 F20110404_AAALAG zhang_h_Page_129.pro
f927f5c96ffdd1eea108666f68b2ea8e
2f9d717b01c4aec33db914bef77def2eae9e0d94
72532 F20110404_AAALYC zhang_h_Page_016.jpg
d58c84068d30c6031abd76644437d0a9
3b148de32008f00b57e4b4868e8ba65283a60574
1051977 F20110404_AAAMDJ zhang_h_Page_134.jp2
7e98f16532c4b9a9f8a7313c96587b8e
0953bc01c5dffb0eee03341ad838d2a96c6fc6e8
919872 F20110404_AAAMCV zhang_h_Page_108.jp2
709329bce137c64c15b2dd5acd2fc9c5
c1fb1f2799c3341d323b68f219830672e0d2e535
50801 F20110404_AAALXP zhang_h_Page_124.pro
4d6d1fd77688a77faef8542e6edc4485
c67fefd7832460bb51dfc1852d1b45756fbd7ba4
26272 F20110404_AAALAH zhang_h_Page_069.pro
bb3f6f91d107d21ffe0d27223a4d10e4
0aec9aa57e8753f6dd5569f1f938daa8512c4c1c
96104 F20110404_AAALYD zhang_h_Page_018.jpg
eacc59e4017469c3787efdc2dbdac873
723c09d184cd563dc8f2364cc5d1a6bfd67f5748
662748 F20110404_AAAMDK zhang_h_Page_137.jp2
0c408f5885414e8b253f6ae0b3687793
ec3d7b898c9d79daffd636be423bf1f2ac1bc635
667605 F20110404_AAAMCW zhang_h_Page_109.jp2
98deed6a7b8be17a753107d109cbbfb2
a8ccc3d935213b0ce190ae0c1b80d506d8d27bbe
51859 F20110404_AAALXQ zhang_h_Page_128.pro
24e84c5b8ca1562e13de4c3732d82509
076a231a2efdc15d1d277692edefeff0c3a8abb1
F20110404_AAALAI zhang_h_Page_043.txt
3efd8a082de0a874e02fd9bceb21a774
8474ca95c99d72fd22e770550654b9f57c6c4da3
61041 F20110404_AAALYE zhang_h_Page_019.jpg
221586b2198e8c74448c65fc796365af
889d39a3b5f6f868277a8609a46d25427f39cfaa
5768 F20110404_AAAMEA zhang_h_Page_092thm.jpg
a0e52d7936a491785c2d4b2158d201ab
a89aa57d507dc265fa4c580aeb79e067f72168fe
706549 F20110404_AAAMDL zhang_h_Page_140.jp2
07879b19fa6b7c77500758ed34843cbe
d82458ec4f9c3a7e82bf63daf0758c1b2246ba09
688088 F20110404_AAAMCX zhang_h_Page_110.jp2
deb0a2a74dee3062b037ffca0d9397fb
e22ab3d8b9a321d63e0063aba7d9795e3b422941
54534 F20110404_AAALXR zhang_h_Page_130.pro
d083978c5c83067934b82192bac9716a
287f8de8e882f344c1c6d74da710a8aba097092a
99140 F20110404_AAALAJ zhang_h_Page_124.jpg
64538309a9af441030b56b3ed1cf19e2
7ba9c9c739c4f537c0d47a40650d515b9fc53ce5
75130 F20110404_AAALYF zhang_h_Page_020.jpg
7fb153c95036527c224f22ecb65f2a06
b1c0c5728beca86c630dad5dc6f1fb0b7ca9eb02
14841 F20110404_AAAMEB zhang_h_Page_044.QC.jpg
52611a0d011a9793153d70bdf6bf37a0
869c6efebb45c552c9b9d60d969697f022202715
618419 F20110404_AAAMDM zhang_h.pdf
9e694698ce353fa3f54d8ddde2b177d5
538d043b42bb61e4a718e2af6097bc518ceb674c
836033 F20110404_AAAMCY zhang_h_Page_111.jp2
64b7f2191bfc83d03a02826f2bfb41c7
163a7351bfde297986decb16aed466c2f4493449
55592 F20110404_AAALXS zhang_h_Page_134.pro
5c7a6c2a0ed9767acfa4ce11a6b64aad
0cfdb01ceca5f4da3b094b7185f25624c5167e9d
2272 F20110404_AAALAK zhang_h_Page_039.txt
18f846eef57a3eccbb9c2c07c96d12ce
e7ce5fc802fd921c686637baa66b87b7e99dfc3a
64089 F20110404_AAALYG zhang_h_Page_021.jpg
f32931088d88879c36118689f83599e0
1970df769ed719402509d56fed743c71bf42c4fd
20075 F20110404_AAAMEC zhang_h_Page_140.QC.jpg
0a4f843374eb31e95ae2d016f235e22b
daac734e2cc11b7f3a3938dba2d56495a1225448
6758 F20110404_AAAMDN zhang_h_Page_014thm.jpg
babb9890e9b151362423169e8ea375f4
096548939b3e705e76d79a1f62f3402767885ff5
577343 F20110404_AAAMCZ zhang_h_Page_113.jp2
453470b9473121e7f25b71f93af4aa12
8ec3894e573a0074bb7e3525155ae0281e35a170
14482 F20110404_AAALXT zhang_h_Page_136.pro
9da32a980aa9b1832d785532e7a3ec63
0918d43226224df56ae039503f4ad7f56825ec18
6016 F20110404_AAALBA zhang_h_Page_065thm.jpg
6e9c69f11808d2a1b34e0b9db2b2dd5d
f492462de628b2345b9625b0a6487913533ccc9c
F20110404_AAALAL zhang_h_Page_115.tif
394b94eaac3f0b323910716c5f408a79
18eb12f2f650f9aeea7a3046161a96025ae31115
67180 F20110404_AAALYH zhang_h_Page_023.jpg
5152029d617507e4e6fec6db750f40d1
6512a9af96b1e820bc9bb4bb75fc2c30ff2df31e
15192 F20110404_AAAMED zhang_h_Page_028.QC.jpg
e03b31fbce56d7e077f28915ed2e3182
191e3c306852e56810d95d35916d41dcede8658e
20351 F20110404_AAAMDO zhang_h_Page_048.QC.jpg
25d430f44579d56d301d0b39f353ce0f
3941d534151074670bdf1d301789eddcbd056447
30327 F20110404_AAALXU zhang_h_Page_137.pro
12c1aac17c110a8c0141cf50815109a3
6a87993a4ddf468ddb440fb5c738693a9590f6c5
F20110404_AAALBB zhang_h_Page_082.txt
332dd1a6b5bb16f7da1bbd3528d41bb6
dfb831e896a2ad3c17e88910ef0f92d8d7fac6fb
96548 F20110404_AAALAM zhang_h_Page_033.jpg
585204cfe809d1416c03a4aaaf5d59ce
bbef0213dcd969c809c2f6de55cfa8a72cecbd2e
51069 F20110404_AAALYI zhang_h_Page_026.jpg
73f2e4c8b6b470e90805c3af32318725
0bb390c5b071cb575260f1d8047d2ae5db9a49f0
7087 F20110404_AAAMEE zhang_h_Page_033thm.jpg
ce3ee14b242acfb39203fb3d37b54b38
cd38168a4e4466058ccfa3b356cf309e1b4793af
31068 F20110404_AAAMDP zhang_h_Page_124.QC.jpg
b69de7a63e4a746a10457e995a9dbe0f
7ee4abe4b14f861452f3d43e01516c7ea0dc79c9
30741 F20110404_AAALXV zhang_h_Page_140.pro
3326f3da69aa0d1f2ed7d7d97bb0ea74
510171661ca23ed98c74f5bada20f19c20e70b15
1381 F20110404_AAALBC zhang_h_Page_076.txt
499c90de6645aeaf4e56d682d07cb2d5
f9f7925d5fafb210fa26ae919e89ca6368333db2
70916 F20110404_AAALAN zhang_h_Page_060.jpg
0c51ff6b959b500d815abf85f643dcc2
15d15f04dd86aa2ed1e007fdc93b7ba13b641107
57465 F20110404_AAALYJ zhang_h_Page_027.jpg
fe437feb689dc379907f1ea5a7f8f80b
ace3929f9ed12439b9ae61960b3c4a450ab0807c
23439 F20110404_AAAMDQ zhang_h_Page_090.QC.jpg
526cd41e058107afb8f8f44f19bf73ea
e628a309f03a1ca21f20185a4e2cbe971e9ab40a
23130 F20110404_AAALXW zhang_h_Page_001.jpg
7337d85964efcb1d85fa4627943da766
d2e25cde0a754b418aecae1eb4862fdb0223ec6f
F20110404_AAALAO zhang_h_Page_116.tif
e1c119d57c86ab8b7609ad714dd17f3b
94db2af41716e5577fae21563e87fa4fe2180025
45015 F20110404_AAALYK zhang_h_Page_028.jpg
39e7fcc9b0d3294ad52c78d0b819e44f
de0834945da1257fd822c4ec7f5ed10ca20fc6cd
30552 F20110404_AAAMEF zhang_h_Page_114.QC.jpg
91253a372b46d296727e5896d19fb527
f3bac0b7c272b4c4035e5b672d1215038f746b99
27738 F20110404_AAAMDR zhang_h_Page_099.QC.jpg
5a2550d1d24cb79202f9546c0401a49c
e853b928d3c7cb5837163adea244a1ba2f11ac34
4572 F20110404_AAALXX zhang_h_Page_002.jpg
e3a5e9953bfbe51a3c309f3c65740723
f38cdf427b27f79bb67cb70a503cd51782ccb249
5164 F20110404_AAALBD zhang_h_Page_138.QC.jpg
e34aedad15d7d5c907bd5c2d26a4a6e2
940dd3e543b1adb4d5fbe07bbf087e6e87e62af1
704695 F20110404_AAALAP zhang_h_Page_004.jp2
0dcbb149bdb30b6e1143b310b42c60eb
bc1006a923f6bbe246439bdbb87d44743bbe263e
89351 F20110404_AAALYL zhang_h_Page_029.jpg
9a7b75a25ec284800268fbdb69fd71ed
565ba8b83e0e6dd8744e7353d89e298579093852
7325 F20110404_AAAMEG zhang_h_Page_128thm.jpg
8cd5588ed3d62e94c90f715c9638cb1f
fe2d79b6d59a05c98d592b991ccaba954ec0727c
5801 F20110404_AAAMDS zhang_h_Page_109thm.jpg
2968c2154d270b873019731ceac53ca0
9ebc9fa673dd8eaf58f3427e469e08ba296f9267
75749 F20110404_AAALXY zhang_h_Page_005.jpg
5d5697b26580c5caade1f7b41a0f84eb
8b0f51d76d9573d1a38d973a437aae7cd584318a
F20110404_AAALBE zhang_h_Page_015.tif
8c5ee951d6c6089ac3accfee8a22352e
2fc8e606daca40a24b490e8fbfb1f684163d6402
688624 F20110404_AAALAQ zhang_h_Page_041.jp2
1145b73e783d7fb56206601651283353
120326189a8a65e549dd232b0cf7732bfcfc41bc
88333 F20110404_AAALZA zhang_h_Page_064.jpg
d553719eddea8b6f2d41c84d255f5126
f3e93e48e02bb1487c3f654033a5c08e0e2f02d0
96958 F20110404_AAALYM zhang_h_Page_031.jpg
f54f13ed568f05da34fdd161b4ab9c95
c6ec3e2a68e9b2f9c0924662f150021769cc5c45
5643 F20110404_AAAMEH zhang_h_Page_091thm.jpg
b216491dbf7100646ac671d0521a282a
49e43574ea5a3a22a830d338fbc618893e614b47
1966 F20110404_AAAMDT zhang_h_Page_001thm.jpg
07fc1ed1f8efe5fa73b7161a9d19021f
56b09f91c5bebf21b9c0730f918e71d90751a423
655 F20110404_AAALBF zhang_h_Page_006.txt
1d98ef8e3decd47447c46fb6c02e7215
0325c55f0a82e37c070fc6cecb31e7a0e3e6d62f
34150 F20110404_AAALAR zhang_h_Page_090.pro
694bbbe6f89bc2ecdc4e17f48197d518
408a53e87a7ab1b96459094c459104a9f2d6dd3d
58343 F20110404_AAALZB zhang_h_Page_067.jpg
9343945f464468c897a965d69f514052
6624f82c58cbc48d9924bb34dbe6d089abf01572
95332 F20110404_AAALYN zhang_h_Page_032.jpg
75567e5a8cab1ce1d748e67f7cc6595d
92c891cd5c4226ff7819cb333aaa16dc1d10e91f
1646 F20110404_AAAMEI zhang_h_Page_009thm.jpg
5909acd5279e5b845cadadc5199ed68e
3ba93100b7c4936361e494cae33c0ffe27f9ef56
7294 F20110404_AAAMDU zhang_h_Page_032thm.jpg
8dc990048e77d1a6da36cf9e6e45bf50
b452fe14511cb30c9dff7f4f6f38c43652687d37
F20110404_AAALXZ zhang_h_Page_006.jpg
2c2a905291f806b2ffbecde2589e8b40
ea960f38506191a14f237a5fc4b39d188512d7e9
968676 F20110404_AAALBG zhang_h_Page_029.jp2
37ac4e7ce003545c8933ed2011949ff6
d836018a846aaf65279410453fba925bb413fa9b
6833 F20110404_AAALAS zhang_h_Page_058thm.jpg
5a5543d40d845bf162faab9787df3e1d
560952ecefdfb429cf8ebf3b01f28a00c748c6d3
58688 F20110404_AAALZC zhang_h_Page_068.jpg
372f18e9a6419719dde2123382239e1e
ded4b868465c9372c3fe36be0a65817fe675dda6
64207 F20110404_AAALYO zhang_h_Page_041.jpg
3c78706887c201fe057c7cb0fb75b697
537787e240a72e364333dc1417793af0afe60487
5214 F20110404_AAAMEJ zhang_h_Page_049thm.jpg
d728dd07f5e80a5f416042ed44abd833
fd3c879d34a3c802507e2ea290db4e5044c7c8a3
30165 F20110404_AAAMDV zhang_h_Page_061.QC.jpg
369b5c55cb3d8cc31abd866ab2ab4f52
0e3bed12821bd2034444e51a4cc5a41a2ca2e6b6
34168 F20110404_AAALBH zhang_h_Page_085.QC.jpg
1f0318e7f7cb8ae204cf6cd761b4c033
2859c8b7b8ea565bfc6bcf9a1dd6e2ab7cbf9238
F20110404_AAALAT zhang_h_Page_121.tif
47d811a3539cf156ada1d968b5353523
5f9b98ec3ca29769590a8e279044ac732afecba6
63588 F20110404_AAALZD zhang_h_Page_070.jpg
84a4ab1d692fb6a74ce77fc628ec61af
63b7e850133621ddd0a80923d275df6ab412a91d
68114 F20110404_AAALYP zhang_h_Page_042.jpg
5092c2f69ae108bfb58eeebb2d1d1a32
7acfc63c8cac50b354fd67723e0f71c239dbb7dc
4347 F20110404_AAAMEK zhang_h_Page_044thm.jpg
ad4d35c1cc4b515f39d37a176cc0e0f1
5f5e1696cfec637c63db7dc84535471b6fbf22fb
19239 F20110404_AAAMDW zhang_h_Page_067.QC.jpg
779fabdda33d6d12d0fc6270e7196102
caabf1141aa0420350ece0184399459f4de5de4b
6074 F20110404_AAALBI zhang_h_Page_104thm.jpg
c9056d6611e58e783d21256cfcdfd536
546310bdfd9f77cd3f19a76a162359dc803387c3
45757 F20110404_AAALAU zhang_h_Page_008.jpg
83222044ce756be4fb3b55f95562a845
d08d7e2df5cf3a0885e2a6c0282349d15f1f3ab7
75849 F20110404_AAALZE zhang_h_Page_071.jpg
1d0ee3f2400490850ae8c81bd63c9290
c1ffb8a3fbf4d44aaf8950664ff85bdfcf607639
53850 F20110404_AAALYQ zhang_h_Page_047.jpg
2e469cd01b51ace571ce7a71e37942d4
cc31289502cb8c3e5e37717d7f4ba25d8f9f377a
32145 F20110404_AAAMFA zhang_h_Page_039.QC.jpg
6b4b41752051c898add11bc5a341c9d8
416d8946435d40dfc13e6feaee41e56e1adfd0e8
F20110404_AAAMEL zhang_h_Page_113thm.jpg
bd2b4c9552dcd66e5ec5432b714cf13a
6d981e89dd9e314c45c3d04ec543186c412af3d5
30727 F20110404_AAAMDX zhang_h_Page_054.QC.jpg
37d03aa0a014061d3b667d9e00ee8e98
4750d123e97eadb94e1a887463ae12a55053452d
5841 F20110404_AAALBJ zhang_h_Page_115thm.jpg
76a8da39715f833ac13b01408bcaec84
cdfe49e16682e3ca25fbaf822d81df3b0a09e6ec
F20110404_AAALAV zhang_h_Page_039.jp2
bc25c6eba2a9da3a5adacd40f0091e43
c5d233550368f9cc16e8d8a9619c6c08503cb030
67572 F20110404_AAALZF zhang_h_Page_072.jpg
fa631d318db8b08bcbccbcad58499c1b
b9cae252e4b8f342a0e435ce28cc6bb817e3dcf6
53775 F20110404_AAALYR zhang_h_Page_049.jpg
cad7a75ea952454eda3ee1ff3114d073
ab9e3f238c9b5abee56ad6ace28fa8d2faefe382
24249 F20110404_AAAMFB zhang_h_Page_071.QC.jpg
467ee4d15f6967e6623959e91b13529a
15a2279f3b95320d1e8bf3bf4ce1bc974a89e60c
23097 F20110404_AAAMEM zhang_h_Page_104.QC.jpg
57f0d3304086e6e487fc148f24afc70b
8d1b5d75cdf4987eefb8bf5c43c78d511b30e159
6261 F20110404_AAAMDY zhang_h_Page_117thm.jpg
cf2c4622669781e504ab7822857b3580
43931bbaf38a14fb2423b8d4779267257a18d3bf
F20110404_AAALBK zhang_h_Page_104.tif
b2f4ed8cbde2e71b77f1c4c6a9c7113f
f71c1bad4ec33980cb1674f0a7dbc493114dbbe6
1522 F20110404_AAALAW zhang_h_Page_090.txt
9377919b8550a35f8e1d9577cc202512
41fe2d35f370096d5bb49f58a09afc6e6a0a2efb
47739 F20110404_AAALZG zhang_h_Page_073.jpg
c1c8edfc09f3eb0f2a3e854237e3e5c5
5248ee521dffbb84cf0a2a88bfbbe260c78d023b
94063 F20110404_AAALYS zhang_h_Page_051.jpg
ed94e96405b0f743c468886747877d09
3df2e3e35668198f2277ead0e32b83578ec6bb62
22409 F20110404_AAAMFC zhang_h_Page_060.QC.jpg
fc6bde0a4bcee9f62b8047c85d11176a
1164ecf9a09a820a73bbb2f828648ce323c3f61a
18080 F20110404_AAAMEN zhang_h_Page_076.QC.jpg
77a63867ce9292e1e976a9ed7f732348
157d829791308226a05e7a54004518722c67835a
27133 F20110404_AAAMDZ zhang_h_Page_098.QC.jpg
b14872c4b9448da4994e85b086df70e0
dd1b55fc7ad47ba98c7aa35e203e6f5bf6393a29
F20110404_AAALBL zhang_h_Page_054.tif
ca10fad42999f95f6c911c3d04cd924c
713c1365402857acd33ba0fe247f9bef068510ca
17043 F20110404_AAALAX zhang_h_Page_025.QC.jpg
4827e0684db2e5131e9c053381a0b529
5a16281c64114c1802b8dad77bb7a93e9e3d3656
63214 F20110404_AAALZH zhang_h_Page_074.jpg
2d567bd9ddc3f69013221d5707be3325
6d71c0a9f1f135d783b9f5535edee5d00463e9ce
78907 F20110404_AAALYT zhang_h_Page_053.jpg
699c6f0d07a8f226295a26afdfac163e
3ddefe64c0eed34eaae45e8e690fbb3819963336
589 F20110404_AAALCA zhang_h_Page_002thm.jpg
8abb223cf953b81f72feba177a5476f7
22224add76d9e84bec7321c463bac2f43c05a53b
27134 F20110404_AAAMFD zhang_h_Page_014.QC.jpg
b94447870a460d38f3cfb390cbe326d5
d8ff8841b80e6e259b437ff721a783d99e4e8182
5152 F20110404_AAAMEO zhang_h_Page_096thm.jpg
1e16008ade43a076df85924278fa96f2
db36f900fa6e2df056bc0778f5b37d533edcfe87
110425 F20110404_AAALBM zhang_h_Page_134.jpg
be85f4654978889ab6c5d58bc2180124
ca668181b6d1e5b2e1b655dc3d1f3ec6de1b1f48
16476 F20110404_AAALAY zhang_h_Page_026.QC.jpg
45c66e290f00f3b6621cdd8aa8ca441e
80a9079108097ded687dfbc76daf8a7940396767
56182 F20110404_AAALZI zhang_h_Page_076.jpg
eeaa526bfe51f7ad1299b5a8e275e6de
f7d4c1879758298bb1a43a5b06b57399e7436c41
85969 F20110404_AAALYU zhang_h_Page_055.jpg
f2f64740e35c26404917a1765d85b87b
b91c77c171de40ce27e80c8b8f452e8be3f23a43
26066 F20110404_AAALCB zhang_h_Page_062.QC.jpg
4e1c1c406be8b87d978bcbce0feb94ee
91881c6592d46989ff46b097c2f708a2eae3f3bf
19683 F20110404_AAAMFE zhang_h_Page_125.QC.jpg
449b53482de9d120fdfbc6ccfbbbcf59
8353de167f9b3748d8ce0b582823fa54cd5c5efa
5825 F20110404_AAAMEP zhang_h_Page_070thm.jpg
c949d6ff623683a97cfe378a54fb3ed4
f8fa146987edc7bc9b7d16cf61526568dbba3043
32905 F20110404_AAALBN zhang_h_Page_116.pro
74523ec46b1ead3670136a785d4cff2f
a4f8fdf68920025cff76e50d064972a82df61d7a
1630 F20110404_AAALAZ zhang_h_Page_022.txt
ad8f972b8be2afd6727428eb69720cad
7929ac5b0c69b3ad4e81f8db5c3922018955db5f
59141 F20110404_AAALZJ zhang_h_Page_077.jpg
01addf0d5f238bca154ff1f1f908e65d
d2404766e4cd13576850715c76ee655bce1f70e3
88890 F20110404_AAALYV zhang_h_Page_056.jpg
fd69c1fa09030fa693a5be7b8d1165f4
ef755c817eb71db4de32f0c3d4aa118adaf90268
5400 F20110404_AAALCC zhang_h_Page_068thm.jpg
029f1d0296ff903669d060cb3e4958aa
259048695bb41edd3aee35df545c704f9aeba81e
6615 F20110404_AAAMFF zhang_h_Page_040thm.jpg
519939d559b05ca04b0ed60b57309358
9c7b79b1828f89a1d35aaa35b3eb21a383b52896
20369 F20110404_AAAMEQ zhang_h_Page_123.QC.jpg
038c96e35e4cda5f65235a621dcbefd2
23bc099e27283827b7433538d206a404ba4425d6
27739 F20110404_AAALBO zhang_h_Page_046.pro
8a97f3220116852745c2f6221674cf53
ee1fba924bbf62b91076f7c22ef4309b44d51fd3
64130 F20110404_AAALZK zhang_h_Page_079.jpg
946b100d1d630fea9f311c6bc7fba150
9e4f176504167f4b0f99cafc5f80c7ae7579df7d
70828 F20110404_AAALYW zhang_h_Page_057.jpg
b2fd18cb0dd6dd203f0c6eed64894311
b09dec1eacd3d1d946cdf7309dfdda3f057bff56
30641 F20110404_AAALCD zhang_h_Page_131.QC.jpg
966ed93123a5d44270b15d158156794a
8dba0b2164b91875e3c218bc615553c8d70f6ce8
20039 F20110404_AAAMER zhang_h_Page_019.QC.jpg
12c43c1015e48fdc30635fefee1ef1e6
a0a3782cd2a8de4e24033b112ee23992ec51b1e9
F20110404_AAALBP zhang_h_Page_036.tif
b264875d2c2c88e0f989eace6b57ede2
800c25c6810965375ddd4d5ca1ea4434e035f6ce
62289 F20110404_AAALZL zhang_h_Page_081.jpg
42c89eea1f679c1e8390e4bbe454bd29
61bb9c3461148e822a8b0c04fff834a67e5e075e
91483 F20110404_AAALYX zhang_h_Page_058.jpg
ab88bfcd168113c09a74085e14205664
541cee05163655ddd2ce3a96406d771f15e0d47c
19876 F20110404_AAAMFG zhang_h_Page_004.QC.jpg
3244fc3f83febdd0d2309c2dbdd6c01f
1fa3f76afc26c34d78a8f2000ba94b1e6fad1640
21142 F20110404_AAAMES zhang_h_Page_023.QC.jpg
66ee53dabee1303e0352c4841ebf04db
95e8a69cf87862fc8a8049ffbdf4f47912ee3f6f
1521 F20110404_AAALBQ zhang_h_Page_088.txt
873199291d3fa43e9ccf224fa592144e
86160cda007e1801ad3441a9c10a269f08409c5c
90864 F20110404_AAALZM zhang_h_Page_082.jpg
71cbb28112df9197ca0b7d0def5e5628
6b8dce5e974586c7677f247860fcd6a1bdceaedc
85787 F20110404_AAALYY zhang_h_Page_062.jpg
df341b4a4c48861a21b2a46119127d6b
0664c8b2782b793c56801fac00f7b05c6f119c52
31636 F20110404_AAALCE zhang_h_Page_086.QC.jpg
e8c36e3050e0ad642d7d868a33861c56
9581c0a8fde8b691e2cf944d4468b275c855ab00
5239 F20110404_AAAMFH zhang_h_Page_047thm.jpg
90f3d51e92e6b8c75a138ffc9acc82b4
3d91798810caaa5d3504bc6c024597b874383eb8
7456 F20110404_AAAMET zhang_h_Page_133thm.jpg
321153be521e2079186154ff74f42df2
c7c255caad386fd10b72a955345a3c5214737d0c
1051878 F20110404_AAALBR zhang_h_Page_129.jp2
097df4e4759721cc22b2bbd254ff112e
06ff54050041ac56b027a167f8cd8e45cbc2f0bc
72867 F20110404_AAALZN zhang_h_Page_088.jpg
08a33c6792d4a6c9c6566e44c69a6e27
e418cdb6ac8cb6d739956b85cf6198ced1939c79
76431 F20110404_AAALYZ zhang_h_Page_063.jpg
6c19602e42819879b7002ee162228f6d
be4ce3768ba13b72d98d6fe919411b6d1ccb2b9f
5236 F20110404_AAALCF zhang_h_Page_107thm.jpg
62876b06f57e2114a118386b55b6da3a
b0f0d083ad133af9d350abbb3ce4e93129530346
F20110404_AAAMFI zhang_h_Page_099thm.jpg
726cfc2c9ed433e77a00de3c0c9876aa
fedf7f6468a819a2fb44ad31290c5c348f2f259b
7633 F20110404_AAAMEU zhang_h_Page_039thm.jpg
48c5c6d00f90e831438d95f7358d6c33
22c8cb02355dcc55da1e173eca4871416b99e57d
823749 F20110404_AAALBS zhang_h_Page_119.jp2
b67db2755deb6e8d1419e9511bb0246b
7fe10e4747e18b3fe22bb5099fd3ac392ddf6f82
79549 F20110404_AAALZO zhang_h_Page_089.jpg
0bb6aa714e7925cd7e124e70102f2c28
d4de9806a6e6e64cd306cc9955086e986f84dcad
76249 F20110404_AAALCG zhang_h_Page_024.jpg
2aa0a33c1b49480898765a8acb5e4270
63f55693d7fc97da447051f3b7396d1f4bbef6ff
6014 F20110404_AAAMFJ zhang_h_Page_020thm.jpg
7d86e7c42b3737050171eed82ea0eda2
c41d3962edd8e9e0a663cd71927003f789578001
7189 F20110404_AAAMEV zhang_h_Page_059thm.jpg
0f66b82dce2223098be4de3e0853b5a5
7cb2201f0c20978c743d98bbccebde5482363f35
23880 F20110404_AAALBT zhang_h_Page_105.QC.jpg
f3cbcb6b6e62941c485bbbfa23e5eff2
fa2e729b7fb8034ada5967e6d9ec6bf1e96aa466
68778 F20110404_AAALZP zhang_h_Page_090.jpg
97ea148368be9a1629f488d5fa37d081
5947887eea3a20d3d89101de1e0a7adfb21b80a2
1051972 F20110404_AAALCH zhang_h_Page_059.jp2
fe9e52c2ccdcbdd8f8f5cab14ee19be5
6724c63bc43430a0bf989161455182a7d22ef0fe
5835 F20110404_AAAMFK zhang_h_Page_023thm.jpg
6967fdc74e85005e0e51008e4456e491
b4e0d6ba12e2851c6b8d26d43768c64f5ca7fc85
7575 F20110404_AAAMEW zhang_h_Page_130thm.jpg
db954733d9b24076ccb56101541b540e
e0aa064dbe5d213eb553e309cc91d6917c1217ce
63869 F20110404_AAALBU zhang_h_Page_110.jpg
f5f4ad14644f08029843a2582b35ca82
e95435d900cb2994bff4678e4cf4b311bf7c1e68
70225 F20110404_AAALZQ zhang_h_Page_092.jpg
3efe462080b25eb0debe3f4d40030cb2
ee16cd72b70324c46cd776fb75eaf5c0c6672ee1
7015 F20110404_AAALCI zhang_h_Page_061thm.jpg
4e632af1cf16fbf560d531765e333fe8
d07cc6fa31e02d0228f17d6ce23b2334a7c6a95d
6079 F20110404_AAAMGA zhang_h_Page_030thm.jpg
0d03a6468c72411806286fea02e640b3
0909d52d34930959c33319ee58422b4dbcda076b
30297 F20110404_AAAMFL zhang_h_Page_032.QC.jpg
4bc6241775138e4b5de79a3d51879244
913a06391546c7bebd2e69622bda1421c2937b01
28668 F20110404_AAAMEX zhang_h_Page_064.QC.jpg
1ce51e92e81ebccb9e14ee89ee162366
dfa03ec8d560b42ee138357724174ae400441602
7035 F20110404_AAALBV zhang_h_Page_064thm.jpg
05743fd95a5305e2575f5e3126c6b0cb
7208dc2532cba004a6be26ca0a875d8c7d273208
82868 F20110404_AAALZR zhang_h_Page_094.jpg
bc6031fa0db3c872a43099e48cb6766b
d0bb92267ca99870e2ffff2191eac33a33de4493
5983 F20110404_AAALCJ zhang_h_Page_097thm.jpg
85dac116d2b8d57609ca7cc4b3e6bb7d
c982138e41fc91340c62981419608bf4852dcbf7
2463 F20110404_AAAMGB zhang_h_Page_136thm.jpg
0724c94fa3df34ca2c1456f962fbd453
aa0cf67d6a34af1146c8d9fb81c8da1d64a6c385
19943 F20110404_AAAMFM zhang_h_Page_109.QC.jpg
f73543fbf4cc61200294ae02cfff52c1
090b3f23bf617dd91a89ea735f42ce363acb00bf
29753 F20110404_AAAMEY zhang_h_Page_118.QC.jpg
77a5e9d8b909996cd8cfa288bcd98371
2c2f17cf7a3f330d711f982bc36d2323b4c36040
34998 F20110404_AAALBW zhang_h_Page_060.pro
9178944e4458a9cfe3044d12be4c9f28
3a332c2b192c1363032f0da83d35cc95570d802b
62500 F20110404_AAALZS zhang_h_Page_095.jpg
dfc7636db632583653da1294cd9f42f6
039725072f4bfeb3101a260e5ea07635cc4dcc1d
1341 F20110404_AAALCK zhang_h_Page_123.txt
3d001ac10ff4efa44f9276c9b981ecae
2f683629105f97d03228281c0644ef52aca411e9
20542 F20110404_AAAMGC zhang_h_Page_095.QC.jpg
31e4d26c05b543c650b3b520842df10c
253ee239d54a621013ff49fab21f67f778955223
6875 F20110404_AAAMFN zhang_h_Page_124thm.jpg
7c0b2ad299df149775776df5fd8fdde7
21fcc36c3c262704b86487b16ae10436865efd39
31340 F20110404_AAAMEZ zhang_h_Page_127.QC.jpg
a754fbf2b9ebfe7277901b3eaf3a4e65
d1425ac29d55a95dd2b51fa38ea1d086703e3ca2
45235 F20110404_AAALBX zhang_h_Page_055.pro
a6e1115338baae12d2728a04525770a4
d781c5ae522a3e0f90b212055152ee95429361dc
77676 F20110404_AAALZT zhang_h_Page_097.jpg
bd07b11afd13ab937cbdb40562afeb5d
30b4e2a0d8d186412ff9652f1fb3870a1d601502
532322 F20110404_AAALDA zhang_h_Page_112.jp2
f29ebedd2c8e971f3ea567b48ffc829a
152be3071e09424e46134a044fb9c020ddf541dd
1039347 F20110404_AAALCL zhang_h_Page_102.jp2
2a15e3ba2d283ca080179473975e6170
0d8e0edde95e2dc3f36954483ce92c0c0a6d95e6
5501 F20110404_AAAMGD zhang_h_Page_077thm.jpg
38d2fc31b054596640d248434e44fb6e
76b7743b84aa8ed39784618fa0db31f45ae88091
F20110404_AAAMFO zhang_h_Page_079thm.jpg
1bea3b3bc4b3670e12ffcd2e0c956127
58e110a9ff8f6e26b692174c8177fca4181af23b
50527 F20110404_AAALBY zhang_h_Page_126.pro
384bdc060e40d998b914e8c5bff5f4fe
e6f6c7ba5f7695d29ed43ca93e7d444bbd98eb28
93315 F20110404_AAALZU zhang_h_Page_103.jpg
dd82d6eac8a44b3967e16c5c348ce020
6e9016ae59aa90b14060a33a9670dc9c3812c51d
F20110404_AAALDB zhang_h_Page_076.tif
97cd6c6487c807460c0614d3c4dc4367
2eb608ca34faeba3754cb46381827f5c941046b8
1985 F20110404_AAALCM zhang_h_Page_102.txt
3d25afe5d7f603587d1ef90422075f02
6d455369cb8df8d189646674bcb442b929197a88
23719 F20110404_AAAMGE zhang_h_Page_119.QC.jpg
69e865103a683e1437deddcc859d2535
b94b3d293d7321ecb8392b92a04a5e026967ef91
5748 F20110404_AAAMFP zhang_h_Page_075thm.jpg
566787f178ef4c0edc79c2b0901f4eb5
7a1f8ec0f6bae1fef55606218292422067af8828
F20110404_AAALBZ zhang_h_Page_012.tif
ddcc236547c93f66548168af704d668b
cd915c2b413e77e65d3f2165407055396354ca5a
72224 F20110404_AAALZV zhang_h_Page_104.jpg
3a349d1135e514d29a43ff21deca59a8
325d3a52ccf4db592f8072bbec11e124e9820ec3
67692 F20110404_AAALDC zhang_h_Page_034.jpg
d247654fca252abaf3c41dbc7510a232
90cf04135e2195ce74e99eae46fbc286d96e229a
69045 F20110404_AAALCN zhang_h_Page_078.jpg
848078358997a859e9cb5ee85cb908ac
761a9c59e8bccfa1dc7a1de02350690986cd6f4d
16919 F20110404_AAAMGF zhang_h_Page_046.QC.jpg
2951683a6764cce8b77569d91eb6cda1
8fe513ab7d8918aa775048a36b5bd1241161e84e
21316 F20110404_AAAMFQ zhang_h_Page_034.QC.jpg
ac3dff722da65debbbc6ec5b9987d723
2b82cc6382401ccf5a954f6eafd054ee208b2aca
75241 F20110404_AAALZW zhang_h_Page_105.jpg
464aff3b99be01713581034b33595a4c
6d45a29e77fadc89b28b23a038dc389eead842b5
100289 F20110404_AAALDD zhang_h_Page_126.jpg
fd56cacff6ed69f0d09c39a6d646cd53
55734cdb2d2c50521787791b31e3c0d511a01cb7
F20110404_AAALCO zhang_h_Page_034.tif
fdf7072a94377e29b1a22a8ff2355899
77759bd7829fadad092db6b621ff6950652fe710
30526 F20110404_AAAMGG zhang_h_Page_059.QC.jpg
35903c9a195f1c8aee92b86e1206b439
e564bc53dcf7c093dc5f38ced8037a315075ce51
21267 F20110404_AAAMFR zhang_h_Page_116.QC.jpg
41f850067daa3b58a02bdb246d2ea1e7
33ea90c8824f335e059cf07ed78fa2129cc77b9b
53433 F20110404_AAALZX zhang_h_Page_106.jpg
ea10880fe1428026bb54da7b292d4b43
e2e1f9b30060c4a3ee6df48b578acbe12a0c74bf
820879 F20110404_AAALDE zhang_h_Page_015.jp2
5249b1a6e6c6fb736d848f153e5dd5ea
a46c4ad1d677163a39bc0e42394f96ef52ecf138
61571 F20110404_AAALCP zhang_h_Page_075.jpg
2b0b05e143496191a7b208b5b24adccb
4c86768fdd90b084e6daed623c637feaeb69e9cd
394 F20110404_AAAMFS zhang_h_Page_003thm.jpg
127dff3dc96e36be3d9eee595b99ec2e
79b9660c77b33c959a1e6e8c17d592c101a4267f
84147 F20110404_AAALZY zhang_h_Page_108.jpg
59ba32069f04c682438528a7867dccc0
fc96d51377f55f7b7eebcc4453e413a4ee176875
F20110404_AAALCQ zhang_h_Page_126.tif
c665027c938efa5a3c8e8b0ef90fc54c
f9cd1c1f3974fc720f7562693a46187cf7ba3d14
6328 F20110404_AAAMGH zhang_h_Page_063thm.jpg
3fd8663b898de0088ae95a2748832c31
7f1e74dcb0110857970ead4690e0b0776188207e
7442 F20110404_AAAMFT zhang_h_Page_012thm.jpg
e41f6782fc95a6d6b59ec74039088cb3
9fae1031804f7478324c7bea308af91f4bbd25ae
78465 F20110404_AAALZZ zhang_h_Page_111.jpg
ff5c4ea79ff9bac449f05503739561d8
74a5c1b4c5be8d12b7ad2d638c82114508e6bb7a
44534 F20110404_AAALDF zhang_h_Page_064.pro
45b2ecee16dea192304972c596445ee8
b4fe03d01f37393535d498a4179c4bcb9d42435c
7065 F20110404_AAALCR zhang_h_Page_118thm.jpg
e6e390b8cc0358fd1373deb8a372bcd2
0481e8f5831c252681011abe6106a061f1969329
7771 F20110404_AAAMGI zhang_h_Page_132thm.jpg
39e37eb24d51ceb24557fb6a87d566b3
b89d7f23934a0a77330be6826999e19846b6a0f5
5655 F20110404_AAAMFU zhang_h_Page_116thm.jpg
a5f7ab59de4811e0395c5351f7c57071
02248844dff2d408096b9abdcf095a3d265fe0da
F20110404_AAALDG zhang_h_Page_059.tif
5ec2b26c6f0ab6e0f271086cbbbca450
95ae821c40cb55e1d14b5ab575631acdbb22378d
922865 F20110404_AAAKYA zhang_h_Page_062.jp2
310521a75572eb0530d3b0fdc7cc5413
fa79356959bd3d5cb6f280351bf7a15e265644ef
21744 F20110404_AAALCS zhang_h_Page_005.QC.jpg
cab7f2bf146749a1661a56daf24b31dc
59a688ad53b65aadb202db0dcaf0b3fec929b6ee



PAGE 4

iv

PAGE 5

TABLEOFCONTENTS page ACKNOWLEDGMENTS............................. iv LISTOFTABLES................................. vii LISTOFFIGURES................................ viii KEYTOABBREVIATIONS........................... ix KEYTOSYMBOLS................................ x ABSTRACT.................................... xi CHAPTER 1INTRODUCTION.............................. 1 1.1 Motivation ............................. 1 1.2 OptimalityConditions ...................... 2 2UNCONSTRAINEDOPTIMIZATION................... 4 2.1 ANewConjugateGradientMethodwithGuaranteedDescent ................................ 4 2.1.1IntroductiontoNonlinearConjugateGradientMethod... 4 2.1.2CG DESCENT........................ 6 2.1.3GlobalConvergence...................... 8 2.1.4LineSearch........................... 17 2.1.5NumericalComparisons.................... 19 2.2 ACyclicBarzilai-Borwein(CBB)Method ......... 25 2.2.1IntroductiontoNonmonotoneLineSearch......... 25 2.2.2MethodandLocalLinearConvergence........... 28 2.2.3MethodforConvexQuadraticProgramming........ 38 2.2.4AnAdaptiveCBBMethod.................. 40 2.2.5NumericalComparisons.................... 46 2.3 Self-adaptiveInexactProximalPointMethods ....... 50 2.3.1MotivationandtheAlgorithm................ 50 2.3.2LocalErrorBoundCondition................. 52 2.3.3LocalConvergence....................... 56 2.3.4GlobalConvergence...................... 67 2.3.5PreliminaryNumericalResults................ 69 v

PAGE 6

3BOXCONSTRAINEDOPTIMIZATION.................. 72 3.1 Introduction ............................ 72 3.2 GradientProjectionMethods ................. 75 3.3 ActiveSetAlgorithm(ASA) .................. 84 3.3.1GlobalConvergence...................... 88 3.3.2LocalConvergence....................... 90 3.3.3NumericalComparisons.................... 105 4CONCLUSIONSANDFUTURERESEARCH............... 112 REFERENCES................................... 114 BIOGRAPHICALSKETCH............................ 124 vi

PAGE 7

LISTOFTABLES Table page 2{1VariouschoicesfortheCGupdateparameter.............. 5 2{2Solutiontimeversustolerance...................... 24 2{3Transitiontosuperlinearconvergence.................. 39 2{4ComparingCBB(m)methodwithanadaptiveCBBmethod...... 42 2{5Numberoftimeseachmethodwasfastest(timemetric,stoppingcriterion(2.104)).............................. 49 2{6CPUtimesforselectedproblems..................... 50 2{7 k g ( x k ) k versusiterationnumber k .................... 70 2{8Statisticsforill-conditionCUTEproblemsand cg descent ..... 71 vii

PAGE 8

LISTOFFIGURES Figure page 2{1Performanceproles........................... 22 2{2Performanceprolesofconjugategradientmethods.......... 23 2{3Graphsoflog(log( k g k k 1 ))versus k ,(a)3 n 6and m =3,(b) 6 n 9and m =4.......................... 41 2{4PerformancebasedonCPUtime..................... 48 3{1Thegradientprojectionstep........................ 75 3{2Performanceproles,50CUTErtestproblems............. 107 3{3Performanceproles,42sparsestCUTErproblems,23MINPACK-2 problems, =10 6 ........................... 108 3{4Performanceproles, =10 2 k d 1 ( x 0 ) k 1 ................ 109 3{5Performanceproles,evaluationmetric,50CUTErtestproblems, gradient-basedmethods........................ 110 3{6Performanceproles,evaluationmetric,42sparsestCUTErproblems,23MINPACK-2problems.................... 111 viii

PAGE 9

ASA:activesetalgorithm CBB:cyclicBarzilai-Borwein CG:conjugategradient LP:linearprogramming NCG:nonlinearconjugategradient NGPA:nonmonotonegradientprojectionalgorithm NLP:nonlinearprogramming SSOSC:strongsecondordersucientcondition ix

PAGE 10

:...........................................11 x

PAGE 11

Inthisdissertation,wedeveloptheoriesandecientalgorithmicapproachesongradientmethodstosolvelarge-scalenonlinearoptimizationproblems. Therstpartofthisdissertationdiscussesnewgradientmethodsandtech-niquesfordealingwithlarge-scaleunconstrainedoptimization.WerstproposeanewnonlinearCGmethod(CG DESCENT),whichsatisesthestrongdescentconditiongTkdk7 8kgkk2independentoflinesearch.ThisnewCGmethodisonememberofaoneparameterfamilyofnonlinearCGmethodwithguaranteeddescent.Wealsodevelopanew\ApproximateWolfe"linesearchwhichisbothecientandhighlyaccurate.CG DESCENTistherstnonlinearCGmethodwhichsatisesthesucientdescentcondition,independentoflinesearch.Moreover,globalconvergenceisestablishedunderthestandard(notstrong)Wolfeconditions.TheCG DESCENTsoftwareturnsouttobeabenchmarksoftwareforsolvingunconstrainedoptimization.Then,weproposeaso-calledcyclicBaizilai-Borwein(CBB)method.ItisprovedthatCBBislocallylinearlyconvergentatalocalminimizerwithpositivedeniteHessian.Numericalevidenceindicatesthatwhenm>n=23,CBBislocallysuperlinearlyconvergent,wheremisthecyclelength xi

PAGE 12

Thesecondpartofthisdissertationdiscussesusinggradientmethodstosolvelarge-scaleboxconstrainedoptimization.Werstdiscussthegradientprojectionmethods.Then,anactivesetalgorithm(ASA)forboxconstrainedoptimizationisdeveloped.Thealgorithmconsistsofanonmonotonegradientprojectionstep,anunconstrainedoptimizationstep,andasetofrulesforbranchingbetweenthetwosteps.Globalconvergencetoastationarypointisestablished.Underthestrongsecond-ordersucientoptimalitycondition,withoutassumingstrictcomplementarity,thealgorithmeventuallyreducestounconstrainedoptimizationwithoutrestarts.Forstronglyconvexquadraticboxconstrainedoptimization,ASAisshowntohaveniteconvergencewhenaconjugategradientmethodisusedintheunconstrainedoptimizationstep.AspecicimplementationofASAisgiven,whichexploitsthecyclicBarzilai-BorweinalgorithmforthegradientprojectionstepandCG DESCENTforunconstrainedoptimization.NumericalexperimentsusingtheboxconstrainedproblemsintheCUTErandMINPACKtestproblemlibrariesshowthatthisnewalgorithmoutperformsbenchmarksoftwaressuchasGENCAN,L-BFGS-B,andTRON. xii

PAGE 13

1.1 Throughoutthedissertation,thenonlinearprogram(NLP)thatwearetryingtosolvehasthefollowinggeneralformulations:minff(x):x2Sg;(1.1) wherefisareal-valued,continuousfunctiondenedonanonemptysetSRn.fisoftencalledtheobjectivefunctionandSiscalledthefeasibleset.InChapter2,weconsiderthecasewhereS=Rn,i.e.,theunconstrainedoptimizationproblem;whileinChapter3,westudythecasewhereSisaboxsetdenedonRn,i.e.,S=fx2Rn:lxugandl
PAGE 14

optimization,peopleoftenconsiderSisdenedbyanitesequenceofequalityandinequalityconstraints.Morespecically,problem(1.1)canbereformulatedasthefollowing:minx2Rnf(x)s:t:ci(x)=0;i=1;2;:::;me;ci(x)0;i=me+1;:::;m; 1.2 Whenfisaconvexfunction,weknow(strictly)localminimumisalsoa(strictly)globalminimum.However,itishardinadvancetoknowwhethertheobjectivefunctionisconvexornotandinmanycasesitisveryhardorevenimpossibletondaglobalminimumonafeasibleregion.Sointhecontextofnonlinearoptimizationitisoftenfoundthatalocalminimumisasolutionofproblem(1.1),andmanyalgorithmsaretryingtondafeasiblepointwhichsatisessomenecessaryconditionsofalocalminimum.Inthefollowing,welistsomeofthesenecessaryconditions.

PAGE 15

Inthisdissertation,wemainlyfocusondevelopinggradientmethodstogenerateiterationpointswhichsatisfysomerstorderconditionforlarge-scaleunconstrainedandboxconstrainedoptimization.

PAGE 16

Inthischapter,weconsidertosolve(1.1)withS=Rn,i.e.thefollowingunconstrainedoptimizationproblem:minff(x):x2Rng;(2.1) wherefisareal-valued,continuousfunction. 2.1 Whenappliedtothenonlinearproblem(2.1),anonlinearconjugategradientmethodgeneratesasequencexk,k1,startingfromaninitialguessx02Rn,usingtherecurrencexk+1=xk+kdk;(2.2) 4

PAGE 17

Table2{1:VariouschoicesfortheCGupdateparameter wherethepositivestepsizekisobtainedbyalinesearch,andthedirectionsdkaregeneratedbytherule:dk+1=gk+1+kdk;d0=g0: Table2{1providesachronologicallistofsomechoicesfortheCGupdateparameter.The1964formulaofFletcherandReevesisusuallyconsideredtherstnonlinearCGalgorithmsincetheirpaper[57]focusesonnonlinearopti-mization,whilethe1952paper[81]ofHestenesandStiefelfocusesonsymmetric,positive-denitelinearsystems.Daniel'schoicefortheupdateparameter,whichisfundamentallydierentfromtheotherchoices,isnotdiscussedinthisdissertation.Forlarge-scaleproblems,choicesfortheupdateparameterthatdonotrequiretheevaluationoftheHessianmatrixareoftenpreferredinpracticeovermethods

PAGE 18

thatrequiretheHessianineachiteration.IntheremainingmethodsofTable2{1,exceptforthenewmethodattheend,thenumeratoroftheupdateparameterkiseitherkgk+1k2orgTk+1ykandthedenominatoriseitherkgkk2ordTkykordTkgk.The2possiblechoicesforthenumeratorandthe3possiblechoicesforthedenominatorleadto6dierentchoicesforkshowninTable2{1. Iffisastronglyconvexquadratic,thenintheory,all8choicesfortheupdateparameterinTable2{1areequivalentwithanexactlinesearch.However,fornon-quadraticcostfunctions,eachchoicefortheupdateparameterleadstoquitedierentperformanceunderinexactlinesearches. 2.1.2 DESCENT DESCENT.Ithascloseconnectionstomemorylessquasi-NewtonschemeofPerry[105]andShanno[115].Toprovetheglobalconvergenceforageneralnonlinearfunction,similartotheapproach[60,79,121]takenforthePolak-Ribiere-Polyak[106,107]versionoftheconjugategradientmethod,werestrictthelowervalueofNk.Inourrestrictedscheme,unlikethePolak-Ribiere-Polyakmethod,wedynamicallyadjustthelowerboundonNkinordertomakethelowerboundsmallerastheiteratesconverge:dk+1=gk+1+Nkdk;d0=g0; Nk=maxNk;k;k=1 where>0isaconstant;wetook=:01intheexperiments Withconjugategradientmethods,thelinesearchtypicallyrequiressucientaccuracytoensurethatthesearchdirectionsyielddescent.Moreover,ithasbeenshown[36]thatfortheFletcher-Reeves[57]andthePolak-Ribiere-Polyak[106,107]conjugategradientmethods,alinesearchthatsatisesthestrongWolfeconditions

PAGE 19

maynotyieldadirectionofdescent,forasuitablechoiceoftheWolfelinesearchparameters,evenforthefunctionf(x)=kxk2,where>0isaconstant.Anattractivefeatureofthenewconjugategradientscheme,whichwenowestablish,isthatthesearchdirectionsalwaysyielddescentwhendTkyk6=0,aconditionwhichissatisedwhenfisstronglyconvex,orthelinesearchsatisestheWolfeconditions. 8kgkk2:(2.7) WeapplytheinequalityuTv1 2(kuk2+kvk2) tothersttermin(2.8)withu=1 2(dTkyk)gk+1andv=2(gTk+1dk)yk

PAGE 20

IfgTk+1dk0,then(2.7)followsimmediatelysince0.IfgTk+1dk<0,thengTk+1dk+1=kgk+1k2+gTk+1dkkgk+1k2+NkgTk+1dk 2.1.3 analysis for strongly convex functions. Althoughthesearchdi-rectionsgeneratedbyeither(2.2){(2.3)withk=Nkor(2.4){(2.5)arealwaysdescentdirections,weneedtoconstrainthechoiceofktoensureconvergence.WeconsiderlinesearchesthatsatisfyeitherGoldstein'sconditions[64]:1kgTkdkf(xk+kdk)f(xk)2kgTkdk;(2.9) where0<2<1 2<1<1andk>0,ortheWolfeconditions[122,123]:f(xk+kdk)f(xk)kgTkdk;

PAGE 21

where0<<1.AsininDaiandYuan[35],wedonotrequirethe\strongWolfe"conditionjgTk+1dkjgTkdk,whichisoftenusedtoproveconvergenceofnonlinearconjugategradientmethods. kdkk2:(2.12) LjgTkdkj kdkk2:(2.13)

PAGE 22

Theorem4andtheassumptiongk6=0implythatdk6=0.Sincek>0,itfollowsfrom(2.17)thatyTkdk>0.SincefisstronglyconvexoverL,fisboundedfrombelow.Aftersummingoverktheupperboundin(2.9)or(2.10),weconcludethat1Xk=0kgTkdk>: ByLipschitzcontinuity(2.15),kykk=kgk+1gkk=krf(xk+kdk)rf(xk)kLkkdkk:(2.19) Utilizing(2.17)and(2.3),wehavejNkj=yTkgk+1 +2L2 kdkk: Hence,wehavekdk+1kkgk+1k+jNkjkdkk1+L +2L2

PAGE 23

Insertingthisupperboundfordkin(2.18)yields1Xk=1kgkk2<1; wheresk=xk+1xk,whenfisstronglyconvexandthecosineoftheanglebetweendkandgk+1issucientlysmall.By(2.17)and(2.19),wehavejdTkgk+1j jdTkykjkykkL juTkgk+1j=c1kgk+1k;(2.22) whereuk=dk=kdkkistheunitvectorinthedirectiondk,isthecosineoftheanglebetweendkandgk+1,andc1=L=.Bythedenitionofdk+1in(2.2),wehavekdk+1k2kgk+1k22NkdTkgk+1:(2.23) BytheboundforNkin(2.20),jNkdTkgk+1jc2juTkgk+1jkgk+1k=c2kgk+1k2;(2.24) wherec2istheconstantappearingin(2.20).Combining(2.23)and(2.24),wehavekdk+1kp

PAGE 24

Convergence analysis for general nonlinear functions. Ouranalysisof(2.4){(2.5)forgeneralnonlinearfunctionsexploitsinsightsdevelopedbyGilbertandNocedalintheiranalysis[60]ofthePRP+scheme.Similartotheapproachtakenin[60],weestablishaboundforthechangeuk+1ukinthenormalizeddirectionuk=dk=kdkk,whichweusetoconclude,bycontradiction,thatthegradientscannotbeboundedawayfromzero.Thefollowingtheoremistheanalogueof[60,Lemma4.1],itdiersinthetreatmentofthedirectionupdateformula(2.4). 491Xk=0(gTkdk)2 Denethequantities:+k=maxfNk;0g;k=minfNk;0g;rk=gk+k1dk1 kdkk:

PAGE 25

Sincetheukareunitvectors,krkk=kukkuk1k=kkukuk1k: Bythedenitionofkandthefactthatk<0andNkkin(2.5),wehavethefollowingboundforthenumeratorofrk:kgk+k1dk1kkgkkminfNk1;0gkdk1kkgkkk1kdk1kkgkk+1 minf;g; where=maxx2Lkrf(x)k:(2.28) Letcdenotetheexpression+1=minf;gin(2.27).Thisboundforthenumeratorofrkcoupledwith(2.26)giveskukuk1k2krkk2c Finally,squaring(2.29),summingoverk,andutilizing(2.25),theproofiscom-plete.

PAGE 26

I.AboundforNk: BytheWolfeconditiongTk+1dkgTkdk,wehaveyTkdk=(gk+1gk)Tdk(1)gTkdk=(1)gTkdk:(2.31) ByTheorem4,gTkdk7 8kgkk27 82: 82:(2.32) Also,observethatgTk+1dk=yTkdk+gTkdk
PAGE 27

BythedenitionofNkin(2.5),wehaveNk=NkifNk0and0NkNkifNk<0: jyTkdkj8 71 (1)2Lkskk+2L2kskk2max whereisdenedin(2.28),C=8 71 (1)2L+2L2Dmax HereDisthediameterofL. II.Aboundonthestepssk: Thisisamodiedversionof[60,Thm.4.3].Observethatforanylk,xlxk=l1Xj=kxj+1xj=l1Xj=kksjkuj=l1Xj=kksjkuk+l1Xj=kksjk(ujuk): Letbeapositiveinteger,chosenlargeenoughthat4CD;(2.40)

PAGE 28

whereCandDappearin(2.37)and(2.38).Choosek0largeenoughthatXik0kui+1uik21 4:(2.41) ByLemma2,k0canbechoseninthisway.Ifj>kk0andjk,thenby(2.41)andtheCauchy-Schwarzinequality,wehavekujukkj1Xi=kkui+1uikp 41=2=1 2: whenl>kk0andlk. III.Aboundonthedirectionsdl: By(2.4)andtheboundonNkgiveninStepI,wehavekdlk2(kglk+jNl1jkdl1k)222+2C2ksl1k2kdl1k2;

PAGE 29

Above,theproductisdenedtobeonewhenevertheindexrangeisvacuous.LetusconsideraproductofconsecutiveSj,wherekk0:k+1Yj=kSj=k+1Yj=k2C2ksjk2=k+1Yj=kp 2 where
PAGE 30

order2k.Theinterpolating(quadratic)polynomialqthatmatches()at=0and0()at=0and=kisq()=0(k)0(0) 2k2+0(0)+(0): Withtheseinsights,weterminatethelinesearchwheneitherofthefollowingconditionsholds: T1. TheoriginalWolfeconditions(2.10){(2.11)aresatised. T2. TheapproximateWolfeconditions(2.44)aresatisedand(k)(0)+k;(2.45) wherek0isanestimatefortheerrorinthevalueoffatiterationk.Intheexperimentssection,wetookk=jf(xk)j;(2.46) whereisa(small)xedparameter. Wesatisfytheterminationcriterionbyconstructinganestedsequenceof(bracketing)intervalswhichconvergetoapointsatisfyingeitherT1orT2.Atypicalinterval[a;b]inthenestedsequencesatisesthefollowingoppositeslopecondition:(a)(0)+k;0(a)<0;0(b)0:(2.47)

PAGE 31

Givenaparameter2(0;1).Wealsodeveloptheintervalupdateruleswhichcanbefoundastheprocedure\intervalupdate"inthepaper[73].Andduringthe\intervalupdate"procedure,anewsocalled\DoubleSecantStep"isused.Weproveimplementingthisnew\DoubleSecantStep"inthe\intervalupdate",anasymptoticrootconvergenceorder1+p DESCENT,totheL-BFGSlimitedmemoryquasi-NewtonmethodofNocedal[103]andLiuandNocedal[91]andtootherconjugategradientmethodsaswell.Comparisonsbasedonothermetrics,suchasthenumberofiterationsornumberoffunction/gradientevaluations,canbefoundinpaper[74],whereextensivenumericaltestingofthemethodsisdone.WeconsideredboththePRP+versionoftheconjugategradientmethoddevelopedbyGilbertandNocedal[60],wherethekassociatedwiththePolak-Ribiere-Polyakconjugategradientmethod[106,107]iskeptnonnegative,andversionsoftheconjugategradientmethoddevelopedbyDaiandYuanin[35,37],denotedCGDYandCGDYH,whichachievedescentforanylinesearchthatsatisestheWolfe

PAGE 32

conditions(2.10){(2.11).ThehybridconjugategradientmethodCGDYHusesk=maxf0;minfHSk;DYkgg; TheL-BFGSandPRP+codeswereobtainedfromJorgeNocedal'swebpage.TheL-BFGScodeisauthoredbyJorgeNocedal,whilethePRP+codeisco-authoredbyGuanghuiLiu,JorgeNocedal,andRichardWaltz.InthedocumentationfortheL-BFGScode,itisrecommendedthatbetween3and7vectorsbeusedforthememory.Hence,wechose5vectorsforthememory.ThelinesearchinbothcodesisamodicationofsubroutineCSRCHofMoreandThuente[101],whichemploysvariouspolynomialinterpolationschemesandsafeguardsinsatisfyingthestrongWolfelinesearchconditions. WealsomanufacturedanewL-BFGScodebyreplacingtheMore/Thuentelinesearchbythenewlinesearchpresentedinourpaper.WecallthisnewcodeL-BFGS.ThenewlinesearchwouldneedtobemodiedforuseinthePRP+codetoensuredescent.Hence,weretainedtheMore/ThuentelinesearchinthePRP+code.SincetheconjugategradientalgorithmsofDaiandYuanachievesdescentforanylinesearchthatsatisestheWolfeconditions,weareabletousethenewlinesearchinourexperimentswithCGDYandwithCGDYH.AllcodeswerewritteninFortranandcompiledwithf77(defaultcompilersettings)onaSunworkstation. Forourlinesearchalgorithm,weusedthefollowingvaluesfortheparameters:=:1;=:9;=106;=:5;=:66;=:01 Ourrationaleforthesechoiceswasthefollowing:Theconstraintsonandare0<<1and<:5.Asapproaches0andapproaches1,thelinesearch

PAGE 33

terminatesquicker.Thechosenvalues=:1and=:9representacompromisebetweenourdesireforrapidterminationandourdesiretoimprovethefunctionvalue.WhenusingtheapproximateWolfeconditions,wewouldliketoachievedecayinthefunctionvalue,ifnumericallypossible.Hence,wemadethesmallchoice=106in(2.46).Whenrestrictingkin(2.5),wewouldliketoavoidtruncationifpossible,sincethefastestconvergenceforaquadraticfunctionisobtainedwhenthereisnotruncationatall.Thechoice=:01leadstoinfrequenttruncationofk.Thechoice=:66ensuresthatthelengthoftheinterval[a;b]decreasesbyafactorof2/3ineachiterationofthelinesearchalgorithm.Thechoice=:5intheupdateprocedurecorrespondstotheuseofbisection.Ourstartingguessforthestepkinthelinesearchwasobtainedbyminimizingaquadraticinterpolant. Intherstsetofexperiments,westoppedwhenever(a)krf(xk)k1106or(b)kgTkdk1020jf(xk+1)j;(2.48) wherekk1denotesthemaximumabsolutecomponentofavector.Inallbut3cases,theiterationsstoppedwhen(a)wassatised{thesecondcriterionessentiallysaysthattheestimatedchangeinthefunctionvalueisinsignicantcomparedtothefunctionvalueitself. Thecputimeinsecondsandthenumberofiterations,functionevaluations,andgradientevaluations,foreachofthemethodsarepostedattheauthor'swebsite.Inrunningthenumericalexperiments,wecheckedwhetherdierentcodesconvergedtodierentlocalminimizers;weonlyprovidedataforproblemswhereallsixcodesconvergedtothesamelocalminimizer.Thenumericalresultsarenowanalyzed. Theperformanceofthe6algorithms,relativetocputime,wasevaluatedusingtheprolesofDolanandMoree[43].Thatis,foreachmethod,weplotthefraction

PAGE 34

Figure2{1:Performanceproles Pofproblemsforwhichthemethodiswithinafactorofthebesttime.InFigure2{1,wecomparetheperformanceofthe4codesCG,L-BFGS,L-BFGS,andPRP+.Theleftsidesideoftheguregivesthepercentageofthetestproblemsforwhichamethodisthefastest;therightsidegivesthepercentageofthetestproblemsthatweresuccessfullysolvedbyeachofthemethods.Thetopcurveisthemethodthatsolvedthemostproblemsinatimethatwaswithinafactorofthebesttime.SincethetopcurveinFigure2{1correspondstoCG,thisalgorithmisclearlyfastestforthissetof113testproblemswithdimensionsrangingfrom50to10,000.Inparticular,CGisfastestforabout60%(68outof113)ofthetestproblems,anditultimatelysolves100%ofthetestproblems.SinceL-BFGS(fastestfor29problems)performedbetterthanL-BFGS(fastestfor17problems),thenewlinesearchledtoimprovedperformance.Nonetheless,L-BFGSwasstilldominatedbyCG. InFigure2{2wecomparetheperformanceofthefourconjugategradient

PAGE 35

Figure2{2:Performanceprolesofconjugategradientmethods algorithms.ObservethatCGisthefastestofthefouralgorithm.SinceCGDY,CGDYH,andCGusethesamelinesearch,Figure2{2indicatesthatthesearchdirectionofCGyieldsquickerdescentthanthesearchdirectionsofCGDYandCGDYH.Also,CGDYHismoreecientthanCGDY.Sinceeachofthesesixcodesdiersintheamountoflinearalgebrarequiredineachiterationandintherelativenumberoffunctionandgradientevaluations,dierentcodeswillbesuperiorindierentproblemsets.Inparticular,thefourthrankedPRP+codeinFigure2{1stillachievedthefastesttimein6ofthe113testproblems. Inournextseriesofexperiments,showninTable2{2,weexploretheabilityofthealgorithmsandlinesearchtoaccuratelysolvethetestproblems.Inthisseriesofexperiments,werepeatlysolvesixtestproblems,increasingthespeciedaccuracyineachrun.Fortheinitialrun,thestoppingconditionwaskgkk1102,andinthelastrun,thestoppingconditionwaskgkk11012.Thetestproblemsusedintheseexperiments,andtheirdimensions,werethefollowing:

PAGE 36

Table2{2:Solutiontimeversustolerance Tolerance Algorithm ProblemNumberkgkk1 #1#2#3#4#5#6 CG 5.222.320.860.001.5710.04102 L-BFGS 4.242.010.990.002.4616.48 PRP+ 6.773.551.430.003.0417.80 CG 9.205.272.090.002.2617.13103 L-BFGS 6.887.462.650.003.3022.63 PRP+ 12.797.163.610.004.2624.13 CG 10.795.765.040.003.2325.26104 L-BFGS 12.2410.926.770.004.1133.36 PRP+ 15.9711.408.130.005.01F CG 14.267.947.970.004.2727.49105 L-BFGS 16.6016.9910.970.004.90F PRP+ 21.5412.0912.310.006.22F CG 16.688.499.805.715.4232.03106 L-BFGS 21.8121.0813.977.785.83F PRP+ 24.5812.8115.338.077.95F CG 20.3111.4711.935.815.9339.79107 L-BFGS 26.47F17.379.986.39F PRP+ 31.17F17.348.509.50F CG 23.2212.8814.099.686.4947.50108 L-BFGS 32.23F20.4814.857.67F PRP+ 33.75F19.83F10.86F CG 27.9213.3216.8012.347.4656.68109 L-BFGS 33.64FFF8.50F PRP+ FFFF11.74F CG 33.2513.8921.1813.218.1165.471010 L-BFGS 39.12FFF9.53F PRP+ FFFF13.56F CG 38.8014.3825.5813.399.1277.031011 L-BFGS FFFF9.99F PRP+ FFFF14.44F CG 42.5115.6227.5413.389.7778.311012 L-BFGS FFFF10.54F PRP+ FFFF15.96F

PAGE 37

1.FMINSURF(5625)4.FLETCBV2(1000)2.NONCVXU2(1000)5.SCHMVETT(10000)3.DIXMAANE(6000)6.CURLY10(1000) Theseproblemswerechosensomewhatrandomlyexceptthatwedidnotincludeanyproblemforwhichtheoptimalcostwaszero.Whentheoptimalcostiszerowhiletheminimizerxisnotzero,theestimatejf(xk)jfortheerrorinfunctionvalue(whichweusedinthepreviousexperiments)canbeverypoorastheiteratesapproachtheminimizer(wherefvanishes).Thesesixproblemsallhavenonzerooptimalcost.InTable2{2,Fmeansthatthelinesearchterminatedbeforetheconvergencetoleranceforkgkkwassatised.AccordingtothedocumentationforthelinesearchintheL-BFGSandPRP+codes,\Roundingerrorspreventfurtherprogress.Theremaynotbeastepwhichsatisesthesucientdecreaseandcurvatureconditions.Tolerancesmaybetoosmall." AscanbeseeninTable2{2,thelinesearchbasedontheWolfeconditions(usedintheL-BFGSandPRP+codes)failsmuchsoonerthanthelinesearchbasedonboththeWolfeandtheapproximateWolfeconditions(usedinCGandL-BFGS).Roughly,alinesearchbasedontheWolfeconditionscancomputeasolutionwithaccuracyontheorderofthesquarerootofthemachineepsilon,whilealinesearchthatalsoincludestheapproximateWolfeconditionscancomputeasolutionwithaccuracyontheorderofthemachineepsilon. 2.2

PAGE 38

Inmonotonelinesearchmethods,kischosensothatf(xk+1)0;0mkminfmk1+1;Mg:

PAGE 39

CUTElibrary,thisnewnonmonotonelinesearchalgorithmusedfewerfunctionandgradientevaluations,onaverage,thaneitherthemonotoneorthetraditionalnonmonotonescheme. TheoriginalmethodofBarzilaiandBorweinwasdevelopedin[3].ThebasicideaofBarzilaiandBorweinistoregardthematrixD(k)=1 wheresk1=xkxk1,yk1=gkgk1,andk2.Theproposedstepsize,obtainedfrom(2.51),isBBk=sTk1sk1 Otherpossiblechoicesforthestepsizekinclude[29,31,34,38,58,68,109,114].Inthisdissertation,wereferto(2.52)astheBarzilai-Borwein(BB)formula.Theiterativemethod(2.49),correspondingtothestepsizetobetheBBstepsize(2.52)anddk=gk,iscalledtheBBmethod.Duetotheirsimplicity,eciencyandlowmemoryrequirements,BB-likemethodshavebeenusedinmanyapplications.Glunt,Hayden,andRaydan[62]presentadirectapplicationoftheBBmethodinchemistry.Birginetal.[8]useaglobalizedBBmethodtoestimatetheopticalcon-stantsandthethicknessofthinlms,whileinBirginetal.[10]furtherextensionsaregiven,leadingtomoreecientprojectedgradientmethods.LiuandDai[92]provideapowerfulschemeforsolvingnoisyunconstrainedoptimizationproblemsbycombiningtheBBmethodandastochasticapproximationmethod.Thepro-jectedBB-likemethodturnsouttobeveryusefulinmachinelearningfortrainingsupportvectormachines(seeSeranietal[114]andDaiandFletcher[31]).Em-pirically,goodperformanceisobservedonawidevarietyofclassicationproblems.AlltheabovegoodperformancesoftheBB-likemethodarebasedoncombiningthemethodwithsomenonmonotonelinesearchtechnique.Inthefollowingofthis

PAGE 40

section,wearegoingtodiscussasocalledcyclicBBmethod(CBB).Byusingamodiedversionoftheadaptivenonmonotonelinesearchdevelopedin[39],agloballyadaptiveconvergentschemeforCBBmethodisalsodeveloped. 2.2.2 wherem1isagainthecyclelength.AnadvantageoftheCBBmethodisthatforgeneralnonlinearfunctions,thestepsizeisgivenbythesimpleformula(2.51)incontrasttothenontrivialoptimizationproblemassociatedwiththesteepestdescentstep. InthissectionweproveR-linearconvergencefortheCBBmethod.In[92],itisproposedthatR-linearconvergencefortheBBmethodappliedtoageneralnonlinearfunctioncouldbeobtainedfromtheR-linearconvergenceresultsforaquadraticbycomparingtheiteratesassociatedwithaquadraticapproximationtothegeneralnonlineariterates.InourR-linearconvergenceresultfortheCBBmethod,wemakesuchacomparison. TheCBBiterationcanbeexpressedasxk+1=xkkgk;(2.54) wherek=sTisi

PAGE 41

second-orderTaylorapproximation^ftofaroundxisgivenby^f(x)=f(x)+1 2(xx)TH(xx):(2.56) Wewillcompareaniteratexk+jgeneratedby(2.54)toaCBBiterate^xk;jassociatedwith^fandthestartingpoint^xk;0=xk.Moreprecisely,wedene:^xk;0=xk^xk;j+1=^xk;j^k;j^gk;j;j0; where^k;j=8>><>>:kif(k+j)=(k)^sTi^si Here^sk+j=^xk;j+1^xk;j,^gk;j=H(^xk;jx),and^yk+j=^gk;j+1^gk;j. Weexploitthefollowingresultestablishedin[29,Thm.3.2]: 2k^xk;0xk:

PAGE 42

and1yTr2f(x)y yTy2forally2
PAGE 43

proof,wealsoshowthatkg(xk+j)^g(^xk;j)kckxkxk2; forallj2[0;`],where^g(x)=r^f(x)T=H(xx). Theproofof(2.64){(2.67)isbyinductionon`.For`=0,wetake=.Therelation(2.64)istrivialsince^xk;0=xk.By(2.59),wehavekg(xk)^g(^xk;0)k=kg(xk)^g(xk)kkxkxk2; Now,proceedingbyinduction,supposethatthereexistL2[1;N)and>0withthepropertythatif(2.63)holdsforany`2[0;L1],then(2.64){(2.67)aresatisedforallj2[0;`].Wewishtoshowthatforasmallerchoiceof>0,wecanreplaceLbyL+1.Hence,wesupposethat(2.63)holdsforallj2[0;L].Since(2.63)holdsforallj2[0;L1],itfollowsfromtheinductionhypothesisand(2.66)thatkxk+L+1xkkxkxk+LXi=0ksk+ikckxkxk: Consequently,bychoosingsmallerifnecessary,wehavexk+L+12B(x)whenxk2B(x).

PAGE 44

Bythetriangleinequality,kxk+L+1^xk;L+1k=kxk+Lk+Lg(xk+L)[^xk;L^k;L^g(^xk;L)]kkxk+L^xk;Lk+j^k;Ljkg(xk+L)^g(^xk;L)k+jk+L^k;Ljkg(xk+L)k: Wenowanalyzeeachofthetermsin(2.69).Bytheinductionhypothesis,thebound(2.64)withj=Lholds,whichgiveskxk+L^xk;Lkckxkxk2:(2.70) Bythedenitionof^,either^k;L=k2[12;11],or^k;L=^sTi^si 2^sTi^si 1: 1kg(xk+L)^g(^xk;L)kckxkxk2: Also,by(2.67)withj=Land(2.62),wehavejk+L^k;Ljkg(xk+L)kckxkxkkxk+Lxk:

PAGE 45

Wecombine(2.69){(2.72)toobtain(2.64)forj=L+1.Noticethatinestablishing(2.64),weexploited(2.65){(2.67).Consequently,tocompletetheinductionstep,eachoftheseestimatesshouldbeprovedforj=L+1. Focusingon(2.65)forj=L+1,wehavekg(xk+L+1)^g(^xk;L+1)kkg(xk+L+1)^g(xk+L+1)k+k^g(xk+L+1)^g(^xk;L+1)k=kg(xk+L+1)^g(xk+L+1)k+kH(xk+L+1^xk;L+1)kkg(xk+L+1)H(xk+L+1x)k+2kxk+L+1^xk;L+1kkg(xk+L+1)H(xk+L+1x)k+ckxkxk2; Observethatk+L+1eitherequalsk2[12;11],or(sTisi)=(sTiyi),wherek+Li=(k+L+1)>k.Inthislattercase,sincexk+j2B(x)for0jL+1,itfollowsfrom(2.61)thatk+L+11 1: Finally,wefocuson(2.67)forj=L+1.If(k+L+1)=(k),then^k;L+1=k+L+1=k,sowearedone.Otherwise,(k+L+1)>(k),andthere

PAGE 46

existsanindexi2(0;L]suchthatk+L+1=sTk+isk+i Since^k;i2[12;11],wehavek^sk+ik=k^k;i^gk;ik1 2kH(^xk;ix)k1 Hence,combining(2.73)and(2.74)gives1sTk+isk+i Nowletusconsiderthedenominatorsofk+iand^k;i.ObservethatsTk+iyk+i^sTk+i^yk+i=sTk+i(yk+i^yk+i)+(sk+i^sk+i)T^yk+i=sTk+i(yk+i^yk+i)+(sk+i^sk+i)TH^sk+i: By(2.64)and(2.66),wehavej(sk+i^sk+i)TH^sk+ij=j(sk+i^sk+i)THsk+i(sk+i^sk+i)TH(sk+i^sk+i)jckxkxk3

PAGE 47

forsucientlysmall.Also,by(2.65)and(2.66),wehavejsTk+i(yk+i^yk+i)jksk+ik(kgk+i+1^gk;i+1k+kgk+i^gk;ik)ckxkxk3:(2.78) Combining(2.76){(2.78)yieldssTk+iyk+i^sTk+i^yk+ickxkxk3:(2.79) Sincexk+iandxk+i+12B(x),itfollowsfrom(2.60)thatsTk+iyk+i=sTk+i(gk+i+1gk+i)1ksk+ik2=1jk+ij2kgk+ik2:(2.80) By(2.61)and(2.60),wehavejk+ij2kgk+ik21 22kgk+ik2=1 22kg(xk+i)g(x)k221 Finally,(2.63)giveskxk+ixk21 4kxkxk2:(2.82) Combining(2.80){(2.82)yieldssTk+iyk+i31 Combining(2.79)and(2.83)gives1^sTk+i^yk+i Observethatjk+L+1^k;L+1j=sTk+isk+i 11sTk+isk+i 1ja(1b)+bj1 1(jaj+jbj+jabj);

PAGE 48

wherea=1sTk+isk+i 2k^xk;0xkfor0j`1
PAGE 49

forallj2[0;`].Wedene=minf1;;(41)1g:(2.90) Foranyx0andx12B(x),wedeneasequence1=k10bethesmallestintegerwiththepropertythatk^xk1;j1xk1 2k^xk1;0xk=1 2kx1xk:Sincex0andx12B(x)B(x),itfollowsfrom(2.61)that^1;0=1=sT0s0 2k^xk1;0xk=1kxk1xk2+1 2kxk1xk3 4kxk1xk: Sincekx1xk,itfollowsthatxk22B(x).By(2.88),xj2B(x)for1jk1. Now,proceedbyinduction.Assumethatkihasbeendeterminedwithxki2B(x)andxj2B(x)for1jki.Letji>0bethesmallestintegerwiththepropertythatk^xki;jixk1 2k^xki;0xk=1 2kxkixk: 4kxkixk:

PAGE 50

Again,xki+12B(x)andxj2B(x)forj2[1;ki+1]. Foranyk2[ki;ki+1),wehavekki+N1Ni,sincekiN(i1)+1.Hence,ik=N.Also,(2.89)giveskxkxk3kxkixk33 4i1kxk1xk33 4(k=N)1kx1xk=ckkx1xk; 33andc=3 41=N<1: 2xTAxbTx;(2.92) whereA2
PAGE 51

Table2{3:Transitiontosuperlinearconvergence superlinearm132445678linearm21334567 Ifg(i)kdenotesthei-thcomponentofthegradientgk,thenby(2.94)and(2.93),wehaveg(i)k+1=(1ki)g(i)ki=1;2;:::;n:(2.95) Weassumethatg(i)k6=0forallsucientlylargek.Ifg(i)k=0,thenby(2.95)com-ponentiremainszeroduringallsubsequentiterations;henceitcanbediscarded.IntheBBmethod,startingvaluesareneededforx0andx1inordertocom-pute1.InourstudyofCBB,wetreat1asafreeparameter.Inournumericalexperiments,wechoose1tobetheexactstepsize. Fordierentchoicesofthediagonalmatrix(2.93)andthestartingpointx1,wehaveevaluatedtheconvergencerateofCBB.Bytheanalysisgivenin[58]forpositivedenitequadratics,orbytheresultgiveninTheorem8forgeneralnonlinearfunctions,theconvergencerateoftheiteratesisatleastlinear.Ontheotherhand,formsucientlylarge,weobserveexperimentally,thattheconvergencerateissuperlinear.Forxeddimensionn,thevalueofmwheretheconvergenceratemakesatransitionbetweenlinearandnonlinearisshowninTable2{3.Moreprecisely,foreachvalueofn,theconvergencerateissuperlinearwhenmisgreatthanorequaltotheintegergiveninthesecondrowoftheTable2{3.TheconvergenceislinearwhenmislessthanorequaltotheintegergiveninthethirdrowofTable2{3. ThelimitingintegersappearinginTable2{3arecomputedinthefollowingway:Foreachdimension,werandomlygenerate30problems,witheigenvaluesuni-formlydistributedon(0;n],and50startingpoints{atotalof1500problems.Foreachtestproblem,weperform1000nCBBiterations,andweplotlog(log(kgkk1))

PAGE 52

versustheiterationnumber.Wetthedatawithaleastsquaresline,andwecom-putethecorrelationcoecienttodeterminehowwellthelinearregressionmodeltsthedata.Ifthecorrelationcoecientis1(or1),thenthelineartisperfect,whileacorrelationcoecientof0meansthatthedataisuncorrelated.Alineartinaplotoflog(log(kgkk1))versustheiterationnumberindicatessuperlinearconvergence.Formlargeenough,thecorrelationcoecientsarebetween1:0and0:98,indicatingsuperlinearconvergence.Aswedecreasem,thecorrelationcoecientabruptlyjumpstotheorderof0:8.TheintegersshowninTable2{3reectthevaluesofmwherethecorrelationcoecientjumps. BasedonTable2{3,theconvergencerateisconjecturedtobesuperlinearform>n=23.Forn<6,therelationshipbetweenmandnatthetransitionbe-tweenlinearandsuperlinearconvergenceismorecomplicated,asseeninTable2{3.GraphsillustratingtheconvergenceappearinFigure2{3.Thehorizontalaxisintheseguresistheiterationnumber,whiletheverticalaxisgiveslog(log(kgkk1)).Herekk1representsthesup-norm.Inthiscase,straightlinescorrespondtosuper-linearconvergence{theslopeofthelinereectstheconvergenceorder.InFigure2{3,thebottomtwographscorrespondtosuperlinearconvergence,whilethetoptwographscorrespondtolinearconvergence{forthesetoptwoexamples,aplotoflog(kgkk1)versustheiterationnumberislinear.ForthetheoreticalvericationoftheexperimentalresultsgiveninTable2{3isnoteasy.However,somepartialresultinconnectionwithn=3andm=2canbetheoreticallyveried.Fordetails,onemayreferthepaper[32]. 2.2.4 2xTAx;A=diag(1;;n):(2.96)

PAGE 53

Figure2{3:Graphsoflog(log(kgkk1))versusk,(a)3n6andm=3,(b)6n9andm=4. Wewillseethatthechoiceformhasasignicantimpactonperformance.Thisleadsustoproposeanadaptivechoiceform.TheBBalgorithmwiththisadaptivechoiceformandanonmonotonelinesearchiscalledACBB.Numericalcompar-isonswithSPG2andwithconjugategradientcodesusingtheCUTErtestproblemlibraryaregiveninthenumericalcomparisonssection. A numerical investigation of CBB method. Weconsiderthetestproblem(2.96)withfourdierentconditionnumbersCforthediagonalmatrix:C=102,C=103,C=104,andC=105;andwiththreedierentdimensionsn=102,n=103,andn=104.Welet1=1,n=C,theconditionnumber.Theotherdiagonalelementsi,2in1,arerandomlygeneratedontheinterval(1;n).Thestartingpointsx(i)1,i=1;;n,arerandomlygeneratedontheinterval[5;5].Thestoppingconditioniskgkk2108:

PAGE 54

Table2{4:ComparingCBB(m)methodwithanadaptiveCBBmethod 10210214721915614515016016613613410350527154683643763954123673491041509F14258148527766288787711055412F5415307416701672115726071915 10310214727416015816216618115014510350517565485044935505404814601041609F18621533137715781447147013781055699F6760475535063516295744123187 10410215622716216616717018715615610353932005155515395365734975051041634F18231701178217471893158715171056362F6779519449654349473646874743 Ouradaptiveideaarisesfromthefollowingconsiderations.Ifastepsizeisusedinnitelyofteninthegradientmethod;namely,k,thenundertheassumptionthatthefunctionHessianAhasnomultipleeigenvalues,thegradientgkmustapproximateaneigenvectorofA,andgTkAgk=gTkgktendstothecorrespondingeigenvalueofA,see[29].Thus,itisreasonabletoassumethatrepeateduseofaBBstepsizeleadstogoodapproximationsofeigenvectorsofA.First,wedenek=gTkAgk

PAGE 55

IfgkisexactlyaneigenvectorofA,weknowthatk=1.Ifk1,thengkcanberegardedasanapproximationofaneigenvectorofAandBBkSDk.Inthiscase,itisworthwhiletocalculateanewBBstepsizeBBksothatthemethodacceptsastepclosetothesteepestdescentstep.Therefore,wetesttheconditionk;(2.98) where2(0;1)isconstant.Iftheaboveconditionholds,wecalculateanewBBstepsize.Wealsointroduceaparameter M,wecalculateanewBBstepsize.NumericalresultsforthisadaptiveCBBwith=0:95arelistedunderthecolumnadaptiveofTable2{4,wheretwovalues FromTable2{4,weseethattheadaptivestrategymakessense.Theperfor-mancewith Nonmonotone line search and cycle number. Aswesawinprevioussections,thechoiceofthestepsizekisveryimportantfortheperformanceofagradientmethod.FortheBBmethod,functionvaluesdonotdecreasemonotonically.Hence,whenimplementingBBorCBB,itisimportanttouseanonmonotonelinesearch. Assumingthatdkisadescentdirectionatthek-thiteration(gTkdk<0),acommonterminationconditionforthesteplengthalgorithmisf(xk+kdk)fr+kgTkdk;(2.99) wherefristheso-calledreferencefunctionvalueand2(0;1)aconstant.Iffr=f(xk),thenthelinesearchismonotonesincef(xk+1)
PAGE 56

valuefortheMmostrecentiterates.Thatis,atthek-thiteration,wehavefr=fmax=max0iminfk;M1gf(xki):(2.100) ThisnonmonotonelinesearchisusedbyRaydan[108]toobtainGBB.DaiandSchittkowski[33]extendedthesameideatoasequentialquadraticprogrammingmethodforgeneralconstrainednonlinearoptimization.AnevenmoreadaptivewayofchoosingfrisproposedbyToint[118]fortrustregionalgorithmsandthenextendedbyDaiandZhang[39].Comparedwith(2.100),thenewadaptivewayofchoosingfrallowsbigjumpsinfunctionvalues,andisthereforeverysuitablefortheBBalgorithm(see[30],[31],and[39]). Thenumericalresultswhichwereportinthissectionarebasedonthenonmonotonelinesearchalgorithmgivenin[39].Thelinesearchinthispaperdiersfromthelinesearchin[39]intheinitializationofthestepsize.Here,thestartingguessforthestepsizecoincideswiththepriorBBstepuntilthecyclelengthhasbeenreached;atwhichpointwerecomputethestepusingtheBBformula.Ineachsubsequentsubiteration,aftercomputinganewBBstep,wereplace(2.99)byf(xk+kdk)minffmax;frg+kgTkdk;

PAGE 57

search.Loosely,wewouldliketocomputeanewBBstepinanyofthefollowingcases: R1. ThenumberoftimesmthecurrentBBstepsizehasbeenreusedissucientlylarge:m R2. Thefollowingnonquadraticanalogueof(2.98)issatised:sTkyk where<1isnear1.Wefeelthatthecondition(2.101)shouldonlybeusedinaneighborhoodalocalminimizer,wherefisapproximatelyquadratic.Hence,weonlyusethecondition(2.101)whenthestepsizeissucientlysmall:kskk2
PAGE 58

Cycle termination/Stepsize initialization. T1. IfanyoftheconditionR1throughR4aresatisedandsTkyk>0,thenthecurrentcycleisterminatedandtheinitialstepsizeforthenextcycleisgivenbyk+1=maxfmin;minfsTksk T2. Ifthelengthmofthecurrentcyclesatisesm1:5 2.2.5 DESCENTcodeofHagerandZhang[73,74].TheSPG2algorithmisanextensionofRaydan's[108]GBBalgorithmwhichwasdownloadedfromtheTANGOwebpagemaintainedbyErnestoBirgin.Inourtests,wesettheboundsinSPG2toinnity.ThePRP+codeisavailableattheNocedal'swebpage.TheCG DESCENTcodeisfoundattheauthor'swebpage.ThelinesearchinthePRP+codeisamodicationofsubroutineCSRCHofMoreandThuente[101],whichemploysvariouspolynomialinterpolationschemesandsafeguardsinsatisfyingthestrongWolfeconditions.CG DESCENTemploysan\approximateWolfe"linesearch.AllcodesarewritteninFortranandcompiledwithf77underthedefaultcompilersettingsonaSunworkstation.TheparametersusedbyCG DESCENTarethedefaultparametervaluegivenin[74]forversion1.1

PAGE 59

ofthecode.ForSPG2,weuseparametervaluesrecommendedontheTANGOwebpage.Inparticular,thesteplengthwasrestrictedtotheinterval[1030;1030],whilethememoryinthenonmonotonelinesearchwas10. TheparametersoftheACBBalgorithmaremin=1030,max=1030,c1=c2=0:1,and Ournumericalexperimentsarebasedontheentiresetof160unconstrainedoptimizationproblemavailablefromCUTErintheFall,2004.Asexplainedin[74],wedeletedproblemsthatweresmall,orproblemswheredierentsolversconvergedtodierentlocalminimizers.Afterthedeletionprocess,wewereleftwith111testproblemswithdimensionrangingfrom50to104. Nominally,ourstoppingcriterionwasthefollowing:krf(xk)k1maxf106;1012krf(x0)k1g:(2.104) Inafewcases,thiscriterionwastoolenient.Forexample,withthetestproblempenalty1,thecomputedcoststilldiersfromtheoptimalcostbyafactorof105whenthecriterion(2.104)issatised.Asaresult,dierentsolversobtaincompletelydierentvaluesforthecost,andthetestproblemwouldbediscarded.Bychangingtheconvergencecriteriontokrf(xk)k1106,thecomputedcostsallagreedto6digits.Theproblemsforwhichtheconvergencecriterionwasstrengthenedweredqrtic,penalty1,power,quartc,andvardim. TheCPUtimeinsecondsandthenumberofiterations,functionevaluations,andgradientevaluationsforeachofthemethodsarepostedattheauthor'swebsite.HereweanalyzetheperformancedatausingtheprolesofDolanandMore[43].Thatis,weplotthefractionpofproblemsforwhichanygivenmethodis

PAGE 60

Figure2{4:PerformancebasedonCPUtime withinafactorofthebesttime.Inaplotofperformanceproles,thetopcurveisthemethodthatsolvedthemostproblemsinatimethatwaswithinafactorofthebesttime.Thepercentageofthetestproblemsforwhichamethodisthefastestisgivenontheleftaxisoftheplot.Therightsideoftheplotgivesthepercentageofthetestproblemsthatweresuccessfullysolvedbyeachofthemethods.Inessence,therightsideisameasureofanalgorithm'srobustness. InFigure2{4,weuseCPUtimetocomparetheperformanceofthefourcodesACBB,SPG2,PRP+,andCG DESCENT.NotethatthehorizontalaxisinFigure2{4isscaledproportionaltolog2().Thebestperformance,relativetotheCPUtimemetric,wasobtainedbyCG DESCENT,thetopcurveinFigure2{4,followedbyACBB.Thehorizontalaxisinthegurestopsat=16sincetheplotsareessentiallyatforlargervaluesof.Forthiscollectionofmethods,thenumberoftimesanymethodachievedthebesttimeisshowninTable2{5.ThecolumntotalinTable2{5exceeds111duetotiesforsometestproblems.

PAGE 61

Table2{5:Numberoftimeseachmethodwasfastest(timemetric,stoppingcrite-rion(2.104)) Method Fastest CG DESCENT 70ACBB 36PRP+ 9SPG2 9 TheresultsofFigure2{4indicatethatACBBismuchmoreecientthanSPG2,whileitperformedbetterthanPRP+,butnotaswellasCG DESCENT.>Fromtheexperiencein[108],theGBBalgorithm,withatraditionalnonmonotonelinesearch[66],maybeaectedsignicantlybynearlysingularHessiansatthesolution.WeobservethatnearlysingularHessiansdonotaectACBBsignicantly.Infact,Table2{4alsoindicatesthatACBBbecomesmoreecientastheproblembecomesmoresingular.Furthermore,sinceACBBdoesnotneedtocalculatetheBBstepsizeateveryiteration,CPUtimeissaved,whichcanbesignicantwhentheproblemdimensionislarge.Forthistestset,wefoundthattheaveragecyclelengthforACBBwas2.59.Inotherwords,theBBstepisreevaluatedafter2or3iterations,onaverage.Thismemorylengthissmallerthanthememorylengththatworkswellforquadraticfunction.Whentheiteratesarefarfromalocalminimizerofageneralnonlinearfunction,theiteratesmaynotbehaveliketheiteratesofaquadratic.Inthiscase,betternumericalresultsareobtainedwhentheBB-stepsizeisupdatedmorefrequently. EventhoughACBBdidnotperformaswellasCG DESCENTforthecom-pletesetoftestproblems,thereweresomecaseswhereitperformedexceptionallywell(seeTable2{6).OneimportantadvantageoftheACBBschemeoverconju-gategradientroutinessuchasPRP+orCG DESCENTisthatinmanycases,thestepsizeforACBBiseitherthepreviousstepsizeortheBBsizesize(2.51).Incon-trast,withconjugategradientroutines,eachiterationrequiresalinesearch.Dueto

PAGE 62

Table2{6:CPUtimesforselectedproblems ProblemDimension ACBBCG DESCENT FLETCHER5000 9.14989.55FLETCHER1000 1.3227.27BDQRTIC1000 .373.40VARDIM10000 .052.13VARDIM5000 .02.92 thesimplicityoftheACBBstepsize,itcanbemoreecientwhentheiteratesareinaregimewherethefunctionisirregularandtheasymptoticconvergenceprop-ertiesoftheconjugategradientmethodarenotineect.Onesuchapplicationisboundconstrainedoptimizationproblems{ascomponentsofxreachthebounds,thesecomponentsareoftenheldxed,andtheassociatedpartialderivativechangediscontinuously.InChapter3,ACBBiscombinedwithCG DESCENTtoobtainaveryecientactivesetalgorithmforboxconstrainedoptimizationproblems. 2.3 Theproximalpointmethodisonestrategyfordealingwithdegeneracyataminimum.Theiteratesxk,k1,aregeneratedbytherule:xk+12argminfFk(x):x2Rng;(2.105)

PAGE 63

whereFk(x)=f(x)+1 2kkxxkk2: Theproximalpointmethod,rstproposedbyMartinet[96,97],hasbeenstudiedinmanypapersincluding[69,95,84,111,112].In[112]Rockafellarshowsthatiffisstronglyconvexatasolutionof(2.1),thentheproximalpointmethodconvergeslinearlywhenkisboundedawayfromzero,andsuperlinearlywhenktendstozero.Herewedeveloplinearandsuperlinearconvergenceresultsforproblemsthatarenotnecessarilystronglyconvex. Sincethesolutionto(2.105)approximatesasolutionto(2.1),wedonotneedtosolve(2.105)exactly.Weanalyzetwocriteriafortheaccuracywithwhichwesolve(2.105).Therstcriterionisthataniteratexk+1isacceptablewhen(C1)Fk(xk+1)f(xk)andkrFk(xk+1)kkkrf(xk)k: whereCisaconstantdependingonlocalpropertiesoffandxistheprojectionofx2RnontoX:kxxk=min2Xkxk:

PAGE 64

Sincefiscontinuous,thesetofminimizersXof(2.1)isclosed,andtheprojectionexists.Bytakingk=krf(xk)kin(2.106),weobtainquadraticconvergenceoftheapproximateproximaliteratestothesolutionsetX,whilethesequenceofiteratesapproachesalimitatleastlinearly. In[112]RockafellarstudiestheacceptanceconditionkrFk(xk+1)kkkkxk+1xkk 2.3.2 Usingthiscondition,YamashitaandFukushima[127],andFanandYuan[54],studytheLevenberg-Marquardtmethodtosolveasystemofnonlinearequations.Whentheirapproachisappliedto(1.1),thefollowinglinearsystemissolvedineachsubproblem:(H(xk)2+kI)d+H(xk)g(xk)=0;(2.108) wherek>0istheregularizationparameteranddisthesearchdirectionatstepk.In[54]and[127],theauthorschoosek=kg(xk)kandk=kg(xk)k2,

PAGE 65

respectively.Theyshowthatifrf(x)providedalocalerrorbound,thentheiteratesassociatedwith(2.108)arelocallyquadraticallyconvergent. Li,Fukushima,QiandYamashitapointoutin[88]thatthelinearsystemofequations(2.108)maylosesparsitywhenH(xk)issquared;moreover,squaringthematrixsquarestheconditionnumberofH(xk).Hence,theyconsiderasearchdirectiondchosentosatisfy:(H(xk)+kI)d+g(xk)=0;(2.109) wherek=ckg(xk)kforsomeconstantc>0.Whenf(x)isconvexandrf(x)providesalocalerrorbound,theyestablishalocalquadraticconvergenceresultforiteratesgeneratedbytheapproximatesolutionof(2.109)followedbyalinesearchalongtheapproximatesearchdirection. Whentheproblemdimensionislarge,computingtheHessianandsolving(2.109)canbeexpensive.Inourapproach,weuseournewlydevelopedconjugategradientroutinecg descent[73,72]tosolve(2.105)witheitherstoppingcriterion(C1)or(C2). First,byobservingthattheminimumofFkisboundedfromabovebyFk(xk),wecaneasilygetthefollowingproposition. 2kkxk+1xkk2Fk(xk)=f(xk)+1 2kkxkxkk2f(x)+1 2kkxkxkk2:

PAGE 66

Inourapproach,itismoreconvenienttoemployalocalerrorboundbasedonthefunctionvalueratherthanthefunctiongradientusedin(2.107).Wesaythatfprovidesalocalerrorboundat^x2Xifthereexistpositiveconstantsandsuchthatf(x)f(x)kxxk2wheneverkx^xk:(2.110) Wenowshowthatundercertainsmoothassumption,thesetwoconditionsareequivalent: Sincebothxandx2B(^x),wecanexpandf(x)inaTaylorseriesaroundxandapply(2.110)toobtain:kxxk2f(x)f(x)=1 2(xx)TH(xx)+R2(x;x); whereR2istheremaindertermandH=r2f(x)istheHessianatx.ChoosesmallenoughthatjR2(x;x)j

PAGE 67

Inthiscase,(2.112)gives4 Nowexpandrf(x)inaTaylorseriesaroundxtoobtainrf(x)=rf(x)rf(x)=H(xx)+R1(x;x);(2.114) whereR1istheremainderterm.ChoosesmallerifnecessarysothatkR1(x;x)k Combining(2.113){(2.115)yieldskrf(x)kkxxk: Conversely,supposerfprovidesalocalerrorboundat^x2Xwithpositivescalarsandsatisfying(2.107).ChoosesmallerifnecessarysothatfistwicecontinuouslydierentiableinB(^x)and(2.115)holds.CombiningtheTaylorexpansion(2.114),thefactthatHispositivesemidenite,thebound(2.115)ontheremainder,andthelocalerrorboundcondition(2.107),weobtain2

PAGE 68

whereisaboundfortheHessianoffoverB(^x).Similarto(2.112),weexpandfinaTaylorexpansionaroundxandutilize(2.116)toobtainf(x)f(x)R2(x;x)=1 2(xx)TH(xx)kxxk2;where=22 analysis for exact minimization. Werstanalyzetheproximalpointmethodwhentheiteratesareexactsolutionsof(2.105). 1;where=2

PAGE 69

Proceedingbyinduction,supposethat(2.119)holdsforallj2[0;k]andforsomek0.Weshowthat(2.119)holdsforallj2[0;k+1].ByProposition1andtheinductionhypothesis,kxk+1xkkkxkxkkkkx0x0k: 1kx0x0k1 1kx0^xk: 1kx0^xk:(2.120) Observethatkxk+1xkk2kxk+1xkk2=(xk+1+xk+12xk)(xk+1xk+1)(kxk+1xk+1k+2kxk+1xkk)kxk+1xk+1k: Combining(2.121)withtherelationFk(xk+1)Fk(xk+1)givesf(xk+1)f(xk+1)=1 2k(kxk+1xkk2kxk+1xkk2)1 2k(kxk+1xk+1k+2kxk+1xkk)kxk+1xk+1k:

PAGE 70

By(2.120),xk+12B(^x).Sincefprovidesalocalerrorboundat^x,weconcludethatkxk+1xk+1k2f(xk+1)f(xk+1):(2.123) Combiningthiswith(2.122)gives1 2kkxk+1xk+1kkkxk+1xkk:(2.124) Duetotheassumptionk<2=3,thecoecient(1 2k)in(2.124)ispositiveand2k=(2k)2=(2)=<1: Relations(2.120)and(2.126)completetheproofoftheinductionstep.Relation(2.125)givesestimate(2.118). By(2.119)andProposition1,wehavekxk+1xkkkxkxkkkkx0x0k:

PAGE 71

considerthecasewheretheregularizationsequencekofTheorem9isexpressedasafunctionofthecurrentiterate.Thatis,weassumethatk=(xk)where()isdenedonRn. 1r;where=2 Sincex02Br(^x),itfollowsby(Q3)that0=krf(x0)k.Hence,(2.128)issatisedforj=0.IntheproofofTheorem9,weshowthatifjforj2[0;k],thenthersttwoconditionsin(2.128)holdforj=k+1.Also,asin(2.120),xk+12Br(^x).Consequently,k+1=krf(xk+1)k,whichimpliesthatthelast

PAGE 72

conditionin(2.128)holdsforj=k+1.Thiscompletestheinductionstep;hence,(2.128)holdsforallj0. Sincexk2Br(^x),itfollowsthatxk2B(^x)(see(2.111)).Sincexkandxk2B(^x),wehavek=krf(xk)k=krf(xk)rf(xk)kLkxkxkk;(2.129) whereListheLipschitzconstantforrfinB(^x).Byestimate(2.118)inTheorem9,kxk+1xk+1k2k analysis for approximate minimization. Wenowanalyzethesituationwheretheproximalpointiteration(2.105)isimplementedinexactly;theapprox-imationtoasolutionof(2.105)needonlysatisfy(C1)or(C2).Thefollowingpropertyofaconvexfunctionisusedintheanalysis.

PAGE 73

SincefisconvexinN,themonotonicitycondition(rf(x)rf(x))(xx)0 holds.Combiningthiswith(2.130),weobtainrFk(x)(xx)kkxxk2:(2.131) BytheSchwarzinequality,kxxkkrFk(x)k Combining(2.131)and(2.132),theproofiscomplete. 122ifacceptancecriterion(C2)isused.

PAGE 74

(A4) 1;where= Iftheapproximateproximaliteratesxksatisfyeither(C1)or(C2)withk=(xk),thentheiteratesareallcontainedinB(^x),theyapproachaminimizerx2X,andforeachk,wehavekxkxkk Proceedingbyinduction,supposethat(2.136)holdsforallj2[0;k]andforsomek0.Weshowthat(2.136)holdsforallj2[0;k+1]. TheconditionFk(xk+1)f(xk)in(C1)or(C2)impliesthatkkxk+1xkk2f(xk)f(xk+1)f(xk)f(xk):(2.137) Since<1by(2.133),thersthalfof(2.136)withj=kimpliesthatxk2B(^x),wherer==2.Thuswehavexk2B(^x)(see(2.111)).Expandingfin(2.137)

PAGE 75

inaTaylorseriesaroundxkandusingthefactthatrf(xk)=0giveskkxk+1xkk2 whereistheboundfortheHessianoffoverB(^x).Bythetriangleinequality,wehavekxk+1^xkkxk+1xkk+kxk^xk:(2.139) Combining(2.139)withtheconditionxk2B(^x),(2.138),and(A4)yields:kxk+1^xk+kxk+1xkk+s Let^xk+1denoteanexactproximalpointiterate:^xk+12argminfFk(x):x2Rng: Bythetriangleinequality,(2.140),thefactthat1,and(2.136),weobtaink^xk+1^xkk^xk+1xkk+kxk^xkkkx0x0k+kxk^xkkkx0x0k+kx0^x0k1+k1Xl=0l!kx0^xk1+kXl=0l!: Byassumption(A1),fprovidesalocalerrorboundwithconstantsand.Hence,byLemma5,rfprovidesalocalerrorboundwithconstantsandr==2.SincerF(xk+1)=rf(xk+1)+k(xk+1xk)andxk+12Br(^x),thelocal

PAGE 76

errorboundconditiongiveskxk+1xk+1kkrf(xk+1)kkrFk(xk+1)k+kkxk+1xkk:(2.141) If(C1)isused,thenkrFk(xk+1)kkkrf(xk)k,and(2.141)impliesthatkxk+1xk+1kk(krf(xk)k+kxk+1xkk)k(Lkxkxkk+kxk+1xkk): If(C2)isused,thenkrFk(xk+1)kkkxk+1xkk,andby(2.141),wehavekxk+1xk+1kk(1+)kxk+1xkk:(2.143) Wenowderiveaboundforthekxk+1xkktermineither(2.142)or(2.143).Sincebothxk+1and^xk+1lieinB(^x),wecanapplyProposition2toobtainFk(xk+1)=Fk(^xk+1)+(Fk(xk+1)Fk(^xk+1))Fk(xk)+1 Aboveweobservedthatxk2B(^x)wherer==2.In(2.111)weshowthatxk2B(^x)whenxk2Br(^x).Hence,by(A2)wehavekrf(xk)k=krf(xk)rf(xk)kLkxkxkk:(2.145) If(C1)isused,thenby(2.145),krFk(xk+1)kkkrf(xk)kkLkxkxkk:(2.146) Usingtherelationf(xk)f(xk+1)in(2.144)gives:k

PAGE 77

Combiningthiswith(2.146),wehavekxk+1xkk2(1+2L2)kxkxkk2:(2.148) Similarly,ifcriterion(C2)isused,thenkrFk(xk+1)kkkxk+1xkk,andby(2.147),wehavekxk+1xkk21 122kxkxkk2:(2.149) Combining(2.148)and(2.149)andreferringtothedenitionofin(A3),weconcludethatkxk+1xkkkxkxkk(2.150) ifeitheracceptancecriterion(C1)or(C2)isused. Insertingthebound(2.150)in(2.142)or(2.143)yields(2.135).By(2.135)andthedenitionofin(A3),wehavekxk+1xk+1kk Forthersthalfof(2.136),weusethetriangleinequality,(2.150),andtheinductionhypothesistoobtain:kxk+1^xkkxk+1xkk+kxk^xkkxkxkk+kxk^xkkkx0x0k+kxk^xkkkx0x0k+kx0^xk1+k1Xl=0l!kx0^xk1+kXl=0l!

PAGE 78

By(2.136)and(2.150),wehavekxk+1xkkkxkxkkkkx0x0k:(2.151) Hence,thexkformaCauchysequence,whichhasalimitdenotedx.Bythetriangleinequalityand(2.151),kxkxk1Xj=kkxj+1xjk1Xj=kkxjxjk1Xj=kjkx0x0k=k andx0issucientlycloseto^x.Forexample,if(x)=krf(x)kwhere2[0;2)and>0isaconstant,andifrfprovidesalocalerrorboundat^x,thenkxxk2

PAGE 79

2.3.4 (L1) Thereexistpositiveconstantsc1andc2satisfyingboththesucientdescentconditiongTkdkc1kgkk2; (L2) TheWolfeconditions[122,123]hold.Thatis,ifkdenotesthestepsizeandyk=xk+kdk,thenFk(yk)Fk(xk)+krFk(xk)dk=Fk(xk)+kgTkdk; Ourglobalconvergenceresultisasfollows:

PAGE 80

By(L1)andtherstWolfeconditionin(L2),itfollowsthatFk(yk)Fk(xk)c1kkgkk2: BythesecondWolfeconditionin(L2)and(L1),wehave(rFk(yk)rFk(xk))dk(1)rFk(xk)dk=(1)gTkdk(1)c1kgkk2:

PAGE 81

IfListheLipschitzconstantforrfinB,thenwehave(rFk(yk)rFk(xk))d(xk)=(rf(yk)rf(xk)+k(ykxk)T)dk(L+k)kykxkkkdkk=k(L+k)kdkk2c22k(L+k)kgkk2: 21Xk=0minkg(xk)k2 21Xk=0minkg(xk)k2 2n1Xi=1(xixi+1)2+1 12n1Xi=1i(xixi+1)4;(P2)f(x)=nXi=1bi(xi1)2+nXi=1(xi1)4:

PAGE 82

Table2{7:kg(xk)kversusiterationnumberk (C1)(C2)(C1)(C2) 1 4.5e015.4e01 4.7e011.3e012 7.2e021.0e01 1.3e023.2e033 2.6e037.1e03 1.5e042.5e054 3.5e062.6e05 4.2e074.5e085 6.4e124.3e10 1.8e101.2e11 theminimizerisunique,however,theconditionnumberoftheHessianatthesolutionisaround:5e16.Table2{7givesthegradientnormscorrespondingto(x)=:05krf(x)k,andacceptancecriterions(C1)and(C2).For(C2),wechose=:66.Thesubproblems(2.105)weresolvedusingourconjugategradientroutinecg descent[73,72],stoppingwheneither(C1)or(C2)issatised. Inthenextseriesofnumericalexperiments,wesolvesomeill-conditionedprob-lemsfromtheCUTElibrary[13].Experimentally,wendthatitismoreecienttousetheproximalstrategyinaneighborhoodNofanoptimum;outsideN,weapplycd descenttotheoriginalproblem(2.1).Inournumericalexperiments,ourmethodforchoosingNwasthefollowing:Weappliedtheconjugategradientmethodtotheoriginalproblem(2.1)untilthefollowingconditionwassatised:kg(x)k1102(1+jf(x)j):(2.156) Whenthisconditionholds,wecontinuetoapplytheconjugategradientmethoduntilanestimatefortheconditionnumberexceeds103;thenweswitchtotheproximalpointmethod(2.105),usingcg descenttosolvethesubproblems.Weestimatetheconditionnumberbyrstestimatingthesecondderivativeofthefunctionalongthenormalizedsearchdirection.Ourestimatefortheconditionnumberistheratiobetweenthemaximumandminimumsecondderivative,duringtheiterationsafter(2.156)issatised.

PAGE 83

Table2{8givesconvergenceresultsforill-conditionproblemsfromCUTE[13].The\exactconditionnumbers"arecomputedinthefollowingway:We Table2{8:Statisticsforill-conditionCUTEproblemsandcg descent ItNFNGItNFNG SPARSINE20002.6e17 12,52825,05712,529 10,30720,61510,308SPARSINE10003.4e14 4,6579,3154,658 3,7607,5213,761NONDQUAR10006.6e10 4,0048,0154,152 3,0136,0323,068NONDQUAR5001.0e10 3,0276,0743,185 2,5265,0622,676EIGENALS4201.2e06 1,7923,5911,811 1,4642,9351,482EIGENBLS4203.0e05 5,08710,1855,099 2,4534,9102,458EIGENCLS4208.2e04 1,7333,4841,754 1,7743,5661,795NCB205103.7e16 1,6313,0482,372 1,2512,2621,684 solvetheproblemandoutputtheHessianmatrixatthesolution.TheextremeeigenvaluesoftheHessianwerecomputedusingMatlab,andtheeigenvalueratioappearsinthecolumnlabeled\Cond"ofTable2{8.Fortheproximalpointiteration,weusedacceptancecriterion(C2).Theiterationswerecontinueduntilthestoppingconditionkrf(x)k1106wassatised.Thenumberofiteration(It),numberoffunctionevaluations(NF),andnumberofgradientevaluations(NG)aregiveninthetable.ObservethatthereductioninthenumberoffunctionandgradientevaluationsvariesfromalmostnothinginEIGENCLStoabout50%forEIGENBLS.

PAGE 84

Inthischapter,wewillconsidertheproblem(1.1)withthefeasiblesetStobeaboxsetB,i.e.theproblem(1.1)turnsouttobeminff(x):x2Bg;(3.1) wherefisareal-valued,continuouslydierentiablefunctiondenedonthesetB=fx2Rn:lxug:(3.2) Herel
PAGE 85

aniteration.Later,YangandTolle[128]furtherdevelopedthisalgorithmsoastoobtainnitetermination,evenwhentheproblemwasdegenerateatalocalminimizerx.Thatis,forsomei,xi=liorxi=uiandrf(x)i=0.AnothervariationoftheCGPalgorithm,forwhichthereisarigorousconvergencetheory,isdevelopedbyWright[124].MoreandToraldo[102]pointoutthatwhentheCGPschemestartsfarfromthesolution,manyiterationsmayberequiredtoidentifyasuitableworkingface.Hence,theyproposeusingthegradientprojectionmethodtoidentifyaworkingface,followedbytheconjugategradientmethodtoexploretheface.Theiralgorithm,calledGPCG,hasniteterminationfornondegeneratequadraticproblems.Recently,adaptiveconjugategradientalgorithmshavebeendevelopedbyDostaletal.[44,45,47]whichhaveniteterminationforastrictlyconvexquadraticcostfunction,evenwhentheproblemisdegenerate. Forgeneralnonlinearfunctions,someoftheearlierresearch[4,19,63,87,99,113]focusedongradientprojectionmethods.Toacceleratetheconvergence,morerecentresearchhasdevelopedNewtonandtrustregionmethods(see[26]forin-depthanalysis).In[5,17,24,51]superlinearandquadraticconvergenceisestablishedfornondegenerateproblems,while[53,59,86,90]establishanalogousconvergenceresults,evenfordegenerateproblems.AlthoughcomputingaNewtonstepcanbeexpensivecomputationally,approximationtechniques,suchasasparse,incompleteCholeskyfactorization[89],couldbeusedtoreducethecomputationalexpense.Nonetheless,forlarge-dimensionalproblemsorforproblemswheretheinitialguessisfarfromthesolution,theNewton/trustregionapproachcanbeinecient.IncaseswheretheNewtonstepisunacceptable,agradientprojectionstepispreferred. Theane-scalinginteriorpointmethodofColemanandLi[14,21,22,23]isadierentapproachto(3.1),relatedtothetrustregionalgorithm.Morerecentresearchonthisstrategyincludes[42,80,83,120,133].Thesemethodsarebased

PAGE 86

onareformulationofthenecessaryoptimalityconditionsobtainedbymultiplica-tionwithascalingmatrix.TheresultingsystemisoftensolvedbyNewton-typemethods.Withoutassumingstrictcomplementarity(i.e.fordegenerateproblems),theane-scalinginterior-pointmethodconvergessuperlinearlyorquadratically,forasuitablechoiceofthescalingmatrix,whenthestrongsecond-ordersucientoptimalitycondition[110]holds.Whenthedimensionislarge,formingandsolvingthesystemofequationsateachiterationcanbetimeconsuming,unlesstheprob-lemhasspecialstructure.Recently,Zhang[133]proposesaninterior-pointgradientapproachforsolvingthesystemateachiteration.Convergenceresultsforotherinterior-pointmethodsappliedtomoregeneralconstrainedoptimizationappearin[48,49,125]. Themethoddevelopedinthischapterisanactivesetalgorithm(ASA)whichconsistsofanonmonotonegradientprojectionstep,anunconstrainedoptimizationstep,andasetofrulesforbranchingbetweenthesteps.Agoodsurveyofthisactivesetmethodcanbefoundin[76].Globalconvergencetoastationarypointisestablished.Whenthestrongsecond-ordersucientoptimalityconditionholds,weshowthatASAeventuallyreducestounconstrainedoptimization,withoutrestarts.Thispropertyisobtainedwithoutassumingstrictcomplementaryslackness.Ifstrictcomplementarityholdsandalltheconstraintsareactiveatastationarypoint,thenconvergenceoccursinanitenumberofiterations.Ingeneral,ouranalysisdoesnotshowthatthestrictlyactiveconstraintsareidentiedinanitenumberofiterations;instead,whenthestrongsecond-ordersucientoptimalityconditionholds,weshowthatASAeventuallybranchestotheunconstrainedoptimizationstep,andhenceforth,theactivesetdoesnotchange.Thusinthelimit,ASAreducestounconstrainedoptimizationwithoutrestarts.

PAGE 87

Figure3{1:Thegradientprojectionstep. 3.2 Webeginwithanoverviewofourgradientprojectionalgorithm.StepkinouralgorithmisdepictedinFigure3{1.HerePdenotestheprojectiononto:P(x)=argminy2kxyk:(3.4) Startingatthecurrentiteratexk,wecomputeaninitialiteratexk=xk

PAGE 88

Inthestatementofthenonmonotonegradientprojectionalgorithm(NGPA)givenbelow,frkdenotesthe\reference"functionvalue.Amonotonelinesearchcorrespondstothechoicefrk=f(xk).ThenonmonotoneGLLschemetakesfrk=fmaxkwherefmaxk=maxff(xki):0imin(k;M1)g:(3.5) HereM>0isaxedinteger,thememory.ForaspecicprocedureonhowtochoosethereferencefunctionvaluebasedonourcyclicBBscheme,onecanreferthepaper[39,78,132]. [min;max](0;1),intervalcontaininginitialstepsize WhilekP(xkgk)xkk> Choose 2. Choosefrksothatf(xk)frkmaxffrk1;fmaxkgandfrkfmaxkinnitelyoften.

PAGE 89

3. LetfRbeeitherfrkorminffmaxk;frkg.Iff(xk+dk)fR+gTkdk,thenk=1. 4. Iff(xk+dk)>fR+gTkdk,thenk=jwherej>0isthesmallestintegersuchthatf(xk+jdk)fR+jgTkdk:(3.6) 5. Setxk+1=xk+kdkandk=k+1. End Theconditionf(xk)frkguaranteesthattheArmijolinesearchinStep4canbesatised.Therequirementthat\frkfmaxkinnitelyoften"inStep2isneededfortheglobalconvergenceresultTheorem12.Thisisaratherweakrequirementwhichcanbesatisedbymanystrategies.Forexample,everyLiteration,wecouldsimplysetfrk=fmaxk.Anotherstragegy,closerinspirittotheoneusedinthenumericalexperiments,istochooseadecreaseparameter>0andanintegerL>0andsetfrk=fmaxkiff(xkL)f(xk). Tobegintheconvergenceanalysis,recallthatxisastationarypointfor(3.3)iftherst-orderoptimalityconditionholds:rf(x)(xx)0forallx2:(3.7) Letd(x),2R,bedenedintermsofthegradientg(x)=rf(x)Tasfollows:d(x)=P(xg(x))x: (P(x)x)T(yP(x))0forallx2Rnandy2. (P(x)P(y))T(xy)kP(x)P(y)k2forallxandy2Rn.

PAGE 90

P4. kd1(x)k:

PAGE 91

Finally,letusconsiderP8.Replacingxbyxg(x)andreplacingybyxinP1gives[P(xg(x))x+g(x)]T[xP(xg(x))]0:(3.10) Bythedenitionofd(x),(3.10)isequivalentto[d1(x)+g(x)]T[xxd1(x)]0: (3.11) by(3.7)sinceP(xg(x))2.Combining(3.11)and(3.12),theproofiscomplete.

PAGE 92

forallj2[0;k].Again,sincethesearchdirectiondkgeneratedinStep1ofNGPAisadescentdirection,itfollowsfromSteps3and4ofNGPAandtheinductionhypothesisthatf(xk+1)frkf(x0):(3.15) Hence,fmaxk+1f(x0)andfrk+1maxffrk;fmaxk+1gf(x0).Thiscompletestheinduction.Thus(3.14)holdsforallj.Consequently,wehavefRf(x0)inSteps3and4ofNGPA.Again,sincethesearchdirectiondkgeneratedinStep1ofNGPAisadescentdirection,itfollowsfromSteps3and4thatf(xk)f(x0),whichimpliesthatxk2Lforeachk. LetbetheLipschitzconstantforrfonL.Asin[131,Lem.2.1],wehavekmin1;2(1) kdkk2(3.16) forallk.ByP6,jgTkdkjkdkk2

PAGE 93

Itfollowsfrom(3.16)thatkmin1;2(1) BySteps3and4ofNGPAandP6,weconcludethatf(xk+1)frk+cgTkdkfrkckdkk2= kfrkckdkk2=max:(3.18) Wenowprovethatliminfk!1kdkk=0.Suppose,tothecontrary,thatthereexistsaconstant>0suchthatkdkkforallk.By(3.18),wehavef(xk+1)frk;where=c2=max:(3.19) Letki,i=0;1;:::,denoteanincreasingsequenceofintegerswiththepropertythatfrjfmaxjforj=kiandfrjfrj1whenkii2andab>M,thenby(3.20){(3.22),wehavefmaxa=max0j
PAGE 94

Thusliminfk!1kd1(xk)k=0. forallxandy2.Interchangingxandyin(3.23)andadding,weobtainthe(usual)monotonicitycondition(rf(y)rf(x))(yx)kyxk2:(3.24) Forastronglyconvexfunction,(3.3)hasauniqueminimizerxandtheconclusionofTheorem12canbestrengthenedasfollows:

PAGE 95

Sincefisstronglyconvexon,xistheuniquestationarypointfor(3.3).Hence,whentheiteratesconvergeinanitenumberofsteps,theyconvergetox.Otherwise,(3.25)holds,inwhichcase,thereexistsaninnitesequencel10andforjsucientlylarge,wehavef(xk)f(x)+forallk2[lj;lj+M+L]: AsintheproofofTheorem12,letki,i=0;1;:::,denoteanincreasingsequenceofintegerswiththepropertythatfrjfmaxjforj=kiandfrjfrj1whenki
PAGE 96

foreachi.Theassumptionthatforeachk,thereexistsj2[k;k+L)suchthatfrjfmaxjimpliesthatki+1kiL:(3.29) Combining(3.27)and(3.29),foreachlj,thereexistssomeki2[lj+M;lj+M+L]andfmaxkif(x)+:(3.30) Sincewasarbitrary,itfollowsfrom(3.28)and(3.30)thatlimi!1fmaxki=f(x);(3.31) theconvergenceismonotoneby(3.28).Bythechoiceofkiandbytheinequalityf(xk)frkinStep2,wehavef(xk)frkfmaxkiforallkki:(3.32) Combining(3.31)and(3.32),limk!1f(xk)=f(x):(3.33) Together,(3.7)and(3.23)yieldf(xk)f(x)+ Combiningthiswith(3.33),theproofiscomplete.

PAGE 97

AlthoughthegradientprojectionschemeNGPAhasanattractiveglobalconvergencetheory,theconvergenceratecanbeslowinaneighborhoodofalocalminimizer.Incontrast,forunconstrainedoptimization,theconjugategradientalgorithmoftenexhibitssuperlinearconvergenceinaneighborhoodofalocalminimizer.WedevelopanactivesetalgorithmwhichusesNGPAtoidentifyactiveconstraints,andwhichusesanunconstrainedoptimizationalgorithm,suchastheCG DESCENTschemein[73,74,72,75],tooptimizefoverafaceidentiedbyNGPA. Webeginwithsomenotation.Foranyx2,letA(x)andI(x)denotetheactiveandinactiveindicesrespectively:A(x)=fi2[1;n]:xi=0g;I(x)=fi2[1;n]:xi>0g:

PAGE 98

TheindicesinUcorrespondtocomponentsofxforwhichtheassociatedgradientcomponentgi(x)isrelativelylarge,whilexiisnotcloseto0(inthesensethatxikd1(x)k).WhenthesetUofuncertainindicesisempty,wefeelthattheindiceswithlargeassociatedgradientcomponentsarealmostidentied.Inthiscaseweprefertheunconstrainedoptimizationalgorithm. AlthoughournumericalexperimentsarebasedontheconjugategradientcodeCG DESCENT,abroadclassofunconstrainedoptimizationalgorithms(UA)canbeapplied.Thefollowingrequirementfortheunconstrainedalgorithmaresucientforestablishingtheconvergenceresultsthatfollow.ConditionsU1{U3aresucientforglobalconvergence,whileU1{U4aresucientforthelocalconvergenceanalysis.U4couldbereplacedbyanotherdescentconditionfortheinitiallinesearch,however,ourlocalconvergenceanalysishasbeencarriedoutunderU4. U2. U3. IfA(xj+1)=A(xj)forjk,thenliminfj!1kgI(xj)k=0.

PAGE 99

U4. Whenevertheunconstrainedalgorithmisstarted,xk+1=P(xkkgI(xk))wherekisobtainedfromaWolfelinesearch.Thatis,kischosentosatisfy(k)(0)+k0(0)and0(k)0(0);(3.35) where()=f(P(xkgI(xk)));0<<<1:(3.36) ConditionU1impliesthattheUAisamonotonealgorithm,sothatthecostfunctioncanonlydecreaseineachiteration.ConditionU2concernshowthealgorithmbehaveswhenaninfeasibleiterateisgenerated.ConditionU3describestheglobalconvergenceoftheUAwhentheactivesetdoesnotchange.InU4,0()isthederivativefromtherightsideof;kexistssinceispiecewisesmoothwithanitenumberofdiscontinuitiesinitsderivative,and0()iscontinuousat=0. WenowpresentourActiveSetAlgorithm(ASA).Intherststepofthealgorithm,weexecuteNGPAuntilwefeelthattheactiveconstraintssatisfyingstrictcomplementarityhavebeenidentied.InStep2,weexecutetheUAuntilasubproblemhasbeensolved(Step2a).WhennewconstraintsbecomeactiveinStep2b,wemaydecidetorestarteitherNGPAorUA.ByrestartingNGPA,wemeanthatx0inNGPAisidentiedwiththecurrentiteratexk.ByrestartingtheUA,wemeanthatiteratesaregeneratedbytheUAusingthecurrentiterateasthestartingpoint. 2(0;1),kgI(xk)k
PAGE 100

Whilekd1(xk)k>executeNGPAandcheckthefollowing: a. IfU(xk)=;,then IfkgI(xk)kexecuteUAandcheckthefollowing: a. IfkgI(xk)kjA(xk1)j+n2,restarttheUAatxk. ElserestartNGPA. End End 3.3.1

PAGE 101

ByU1,f(xki+1)f(xki+li):(3.39) Combining(3.38)and(3.39),wehavef(xki+1)f(xki),whichcontradictstheassumptionthatfisboundedfrombelow.

PAGE 102

3.3.2 Nondegenerate problems. Inthiscase,itisrelativelyeasytoshowthatASAeventuallyperformsonlytheUAwithoutrestarts.Theanalogousresultfordegenerateproblemsisestablishedinthenextsection. Letk+bechosenlargeenoughthatxk2B(x)forallkk+.Ifkk+andxki=0,thendki=0inStep1ofNGPA.Hence,xk+1;i=0ifxk+1isgeneratedbyNGPA.ByU2,theUAcannotfreeaboundconstraint.Itfollowsthatifkk+andxki=0,thenxji=0foralljk.Consequently,thereexistsanindexKk+withthepropertythatA(xk)=A(xj)foralljkK. Foranyindexi,jd1i(x)jjgi(x)j.Supposex2B(x);by(3.40),d1i(x)=0ifxi=0.Hence,kd1(x)kkgI(x)k(3.41) forallx2B(x).Ifk>K+n1,theninStep1bofASA,itfollowsfrom(3.41)andtheassumption2(0;1]thatNGPAwillbranchtoStep2(UA).InStep2,thecondition\kgI(xk)kK.Hence,theiteratesneverbranchfromUAtoNPGAandUAisneverrestarted.

PAGE 103

Degenerate problems. Wenowfocusondegenerateproblems,andshowthataresultanalogoustoTheorem14holdsunderthestrongsecond-ordersucientoptimalitycondition.Webeginwithaseriesofpreliminaryresults. Now,letusfocusonthenontrivialindicesinA+(x).Thatis,supposethatthereexistsl2A+(x)andxkl>0forallkk+.Bytheanalysisgiveninthepreviousparagraph,whenk+issucientlylarge,eitherxki>0orxki=0(3.43)

PAGE 104

forallkk+andi2A+(x)(sinceanindexi2A+(x)whichbecomesactiveatiteratexkremainsactiveforallthesubsequentiterations).Weconsiderthefollowingpossiblecases: Itfollowsfromthedenition(3.36)of()that0(0)=Xi2I(xk)g2ki=kgI(xk)k2and (3.45) BytheLipschitzcontinuityofrfandP3,wehavekg(xk)g(xk+1)k=kg(P(xk))g(P(xkkgI(xk)))kkkgI(xk)k:

PAGE 105

Hence,bytheSchwarzinequality,Xi2I(xk)gki(gkigk+1;i)kkgI(xk)k2:(3.47) SinceA(xk+1)nA(xk)I(xk),theSchwarzinequalityalsogivesXi2A(xk+1)nA(xk)gkigk+1;ikgI(xk)kkgk+1kN;(3.48) wherekgk+1k2N=Xi2A(xk+1)nA(xk)g2k+1;i: kgk+1kN Forksucientlylarge,(3.43)impliesthatthenewlyactivatedconstraintsA(xk+1)nA(xk)excludeallmembersofA+(x).Sincethexkconvergetox,kgk+1kNtendstozero.Ontheotherhand,kgI(xk)kisboundedawayfromzerosincetheindexliscontainedinI(xk).Hence,thelasttermin(3.49)tendsto0askincreases,andthelowerboundforkapproaches(1)=.Sincexl=0,itfollowsthatxklapproaches0.Sincethelowerboundforkapproaches(1)=,gl(x)>0,andxkconvergestox,weconcludethatxk+1;l=xklkgkl<0 forksucientlylarge.Thiscontradictstheinitialassumptionthatconstraintlisinactiveforksucientlylarge.Hence,inanitenumberofiterations,xki=0foralli2A+(x).

PAGE 106

violatedforsomel2A+(x).Weshowthatthisleadstoacontradiction.By(3.43),xkl>0forallkk+.Sincexkconvergestox,xl=0,andgl(x)>0,itispossibletochooseklarger,ifnecessary,sothatxklgklmin<0:(3.50) Since(3.42)isviolatedandxkconvergestox,wecanchooseklarger,ifnecessary,sothatxkl where0<<1istheparameterappearinginStep3ofNGPA,andistheLipschitzconstantforrf.Wewillshowthatforthisk,wehavef(xk+dk)fR+gTkdk;(3.52) wherefRisspeciedinStep3ofNGPA.AccordingtoStep3ofNGPA,when(3.52)holds,k=1,whichimpliesthatxk+1;l=xkl+dkl:(3.53) Since(3.50)holdsandkmin,wehavesincedkl=maxfxklkgkl;0gxkl=xkl:(3.54) Thissubstitutionin(3.53)givesxk+1;l=0,whichcontradictsthefactthatxkl>0forallkk+.Tocompletetheproof,weneedtoshowthatwhen(3.51)holds,

PAGE 107

(3.52)issatised.ExpandinginaTaylorseriesaroundxkandutilizing(3.54)givesf(xk+dk)=f(xk)+Z10f0(xk+tdk)dt=f(xk)+gTkdk+Z10(rf(xk+tdk)gTk)dkdtf(xk)+gTkdk+ =f(xk)+gTkdk(1)gklxkl+ Theinequality(3.55)isduetothefactthatgkidki0foreachi.ByP3,P4,P5,andP7,andbytheLipschitzcontinuityofrf,wehavekdkkmaxf1;maxgkd1(xk)k=maxf1;maxgkd1(xk)d1(x)k=maxf1;maxgkP(xkgk)xkP(xg(x))+xkmaxf1;maxg(kxkxk+kP(xkgk)P(xg(x))k)maxf1;maxg(kxkxk+kxkgk(xg(x))k)maxf1;maxg(2kxkxk+kgkg(x)k)maxf1;maxg(2+)kxkxk: 22(1)xklgkl

PAGE 108

Hence,by(3.56)andbythechoiceforfRspeciedinStep3ofNGPA,wehavef(xk+dk)f(xk)+gTkdkfR+gTkdk:(3.57) Thiscompletestheproofof(3.52). Ifalltheconstraintsareactiveatastationarypointxandstrictcomplemen-tarityholds,thenconvergenceisachievedinanitenumberofiterations: Sincealltheconstraintsareactiveatx,xk;maxtendstozero.By(3.58)theconclusion(3.42)ofLemma6doesnothold.Hence,afteranitenumberofiterations,xk=x.

PAGE 109

Recall[110]thatforanystationarypointxof(3.1),thestrongsecond-ordersucientoptimalityconditionholdsifthereexists>0suchthatdTr2f(x)dkdk2;wheneverdi=0foralli2A+(x):(3.59) UsingP8,weestablishthefollowing: forallx2B(x)withxi=0foralli2A+(x).Choosesmallerifnecessarysothatxigi(x)0foralli2A+(x)andx2B(x):(3.62) Letxbedenedasfollows:xi=8><>:0ifi2A+(x);xiotherwise.(3.63) Since(3.62)holds,itfollowsthatkxxkkd1(x)k(3.64) forallx2B(x).Also,by(3.62),wehave[P(xg(x))x]i=0andd1(x)i=[P(xg(x))x]i=xi

PAGE 110

foralli2A+(x),while[P(xg(x))x]i=d1(x)i=[P(xg(x))x]i forallx2B(x).BytheLipschitzcontinuityofg,(3.64),(3.65),andP3,itfollowsthatkd1(x)k=kP(xg(x))P(xg(x))+P(xg(x))xkkxxk+kd1(x)k(1+)kd1(x)k forallx2B(x).ByP8,(3.61),and(3.66),wehavekxxk1+ :5kd1(x)k(1+)2 Sincekxxk2+kxxk2=kxxk2,theproofiscompletedbysquaringandadding(3.67)and(3.64).

PAGE 111

(3.69) ByLemma7,thereexistsaconstantcsuchthatkxxkckd1(x)kforallxnearx.Ifi2A+(x),thenby(3.68),wehavelimsupk!1xki since2(1;2).Hence,foreachi2A+(x),(3.70)isviolatedforksucientlylarge. Ifi62A+(x),thengi(x)=0.ByLemma7,wehavelimsupk!1jgi(xk)j kd1(xk)k=limsupk!1jgi(xk)gi(x)j kd1(xk)klimsupk!1kxkxk kd1(xk)klimsupk!1ckd1(xk)k1=0;

PAGE 112

(3.73) Utilizing(3.71)giveskxkxkknXi=1jxkixkij=Xi2A+(x)xkinkxkxk2nkxkxk(kxkxkk+kxkxk): whenkissucientlylarge.Combining(3.73)and(3.74),thereexistsaconstantc>0suchthatkd1(xk)kckxkxk(3.75)

PAGE 113

forksucientlylarge. Letkbechosenlargeenoughthatkxkxk0,thenkxkxkxi,whichcontradicts(3.76).Hence,xki=xi=0.Moreover,ifi2A+(x),thenbythedenition(3.63),xki=xi=0.Insummary,8><>:xki=xi=0foreachi2A(xk)[A+(x);gi(x)=0foreachi2A+(x)c;(3.77) whereA+(x)cisthecomplementofA+(x).DeneZ=A(xk)c\A+(x)c. Bythestrongsecond-ordersucientoptimalityconditionandforxnearx,wehave Wesubstitutex=xkin(3.78)andutilize(3.77)toobtain(xkx)T(g(xk)g(x))=nXi=1(xkixi)(gi(xk)gi(x))=Xi2Z(xkixi)gi(xk)kxkxk0@Xi2I(xk)gi(xk)21A1=2; sinceZA(xk)c=I(xk).ExploitingtheLipschitzcontinuityofrf,(3.79)gives(xkx)T(g(xk)g(x))kxkxk(kgI(xk)k+kxkxkk):(3.80)

PAGE 114

Combining(3.74),(3.78),and(3.80),weconcludethatforksucientlylarge, Combining(3.75)and(3.81),theproofiscomplete. Wenowshowthatafteranitenumberofiterations,ASAwillperformonlytheunconstrainedalgorithm(UA)withaxedactiveconstraintset.

PAGE 115

WenowgivetheproofofCorollary3;thatis,whenfisstronglyconvexandtwicecontinuouslydierentiableonB,andassumptionA3ofTheorem13issatised,thentheentiresequenceofiteratesgeneratedbyASAconvergestotheglobalminimizerx.NotethattheassumptionsofCorollary3areweakerthanthoseofCorollary2(globalconvergenceofNGPA)sinceCorollary3onlyrequiresthatfrkfmaxkinnitelyofteninNGPA. holdsforallkkj(see(3.14)and(3.15)).Sincexkjconvergestox,itfollowsthatf(xkj)convergestof(x),andhence,by(3.82)and(3.34),theentiresequenceconvergestox.

PAGE 116

or(3.83)isviolated.BytheanalysisgiveninCase3ofLemma6,when(3.83)isviolated,(3.52)holds,fromwhichitfollowsthatforjsucientlylarge,xkj+1;i=0foralli2A+(x):(3.84) Hence,eitherthesequencexkjsatises(3.83)orthesequencexkj+1satises(3.84).Inthislattercase,itfollowsfrom(3.57)thatf(xkj+1)f(xkj): Ineithercase(3.83)or(3.84),thereexistsasequenceKj(eitherKj=kjorKj=kj+1)withthepropertythatxKjconvergestoxandlimsupj!1(xKj)i

PAGE 117

3.3.3 DESCENTfortheunconstrainedalgo-rithmandtheCBBmethodforthenonmonotonegradientprojectionalgorithm,totheperformanceofthefollowingcodes: AdetaileddescriptionofourimplementationofASAcanbefoundin[78]. L-BFGS-BwasdownloadedfromJorgeNocedal'swebpage,TRONwasdownloadedfromJorgeMore'swebpage,SPG2andGENCANweredownloadedonJune28,2005,fromtheTANGOwebpagemaintainedbyErnestoBirgin.AllcodesarewritteninFortranandcompiledwithf77(defaultcompilersettings)onaSunworkstation.ThestoppingconditionwaskP(xg(x))xk1106;

PAGE 118

HereMisthememoryusedtoevaluatefmaxk(see(3.5)).InASAtheparametervalueswereasfollows:=:1;=:5;n1=2;n2=1: Thetestsetconsistedofall50boxconstrainedproblemsintheCUTErlibrary[12]withdimensionsbetween50and15,625,andall23boxconstrainedproblemsintheMINPACK-2library[2]withdimension2500.TRONissomewhatdierentfromtheothercodessinceitemploysHessianinformationandanincompleteCholeskypreconditioner,whilethecodesASA,L-BFGS-B,SPG2,andGENCANonlyutilizegradientinformation.WhenwecompareourcodetoTRON,weusethesameLin/Morepreconditioner[89]usedbyTRONforourunconstrainedalgorithm.ThepreconditionedASAcodeiscalledP-ASA.SinceTRONistargetedtolarge-sparseproblems,wecomparewithTRONusingthe23MINPACK-2problemsandthe42sparsestCUTErproblems(thenumberofnonzerosintheHessianwasatmost1/5thetotalnumberofentries).ThecodesL-BFGS-B,SPG2,andGENCANwereimplementedfortheCUTErtestproblems,whileASAandTRONwereimplementedforbothtestsets,CUTErandMINPACK-2. Thecputimeinsecondsandthenumberofiterations,functionevaluations,andgradientevaluationsforeachofthemethodsarepostedattheauthor'swebsite.Inrunningthenumericalexperiments,wecheckedwhetherdierentcodesconvergedtodierentlocalminimizers;whencomparingthecodes,we

PAGE 119

Figure3{2:Performanceproles,50CUTErtestproblems restrictedoutselvestotestproblemswhereallcodesconvergedtothesamelocalminimizer,andwheretherunningtimeofthefastestcodeexceeded.01seconds.Thenumericalresultsarenowanalyzed. Theperformanceofthealgorithms,relativetocputime,wasevaluatedusingtheperformanceprolesofDolanandMore[43].Thatis,foreachmethod,weplotthefractionPofproblemsforwhichthemethodiswithinafactorofthebesttime.InFigure3{2,wecomparetheperformanceofthe4codesASA,L-BFGS-B,SPG2,andGENCANusingthe50CUTErtestproblems.Theleftsideoftheguregivesthepercentageofthetestproblemsforwhichamethodisthefastest;therightsidegivesthepercentageofthetestproblemsthatweresuccessfullysolvedbyeachofthemethods.Thetopcurveisthemethodthatsolvedthemostproblemsinatimethatwaswithinafactorofthebesttime.SincethetopcurveinFigure3{2correspondstoASA,thisalgorithmisclearlyfastestforthissetof50testproblemswithdimensionsrangingfrom50to15,625.TherelativedierenceinperformancebetweenASAandthecompetingmethodsseeninFigure3{2isgreaterthantherelativedierenceinperformancebetweentheCG DESCENT

PAGE 120

Figure3{3:Performanceproles,42sparsestCUTErproblems,23MINPACK-2problems,=106 InFigure3{3wecomparetheperformanceofTRONwithP-ASAandASAforthe42sparsestCUTErtestproblemsandthe23MINPACK-2problems.ObservethatP-ASAhasthetopperformance,andthatASA,whichonlyutilizesthegradient,performsalmostaswellastheHessian-basedcodeTRON.ThenumberofCGiterationsperformedbytheP-ASAcodeismuchlessthanthenumberofCGiterationsperformedbytheASAcode.Finally,inFigure3{4wecomparetheperformanceofP-ASAwithASAfortherelaxedconvergencetolerance=102kd1(x0)k1.BasedonFigures3{3and3{4,thepreconditionedASAschemeismoreecientthanunconditionedASAforthemorestringentstoppingcriterion,whiletheunconditionedandpreconditionedschemesareequallyeectiveforamorerelaxedstoppingcriterion.AlthoughtheperformanceproleforASAisbeneath1inFigure3{3,itreaches1asincreases{therearesomeproblems

PAGE 121

Figure3{4:Performanceproles,=102kd1(x0)k1 Whenwesolveanoptimizationproblem,thesolutiontimeconsistsoftwoparts: T1. thetimeassociatedwiththeevaluationofthefunctionoritsgradientoritsHessian,and T2. theremainingtime,whichisoftendominatedbythetimeusedinthelinearalgebra. ThecputimeperformanceprolemeasuresamixtureofT1andT2forasetoftestproblems.Insomeapplications,T1(theevaluationtime)maydominate.Inordertoassesshowthealgorithmsmayperforminthelimit,whenT2isnegligiblecomparedwithT1,wecouldignoreT2andcomparethealgorithmsbasedonT1.Inthenextsetofexperiments,weexplorehowthealgorithmsperforminthelimit,asT1becomesinnitelylargerelativetoT2. Typically,thetimetoevaluatethegradientofafunctionisgreaterthanthetimetoevaluatethefunctionitself.AndthetimetoevaluatetheHessianisgreater

PAGE 122

Figure3{5:Performanceproles,evaluationmetric,50CUTErtestproblems,gradient-basedmethods thanthetimetoevaluatethegradient.Ifthetimetoevaluatethefunctionis1,thentheaveragetimetoevaluatethegradientandHessianfortheCUTErboundconstrainedtestsetisasfollows:function=1;gradient=2:6;Hessian=21:0: Foreachmethodandforeachtestproblem,wecomputean\evaluationtime"wherethetimeforafunctionevaluationis1,thetimeforagradientevaluationiseither2.6(CUTEr)or2.0(MINPACK2),andthetimeforaHessianevaluationiseither21.0(CUTEr)or40.5(MINPACK2).InFigure3{5wecomparetheperformanceofgradient-basedmethods,andinFigure3{6,wecomparetheperformanceofthegradient-basedASAandthemethodswhichexploittheHessian(P-ASAorTRON).

PAGE 123

Figure3{6:Performanceproles,evaluationmetric,42sparsestCUTErproblems,23MINPACK-2problems InFigure3{5weseethatfortheevaluationmetricandnear1,L-BFGS-BisperformsbetterthanASA,butasincreases,ASAdominatesL-BFGS-B.Inotherwords,intheevaluationmetric,therearemoreproblemswhereL-BFGS-Bisfasterthantheothermethods,however,ASAisnotmuchslowerthanL-BFGS-B.Whenreaches1.5,ASAstartstodominateL-BFGS-B. InFigure3{6weseethatP-ASAdominatesTRONintheevaluationmetric.Hence,eventhoughTRONusesfarfewerfunctionevaluations,itusesmanymoreHessianevaluations.SincethetimetoevaluatetheHessianismuchgreaterthanthetimetoevaluatethefunction,P-ASAhasbetterperformance.Insummary,byneglectingthetimeassociatedwiththelinearalgebra,therelativegapbetweenP-ASAandTRONdecreaseswhiletherelativegapbetweenTRONandASAincreases,asseeninFigure3{6.Nonethelesss,intheevaluationmetric,theperformanceproleforP-ASAisstillabovetheproleforTRON.

PAGE 124

Wehavepresentedsomenovelapproachesbyusinggradientmethodsforsolv-inglarge-scalegeneralunconstrainedandboxconstrainedoptimization.Intherstpartofthisdissertation,wefocusontheunconstrainedoptimization.Werstin-troducedanonlinearconjugategradientmethod,CG DESCENT,whichguaranteestogeneratesucientdescentsearchingdirectionsindependentoflinesearch.Weprovedglobalconvergenceofthismethodforstronglyconvexandgeneralobjectivefunctionsrespectively.ExtensivenumericalresultsindicateCG DESCENTisaveryecientandrobustmethodwhichoutperformsmanystate-of-the-artsoftwareforsolvingunconstrainedoptimization.WethenintroducedarecentlydevelopedcyclicBarzilai-Borwein(CBB)method.Weprovedforgeneralnonlinearfunctions,CBBmethodwillhaveatleastlinearlyconvergentatastationarypointwithpositivedeniteHessian.Inaddition,ournumericalexperimentsindicatedifthecyclelengthislarge,convergenceratecouldbesuperlinear.BycombingtheCBBmethodwithanadaptivenonmonotonelinesearch,weglobalizedthemethodtosolvegeneralunconstrainedoptimization.Fordegenerateoptimizationproblems,wheremultipleminimizersmayexist,orwheretheHessianmaybesingularatalocalminimizer,weprosedaclassofself-adaptiveproximalpointmethods.Byparticularlychoosingtheregularizationparameter,weshowedthedistancebetweentheiteratesandthesolutionsetwillbeatleastsuperlinearlyconvergent.Inthesecondpartofthisdissertation,weproposedancompletelynewactivesetmethodforboxconstrainedoptimization.Ouractivesetalgorithmconsistsofanonmono-tonegradientprojectionstep,anunconstrainedoptimizationstep,andasetofrulesforbranchingbetweenthetwosteps.Globalconvergencetoastationarypoint 112

PAGE 125

isestablished.Thisalgorithmeventuallyreducestounconstrainedoptimizationevenwithoutassumingthestrictcomplementaritycondition.Aspecicimplemen-tationofASAisgivenwhichexploitsthecyclicBarzilai-BorweinalgorithmforthegradientprojectionstepandtheCG DESCENTforunconstrainedoptimization.Extensivenumericalresultsarealsoprovided. Thisprojectisfarfromnishedyet.Rightnow,Iamcontinuingthelineofresearchoutlinedabove.First,IamworkingonextendingASA[78]tosolvenonlinearoptimizationproblemswithboxandlinearconstraints.Second,anecientpreconditionerwhichtakestheadvantageofthestructureofproblemneedtobedeveloped.Bothoftheaboveprojectsarecrucialtothedevelopmentofhighlyecientsoftwareforgeneralnonlinearoptimization.Thisworkalsoexploitsmanytechniquesinnumericallinearalgebra,sparsematrixcomputationsandparallelcomputing.AtthesametimethatIworkonthedevelopmentofgeneral,ecientoptimizationtechniques,Iwouldlikeapplythesetechniquestosignalprocessing,medicalimageprocessing,bothonMRIanPET,molecularmodelandrelatedelds.

PAGE 126

[1] M.Al-BaaliandR.Fletcher.Anecientlinesearchfornonlinearleastsquares.J.Optim.TheoryAppl.,48:359{377,1984. [2] B.M.Averick,R.G.Carter,J.J.More,andG.L.Xue.TheMINPACK-2testproblemcollection.Technicalreport,MathematicsandComputerScienceDivision,ArgonneNationalLaboratory,Argonne,IL,1992. [3] J.BarzilaiandJ.M.Borwein.Twopointstepsizegradientmethods.IMAJ.Numer.Anal.,8:141{148,1988. [4] D.P.Bertsekas.OntheGoldstein-Levitin-Polyakgradientprojectionmethod.IEEETrans.AutomaticControl,21:174{184,1976. [5] D.P.Bertsekas.ProjectedNewtonmethodsforoptimizationproblemswithsimpleconstraints.SIAMJ.ControlOptim.,20:221{246,1982. [6] D.P.Bertsekas.NonlinearProgramming.AthenaScientic,Belmont,MA,1999. [7] E.G.Birgin,R.Biloti,M.Tygel,andL.T.Santos.Restrictedoptimization:acluetoafastandaccurateimplementationofthecommonreectionsurfacestackmethod.J.Appl.Geophysics,42:143{155,1999. [8] E.G.Birgin,I.Chambouleyron,andJ.M.Martnez.Estimationoftheopticalconstantsandthethicknessofthinlmsusingunconstrainedopti-mization.J.Comput.Phys.,151:862{880,1999. [9] E.G.BirginandJ.M.Martnez.Large-scaleactive-setbox-constrainedoptimizationmethodwithspectralprojectedgradients.Comput.Optim.Appl.,23:101{125,2002. [10] E.G.Birgin,J.M.Martnez,andM.Raydan.Nonmonotonespectralprojectedgradientmethodsforconvexsets.SIAMJ.Optim.,10:1196{1211,2000. [11] E.G.Birgin,J.M.Martnez,andM.Raydan.Algorithm813:SPG-softwareforconvex-constrainedoptimization.ACMTrans.Math.Software,27:340{349,2001. [12] I.Bongartz,A.R.Conn,N.I.M.Gould,andP.L.Toint.CUTE:constrainedandunconstrainedtestingenvironments.ACMTrans.Math.Software,21:123{160,1995. 114

PAGE 127

[13] I.Bongartz,A.R.Conn,N.I.M.Gould,andP.L.Toint.CUTE:constrainedandunconstrainedtestingenvironments.ACMTrans.Math.Software,21:123{160,1995. [14] M.A.Branch,T.F.Coleman,andY.Li.Asubspace,interior,andconjugategradientmethodforlarge-scalebound-constrainedminimizationproblems.SIAMJ.Sci.Comput.,21:1{23,1999. [15] J.V.BurkeandJ.J.More.Ontheidenticationofactiveconstraints.SIAMJ.Numer.Anal.,25:1197{1211,1988. [16] J.V.BurkeandJ.J.More.Exposingconstraints.SIAMJ.Optim.,25:573{595,1994. [17] J.V.Burke,J.J.More,andG.Toraldo.Convergencepropertiesoftrustregionmethodsforlinearandconvexconstraints.Math.Prog.,47:305{336,1990. [18] R.H.Byrd,P.Lu,andJ.Nocedal.Alimitedmemoryalgorithmforboundconstrainedoptimization.SIAMJ.Sci.Statist.Comput.,16:1190{1208,1995. [19] P.CalamaiandJ.More.Projectedgradientforlinearlyconstrainedproblems.Math.Prog.,39:93{116,1987. [20] P.G.Ciarlet.TheFiniteElementMethodforEllipticProblems.Amsterdam,North-Holland,1978. [21] T.F.ColemanandY.Li.Ontheconvergenceofinterior-reectiveNewtonmethodsfornonlinearminimizationsubjecttobounds.Math.Prog.,67:189{224,1994. [22] T.F.ColemanandY.Li.Aninteriortrustregionapproachfornonlinearminimizationsubjecttobounds.SIAMJ.Optim.,6:418{445,1996. [23] T.F.ColemanandY.Li.Atrustregionandanescalinginteriorpointmethodfornonconvexminimizationwithlinearinequalityconstraints.Technicalreport,CornellUniversity,Ithaca,NY,1997. [24] A.R.Conn,N.I.M.Gould,andPh.L.Toint.Globalconvergenceofaclassoftrustregionalgorithmsforoptimizationwithsimplebounds.SIAMJ.Numer.Anal.,25:433{460,1988. [25] A.R.Conn,N.I.M.Gould,andPh.L.Toint.AgloballyconvergentaugmentedLagrangianalgorithmforoptimizationwithgeneralconstraintsandsimplebounds.SIAMJ.Numer.Anal.,28:545{572,1991. [26] A.R.Conn,N.I.M.Gould,andPh.L.Toint.Trust-RegionMethods.SIAM,Philadelphia,2000.

PAGE 128

[27] Y.H.Dai.Anonmonotoneconjugategradientalgorithmforunconstrainedoptimization.J.Syst.Sci.Complex.,15:139{145,2002. [28] Y.H.Dai.Onthenonmonotonelinesearch.J.Optim.TheoryAppl.,112:315{330,2002. [29] Y.H.Dai.Alternatestepsizegradientmethod.Optimization,52:395{415,2003. [30] Y.H.DaiandR.Fletcher.ProjectedBarzilai-Borweinmethodsforlarge-scalebox-constrainedquadraticprogramming.Numer.Math.,100:21{47,2005. [31] Y.H.DaiandR.Fletcher.Newalgorithmsforsinglylinearlyconstrainedquadraticprogramssubjecttolowerandupperbounds.Math.Prog.,toappear2006. [32] Y.H.Dai,W.W.Hager,K.Schittkowski,andH.Zhang.ThecyclicBarzilai-Borweinmethodforunconstrainedoptimization.IMAJ.Numer.Anal.,toappear. [33] Y.H.DaiandK.Schittkowski.Asequentialquadraticprogrammingalgorithmwithnon-monotonelinesearch.Technicalreport,Dept.Math.,Univ.Bayreuth,submitted,2005. [34] Y.H.DaiandX.Q.Yang.Anewgradientmethodwithanoptimalstepsizeproperty.Technicalreport,InstituteofComputationalMathematicsandScientic/EngineeringComputing,ChineseAcademyofSciences,2001. [35] Y.H.DaiandY.Yuan.Anonlinearconjugategradientmethodwithastrongglobalconvergenceproperty.SIAMJ.Optim.,10:177{182,1999. [36] Y.H.DaiandY.Yuan.NonlinearConjugateGradientMethods.ShangHaiScienceandTechnologyPublisher,Beijing,2000. [37] Y.H.DaiandY.Yuan.Anecienthybridconjugategradientmethodforunconstrainedoptimization.Ann.Oper.Res.,103:33{47,2001. [38] Y.H.DaiandY.Yuan.Alternateminimizationgradientmethod.IMAJ.Numer.Anal.,23:377{393,2003. [39] Y.H.DaiandH.Zhang.Anadaptivetwo-pointstepsizegradientalgorithm.Numer.Algorithms,27:377{385,2001. [40] J.W.Daniel.Theconjugategradientmethodforlinearandnonlinearoperatorequations.SIAMJ.Numer.Anal.,4:10{26,1967.

PAGE 129

[41] R.S.DemboandU.Tulowitzki.Ontheminimizationofquadraticfunctionssubjecttoboxconstraints.Technicalreport,SchoolofOrganizationandManagement,YaleUniversity,NewHaven,CT,1983. [42] J.E.Dennis,M.Heinkenschloss,andL.N.Vicente.Trust-regioninterior-pointalgorithmsforaclassofnonlinearprogrammingproblems.SIAMJ.ControlOptim.,36:1750{1794,1998. [43] E.D.DolanandJ.J.More.Benchmarkingoptimizationsoftwarewithperformanceproles.Math.Program.,91:201{213,2002. [44] Z.Dostal.Boxconstrainedquadraticprogrammingwithproportioningandprojections.SIAMJ.Optim.,7:871{887,1997. [45] Z.Dostal.Aproportioningbasedalgorithmforboundconstrainedquadraticprogrammingwiththerateofconvergence.Numer.Algorithms,34:293{302,2003. [46] Z.Dostal,A.Friedlander,andS.A.Santos.SolutionofcoerciveandsemicoercivecontactproblemsbyFETIdomaindecomposition.Contemp.Math.,218:82{93,1998. [47] Z.Dostal,A.Friedlander,andS.A.Santos.AugmentedLagrangianswithadaptiveprecisioncontrolforquadraticprogrammingwithsimpleboundsandequalityconstraints.SIAMJ.Optim.,13:1120{1140,2003. [48] A.S.El-Bakry,R.A.Tapia,T.Tsuchiya,andY.Zhang.Ontheformulationandtheoryoftheprimal-dualNewtoninterior-pointmethodfornonlinearprogramming.J.Optim.TheoryAppl.,89:507{541,1996. [49] A.S.El-Bakry,R.A.Tapia,andY.Zhang.OntheconvergencerateofNewtoninterior-pointmethodsintheabsenceofstrictcomplementarity.Comp.Optim.Appl.,6:157{167,1996. [50] F.Facchinei,A.Fischer,andC.Kanzow.Ontheaccurateidenticationofactiveconstraints.SIAMJ.Optim.,9:14{32,1998. [51] F.Facchinei,J.Judice,andJ.Soares.AnactivesetNewton'salgorithmforlarge-scalenonlinearprogramswithboxconstraints.SIAMJ.Optim.,8:158{186,1998. [52] F.FacchineiandS.Lucidi.Aclassofpenaltyfunctionsforoptimizationproblemswithboundconstraints.Optimization,26:239{259,1992. [53] F.Facchinei,S.Lucidi,andL.Palagi.AtruncatedNewtonalgorithmforlarge-scaleboxconstrainedoptimization.SIAMJ.Optim.,4:1100{1125,2002. [54] J.FanandY.Yuan.OntheconvergenceoftheLevenberg-Marquardtmethodwithoutnonsingularityassumption.Computing,74:23{39,2005.

PAGE 130

[55] R.Fletcher.PracticalMethodsofOptimizationVol.1:UnconstrainedOptimization.JohnWiley&Sons,NewYork,1987. [56] R.Fletcher.PracticalMethodsofOptimizationVol.2:ConstrainedOptimiza-tion.JohnWiley&Sons,NewYork,1987. [57] R.FletcherandC.Reeves.Functionminimizationbyconjugategradients.Comput.J.,7:149{154,1964. [58] A.Friedlander,J.M.Martnez,B.Molina,andM.Raydan.Gradientmethodwithretardsandgeneralizations.SIAMJ.Numer.Anal.,36:275{289,1999. [59] A.Friedlander,J.M.Martnez,andS.A.Santos.Anewtrustregionalgorithmforboundconstrainedminimization.Appl.Math.Optim.,30:235{266,1994. [60] J.C.GilbertandJ.Nocedal.Globalconvergencepropertiesofconjugategradientmethodsforoptimization.SIAMJ.Optim.,2:21{42,1992. [61] R.Glowinski.NumericalMethodsforNonlinearVariationalProblems.Springer-Verlag,Berlin,NewYork,1984. [62] W.Glunt,T.L.Hayden,andM.Raydan.Molecularconformationsfromdistancematrices.J.Comput.Chem.,14:114{120,1993. [63] A.A.Goldstein.ConvexprogramminginHilbertspace.Bull.Amer.Math.Soc.,70:709{710,1964. [64] A.A.Goldstein.Onsteepestdescent.SIAMJ.Control,3:147{151,1965. [65] G.H.GolubandD.P.O'leary.SomehistoryoftheconjugategradientandLanczosalgorithms:1948{1976.SIAMRev.,31:50{100,1989. [66] L.Grippo,F.Lampariello,andS.Lucidi.AnonmonotonelinesearchtechniqueforNewton'smethod.SIAMJ.Numer.Anal.,23:707{716,1986. [67] L.Grippo,F.Lampariello,andS.Lucidi.AtruncatedNewtonmethodwithnonmonotonelinesearchforunconstrainedoptimization.J.Optim.TheoryAppl.,60:401{419,1989. [68] L.GrippoandM.Sciandrone.NonmonotoneglobalizationtechniquesfortheBarzilai-Borweingradientmethod.Comput.Optim.Appl.,23:143{169,2002. [69] C.D.Ha.Ageneralizationoftheproximalpointalgorithm.SIAMJ.Control,28:503{512,1990. [70] W.W.Hager.Dualtechniquesforconstrainedoptimization.J.Optim.TheoryAppl.,55:37{71,1987.

PAGE 131

[71] W.W.Hager.Analysisandimplementationofadualalgorithmforcon-strainedoptimization.J.Optim.TheoryAppl.,79:427{462,1993. [72] W.W.HagerandH.Zhang.CG DESCENTuser'sguide.Technicalreport,Dept.Math.,Univ.Florida,2004. [73] W.W.HagerandH.Zhang.Anewconjugategradientmethodwithguaranteeddescentandanecientlinesearch.SIAMJ.Optim.,16:170{192,2005. [74] W.W.HagerandH.Zhang.Algorithm851:CG DESCENT,aconjugategradientmethodwithguaranteeddescent.ACMTrans.Math.Software,32,2006. [75] W.W.HagerandH.Zhang.Asurveyofnonlinearconjugategradientmethods.PacicJ.Optim.,2:35{58,2006. [76] W.W.HagerandH.Zhang.Recentadvancesinboundconstrainedopti-mization.InF.Ceragioli,A.Dontchev,H.Furuta,K.Marti,andL.Pandol,editors,SystemModelingandOptimization,Proceedingsofthe22ndIFIPTC7ConferenceheldinJuly18{22,2005,Turin,Italy.Springer,2006,toappear. [77] W.W.HagerandH.Zhang.Self-adaptiveinexactproximalpointmethods.Comput.Optim.Appl.,submitted,2005. [78] W.W.HagerandH.Zhang.Anewactivesetalgorithmforboxconstrainedoptimization.SIAMJ.Optim.,toappear. [79] J.Y.Han,G.H.Liu,andH.X.Yin.ConvergenceofPerryandShanno'smemorylessquasi-Newtonmethodfornonconvexoptimizationproblems.ORTrans.,1:22{28,1997. [80] M.Heinkenschloss,M.Ulbrich,andS.Ulbrich.Superlinearandquadraticconvergenceofane-scalinginterior-pointNewtonmethodsforproblemswithsimpleboundswithoutstrictcomplementarityassumption.Math.Prog.,86:615{635,1999. [81] M.R.HestenesandE.L.Stiefel.Methodsofconjugategradientsforsolvinglinearsystems.J.ResearchNat.Bur.Standards,49:409{436,1952. [82] C.HumesandP.Silva.Inexactproximalpointalgorithmsanddescentmethodsinoptimization.Optim.Eng.,6:257{271,2005. [83] C.KanzowandA.Klug.Onane-scalinginterior-pointNewtonmethodsfornonlinearminimizationwithboundconstraints.Comput.Optim.Appl.,toappear2006.

PAGE 132

[84] A.KaplanandR.Tichatschke.Proximalpointmethodsandnonconvexoptimization.J.GlobalOptim.,13:389{406,1998. [85] C.Lemarechal.Aviewofline-searches.InOptimizationandOptimalControl,volume30,pages59{79,Heidelberg,1981.Springer-Verlag. [86] M.Lescrenier.Convergenceoftrustregionalgorithmsforoptimizationwithboundswhenstrictcomplementaritydoesnothold.SIAMJ.Numer.Anal.,28:476{495,1991. [87] E.S.LevitinandB.T.Polyak.Constrainedminimizationproblems.USSRComput.Math.Math.Physics,6:1{50,1966. [88] D.Li,M.Fukushima,L.Qi,andN.Yamashita.RegularizedNewtonmethodsforconvexminimizationproblemswithsingularsolutions.Comput.Optim.Appl.,28:131{147,2004. [89] C.J.LinandJ.J.More.Incompletecholeskyfactorizationswithlimitedmemory.SIAMJ.Sci.Comput.,21:24{45,1999. [90] C.J.LinandJ.J.More.Newton'smethodforlargebound-constrainedoptimizationproblems.SIAMJ.Optim.,9:1100{1127,1999. [91] D.C.LiuandJ.Nocedal.OnthelimitedmemoryBFGSmethodforlargescaleoptimization.Math.Program.,45:503{528,1989. [92] W.B.LiuandY.H.Dai.Minimizationalgorithmsbasedonsupervisorandsearchercooperation.J.Optim.TheoryAppl.,111:359{379,2001. [93] Y.LiuandC.Storey.Ecientgeneralizedconjugategradientalgorithms,part1:Theory.J.Optim.TheoryAppl.,69:129{137,1991. [94] S.Lucidi,F.Rochetich,andM.Roma.CurvilinearstabilizationtechniquesfortruncatedNewtonmethodsinlarge-scaleunconstrainedoptimization.SIAMJ.Optim.,8:916{939,1998. [95] F.J.Luque.Asymptoticconvergenceanalysisoftheproximalpointalgo-rithm.SIAMJ.Control,22:277{293,1984. [96] B.Martinet.Regularisationd'inequationsvariationnellesparapproximationssuccessives.Rev.FrancaiseInform.Rech.Oper.Ser.R-3,4:154{158,1970. [97] B.Martinet.Determinationapproacheed'unpointxed'uneapplicationpseudo-contractante.ComptesRendusdesSeancesdel'AcademiedesSciences,274:163{165,1972. [98] J.M.Martnez.BOX-QUACANandtheimplementationofaugmentedLagrangianalgorithmsforminimizationwithinequalityconstraints.J.Comput.Appl.Math.,19:31{56,2000.

PAGE 133

[99] G.P.McCormickandR.A.Tapia.Thegradientprojectionmethodundermilddierentiabilityconditions.SIAMJ.Control,10:93{98,1972. [100] J.J.MoreandD.C.Sorensen.Newton'smethod.InG.H.Golub,editor,StudiesinNumericalAnalysis,pages29{82,Washington,D.C.,1984.MathematicalAssociationofAmerica. [101] J.J.MoreandD.J.Thuente.Linesearchalgorithmswithguaranteedsucientdecrease.ACMTrans.Math.Software,20:286{307,1994. [102] J.J.MoreandG.Toraldo.Onthesolutionoflargequadraticprogrammingproblemswithboundconstraints.SIAMJ.Optim.,1:93{113,1991. [103] J.Nocedal.Updatingquasi-Newtonmatriceswithlimitedstorage.Math.Comp.,35:773{782,1980. [104] E.R.PanierandA.L.Tits.Avoidingthemaratoseectbymeansofanonmonotonelinesearch.SIAMJ.Numer.Anal.,28:1183{1195,1991. [105] J.M.Perry.Aclassofconjugategradientalgorithmswithatwostepvariablemetricmemory.TechnicalReport269,CenterforMathematicalStudiesinEconomicsandManagementScience,NorthwesternUniversity,1977. [106] E.PolakandG.Ribiere.Notesurlaconvergencedemethodesdedirectionsconjuguees.Rev.FrancaiseInformat.RechercheOperationnelle,3:35{43,1969. [107] B.T.Polyak.Theconjugategradientmethodinextremalproblems.USSRComp.Math.Math.Phys.,9:94{112,1969. [108] M.Raydan.TheBarzilaiandBorweingradientmethodforthelargescaleunconstrainedminimizationproblem.SIAMJ.Optim.,7:26{33,1997. [109] M.RaydanandB.F.Svaiter.RelaxedsteepestdescentandCauchy-Barzilai-Borweinmethod.Comput.Optim.Appl.,21:155{167,2002. [110] S.M.Robinson.Stronglyregulargeneralizedequations.Math.Oper.Res.,5:43{62,1980. [111] R.T.Rockafellar.AugmentedLagrangiansandapplicationsoftheproximalpointalgorithminconvexprogramming.Math.Oper.Res.,2:97{116,1976. [112] R.T.Rockafellar.Monotoneoperatorsandtheproximalpointalgorithm.SIAMJ.Control,14:877{898,1976. [113] A.SchwartzandE.Polak.Familyofprojecteddescentmethodsforopti-mizationproblemswithsimplebounds.J.Optim.TheoryAppl.,92:1{31,1997.

PAGE 134

[114] T.Serani,G.Zanghirati,andL.Zanni.Gradientprojectionmethodsforquadraticprogramsandapplicationsintrainingsupportvectormachines.Optim.MethodsSoftw.,20:353{378,2005. [115] D.F.Shanno.Ontheconvergenceofanewconjugategradientalgorithm.SIAMJ.Numer.Anal.,15:1247{1257,1978. [116] P.L.Toint.GlobalconvergenceofaclassoftrustregionmethodsfornonconvexminimizationinHilbertspace.IMAJ.Numer.Anal.,8:231{252,1988. [117] P.L.Toint.Anassessmentofnon-monotonelinesearchtechniquesforunconstrainedoptimization.SIAMJ.Sci.Comput.,17:725{739,1996. [118] P.L.Toint.Anon-monotonetrustregionalgorithmfornonlinearoptimiza-tionsubjecttoconvexconstraints.Math.Prog.,77:69{94,1997. [119] P.Tseng.ErrorboundsandsuperlinearconvergenceanalysisofsomeNewton-typemethodsinoptimization.InG.DiPilloandF.Giannessi,editors,NonlinearOptimizationandRelatedTopics,pages445{462.Kluwer,2000. [120] M.Ulbrich,S.Ulbrich,andM.Heinkenschloss.Globalconvergenceofane-scalinginterior-pointNewtonmethodsforinnite-dimensionalnonlinearproblemswithpointwisebounds.SIAMJ.ControlOptim.,37:731{764,1999. [121] C.Wang,J.Han,andL.Wang.GlobalconvergenceofthePolak-RibiereandHestenes-Stiefelconjugategradientmethodsfortheunconstrainednonlinearoptimization.ORTransactions,4:1{7,2000. [122] P.Wolfe.Convergenceconditionsforascentmethods.SIAMRev.,11:226{235,1969. [123] P.Wolfe.ConvergenceconditionsforascentmethodsII:somecorrections.SIAMRev.,13:185{188,1971. [124] S.J.Wright.Implementingproximalpointmethodsforlinearprogramming.J.Optim.TheoryAppl.,65:531{554,1990. [125] H.YamashitaandH.Yabe.Superlinearandquadraticconvergenceofsomeprimal-dualinterior-pointmethodsforconstrainedoptimization.Math.Prog.,75:377{397,1996. [126] N.YamashitaandM.Fukushima.Theproximalpointalgorithmwithgenuinesuperlinearconvergenceforthemonotonecomplementarityproblem.SIAMJ.Optim.,11:364{379,2000.

PAGE 135

[127] N.YamashitaandM.Fukushima.OntherateofconvergenceoftheLevenberg-Marquardtmethod.InTopicsinnumericalanalysis,volume15ofComput.Suppl.,pages239{249.Springer,2001. [128] E.K.YangandJ.W.Tolle.Aclassofmethodsforsolvinglargeconvexquadraticprogramssubjecttoboxconstraints.Math.Prog.,51:223{228,1991. [129] E.H.Zarantonello.ProjectionsonconvexsetsinHilbertspaceandspectraltheory.InE.H.Zarantonello,editor,ContributionstoNonlinearFunctionalAnalysis,pages237{424,NewYork,1971.AcademicPress. [130] H.Zhang.Anonmonotonetrustregionalgorithmfornonlinearoptimizationsubjecttogeneralconstrains.JornalofComputationalMathematics,2:237{276,2003. [131] H.ZhangandW.W.Hager.Anonmonotonelinesearchtechniqueanditsapplicationtounconstrainedoptimization.SIAMJ.Optim.,14:1043{1056,2004. [132] H.ZhangandW.W.Hager.PACBB:AprojectedadaptivecyclicBarzilai-Borweinmethodforboxconstrainedoptimization.InWilliamW.Hager,Shu-JenHuang,PanosM.Pardalos,andOlegA.Prokopyev,editors,Mul-tiscaleOptimizationMethodsandApplications,pages387{392,NewYork,2005.Springer. [133] Y.Zhang.Interior-pointgradientmethodswithdiagonal-scalingsforsimple-boundconstrainedoptimization.TechnicalReportTR04-06,DepartmentofComputationalandAppliedMathematics,RiceUniversity,Houston,Texas,2004. [134] J.L.ZhouandA.L.Tits.Nonmonotonelinesearchforminimaxproblem.J.Optim.TheoryAppl.,76:455{476,1993. [135] C.Zhu,R.H.Byrd,andJ.Nocedal.Algorithm778:L-BFGS-B,Fortransubroutinesforlarge-scalebound-constrainedoptimization.ACMTrans.Math.Software,23:550{560,1997.

PAGE 136

124

PAGE 137

WilliamW.Hager,ChairProfessorofMathematics IcertifythatIhavereadthisstudyandthatinmyopinionitconformstoacceptablestandardsofscholarlypresentationandisfullyadequate,inscopeandquality,asadissertationforthedegreeofDoctorofPhilosophy. PanosM.PardalosProfessorofIndustrialandSystemsEngineering IcertifythatIhavereadthisstudyandthatinmyopinionitconformstoacceptablestandardsofscholarlypresentationandisfullyadequate,inscopeandquality,asadissertationforthedegreeofDoctorofPhilosophy. ShariMoskowAssociateProfessorofMathematics IcertifythatIhavereadthisstudyandthatinmyopinionitconformstoacceptablestandardsofscholarlypresentationandisfullyadequate,inscopeandquality,asadissertationforthedegreeofDoctorofPhilosophy. SergeiS.PilyuginAssociateProfessorofMathematics

PAGE 138

JayGopalakrishnanAssistantProfessorofMathematics

PAGE 139

Dean,CollegeofLibralArtsandSciences

PAGE 140

HongchaoZhang (352)392{0281ext.329 DepartmentofMathematics Chair:WilliamW.Hager Degree:DoctorofPhilosophy GraduationDate:August2006 Optimizationmightbedenedasthescienceofdeterminingthe\beststrate-gies"bysolvingcertaintypeofmathematicalproblems.Nowtheapplicabilityofoptimizationmethodsiswidelyspreadintoalmosteveryactivityinwhichnumer-icaloptimalsolutionneedtobefoundandinwhichthehugenumberofvariablesneedtobedetermined.Thisdissertationgenerallyconsiders,frommathematicalkindofview,howtosolvethoselarge-scaleproblemseciently.Allthemethodsdiscussedinthisdissertationusesleastmemoryandprovedbyextensivenumericalexperimentstobeveryecientforoptimizationproblemswithoutconstraintsoronlyboundconstraints.Theserecentlydevelopedmethodsshouldbeinterestingnotonlytothosemathematicianswhoworkonnumericalanalysis,computationaloptimization,butalsotomanyscientistsandengineerswhoaredealingwithrealoptimizationproblemscomingfromindustry,economics,commerce,etc.


Permanent Link: http://ufdc.ufl.edu/UFE0013703/00001

Material Information

Title: Gradient Methods for Large-Scale Nonlinear Optimization
Physical Description: Mixed Material
Copyright Date: 2008

Record Information

Source Institution: University of Florida
Holding Location: University of Florida
Rights Management: All rights reserved by the source institution and holding location.
System ID: UFE0013703:00001

Permanent Link: http://ufdc.ufl.edu/UFE0013703/00001

Material Information

Title: Gradient Methods for Large-Scale Nonlinear Optimization
Physical Description: Mixed Material
Copyright Date: 2008

Record Information

Source Institution: University of Florida
Holding Location: University of Florida
Rights Management: All rights reserved by the source institution and holding location.
System ID: UFE0013703:00001


This item has the following downloads:


Full Text











GRADIENT METHODS FOR LARGE-SCALE NONLINEAR OPTIMIZATION


By

HONGCHAO ZHANG















A DISSERTATION PRESENTED TO THE GRADUATE SCHOOL
OF THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT
OF THE REQUIREMENTS FOR THE DEGREE OF
DOCTOR OF PHILOSOPHY

UNIVERSITY OF FLORIDA


2006

































Copyright 2006

by

Hongchao Zhang
















To my parents















ACKNOWLEDGMENTS

First of all, I would like to express my gratitude to my advisor, Professor

William W. Hager. His enthusiasm, encouragement, consistent support, and

guidance made my graduate experience fruitful and thoroughly enjov-,-'-. I am

grateful to have had the opportunity to study under such a < liii- intelligent, and

energetic advisor. His confidence in me will albv--i- stimulate me to move forward

on my research.

Second, I would also like to thank Dr. Panos M. Pardalos, Dr. Shari Moskow,

Dr. Sergei S. Pilyugin, and Dr. Jay Gopalakrishnan, for serving on my supervisory

committee. Their valuable -,.--.- -I i., have been very helpful to my research.

Third, thanks go to my officemates (Dr. Shu-jen Huang, Bv- .i Sukanya), and

all colleagues and friends in the Department of Mathematics at the University of

Florida; their ( "-. i -:v alleviated the stress and frustration of this time.

Last, but not least, I wish to express my special thanks to my parents for their

immeasurable support and love. Without their support and encouragement, this

dissertation could not have been completed successfully.

This work was supported by the National Science Foundation Grant 0203270.















TABLE OF CONTENTS
page

ACKNOWLEDGMENTS ................... ...... iv

LIST OF TABLES ...................... ......... vii

LIST OF FIGURES ................... ......... viii

KEY TO ABBREVIATIONS ........ ................... ix

KEY TO SYMBOLS ......... ....................... x

ABSTRACT .................. ............ xi

CHAPTER

1 INTRODUCTION .. ... ......... ............ 1

1.1 M otivation ..... .................... 1
1.2 Optimality Conditions ......... ............ 2

2 UNCONSTRAINED OPTIMIZATION .......... ........ 4

2.1 A New Conjugate Gradient Method with Guaranteed De-
scent ................................ 4
2.1.1 Introduction to Nonlinear Conjugate Gradient Method ... 4
2.1.2 CG_DESCENT ..... ........ ....... 6
2.1.3 Global Convergence .......... ........... 8
2.1.4 Line Search .......... ............... 17
2.1.5 Numerical Comparisons .......... ........ 19
2.2 A Cyclic Barzilai-Borwein (CBB) Method ......... 25
2.2.1 Introduction to Nonmonotone Line Search ........ 25
2.2.2 Method and Local Linear Convergence ...... ..... 28
2.2.3 Method for Convex Quadratic Programming . ... 38
2.2.4 An Adaptive CBB Method ................. .. 40
2.2.5 Numerical Comparisons . . . ... .. 46
2.3 Self-adaptive Inexact Proximal Point Methods ...... 50
2.3.1 Motivation and the Algorithm ............... .. 50
2.3.2 Local Error Bound Condition ................ .. 52
2.3.3 Local Convergence ...... ......... .... 56
2.3.4 Global Convergence ...... .......... ...... 67
2.3.5 Preliminary Numerical Results . . ..... 69









3 BOX CONSTRAINED OPTIMIZATION ....... ......... 72

3.1 Introduction ............................ 72
3.2 Gradient Projection Methods ................ .. 75
3.3 Active Set Algorithm (ASA) ................. .. 84
3.3.1 Global Convergence .................. .... 88
3.3.2 Local Convergence .................. .. 90
3.3.3 Numerical Comparisons .............. .. 105

4 CONCLUSIONS AND FUTURE RESEARCH . . ..... 112

REFERENCES .................. ................ .. 114

BIOGRAPHICAL SKETCH .................. ......... 124















LIST OF TABLES
Table page

2-1 Various choices for the CG update parameter ..... . . 5

2-2 Solution time versus tolerance .................. ..... 24

2-3 Transition to superlinear convergence ................. 39

2-4 Comparing CBB(m) method with an adaptive CBB method . 42

2-5 Number of times each method was fastest (time metric, stopping cri-
terion (2.104)) . . . . . . ... .. 49

2-6 CPU times for selected problems .................. .. 50

2-7 I|g(x k)l versus iteration number k ................ 70

2-8 Statistics for ill-condition CUTE problems and CG_DESCENT ..... .71















LIST OF FIGURES
Figure page

2-1 Performance profiles .................. ........ .. 22

2-2 Performance profiles of conjugate gradient methods . .... 23

2-3 Graphs of log(log(l|g~ ll)) versus k, (a) 3 < n < 6 and m = 3, (b)
6 < n < 9 and m = 4. ............... ... ...... .. 41

2-4 Performance based on CPU time. .................. 48

3-1 The gradient projection step. .................. .... 75

3-2 Performance profiles, 50 CUTEr test problems . . ..... 107

3-3 Performance profiles, 42 sparsest CUTEr problems, 23 MINPACK-2
problems, e 10-6 .. ............. ...... 108

3-4 Performance profiles, e 10-2| dl(xo) || . . .109

3-5 Performance profiles, evaluation metric, 50 CUTEr test problems,
gradient-based methods ................ ..... 110

3-6 Performance profiles, evaluation metric, 42 sparsest CUTEr prob-
lems, 23 MINPACK-2 problems ................ 111















KEY TO ABBREVIATIONS


ASA: active set algorithm

CBB: cyclic Barzilai-Borwein

CG: conjugate gradient

LP: linear programming

NCG: nonlinear conjugate gradient

NGPA: nonmonotone gradient projection algorithm

NLP: nonlinear programming

SSOSC: strong second order sufficient condition














KEY TO SYMBOLS

The list shown below gives a brief description of the i i, Prw mathematical

symbols defined in this work. For each symbol, the page number corresponds to the

place where the symbol is first used.

S . . . . . . . . . . 1 1

R' : The space of real n-dimensional vectors ................. .. 11

k : Integer, often used to denote the iteration number in an algorithm ..... ..11

x : R' vector of unknown variables .................. .... .. 11

Xki : Stands for the i-th component of the iterate xk ........... 11

f(x) : The objective function: R" iR .................... 11

fk fk f (xk) . . . . . . . . 11

g(x) : A collum vector, the transpose of the gradient of objective function at
x, i.e. g(x) V f(x)T . . . . . . . .11

gk : 9g = g(xk) . . . . . . . .. .. .11
H(x) : The hessian of objective function at x, i.e. H(x) = V2f(x) ..... ..11

Hk: Hk= H(xk) ............................ 11

I|| | : The Euclidean norm of a vector ................ ...... 11

B(x, p) : The ball with center x and radius p > 0 ............... .. 11

ISI : Stands for the number of elements (cardinality) of S, for any set S .. 11

Sc : The complement of S .................. .......... .. 11















Abstract of Dissertation Presented to the Graduate School
of the University of Florida in Partial Fulfillment of the
Requirements for the Degree of Doctor of Philosophy

GRADIENT METHODS FOR LARGE-SCALE NONLINEAR OPTIMIZATION

By

Hongchao Zhang

August 2006

C'! In': William W. Hager
Major Department: Mathematics

In this dissertation, we develop theories and efficient algorithmic approaches on

gradient methods to solve large-scale nonlinear optimization problems.

The first part of this dissertation discusses new gradient methods and tech-

niques for dealing with large-scale unconstrained optimization. We first propose

a new nonlinear CG method (CGDESCENT), which satisfies the strong descent

condition gkdk < 2gk||2 independent of linesearch. This new CG method is

one member of a one parameter family of nonlinear CG method with guaranteed

descent. We also develop a new "Approximate Wolfe" linesearch which is both

efficient and highly accurate. CGDESCENT is the first nonlinear CG method

which satisfies the sufficient descent condition, independent of linesearch. Moreover,

global convergence is established under the standard (not strong) Wolfe conditions.

The CG_DESCENT software turns out to be a benchmark software for solving

unconstrained optimization. Then, we propose a so-called cyclic Baizilai-Borwein

(CBB) method. It is proved that CBB is locally linearly convergent at a local

minimizer with positive definite Hessian. Numerical evidence indicates that when

m > n/2 > 3, CBB is locally superlinearly convergent, where m is the cycle length









and n is the dimension. However, in the special case m = 3 and n = 2, we give an

example which shows that the convergence rate is in general no better than linear.

Combining a nonmonotone line search and an adaptive choice for the cycle length,

an implementation of the CBB method, called adaptive CBB (ACBB), is proposed.

The adaptive CBB (ACBB) performs much better than the traditional BB methods

and is even competitive with some established nonlinear conjugate gradient meth-

ods. Finally, we propose a class of self-adaptive proximal point methods suitable

for degenerate optimization problems where multiple minimizers may exist, or

where the Hessian may be singular at a local minimizer. Two different acceptance

criteria for an approximate solution to the proximal problem is analyzed and the

convergence rate are analogous to those of exact iterates.

The second part of this dissertation discusses using gradient methods to solve

large-scale box constrained optimization. We first discuss the gradient projection

methods. Then, an active set algorithm (ASA) for box constrained optimization

is developed. The algorithm consists of a nonmonotone gradient projection step,

an unconstrained optimization step, and a set of rules for branching between

the two steps. Global convergence to a stationary point is established. Under

the strong second-order sufficient optimality condition, without assuming strict

complementarity, the algorithm eventually reduces to unconstrained optimization

without restarts. For strongly convex quadratic box constrained optimization, ASA

is shown to have finite convergence when a conjugate gradient method is used in

the unconstrained optimization step. A specific implementation of ASA is given,

which exploits the cyclic Barzilai-Borwein algorithm for the gradient projection

step and CGDESCENT for unconstrained optimization. Numerical experiments

using the box constrained problems in the CUTEr and MINPACK test problem

libraries show that this new algorithm outperforms benchmark software such as

GENCAN, L-BFGS-B, and TRON.















CHAPTER 1
INTRODUCTION

1.1 Motivation

Although computational optimization can be dated back to the maximum

value problems, it became one branch of computational science only after ap-

pearance of the simplex method proposed by Dantzig in the 1940s for linear

programming. Loosely speaking, optimization method seeks to answer the question

"What is best?" for problems in which the quality of any answer can be expressed

as a numerical value. Now as computer power increases so much, it makes possible

for researchers to tackle really large nonlinear problems in many practical appli-

cations. Such problems arise in all areas of mathematics, the physical, chemical

and biological sciences, engineering, architecture, economics, and management.

However, to take advantage of these powers, good algorithms must also be devel-

oped. So developing theories and efficient methods to solve large-scale optimization

problems is a very important and active research area. This is essentially the goal

of this dissertation.

Throughout the dissertation, the nonlinear program (NLP) that we are trying

to solve has the following general formulations:


min{f(x) : x S}, (1.1)


where f is a real-valued, continuous function defined on a nonempty set S C IR. f

is often called the objective function and S is called the feasible set. In C'!i pter 2,

we consider the case where S = R", i.e., the unconstrained optimization problem;

while in C'!i lter 3, we study the case where S is a box set defined on R", i.e.,

S = {x E R" : 1 < x < u} and 1 < u are vectors in IR'. For general nonlinear









optimization, people often consider S is defined by a finite sequence of equality and

inequality constraints. More specifically, problem (1.1) can be reformulated as the

following:

min f(x)

s.t. ci(x) = 0, i 1,2,...,m ;

ci (x) > 0, i = me + 1,..., m,

where ci : R" -- R, i 1, 2, m are smooth functions and at least one of

them is nonlinear. We often denote E = {1,2,..., me}, I = {me + 1,..., m} and

I(x) {ic (x) < 0, i I}.
1.2 Optimality Conditions

First, we give the concepts of global minimum and local minimum.

Definition 1. Given x* E S, if f(x) > f(x*) for all x E S, then x* is called

a global minimum of the problem (1.1). If f(x) > f(x*) for all x E S and x / x*,

then x* is called a strictly global minimum of the problem (1.1).

Definition 2. Given x* E S, if there exists a 6 > 0 such that f(x) > f(x*)

for all x E S n B(x*, 6), then x* is called a local minimum of the problem (1.1). If

f(x) > f(x*) for all x E S n B(x*, 6) and x / x*, then x* is called a strictly local

minimum of the problem (1.1).

When f is a convex function, we know (strictly) local minimum is also a

(strictly) global minimum. However, it is hard in advance to know whether the

objective function is convex or not and in many cases it is very hard or even

impossible to find a global minimum on a feasible region. So in the context of

nonlinear optimization it is often found that a local minimum is a solution of

problem (1.1), and many algorithms are trying to find a feasible point which

satisfies some necessary conditions of a local minimum. In the following, we list

some of these necessary conditions.









Theorem 1 (First order necessary condition for unconstrained optimization)

Suppose f(x) : R" -- R is a continue;, -l; differentiable function. If x* is a local

minimum of problem (1.1), then


Vg(x*) 0.

Theorem 2 ( Second order necessary condition for unconstrained optimization)

Suppose f(x) : R" -- R is a twice continue;, -li differentiable function. If x* is a

local minimum of problem (1.1), then


Vg(x*) 0 and H(x*) > 0.


Because for the general nonlinear optimization, the necessary conditions become

more complicated and have many variants, we only state one first order first order

necessary condition here which is most often used in practice.

Theorem 3 ( First order necessary condition for constrained optimization)

Suppose f(x) and ci(x)(i 1 ,m) in problem (1.2) are continue;, -l; differ-

entiable functions. If x* is a local minimum of problem (1.2) and Vc7(x*)(i E

E U I(x*)) are linearly independent, then there exists AX(i = 1, m) such that


Vf(x*) = A*Vci(x*)
i= 1
A* > 0, A ci(x*) 0, i c I.


For the proof of these theorems and many other necessary optimality conditions,

one may refer to R. Fletcher's books [55, 56].

In this dissertation, we mainly focus on developing gradient methods to

generate iteration points which satisfy some first order condition for large-scale

unconstrained and box constrained optimization.














CHAPTER 2
UNCONSTRAINED OPTIMIZATION

In this chapter, we consider to solve (1.1) with S = R", i.e. the following

unconstrained optimization problem:


min {f(x) : x }, (2.1)


where f is a real-valued, continuous function.

2.1 A New Conjugate Gradient Method with Guaranteed Descent

2.1.1 Introduction to Nonlinear Conjugate Gradient Method

Conjugate gradient (CG) methods comprise a class of unconstrained opti-

mization algorithms which are characterized by low memory requirements and

strong local and global convergence properties. CG history, surveyed by Golub

and O'leary [65], begins with research of Cornelius Lanczos and Magnus Hestenes

and others (Forsythe, Motzkin, Rosser, Stein) at the Institute for Numerical Anal-

ysis (National Applied Mathematics Laboratories of the United States National

Bureau of Standards in Los Angeles), and with independent research of Eduard

Stiefel at Eidg. Technische Hochschule Zirich. In the seminal 1952 paper [81] of

Hestenes and Stiefel, the algorithm is presented as an approach to solve symmetric,

positive-definite linear systems.

When applied to the nonlinear problem (2.1), a nonlinear conjugate gradient

method generates a sequence xk, k > 1, starting from an initial guess xo E R",

using the recurrence

Xk+1 = Xk + kdk, (2.2)









Table 2-1: Various choices for the CG update parameter


T
nHS gk+lYk
dTyk

nFR llgk+1 l2
k|cIg, 1:-


D gk+ V2f(xk)dk
dT7V2f(xk)dk

T
kPRP gk+lYk
k3c 11g,11


QCD 1gk+1 1
-dkgk


g-dygk
L I_ g +l 12


k Yk- 2dk dllg+ Tk
S2y4-2d y) gk+1
d y/ k

where the positive step size ak is

are generated by the rule:


dk+l


(1952) in the original (linear) CG paper
of Hestenes and Stiefel [81]
(1964) first nonlinear CG method, proposed
by Fletcher and Reeves [57]
(1967) proposed by Daniel [40], requires
evaluation of the Hessian V2f(x)

(1969) proposed by Polak and Ribikre [106]
and by Polyak [107]
(1987) proposed by Fletcher [55], CD
stands for "Conjugate D. il

(1991) proposed by Liu and Storey [93]


(1999)


proposed by Dai and Yuan [35]


(2005) proposed by Hager and Zhang [73]


obtained by a line search, and the directions dk


gk+l + 1 kdk, do


(2.3)


Table 2-1 provides a chronological list of some choices for the CG update

parameter. The 1964 formula of Fletcher and Reeves is usually considered the

first nonlinear CG algorithm since their paper [57] focuses on nonlinear opti-

mization, while the 1952 paper [81] of Hestenes and Stiefel focuses on symmetric,

positive-definite linear systems. Daniel's choice for the update parameter, which is

fundamentally different from the other choices, is not discussed in this dissertation.

For large-scale problems, choices for the update parameter that do not require

the evaluation of the Hessian matrix are often preferred in practice over methods









that require the Hessian in each iteration. In the remaining methods of Table 2-1,

except for the new method at the end, the numerator of the update parameter

3k is either ||gk+1 12 or gT LYk and the denominator is either ||g, ||- or d yk or
-dkgk. The 2 possible choices for the numerator and the 3 possible choices for the

denominator lead to 6 different choices for f3k shown in Table 2-1.

If f is a strongly convex quadratic, then in theory, all 8 choices for the update

parameter in Table 2-1 are equivalent with an exact line search. However, for

non-quadratic cost functions, each choice for the update parameter leads to quite

different performance under inexact line searches.

2.1.2 CG_DESCENT

The method corresponding to the last parameter f3 in Table 2-1 is a recently

developed NCG method [73] with guaranteed descent, named CG_DESCENT.

It has close connections to memoryless qu -i-N. .-ton scheme of Perry [105] and

Shanno [115]. To prove the global convergence for a general nonlinear function,

similar to the approach [60, 79, 121] taken for the Polak-Ribiere-Polyak [106, 107]

version of the conjugate gradient method, we restrict the lower value of O3N. In our

restricted scheme, unlike the Polak-Ribiere-Polyak method, we dynamically adjust

the lower bound on fkN in order to make the lower bound smaller as the iterates

converge:

dk+1 gk+l + ~kdk, do -go, (2.4)
-1
N max{ kk}, k d llmin{, (2.5)


where Tr > 0 is a constant; we took r = .01 in the experiments

With conjugate gradient methods, the line search typically requires sufficient

accuracy to ensure that the search directions yield descent. Moreover, it has been

shown [36] that for the Fletcher-Reeves [57] and the Polak-Ribiere-Polyak [106, 107]

conjugate gradient methods, a line search that satisfies the strong Wolfe conditions









may not yield a direction of descent, for a suitable choice of the Wolfe line search
parameters, even for the function f(x) = A||x||2, where A > 0 is a constant. An

attractive feature of the new conjugate gradient scheme, which we now establish, is
that the search directions alv--i yield descent when dTyk / 0, a condition which is

satisfied when f is strongly convex, or the line search satisfies the Wolfe conditions.
Theorem 4 If d Tyk / 0 and

dk+l -gk+l + Tdk, do = -go, (2.6)

for w,', T e [3fv,max{3 N,0}], then

gkdk < -- Ilg12. (2.7)
8

Proof. Since do = -go, we have gTdo = -lgo l2, which satisfies (2.7). Suppose
T = fN. Multiplying (2.6) by gk+, we have


gk+ldk+l -llgk+12 k gk+ldk
2+ T yTgk+ l IgT Idk
-|g1k+1 2 gk+ldk (yk+ 2 2y+ldk)
kg l (dy +gyk
YkgTk+l(d kk)(g k+dk) lg+1 2(dyk2 2|y, I(gldk) (2.8)
(dTyk )2

We apply the inequality
UTV 1 12+ IV12)
2
to the first term in (2.8) with

1
u -= (d yk)gkc+ and v 2(g Ildk)yk
2

to obtain (2.7). On the other hand, if T / O3N, then fnv < T < 0. After multiplying
(2.6) by gkT+, we have


gk+Idk+I lgk+ll2 + g ldk.









If gk+ldk > 0, then (2.7) follows immediately since 7 < 0. If gk+ldk < 0, then

gk l112 + Id112 N T 2d
gk+1dk+- -d-1g9k+2 + g dk _< -g9k+2 + /kg+1dk

since 3fN < < 0. Hence, (2.7) follows by our previous analysis. D
By taking T = 3 we see that the directions generated by (2.2)-(2.3) are
descent directions. Since rlk in (2.5) is negative, it follows that

~ = max { 77k} e [3k ,max{/3, 0}]

Hence, the direction given by (2.4) and (2.5) is a descent direction. Dai and Yuan
[35, 37] present conjugate gradient schemes with the property that d gk < 0 when

dkyk > 0. If f is strongly convex or the line search satisfies the Wolfe conditions,
then dkyk > 0 and the Dai/Yuan schemes yield descent. Note that in (2.7) we
bound dkgk by -(7/8) lgk 2, while for the schemes [35, 37], the negativity of dTgk
is established.
2.1.3 Global Convergence

Convergence analysis for strongly convex functions. Although the search di-
rections generated by either (2.2)-(2.3) with /3k = 3N or (2.4)-(2.5) are ah--v
descent directions, we need to constrain the choice of ak to ensure convergence. We
consider line searches that satisfy either Goldstein's conditions [64]:

61akg dk <_ f(xk + akdk) f(xk) < 62akgkdk, (2.9)

where 0 < 62 < < 61 < 1 and ak > 0, or the Wolfe conditions [122, 123]:

f(xk + kdk) f(xk) < 6ag, dk, (2.10)

gk+ldk > 'gkdk, (2.11)









where 0 < 6 < a < 1. As in in Dai and Yuan [35], we do not require the -~i ii,

W\. ., condition Ig ldkl < --aggdk, which is often used to prove convergence of

nonlinear conjugate gradient methods.

Lemma 1 Suppose that dk is a descent direction and Vf .il.:;i. the Lipschitz

condition

I|Vf(x) Vf(xk) | < L|x x, |

for all x on the line segment connecting Xk and Xk+l, where L is a constant. If the

line search -./.-I. / the Goldstein conditions, then

t(1 1) |g dk
k > -(2.12)
L Idcl, ||-

If the line search -./.:/ the Wolfe conditions, then

1 7 |g Td
ak > 1 k (2.13)
L Idc, II-

Proof. The proof is standard and we omit its proof here (for example similar

proofs can be found [73, 131]). D

We now prove convergence of the unrestricted scheme (2.2)-(2.3) with O3k = P3

when f is strongly convex.

Theorem 5 Suppose that f is strongly convex and Lipschitz continuous on the

level set

= {x E : f(x) < f(xo)}. (2.14)

That is, there exists constants L and p > 0 such that


I|Vf(x) f(y)|| < L||x yl and (2.15)

pl|x -y1|2 < (Vf(x) -Vf(y))(x -y)

for all x and y c L. If the conjugate gradient method (2.2)-(2.3) is implemented

using a line search that -./.;'i. -. either the Wolfe or the Goldstein conditions in









each step, then either gk 0 for some k, or

lim gk 0. (2.16)
k->oo

Proof. Suppose that gk / 0 for all k. By the strong convexity assumption,


kdk (gk+1 g)kTdk > ,.,, ldk 2. (2.17)

Theorem 4 and the assumption gk / 0 imply that dk / 0. Since ak > 0, it follows

from (2.17) that Yk dk > 0. Since f is strongly convex over L, f is bounded from

below. After summing over k the upper bound in (2.9) or (2.10), we conclude that


akgkdk > -o .
k-0
Combining this with the lower bound for ak given in Lemma 1 and the descent

property (2.7) gives

Sgl < 00. (2.18)
kO0
By Lipschitz continuity (2.15),


Ykll lgk+1 g, II lVf(xk + akdk) Vf(xk)l| < Lakl|d, I|. (2.19)

Utilizing (2.17) and (2.3), we have

nN Ygk+1 H l gk gk+
S dTUyk (d yk)2

Yk k1 12 1
< lYkfllllgf+lll + 11Y, 11-'llC- 111 kc+ill
,- .. ldk 2 2ca ||dk 14
SL(, Ldk gik+ 1| L2 II k gk+111
akc|d, ||- /-I 2|d, |1k

-< + ) 19 (2.20)

Hence, we have


||dk+1 < gk+1| + |1l |dkj < 2 + + 2 gk+1.









Inserting this upper bound for dk in (2.18) yields
00
iig9i2 < 00
k= 1
which completes the proof. O

We now observe that the directions generated by the new conjugate gradient

update (2.2) approximately point along the Perry/Shanno direction,

T
PS Yk Sk dk gk+l t)
dk+1 I dk + Yk (2.21)

where Sk Xk+1 xk, when f is strongly convex and the cosine of the angle

between dk and gk+1 is sufficiently small. By (2.17) and (2.19), we have

d T k1 L T
Id gklYk < L ugk+1 C1 gk+ I (2.22)

where uk = dk/l|dkl is the unit vector in the direction dk, e is the cosine of the

angle between dk and gk+1, and c = L/p. By the definition of dk+1 in (2.2), we

have

ldk+1 2 > gk+1 2/3 dkgk+1. (2.23)

By the bound for O1 in (2.20),
ION T IUT 112
|/Pdgk+l < C2 uk k+1gI k+1 c2C2I k+1 2, (2.24)

where c2 is the constant appearing in (2.20). Combining (2.23) and (2.24), we have


|ldk+i1 > /1- 2c2Cg1k+ll.

This lower bound for ||dk+l|| and the upper bound (2.22) for the Yk term in (2.21)

imply that the ratio between them is bounded by cie/ 1\ 2c2c. As a result,

when c is small, the direction generated by (2.2) is approximately a multiple of the

Perry/Shanno direction (2.21).









Convergence analysis for general nonlinear functions. Our analysis of (2.4)-(2.5)

for general nonlinear functions exploits insights developed by Gilbert and Nocedal

in their analysis [60] of the PRP+ scheme. Similar to the approach taken in

[60], we establish a bound for the change uk+I Uk in the normalized direction

Uk = dk/llI II, which we use to conclude, by contradiction, that the gradients
cannot be bounded away from zero. The following theorem is the analogue of [60,

Lemma 4.1], it differs in the treatment of the direction update formula (2.4).

Lemma 2 If the level set (2.14) is bounded and the Lipschitz condition (2.15)

holds, then for the scheme (2.4)-(2.5) and a line search that -,/'. f, the Wolfe

conditions (2.10)-(2.11), we have
00
dk 0 for each and ||Iuk+ U- < o
k-0

whenever inf { ||gk : k > 0} > 0.

Proof. Define 7 = inf {||g, I| : k > 0}. Since 7 > 0 by assumption, it follows from

the descent property Theorem 4 that dk / 0 for each k. Since L is bounded, f is

bounded from below, and by (2.10) and (2.13),


k-0
(gfdc)2


Again, the descent property yields

4 < < 64 (gdk)2 < o. (2.25)
k Z Id I- Z d, I|- 49 lcd, II
k=0 k11 -0 k-0 0
Define the quantities:
-gk +Bfldk-1 Ild-l||
k+ max{fo,0}, 0f min{f ,0}, rk k k-, 6k k- i1 *

By (2.4)-(2.5), we have

dk -gk + (kk-1 + O -l)dk-i
Uk IIII II rk + 6kuk-1.









Since the uk are unit vectors,


||rkl = |Uk 6kUk-11| i,, Uk-1 .-

Since 6k > 0, it follows that

||Uk Uk- 1 < (t + 6k)(Uk Uk-1)

< ||jUk 6kUk-1Uk_1 + 1,I, Uk-1_l

211rkl |. (2.26)

By the definition of fk and the fact that rlk < 0 and 3ikN > rlk in (2.5), we have the
following bound for the numerator of rk:


11 gk 1+/_idk-1| < 11gkII min{sk1I,0}||dk-I||

< I1g llk Tk-lll||dk-i

< ||gk + -Ildk-lI
||dk- 1 min{l7, 7}
< 1+ (2.27)
minmr,y 7}

where

F max| Vf(x) |. (2.28)
xE6
Let c denote the expression F + 1/min{q, 7} in (2.27). This bound for the
numerator of rk coupled with (2.26) gives

2c
iUk- Uk-1i|| 21|rk||
Finally, squaring (2.29), summing over k, and utilizing (2.25), the proof is com-

plete. D
Theorem 6 If the level set (2.14) is bounded and the Lipschitz condition (2.15)

holds, then for the scheme (2.4) (2.5) and a line search that -.i-/. f, the Wolfe









conditions (2.10)-(2.11), either gk = 0 for some k or


liminf Il gkl 0. (2.30)
k->oo

Proof. We suppose that gk / 0 for all k, and liminf I||gr| > 0, and we obtain a
k->oo
contradiction. Defining 7 = inf {||g, || : k > 0}, we have 7 > 0 due to (2.30) and the

fact that gk / 0 for all k. The proof is divided into 3 steps.

I. A bound for /3kN

By the Wolfe condition gk+dk > > Jgkdk, we have


y dk (g k+I gkTdk > (a 1)g d -(1 a)gdk. (2.31)

By Theorem 4,
T 7 7d2
-gjdk > 81g, II- > -72
8 8

Combining this with (2.31) gives

T 7
ykdk > (1- a)82. (2.32)

Also, observe that
gTdk T T T (2.33)
g+ldk Y dk + gdk < ydk. (2.33)

Again, the Wolfe condition gives


gk+ldk > >g dk = --y dk + gldk. (2.34)

Since a < 1, we can rearrange (2.34) to obtain

--T T
gk+ldk > t y dk.
1-a

Combining this lower bound for gk+ldk with the upper bound (2.33) yields


gT dk ydk 1 -









By the definition of 3kN in (2.5), we have

f=N =N if N > 0 and 0> "N > fN if N < 0.


Hence, I3fN < I3fkN for each k. We now insert the upper bound (2.35) for

gkldk /lydkl, the lower bound (2.32) for ykdk, and the Lipschitz estimate
(2.19) for yk into the expression (2.3) to obtain:

| kN < N ION
1 1 _< 1 k 1

< 1 Y k+1| + 2y, kl+Idk)
Id< Tyk I k+ll+2Y yTdk
8 1 a
< LF )2 ||1||+2L-|| ||2maX{ ,1)
7 (1 a)72 -a )
< C|-, |\, (2.36)

where F is defined in (2.28),

8 1 2
C 7= ( -) (L + 2L2Dmax ,l } (2.37)

D = max{|ly- zl| :y,z e }. (2.38)

Here D is the diameter of L.

II. A bound on the steps sk:

This is a modified version of [60, Thm. 4.3]. Observe that for any I > k,
1-1 1-1 1-1 1-1
X1 Xk Xj+1 X +- = IISj U = Y ISjk + Y ISSjH(Uj Uk)
j=k j=k j=k j=k

By the triangle inequality:
1-1 1-1 1-1
||s ll|| < ||x x, || + I||sjl||||u ukI| < D + l||sj||||u u, ||. (2.39)
j=k j=k j=k

Let A be a positive integer, chosen large enough that


A > 4CD,


(2.40)









where C and D appear in (2.37) and (2.38). C'! ...- ko large enough that


i I- u 2 < (2.41)
I l l 4A
i>ko

By Lemma 2, ko can be chosen in this way. If j > k > ko and j k < A, then by

(2.41) and the Cauchy-Schwarz inequality, we have

j-1
u -, Y|| < ||u+ ull||
i=k
< k i,+1i -

(1 1/2 t
4A 2

Combining this with (2.39) yields

1-1
S||s,| < 2D, (2.42)
j-k

when I > k > ko and k < A.

III. A bound on the directions dl:

By (2.4) and the bound on O3N given in Step I, we have


|ld112 < (lgll + lNl |lll|d- 1)2 < 2F2 + 2C2 S1 -1l 2|| 12 12,


where F is the bound on the gradient given in (2.28). Defining Si = 2C2 11S 2, we

conclude that for 1 > ko,
( 1 1-1 1-1
|dl12 < 22 i s + IIdko 2 SI (2.43)
i=ko+l j=i j=ko









Above, the product is defined to be one whenever the index range is vacuous. Let

us consider a product of A consecutive Sj, where k > ko:

k+A-1 k+A-l 1 (k+A-1 2
U Sj 2C2s112 S Cl s)l
j=k j=k j=k

k-1 2C\s\ 2A ( 2 CD 1

The first inequality above is the arithmetic-geometric mean inequality, the second is

due to (2.42), and the third comes from (2.40). Since the product of A consecutive

Sj is bounded by 1/2^, it follows that the sum in (2.43) is bounded, independent

of 1. This bound for ||dll|, independent of I > ko, contradicts (2.25). Hence,

7 liminf Il|gk| 0. O
k->oo
2.1.4 Line Search

The line search is an important factor in the overall efficiency of any opti-

mization algorithm. Papers focusing on the development of efficient line search

algorithms include [1, 85, 100, 101]. The algorithm [101] of Mor6 and Thuente

is used widely; it is incorporated in the L-BFGS limited memory qu -i-N. i-ton

code of Nocedal and in the PRP+ conjugate gradient code of Liu, Nocedal, and

Waltz. However, there is a fundamental numerical problem associated with the first

condition (2.10) in the standard Wolfe conditions (for detail explanations, please

refer the paper [73]). Based on this observation, in practice we proposed the the

approximate Wolfe conditions:

(26 1)Q'(0) > _'(ak) > a7'(0), (2.44)

where 6 < min{.5, a} and Q(a) = f(xk + adk). The second inequality in (2.44)

is identical to the second Wolfe condition (2.11). The first inequality in (2.44) is

identical to the first Wolfe condition (2.10) when f is quadratic. For general f, we

now show that the first inequality in (2.44) and the first Wolfe condition agree to









order ac. The interpolating (quadratic) polynomial q that matches ((a) at a = 0

and 0'(a) at a = 0 anda = ak is

q(ak) '(0)2 + 0'(0)a + 0(0).
2ak

For such an interpolating polynomial, |q(a) 0(a)| = O(a3). After replacing 4 by

q in the first Wolfe condition, we obtain the first inequality in (2.44) (with an error

term of order a ). We emphasize that this first inequality is an approximation to

the first Wolfe condition. On the other hand, this approximation can be evaluated

with greater precision than the original condition, when the iterates are near a

local minimizer, since the approximate Wolfe conditions are expressed in terms of a

derivative, not the difference of function values.

With these insights, we terminate the line search when either of the following

conditions holds:

T1. The original Wolfe conditions (2.10)-(2.11) are satisfied.

T2. The approximate Wolfe conditions (2.44) are satisfied and


(ak) < 0(0) + Ck, (2.45)

where ek > 0 is an estimate for the error in the value of f at iteration k. In

the experiments section, we took


Ek /(xk) (2.46)

where c is a (small) fixed parameter.

We satisfy the termination criterion by constructing a nested sequence of

(bracketing) intervals which converge to a point satisfying either T1 or T2. A

typical interval [a, b] in the nested sequence satisfies the following opposite slope

condition:


0(a) < 0(0) + Ck, 0'(a) < 0, 0'(b) > 0.


(2.47)









Given a parameter 0 E (0, 1). We also develop the interval update rules which can

be found as the procedure ili,, i1 I! update" in the paper [73]. And during the

i, i, i i,! up(d ,. procedure, a new so called "Double Secant Step" is used. We

prove implementing this new "Double Secant Step" in the \I -, i. 1l update", an

..i-mptotic root convergence order 1 + v/2 2.4 can be obtained with is slightly

less the the square of the convergence speed of the traditional second method

((1 + V5)2/4 m 2.6). More specifically, we have the following theorem. For the
detail proof, please refer the paper [73].

Theorem 7 Suppose that Q is three times continuo;,-l/; differentiable near a local

minimizer a*, with J"(a*) > 0 and ('" (a*) / 0. Then for ao and bo suff.. :'. ll;,

close to a* with ao < a* < bo, the iteration


[ak+l,bk+l] 1 l(ak, bk)

converges to a*. Moreover, the interval width Ibk akl tends to zero with root

convergence order 1 + /2.

2.1.5 Numerical Comparisons

In this section we compare the CPU time performance of the new conjugate

gradient method, denoted CGDESCENT, to the L-BFGS limited memory quasi-

Newton method of Nocedal [103] and Liu and Nocedal [91] and to other conjugate

gradient methods as well. Comparisons based on other metrics, such as the

number of iterations or number of function/gradient evaluations, can be found

in paper [74], where extensive numerical testing of the methods is done. We

considered both the PRP+ version of the conjugate gradient method developed by

Gilbert and Nocedal [60], where the /3k associated with the Polak-Ribikre-Polyak

conjugate gradient method [106, 107] is kept nonnegative, and versions of the

conjugate gradient method developed by Dai and Yuan in [35, 37], denoted CGDY

and CGDYH, which achieve descent for any line search that satisfies the Wolfe









conditions (2.10)-(2.11). The hybrid conjugate gradient method CGDYH uses


/3k max{0, min{/3kHS DY}},


where /3Hs is the choice of Hestenes-Stiefel [81] and /3Y appears in [35]. The test

problems are the unconstrained problems in the CUTE [12] test problem library.

The L-BFGS and PRP+ codes were obtained from Jorge Nocedal's web

page. The L-BFGS code is authored by Jorge Nocedal, while the PRP+ code

is co-authored by Guanghui Liu, Jorge Nocedal, and Richard Waltz. In the

documentation for the L-BFGS code, it is recommended that between 3 and

7 vectors be used for the memory. Hence, we chose 5 vectors for the memory.

The line search in both codes is a modification of subroutine CSRCH of Mord

and Thuente [101], which employs various polynomial interpolation schemes and

safeguards in satisfying the strong Wolfe line search conditions.

We also manufactured a new L-BFGS code by replacing the Mor6/Thuente

line search by the new line search presented in our paper. We call this new code

L-BFGS*. The new line search would need to be modified for use in the PRP+

code to ensure descent. Hence, we retained the Mor6/Thuente line search in the

PRP+ code. Since the conjugate gradient algorithms of Dai and Yuan achieves

descent for any line search that satisfies the Wolfe conditions, we are able to use

the new line search in our experiments with CGDY and with CGDYH. All codes

were written in Fortran and compiled with f77 (default compiler settings) on a Sun

workstation.

For our line search algorithm, we used the following values for the parameters:


6 .1, a =.9, 10-6, 0 .5, 7 .66, r = .01


Our rationale for these choices was the following: The constraints on 6 and a are

0 < 6 < a < 1 and 6 < .5. As 6 approaches 0 and a approaches 1, the line search









terminates quicker. The chosen values 6 = .1 and a = .9 represent a compromise

between our desire for rapid termination and our desire to improve the function

value. When using the approximate Wolfe conditions, we would like to achieve

decay in the function value, if numerically possible. Hence, we made the small

choice c = 10-6 in (2.46). When restricting /3k in (2.5), we would like to avoid

truncation if possible, since the fastest convergence for a quadratic function is

obtained when there is no truncation at all. The choice r = .01 leads to infrequent

truncation of 3k. The choice 7 = .66 ensures that the length of the interval [a, b]

decreases by a factor of 2/3 in each iteration of the line search algorithm. The

choice 0 = .5 in the update procedure corresponds to the use of bisection. Our

starting guess for the step ak in the line search was obtained by minimizing a

quadratic interpolant.

In the first set of experiments, we stopped whenever


(a) ||Vf(xk) Io < 10-6 or (b) akg k < 10-20 fXk1), (2.48)


where || || denotes the maximum absolute component of a vector. In all but

3 cases, the iterations stopped when (a) was satisfied -the second criterion

essentially -i,- that the estimated change in the function value is insignificant

compared to the function value itself.

The cpu time in seconds and the number of iterations, function evaluations,

and gradient evaluations, for each of the methods are posted at the author's web

site. In running the numerical experiments, we checked whether different codes

converged to different local minimizers; we only provide data for problems where

all six codes converged to the same local minimizer. The numerical results are now

analyzed.

The performance of the 6 algorithms, relative to cpu time, was evaluated using

the profiles of Dolan and More6 [43]. That is, for each method, we plot the fraction










P


09-

o08 CG -BFGS L-BFGS

07-
07 ~ / /
/ PRP+
06-

05-

04

03

02

01 -


1 4 16

Figure 2-1: Performance profiles



P of problems for which the method is within a factor r of the best time. In Figure

2-1, we compare the performance of the 4 codes CG, L-BFGS*, L-BFGS, and

PRP+. The left side side of the figure gives the percentage of the test problems

for which a method is the fastest; the right side gives the percentage of the test

problems that were successfully solved by each of the methods. The top curve is

the method that solved the most problems in a time that was within a factor r of

the best time. Since the top curve in Figure 2-1 corresponds to CG, this algorithm

is clearly fastest for this set of 113 test problems with dimensions ranging from

50 to 10,000. In particular, CG is fastest for about '.1', (68 out of 113) of the

test problems, and it ultimately solves 1li 1' of the test problems. Since L-BFGS*

(fastest for 29 problems) performed better than L-BFGS (fastest for 17 problems),

the new line search led to improved performance. Nonetheless, L-BFGS* was still

dominated by CG.

In Figure 2-2 we compare the performance of the four conjugate gradient










P


09- CG
CGDY
08-

07 PRP+

06

05

04-



02
/
01

0 t
1 4 16

Figure 2-2: Performance profiles of conjugate gradient methods



algorithms. Observe that CG is the fastest of the four algorithm. Since CGDY,

CGDYH, and CG use the same line search, Figure 2-2 indicates that the search

direction of CG yields quicker descent than the search directions of CGDY and

CGDYH. Also, CGDYH is more efficient than CGDY. Since each of these six codes

differs in the amount of linear algebra required in each iteration and in the relative

number of function and gradient evaluations, different codes will be superior in

different problem sets. In particular, the fourth ranked PRP+ code in Figure 2 1

still achieved the fastest time in 6 of the 113 test problems.

In our next series of experiments, shown in Table 2-2, we explore the ability

of the algorithms and line search to accurately solve the test problems. In this

series of experiments, we repeatly solve six test problems, increasing the specified

accuracy in each run. For the initial run, the stopping condition was I||gk < 10-2,

and in the last run, the stopping condition was I||gk l < 10-12. The test problems

used in these experiments, and their dimensions, were the following:









Table 2-2: Solution time versus tolerance


Tolerance
11g 1 ,


Algorithm
Name


Problem Number
#1 #2 #3 #4 #5


CG 5.22 2.32 0.86 0.00 1.57 10.04
10-2 L-BFGS* 4.19 1.57 0.75 0.01 1.81 14.80
L-BFGS 4.24 2.01 0.99 0.00 2.46 16.48
PRP+ 6.77 3.55 1.43 0.00 3.04 17.80
CG 9.20 5.27 2.09 0.00 2.26 17.13
10-3 L-BFGS* 6.72 6.18 2.42 0.01 2.65 19.46
L-BFGS 6.88 7.46 2.65 0.00 3.30 22.63
PRP+ 12.79 7.16 3.61 0.00 4.26 24.13
CG 10.79 5.76 5.04 0.00 3.23 25.26
10-4 L-BFGS* 11.56 10.87 6.33 0.01 3.49 31.12
L-BFGS 12.24 10.92 6.77 0.00 4.11 33.36
PRP+ 15.97 11.40 8.13 0.00 5.01 F
CG 14.26 7.94 7.97 0.00 4.27 27.49
10-5 L-BFGS* 17.14 16.05 10.21 0.01 4.33 36.30
L-BFGS 16.60 16.99 10.97 0.00 4.90 F
PRP+ 21.54 12.09 12.31 0.00 6.22 F
CG 16.68 8.49 9.80 5.71 5.42 32.03
10-6 L-BFGS* 21.43 19.07 14.58 9.01 5.08 46.86
L-BFGS 21.81 21.08 13.97 7.78 5.83 F
PRP+ 24.58 12.81 15.33 8.07 7.95 F
CG 20.31 11.47 11.93 5.81 5.93 39.79
10-7 L-BFGS* 26.69 25.74 17.30 12.00 6.10 54.43
L-BFGS 26.47 F 17.37 9.98 6.39 F
PRP+ 31.17 F 17.34 8.50 9.50 F
CG 23.22 12.88 14.09 9.68 6.49 47.50
10-8 L-BFGS* 28.18 33.19 20.16 16.58 6.73 63.42
L-BFGS 32.23 F 20.48 14.85 7.67 F
PRP+ 33.75 F 19.83 F 10.86 F
CG 27.92 13.32 16.80 12.34 7.46 56.68
10-9 L-BFGS* 32.19 38.51 26.50 26.08 7.67 72.39
L-BFGS 33.64 F F F 8.50 F
PRP+ F F F F 11.74 F
CG 33.25 13.89 21.18 13.21 8.11 65.47
10-10 L-BFGS* 34.16 50.60 29.79 33.60 8.22 79.08
L-BFGS 39.12 F F F 9.53 F
PRP+ F F F F 13.56 F
CG 38.80 14.38 25.58 13.39 9.12 77.03
10-11 L-BFGS* 36.78 55.70 34.81 39.02 9.14 88.86
L-BFGS F F F F 9.99 F
PRP+ F F F F 14.44 F


10-12


CG
L-BFGS*
L-BFGS
PRP+


42.51
41.73
F
F


15.62
60.89
F
F


27.54
39.29
F
F


13.38
43.95
F
F


9.77
9.97
10.54
15.96


78.31
101.36
F
F









1. FMINSURF (5625) 4. FLETCBV2 (1000)

2. NONCVXU2 (1000) 5. SCHMVETT (10000)

3. DIXMAANE (6000) 6. CURLY10 (1000)
These problems were chosen somewhat randomly except that we did not include

any problem for which the optimal cost was zero. When the optimal cost is zero

while the minimizer x is not zero, the estimate lf(xk)| for the error in function

value (which we used in the previous experiments) can be very poor as the iterates

approach the minimizer (where f vanishes). These six problems all have nonzero

optimal cost. In Table 2-2, F means that the line search terminated before the

convergence tolerance for ||g, 1| was satisfied. According to the documentation

for the line search in the L-BFGS and PRP+ codes, "Rounding errors prevent

further progress. There may not be a step which satisfies the sufficient decrease and

curvature conditions. Tolerances may be too small."

As can be seen in Table 2-2, the line search based on the Wolfe conditions

(used in the L-BFGS and PRP+ codes) fails much sooner than the line search

based on both the Wolfe and the approximate Wolfe conditions (used in CG and

L-BFGS*). Roughly, a line search based on the Wolfe conditions can compute a

solution with accuracy on the order of the square root of the machine epsilon, while

a line search that also includes the approximate Wolfe conditions can compute a

solution with accuracy on the order of the machine epsilon.

2.2 A Cyclic Barzilai-Borwein (CBB) Method

2.2.1 Introduction to Nonmonotone Line Search

We know many iterative methods for (2.1) produce a sequence x0, Xl, 2...,

where Xk+1 is generated from xk, the current direction dk, and the stepsize ak > 0

by the rule


Xk+1 Xk + akdk.


(2.49)









In monotone line search methods, ak is chosen so that f(xk+l) < f(xk). In

nonmonotone line search methods, some growth in the function value is permitted.

As pointed out by many researchers (for example, see [28, 117]), nonmonotone

schemes can improve the likelihood of finding a global optimum; also, they can

improve convergence speed in cases where a monotone scheme is forced to creep

along the bottom of a narrow curved valley. Encouraging numerical results have

been reported [27, 67, 94, 104, 108, 117, 130] when nonmonotone schemes were

applied to difficult nonlinear problems.

The earliest nonmonotone line search framework was developed by Grippo,

Lampariello, and Lucidi in [66] for Newton's methods. Their approach was roughly

the following: Parameters Ai, A2, a, and 6 are introduced where 0 < A1 < A2 and

a, 6 E (0, 1), and they set ak akJah where ak E (1, A2) is the I i i! step" and hk

is the smallest nonnegative integer such that


f(Xk + kdk) < max f(xk-j) + cakV/(xk)dk. (2.50)

The memory mk at step k is a nondecreasing integer, bounded by some fixed

integer M. More precisely,


mo = 0 and for k > 0, 0 < mk < min{mkl + 1, M}.


T ii.: subsequent papers, such as [10, 27, 67, 94, 108, 134], have exploited non-

monotone line search techniques of this nature. Recently, a completely new

nonmonotone technique is proposed and analyzed in [131]. In that scheme, the

authors require that an average of the successive function values decreases, while

the above traditional nonmonotone approach (2.50) requires that a maximum of

recent function values decreases. They also proved global convergence for noncon-

vex, smooth functions, and R-linear convergence for strongly convex functions.

For the L-BFGS method and the unconstrained optimization problems in the









CUTE library, this new nonmonotone line search algorithm used fewer function

and gradient evaluations, on average, than either the monotone or the traditional

nonmonotone scheme.

The original method of Barzilai and Borwein was developed in [3]. The

basic idea of Barzilai and Borwein is to regard the matrix D(ak) = 1 as an
ak
approximation of the Hessian V2f(xk) and impose a q, ,--1. ton property on

D(oak):

k = argmin ||D(Q)sk_1 Yk-1 2, (2.51)
aEs

where Sk-1 = Xk xk-1, Yk-1 g k gk-1, and k > 2. The proposed stepsize,

obtained from (2.51),is
T
BB Sk-lSk-1
ak T (2.52)

Other possible choices for the stepsize ak include [29, 31, 34, 38, 58, 68, 109, 114].

In this dissertation, we refer to (2.52) as the Barzilai-Borwein (BB) formula. The

iterative method (2.49), corresponding to the step size to be the BB stepsize (2.52)

and dk -gk, is called the BB method. Due to their simplicity, efficiency and

low memory requirements, BB-like methods have been used in many applications.

Glunt, Hayden, and R wdan [62] present a direct application of the BB method in

chemistry. Birgin et al. [8] use a globalized BB method to estimate the optical con-

stants and the thickness of thin films, while in Birgin et al. [10] further extensions

are given, leading to more efficient projected gradient methods. Liu and Dai [92]

provide a powerful scheme for solving noisy unconstrained optimization problems

by combining the BB method and a stochastic approximation method. The pro-

jected BB-like method turns out to be very useful in machine learning for training

support vector machines (see Serafini et al [114] and Dai and Fletcher [31]). Em-

pirically, good performance is observed on a wide variety of classification problems.

All the above good performances of the BB-like method are based on combining

the method with some nonmonotone line search technique. In the following of this









section, we are going to discuss a so called cyclic BB method (CBB). By using

a modified version of the adaptive nonmonotone line search developed in [39], a

globally adaptive convergent scheme for CBB method is also developed.

2.2.2 Method and Local Linear Convergence

The superior performance of cyclic steepest descent, compared to the ordinary

steepest descent, as shown in [30], leads us to consider the cyclic BB method

(CBB):

am+i = a+1BB for i 1,... m, (2.53)

where m > 1 is again the cycle length. An advantage of the CBB method is that

for general nonlinear functions, the stepsize is given by the simple formula (2.51)

in contrast to the nontrivial optimization problem associated with the steepest

descent step.

In this section we prove R-linear convergence for the CBB method. In [92],

it is proposed that R-linear convergence for the BB method applied to a general

nonlinear function could be obtained from the R-linear convergence results for a

quadratic by comparing the iterates associated with a quadratic approximation

to the general nonlinear iterates. In our R-linear convergence result for the CBB

method, we make such a comparison.

The CBB iteration can be expressed as


Xk+1 = Xk akgk, (2.54)

where
8T i
ak Si i =v(k), and v(k) = m[(k )/m], (2.55)

k > 1. For rE R, [r] denotes the largest integer j such that j < r. We assume

that f is two times Lipschitz continuously differentiable in a neighborhood of

a local minimizer x* where the Hessian H = V2f(*) is positive definite. The









second-order Taylor approximation f to f around x* is given by

1
f(x) f(x*) + (x x*)TH(x x*). (2.56)
2

We will compare an iterate Xk+j generated by (2.54) to a CBB iterate Xk,j

associated with f and the starting point Xk,0 Xk. More precisely, we define:


Xk,O = Xk

k,j+1 k,j ak,jgkj, j> 0, (2.57)

where
Ok if v(k + j) = v(k)
ak,j T T
dk, i = v(k +j), otherwise.

Here k+j = k,j+1 Xk,j, k,j = H(xk,j x*), and yk+j g= k,j+l 9k,j-

We exploit the following result established in [29, Thm. 3.2]:

Lemma 3 Let {Xk,j : j > 0} be the CBB iterates associated with the starting point

Xk,o Xk and the quadratic f in (2.56), where H is positive /. I;, Given two
arbitrary constants C2 > C1 > 0, there exists a positive integer N with the following

1", '/' 11 For ,:; k > 1 and

ck,0 E [C1,C2], (2.58)


1
l.i X*ll __< |.,, X*||.
2
In our next lemma, we estimate the distance between Xk,j and Xk+j. Let

Bp(x) denote the ball with center x and radius p. Since f is two times Lipschitz

continuously differentiable and V2f(x*) is positive definite, there exists positive

constants p, A, and A2 > A1 such that


I|Vf(x) H(x x*)|| < A|lx- x* |2 for all x e B,(x*)


(2.59)









and
yTV2fq)y
A < Y V y< A for all y c Rn and x e B,(x*). (2.60)

Notice that if xi and x{+1 E Bp(x*), then the fundamental theorem of calculus

applied to yi = gi+ gi yields

1 s s 1
A < < (2.61)
A2- s STi A

Hence, when the CBB iterates lie in Bp(x*), the condition (2.58) of Lemma 3 is

satisfied with C1 = 1/A2 and C2 = 1/A1. If we define g(x) = Vf(x)T, then the

fundamental theorem of calculus can also be used to deduce that


11g(x)11 = 1g(x) g(x*) _< A,|.r,- x* 1 (2.62)

for all x c B,(x*).

Lemma 4 Let {xj : j > k} be a sequence generated by the CBB method applied to

a function f with a local minimizer x*, and assume that the Hessian H = V2f(x*)

is positive I;,..:'/ with (2.60) -,/l.:-/. Then for i,.,, fixed positive integer N, there

exist positive constants 6 and 7 with the following j .pI* '/; For .; Xk E B6(x*),

ak e [A2, A1], Alf [0, N] with

1
,, x*|| > .11,,- x*| for allj C [0, max{0, 1}], (2.63)
2

we have

Xk+j E Bp(x*) and I ,+j k,4j\ <-' II,, x*112 (2.64)

for allj e [0, ].

Proof. Throughout the proof, we let c denote a generic positive constant,

which depends on fixed constants such as N or A1 or A2 or A, but not on either

k or the choice of xk E B6(x*) or the choice of ak E [A2 1, Al ]. To facilitate the









proof, we also show that


|g(xk+j) gxk j) < I, 2, (2.65)

| ,+j|| < 1|| *||, (2.66)

Iak+j k,j| < (| -x*||, (2.67)

for all j e [0, ], where g(x) = Vf(x)T H(x x*).

The proof of (2.64)-(2.67) is by induction on f. For = 0, we take 6 = p. The

relation (2.64) is trivial since Xk,0 Xk. By (2.59), we have


||g(xk) g(xk,0o)l g(xk) g(xk)\\ < All jX* 2,

which gives (2.65). Since 6 = p and xk c Bs(x*), it follows from (2.62) that

A2
A,

which gives (2.66). The relation (2.67) is trivial since i^k,o = ak.

Now, proceeding by induction, suppose that there exist L E [1, N) and 6 > 0

with the property that if (2.63) holds for any f [0, L 1], then (2.64)-(2.67) are

satisfied for all j E [0, f]. We wish to show that for a smaller choice of 6 > 0, we

can replace L by L + 1. Hence, we suppose that (2.63) holds for all j E [0, L]. Since

(2.63) holds for all j E [0, L 1], it follows from the induction hypothesis and (2.66)

that
L
I',++l- *||ll < -X*\ +Y I,+|
i=0
< (, x* |. (2.68)

Consequently, by choosing 6 smaller if necessary, we have Xk+L+1 c Bp(x*) when

Xk c B6(x*).









By the triangle inequality,


S+L+1 k- k,L+l

'' +L ak+Lg(Xk+L) [k,L k,L9(k,L) II

< 'i+L Xk,L + ak,L g(x k+L) g(x k,L)

+Iak+L ak,L\ \g(xk+L)||. (2.69)

We now analyze each of the terms in (2.69). By the induction hypothesis, the

bound (2.64) with j =L holds, which gives


I' +L- Xk,L < x 2. (2.70)

By the definition of 6, either ak,L = ak E [A2 A ], or


6k,L v(k + L).
Ysi i

In this latter case,
1 < is 5T 5si 1
A2- sHsi S i- A'
Hence, in either case 6k,L e [A21, A1A]. It follows from (2.65) with j =L that

1
ak,L 9 g(xk+L) gxk,L) I < A| g(xk+L) g(xk,L)\

< 1 x*2. (2.71)

Also, by (2.67) with j =L and (2.62), we have


La-k+L 6,L| g (Xk+L) < II '( X* I +L X*||.

Utilizing (2.68) (with L replaced by L 1) gives


Iak+L ak,L g(Xk+L) << -X12. (2.72)








We combine (2.69)-(2.72) to obtain (2.64) for j = L + 1. Notice that in establishing
(2.64), we exploited (2.65)-(2.67). Consequently, to complete the induction step,
each of these estimates should be proved for j = L + 1.
Focusing on (2.65) for j =L + 1, we have

IIg(Xk+L+1) g(k,L+1)\\

< llg(xkf+L+1) (xk+L+1) + (k+L+1) gk,L+1)

= |g(xk+L+1) -(Xk+L+ 11 + H(xk+L+ 1 Xk,L+1)11

< IIg(Xk+L+1) H(Xk+L+1 X*)l + A | +L+1 k~,L+1

< IIg(Xk+L+1) H(Xk+L+1 X*) I + I ', X* 2,

since IIHII < A2 by (2.60). The last inequality is due to (2.64) for j L + 1, which
was just established. Since we chose 6 small enough that Xk+L+1 E Bp(x*) (see
(2.68)), (2.59) implies that

IIg(Xk+L+1) H(Xk+L+1- X*)l < A1 '' +L+1 X* 2 < I X12.

Hence, IIg(xk+L+1) -g(xk,L+1)\ < II -X*2, which establishes (2.65) for j L + 1.
Observe that ak+L+1 either equals ak E [A21, A1], or (sTsi)/(sTyi), where
k + L > i = v(k + L + 1) > k. In this latter case, since xk+j E Bp(x*) for
0 < < L + 1, it follows from (2.61) that

1
ak+L+1 < A

Combining this with (2.62), (2.68), and the bound (2.66) for j < L, we obtain

+L+1 L \ak+L+lg9(k+L+1) 1 < L I +L+ -'* .-

Hence, (2.66) is established for j = L + 1.
Finally, we focus on (2.67) for j = L + 1. If v(k + L + 1) v(k), then

Ok,L+1 = ak+L+1 = ak, so we are done. Otherwise, v(k + L + 1) > v(k), and there









exists an index i E (0, L] such that


T
Sk+iSk+i
ak+L+1 T
Sk+iYk+i


and (k,L+1


By (2.64) and the fact that i < L, we have


I- +i k+i11 < 12.


Combining this with (2.66), and choosing 6 smaller if necessary, gives


S j 2sT+(Sk+i
Sk+iSk+i k+iSk+i 2 k+iSk+i

Since (ik,i E [A2 1, A ], we have


Sk+i) -'+i Sk+i1 2 1 ( I


x*ll3. (2.73)


-'+ill


1
lk,i9k,ill |> II H(xk,i
A2


Furthermore, by (2.63) it follows that


-+il| > A II'., x* I
2A2


Hence, combining (2.73) and (2.74) gives


SSkT+iSk+i
Sk+iSk+i


T T
Sk+iSk+i k+iSk+i <
T ^
k+iSk+i


Now let us consider the denominators of ak+i and ak,i. Observe that


T T
5k+iYk+i k- k+i+i


k+i(Yk+i k+i) + (Sk+i

k+i(Yk+i k+i) + (Sk+i


Sk+i)T&k+i


Sk+i)THSk+i.


By (2.64) and (2.66), we have


\(Sk+i Sk+i)THSk+i


\(Sk+i Sk+i)THSk+i


(Sk+i Sk+i)TH(Sk+i


< (I', l3


T
Sk+ik+i
^T i
Sk+i k+i


A1
A2


x*)ll


X*ll.


A1
2A2


(2.74)


( |', *||.


(2.75)


(2.76)


Sk+i)\


(2.77)









for 6 sufficiently small. Also, by (2.65) and (2.66), we have


Isk (Yk+i k+i) < +i ( 'l+i+1- gk,i+l + I '+zi ) < (1' -x |3. (2.78)

Combining (2.76)-(2.78) yields

T y T (2.79)
k+iYk+z +ik+i -< I 3. (2.79)

Since Xk+i and Xk+i+l c Bp(x*), it follows from (2.60) that


S+iYk+i = s+i(gk+i+l gk+i) > A1I +||2 A lak+i2 gk+i l2. (2.80)

By (2.61) and (2.60), we have


lak+i12 lk+i 2 > gk+i2 A )2 -g(kxi) (*)2 > A2,,+ x 2. (2.81)

Finally, (2.63) gives


S X* 12 > I X* 2
S


(2.82)


Combining (2.80)-(2.82) yields


T A3
S +iYk+i > 42 XI2
4A2

Combining (2.79) and (2.83) gives

T T T
Sk+ik+i k+ +ik i< ,, x*kl.
Sk+iY+i Ski+ik+i
T T


(2.83)


(2.84)


Observe that


Itk+L+1 dk,L+1l


ST T
k+iSk+i k+iSk+i
TT T
k+iYk+i k+iSk+i
T GT
1 Sk+iSk+i\ k+i+i
= k,L+l 1 T- T T
Sk+iSk+i \k+ik+i
< 1 (S Tk+i) (T +iYk+i
S Sk+iSk+i Sk+iYk+i

Sa(1 b) + bl < -(la + lbl + lab),
A A


(2.85)









where
T T
a=- +ik+i and b k+ik+i
T T
Sk+iSk+i sk+iYk+i
Together, (2.75), (2.84), and (2.85) yield


|ak+L+1 (k,L+1 < II x*l

for 6 sufficiently small. This completes the proof of (2.64)-(2.67). D

Theorem 8 Let x* be a local minimizer of f, and assume that the Hessian

V2f(x*) is positive 1. fI;,'.: Then there exist positive constants 6 and 7, and a

positive constant c < 1 with the j, *I*/' '/, that for all starting points xo, X1 E Bs(x*),

xo / xl, the CBB iterates generated by (2.54)-(2.55) i/.l;rfy

S,, -x* < 7Ck 1X1 -X* .

Proof. Let N > 0 be the integer given in Lemma 3, corresponding to

C1 = Al1 and C2 = A21, and let 61 and 71 denote the constants 6 and 7 given in

Lemma 4r. Let 72 denote the constant c in (2.66). In other words, these constant

61, 71, and 72 have the property that whenever i, x*| < <61, ak E [A2 Al ], and

1
i, ,-x*|| > ll,,,-x*|l for0 2

we have


'+j ,J < 71 i' X* 2, (2.86)

-, || < 1 ,, *||, (2.87)

xk+j C B,(x*), (2.88)

for all j E [0, f]. Moreover, by the triangle inequality and (2.87), it follows that


S+j | < (N2+ 1) x*11

73 -x*, 73- (72 + 1), (2.89)









for all j E [0, ]. We define


6 = min{jl, p, (471)-1}. (2.90)


For any xo and xi E Bs(x*), we define a sequence 1 = kl < k2 < ... in the

following way: Starting with the index k 1, let jl > 0 be the smallest integer

with the property that

1 1
I.,t',jl X*|| < II||.'1 o X*|l = ||1 i X*|l.
2 2

Since xo and xl E Bs(x*) C Bp(x*), it follows from (2.61) that

T 0
So Yo

By Lemma 3, ji < N. Define k2 k + jl > kl. By (2.86) and (2.90), we have


I X, 1 lA, +j X*11 < I l., +j1 X ijill + l j-1A j X*I

< 71II. X* l2 + I. 0 *ll

S71I.~- X* +2 + t 11 -
2
< 311. *|. (2.91)
-4

Since 1||x x*l| < 6, it follows that xk2 E B6(x*). By (2.88), xj E B,(x*) for

1 < < ki.

Now, proceed by induction. Assume that ki has been determined with xkj E

Ba(x*) and xj e Bp(x*) for 1 < j < ki. Let j, > 0 be the smallest integer with the

property that
1 1
I,, *11 < 1 x*|11 |,, x*11.
2 2

Set ki+l = ki + ji > ki. Exactly as in (2.91), we have

3
I,, x* < ',, x*|.
4









Again, xk E B6(x*) and xj e B,(x*) for j e [1, k~i+].

For any k E [ki, ki+l), we have k < ki + N 1 < Ni, since k < N(i 1) + 1.

Hence, i > k/N. Also, (2.89) gives


I, < 3 ,, *

_< 73 t, -11*|
(43 (k/N)II-
73(4) x11

7c II ,'1 *I,

where

7 73 and c () < 1.
3/ 4
This completes the proof. O

2.2.3 Method for Convex Quadratic Programming

In this section, we give numerical evidence which indicates that when m is

sufficiently large, the CBB method is superlinearly convergent for a quadratic

function

f(x) = TAx bTx, (2.92)
2
where AE R""' is symmetric, positive definite and b E c". Since CBB is invariant

under an orthogonal transformation and since gradient components corresponding

to identical eigenvalues can be combined (see for example Dai and Fletcher [30]),

we assume without loss of generality that A is diagonal:

A = diag(Ai, A2,..., AX ) with 0 < A1 < A2 < ... < A. (2.93)

In the quadratic case, it follows from (2.49) and (2.92) that


9k+1 = (I akA)gk.


(2.94)









Table 2-3: Transition to superlinear convergence

n 2 3 4 5 6 8 10 12 14
superlinear m 1 3 2 4 4 5 6 7 8
linear m 2 1 3 3 4 5 6 7


If g) denotes the i-th component of the gradient gk, then by (2.94) and (2.93), we
have

S(1- aki)g i 1,2,...,n. (2.95)

We assume that g) / 0 for all sufficiently large k. If g) = 0, then by (2.95) com-

ponent i remains zero during all subsequent iterations; hence it can be discarded.

In the BB method, starting values are needed for x0 and xl in order to com-

pute ac. In our study of CBB, we treat ac as a free parameter. In our numerical

experiments, we choose ac to be the exact stepsize.

For different choices of the diagonal matrix (2.93) and the starting point xl,

we have evaluated the convergence rate of CBB. By the analysis given in [58]

for positive definite quadratics, or by the result given in Theorem 8 for general

nonlinear functions, the convergence rate of the iterates is at least linear. On

the other hand, for m sufficiently large, we observe experimentally, that the

convergence rate is superlinear. For fixed dimension n, the value of m where the

convergence rate makes a transition between linear and nonlinear is shown in Table

2-3. More precisely, for each value of n, the convergence rate is superlinear when m

is great than or equal to the integer given in the second row of the Table 2-3. The

convergence is linear when m is less than or equal to the integer given in the third

row of Table 2-3.

The limiting integers appearing in Table 2-3 are computed in the following

way: For each dimension, we randomly generate 30 problems, with eigenvalues uni-

formly distributed on (0, n], and 50 starting points -a total of 1500 problems. For

each test problem, we perform 1000n CBB iterations, and we plot log(log(||l/, || I))









versus the iteration number. We fit the data with a least squares line, and we com-

pute the correlation coefficient to determine how well the linear regression model

fits the data. If the correlation coefficient is 1 (or -1), then the linear fit is perfect,

while a correlation coefficient of 0 means that the data is uncorrelated. A linear

fit in a plot of log(log( ||l/,.|| )) versus the iteration number indicates superlinear

convergence. For m large enough, the correlation coefficients are between -1.0

and -0.98, indicating superlinear convergence. As we decrease m, the correlation

coefficient abruptly jumps to the order of -0.8. The integers shown in Table 2-3

reflect the values of m where the correlation coefficient jumps.

Based on Table 2-3, the convergence rate is conjectured to be superlinear for

m > n/2 > 3. For n < 6, the relationship between m and n at the transition be-

tween linear and superlinear convergence is more complicated, as seen in Table 2-3.

Graphs illustrating the convergence appear in Figure 2-3. The horizontal axis in

these figures is the iteration number, while the vertical axis gives log(log(lli/ || I)).

Here I|| |, represents the sup-norm. In this case, straight lines correspond to super-

linear convergence -the slope of the line reflects the convergence order. In Figure

2-3, the bottom two graphs correspond to superlinear convergence, while the top

two graphs correspond to linear convergence -for these top two examples, a plot

of log( ||lg l) versus the iteration number is linear. For the theoretical verification

of the experimental results given in Table 2-3 is not easy. However, some partial

result in connection with n = 3 and m = 2 can be theoretically verified. For details,

one may refer the paper [32].

2.2.4 An Adaptive CBB Method

In this section, we examine the convergence speed of CBB for different values

of m C [1, 7], using quadratic programming problems of the form:


f(x) = TAx, A diag(Ai, A,). (2.96)
2















n=3\
1` n -61'1'1


o10 20 0 40 500 60 70 0 90 10 o 500 1000 1500 2000 2500 3000
(a) (b)

Figure 2-3: Graphs of log(log(ll/l || ,)) versus k, (a) 3 < n < 6 and m = 3, (b) 6 <
n < 9 and m = 4.

We will see that the choice for m has a significant impact on performance. This

leads us to propose an adaptive choice for m. The BB algorithm with this adaptive

choice for m and a nonmonotone line search is called ACBB. Numerical compar-

isons with SPG2 and with conjugate gradient codes using the CUTEr test problem

library are given in the numerical comparisons section.

A numerical investigation of CBB method. We consider the test problem (2.96)

with four different condition numbers C for the diagonal matrix: C = 102, C 103,

C = 104, and C = 105; and with three different dimensions n = 102, n = 103,

and n = 104. We let A1 1, XA = C, the condition number. The other diagonal

elements Ai, 2 < i < n 1, are randomly generated on the interval (1, A,). The

starting points x ), i = 1, n, are randomly generated on the interval [-5, 5].

The stopping condition is

'/11,2 < o10-8

For each case, 10 runs are made and the average number of iterations required by

each algorithm is listed in Table 2-4 (under the columns labeled BB and CBB).

The upper bound for the number of iterations is 9999. If this upper bound is

exceeded, then the corresponding entry in Table 2-4 is F.









Table 2-4: Comparing CBB(m) method with an adaptive CBB method

BB CBB adaptive
n cond m=2 m=3 m=4 m=5 m=6 m=7 M=5 M=10
102 102 147 219 156 145 150 160 166 136 134
103 505 2715 468 364 376 395 412 367 349
104 1509 F 1425 814 852 776 628 878 771
105 5412 F 5415 3074 1670 1672 1157 2607 1915
103 102 147 274 160 158 162 166 181 150 145
103 505 1756 548 504 493 550 540 481 460
104 1609 F 1862 1533 1377 1578 1447 1470 1378
105 5699 F 6760 7-". 3506 3516 2957 4412 3187
104 102 156 227 162 166 167 170 187 156 156
103 539 3200 515 551 539 536 573 497 505
104 1634 F 1823 1701 1782 1747 1893 1587 1517
105 6362 F 6779 5194 i1''. 4349 17 ;1. 4687 4743


In Table 2-4 we see that m = 2 gives the worst numerical results -in the

previous subsection, we saw that as m increases, convergence became superlinear.

For each case, a suitably chosen m drastically improves the efficiency the BB

method. For example, in case of n = 102 and cond = 105, CBB with m = 7 only

requires one fifth of the iterations of the BB method. The optimal choice of m

varies from one test case to another. If the problem condition is relatively small

(cond = 102, 103), a smaller value m (3 or 4) is preferred. If the problem condition

is relatively large (cond = 104, 105), a larger value of m is more efficient. This

observation is the motivation for introducing an adaptive choice for m in the CBB

method.

Our adaptive idea arises from the following considerations. If a stepsize is used

infinitely often in the gradient method; namely, ak a- c, then under the assumption

that the function Hessian A has no multiple eigenvalues, the gradient gk must

approximate an eigenvector of A, and gTAgk/g gk tends to the corresponding

eigenvalue of A, see [29]. Thus, it is reasonable to assume that repeated use of a

BB stepsize leads to good approximations of eigenvectors of A. First, we define

Tn.
Vk rA9 (2.97)
II. I 11 IIA .,, I1









If gk is exactly an eigenvector of A, we know that vk = 1. If lk t 1, then gk can be
regarded as an approximation of an eigenvector of A and afB B aD. In this case,

it is worthwhile to calculate a new BB stepsize fa so that the method accepts a

step close to the steepest descent step. Therefore, we test the condition


_>k > 3, (2.98)

where 3 e (0, 1) is constant. If the above condition holds, we calculate a new BB

stepsize. We also introduce a parameter M, and if the number of cycles m > f,

we calculate a new BB stepsize. Numerical results for this adaptive CBB with

3 = 0.95 are listed under the column adaptive of Table 2-4, where two values

M = 5, 10 are tested.

From Table 2-4, we see that the adaptive strategy makes sense. The perfor-

mance with M = 5 or M = 10 is often better than that of the BB method. This

motivates the use of a similar strategy for designing an efficient gradient algorithms

for unconstrained optimization.

Nonmonotone line search and cycle number. As we saw in previous sections,

the choice of the stepsize ak is very important for the performance of a gradient

method. For the BB method, function values do not decrease monotonically.

Hence, when implementing BB or CBB, it is important to use a nonmonotone line

search.

Assuming that dk is a descent direction at the k-th iteration (gkTdk < 0), a

common termination condition for the steplength algorithm is


f(xk + akdk) < fr + 6akgk dk, (2.99)

where f, is the so-called reference function value and 6 E (0, 1) a constant.

If f, = f(xk), then the line search is monotone since f(xk+l) < f(xk). The

nonmonotone line search proposed in [67] chooses f, to be the maximum function









value for the M most recent iterates. That is, at the k-th iteration, we have


fr= fmax Imax f(xk-i). (2.100)
0
This nonmonotone line search is used by Raydan [108] to obtain GBB. Dai and

Schittkowski [33] extended the same idea to a sequential quadratic programming

method for general constrained nonlinear optimization. An even more adaptive

way of choosing f, is proposed by Toint [118] for trust region algorithms and then

extended by Dai and Zhang [39]. Compared with (2.100), the new adaptive way of

choosing f, allows big jumps in function values, and is therefore very suitable for

the BB algorithm (see [30], [31], and [39]).

The numerical results which we report in this section are based on the

nonmonotone line search algorithm given in [39]. The line search in this paper

differs from the line search in [39] in the initialization of the stepsize. Here, the

starting guess for the stepsize coincides with the prior BB step until the cycle

length has been reached; at which point we recompute the step using the BB

formula. In each subsequent subiteration, after computing a new BB step, we

replace (2.99) by

f(xk + Okdk) < min{fmax, fr} + cak9gdk,

where f, is the reference value given in [39] and ak is the initial trial stepsize (the

previous BB step). It is proved in [39, Thm. 3.2] that the criteria given in [39] for

choosing the nonmonotone stepsize ensures convergence in the sense that


lim inf I|1 || 0.
k-*oo

We now explain how we decided to terminate the current cycle, and recompute

the stepsize using the BB formula. Notice that the reinitialization of the stepsize

has no effect on convergence, it only effects the initial stepsize used in the line









search. Loosely, we would like to compute a new BB step in any of the following

cases:

R1. The number of times m the current BB stepsize has been reused is sufficiently

large: m > M, where M is a constant.

R2. The following nonquadratic analogue of (2.98) is satisfied:

T
skk > (2.101)
i ', II |2

where 3 < 1 is near 1. We feel that the condition (2.101) should only be used

in a neighborhood a local minimizer, where f is approximately quadratic.

Hence, we only use the condition (2.101) when the stepsize is sufficiently

small:

2 < min{ c ,1}, (2.102)
IIgk+lll
where cl is a constant.

R3. The current step Sk is sufficiently large:


I', |1 > max{c2 k+ 1}, (2.103)
-- .,I' +lI 'loo

where c2 is a constant.

R4. In the previous iteration, the BB step was truncated in the line search. That

is, the BB step had to be modified by the nonmonotone line search routine to

ensure convergence.

Nominally, we recompute the BB stepsize in any of the cases R1-R4. One case

where we prefer to retain the current stepsize is the case where the iterates lie in

a region where f is not strongly convex. Notice that if sTyk < 0, then there exists

a point between xk and Xk+l where the Hessian of f has negative eigenvalues. In

detail, our rules for terminating the current cycle and reinitializing the BB stepsize

are the following:









Cycle termination/Stepsize initialization.

T1. If any of the condition Ri through R4 are satisfied and sTyk > 0, then the

current cycle is terminated and the initial stepsize for the next cycle is given

by
5kSk
ak+1 = max{Omin, mm T, amax},
k Yk
where amin < amax are fixed constants.

T2. If the length m of the current cycle satisfies m > 1.5M, then the current cycle

is terminated and the initial stepsize for the next cycle is given by


ak+1 max{/ll gk+lll ak}

Condition T2 is a safeguard for the situation where sTyk < 0 in a series of

iterations.

2.2.5 Numerical Comparisons

In this subsection, we compare the performance of our adaptive cyclic BB

stepsize algorithm, denoted ACBB, with the SPG2 algorithm of Birgin, Martinez,

and Raydan [10, 11], with the PRP+ conjugate gradient code developed by Gilbert

and Nocedal [60], and with the CG_DESCENT code of Hager and Zhang [73, 74].

The SPG2 algorithm is an extension of Raydan's [108] GBB algorithm which was

downloaded from the TANGO web page maintained by Ernesto Birgin. In our

tests, we set the bounds in SPG2 to infinity. The PRP+ code is available at the

Nocedal's web page. The CGDESCENT code is found at the author's web page.

The line search in the PRP+ code is a modification of subroutine CSRCH of Mor4

and Thuente [101], which employs various polynomial interpolation schemes and

safeguards in satisfying the strong Wolfe conditions. CGDESCENT employs an

111 o i ::;in ii. Wolfe" line search. All codes are written in Fortran and compiled

with f77 under the default compiler settings on a Sun workstation. The parameters

used by CGDESCENT are the default parameter value given in [74] for version 1.1









of the code. For SPG2, we use parameter values recommended on the TANGO web

page. In particular, the step length was restricted to the interval [10-30, 1030], while

the memory in the nonmonotone line search was 10.

The parameters of the ACBB algorithm are amin = 10-30, max = 1030,

C = c2 = 0.1, and M = 4. For the initial iteration, the starting stepsize for the line

search was al = 1/||gil|,. The parameter values for the nonmonotone line search

routine from [39] were 6 = 10-4, 1 = 0.1, 2 = 0.9, / 0.975, L 3, M =8, and

P 40.

Our numerical experiments are based on the entire set of 160 unconstrained

optimization problem available from CUTEr in the Fall, 2004. As explained in [74],

we deleted problems that were small, or problems where different solvers converged

to different local minimizers. After the deletion process, we were left with 111 test

problems with dimension ranging from 50 to 104.

Nominally, our stopping criterion was the following:


I Vf(Xk)ll < max{10-6, 10-12 Vf(xo) l}. (2.104)

In a few cases, this criterion was too lenient. For example, with the test problem

PENALTYI, the computed cost still differs from the optimal cost by a factor of

105 when the criterion (2.104) is satisfied. As a result, different solvers obtain

completely different values for the cost, and the test problem would be discarded.

By changing the convergence criterion to I Vf(xk)ll < 10-6, the computed

costs all agreed to 6 digits. The problems for which the convergence criterion was

strengthened were DQRTIC, PENALTY, POWER, QUARTC, and VARDIM.

The CPU time in seconds and the number of iterations, function evaluations,

and gradient evaluations for each of the methods are posted at the author's web

site. Here we analyze the performance data using the profiles of Dolan and Mor4

[43]. That is, we plot the fraction p of problems for which any given method is










P
1
CG DESCENT -
09

08-

0"7 ACBB
06 -
PRP+


04

03

0 2 SPG2

01-

0 t
1 4 16

Figure 2-4: Performance based on CPU time



within a factor r of the best time. In a plot of performance profiles, the top curve

is the method that solved the most problems in a time that was within a factor

'- of the best time. The percentage of the test problems for which a method is

the fastest is given on the left axis of the plot. The right side of the plot gives

the percentage of the test problems that were successfully solved by each of the

methods. In essence, the right side is a measure of an algorithm's robustness.

In Figure 2-4, we use CPU time to compare the performance of the four codes

ACBB, SPG2, PRP+, and CG_DESCENT. Note that the horizontal axis in Figure

2-4 is scaled proportional to log2(r). The best performance, relative to the CPU

time metric, was obtained by CGDESCENT, the top curve in Figure 2-4, followed

by ACBB. The horizontal axis in the figure stops at r = 16 since the plots are

essentially flat for larger values of '-. For this collection of methods, the number of

times any method achieved the best time is shown in Table 2-5. The column total


in Table 2-5 exceeds 111 due to ties for some test problems.









Table 2-5: Number of times each method was fastest (time metric, stopping crite-
rion (2.104))

Method Fastest
CG_DESCENT 70
ACBB 36
PRP+ 9
SPG2 9


The results of Figure 2-4 indicate that ACBB is much more efficient than

SPG2, while it performed better than PRP+, but not as well as CG_DESCENT.

fFrom the experience in [108], the GBB algorithm, with a traditional nonmonotone

line search [66], may be affected significantly by nearly singular Hessians at

the solution. We observe that nearly singular Hessians do not affect ACBB

significantly. In fact, Table 2-4 also indicates that ACBB becomes more efficient

as the problem becomes more singular. Furthermore, since ACBB does not need

to calculate the BB stepsize at every iteration, CPU time is saved, which can

be significant when the problem dimension is large. For this test set, we found

that the average cycle length for ACBB was 2.59. In other words, the BB step is

reevaluated after 2 or 3 iterations, on average. This memory length is smaller than

the memory length that works well for quadratic function. When the iterates are

far from a local minimizer of a general nonlinear function, the iterates may not

behave like the iterates of a quadratic. In this case, better numerical results are

obtained when the BB-stepsize is updated more frequently.

Even though ACBB did not perform as well as CG_DESCENT for the com-

plete set of test problems, there were some cases where it performed exceptionally

well (see Table 2-6). One important advantage of the ACBB scheme over conju-

gate gradient routines such as PRP+ or CGDESCENT is that in many cases, the

stepsize for ACBB is either the previous stepsize or the BB sizesize (2.51). In con-

trast, with conjugate gradient routines, each iteration requires a line search. Due to









Table 2-6: CPU times


Problem Dimension ACBB CGDESCENT
FLETCHER 5000 9.14 989.55
FLETCHER 1000 1.32 27.27
BDQRTIC 1000 .37 3.40
VARDIM 10000 .05 2.13
VARDIM 5000 .02 .92


the simplicity of the ACBB stepsize, it can be more efficient when the iterates are

in a regime where the function is irregular and the .,-i- i,! i. i c convergence prop-

erties of the conjugate gradient method are not in effect. One such application is

bound constrained optimization problems -as components of x reach the bounds,

these components are often held fixed, and the associated partial derivative change

discontinuously. In C'! lpter 3, ACBB is combined with CGDESCENT to obtain a

very efficient active set algorithm for box constrained optimization problems.

2.3 Self-adaptive Inexact Proximal Point Methods

2.3.1 Motivation and the Algorithm

The convergence rate of algorithms for solving an unconstrained optimiza-

tion problem (2.1) often depends on the eigenvalues of the Hessian matrix at a

local minimizer. As the ratio between largest and smallest eigenvalues grows,

convergence rates can degrade. In this section, we propose a class of self-adaptive

proximal point methods for ill conditioned problems where the smallest eigenvalue

of the Hessian can be zero, and where the solution set X may have more than one

element [77].

The proximal point method is one strategy for dealing with degeneracy at a

minimum. The iterates xk, k > 1, are generated by the rule:


Xk+1 G arg min (Fk(x) : x E },


for selected problems


(2.105)









where
1
Fk(x) f=(X) + "x x, :.
2
Here xo E R' is an initial guess for a minimizer and the parameters pk, k > 0,

are positive scalars. Due to the quadratic term, Fk is strictly convex at a local

minimizer. Hence, the proximal point method improves the conditioning at the

expense of replacing the single minimization (2.1) by a sequence of minimizations

(2.105).

The proximal point method, first proposed by Martinet [96, 97], has been

studied in many papers including [69, 95, 84, 111, 112]. In [112] Rockafellar shows

that if f is strongly convex at a solution of (2.1), then the proximal point method

converges linearly when ik is bounded away from zero, and superlinearly when

PIk tends to zero. Here we develop linear and superlinear convergence results for
problems that are not necessarily strongly convex.

Since the solution to (2.105) approximates a solution to (2.1), we do not need

to solve (2.105) exactly. We analyze two criteria for the accuracy with which we

solve (2.105). The first criterion is that an iterate xk+1 is acceptable when


(Cl) Fk(Xk+1) < /(xk) and VFk(Xk+) < Vf(xk).

The second acceptance criterion is


(C2) Fk(xk+1) < f(xk) and |VFk(xk+l) 1< 0,,, H,+ x,


where 0 < 1/-/2. In either case, we show, for rk sufficiently small, that


I +1- Xk+1 < C, |Ix, -x |, (2.106)

where C is a constant depending on local properties of f and x is the projection of

x e R' onto X:


Ix XII min Il xl.
xEX









Since f is continuous, the set of minimizers X of (2.1) is closed, and the projection
exists. By taking Pk = Vf(k)ll in (2.106), we obtain quadratic convergence of the

approximate proximal iterates to the solution set X, while the sequence of iterates

approaches a limit at least linearly.

In [112] Rockafellar studies the acceptance condition


||VFk(xk+1)ll
where Ek 6k < oo (see his condition B'). Our criterion (C2) corresponds to the case

Ek Ck = co. (C2) is also studied in [82] where the authors give an introduction to
proximal point algorithm, borrowing ideas from descent methods for unconstrained

optimization. In [82] global convergence for convex functions is established, while

here we obtain local convergence rates for general nonlinear functions.
2.3.2 Local Error Bound Condition

Our analysis of the approximate proximal iterates makes use of a local error

bound condition employ, -1 when the Hessian is singular at a minimizer of f see

[54, 119, 127, 126]. Referring to [54] and [127], we have the following terminology:

Vf provides a local error bound at i E X if there exist positive constants a and p
such that

||Vf(x)|| > ( I| xll whenever Ix xl| < p. (2.107)

Using this condition, Yamashita and Fukushima [127], and Fan and Yuan [54],

study the Levenberg-Marquardt method to solve a system of nonlinear equations.

When their approach is applied to (1.1), the following linear system is solved in

each subproblem:

(H(xk)2 + kI)d + H(xk)g(xk) = 0, (2.108)

where pk > 0 is the regularization parameter and d is the search direction at

step k. In [54] and [127], the authors choose ik = g(xk)ll and lk = I(xk) 2,









respectively. They show that if Vf(x) provided a local error bound, then the

iterates associated with (2.108) are locally quadratically convergent.

Li, Fukushima, Qi and Yamashita point out in [88] that the linear system of

equations (2.108) may lose sparsity when H(xk) is squared; moreover, squaring

the matrix squares the condition number of H(xk). Hence, they consider a search

direction d chosen to satisfy:


(H(xk) + kI)d + g(xk) 0, (2.109)

where ik = C g(xk) for some constant c > 0. When f(x) is convex and Vf(x)

provides a local error bound, they establish a local quadratic convergence result for

iterates generated by the approximate solution of (2.109) followed by a line search

along the approximate search direction.

When the problem dimension is large, computing the Hessian and solving

(2.109) can be expensive. In our approach, we use our newly developed conjugate

gradient routine CG_DESCENT [73, 72] to solve (2.105) with either stopping criterion

(Cl) or (C2).

First, by observing that the minimum of Fk is bounded from above by Fk(xk),

we can easily get the following proposition.

Proposition 1 If f is continuous and its set of minimizers X is non, ijil,, then

for each k, Fk has a minimizer xk+1 and we have


I ,+1 Xk1 < X, Xk|

Proof. The proposition follows directly from

1 1
Fk(xk+,) = f(xk+,) + -k I X,+1 < Fk(x) f(xk)) + i x, -x, 1
2 2
< f(x) + pk IX, 1
2









In our approach, it is more convenient to employ a local error bound based on

the function value rather than the function gradient used in (2.107). We -w that f

provides a local error bound at x E X if there exist positive constants a and p such

that

f(x) f(x) > aj|x x1|2 whenever |x i|| < p. (2.110)

We now show that under certain smooth assumption, these two conditions are

equivalent:

Lemma 5 If f is twice continue.;, l differentiable in a neighborhood of x X,

then f provides a local error bound at x is equivalent to Vf provides a local error

bound at i.

Proof. Suppose f provides a local error bound at x E X with positive

scalars a and p satisfying (2.110). C'! .... p smaller if necessary so that f is twice

continuously differentiable in Bp(x). Define r = p/2. Given x E Br(), the triangle

inequality implies that

IIx- xll < II x|l + IIx- xll < 2r p. (2.111)

Since both x and x E Bp,(x), we can expand f(x) in a Taylor series around x and

apply (2.110) to obtain:

a||x- x1|2 < f(x)- f(x)

( X- x)TH(x x) + R2(x, x), (2.112)
2

where R2 is the remainder term and H = V2f() is the Hessian at x. C'! ....- p

small enough that


IR2(x, x) llx x12 whenever x E Bp().









In this case, (2.112) gives


4a-
S|x x|| < I|H(x- x) |. (2.113)

Now expand Vf(x) in a Taylor series around x to obtain

Vf(x) = Vf(x) Vf(x) = H(x x) + Ri(x, x), (2.114)

where R1 is the remainder term. C'! ....- p smaller if necessary so that

IIRl(x, x)|| < I|x xll whenever x E Bp(x). (2.115)

Combining (2.113)-(2.115) yields


|V/f(x)I| > a||x- x||.

Hence, Vf provides a local error bound at x with constants a and p/2.
Conversely, suppose Vf provides a local error bound at kx E X with positive
scalars a and p satisfying (2.107). C(! ....- p smaller if necessary so that f is twice

continuously differentiable in Bp(i) and (2.115) holds. Combining the Taylor

expansion (2.114), the fact that H is positive semidefinite, the bound (2.115) on
the remainder, and the local error bound condition (2.107), we obtain

2a
| Ix- x|| < IIH(x x)| < I H1/2 11 IH1/2x x).

Squaring both sides gives

4 Ix X1l2 HIIHII IH1/2(X X)i2 = IIHII( x)TH(x x).


Consequently,

(X x)TH(X X) 4a2 x 2 > (2.116)2
(X T x)Hx x) > ||x x|| > |-x|| (2.116)
~ 9||H||) 9A )









where A is a bound for the Hessian of f over Bp(x). Similar to (2.112), we expand

f in a Taylor expansion around x and utilize (2.116) to obtain

1 222
f(x) f(x) R2(x, x) (x X)TH(X X) > PX X112, where 2a2
2 9A

To complete the proof, choose p smaller if necessary so that


|R2(x,x)l < IX X 2
2

whenever x E Bp(x).

2.3.3 Local Convergence

Convergence analysis for exact minimization. We first analyze the proximal point

method when the iterates are exact solutions of (2.105).

Theorem 9 Assume the following conditions are ,/.'17 ,I1

(El) f provides a local error bound at x CE X with positive scalars a and p
.I'-Ifying (2.110).

(E2) Pk < 3 for each k where 3 < 2a/3.
(E3) xo is close enough to x that


x|| (- + < p, where 7 2 (2.117)
1 -7 2a -

Then the proximal iterates xk are all contained in Bp(:x) and they approach a

minimizer x* E X; for each k, we have

x, x*l < Ixo xoll and
1-7
Ix,+i -X+11 < 2 I, -Xkll. (2.118)
2a pk

Proof. For j 0, we have


I|xj- ~i|| p and I|xj xj|| < 7 lxo xo||.


(2.119)







57

Proceeding by induction, suppose that (2.119) holds for all j E [0, k] and for some

k > 0. We show that (2.119) holds for all j [0, k + 1]. By Proposition 1 and the

induction hypothesis,


IX,+i x, || < |x,


, II < 7' l' x,,


xo 1.


By (E2), we have 3 < 2a/3 and hence,


23
2ac- 3


< 1.


By the triangle inequality, Proposition 1, and the induction hypothesis, it follows

that


k
Xjl| < |x 1 xj|
j=0


Again, by the triangle inequality and (2.117),


IX'+I Xi < | ,+I xo0 + Ix X| < (1


Ilxo -xo| <


t1 -7 0


x|| < p.


Observe that


IX,+I -i Xk 2 X,+I X, 1 -


< ( x,+1 Xk+1l + 211,+1 -


X, II) X +1 Xk+ ||.1


(2.121)


Combining (2.121) with the relation Fk(Xk+l) < Fk(Xk+l) gives


f(Xk+l) f(Xk+l)


1
-= k +1
<
2


Xk 12 +I -X, 1 I-)


Xk 1) Ix,+1- Xk+1| (2.122)


k
x+I- xo011 < ||x IXj+i
j=0
k
j< il
j=0


xo| <


Ilxo xll.


(2.120)


(Xk+l + Xk+l


2xk)(Xk+ Xk+l)


Xk+ 11 + 2 iX, +1









By (2.120), Xk+1E 3 p(:B(). Since f provides a local error bound at :, we conclude
that

( | x, +1 Xk+1 2 < (xk+i) (xk+1). (2.123)

Combining this with (2.122) gives

(a- 2- k + -- Xk+1 /'" i + Xk |. (2.124)

Due to the assumption Pk < 2a/3, the coefficient (a 1k) in (2.124) is positive
and

2pk/(2a k) < 23/(2a 3) 7 < 1.

Hence, (2.124), Proposition 1, and (2.119), with j = k, give

ix, +-1 ll < 2 I ,+1 Xkc< 2 k Xk | (2.125)

< X- |x, x, II < 7k+l xo xo0| (2.126)

Relations (2.120) and (2.126) complete the proof of the induction step. Relation
(2.125) gives estimate (2.118).
By (2.119) and Proposition 1, we have

X '+1 X, II < X x, < 7kjlxo xoll.

Hence, the proximal iterates Xk form a Cauchy sequence, which has a limit denoted
x*. Again, it follows from (2.119) and Proposition 1 that

X, <>11 < X*l ||< Y ||Xj+l Xj 11 < Y ||Xj Xj |
j k j k

< Z 1xo -xoll 0 1- I|xo- xo||.
j k

By (2.125)-(2.126), x* e X. O
Note that neither smoothness nor convexity assumptions enter into the
convergence results of Theorem 9. In a further extension of these results, let us









consider the case where the regularization sequence pk of Theorem 9 is expressed as
a function of the current iterate. That is, we assume that pk = p(xk) where p(-) is

defined on R".

Corollary 1 We make the following assumptions:

(Q1) f provides a local error bound at x E X with positive scalars a and p
I/.:-fying (2.110).

(Q2) Vf is Lipschitz continuo.,-l/, differentiable in Bp(x) with Lipschitz constant L.
(Q3) p is small enough that for some scalar 3, we have

2a
I Vf(x) | < P < for all x e r(xi) where r = p/2.
3

(Q4) xo is close enough to x that

2j
|xo || t1 + < r, where 7 2
I1-7) 2a- 0

Then for the choice p(x) = |Vf(x)|| and ik (= (Xk), the proximal iterates (2.105)
are all contained in Br(x) and they approach a minimizer of f. Moreover, we have

|X,+i Xk+1 < 3 IX, xk 12 (2.127)

for each k.
Proof. The proof is identical to the inductive proof of Theorem 9 except that

we append the condition pij < 3 for each j E [0, k] to the induction hypothesis
(2.119). That is, we assume that for all j E [0, k], we have

||xj- x|| < r, ||x xj,| < 'Jxo- xo l, and pj < 3. (2.128)

Since xo c B,(x), it follows by (Q3) that po = I|Vf(xo)|| < 3. Hence, (2.128) is

satisfied for j = 0. In the proof of Theorem 9, we show that if pij < P for j E [0, k],
then the first two conditions in (2.128) hold for j = k + 1. Also, as in (2.120),

Xk+1 E B (:x). Consequently, Uk+1 = IVf(xk+1i) < /3, which implies that the last









condition in (2.128) holds for j =k + 1. This completes the induction step; hence,

(2.128) holds for all j > 0.

Since Xk E Br(x), it follows that xk E Bp(x) (see (2.111)). Since xk and

Xk CE p(x), we have


Sk = Vf(xk)ll = IVf(xk) Vf(xk) | < Ll|x, x, ||, (2.129)

where L is the Lipschitz constant for Vf in Bp(x). By estimate (2.118) in Theorem

9,

X, +1 Xk+l < 2 k -Xk -.
2 a- Pk)
Using the bound (2.129) in the numerator and the bound i k < / < 2a/3 in the

denominator, we obtain (2.127). O

Convergence analysis for approximate minimization. We now analyze the situation

where the proximal point iteration (2.105) is implemented inexactly; the approx-

imation to a solution of (2.105) need only satisfy (Cl) or (C2). The following

property of a convex function is used in the analysis.

Proposition 2 If x* is a local minimizer of Fk and f is convex and continuoe;,'1

differentiable in a convex neighborhood A] of x*, then

Fk(x)< F (x*) +
lk

for all x E c .

Proof. The convexity of f in AH implies that Fk is convex in AH and

Fk(x*) > Fk(x) + VFk(x)(x* x).

Since x* is a local minimizer of Fk, we have

VFk(x)(x -x*) (VFk(x)- VFk(x*))(x-x*)

(Vf(x) Vf(x*)(x- x*) +,,, | x* |2. (2.130)









Since f is convex in N, the monotonicity condition


(Vf(x)- Vf(x*))(x- x*) > 0

holds. Combining this with (2.130), we obtain


VFk(x)(x- x*) > ,,, I x* 2. (2.131)

By the Schwarz inequality,

IIVFk(x)II
IX x* lI < (2.132)
P-k

Combining (2.131) and (2.132), the proof is complete. D

Our convergence result for the inexact proximal point iterates is now estab-

lished.

Theorem 10 Assume that the following conditions are -,,l.:'. ,./

(Al) f provides a local error bound at x E X with positive scalars a and p
.I'-Ifying (2.110).

(A2) f is twice continuo;,-l,; differentiable and convex throughout Bp,(x); let L be a

Lipschitz constant for Vf in Bp,(x).

(A3) The parameter 3 = sup{p(x) : x cE p(x)} -,/,-7. -


3 < ac/A (2.133)

where


A = L + 7 and 72 = 1 + 2L2 if acceptance criterion (Cl) is used,

while

1


A T(l + 0) and T


if acceptance criterion (C2) is used.
1 202









(A4) The parameter

1 p3A
eC T Xo (i 1+ where 7=
.x xa|



e + sup x : x x e (:),x X < r, where r = p/2,


and A is i.;;, upper bound for the 1.,I .' -.: ,,.:, i;l;, of V2f(x) over x E Bp(x).

If the approximate proximal iterates xk '.i, fy either (C1) or (C2) with ik p= (xk),

then the iterates are all contained in Be(x), they approach a minimizer x* E X, and

for each k, we have
T7k
x, -x*|| < I|xo xol| and (2.134)
1-7

i-xlXk+li < (Ak Xk |, (2.135)

where A is 1. I;., / in (A3).

Proof. The following relations hold trivially for j = 0 since the index range for
the summation is vacuous and r > 1:


I||xj I|
Proceeding by induction, suppose that (2.136) holds for all j E [0, k] and for some

k > 0. We show that (2.136) holds for all j e [0, k + 1].

The condition Fk(xk+l) < f(Xk) in (C1) or (C2) implies that


k I ,+1 Xk 2 < f(xk) f(xk+1) < f(Xk)- f(Xk). (2.137)

Since 7 < 1 by (2.133), the first half of (2.136) with j = k implies that xk E c (x),

where e < r = p/2. Thus we have Xk G Bp(x:) (see (2.111)). Expanding f in (2.137)









in a Taylor series around xk and using the fact that Vf(xk) = 0 gives


Ilk I ,+1 -x, || < x, -x 1, (2.138)

where A is the bound for the Hessian of f over Bp(x). By the triangle inequality, we

have

x,+,I x|| < Ix,+ x, || + I, x||. (2.139)

Combining (2.139) with the condition xk E B,( (), (2.138), and (A4) yields:


|x'+1 xl < C+ IX+1 x, || < e+ x, Xk < r.

Hence, Xk+1E G B(x).

Let Xik+l denote an exact proximal point iterate:

Xk+l c arg min {FF(x) : x }.

By Proposition 1 and (2.136), we have


|x+1 x,j| < |x, x, < 7kXO xo0|. (2.140)

By the triangle inequality, (2.140), the fact that r > 1, and (2.136), we obtain


X, '+1 x|| < X, +1 Xk+ X, X
k-
< T-lxo -xol + Ix, xl

T-'||x,,- xol + ||o- o|| 1+ i )
l=0

< T||xo-x|| 1+ .


Referring to the definition of c, we have :ik+l E e(x), where c < r = p/2.

By assumption (Al), f provides a local error bound with constants a and

p. Hence, by Lemma 5, Vf provides a local error bound with constants a and

r = p/2. Since VF(xk+l) = Vf(xk+l) + Ik(Xk+1 Xk) and xk+ E c r(x), the local








error bound condition gives

-|| x, +1 X+ii _< i Vf(xk+l)i _< iBVFk(xk+i)ii + ,,, IIx+1 x, 11. (2.141)

If (Cl) is used, then I|VFk(xk+l)ll < ,,, IVf(xk)l|, and (2.141) implies that

('X,+l -X+ 1 -k1 k k) I )

< k(L|x, -IXk| + iX+1 i|). (2.142)

If (C2) is used, then IIVFk(xk+l)ll < 0,1,, x,+1 x, I|, and by (2.141), we have

(.mixI+1 -Xk+l < lk( + 0) ix,+1-, ||. (2.143)

We now derive a bound for the i x,+1 x, I| term in either (2.142) or (2.143).
Since both Xk+l and X:k+l lie in Bp(x), we can apply Proposition 2 to obtain

Fk(xk+1) Fk(:xk++) + (Fk(xk+1)- Fk (x:k+I))

< Fk(xk) + tVFk(Xk+l) 2. (2.144)
Ilk

Above we observed that xk E (Be(x) where c < r = p/2. In (2.111) we show that
Xk C Bp(x) when xk cE B(x). Hence, by (A2) we have

||Vf(xk) = Il Vf(xk) Vf(xk)| < L ix, x, ||. (2.145)

If (Cl) is used, then by (2.145),

||VFk(xk+l)ll < ,,, Vf(xk) < pkL I, x x|. (2.146)

Using the relation f(xk) < f(xk+1) in (2.144) gives:

Il I x, +1 < x + VFk(Xk+) 2. (2.147)
2 2 Ilk









Combining this with (2.146), we have


x,+ x, |- < (1 + 2L2) ix, -Xk 112. (2.148)

Similarly, if criterion (C2) is used, then I|VFk(xk+l)1) < Oik Ix, +1 Xk |, and by

(2.147), we have
1
IX,+I i || -< |x x, -'. (2.149)
1 202
Combining (2.148) and (2.149) and referring to the definition of r in (A3), we

conclude that

X, +1 X, || < r||T x, || (2.150)

if either acceptance criterion (Cl) or (C2) is used.

Inserting the bound (2.150) in (2.142) or (2.143) yields (2.135). By (2.135) and

the definition of 3 in (A3), we have

X, +1 Xk+ | ( <- A X, Xk < X- Xk 1 < 7k+l ||XO X0 11.

This establishes the second half of (2.136) for j =k + 1.

For the first half of (2.136), we use the triangle inequality, (2.150), and the

induction hypothesis to obtain:

IX,+i- x- < xI+i-X, i| + x x||


< IIX, x, II + Ix, xl
< T7- lxo -_ X1o + I x, -(
k-1


~< (x I=0I
"< -||xo | 1 7+

This completes the proof of the induction step, and in particular, it shows that

Xk+l E 8,e(x), where c is defined in (A4).









By (2.136) and (2.150), we have


,x,+1 Xk T 11. ,_ -X, 11 < T7kjjXo Xo||. (2.151)

Hence, the xk form a Cauchy sequence, which has a limit denoted x*. By the

triangle inequality and (2.151),
OO OO
IX' -X*II < < IXj+l xjl T |IIXj Xj
j k j k
00
c IIxT- x011.
< T 7Y 1X -xo|| 11 --1||Xo X
jk 7

This establishes (2.134). O

Remark. In our analysis of the exact proximal point algorithm, our proof

of Theorem 9 could rely on the bound provided by Proposition 1 for the step size

Xk+1 Xk. In Theorem 10, we obtain a similar bound using either condition (C1) or

(C2) along with the relationship between the regularization parameter pIk and the

current iterate Xk. Condition (A4) is satisfied if

lX- x|
lim x- 0 (2.152)
x x P(x)

and xo is sufficiently close to x. For example, if p(x) = P|Vf(x)l|| where TI E [0, 2)

and 3 > 0 is a constant, and if Vf provides a local error bound at :, then
||x-x_ |2 ||x -x_ |2 ( 1 )
l x 12< jI x xll2-
p(x) P|V/ f(x)|| )

Hence, (2.152) holds for r, E [0, 2). Also, (A3) holds if either Tr E (0, 2) and p is

sufficiently small, or r = 0 and 3 is sufficiently small. For this choice of p(-), it

follows from Theorem 10, that the convergence rate to the set of minimizers X is

linear for r = 0, superlinear for rT E (0, 1), and at least quadratic for rT E [1, 2).









2.3.4 Global Convergence

We now establish a global convergence result. It is assumed that the algorithm

used to approximately solve (2.105) starts with a direction dk which satisfies (L1)

below, and which employs a line search in the direction dk which satisfies (L2)

below:

(L1) There exist positive constants cl and c2 satisfying both the sufficient descent

condition

gjdk _< -Clg, ',

and the search direction bound


||d, || < c2 |11g|,

where g9 Vf(Xk)T VFk(xk).

(L2) The Wolfe conditions [122, 123] hold. That is, if ak denotes the stepsize and

yk = k + akdk, then

Fk(yk) < Fk(xk) + 6akVFk(xk)dk Fk(xk) + kgj dk,

and

VFk(yk)dk > 7VFk(xk)dk agJdk.

Moreover, since yk is an intermediate point in the move from xk to Xk+l, we

require that Fk(yk) > Fk(xk+l).

Our global convergence result is as follows:

Theorem 11 Let xk, k > 1, denote a sequence of inexact proximal point iterates

associated with (2.105). We assume that the ili,.rithm used to '/'i,';''~l','I l; solve

(2.105) -.,l.:.f. (L1) and (L2). If the iterates are all contained in a convex set B

where Vf is Lipschitz continuous, and if lik < j IIVf(xk) jl for some ] e [0,2) and









some constant 3, then we have


lim g(xk) 0.
k--oo

Proof. Since X is nonempty, f is bounded from below. By (L2) and the

definition of Fk,


f(xk+1) < Fkc(Xk+1) < k Fyk) <(xk)


f(xk).


Hence, we have


00
00 > f (xk)
k-0

> Fk Xk)
00



> Fk (Xk)
k-0
> Fcx)
kco


00
f(xk+1) Z Fk(xk)
k=0


- k (Xk+ 1)


Fk(yk).


By (L1) and the first Wolfe condition in (L2), it follows that


Fk(yk) < Fk(xk)


JClak g, II-.


Combining this with (2.153) gives


,, |g(Xk) 2 < 00.
kO0

By the second Wolfe condition in (L2) and (LI), we have


(VFk(yk) VFk(xk))dk >


(1 o7)VFk(xk)dk


(1 a)gTdk


> ( 7)Ci||g, ||2.


f(Xk+l)


(2.153)


(2.154)


(2.155)









If L is the Lipschitz constant for Vf in B, then we have


(VFFk(yk)- VFk(xk))d(xk) (Vf(yk- V() f + Pk(Yk Xk)T)dk

< (L + pk) Yk x d, C ak(L + pk)d, C-

< c2ak(L +Ik) lg 2

In the last inequality, we utilize (LI). Combining this with (2.155) yields:

ci(1- o-)
ak c(L + k)

By (2.154) and the bound Uk < llVf(xk) '",

o > 9 (|g(xk) 2
co > L+/
k L + lk
rn1 ||g(xk)ll 2 g(xk) 2
> m

>1 min g(xk) l2 lg(xk) l2-
> min L '


Since rl < 2, we conclude that g(xk) approaches 0. O

2.3.5 Preliminary Numerical Results

We now present some preliminary numerical examples to illustrate the conver-

gence theory. To illustrate the quadratic convergence when p(x) = /31Vf(x)l|, we

consider the following two problems (the first is introduced in [88]):
n-1 n-1

= i=1

(P2) f (x)= b(x -)2 )4.
i= 1 i= 1

In (P1) we take n = 10, ai 1, and the starting guess xi = i, 1 < i < n. The

set of minimizers are given by xi = x2 = ... = Xn. In (P2) we take n = 10,

bi = e-4i, and the starting guess xi = 1 + 1/i, 1 < i < n. For this problem,









Table 2-7: I|g(xk)l versus iteration number k

Problem 1 Problem 2
Iteration (C1) (C2) (C1) (C2)
1 4.5e-01 5.4e-01 4.7e-01 1.3e-01
2 7.2e-02 1.0e-01 1.3e-02 3.2e-03
3 2.6e-03 7.1e-03 1.5e-04 2.5e-05
4 3.5e-06 2.6e-05 4.2e-07 4.5e-08
5 6.4e-12 4.3e-10 1.8e-10 1.2e-11


the minimizer is unique, however, the condition number of the Hessian at the

solution is around .5e16. Table 2-7 gives the gradient norms corresponding to

p(x) = .051|Vf(x)l|, and acceptance criterions (C1) and (C2). For (C2), we chose
0 = .66. The subproblems (2.105) were solved using our conjugate gradient routine

CG_DESCENT [73, 72], stopping when either (C1) or (C2) is satisfied.

In the next series of numerical experiments, we solve some ill-conditioned prob-

lems from the CUTE library [13]. Experimentally, we find that it is more efficient

to use the proximal strategy in a neighborhood NA of an optimum; outside NA, we

apply CD_DESCENT to the original problem (2.1). In our numerical experiments,

our method for choosing N" was the following: We applied the conjugate gradient

method to the original problem (2.1) until the following condition was satisfied:


Ilg(x) ll 10-2(1 + If(x) ). (2.156)

When this condition holds, we continue to apply the conjugate gradient method

until an estimate for the condition number exceeds 103; then we switch to the

proximal point method (2.105), using CGDESCENT to solve the subproblems. We

estimate the condition number by first estimating the second derivative of the

function along the normalized search direction. Our estimate for the condition

number is the ratio between the maximum and minimum second derivative, during

the iterations after (2.156) is satisfied.









Table 2-8 gives convergence results for ill-condition problems from CUTE

[13]. The :: I I condition numbers" are computed in the following way: We

Table 2-8: Statistics for ill-condition CUTE problems and CG_DESCENT

No Proximal Point With Proximal Point
Problem Dim Cond It NF NG It NF NG
SPARSINE 2000 2.6e17 12,528 25,057 12,529 10,307 20,615 10,308
SPARSINE 1000 3.4e14 4,657 9,315 4,658 3,760 7,521 3,761
NONDQUAR 1000 6.6e10 4,004 8,015 4,152 3,013 6,032 3,068
NONDQUAR 500 1.0el0 3,027 6,074 3,185 2,526 5,062 2,676
EIGENALS 420 1.2e06 1,792 3,591 1,811 1,464 2,935 1,482
EIGENBLS 420 3.0e05 5,087 10,185 5,099 2,453 4,910 2,458
EIGENCLS 420 8.2e04 1,733 3,484 1,754 1,774 3,566 1,795
NCB20 510 3.7e16 1,631 3,048 2,372 1,251 2,262 1,684


solve the problem and output the Hessian matrix at the solution. The extreme

eigenvalues of the Hessian were computed using Matlab, and the eigenvalue ratio

appears in the column labeled "Cond" of Table 2-8. For the proximal point

iteration, we used acceptance criterion (C2). The iterations were continued until

the stopping condition I|Vf(x)||1 < 10-6 was satisfied. The number of iteration

(It), number of function evaluations (NF), and number of gradient evaluations

(NG) are given in the table. Observe that the reduction in the number of function

and gradient evaluations varies from almost nothing in EIGENCLS to about 50'.

for EIGENBLS.














CHAPTER 3
BOX CONSTRAINED OPTIMIZATION

In this chapter, we will consider the problem (1.1) with the feasible set S to be

a box set B, i.e. the problem (1.1) turns out to be


min {f(x) :x E B}, (3.1)

where f is a real-valued, continuously differentiable function defined on the set


B = x :1< x < u}. (3.2)

Here 1 < u and possibly, i = -oo or ui = o0.

3.1 Introduction

The box constrained optimization problem appears in a wide range of applica-

tions including the obstacle problem [102], the elastic-plastic torsion problem [61],

optimal design problems [8], journal bearing lubrication [20], inversion problems

in elastic wave propagation [7], and molecular conformation analysis [62]. Prob-

lem (3.1) is often a subproblem of augmented Lagrangian or penalty schemes for

general constrained optimization (see [24, 25, 46, 47, 52, 59, 70, 71, 98]). Thus the

development of numerical algorithms to solve (3.1) efficiently, especially when the

dimension is large, is important in both theory and applications.

We begin with an overview of the development of active set methods. A

seminal paper is P. i, il:'s 1969 paper [107] which considers a convex, quadratic cost

function. The conjugate gradient method is used to explore a face of the feasible

set, and the negative gradient is used to leave a face. Since Polyak's algorithm

only added or dropped one constraint in each iteration, Dembo and Tulowitzki

proposed [41] an algorithm CGP which could add and drop many constraints in









an iteration. Later, Yang and Tolle [128] further developed this algorithm so as

to obtain finite termination, even when the problem was degenerate at a local

minimizer x*. That is, for some i, x* = li or x* = ui and Vf(x*)i = 0. Another

variation of the CGP algorithm, for which there is a rigorous convergence theory, is

developed by Wright [124]. Mor6 and Toraldo [102] point out that when the CGP

scheme starts far from the solution, many iterations may be required to identify a

suitable working face. Hence, they propose using the gradient projection method

to identify a working face, followed by the conjugate gradient method to explore

the face. Their algorithm, called GPCG, has finite termination for nondegenerate

quadratic problems. Recently, adaptive conjugate gradient algorithms have been

developed by Dostdl et al. [44, 45, 47] which have finite termination for a strictly

convex quadratic cost function, even when the problem is degenerate.

For general nonlinear functions, some of the earlier research [4, 19, 63, 87,

99, 113] focused on gradient projection methods. To accelerate the convergence,

more recent research has developed Newton and trust region methods (see [26]

for in-depth analysis). In [5, 17, 24, 51] superlinear and quadratic convergence is

established for nondegenerate problems, while [53, 59, 86, 90] establish analogous

convergence results, even for degenerate problems. Although computing a Newton

step can be expensive computationally, approximation techniques, such as a sparse,

incomplete ('!!. .1 -l:y factorization [89], could be used to reduce the computational

expense. Nonetheless, for large-dimensional problems or for problems where the

initial guess is far from the solution, the Newton/trust region approach can be

inefficient. In cases where the Newton step is unacceptable, a gradient projection

step is preferred.

The affine-scaling interior point method of Coleman and Li [14, 21, 22, 23] is

a different approach to (3.1), related to the trust region algorithm. More recent

research on this strategy includes [42, 80, 83, 120, 133]. These methods are based









on a reformulation of the necessary optimality conditions obtained by multiplica-

tion with a scaling matrix. The resulting system is often solved by Newton-type

methods. Without assuming strict complementarity (i. e. for degenerate problems),

the affine-scaling interior-point method converges superlinearly or quadratically,

for a suitable choice of the scaling matrix, when the strong second-order sufficient

optimality condition [110] holds. When the dimension is large, forming and solving

the system of equations at each iteration can be time consuming, unless the prob-

lem has special structure. Recently, Zhang [133] proposes an interior-point gradient

approach for solving the system at each iteration. Convergence results for other

interior-point methods applied to more general constrained optimization appear in

[48, 49, 125].

The method developed in this chapter is an active set algorithm (ASA) which

consists of a nonmonotone gradient projection step, an unconstrained optimization

step, and a set of rules for branching between the steps. A good survey of this

active set method can be found in [76]. Global convergence to a stationary point is

established. When the strong second-order sufficient optimality condition holds, we

show that ASA eventually reduces to unconstrained optimization, without restarts.

This property is obtained without assuming strict complementary slackness. If

strict complementarity holds and all the constraints are active at a stationary

point, then convergence occurs in a finite number of iterations. In general, our

analysis does not show that the strictly active constraints are identified in a finite

number of iterations; instead, when the strong second-order sufficient optimality

condition holds, we show that ASA eventually branches to the unconstrained

optimization step, and henceforth, the active set does not change. Thus in the

limit, ASA reduces to unconstrained optimization without restarts.









xk = xk kgk


P(xk)


Figure 3-1: The gradient projection step.


3.2 Gradient Projection Methods

In this section, we consider a generalization of (3.1) in which the box B is

replaced by a nonempty, closed convex set Q:


min {f(x) :x E Q}.


(3.3)


We begin with an overview of our gradient projection algorithm. Step k in our

algorithm is depicted in Figure 3-1. Here P denotes the projection onto Q:


P(x) = arg min x y||.
yEQ


(3.4)


Starting at the current iterate xk, we compute an initial iterate xk = Xk akgk.

The only constraint on the initial steplength ak is that Ok C [Omin, Omax], where









amin and amax are fixed, positive constants, independent of k. Since the nominal
iterate may lie outside 2, we compute its projection P(xk) onto 2. The search

direction is dk = P(xk) Xk, similar to the choice made in SPG2 [11]. Using a

nonmonotone line search along the line segment connecting xk and P(xk), we arrive

at the new iterate xk+l.

In the statement of the nonmonotone gradient projection algorithm (NGPA)

given below, fJ denotes the "reference" function value. A monotone line search

corresponds to the choice fJ = f(xk). The nonmonotone GLL scheme takes

fk = ffax where

f~ = max{f(xk-i) : 0 < i < min(k, M 1)}. (3.5)

Here M > 0 is a fixed integer, the memory. For a specific procedure on how to

choose the reference function value based on our cyclic BB scheme, one can refer

the paper [39, 78, 132].

NGPA Parameters

e C [0, oo), error tolerance

6 E (0, 1), descent parameter used in Armijo line search

rl E (0, 1), decay factor for stepsize in Armijo line search

[amin, amax] C (0, oo), interval containing initial stepsize

Nonmonotone Gradient Projection Algorithm (NGPA)

Initialize k = 0, xo = starting guess, and f1 = f(xo).

While IIP(xk gk) Xk >

1. C'! ....-* k c [Cmin, cmax] and set dk = P(xk akgk) Xk.
2. ('! .....- f so that f(xk) < f < max{fY ffmax} and f; < fmax

infinitely often.









3. Let fR be either f or min ffka, ff}. If f(xk + dk) < f+ Jgdk, then
ak 1.
4. If f(xk + dk) > fR + g dk, then akc = r where j > 0 is the smallest

integer such that

f(xk + Tjdk) < fR + lj6gTdk. (3.6)

5. Set xk+l = X + ckdk and k = k + 1.
End

The condition f(xk) < f/ guarantees that the Armijo line search in Step 4 can be

satisfied. The requirement that "f/; < f"ax infinitely often" in Step 2 is needed
for the global convergence result Theorem 12. This is a rather weak requirement
which can be satisfied by many strategies. For example, every L iteration, we could

simply set f =- f'ax. Another strategy, closer in spirit to the one used in the
numerical experiments, is to choose a decrease parameter A > 0 and an integer

L > 0 and set f; flxX if f(xk-L) f(xk) < A.
To begin the convergence analysis, recall that x* is a stationary point for (3.3)
if the first-order optimality condition holds:

Vf(x*)(x x*) > 0 for all x e Q. (3.7)

Let da(x), a E R, be defined in terms of the gradient g(x) Vf(x)T as follows:

d&(x) P(x- ag(x)) x.

In NGPA, the search direction is dk = d& (xk). For unconstrained optimization,

da(x) points along the negative gradient at x when a > 0. Some properties of P
and d" are summarized below:

Proposition 3 P1. (P(x) x)T(y P(x)) > 0 for all x e R" and y e Q.
P2. (P(x) P(y))T(x y) > i|P(x) P(y) |2 for all x andy e R".

P3. I|P(x) P(y)|| < |x yll for all x and y E R".









P4. Ilda(x)ll is nondecreasing in a > 0 for rI;, x E Q.
P5. lda(x)ll/a is nonincreasing in a > 0 for r,;; x cE .
P6. g(x)Tda(x) < -Ilda(x) l2/a for I,', x eG and a > 0.

P7. For ,,q; x E Q anda > 0, d(x) = 0 if and only if x is a stat.:',,inq point for
(3.3).

P8. Suppose x* is a station, ; point for (3.3). If for some x e R", there exist

positive scalars A and 7 such that


(g(x) g(x*))T(X X*) > 7X xX 2 (3.8)

and

Ilg(x) g(x*)ll < AXIIx- x* l, (3.9)

then we have

l|x x* 1+ A Id(x)||.

Proof. P1 is the first-order optimality condition associated with the solution
of (3.4). Replacing y by P(y) in P1 gives


(P(x) -x)T(P(y)- P(x)) > 0.

Adding this to the corresponding inequality obtained by interchanging x and

y yields P2 (see [129]). P3 is the nonexpansive property of a projection (for

example, see [6, Prop. 2.1.3]). P4 is given in [116]. For P5, see [6, Lem. 2.3.1].
P6 is obtained from P1 by replacing x with x ag(x) and replacing y with x.

If x* is a stationary point satisfying (3.7), then P6 with x replaced by x* yields
d"(x*) = 0. Conversely, if d"(x*) = 0, then by P1 with x replaced by x* ag(x*),
we obtain

0 < ag(x*)T(y P(x* ag(x*)) = g(x*)T(y x*),

which implies that x* is a stationary point (see [6, Fig. 2.3.2]).









Finally, let us consider P8. Replacing x by x g(x) and replacing y by x* in
P1 gives


[P(x g(x)) x + g(x)]T[x*- P(x g(x))] > 0.


(3.10)


By the definition of da(x), (3.10) is equivalent to

[dl(x) + g(x)]T[x* x- dl(x)] > 0.

Rearranging this and utilizing (3.8) gives

dl(x)T(x*- x) -g(x)Tdl(x)- dll(x) 12 > g(x)T(x- X*) (3.11)

> 7Yx- x*2 + g(x*)T(x-x*).

Focusing on the terms involving g and utilizing (3.9), we have


g(x*)T(x* X) g(X)Tdl(x) <


A||x x*|| ||dl(x) | + g(x*)T(x*

A||x x*| || dl(x)|| + g(x*)T[x*

Allx- x*ll |dl(x)||


x dl(x))

P(x- g(x))]

(3.12)


by (3.7) since P(x g(x)) E Q. Combining (3.11) and (3.12), the proof is complete.
D
Next, we establish a convergence result for NGPA:
Theorem 12 Let L be the level set 1. ,,' J by


= {x E Q: f(x) < f(xo)}.


(3.13)


We assume the following conditions hold:
Gl. f is bounded from below on L and dmax = suPldk < oo.
G2. If L is the collection of x e Q whose distance to L is at most dmax, then Vf
is Lipschitz continuous on L.









Then NGPA with e = 0 either terminates in a finite number of iterations at a

stationary point, or we have

liminf Idl(Xk) II 0.
k->oo

Proof. By P6, the search direction dk generated in Step 1 of NGPA is a

descent direction. Since ft > f(xk) and 6 < 1, the Armijo line search condition

(3.6) is satisfied for j sufficiently large. We now show that xk E L for each k. Since
fomax f = f(xo), Step 2 of NGPA implies that f~o < f(xo). Proceeding by

induction, suppose that for some k > 0, we have


fj < f(xo) and fjax < f(xo) (3.14)

for all j e [0, k]. Again, since the search direction dk generated in Step 1 of NGPA

is a descent direction, it follows from Steps 3 and 4 of NGPA and the induction

hypothesis that

f(Xk+1) < f < /f(x). (3.15)

Hence, fk'7 < f(xo) and f/+, < max{f,, fkX} < f(xo). This completes the

induction. Thus (3.14) holds for all j. Consequently, we have fR < f(xo) in Steps 3

and 4 of NGPA. Again, since the search direction dk generated in Step 1 of NGPA

is a descent direction, it follows from Steps 3 and 4 that f(xk) < f(xo), which

implies that xk E L for each k.

Let A be the Lipschitz constant for Vf on L. As in [131, Lem. 2.1], we have


k > min ( ))(3.16)

for all k. By P6,

g dk cl, II dk 112
Ig dkl > _> m
Oak Oamax









It follows from (3.16) that

ak > min 1, 12 ):) c. (3.17)
I amax

By Steps 3 and 4 of NGPA and P6, we conclude that

f(Xk+l) < fF + gdk < f/- 6c i/ak < f cl dk 2/max- (3.18)

We now prove that liminf li | = 0. Suppose, to the contrary, that there exists
kc-oo
a constant 7 > 0 such that ||d, I| > 7 for all k. By (3.18), we have


f(xk+l) < f i- T, where 7 = 6c2/max. (3.19)

Let ki, i = 0, 1,..., denote an increasing sequence of integers with the property that

ff < fnax for j = k and fy < fj,- when ki < j < ki+1. Such a sequence exists by
the requirement on fF given in Step 2 of NGPA. Hence, we have


fT < fk < fTkx, when k
By (3.19) it follows that


f(xj) < ff_, r < f ax T when k< < j < ki+. (3.21)

It follows that
fr+ fmax < fmax (3.22)

Hence, if a = ki and b = ki where i1 > i2 and a b > M, then by (3.20)-(3.22),

we have
fmax = max f(x-,_) < max f,_ r < fmax r.
O
Since the sequence ki, i = 0, 1,..., is infinite, this contradicts the fact that f is

bounded from below. Consequently, liminf ||dkl| 0. By P4 and P5, it follows that
k--oo


lld, II > min{imin, 1} ld(Xk) \-









Thus liminf Ildl(xk)l 0. O
k->oo
Recall that f is strongly convex on Q if there exists a scalar 7 > 0 such that


f(x) > f(y) + Vf(y)(x y) + ||x y|2 (3.23)
2

for all x and y E Q. Interchanging x and y in (3.23) and adding, we obtain the

(usual) monotonicity condition


(Vf(y) Vf(x))(y x) > 7|y x1|2. (3.24)

For a strongly convex function, (3.3) has a unique minimizer x* and the conclusion

of Theorem 12 can be strengthened as follows:

Corollary 2 Suppose f is .-li,'ili convex and twice .,'/:i',', '.';i-;i differentiable on

Q, and there is a positive integer L with the I" ii., I; that for each k, there exists

j E [k, k + L) such that f < f'ax. Then the iterates Xk of NGPA with e 0

converge to the g/ ..l/.r minimizer x*.

Proof. As shown at the start of the proof of Theorem 12, f(xk) < f(xo)

for each k. Hence, Xk lies in the level set L defined in (3.13). Since f is strongly

convex, L is a bounded set; since f is twice continuously differentiable, |IVf(xk)ll is

bounded uniformly in k. For any x e Q, we have P(x) = x. By P3, it follows that


ldlld|- I P(x ag(x)) xll|| IP(x ag(x)) P(x)|| < o||g(x)||.

Since ak E [Cmin, amax], dmax = supk ||cl, || < oc. Consequently, the set L defined in

G2 is bounded. Again, since f is twice continuously differentiable, Vf is Lipschitz

continuous on L. By assumption, fk < fkmax infinitely often. Consequently, the

hypotheses of Theorem 12 are satisfied, and NGPA with e 0 either terminates in

a finite number of iterations at a stationary point, or we have


liminf |ldl(xk)| = 0.
k-oo


(3.25)









Since f is strongly convex on Q, x* is the unique stationary point for (3.3).

Hence, when the iterates converge in a finite number of steps, they converge to x*.

Otherwise, (3.25) holds, in which case, there exists an infinite sequence 11 < 12 < ...

such that Ildl(xlj)ll approaches zero as j tends to oo. Since (3.24) holds, it follows

from P8 that x;l approaches x* as j tends to oo. By P4 and P5, we have


Ild'(x) _< max{1, a lldl(x)ll.

Since Ok E [a min,, max], it follows that

|| d, || < max{1, amax} ld (xk) -

Since the stepsize ak E (0, 1], we deduce that


x,+1 x, II = ||, I < c||d, I < max{1, cmax}ldl'(xk) l. (3.26)

By P3, P is continuous; consequently, d'(x) is a continuous function of x. The

continuity of d"(.) and f(-) combined with (3.26) and the fact that x1j converges to

x* implies that for any 6 > 0 and for j sufficiently large, we have


f(xk) < f(x*) + 6 for all k e [lj, I + M + L].

By the definition of fax,


fkrx < f(x*) + 6 for all k e [lj + M, l + M + L]. (3.27)

As in the proof of Theorem 12, let ki, i = 0, 1,..., denote an increasing

sequence of integers with the property that f/ < fmax for j =ki and f < f' 1_

when ki < j < ki+l. As shown in (3.22),


fkmax < fknax
_k,_ 1 ^ i,


(3.28)









for each i. The assumption that for each k, there exists j e [k, k + L) such that

ff < fnmax implies that

ki+i ki < L. (3.29)

Combining (3.27) and (3.29), for each lj, there exists some ki E [lj + M, lj + M + L]

and

fkax < f(x*) + (3.30)

Since 6 was arbitrary, it follows from (3.28) and (3.30) that

lim fk = f(x*); (3.31)

the convergence is monotone by (3.28). By the choice of ki and by the inequality

f(xk) < fk in Step 2, we have

f(xk) < f, < f fax for all k > ki. (3.32)

Combining (3.31) and (3.32),


lim f(xk) f(x*). (3.33)
k->oo

Together, (3.7) and (3.23) yield


f(xk) > f(x*) + x -X 2 (3.34)
2

Combining this with (3.33), the proof is complete. D

3.3 Active Set Algorithm (ASA)

Starting with this section, we focus on the box constrained problem (3.1). To

simplify the exposition, we consider the special case 1 0 and u = oo:

min {f(x) : x > 0}.

We emphasize that the analysis and algorithm apply to the general box constrained

problem (3.1) with both upper and lower bounds.









Although the gradient projection scheme NGPA has an attractive global

convergence theory, the convergence rate can be slow in a neighborhood of a local

minimizer. In contrast, for unconstrained optimization, the conjugate gradient

algorithm often exhibits superlinear convergence in a neighborhood of a local

minimizer. We develop an active set algorithm which uses NGPA to identify active

constraints, and which uses an unconstrained optimization algorithm, such as the

CG_DESCENT scheme in [73, 74, 72, 75], to optimize f over a face identified by

NGPA.

We begin with some notation. For any x CE let A(x) and Z(x) denote the

active and inactive indices respectively:


A(x) {i [1, n] : x= 0},

Z(x) = {i [1, n] : xi > 0}.

The active indices are further subdivided into those indices satisfying strict

complementarity and the degenerate indices:


A+(x) = {i e A(x) : g(x) > 0},

Ao(x) = {i A(x) : g(x) 0}.

We let gi(x) denote the vector whose components associated with the set Z(x) are

identical to those of g(x), while the components associated with A(x) are zero:

S 0 if x 0,

gi(x) if xi > 0.

An important feature of our algorithm is that we try to distinguish between

active constraints satisfying strict complementarity, and active constraints that

are degenerate using an identification strategy, which is related to the idea of an

identification function introduced in [50]. Given fixed parameters a C (0, 1) and









f3 (1, 2), we define the (undecided index) set U at x c B as follows:


L(x) = {i [1, n] : |gi(x)| > |dl(x) a" and x, > dl'(x)|)}.


In the numerical experiments, we take a = 1/2 and 3 = 3/2. In practice, U

is almost ahv--, empty when we reach a neighborhood of a minimizer, and the

specific choice of a and f does not have a significant effect on convergence. The

introduction of the U set leads to a strong local convergence theory developed in

the local convergence section.

The indices in U correspond to components of x for which the associated

gradient component gi(x) is relatively large, while xi is not close to 0 (in the sense

that xi > IId'(x) ll). When the set U of uncertain indices is empty, we feel that the

indices with large associated gradient components are almost identified. In this case

we prefer the unconstrained optimization algorithm.

Although our numerical experiments are based on the conjugate gradient

code CG_DESCENT, a broad class of unconstrained optimization algorithms (UA)

can be applied. The following requirement for the unconstrained algorithm are

sufficient for establishing the convergence results that follow. Conditions Ul

U3 are sufficient for global convergence, while U1-U4 are sufficient for the local

convergence analysis. U4 could be replaced by another descent condition for the

initial line search, however, our local convergence analysis has been carried out

under U4.


Unconstrained Algorithm (UA) Requirements

Ul. xk > 0 and f(xk+1) < f(xk) for each k.

U2. A(xk) C A(Xk+l) for each k.

U3. If A(xj+l) = A(xj) for j > k, then liminf ||gj(xj)|| 0.
j~oo









U4. Whenever the unconstrained algorithm is started, xk+1 P(Xk cakg(xk))

where ck is obtained from a Wolfe line search. That is, ak is chosen to satisfy


(Oak) < (0) + 6, ./'(0) and Q'(ak) > o7'(0), (3.35)

where

0(a) f(P(xk agl(xk))), 0 < 6 < a < 1. (3.36)

Condition U1 implies that the UA is a monotone algorithm, so that the cost

function can only decrease in each iteration. Condition U2 concerns how the

algorithm behaves when an infeasible iterate is generated. Condition U3 describes

the global convergence of the UA when the active set does not change. In U4, 0'(c)

is the derivative from the right side of c; ak exists since Q is piecewise smooth with

a finite number of discontinuities in its derivative, and 0'(a) is continuous at a = 0.

We now present our Active Set Algorithm (ASA). In the first step of the

algorithm, we execute NGPA until we feel that the active constraints satisfying

strict complementarity have been identified. In Step 2, we execute the UA until

a subproblem has been solved (Step 2a). When new constraints become active in

Step 2b, we may decide to restart either NGPA or UA. By restarting NGPA, we

mean that xo in NGPA is identified with the current iterate xk. By restarting the

UA, we mean that iterates are generated by the UA using the current iterate as the

starting point.

ASA Parameters

e E [0, oc), error tolerance, stop when Ildl(Xk)Il <

p E (0, 1), I||g(xk) < p Ildl(Xk)I implies subproblem solved

p E (0, 1), decay factor for p tolerance

ni e [1, n), number of repeated A(xk) before switch from NGPA to UA

n2 E [1, n), used in switch from UA to NGPA









Active Set Algorithm (ASA)

1. While Ildl(xk)l > e execute NGPA and check the following:
a. If U(xk) = 0, then

If I||g(xk)l| < plld'(xk)ll, then p pp.
Otherwise, goto Step 2.
b. Else if A(xk) = A(xk-) =...= A(xk-nl), then

If ||gI(xk)|| > plld'(Xk)I, then goto Step 2.
End

2. While Ildl(xk)ll > c execute UA and check the following:
a. If I||g(xk)l| < plldl(xk)ll, then restart NGPA (Step 1).
b. If IA(xk_)l < IA(xk)|, then

If U(xk) = or IA(xk) > IA(Xk-1i) + n2, restart the UA at xk.
Else restart NGPA.

End
End

3.3.1 Global Convergence

We begin with a global convergence result for ASA.
Theorem 13 Let L be the level set 1. ./1 by

= {x E B: f (x) < f(xo)}.

Assume the following conditions hold:
Al. f is bounded from below on and dmax = supk ldk < oo.
A2. If L is the collection of x E B whose distance to L is at most dmax, then Vf

is Lipschitz continuous on L.
A3. The UA ,/.:./7, U1-U3.