<%BANNER%>

Information Theoretic Content and Probability

xml version 1.0 encoding UTF-8
REPORT xmlns http:www.fcla.edudlsmddaitss xmlns:xsi http:www.w3.org2001XMLSchema-instance xsi:schemaLocation http:www.fcla.edudlsmddaitssdaitssReport.xsd
INGEST IEID E20101113_AAAAOU INGEST_TIME 2010-11-13T20:30:55Z PACKAGE UFE0019613_00001
AGREEMENT_INFO ACCOUNT UF PROJECT UFDC
FILES
FILE SIZE 22794 DFID F20101113_AADEUK ORIGIN DEPOSITOR PATH liu_j_Page_07.jpg GLOBAL false PRESERVATION BIT MESSAGE_DIGEST ALGORITHM MD5
86b113c3c2a7f89f8f8c701d47678cf7
SHA-1
bfc0ecb60d307f73077dac289adf965b6e4ace6a
42934 F20101113_AADEPN liu_j_Page_37.pro
cb950c57a410b792843b9cb27e593548
5b8843244c77ab73b5f8b4cf6f70a4595423fc2f
1612 F20101113_AADEZH liu_j_Page_18.txt
e3843c599df38d36b6325c271e0bab7f
1dc5b4bf974f40146d2edfb7abd1191051505979
19181 F20101113_AADFBH liu_j_Page_28.QC.jpg
6fa4771b7915bbe35d199962806c7806
d9f979cd6f5db9aafc0ea0e1a235a99f021b6bc3
91130 F20101113_AADEUL liu_j_Page_08.jpg
35e421111987e493ac84653a84df9366
916117ee169eae15f1383f42e4df3bd0efe8218f
6074 F20101113_AADEPO liu_j_Page_27thm.jpg
18c4b7bd2773182b542e0c9fc2d1e439
4da3290092f54a2f02c61e11be8b395ce05d304e
1519 F20101113_AADEZI liu_j_Page_19.txt
543e6c32b5c9ed44e8e0ec159665735a
4136d3e6f185f858cbfa5e9336f2ef992021ed73
16406 F20101113_AADFBI liu_j_Page_29.QC.jpg
2603f6c31cd0b6b80b0589f57f6444e6
fc4a49b2e9457ed3bb4f896b8fa8c2c80c78f43d
74325 F20101113_AADEPP liu_j_Page_37.jpg
dce52dab01920438539673963e010bec
ad1aa7894f670c032ad434d082a601c64d0a7dbc
1739 F20101113_AADEZJ liu_j_Page_21.txt
d8defd22656474947c05983ef9fcaa5a
633855e67dbdf6095fc4c2a389aa522c80ad9b6e
5432 F20101113_AADFBJ liu_j_Page_30thm.jpg
9d24bed1984b1f4ef2f6ed4d20ff4225
c4b293ac3f90214fd24e828344baec230190eb3a
102215 F20101113_AADEUM liu_j_Page_09.jpg
87ee1114313c691263f580f26ed84148
51f3879ca2e5251f15041ee5aa9f81c0fd99d7b0
67812 F20101113_AADEPQ liu_j_Page_40.jp2
65bd22700886297f23da4d034d2aadab
dbc93b2ec007c554edef20c6b21fadc61791ce82
2191 F20101113_AADEZK liu_j_Page_23.txt
7546d209df7c4046990bce63f82428d8
0ab596fe310e39db1ed036d9820b5f1c03fed4da
20189 F20101113_AADFBK liu_j_Page_31.QC.jpg
0a6bb62b4780d4b7ef4697530fae0bbc
8ab301fafc8ad03ece6a164f5f97aed414011bab
100455 F20101113_AADEUN liu_j_Page_10.jpg
4f395595c067143a3b5e35d87967ec64
e7054f24050328698848e4fbb4c05478095a5b9b
27753 F20101113_AADEPR liu_j_Page_08.QC.jpg
957bb2881f91e08097ab3236ec3f565a
ddec5187da0a518b5173549ef16fc8e712bc5043
1577 F20101113_AADEZL liu_j_Page_26.txt
cb53807cf02a07ee1d39f8e189081361
7ff82873e75f7d7e4505be4df91143546cfcc5c6
17505 F20101113_AADFBL liu_j_Page_32.QC.jpg
568003368619fb8ffbecd6a254ea07f4
46510406ed2f6984d91ccd0a1b4c49c70f1ef25e
76111 F20101113_AADEUO liu_j_Page_11.jpg
cbba2ecace744e37c089a23315937093
b84b23f839af708e056e3734f1ad77079f862d2c
2033 F20101113_AADEPS liu_j_Page_24.txt
4b2d8b7c797ce2732156e2fb0f20bfac
81c57bb44e95cf0d3565a66b86d58743f2d4a893
1440 F20101113_AADEZM liu_j_Page_28.txt
dfec8b3039e948d03b064c5a9560366c
b2f60128973b50be6a1014257b85a2539b6a9caa
13045 F20101113_AADFBM liu_j_Page_34.QC.jpg
073ee972bbdce9704ffc6ac12a5d7ca0
8036b22c2b863cedde8a47c7bd76f9f1ac5fb85a
53791 F20101113_AADEUP liu_j_Page_12.jpg
e7ee36fb92dc65ae271984c6120eb828
65e6e0ba0a40fe4ae368ab45f4ae4b82fda639e8
1087 F20101113_AADEPT liu_j_Page_49.txt
9a9d40efaba1b02e0bfd3b0f65226869
96f6008b95e1a986257f4a377a9df1bcf8ebf92c
1349 F20101113_AADEZN liu_j_Page_29.txt
3b7c00c1f81c9bfd4bc89d0133041144
fc62132c1e158effd60a61b45f5e285e9b3936c7
17584 F20101113_AADFBN liu_j_Page_35.QC.jpg
036606addcbb0086ea6e205269307878
efbc99d9b677170f9a34fbbe635893fcfad6890b
46389 F20101113_AADEUQ liu_j_Page_13.jpg
101e637290ccc8a71e307a33fd347f78
e2415fd04c9fd75c2910773b425a19cb54082192
1667 F20101113_AADEZO liu_j_Page_30.txt
c2118c61143313297e13502829b3ed68
4e42f18443708b7b391762b39d04c126892e40af
5350 F20101113_AADFBO liu_j_Page_35thm.jpg
5f20582ed312e4960ebcfbde4b7d9451
f0783122be33e12807a45451ca313e9ac6228aea
48656 F20101113_AADEUR liu_j_Page_14.jpg
4d24770e7fc0ff1152caadc876bd62a9
97768d12ca7747159d9299cb8107766e31da6fce
40727 F20101113_AADEPU liu_j_Page_34.jpg
47446ec41e50bddc3b3648026e33b06e
09c2bd79826cbae7773e5605b2310a603edea799
1805 F20101113_AADEZP liu_j_Page_32.txt
20c2d2336be71e4ab2c2e181e58b251b
f94f55e8131a2b37ecf6cfde4dfc549d011a20a1
15700 F20101113_AADFBP liu_j_Page_36.QC.jpg
955fa5128a6e0a1390111b99dac4c305
9b1c045895288c088ab2987b25f37f95e52fd3f2
59096 F20101113_AADEUS liu_j_Page_16.jpg
3382841a78e021b9d9e71ae2a8bfd507
43cb876cf7c4f9667da1bed504b80e8a2ef87cea
52452 F20101113_AADEPV liu_j_Page_35.jpg
38a75fe03a9d2bae54d418ae5fbf13b5
4bf374b9cd714e0e46d6e25730e970b366def1c6
1168 F20101113_AADEZQ liu_j_Page_34.txt
9006d23cc6eff5b58c123096f936b1d8
25140d5ae0664a47f10f2411dd61861b0800167b
4593 F20101113_AADFBQ liu_j_Page_36thm.jpg
f3c530f8f098c553e751dba951b893e3
10d03663bb37280948334a069a2001bd32cd4a80
47542 F20101113_AADEUT liu_j_Page_17.jpg
ec5249f5dc8aea0785e66b349c00e309
b9c572658dfe5e9712f25eaf6ac83d7ea2ed3dfb
43334 F20101113_AADEPW liu_j_Page_55.jpg
be5cda7ab7b2a98ae02bb1b4435501ca
f8d73bc74c0a9d138ccf64c945dfdb1963fb9858
1045 F20101113_AADEZR liu_j_Page_35.txt
e8dde61451184d2ee711c320344b4b83
8a84e359a5af0ac4e16f7a5db008c777a2fae9c5
23716 F20101113_AADFBR liu_j_Page_37.QC.jpg
0da5c46882bc0fecce106c88226a68c1
6917a11a2a5f8814fb5715e1a75909bf7b2b1d92
49141 F20101113_AADEUU liu_j_Page_20.jpg
766ef12198e10ddd68c9b4932b5e2b7e
5e5acbf0e6b7d989f84c9c2f831f5ad15d40d194
2463 F20101113_AADEPX liu_j_Page_08.txt
09369cdf2232dc4ad3c71e1c92a292f1
fad8aa85643b9470ed37f9773ee7f497e326b738
1842 F20101113_AADEZS liu_j_Page_36.txt
bde4e7f5de07d3a2ba6f0773ec2d262b
797bca0a721c5ca87f42f40c7684e4475232f121
6376 F20101113_AADFBS liu_j_Page_37thm.jpg
294614ac19cc7d42f4fd4fc87130e3fe
f92a1eaaba9a08a35c22fbfcad83c19373c84f6a
54880 F20101113_AADEUV liu_j_Page_21.jpg
067e54f8dcaf76e329e7d0fe6e11a38d
2c2e73eb5033c2512087191b84578f4ed0851b1b
652 F20101113_AADEPY liu_j_Page_58.txt
fbab1b343b87e2bf92faa587d7647af9
1614f634e996487799da6cb5a794acb5071a6b12
1704 F20101113_AADEZT liu_j_Page_38.txt
cd4463c33e1e7f9c1a11b16e1a72b3e6
17ec2687631731f80de43335d2d5b640aaf106d9
16659 F20101113_AADFBT liu_j_Page_38.QC.jpg
9a1082ab947abd247723f86ebaddb50c
fb34757550ef42f113ca129b7593f8a95d24c0e0
61035 F20101113_AADEUW liu_j_Page_28.jpg
0fff400e3e82f821da3d3b93cee649f7
353c7b08e723e333d6cc238bd41ee25da06706af
8661 F20101113_AADENA liu_j_Page_58.QC.jpg
4f5dc0d35ad2da38e4d979ca2c54d510
7fb70c616bccf5361106301358c47863d3e11b11
82656 F20101113_AADEPZ liu_j_Page_44.jp2
66bee7c5226dbd2e0bc74289030b3a80
87dc2a9be2a92fc63a9969ee90d675fffe875d3c
1553 F20101113_AADEZU liu_j_Page_39.txt
c9eaace642feceb1c21b44aeeb4be3ef
7c4b32027dc79a9e22c69f0727958b12318b5b8b
16829 F20101113_AADFBU liu_j_Page_39.QC.jpg
218e2a9a3c90ed9788a72225c3cb8644
32bbfa540080b0ef9f83d4e7e98c62c9c4526b3d
49472 F20101113_AADEUX liu_j_Page_29.jpg
454fc17009b2d6559b6e83dacd1e4351
562aca5cf3395578b169cdf9006501990327f7e2
1664 F20101113_AADENB liu_j_Page_46.txt
4b9c5c0bf69020c2db9c912fdd7734e4
d3e861a14d3bc315b4796c3da3fd679e497f5e99
1521 F20101113_AADEZV liu_j_Page_41.txt
f2f09b793a0c1f38fc1e8314b5bac5f9
d721e9e3c17ee1a2be375a4d028040b1c3bc4745
4879 F20101113_AADFBV liu_j_Page_39thm.jpg
6ed3ff6ea8a618583d931cb49d063acf
4491cfb1fbb381e26d510de6a789edae2cd86298
55838 F20101113_AADEUY liu_j_Page_30.jpg
9ac1a9a3a143f541497ce399312607dc
354455493996df6ce2c5264f691450e0bf0199ed
1053954 F20101113_AADENC liu_j_Page_25.tif
d3b20df43714da8b809d038694b2665c
befc2e390416f671bee9ac1de8a2ed188591f2e5
1948 F20101113_AADEZW liu_j_Page_43.txt
898e2820e83d2775cbfde94b584c07c2
c14e88020ede92818705b5340d733ad730268a22
15927 F20101113_AADFBW liu_j_Page_41.QC.jpg
9c654c7921b9ab823c29a2b449b402ca
052acae7dbde3ae29302fd80752d3d7ad4d5b67b
35236 F20101113_AADESA liu_j_Page_26.pro
d6c862af6f8755dfd6cda7e79abf0e21
86892c81ce4bbec69c9373ede28630cdb0d6a97e
83343 F20101113_AADEND liu_j_Page_35.jp2
42f8a08203741e13df49b793cbd2f79a
d08288053260ecaeb4dffe2137154898d1735164
1691 F20101113_AADEZX liu_j_Page_44.txt
8e295c9a833d789030b62f4e7ed9086b
34c326f3a39e930314a5cd04d814844aee103091
5028 F20101113_AADFBX liu_j_Page_41thm.jpg
cf20f8bf705f3c74c13dc5857ce1cdfc
3033a8f9022905692d079ef7943e8e3cb2fa9122
38749 F20101113_AADESB liu_j_Page_38.pro
86c7311d13e481e111bbf2450b39e00d
d741e7629c151308048ac928c45058ad0e7d709e
62141 F20101113_AADEUZ liu_j_Page_31.jpg
a5ec89d5cb2358bca5864b1d89a3b103
0f492c1133afe2fe24dced2bfb61e1271ea55d9a
1906 F20101113_AADENE liu_j_Page_48.txt
9b0913d1eb653ddd9fbfda2085fd5e49
50547cf9124a5922f69c70b08b27c3208e6a112a
1078 F20101113_AADEZY liu_j_Page_45.txt
84476d0e061b9c03d1ecc23bcdd57c68
91f8ecec388ca52b7fa7f10e03393ed864a04def
19268 F20101113_AADFBY liu_j_Page_42.QC.jpg
da345931465dc44c11c61945a107afb7
d6f1cc90d463c1ae0dba3367eac00bc933b871df
43384 F20101113_AADESC liu_j_Page_47.pro
e2f016fb5854e09331bf5107513f2fdf
fc2fdbd999c6f06ce390b4c427e84e6711812bce
43493 F20101113_AADENF liu_j_Page_33.pro
6119ed8bf0aa9462549d7802bb971866
38bb1bd9b94c79ce0ad09127a546223540eaa6d5
F20101113_AADEXA liu_j_Page_20.tif
0d41d010a3bb032ca1eb846d1b39156a
3dcebbc7c9bad56909f3a426322a4c9097fcc00c
1832 F20101113_AADEZZ liu_j_Page_47.txt
f913463c58cd7d0621ff609a7b004cf8
2786585c7a9e4780943ea5bdffaeb4b27c952989
16726 F20101113_AADFBZ liu_j_Page_43.QC.jpg
e7beeb96640e58d82d6e9049c0f27a16
fcaed5a32eedab475b65d9a8507bfc5b7d8a77cf
76526 F20101113_AADESD liu_j_Page_39.jp2
03aa882aad1eaf7db220bcceebf350ed
f28a7617a8499e00442f2bdb947854c99da2823d
25271604 F20101113_AADENG liu_j_Page_50.tif
fa2f8a2697e5412bd38689875b5366bc
b1a15710f0390245614815a2c28c37425208450f
F20101113_AADEXB liu_j_Page_22.tif
277bc45dfc1a4a292db91fbf0326babb
5d9830e1f8993d0f7710766978362953a372a6b5
67947 F20101113_AADESE liu_j_Page_22.jp2
8a36f3c85a031711c63a36b03fc3be20
ed829e9e9520dfde4e827f92983aa9291b3f92f1
1673 F20101113_AADENH liu_j_Page_42.txt
8f6df9e4e3b0a66ccb6fed02bd4b4584
ddc8a1f79a26a1057e76f356d2ba81f0a4e7527a
F20101113_AADEXC liu_j_Page_26.tif
fe9b67456a26c8d2399b6a457a6d0c48
fcc6955cb156512ccc592cc6935cbd2435be9c8f
66442 F20101113_AADESF liu_j_Page_10.pro
3d7d09a77ad73337bc66a1c1b86636ed
63fcf3bae929701521e01613195d52a7d6fa3c3d
60580 F20101113_AADENI liu_j_Page_25.jpg
3ee48991a91f3eb051bf84f81c59899d
3c41faaa8f4dd1411eaa5182e8ff0e0aa4d338d2
F20101113_AADEXD liu_j_Page_28.tif
d53cafb912b1d37504c37151c50e9e62
dc3718624230b7da740a459fe464da6cd0d5cf72
73559 F20101113_AADESG liu_j_Page_56.jpg
3f1bfa6cbb838140f0019d7e9de63308
a1d5ae2eedfb30d0a9dfd5e5318d4fa1596f4a32
5465 F20101113_AADENJ liu_j_Page_42thm.jpg
aa2686d712b2b1102295604c8c086d40
8e9db3424540aa29c7552175eefa5869b9362afe
F20101113_AADEXE liu_j_Page_29.tif
39b7c99f33a3e6a424d00e8fb8b68406
11d02f4c6920a6bdc48c2766db03b5925cb3ba67
16865 F20101113_AADESH liu_j_Page_26.QC.jpg
70a1b788184c1f74e06c460e0ed3241b
4863cd23072040d2d0f3180e887ce0f23be13ea6
5352 F20101113_AADENK liu_j_Page_21thm.jpg
969d03c49b439e9bf3a917774f322e32
d877804f2167bb9e2a0f86eb4666e9102db237b1
F20101113_AADEXF liu_j_Page_30.tif
bb982c00fcc8102680e8dfd0015fa718
9a4532183312eaed1615b5ac92f1c7c8343a282d
6821 F20101113_AADESI liu_j_Page_04.QC.jpg
a1af18a3e48ff1e4558179f82f54e207
c3cbf35848d21c113d55311962851171edd29efc
1414 F20101113_AADENL liu_j_Page_55.txt
fcad802c1237c56f167f0643bf343f3f
1f5e518dca8b1a4abae9aedd06feb56b06958090
F20101113_AADEXG liu_j_Page_31.tif
2462e9bb1dc53e98041084f4d6596942
a93dbe3bea83590369911cbff6c2f92366bbb7f3
F20101113_AADESJ liu_j_Page_27.tif
7f46e5a229df24d119157747a0042cef
fc0aacfa860a3bf78424219f5cddb0fe9e5e29d1
8149 F20101113_AADENM liu_j_Page_10thm.jpg
20a5d4f143d2ec440f982aff6c93d8c3
f86cc92b4443d889b3109e2066edf369ec30315a
F20101113_AADEXH liu_j_Page_32.tif
a3fba43ac1b41167f0798415aca7b337
731c603e5959bcf86b26cea6b9ccdfe3d5e1a1bc
15005 F20101113_AADESK liu_j_Page_58.pro
cc495d254812d556d8eb333c32f5bd89
fe752939cbb2b7fada6688cde723c6f121edbb4e
5866 F20101113_AADENN liu_j_Page_24thm.jpg
941fb3061fb87f0f1324be5a474e817a
c74f5a4eb1f9a06a00f8185b27b175a788043f18
17262 F20101113_AADESL liu_j_Page_48.QC.jpg
bffa8d36ab922c6aeead6493f439275c
0a27fadb37844047c42c45243c0bfee4f16ca6cb
32508 F20101113_AADENO liu_j_Page_50.pro
9e49d0f8894e4eeeba96d3d4257f7f08
9d532eb6949495d78e044ae428ce720f835d6e68
F20101113_AADEXI liu_j_Page_33.tif
4a76daa5a249f7956b57066fdc8cb254
50d9a9f7f59a172bba9166f59a531b43849ff366
1808 F20101113_AADESM liu_j_Page_52.txt
b9b9c0baa749836790dc0073be230968
a089b99c0679bd116caba1d5da4e6a71ca288dca
30836 F20101113_AADENP liu_j_Page_05.pro
05d129f1ddfee363c22c74dcbc15fe13
ffc505e126088f9683b0d145909c2606e9b45083
F20101113_AADEXJ liu_j_Page_34.tif
7edff3d19997b81563a1e8bbca4a3f68
206fbb2748c966eb418f94537b8224d6268cd240
1051952 F20101113_AADESN liu_j_Page_08.jp2
54bfb56051fcdfb4f156a71af153ff6e
e2469802be8e3d2e3c8f354f717c655e34a25792
17806 F20101113_AADENQ liu_j_Page_51.QC.jpg
a22eb8a32e94f428e3ce18aca4670e99
8c915ae34cdd8f37d3f2e753f6d096fcaf67c0df
F20101113_AADEXK liu_j_Page_36.tif
e97417bdd3445751798e950bfb9f53ec
aeb3abfaf68f998601a00108e4299d41ccbe2fc9
14940 F20101113_AADESO liu_j_Page_22.QC.jpg
e88d2a0fb2dae4bd4c9719504f53a68e
f5cc7e7d1460135de2d4381dc058fb9de20344f4
16869 F20101113_AADENR liu_j_Page_54.QC.jpg
98274e4ced449bd47392ef647d8f3b99
20b65708c31d8b3349a32aa4abff578589aa119d
F20101113_AADEXL liu_j_Page_39.tif
4d386443f6a322d968f5e0a7bb8e205c
0174a13e3549a4233683d2e9fa1b75ec5f7ce26f
7443 F20101113_AADESP liu_j_Page_08thm.jpg
09e55962e04d21004960d64e86ef05c7
2179bab3f48b77107d1baa03524c6481c8742bd9
F20101113_AADEXM liu_j_Page_40.tif
19d48725c53c37baaf4f3018ed049a4e
ed502da47d1012962212885283043354bdce04ba
43936 F20101113_AADESQ liu_j_Page_22.jpg
eaebb98d04780cb051b5171f85688354
062a2386e186c4f22e0541121744c09bb5e75701
27869 F20101113_AADENS liu_j_Page_58.jpg
ca3412fbd2f143e7bb476a90d064fd39
8950d0d1be6dd77554303279caf275682eae94a6
F20101113_AADEXN liu_j_Page_41.tif
febd20fe65de712f4121b2c29c0a0218
732b9539cd6d26541c3b8142309a876ea31d90c5
F20101113_AADESR liu_j_Page_35.tif
926918ebd742a5af325fabaf728b6fb9
7c0179a1f56856317887c7df794d3dd6e84aaf56
99741 F20101113_AADENT liu_j_Page_31.jp2
741d9547032b5bb7dae08d01b1edcbf1
02852241fc9c76bbfcefff4a53330d6d18914fd9
F20101113_AADEXO liu_j_Page_42.tif
b18000de588b2a32422d7843ab105bb1
13d53d00c2237106be1aa4234a279cf626039fb3
26101 F20101113_AADESS liu_j_Page_15.pro
af7c61edef0652d8c9a90c751ac45f06
c71b92801079749b0cf7ed4501672c46c859a08e
F20101113_AADENU liu_j_Page_04.tif
32171a67a09775f1c809ec42411feb08
c39d176acd7edbd4a1317ca56bead5333f5884d8
F20101113_AADEXP liu_j_Page_43.tif
70ba088389da9fb8e82bd4d559ebbfd4
0d082e46ebc4ea7b54b76c66645bf495d440671f
62207 F20101113_AADEST liu_j_Page_55.jp2
ba780c72cbc2ca01188c0c64ca45f23b
d1cd7d53a9b868ed1a3f40aba4c556c2f8174c22
4124 F20101113_AADENV liu_j_Page_05thm.jpg
77f4b13ebbf16ccfd42cfe91a255f177
4e5a2dc3d6c4836574750646e6122b2e30b4a050
F20101113_AADEXQ liu_j_Page_44.tif
6019cd92cc23bb29775cf21430a47759
4bee7f0163ee429ca5cc8b96a4209535122ccd40
1289 F20101113_AADESU liu_j_Page_03thm.jpg
47141992d8583e53e3ca3c7fd1f7541e
8993e6401cac1ae3fc20a9718d1d20ab6b8b17ee
26776 F20101113_AADENW liu_j_Page_57.QC.jpg
1387f374f5b052784c47ebdd056473d5
e7f67bd64cba14f3c11abaceaccd9159607b0cbf
F20101113_AADEXR liu_j_Page_46.tif
e94ba765b236b8e706145b47d3b5dfe4
5f41d46bc35de931a8a50f2c8f04a3b3cabd5212
50584 F20101113_AADESV liu_j_Page_18.jpg
5e4f6acd0f550b583943b09475d6a4ef
dde6d18a020aee1ddb1755dc21474bef0d7bca99
67106 F20101113_AADENX liu_j_Page_24.jpg
215cbf3f17c559c1410a896b427017ee
dec77beaf473fdea14e720d432294dd591d36f15
F20101113_AADEXS liu_j_Page_47.tif
3093dcfee0c9419b076a3b7dd1b9f515
49cea4265c59511fd839ac4d49ea96d32630ea09
F20101113_AADESW liu_j_Page_48.tif
ae65bc641248ae319d8ffbebcf7dfb84
8164f4400db105f45de29a2fc413a569ee1e4707
21570 F20101113_AADENY liu_j_Page_01.jp2
e0a21103a90d0cd99e578abc82baddbb
9e8a96847ac67a33451a327f45108971265a79dd
F20101113_AADEXT liu_j_Page_51.tif
6681f6dae2b77148bae4e2bd8fb215ef
bc7444a29501b3d95900cc3db3f872667802e510
27687 F20101113_AADENZ liu_j_Page_14.pro
ff9f51d9bca8a574ae3fc6b859c81dd4
90b7976dce15d4ebc38780cb61bc5c9a97421f12
F20101113_AADEXU liu_j_Page_53.tif
d44f3150f5c76449b6adc1758ba28906
e3dc0d37276d8771d1bff511caaa723d02fc8f6c
25121 F20101113_AADESX liu_j_Page_45.pro
901598c6e23a418d05ef9b891bd2c248
cc1318cbd1f6c45fb8fae65a1935f8c6c70e9786
F20101113_AADEXV liu_j_Page_54.tif
d3dc1b502052a104c8a7ea27cb38ce99
06188f21c694ffed48fe54fe412436fe23860748
33000 F20101113_AADESY liu_j_Page_19.pro
ab27994d6e5110e9097e86e19dcf225d
603e5490c98477f11e6eedcdb0d730c15e7350c4
F20101113_AADEXW liu_j_Page_55.tif
e3f8c2300b4fc5cfc251680ea79ddd5e
875d68ad6db5670f867098b9910ff8d9ca4bdf71
F20101113_AADEQA liu_j_Page_11.tif
32a726d720a0967a8592a8551245e4a0
6e9396ce4ac0f16c37b58812fd45f8382cfdc509
17182 F20101113_AADESZ liu_j_Page_53.QC.jpg
5f301725161d6469cfb11886290e21c7
ec13b3a00351ec1cb7aff07f466ba45ef69319e0
F20101113_AADEXX liu_j_Page_58.tif
9ee16bfce0605b2b5c5241c226d9be2e
0d5e896a2ee61c813f9806e5b00f5c44d3337087
5246 F20101113_AADEQB liu_j_Page_12thm.jpg
9a18ce3eb00be2b4799a21eea110bfb0
36c29948c5d05caeda4bc3a7fb091537dfec0636
7106 F20101113_AADEXY liu_j_Page_01.pro
e7208582559d1044e60f4277df6deba0
c5d5ca5d12b53c8cac324e71e085f1da73b59fb6
10385 F20101113_AADEQC liu_j_Page_07.pro
16d243c4920ee27eca59c30a389a55d2
20f20dae2b216fc2dd56955827ff3c61d91207d3
51755 F20101113_AADEVA liu_j_Page_32.jpg
647fc26397184a112ac9f711cb824e47
e9a46c53e318905ccc7c3144307ee5af2f39db3b
592 F20101113_AADEXZ liu_j_Page_03.pro
2ff1c46207380828173ed8eb216aca86
848b25dcd5183eefacf647ed22a839b9ca498066
F20101113_AADEQD liu_j_Page_21.tif
ce7296162e345518756c2dfb370a6845
e44d67679ec6ccb8dec366bf9dd2c8dd9467b34a
52460 F20101113_AADEVB liu_j_Page_38.jpg
eac7a44dad3cc0a972ea1a7925bf97ab
2c4259f42c5f30bc9151b2713dde15e6578cf713
41330 F20101113_AADEQE liu_j_Page_49.jpg
bfec4e69f75421911c5bf2197f203c74
f17f7ded4ed1a746c5de34cdd348490d80ea9b14
5056 F20101113_AADFCA liu_j_Page_43thm.jpg
5b37be9e295f22d960b47b79de4f47f0
be4cafec341d4d311fa2bb20906a1fd1c4732c02
51137 F20101113_AADEVC liu_j_Page_39.jpg
04e3a7905d3f427d864fa63bd9811363
800437c0a1cd5c84c72e78e7852a428c04e023c8
26574 F20101113_AADEQF liu_j_Page_07.jp2
1339bcd969028a7283ce9005bef49c3f
91b542869bf1c9f68ae17c8077898780b05e1f64
16919 F20101113_AADFCB liu_j_Page_44.QC.jpg
9b1830243e1ee2924569a87a58f8393d
66688bea6528aa1b3ec7a261889457b7f55e473a
47047 F20101113_AADEVD liu_j_Page_40.jpg
ae8855b2f2f37da64fd57120250273bb
d65f76e3ec20d75fe26ac67b5c2d92614444b5bd
14674 F20101113_AADEQG liu_j_Page_40.QC.jpg
9de5f1e7edc9866f6681667db2517f9d
123383ce7a6ba7819a886d00cc8b1c5386e309dd
14469 F20101113_AADFCC liu_j_Page_45.QC.jpg
41b1dca52682fac6c3eca1e363ba5cfb
d20313629243014356f66eb391ec25dfc39c522e
49727 F20101113_AADEVE liu_j_Page_41.jpg
dcb577822e46f82376659536ec2a4c44
caa0f4c992d49ac3c126c07742a56f9367568437
2160 F20101113_AADEQH liu_j_Page_27.txt
8592d618d63ebe4b8dfcf52aaa9638a4
56b4a91a09e7cdb99b8d04232b7e4f47da70d49f
4504 F20101113_AADFCD liu_j_Page_45thm.jpg
d4f6493b7e894608983b9b1cf01da660
02e8964de7ffb834ce5457cdf44305b9cdd8ea75
49576 F20101113_AADEVF liu_j_Page_43.jpg
dab62b9f226a6bbf6d8563994559bb90
38d44bd82d7de13c9ef150e63a6dfa6e34d1d0d9
F20101113_AADEQI liu_j_Page_05.tif
08510500105970f9c5e8b8ceca287830
7ea18824f79735006bf8c0ab7a178ea5dd5a8171
16749 F20101113_AADFCE liu_j_Page_46.QC.jpg
ac0c6ced39026a6b9f219fc5b6602d5d
630a673d697d44825a3d3bc49250273f9bc3cc7a
45606 F20101113_AADEVG liu_j_Page_45.jpg
34c4e7b49aefd05ae5e3578f5edd2023
87b5333d27172224750fe1505cb5859d45ca1ff2
15384 F20101113_AADEQJ liu_j_Page_13.QC.jpg
ea48671efcdeda0f8472a32a58b6460b
c7ff8efa281316c6911f64cfa994dd20083a4c83
21162 F20101113_AADFCF liu_j_Page_47.QC.jpg
8ab2367d0d198c60047c6cd01e671fc7
cd3401f9cbe9af2b82eb281cc8f0ac8bd979bcb7
52658 F20101113_AADEVH liu_j_Page_46.jpg
3de6c442d1c5a20fc05e1739107682e8
eee11a3d5baec5e3a9f1ab9cf6ff27e75028d706
58202 F20101113_AADEQK liu_j_Page_52.jpg
7ce3c6d720dae207ac5f2398cdf9b05a
d772b67747a5f0e320c535c3b871bb3298de5d37
63379 F20101113_AADEVI liu_j_Page_47.jpg
0527c221001c7b73366702995c01a84e
0ffa0248b37531ccae185a0237dd83232b6db1f1
34060 F20101113_AADEQL liu_j_Page_52.pro
c9e58bc1132d7f206c3ac581f1490722
05d370ec4d4c59c3a6d2b350f60752ac7c292ad8
5486 F20101113_AADFCG liu_j_Page_48thm.jpg
b7aada4be6ce5bad2e67e9e3a6674c13
60d9ff6240e2efc6f0d0b680f66c409194805675
55312 F20101113_AADEVJ liu_j_Page_51.jpg
afca165915194c6ffeeb45c3fabffd58
c6fb4221b5f55b8d97d1121f47d42d6dfa9136dc
F20101113_AADEQM liu_j_Page_12.tif
f91203eecd2ba85374fa502408d49bd4
623db0f7d71d07abb3861ec652933e91a774da01
13262 F20101113_AADFCH liu_j_Page_49.QC.jpg
8561637fb5feeddc5e2cdea55a9c9fec
3f563cb285596e588058ce824cc08fb7ede30496
92869 F20101113_AADEVK liu_j_Page_57.jpg
07ff6b7897e816eedd45782bec6779a4
3e85c8632ad1ab07352d97731941abc881afc977
105275 F20101113_AADEQN liu_j_Page_24.jp2
435072c4cd3139ce1b35c62e8185d1a0
8b02d626d13204220fb9be23f8b493933841a4a4
3563 F20101113_AADFCI liu_j_Page_49thm.jpg
b5eeca7c6e11630ec4ffb102297ca3c8
18e232646665d172b26263370d14a31288ef8dfe
4579 F20101113_AADEVL liu_j_Page_02.jp2
cbb510f0ee4a06aaf54be49490bd0a7b
19192d82c556a815df22768e840d616649459887
15255 F20101113_AADEQO liu_j_Page_05.QC.jpg
1c4589adbd9be5272fef62a5f0385566
105ecd73686b4b68a73f51ddd7c2bd6e2bb108b0
16773 F20101113_AADFCJ liu_j_Page_50.QC.jpg
e383d03877a5cca12dfc8c761ac3b03c
d7546b9823cbb182795037d2b4e499da27ed3d73
4371 F20101113_AADEVM liu_j_Page_03.jp2
721ebc0834004915194bb60a73396aec
58227d30644a9b345ae6e9e88de51b980060e7e1
414 F20101113_AADEQP liu_j_Page_01.txt
fadd5c6abd6332e435ef978867a69de6
8bb2b90bda3f54e0abdc28d37fa09e8ab2131bec
5532 F20101113_AADFCK liu_j_Page_51thm.jpg
484ff5ecb4ca95323b838772e4c91ef9
7cc4e2c708287832dad439184368e8fc8a72b44d
1051963 F20101113_AADEVN liu_j_Page_10.jp2
159a2415e94f7a7a9397ef9bd5d5c0d3
ed6171704880de1cd13f589d7f2ca610e83c7424
839592 F20101113_AADEQQ liu_j_Page_19.jp2
523b357c51e14465d6781235a50ed049
f18046d657a5b829db559ee1632d93b8fe0497ac
18712 F20101113_AADFCL liu_j_Page_52.QC.jpg
f433fd0fea490e66d3d17b46fa05ac52
f81358e4bc9b0c9c6eb071f38ca384d68ca439b8
639025 F20101113_AADEVO liu_j_Page_14.jp2
c84f61ce7c54d9c3d72e258cb38741f5
31080c7a132a0166806219116728f1ec8db7d4e1
28320 F20101113_AADEQR liu_j_Page_36.pro
cf037675fcae69ac739fb3b4be765048
840ddbf5dd6c4b46b08ce813689fe2f07a3e951c
5397 F20101113_AADFCM liu_j_Page_52thm.jpg
ca8d6660a9dcfdec68a18918c377a5fa
496a5e2fc84119a8e22738e122cbc207e5a13a8c
63699 F20101113_AADEVP liu_j_Page_15.jp2
1bbb1989b4134e4cad77cedc4fc54fde
3b09fe5d81a095d39f624d201f3241f52bc7af23
66772 F20101113_AADEQS liu_j_Page_23.jpg
8729e2d552bd03baf99a944002e4cd67
083a041adf27d6eb19188bc379addda55a0502a7
5328 F20101113_AADFCN liu_j_Page_53thm.jpg
cf80c98856fff049007c28be4f1f78b9
46d9e4dc35538d8a0ea581a543c8181077cabb2f
845164 F20101113_AADEVQ liu_j_Page_16.jp2
ccdbea5751ba75425e302a3fe35856c2
5ddd40821ef8ccc8c53514b0f310c65c0b206bb4
F20101113_AADEQT liu_j_Page_38.tif
318cb2fbbad900beea09f6e513fd0b8a
fa9c55432c2a370e86b0f765b8ecd461c4e0f34c
4012 F20101113_AADFCO liu_j_Page_55thm.jpg
b5641fef9b483d5c1d64680a4a3cf850
9e02ec3d3a7aae0e56f1bb6bf299f76b9deffc7c
74653 F20101113_AADEVR liu_j_Page_17.jp2
4fcead596f8bdb6f747f7dfbfd60a913
97bea1261a90888ea01d688728fba0ca0a33c97f
1051930 F20101113_AADEQU liu_j_Page_05.jp2
ee2d71bc06e56f361e19266564fef9ee
71845cd12a929a7f54a76700e7911ab307604af2
23207 F20101113_AADFCP liu_j_Page_56.QC.jpg
d45de734c15e21727ed70e04b001be0e
3d55d15938a2d04b1d914ff72bb06d2809aa4af2
81962 F20101113_AADEVS liu_j_Page_18.jp2
76cb410673f2111cdb5c33b3ae3af66a
5be5c3eca9df234cca6dc0daa215c86ce998f5f4
7157 F20101113_AADFCQ liu_j_Page_57thm.jpg
f9d85a61d06104d601d754b951ab340a
8f8be5e413225ec9729c6ab069dc8c36889653c3
75326 F20101113_AADEVT liu_j_Page_20.jp2
1b300f9e401d3bb2b3656b3c91e02d9d
7d03916b6912f052ed1ca30c7afac7e41da3d557
83838 F20101113_AADEQV liu_j_Page_38.jp2
98d85ba8b3db2e3bf220d78e28f6b134
7b52bc97b95dbeade352f9f318c92199272bad21
2872 F20101113_AADFCR liu_j_Page_58thm.jpg
dbf516b7661be36b64440d4e32917830
8c16a0192f679a8fc1216d498a6babbf11419bb7
83800 F20101113_AADEVU liu_j_Page_21.jp2
40ec9dc3e7384bbb6956a771a50b6224
3a5e30e28454fec55d30f484673f37e24f1a58be
60654 F20101113_AADEQW liu_j_Page_19.jpg
b849ca09ec32882bdc0f6f02823141c0
8ab0e60399514bf4f66be2ab1f89ee91a2fc9629
103009 F20101113_AADEVV liu_j_Page_23.jp2
eeda9073d85c59e99ceb5f2e366f1f7b
3f7dbc232bcd6e2ccec222104d58fa266aba549a
1399 F20101113_AADEQX liu_j_Page_13.txt
640acd48738a82c583a4ac88d08a544b
5379a975c723deba1f3a841a9c613cf894b95e4f
91683 F20101113_AADEVW liu_j_Page_25.jp2
0bb8d4ada536b296abbfe6bffebe2a0a
02a29a406237bf7f52b16d3c4341db372719a67f
817926 F20101113_AADEOA liu_j_Page_51.jp2
47c1f3accfbd407fd5fb9996b99ecb5c
789286086477d223544f1cea8910039787b564cd
33773 F20101113_AADEQY liu_j_Page_54.pro
0994a2b8c3f17b859b61d8554c380690
4742f5972256929fb24044048ca323d6e9b25100
76229 F20101113_AADEVX liu_j_Page_26.jp2
b8b65ef4489513526737ed29dc3ef944
e7efe77474eadf533aa7e44c7e94ef3b5cd0b5d3
5626 F20101113_AADEOB liu_j_Page_28thm.jpg
46b6440bd33980f4cdb359d6c3f41205
227e31b97c7f16d7199075954977a470aff8a4d1
5478 F20101113_AADEQZ liu_j_Page_16thm.jpg
7ca6ce4da171360bbeef7daeac304dd6
95332cf5f2b86d4263d3ab64d6fb6947b6aea104
106128 F20101113_AADEVY liu_j_Page_27.jp2
f76895c4e1202d16cfd36768af268e1f
2d550723d776340f5f5b8d676bd22f0f0b945faa
76323 F20101113_AADEOC liu_j_Page_29.jp2
07b0116366fe209d1e49e5391b76f5c3
25cc4946b3897b9e3fb89743321f36109f2776a1
87506 F20101113_AADEVZ liu_j_Page_30.jp2
30810b62176ad9e90b501c66ec930268
ac2984f88f59be57ca95b1c9f4fc4630637000a1
F20101113_AADEOD liu_j_Page_57.tif
a60dc05c44abc77639496a3619919bd1
4a30706a7c19a83a7286453ce4284adfb26456c6
5447 F20101113_AADETA liu_j_Page_38thm.jpg
efb89a26ea24e61b7a6b04849b8b08c1
35b586b6ec09e48aa15352caa43860eaa9b78924
68987 F20101113_AADEOE liu_j_Page_09.pro
26c9cb1d8dce169698fd65c267a693e1
ca63368837affd150797bc086e5a8ab3b62df965
5948 F20101113_AADETB liu_j_Page_23thm.jpg
e6b47317a35e4ad68cb5604a4a6e4efc
07a73fa0e762c9d9eb1d2b66c67e6ad4c013a997
1373 F20101113_AADEOF liu_j_Page_40.txt
0fb0dfc9ee2c27151d7aef2760aea0a7
dbee774cf4f714bf1904d20da869eb14d86b0730
1749 F20101113_AADFAA liu_j_Page_50.txt
67c04bf5485f5b83f607337e9c75bd47
2c67b37816b48dd2c643caccc4710d4aeae13299
87102 F20101113_AADETC liu_j_Page_48.jp2
91ea2f557a846bd35fad2e457a470e18
583e179559996084edc9db3103c379bc42a7df17
22688 F20101113_AADEOG liu_j_Page_49.pro
6ec74ac58fd3c4f9fc591ca55bff6979
eeba4906e8c7e7c367ac9db989a7a16e6f2de213
11169 F20101113_AADEYA liu_j_Page_04.pro
d15d2a014b46dcedc2c7e1893909826a
28b30ebea4d273e2e22d3719f256bb0a080f79cc
1852 F20101113_AADFAB liu_j_Page_51.txt
ea937350dee2a314943068fb9ed6dc4c
708225c8951c4d9218032e67081146dc8b44f270
403431 F20101113_AADETD liu_j.pdf
131462669b6e120cc8415d381fe44633
3c63ba02d22c5ed3b0725e8e08c4fb8d7e352693
2007 F20101113_AADEOH liu_j_Page_33.txt
91732d4eac41db63bcb9e829050bd502
82c8a0d895daa030269378b556356f0a370ae36c
58251 F20101113_AADEYB liu_j_Page_08.pro
5fcf7c761dfa0fbedc37534d5453e974
463f04fc8cd89bf0c33c63c49e437aba2e15731b
F20101113_AADFAC liu_j_Page_53.txt
79a4c267a5c25f0638290da6c450fcbe
4a1a1d9d6ff106c335bd259f8756aa72e8ac285c
22567 F20101113_AADETE liu_j_Page_33.QC.jpg
e6d43f23dc6d025e38c5675d9d4cd6bc
7cc7aafd8eddf187e550254fe13aaf1b90253f6b
34547 F20101113_AADEOI liu_j_Page_21.pro
f97c7f673503e154eced7473ea6fb2b8
d9124e84ab2d19e327728a1f481c3532f3b18f47
46849 F20101113_AADEYC liu_j_Page_11.pro
f92492fcac0e669351a617435e8a817f
a6cf131b88e519f25f727d7c685e50fef7829faa
1884 F20101113_AADFAD liu_j_Page_54.txt
ef17291000390ea05a8ec66ad351b8cd
9004f6c182d5f58f1e2fc9a4fb80c79d90663e15
1003993 F20101113_AADETF liu_j_Page_37.jp2
7233c110c1fce8ffb42d87e2082b3d94
fd4234f7c864b7976f24b8960a2a8c85230a81e1
1051972 F20101113_AADEOJ liu_j_Page_57.jp2
8e09f949c935f954ffa800021146fb76
95ac4f00c8591813bf29c1bc8c886aae09af4c16
36080 F20101113_AADEYD liu_j_Page_12.pro
1fcaa1bde0d81887e8a0769ae12f77bd
f866c265c41966f57cb5a7b47effc4a5af7f004b
F20101113_AADETG liu_j_Page_31.txt
c34a8b795cd3b0aa732f3f0826782793
6565354efc62c2a84b3b909d926d90cff6f3f79b
1622 F20101113_AADEOK liu_j_Page_12.txt
6928324a6de0299b2b87600d96f6a386
ea153d4e2cbd596bf272e0049c802dbe61a1053c
24858 F20101113_AADEYE liu_j_Page_13.pro
68803b9443b12abc34cef4d014a8c06d
9d08636ea42e449fc280ff15f0025ca0e7d6f9c2
2219 F20101113_AADFAE liu_j_Page_56.txt
01d8c6096c10a9673e2b05e0823b1a49
aeb1b46fbb42b05e6a2f0a9a1e112295555f1d17
83 F20101113_AADETH liu_j_Page_02.txt
f5b39a8c901889470ae299e624020d1b
78f6129185b6dfe3fd2beb8a80bf16ff151d5266
5204 F20101113_AADEOL liu_j_Page_54thm.jpg
396966d8d944a50771626ab543caf621
4720d4f1090cf9fb23b174b5fd22578ba8e4a65e
34644 F20101113_AADEYF liu_j_Page_16.pro
1aa4227c957e81ac0b97d3548a667ed7
01f4d5fbf35c1377c8b217f2e58e8132a088f087
2620 F20101113_AADFAF liu_j_Page_57.txt
d96d543f8cebf6a159e315fb1f2947c8
2554c0165bd33662263ab888c19bda2e72269a04
57403 F20101113_AADETI liu_j_Page_42.jpg
e7967a3526bf10cc082db70154086e4a
df830fa5ca64fb33677ed80f0293b1fe8ffa0acd
F20101113_AADEOM liu_j_Page_23.tif
d1549bc8a31c4a8cf0f219419fcc8d68
72dbc50d98aadee595445b56f762776bc8aad2cb
32360 F20101113_AADEYG liu_j_Page_20.pro
747a02452c89b0ee4e82adb991fc002d
d454fc8e8ebefc49c9ef189d9a106a45fb54a343
5981 F20101113_AADFAG liu_j_Page_47thm.jpg
8691477667b490adb1b90a0fa00d56b7
5f6d60fa39ba1836354f5c14040fd1d17839256d
4145 F20101113_AADETJ liu_j_Page_34thm.jpg
f7173aac0729a1a754382c5e0eeb4446
9d6467a0e78cd551d166e0570a6c744818b430e6
2299 F20101113_AADEON liu_j_Page_07thm.jpg
84c0a5124daca43ddaec8ebed7db2959
345fffadd5c90d94ba811ee69dad140fd5fbbd2c
46752 F20101113_AADEYH liu_j_Page_23.pro
f9aa022af0cf60fa34b204e9f8556b2b
2d4a22739771ff76e9c6930e1eedbfcb8d1559a2
21510 F20101113_AADFAH liu_j_Page_23.QC.jpg
1e6b692d86134e9a217b1079f21ea7e1
635a840fab007ef7ab7aaf3c830e1d66077028d5
1785 F20101113_AADETK liu_j_Page_25.txt
fc2d6a826dafd79169dc0687f729e5f9
47ca6a6dfe1f331f705c0e28f9ace3958ceeec82
5042 F20101113_AADEOO liu_j_Page_46thm.jpg
42b5d72178358790215c353f94077164
03307d944c33ab0baef88e7075adaaf53b46b9f2
48913 F20101113_AADEYI liu_j_Page_24.pro
6ca84629b6d1c525f38398788af75013
24ed1422c3e3bc603631ba681ef62cdee03327c8
6325 F20101113_AADFAI liu_j_Page_33thm.jpg
93db3b4458c5dc85ee6d8d66a11f81b9
bb0246d839380bc788ac004520026dccadc88993
14915 F20101113_AADETL liu_j_Page_14.QC.jpg
ef69beef433548eaf877cd261d1fabb9
37c3af16d7de3a81f30bf6e7f4f3df092db21f40
16585 F20101113_AADEOP liu_j_Page_17.QC.jpg
efc25b74bde31f0edb7dbbb40ad05a02
c6e8907d8737e64fc90a10b69a31f78008c93c4e
40397 F20101113_AADEYJ liu_j_Page_25.pro
cddd5c479e4f8297a20dc5005ad1dd97
d4917602896bbd67022622c2220529cabed231b5
5189 F20101113_AADFAJ liu_j_Page_32thm.jpg
080040c7902dcd32620b5f21ba6fb8f5
d3070a8f569d5751e49d2ebef6069e893406dcbb
66841 F20101113_AADETM liu_j_Page_13.jp2
0e042ab9e84f3ed209d98bba1dbcce50
b5df4c6483524a99e6a96e5fc2b47d80f8ec18e6
56544 F20101113_AADEOQ liu_j_Page_54.jpg
549cbe0f1e65985168a3d9b3aec510d8
6bc3c0e0e864d7276ad9f80e4ec4f0aa24ca8af3
49796 F20101113_AADEYK liu_j_Page_27.pro
277ac6d863c27fa01abe5c1e9f45f1a2
59cdbd7583eec190a962121b78477401a4d33175
5047 F20101113_AADFAK liu_j_Page_29thm.jpg
0644af0b81888f59ee118567b9a94fe6
1e419fc96471b012e84befef0c48726b2796b977
18618 F20101113_AADETN liu_j_Page_30.QC.jpg
54788962c847914b3dd4cb56d39d1e51
c9add5bd58884851f58fcff5bcfd0389be12f74c
12995 F20101113_AADEOR liu_j_Page_15.QC.jpg
0c3a4e85d2cb498c9be1d50e12fc05b5
b7c322baaf91e08894306f04aff4c3f69a0c9fbc
31245 F20101113_AADEYL liu_j_Page_28.pro
d82fa67cc624737d07ccd18874439b3d
18a153320a75e82f20c60bffd1a79bca0608b68d
87940 F20101113_AADFAL UFE0019613_00001.xml FULL
bc8cf7b387fb7079f1c8959c17ccc639
fe03c60418936ecb2830041300f8bfa11ce308c1
1011061 F20101113_AADETO liu_j_Page_33.jp2
fdf5e93c95d565226a26db38ae356072
987aaa41ab6f7727f1107a3c560a6ba9ec90fc76
2066 F20101113_AADEOS liu_j_Page_01thm.jpg
b2dd3cee8eadcb0df75629e44538caf3
2f1778b72938cc0558e5dbd6416d073c9dfacf32
29659 F20101113_AADEYM liu_j_Page_29.pro
4ddf3c673ce878933d36a09e2b5e728e
bd42eaafa86f91ae7b10a53c1e8eead7b6ed4178
3077 F20101113_AADFAM liu_j_Page_02.QC.jpg
651b6b6771a15d179f48f827d84ca541
9deb6438bfea21baf820fc88a209c02805875339
4800 F20101113_AADETP liu_j_Page_20thm.jpg
bbbf3b1045af1ce478d63500cefaa938
c3cc77cc83f2f03deda3e797dd2921c94a99c484
37758 F20101113_AADEYN liu_j_Page_32.pro
2247dd969aca3e2e78e0bf51ed795940
ac5cdb139b48ff3c76ef59d47a18838d20a51490
1296 F20101113_AADFAN liu_j_Page_02thm.jpg
f6f36f5c8bf18c70c5fe408dc6f7d6b9
d992ae70a7b38a4fa1ca892a0d94df0178649b01
827417 F20101113_AADETQ liu_j_Page_28.jp2
3cd5d43c001eed687ec5fc2f835c8648
1be9a7dbd091f89eda7f421986b0fb9d1bc186d0
24678 F20101113_AADEYO liu_j_Page_35.pro
d722046440b1b36f9e1321e1a5ec4993
550b56ea20180e7926e0460562a21775b46208d9
3013 F20101113_AADFAO liu_j_Page_03.QC.jpg
2682de30f463d437605af34df0c5f13f
b59e39ced3181d6608715270732973fac8f0a8f3
56992 F20101113_AADETR liu_j_Page_06.pro
f6a191c1d2384cce6a5748443f61b06a
a0ec683f3767f3041e6fc011e09147557cd31822
748879 F20101113_AADEOT liu_j_Page_54.jp2
ee09b19c1b2dcb77872a73f16e0cc523
47e0f1575fdd499711c4bd239c582d1961790f28
32965 F20101113_AADEYP liu_j_Page_39.pro
b4b67e4294f89ac7f068d4500bbccee4
d2ec3709b4a6c1831c0751ba138ae925588bfb93
23800 F20101113_AADFAP liu_j_Page_06.QC.jpg
9f2e6e9d61cf44de5e2a2eefc73800c6
f147890d94fd0e3f056f628be44b2b25c938f5d6
31448 F20101113_AADETS liu_j_Page_10.QC.jpg
9ad0576fa9d72890a5a7df3d1e398753
cee142b8b3e5dc0f2d68374df030c2a794af1052
45131 F20101113_AADEOU liu_j_Page_36.jpg
7222f1bf6fd136d5561fc13b0fa18689
94da1038d9645d185ff41d6bb3598477ffef0d45
31204 F20101113_AADEYQ liu_j_Page_40.pro
db9660447cafac5d70f385b309fd2083
f1ed97e72e59e7301a630bc099e59cd4d37b4675
7182 F20101113_AADFAQ liu_j_Page_07.QC.jpg
2ebaa4f4d25982d99328a61850d43bda
0733041c808dc92666296e6ba961a9b2d4d2c1ba
22219 F20101113_AADETT liu_j_Page_04.jpg
9026f840e1d161bbe02c7fa95e1e9ebd
ec1eb83e795ee94e3d3ac97ec51ca2eb2ebbb349
5394 F20101113_AADEOV liu_j_Page_18thm.jpg
eddb435d991ba06f08145bd71b56c12e
751e329b4d8e02cd63d868f3402330008fdfee49
34686 F20101113_AADEYR liu_j_Page_41.pro
54cfbf3c4e21bc839868968881db17f9
4629c478a9c427dfb3424e8e65c8086d484d7a52
31713 F20101113_AADFAR liu_j_Page_09.QC.jpg
d16793852d56791a6283200f7d0e21aa
0813fe5bfa426dc98df9842aa097a819e70f0e68
1460 F20101113_AADETU liu_j_Page_20.txt
e9b7fa526ec48786df8d76912344feaa
7f0454279e0c658f9a20e44cc73d68505dac8843
33438 F20101113_AADEOW liu_j_Page_53.pro
9bb3287d858d54ab8702e045a8606341
636581308b5b6b94b80f29333c5d9a171113511b
32804 F20101113_AADEYS liu_j_Page_43.pro
11bd7926566bd3b0c62b61c28c7a8587
c72d322d4e0fcc626131e52d55a8a7462c98b638
7846 F20101113_AADFAS liu_j_Page_09thm.jpg
5f6d79696c6fc072891476ec6af85b6a
4da17806c322f39204eb51d9cc15a410dbea3ee2
54913 F20101113_AADETV liu_j_Page_44.jpg
be85d4b3dfffe45595214ccbdd1a9731
81b29e23574d14a891c8ba7739b488b20ca293b4
2381 F20101113_AADEOX liu_j_Page_04thm.jpg
f18ebbb032f507dc70dc419ba01e4431
5a16b45f454eefd55f42afeb09b8f3b1abdb6247
35823 F20101113_AADEYT liu_j_Page_44.pro
9bfd68b2f3b653417f93f986e5c785f2
57c3856d3bcc58bb7b035580a1cd1c0464b678d4
23430 F20101113_AADFAT liu_j_Page_11.QC.jpg
90d37da3b63eb9416d61e3babbcd9aac
d19e0cdec385a676a7c310f8bc1ee2e5b4c54c08
F20101113_AADETW liu_j_Page_14.tif
c6cf17d2fc1f7e83c924a0d8fb695fdb
1db0d1f33b51c25d9b5a89177ba929e2589bc5e8
4475 F20101113_AADEOY liu_j_Page_40thm.jpg
489d9f569410d83503a1e595764e7284
cbccfcfbef8091db796a8e33e786c91d70309a01
34786 F20101113_AADEYU liu_j_Page_46.pro
63ae040c2755b84ba5c6959482c2babf
cbeb48144fbba4b1fbb084a947bccd648b981ad1
4786 F20101113_AADFAU liu_j_Page_13thm.jpg
555965c920c8f551df882439fb104f1e
8252d2d0040936d3439fac576aff8053ff3d6a53
F20101113_AADETX liu_j_Page_45.tif
975e75494cdf64cc9cc0b232dc3e0c6f
356b1b1a197442c6bf32a2a58262742ecc56dcca
F20101113_AADEOZ liu_j_Page_19.tif
2e0dd8a5454fbfa1dce420a61a3b39f8
34f7a1b44ae0163b7f1707cdb6cb978dae13af71
36885 F20101113_AADEYV liu_j_Page_48.pro
92a38964f8c7ea9abfdce53a6bff6d28
063596924610075a049095419843d04d441d7c24
4715 F20101113_AADFAV liu_j_Page_14thm.jpg
a8b510c7a529719e7cb32aef6aa5b1ef
de5c991e70f67ad4265a910ba2c2987bf72ddb22
35324 F20101113_AADEYW liu_j_Page_51.pro
cc4e993dfdcf8819859e08392263f43b
14047478167262d8dfeedd7e4f162bfb55579e1b
4360 F20101113_AADFAW liu_j_Page_15thm.jpg
417ae35e9272374bc48f5d3b8f130247
00bfa14529aeede798f82078fb76a1a05889fc69
34808 F20101113_AADERA liu_j_Page_18.pro
a64d19967db934278a79669cba22387a
9449107e48dbe5d19535bf170d6bb32be9418564
1911 F20101113_AADETY liu_j_Page_37.txt
2c583eb7dc8f7a4b549cc34f864f8415
943c608334ce0c99712fb5c180b8f23fe649a1e3
28331 F20101113_AADEYX liu_j_Page_55.pro
9df9fb9e5a99058ee35fba73a42f8560
8d4a57c9a850118c9065ca74f0c75aa70f3d080e
4817 F20101113_AADFAX liu_j_Page_17thm.jpg
9b1f795766fc89c0983499100a18c7eb
3970de8253cce7aa545ee28613c9b6bd6f9f17fd
26378 F20101113_AADERB liu_j_Page_22.pro
1e48a557888e0b448d3864fdf14feaee
7ee116d06027b773b09a7474ea9a82290109a9dc
52488 F20101113_AADETZ liu_j_Page_53.jpg
338b8d1483df457b5f547ba886f6ff79
248266e7349d3d7ac8f9ce3d272f55689104f5ca
54018 F20101113_AADEYY liu_j_Page_56.pro
5a7040dfadc16e70911062cdb6ebcb3c
1394601fb411f392a6c68b077d8fff4668cab53e
17048 F20101113_AADFAY liu_j_Page_18.QC.jpg
c5b7efb2c8d213b23531ecd8bdfe3197
dd4a51392b8cc24e4446076034069f25e227ef84
14084 F20101113_AADERC liu_j_Page_55.QC.jpg
2f05d0e59ccdef1558b8b960ea686ac8
c8c0c027a1c8f405e78a31a991d0e5a9929e4087
64701 F20101113_AADEYZ liu_j_Page_57.pro
f423509b604f2ca1d1ff62a94fad6c78
4d4a1d4e03314b754162a6ead15e0d60c734d9d1
5678 F20101113_AADFAZ liu_j_Page_19thm.jpg
4565d3f5a7cbcdcb2e6c641e3706315c
e58e5a74478866fd30566fce98a9230c74484496
50850 F20101113_AADERD liu_j_Page_26.jpg
8f51991a560bf3db1829127ae6f88b7c
d7552253aa60a740c32bf1f3a7825979f9046995
84604 F20101113_AADEWA liu_j_Page_32.jp2
d61f1eb7488083cbd00f2d8fe6d99c3a
942b69c7dc6f2d6a8311ea4737f5a96f934f9a5a
68420 F20101113_AADERE liu_j_Page_27.jpg
1932c45dd3487eb40283ea76688d15d8
d68c485c8aef379532741c6832de8dcce833c703
61063 F20101113_AADEWB liu_j_Page_34.jp2
f323b66d9c55d0c51e619fb096f48018
3983c43dbafba866cfa29b51c108937dc0b3b9a6
1508 F20101113_AADERF liu_j_Page_17.txt
565a43a51800a62d636590a0aaef520c
4276bda79ed2b6632ec0fd33b087b1fd874e6e84
67269 F20101113_AADEWC liu_j_Page_36.jp2
0074a0f26248e2ce60a25ea5412136dd
d63bce7657a9860edfd40b19b003fc4c6e816efa
25588 F20101113_AADERG liu_j_Page_34.pro
42314e4958417bfbde2de126af76cfdc
e3f6f48be511003cce64ec8f88a52d627bbe3c28
76624 F20101113_AADEWD liu_j_Page_41.jp2
6892c748b676e77ab9ba5c023fbb4bba
11b307ce529efc133a15494f437753ae583bd7e6
1363 F20101113_AADERH liu_j_Page_15.txt
2f97d45ba8a43fbcc2433823873906dc
9970a438c2dbed54f28fa93cd15acfff7ff3f705
86459 F20101113_AADEWE liu_j_Page_42.jp2
a025f8cf2cca3ded852a4b03a02b7172
fc92abc57a8293c1cde906ee5f791f8ec42998e6
98651 F20101113_AADERI liu_j_Page_47.jp2
52e0aaf0cfd787dfc57ad9823019684d
501080733ab64c2f8c96fdecce8f1746a3194c5c
74964 F20101113_AADEWF liu_j_Page_43.jp2
60eabacc9e7abb68f5c94fe6877ada9d
2a4a376642a5a018bbba6808d141ec9f5b2ca2ba
40801 F20101113_AADERJ liu_j_Page_15.jpg
91f9847145c76f354aa833335ec51ae1
7931d490e8949486861265777fbcaea560c3d526
63849 F20101113_AADEWG liu_j_Page_45.jp2
768c39173ab2467351c6387d909d7075
8961a7ce0659be1dd9b2c6cd15ff08590a9ca97e
2611 F20101113_AADERK liu_j_Page_10.txt
8034b0e5504163c8e746800f83b996c3
4be4f5f33d700210ef3c5cf749f5c5620559ea8b
77826 F20101113_AADEWH liu_j_Page_46.jp2
6b3b70caebd502dd71971b2281422b5b
cbf23b914a3e328125db435030cc927c03865c64
F20101113_AADERL liu_j_Page_06.tif
514c886f580500ed6f830e7dc0689a52
72116dde847cf4f98c556afb1fdcfe9518d5fb57
505718 F20101113_AADEWI liu_j_Page_49.jp2
6a4dcad5bd2ca3adab89b6b9a054e88d
79f35ae8df46dc769aace5de890d30820bc45b2e
5610 F20101113_AADERM liu_j_Page_31thm.jpg
23cfc40af49f6031400ac96be75dd737
6ee60904a6a6cfa3b0b95bd7fb360de7e175abb7
744339 F20101113_AADEWJ liu_j_Page_50.jp2
7629ae242cd978aadc3207a8f1ee58a2
f1a2a71fe0cf8b115777c7d4ed8621997a4c5aae
1831 F20101113_AADERN liu_j_Page_22.txt
652a2bf6e96d715e8a3c1b936ebdf923
f56b8266a8134c1f34962b16e23d26dfb64a9d9e
795532 F20101113_AADEWK liu_j_Page_52.jp2
7be2f33dfd31e69af11028a7aa626674
3519be653ca6c6edab736d14452a8d5a64c2e235
1051951 F20101113_AADERO liu_j_Page_09.jp2
3b185d6e394d7928eadd9b570950cf9b
e8dc97405d2c7c2e71ad04d2a2eaa7ad82110b6c
76569 F20101113_AADEWL liu_j_Page_53.jp2
2e9ee21b9c3d7e3fa4010924c2319f1d
ed14c898d72e0e50f3b58f1399487b7f9373ce1a
F20101113_AADERP liu_j_Page_49.tif
d632b08977469cfa63e55f0a52165441
7fbc17d60c39be454fe966c4a9968c376768a0e9
115216 F20101113_AADEWM liu_j_Page_56.jp2
c022ff1e5378f1a44a0603dc18ae0421
6365179cefa90c26b21a9f9266e830b331781b90
36323 F20101113_AADEWN liu_j_Page_58.jp2
437b9d97de9c2072b784662e6c0a5c0f
69bcea2a6616715aea1f3cd47e730babee1cfe28
69893 F20101113_AADERQ liu_j_Page_33.jpg
5bc6f9239de9e31a1f256b91c0bda473
c19603f7b746317c72beb89e69425aea6a91ce71
F20101113_AADEWO liu_j_Page_01.tif
505ed45f4cbb938b148025ddeffc6147
bf9a50373c609fc90d62a9e035baf5c702271c0b
81 F20101113_AADERR liu_j_Page_03.txt
1d52adc938ce5a77d982b5d0ec1825f8
ecadacb86ae1ac39177e4c3d0dda26f9046a6b52
F20101113_AADEWP liu_j_Page_02.tif
68179713fd437c8b0a05d2017e6be5b8
2ca2830bd9c3552d2daa8fe7f044643dbdcdac47
35163 F20101113_AADERS liu_j_Page_42.pro
a945552a9a09a4797b33381d9f771065
103d91fba1540fddfb6c573f9f0ef457e33b8e79
F20101113_AADEWQ liu_j_Page_03.tif
63bb15fb43095b6b7bc29dcc1fac4572
b5cd7b9f2dd420e091a9f3c6d57e618392526c4e
6415 F20101113_AADERT liu_j_Page_06thm.jpg
89c0a3918bb2f766d6e16659ba1b8ce3
2450fb95f5b25fea2fc957f737f2c7a5a2113fc2
F20101113_AADEWR liu_j_Page_07.tif
03a6f34045951b7303ee2882d977e7ed
59a57be1443dcb4cb2d322e96817bb27aee986cb
18176 F20101113_AADERU liu_j_Page_21.QC.jpg
ac4156c059b6405445d92674bd2a62bd
682dc10b05358244c7aae6ce1617ac21dffa6373
5151 F20101113_AADEMX liu_j_Page_50thm.jpg
ed0e5774042c1f788cd1b820265b971c
50c7ade3ff17e52eb620868b4ce6089c940e1916
F20101113_AADEWS liu_j_Page_08.tif
7375ddac6e9391042fa83934f1a978e6
e1809bd66007fb690cc222e0bf3880a1875a7cb1
F20101113_AADERV liu_j_Page_56.tif
298df09259a6f613800ff9fe3998f576
0f6ffb1c3c8a9a4c775fc089c862fca11fb6e19d
19220 F20101113_AADEMY liu_j_Page_16.QC.jpg
1dfcdf04c7cbfda9c7453d124719db12
ff22aa00c9d24183a8f0440d9d2879deea97fe0f
F20101113_AADEWT liu_j_Page_09.tif
9c2de668f69a581226a8f554f23fea41
5e448a35c5f33b70cc32f18dcfa073d9793455bd
17708 F20101113_AADEMZ liu_j_Page_12.QC.jpg
e74fb41a580111172ff0a12a10069321
f6b7b43be572aa4f5b8bb4323ffdedfc9c7b83ae
F20101113_AADEWU liu_j_Page_10.tif
8da39ae936d51aae316422188c5aa914
38926f4a4b7919a9da27be4c3ac18d89c26dbac6
6122 F20101113_AADERW liu_j_Page_56thm.jpg
8d654340f8d2f31cfffa2b5b0ea4b97c
57288fc8799577704a6d32c3897504bd53e5d1c8
F20101113_AADEWV liu_j_Page_13.tif
276d9745eb26db37fe904b18cb477ea2
28f5707606649decbb2ae829c21eead7b90fcb9a
1051970 F20101113_AADERX liu_j_Page_11.jp2
ad3e1252d9129eab49a52870026d825f
24af385bc88f00289378b16e771191cf6b9c680e
F20101113_AADEWW liu_j_Page_15.tif
7c206b5d29beb4b30eb793f0b7628663
b0efd8628185d29fd5dbf30b5249395849f51317
F20101113_AADEPA liu_j_Page_52.tif
24ecc292abf5e6469197273a469713f2
8d0e24f1658cac0f6b8e451a35ea01a263983bd4
644 F20101113_AADERY liu_j_Page_02.pro
7c6b6080ab27498cdc9bb988c5799bd3
00b69307b7dd443fa95db217b1cacf4b84e21b54
F20101113_AADEWX liu_j_Page_16.tif
0633a34047a35fd7dd14af91827c34a8
a893ead9bb5c279ec08bcc298065859545a60352
36230 F20101113_AADEPB liu_j_Page_30.pro
a6247e331943096fb6cb2d70375b596a
5d3ad8a1ceb1469764c579e76be187b2a6b33493
42811 F20101113_AADERZ liu_j_Page_31.pro
3c3fec4a4edffb3b890a8680bd870a1f
ab222a07bf4f7233419b3328a3021e6b6eeb3c4f
F20101113_AADEWY liu_j_Page_17.tif
ab510ad083ed99447dc0fc8195612b85
4de24dc2ed12c3048f719e5ad80d8d66bb200928
119710 F20101113_AADEPC liu_j_Page_06.jp2
45fd4b4520796be4c9b0170ba59aaba4
eb413d612be0d65e9baeaa5415a7ad78ce62821a
31463 F20101113_AADEUA liu_j_Page_17.pro
6a83102e39c56ff97bcf192214f2607b
985d4a8e4634611c01f57f3948860c7f2a4c42ca
F20101113_AADEWZ liu_j_Page_18.tif
0345bbdacf0c9f0dc65671bbb81dd413
379c9402233443f8ca4a8bcc923466bc8b4874fc
F20101113_AADEPD liu_j_Page_24.tif
ce4619677d57b00129597e983502b926
00aa39c99514056fe6c62314bf778acfe92db3ce
5955 F20101113_AADEUB liu_j_Page_01.QC.jpg
9c6e50fbb59c5677d607a30e9cec54fe
0a230d3eb8e372cece6ca1be4dcfa1ee25cc960c
F20101113_AADEPE liu_j_Page_37.tif
96d1a1b5594863c82e9b11c7d56e6644
d9946889d62a5bb9daed6149da49c49bc6428644
68172 F20101113_AADEUC UFE0019613_00001.mets
2d4d3625654b7a121d6b830cb1ea6a25
56e068757a328e2dc71635a790d7b359499766ed
6823 F20101113_AADEPF liu_j_Page_11thm.jpg
fb81d2178dffb3f1152a263b1ac61131
fade1670d7017e912d0979246d035daee9ad4888
16228 F20101113_AADFBA liu_j_Page_20.QC.jpg
a9ee4d484acd3d26b2604ed8b79201d8
9f1d7cf0e4fdbb6ba764e739f00846e3ae580dba
84205 F20101113_AADEPG liu_j_Page_12.jp2
7415a76f687207e195c6a4862a58923d
e23c538c56272a0dd7f196e30204a8b1a2e8b8f4
490 F20101113_AADEZA liu_j_Page_04.txt
906d6d84efba0b4a9de8be86c451358e
d07f58b9db476b57903d0da2a0f518774e102d4e
4659 F20101113_AADFBB liu_j_Page_22thm.jpg
56823826d99d458e1007815c6f6a9122
761f71ff57887b051d441a1c8013035b53ab390f
2705 F20101113_AADEPH liu_j_Page_09.txt
0f50c6464c250f78e220a76f3ec0dc9d
9e9bfd5798f9074cece39aad329a1de0954516ac
1523 F20101113_AADEZB liu_j_Page_05.txt
bd3f4a4e8f5a08af595b734d0defb1e1
e6d7c58219ce16d3af843708b2ede6a5fc83f971
21737 F20101113_AADFBC liu_j_Page_24.QC.jpg
56ce2380185e330530074e43edb59147
144b1f2f15eaa9ec4562f8b427bd7a71b416c644
19899 F20101113_AADEUF liu_j_Page_01.jpg
84edf5170b47207d1493008639e47321
f5e86ea6aea36f092c60d3a6c6f1ba1689900b0f
5159 F20101113_AADEPI liu_j_Page_44thm.jpg
5b5fe0ab9e08204fb98566d425e8c794
11dda345c491000f71a372da82b21ea89665ec46
2431 F20101113_AADEZC liu_j_Page_06.txt
8e82e6aebff5f531dbb1fba1fbe8555e
d3ed6ec19c6d74326f8fada3f21919df99d33eeb
19133 F20101113_AADFBD liu_j_Page_25.QC.jpg
4c8901bca897da4ac394850e660a440c
188e9e22420daba23a8c198cdeda8875da258824
9547 F20101113_AADEUG liu_j_Page_02.jpg
7e8a41294c676d4f2e0a918d0227ad5f
b68e1c13feeebd9c09dd4174942f92f4d489f7ee
55400 F20101113_AADEPJ liu_j_Page_48.jpg
6bd386e257a046820f9b8c038f5733e9
39f737c7463619098da4a5d4642476aa70287bf2
417 F20101113_AADEZD liu_j_Page_07.txt
8371d84d2f1791c6b46496bfc6c47efd
4a5b618f23ed7f2cd1684ce3ff60db3418c01842
5590 F20101113_AADFBE liu_j_Page_25thm.jpg
01cca222853b9a692f705bc269f8a38f
145792f939d673d0964090279fb68b6aa2e7d197
9356 F20101113_AADEUH liu_j_Page_03.jpg
35dbe4726b912f62b02e5e2f0fd756af
44b1348c495974df4a0ae084cac762c8f3693746
53608 F20101113_AADEPK liu_j_Page_50.jpg
aba0ee67c60e3108549a446e2079f4fa
00cb6c86a546ca2865106ee442b0804a7250c540
2051 F20101113_AADEZE liu_j_Page_11.txt
d866ccd35ddde10747ead5f9d4855d15
14634342d0d6600a075827d81851d51b93698ee6
54407 F20101113_AADEUI liu_j_Page_05.jpg
083c5d046acd7e736399258abb0a2d16
29951bcbf4f6a6a618e2c7061b10beee1f43b263
26383 F20101113_AADEPL liu_j_Page_04.jp2
5ddbcebec5f3abebb65438f113685f66
8085d1d429cf332744c73530b1f5c6aa6ac433ae
1245 F20101113_AADEZF liu_j_Page_14.txt
aea1f3eedff474b2191b2ef781ef56c6
32d98ebaf40252446764983b4a6c4d00c474cfc0
4914 F20101113_AADFBF liu_j_Page_26thm.jpg
042937a3c2f2d36d13cb48dbbce1d06d
946f4cee75dee0eaf30dec4578568645a55ef957
76417 F20101113_AADEUJ liu_j_Page_06.jpg
c619b7a9df31e9410736d56e5f31791b
ade44e1d071cb512212e6c4c1cd51e39c2d0fe3e
19120 F20101113_AADEPM liu_j_Page_19.QC.jpg
209eb2b50ff57383d92db4fca45a830d
aff6be121e99a5d68e740bff7875890045d1634b
1547 F20101113_AADEZG liu_j_Page_16.txt
5b0d66731412087abb366e1b774abc06
4ea6b3bbd449bbc9ef45d580a138f936e835267e
22170 F20101113_AADFBG liu_j_Page_27.QC.jpg
fa79dece0b2f2b1a0bc32e75189e95bc
b95b97c5570a896fe2b39201353f849f9b46ee5d



PAGE 1

1

PAGE 2

2

PAGE 3

3

PAGE 4

IamindebtedtoProf.MuraliRaoforhisencouragementandguidancethroughoutmystudyinthegraduateschool,toProf.F.AitSahlia,Prof.Y.Chen,Prof.B.MairandProf.L.Shenfortheirinterestinmydissertation,andtothefacultyoftheMathematicsDepartmentfortheirhospitality.Last,butnotleast,IwouldliketothankDr.S.A.Melikhovforstimulatingdiscussionsandsupport. 4

PAGE 5

page ACKNOWLEDGMENTS ..................................... 4 ABSTRACT ............................................ 6 CHAPTER 1INTRODUCTION ...................................... 8 2CUMULATIVERESIDUALENTROPY .......................... 11 2.1MoreResultsaboutCumulativeResidualEntropy .................. 11 2.2Generalization ...................................... 16 2.3CumulativeResidualEntropyandWeakLawofLargeNumbers .......... 20 2.4CumulativeResidualEntropyandIndependence ................... 23 3KULLBACK-LEIBLERDIVERGENCEOFDISTRIBUTIONS ............. 25 4ENTROPYVARIABLE ................................... 27 4.1InformationVolatility .................................. 27 4.2WeakConvergence ................................... 33 4.3ApplicationsofInformationVolatility ......................... 40 4.4DistributionandLaplaceTransformoftheInformationVariable .......... 47 5GENERALIZEDGAUSSIANFAMILY ........................... 50 6SUMMARY .......................................... 56 REFERENCES ........................................... 57 BIOGRAPHICALSKETCH .................................... 58 5

PAGE 6

Therstpartoftheresearchstudiestheinformationcontentofrandomvariableswithdistributions.GivenapositiverandomvariableXwithdistributionF(x),itsinformationcontentcanbemeasuredusingtheCumulativeResidualentropy(CRE).Inthisstudy,weextendtheoriginaldenitionofCREandobtainresultsforarbitraryrandomvariables.Asanapplication,weareclosetondinganewproofofWeakLawofLargeNumbersusingCREjustastheShannonentropywasusedtoprovetheCentralLimitTheorem.WealsointroduceameasureofindependencebetweentworandomvariablesusingCRE.Atlast,westudytheKullback-Leiblerdivergencebetweentwoempiricaldistributionsandapplyittotheparameterestimates. Thesecondpartstudiestherandomvariableswithdensityfunctions.GiventherandomvariableXwiththedensityfunctionf(x).TheShannonentropyofXistheexpectationoflogf(X)whichitselfisalsoarandomvariable.Inthisresearchwork,wefoundthatthevari-anceoflogf(X)(wecallitInformationVolatilityorIV)hasveryinterestingproperties:1.IVequalingzerocharacterizestheUniformdistribution;2.IVisabletoseparatetheUniform,Gaussian,GammaandasubfamilyofBetadistributions;3.IVistranslationinvariant;foravectorvariable,IVisinvariantunderanetransformations.Moreover,IVhasgoodweakconvergencepropertywhichtheShannonentropyusuallydoesnothave.Forinstance,IVofBinomial(n;p)convergestoIVofNormalasngoestoinnity.InthispaperwealsoproposeanewprooftotheasymptoticapproximationoftheShannonentropyofBinomial(n;p)andprovethatdiscreteShannonentropywillneverconvergetoitscontinuousanalogousbutIVwillcon-verge.GeneralizedGaussian(GG)densityisoneofthemaximalentropydistributions.Inthis 6

PAGE 7

7

PAGE 8

ThedisciplineoftheinformationtheorywasestablishedandbroughttotheworldwideattentionbyShannoninhislandmarkpaper"AMathematicalTheoryofCommunication"intheBellSystemTechnicalJournalin1948.Theinformationtheoryisbasedontheprobabilitytheoryandstatistics.ThemostimportantquantityofinformationtheoryistheEntropywhichisafunctionalofprobabilitydistributions,measuringtheuncertaintyinarandomvariableorvector.ForarandomvariableXonsomeprobabilityspace(;F;P),theShannonEntropyisdenedas wherefisthedensityifXiscontinuous,probabilitymassfunctionifXisdiscrete.TheShannonentropyplaysanimportantroleinvariouselds(Cover[ 5 ]). However,theShannonentropyhascertaindisadvantages.First,itrequirestheknowledgeofdensityfunctionfornon-discreterandomvariables.Second,thediscreteShannonentropydoesnotconvergetoitscontinuousanalogous.Third,inordertoestimatetheShannonentropyforacontinuousdensity,onehastoobtainthedensityestimation,whichisnotatrivialtask.In2005,Rao,Chen,etal.[ 13 ]developedanothernewmeasureofrandomnesscalledCumulativeResidualEntropy(CRE),whichisdenedusingdistributionsratherthandensities.TheCREofapositiverandomvariableXwithdistributionF(x)is 13 ],weobtainsomenewresults.Asanapplication,weareclosetondinganewproofofWeakLawofLargeNumbersusingCREjustastheShannonentropywasusedtoprovetheCentralLimitTheorem.Atlast,wederivedanewmeasureofindependenceusingCRE,whichcanbe 8

PAGE 9

Shannon'sKullback-Leibler-divergence(KL-divergence,Kapur[ 9 ])canbeusedtomeasurethedistancebetweentwodensities.Sinceitismucheasiertoestimateempiricaldistributionthanthedensity,westudytheKLdivergenceofdistributionsdenotedbyKLD,anduseittoobtainparameterestimators.Forexponentialdistributionwithparameter,wecanseethenewKLDestimatorofisunique,anditssquareistheunbiasedestimatorof2.ForUniformdistribution,theintervalestimationbasedonKLDhassmallererrorthanthemaximumlikelihoodestimator. NoticethatShannonentropyofar.v.Xwithdensityf(x)istheexpectationoflogf(X)whichitselfisarandomvariable.Theexpectationofarandomvariableisaveryusefulstatistic.Instatisticalapplicationsthedistributionofarandomvariableisusuallyunknown.Twoofthemostimportantstatisticsofthisdistributionthatonewantstoestimateareitsmeanandvariance.Andtheonlywaytodothisistousedata:meanandvarianceofthedataaregoodestimatesoftheunknowns.Sincelogf(X)isalsoarandomvariable,itisnaturaltolookatitsotherstatistics,forexamplevariance,highermoments,distributionetc. Inthisresearchwerststudythevarianceoflogf(X),callitInformationVolatility(IV).WeknowthattheShannonentropyisshiftinvariant(Cover[ 5 ]),whichmeanstheentropyoftheeventX+bforsomeconstantbisthesameasthatofX.IVgoesonestepfurtherinthesensethatitisinvariantunderlineartransformations(theShannonentropyisnotscaleinvariant),whichmeansIVoftherandomvariableaX+bisthesameasIVofX.Forarandomvector,IVisinvariantunderanetransformations.ThispropertyimpliesthatIVisindependentofthemeanandthevarianceofarandomvariablewhileShannonisonlyindependentofthemean.Therefore,ifadistributionfamilycanbedescribedsolelybythemeanandthevariance,thenithasaconstantIVvalueforthewholefamily.Forexample,IVofUniformdistributionsiszero,IVofGaussiandistributionsis1 2,IVofExponentialdistributionsis1andsoon. OneoftheinterpretationsoftheShannonentropyisthatitmeasures"information".UnfortunatelytheredoesnotseemtobeanywayofestimatingthisquantityfromtheShannon 9

PAGE 10

7 ],Knessl[ 10 ],Worsch[ 16 ]). TheothergoodpropertyabouttheShannonentropyisthatmaximizingShannonentropywithdierentconstraintscharacterizesUniform,Exponential,Gaussianandsomeothercommondistributions(Kapur[ 9 ]).Actually,itwillbeshownthatIVequalingzerocharacterizestheUniformdistributionsandwithoutanyconstraints,IVisabletoseparatetheUniform,Gaussian,GammaandasubfamilyofBetadistributions. Themomentgeneratingfunctionisalsoofinterest.Asamatteroffact,weshowthatmanyothermeasuresofentropy(suchasRenyi)canbeexpressedintermsoftheLaplacetransformoflogf(X).Moreover,alowerboundfortheFisherinformationintermsoftheLaplacetransformoflogf(X)canbefoundandanupperboundofthedistributionoflogf(X)intermsoftheRenyi'sentropyisobtained.SomebriefdiscussionsaboutConditional-Value-at-Riskoflogf(X)areincludedattheend. GeneralizedGaussiandensityisoneofthemaximalentropydistributions.Itiswidelyusedinengineeringtomodelnoise(Dominguez-Molinaetal.[ 6 ]).Lutwaketal.[ 11 ]showsthattheprobabilitydistributionsthathavemaximalRenyientropywithgivengeneralizedFisherinformationarethegeneralizedGaussian.Inthisarticle,weshowthatgiventhevarianceandtheRenyi'squadraticortheShannonentropy,wehaveatmosttwogeneralizedGaussiandistributionswiththegivenvarianceandentropy.WealsoproposethatGGcanbeusedtododensityestimationusingParzonwindowmethod.WeprovethatIVofGGisexactlythereciprocalofitsshapeparameterp,whichgivesusanewmethodtoestimatep. Therestpartisorganizedasfollows.x2studiesCRE,x3isaboutKL-divergenceofdistributions,x4considerlogf(X)andatlastthegeneralizedGaussianisstudiedinx5. 10

PAGE 11

Inthischapter,westudytheCumulativeResidualEntropy(CRE)whichisafunctionalofthedistribution.CREwasrstintroducedbyRaoandChenetal.[ 13 ]anddevelopedfurtherbyRao[ 12 ].CREhasmanygoodproperties.First,itsdenitionisvalidinthecontinuousanddiscretecaseswhichismoregeneralthantheShannonentropy;second,ithasmoregeneralmathematicalpropertiesthantheShannonentropy;andatlast,itcanbeeasilyestimatedfromsampledataandtheestimationasymptoticallyconvergetothetruevalues. Originally,CREisdenedforapositiverandomvectorXinRNby whereX=(X1;X2;:::;XN),x=(x1;::::xN)andjXj>xmeansjXij>xiandRN+=xi2RN;xi>0. WerstderivemoreresultsofCREbasedonthisdenition,thenwegeneralizethedenitiontoarbitraryrandomvariablesandprovesomenewproperties.ThethirdpartisaboutCREandtheWeakLawofLargeNumbers.Atlast,weshowanapplicationofCREtotheindependencetest. Proof. y>xyfornonnegativexandy, HenceE(Y)>R10F(t)logG(t)+E(Y)E(X)>R10G(t)logG(t)+E(Y)E(X).SoE(X)+E(X)>E(Y)+E(Y). Next,weproveasucientandnecessaryconditionfortheconvergenceofE(X). 11

PAGE 12

Proof. G>ZEFlogREF so AgainforanyGandp>1,RFlogF Gp>RFRGp,soRFlogFpRFlogG>RFRGp.ThusifRF<1,RGp<1andRFlogG<1,then-RFlogF<1. MultiplyingbothsidesbyFandintegratewithrespecttotentailsthat SinceR10F(t)logtdt6R10jlogtjdt<1,E(X)<1impliesthatR11log(t)F(t)dt<1. Hence,E(X)<1ifandonlyifR10log(1+t)F(t)dt<1bythepreviouslemma. ThistheoremstudiesCREofthesumoftwoindependentdiscreterandomvariables.IfweinterpretEas\information",thenthetheoremsimplysaysthataddingtwoindependentdiscreter.v.sincreasesthe\information"content. 12

PAGE 13

Sincexlogxisconvex,byJensen'sinequality,8zj, Multiplybothsidesbyzj=(zj+1zj)andsumupwithrespecttoj,weget (2{8) =mXi=1qiNXj=1zjP(X>zjyi)jlogP(X>zjyi)j =nXk=1(zk+1zk)P(X>xk)jlogP(X>xk)j=nXk=1xkP(X>xk)jlogP(X>xk)j=E(X) 13

PAGE 14

ThesameargumentscanbeappliedtoE(Y). Stam[ 15 ]givesanewprooftheoftheShannonpowerinequality.WeobtainananalogousoneforCRE. Furthermore,ifFX+Ye(x+y)r(x+y)>maxfFXexr(x);FYeyr(y)g,then Proof. 12 ],E(X)=E[X+XlogFX(X)].So (2{14) =E[(X+Y)logFX+Y(X+Y)XlogFX(X)YlogFY(Y)]=E[XlogFX+Y(X+Y) Let(x;y)=xlogFX+Y(x+y) (2{16) =logP(Y>yjX>x)xP(X>xjY>y)y=(x;y) Takingtheexpectationentailstheclaim. 14

PAGE 15

So@(x;y) Bythesymmetryof(x;y),onehasifFZezr(z)>FYeyr(y),then(x;y)isincreasinginy,so(x;y)>(x;0)>(x;0)=0. HenceifFZezr(z)>maxfFXexr(x);FYeyr(y)g,thenE[(x;y)]>0.Thetheoremfollows. Atlast,weestablishtheconnectionbetweenCREandthedistribution. 1plogZ1F(x)pdx(2{18) limp!11 1plogZ1F(x)pdx=limp!11 1plogZ1F(x)pdxlogZF(x)dx =@ @plogZ1F(x)pdxjp=1=RF(x)logF(x)dx ThelastequalityholdsbecauseRF(x)dx=E(X)=1.

PAGE 16

p+1(Z10(xf(x))p+1dx)1 logR10F(x)p+1dx p6(p+1)log(p+1) p(2{21) Letp!0+,byLemma2.1.2,onehasE(X)61E(Xf(X)),orE(X)>E(Xf(X))1. Noticethatthisdenitionisalsovalidforbothcontinuousanddiscreterandomvariables.Usingthisnewgeneralizeddenition,wecanobtainseveralresultssimilartothoseinRaoetal.[ 13 ].Thersttheoremistheconvergenceproperty. Proof. 1P(X>)=lim!1 1P(X>)j<1+". 16

PAGE 17

1P(X>)dj6(1+")Z0P(X6)d=(1+")Z+10P(X>)d6(1+")E(X)=(1+")E(X)<+1 HenceE(X)<+1. whereFY()isthecumulativedistributionfunctionofY.UsingJensen'sinequality, Integratingbothsideswithrespecttotfromto+1. HenceE(X)6E(X+Y) Inthesameway,wecanshowE(Y)6E(X+Y),hencemaxE(X);E(Y)6E(X+Y). OneadvantageofCREovertheShannonentropyisthatthesampleCREconvergestothetrueCREassamplesizeincreases,whichisstatedinthefollowingWeakConvergenceTheorem. 17

PAGE 18

1P(Xk>)j<1+".Since weonlyneedtoshowtherstintegralisnite.WeknowthatforXk<^k 1P(Xk>)j Sincej(1+")E(Xk)p Therefore,limk!+1E(Xk)=E(X): Proof. 18

PAGE 19

12 ],E(X)=E(XEX)(logF(X)).Holder'sinequalityentailsthatforp;q>1suchthat1=p+1=q=1, =m1=pp(q!)1=q 22X6E(X)6p IfX2[0;1],thenE(jXE(X)j)>E[(XE(X))2]=2X.HenceE(X)>1 2E(jXE(X)j)>1 22X. SincebothCREandtheShannonentropymeasuretheuncertaintyofarandomevent,westudytheirconnectionsinthenexttwotheorems,oneforthecontinuouscase,theotheroneforthediscretecase. F(x). Proof. F(X)(Rao[ 12 ]),Simplifyitbyintegratingbyparts,onehas F(X)=E(X)+ER1Xtf(t)dt F(X)=E('(X)X)(2{36) where'(x)=R1xtf(t)dt F(x). Sobytheconvexityoflogfunction,logE(X)>Elogj'(X)Xj. Dierentiate'(x),'0(x)=f(x) SinceH(X)=1Elogf(X) 19

PAGE 20

1p1eC (2{39) =(1p1)log1p1 Butn1Xi=1pilogpi SimplifytheinequalityEq.2{39,wehaveH(X)+n1Xi=1pilog(xi)+n1Xi=1pilog(gijlog(gi)j)+(1p1)log(1p1)+pnlog(pn)6(1p1)log(E(X)) Rewritethisinequalityfurther,wegettheclaimedinequality(Eq.2{38). 20

PAGE 21

Proof. Proof. so whereMistheupperboundforE(XilogXi),8i. Step2:WanttoshowthatE(Sn (2{42) =E(X1log(Sn ThesecondequalityfollowsfromtheexchangeabilityoffXngn2N. Bytheinequalityxy6xlogx+expfy1g;8x>0andy>0,Eq.2{42leadsto (2{43) =E(X1logX1:Sn Therstexpectationtendstozerouniformlyas!1sinceX12Llog+L. 21

PAGE 22

Togettheprecisebound,wehavethefollows.Since wehavethebound logE(Sn whichtendstozerouniformlyas!1. Step3:Foranypositiver.v.Xandp>1,bythelogsuminequality, p+1 (2{47) Thesecond">"followsfromtheinequalityxlogx y>xy,8x>0;y>0. But +pZ1P(X>t)log(t)dt (2{49) =E(ZXlog(t)dt:X>)=E(Xlog(X)Xlog()+:X>) PlugtheEq.2{48andEq.2{49intoEq.2{47andrearrangetheitems,wehave (2{50) +p(log())P(X>)+E((X)+)p+1 22

PAGE 23

(2{51) Whenislargeenough,theright-handsideofinequalityEq.2{51isuniformlysmallbystep1andstep2,thatisR1fn(t)dtisuniformlysmallwhenislargeenough. ThistheoremshowsthatE(Sn Supposewehavensamplesoftwocontinuousr.v.s(X;Y),f(x1;y1);(x2;y2);:::;(xn;yn)g,weaimtoinferfromthesampledataifXandYareindependent.ThecommonlyusedmethodistocalculatethecorrelationbetweenXandY,thenperformthehypothesistestingofH0:XandYareindependent,H1:NotH0.However,weknowthatcor(X;Y)=0doesnotimplyX;YareindependentexceptfortheNormaldistributionforwhichtheindependenceanduncorrelatedareequivalent.Here,wederivedanewmeasureofindependenceusingCRE,whichcanmeasurethetrueindependenceratherthanthecorrelationbetweentwor.v.'s.ThesimulationresultsshowthatCREmeasureisasymptoticallybetterthanthecorrelationmeasure,especiallywhennissucientlylarge. 23

PAGE 24

Proof. Weperformtwoexperimentstotesttheindependencebetweentwor.v.susingthemeasurerandthecorrelationmethod.SinceallrandomnumberscanbeobtainedfromtheUniformrandomnumbers,wefocusonthetheUniformdistributions. Experiment1.GeneratesamplesfromUnif(0;1)asfollows: step1,generate50samples,fx(1)1;x(1)2;:::;x(1)50gandletX(1)denotethisarray; step2,generate100samples,fx(2)1;x(2)2;:::;x(2)100gandletX(2)denotethisarray; stepn,generate50nsamples,fx(n)1;x(n)2;:::;x(n)50ngandletX(n)denotethisarray. Wegenerate200timesandgetasequenceofarraysX(1);X(2);:::;X(200)andthelengthofX(n)is50n.DothesameprocedureandobtaintheothersequenceofarraysY(1);Y(2);:::;Y(200). Sincewegeneratetwosetsofrandomsamplesindependently,wemayassumeX(n)andY(n)areindependent.Byabovecorollary,rn(X(n);Y(n))!0asn!1.Thenumericalresultsshowthatasn!1,rn(X(n);Y(n))decreasestozeroasymptotically,butcor(Xn;Yn)doesnothavethispattern. Experiment2.ConstructanewsequenceZ(1);Z(2);:::;Z(200)whereZ(n)=sin(100X(n)),n=1;2;:::;200,ahighlynonlinearfunctionofX(n).Recalculaternof(X(n);Z(n))andcor(X(n);Z(n)).Theresultsshowthatrnconcentratedaround.97asnincreases,whichentailshighdependencebetweenX(n)andZ(n).Butthevaluesofcor(Xn;Zn)revealnoinformation. Thetrendofthegraphofrnintwoexperimentsareextremelydierent,butthetrendofthecorrelationsintwoexperimentsaresosimilarthatwecannotdistinguishthem. 24

PAGE 25

LetX1;X2;:::;Xnbei.i.d.sampleofsizenfromthedecreasingdistributionF(x;)withanunknownparameter.InthischapterwedevelopanewmethodtoestimateusingthedistributionratherthanthedensityofX1. Proof. y>xy,8x>0andy>0. TheempiricaldistributionGn(x)=Pn1i=1ni+1 LetPi=ni+1 21 LetAn:=1 22An+1entails=q Proof. 25

PAGE 26

GivenXexp(),E(X)=andVar(X)=2. 1).E(^MLE)=,whileE(2)=2. 2).Considerthemeansquareerror,E(^MLE)2=E(^2MLE)2=2 ;06x6,whereisunknown.Wedrawasampleofsizenfromthisdistribution,X=(X1;X2;:::;Xn),andE(X1)= TheempiricaldistributionGn(x)=ni+1 Xi(3{3) So 2:(3{4) Sinceg00()=1 Wedothefollowingexperiment:Generate100samplesfromUnif(0,1),calculateusingNewton'smethod.Lete1=j1j,e2=jX(n)1j,thencomparee1ande2.Repeatthesameprocedure1000timesandwecouldseethatthereweremorethan75%timesthate1
PAGE 27

ItiswellknownthatShannonentropyistheexpectationoflogf(X),wecallitentropyvariable.Sincelogf(X)isarandomvariable,itisnaturaltostudysomeofitsotherstatistics(i.e.,expectation,variance,momentgeneratingfunction,etc.).Inthischapter,werststudythevarianceoflogf(X),callitInformationVolatilityorIV.WelookatthegeneralpropertiesofIVanditsvaluesforcommondistributions,forinstanceUniform,Normal,Gammma,Beta,Weibullandsoon.ThenwestudythedistributionandtheLaplacetransformoflogf(X). Denition:LetXbear.v.insomeprobabilityspace(;F;P),Xhasthedensityf(x).DenetheInformationVolatilityofX,IV(X)=Var(logf(X)). SoH(X)=1 2+logp 2. Noticethatlogf(X)>logp BytheChebychevinequality,P[jlogf(X)H(X)j
PAGE 28

2ejxj=.ItcanbeeasilyveriedthatIV(X)=1 2.ButjXjisexponentiallydistributed,soIV(jXj)=1 2,henceIV(X)=IV(jXj). TheShannonentropyisnotinvariantunderanetransformationbecauseitisshiftinvariant(Cover[ 5 ])butnotscaleinvariant.However,thefollowingresultshowsIVisinvariantunderanetransformation. Proof. dy1(y)j.Withoutthelossofgenerality,assumeisincreasing. dy1(y)logfX(1(y))d dy1(y)dy(4{3) Dothechangeofthevariable.Letx=1(y)or(x)=y,sodx dy1(y)anddx dy=dx=1 =ZfX(x)(logfX(x)+log1 Hence =Var(logfX(x) 28

PAGE 29

=2Zflogflog0dxZflogfdxZflog0dx 061 =21 Solog0(x)=cforsomeconstantcbytheCauchy-Schwarzinequality,hence(x)isananefunction. Proof. [E(logg(Y))]2=[log1 =log1 =ZRn1 HenceVar(logg(Y))=E[(logf(X))2][E(logf(X))]2=Var(logf(X)). 29

PAGE 30

Proof. (2)n=2jj1=2expf1 2(x)T1(x)g. SinceforanyndimensionalnormalrandomvectorX,(X)T1(X)2n. 4Var((X)T1(X))=1 42n=n ForanyinvertiblematrixAandvectorb,AX+bN(A+b;AAT),whichisalsoandimensionalnormalrandomvector.HenceIV(AX+b)=n (1)Iff f0<1 (2)Ifjf0 Proof. 01. Forcase(2),if1 30

PAGE 31

F(x)andlogf(x)=logr(x)+logF(x). (4{14) =E[(logfX;Y)2][E(logfX;Y)]2E[(logfXfY)2]+[E(logfXfY)]26[E(logfX;Y)]2+[E(logfXfY)]2=E(logfXfYfX;Y)E(logfXfYlogfX;Y)=[H(X)+H(Y)+H(X;Y)][H(X)+H(Y)H(X;Y)]60 Forthelaststep,weusedthepropertyoftheShannonentropyH(X)+H(Y)>H(X;Y)andtheassumptionH(X;Y)>0. (1)(Theassumptionisfalse)ThejointdensityofXandYisf(x;y)=x+y,06x61,06y61.ThemarginaldensitiesarefX(x)=x+1 2,06x61,andfY(y)=y+1 2,06y61. ItcanbenumericallyveriedthatE[(logfX;Y(X;Y))2]>E[(logfX(X)fY(Y))2]. 31

PAGE 32

ItcanbenumericallyveriedthatE[(logfX;Y(X;Y))2]
PAGE 33

(4{19) Proof. 7 ],Knessl[ 10 ],Worsch[ 16 ]). Proof. SinceX(n)!Xindistribution,thenlimn!1P(X(n)6x(n)i)P(X6x(n)i)=0. Denep(n)i=P(X(n)=x(n)i),sotheabovelimitcanberewrittenas limn!1iXj=1p(n)jZx(n)1f(x)dx+i1Xj=1Zx(n)j+1x(n)jf(x)dx!=0(4{21) 33

PAGE 34

NextconstructasequencefX(n)gsuchthatlimn!1IV(X(n))=IV(X). ConsiderthesequencefX(n)gwithsupportfx(n)1;x(n)2;:::;x(n)mg,wherex(n)i
PAGE 35

=IV(X) NextwestudysomespecialdistributionsandexploretheirconvergencepropertiesofIV. ni)=1

PAGE 36

n)=p(1p)i1;i=1;2;:::.XExponential()withdensityf(x)=1 Proof. ~L(Xn)=1Xi=1ei np(1p)i1=p =p 1e=n(1p) Letp=1 e=n(1p) (4{26) =1 n)=p(1p)i1;i=1;2;:::,thenIV(Xn)=(log(1p) Proof. n)=p(1p)i1,thenE(Xn)=P1i=1ipi=1 p2. SinceIV(Xn)=P1i=1pi(logpi)2(P1i=1pilogpi)2,wecomputetwosummationsasfollows. (4{27) =logp 36

PAGE 37

=(logp p2IV(Xpn)=P1i=1pi(logpi)2(P1i=1pilogpi)2=(log(1p) Plugginginp=1 n)=p(1p)i1;i=1;2;:::.Xn!Xindistributionasn!1.limn!1IV(Xn)=IV(X),butlimn!1H(Xn)6=H(X). Proof. 11 TheShannonentropyofBinomial(n;p)distributionhasbeenstudiedthoroughlyintheliterature(Jacquet[ 7 ],Knessl[ 10 ],Worsch[ 16 ]).Hereweproposeanewsimpleprooftotheasymptoticapproximationoftheentropy.ThesamemethodleadstotheasymptoticapproximationofIV(Binomial(n,p)). Worsch[ 16 ]givesthefollowinglemma. 1+B(n;i)<(ni)<(1+)B(n;i)(4{29) i(ni)(n i)i(n ni)ni. 6Sn6np]g+o(ecn)(4{30) Proof. 6Sn6npg.ThenbytheChernobound,theprobabilityofAcngoestozeroexponentially,orP(Acn)=o(ecn),wherecisaconstantdependingonn. 37

PAGE 38

wherecdependson.Forthesamereason, 6Sn6np]g+o(ecn)(4{32) wherecdependsonandm. 6Sn6npg,2(1;1+)forasmallconstant>0andcisaconstantdependingon. Proof. Given1<<1+and>0,forlargen,onthesetAn=fnp 6Sn6npg,wecanapplyLemma4.2.2. 1 1+B(n;Sn)<(nSn)<(1+)B(n;Sn)(4{34) So log1 1++logB(n;Sn)
PAGE 39

2+O(1 logB(n;Sn) (4{39) =log1 2logn Sn(nSn)+Snlogn Sn+(nSn)logn nSn=log1 2logn+1 2log1 1Sn 2[logSn (4{40) =logp 2[logSn 2[logSn 1pSn 1plog(1Sn 1p] Letf(x)=logx+log(1x),itsT3Taylorestimationatcenterpis p(1p)(xp)1 2(1 (1p)2)(xp)2+1 3(1 (1p)3)(xp)3+O(xp)4 ;p]for2(1;1+). Bythesamemethod,theTaylorestimationofthefunctiong(x)=x p+1x 2p(1p)2(xp)2+2p1 6p2(1p)3(xp)3+O(xp)4 ;p]for2(1;1+). Takingx=Sn 2logp(1p)+12p 4(1 (1p)2)+n 6(1 (1p)3)+(2p1)n 39

PAGE 40

TakingtheloganddierentiatingbothsidesentailsthatM0(t)=npM(t)Y(t)whereY(t)=et SinceY(0)=0,Y0(0)=1p,Y00(0)=(1p)(12p),andY(3)(0)=(1p)(16p+6p2),onehas Hence,E(Sn Sowehavethefollowing (4{41) =logp 2logp(1p)+[n 4(1 (1p)2)+n (4{42) =[n 2+O(1 ByLemma4.2.4,H(Sn)=logp 2+O(1 40

PAGE 41

2,IV(Normal)=1 2,02p 2and02p Weneedthefollowingspecialfunctionsfortheproof.Hurwitzzetafunction:(s;) (k+)s(4{43) and d)m()=(d d)(m+1)log()(4{45)Thefollowingidentityholds: ()x1ex=,where06x<1;;>0,thenIV(X)=(1)2(1)()+2,where(1)isthePolygammafunction. Proof. ()+1 ()Z10x1exlogxdx(4{47) wheretheidentity(+1)=()wasused.Secondly, ())22log1 ()+2(1) ()log1 ()Z10x1exlogxdx +(1)2 ()Z10xex(logx)dx+(+1) 41

PAGE 42

whichcanbeobtainedbythepartialintegrationwithrespecttoex.Witheverythingathand,itiseasytoshowthat ButVar(logX)=(d d)(2)log()=(1)(),thestatementfollows. TheabovelemmaprovidesuswithaniceformulaforIV(X)whenXbelongstotheGammafamily.Thenextpropositionsaysthatactually,thequantityIV(X)isabletoseparatetheentireGammafamilyfromtheNormalfamily.RecallthatIVofaNormalrandomvariableequals1 2. ()exx1x>0.ThenIV(X)>1 2andlim!1IV(X)=1 2 ThedensityfunctionofXnn Twofactsareneeded. 1).Sinceex=limn!1(1+x n)n,8">0,thereexistssucientlylargeNsuchthatjexp n)nj<",8n>N. 2).TheStirlingformula:n!=p limn!1p =limn!1(nn+1=2 n+1)n1exp n+1)n1(1xp n)n=limn!11 n+1)1=1

PAGE 43

2andlim!1IV(X)=1 2.SoIV(X)separatestheentireGammafamilyfromtheNormalfamily. Proof. d)(2)log()=(1)()=(2;)(4{52) Hence (4{53) =(1)21Xk=01 (k+)2+2 Since (x+)2dx<1Xk=01 (k+)21 2;8.It'strivialwhen62.When>2,since1 2=(1)2(2;)(1)+1 2 (4{56) =(1)2[(2;)+1 2(1)21 2(1)2+1Xk=11 (+k)21Xk=1Z+k+1+k1 Thefunction1 1 (+k)2>Z+k+1+k1 2[1 (+k)21 (+k+1)2](4{57) 43

PAGE 44

(+k)2istheareaoftherectangularon(+k;+k+1)withheight1 (+k)2,R+k+1+k1 2[1 (+k)21 (+k+1)2]istheareaofthetriangleabovethecurvebutinsidetherectangular. Byaddinguptheinequalitiesforallk,onehas (+k)2>1Xk=1Z+k+1+k1 2(1)2(4{58) ItfollowsthatIV(X)1 2>08. Now,foranyn,letY=Xnn 2.ThereforeIV(X)!1 2as!1. 6;1). Proof. dIV(X)=2(1)(2;)+(1)2@(2;) (4{59) =2(1)(2;)2(1)2(3;)1=2(1)1Xk=0k+1 (k+)31 Proof. ItistrivialtoverifythatVar(logX(1X))=d2 SinceB(;)=()2 (4{60) =2(1)()4(1)(2)=2(2;)4(2;2) 44

PAGE 45

HenceIV(X)=(1)2[2(2;)4(2;2)]. ThenextpropositionsaysthatIV(X)ofBeta(;)familyisactuallysmallerthan1 2whenisgreaterthanasmallnumber2p 2,8>2p 2. Proof. 2.Now,assume6=1,wewanttoshowthat2(2;)4(2;2)<1 2(1)2.Since 2(2;)4(2;2)=21Xk=01 (+k)241Xk=01 (2+k)2 =21Xk=01 (+k)21Xk=01 (+k=2)2=1Xk=01 (+k)21Xk=01 (+k+0:5)2<1 2(1)2entailsthat2(2p NextshowthatIV(X)<1 2for>2+p (+k)2=1 (+k)2 =1 (+k1)(+k)261 (+k)361 2(+1)2+1 2(+1)3) also, (+k+0:5)2>1 2(+0:5)2 45

PAGE 46

(+k)21Xk=01 (+k+0:5)21 2(1)2 2(+1)2+1 2(+1)3)(1 2(+0:5)2)1 2(1)2=P() whereP()=69524:54123+5:22+6+1,Q()=4(1)22(+0:5)2(+1)3.Sowehave 2(4{65) SinceQ()>0;8,butP0()<0for>1andP(1)<0,soP() 2for>1.ButwehaveshownthatIV(X)<1 2for2(2p 2,8>2p Since(1)2P() 2. CombiningprevioustwotheoremsgivesusaproveofTheorem4.3.1.NextwecomputeIVofothercommondistributions. 2+2. Proof. Integratingusingthechangeofthevariablesentailsthat 4+2 2 soIV(X)=1 2+2,theclaimfollows.

PAGE 47

x1ex=,06x<1,>0and>0.ThenIV(X;;)=(11 Proof. SinceR10eylog(y)2dy=2 (4{68) Bytakingthederivative,theminimumvalueofIV(X)isobtainedat=2=6 Consideranexamplewheref(x)isanone-to-onefunction. 1lnZ1f(x)dx(4{69)

PAGE 48

Thetwo-sidedLaplacetransformof-logf(X)isgivenby Noticethefollowingtwoobservations. (1).TheRenyi'smeasureofentropyis 1lnZ1f(x)dx=1 1lnL(1)(4{71) (2).WhenXispositiveands>0,Hardy'sinequalitygivesalowerboundforL(s)intermsofthedistributionofX. s+1]s+1Z10[F(x) ThefollowingpropositiongivesalowerboundfortheFisherinformationintermsoftheLaplacetransform. @logf(X;))2]>n @L(s;)]2 @logf(X;))2]=nE[(@ @logf(X;))2],whereX=X1.Ontheotherhand, @L(s;)=(s+1)Z1f(x;)s+1[@ @logf(x;)]dx =(s+1)Eff(X;)s[@ @logf(X;)]g6(s+1)fE[f(X;)2s]E[(@ @logf(X;))2]g1=2 @logf(X;))2]>1 (s+1)2[@ @L(s;)]2 48

PAGE 49

14 ])isariskmeasure.Forcontinuousvariables,-CVaRofar.v.YwithdistributionFYissimplytheexpectationofYconditioningonY>VaR,whereVaR=F1Y().SoCVaR(Y)=E[YjY>VaR].Forourpurpose,givenarandomvariableXwithdensityf,supposeweonlycareaboutsmallprobabilityeventsf!:P(X(!)2(x;x+)s). Consideraspecicexample.LetXexp(),thedensityfunctionisfX(x;)=1 (4{76) =v+1=log+1log(1)=H(X)log(1) 49

PAGE 50

GeneralizedGaussiandistributioniscommonlyusedtomodelthenoiseinengineering(Dominguez-Molinaetal.[ 6 ]).Lutwaketal.[ 11 ]showedthattheprobabilitydistributionsthathavemaximalRenyientropywithgivengeneralizedFisherinformationarethegeneralizedGaussian. WerstshowthatthegeneralizedGaussiandensityisoneofthemaximalentropydistribu-tions. (1=p)]1=2(5{1) 2(1+1=p)A(p;)expfjx A(p;)jpg(5{2) WhereA(p;)=q (3=p).ByDominguez-Molinaetal.[ 6 ], SotheShannonentropyofYis NowassumeXhasthedensityf,bythelog-suminequality, gg>0(5{5) SoRRflogf>RRfloggg,orH(X)6RRfloggg,whereH(X)istheShannonentropyofX. 50

PAGE 51

(1=p)]1=2.So 2(1+1=p)A(p;)dx+ZGf(x)jx A(p;)jpdx =log(2(1+1=p)A(p;))+1 ThegeneralizedGaussiandistributioniscommonlyusedbytheengineerstomodelthenoise.However,itisnoteasytogetagoodestimatorfortheshapeparameterp.HerewedevelopanalternativemethodtoestimatepusingIV. 2(1+1=p)A(p;)expfjx A(p;)jpg(5{8) WhereA(p;)=q (3=p).Withoutlossofgenerality,assume=0. (5{9) ByDominguez-Molinaetal.[ 6 ], soVar(jXjp)=p1A(p;)2p,henceVar(logf(X))=p1. Lemma5.0.1canbeusedtoestimatetheshapeparameterp,denotedbyp.Giveni.i.d.samplesx1;x2;;xn,assumetheyarefromthegeneralizedgaussiandistributionwithunknownparameterp.Theestimationprocedureisasfollows: 1.Estimatethedensityusingtheparzenwindowmethod.Assumetheestimateddensityis(x),(x)=1 51

PAGE 52

3.TheestimatorofIVis^IV(X)=1 OnepropertiesofGG. Proof. 2(1+1=p)[2(1=p) (3=p)]1=2expfjx (3=p)]1=2jpg(5{11) TheRenyi'squadraticentropyofXis 2log(3=p)+3 2log(1=p)(5{12) WanttoshowR(X;p)isnotmonotoneinp.Consider@R(X;p) 2p20(3=p) (3=p)3 2p20(1=p) (1=p) (5{13) By[ 4 ](page179), 0(z) (z)=1 WhereistheEulerConstant.Theabovederivativecanbewrittenas 21Xn=1(1 Where'(p)=log2+3 2P1n=1(1 Case1:limp!1'(p)=log2. Since06P1n=1(1 (n+1=p)(n+3=p)62 Case2:limp!0+'(p)=1 2log27 4. 52

PAGE 53

=1Xn=1(1 Bythedominatedconvergenttheorem,limp!0+P1n=1Rn+1n1 Thesameargumentcanbeappliedtoshowlimp!0+P1n=1Rn+1n1 limp!0+'(p)=limp!0+log2+3 2logp+3 (5{17) 3 2[1Xn=1(1 2log3=1 2log27 4'(p)takesthepositivevalueat0+andthenegativevalueatinnity.HenceR(X;p)isnotmonotoneinp. WanttoshowthatR(X;p)onlyhasoneglobalmaximum. @p. @p=3 2p2[1Xn=11 (n+1=p)21Xn=13 (n+3=p)2] (5{18) LetA=P1n=11 (n+1=p)2R111 (x+1=p)2dx,andB=R103 (x+3=p)2dxP1n=13 (n+3=p)2. NoticethatbothAandBareincreasingfunctionsofp. (n+1=p)21 (x+1=p)2dx =1Xn=1Zn+1n[1

PAGE 54

(x+1=p)21 (n+1=p)2dx =31Xn=1Znn1[1 Butthen @p=3 2p2[Z111 (x+1=p)2dx+AZ103 (x+3=p)2dx+B] (5{21) =3 2p2[p2 Sincep2 @p=0hasatmostonesolution,hence'(p)cancrossxaxisonlyonce,or'(p)=0onlyhasonesolution.Itfollowsthatthestatementholds. EjXjp6r By[ 4 ](page179), 0(z) (z)=1 54

PAGE 55

(1=p)+2log(p)+2log(r)1] (5{24) =1 1=p+n)+2log(p)+2log(r)1]=1 1=p+n)isadecreasingfunctionofpandlog(p)isanincreasingfunctionofp.Since'(0+)=and'(1)=1,'(p)=0hasanuniquesolution,denotedbyp. Since@H(X;p) (1=p)]1=2(5{25) (,p)istheoptimalsolutiontothestatedmaximizationproblem. AnotherpossibleapplicationofgeneralizedGaussiandistributionistheestimationofthedensityusingtheparzonwindowmethod.Supposetheempiricalhistogramestimatedusingadatasetofsizenisnandthetruedensityis.Assumehasatmostoneinectionpoint.Sincewhentheshapeparameterp>1 2,thegeneralizedGaussiandensitygg;pbecomeatter,sothenormjngg;pj2>jngg;1=2j2. 55

PAGE 56

ThemainpartofthisresearchstudiesCumulativeResidualEntropy(CRE)andInfor-mationVolatility(IV).BothCREandIVentropymeasuresareabletoimprovetheclassicalShannonentropyincertainways. TheCumulativeResidualEntropygivesamoregeneraldenitionoftheentropyusingthedistributions.TheestimationofCREismuchsimplerthantheestimationoftheShannonentropysinceitismucheasiertoestimatetheempiricaldistributionsthanthedensities.WegeneralizetheoriginaldenitionofCREtothereallineandstudyitsnewproperties.AnewproofofWeakLawofLargeNumbersisexamined. TheInformationVolatilityexaminesthevarianceoftheinformationow(i.e.,thevarianceoflogf(X)),whereXisarandomvariablewithdensityfunctionf(x).RecallthatShannonentropyistheexpectationoflogf(X).UnfortunatelytheredoesnotseemtobeanywayofestimatingthisquantityfromtheShannonentropyoftheempericaldistributionofthesample.Thusinthissensethemeanoflogf(X)isnota"good"statistic.HoweverWefoundthatitsvarianceIVhasveryinterestingproperties.IVequalingzerocharacterizestheUniformdistribution;IVisabletoseparatecommonprobabilitydistributions;IVistranslationinvariant;foravectorvariable,IVisinvariantunderanetransformations.Moreover,IVhasgoodweakconvergencepropertywhichShannonentropydoesnothave.WeshowthatonecanalwaysndasequenceXnwhichconvergestoXindistributionandIV(Xn)convergestoIV(X),butfortheShannonEntropythisisneverthecase.Forinstance,IVofBinomial(n;p)convergestoIVofNormalasngoestoinnityandIVofGeometric(1 WealsostudytheparameterestimationsusingdistributionsandexplorethepropertiesofthegeneralizedGaussianfamily.Furtherresearchcanbedoneintheseareas. 56

PAGE 57

[1] M.S.Bazaraa,J.J.Jarvis,H.D.Sherali,LinearProgrammingandNetworkFlow.NewYork:Wiley,2005. [2] P.Billingsley,ProbabilityandMeasure.NewYork:JohnWiley&Sons,Inc.,1995. [3] G.Casella,R.L.Berger,StatisticalInference.PacicGrove:Duxbury,2002. [4] J.B.Convey,FunctionsofOneComplexVariable.NewYork:Springer,1989. [5] T.M.Cover,J.A.Thomas,ElementsofInformationTheory.NewYork:Wiley,1991. [6] J.A.Dominguez-Molina,G.Gonzalez-Farias,R.M.Rodriguez-Dagnino,\ApracticalproceduretoestimatetheshapeparameterinthegeneralizedGaussiandistribution,"preprint,2001 [7] P.Jacquet,W.Szpankowski,\Entropycomputationsviaanalyticdepoissonization,"IEEETransactionsonInformationTheory,vol.45,pp.10721081,1999. [8] O.Johnson,\Aconditionalentropypowerinequalityfordependentvariables,"IEEETransactionsofInformationTheory,vol.50,pp.15811583,2004. [9] J.N.Kapur,H.K.Kesavan,EntropyOptimizationPrincipleswithApplications.Boston:AcademicPress,1992. [10] C.Knessl,\Integralrepresentaionsandasymptoticexpansionsforshannonandrenyientropies,"AppliedMathematicsLetters,vol.11,pp.6974,1998. [11] E.Lutwak,D.Yang,andG.Zhang,\Cramer-Raoandmoment-entropyinequalitiesforrenyentropyandgeneralizedsherinformation,"IEEETransactionsofInformationTheory,vol.51,pp.473478,2005. [12] M.Rao,\Moreonanewconceptofentropyandinformation,"JournalofTheoreticalProbability,vol.18,pp.967981,2005. [13] M.Rao,Y.Chen,B.C.VemuriandF.Wang,\Cumulativeresidualentropy,anewmeasureofinformation,"IEEETransactionsofInformationTheory,vol.50,pp.12201228,2003. [14] R.T.Rockafellar,S.Uryasev,\Optimizationofconditionalvalue-at-risk,"JournalofRisk,vol.2,pp.2141,2000. [15] A.J.Stam,\Someinequalitiessatisedbythequantitiesofinformationofsherandshannon,"InformationandControl,vol.2,pp.101112,1959. [16] T.Worsch,\Lowerandupperboundsfor(sumsof)binomialcoecients,"TechnicalReport31/94,UniversitatKarlsruhe,FakultatfurInformatik,1994 57

PAGE 58

Dr.JuanLiuwasborninShaanxiprovince,Chinain1979.From1992to1998,JuanstudiedintheNo.9highschoolinthecityofKelamayiinXingjiangprovince.From1998to2002,shecontinuedhereducationattheDepartmentoftheMathematicalSciencesatNankaiUniversity,whichconferredaBachelor'sdegreeinappliedmathematicsonher.From2002to2007,JuanstudiedintheGraduateSchooloftheUniversityofFlorida,whichconferredaPh.D.degreeinappliedmathematicsandaMaster'sdegreeinoperationsresearchonher. 58


Permanent Link: http://ufdc.ufl.edu/UFE0019613/00001

Material Information

Title: Information Theoretic Content and Probability
Physical Description: Mixed Material
Copyright Date: 2008

Record Information

Source Institution: University of Florida
Holding Location: University of Florida
Rights Management: All rights reserved by the source institution and holding location.
System ID: UFE0019613:00001

Permanent Link: http://ufdc.ufl.edu/UFE0019613/00001

Material Information

Title: Information Theoretic Content and Probability
Physical Description: Mixed Material
Copyright Date: 2008

Record Information

Source Institution: University of Florida
Holding Location: University of Florida
Rights Management: All rights reserved by the source institution and holding location.
System ID: UFE0019613:00001


This item has the following downloads:


Full Text



INFORMATION THEORETIC CONTENT AND PROBABILITY


By
JUAN LIU





















A DISSERTATION PRESENTED TO THE GRADUATE SCHOOL
OF THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT
OF THE REQUIREMENTS FOR THE DEGREE OF
DOCTOR OF PHILOSOPHY

UNIVERSITY OF FLORIDA

2007



































S2007 Juan Liu



































To my Family








ACKNOWLEDGMENTS

I am indebted to Prof. Murali Rao for his encouragement and guidance throughout

my study in the graduate school, to Prof. F. AitSahlia, Prof. Y. Clo! n,~ Prof. B. Mair and

Prof. L. Shen for their interest in my dissertation, and to the faculty of the Mathematics

Department for their hospitality.

Last, but not least, I would like to thank Dr. S. A. Melikhov for stimulating discussions and

support .









TABLE OF CONTENTS

page

ACK(NOWLEDGMENTS ......... .... .. 4

ABSTRACT ............ .......... ..... 6

CHAPTER

1 INTRODUCTION ......... .... .. 8

2 CUMULATIVE RESIDUAL ENTROPY ....... ... .. 11

2.1 More Results about Cumulative Residual Entropy .... .... .. 11
2.2 Generalization .... .. .. ... ...... .. 16
2.3 Cumulative Residual Entropy and Weak Law of Large Numbers .. .. .. .. 20
2.4 Cumulative Residual Entropy and Independence .... .... .. 23

3 K(ULLBACK(-LEIBLER DIVERGENCE OF DISTRIBUTIONS ... .. . 25

4 ENTROPY VARIABLE ...... .... 27

4.1 Information Volatility ......... . .. 27
4.2 Weak Convergence ......... .. .. .. 33
4.3 Applications of Information Volatility . . .. .. .. 40
4.4 Distribution and Laplace Transform of the Information Variable .. .. .. .. 47

5 GENERALIZED GAUSSIAN FAMILY ....... .. .. 50

6 SUMMARY ........._ ....... ..... 56

REFERENCES ..........._........ ..... 57

BIOGRAPHICAL SK(ETCH ......... . .. .. 58








Abstract of Dissertation Presented to the Graduate School
of the University of Florida in Partial Fulfillment of the
Requirements for the Degree of Doctor of Philosophy

INFORMATION THEORETIC CONTENT AND PROBABILITY

By

Juan Liu

M li- 2007

Cl.! ny~: Murali Rao
Major: Mathematics

The first part of the research studies the information content of random variables with

distributions. Given a positive random variable X with distribution F(x), its information

content can be measured using the Cumulative Residual entropy(CRE). In this study, we

extend the original definition of CRE and obtain results for arbitrary random variables. As

an application, we are close to finding a new proof of Weak Law of Large Numbers using CRE

just as the Shannon entropy was used to prove the Central Limit Theorem. We also introduce

a measure of independence between two random variables using CRE. At last, we study the

K~ullback-Leibler divergence between two empirical distributions and apply it to the parameter

estimates.

The second part studies the random variables with density functions. Given the random

variable X with the density function f(x). The Shannon entropy of X is the expectation of

- log f(X) which itself is also a random variable. In this research work, we found that the vari-

ance of log f (X) (we call it Information Volatility or IV) has very interesting properties: 1.

IV equaling zero characterizes the Uniform distribution; 2. IV is able to separate the Uniform,

Gaussian, Gamma and a subfamily of Beta distributions; 3. IV is translation invariant; for

a vector variable, IV is invariant under affine transformations. Moreover, IV has good weak

convergence property which the Shannon entropy usually does not have. For instance, IV of

Binomialn, p) converges to IV of Normal as a goes to infinity. In this paper we also propose a

new proof to the .I-i-mptotic approximation of the Shannon entropy of Binomialn, p) and prove

that discrete Shannon entropy will never converge to its continuous analogous but IV will con-

verge. Generalized Gaussian(GG) density is one of the maximal entropy distributions. In this








research, we show that given the variance and the Renyi's quadratic or the Shannon entropy, we

have at most two generalized Gaussian distributions with the given variance and entropy. We

also propose that GG can he used to do density estimation using Parzon window method. We

prove that IV of GG is exactly the reciprocal of its shape parameter p, which gives us a new

method to estimate p.








CHAPTER 1
INTRODUCTION

The discipline of the information theory was established and brought to the worldwide

attention by Shannon in his landmark paper A mathematical Theory of Communication" in

the Bell System Technical Journal in 1948. The information theory is based on the probability

theory and statistics. The most important quantity of information theory is the Entropy which

is a functional of probability distributions, measuring the uncertainty in a random variable or

vector. For a random variable X on some probability space (R, F, P), the b'lo~r e. e..n Entropy is

defined as

H(X) =- f(r o :~~ (1-1)

where f is the density if X is continuous, probability mass function if X is discrete. The

Shannon entropy ptil- an important role in various fields (Cover [5]).

However, the Shannon entropy has certain disadvantages. First, it requires the knowledge of

density function for non-discrete random variables. Second, the discrete Shannon entropy does

not converge to its continuous analogous. Third, in order to estimate the Shannon entropy for

a continuous density, one has to obtain the density estimation, which is not a trivial task. In

2005, Rao, C'I. in, et al.[13] developed another new measure of randomness called C'umulative

Residual Entropy (G'RE), which is defined using distributions rather than densities. The CRE of

a positive random variable X with distribution F(:r) is


S(X) (r lgF:>~ (1-2)


CRE has advantages over the Shannon entropy in the sense that its definition is valid for

both continuous and discrete cases and the sample CRE converges to true CRE as sample

size increases. In this article, we first develop more results about CRE. Since in most cases,

we need to have negative random variables, we then extend the original definition of CRE

to an arbitrary random variable X. Besides the generalization of theorems in Rao [13], we

obtain some new results. As an application, we are close to findings a new proof of Weak Law

of Large Numbers using CRE just as the Shannon entropy was used to prove the Central

Limit Theorem. At last, we derived a new measure of independence using CRE, which can he








used to measure the independence between two r.v.'s without looking at the correlation. The

simulation results show that given sample points, CRE measure is .I- ...ph.1i1 cally better than

the correlation measure, especially when the sample size is large.

Shannon's K~ullback-Leibler-divergence (KL-divergence, K~apur [9]) can be used to measure

the distance between two densities. Since it is much easier to estimate empirical distribution

than the density, we study the KL divergence of distributions denoted by KLD, and use it to

obtain parameter estimators. For exponential distribution with parameter 8, we can see the

new KLD eStimator of 8 is unique, and its square is the unbiased estimator of 82. For Uniform

distribution, the interval estimation based on KLD has smaller error than the maximum

likelihood estimator.

Notice that Shannon entropy of a r.v. X with density f(x) is the expectation of log f(X)

which itself is a random variable. The expectation of a random variable is a very useful statistic.

In statistical applications the distribution of a random variable is usually unknown. Two of

the most important statistics of this distribution that one wants to estimate are its mean and

variance. And the only way to do this is to use data: mean and variance of the data are good

estimates of the unknowns. Since log f (X) is also a random variable, it is natural to look at

its other statistics, for example variance, higher moments, distribution etc.

In this research we first study the variance of log f(X), call it Information Volatility(IV).

We know that the Shannon entropy is shift invariant(Cover [5]), which means the entropy of the

event X + b for some constant b is the same as that of X. IV goes one step further in the sense

that it is invariant under linear transformations (the Shannon entropy is not scale invariant),

which means IV of the random variable aX + b is the same as IV of X. For a random vector,

IV is invariant under affine transformations. This property implies that IV is independent of

the mean and the variance of a random variable while Shannon is only independent of the mean.

Therefore, if a distribution family can be described solely by the mean and the variance, then it

has a constant IV value for the whole family. For example, IV of Uniform distributions is zero,

IV of Gaussian distributions is ~, IV of Exponential distributions is 1 and so on.

One of the interpretations of the Shannon entropy is that it measures "information".

Unfortunately there does not seem to be any way of estimating this quantity from the Shannon








entropy of the emperical distribution of the sample. By this we mean it is not possible to find

discrete distributions converging weakly to the distribution in such a way that the corresponding

Shannon entropies converge. Thus in this sense the mean of log f(X) is not a "good" statistic.

However, we will show that IV has this convergence property. We prove some special cases:

IV of Geometric( j) converges to IV of Exponential(n) and IV of Binomial(P, p) converges

to IV of Normal family as a goes to infinity. In this article we also propose a new proof to the

.-i-mptotic approximation of the Shannon entropy of Binomial(n, p) (Jacquet [7], K~ness1 [10],

Worsch [16]).

The other good property about the Shannon entropy is that maximizing Shannon entropy

with different constraints characterizes Uniform, Exponential, Gaussian and some other common

distributions (K~apur [9]). Actually, it will be shown that IV equaling zero characterizes

the Uniform distributions and without any constraints, IV is able to separate the Uniform,

Gaussian, Gamma and a subfamily of Beta distributions.

The moment generating function is also of interest. As a matter of fact, we show that many

other measures of entropy (such as Renyi) can be expressed in terms of the Laplace transform

of log f (X). Moreover, a lower bound for the Fisher information in terms of the Laplace

transform of log f(X) can be found and an upper bound of the distribution of log f (X) in

terms of the Renyi's entropy is obtained. Some brief discussions about Conditional- V/alue-at-

Risk of log f(X) are included at the end.

Generalized Gaussian density is one of the maximal entropy distributions. It is widely used

in engineering to model noise (Dominguez-Molina et al. [6]). Lutwak et al. [11] shows that

the probability distributions that have maximal Renyi entropy with given generalized Fisher

information are the generalized Gaussian. In this article, we show that given the variance

and the Renyi's quadratic or the Shannon entropy, we have at most two generalized Gaussian

distributions with the given variance and entropy. We also propose that GG can be used to

do density estimation using Parzon window method. We prove that IV of GG is exactly the

reciprocal of its shape parameter p, which gives us a new method to estimate p.

The rest part is organized as follows. ~2 studies CRE, 63 is about KL-divergence of

distributions, 64 consider log f (X) and at last the generalized Gaussian is studied in 95.







CHAPTER 2
CUMULATIVE RESIDUAL ENTROPY

In this chapter, we study the Cumulative Residual Entropy (CRE) which is a functional of

the distribution. CRE was first introduced by Rao and C'I. 1. et al. [13] and developed further

by Rao [12]. CRE has many good properties. First, its definition is valid in the continuous
and discrete cases which is more general than the Shannon entropy; second, it has more general

mathematical properties than the Shannon entropy; and at last, it can be easily estimated from

sample data and the estimation .I-i ing ul i1 cally converge to the true values.

Originally, CRE is defined for a positive random vector X in R"N by


S(X) -P(|X|>x) log P(|X | > x)dx (2-1)


where X =(XIX2,...,XN), x= (xx....xy)) and |X| > x meanBLs |Xe > xi anld R"~ =T xs


We first derive more results of CRE based on this definition, then we generalize the

definition to arbitrary random variables and prove some new properties. The third part is about

CRE and the Weak Law of Large Numbers. At last, we show an application of CRE to the

independence test.
2.1 More Results about Cumulative Residual Entropy

In this section, we study more properties of CRE defined for positive random vectors. The

first theorem tells us CRE(X) + E(X) is an increasing function of X.

Theorem 2.1.1. Let X and Y be positive random variables. If X 3 Y 3 0, then

S(X) + E(X) & S(Y) + E(Y).

Proof. Let F(t) =PY>)anGt)=P(X > t). By the log-sum inequality and the inequality
x log & x y7 for nonnegfative x and y7,

F~t lg E(Y) log & E(Y) E(X) (2-2)
o G(t) E(X)
Hence -S(Y) 3 fo" F(t) log G"t + ()-E()>" G(t) log G(t) + E(Y) E(X). So

S(X) + E(X) & S(Y) + E(Y). O

Next, we prove a sufficient and necessary condition for the convergence of S(X).







Lemmua 2.1.1. X is a positive random vector, then 8 (X) < 00 if for some decreasing

G E [0, 1], J G" < 00 and -S F log G < 00, where F(t) = P(X > t), p > 1.

Proqg For any subset E,

F logF logr F- (-3

-F log rF S F lo G' -S F- +G (2-4)



Again for any G: and p > 1, S F log f~~SF J G so f F log F p S F log G > f F J G .
Thus if f F < 00, J G" < 00 and f F log G < 00, then -S F log F < 00. O

Theorem~ 2.1.2. For positive random vector X, S(X) < 00 if and only if


lXog(1 t)F(t)dt < 00

Prioqg Assumle S(X) < 00, thenl E~(X) < 00. P(Xi > t) ( ,- th~us log P(X > t) (
log E(X) log t, or | log P(X > t)| 1 log E(X) + log t.
Multiplying both sides by F and integrate with respect to t entails that


F|i "log F| i-(X) log E(X) + FU(t) log idt (2-5)

Since fo1 F(t) log: Idt ( ol | log tldt < 00, S(X) < 00 implies thlat /"'log(t)F(t)dtl < 00.
Hence, S(X) < 00 if and only if for" log(1 + t) F(t)dt < 00 by the previous lemma. O

This theorem studies CRE of the sum of two independent discrete random variables. If
we interpret 8 as "information", then the theorem simply ;7io that adding two independent
discrete r.v.s increases the !ish1its~! !s un" content.

Theorem~ 2.1.3. For r,:; independent discrete r.v.s X and Y,

max(S(X), S(Y)) ( S(X + Y) (2-6)

Proog Let X and Y be non-negative and independent discrete r.v.'s, assume










1 2 n91 2 qm i whm
X, ~ where xl
yl < y2 < < ym. Also assume X + Ye {zix, z2, N xv, Where zz < z2 < N- xv

Since x log x is convex, by Jensen's inequality, V z ,


P(X + Y > zy) log P(X + Y > zy =P( >zy- i=1 o ( >z iq


( P(X > zj yi) log P(X I> zy y/i)qi 27
i= 1

Multiply both sides by -azy = --(zy I zy) and sum up with respect to j, we get


AyPX +Y z) log P(X + Y > zy) (2 8)
j= 1


j= 1 i= 1

So we have

N m
S(X + Y) ~ az, P(X > zy -J ys|1gP(X > z, ys)|ql (29))
j= 1 i= 1

= ge A zP(X > zy yi)|10ogP(X >- z ys)
i= 1 j= 1

For each yi, there exits a subsequence {jk)~ =1 ofI th subscribe {Jj}{ s.t. zyr yi =ZVk

Now fix k, we have P(X > z~ -- yi) = P(X > zjc~ Yi) = = P(X > z _l- Yi) = P(X >

Zk). Thus the second summation of the Eq. 2-10 can be simplified as


AzyPXj > zYi -|1g P(X > zy y)| (2 10)
j=1







= (X)







Hence the Eq. 2-10 leads to


S(X + Y) > q;S(X)


S(X)


(2-11)


The same arguments can be applied to S(Y).


Stam [15] gives a new proof the of the Shannon power inequality. We obtain an analogous
one for CRE.

Theorem 2.1.4. For r,:; nonnegative random variables X and Y,


S(X + Y) ( S(X) + S(Y) + E(p(X, Y))


(2-12)


where p(x, y)


log P(Y > y|X > x)"P(X > x|Y > y) .


Furthermore,if FXree-(z+Y)r(z+Y) > max{Fxe-"r("), Fye-yr(y)) }, then


S(X + Y) ( S(X) + S(Y)


(2-13)


where r (-) is the Hazard rate.

Proof. By Rao [12], S(X) =


E[X + Xlog Fx(X)]. So


S(X) + S(Y) S(X + Y)

E [(X + Y) log FX+ (X + Y)


(2-14)


X log Fx (X)


Y log Fy (Y)]


Fx+v(X + Y)
Fx (X)


Fx~y(X + Y)
+ Ylog ]
FY (Y)


E [X log


x log Ex v~) + ogE +). ic


Let ~(x, y)


FX+Y(x + y)
Fx (x)


P(X + Y > x + y) P(X > x, Y > y)
P(X > x) P(X > x)


P(Y > y|X > x)


(2-15)


(2-16)


~(x, y) x log P(Y > y|X > x) + y log P(X > x|Y > y)

= log P(Y > y|X > x)"P(X > x|Y > y)"

= p(x, y)

Taking the expectation entails the claim.







Furthermore, for any fixed y > 0,


8 (x, y) Fz fz fx fz Fz
= og x + x y = log + xr (x) zr (z) (2-17)
8x Fx F Fx F Fx

So >ii 0 if and only if Fze-zr(z) > Fxye-wr(2), in which case Q(x, y) is increasing in x, so

~(X, y) 3 #(0, y) 3 p(0, y) = 0.

By the symmetry of ~(x, y), one has if Fze-"r(") 3 Fye-~'" ), then #(x, y) is increasing in y,

so ~(X, y) 3 #(X, 0) 3 p(X, 0) = 0.
Hence if Fze-zr(z) > max{Fxe-r("), Fye-~'"M)}, then E[ (x, y)] 3 0. The theorem
follows. O

At last, we establish the connection between CRE and the distribution.

Lemmra f2.1.2. Let X be a r.v. with decreasing distribution F(x) and mean 1. For it ;,


S(X) =~ lm-logF(Zx)"dx (2-18)

j. Consider the following limit,



1 1
pli log F(x)pdx = im- logI F(x)"dx og F(x)dx ( 19

lgF(x)"dx| I,=

S F(x) log F(x)dx
fF (x) dx
= (X)

The last equality holds because f F(x)dx = E(X) = 1. O

Theoremn 2.1.5. Given positive r.v. X with I. ,:.:lei f(x) and mean 1, S(X) 3 E(X f(X))-








Proof. Let F(x) be the decreasing distribution of X, f~o" F(x)dx = 1. For p > 0, Using

integration by parts and Hoilder's inequality entail that

F(x)"+ldx (p +1 ~ )xf(~x(-0




Rarrange,, foO F(x) +1 ( (p + )P+ f" x0 Xf (x)p*dx. Take log and divide both sides by p, one
has
log fo" F(X)P**dX (p +1) log(p +1) log fo" x f(x)P**dX (-1
p p p

Let p 0 +, by Lemma 2.1.2, one has -S(X) 4 1 E(X f(X)), or S(X) 3 E(X f(X)) 1. O

2.2 Generalization

The previous section is solely about CRE of positive random variables. Since in most cases,

we need to work with negative random variables, for instance, the Normal family, in section we

extend the original definition of CRE to an arbitrary r.v. X E R and explore its properties.

Definition: Let X be a random variable in RW, we define the Cumulative Residual Entropy of

X, by:

S(X)=- PX>x)loP(> dx(2-22)

Notice that this definition is also valid for both continuous and discrete random variables. Using

this new generalized definition, we can obtain several results similar to those in Rao et al. [13].

The first theorem is the convergence property.

Theorem 2.2.1. S(X) < +oo, if for some i and some p > 1, E(|X,|9) < +oo.

Proof. Since lim sw, 110 ="' lim s, = -1, for every E > 0, there exists

-oo <; f < 0 such that for all A < ;\, we have | 11 |< + E.







Since & P. (X > A) log P(X > AjdA = (JIX + J ~")P(X > A) log P(X > A)dA, the second
integral is finite, we only need to show the first integral is also finite. We know that
slog P(X > A)
S P(X > A)0og P(X > A)dX| | P(X > A)[1- P(X > A)] (X>X dX|

6 (1+E) _O P(Xi 4 A)dA


S(1 + E)E(-X)

S-(1 + E)E(X) < +oo (-3


Hence S(X) < +oo.


Theorem~ 2.2.2. For r,:; independent random variables X and Y,

max (S(X), S(Y)) ( S(X + Y)


(2-24)


P. c. Since X and Y are independent


S_"


P(X + Y > t)


P(X > t a)dFy(a),


(2-25)


where Fr (-) is the cumulative distribution fumetion of Y. Using Jensen's inequality,


/+=
P(X + Y > t) log P(X + Y > t) (


P(X > t a) log P(X > t a)dFr (a)


(2-26)



(2-27)


Integrating both sides with respect to t from -co to +oo.

S(X + Y) 4j~o dii (a) P(X,,,,,,, > -a lgP( t-a,

4 dFY (a)S(X)

-S(X)


Hence S(X) ( S(X + Y)
In1 th~e same way, weV can? show S(YJ) ( S(X + Y), hnceC max (S(X), S(Y)) ( S(Xi + Y/). O

One advantage of CRE over the Shannon entropy is that the sample CRE converges to the
true CRE as sample size increases, which is stated in the following Weak Convergence Theorem.







Theorem 2.2.3. (Weak Convergence) Let Xk, k

Xk COnUCrge in diStribution sense to X, that means


1, 2,..., X be random variable in R.


lim E[(p(Xk)] = E[(p(X)]
k->+o


(2-28)


for all bounded continuous functions cp on R. If all the Xk are bound in L" for some p > 1, then

lim S(Xk) = S(X) (2-29)
k->+o

P": Using the same argument as in Theorem 1, we know for every Xk and every E > 0, there

exists -oo < Ak. < 0 such that for all A <; Ak, We have |i~ 11o| 1+ E. Since


P(Xk~ ~ ~ ~ ~S~;x > )lgPX hd PX ) log P(Xk > AidA

we only need to show the first integral is finite. We know that for Xk k X


(2-30)


log P(Xk X
= P(Xk > A)[1- P(Xk > A)] |
1 P(Xk X
1 (1 +E)P(--Xk~>--
E(-Xk)P
l( |( + E)


|P(Xk > A) log P(Xk > A)|


(2 :31)










(2-32)


for some p > 0.

Since |(1+E)j ~ iS integr~able, b-y dominated conlverg~ence theoremn,

P(X > A) log P(X > A)dA lim P(Xk > A) log P(Xk > A)dA


Therefore, limk->+0 S(Xk) = (X).


The next theorem gives CRE(X) a upper bound in terms of the pth central moment of X.

Theorem 12.12.4. Let X be a random variable with pth central moment m,, then S(X) 4

m "qy! 9, where 1/p + 1/q = 1.

Prioof. Let F(X) be the decreasing distribution of X. For any integer q > 1,


f (x) (- log F(x))'dx


E [(- log F(X))V]


S(- log: x)"cd = q!







By Rao [12], S(X) = E(X EX)(- log F(X)). Holder's inequality entails that for p, q > 1
such that 1/p + 1/q = 1,

S(X) E E[(X EX)"] /PE[(- log F(X))V] 79 (2-33)



The statement follows. O

Corollary 2.2.1. Let X E [0, 1] be a random variable with standard deviation ax. Then

mi~ ( S(X) 4 z/2x (2-34)

Prwoof Let pJ = q = 2 in Theorem 2.2.4, so S(X) ( m /2'2!1/2 aX-
If X E [0, 1], then E(|X-E(X)|) 3 E[(X-E(X))2] = a). Hence S(X) 3 E(|X-(X)|



Since both CRE and the Shannon entropy measure the uncertainty of a random event, we

study their connections in the next two theorems, one for the continuous case, the other one for
the discrete case.

Proposition 2.2.1. X is a r.v. with decreasing distribution F(x) and the 7~:. 000 f (x),
then

S(X) > eH(X)+Elog|I~/(X)|-1 (2-35)



Proof. Since S(X)= E f d (Rao [12]I), Simplifyi it by interarr~ting. by parts, one has

-XF(X) + j~ if(t)dt j~ if(t)dt
S(X) = E =-E(X) + E =E((p(X) X) (2-36)
F(X) F(X)



So by the convexity of log function, log S(X) 3 E log |9p(X) X|I.
D~iffer~entiate p(x:), ip'(x) = (p(x) x), so log |9'(x)| o o(|()-x)
Since H~(X) 1 Eilog and log S(X)3 >Elog(|9(X) X|) one hlas

f(X)
E log | 9'(X)|I = E log F(X + E log(|9c(X) X|I) ( 1 H(X) + log S(X) (2-37)








or S(X) 3 eH(X)+Elog Icp'(X)|-1


Proposition 2.f2.f2. (Discrete CRE and S'lor e. .>n) Let X be a discrete r.v. with pmf

xa p~xwhere xl < x2 < < xn, a 2. Let N (X) be the 5'/ ter..>:n entropy of X


and S(X) be the CRE entropy of X, then we have


WH(X) C
S(X) > e' p-w e -l


(2-38)


Where


n-1 n-1
C = Cpi log(x x ) + p log f (pl, ..., ps) log[(1 -p,)~n i'p
i= 1 i= 1


k=i+1 k=i+1

- Xi, gi = C2"+ pk. By the log sum inequality,


/(Pi+1, ---, Pu)


n-1
Cpi log 94
i=l xg|0~e|


P,: Let axi = x ~


n-1 pn-1
> Pi log i=1 Pi
i=1 i=1 Axigilog(gi)
1 pi
=(1 p ) log
S(X)


(2-39)


But


n-1 n-1 n-1
Pi~ p lo 9 -(X) p, log(ps)j p log(Ax4) plog(g, log(g,)|)
Axigi | log(gi)|I
i= 1 i= 1 i= 1

Simplify the inequality Eq. 2-39, we have
n-1 n-1
NF(X) + Cpilog(Ax4) + Cpilog~gg| log(e)|) + (1- p,) log(1 l oo~s

S(1 pi) log(S(X))


Rewrite this inequality further, we get the claimed inequality (Eq. 2-38).


2.3 Cumulative Residual Entropy and Weak Law of Large Numbers

One important application of the Shannon entropy is that it was used to give a new proof

of the Central Limit Theorem. In this section, we try to find a new proof of the Weak Law of








Large Numbers using CRE. The motivation is the Weak Convergence property of CRE and the

property that CRE of a constant is zero.

Lemmna 2.3.1. Let {AznnEN be "t;, sequence of nonnegative r.v.s, if 8(X,) 0 then



Proof. By proposition 1 in [1], Vu E N,0 O E[|1X E(X,)| II 28(X,) 0 so Xi E(X1) as
n oo. O

Theoremn 2.3.1. Let {LtnnEN be a sequence of nonnegative r.v.s bounded in L log+ L,

S,> := CE Xi, the nth partial sums. Let fes(t) := P(S > t)logP(S > t). If{(Xn~nEN is

i L ,i,.;11. then ]~ fw(t) dt is ; ,,.:.7.., 1,,1;1 small when A is 'a r ,II enough.

j. Stepl: Since x log* x is convex,


Sir log+ S,> _1, Xi log+ Xi (2 40)
i= 1

so

E(t log 8 1 Elogog X4]( (2 41)
i= 1
where M~ is the upper bound for E(X, log X ), Vi.

Step2: Want to show that E(- log ( ~) : ~-> A) is uniformly small if A is large enough.

Fix n EN and A > 1, we have


E(St log(", Si n> A)EXi log(* ,> > A (2 ')


= E(XI log (*r) Sir > )


The second equality follows from the exchangeability of {XnthEm.

By the inequality xy ( x log x + exp {y 1}, V x > 0 and y > 0, Eq. 2-42 leads to


E(XI log( ") : > A) E(X, log X1 + exp {log -" 1}i : -" > A) (24-43)

=E(X1 log X1 : Sr> A)+ -1E(*r r>A
n e an

The first expectation tends to zero uniformly as A 00o since X1 E L log+ L.







For the second expectation, since X1 E L1


E(* 2 n> A E(XI : n2> A) 0 as A 00o. (2-44)

To get the precise bound, we have the follows. Since

E(** log A : 8 > ) 4 E( S2 lo:g(S : 82 > A) (2-45)

we have the bound


E(* 2> )~ 1E( Szlog+( M(2-46)
n a log A\ n a log A\

which tends to zero uniformly as A 00.

Step: For any positive r.v. X and p > 1, by the log sum inequality,

P(X > t) E((X A)+
P(X> ) lg( tP ) E((X A)t) log( A-p+l (2 .il-)
p-1]
A-"+]
E((X A)+) -
p-1

The second "3" follows from the inequality x log & x y, V x > 0, y > 0.
But


P(X t) og()dt = P(X > t) log(P(X > t))dt (2-48)
A AP

+pP(X > t) log(t)dt
/OO

The second item can be simplified as


P(X >t) lo~tJd E( 1{xlt log(t)dt) (2-49)



=E(X log(X) X A log(A) + A: X > A)

Plugf the Eq. 2-48 and Eq. 2-49 into Eq. 2-47 and rearrange the items, we have

P(X> ) ogP( > ))t -E( lo(X :X A +pE(X : X > A) (2-50)

+p(Alog(A) A)P(X > A) + E((X A)*)
p-1








Replace all the X's by and multiply both sides by -1, we have


P(" >t) log(P(S> t))dt ( pE(S log(S,) S n>A pE(S S">A

Sn S, A-"**
-p(Alog(A) A)P( > A) E(( A) ) + (251
n a p-1

When A is large enough, the right-hand side of inequality Eq. 2-51 is uniformly small by

step and step2, that is JAM In(t)dlt is uniformly small when A is large enough. O

This theoremn shows that I( -) converges as n oo txl he nextl step is to show~ that

E( -) 0 then the Weak Lawr of Large Number follows from the above lemma.
2.4 Cumulative Residual Entropy and Independence

In this section, we derived a new measure of independence using CRE, which can be used

to measure the independence between two r.v.'s.

Suppose we have a samples of two continuous r.v.s (X, Y), {(xl, yl), (x2, y92.. *** n Un

we aim to infer from the sample data if X and Y are independent. The commonly used method

is to calculate the correlation between X and Y, then perform the hypothesis testing of Ho

X and Y are independent, H1 : NVot Ho. However, we know that cor(X, Y) = 0 does not

imply X, Y are independent except for the Normal distribution for which the independence

and uncorrelated are equivalent. Here, we derived a new measure of independence using CRE,

which can measure the true independence rather than the correlation between two r.v.'s.

The simulation results show that CRE measure is .I-i-inguli1 cally better than the correlation

measure, especially when n is sufficiently large.

Definition: Let X, Y be two real valued random variables, define

S(Y|X)
r(X, Y) = 1
S(Y)

Proposition 2.4.1. For r,:; r.v. X and Y,


r(X, Y) = i~f X and Y are independent; (2-52)
1, i~f Y is a function of X.

j'. It follows from the property of conditional CRE. O








Corollary 2.4.1. Let {Xn},as be r.v.s converging to X in distribution, and {Yn},as be

r.v.s converging to Y in distribution. Then


lim ,(XY,) i~fX, and Y, are independent;
n~1, i~fY, is a function of X,.

.~ It follows from the Weak Convergence Theorem and the above proposition. O

We perform two experiments to test the independence between two r.v.s using the measure
r and the correlation method. Since all random numbers can be obtained from the Uniform

random numbers, we focus on the the Uniform distributions.

Experiment 1. Generate samples from Uni f (, 1) as follows:

step 1, generate 50 samples, {x za)1.,x ,...,x and let X(I) dlenolte this array;

step 2, generate 100 samples, {x 2), 2) 1... 0}ii and let X(2) denote this array;

step n,1 generate 50n samples, {x x ..., x } anld let Xt") denote this array.

We generate 200 times and get a sequence of arrays X(1),X(2),., X(200) and the length of

Xt") is 50n. Do the same procedure and obtain the other sequence of arrays Y l), Y(2),. Y(200)

Since we generate two sets of random samples independently, we may assume Xt") and Y(")

are independent. By above corollary, r,(X("), Y(")) 0 as n oo. The numerical results show

that as n oo, r,(X("),Y(")) decreases to zero .-i-mptotically, but cor(X,, Y,) does not have

this pattern.

Experiment 2. Construct a new sequence Z(1), Z(2) ,..., Z(200) Where Z(") = sin(100X(")),

a = 1, 2,..., 200, a highly nonlinear function of Xt"). Recalculate r, of (X("), Z(")) and

cor(X("), Z(")). The results show that r, concentrated around .97 as n increases, which entails

high dependence between Xt") and Z("). But the values of cor(X,, Z,) reveal no information.

The trend of the graph of r, in two experiments are extremely different, but the trend of

the correlations in two experiments are so similar that we can not distinguish them.







CHAPTER 3
K(ULLBACK(-LEIBLER DIVERGENCE OF DISTRIBUTIONS

Let X1, X2,...,X, be i.i.d. sample of size a from the decreasing distribution F(x, 0) with

an unknown parameter 6. In this chapter we develop a new method to estimate 8 using the

distribution rather than the density of X1.

Definition: Let F(x, 8) be the true distribution with parameter 8, G,(x) be the empirical

distribution of a random sample of size a from F(x, 8), X = (X1, X2, .. ,X,). Define


CSX;6):= G,(x) log G() [G,(x) F(x, 8)]dx (3-1)


CS(X; 8) is the KL-divergence of G,(x) and F(x, 8).

Proposition 3.0.2. Given a sample space (R, FT, P), CS(X(w); 8) 3 0 for all we E ,

eq l;, ///I~ holds if and only if F = G,.

P" It follows from the log-sum inequality and the inequality x log 2 3 x y, V x > 0 and

y >0. O

Definition: Let 0* = argminoCS(X(w); 8). Call 0* the KLD eStimator of 8.

Example: (Exponential Distribution) Suppose F(x, 8) = e-"/o, where 8 is unknown, from

which we draw a random sample of size n, X = (X X2,...,X,), and E(X ) = 0. WLOG, we

assume X1 < X2 < < X,.

The empirical distribution G(x:ILR) C= E 1(x!;~ ,C x e

Let Pi = "1 we have

n-1 n
g(0) =CS(X; a) := (Plog P,)(X, Xi) + -~ XJ -`V ( fX 4 (3-2)
i= 1 i= 1 i= 1

Let A, := X,?, an d 0 = g(0) =q An+ entals 01* =- ote that this 0* is

unique.

Proposition 3.0.3. Let {Xe}" be a sample of size a from F(x; 8) = e-"/o, 0* be the KLD

estimator of 8 based on {Xi }" Then 0* 0 a.e. as n oo and E (0*2) 82

F" E(0*2) = 2 is triVial.








0 a.e.


~ ~XZ E(X2) a.e. Hence 0*


By the Law of Large Numbers, An


as n 00o.


Given X ~ exp,(0), E(X) = 0 and Var(X) = 02

1). E(0M/LE) = 8, While E(0*2) 82

2). Consider the mean square error,


E(0MLE s2 =E(H LE) t2 =/ 0


E(0* 8)2 = 202 20E(0*) 0


Example: (Uniform Distribution) Now assume we have a uniform decreasing distribution

F(x, 8) = 1 ~, O 4 x ( 8, where 8 is unknown. We draw a sample of size a from this

dlistribution, X = (XI, X2,..., X,), andc E(XI) = F. WLOG, we assume XI < X2 < < X,.

The empirical distribution G,(x) = 1(llx,, x 1l. Let Pi = we have

n-1 n-] Y1)C i1X+ Xi 6
gl(0) = (PlogP) (Xi X ) (XI Xj) Ps nl 1 el 6I (3 3)
i= 1 i= 11 a 8


1 1 ix 1 0~~c e- X 1
g'(0) = n I( )+ 2
i= 1 i= 1


(3-4)


Sinceg"(0 = >0, the solution to g'(0) = 0 is unique, or 0* is unique.

Solving g'(0) = 0 for 8 numerically entails the KLD eStimator 8*.

We do the following experiment: Generate 100 samples from Unif(0,1), calculate 0* using

Newton's method. Let el = |8* 1|, e2 = |X(,) 1|, then compare el and e2. Repeat the same

procedure 1000 times and we could see that there were more than T.~' times that el < e2, SO 8*

is better than X, in the MSE sense.

The same thing happens when generating samples from Unif(0,10).







CHAPTER 4
ENTROPY VARIABLE

It is well known that Shannon entropy is the expectation of log f(X), we call it entropy

variable. Since log f (X) is a random variable, it is natural to study some of its other statistics

(i.e., expectation, variance, moment generating function, etc.). In this chapter, we first study the
variance of log f(X), call it Information V -.l1 / /l~ or IV. We look at the general properties

of TV and its values for common distributions, for instance Uniform, Normal, Gammma, Beta,

Weibull and so on. Then we study the distribution and the Laplace transform of log f(X).

4.1 Information Volatility

Definition: Let X be a r.v. in some probability space (R, F, P), X has the density f(x).

Define the Information Vol.;L1 / li of X, IV(X) = Var(log f(X)).

Example: Let X ~ NV(a, 0.2), then log f (X) = log 2/r2+ (X a)2/2o.2

So NF(X) = + flog ~/;~2r2 and IV(X) = ~.
Notice that log f (X) 3 log 2/r2 = H(X) IV(X).

By' the C! HI ls. v inequalityl P[| logr f (X) H(X)| < ~] > 1 or

P[| log f (X) H(X)| < t] 3 1 2t2)

Example: Let X ~ exp(A), then log f (X) = log A X/A, so 'F(X) = 1 +10g A, IV(X)=1.

Notice that log f (X) 3 log A = H(X) IV(X), and


P[| log f (X) H(X)| < t] 3 1 {2 (4-2)

Theorem 4.1.1. Let X be a r.v. with rplf f(x). Then X has a ,,..for ts I, distribution if and

only if the information .J;/ lr.7l.1;i of X is zero. The assertion holds when X is discrete.

P. c. Let N~(X) be the Shannon entropy of X. By definition, N~(X) = -E(log f(X)).

Since IV(X) = Var(log f (X)) = 0, log f (X) is a constant. So log f (X) = E(log f (X))=

-NF(X), or f (X) = e-wH(x) with probability 1.
Let 8 = ewH(x), we have X ~ Uni f(a, a + 8) with -oo < a < +oo. O

Theorem 4.1.2. Given a r.v. X with 1~:. l;.-1 f (x), if f is ;; inant. Ir:. with respect to x = a

for some constant a, then IV(X) = IV(|X |).







Proof. Without loss of generality, assume a = 0, so the density of |X| is f (x) + f (-x)
IV(X) = Varlog(f (X)) = Varlog(2 f(X)) = Varlog(f (X) + f (-X)) = IV(|X|).


2f (x).
O


Example: Letl X be double exupone~ntially distributedl with density f (xr~p, o-) = ~e- ZI, j
It can be easily verified that IV(X) = ~. But |X| is exponentially distributed, so IV(|X|) = ~,
hence IV(X) = IV(|X|).
The Shannon entropy is not invariant under affine transformation because it is shift
invariant(Cover [5]) but not scale invariant. However, the following result shows IV is invariant
under affine transformation.

Theorem 4.1.3. Let #(x) be a monotone function. If TV( (X)) = IV(X) for all
continuous random variable X, then #(x) is affine.


0(X') is fyi(y) = fx(h-l(y))| 1-l(y)|. Without the loss of generality,


Proof. The pdf of Y =
assume is increasing.


/d d
dy dy

Do the change of the variable. Let x = ~-l(y) or #(x) = y, so =" X-(y) and d
] So


(4-3)


fx)(log fx (x) +10og )dy
dy dy


/fx (x)~'x
fx~x(log )dx

Y'))2 I.X Zr) (l0g ~)2dx.


E(log fy (Y)) =






In the same way, one has E(log fy(~
Hence


(4-4)










(4-5)


fxx)lo ()2d fxx)log )dZ2

fx (X)
Var(log)


IV( (X))







IV( (X)) = IV(X) if and only if Var(log #'(X)) = 2Cov(log f(X), log ~'(X)), or equivalently,


fx(log ')2dx fx log #'dZ 2


(~rii


For any set A with positive measure a~, let f =
equation entails


- fi log fdxl f / log: o'dx1

- -A. Substituting this function f in the above


0 S (log o')2dxi lg 'd

l(~log og~ #'x log- lo 'd


c for some constant c by the Cauchy-Schwarz inequality, hence #(x) is an affine


So log #' (x)
function.


Theorern 4.1.4. Let X be an a dimensional continuous random vector on some r, oa~lad.:.7.1

space (R, FT, P) Then IV (X) = IV (AX + b) for r t.;, invertible a x n matrix: A and r t. ; ax 1
vector b.


P-"roof. Assume X has the density f(x), so the density of Y
b)), where |A| = det(A).


AX + b is g(y) = |A|-i f(A-l(y


[log E(log f (X))]2
|A|

log 2 + 2logI~ E(log f (X)) + [E(log f (X))]2


[E(log g(Y))]2


(4-8)


E.[(logg(Y))2J = ru) l ~Og9 y)2d

=~f (A- (y b))(log f (A- (y b)))2dy

= f~z(log f(z))2dz

= logI +, 2,10g E(log f (X)) + E[(log f (X))2]

Hence Var(log g(Y)) = E[(log f (X))2] [E(log f (X))]2 = Var(log f (X)).


(4-9)








O







Corollary 4.1.1. For r,.;, random variable X, IV(X) is independent of the mean and the

variance ofX .

Proposition 4.1.1. Let X be a n dimensional normal random vector, A an invertible a x n

matrix: and b a nx 1 vector. Then, IV(AX + b) = So all Normal random variables have same
IV.

j'. Assume X ~ NV(p, E), where p = (py, p2a,, "n)T and E is n x n covariance matrix. The
density of X is f(x) = (2rn ] 2Z{ x-p -( U

Since for any a dimensional normal random vector X, (X p)TE-1(X p) ~ X2~

1 1 n
IV(X) = Var(log f (x)) = Var((X p)TE- (X p)) = 2n (4-10)
4 4 2

For any invertible matrix A and vector b, AX + b ~ NV(Ap + b, AEAT), which is also a n

dimensional normal random vector. Hence IV(AX + b) = ". O

Theorem~ 4.1.5. Let f(x), O 4 x < 00 be a 1. ;.'-9 i with Var(X) = m.

(1) f < V(X) > 1,

(2) If |~ |. < IV(X) < 1.

P-: Let g(y) = ;7e-"'c/~ Differentiating funlctionsv fg and f/g enltailsi that (fg)'

( f' f) e/ an(f/)'=( f' ~f ) e-m/ 2.~l COnuider case (1): If f' < f ,
both f g and f /g are d..i .0 -; so Covx(log f g, log f /g) > 0. But

0 < Covx (log f g, log ) lgfglg)- flgfg flg- (41 11)

= f [/l(log f )2 (/l0R)g 2 0 2 9 0 2



= IV(X)- 1

So IV(X) > 1.

For case (2), if f < f' < ~f, thecn fg is decreasin~g but f/g is increasing, so
Covx (log fg, log f/g) < 0, hence IV(X) < 1. O







Proposition 4.1.2. Let X be a continuous random variable with decreasing distribution
F(x) and the hazard rate function r(x), then

IV (X) = Var (log r (X)) + 2Cov (log r (X), log F (X)) + 1 (4-12)

P'roo f. It 's trivial since r (x) = anld log: f (x)=lo ) +loF().O

Notations: (Tjx =Va~r(log fx), CTJxy = Var(log fxy~v), (Tfx/ = Va,(log fxfy) anld

a/x,v//x/ arlg )
Theorem 4.1.6. Let X and Y be r.v.'s with joint I. ;. -.:1/ fx,v(X, y) and the I, .rr i:,..al
densities fx and fy ,~'. in ..:; e;, if E[ (log fx (X)fy (Y)2 E[ (log fx,v(X, Y)) 2] then


axv 0x/vS~ (4 13)

The .l;,rd.Uli holds if and only ifX and Y are independent, in which case, a2x 0x a

.~ Since Var [log fx,v(X, Y)] and Var [log fx(X) fy(Y)] are scaling invariant, without loss

the generality, we may assume H(X, Y) 3 0. If H(X, Y) < 0, choose a, b / 0 such that

H(aX, bY) =log + H(X, Y) > 0.

af, xty = Var[log fx~v(X, Y)] V'ar[log fx(X) fy(Y)] (4 14)

=E[(log fx,v)2] [E(log fx~v)]2 E[(log fxfy)2] + [E(log fxfy)]2

S- [E(log fx,v)]2 + [E(log fxfY)]2

=E(log fxfYfx,v)E(log fxfy log fx,v)

=['F(X) + 'F(Y) + N(X, Y)] ['F(X) + 'F(Y) N(X, Y)]

(0

For the last step, we used the property of the Shannon entropy 'F(X) + N~(Y) 3 'F(X, Y) and
the assumption N~(X, Y) 3 0. O

Example: About the assumption E[(log fx,v(X, Y))2] ( E[(log fx(X) fy(Y))2]
(1)(The assumption is false) The joint density of X and Y is f(x, y) = x + y, O 4 x ( 1,
O ( y ( 1. The marginal densities are fx(x) = x + ~, O 4 x ( 1, and fy(y) = y + ~, O 4 y ( 1.
It can be numerically verified that E[(log fx,v(X, Y))2] > E[(log fx (X) fy(Y))2]








(2)( The assumption is true) The joint density of X and Y is f (x, y) = e-9, O < x < y < 00.
The marginal densities are fx(x) = e-" and fy(y) = ye-w

It can be numerically verified that E [(log fx,v (X, Y))2] < E [(log fx (X) & (Y))2]

Corollary 4.1.2. Let X1, X2 X, be a r.v.'s with joint I/:.:;.-1 fx,,...,x,,(xi, Xn)

and the nonyr~ll...rl densities fxz(x ) V i = 1, 2, n, if E[ (log fxx,..,xn (X ,X,t))2 2

E [(log fx, (X1) fx, (X,t))2 1268

0~xx,.. 2x--x (4-15)


The .l;,l.:lti holds if' and only if X1, X2 X, are independent, in which case, ( T,..x



.~ This is a trivial generalization of the Theorem 4.1.6. O

Theorern4.1.7. Let X and Y be r.v.'s with densities fx(x) and f(y) ,**-/~~ .../; 1;; The

conditional I/:.:;.-1 of X given Y is f(xly). If E[(log fx,v(X, Y))2] > E[(log fx(X) fy(Y) I th2,6


fx (X)
I V(X|Y)r IV(X) > 2C/ov[log ,Log f (Y)] (4-16)
f(X|Y)
The .l;,rd.Uli holds if and only ifX and Y are independent.

Proof. The joint density of X and Y is fx,v(x, y) = f (xly)fy(y), so

Var (log fx,v (X, Y)) = Var (log f (X | Y)) + Var (log fY (Y)) + 2Cov [log f (X |Y), log f (Y)]

But, by the Theorem 4.1.6,

Var (log fx,v (X, Y)) 3 Var [log fx (X) fY (Y)] (4-17)

so we have


Var (log f (X |Y)) + Var (log fY(Y)) + 2Coev [log f (X |Y), log fY(Y)] (4-18)

SVar [log fx (X) fY-(Y)]

=Var (log fx (X)) + Var (log f(Y)) + 2Cov [log fx (X), log f(Y)]









fx (X)
Var (log f (X | Y)) Var (log fx (X)) 3 2Cov [log log fy (Y)] (4-19)
f(X|Y)



Theorem 4.1.8. Let X and Y be discrete r.v.'s, assume X ~ ,: X X and



Y ~ 91 2 q m .Also assume P(X = xi,Y = yj) = pij, 1 ( i ( n, 1 ( j ( m. Then


E [(log P(X, YI))2] E[(log P(X)P(Y)))2 ) :.)logpa (4-20)

The ~ l,~r.:;;;1/ holds if and only ifX and Y are independent.

Proof. Apply the same arguments used for the continuous case. O

4.2 Weak Convergence

In this section, we show that IV has a nice convergence property which the Shannon

entropy does not have. We examine a sequence of discrete random variables which converges

to a continuous random variable in distribution. The first result is that the Shannon entropy
of any discrete sequence will never converge to the Shannon entropy of the continuous one, but

one can ah-li-w construct a sequence in a way such that IV converges. Some examples about

special distributions are studied in details. We also propose a new proof to the .I- ...ph.1ilc

approximation of the Shannon entropy of Binomial@n, p) (Jacquet [7], K~ness1 [10], Worsch [16]).
Theorem 4.2.1. Let X be a continuous r.v. with 1:. l;./i f(x) and {X (") } a sequence of

discrete r.v.s converging to X in distribution. Then limess Fi(X (")) = -oo. There exists {X (") }
such that lim,,, Iv(X(")) = IV(X).

Prwoof Vu > 0, ass9umle X("C) talkes value at {x '" x ", ..,x where x '" Sincet X('") X in distribution, Ithen limes,, P(X('") ( x ")) P(X ( x '"!) = 0.
Define p, =P((" =x), so the above limit can be rewritten as


limo p f (x)dx + f (x)dx =0 (4-21)
1= 1 1=








Sincet X('") X in dlistribution, then V~E > 0, there e~xists Ni s.t. Vu2 > N, J)1i f(x)dxr < E,

and ) /j~ f (x)dx < p ") + E-. By thle M'ean Value T'heoremn, onle canl chloose a point 1 3)fo
eachl subintecrval [x ", xn z], s~t. p n) af (n )lax where ax ) x x a) Without
loss of generalityl assumet for each n, Ax '") 6 AxC'"), a small c~onstant depending: on n2 and
lim,,, axC") = 0 since {X(")} converges to X in distribution.

NF(X '")) = )p "log p'" (4 22)

f(z "n) ax~ log f( xI") AxC

= ~ i" f ( '" )x log axt f (&)A lo f 2

log~(n Ax "), f i)xf( log f( x((") ax

-oo0

Since log AxC'") -oo, Ci f (2 ))z") x 1 and | C f (~ ") lo 2"A "| |f x lgf(

= |N(X) | < 00.
Next construct a sequence {X(")} such that lim,, IV(X ")) = IV(X).
Consiider the sequence {XCj)} with support {xIL ,'" x n,..,x where x'") < x '"), i =

1, ..., m1. F'urthler assume the distance between two ..~l 11..ent values is a constant, and let x "),






xi(" = axc),l a small consutanlt. limesco, axC") = 0.


SiV(X "))ir(" = Igr p FZ)log pl) 2 _


= .f(2 *)Ax"j)n, (logr .f(2 ") + log ax


(4-23)


2 ~ ~" n) t ( u) l0g ,n(j' ) 0 Z )
I ( f ;)2


fl(s ) (logf ij ),l") 2 (u+2 (il )Axl" logfi(: ) ogAx(")


f S(L6' )Ax(") (lo:g ax!";))2


(


f i(:i" *) (log f (F ') 2 (u) ~


S(r') jJ(u)


Since C, f (iif' ) log f (-Fl"')a 2 (u) i .iz) (l0g f)2 dx. E f i(:i"j)10g f (i'L)1")AxC"
f'f (x) log f (x)dx and Ci f (~!" I ff(x)dx = 1, one hlas


f (x) log f (x)dZ 2


= V(X)


(12/ij


Next we study some special distributions and explore their convergence properties of TV.
Proposition 4.r2.1. Let X ~ Unif(a, b), X, discrete r.v. with I p I~rI.:.l.J.7I ; mass fun action
P(X,s = + -i) =,i = 1L,2..., n,~ then IV(X) = V(X,).


()2


O n) (u)
( )2


) O n) (u


+210g Axf (~" )ax ", log f ( ). 1 f (D )A


+(log Ax )2


a) (u








Lernrna 4.2.1. Given n, let X,s ~ Geometric~p) with the 1''..? F7i lla mass function

P(i,s = ~) = p(1 p)i-1, i = 1, 2,.... X ~-Exponlentialji8) witSh I .10 fl (xr) = 7e-m/p, choose
p 1 then X,s X in distribution as n 00o.

F" :. The Laplace transform of X is (X) =e J e-"lmdx = The Laplace transform

of X,s is


E(At) = e p(1p~ p)i-1 p /":"(1 -p)]i (4-25)
1-p
i= 1 i= 1
p e- /(1 p)
1 pl- e- /I"(1 -p)

Lect p = -b


C(X,) = (4 26)


up(ex/" 1) +1
-as n oo
xp + 1

Since lim~,t- (X,) = (X), X, X in distribution as n oo. O

Proposition 4.2.2. Given n, let X,s ~ Geometric~p) with the r,~ ..1st~l..7.70 mass function

P(X,s = i)\ = p(1 p)i-1, i =1 2,..., then IV(X~) =( i~i)2( -) ChOOse p = i, then

lime,I, V(X,t) = 1 and lim,,Z-a1(X,) = 0.

.~ Let, pi, = (A )=p( p)i-1, then E(X,,) t= l aip = and E(X ~) t, i12P
2-

Since IV(~,t) =~ p,(log p,)2 1C p, log pi)21 We COmpute two summations as follows.


Pi lo p i p (log p + (i ) log(1 p)) (4 27)

= o log(1 p) ips
1-p
i= 1

=log + log(1 -p)
1-p p









Pi~ogps) = pi~o 2 + (i -1) og( p)]2 (4-28).

=(log )2 + 21g lg1-p p lg1-p]
1-p 1-p i= 1 i= 1

=(log )+21g log(1 p) 1 [log(1 p)]2
1-p 1-p p p2

IV(Xj~l7) = E pi(log pi)2 1 p log p,)22

Plugging in p = entails that lim,,,, IV(Ai)= 1 and liml 117(&>Z) = O

Corollary 4.2.1. Let XrY ~ E.Tpone~ntial(6) with I/ r.-.1 / f(x) =~"" e-mas, "

Geometric(-) with Ip~~~ l..1sl.:7:;i mass function P(X, = ()=p1- r-,i= ,2...Xs Xi

distribution as n 00o. lim,, Z V(X,) = IV(X), but lim,,, H(X,) / H(X).

Proof. Only need to show that lim,w-m H(X,t) / H(X).

V > 0, by Eq. 4-27, H(S,,) =- log -b~ up log(1 --) 00 cas n oo. But
H(X) = 1 + log P < 00. O

The Shannon entropy of Binomialn, p) distribution has been studied thoroughly in the

literature (Jacquet [7], K~ness1 [10], Worsch [16]). Here we propose a new simple proof to

the .I-i-mptotic approximation of the entropy. The same method leads to the .I-i inp ut ic~

approximation of TV(Binomial(n,p)).
Worsch [16] gives the following lemma.

Lemma 4.2.2. For r,.;; e > 0, 3 no > 0 s.t. for r,.;, a > 2no and i s (no, n no),


-B(n, i) <(") < (1 +e)Bn, i (4-29)



Lemma 4.2.3. Let S, be the Binomial(n, p) random variable. Then


E {[log p"(1 p)""( n)]m} = E {[log p"(1 p)"-'L( s)]m : I12 ( an~cnp]} + o(e-oz) (4-30)

where as (1, 1 + 6) for a small constant 6 > 0 and c is a constant depending on a~ and m.

Proof. Define the set A, = {? 4 sS, 4 anp}. Then by the ('I!. Iin-ll' bound, the probability of

A~ goes to zero exponentially, or P(A,) = o(e-or), where c is a constant depending on n.








Since for every Lc in the probability space, S,S(w) E {0, 1, ..., n}, 1 ( (1n) ( 2",) |log p (1

p)"-"( )| I log 2"p"(1 p).


E(log pi(1 p)"-i( ~)) =E[log pi(1 p)"-i( s) : A,] + o(e-cn)


(4-31)


where c depends on a~. For the same reason,


E { [log pi (1 p)"-'( )]m} E {[log pi(1 p)"-i'( ] "(ap}+oeo)


(4-32)


where c depends on a~ and m.


Lemmrua 4.2.4. Let ,S, be the Binomial(n, p) random variable, then for 'by.-~ n, -H(S,)=

E[log pi(1 p)"-iB(n, ,S) : A,] + o(e-or), and ZV(<%) = Var [B(n, S,) : A,] + o(e-or), where

A, = {"~ 4 S, 4 anp}, as (1, 1 + 6) for a small constant 6 > 0 and c is a constant depending
on a~.


.~ Let pi


P(S, = i) = (")pi(1 p)"-i. The Shannon entropy of S,S is


H(,S,) =- ~plog p, ilgp( ~-()
i=0 i=0

Given 1 < a~ < 1 + 6 and e > 0, for large n, on the set An

Lemma 4.2.2.


E(log p"(1 p)"-"( n))


(4-33)


{~ 4 sS, 4 anp}, we can apply


-B(n, ,S) <(" )<( )~,S)
1+e ,<1~Bn ,
So

log + log B(n, S,) < log(" ,) < log(1 + e) log B(n, S,)

By Lemma 4.2.3, taking the expectation of equation (4-35) over the set A, entails that


(4-34)


(4-35)


1
P(A,) log ~


+ E[log p"(1 p)"-"B(n, ,S) : A,] + o(e-"") ( -H(S,)

SP(A,) log(1 + e) + E[log p"(1 p)"-"B(n, ,S) : A,] + o(e-m)


(4 326)


-H(S,) = E[log pi(1 p)"-iB(n, ,S) : A,] + o(e-or) for large n. For the same

Var [B(n, S,) : A,] + o(e-c") for large n. [


Let e 0 then

reason, IV(S,)








Theoremn 4.2.2. Let S,z be a Binomial(n, p) random variable. Then


H(S,z) = log 2re( p~)+O()



1 1tol
IM~S, = +O()
2 n


(4-37)


and


(4-38)


.~ We first estimate B(n, S,z),


log B(n, S,z)
1 1 n a n
log + I log + So logf + (n S,z) log
2 2 S, (n S,z) S n Snz
1 1 1 1 1
log log n + l og s,s, + Slo lg + (n ,z) logf

1 S, S, S
- log 22i [log + log(1 ")] S, log (n S,z) log(1
2 n a n


(4-39)


s,
S
")
n


- log p"(1 p)"-"B(n, S,z)
1~ s
logf 2/r + 1 [logf 8> +log(1 -
2 n

+S,> log 8,> + (n S,z) log(1 -
n S
log 2/m+ 1[log ", + log(1 -
2 n
1 S, s,> 1
+n(1 p) [ logR +
1-pa np


(4 im)


S,
), s, log ~
n 1-p


S, 1
(1 ) log(1
n 1-p


a log(1 p)


S, 1
n 1-


Let f (x) = log x + log(1 x), its '73 TaylOT eStimation at center p is

f(x) =log:p(1 p) + 1'"(x-- i ~ (?i + )(x-p)2+ 1p _l))1 j _t 3+(x- p)4
where x E [%, pa] for as (1, 1 + 6).

By' the same method, the Tay~lor estimation of the fumetion g(x) =i logr + oga
center p is

g(xc) = (x~j( p)2 6p ~1-p)3L p3 +t O(x p)4
where x E [p pa for, a 1, 1 + 6).

Taking x = e gret that on the set A,={ S }

-log pi(1 p)n-iB(n, ~) = log & + log p(1 ) + P,(x ) -

](x2P)~ -t p) _6_ O(n.(x -p)4) Where x =.







The moment generating function of S, up, is M\(t) = e-paq pt + 1 p)".

Taking the log and differentiating both sides entails that M~'(t) = upM~(t)Y(t) where


Y(t) = e 1.~l
Since Y(0) = 0, Y'(0) = 1 p, Y"(0) = (1 p)(1 2p), and Y(3)(0)
one has

M'(0) = 0;
/ "(0) = up(1 p);

M~(3)(0) = up(1 p)(1 2p); and

M~(4)(0) = up(1 p) [3np(1 p) + 1 6p + 6p2]
Hence, E(~S p) = 0;

E ( p) 2 1 pl )

E(- p)3=p(-p)1-2)an

E(- p)4 = p(1_ p) [3up(1 p) + 1 6jp + 6p2]
So we have the following


(1 p)(1 6p + 6p2


-E[log p (1 p)"-"B(n, S,) : A,]
log 110p1p n 1p(1-p))+ O(1)
Lof2 2p(1 lg p) n ap
log 2xnep( p)n

Var [log p"(1 p)"-"B(n, S,) : A,]

[- +1 ) + 2IVar .[( "~ p)2] + O() l
4 p2 _1 --2 2p(1 p) n n
n 2 2 2_2 2]+O(" 01)
2p(1 p) n- a

2 n


(4-41)


(4-42)


By Lemmra 4.2.4, H(S,) = log 2xnep(1 p)S + O( i) and IV(S,,) = ( ). O

4.3 Applications of Information Volatility

This section studies applications of TV. The main application is that IV is able to separate

the Gamma, Normal, Beta and the Uniform distributions, which is stated in the following
theorem.







Theoremn 4.3.1. IV(Gamma) > ~, Iv(Normal) = 0< IV(Beta(a~, a)) < for

a~ > 2 2/ and IV (Und.:l:ter,)=0.

Wle only need to prove IV(GammaPn~ ) > and 0 < IV(Beta(a, a)) < for a > 2 4.
We need the following special functions for the proof. Hurwitz zeta function: ((s, a~)


S((s )= ~ 1 ~>


(4-43)


and


ag(s, C~)
dc~


s((s + 1, ~)


(4-44)


Polygamma function: ~(m)()


in`ld d
Sr (a)> = ( )m(Q = (m+l) log F(~)
dco dca


(4-45)


The following identity holds:


' (a~) = (-1)m+lm!((m + 1, ~)


(4-46)


Liemma 4.13.1. Let X ~ G'amm~ra(a P) wuith I. .'-.1 / f(x) = T---1----wole-m her~e

0 ( x < 00, a~, > 0, then IV(X) = (a~ 1)2 ~(1)(a~) a + 2, where ~() is the P .1;;,lri,,ei, .7

function.

P. c. Since p is a scaling parameter, IV(X) does not depend on P. Thus, without loss

of generality, consider Gamma(a~, 1). By definition, IV(X) = Var(log f(X)), so calculate

E(log f(X)) and E[log f(X)2] aS follows. FirSt Of all,


1 ~- 1 P"
E(log f (X)) = og -a e-1 log xdx
F(a~) F(a~) o


(4-47)


where the identity F(a~ + 1) = af (a~) was used. Secondly,


1 1 2(a~ 1) 1
= 10g )2 2a~log +of log) I x"-l-" -log xdx (4-48)

(a~- 1)2 X 2(a-1 -
+: xe1(log x)2dx xy~- )m r"-"(log x:)dx + l(au + 1)
F(a~) o (a~)o


E[(log f (X))2








The last step is to simplify E[(log f(X))2] [E(log f(X))]2. The following identity is needed for

this purpose,

xy- lo xd a -"x"-l log xdx + F(a~) (4-49)

which can be obtained by the partial integration with respect to e-". With everything at hand,

it is easy to show that

IV(X) = (a~ 1)2Var(log X) a~ + 2 (4-50)

But Var (log X) = ( ) (2) lo0g r 0) = : (1) (a), the statements followvs. O

The above lemma provides us with a nice formula for IV(X) when X belongs to the

Gamma family. The next proposition ;7a that actually, the quantity IV(X) is able to separate

the entire Gamma family from the Normal family. Recall that IV of a Normal random variable

equals .

L~emma. 4.3.2. Let Xa ~ Ga~mma(a,1i), with 1. r::10 function .fx,u(x) =e2o1>0

Then I/(Xo~) > a nd lim,,, IV(Xe) =

Proof. First consider the case when a~ is an integer. Since no~ is also an integer, it is sufficient to

show x, -"Z in density.

Th~e den~sity function? of ~4 is Z.E Y(x~ + n). W/ian~t to show that jifY (~ +2 n)
1 6-2 S8 O

Two facts are needed.

1). Since e" = lim,,,(1 + ep", E > 0, there exists sufficiently large NV such that


2). The Stirling formula: n! = 2/;,~ I1/26-ne6/(12n) (0 < 0 < 1). Now,


lim z fx,, (xz + n) lim ~e- +( n)"- ( 1

= hm ( e ) ( 1

=lim e-e-8(12n)( -1> _1 n

=lim ~e-8/(2(12n _2n -








Next, assume a~ is any positive number. For fixed n, let m be the 'I;__ -r integer such

that m ( no~ < m + 1. Since F(m) 4 F(na~) < F(m + 1), by the Squeeze theorem,



Theoremn 4.3.2. Let Xo ~ Gamma(a~, 1), IV(Xo) = (a~ 1)2Var(log X,) a~ + 2 where

a~ > 0. Then IV(X,) > and lim,,,IV(X,) = ~. So IV(X) separates the entire Gamma

f~ I,,:.l; from the Normal f~ r I,.1I;;

Proof. Using the above identity involving the Hurwitz zeta function and the Polygamma
fimction, one has

Var (log X,) = ( ) (2) l0g 0 0~) = @2(1) (a) = ((2, a~) (4-52)

Hence


Iv (Xo) = (a~ 1)2((2, a~) a~ + 2 (45-53)

=(a-1)2C -a+2,>

Since

dx < J o (x + a~)2 k= k+a2 02 0
one has
1 1 1
< I(X,) <1 + 2 (4-55)

So IV(Xo) tends to infinity as a~ 0 but it is uniformly bounded by one as a~ 00o.

Next, wanlt to show ZiV(Xo) > ~, Va(L. It's trivial whecn a(L 2 Wheln > 2, since

a-1 a-E1 -dX, one has

1 1
IV/(Xo) -=(a~ 1)2((2, a~) (a~ 1) +(4- ..)
2 2
1 1
=(a~ 1)2[((2, a~) + 2
2( 1)2 a- 1
1 1 k 1
=(a~ 1)2_ 2 2 2 dx]
2( )' (a + k,)' C
k--1 k,--1"k

The function is convex, so for any k,

1 yn+k+ 1
>I dx + [ (4-57)
(a + k)2 Jatk 2 (a k)2 (a 1)2








where 7 is the area. of th~e rectangular on? (aL + k, a k + 1) with? height r"+f l -dX

is the area. below th~e curve -9 on (aL + ka k. + ) and [ -ak) (0+ +l" ] is theC area. of th~e
triangle above the curve but inside the rectangular.

By adding up the inequalities for all k, one has

1 = a+k+ 1
2~ 2> dX+2 (4-58)
(a- + k) +k x2 2(a~ 1)2

It follows that IV(X,) > 0 Va.

Now, for any na, let, Y = ,Y-and Z ~ N(0, 1). By the( ~Central Limit Thecorem~, Y' Z
in distribution, thus in density by Lemma 4.3.2. Since IV(Y) = IV(X,to), which is uniformly

bounded by one as n oo, one has IV(X,to) i V(Z) = ~. Therefore IV(Xo) as
a 00. O

tPropo~tcsitionr 4.3.1. IV/(Xo) decreauses on (0, 1] U ( 1 0).

P. r Consider the derivative of TV(X,),


d
Iv (Xa)
dc


8(g(2, a~)
1)2
80~
- 1)293 )

- 1


1)((2, a~) +(a

1)((2, ~) 2(a


(4 59)


Lemnma 4.3.3. Let X be Beta(a~, a~) random variable, then IV(X) = (a~ 1)

4((2, 2a~)], where 9((, -) is the Hurwitz zeta function.

P. r IV(X) = Var(log f (X)) = (a~ 1)2Var(log X(1 X)).
It is trivial to verify that Var(log X(1 X)) = o da, ec 2X

1)2 L0g BE8, c0)

Since B(a, a)= ,lo Ia)=2 log F((a) log F(2a). So wfe have

d2 d2 d2
t2log B(a~, a) = d2 l0g 0 ~) 2log F(2~)
=2 () (a~) 4 () (2a~)

=2((2, a~) 4((2, 2a~)


2[2((2, a~)


(4 1 )








where tym) is the polygamma function.


=(a~ 1)2[2((2, a~) 4((2, 2a~)]


Hence IV(X)


The next proposition ;7a that IV(X) of Beta(a~, a~) family is actually smaller than when

a~ is greater than a small number 2 2/.


2/, and


Theoremn 4.3.3. Let X, ~ Beta(a~, a), then IV(X,) < ~, Vac > 2


lim,,, Iv(X,)


Proof. By the previous lemma, IV(Xo)


(a~ 1)2[2((2, a) 4((2, 2a~). When a~


TV(Xo) = 0 < ~. Now, assume a~ / 1, we want to show that 2((2, ~)


4((2, 2a) < .~~ Since


2(a 1 k~)2



(a + k~)2


(2a~ 1 k)21


(a~ + k~/2)21


k0(a~ + k + 0.5)21


2((2, a~) 4((2, 2~)


Solving:~ < L~lr~entails that a s (2 24, 2 + J2j))

Next show that IV(Xo,) < for a > 2 + 4. Since


k=1 ~a+k-1 2


1
(a + k~)2


a k


1 a + k


(a + k~)2


(4-62)


k1(a~ + k -


1 "
02 a x


1
-1)(a~ + k~)2


11 1
i~(yC(a + k~)
11 1 1
a~2 a2(a + 1)2 2(a + 1)3


also,


1
(a~ + k + 0.5)2


1 1
a~ + 0.5 2(a~ + 0.5)2


(4 : .)







We have


1 1 1
k=0 (a k) k= (a~ + k + 0.5)2 2(a~ 1)2
11 1 1 1 1 1
< + -(+ )-( + )-
C02 a 2(a~ + 1)2 2(a~ + 1)3 a + 0.5 2(a~ + 0.5)2 2(a~ 1)2
P(a)>


Q6 -- 9as 24.5Q4 12a3 + 5.2a2 + 6a + 1, Q(a) = 4(a 1)2Ct2(a~ + 0.5)2( 13


where P(~)
So we have


P(a)> 1
IV(X,) = (a~ 1)2 _465)
Q (a) 2
Since QL(aY) > VaQ but P'l(a) < 0 for. I >- 1 anld P(1) < 0, so < 0 for a~ > 1. Hence

IV(Xo) < f or a > 1. But we have shown that IV(Xo) < for a s (2 2/, 2 + T2j)), so

IV(Xo) < a>2-4

Since (a 1)2~ 0 as ar oco we have lim,,, IV(Xo,) = -. O

Combiningf previous two theorems gives us a prove of Theorem 4.3.1. Next we compute IV
of other common distributions.
-1gr )2/(2r2)
]Proapositiotn 4.3.2. Let X 0 be Log~nor~mal~lp, a'2 ,it __l~ e~ = a'liil
a > 0. Then IV(X; p~, a2) __ 1 a2


Proof. Since IV is invariant under linear transformation, assume p = 0. So


IV(X; p-, a2) = Var(log( f)) = Var [(log X)2/2o.2 + log(X)]


(4-66)


Integfratingf usingf the change of the variables entails that


S2
4


E [((log X)2/2o.2 + log(X))2

E [(log X)2/2o.2 + log(X)]


(4 67)


so IV(X) = + 0.2, the claim follows.

Proposition 4.3.3. Let X 3 0 be Pareto(a~, P) with nHfx)

a > 0 ando i > 0. Then IV/(X; au, P) = (1 + ~)2


zP+1 < x < 00,







P~rtoof Simple integration entails that Var(log(X)) = (log a~)2 Og a/- -~ 10 lg a~ + 2

so IV(X) = (P + 1)2Var(log(X)) = (1 + )j2.

Pr~opositiont 4.3.4. Let X > 0 be Weit~bulljLa, P) withl I. I!.1; / f(x) = zo-"le-ma y
O 4 < 00, a > 0 and p > 0. The~n IV(X;' a, P) = (1 )2" 2

Proof. C'll III variables y := x", so E log f(X) =log(a~)+y(1-- )1,_ where y o"e-ogyd
is Euler constant.

Sic o" e' log(y)2dy = +72,


Var(log f ) (1 )2Ib 610g )2dy (b e-"log~y)dy)2] -l 2( -) +1
1 ;,r2 2


By taking the derivative, the minimum value of TV(X) is obtained at a~* = and

minlV(X)= 1 i-. O

4.4 Distribution and Laplace Transform of the Information Variable

This section examines the general mathematical properties of the information variable

- log f(X), for instance, its distribution, moment generating functions, etc.
Consider an example where f(x) is an one-to-one function.

Example: Letl X ~exp(A), the de~nsity functlion is fx(x; A) = (e-m/^. Then the de~nsity of

- log f (X) is gy(y) = Ae-", y 3 log A

In general, it is difficult to get the distribution of log f(X) because usually f(x) is not a
one-to-one function. However, an upper bound in terms of the Renyi's entropy can be obtained

by the ('I!. in -ll'` bounds.
Definition: The Renyi's measure of entropy for a random variable X with density f(x) is

given by

R(X; a) = In f (x2)dx (4-69)

Theorem 4.4.1i. Let X1, X2, X, be an iid sample with r, go~l.:.:sl..;i 1. ,: .:1; function f (x)

and the Renyi's quadratic entropy R(X1; 2). Let X=(X X2, ,X,), suppose the distribution of

- log f(X) is G(a). Then for all a > 0, G(a) 4 ea-nR(X1;2)








Proof. The moment generating function of log f(X) is M(t) = E[e-tiog (x)] = [J~ f(xr)1-ed]'"
since Xfs are independent. By the C'I, I1...111' Bounds, for a > 0 and t < 0,


G(a) = P(- log f (X) ( a) ( e-taM~(t) = e-ta[ f()-d]=etannf ()-d

Especially, letting- t = -1 enltails that G'(a) 4 ea+nln fS~ i)2dz = a-nR(XI;2) for. all a > 0. O

The two-sided Laplace transform of -log f(X) is given by


L(s) = esoI2 xd x" x(4-70)

Notice the following two observations.

(1). The Renyi's measure of entropy is
1" 1
R(X; ~) = In f r(x,odx =n L(a~ 1) (4-71)

(2). When X is positive and a > 0, Hardy's inequality gives a lower bound for L(s) in terms
of the distribution of X.
L (s) ) [ s s' F (x) ldx (4-72)

The following proposition gives a lower bound for the Fisher information in terms of the

Laplace transform.

Theorern 4.4.2. Let X1,X2, X, be an iid sample with Igo I~ l..1l..7.;i 1. ,:.:1;i function

f (x; 8), where 8 is an unknown parameter. Let X=(X1,X2,' :- n), then fOT it:, a > 0,

E[( log .f(X; 8))2] 80 _lU" V! q
i80 (s +1)2 L(2s; 8)

P': For iid sample, E[(~ log f (X; H))2] = nE[(~ log f (X; 8))2 Where X X On the other

hand,


L(s; 8) =(s + 1) ,f (x; 8)s [ log f (x; 8)]dx (4 741)

ii8

((S + 1) {E[f (X; 8)2s]E[( log f (X; 8))2 )1/2

Or
8 1 [ ~;8]
E[( log f (X; 8))2] 80 _R\, "/ 75)
80) (s +1)2 L(2s; 8)










Conditional V/alue-at-Risk (CVaR, Rockafellar et al. [14]) is a risk measure. For continuous

variables, a~-CVaR of a r.v. Y with distribution Fy is simply the expectation of Y conditioning

on Y >VaR,, where VaR, = F,- (a). So CVaR,(Y) = E[Y|Y > VaR,]. For our purpose,

given a random variable X with density f suppose we only care about small probability

events {w : P(X(w) E (x, x + 5) < t}, then it is meaningful to study the CVaR of entropy

-E(log fx (X)|I log f(X) > s).

Consider a specific example. Let X ~-exp(A), the density function is fx:(x, AX) = (e-sp

The density of Y = log f(X) is fy(y) = Ae-", y 3 log A. The distribution of Y is FY(t)=

1 Ae-", y 3 log A. Fix as [0, 1), denote VaR, by v, for short. Fr (v,) = a implies that

v, log So

CVaR, (- log f (X)) = E(Y|Y > v,) (4-76)

=v, + 1

=log A + 1 log(1 a~)

=H(X) log(1 a~)







CHAPTER 5
GENERALIZED GAUSSIAN FAMILY

Generalized Gaussian distribution is commonly used to model the noise in engineering

(Dominguez-Molina et al. [6]). Lutwak et al. [11] showed that the probability distributions

that have maximal Renyi entropy with given generalized Fisher information are the generalized

Gaussian.

We first show that the generalized Gaussian density is one of the maximal entropy distribu-

tions .

Theorem 5.0.3. Let X E R be a random variable with unknown I. ,:.-6;, Given the mean

E(X) = p and Ithe pthI cetra1l7 momen1t91 E|Xl p|9 = r~, Ithe genera~lized~L GaLUSSianS diZstribZVution

g g(x; p-, o-, p) max~imizes the S'lo~ ttr e.>n entropy, where


o-= (p r) /P[(3) 1/2 (5-1)
F(1/p)

Proof. Let Y be generalized Gaussian distributed with density function

1x -
gg(x; p-, o-, p) = exp {-| |"} (5-2)
2F(1 + 1p) A(p, o-) A(p, o-)

WhereI A(p, o-) = .1 ~ By Domninguez-Mlolinaa et al.[6],

A (p, o-) r+ 1
E(|YIr) =((5-3)
pf (1+ 1/p) p

So the Shannon entropy of Y is


H(Y) =log(2F(1 + 1/p)A(p, o-))+ 1 (5-4)


Now assume X has the density f by the '7.I-;- I,,e inequality,


n99f


So S, f log fa 3 f log gg, or H(X) ( Sf log gg, where H(X) is the Shannon entropy of X.







C!....--- a such that r = /p, wvhic~h yields that a =(pv ?)[r) /P ]1/2. So


2F 1+1/)p,) A(p, a) ( 6

=log(2F(1 + 1/p)A(p, a)) + E|X p|I

=log(2F(1 + 1/p)A(p, a))+ 1

=H(Y)




The generalized Gaussian distribution is commonly used by the engineers to model the
noise. However, it is not easy to get a good estimator for the shape parameter p. Here we

develop an alternative method to estimate p using TV.

rPropsition 5.0.1. Liet X be Generalized G~aussian with mean p, variance a2 Gnd Shape

parameter p, then

IV(X) = 1 (5-7)

Proof. The density of X is

1x-
f (x; p, a, p) =exp {-l | "} (5-8)
2F(1+ 1/)A~p e)A(p, a)

Where A(p, n) = Without loss of generality, assume p = 0.

Var(log f (X)) =Ap 2pVar(|X |") (5-9)

By Dominguez-Molina et al. [6],

A(p, a)f r+ 1
E(|Xlr)= ( (5-10)
pf (1+ 1/p) p

so Var(|IX| ) = p- A(p, a)2p, hence Var(log f(X)) = p- O

Lemma 5.0.1 can be used to estimate the shape parameter p, denoted by p*. Given i.i.d.

samples xl, x2, n, x, SSume they are from the generalized gaussian distribution with unknown

parameter p. The estimation procedure is as follows:

1. Estimate the density using the parzen window method. Assume the estimated density is
(x), #(x W(x xi), where the function W(x) is the kernel density.








2. Let log ~(Xi), the estimator of E(log ~(X));
3. The estimator of TV is IV(X) = E ~ (log #(Xi) m)2 Op >X-

One properties of GG.
Theorem 5.0.4. Let X be a random variable with generalized Gaussian distribution.

Suppose Var(X) = 0.2 Gnd the qutdrtONC EReyiS entropy R(X) = r. Then the shape parameter p

of the distribution is /i;,,.:I, let determined when r < log 22o-, otherwise p can take two possible
values. The same holds for the Shannon entropy.

Proof. The pdf of X is given by

1x -
gg(x; p-1, o-, p)= -x3- 2F1+1p i(/)12vF1p / "} (5-11)
r(3/p) r(3/p)
The Renyi's quadratic entropy of X is


R(X p)= -lo ggdx (1+ lo 2 lo~o) -logp) log F(3/p) + logF(1/p) (5-12)
p 2 2

W;lant to show R(X; p) is not monotone in p. Consider ~~

8R(X; p) 1 1 3 (3p 3 (1)
log 2 -+(5-13)
SP 2 p 2p2 F(3/p) 2p2 p(~

By [4](page 179),
L'(z) 1 1 1
(v + ('~ ) (5-14)
Where v is the Euler Constant. The above derivative can be written as

8R(X; p) 1 3n= 1~ 1 1 ~P
[- log 2 + (1 )= p(p) (5-15)



Case 1: lim,,, cp(p) = log 2.
m 171 1i +~p 2P 1 2~o m 1 2 Trr
Smece 0 4 P" -- O as p oo.
n= 1 ~p n= 1 (n+1/p) (n+3/p) Ln= 1 n2 p 6
Case 2: limpso+ cp(p) = log ~.








Since f;C dzr log Pf an~d

1 1 1 1
nc + /p n + 3/p +1p r+ /

( 1 1 ': y+] 1 1dr
n( + /p n + 3/p r+1p r+3/
rn+] 1 1 yn+] 1 1


Bu ds Cr" (r( l:~ild5r
itU I n +1/p .r+1/p .= Jn (i+/).r1p
By the dominated convergent theorem, limpso+Ln i -1." n+l/ d~ =

The same arguments can he applied to show limpso+, Cn-l. S,+li ;I~~z/ 1dXr = 0. Hence
one has

,3p+3
lini c(p) = lini log 2 + log +(3-17)
p 0+ p 0+ 2 p +1
3 ric1 111
[ ( ) -d~r]
2 n +1p n + 3/p .r +x 1/ .r + 3/p1
,3 1 27
log 2 + log1 03 = log


c(p) takes the positive value at O* and the negative value at infinity. Hence R(X; p) is not

monotone in p.

Want to show that R(X; p) only has one global nmaxiniun.

=R~ 0 if and onlyr if b"(p) = 0 so it suffices to show that ;p(p) = on~ly has one
solution. The sufficient condition for cp(p) to have one zero is that cp(p) has at most one critical

valuel SO we COnISider. sp.

8~p3 1 3


Let A =n) 1 r d~r, and B =O z~?l) dSr C l(,~p
11 1 (i+1/p3)2 (.r+1/p)2 O (r /) 11(z:/)
Notice that both ,4 and B are increasing functions of p.

+ 1 1
A = i -n+1>2 ( 1) d~r (B 19)
ii: I(n + 1/) 2-

n n+ 1/p .r 1/ (n + 1/)(.r+1p








x n 0 over the interval (n, a + 1).


B = 3dx (5-20)
n =1 n -1 (X + 1/p)2 ( lp21
"1 1 n-x
= i 3 + ]~ [ ]dx
.I=l n- x+3/ + 3/p (x +t 3/p)(n + 3/p)

n x 0 over the interval (n 1, n).
But then

89' 3 "1 "3
[ dx +A d + B] (5-21)
8p 2p2 1 X 132 2X 0 2
3 -p2
[ +(A + B)]
2p2 '

Since ~- decreases in p and (A + B) increases in p, = 0 has at most one solution, hence

cp(p) can cross x axis only once, or cp(p)=0 only has one solution. It follows that the statement
holds. O

Corollary 5.0.1. Suppose that X has a generalized Gaussian distribution with unknown

parameters. Given the mean, the variance and the Renyi's quadratic (R.;e' e.'.>:~n) entropy ofX ,
there are at most two possibilities for the distribution ofX .

Theorem 5.0.5. Liet X be generalized Gau1%ssianS di~stribu1ed with1 th1e 7 ~ lugg(;p ,p

where a and p are known. Given the constant r, the solution to the following optimization

problem is unique.

maxp,oo H(X)


E|X p|11 ( r

Proof. H(X) is an increasing function of r, so let E|X p|" = r, then A(p, a) = (p r)2/p, where

A(p, a) is defined as above. So


H(X; p) =log(2F(1 + 1/)(p r)2/p) + (5-22)


By [4](page 179),
F'z) 1 1 1
(v + ('~ ) (5-23)







Where v is the Euler Constant. The derivative of H(X; p) can be written as

8H(X; p) -1 (1p
2 p+ + 210g(p) + 210og(r) 1] (5 24)
8p~P p (1/p)
-1 r + 1 1 -1
2 [-v+ ( )+ 210g(p) +210g(r) 1] =p 2p
p2 n 1/p + n p2

p O) has no more than one solution since E saderaigfncino n

log(p) is an increasing function of p. Since cp(0 ) = -oo and cp(oo) = 00, cp(p) = 0 has an unique
solution, denoted by p*.

Since -1 changes from positive to necga~tive a~s p ol H(X; p) is convex in p. C.!, n m.

o-*= (p* r) ~[F(/* 1/2 (5-25)
F(1/p*)

(o-*,p*) is the optimal solution to the stated maximization problem. O

Another possible application of generalized Gaussian distribution is the estimation of the

density using the parzon window method. Suppose the empirical histogram estimated using a

data set of size n is and the true density is ~. Assume has at most one inflection point.

Since when the shape parameter p > ,1 the generalized Gaussian density .i~e become flatter, so

the norm I' 1 -ii 21 1' */*/ 1/ 2 2*I~







CHAPTER 6
SUMMARY

The main part of this research studies Cumulative Residual Entropy(CRE) and Infor-

mation Volatility(IV). Both CRE and IV entropy measures are able to improve the classical

Shannon entropy in certain r- .1-4.

The Cumulative Residual Entropy gives a more general definition of the entropy using

the distributions. The estimation of CRE is much simpler than the estimation of the Shannon

entropy since it is much easier to estimate the empirical distributions than the densities. We

generalize the original definition of CRE to the real line and study its new properties. A new

proof of Weak Law of Large Numbers is examined.

The Information Volatility examines the variance of the information flow (i.e., the variance

of log f (X)), where X is a random variable with density function f(x). Recall that Shannon

entropy is the expectation of log f(X). Unfortunately there does not seem to be any way

of estimating this quantity from the Shannon entropy of the emperical distribution of the

sample. Thus in this sense the mean of log f(X) is not a "good" statistic. However We

found that its variance IV has very interesting properties. IV equaling zero characterizes the

Uniform distribution; IV is able to separate common probability distributions; IV is translation

invariant; for a vector variable, IV is invariant under affine transformations. Moreover, IV

has good weak convergence property which Shannon entropy does not have. We show that one

can ahr-l-ns find a sequence X, which converges to X in distribution and IV(X,) converges to

IV(X), but for the Shannon Entropy this is never the case. For instance, IV of Binomialn, p)

converges to IV of N~ormlR as a goes to in~finity anld IV) of Geometric( ) converges to ZIV of

Exponential(P). We also propose a new proof to the .I-i-mptotic approximation of the Shannon

entropy of Binomial@n, p).

We also study the parameter estimations using distributions and explore the properties of

the generalized Gaussian family. Further research can be done in these areas.








REFERENCES

[1] 31. S. Bazaraa, J. J. Jarvis, H. D. Sherali, Litear P, lr..younts;.ely and Network Flow. New
York: Wiley, 2005.

[2] P. Billingsley, Pr o~l~.:l..7.;i and Alectsure. New York: John Wiley & Sons, Inc., 1995.

[:3] G. Casella, R. L. Berger, Statistical Inference. Pacific Grove: Duxhury, 2002.

[4] J. B. Convey, Functions of One C'omplexr V/ariable. New York: Springer, 1989.

[5] T. 31. Cover, J. A. Thomas, Elements of Information The <;; New York: Wiley, 1991.

[6] J. A. Dominguez-Molina, G. Gonzilez-Farias, R. 31. Rodriguez-Dagnino, "A practical
procedure to estimate the shape parameter in the generalized Gaussian distribution,"
preprint, 2001 http://www.c imat .mx/report es /enl inea/ I- 01- 18_eng .pdf last accessed
in April, 2007.

[7] P. Jacquet, W. Szpankowski, "Entropy computations via analytic depoissonization," IEEE
Treen~srctions on Infr: tI:a ll..>,n Theory, vol. 45, pp. 1072 1081, 1999.

[8] O. Johnson, "A conditional entropy power inequality for dependent variables," IEEE
Treen~srctions of Information Theory, vol. 50, pp. 1581 158:3, 2004.

[9] J. N. K~apur, H. K(. K~esavan, Entropy Op~timization Principles with Applications. Boston:
Academic Press, 1992.

[10] C. Knessl, lIst.~ gi representations and .I-vi-!lllh'lle expansions for shannon and renyi
entropies," Applied Afrathematic~s Letters, vol. 11, pp. 69 74, 1998.

[11] E. Lutwak, D. Yang, and G. Zhr Iln "Cramibr-Rao and moment-entropy inequalities for reny
entropy and generalized fisher information," IEEE Transactions of Information Theory, vol.
51, pp. 47:3 478, 2005.

[12] 31. Rao, "More on a new concept of entropy and information," Journal of Tit~~ ...t /...l
P lr l~.:l..7. i, vol. 18, pp. 967 981, 2005.

[1:3] 31. Rao, Y. C'I. in~ B. C. Vemuri and F. Wang, "Cumulative residual entropy, a new measure
of information," IEEE Transactions of Information Theory, vol. 50, pp. 1220 1228, 200:3.

[14] R. T. Rockafellar, S. Uryasev, "Optimization of conditional value-at-risk," Journal of Risk,
vol. 2, pp. 21 41, 2000.

[15] A. J. Stam, "Some inequalities satisfied by the quantities of information of fisher and
shannon," Information and C'ontrol, vol. 2, pp. 101 -112, 1959.

[16] T. Worsch, "Lower and upper bounds for (sums of) hinomial coefficients," Technical Report
:31/94, Universitat K~arlsruhe, Fakultat fiir Informatik, 1994 http://1iinwww.ira. uka. de/
wors ch/research/papers/ last accessed in Feb 2007.








BIOGRAPHICAL SKETCH

Dr.Juan Liu was born in Shaanxi province, Clash~ I in 1979.

From 1992 to 1998, Juan studied in the No.9 high school in the city of KE<~l an li-i in

Xingjiang province.

From 1998 to 2002, she continued her education at the Department of the Mathematical

Sciences at Nankai University, which conferred a Bachelor's degree in applied mathematics on

her.

From 2002 to 2007, Juan studied in the Graduate School of the University of Florida, which

conferred a Ph.D. degree in applied mathematics and a Master's degree in operations research

on her.