<%BANNER%>

Biased Bootstrap Methods for Semiparametric Models

Permanent Link: http://ufdc.ufl.edu/UFE0021391/00001

Material Information

Title: Biased Bootstrap Methods for Semiparametric Models
Physical Description: 1 online resource (112 p.)
Language: english
Creator: Giurcanu, Mihai C
Publisher: University of Florida
Place of Publication: Gainesville, Fla.
Publication Date: 2007

Subjects

Subjects / Keywords: 2sls, bootstrap, consistency, gmm, iteration, regression, resampling
Statistics -- Dissertations, Academic -- UF
Genre: Statistics thesis, Ph.D.
bibliography   ( marcgt )
theses   ( marcgt )
government publication (state, provincial, terriorial, dependent)   ( marcgt )
born-digital   ( sobekcm )
Electronic Thesis or Dissertation

Notes

Abstract: The finite sample properties of estimators in some moment condition models often differ substantially from the approximations provided by asymptotic theory. The bootstrap can provide a way to circumvent the inadequacies of asymptotic approximations, but in over identified models, where the dimension of the parameter is smaller than the number of moment conditions, the usual uniform bootstrap may be inconsistent. This problem is usually solved by recentering either the residuals, the sample estimating equations, or the statistic of interest. In this dissertation, we developed a new biased bootstrap methodology for moment condition models. This biased bootstrap is a form of weighted bootstrap with the weights chosen to satisfy the constraints imposed by the model. First, we construct a pseudo-parametric family of weighted empirical distributions, obtained by minimizing the Cressie-Read distance to the empirical distribution under the constraints imposed by the model. The resulting family has the least favorable property, meaning that the inverse of the Fisher information matrix evaluated at the MLE equals the sandwich estimator. By resampling within this family, we 'mimic' the parametric bootstrap for semiparametric models. An extension of this methodology for time series applies the biased bootstrap to the sample of blocks of consecutive observations. Our overall goal is to extend and develop the range of applications and theoretical properties of the biased bootstrap, focusing mainly in three directions. First, we prove that the biased bootstrap is consistent in moment condition models, with no need for 'recentering'. Moreover, by applying bootstrap recycling within the pseudo-parametric family, we obtain computationally feasible and more accurate iterated biased bootstrap procedures. The main idea here is to reuse the first level bootstrap resamples in order to estimate higher level parameters corresponding to the iterated bootstrap. This methodology is a competitor of the jackknife after bootstrap developed by Efron (1992) in the iid case and by Lahiri (2002) for dependent data. Third, new biased bootstrap procedures are proposed for problems where the usual uniform bootstrap fails, such as on the boundary of the parameter space and for certain asymptotically nonnormal statistics.
General Note: In the series University of Florida Digital Collections.
General Note: Includes vita.
Bibliography: Includes bibliographical references.
Source of Description: Description based on online resource; title from PDF title page.
Source of Description: This bibliographic record is available under the Creative Commons CC0 public domain dedication. The University of Florida Libraries, as creator of this bibliographic record, has waived all rights to it worldwide under copyright law, including all related and neighboring rights, to the extent allowed by law.
Statement of Responsibility: by Mihai C Giurcanu.
Thesis: Thesis (Ph.D.)--University of Florida, 2007.
Local: Adviser: Presnell, Brett D.

Record Information

Source Institution: UFRGP
Rights Management: Applicable rights reserved.
Classification: lcc - LD1780 2007
System ID: UFE0021391:00001

Permanent Link: http://ufdc.ufl.edu/UFE0021391/00001

Material Information

Title: Biased Bootstrap Methods for Semiparametric Models
Physical Description: 1 online resource (112 p.)
Language: english
Creator: Giurcanu, Mihai C
Publisher: University of Florida
Place of Publication: Gainesville, Fla.
Publication Date: 2007

Subjects

Subjects / Keywords: 2sls, bootstrap, consistency, gmm, iteration, regression, resampling
Statistics -- Dissertations, Academic -- UF
Genre: Statistics thesis, Ph.D.
bibliography   ( marcgt )
theses   ( marcgt )
government publication (state, provincial, terriorial, dependent)   ( marcgt )
born-digital   ( sobekcm )
Electronic Thesis or Dissertation

Notes

Abstract: The finite sample properties of estimators in some moment condition models often differ substantially from the approximations provided by asymptotic theory. The bootstrap can provide a way to circumvent the inadequacies of asymptotic approximations, but in over identified models, where the dimension of the parameter is smaller than the number of moment conditions, the usual uniform bootstrap may be inconsistent. This problem is usually solved by recentering either the residuals, the sample estimating equations, or the statistic of interest. In this dissertation, we developed a new biased bootstrap methodology for moment condition models. This biased bootstrap is a form of weighted bootstrap with the weights chosen to satisfy the constraints imposed by the model. First, we construct a pseudo-parametric family of weighted empirical distributions, obtained by minimizing the Cressie-Read distance to the empirical distribution under the constraints imposed by the model. The resulting family has the least favorable property, meaning that the inverse of the Fisher information matrix evaluated at the MLE equals the sandwich estimator. By resampling within this family, we 'mimic' the parametric bootstrap for semiparametric models. An extension of this methodology for time series applies the biased bootstrap to the sample of blocks of consecutive observations. Our overall goal is to extend and develop the range of applications and theoretical properties of the biased bootstrap, focusing mainly in three directions. First, we prove that the biased bootstrap is consistent in moment condition models, with no need for 'recentering'. Moreover, by applying bootstrap recycling within the pseudo-parametric family, we obtain computationally feasible and more accurate iterated biased bootstrap procedures. The main idea here is to reuse the first level bootstrap resamples in order to estimate higher level parameters corresponding to the iterated bootstrap. This methodology is a competitor of the jackknife after bootstrap developed by Efron (1992) in the iid case and by Lahiri (2002) for dependent data. Third, new biased bootstrap procedures are proposed for problems where the usual uniform bootstrap fails, such as on the boundary of the parameter space and for certain asymptotically nonnormal statistics.
General Note: In the series University of Florida Digital Collections.
General Note: Includes vita.
Bibliography: Includes bibliographical references.
Source of Description: Description based on online resource; title from PDF title page.
Source of Description: This bibliographic record is available under the Creative Commons CC0 public domain dedication. The University of Florida Libraries, as creator of this bibliographic record, has waived all rights to it worldwide under copyright law, including all related and neighboring rights, to the extent allowed by law.
Statement of Responsibility: by Mihai C Giurcanu.
Thesis: Thesis (Ph.D.)--University of Florida, 2007.
Local: Adviser: Presnell, Brett D.

Record Information

Source Institution: UFRGP
Rights Management: Applicable rights reserved.
Classification: lcc - LD1780 2007
System ID: UFE0021391:00001


This item has the following downloads:


Full Text
xml version 1.0 encoding UTF-8
REPORT xmlns http:www.fcla.edudlsmddaitss xmlns:xsi http:www.w3.org2001XMLSchema-instance xsi:schemaLocation http:www.fcla.edudlsmddaitssdaitssReport.xsd
INGEST IEID E20101206_AAAAEM INGEST_TIME 2010-12-06T22:07:17Z PACKAGE UFE0021391_00001
AGREEMENT_INFO ACCOUNT UF PROJECT UFDC
FILES
FILE SIZE 36974 DFID F20101206_AACMVN ORIGIN DEPOSITOR PATH giurcanu_m_Page_053.pro GLOBAL false PRESERVATION BIT MESSAGE_DIGEST ALGORITHM MD5
ebed998ba9dd5d1235d3e2ace2dc8ad9
SHA-1
682ff0ad58b3f76cc200e19ef2dbf2c605e4fe88
42264 F20101206_AACMUY giurcanu_m_Page_025.pro
b19da8f9181e52d5bd6986267a04f1f7
dd65b9edb70f080ade33f481bade0559e2e6d4e5
29763 F20101206_AACMWB giurcanu_m_Page_075.pro
64948b293c12b66c348ebb69722f724b
93f428f94cf5959511651e5d79fee5f2e877b131
48861 F20101206_AACMVO giurcanu_m_Page_060.pro
235e5e46599941a75f7a27426d5818e1
4def903a2e4249a7b2160aa48f86fc4e3bb5ac2b
40797 F20101206_AACMUZ giurcanu_m_Page_026.pro
e8086f317f58e568beb0d18e031fe8f2
ea7c4965eedc5eb498d0c232858f362a373c5674
31736 F20101206_AACMWC giurcanu_m_Page_079.pro
6ff3ace30a8b73e659355910ffbd883c
0bae88fd25adc7fd5e0b6f167854869d3c63ffd4
48254 F20101206_AACMVP giurcanu_m_Page_061.pro
cba5f7b5c01ca1b345877ab059d1cdb4
861692dc0c1c72a7de56eedc90c877ce1da644b5
31929 F20101206_AACMWD giurcanu_m_Page_080.pro
1138c17e0d23f5c69c5e3bab6a5852a1
300ed4065a305af75929b6442691b4f15121e744
48221 F20101206_AACMVQ giurcanu_m_Page_062.pro
8105b4100e56db39fc49a71405577bd0
141d02dfb084c858d90e714ff4dac849c8cb02aa
26511 F20101206_AACMWE giurcanu_m_Page_082.pro
3cca63673298747ef01a57fdd592e28f
c35e5d33f0328d6502b51e007ae195172039d553
39787 F20101206_AACMVR giurcanu_m_Page_063.pro
e8913b95db4691de11ddebbcfcc45e86
b7e95d7275dec5a785c955cf3a6bad1a38021eee
28160 F20101206_AACMWF giurcanu_m_Page_086.pro
b7afee464bc264cfb0ce03e0f04fdf4d
a09763cd8c3c031caabd6e32fe849b44828de4a8
44418 F20101206_AACMVS giurcanu_m_Page_065.pro
4dafd02e697b447f319f58eda30c79d9
fb7fefd24ce4836b16cc898e7eaebe2c2665448a
39004 F20101206_AACMWG giurcanu_m_Page_088.pro
7925f266bfd9c412ca82f3622740bf66
88020e1fb2b4754ba36b2c8c678d747ee2d60c52
43768 F20101206_AACMVT giurcanu_m_Page_066.pro
64e7e1b9bc9036518e0df8ea6288d265
1321738c8f4af672dfbd3407404d997ea3a1b57e
31183 F20101206_AACMWH giurcanu_m_Page_090.pro
d530f74435446137509de99e52b5de50
03201d39ba1e1d38a514652cfaa4c2e38a8b34fe
44359 F20101206_AACMVU giurcanu_m_Page_067.pro
3ce63b698a13f76216ba473b50209701
df4686bf176a060f1703a64e6485f7f0ba76b022
27091 F20101206_AACMWI giurcanu_m_Page_092.pro
43a2e219e16510beff0f252c7842ee62
0a77029b8d7529b88d80b801d9207b65594c575c
41488 F20101206_AACMVV giurcanu_m_Page_068.pro
281bdc2f954d6992be47ca7b75fa2fef
cf17fa764d48815bcc3cdeaed0d379ae5f5dce2f
28699 F20101206_AACMWJ giurcanu_m_Page_093.pro
f0990c1f6792ec3e19be40b6dc042950
3d58de85f24f7ab4e869a8482077c3ba673f6b26
30862 F20101206_AACMWK giurcanu_m_Page_094.pro
a70d40436d48caf2aed1cbf3fc665f14
8bd6b7b0f3cb1a2c01e8fc9f7dde5b66c4ccadff
39156 F20101206_AACMVW giurcanu_m_Page_069.pro
b194471522605df4e8125de7969eb2d5
087c8f8ecc8d5757637e624b6c11386f70f8b456
29200 F20101206_AACMWL giurcanu_m_Page_095.pro
fcdae7f3d6524579da146bae4c5f2a60
6f6ecc549929d4e443d19cedb1150ee1ab30f858
34834 F20101206_AACMVX giurcanu_m_Page_071.pro
dea7d1e4cb7fe578f9c66e681a0cc75e
f76892f81a8c4f169349b919fd2ffaf7865d554f
2040 F20101206_AACMXA giurcanu_m_Page_011.txt
28122e89c7cfc52dd7fe59373b418a09
fae446bd2d0cf7f3276b67ec1f3ac39ace457e0e
16414 F20101206_AACMWM giurcanu_m_Page_099.pro
43e54641a793021ced3e58fc59235814
370a9971077ee8091d9b8ee6ca73553128262f7e
54105 F20101206_AACMVY giurcanu_m_Page_072.pro
7a40e5dfca7099b65fbaf9b0910dc038
d52d42a8f3e596eedff61d7b004bc8731c78ce2f
2268 F20101206_AACMXB giurcanu_m_Page_012.txt
eb128ca38030e8c82950b923b5468408
95e3cfa56cd38f9c5f65c61c9b73e568d009f919
27641 F20101206_AACMWN giurcanu_m_Page_101.pro
5b453aa02db78361b6760d31d957754e
3f8c68671bbe7533e74c38565edbbcd974fe346f
12368 F20101206_AACMVZ giurcanu_m_Page_073.pro
d5508629e31f29c35bbcfdc6fb620db0
dba9b498cf713990c898ae1f4aa356d6b08aa0fd
2113 F20101206_AACMXC giurcanu_m_Page_014.txt
1bbc8b7bba29da5908955fdea29bf59d
4f7dd3652988886ad5fdb604f291f17b67fa8fbd
30939 F20101206_AACMWO giurcanu_m_Page_102.pro
2a1c7488fc6e6379113ba3e9c0055a5e
1a7b07b06d953cb9f9aaa0e75149c6a8ae386bad
2084 F20101206_AACMXD giurcanu_m_Page_015.txt
ffaa7a4efe94b37dde1643aea9eda85e
abfe40a7539ea9b9f4d4f9791d7fb06e9becfab0
28475 F20101206_AACMWP giurcanu_m_Page_104.pro
953e866dd5e90babf856134ddbaefeed
45efb11be66ad02700f4196110c2569a938efa73
5544 F20101206_AACNAA giurcanu_m_Page_013thm.jpg
1d540de2729be7e074c84403d354d15f
30de2a1625f05d1c4b823a4b1dea221c97a4434c
2091 F20101206_AACMXE giurcanu_m_Page_018.txt
623c9117d2ad9aeb8f8086f4b0f5fc8a
7ddd2a0cfc75225af5d8ff43dc9ccbf91f5c43e5
30119 F20101206_AACMWQ giurcanu_m_Page_105.pro
7537887d6b7fd5541a3bcc4f042dddd8
e095472c6eb664682960d1cca3d0ec1a5c668f3b
21036 F20101206_AACNAB giurcanu_m_Page_014.QC.jpg
7be1f4339a1402cb549a875a62c5abfa
0933b05e5a5a121c7f9782a00fd30cc59120fbba
2242 F20101206_AACMXF giurcanu_m_Page_019.txt
b8fd756cc7bf447ce7f422b7a3a2b5f0
1dc1d0b326fab4c52bbf6dcd5b78c69a8a168ec4
59033 F20101206_AACMWR giurcanu_m_Page_106.pro
a287e6d06525a29f4b431aa98d7a9d3a
4aeba8601e4b4891224e4f3f63c4d79785cf94ac
5972 F20101206_AACNAC giurcanu_m_Page_014thm.jpg
a3a8d111bb57614ddb062fe6ff0872a3
876d803523d0a826f6fd540d5f6ecbc014761b63
2119 F20101206_AACMXG giurcanu_m_Page_021.txt
25c570246b5ac9458796f41c93be2fe3
6f12e8a807fc84792b0dc16147c68c33e3ed5834
55278 F20101206_AACMWS giurcanu_m_Page_108.pro
90cac4327c35504d71a6d3138d838f05
667538be1286f00da8f31df7722261be857c761b
5971 F20101206_AACNAD giurcanu_m_Page_016thm.jpg
106958894a18620d9d813d9720fc3fdc
905ee3ecd8b842593964098a6a6e71bf5f8a843a
2262 F20101206_AACMXH giurcanu_m_Page_022.txt
6e5c96bcc4bdabd366ad9e1f1a874e06
7b43c1dfeea61dd37360f4c66e3c256d2e48e1ea
58365 F20101206_AACMWT giurcanu_m_Page_110.pro
8b4a794e922b7cd98b4f266a930103b9
c423114e57e07988faeff8b20321afcdbe0f5602
57052 F20101206_AACMAA giurcanu_m_Page_091.jpg
f5f96454d45c6d09986052e06a30a3d7
f5a6a168487aa4ff8003228f2afb2b3cdd801cc3
22560 F20101206_AACNAE giurcanu_m_Page_017.QC.jpg
387e3d031fd353dbe63297af7897d2a3
cb681f7be33108e623e97137b22ebbb09b4d1da6
2021 F20101206_AACMXI giurcanu_m_Page_023.txt
b2b56994d0946dd63f4764c56d817903
eb77816967b200a485a3230d33a6b4880616b1e7
93 F20101206_AACMWU giurcanu_m_Page_002.txt
4e5bb1888a8a1b599bd0615d8dcb70c0
1bf6f9f5e8fbb72be7eaef06c4ba004ded36bf3c
6229 F20101206_AACNAF giurcanu_m_Page_017thm.jpg
0f5c1f353960e4bfd1f8773dafc92bd4
79984d82708fcc53e900f35eb567b8f07fd690ff
1935 F20101206_AACMXJ giurcanu_m_Page_024.txt
0b7dd8343ebd427b205aa3b40987d387
f10c898ae0694cbd2ab01ed15653a7d813c0c27d
1303 F20101206_AACMWV giurcanu_m_Page_004.txt
d6d1cc19ca7d730518bab5fb83ffb2c6
e8bc8586bb1f66f5f7901daa8d4956e53bf90e16
1814 F20101206_AACMAB giurcanu_m_Page_016.txt
9b826f9da619684a83e91f27c392f278
28aba39e6e6786c45f20462bee9e8f0aac72a3fe
20967 F20101206_AACNAG giurcanu_m_Page_018.QC.jpg
c4e288a7882d590ebbb0f3eb54ab9b62
e74ffd8a7ef4f706429571a13050218d82be6b88
1948 F20101206_AACMXK giurcanu_m_Page_025.txt
20cdcd8c82e9218182dc4f80ae3de1c7
1c6e80253f0c3844d582b45c69eea0b56c93b058
592 F20101206_AACMWW giurcanu_m_Page_006.txt
a3f9705d480e667dfd7cfc857e00291d
30361260488cd3689171dfdab16d15b5ffe37691
47877 F20101206_AACMAC giurcanu_m_Page_015.pro
368d54e20c8355d257dced6e2c7dfd8a
b067e61d23b32a11be19f52390dda1f383132a4e
26690 F20101206_AACNAH giurcanu_m_Page_020.QC.jpg
d85931c950a97df938040a493eb49a76
4ce1b5cb4be7ebab411822755970fdd7826d0621
1800 F20101206_AACMXL giurcanu_m_Page_026.txt
f91e61e70f8a6bf4564f6e552044d0ae
140cd8f462d631728e6bb10290861ec209a7d498
674774 F20101206_AACMAD giurcanu_m_Page_081.jp2
5133587f9dc98253dc8e92982e46c6b8
931a3fe42d79fd5c686d7c68b3a8695c997ec4f4
24546 F20101206_AACNAI giurcanu_m_Page_021.QC.jpg
887c2744b4cf9352a5745e9db2129912
c1578b75e30b809506e4fe5848cde36a17f899d1
2275 F20101206_AACMXM giurcanu_m_Page_027.txt
a9c59935fcf0b92b8cd472ba203ebf60
44661502cfba2790bb7dc906a2cb5863a893dbdb
2126 F20101206_AACMWX giurcanu_m_Page_007.txt
8ef31b4eefb76a1abd43740fe4c20086
73e4f550aea69885c1d466cfeea7de652776fcd1
6202 F20101206_AACMAE giurcanu_m_Page_060thm.jpg
44e9a9aad90cc75fcd9b11d99b3c4518
6ae69c4feeb4a0e0e0d7e0b4aff0e7bb70a52ca2
25914 F20101206_AACNAJ giurcanu_m_Page_022.QC.jpg
12810c075d33ff6c21bc5338038665ef
9d8deb0fad3213b8eaaa99940298411b1bb783f8
1807 F20101206_AACMYA giurcanu_m_Page_045.txt
3f2978478ba5beb480b8fb5f333ed15f
3a12d1ed67c58c4955aa45e2862d0b44c325414d
2315 F20101206_AACMXN giurcanu_m_Page_028.txt
89aacd6bfd564d4b3243f0082a073abf
da4c3f2f208e6ad0229494ab4066a6a6298d8836
2121 F20101206_AACMWY giurcanu_m_Page_009.txt
040f1cee3ec09bdcf17bbb495d879c38
77e061e191009c7a610dde10629cb76d5e9a2189
726202 F20101206_AACMAF giurcanu_m_Page_079.jp2
cf5e8ce31f28c43dc895525578b5c8dc
ea85c35b1ae36bc86faa377a0322dcd407a6d14b
6120 F20101206_AACNAK giurcanu_m_Page_023thm.jpg
75135461b56fabed26f758cf033dbdbf
34f32d97cafb7106db7159ca2b193c0958c35ac4
1542 F20101206_AACMYB giurcanu_m_Page_046.txt
174e20581a5b6f79ff709a77b52594b4
3858f4b2e0b7789c44f4f3d4cac378780de38e95
2353 F20101206_AACMXO giurcanu_m_Page_029.txt
b7d969012bb16bac668fc3b445010206
fed1aee34fa2661aaecf2f06beffac1fc27b103a
18175 F20101206_AACNBA giurcanu_m_Page_039.QC.jpg
487ad968cefd49ffee43c5d7c94f9be7
cde97ee72f6d4a67e2c404ba5c026d126ce81bf9
671 F20101206_AACMWZ giurcanu_m_Page_010.txt
cd99de0d0985199c1c111cf0f3a6090c
a4fd4a9a8b2be2d95173e1020318adc87a1d15ea
1458 F20101206_AACMAG giurcanu_m_Page_085.txt
0214756c09e8076892a3675a95820b44
8919706615fef6654863f86a614765e7e9990228
21954 F20101206_AACNAL giurcanu_m_Page_024.QC.jpg
e2b6535e2dd9b3202304a46f9ccab3d2
396add43c08abff30375a2b48261bb58e0ce690f
2068 F20101206_AACMYC giurcanu_m_Page_048.txt
66d52b167d0cfd4c290bd73725270bd3
072fb40ee3f265ff593a1d389c559286c82263bf
2403 F20101206_AACMXP giurcanu_m_Page_030.txt
e36f6948260dc660ee31eb8775d70728
fc5e62be1a4fa9fe30b1c42b3428d7e88ac05959
1017695 F20101206_AACMAH giurcanu_m_Page_016.jp2
c8caeac7018922b4fa7f8823140a29a4
240c6e29077e2d40010311d784d5a8ed43127514
21162 F20101206_AACNAM giurcanu_m_Page_025.QC.jpg
e05d2a61b1015ea003632fa40563936f
a7c3f99d245fc136118262f5ef36bdb07024aad2
2219 F20101206_AACMYD giurcanu_m_Page_049.txt
72ec687f86b0de5b39fd1d23f3b9f294
e1b6d06ff8d9686da8814bdc608d556f1e49c834
1579 F20101206_AACMXQ giurcanu_m_Page_032.txt
4e016d3d9a6214588fe902f07c6cecf2
d9532c8ba66e65798660fd26fcea63ed3533ba5a
5550 F20101206_AACNBB giurcanu_m_Page_039thm.jpg
569b6d1dc0ab504a37440869c1a54464
82cb4b4afc622ea0085a9685a296924eea580759
5608 F20101206_AACMAI giurcanu_m_Page_111thm.jpg
cfa2df7e367c7ce114a5f4c6daf85aad
94f499ee523f8435b13e7dc9a0f1327823eb6d42
5718 F20101206_AACNAN giurcanu_m_Page_025thm.jpg
633e04d5f4669e3c9f70977154aeb4a3
dfef950328a64558d582b91a89ce596c4b1a3ec1
1929 F20101206_AACMYE giurcanu_m_Page_050.txt
aca6aab2ed23d35f106ff85bdfe9a77c
33047fd9ac92c9e85480c9ea8cd1d98a1b20f769
1959 F20101206_AACMXR giurcanu_m_Page_033.txt
841ab4aafeefc6fd24538516b53888f6
b36c70f61224053e640b87d6a9ad13002089f04d
5437 F20101206_AACNBC giurcanu_m_Page_040thm.jpg
20653f6a942e7e2d9cdafdd2a682f9ff
eae386abfc2487f51daf6791df4e84b8f1b97eb0
878 F20101206_AACMAJ giurcanu_m_Page_099.txt
3b1e9c5e830ef6dc18ba36af17359e13
7933d257729cee00cf32bd9909347533c145e978
20263 F20101206_AACNAO giurcanu_m_Page_026.QC.jpg
318118d10284f98f456eaedd7d1977f4
cd84c6b1f92ac283e80bbb72b561e9f820ce0a75
1596 F20101206_AACMYF giurcanu_m_Page_051.txt
5cff6ad8815eb13da67886f74624f881
ee16e847f6c69bcfe1a1c96698627f87e36d6fda
2351 F20101206_AACMXS giurcanu_m_Page_034.txt
2821c68def71e71c552b3d9ac38d280f
0805c4d2a99b1db6697c66e82a5313603560d760
15646 F20101206_AACNBD giurcanu_m_Page_041.QC.jpg
b9c0b7ade59893e198685f49a6c19c1b
b33fddfc177a60411950ac6f2a182c8a6dba96b5
1624 F20101206_AACMAK giurcanu_m_Page_105.txt
300cd977905490f3608d75f8ba086519
8cccf9a7d9386b0ac6e245ee8b32c220fee34f4f
27474 F20101206_AACNAP giurcanu_m_Page_027.QC.jpg
86ea6238571aacf178ccb65d51259606
d516d11e5ba466275361fb062f83bee29c7aaf8a
2008 F20101206_AACMYG giurcanu_m_Page_052.txt
926801264e5d821293b8d394e3fecab9
87418f4dfe11eb011591ad404c48c37a5bfccabd
1930 F20101206_AACMXT giurcanu_m_Page_035.txt
3f51ff7f032f2746e156bda66f28b68f
dff2cf95142db5802981c30546c246121189e4bf
52646 F20101206_AACMBA giurcanu_m_Page_042.pro
4fdfde6a29f8baf1ee32280883ad42ff
0b71d87e5b132255a8b18063880177140947f75a
7122 F20101206_AACNBE giurcanu_m_Page_043thm.jpg
efbe328dd0a278d7a64d35a694887563
aa2baf5e2f952e22130100ee0e23293a93047344
6116 F20101206_AACMAL giurcanu_m_Page_108thm.jpg
900e83a65f2e2fa349f1272b4dbb6be5
10b0a5b23e4330d728da01ea916b20e3b48f84ba
7015 F20101206_AACNAQ giurcanu_m_Page_029thm.jpg
c92f4de5bb97fb0a597b7dbbf31862fa
24bdaec16a5428229d486ccde80555768fce49c1
1759 F20101206_AACMYH giurcanu_m_Page_053.txt
54ca8f64869ef3f0d83d85b7e9c4b8ab
90d2fd6bb49cf2257d48b3cf81714937772d0466
2032 F20101206_AACMXU giurcanu_m_Page_036.txt
b24bcbeba89b6a61d87004fb4aa3a6a3
52d7392ed6779fd44523209accfb191412ada9c2
25271604 F20101206_AACMBB giurcanu_m_Page_086.tif
687ae05bd31d33df395e28ebce48f77e
4783677ad80e0e45923a39fd175c2c9c757361c8
6389 F20101206_AACNBF giurcanu_m_Page_044thm.jpg
c2729737cea86c5a92e6eb10e37bb407
ffd0e62c2ccc97fbd8344281d562e25989775033
6192 F20101206_AACMAM giurcanu_m_Page_110thm.jpg
1464854f48956890014766d1f560e131
a5bc16186afc94130078ef74a41db9275bcd2aeb
28035 F20101206_AACNAR giurcanu_m_Page_031.QC.jpg
fa5496fb5229461f4768cec926061dae
ca4c2c7f0417ecbfee1cd5a29b2f6a0c82597315
1709 F20101206_AACMYI giurcanu_m_Page_054.txt
591a61a538a84d11ab1f8002e5ae6e8f
7ae804c9a303ad0d32079f70352f4d9d68aa9d54
1901 F20101206_AACMXV giurcanu_m_Page_039.txt
31b1e2a8595444dc646cb5114739b64b
8583d5ccf12ff369a3f2b4eb152f5e5ec71dcf42
21096 F20101206_AACNBG giurcanu_m_Page_045.QC.jpg
de81c1062f563a03d6c7fc72d9bafa8c
85bdaf36f57f60a470329d436e6a31e6f273b8d4
1051972 F20101206_AACMAN giurcanu_m_Page_077.jp2
ebc07cca622b102a0bd096734a784d76
4e86eb48b03b7ec17587dcf221b17d3b218ead82
7026 F20101206_AACNAS giurcanu_m_Page_031thm.jpg
d814708df0e52a6e0bb21e90c7fc212a
246c45816d626d078e7bd3cdd2dc9db5104d7b79
2453 F20101206_AACMYJ giurcanu_m_Page_056.txt
d0f1d88d83a919e3dd1349c2f6abd9ed
7642d13c2ab00d85cba2c562b8aa8ff1962f3db6
1530 F20101206_AACMXW giurcanu_m_Page_041.txt
1cee4d7d5836dfcf07b925295de5d900
553d88814bb054af2d187cb4f33c30930dfddeee
15684 F20101206_AACMBC giurcanu_m_Page_054.QC.jpg
f2651bd7fd355610d361e27b3fdec5b8
0b4c66ad7cdb3ac1c27f68121b6a1699a47b4dd2
5749 F20101206_AACNBH giurcanu_m_Page_045thm.jpg
5c994dd8703b7c61809f67d80d8e7b35
aa200b503a556bc4a61bb529817d10c36bbca49a
79723 F20101206_AACMAO giurcanu_m_Page_007.jpg
e47b0e6d0fc63aca53f4bb78a0330afe
826b6ba68420121fa9159e771835e65d33ca97a2
4152 F20101206_AACNAT giurcanu_m_Page_032thm.jpg
6398dddd5f656ead5b67954c58cb27ef
9236fc7d57efdae5f96c6a3bcad524bb6171a0c1
824 F20101206_AACMYK giurcanu_m_Page_057.txt
076bce041765d1fa385a5f770ac65fed
35d465ce20c77ba2b206eb2385a5ff5ca22c0a39
2203 F20101206_AACMXX giurcanu_m_Page_042.txt
53d324fc6a0433e3637eea3fa257ce15
eb5f8276233c256fdc313c12f4af30f40f6f08b3
1051978 F20101206_AACMBD giurcanu_m_Page_089.jp2
00926162c66e988280361694959803f4
fec82322a84a212d98ed3f2642dd8f06af6d3690
18058 F20101206_AACNBI giurcanu_m_Page_046.QC.jpg
9d7e53fc29964b35f39895815b845752
c345b0eb47733165ffcc5243a73c9fc19fabafc7
23570 F20101206_AACMAP giurcanu_m_Page_062.QC.jpg
8e29b9279a67cf68ebd47422b4a4ef09
de0038553d67eb8047485ceefacf8035c6740b17
5736 F20101206_AACNAU giurcanu_m_Page_033thm.jpg
9e4fb3038881fa1e05d97e5dfb814071
aee869685b463b62ebe99f045ec85421aca0a211
2257 F20101206_AACMYL giurcanu_m_Page_061.txt
c5c9c864b47105c9a65c33d7ad62680c
6217a8fac6a70f10c124c8c7147f54c3f7e25185
1051968 F20101206_AACMBE giurcanu_m_Page_012.jp2
5513b57cb2d363744c38beae21275ebc
b5be848ffc1ee871182f590499f9eee84a8a1984
5239 F20101206_AACNBJ giurcanu_m_Page_046thm.jpg
772b6ef1ee5d4b48e27e97870c12cf6d
689085b6a38af983b74781165f7387ec8f04c055
612077 F20101206_AACMAQ giurcanu_m_Page_006.jp2
a7df05e1c340a95f24fd64612e666714
decad77c024766b78a7535151f8cc727e2c2d155
1774 F20101206_AACMZA giurcanu_m_Page_083.txt
f2bb04a187eea379b5b928ae4c97079f
caaebd124175f1801a85efafaf39da6853b80f18
24189 F20101206_AACNAV giurcanu_m_Page_035.QC.jpg
5be7ff936ca6c75da601d0ba527dbecf
7a591dc1d7fa5f9d6c97fe48b5cd406ff3c8843f
2188 F20101206_AACMYM giurcanu_m_Page_062.txt
12ba32fbb0c89d396de5a4b0aadf5090
3897eb549a1f1101ec6702dbad580eb43b95855c
2431 F20101206_AACMXY giurcanu_m_Page_043.txt
9af3f1451924eaf76a95f09c35a05d92
0ffc18fc1b6c68e150b8f26c8883a6bd8f7614df
4612 F20101206_AACMBF giurcanu_m_Page_102thm.jpg
7620146f4d0e4cef6ee803bf957fe66e
0e44034411a07581983d8e16d7603017a6ca35c3
5579 F20101206_AACNBK giurcanu_m_Page_047thm.jpg
c7e7894b36727a3905e2fbd16da7c655
cee02ba339d516a79c5719e1084def46947c6524
38673 F20101206_AACMAR giurcanu_m_Page_055.pro
11fed41d24c7296ef249affe453fac13
a058e789ade0a0840986c00e9e78592ef7443abc
1692 F20101206_AACMZB giurcanu_m_Page_084.txt
fce91e38375002e51514a443dbf016fc
f1b7f4eca67654651f46af77c875997ba59aca00
6203 F20101206_AACNAW giurcanu_m_Page_036thm.jpg
4eb52734d495be0b0153c862e64bc325
bfa59045e795027a633e23b0e6d55b8afc0d59c0
2001 F20101206_AACMYN giurcanu_m_Page_063.txt
07710f02a7d611c34de10bc1c45f5d0f
51190e7bc1e17243a7e52200568c53683ed4ee55
6894 F20101206_AACNCA giurcanu_m_Page_059thm.jpg
f2b0fd1899ca2481da80b0525e4d8ee8
57f7d686162e25aecc00d1bd3b32e758f16179c3
2072 F20101206_AACMXZ giurcanu_m_Page_044.txt
b8e457c20c24efae33d63ed5943bb268
d9a99c3400c42d09ea2d4ab71915136d765e6343
5117 F20101206_AACMBG giurcanu_m_Page_007thm.jpg
954594071aac36ccfb5158c56472da5a
415190536beabbf3cb33d45a22b3b4f73f265c4c
23996 F20101206_AACNBL giurcanu_m_Page_048.QC.jpg
b993106a29dfe36baf02b04f25b83797
f7776422e6904133d45e4be4159c2c1959113fc8
1051951 F20101206_AACMAS giurcanu_m_Page_029.jp2
8d0d3965476135ee1ccae09915e14489
81bff4903579128bab80f66f29c7ee26a1deffc0
1608 F20101206_AACMZC giurcanu_m_Page_091.txt
d44a88694ce0945c96b8973fe0051224
1c098a52caeb7ab6bff9deb2a7fcfd9b1e429ba3
24396 F20101206_AACNAX giurcanu_m_Page_037.QC.jpg
94e4164c553aa9926ef46f41773b56f1
890190dc7f42c89638c20daf148bfc2dac0a7067
2129 F20101206_AACMYO giurcanu_m_Page_064.txt
98bd8bb8bc7b00d92e0b01b39fc6b7e8
3280a74602c37339d87c066c1f8778fbe8c21426
24367 F20101206_AACNCB giurcanu_m_Page_061.QC.jpg
f8d9ebb5d1e657bb87c25bd4082c0c30
a458c6d6330f19934fae74cbac64b3a0dc665511
1053954 F20101206_AACMBH giurcanu_m_Page_002.tif
5a6308cddd0623e57b461ea42eb2eaa1
89edec5cd63f5b51af7890df3b54938f5c52ca63
6349 F20101206_AACNBM giurcanu_m_Page_048thm.jpg
af637da6fb1453091fdde857a3d8e5f5
0bc220bece005254c884305d7550d7b8583389b3
F20101206_AACMAT giurcanu_m_Page_062.jp2
93eadeea8f8b0b6bed132f8246030139
763cd378241cd8b6aecb24d976e5654de44d4b0e
1227 F20101206_AACMZD giurcanu_m_Page_093.txt
ac5b089c3ddf0e3afc465f3ff379d752
f23c90389f13a24e47cc29de46b98f57c6efc848
6404 F20101206_AACNAY giurcanu_m_Page_037thm.jpg
d477390c82fb22387d3b10f582fe9974
fa9ec0d4790dc9ad3cb81061740a24d872c82450
1998 F20101206_AACMYP giurcanu_m_Page_066.txt
ad9bb171dd3bb7f81af4573376200a09
3a989466989bbf49488ec30c92c61d3159bdcf7c
F20101206_AACMBI giurcanu_m_Page_016.tif
76a9ea3b84915f9fc5ef0117a5fe297a
f0624408b9c24104510bc46757f8ba2d57637a16
27002 F20101206_AACNBN giurcanu_m_Page_049.QC.jpg
3092575d34f400f50842a83cec28a5fc
5e701224300a8ce1d960cc1ef05759211bd193f1
F20101206_AACMAU giurcanu_m_Page_053.tif
c9ed5d35a498bda659be1853e781ef99
5b784434f09378ef633a7788f1190248b6b16fcc
744 F20101206_AACMZE giurcanu_m_Page_096.txt
5290494d7917874f10f26368f4f97f6f
4389c0a8f067b101956d28866a695f3b8354493d
6286 F20101206_AACNAZ giurcanu_m_Page_038thm.jpg
4090a5a6f02a104452d3be8e7792a7a1
72d134f73db229eac1b56ed4127733b1c2878009
1975 F20101206_AACMYQ giurcanu_m_Page_067.txt
dbe4def252463932dd83b81e977127cf
5fff502138f4192d4ca2bd334450aabf429692a7
6158 F20101206_AACNCC giurcanu_m_Page_062thm.jpg
cc0821171710a55e3b275b401c23e293
84aae8210dffcce0cbe2dc3b0e4c83f9778cf51b
F20101206_AACMBJ giurcanu_m_Page_110.tif
d50a2479ce4efec30c81bb8adb947bb8
e4602e037fd76367e43c8d251b6a461ad06032a0
6679 F20101206_AACNBO giurcanu_m_Page_049thm.jpg
5e5204148e081a96613878b683fd70d1
54f722739be6a707dbf34d500630e9a308ea607b
6872 F20101206_AACMAV giurcanu_m_Page_027thm.jpg
f1fe3a6786bd6d66ca6cfa70f2efa9b1
1bf8881923f9ebaf9d49d78b07bc96550e6f9af1
1805 F20101206_AACMZF giurcanu_m_Page_102.txt
b853174f006df342ab415795c8ee53b5
60508c2b0a7c6e6576c18564997a2f7942474fca
1393 F20101206_AACMYR giurcanu_m_Page_071.txt
5d30ffef9641746a881b2be0b8fb7593
ab36f9782737fb97db3e510b7ece4317857a8958
5877 F20101206_AACNCD giurcanu_m_Page_064thm.jpg
e99f968f13a9c6b301523177875d0024
c00722680aeb92b233f49b67f9f702c940fe62ce
4915 F20101206_AACMBK giurcanu_m_Page_094thm.jpg
2027f05975623c093949b12dce52e10f
d8abdf29a35fc785de09392970e183e5a40195d9
21189 F20101206_AACNBP giurcanu_m_Page_050.QC.jpg
85f6572ed68eeeb98d2170b5beb0c020
d81ee5e77b319fcbff00d799d0d71488b900620b
15309 F20101206_AACMAW giurcanu_m_Page_096.pro
07ac66650b939f4fb3b5a4a35fc01de4
7328ca5bafb475edb4301666bda7bf02eca6d7dc
1574 F20101206_AACMZG giurcanu_m_Page_104.txt
d4b9d6725752e9d7a434ded54f01cbbe
96896379e12a1d0864ae649246d7a78c4a11840d
2208 F20101206_AACMYS giurcanu_m_Page_072.txt
ee87ce4f424bd8289bb7b169e0c8e423
df1de60f4aedd69f91c2f871e270fdccb33b7ced
23152 F20101206_AACNCE giurcanu_m_Page_065.QC.jpg
05c249624de3bf74bad7c30f439bee00
453f1982611919786ff9b78ad860c339892da83f
17759 F20101206_AACMBL giurcanu_m_Page_087.QC.jpg
4f31f12b955db4d97dce1f6a3d66ab58
e993eb62a5e9b1c27858afbbe6bb9e257621d9f2
17810 F20101206_AACNBQ giurcanu_m_Page_051.QC.jpg
07059b24c78ab0d62626cf5e8d636431
3142f5ec65692934d34ea08a1bf4c43b24947604
32103 F20101206_AACMAX giurcanu_m_Page_097.pro
9b27faa31408cdd3b652188dc00670ba
8df8677805295e209f0898a5dc2f80ea47bc8736
2468 F20101206_AACMZH giurcanu_m_Page_107.txt
3a227a60cf8f59c778ba85c199eb8bda
84036e828d0ae5dc90a1b2a567d2cfdb8277d2b3
554 F20101206_AACMYT giurcanu_m_Page_073.txt
0b917b57932f6b663dfe03c5f8fd71fd
0c7c33dab73d732a6e581d04e62f0683936f5b75
F20101206_AACMCA giurcanu_m_Page_033.tif
7d5699acb4da65ca8b12faf5b41641be
d26765946f079a14eb22294a3f583cd8f39a73e0
5967 F20101206_AACNCF giurcanu_m_Page_065thm.jpg
6bc08d3c6025893cb959a646c84da22d
6e5551b4236361bfe573884fccc4616da9891285
17447 F20101206_AACMBM giurcanu_m_Page_095.QC.jpg
397f0dc1d55ac317b1abbce732222bcd
5da62a5f19887ff7b4c1686dc37c51462714cd41
5660 F20101206_AACNBR giurcanu_m_Page_052thm.jpg
e9bfbdc1e3408ade7f367a0b6310c0c9
792007bbfc36604cffb45c2fe0cc0015dd143a86
2383 F20101206_AACMAY giurcanu_m_Page_031.txt
6106804584586bf923f33b4986dbc357
ecc8ac71b3bb74d65779febb61f7d0debd2755ec
2186 F20101206_AACMZI giurcanu_m_Page_108.txt
7c27550437d06865d661825cd98cdf3b
7d3973b2de24a0c9ec1a2dfac2cb84e531e63ddd
1863 F20101206_AACMYU giurcanu_m_Page_074.txt
a5952f20febb8882c1f95dd403f8d749
21026a8a8fd6efaa5e941af859bfb2b690314169
22216 F20101206_AACMCB giurcanu_m_Page_023.QC.jpg
bd54cd4ca137ee034a071d9504b677ec
3ee7b9d11151853227d6c05d730b2860be8cddc4
5792 F20101206_AACNCG giurcanu_m_Page_066thm.jpg
4bb5b627b630ded4d96258c98d418dbe
8bb3a0fc8f8d05be314408d8639e54b85ea1458a
1580 F20101206_AACMBN giurcanu_m_Page_087.txt
f3899fc974a7b2348bab502d5011b57b
004622081db6adbdeb9974c5664a392361636656
17830 F20101206_AACNBS giurcanu_m_Page_053.QC.jpg
9929d5b0ae3ecb533a347b3d1e42ea3a
0055350a07815652605b9e690d10ddea760adf01
5932 F20101206_AACMAZ giurcanu_m_Page_089thm.jpg
19d530e6d94b0ec414010a1525e0eb03
e2e594d20a2cb5143c20ee41d2b76b3edb639fd1
2185 F20101206_AACMZJ giurcanu_m_Page_109.txt
4ea915b7b0077da3420e2f2307433ea8
a1326ba74faca0ea045dff87f578fd6072b0bb7b
1328 F20101206_AACMYV giurcanu_m_Page_075.txt
5a671bf88966967537d4518c298b0a99
cd4f7b7a0a522384a3312063507ba59b6388bb77
F20101206_AACMCC giurcanu_m_Page_112.tif
ea3d12a5624bf2861c71d1bf1c441afd
9de4900915ef301861d1008b57b0a53a44e59207
19761 F20101206_AACNCH giurcanu_m_Page_068.QC.jpg
c28e46e47772b0b4210aa748abbbce91
4a501078e8e08dd4eecf2f92a94b8a73b09f13e6
73802 F20101206_AACMBO giurcanu_m_Page_112.jp2
5af99e9fcfa3531b837889166bc63788
f9f879eb875600555c3b960fb4bfe688e9dc75be
5099 F20101206_AACNBT giurcanu_m_Page_053thm.jpg
d32e9b2fa0f0be245dc66dd4aca7d69b
875b3e8dee96a1c2541287075dd9fdd5ee01aad1
2305 F20101206_AACMZK giurcanu_m_Page_110.txt
48e37593009cba5c8ab0dd7dd1d9dee0
6cfcf56f6e40a43cbcc7d1587a9b4b8dcae03eea
1604 F20101206_AACMYW giurcanu_m_Page_076.txt
7d7db8a9a35b6b4e33b9a0242883a5e4
8acd2278cdc37a0a7aa736f35a74900559778102
5765 F20101206_AACNCI giurcanu_m_Page_068thm.jpg
ff224302b37352ca748549b59ad626f1
0fd7da722032b3d8180526101b75a386ed39c805
49276 F20101206_AACMBP giurcanu_m_Page_036.pro
9df57959fa4f4782c0a0c6d370febecb
8106c7e3811f4168369666d2093270fe38d8ed57
18794 F20101206_AACNBU giurcanu_m_Page_055.QC.jpg
c3a89fcfdb8992163ef199865bb7c1d9
e8114ca074f3693e294f6e516f4d7fc9682a5466
2042 F20101206_AACMZL giurcanu_m_Page_111.txt
556b768150edebfc743b35a5871d5ed8
78e6613644111008a3669cd5c13e9695db89fd95
2086 F20101206_AACMYX giurcanu_m_Page_077.txt
487d21e76e5387948618c977528455f7
2bfa4df79701f3e44d80a53df93452cd88c84bf7
F20101206_AACMCD giurcanu_m_Page_020.tif
eb11e3d42b5faa84d4c07985b539cd46
4d7255ef7e915fa4801cc809a25ce56efdebf7bb
5556 F20101206_AACNCJ giurcanu_m_Page_069thm.jpg
192d5d9e84c8228218ed62a0df3620d2
1a03c96f627f21a28d53315a5883600d784027b1
27023 F20101206_AACMBQ giurcanu_m_Page_012.QC.jpg
bbe9e80168accbe83397663d887a3b1b
43bcba158c65b242f15042c81276fb517d60b132
5346 F20101206_AACNBV giurcanu_m_Page_055thm.jpg
86e2d83d9277ae87c72203a19178f1c0
f135499eb7950248a971e9a324895ca739edb343
1365 F20101206_AACMZM giurcanu_m_Page_112.txt
616f7a64eacfd3f6c320687a4bf26a84
cc2f322c32e5eb032347f54034f7741304595b65
1610 F20101206_AACMYY giurcanu_m_Page_079.txt
1b3a8114b651ac08aefd518005980e7b
4c95346b36686f0e92d7cf318acc41d3c49981b1
16073 F20101206_AACMCE giurcanu_m_Page_092.QC.jpg
cd3af1667ee7089c504ec12e4a620240
d0aab887d1ac29a436c2eafe00c4f767bc8b5aee
16999 F20101206_AACNCK giurcanu_m_Page_070.QC.jpg
75bec90274d1244a6132d2173a6de59b
78278dedc3d09ad1785b8453e90f99cc93e69b92
45532 F20101206_AACMBR giurcanu_m_Page_075.jpg
b6ea06e515e54d17eaed755a487cd49a
4f2fea59848ff9c6a0a6227c5fda51e1d1ad6476
25022 F20101206_AACNBW giurcanu_m_Page_056.QC.jpg
233f7235af14fa4c92680d073ee2ddaa
25969796d4017fe7e79da939d9118e2908f715e1
2157 F20101206_AACMZN giurcanu_m_Page_001thm.jpg
e0ebdf07b77f692a76c8fd92f4b92832
e3a29917640edbc4fe6087f815216f58783349d5
F20101206_AACMCF giurcanu_m_Page_064.tif
92f0981a1b6d53ddbca736610341d493
a0d9261a32a551c2c266c0133f861bf9a63f6d40
67031 F20101206_AACMBS giurcanu_m_Page_025.jpg
d3afa677d7aa530d0b62eaf2935a5347
1bc148eaa2c439db697c57c17ae670f256751b49
13046 F20101206_AACNBX giurcanu_m_Page_058.QC.jpg
e5fa834ef9b1353d1240d5cef6dc98d7
1658f6635078af393588e211c3dfa4684e19305f
713252 F20101206_AACMZO giurcanu_m.pdf
37346fad14c115424e8d76d8d01ad266
fbfa6b8e38f6690dbfddc5465852821b14fea994
4649 F20101206_AACNDA giurcanu_m_Page_081thm.jpg
beb6b306058b9b47d4f9c8eea6eaf438
d1e739a4f8c44320f669c2ea93f2d0bf24ec5814
1270 F20101206_AACMYZ giurcanu_m_Page_082.txt
fb1ee9f0faff45cfff5f7d1ce0763a09
789f573cf4a111075a376f5ca9874a96bafa1c8d
1051980 F20101206_AACMCG giurcanu_m_Page_027.jp2
02da27874d6a79c13bf633f425ad45a7
b5f3e8a3c2f9482f1a56aaebf288d7962972699b
5096 F20101206_AACNCL giurcanu_m_Page_070thm.jpg
2d508e5770018346b4661781d24b5da3
cc87547a4fdd2ea9346c2a582817a7eedab32222
4048 F20101206_AACMBT giurcanu_m_Page_057thm.jpg
741811a441fc037ddc19559a224c8ac6
3f80445ba5e360963178f4ee5fbbc9918ce20bc6
3846 F20101206_AACNBY giurcanu_m_Page_058thm.jpg
0d07c46002bc573e28b3d38918785f68
031fcc5523f26a3c8c5117dc6af7eec6dcde1008
169931 F20101206_AACMZP UFE0021391_00001.xml FULL
ea29d0a9776b63c323cb66cc2dfea736
a6149f40db20c5e4701229194ca555cb3bae2745
16401 F20101206_AACNDB giurcanu_m_Page_083.QC.jpg
ff99fde5ebf2927e3f04b83167161fee
6a8d587722bd5f4b343d8cce99971a40920ae64d
20548 F20101206_AACMCH giurcanu_m_Page_074.QC.jpg
8f01cd23e0cf0c83e75fddf24bb8b854
fbe8dd5b7b313e04770072a5d0010088c26a958f
4528 F20101206_AACNCM giurcanu_m_Page_071thm.jpg
b700b216757157dd9c7bdd7d944d4152
30fdef413b48193ec8a6003cf488a882cf7fa35d
1051985 F20101206_AACMBU giurcanu_m_Page_020.jp2
ff1b0119d1f59c5daa713ef1686072c9
301a6afe51f8c581216f7a504bb31103cf6d4ed0
27427 F20101206_AACNBZ giurcanu_m_Page_059.QC.jpg
c2075609393a6e8017780a62e566ec4c
a9f10efc56636b6b43f161edaf27df25c4848439
3238 F20101206_AACMZQ giurcanu_m_Page_002.QC.jpg
69966389324695b1d8083a7e7e865b9a
165ad01e47cfe22d83eb5b54a832c2900b7a329f
5361 F20101206_AACNDC giurcanu_m_Page_084thm.jpg
51bc5686ab0bec8e1e08a5720b478061
7e1122d43304ba48297b0193bca2c880ba9f58f9
5784 F20101206_AACMCI giurcanu_m_Page_063thm.jpg
ea417a9f7e73d191a7d15062615aba74
b045bb2c5cd8028d21e65af3dd54a35723cbb68c
5456 F20101206_AACNCN giurcanu_m_Page_072thm.jpg
5ecbb645b970eeac9bd4003fc9ef1e7a
db20d52dc18db9d8cd5fdb3daf1aad2f019d265e
1051958 F20101206_AACMBV giurcanu_m_Page_061.jp2
3b40fdd50a501b518e3987a2c5008651
08ddde4ce0235fab845059a4feb0d11b2397f2bd
1353 F20101206_AACMZR giurcanu_m_Page_002thm.jpg
48b409e6a233961631a586bbe99087f2
eb474f4a2e8cf02b327ce02b4713cda3e9ea8144
74127 F20101206_AACMCJ giurcanu_m_Page_077.jpg
3475599e47d822d793e6eb16c6c6d9b4
4ef39eb71f3364622ace04738c533f93784be9a1
8090 F20101206_AACNCO giurcanu_m_Page_073.QC.jpg
87f6d20039a38867e8a7d9a0b68d0984
a4f9d5ee3cfd33e4b12181ff0850838398af1fd0
860383 F20101206_AACMBW giurcanu_m_Page_055.jp2
18193df543ddbcd3c6f98b0e411688bf
cdda91d5d94a8a8576007af3181f80cf740e4087
1509 F20101206_AACMZS giurcanu_m_Page_003thm.jpg
01c35ade840d591607768d7250230254
185ec3b637de1d6635fb72d18ae928cfe706093e
15710 F20101206_AACNDD giurcanu_m_Page_085.QC.jpg
78a223c95a0a1067f942f5a92b9e84db
81e0a163fecb9564eabaf7cc667e5347aa38b013
5131 F20101206_AACMCK giurcanu_m_Page_083thm.jpg
fc9be167eeffefee297563b845ea7138
c42f9da6e2196260aef62e0cd682a992c8b710a7
2618 F20101206_AACNCP giurcanu_m_Page_073thm.jpg
2584fcb351f5e7339ff25efc31066b8b
43f70527fc234c9515369a32216015f208e637f8
814393 F20101206_AACMBX giurcanu_m_Page_097.jp2
292acfa4341feb414158203b31ce0cfd
21d9988e3a424f533e0978668af474844ea2d464
2634 F20101206_AACMZT giurcanu_m_Page_006thm.jpg
3a349d26701b4f9c2e34cb1f56176248
1eddc70644a57b3d981c48bab48d3fb9e18c8cd2
1717 F20101206_AACMDA giurcanu_m_Page_090.txt
3390e5c06a15623a6b9b19697b221ab2
4c94b4dfceea410850e26dc7c921932d83786fae
15273 F20101206_AACNDE giurcanu_m_Page_086.QC.jpg
c0d14c690fc4611b482893dfff119ce9
f8ce2507dc9cfb5aa17ed77a7c1638dd7f4e9e96
57644 F20101206_AACMCL giurcanu_m_Page_012.pro
64aa8500ba5a2cd8aa8899bb74f89034
6e7161ad6b349f974cf9fd81735cdc4538f0b7ce
5485 F20101206_AACNCQ giurcanu_m_Page_074thm.jpg
76811f7c80336b2fff6316ad8a1f3bcf
84dde67183bd225fd724931d5395d8708adb462e
19272 F20101206_AACMBY giurcanu_m_Page_057.pro
73cc5aa069540834ec34c959c0681ed0
1e413bdf966d587aa9373461fb305dbcda41e659
21198 F20101206_AACMZU giurcanu_m_Page_009.QC.jpg
9ce9938ed5c9effc767e6c93099f7ead
bc88dc213af9ba4e8c7e4d6ede4369fb87bf1cb1
1711 F20101206_AACMDB giurcanu_m_Page_080.txt
068b6fc9bf15cd39e4990059fc3ad645
4863d31f7343827603d441d16eccfdbbd7d8a43c
5202 F20101206_AACNDF giurcanu_m_Page_087thm.jpg
525abce5130c4841516dea570038832e
be2c72dce4095529546bb030456a0a2fd830dd62
2057 F20101206_AACMCM giurcanu_m_Page_037.txt
ee0103422d990a6a15038573c04d817c
b14c38723bccc36421c622f5780ecc6668ede32c
14341 F20101206_AACNCR giurcanu_m_Page_075.QC.jpg
daa4871a8589d321ec5d091c711b18c9
eba77684a7141078c34bfcd79bcb3e6a80feda85
798298 F20101206_AACMBZ giurcanu_m_Page_071.jp2
5ed519baa11dd4a6e2b1c96ef391260f
73fe37d6a186586bc6e4eff351f5f2372b3225e1
5609 F20101206_AACMZV giurcanu_m_Page_009thm.jpg
233700e7eb3542bfb7e7e4b064ad0de6
4609d729257553cdcaf3799ea5819852aabe4126
51845 F20101206_AACMDC giurcanu_m_Page_090.jpg
635a0d639922dafdd5363fd1816c242b
9fe5eb45cbbf4e62878a578236bd751000925cb6
5652 F20101206_AACNDG giurcanu_m_Page_088thm.jpg
6810b18c69bb3fd233cf3a86d93362f8
4355fcb149a9f03a3d5e7f65580d19c9cc55ce79
795690 F20101206_AACMCN giurcanu_m_Page_039.jp2
c4addfcaf8b0fcd73ad60a7c6f57794a
59671de510d9b29c7477c0fb32aca8aacd77612f
17702 F20101206_AACNCS giurcanu_m_Page_076.QC.jpg
7c4d766724b44731f6d43ea3e8ed82ba
e3ca35394b3c948f973ac8f1a74bf0cc3dd3a308
9116 F20101206_AACMZW giurcanu_m_Page_010.QC.jpg
4c2cc26f3e30631cccf5f3e6e644e914
080146c2a73e0057c1095e9a1f8cf09bb8bdba62
F20101206_AACMDD giurcanu_m_Page_087.tif
5fe946f2e537d1f832d0ff67572b6b11
43a4856976591ca40b5ab665f8e3ee5d5a49a094
16739 F20101206_AACNDH giurcanu_m_Page_090.QC.jpg
05762551d298684e62bb10fbd3ebd598
cd42cc54a825e9eed2c75077f2bc28805d0c22c5
976045 F20101206_AACMCO giurcanu_m_Page_018.jp2
6a23e5afdd96ffd6c59170592129d58b
41c4cba59edb0db55e84fbe2e1a900f486ff3bc3
5161 F20101206_AACNCT giurcanu_m_Page_076thm.jpg
a5cc013a9542d343075a679449ccfc73
016dd47d0ed0bd4d756b4968f7d15bec4932ed86
F20101206_AACMZX giurcanu_m_Page_010thm.jpg
acde353c28deabe5a15edd06255d8bd0
c1c437c9af17ab0a80cca414336b6efc406ff0e9
4881 F20101206_AACNDI giurcanu_m_Page_090thm.jpg
b7bdb867225119f54bca19d183933620
a28f3d65cee908c78e4c0a8a902ba4defe8bc691
F20101206_AACMCP giurcanu_m_Page_059.tif
d61e52068b1975a50a56e05ce5ccd94f
45cf9af6f433aadf7e594f719d00213a37833e5e
23496 F20101206_AACNCU giurcanu_m_Page_077.QC.jpg
1a2bf3bfeceb8ae575959e73d9a92151
5f5c3fa02129b82f1cea293db537418e3ea4f399
6664 F20101206_AACMZY giurcanu_m_Page_012thm.jpg
86dada5066a5c320ed3242c520c6d660
9e5c3b0c23b4b9cdc6214893aae8d5d8dd2f87d4
3903 F20101206_AACMDE giurcanu_m_Page_004thm.jpg
9070b28a6f9c6f8eb10242d5e22be80e
8b930b445df6ad72d884799b6c752ea0704ac1f8
17853 F20101206_AACNDJ giurcanu_m_Page_091.QC.jpg
d31b3660e2f165067c43601b305c45eb
099d796642f690a2ddb012772e38a17a98c1e247
53010 F20101206_AACMCQ giurcanu_m_Page_021.pro
b3a3546461fea36dd9e0472720c74a3a
2cd813e9e8c8fe12e98bbbb07f287c53ae2ea48c
5964 F20101206_AACNCV giurcanu_m_Page_077thm.jpg
6281b3d49b880accf35117b53f971698
43fdcf9b0d4b91551de15d1671a14704a77173e3
20466 F20101206_AACMZZ giurcanu_m_Page_013.QC.jpg
f64a5793ba920fd3b2ca6eef78173371
8cd27e0a9b4d9af15479eab31870333439c7f385
798444 F20101206_AACMDF giurcanu_m_Page_103.jp2
8f26babb4cbe4bd4c2f6675544f4cb9a
23f066b0d7c431f6d9dd698f4038a78a92fbfe40
5290 F20101206_AACNDK giurcanu_m_Page_091thm.jpg
5d3275f5e79247a50e9d16c7b3164c51
47ef43d58f8bd4d734e73e46010b623841725f87
33338 F20101206_AACMCR giurcanu_m_Page_070.pro
b365035963293e3490365d72b31f43c0
539f25bc7e3c0a28a4ccd00f5088e2e09db0e37c
4765 F20101206_AACNCW giurcanu_m_Page_078thm.jpg
46c8cfb72474648f8e40027dd5e0a7ec
876236c4f280c5b91c627d0b9149169818569127
23035 F20101206_AACNEA giurcanu_m_Page_106.QC.jpg
e6ddfb1c0a0f5de793ecf145c25c0389
41cf36c2d5212b1136f1bb601de7f0de0a7284fe
68099 F20101206_AACMDG giurcanu_m_Page_050.jpg
645936a051aa3a8a2357a7e2bcf03d47
8adfcd840ece367e9b0dabde93e38fbc616f2c3b
17637 F20101206_AACNDL giurcanu_m_Page_094.QC.jpg
86d5ac6ca10aae92f81449180ef2d72a
c5eef0011a85157b39ff9176ffb2a4733563f0a3
57861 F20101206_AACMCS giurcanu_m_Page_100.jpg
fb6f658ace4fe60dcd821a5c69e37fd9
bfd001ca63355d6d1d4ae0c9ae615897d35d285d
16728 F20101206_AACNCX giurcanu_m_Page_079.QC.jpg
a1068cdac7103b58759d4cb22b726ed3
fd7f9d52504b261afea24c54e4563c9cd556cc55
6430 F20101206_AACNEB giurcanu_m_Page_107thm.jpg
62ca34d361056e7771eef7e0f54a66f0
f3a6d63dff8b6680e4698f9fbe1d57b9398125d8
53115 F20101206_AACMDH giurcanu_m_Page_070.jpg
96afbfe550383223bcc775d0d73c1253
e459a5a2f1df744ee9f4952b6eae47cf53c4e648
11646 F20101206_AACNDM giurcanu_m_Page_096.QC.jpg
234cb4bc8e8166d7fad8852c76abfad4
ed2b8937f8e6f9a46148864b96963ebd15c530e8
50958 F20101206_AACMCT giurcanu_m_Page_093.jpg
6f7a58adb8a3a73f5b26249497bbad1f
aee287296d66e40a2edba1cd00368db9675cb42b
4526 F20101206_AACNCY giurcanu_m_Page_080thm.jpg
4e511058fda6f1f4b5fd40de0472ca5c
4c9253b7dc92d418314946cbf4a22475204e7ac6
22133 F20101206_AACNEC giurcanu_m_Page_108.QC.jpg
7a4f0fb5a1c7c4b2581bc8a795282b50
ee6d4ccc6ed76e2b0fcd52d7819bc71e0ade2bd7
1051938 F20101206_AACMDI giurcanu_m_Page_038.jp2
5da029d2210e04e29c1e5dd31a28da50
2597f04dc78ac737715c1ee852afee28bf420d6e
3876 F20101206_AACNDN giurcanu_m_Page_096thm.jpg
64409df13f816a6085a969c79991a405
fac4376890dfc3dbc4acb862389598b3ab1e8321
60652 F20101206_AACMCU giurcanu_m_Page_040.jpg
35aaa19785911ee9797469b47d5a5434
9566d64b9e7c76f1c50eebd22bf0a1aabd125ec7
15979 F20101206_AACNCZ giurcanu_m_Page_081.QC.jpg
2f47e253a6fc8c577988e070e35fa501
b0402a023066465bd4323196ba7aed29d2bef742
22573 F20101206_AACNED giurcanu_m_Page_109.QC.jpg
21b5aed4603dbd61b34556ff74d74ce8
f5ffdf3fd812a12b1a7a14739c204679b6a732bd
24301 F20101206_AACMDJ giurcanu_m_Page_032.pro
3974c9aba22450d78871d02f2c6799b8
a4e6d709b548452d00e06af279d2e378dae585bf
18618 F20101206_AACNDO giurcanu_m_Page_097.QC.jpg
c1a99e58655e7748e820326ed6f3c637
9e0dee43195837531bfd6e60dde85263d7ca8325
88077 F20101206_AACMCV giurcanu_m_Page_012.jpg
111c536120a955b57899dd818af22267
a34e3bdb82fca48e850050c05e897b0efd1e782a
70035 F20101206_AACMDK giurcanu_m_Page_023.jpg
18da5b1110202e3333dd67a35943e99c
acb786285f110fa280a1cf5f65657c8d4cc7ddd9
5205 F20101206_AACNDP giurcanu_m_Page_097thm.jpg
ca1f8d49c9f793b3058d760b4bb851d9
af322d83549e6f0bcd35b80a9ce064567780dc22
F20101206_AACMCW giurcanu_m_Page_095.tif
32a18cab78fa3499047a6f20d92e26a9
c0ca7c978901026e9de311be978c3b3de24c3837
36595 F20101206_AACMEA giurcanu_m_Page_100.pro
01722693f4eb584628798fa1582c28c5
d7c6d215690c3a955757dfbe836cea889e423c46
6102 F20101206_AACNEE giurcanu_m_Page_109thm.jpg
e5e08149cfb7129940f12639c781cb04
437f4e96f5b29a89c68df68590531d76077f621a
2078 F20101206_AACMDL giurcanu_m_Page_103.txt
7b866e4c79ccbf920595d36d51270568
0f0e4c083963e9a4f80b37d77df352354f1595f6
4798 F20101206_AACNDQ giurcanu_m_Page_098thm.jpg
5a26a71c5c220c67c61bec854c2ad4e0
0bcfe8df586a6583f14718056be8430120d2fe43
24533 F20101206_AACMCX giurcanu_m_Page_044.QC.jpg
37dd90f07c71f9d3eaa489709681da3f
dc8db8f0d6c1ae3714c6b28abd0f9668ef9661ea
33886 F20101206_AACMEB giurcanu_m_Page_076.pro
17f53a4a4d2e8e91d3baad344b269993
a987286ca273d3c76a8f45c6d83c7f241c54c987
23700 F20101206_AACNEF giurcanu_m_Page_110.QC.jpg
f484296df27e34926a30318f3c35121e
3119b200adb2e7315a1dd375e475e7d259d94b93
21348 F20101206_AACMDM giurcanu_m_Page_066.QC.jpg
76d816f8efd0ccc707ef6cae1552d952
0d335ad6084f69656057d46138047161105279c6
13941 F20101206_AACNDR giurcanu_m_Page_099.QC.jpg
5bf50db53b7e43c625dd854d25df9929
c4fad4dcecce7c5c8534d9b0e7ba09ba38e76e20
77331 F20101206_AACMCY giurcanu_m_Page_110.jpg
db3211097b4c77525477f48327d95e05
79eadfe3e387bb099575b780a341d9056e01c160
20406 F20101206_AACNEG giurcanu_m_Page_111.QC.jpg
030bebc28b52dfa8c111c7964161ceeb
8ad916728e3ebca27eac84a802821450bf95e2c2
953156 F20101206_AACMEC giurcanu_m_Page_052.jp2
62f60097c086f1c3d0027e05d4a202a8
4120031efceec3ade2922315e58eadbd6c1a9097
26778 F20101206_AACMDN giurcanu_m_Page_019.QC.jpg
3d00f12f885973ce18529e5c2fedef4a
2b1a31b941d17e5fe0493dfb67b13f84d9d06138
4271 F20101206_AACNDS giurcanu_m_Page_099thm.jpg
f27cc81868af0896978707f7fe07dcf7
b20c8524bac174e499afe1812397428c6814ff4a
F20101206_AACMCZ giurcanu_m_Page_090.tif
52aeec68aa68e8f9a8b7dc1dbc6cd92b
8225e94753f8fb769098f8aee470fb3255683f03
16034 F20101206_AACNEH giurcanu_m_Page_112.QC.jpg
ac45c5d31a7fa6bbe6c8bd0f02f360db
f5c2e01ff74b3bc8e2cfcf90710e2aef250605b5
4803 F20101206_AACMED giurcanu_m_Page_079thm.jpg
5bc299616a92260967936ccfe0a95488
bb5fc72d3faeb61d3cee0decb9c72db10d863b24
F20101206_AACMDO giurcanu_m_Page_056.tif
1803cab888228bcf1f8bd28df65a7767
deb5dee50494186e8a785e4ae66f71711b70cd11
5328 F20101206_AACNDT giurcanu_m_Page_100thm.jpg
2e6816f3f4782aa47a8627cb583796d0
43ca5dfbb4c560ae112075b2b136adebc2fd23de
4109 F20101206_AACNEI giurcanu_m_Page_112thm.jpg
9f87593049900e97942393561f321751
e5d3e546fb3029818b5dd5a89df0cc3d20ed0d53
61869 F20101206_AACMEE giurcanu_m_Page_055.jpg
525fff9bc699ba2e8aa5734b23f1ab20
9e857bd0842b130ace951435f1466af8c70d3129
915 F20101206_AACMDP giurcanu_m_Page_002.pro
b0ee8321304dfc5a0f5bc14ddeac3994
0aa1f57a49458b0eb7784818730f2a4c3b452ea7
15468 F20101206_AACNDU giurcanu_m_Page_101.QC.jpg
d12f99a2533472614f15ee971a6f81b8
d38e398544288b4d186da1c93d4f6b599a53eeab
73195 F20101206_AACMDQ giurcanu_m_Page_015.jpg
4a7814c14867225b23671e6d527c614c
3848f238e0a538017a94723384de8e128734a5f6
4618 F20101206_AACNDV giurcanu_m_Page_101thm.jpg
62ec1da7c3d2904a3cf594e6faf816d4
3971f15d5dfead099dc2c2acd47a9edf27ebdc4b
28010 F20101206_AACMEF giurcanu_m_Page_034.QC.jpg
30dcc9cd00018919ac372056a74dad9e
d96e0cf87f4a1e7c12e4126acb9ecd695e86c895
34731 F20101206_AACMDR giurcanu_m_Page_091.pro
dbec20b59c160b38f9d56a05e4a4c9d8
8fbd0d8052bd743b85fbb224ad5b558477ce755d
15284 F20101206_AACNDW giurcanu_m_Page_102.QC.jpg
50cc9a624c897a20b53a2244250f27a1
8c119dc0f36881b3d69a61f68c22dfeb0a77bce3
76181 F20101206_AACMEG giurcanu_m_Page_061.jpg
175c397282cda0e82b83f63228a1e4e5
fe5698acded0125ec99ffc9a86d19c39da3ccea7
22672 F20101206_AACMDS giurcanu_m_Page_001.jp2
923c7bdb67bbb3448b622b8c230a48ee
8393102a63b9c266f86c1f1385721cf874ba7238
15363 F20101206_AACNDX giurcanu_m_Page_104.QC.jpg
07cb136ea86fa26ecedb2bdad7c8d1fd
1a795a3da48d9f81dffe0f6be8cbc3af641e6061
18332 F20101206_AACMEH giurcanu_m_Page_103.QC.jpg
8652de15a995c0fdc532c1eab4850d73
e4eaf1b50b550ab3353a40d1ad97cb959ed1bdea
20121 F20101206_AACMDT giurcanu_m_Page_088.QC.jpg
8c48429e2f69f732b065f85b171a9dcf
fa0a0be87b3b6051151f7f7bd17684cd7916f5c2
4431 F20101206_AACNDY giurcanu_m_Page_104thm.jpg
0e6fdd07f2445f432732891c3ba92be4
9aeec000a905f4c33364f7ce216e497388890cb5
F20101206_AACMEI giurcanu_m_Page_080.tif
459b5e3f3f327ba86f9fda5b4a278e50
5ca64ca2c0a130bd3117fcd16d58a37b6e23f1b1
1566 F20101206_AACMDU giurcanu_m_Page_008.txt
778a37ded05fafd5caabad3e1300aec2
2c785da9784dfa5b6d557f89c2d5ae9ee092ee56
4860 F20101206_AACNDZ giurcanu_m_Page_105thm.jpg
4dc26b86df002ef944a62f58b4c22d6d
75e098107620bfc443caf779b294d572b9877268
F20101206_AACMEJ giurcanu_m_Page_066.tif
6853a7be597253f290ae7c29ef377417
c2e9681e57883b921c14b9f9bfe1ef0719c8a663
982421 F20101206_AACMDV giurcanu_m_Page_014.jp2
724f913460847942b0eb909fc03c4c07
7b16bb22693faf44ef5d0db20085740f1d1c4c84
6624 F20101206_AACMEK giurcanu_m_Page_028thm.jpg
c0947f7f39c9f37bc70c049b7852ef2f
ec4cd4f6973c37fd55e9f441d460818d610503a6
35655 F20101206_AACMDW giurcanu_m_Page_054.pro
eed2e9c2f5b8f1f9a62ce50a291dbf3f
d767232949d1d73fec3d422ee081f13ac905229f
421 F20101206_AACMDX giurcanu_m_Page_001.txt
8e97ad39f545664300f7896a18fa5b44
ccd8eeea95ff45964379ebd5c1332815791988e4
47691 F20101206_AACMFA giurcanu_m_Page_035.pro
6edc101d3d7e0ce5fe3eaf0d95302f2e
5d783491ee2d19b8b051f94de246b287a589f8cf
745180 F20101206_AACMEL giurcanu_m_Page_094.jp2
0de6b842e2731dc83b56f39f829e3464
cc8d9620cab1e20ff98b4884a98dd1e0c1c758dc
6858 F20101206_AACMDY giurcanu_m_Page_001.QC.jpg
b718fe6682d4a779afd57ded7f5c4a09
fc5159b2297bea237382d0036d0cb99e2dbd67f4
6063 F20101206_AACMFB giurcanu_m_Page_024thm.jpg
a6565a96a651f5995d6b2d658bb6f91f
c9316c20592e3ace8eb0644715436d4fb8785d10
6529 F20101206_AACMEM giurcanu_m_Page_021thm.jpg
96ec05b3104997b23332efb518690055
38ccf9467088862e12510d2b2e2a3f12f78ffce6
16096 F20101206_AACMDZ giurcanu_m_Page_093.QC.jpg
3613b5a539eea69ce17b39d2f0733e16
ec910284673f8fa755f05d0d5cf5f7a1a9549a71
22963 F20101206_AACMFC giurcanu_m_Page_015.QC.jpg
1e3f53c05357f78d04502ee746ddff9b
a07cd5acbc150d897c9da834872872812f5cf21c
F20101206_AACMEN giurcanu_m_Page_058.tif
c496a2d4640a61b3fceb76edf4ececbe
fa4627f9d9603e2007d17a3b1a96d8864e9ee081
749725 F20101206_AACMFD giurcanu_m_Page_070.jp2
ea433acb83c12efd9c0a9db75a23d772
bf86708780a72c2bfee6b070caab21b0767d7988
724246 F20101206_AACMEO giurcanu_m_Page_095.jp2
e9540a024c54fb0b27f223e41434d79b
e045a0894b54f89fef969f52fe42fa768a73bf37
2797 F20101206_AACMFE giurcanu_m_Page_005.txt
eea78306330d3a2368d8efa0d9bc2815
264b3810a1778d847d61f024d9de6dc131e06593
55828 F20101206_AACMEP giurcanu_m_Page_087.jpg
36e8f22ff9361df504c05f52893ea2eb
64f530bf0d7f36ac82907a43d8ab4c75f7b17c62
7047 F20101206_AACMFF giurcanu_m_Page_030thm.jpg
f12408cb5b7df6185db0fe850bb44189
49e0733fcdf9e56748d07cad87acc977a19b6790
63289 F20101206_AACMEQ giurcanu_m_Page_088.jpg
f515a1c051977a0bc65c228fb1606de8
b147ac97ea3198b6758f0aedbe7267e60ce43d0a
65053 F20101206_AACMER giurcanu_m_Page_005.pro
e879c729144700bea0bf2d1b1498df63
f8e273817ac0dbdfe8da7d2f581772e6ad767a18
58639 F20101206_AACMFG giurcanu_m_Page_031.pro
76b8e5d5706979b07a53def1e0958084
294c36d7b195f1cf751981e5ccf27c097437c97d
6190 F20101206_AACMES giurcanu_m_Page_061thm.jpg
f9aa8c154aac2e55e1b4a616d27c1ff3
b5bbe31b7c0bcf5501506c8648fed493e79215c5
1045135 F20101206_AACMFH giurcanu_m_Page_064.jp2
4edbeaea5af07eef621b10b0836bc495
fcf17bb5a0a0b7c0a23fcedb8cc1be7d9d1a2874
28774 F20101206_AACMET giurcanu_m_Page_043.QC.jpg
43def5c923269cbccadac78ad26b5cea
659c1735b32be3e8a69162ec71eb8365f42b226e
19152 F20101206_AACMFI giurcanu_m_Page_040.QC.jpg
4b479df9adb5f4b27b8aac343a60547e
40dae34f9c6ac920469f7cb8dbc303792fb18e5d
6423 F20101206_AACMEU giurcanu_m_Page_042thm.jpg
5f1b603875a602bb076d63bdc517331c
5ecf38b3a461b41ee29f5e47030b300367befeff
5554 F20101206_AACMFJ giurcanu_m_Page_005thm.jpg
d75c448aea75be5024626edb3df4d84f
d8103ba2bfd481b61cefa407e96478f1ac3f3cc7
4711 F20101206_AACMEV giurcanu_m_Page_075thm.jpg
3a85b161803ab65a2e3eb77c7836d8a1
1442238d2cca0d9580c4f5026e25e08fb2233e7d
15329 F20101206_AACMFK giurcanu_m_Page_080.QC.jpg
1c82c9b3e47cb68c0f5bf1acf2cedb1d
c8d98f3621aa54007ae2243b64f067f3835f95b1
47231 F20101206_AACMEW giurcanu_m_Page_064.pro
45c9f5abad1a887ddf6a909e9956ff92
01962f0b0c058f3a8497e72b2653ba4cbc42a4d8
21811 F20101206_AACMGA giurcanu_m_Page_089.QC.jpg
3e30cfff34dabe103247b5c7db0020f6
1244396f38c5102ab390f707a33ef6998b9bb95f
4841 F20101206_AACMFL giurcanu_m_Page_051thm.jpg
7cccb73e495ccb63b8274628bf80c62f
63e46392c8c240fd6935752fa520a8a8268bca1d
1977 F20101206_AACMEX giurcanu_m_Page_017.txt
5a4d84bdba862fcdd8cd74e2b0e315ee
4e739d08230d8342dcc1a0e88ee00976497b74d1
694740 F20101206_AACMGB giurcanu_m_Page_041.jp2
d15b5a3c79cd7de96f255cc81f71020d
a6bd6c16d545158147cf3bb1ed462c49f53e99b9
F20101206_AACMFM giurcanu_m_Page_038.tif
a28c83f79569032035cdbd66de9bfe8d
7b44a4eaeded0cfaa9f4e77b303add3598e63b2a
56511 F20101206_AACMEY giurcanu_m_Page_059.pro
47c49023bc479a71294607a7d27d6a90
9e40bb8ecab831d528da9b061f75c8bfc4a00e5a
17929 F20101206_AACMGC giurcanu_m_Page_071.QC.jpg
78344817bcf3dcb3d7af19efc1dcb80a
62e923c2246dcec9c0fa02798b22aa8065398467
24349 F20101206_AACMFN giurcanu_m_Page_042.QC.jpg
484e7ac3dee92afc4012dd7925de19ee
b3a2b08ed5d5ff24a072f4a1d147d9c8a85769ac
26214 F20101206_AACMEZ giurcanu_m_Page_028.QC.jpg
418467210bb66aba33e92c917de04ff1
300ec1c83e9f8c2365013dbcd2564d4eb9dc190e
18438 F20101206_AACMGD giurcanu_m_Page_100.QC.jpg
544c29b2b5198c0d0f21b2d7420b6a22
2f3a791a2285d6eb7bfc3c4a6f90d2b8fd07e747
89345 F20101206_AACMFO giurcanu_m_Page_072.jpg
5bad8a43eedf7bd078069de69f9ea942
2f95f557466b86c6c6a6f3bb6f973b242df14cb6
55911 F20101206_AACMGE giurcanu_m_Page_027.pro
781c18f0074bfcdc6076a5cea251169f
3b4ebb8f9654eb9c18d696bab103ab61ad5dd037
F20101206_AACMFP giurcanu_m_Page_072.tif
e84f98e2e2b82ac410115a4732d74f6e
5a39b1d3c4b2759931e41febd1e7c389c2eee8c1
14076 F20101206_AACMGF giurcanu_m_Page_057.QC.jpg
53f63e8de1e77e1a7297b270fe24c279
07258561ce6880b153afd1fb9d2bc166bccaa939
28886 F20101206_AACMFQ giurcanu_m_Page_041.pro
4e896c7d07f872e47c934183fb4a67fc
6c270f0b88f430820d799df5a21a6b14265735f9
50359 F20101206_AACMGG giurcanu_m_Page_083.jpg
85f1dde85e9907ba1dd41c642aac93c8
3be1a27545346e4bb5d6f0b188d3be54f56c44d5
F20101206_AACMFR giurcanu_m_Page_073.tif
163473ec8cdc54dfc253963332306fb5
f703b483db95b0aa14b4e32990179057a2803ed5
45805 F20101206_AACMFS giurcanu_m_Page_089.pro
73f1d8db407c5e0d2be3839af5c9fd1f
0f3db7e1c854aa4c255a3b3fe8e954b08eb4c8c4
89594 F20101206_AACMGH giurcanu_m_Page_034.jpg
91b07cc87aad2f39e3f92de1df867ec4
c0fea15cd341ec14a64b2dec71bfbf7fa7396063
21086 F20101206_AACMFT giurcanu_m_Page_052.QC.jpg
448f8ca659e59a5c25b78afacd650f0f
ba62af6d72be8f55539a75d875cf45ff889d269d
4865 F20101206_AACMGI giurcanu_m_Page_082thm.jpg
b6237f7c2dfbc4f96f950b6b975d1f44
338404658111165e2b88992bf6a708cf1b3b8dcf
1011479 F20101206_AACMFU giurcanu_m_Page_025.jp2
5731ceade20a15ff00c188732ab2b123
ef87c31858e7d2976dbb6e4c286a21b2a6246bc0
76327 F20101206_AACMGJ giurcanu_m_Page_037.jpg
3e66437b3424fccb117995c695edf623
a18ee7b365240661d1c208e1c203445084747876
22455 F20101206_AACMFV giurcanu_m_Page_016.QC.jpg
0e3e3489c3323245715aa9df7c448f2b
69c269899e7c1e7a3d735044fcdf32683dc01248
657270 F20101206_AACMGK giurcanu_m_Page_082.jp2
747a3b91f2e3c7fde3a68eb36266c188
4186f01e214e0a99927bda699561c04e7a7089ad
4884 F20101206_AACMFW giurcanu_m_Page_085thm.jpg
bd3608cde85f2ad74b6c4395e63f44cc
eb12e520de242251307426bea642b12699075025
48856 F20101206_AACMGL giurcanu_m_Page_011.pro
c4aef072911b156395aff6b959b79a12
25216a8758c5e7376a625919974183e1e193ef2d
1051984 F20101206_AACMFX giurcanu_m_Page_021.jp2
5531d4d8429cabd98f32a17a24d93b93
a66ff62bc89ae8a8fe7c21c1b3765612d75bd22f
F20101206_AACMHA giurcanu_m_Page_062.tif
86403bf5a38a4f7f1ab1ccb64a836e77
f2bc4950c76c8e8f6093da218cf4e8f6af4fa957
F20101206_AACMGM giurcanu_m_Page_060.txt
d2147c3e604e811328eda79a2d6e7dcf
866e62ed1b16cef4875f00185ab6841ba3b1e5d7
6762 F20101206_AACMFY giurcanu_m_Page_020thm.jpg
a96bf53901088c040f2c55f6ab5df915
b93fc72031cb18449adb3fd6e448240336d0a86a
73108 F20101206_AACMHB giurcanu_m_Page_064.jpg
2728fbbe8afb6396c95d5c2e0bb5a801
0b3bc25c17eca0ebf274126a05b671def19209eb
1706 F20101206_AACMGN giurcanu_m_Page_047.txt
e6c90962a168ffde84df4998de6a21ce
4e7d52badb983b0be3eb107f4f3d0b6d8c13f6f3
22349 F20101206_AACMFZ giurcanu_m_Page_064.QC.jpg
7adaedffeae87d97d2b2ecaefa807c8d
2ebdcafea15c55227e7ce928e84babcaeaf79bb5
F20101206_AACMHC giurcanu_m_Page_084.tif
95b2f70ef297d5d0fc33997b72e7d3ac
33ec7e61e0de185ace6fbe3817ee10507d4aa44d
81048 F20101206_AACMGO giurcanu_m_Page_011.jpg
801a12d9ba41234a0537782669af6b88
aed3c4a9549cfb8b673af372fd47511e9efff7cc
3803 F20101206_AACMHD giurcanu_m_Page_003.QC.jpg
f67c388f8fc8f8595a990bc0417c675d
53a67daaaa1626dcf91c0a3740314b1fa4332ca5
151 F20101206_AACMGP giurcanu_m_Page_003.txt
afb2d50dd18b3f4d0929564d25e773c9
ffa40aa82331dde16c283fa6f4d78d94f7d8e1a6
6246 F20101206_AACMHE giurcanu_m_Page_106thm.jpg
d7d38470dc7410a845168f3ddfde501b
754439d8e016f2ddd38686f83a8cd4859634ca98
5426 F20101206_AACMGQ giurcanu_m_Page_026thm.jpg
f3de063f97513c166372b3daeaa0412b
cd5817a77e7b0a48399e2c2436aa58200559702c
21711 F20101206_AACMHF giurcanu_m_Page_067.QC.jpg
c8519ebaccc1a41dbbd0aff06ba2de8e
8d95091336fc7d388529ce033b4e6e27931e35d1
1915 F20101206_AACMGR giurcanu_m_Page_070.txt
ab48964a9ed5e852218ace2177c07fd2
1c75ffa7627339116223130aafb630d504a99836
4631 F20101206_AACMHG giurcanu_m_Page_041thm.jpg
d0c4b6e6a5bd4d2aaef2f62a3b3bb660
882399e62fdc1f605db5dff9dd81d3ed57ccf3ec
16858 F20101206_AACMGS giurcanu_m_Page_098.QC.jpg
12e8ed77a3b787f4a29107eaddda0271
b24b18a8bd8ac117b95671dfd09304d49d25acb7
41607 F20101206_AACMHH giurcanu_m_Page_045.pro
6d1dcd2e8c5c23d5ff7c65ffe7578dfc
f85bb57dab589c240cd53a17b8951368aa19cd12
1051986 F20101206_AACMGT giurcanu_m_Page_005.jp2
310090092ae07bdea00b0020d7c7efda
9ca5e35079a5671bed92398d3d3391696d1004d3
18026 F20101206_AACMGU giurcanu_m_Page_084.QC.jpg
6025a803e1c40a6b05415e86c71651ed
60601a40c8525dd2de50a24d9d3376311ee66b09
72324 F20101206_AACMHI giurcanu_m_Page_089.jpg
44086112060da489e0d5945878d43931
075b20b9e24f8873f639898053aca1d2fa30d6cc
59778 F20101206_AACMGV giurcanu_m_Page_034.pro
766b5c585380fdb6c3dcd53f0bc3b096
8c1f66f1c5d05e38104460e8f25c3595a5c7f6a5
636804 F20101206_AACMHJ giurcanu_m_Page_101.jp2
921fb32a996fbb5eabcaf3046716c794
4ef93cecff40ff9a71fd16651e3cf122eeb7c1e0
23217 F20101206_AACMGW giurcanu_m_Page_072.QC.jpg
553f4104f259459ea00573908167568b
c92438de0648e4443a6b431a8205255df8a5e20b
1296 F20101206_AACMHK giurcanu_m_Page_095.txt
9bb418209b239e08ff2edfdfde8c0fdd
7d95bf0d8e571b7f26c1de5bd297b55564539e11
132353 F20101206_AACMGX giurcanu_m_Page_107.jp2
dbe7e206589ba71f0853232fda9f1c48
47c31617b0841755638071452fdcbdf4368ff5b7
27925 F20101206_AACMIA giurcanu_m_Page_029.QC.jpg
a58d259f7abffd7939369871f1d499f6
ba19dbbde0899ab9c1b38c079458de5b7d0e25ce
47743 F20101206_AACMHL giurcanu_m_Page_101.jpg
548cacb007be8a10210274ea6d73864c
939fc0c09d69669b89b45e573afcaee2a8c9e218
77007 F20101206_AACMGY giurcanu_m_Page_109.jpg
e7d71fbb545582a28e46f5911e48eb0c
54c7b1e9c00ea36a5392ae55c00ac4cb8188afd1
F20101206_AACMIB giurcanu_m_Page_071.tif
e4a51783201cd9bd4f2a30376e2628ed
91715f33fc96aee7d2d69860df5092bfc45e769e
69811 F20101206_AACMHM giurcanu_m_Page_004.jp2
a72ccf3bf8e80c51c1160a594264ee96
d15b038706d209e2edfe6656ee758b4c8303ccd5
5425 F20101206_AACMGZ giurcanu_m_Page_002.jp2
4b66a35f3d016fd995f605fcaafe926e
54eba3bdff01e281a32cddb081cec0c4469990ef
51442 F20101206_AACMIC giurcanu_m_Page_111.pro
ecb60dec00ac76afc8e3e82435f8fc98
6b6bfefa53cc6ed2ed5a06bdbfd3e8e3610900d5
59782 F20101206_AACMHN giurcanu_m_Page_029.pro
5a7532ca8229eb90ccb79a56da7092f2
c2c11234ae31227a8c7d761bfb6f6e3e016dd0cc
1051981 F20101206_AACMID giurcanu_m_Page_030.jp2
ed7e86f25fc7b2457b531b8d89d416f3
b353652228ff75110ff3710494acf5e1a7401dac
30088 F20101206_AACMHO giurcanu_m_Page_081.pro
da2120db522086bf58d91ce8bf4f3c09
b9f635de28b9d616555877696487cb1644dae367
517234 F20101206_AACMIE giurcanu_m_Page_032.jp2
49a2fc48a5eb5eb26ccb697b23aa4c85
3752bc1183b28efe7bfbb4239a2b98d31c2dfbfe
F20101206_AACMHP giurcanu_m_Page_013.tif
03ea65127f3f6ceb24ad1a85525a3027
c7dfe5c7c988848d789a676dc2415be2ac518e6d
F20101206_AACMIF giurcanu_m_Page_094.tif
fba73ebe20927d8724bde3300b4dec21
2508a197b08b4501cd3a4e793778e1acf05564ff
18915 F20101206_AACMHQ giurcanu_m_Page_069.QC.jpg
a7584fc8c89b887705776ae03c14cd7f
5bbbda92025e183b6105a677dc1e8cd006cbd3ac
50399 F20101206_AACMIG giurcanu_m_Page_079.jpg
e1d82015b1ca8bb5c8c7de2586b9ab66
275ac44457e2c3f7d442e3c5e40ce587ecb190ef
28642 F20101206_AACMHR giurcanu_m_Page_030.QC.jpg
bf4ec33a7ee1aa2566e881c2e4d03d7f
8ca10ff27f2cca52bdf387e11e860952e0204633
5196 F20101206_AACMIH giurcanu_m_Page_103thm.jpg
cacede904850bdbcc221e53126a8c067
839351c03303770cd6b58f0953513ba296084e69
5919 F20101206_AACMHS giurcanu_m_Page_018thm.jpg
95d89e1db8514ce0f42d07a0b445acfa
14d8c7109478b8b175d55e655ff19fcd77004ff1
37819 F20101206_AACMII giurcanu_m_Page_103.pro
5d0a2d4daa9bcb10231fe376092ab01d
941d7cd1d56cc60ce11b1e699cc9cc57df66ec7c
1257 F20101206_AACMHT giurcanu_m_Page_086.txt
40411d39821c96b45185a90d7c9a9b92
7a2039a7a06a7c664e0865e82b88bf89eb88747f
24496 F20101206_AACMHU giurcanu_m_Page_107.QC.jpg
452fd2600ce3b2ec26916bd7a36400b0
27c0de468c47aa7084a3b122bc7e764a43e13a79
56938 F20101206_AACMIJ giurcanu_m_Page_084.jpg
ff616cba1ad9bca54169d3127f7adc65
e003d25bd2f1e7157eda16f43d26aec012fcd69a
29905 F20101206_AACMHV giurcanu_m_Page_083.pro
e4b25d12249bb07507673f96e9fa3b70
e7d2ba00f528956427cad8973f3b307bae9cef2d
1995 F20101206_AACMIK giurcanu_m_Page_038.txt
658e839cc78f9200cbcabd2733dc7f83
e5c57ed7514e249e0ce90152fd633721abfed7b8
68001 F20101206_AACMHW giurcanu_m_Page_033.jpg
77d500e3ffa05cac4302e14fd1261b67
10908ff22a145e01615118dfb3844a3ecd5047ef
4059 F20101206_AACMJA giurcanu_m_Page_008thm.jpg
4eadbd27a8c0fd333db19f11a629d55e
9234552d9bb0f85592f4362baf1cecda9537a4e5
998261 F20101206_AACMIL giurcanu_m_Page_023.jp2
33f83e81d8d69f12f75c90c0d726e6bd
4077c603204720014f1afd4e79c8be24000f5876
29364 F20101206_AACMHX giurcanu_m_Page_085.pro
80dd44216330b8c1e6b2090e7dfdd2e1
a5c47541f0ee79f96c29090ed8ae394cc858b96c
768291 F20101206_AACMJB giurcanu_m_Page_053.jp2
bdfa209faba3de7c2bcf2e998355b1c5
9a0b544ecf59066d5c40cd23001d6886340fa7e2
50010 F20101206_AACMIM giurcanu_m_Page_077.pro
93b89030e1d15d0bea8f966e952b5d79
b0cd4548861fb5a468036286b438504f91308dd8
896 F20101206_AACMHY giurcanu_m_Page_058.txt
06974bdbce9ead38be805d577a07c8ae
8e87ca43cba2d6fa6a7a4d2e1c0c8d4f5362cb7a
4648 F20101206_AACMJC giurcanu_m_Page_054thm.jpg
491e8e65d89ecb988afea869778dc8e8
41a7a44a87310660f25830e67b468d23f731a6f5
33535 F20101206_AACMIN giurcanu_m_Page_112.pro
16783b80cff0b3b190b48b6c403987ab
e51d36c6e1733976ea2d45adaf5567d39c33951f
1682 F20101206_AACMHZ giurcanu_m_Page_040.txt
d1bf17e4ed10be11e0808e8838df9488
99311d22be9cda9fa36dcb181ccefd6fa4f7430b
66219 F20101206_AACMJD giurcanu_m_Page_026.jpg
ee68f71ade417a011176672f77dfa3fc
44f127438a49dbbdf8014d78562a5cf49a5572bb
68011 F20101206_AACMIO giurcanu_m_Page_014.jpg
65628e4b9b1a4752cb07d85e040732cd
0fb0652742a15cfa5a40fccff3e53ae1a4a1ed3c
55205 F20101206_AACMJE giurcanu_m_Page_109.pro
8cd59dcae3ec2607591f77b6cf4a8f96
b1faed4f3fdf52bebe232ef15ae1fb7c7920cf88
1020559 F20101206_AACMIP giurcanu_m_Page_067.jp2
b6dfcce6b692356bb5a33b728079e706
2b9db7393d6db3b4fd57ee11d4bf1662c6d97e03
46605 F20101206_AACMJF giurcanu_m_Page_004.jpg
4e015543cb7fc475611229f9199e5a12
c57b55f1ded1df707ec9d3b460cf867b1987b33d
38912 F20101206_AACMIQ giurcanu_m_Page_073.jp2
75ccb29c4465e889ef92e21f63727a78
80a2c561242e170af08d9a735bc6c494cc1276ff
F20101206_AACMJG giurcanu_m_Page_025.tif
6dad20820cd4a5b9e679256f92c79dde
de93b9512606ed25e6ed8ad98409120df9394448
1311 F20101206_AACMIR giurcanu_m_Page_092.txt
bc0c24ce13f278a6d29a5fd0565ac763
f5dd06b4d8a67586b120542d793d8230e267be92
F20101206_AACMJH giurcanu_m_Page_076.tif
23cdcd4360961430640342271647c8df
d9de0c79fe24c0e0e492833f0420532adc91b353
37926 F20101206_AACMIS giurcanu_m_Page_040.pro
08024dc2ab870680414ba3245fbf752f
175b0baee2f7314760a680a89ebb536ccfd3cf93
893789 F20101206_AACMJI giurcanu_m_Page_068.jp2
b544f1d1db498dfd6742ae9ada41f7b6
ed3f4635e209824917ba0f9a38dc8e1dd955e547
F20101206_AACMIT giurcanu_m_Page_089.txt
b2b238e338b0a414f1f87693bf29c291
6ac70779ab5d4e61472ca39f69724ecef23e2c64
1051926 F20101206_AACMIU giurcanu_m_Page_011.jp2
0a28007f6fc26caf58d62f9ce1c3ff77
10f64b5e01bfc2865a2ecab04833610459774a64
4764 F20101206_AACMJJ giurcanu_m_Page_092thm.jpg
c9992622550960fbb5abd673d1ef956d
fb0b0dbc9b85eb354860533ed477556138e8dbf6
75561 F20101206_AACMIV giurcanu_m_Page_038.jpg
8707f9f07e27cc9008bf101ac80aa597
de3d16bd47ddbfcf04fd12d2aaa08d19400e7c83
1419 F20101206_AACMIW giurcanu_m_Page_094.txt
b5e07fc09aa7ab2efce90cfe0a661601
11470cb873059c2dd4daaefb3eec525f1b5c6c38
F20101206_AACMJK giurcanu_m_Page_047.tif
d4ce5f670a767043ce32417d3b835206
02666726b9600991b8f4a7857826b72121bc6130
5882 F20101206_AACMIX giurcanu_m_Page_067thm.jpg
c9dd35dea0577d4992caa9f655046380
91e0f32ee2afc5ca4cfe5f99a321ad13cd47f412
6530 F20101206_AACMKA giurcanu_m_Page_022thm.jpg
40c8bccb5b2572fb228878f4ab6989e3
ee63fd8bb84ff2a00ebd900e551bebbb64f3e4d6
69225 F20101206_AACMJL giurcanu_m_Page_111.jpg
60a274fe490385045aae78e322dc4957
e741ed6e5b5ece42c2823f41ed35a5e53e4bb648
38647 F20101206_AACMIY giurcanu_m_Page_010.jp2
6fed3c8ca2da2bfda969038647b00831
a8d32c4f7f60e930acfc7eb345930a20c4fde67f
37493 F20101206_AACMKB giurcanu_m_Page_008.pro
7d6356d0e713359764f281a218af69bf
e99ddc3107c1d2e3c511175620f8823227047540
1619 F20101206_AACMJM giurcanu_m_Page_101.txt
6bdf355f7cfb67f955d8c690dfecbfc6
762ea3d94987c6b53d3ae4cb896b5780d88eeab4
1527 F20101206_AACMIZ giurcanu_m_Page_081.txt
9fe724cfbed64ece0d2c08d68d8ad7a0
e0afafdfa3151ca94190378491bd6a046af6a359
1889 F20101206_AACMKC giurcanu_m_Page_088.txt
55b206323491d85173e41426f10db490
a2bec36ef307335ac8647894976c593a7dd740f0
78040 F20101206_AACMJN giurcanu_m_Page_044.jpg
9e2102a2aba58c288e00a6f5859df3ef
05f9bfb9872e2456624bd4bb5ba7971ee99e80b3
23933 F20101206_AACMKD giurcanu_m_Page_060.QC.jpg
cf50379a7fdf65b5a99cb59469572e09
4e9bc15e048c15efe62d244dad5d93c14035aa7d
5049 F20101206_AACMJO giurcanu_m_Page_093thm.jpg
bea85185fcc4ef54e4cb02b804643c8d
44fc359ef34196a7291b885bc104e2ebac606185
1051960 F20101206_AACMKE giurcanu_m_Page_065.jp2
0feab3ea9cb620e50f4e21bb87a87cc2
921f95654c88599ab356b882c6fc7d720b45c272
1467 F20101206_AACMJP giurcanu_m_Page_098.txt
40599cead28a16aad3310874a48848aa
87490303bb369d0fb49eee2146c15ed60a5a9581
31934 F20101206_AACMKF giurcanu_m_Page_004.pro
e09d6c2c5232a818b3464664c664a7d4
0a307292e177965b49fc6ef6bb1a6453e3076193
62596 F20101206_AACMJQ giurcanu_m_Page_107.pro
ae6123e30497c14cf8e2bcf35b4687b5
ac1cd4141c27fe7398b1e253e4c261daeffc2f8e
460789 F20101206_AACMKG giurcanu_m_Page_096.jp2
af91302e2285973a3f3dc801ef5b0b4e
2c3488bae42fde4b4fa106dfae2f61e839802879
16440 F20101206_AACMJR giurcanu_m_Page_078.pro
200a1c8aab8449f91cee7b8aa715d4a9
1e85ace98c1e039215da9e9c208b3aad4fcd25ea
86407 F20101206_AACMKH giurcanu_m_Page_056.jpg
141e5978741b93498af1518d64d6c3d9
52f05afcc5735c5c074ee63a251a5a3d9e78380b
6860 F20101206_AACMJS giurcanu_m_Page_034thm.jpg
cc712cb989a7d7ab8e6c728ef392261d
1bcc028a3fc9cc09c5e8318d5c006b6e5dc0fdcf
814190 F20101206_AACMKI giurcanu_m_Page_051.jp2
df364ba92e9da4bb68f06f6f809482aa
9c0334eff8d46d1c0d9c9111b37998f77fc3e2b3
F20101206_AACMJT giurcanu_m_Page_019.tif
1c290aa9caae30b459e1a965c5299447
302bf362d1217a2738139016cd2ffff6616cf44d
1654 F20101206_AACMKJ giurcanu_m_Page_097.txt
c10ba49305218755594cd0cc472586e5
88ea626166029dd9a06fab6bd76c80caa063a6c2
16249 F20101206_AACMJU giurcanu_m_Page_008.QC.jpg
faef6a1d8458f17e7b69ad7b352ba978
87768ad3d22763e6947f2afbc519257665cf22e3
56455 F20101206_AACMKK giurcanu_m_Page_053.jpg
4ab8afa773f55cc093dab8b502a79e6a
0543c8cf0496292296acda73bb6c38f681491dcc
66220 F20101206_AACMJV giurcanu_m_Page_045.jpg
fb7d36c9b603d5846609c3b237714c0e
9be352b383b86002f2e752ccd7137c46793557ea
F20101206_AACMJW giurcanu_m_Page_055.txt
4282fd0e0384d944289ee8b741ef6cf3
bfbbcadc441f62c8ed56246fa297605261e17d17
15200 F20101206_AACMLA giurcanu_m_Page_004.QC.jpg
f7b4dc4fca7c0a97223788af6157ac6a
4bb6732e33b9a7189bc99cef3b5d5d0880cf7274
6024 F20101206_AACMKL giurcanu_m_Page_015thm.jpg
33b702296e62442d031a28895ded29c2
e6b22a5c9433a193032463414bc33d4224a4f5fb
68770 F20101206_AACMJX giurcanu_m_Page_058.jp2
79c4035f0d02c5d35fc2426cf6a2e7ef
2ce66a73c8a22d24629bf1dbabd0e8e443d06107
20951 F20101206_AACMLB giurcanu_m_Page_007.QC.jpg
02e7938f3cf355734e65dc13ffac6675
73a7784af6df821a6c875370c08a8fcf16881921
1051976 F20101206_AACMKM giurcanu_m_Page_056.jp2
a1108922ef98c60c225c6fe4742baf04
cd2863f9bf6b224175979dee7d6307196c51d701
12725 F20101206_AACMJY giurcanu_m_Page_032.QC.jpg
b920c710640062973e0d85cbdee1f179
4ab99ac092a1117c7f82b8a098a9f8a8a59329b4
F20101206_AACMLC giurcanu_m_Page_017.tif
4f63f0fcdbd2caed001eef0320ec40e8
7dccd6a8926314cbd4800c5554c0b909ca0f4017
F20101206_AACMKN giurcanu_m_Page_015.tif
2725289a03c0e97b10969cdc116ef6d3
feb9a0156b0bbaec9fd1e7b51c3d2b98fd16e9ff
34906 F20101206_AACMJZ giurcanu_m_Page_087.pro
cfc0f5678d0c0347115c57cc5df30f8e
337c4484dde4d78263ad98da285a37cf5694ff88
663076 F20101206_AACMLD giurcanu_m_Page_102.jp2
fccec8d9049feb4f63debce4d50e22e8
c52dbb43c5889dbc0edc80365dbba6c6a7148c81
53534 F20101206_AACMKO giurcanu_m_Page_022.pro
6ec5b9de0d57123919cd70bb7426d1df
696dd0850bd537cd7dadbdebdb6b1f8c8b51b31a
6603 F20101206_AACMLE giurcanu_m_Page_019thm.jpg
ec2c11022671a3932fabbefeae1eb3e3
56273559d38b34546d2f555e155ea9d652c63b62
F20101206_AACMKP giurcanu_m_Page_054.tif
f104b69a8fd9d2d90d9207996fc4252b
f2024a33da0dc064b50d54ea95c9e9cd6f33a3b9
23347 F20101206_AACMLF giurcanu_m_Page_036.QC.jpg
4224e5d2351cbe2ebf371215d20c2e90
84be62941e7d50957f5c7c9be14f5f9c752e783e
F20101206_AACMKQ giurcanu_m_Page_032.tif
4d539463b1ed35fab2ca98ef27bc25d8
ba2bb6612ffb843c030a5c50481e62963282f260
6197 F20101206_AACMLG giurcanu_m_Page_011thm.jpg
1f9f196ba0ab8159ed19c3b26fbc1b12
b74517926508fa743b06dcc879c2c2ad41ad7b37
21149 F20101206_AACMKR giurcanu_m_Page_033.QC.jpg
71576f170c2ed847890d73faa3b1a5c9
532a90deeec323c8934c2a4f71d8162d439a9a8c
5908 F20101206_AACMLH giurcanu_m_Page_050thm.jpg
271f4791f21af22227ebec66fe260360
940e4a3b2e94b0b34dc5d0119b0e695381feba27
16636 F20101206_AACMKS giurcanu_m_Page_105.QC.jpg
322688130aadb580dcaff37bbfc874a1
93553ff66a108816688c97b585eca28a117664bf
20552 F20101206_AACMLI giurcanu_m_Page_058.pro
00ed19bbb5895c23f4bfcd8246ce6e3c
69e20832ea42265f99caa5dc2b97849c1daa3623
85624 F20101206_AACMKT giurcanu_m_Page_019.jpg
40a8e7d4df9fd8c424df85c699c94325
e30efa93791d9f1007cb18a55e95f4253f5d9119
1896 F20101206_AACMLJ giurcanu_m_Page_068.txt
1665d343483a9dec4d2f09b2f985dd3b
0451cd0ccd3a81aebb03715b94469c98d2276a04
24897 F20101206_AACMKU giurcanu_m_Page_011.QC.jpg
f66024013e329fdd196d7c7cf973f4ec
18b17ad2f9c30c3cf23bf8532afcf95acdbbc71e
1850 F20101206_AACMLK giurcanu_m_Page_100.txt
00983fe8a439e353641b3e081951168c
5e65bbeeb9d40f47ee1ca5272b9ce2429acfdad3
8855 F20101206_AACMKV giurcanu_m_Page_006.QC.jpg
eb9b695467bc97a4709b5011428d8793
bb3edb04dcb79b57717df48b940022e10de81817
F20101206_AACMLL giurcanu_m_Page_069.txt
0301467ca293ecd819211e6a957ba393
7e9bd2f7becdd74c8cebc2da575ce61d9d6fbcf6
F20101206_AACMKW giurcanu_m_Page_089.tif
192ce1a8b8f360b956f6a33ab4ce13d5
075efcbd6f46d5c905b2f3c66641303e6fe055ef
1847 F20101206_AACMKX giurcanu_m_Page_013.txt
dc40144839f8cf1efdbe33fa3b3f76ba
7d89e7319fa2be03df5a79cd0f98cae030550cfd
F20101206_AACMMA giurcanu_m_Page_061.tif
0327fdc2c2a806645becea10bb80b9b1
b2afa96f084e23034de96925501ea3ec961371cd
36032 F20101206_AACMLM giurcanu_m_Page_084.pro
a0d4b84ff556422895b96731d4c5d7c5
0466e2e04d4558d2af44c9d00b0068fe4ff3c1dc
47803 F20101206_AACMKY giurcanu_m_Page_017.pro
f4b736d56b1931d6ee9f01aa673e343c
b5e58db354cb211d1de570c6a275284fcd269d45
65238 F20101206_AACMMB giurcanu_m_Page_066.jpg
d726ca0c4cf848dabd6fa26ccfdbcb1f
7b14502d74fcb24cf293b83338237569b818028d
F20101206_AACMLN giurcanu_m_Page_012.tif
d4325325f76c48cde853af006d40d6af
df4661705600ae10cd1ae95b305ca2c6ec03942e
F20101206_AACMKZ giurcanu_m_Page_006.tif
bb9a981f2e746ef96be895273f67c013
b0bd59f5c8c21f9940b391399f9573a42fc61618
19603 F20101206_AACMMC giurcanu_m_Page_047.QC.jpg
4b6b5c7bb98686162d0de4c4138b56c5
d9aa96e4782921875d1295665fd2ca1d60a0b890
671246 F20101206_AACMLO giurcanu_m_Page_080.jp2
c6cdb345002b04aa4a5559a4c8a72ee3
a8a73a821c40b53f82a4cb0876a6fe86590f0060
36587 F20101206_AACMMD giurcanu_m_Page_096.jpg
720a31094e05eb4a17846c8e5268c4a2
d3c059b58a4dd9ec26adf3d91f7562b5d1058ace
793766 F20101206_AACMLP giurcanu_m_Page_100.jp2
01236c58589a43435767b08515e68c67
760df3b9cf711b1a3cd2489386091fb38d0f322f
23725 F20101206_AACMME giurcanu_m_Page_038.QC.jpg
3f5614e08dcca7b183ad72c9dc338f84
cee669f542c6a8855e61f68b806adebe85ac4ce1
F20101206_AACMLQ giurcanu_m_Page_039.tif
ca20d997205500d617f0fde2977f9127
78e89992a552dbf1097935602f1b52b6ea5c024f
2295 F20101206_AACMMF giurcanu_m_Page_059.txt
d0306fc2be81b95a143ed62a849345af
1289e0b31cb6e199a1a697c6e68c856bb315af83
30501 F20101206_AACMLR giurcanu_m_Page_098.pro
644ea932e9a17f191d948c0190effca9
ae1df4484a51e84cefeaad4a492c52d654a71d51
2230 F20101206_AACMMG giurcanu_m_Page_020.txt
53a5de4ffd1b42fd2afb97283a825376
19d99c86f6b78440480e7052e83e60343720079d
90885 F20101206_AACMLS giurcanu_m_Page_029.jpg
7b6bdfddd12604bbcd854146693e397a
bf10e1dc4cee13db505f3edc886e420e8f1e782a
56526 F20101206_AACMMH giurcanu_m_Page_056.pro
ebbf33e64d72424afa925c6dbb36aecb
f69d736cfbbbf210a2e76aef9104c049b37a4cfa
F20101206_AACMLT giurcanu_m_Page_081.tif
748f558fb835d59f2c7bbdfb9f3d6568
d3a11d55973e3bef66ccba57639edec8dd16d7fa
42045 F20101206_AACMMI giurcanu_m_Page_050.pro
69a208790353844e016a723c52413719
c6245b3838be75b0dff8dc6254fd9b65176774c7
35209 F20101206_AACMLU giurcanu_m_Page_039.pro
0157b68f6d2ebd4fb83bcd86b291afa5
258a37b7ea3dba6c1dc778662bb1b1e20ed9cae9
6262 F20101206_AACMMJ giurcanu_m_Page_056thm.jpg
f1242fcce2c3722d4d4833a4e47f2614
26f4b383d2a6bcdd5fa6e419cccb16fc3d30e422
39733 F20101206_AACMLV giurcanu_m_Page_032.jpg
f60dc735cc99f23591056cd3810e4529
21eeec8fae30ce5354780ed478f0985a0950e11f
74962 F20101206_AACMMK giurcanu_m_Page_048.jpg
3852afd6a19167076cbca3fa0eb93962
e96cb43facea92a396cc79fa2fb35898f5c9575d
122241 F20101206_AACMLW giurcanu_m_Page_109.jp2
4e2fb342353e914d2feb5edaddc201de
c8fe1423bf32d22a925a9d386414f84437c69de7
F20101206_AACMML giurcanu_m_Page_108.tif
76fc812cd410cff99eef79fbf32ca1a8
64f87b811c3823d8c07023bf87b426fc61a75b07
4652 F20101206_AACMLX giurcanu_m_Page_086thm.jpg
85c8fc32e38babfdeab6923adf16606d
e7cd3c18a391daa95abda5033ae03761956ac081
F20101206_AACMMM giurcanu_m_Page_077.tif
1812f1c81edcd57e33d1fe6c8f5def7b
784dbfa0896acc037326877a3bba20a01cf931b5
F20101206_AACMLY giurcanu_m_Page_093.tif
3a23304ab65e9508728aced6fe416235
5f6fa381e33ef0e770800c4b43a0cd46aa1975e0
2322 F20101206_AACMLZ giurcanu_m_Page_106.txt
5bd5a0f0a0d782a966000f656560104b
517a2395584cb8cce147c1a869c57aa0ec7d6516
22317 F20101206_AACMNC giurcanu_m_Page_001.jpg
2c1ca6446d0fbdb6225568b8690f2828
da5183179b57816e5ac1eae63eb0220c54671fa4
15874 F20101206_AACMMN giurcanu_m_Page_082.QC.jpg
d48e614c740bad1ecf3c120f9f74649d
0dad1405fc8443736dcbb87813559451dd0b9ec5
9966 F20101206_AACMND giurcanu_m_Page_002.jpg
3f2a747981a37abc815b610c62aa95e7
bfd12c0b2d96cabcfd2a64a2179554872e7a5419
23147 F20101206_AACMMO giurcanu_m_Page_005.QC.jpg
fc95a0a0d23402a3f0fe2a1d189347ee
50695ea93b79be0f3e711023aa0a6602055a1003
12002 F20101206_AACMNE giurcanu_m_Page_003.jpg
527b863f6e63821a6e6063f13154d4f7
c0c314131d1f6cf078b609a920ad6368896d33e9
20918 F20101206_AACMMP giurcanu_m_Page_063.QC.jpg
438615adca1f33a2b086ba8585572137
b9b2f44513d7a5f6978666d5dac08cc5def42d45
86033 F20101206_AACMNF giurcanu_m_Page_005.jpg
5617183c4c7e1df9ef8fd9f819a226b4
68ac844065e615374970a738cd741654fbe3d6ca
6175 F20101206_AACMMQ giurcanu_m_Page_035thm.jpg
abe8a9c73d4c9ae34ea1b655c2c1d69c
f00e2abd5d42a187d21d287cdc7f481902e01cc2
29701 F20101206_AACMNG giurcanu_m_Page_006.jpg
695ab222b8ea68b437c4bf659d009dbe
798276e63724f3140e2430209c3b39c62434d17b
F20101206_AACMMR giurcanu_m_Page_037.tif
355c3c30b3e1e94fa235b0f150fb0fbc
62063aaf9bb7ea991a5c3c2696d68f9215226944
62473 F20101206_AACMNH giurcanu_m_Page_008.jpg
0131fdd3421d988b65bd2b04d220fb87
4e80c3c87ba47807ef23fb3b94e057f6ae8486f2
1964 F20101206_AACMMS giurcanu_m_Page_065.txt
275111aa53d8c5008d89569c2efb6ca2
7497db4b6cec4a0032902ffe350cb91a46414eda
66093 F20101206_AACMNI giurcanu_m_Page_009.jpg
9446bf9f56bb525d1dcbeaf395fa36c1
18bbf6167606655a5ec6eeaa5d9e34c6de578d88
5168 F20101206_AACMMT giurcanu_m_Page_095thm.jpg
b5359ad90c68119321437223650317c1
7f2debb5bcf327d66a16a5dbfea3722ee9b2b37b
28325 F20101206_AACMNJ giurcanu_m_Page_010.jpg
b548f69a039b1bee011477f351350bb9
3fdf82f528c002d11600eaa950560ac63b516265
51061 F20101206_AACMMU giurcanu_m_Page_007.pro
1acdfc945234aa87c7a4a53151a81fd3
9897e26d94110817b2a3f4593ef1863814933235
66797 F20101206_AACMNK giurcanu_m_Page_013.jpg
68bbcf6ac2ab4aaf7255f2c0275fe34e
79ec8fe479044e94b3bb0dda582cc91fb2748e73
18829 F20101206_AACMMV giurcanu_m_Page_078.QC.jpg
d64ad10c67dd20643c16869295874a72
e698e75a199cbd6e3de7d480d0e0db3a8210a238
73844 F20101206_AACMNL giurcanu_m_Page_016.jpg
59f5eb8d32ae12c552922fd6e16a804d
351ab02e83f643063937c9fbde742e2887e909c5
F20101206_AACMMW giurcanu_m_Page_068.tif
f1178b0d6e20db4e56decbff72af4387
a43eb546ec2868d8b55d3f99864078fba2e122ae
78604 F20101206_AACMOA giurcanu_m_Page_042.jpg
98caf90dc8a4266a60ad33bd7cb70747
7b793f3c576015b1c7d9cad1ce2865b047664d17
72703 F20101206_AACMNM giurcanu_m_Page_017.jpg
e2ed72dbe5305b879746066eb92c7ccb
9500a2aa6acfdffc160000b474fbe0c5d8d9d1d2
F20101206_AACMMX giurcanu_m_Page_099.tif
b35231d440b48e500b339aabd588adc5
ad65bbdef3e7bffb8240e5cfa3b6fd505ddfd570
91868 F20101206_AACMOB giurcanu_m_Page_043.jpg
2d6d30d2c4b3715a8c10d394cbbc7950
7d79637d0c4fa004671b01984e003eac38e0a8b6
63861 F20101206_AACMNN giurcanu_m_Page_018.jpg
660685d5fc1c3a5a2d0ad8dc60e2f157
0045ab3f34b9b3bb920afc679ca8d7d5ee89dcbb
797 F20101206_AACMMY giurcanu_m_Page_078.txt
e39eab80fc077288b148729061f6097f
8f1744c969fc3535aff82ab421aea60629adab7b
55022 F20101206_AACMOC giurcanu_m_Page_046.jpg
3dec73cf65790733a4913272f0e276d3
2d67f807e06275a497a8bc20e2e0f6305aa6edb6
131214 F20101206_AACMMZ UFE0021391_00001.mets
3d78718f6785cd8406465812ea030542
9617a7dd1e7124de9b611d215901efb41b9ff1c8
61345 F20101206_AACMOD giurcanu_m_Page_047.jpg
c02f11051d90f6db3fc1841c169e6494
9034cd6e377a51a0d7acdde8be2a4c3a85c45a76
84320 F20101206_AACMNO giurcanu_m_Page_020.jpg
4eac39099d4731ad37aa4135a9efaa7f
c52a3948b19515ddf8c2938089751e1aa80ffa89
86005 F20101206_AACMOE giurcanu_m_Page_049.jpg
800908fb01d7fdb0f9050f64ff84d609
9a702f8e43ce346360513dd9f17067bd49768c21
80796 F20101206_AACMNP giurcanu_m_Page_021.jpg
b27ff96a6942f66f94feb0b92e80a75c
2f7051f52422e4a96a5854bb7c4f4ef251c76bbb
55780 F20101206_AACMOF giurcanu_m_Page_051.jpg
cecccfd08d592ecd282b18ecff20d6ae
f8570d642b0bafd6b15924a3cd2d6ca620018c68
83540 F20101206_AACMNQ giurcanu_m_Page_022.jpg
ae188c19f3932db5143530a86a3c3bde
7ba5be71630508b0f81e6f68f5d9edeb04b4804b
69669 F20101206_AACMNR giurcanu_m_Page_024.jpg
a5c8aa8d29d1695c3ffc549182afe9c7
8414c9fc06528a16de73867ffddea4b485dc7c21
65875 F20101206_AACMOG giurcanu_m_Page_052.jpg
734bc6aef8ed9e82b790b42e596ac8f8
5dacf7730443979157abad4d774013e02312fa1b
88708 F20101206_AACMNS giurcanu_m_Page_027.jpg
6859f512b11da84844f5bd9f314e8202
f7818156b94d60a11b40cd5093e412bab93b9e7e
47491 F20101206_AACMOH giurcanu_m_Page_054.jpg
b25c6e60dcddfdad29c6d07d7c0cd0a5
1bd3f997842b79849a8a3390e85e1846c3514f29
83964 F20101206_AACMNT giurcanu_m_Page_028.jpg
e79266595b0fd67040ead8011e18b197
29e0240aaf8168a419073800893a8bb0f5b01633
49140 F20101206_AACMOI giurcanu_m_Page_057.jpg
37f1796a6a15cb482d7dd462b406ecb9
fa7d2f1eb6f6495f661424c5e512a41ddb94f59b
91435 F20101206_AACMNU giurcanu_m_Page_030.jpg
5189bf14899cf75b0dc4d9d8cd604982
991ea9fafe5b2dce75c612243afa8d652f9bc009
44190 F20101206_AACMOJ giurcanu_m_Page_058.jpg
46a87cbe796f6998df9ed9a421b1d229
8d7819e2e1daae0ef7989adf484e057c02d81323
91031 F20101206_AACMNV giurcanu_m_Page_031.jpg
56078c71d1a152ab44c0825b8986780d
78130de0e3eb6cc6457e2a770497654a55b5a780
90351 F20101206_AACMOK giurcanu_m_Page_059.jpg
8b24787bd43b3766ee519c879f1e505f
8435f4bfba4f448ab78a06298b9ebf8083a32386
76515 F20101206_AACMNW giurcanu_m_Page_035.jpg
24ebc301d8f9168d05b9bad570e9e975
e809f3b07a4d367d305e7510ac2cc1c215271fe1
74883 F20101206_AACMOL giurcanu_m_Page_060.jpg
0da94e38a71f95bcf3daf1bb3b7fa900
2a6f4f991d7fd8d5d18980f01e5b3022c951c46c
74857 F20101206_AACMNX giurcanu_m_Page_036.jpg
c477ba34d8446cc4825c9101b702d124
5790b6569a2662f8c75cabf9e89a82d204354bfe
51438 F20101206_AACMPA giurcanu_m_Page_085.jpg
775816cbe12836d84fb424cd0618b6bb
bbf4c9694cc90a237f0a7f460186530adae9133d
74124 F20101206_AACMOM giurcanu_m_Page_062.jpg
ea4c32446422d1524a4125b02ad14fd6
39352b49e6b4ab5240cab855420ef5546d9946c5
56962 F20101206_AACMNY giurcanu_m_Page_039.jpg
310e5b6263981b7a545aac313f7f7515
65785a3933369e87360c692c577dcff5be6e82b0
48177 F20101206_AACMPB giurcanu_m_Page_086.jpg
060d2d07047eb13990a33e8c7431c0f3
f3e58592a8741b2ef94a7a6be1bff04645e038c9
68636 F20101206_AACMON giurcanu_m_Page_063.jpg
e1824445b4c0bd07e31a2e0cce3cd953
69fa86d9339f35593a9b1c05ad69763d48ed3411
49447 F20101206_AACMNZ giurcanu_m_Page_041.jpg
e6679ac04195fd2ed8acb3cbe89d6c3f
712bca18c93648edafdb0cb3d7d59b75b58b8df2
50450 F20101206_AACMPC giurcanu_m_Page_092.jpg
c2e78e48d2b200a08eb1fa1265c42175
46164e3ec6da9f34e6134f94232f359869fa6de8
70766 F20101206_AACMOO giurcanu_m_Page_065.jpg
1db9234c1e77bdb1a603adb9a6c15a7b
b551a5e920353ee3a34e901f70632905d9f67459
54332 F20101206_AACMPD giurcanu_m_Page_094.jpg
b28aab47c19729e1b64fe5cbf9c67c71
aa3e6d6b1986a76a1781630597d494bd4591da79
53991 F20101206_AACMPE giurcanu_m_Page_095.jpg
cb57593422f8c9bf33534f57aa3fbf25
7209505769f47a3b50c288d9433d3547934f1e30
69314 F20101206_AACMOP giurcanu_m_Page_067.jpg
882c287622bf6934005e93939cb30181
8a426617ccca0b4e310bdcbd2e550dd7d49cfb91
60118 F20101206_AACMPF giurcanu_m_Page_097.jpg
02ab3c1bf32d7689114f3711483ecf2e
38c178d53e557a07155396aae3fb8c80c569ea5b
64481 F20101206_AACMOQ giurcanu_m_Page_068.jpg
8e28df90f6f355a7be138e66934c9bf5
17bdf6d33a04a6301596bbc79ea01a280c317606
53368 F20101206_AACMPG giurcanu_m_Page_098.jpg
9c6af4abac29214ff51065bfef506621
dff4588049a87ff057c516413a3b48f5fbb1247a
62185 F20101206_AACMOR giurcanu_m_Page_069.jpg
a40c87425063dd7dc5d8f1d49031752c
79983b9ffc3d39dc34ef296c323840fb1d7820dc
43353 F20101206_AACMPH giurcanu_m_Page_099.jpg
95e59b508b8bce7826457e30d2e0aa72
8e7b23a5f7cf40470f6fe41041f22f488ba794ca
55981 F20101206_AACMOS giurcanu_m_Page_071.jpg
d2e1c3374782451593ebd88e50181ce1
0e9e47c58b6a86cabdbdbfbdee91cc805f5224fc
48500 F20101206_AACMPI giurcanu_m_Page_102.jpg
b5951191add41786804795c96214260d
8d080341a89705774f97da097f13d72349e65388
28328 F20101206_AACMOT giurcanu_m_Page_073.jpg
5b1d2fe5279bc7a1f095920a612b06c6
c784fa0501fdcae78e7e9a9894f2ae98fd736832
58501 F20101206_AACMPJ giurcanu_m_Page_103.jpg
8ffae52bb4d0b20d997c3529046942a4
94f7ac7a8dd795df337b97f04479ea92c83d3403
67638 F20101206_AACMOU giurcanu_m_Page_074.jpg
73231039b5bd3e6f35a8574bd0c78f58
b6c21830a370d5a724e756ade23a9380a8fddb5d
46541 F20101206_AACMPK giurcanu_m_Page_104.jpg
53c1f19a212bef41fc1663ae7fd4073d
f741a8e1ca6f741e4d380e02ed15d6b9538a119c
56727 F20101206_AACMOV giurcanu_m_Page_076.jpg
c4022d9a66c6e191f16518589c267f1d
164f9c150c5c832b3b469f9423a731ce6beb8e3c
53793 F20101206_AACMPL giurcanu_m_Page_105.jpg
7dd6f2c8ad07edad3ba28a698edee3b9
74680cc4181300cf51e466dd96ec394e29b45d3b
72732 F20101206_AACMOW giurcanu_m_Page_078.jpg
74895fd70b9c13716d4d5192fd712f0b
b5628945a092e90aba84ee3c79e42be30de5c2f1
982429 F20101206_AACMQA giurcanu_m_Page_026.jp2
e8dfd043f5c50fd25540c84831d66da2
366453a8c4f39b9436ac860dddfe78a0c09465cf
76255 F20101206_AACMPM giurcanu_m_Page_106.jpg
e9c819820e45ecfb003657127263cd75
de70796ef0bd139669e3223ce9be1812430d16b0
49884 F20101206_AACMOX giurcanu_m_Page_080.jpg
3b174f3e2b04d40d36b416bc415d8741
1fbc5d3031a08b939c67414504a9f84d1de01021
F20101206_AACMQB giurcanu_m_Page_028.jp2
c2101de1b006abdfdb93032d460d3498
3f9bb2177d84ea7db5fab92c001d852cafaf4b03
80424 F20101206_AACMPN giurcanu_m_Page_107.jpg
18f32efc94f13700370d697f0e77d682
b4113b043aa6409a4b94f94beaac9229e389fb0c
49963 F20101206_AACMOY giurcanu_m_Page_081.jpg
7ca32d4c24107de158a442e95af3b0a4
71f3daf5868513279e0a3476a31dff47ded749f9
1051979 F20101206_AACMQC giurcanu_m_Page_031.jp2
17987d06f036b47a9914521ab12e6383
243a37fa54748bfa9573b56a431eb91c849a4c2e
73522 F20101206_AACMPO giurcanu_m_Page_108.jpg
4e1104bc3e8372e721699717b839341b
632c2094ad8fd6136342c262467d49aab6bf5221
48020 F20101206_AACMOZ giurcanu_m_Page_082.jpg
a79fe203cd3e68243bda99d09f67e74d
1c4c4a108228f47946d9194fc168c3ce7e086be2
1041635 F20101206_AACMQD giurcanu_m_Page_033.jp2
fb48e5f3c05d621dae369e7519653b0c
7b7013cedb8c8324eae42db690f2d31985e08339
49242 F20101206_AACMPP giurcanu_m_Page_112.jpg
d3450fcf0d171c73fd4d828d493fe4b8
838c7f44f36100b621ef9e72f0f77feef5500a2d
F20101206_AACMQE giurcanu_m_Page_034.jp2
71cf5a19510609a0c114767e5609007d
ac0249cdc58afc77d9d1c78936746d3626271104
1051959 F20101206_AACMQF giurcanu_m_Page_035.jp2
7e4c002fff3bd15c29c60487978d9aa8
b8106788a8a86f2302dedc2718ac4762bbbcc085
8635 F20101206_AACMPQ giurcanu_m_Page_003.jp2
4ff8279a4908caa639b0d77943d84799
ed0f061c54e40bb789045eaabcd852f52ba3b7bd
1051935 F20101206_AACMQG giurcanu_m_Page_036.jp2
8b5deaff1d77257a43a7c6b125624a62
a3a8a279dce6468e49fadf5696e4da7695b8744c
F20101206_AACMPR giurcanu_m_Page_007.jp2
2d5a923040e5797754ff5fbea4e1296b
198e1a0f7809646eca5e54c5c33280b615f352dd
F20101206_AACMQH giurcanu_m_Page_037.jp2
d305940282df884bd21359de49d24c2a
99d823ff5b3558b4403309879795221e0c5bd9ff
F20101206_AACMPS giurcanu_m_Page_008.jp2
283d3a1ee9902355e02dca66539e670a
10e01c191d3e1b6eecba738f092cba1870779497
840510 F20101206_AACMQI giurcanu_m_Page_040.jp2
b83df390f84e0c1226a3d8b01140cd3a
58c79d4f6d062804060c43e16740def0686dde3e
104611 F20101206_AACMPT giurcanu_m_Page_009.jp2
aec165f9a3c82d2dd8b7fb160051a69e
47059d8cb7aba0ffc2b586a4de719aacbe4370bb
F20101206_AACMQJ giurcanu_m_Page_042.jp2
4fc44ed8a4fbf5dc66ba421787168de8
3681a57117fa55f6dc8949e0264b0f4562df2f0e
940967 F20101206_AACMPU giurcanu_m_Page_013.jp2
b6a7ee785a04042b9ebc28cf46a93fed
2cd70839f029c506d18d74120df63608b64b01e7
1051974 F20101206_AACMQK giurcanu_m_Page_043.jp2
b5ebfb00efa95c8298ea5b10f0f298ad
d38b67959ed714c9f1fcb3dd6605ee2c987a64f5
1051923 F20101206_AACMPV giurcanu_m_Page_015.jp2
4791f3b051f295be9dcf68793b25c8a0
bbc4bf71d39a5922fab10828b5359d08238d915a
1051955 F20101206_AACMQL giurcanu_m_Page_044.jp2
6c75073463531bbd2537c7e0a6a6f8c8
0340257fdbacf4cdec72bf14eb421b0e8aba465a
1051982 F20101206_AACMPW giurcanu_m_Page_017.jp2
9fb096e09da8934e1573c6361b6cfbbb
afcd0fd31e897f1dd42e18e28f55f32052d417a3
892932 F20101206_AACMQM giurcanu_m_Page_045.jp2
f1f3e03a53792343cec513cbb7bc3ffe
a69bfe10297c2efc5a9a0274e4e6d1c4c65dd092
1051964 F20101206_AACMPX giurcanu_m_Page_019.jp2
19dc86f6b1be1f88285609e7a2f72129
5b626efc571c052ae9574476732f30154ca0c52e
937727 F20101206_AACMRA giurcanu_m_Page_074.jp2
89b0047a399edd809a4de555324fc440
87904a0f8e2505eb9efded475f388bf9f1699c20
780650 F20101206_AACMQN giurcanu_m_Page_046.jp2
5c5aa65f6751a0fee4e2b0eb984f9367
c61f5cbe480bc787f2f69dde2e42495ecf353141
F20101206_AACMPY giurcanu_m_Page_022.jp2
4e2f6e4f43b8b0134c3683accc0d0bd9
04d6fe8d72c499068cb59f4db9b49a53f2f47477
601546 F20101206_AACMRB giurcanu_m_Page_075.jp2
ae245bda9e8b24e5c17d269ab4f24937
8423ee09e2b747712cfc979403a120dd6146aa6c
897672 F20101206_AACMQO giurcanu_m_Page_047.jp2
5a033fdf61d64a38fc7273b358785d26
11271ef01b4f49d959706ebea36d58fadda46c36
1051970 F20101206_AACMPZ giurcanu_m_Page_024.jp2
32b73b8428aa2a4f32fae5e7161fa36f
08e31002093c35ecbb8ca0057443f4f9a6c0645a
796268 F20101206_AACMRC giurcanu_m_Page_076.jp2
ef055a83b9f4e37c16ee7633573e7f6b
40224243d220cc83a7e2ce8fdae67cd35c6a07bf
1051912 F20101206_AACMQP giurcanu_m_Page_048.jp2
b358ed6607a1cf87b26901f873478b73
158d5873cd3200307fc28491449a45bd6e5118fb
128299 F20101206_AACMRD giurcanu_m_Page_078.jp2
cac325dd3b04ba81dadc0cc4d39e6bdb
da3f1181c33dd32897f8da20cf04620d27adbd16
1051898 F20101206_AACMQQ giurcanu_m_Page_049.jp2
5fe01134fa27955e0e9079340b7c3f7e
7966f30cbd7913f3c6d1f6d68f6db22859f5045e
697621 F20101206_AACMRE giurcanu_m_Page_083.jp2
bbb7b579feebf5c348b802614094ea0b
f86db7d88e666cb14029a54198f0a761a2f2febc
807569 F20101206_AACMRF giurcanu_m_Page_084.jp2
25a432975d34b58cb8ba435e230a92ab
115d3598125594197137021eb77fbc9bc38bda68
920236 F20101206_AACMQR giurcanu_m_Page_050.jp2
4828af904f6dc46138d558a25d22f488
ad4ed4a59bb4102ffa73025781b75c5b7bc75ae6
671969 F20101206_AACMRG giurcanu_m_Page_085.jp2
ea991fa0a512009820c66ab9d3be181a
2bf6cff692949d697af469f3ebd416bc9ec662ff
75302 F20101206_AACMQS giurcanu_m_Page_054.jp2
8aa0a50b4b66a83d3cb8a13e724a72a6
e8aca36543f814ccb5699889d61a180ab07d309f
662724 F20101206_AACMRH giurcanu_m_Page_086.jp2
d19d67073b591ba34d87e8e480eaf593
7003c9c5d96eed5aaa2d3c96101de920a72d340f
83028 F20101206_AACMQT giurcanu_m_Page_057.jp2
ccd6d16cd7fcd7f47e1ab7cc0fc78247
8094cf27a3f4c7fd6fe1f8831971a951e493b07e
771282 F20101206_AACMRI giurcanu_m_Page_087.jp2
3d1e68c7a0e0bc182e07617687b522a7
f89c7bb773322b8ab84a2db5c0a22d72385ad630
F20101206_AACMQU giurcanu_m_Page_059.jp2
bc86325f501ac4a9a40544ee7c09764c
8ac165a2bcf53ae49396cedf46ee6ed4a8f14ee9
923210 F20101206_AACMRJ giurcanu_m_Page_088.jp2
c5437ff889446df72bf3bc88de30b599
23ed62f8464daa952f474368d57b7b0935797e0d
1051965 F20101206_AACMQV giurcanu_m_Page_060.jp2
ab9e4fdf04b2d8d9d6bd49e528c8325f
02cfd02c902e4d14124216bae775c3d9cbca355e
754453 F20101206_AACMRK giurcanu_m_Page_090.jp2
7fcade1263b76109a7ceb97db808f7ce
e4ae6dc58e4b0f760005e50f1bb5370fc08db235
900034 F20101206_AACMQW giurcanu_m_Page_063.jp2
0ad21054e478a60efcce1b625083fd6f
1c670ed5b4550b2cd6bafb9ede2167c1b76db9e0
795729 F20101206_AACMRL giurcanu_m_Page_091.jp2
b53ba86e1184f2ad3dc3e9a4b7513e96
47f84bc770d134dba863c77e33953eecdb292aa5
979181 F20101206_AACMQX giurcanu_m_Page_066.jp2
5d4970d83c02f0896e186374c3a7f88b
e3c3b5722c6d2b2a85e4a0230ab7405ddd5b5cf9
F20101206_AACMSA giurcanu_m_Page_007.tif
fb14543e7f767408f6424c9dbdc2f911
6b22fe0815e7750470f4f064fd17ef9a0d36dd90
674014 F20101206_AACMRM giurcanu_m_Page_092.jp2
835d539484652eaaa8ec20cfc26e42ad
6c06f8cb2da881ba70aebcdf20a0f49efd807603
882161 F20101206_AACMQY giurcanu_m_Page_069.jp2
7d055a4899f83cf554e6152a7b93d693
71079053fcd7407abfaca68050463cfba6548eca
F20101206_AACMSB giurcanu_m_Page_008.tif
9e1629c53d1b87d9e266ce3d91d3ac1b
2a19adf98bc880dd6983df2e26b8de590cc34eff
690312 F20101206_AACMRN giurcanu_m_Page_093.jp2
7c63f02376dc6f0a8295f538fdc94bd4
62a8682cd679b59d0d46757997a0ed7d9bc53b76
F20101206_AACMQZ giurcanu_m_Page_072.jp2
907d8afceec329f4e481911001ec5b3c
9c62e371c2d99bb87b0373c6deccfbf9f3a22203
F20101206_AACMSC giurcanu_m_Page_009.tif
7cfadfd0519708940891842ffd246703
9bc7807f4c284a03e4b171f65e10992f42ed1475
684813 F20101206_AACMRO giurcanu_m_Page_098.jp2
5aa9b5a30f6ad6ece25dff135b91126b
921addf16996f707c7be518408e6e73e6da71044
F20101206_AACMSD giurcanu_m_Page_010.tif
4d8bf8330e8c9d79b7312eb9d6edb174
506ec0ceb17559a84136e4e92305b83f681b68a2
559006 F20101206_AACMRP giurcanu_m_Page_099.jp2
e8a6ed6ab9e7a2206af40568b055c5af
415f4659a19faa5d502294839ad3a227ebdce3f6
F20101206_AACMSE giurcanu_m_Page_011.tif
a73a3c4c55a69a1bfce09dd36bd43c7f
27361c0b064545772e539220169b8034b471bdf2
616205 F20101206_AACMRQ giurcanu_m_Page_104.jp2
f580be6253a93597cd1288752eb1d3c9
b3f8fa6f4b0fbb22e8db107563af88d8b6874943
F20101206_AACMSF giurcanu_m_Page_014.tif
b00c9a46db72993372a0b4fe96e69512
a51e5bf9c640735ad2121e1339aa715db3b99eb9
705493 F20101206_AACMRR giurcanu_m_Page_105.jp2
6dc8eda626602f7cccd242ebe6c026e6
edc4c69907ea3e188abe117780e2f6de7ff34d58
F20101206_AACMSG giurcanu_m_Page_018.tif
509392603315c2accbc945eed9f0b941
699d60c6cadee055cc35fca6e2a0c77e5ca434bd
F20101206_AACMSH giurcanu_m_Page_021.tif
118faef4082306453c1d1fc3626b4e85
57aa17c037c9e3b2616af16eb28c1b65d554138b
127920 F20101206_AACMRS giurcanu_m_Page_106.jp2
899c0ae7fed144e0f0b9da1118aa64bf
d0e27f009f89b855bedd250826bd192c731c82f9
F20101206_AACMSI giurcanu_m_Page_022.tif
5542263a1ba34aea4bacbc4d8ee60d53
740731937e57cbf291ef115f644d898c34e1086a
120016 F20101206_AACMRT giurcanu_m_Page_108.jp2
2babd5a7c5f34dfba61de69694b831db
c2c30f0bcad8688f58f5460cb4eefaa176b680e6
F20101206_AACMSJ giurcanu_m_Page_023.tif
1925f58e9dd0b65be359222def39230d
51795014a6f02ba17e16c04f76f1f4d282341ca6
125297 F20101206_AACMRU giurcanu_m_Page_110.jp2
21bb99985d6374878faf48917784de02
02f5ce09b9059d4bac05652dee0a9fa0627a7662
F20101206_AACMSK giurcanu_m_Page_024.tif
739e58298e1014c9b0f00bc94d159bb7
bd3280e2fccc7af39fd68c332b964ffc703988dd
113504 F20101206_AACMRV giurcanu_m_Page_111.jp2
1033d5856552fa8d967e2ba9e8e5cea1
7627e3980002b6b752e6271d6d5928d24cf76e65
F20101206_AACMSL giurcanu_m_Page_026.tif
be81c15e8b215ceef27bf2d24598b05b
5f6dbd4ff0e7c135c53982693d6c65bc50ef047e
F20101206_AACMRW giurcanu_m_Page_001.tif
3d7fcea1c1da07d8d9ad54066faf8863
8cb95588910b729620e840e7619f72a905fe7d46
F20101206_AACMTA giurcanu_m_Page_046.tif
7aa9da8e0da0f67ae3687c1726271a51
1f50a4d7e469ff441698032c4fa29b6ef827e6d0
F20101206_AACMSM giurcanu_m_Page_027.tif
8b7320f7fcc8205d995f54f33eb36f90
bde27e4ca21c547be9d053728f50e5f8e92f4cb1
F20101206_AACMRX giurcanu_m_Page_003.tif
15090342024542bf7098f8ba7c5246b2
390c32e8417380e7599fd27e7dda87aee5bb21d7
F20101206_AACMTB giurcanu_m_Page_048.tif
c7c154e894b46b259a5beb63715282bf
8c0c0754b30287cbda298ade3f56f2746cd52604
F20101206_AACMSN giurcanu_m_Page_028.tif
9fdaa0f2fffcf14b0d4f9bd34dbf682f
158369755f9a4653a8919f2adc3249a6e767e2ca
F20101206_AACMRY giurcanu_m_Page_004.tif
f66605f757387b0c57c6cde52246ea21
8f61c2ab1d5821c7bef93913880c0af1116f7103
F20101206_AACMTC giurcanu_m_Page_049.tif
78725185ba4311aabf20f6fbe0470f70
978cb51444fae0d50705decf1d0e726c7eeb5452
F20101206_AACMSO giurcanu_m_Page_029.tif
ee56c00f7e30ddcb49ea6d138116e647
5cfac464ca17adec726b3229332ebb0491f93975
F20101206_AACMRZ giurcanu_m_Page_005.tif
50ff702877a3b4fc3f671410d8ced838
45f9743a67e4583817240f9a42f4bdb34fdbef92
F20101206_AACMSP giurcanu_m_Page_030.tif
a88583ee6f289612d1e4a1fd151c6be3
f4852d9481062e16e2b40d20cc15328d20bc0d28
F20101206_AACMTD giurcanu_m_Page_050.tif
2c4c5c924607dafde92f3ef024b02595
22f8244ce54ddbdbd14dd1b171e582a572e16f5f
F20101206_AACMSQ giurcanu_m_Page_031.tif
d6dc8ab706bf42d8b0426829460d749b
33e4b64b52cabbacaf2fd52bc92b32306ba00578
F20101206_AACMTE giurcanu_m_Page_051.tif
c38fe6de25625479282a85382e41ea2e
e0e859a7b7f76342fd421dfa1e6e984b870fabae
F20101206_AACMSR giurcanu_m_Page_034.tif
ab5e730a9a4afa54cd1eb0f946586199
d819874ac8ad862cc534b32db647cfe4760fe3fa
F20101206_AACMTF giurcanu_m_Page_052.tif
c45ee49f2771511d51081211d5465e80
220e7d9a4447ccc2e366a22b52e711f4cb6561f5
F20101206_AACMSS giurcanu_m_Page_035.tif
075cfd7c6bb66bf9ee1255085d831514
f9e820214afdf8407d31d3d7f09676276f133097
F20101206_AACMTG giurcanu_m_Page_055.tif
80f3c2d4b2cbff78cf6b9cb13e15ed56
10eee7856ef9a323f26fdef297ad28801882c27b
F20101206_AACMTH giurcanu_m_Page_057.tif
ad637696f9868a4a2717279ce4329063
ceb19e64e7484c9c0c17a4ed64cdad806a6ec02a
F20101206_AACMST giurcanu_m_Page_036.tif
5071407b7a17dfd4cb9bfa3b1dc2646f
af15dd310b4725140e5e1112174b69ab3dc7c0da
F20101206_AACMTI giurcanu_m_Page_060.tif
79cf92c0192910c3f15f975c3ad710b8
2a452cbae774d2743f0d6f126757f601ac259d29
F20101206_AACMSU giurcanu_m_Page_040.tif
775827b435a48ad0dd4aaaf73225d12b
27218c2fcc92924568f4e3ee1b71de5a7fa539aa
F20101206_AACMTJ giurcanu_m_Page_063.tif
fdc61cc1d40c245e148158914c236f00
d543c8876efaa044979d96d6d195f8cff8d6755c
F20101206_AACMSV giurcanu_m_Page_041.tif
8dcc1ff9cc6dd2b1a555951f73acb7fd
72b88bc61e603987560566ba67f8520a787fae7d
F20101206_AACMTK giurcanu_m_Page_065.tif
72c6810954cda912894cefe055888f11
67e2abf7d340eebad53f68d48a4aa929333db43d
F20101206_AACMSW giurcanu_m_Page_042.tif
acfdfd18efa612d52103ae2f96e3ca68
206841ed8011c18704100e9c7ced5baca4cf2047
F20101206_AACMTL giurcanu_m_Page_067.tif
fd491b204ac55c65e83a89944aa86ce7
e5c16d710f6e6531b74ee7c9d7210ab25b0072cb
F20101206_AACMSX giurcanu_m_Page_043.tif
ff57e8c47aea595f57bc4611ad0018cc
3ed96037d0f27fe5bcdc5995f0d76d57bdf53841
F20101206_AACMUA giurcanu_m_Page_098.tif
23a03e74e28c24485bab6cb6b93a6aaa
1ba171c1370a5fa0cd48425350fc04c3770d90b9
F20101206_AACMTM giurcanu_m_Page_069.tif
1f9286dc0311d890f1c8794179c400fd
7304dcd0da09dcfe23f249cac3782f1258563722
F20101206_AACMSY giurcanu_m_Page_044.tif
d5454cbbab256b0282c6481b257fe5b0
bbe3c7632c0e4727c5ef3bc0ea022eb774494b45
F20101206_AACMUB giurcanu_m_Page_100.tif
433c5ce977ca7fa209f7071995561e60
60be796028d6d3352f88ff256ac961b04e3fbcbd
F20101206_AACMTN giurcanu_m_Page_070.tif
acfe0b34af95027749c1436054537e35
5ac012b9f8df9862ddb49436884fb48570a89ae2
F20101206_AACMSZ giurcanu_m_Page_045.tif
e33210a01225c102dd445f720dee476c
76fd7885fac134330c792ed4501b8fe29a39d4cb
F20101206_AACMUC giurcanu_m_Page_101.tif
3e895a6895d056ae321452b2304a3fb1
89c9fc4854862f0825f03bff78fc0f839ce41958
F20101206_AACMTO giurcanu_m_Page_074.tif
a5df64ad47264fe89995f263e0cbc0b4
b12bb69dec9423d00e6ff9428491829729a28cfc
F20101206_AACMUD giurcanu_m_Page_102.tif
940c84b34f9b84f966dcc9299441f555
fe07c08d2852e46a391f6d68d61261d030c17b65
F20101206_AACMTP giurcanu_m_Page_075.tif
2c7d53a3cab2813b94f78c0ebac262d8
5ed5111cf5a4f4f591381e3ab1ddc0f33f655353
F20101206_AACMUE giurcanu_m_Page_103.tif
07e5bcb842bc945afc2f4993b3b0612e
deca45ba6f74210f1ad2552d3fe4eff7dc1d6499
F20101206_AACMTQ giurcanu_m_Page_078.tif
b096a6211434d9d12d8537eeab906d5f
aaac6608bb7ab5e09e9bd2a3115d57398078de6f
F20101206_AACMUF giurcanu_m_Page_104.tif
de2eee2a76f7ac40f881952495613984
b9a5861de3f73a2f8035e818833967fa940404d5
F20101206_AACMTR giurcanu_m_Page_079.tif
956a14c47987d32c579c6b8aa4e83e6c
2c3fec814deab9384b19dc0bf1e9aaba4889a13f
F20101206_AACMUG giurcanu_m_Page_105.tif
5e58ac7545b826bdbfbb278b981388a8
ef88c11612e7e2987a01d033215c6efd0076285d
F20101206_AACMTS giurcanu_m_Page_082.tif
4e910f03d8bdf4cdb51a221753b21040
5c22fcf59304764cdb98b76ff7627dfb436783e8
F20101206_AACMUH giurcanu_m_Page_106.tif
c576459e028249f9494fad97dbbf4e54
8ca1d0b489123c8f2d881001d038763a10475982
F20101206_AACMTT giurcanu_m_Page_083.tif
672d2f06df04fe92cafdc1260be8f486
0b8579c67175dd65cf44e2d6025016879e820eed
F20101206_AACMUI giurcanu_m_Page_107.tif
d17fea73259162b5bd935b3bf4807603
fd38c48621b79f8ff9e8471cf555117bd1ae6cef
F20101206_AACMUJ giurcanu_m_Page_109.tif
1792878059a8336131fb93ceb3590b39
bdfd94f22251e5d606cae60df226287b561fe49f
F20101206_AACMTU giurcanu_m_Page_085.tif
09479f1658c5f3bb298c4b37193e9306
bab40dcf90c2d8ae773b6e10a84b9c5044db6bae
F20101206_AACMUK giurcanu_m_Page_111.tif
69f6fbd95b74b27b1cdbb3dafb03a81c
ec0307b4d8a7052f91b7bf03246a77f3dc1a33ed
F20101206_AACMTV giurcanu_m_Page_088.tif
3324b8999c8a6ec35c368c3876768fb8
5cb81f8fe52ff06d991f6084bbeb47d7d48bc2c3
7527 F20101206_AACMUL giurcanu_m_Page_001.pro
6f19d71709a63c2fd2210728fd1e6630
ef90203b7ac4c264f0a6a8e948ce8afe2da0e802
F20101206_AACMTW giurcanu_m_Page_091.tif
be9f070880a27f61c868177dd0ef7067
ad64eff848fd0518677bbb193245f57e42bc4e33
55351 F20101206_AACMVA giurcanu_m_Page_028.pro
f611d617cc0fe1a872542d46f67217ec
ff3ee78c531d7582e2a994822160f3e25103885e
2462 F20101206_AACMUM giurcanu_m_Page_003.pro
473788329f13a957d3f7000f02c6cea4
514af5347ec73069d48860f59cb758ce40158e8e
F20101206_AACMTX giurcanu_m_Page_092.tif
6a587719b4994afb397f767b4cd04e9c
c0de096e9ce7189ad4e42b9231d720bc1e549520
61266 F20101206_AACMVB giurcanu_m_Page_030.pro
d9f69da5058c86767bf558a531207393
52e269555918006cd514732f36c50e122cebdcf1
14249 F20101206_AACMUN giurcanu_m_Page_006.pro
ada4788c4c25e7a09e6c6c239c62f40f
17415b007486cbaf8cf7011a1e881edea82425db
F20101206_AACMTY giurcanu_m_Page_096.tif
de935d993785d5eae85881146845b024
54f2a234c9229000fcd1436451da7fcd93171f0c
44746 F20101206_AACMVC giurcanu_m_Page_033.pro
15d1a94466f8e11b058fe3b0d5d7f13d
e39d1a12599dd41716dda380e6b633562e179531
49529 F20101206_AACMUO giurcanu_m_Page_009.pro
943dfc0def02e49a0d395f05f803d42c
9d45741260022be2c6fe4e036b2140f9123d41a1
F20101206_AACMTZ giurcanu_m_Page_097.tif
584b3c9db10575ae7ccffdaae27a0940
5295722029b975ba65a5bb138039ca9ecb64a112
50270 F20101206_AACMVD giurcanu_m_Page_037.pro
11786eecdd46446025b9b130738edc95
ce10aaa137942c308b7b2339a0ebf05ad2e070ab
16852 F20101206_AACMUP giurcanu_m_Page_010.pro
517d0652e09267fd266246b519f85cee
058461e15bade9fd86bf30cf205e5a8109c9d3ad
48491 F20101206_AACMVE giurcanu_m_Page_038.pro
1c6e96645c49cc644e4f76e3e30239a5
7056eeca768bddee722184bd8bce0081789bcf2f
41884 F20101206_AACMUQ giurcanu_m_Page_013.pro
d066573da107857b3e2de06b3816a735
f66a5a34c6817e035879871f81cbfe4529bdcf08
62063 F20101206_AACMVF giurcanu_m_Page_043.pro
baede11a26f07fa71efaa8f0c07a59d1
a9fb634c21bb871072b694ad4385f46ace1f88a0
44333 F20101206_AACMUR giurcanu_m_Page_014.pro
38bc5594b826f3782353cb57cbe2ac76
15a007a3e74c222810ecd13251af07785dbb5c2c
51425 F20101206_AACMVG giurcanu_m_Page_044.pro
fe78c1e8a6d50de1b72cd18c88295b77
ce5aa722592a47a45ed8d473c177c937aae0a3a6
43626 F20101206_AACMUS giurcanu_m_Page_016.pro
8a864ba3ca5d1742a35b8d350e856c72
82194df4256aac4d332e0ab7c338d256b7d6e9c3
32930 F20101206_AACMVH giurcanu_m_Page_046.pro
4344fe5e1962195416013c2083793583
2b78c5b738b352bc63004dddc048a560ceacfd72
42658 F20101206_AACMUT giurcanu_m_Page_018.pro
f060cc96ecde0ecbcfe22868c227ad3f
8c52ac46f9166aab0898689832a2331569d9e01e
37815 F20101206_AACMVI giurcanu_m_Page_047.pro
5acdc5c58c589888a51bde7ada6060e4
ce1845b1bd43a587a024c06b8a69199c043f0c24
56732 F20101206_AACMUU giurcanu_m_Page_019.pro
62467b20da50247a353094f3b12a79f6
140a5e3c736c860eb179fd91840319fe3c12efc9
49877 F20101206_AACMVJ giurcanu_m_Page_048.pro
57e7183ef32306b9a9ee3cd53181bed0
c8c6d22dd14517661c539fda58f49feae0b1ba32
56083 F20101206_AACMVK giurcanu_m_Page_049.pro
1038c304900e0b58cb3089fe37353153
6929d73c8ae28ad24ce038d45b73662576f3ad92
56052 F20101206_AACMUV giurcanu_m_Page_020.pro
ffcd3bfc958631125155dcdcbfc1102b
39a5179a2fd25d14d74ee49d5c3d156629c7656a
34873 F20101206_AACMVL giurcanu_m_Page_051.pro
579c7e7e3e42c10f5b015ed52e3da3e7
5c4ef491d90e44673dfa68936868a87562e4f97d
45282 F20101206_AACMUW giurcanu_m_Page_023.pro
051eea8cfbbe3822a5cde54a32c9699c
f2e7d19dda8a071bd9fb9c67613d77f44e3c2d44
45818 F20101206_AACMVM giurcanu_m_Page_052.pro
e6a3ca392f577e7c254e1a67fc6a1745
00c683ff4eeb56e1b318fba847ade65b225f6d10
43630 F20101206_AACMUX giurcanu_m_Page_024.pro
ddd06eb5d60d06e839963ea46849862b
79036a1647f87b706dc8f343f6fab3609fa67826
40611 F20101206_AACMWA giurcanu_m_Page_074.pro
1dd7ac6d138d76a23347ff004bab6b24
221b3b9cbb3cfc52e7cb13cc9a5473bdd43e0755







BIASED BOOTSTRAP METHODS FOR SEMIPARAMETRIC MODELS


By

MIHAI C. GIURCANU



















A DISSERTATION PRESENTED TO THE GRADUATE SCHOOL
OF THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT
OF THE REQUIREMENTS FOR THE DEGREE OF
DOCTOR OF PHILOSOPHY

UNIVERSITY OF FLORIDA

2007


































2007 Mihai C. Giurcanu

































To my parents, my lovely wife Magda, and

my wonderful children Michael and Stefanie









ACKNOWLEDGMENTS

I am indebted to many people who gave me support and advice while a graduate

student. First, I want to thank Professor Brett Presnell for giving me the opportunity

to work on very interesting statistical topics during my graduate education. He was an

invaluable source of inspiration in this fascinating research. This dissertation would not

have been possible without his constant feedback and guidance. Our weekly discussions

helped me better understand the statistical issues involved in our research. I also want

to thank to my graduate committee members Professor Malay Ghosh, Professor Alex

Trindade, Professor Jim Hobert, and Professor Murali Rao for reading my research.

Our conversations on different statistical topics improved my understanding and vision

of statistics. I also want to thank to my first mathematics teacher, Professor Ignatie

Henny, who made me discover the beauty of mathematics. I especially want to thank my

parents for their love and interest in my education and my family, my wife Magda, and my

children Michael and Stefanie for their love, support, encouragement, and understanding

when sometimes I had to stay long hours at school to finish this dissertation.









TABLE OF CONTENTS


page

ACKNOW LEDGMENTS ................................. 4

LIST OF TABLES ....................... ............. 7

LIST OF FIGURES .................................... 8

A BSTR A CT . . . . . . . . .. . 9

CHAPTER

1 ESTIMATION IN MOMENT CONDITION MODELS ............. 11

1.1 Introduction ....................... .......... 11
1.2 Generalized Method of Moments .......... ............ 11
1.2.1 Review on Generalized Method of Moments ............. 11
1.2.2 GMM Estimation and Asymptotic Results ............. 13
1.2.3 N.-i I1 GMM Models ................... .... 17
1.3 M- Estimation ...................... .......... 18
1.4 Empirical Likelihood Estimation .................. ..... .. 20
1.4.1 Review on Empirical Likelihood . . ....... .. 20
1.4.2 Empirical Likelihood for Moment Condition Models . ... 22

2 THE BIASED BOOTSTRAP WITH IID OBSERVATIONS . ... 27

2.1 Review on the Bootstrap with IID Observations . . . 27
2.2 Least Favorable Families Corresponding to Z-Estimation Model ..... 31
2.3 The Biased Bootstrap for GMM .................. .. 33
2.3.1 Consistency Results for the Biased Bootstrap . . 35
2.3.2 The Biased Bootstrap Recycling . ........ 38
2.4 Instrumental Variables .................. .......... .. 42
2.4.1 Review on Instrumental Variables .................. .. 42
2.4.2 Bootstrapping 2SLS Estimators ............... .. 48
2.4.3 Simulations .................. ............ .. 51

3 THE BLOCK BIASED BOOTSTRAP FOR TIME SERIES . .... 59

3.1 Review on Bootstrap for Time Series . . .... .. ... 59
3.2 The Block Biased Bootstrap for Generalized M-Estimators . ... 60
3.3 Consistency Results for the Block Biased Bootstrap . . .. 63
3.4 Iterated Block Biased Bootstrap Recycling .... . . 65
3.5 An Application to the Optimal Block Size Selection . . .. 67

4 A HYBRID BIASED BOOTSTRAP .................. .... .. 74

4.1 On the Boundary of the Parametric Space ................. .. 74
4.2 Certain Asymptotically Nonnormal Statistics ................ .. 76









APPENDIX

A PROOFS OF CONSISTENCY RESULTS FROM CHAPTER I ........ 79

B PROOFS OF RESULTS FROM CHAPTER II ...... .......... 82

B.1 Least Favorable Families ................... ........ 82
B.2 Consistency of the Biased Bootstrap for GMM Estimators ........ 84
B.3 Consistency Results for Bootstrapping 2SLS Estimators ........... 95

C PROOFS OF CONSISTENCY RESULTS FROM CHAPTER III . ... 100

REFERENCES ................... ............ ...... 106

BIOGRAPHICAL SKETCH .................. ............. .. 112









LIST OF TABLES


Table page

2-1 Estimated coverage probabilities for bootstrap one-sided, upper confidence intervals
at a .90 nominal level, =7.8, B 1000, S=1000: GMM biased bootstrap (GBB),
GMM uniform bootstrap (GUB), centered residuals (FCR), and based on ..-i-', ii!1 ic
approximation (ASY) .................. .............. .. 56

2-2 Estimated coverage probabilities for bootstrap one-sided, upper confidence intervals
at a .95 nominal level, =7.8, B 1000, S=1000: GMM biased bootstrap (GBB),
GMM uniform bootstrap (GUB), centered residuals (FCR), and based on ..-i-,iiiill ic
approximation (ASY) .................. .............. .. 57

3-1 Computation of the optimal block size for uniform block bootstrap estimation
of level-2 parameters Q1 and Q2 given by (3-22) and (3-23). The number of simulations
is S=1000, and the number of bootstrap resamples is B=1000. An asterix (*)
shows the block size for which the minimum RMSE has been attained. . 72

3-2 Computation of the optimal block size for moving block biased bootstrap estimation
of level-2 parameters k1 and Q2 given by (3-22) and (3-23). The number of simulations
is S=1000, and the number of bootstrap resamples is B=1000. An asterix (*)
shows the block size for which the minimum RMSE has been attained. . 72

4-1 The quantiles of the distribution of T, under p=0, and their bootstrap approximations
given by the ordinary (uniform) bootstrap (UB) and the "hybrid" biased bootstrap(HBB),
using B=1000 bootstrap resamples, S 1000 simulation runs and 6 n4 78

4-2 The quantiles of the ..,-i-!,ii.l)tic distribution of T, under p=0.2, and their bootstrap
approximations given by the ordinary (uniform) bootstrap (UB) and the "hybrid"
biased bootstrap (HB), using B-1000 bootstrap resamples, S=1000 simulation
runs and T n-.4 ................ ............... 78









LIST OF FIGURES


Figure page

2-1 Estimated coverage errors corresponding to different bootstrap confidence intervals,
at different a levels, sample size n=20, 40: o GMM Biased Bootstrap (GBB),
D GMM uniform bootstrap, 0 GMM Recycling Biased Bootstrap (RBB), A
centered residuals (FCR), V based on .-i~,'!Iil.l ic approximation (ASY) . 57

2-2 Estimated coverage errors corresponding to different bootstrap confidence intervals,
at different a levels, sample size n=60, 80: o GMM Biased Bootstrap (GBB),
D GMM uniform bootstrap, 0 GMM Recycling Biased Bootstrap (RBB), A
centered residuals (FCR), V based on .-i~,'!Iil.l ic approximation (ASY) . 58

2-3 Estimated coverage errors corresponding to different bootstrap confidence intervals,
at different a levels, sample size n=100, 200: o GMM Biased Bootstrap (GBB),
D GMM uniform bootstrap, 0 GMM Recycling Biased Bootstrap (RBB), A
centered residuals (FCR), V based on .-i~,'!Iil.l ic approximation (ASY) . 58

3-1 The bootstrap estimates of the RMSE's corresponding to different block bootstrap
schemes. We used B=1000 outer bootstrap resamples (for the biased bootstrap
recycling (BBR) and the adjusted biased bootstrap recycling (ABBR)) and for
the uniform double bootstrap (UB) and the double biased bootstrap (BB) an
additional 500 inner bootstrap resamples for each outer bootstrap resample 73









Abstract of Dissertation Presented to the Graduate School
of the University of Florida in Partial Fulfillment of the
Requirements for the Degree of Doctor of Philosophy

BIASED BOOTSTRAP METHODS FOR SEMIPARAMETRIC MODELS

By

Mihai C. Giurcanu

August 2007

C'!i ,': Brett Presnell
Major: Statistics

The finite sample properties of estimators in some moment condition models

often differ substantially from the approximations provided by .-i-'!!,'il ic theory. The

bootstrap can provide a way to circumvent the inadequacies of .-i- i!!ill ic approximations,

but in over identified models, where the dimension of the parameter is smaller than the

number of moment conditions, the usual uniform bootstrap may be inconsistent. This

problem is usually solved by recentering either the residuals, the sample estimating

equations, or the statistic of interest.

In this dissertation, we developed a new biased bootstrap methodology for moment

condition models. This biased bootstrap is a form of weighted bootstrap with the

weights chosen to satisfy the constraints imposed by the model. First, we construct a

pseudo-parametric family of weighted empirical distributions, obtained by minimizing the

Cressie-Read distance to the empirical distribution under the constraints imposed by the

model. The resulting family has the least favorable property, meaning that the inverse of

the Fisher information matrix evaluated at the MLE equals the sandwich estimator. By

resampling within this family, we "mimic" the parametric bootstrap for semiparametric

models. An extension of this methodology for time series applies the biased bootstrap to

the sample of blocks of consecutive observations.

Our overall goal is to extend and develop the range of applications and theoretical

properties of the biased bootstrap, focusing mainly in three directions. First, we prove









that the biased bootstrap is consistent in moment condition models, with no need for

"reo. iil. ini:, Moreover, by applying bootstrap recycling within the pseudo-parametric

family, we obtain computationally feasible and more accurate iterated biased bootstrap

procedures. The main idea here is to reuse the first level bootstrap resamples in order

to estimate higher level parameters corresponding to the iterated bootstrap. Third, new

biased bootstrap procedures are proposed for problems where the usual uniform bootstrap

fails, such as on the boundary of the parameter space and for certain .i-vmptotically

nonnormal statistics.









CHAPTER 1
ESTIMATION IN MOMENT CONDITION MODELS

1.1 Introduction

In this chapter we review three moment condition models that will be used when

defining the biased bootstrap procedures in the next chapters. We first present the

Generalized Method of Moments (GMM), a very popular method of estimation in

econometrics. We describe the GMM estimators and their .,-vmptotic properties, and

we give a new proof of consistency for GMM estimators. In the next section we describe

M-estimation. M-estimators are more general than the GMM estimators, but their

.,-i- !, ,ii I ic properties are usually studied for some particular cases. Here we prove a

consistency result for M-estimators based on concave criterion functions and an extension

allowing for nuisance parameters. We close this chapter with empirical likelihood, a

more recently proposed approach to inference. Empirical likelihood estimation offers an

alternative to GMM estimation for over-identified models and it is closely connected with

the biased bootstrap defined in the next sections. Some techniques used in proving the

.-i-, iI!ill ic properties of empirical likelihood are also used in proving the corresponding

results for the biased bootstrap.

1.2 Generalized Method of Moments

1.2.1 Review on Generalized Method of Moments

The generalized method of moments was first introduced in the econometric

literature by Hansen (1982) and has been widely applied to time series, cross sectional

data, and panel models, particularly with applications to economic and financial data.

This methodology generalizes many standard estimation methods, including maximum

likelihood (illi), ordinary least squares (OLS), generalized estimating equations (GEE),

and the method of moments (\I\ ), by allowing the number of estimating equations to

exceed the number of free parameters.









As Hall (2005) remarks in a recent econometrics textbook, GMM has had a great

impact in the econometrics literature mostly because economic models rarely provide

a complete specification of the probability distributions of the data. Moreover, the

optimality properties of maximum likelihood estimators (MLEs) are only attained when

the distributional assumptions are correctly specified. Under misspecification, White

(1982b) proved that (pseudo) MLEs are no longer optimal, emphasizing the need for

alternative methods of estimation to reduce the impact of misspecification in parametric

modeling.

In a recent review paper, Lin- i v and Qu (2003) remark that GMM combines

estimating equations in an optimal way and that the corresponding procedures are highly

efficient in the following sense: GMM estimators are equivalent to the estimators based

on the best linear combinations of the estimating equations, in terms of the .,-i-!,ii, ll ic

variance. In particular, if the set of estimating equations (moment conditions) contains

the score equations of a correctly specified parametric model, then the GMM estimator is

.,-i-!,l .' I ically equivalent to the MLE, though a second order effect may be evident with

smaller samples when additional estimating equations are included in the set of scores.

Qu et al. (2000) argue that GMM can be used to improve the efficiency of generalized

estimating equations in longitudinal data models (Liang and Zeger, 1987). They

optimally combine an extended set of scores in such a way that, under misspecification,

the estimators are more efficient than those based on the GEE for a given i l.:i

correlation 1i ii1:: Park (2000) uses GMM to balance robustness and efficiency of point

estimators combining both efficient and robust scores for parameters in order to obtain

a GMM estimator associated with the implied semiparametric model that is efficient for

both heavy and light tailed distributions. Qu and Song (2002) use the GMM J-test of

over-identifying restrictions for testing whether missing data is ignorable by constructing

an additional set of scores based on different missing data patterns. They distinguish









between ignorable and non-ignorable missing data models by whether the semiparametric

model induced by the extended set of scores on different missing data patterns is true.

1.2.2 GMM Estimation and Asymptotic Results

Let X {Xi,..., X, } be an iid sample from a distribution F, with support in

Rd and let 00 e O C RP be the parameter to be estimated. For a given f, we denote

by EF[f] = E[f(Xi)] the expectation of f with respect to the underlying distribution

function F and denote by F, the empirical distribution function corresponding to the

sample s. As discussed in the introduction, a GMM model specifies population moment

conditions in terms of a function b : Rd x ( -i Rq whose expectation under F is zero. Let

be(x) = b(x, 0). We give below a formal definition following Hall (2005).
D. [fi/o., 1.2.1 (Population moment condition). The moment condition of a GMM model

is given by

E[b(X, 0o)] E[boo] 0. (1-1)

Having defined the moment condition that identifies the GMM model, we also require

that 0o be globally identifiable (Hall, 2005).

D. fil.,i 1.2.2 (Global Identifiability). 0o is globally identifiable if


EF[bo] /0 for all 0 e 0 with 0 / 0o. (1-2)

Usually, in order for 0o to be globally identifiable, it is necessary for the dimension

of bo (the "basic score") to equal or exceed the dimension of the parameter vector 0, i.e.

p < q. Henceforth we will assume that this condition is satisfied.

When p = q, the Z-estimator for Oo = 0(F) is obtained by considering the sample

version of (1-1),

1E, [be] b(Xi, 0) 0, (1-3)
i









and solving it in 0. In this case, under regularity conditions, 0 is consistent and
.-i-,i!111 i. ically normal, with


l/2( -_ 00) N(0, D- V(D- )T), (1-4)

where D EF [Vboo], Vb(x, 00) is the Jacobian of b(x, 00), and V VarF [boo],

(van der Vaart, 1998, Theorem 5.21, p. 52). The sandwich estimator E, the nonparametric
estimator of the .-i-~! ,l.. ic covariance matrix of the Z-estimator, is given by

t n -1 (t n i /1 n -1
S(i b(X 0, ) 1 b(X, 0)b(X 0 )T Vb(X, 0)T (1-5)
i= 1 i=l i= 1

Unfortunately, (1-3) usually has no solution when q > p, even though the population

equation (1-1) is satisfied. One way around this problem is to consider the GMM

estimator.

D. /7i; ,., 1.2.3 (GMM Estimator). For any symmetric, positive definite matrix W,

(possibly random), the GMM estimator of 00 is defined as

argmin b(o0)TWb,(0), (1-6)
OEH
eee

where b,(0) -= b(X, 0).

We will aliv-; assume that the sequence of matrices (We,) converges in probability

to a (nonrandom) positive definite matrix W. Denote by b(0) EF[be] the expectation

of the basic score under the true distribution F and let Q(0) = b(0)TWb(0) and

Qn(0) = b,(o)TWeb,(). Then it is obvious that

0 = argmin Q(0). (1-7)
eee
OEH

Hansen (1982) first showed that under classical regularity conditions, the GMM estimator

is consistent and .,-i, ii!1.l ically normal. Usually, consistency results for GMM estimators

are obtained assuming one of two types of conditions. Either the parameter space is

assumed to be compact, in which case the criterion function is required to be continuous









in the parameters, or the parameter space is arbitrary and the criterion function to be

maximized (minimized) is assumed to be concave (convex), (H-i-hi 2000, pp. 456-458).

We present here a general consistency result based on Theorem 5.7 of van der Vaart (1998,

p. 45). Suppose that for every c > 0


sup IQ,(0)) Q() P 0, (1-8)
eee

and

inf Q(0) > Q(00). (1-9)
0:\\0-0011>\

Then any sequence of estimators 0n with Q,(0 ) < Q,(0o)+ op(1) converges in probability

to 00.

It is easy to see that if O is compact and b(0) is continuous then the global

identifiability property (1-2) and condition (1-9) are equivalent. Moreover, if we further

assume that b(x, 0) is continuous in 0 for all x and EF[supo e |boll] < oo, then using a

Uniform Law of Large Numbers (van der Vaart, 1998, Example 19.8, p. 272), we also can

establish (1-8). Consequently, we have established the following corollary (for other proofs,

see, e.g., Hall (2005, p. 67), Davidson and Mackinnon (1993, p. 592), and Matyas (1999,

p. 13)).

Corollary 1.2.1. Suppose that Oo is an interior point of the parameter space O -*ilifying

the population moment and the 11 ..l.rl .:1. ,I/',/l.7.:l:;i, conditions given in D fiil'..i 1.2.1

and 1.2.2, "i.. 1.; Suppose further that 0 C Rp is compact, that b(x, 0) is continuous

in 0, and that EF [supoeoIlbeol] < o0. Then the ('111 / estimator 1. fi,. in (1-6) is .,l.;'

consistent, i.e.
SP


Under additional regularity conditions, Hansen (1982) showed that GMM estimators

are .,- mptotically normally distributed. We present here the most common conditions

found in the statistical and econometrics literature to assure .,-i- ill,. ic normality. If, in

addition to the conditions of Corrolary 1.2.1, b(x, 0) is continuously differentiable with









respect to 0 for all x, EF Ibo 12< o0 and EF[supo| Vboll] < oo, then


ln /2( 0) N(O, (DTWD)- DTWVWD(DTWD)-1),

where Vb(x, 0o) is the Jacobian of b(x, Bo), with (i, j)th entry Vb(x, Oo)y =
D E [Vboo], and V VarF [bo].

Theorem 3.2 of Hansen (1982) proves that the choice of W, = V-1 gives the

smallest ..i-mptotic variance of GMM estimators, where V is any consistent estimator

of V (note though that W, is then random). In this case the GMM estimator is called

efficient. However, in order to estimate V, we must first estimate 0o. Hansen proposed

a two-step GMM estimator, which is obtained by first computing an initial (inefficient)

estimator 0, of Oo using an arbitrary weight matrix W (usually the identity), and then

letting 0, = argmino b,(0)TV(o )-lb,(0), where V(0) is an estimator of the ..i-mptotic

covariance of b,(0). Usually, V(0) n1 En- b(X,, O)b(X,, O)T, or the centered version

V(0) = n-1 il(b(Xi, ) b,(O))(b(Xi, O) b,(O))T in semiparametric estimation,

but the covariance can be also modeled parametrically. The .,i-mptotic distribution of the

two-step GMM estimator is

n1/2(, 80) N(0, (DTV-1D)-1). (1-10)

The k-step GMM estimator is defined by iterating the second step above k times

replacing On by the current value of 0n at each iteration. Another efficient estimator,

called the continuous updating GMM estimator, was developed by Hansen et al. (1996)

and is defined to be

0, = argmin b,(o)TV(o)-lb,(0). (1-11)

It can be shown that the continuous updating GMM estimator is consistent and has the

same .,i-mptotic properties as the two-step GMM estimator given in (1-10).









The J-test of over-identifying restrictions given in Lemma 4.2 of Hansen (1982), uses

the statistic

Qn (0,) b ,(0 ,) (,) -lb, (), (1-12)

where O, is an efficient GMM estimator, to test whether the GMM moment condition

E[be] = 0 holds for any value of 0. Under the same assumptions as before, the J-test

statistic given in (1-12) for testing the over-identifying restrictions (1-1) satisfies


nQ,(8- ) X)-P. (1-13)

1.2.3 Nested GMM Models

Lind- ; and Qu (2003) analyzed the GMM methodology within parametric,

semiparametric, and nonparametric frameworks. They identify three levels of nested

models. At the first level, we find the parametric model. Here, in general, we identify

some basic scores that define the parameter of interest. For example, for a univariate

distribution, we can take bi(x, 0) = x 0 as a basic score for the mean and b2(x, 0)

sign(x 0) .5 as a basic score for the median. In the case of normal distribution with

known variance, -;v N(Oo, 1), both basic scores provide consistent estimators of location

parameter 00. In this case, any efficient GMM estimator based on the basic scores bl and

b2 is fully efficient, in the sense that its .,-vmptotic variance equals the inverse Fisher

information.

If the parametric model is not correct, Lin -1i- and Qu (2003) define the semiparametric

model implied by the moment conditions to be the set of all distributions F compatible

with the scores, i.e., the set of all F for which there exists 0 CE such that EF [bo] = 0.

Then the GMM estimators are consistent under weakening of model assumptions. On the

other hand, if the parametric model holds and the scores are correctly specified in the

semiparametric model, then any efficient GMM estimator is first order equivalent to the

MLE.









The semiparametric model is false when the underlying distribution F is incompatible

with the basic scores, i.e., EF [bo] / 0 for all 0 c 0. In this situation, we can still obtain

consistent estimators by taking as our parameter the value of 0 E O that minimizes the

quadratic distance between the expected scores b(0) and the zero vector (also known as

Mahalanobis distance), i.e.,

0 = argmin Q(0), (1-14)
eee
where Q(O) = b(0)TV(0)-lb(0), with b(O) = Ep[be] and V(O) = Ep[bob ]. Let Q,(0) be

the sample version of Q(0), i.e.,


Q.(0) = b ()TV(0o) -1b~(0). (1 15)

Lind-v- and Qu (2003) call Q,(0) "the quadratic inference function" and they show

that this inference function mimics the properties of log-likelihood (even when the

semiparametric model is false):

"Likelihood ratio" test: nQ,(0o) nQ,(0~) ~- X.

"Profile likelihood" test: RnQ,(0o01, 2(012 ))n- rnQ(0 ,02) ^ Xdim(e) for testing
Ho : 01 001, where 02(001) argmin02 Q(0o01, 02).

"J-test of over-identifying restrictions": nQ,(O ) -- X_, under the semiparametric
model, yielding a valid goodness-of-fit test for the semiparametric model.

1.3 M- Estimation

M-estimation is another popular method for finding estimators. For more references

and results, see van der Vaart (1998), Serfling (1980), and Huber (1981). In this case, the

parameter of interest 00 is given as a maximizer of the population "criterion function"

m(0)

0o = argmaxm(0). (116)
eee
EOH
The M-estimator is defined as a maximizer of the inpIl. criterion" function


0 = argmax m,(0). (117)
OG









For instance, if m(0) = E[m(X, 0)], we usually take m,(0) = m(Xi, 0).

We consider now consistency results for M-estimators given as maximizers of concave

criterion functions when the parameter space is not necessarily compact, a topic that

has been actively researched in econometric theory. In the case of GMM estimators,

the assumption of concavity is not restrictive for (basic) scores that are linear in the

parameters, since, in this case, it can be shown that the negative of the quadratic inference

function (1-15) is concave (H--ihi 2000, p. 468).

H i-l-hi (2000, p. 458) presents a proposition from N. y and McFadden (1994,

pp. 2133-2134) that establishes consistency for M-estimators based on concave criterion

functions. Proposition 1.3.1 below gives sufficient conditions for consistency under weaker

assumptions. Its proof, given in Appendix A, does not require that the sample criterion

functions converge in probability on the entire parameter space, nor does it depend on the

result that M-estimators corresponding to continuous criterion functions are consistent.

Moreover, it does not require the parameter space to be convex, as in Hi-ihi (2000,

p. 458). The proof uses only the fact that pointwise convergence in probability for concave

functions on an open set implies uniform convergence on compact subsets of that open set

(Pollard, 1991, sec. 6) and is easily adapted to accommodate nuisance parameters, as in

Proposition 1.3.2 (proven in Appendix A). For further details and applications to some

financial risk measures, see Giurcanu and Trindade (2006).

Denote by S(to,c) = {t e R : lit toll = c} and B(to,c) = {t e Rq : lt tol < c} the

sphere, respectively, the closed ball centered at to of radius e.

Proposition 1.3.1 (Consistency under concavity). Let m,(0) and m(0) be population

and sample criterion functions, -, "i-.. /.:; /; and let Oo = argmax6oc m(0) and 08

argmaxoE, m,(0). Suppose that Oo is ii//, li/// .:l ,..ifi,!,1, that m,(0) is concave in 0 e

with I' .'l',l'.:l.:l 1, and that there exists a neighborhood C C 0 of Oo such that for every

0 e C, m,(0) -> m(0). Then O, -> 0.









Proposition 1.3.2 (Consistency under concavity with nuisance parameters). Let m(0)

and mn(0) be the population and sample criterion functions, ', -./,i. /.: .l/; and let Oo

argmaxe m(O), with Oo = (00,1,00,2). Let C C 0 be a neighborhood of 0o and /. fI;,.'

for every 0 = (01, 02) E C, On,2(01) = arg"i_ '1,82)EC n (01, 02). Suppose that

0o is i, .,1,ll;/ .,l, I.:i11, m,n(0) is concave in 0 with pj,., '',.:l.:/; 1 and that for every

0 c C, mn(0) A- m(0). For ,:.;, consistent sequence of estimators 0n,1 for 0o,1, let

0O,,2 0n,2(8n,l). Then 0,,2 -- 00,2.
1.4 Empirical Likelihood Estimation

1.4.1 Review on Empirical Likelihood

Empirical likelihood, introduced in a series of papers by Owen (1988, 1990, 1991),

is a nonparametric approach to inference with applications in many areas of statistics.

Empirical likelihood allows the use of likelihood methods without necessarily assuming

that the data are drawn from a parametric family of distributions. As Owen (2001,

p. 1) remarks in his comprehensive monograph on empirical likelihood, the advantages

of empirical likelihood arise because "it combines the reliability of the nonparametric

methods with the flexibility and effectiveness of the likelihood appro !I He adopted

the name "empirical likelihood" because the empirical distribution of the data p-1i-

an important role. As we will describe later in this section, alternative nonparametric

likelihood ratios have been developed that are also based on the empirical distribution

function, and as Owen (2001, p. 2) states in his book, "empirical likelihood ... is

distinguished more by being a likelihood than by being empirical".

The main idea of empirical likelihood is to construct a likelihood ratio statistic for the

parameter of interest using a multinomial distribution on the observed data. Owen (1988)

proves an analogue of Wilks Theorem, obtaining a X2 .i-,in!l ltic distribution for the

negative of twice the log empirical likelihood ratio. As Owen (1988) remarks, this result is

surprising because the number of nuisance parameters, n 1, increases with the sample

size.









Empirical likelihood was initially applied to construct confidence regions for

parameters defined by statistical functionals, such as Z-estimators, Frechet differentiable

functionals, and smooth functions of means (Owen, 1988). Owen (1991) applied empirical

likelihood to regression models by extending the theory for independent and non-identical

distributed observations. Kolaczyk (1994) made further extensions to generalized linear

models.

Empirical likelihood was soon recognized as a serious competitor to contemporary

methods of nonparametric inference, such as the bootstrap. Hall and La Scala (1990)

argue that "empirical likelihood ... deserves a prominent place in the modern statistician's

armory of computer-intensive -...1 They identify the following advantages of empirical

likelihood over the bootstrap:

(1) empirical likelihood provides confidence regions for multivariate parameters, and

the shapes are data driven, being concentrated in places where the density of the

parameter estimator is greatest;

(2) empirical likelihood is Bartlett correctable, i.e., a correction for the mean reduces the

coverage error of confidence regions based on empirical likelihood from order n-1 to

order n-2;

(3) empirical likelihood does not require estimation of scale or skewness;

(4) empirical likelihood regions are range preserving and transformation respecting.

Imbens (2002) gives a review of recent developments concerning maximum empirical

likelihood estimators, which are defined as the maximizers of the empirical likelihood

over the parameter space. He remarks that their main merit is that they circumvent

the need to estimate the covariance matrix (the .,-ii!l ,' i1 ic covariance of the sample

criterion function) necessary in the case of GMM estimators and also they have a nice

information-theoretic interpretation. It turns out that the (.oil iii ll.iLt- -l'i-dating GMM

estimator is a particular case of the generalized empirical likelihood estimator obtained









in the class of Cressie-Read power divergences for the Euclidean distance (p = 2). He

compares these estimators on a simulated dynamical panel data set.

Empirical likelihood has been successfully applied in a time series context as well;

chapter 8 of Owen (2001) is dedicated to this subject. By reducing to independence, he

shows how to apply empirical likelihood in the case of AR(1) processes. Extensions to

arbitrary order autoregressive processes are easily obtained, and it would be interesting to

see how inference based on empirical likelihood competes with the classical approaches in

time series. Kitamura (1997) introduced a blockwise empirical likelihood that preserves

the dependence structure within the observations. By extending results from Qin and

Lawless (1994), he derives an efficient estimator by maximizing the blockwise empirical

likelihood. This estimator is called the maximum blockwise empirical likelihood estimator

and is the counterpart for time series of the maximum empirical likelihood estimator.

1.4.2 Empirical Likelihood for Moment Condition Models

Let X1,... ,X, be an iid sample from F, 0o E C WR the parameter of interest

and b(x, 0) =(bi(x, 0),..., bq(x, 0))T a q-dimensional function indexed by 0, such that

0o = 0(F) is the unique solution to the population equation


EF[boo] =0. (1-18)


If p = q we obtain the classical Z-estimation model, and if q > p, then we obtain the GMM

model, as discussed in previous sections. We consider both models at the same time, and,

when necessary, we underline the differences. We first give some definitions.

D. /7i/,, 1.4.1 (Nonparametric Likelihood). The nonparametric likelihood of a

distribution G is defined by

L(G)=- G{X,}, (1-19)
i=1
where G{Xi} represents the probability of getting the value Xi under the distribution G.

This is not a likelihood as defined in classical statistical theory, but L(G) is the

probability of obtaining exactly the observed values X,..., X,. In order to have a









positive nonparametric likelihood, G must place positive probability on each observed

value X, i = 1,..., n.

D. /7i ./,.( 1.4.2 (Nonparametric Likelihood Ratio). The nonparametric likelihood ratio is

defined to be

(G) () (1-20)
L(F<)'
where F, is the empirical distribution function.

D /(./.., 1.4.3 (Empirical Likelihood). The empirical likelihood for 0 is defined as


R(0) = sup{R(G) : G < F, Ec[be] = 0}, (1-21)

where the supremum is taken over all distributions supported on the sample.

It is shown in Owen (2001, pp. 11-12) that we can treat the data as if there were

no ties, by considering the probabilities associated with observations and not with their

values. If we represent any distribution G < F, by a vector of weights p = (pl,...,pn),

where pi = G{Xi}, then the empirical likelihood ratio can be written in an equivalent form

as

R(0) sup unp pib(X, 0) 0, pi O l,p > 0 (1 22)
p i=1 i= 1 i= 1
Owen (1988) proves the following fundamental result. Let X1,...,X, E Rd be independent

random vectors with common distribution F. For 0 e 0 C RP and x IRd, let b(x, 0) e

RP. Let 0 be such that CovF[boo] is finite and has rank p. If 0o satisfies E[boo] = 0, then

-21og(((00)) ~ XP. As Owen (2001, p. 41) remarks in his monograph, an interesting

aspect of this .i-mptotic result is that it does not include conditions on b(x, 0) nor on

EF [be].

Let F,n = {G : R(G) > c, G < Fn} and STn = UeC,.t : EG[bt] = 0}. Owen's result

-,-.-, -r- taking c = exp(- (1 a)/2), where X(1 a) is the 1 a quantile of iX, in

order to obtain an ..i-mptotic 100(1 a) confidence region for 00, i.e.

P(Oo E S,n) 1- a, as n oo. (1-23)









Inspection of the proof reveals that the leading term in the .,-ii!l, ,I ic expansion of the

empirical likelihood ratio is

-2log( 7(0o)) nbTV-b + Op(n-1),


where b = n-1 EY b(Xi, 0o) and V, n-1 EY1 b(X, 0o)b(X,, 0)T". Let T2 n6S-1b

be the corresponding Hotelling's T2 statistic, where


S = 1 (b(Xi,Oo) b)(b(X, Oo) b).
i=1

Then -2log(R7(0o)) = T2 + Op(n-1). Thus, Owen (1990) -,,-. -I1 using the quantiles of a

scaled Fisher's F distribution (n) Fp,T,-p instead of a X 2 when constructing the empirical

likelihood confidence regions for 00.

Hall and La Scala (1990), DiCiccio and Romano (1989), and DiCiccio et al. (1991)

show that empirical likelihood is Bartlett correctable. Bartlett correction amounts to a

mean correction of the empirical likelihood in order to achieve a coverage accuracy of order

Op(n-2). Empirical likelihood is Bartlett correctable because the third and the fourth

cumulants of the components of the signed root of the empirical likelihood are of orders at

most Op(n-3/2) and Op(n-2), respectively. Consequently, the empirical likelihood admits

the expansion

2log(7(0o)) + x) + Op(-2). (1-24)

Since the algebraic expression for a is fairly complicated, Hall and La Scala (1990) -,i-- -1

a bootstrap approximation.

Qin and Lawless (1994) extend empirical likelihood for Z-estimators to models where

the dimension of the estimating equation is greater than that of the parameter. They

define the maximum empirical likelihood estimator (MELE) to be the maximizer of the

empirical likelihood ratio over the parameter space, i.e.,

0 argmax 7(0). (1-25)
Oee









Following the same arguments as in Owen (1990), Qin and Lawless (1994) prove that

the optimal weights are given by pi(O) = (n(l + ATb(Xi, 0)))1 for any 0 in a small

neighborhood of the true parameter 00, where A is the Lagrange multiplier of the system

(1-22) and satisfies
n b(Xi, 0)
=0
n(1 + ATb(Xi, 0))

It is an easy exercise to show that for p = q, 0 0, the usual Z-estimator, but for p < q, 0

generally differs from the GMM estimator 0.

Qin and Lawless (1994) show that 0, the MELE of 00, is .,-imptotically normally

distributed, with the same limiting distribution as the efficient GMM estimator.

Specifically, assume that E[b(Xi,00)b(Xi, 00)T] is positive definite, that b(x, 0) is two

times continuously differentiable in a neighborhood of 00 where l|b(x, 0)11, I|Vb(x, 0)11,

and IIV2b(x, 0)11 are bounded by some F-integrable function G(x), and that the rank of

E[Vb(X,, Oo)] is p. Then

(1) with probability 1, R(0) attains its maximum at a value 0 in the interior of the ball

10 0o|<
(2) /(8 Oo) N(0, A) and v/ ~- N(0, U), and

(3) -21ogR(0) ,,

where I is the identity matrix, A (DTV-1D)-1, and U = V-(I DADTV-1).

Result (3) gives a test for over-identifying restrictions, a competitor to the GMM J-test as

described in previous sections.

B ,.-.-. 1y (1998) has generalized empirical likelihood by considering the family of

Cressie-Read power divergences. The Cressie-Read power divergence between F, and Fp

(or equivalently between the vector of uniform probabilities po = n-11 and p) is given by

1 n
DP(p) n p(np), (1 26)
i=1









where

[p(p 1)]-1( 1), if p / 0,1,

lp(u) log(u), if p 0, (1 27)

ulog(u), ifp =1.

The power divergences contain as special cases the forward (p = 0) and backwards (p = 1)

Kullback-Leibler divergences, Hellinger distance (p = 1/2), and Euclidean distance (p = 2).

B ,.-.-. ly defined the empirical divergence for the mean for the whole class of Cressie-Read

discrepancy measures by generalizing (1-22) in case of the mean, i.e. for every p E R


CRp(O) = inf Dp(p) : piXi O Ip 1 ,p > 0 (1-28)
P i= 1 i= 1

B r--__ dly (1998) showed that Owen's result on the .-i-~!,ll ,ics of empirical likelihood

holds for any member in Cressie-Read family. Jing and Wood (1996) show that the

exponential empirical likelihood (obtained for p = 1) is not Bartlett correctable, and later,

B r.-.- dly shows that empirical likelihood is the only element in the Cressie-Read family of

divergences that admits a Bartlett correction. Nevertheless, in his simulation results, the

use of a scaled Fisher's F distribution gives better coverage than both .,-vmptotic 2 and

Bartlett corrected confidence regions. Corcoran (1998) extends the class of discrepancy

statistics that admit Bartlett corrections. Smith (1997) introduces the class of generalized

empirical likelihood estimators defined as saddle points of an optimization problem defined

in terms of a normalized convex function. N. .-- y and Smith (2004) show that this class of

estimators generalizes the class of minimum Cressie-Read discrepancy estimators.









CHAPTER 2
THE BIASED BOOTSTRAP WITH IID OBSERVATIONS

2.1 Review on the Bootstrap with IID Observations

Since its introduction by Efron (1979), the bootstrap has provided new methods

to applied statistics and motivated a myriad of new theoretical results. In a recent

edition of Statistical Science dedicated to the bootstrap, Efron (2003) remarked that the

bootstrap was initially introduced as an alternative to the jackknife for estimating the bias

and variance of an estimator. Since then, many new applications have been developed,

including bootstrap confidence intervals and significance tests, bootstrap bias reduction,

and bootstrap diagnostics.

In reviewing recent developments in bocli -, ippi.- Davison et al. (2003) mentioned

several new directions of research, including highly accurate parametric bootstrap

procedures, theoretical properties for the nonparametric bootstrap with unequal

probabilities, the m-out-of-n bootstrap, bootstrap failures and remedies for superefficient

estimators, significance J I i- and resampling for dependent data. Books that deal with

both theoretical properties and applications of bootstrap include Hall (1992), Efron and

Tibshirani (1993), Shao and Tu (1995), Davison and Hinkley (1997), and Lahiri (2003).

The main idea of the bootstrap is to estimate the sampling distribution of a statistic

by its (re)sampling distribution obtained under an estimate of the underlying distribution

of the data. This definition applies to both parametric and nonparametric problems as

follows. Suppose X {X1,..., X, } is an iid sample from a distribution F and we want

to estimate the sampling distribution of a statistic T, = T,(X1,...,X,). Let (T,IF)

represent the distribution of T, when the data Xis are drawn from F.

In the parametric bootstrap, we consider a parametric model {Fe; 0 e 0} for the

underlying F, with F = Fg0 and O0 E 0, where 0 is the parameter space. In order to

apply the bootstrap principle to this problem, we first estimate the parameter 0o by a

consistent (and efficient) estimator 0 (usually the MLE) and take F6 as our estimate









of F. Next, let {Xf,... ,X,} be a bootstrap resample, i.e., conditional on X,

X*,...,X, are iid from Fb, and denote by T,* T,(X*,...,X ) the value of T, computed

for this (hypothetical) bootstrap resample X*. Then, the parametric bootstrap estimate

of (T,,F) is given by (T, ,F).

In the nonparametric bootstrap, we usually take F,, the empirical distribution

function of the sample X, as our estimate of the underlying distribution F, though

weighted versions are also used. As before, let K* = {X,... X} be a bootstrap

resample, i.e., conditional on ,, Xf,..., X, are iid from F,, and denote by T, =

T,(X*,...,X*,) the value of T, computed for X*. Then, the nonparametric bootstrap

estimate of (T, IF) is given by [(T,* F,).

Except for some special cases, there are no closed form expressions for [(T* F,)

(Hall, 1992, pp. 9-11), and bootstrap estimates are usually found by Monte Carlo

simulation. In this case, for a given integer B, we consider B simulated bootstrap

resamples X*,..., Q), and we compute the statistic Tb T*(b,*) for every resample

Sb*. Then, we approximate (T,* F,) by the empirical distribution of the T*'s, i.e.,

B
L(T.* IF.) T Z ,r
b=1

where 6, is the unit point mass at x. Having a bootstrap estimate of the sampling

distribution of a statistic we can estimate its bias, variance, and the quantiles of interest.

The bootstrap can be iterated, and usually each iteration reduces the order of error

of bootstrap confidence intervals and tests by a factor of n-1/2, as in Hall and Martin

(1988). Generally, no more than two levels of bootstrap are employ, -1 and such procedures

are referred to in the literature as "the iterated bootstrap", "the nested bootstrap", and

"the double bootstrap". The computational effort required by the iterated bootstrap is

generally taken to be the square of that required for one level of bc.. .1 -i 1 -ppi.- which

is already computationally involved. Applications of the iterated bootstrap include

calibration of confidence regions (Beran, 1987, 1988; Hall, 1992), bias reduction (Hall and









Martin, 1988; Davison and Hinkley, 1997), variance stabilization (Tibshirani, 1988; Hall

and Presnell, 1999), and bootstrap diagnostics (Efron, 1992; Canty et al., 2000).

Often, the bootstrap provides more accurate results than first order .,-i-,i!!,il ic

approximations, without making use of the complex algebra of higher order expansions.

The analysis of the performance of bootstrap procedures generally rely on Edgeworth

expansions. The Edgeworth expansion is a refinement of the Central Limit Theorem that

gives the form of the error terms in an .,-i, iiill ic approximation of the distribution of the

sample mean, extended by Bhattacharya and Ghosh (1978) to smooth function of means.

Bootstrap versions of these expansions were developed by Hall (1988) in order to analyse

the performance of different types of bootstrap confidence intervals. As a consequence,

the bootstrap often gives rejection and coverage probabilities that are more accurate than

approximate large sample methods.

Generally, a good bootstrap procedure should satisfy two desiderata: it should yield

an .,-i-i!!ii1l 1 ically consistent estimate of the sampling distribution of a statistic and, for

small to moderate sample sizes, it should outperform .-i iiil-l ic approximations. Shao

and Tu (1995) identify some techniques used in the statistical literature to establish

bootstrap consistency. The most popular technique is called imitation. The main idea here

is to imitate the proof for obtaining the .,-vmptotic distribution of the statistic in order to

extend it for the bootstrap. The consistency of the bootstrap for the sample mean can be

proven this way (van der Vaart, 1998, Thereom 23.4). Then, by applying the delta method

for bootstrap (van der Vaart, 1998, Theorem 23.5), the consistency of the bootstrap for

smooth functions of sample means follows. Another technique uses Berry-Esseen type

inequalities. The main advantage of this method is that one can also obtain the rate of

convergence of the bootstrap estimates. Unfortunately, it is often difficult to obtain such

inequalities.

In order to show consistency of the bootstrap, one usually considers a metric 6

that metrizes weak convergence in the space of distribution functions (Huber, 1981),









and then shows that, the distance between the bootstrap distribution and the sampling

distribution of the statistic converges to zero either almost surely or in probability, as the

sample size increases. In the former case the bootstrap is called strongly consistent, in the

latter weakly consistent. We will also use the terminology of convergence in distribution,

conditionally given the data, almost surely or in probability.

If G is continuous, 6 can be taken to be the Kolmogorov-Smirnov distance, which

metrizes weak convergence in this case, but other metrics have been used in studying

the consistency of bootstrap. For example, Bickel and Freedman (1981) used Mallow's

distance in proving consistency of the bootstrap for t-statistics, von Mises functionals,

and empirical processes. Freedman (1981, 1984) also uses Mallow's distance to prove

the consistency of the bootstrap distribution of ordinary least squares (OLS) and two

stage least squares (2SLS) estimators in certain linear regression models. Using empirical

processes theory, bootstrap consistency can be also established for Hadamard differentiable

statistical functionals using the consistency of the empirical bootstrap for the Brownian

Bridge (Gine, 1990), and more recently van der Vaart (1998, pp. 332 334). Using a

functional delta method, one can prove bootstrap consistency for a myriad of statistics,

such as sample quantiles, L-estimators, and nonparametric-goodness-of-fit statistics such

as the von Mises and Kolmogorov-Smirnov statistics.

To illustrate, using the same notations as above, let X be an iid sample from F

and let G, = (TIF) be the distribution of T,. Having a bootstrap resample X*,

let T =- T,(Xf,..., X,*) be the bootstrap version of Tn corresponding to and let

G, = [(T IF,) be its bootstrap distribution. Suppose that T, converges weakly to G,

so that 6(G,, G) 0 as n -i o, where 6 is a metric that metrizes weak convergence of

distributions, such as Levy distance or bounded Lipschitz distance (Huber, 1981). While

the .-i~!, ill. ic theory approximates the distribution G, by its limit G, the bootstrap

approximates G, by its bootstrap distribution (which is a random distribution). If the

sequence of random distributions G, converges to G in probability, i.e. 6(G,, G) -2 0, then









P
we will that T, converges weakly to G in probability and write T* -I G. In this case,

by the triangle inequality, 6(G,, G,) -2 0, and the bootstrap is weakly consistent.

In order to prove that T,* P G, the following notations are often used. We will -,

that T, = o7(1) if for every c > 0, P(I|T, > e| ) -> 0, where P(-| ) is the conditional

probability given the sample i. In appendix B.2, we present a (conditional) version of

Slutsy's theorem, that shows that if T,* -~ T, BA B + o7(1), and C, = C + o7(1), then

B*T, + CA ^ BT + C. Finally, using a classical subsequence argument, (Kallenberg, 2002,

Lemma 4.2, p. 63), note that 6(G,, G) -P 0 if and only if for every subsequence Gm, of Gn,

there exists a further subsequence Gin of Gmn, such that 6(Gin, G) a 0 as n -i oo.

2.2 Least Favorable Families Corresponding to Z-Estimation Model

In this section, we introduce pseudo-parametric families of distributions associated

with Z-estimation model described in Section 1.2.2. Presnell (2002) shows that these

families are least favorable for parameters that are smooth functions of the mean vector.

In Theorem 2.2.1 below, we prove that this result is also true for the Z-estimation

model (1-3). Here, by least favorable we mean that, when evaluated at the maximum

likelihood estimator, the inverse of the Fisher information matrix corresponding to the

pseudo-parametric families defined in Section 1.2.2 is equal to the sandwich matrix, the

usual nonparametric estimator of the .,-vmptotic variance of the Z-estimator. In this

sense, inference about the parameters is not made artificially easier by restricting attention

to these families of distributions (Stein, 1956).

There are a number of bootstrap confidence interval procedures based on least

favorable families of distributions. Efron's tilted bootstrap confidence intervals are based

on inverting a bootstrap hypothesis test carried out within a least favorable family. The

automatic percentile method (DiCiccio and Romano, 1989) can be applied in conjunction

with a least favorable family approach (DiCiccio and Romano, 1990). Also, DiCiccio and

Romano (1990) and Hall and Presnell (1999) -i--::. -1 estimating the variance of 0 as a








function of 0 by resampling along a least favorable family in order to compute variance
stabilized bootstrap-t intervals.
C'! ....- and fix any value of p. For a given value of 0, we choose the vector of
probabilities p = p(0) in order to



minimize Dp(p) = 1 / (npp), (2-la)
i=1

S pb(X,, 0) 0,
subject to (2-1b)
-Pi 1, pi > 0,
i= 1
where bp is given by (1-27). Note that


) (p1- )-l-1, if p/1, (22)
l+log(u), if p 1.

Using the same notation as in Presnell (2002), define

Sdef ( 1)-1 + pDp(p), p 1,
Yp(p) u (pi ) = (2-3)
n i=1 1 + D(p), p 1.

Applying a Lagrange multipliers argument to (2-1), for a fixed 0

0 D,(p)- Ao [ p- 1 -AT pb(X, 0)
Ii= i=
= (npi) o ATb(X, 0). (2-4)

Multiplying (2-4) by pi and summing over i, gives

C,(p) Ao 0. (2-5)









Equation (2-4) may be solved for pi using (2-5), yielding


npi = (2 6)
upi = {1 + (p 1) [p6 + ATb(Xi, 0)] } 1), if p:/ t, (2-6)
exp{ + ATb(Xi, 0)}, if p 1,

where 6 and A are chosen to satisfy the constraints pi = 1 and Y', pib(Xi, 0) = 0.

Note that in this representation 6 is equal to the minimized power divergence, i.e.,

6 =D,(p).

For any 0, let Fo be the weighted empirical distribution on the sample ), corresponding

to the vector of weights p(0) given by (2-6), i.e.


FO pi(0)6xs,
i= 1

where 6, is the unit point mass at x. We show in the appendix that 3 {Fo}oee, the

resulting family of weighted empirical distributions indexed by 0, has the nonparametric

least favorable property (Stein, 1956; DiCiccio and Romano, 1990).

Theorem 2.2.1. The inverse of the Fisher information matrix .... -. ',, 1,,:; to the

pseudo-parametric families /, 7.,, .1 in Section 1.2.2, evaluated at the maximum likelihood

estimator, is equal to the sandwich matrix, the usual nonparametric estimator of the

'- IIl'.'l/:.: variance covariance matrix of the Z-estimator.

2.3 The Biased Bootstrap for GMM

In spite of the good .i,-ill l ic properties of GMM estimators, Monte Carlo

experiments have shown that the actual sizes of tests based on first order .,-i-,i ll, ic

theory differ greatly from their nominal levels (Tauchen, 1986). In an effort to improve

finite sample performance of these tests, Hall and Horowitz (1996) devised a modified

bootstrap procedure that can be also applied with dependent data. Their procedure

recenters the moment conditions so that the modified moment conditions are fulfilled by

the sample.









Hahn (1996) showed that the bootstrap t-test for individual parameters is .-,-in',! l)tically

consistent without recentering. However, if the uncentered moment conditions are used,

then the bootstrap estimate of the distribution of the J-test statistic of over-identifying

restrictions is not consistent, a fact also claimed by Brown and N. .-- y (2002) and Lind,-v

and Qu (2003). As noted by Lind,-i- and Qu (2003), using the uncentered moment

conditions, the null hypothesis of mean-zero of the scores does not hold for the sample,

so that, one is sampling under an alternative hypothesis in which the mean of the scores

is not zero. This may have a small impact on the critical values if the null is true (if the

bootstrap is consistent under the null), but will have a great impact if the null is false;

consequently, the size of the bootstrap test might be nearly correct but its power may be

poor.

Hall and Presnell (1999) devised the biased bootstrap in order to improve the

performance of a wide range of statistical procedures for hypothesis .' -1ii:- shrinkage,

robust estimation and variance stabilization. We use this methodology in order to

construct a semiparametric bootstrap for GMM. A biased bootstrap for GMM was

introduced by Brown and N. .-- y (2002), though from another perspective. They argue

that the weighted empirical distribution that minimizes the Kullback-Leibler divergence to

the empirical distribution while satisfying the GMM equations (they call it "the empirical

likelihood distribution"), attains a semiparametric efficiency lower bound (Brown and

N. .-- 3, 1998). To show the consistency of the weighted bootstrap for t and J statistics,

they claimed that because the empirical likelihood distribution is more efficient than

the empirical distribution of the sample, the corresponding weighted bootstrap is both

consistent and more efficient than the uniform bootstrap.

Our approach is similar. We first introduce a family of weighted empirical distributions

3 {Fo}ose defined by (2-1), associated with the GMM model (1-1). We bootstrap

as if we had a parametric model: first we estimate the parameter 00 = O(F) using,

- i, the GMM estimator 0; then, for the first level of bootstrap, a generic resample









X* ={X,... ,X,} is a sequence of (conditional) iid draws from Fb E 3. It is obvious

that if YL1 b(Xi, 0) = 0, which is generally the case when dim(b) = dim(0), then

Fb = F,, so that the biased bootstrap coincides in this case with the classical uniform

bootstrap. Let 0* be the bootstrap version of 0 on the resample X*. At the second

iteration of the biased bootstrap, a typical re-resample X** = {X**,..., X**} is a

(conditional) iid sample from the weighted empirical distribution Fo, corresponding to
the parameter estimate 0*. Here, the resampling procedure of the biased bootstrap differs

from the uniform bootstrap even when dim(b) dim(0), since typically F6. / F*, where

F* is the empirical distribution of the resample X*. In Section 2.3.2, we will see that the

biased bootstrap has certain computational advantages over the "uniform" bootstrap when

the bootstrap is iterated.

In this way, we mimic the parametric bootstrap for semiparametric models without

any adjustment of the moment conditions and without the need to find the centering value

for the bootstrap version of the statistics, as it is usually required in such situations, see

e.g. Shorack (1981), Freedman (1981), Hall and Horowitz (1996), and Lahiri (2003). To

summarize, the main steps in the biased bootstrap for GMM are as follows:

Estimate the parameter by GMM estimator 8.

Find Fo as in (2-6).

Resample the data drawn from Fo, to obtain bootstrap estimates of the
distributions of 0 and nQ,(0).

2.3.1 Consistency Results for the Biased Bootstrap

Using the biased-bootstrap procedure, we estimate the distributions (b1/2 (0- Oo) F)
and (nQ,(0)IF) with their bootstrap versions (7n12(0* 0)|F6) and (nQ(0*) |F),

where the weights of F6 are given by (2-6). We assume the following regularity conditions

hold.









Assumption 2.3.1. The parameter space O C IR is compact and 00 is an interior point

of O that satisfies the population moment condition and the global identifiability condition

given by Definitions 1.2.1-1.2.2.

Assumption 2.3.2. Suppose that be(x) is three times differentiable in 0 in a neighborhood

of 00 for every x, such that I b(x, 0) |, ||Vb(x, 0) 1, IIV2b(x, 0) and IV3b(x, 0)|| are all

bounded by some square integrable function k, with EF k2 < 00.

Assumption 2.3.3. Assume that V = Eb(Xi, Oo)b(Xi, 0o)T is positive definite and that

the rank of D= E[Vb(Xi, 0o)] is p.

In the following proofs, we assume p = 0. The first lemma is concerned with the

existence of the weights satisfying (2-6). We show next the consistency of the biased

bootstrap for the sample mean of criterion functions (Theorem 2.3.1), then, we give a

lemma on conditional uniform convergence in probability that will be needed in proving

the weak consistency of GMM estimators and their bootstrap versions. We end with

a theorem on consistency of the biased bootstrap distribution of the GMM estimators

(Theorem 2.3.3). Theorem 2.3.4 shows the consistency of the biased bootstrap for the

J-test of over-identifying restrictions.

Lemma 2.3.1. Under Assumptions 2.3.1-2.3.3, 0 is inside the convex hull of {b(Xi, ) :

i = 1,..., n}, w ith ,.l .,,,.:.: /'; l 1, .,,. I,': I, 1.

Theorem 2.3.1. Let X {Xi,...,X,} be an iid sample from F, and let Oo = 0(F) be

the parameter of interest, assumed to .iI.fy equation (1-1). Let 0 be a sequence of ('11U11

estimators 1/. ,, I. by (1-6) and let F6 be the weighted empirical distribution given by (2-6)

corresponding to 0. Let X* = {X.,... ,X,} be a biased bootstrap resample. Under the

assumptions 2.3.1-2.3.3, as n oo,

n-1/2 b(X*, ) ^1 N(O, V). (2-7)
i= 1

We now give a conditional uniform convergence result.









Lemma 2.3.2. Under the same conditions as in Theorem 2.3.1, the following ,,.:.,rm

convergence for the biased bootstrap holds:


sup b(X;,0)- Eb, o= (1), (2-8)
i= 1

where E be = E b(Xi, 0) = b(O).

Theorem 2.3.2. Under the same conditions as in Theorem 2.3.1, for ,;,i sequence of

GMM estimators 0 and (biased) bootstrap estimators 0*, with Q,(0) > Qn(0o) op(1) and

Q*(0*) > Q(O0) Op(1), we have 0 = Oo + op(1) and 0* = Oo + o(1), and hence also that

* =0+ o (1).

Theorem 2.3.3. Let X {X1,...,X,} be an iid sample from F, and let Oo = 0(F) be

the parameter of interest, assumed to -il.'-fy equation (1-1). Let 0 be a sequence of (.'1111

estimators /, 7,'., by (1-6) and let Fb be the weighted empirical distribution given by (2-6)

corresponding to 0. Let i* = {XI ,... ,X,} be a biased bootstrap resample and let 0* be

the bootstrap version of 0 on K*. Under the assumptions 2.3.1-2.3.3, as n -+ oo,

l1/2(0* ) ^ N(O, (DTWD)-IDTWVWD(DTWD)- ). (2-9)


Remark 2.3.1. Using the conditional form of Slutsky's Theorem given by Lemma B.2.4

in the appendix B.2, we can substitute for the nonrandom matrix W the nonparametric

estimator of V-1. As a consequence, the consistency of the bootstrap for the two-step

GMM estimator follows.

Remark 2.3.2. In the proof of Theorem 2.3.1, we use the fact that the weighted empirical

distribution satisfies the moment conditions. If we do not use the biased bootstrap, the

conditional mean of the (uniform) bootstrap sample mean b((0) contains a random

element that disappears .,-vmptotically, a result that is proven in the appendix B.2. The

fact that recentering is not necessary in the case of GMM estimation has been also proven

by Hahn (1996), though using another approach. As a consequence, without reweighting

or recentering, the bootstrap estimate of (/n(0( 0o) Fo) is still consistent. On the









other hand, we will show also in the appendix B.2 that the usual (uniform, uncentered)

bootstrap distribution estimate of the J-test of over-identifying restrictions is weakly

inconsistent.

Theorem 2.3.4. Under the same conditions as in Theorem 2.3.3,


nQn (o) X,-P. (2-10)

As a consequence of these .i-ii! .ll ic results, the Kolmogorov-Smirnov distance

between the bootstrap distributions and the sampling distributions of these statistics

converges to zero in probability as the sample size grows to infinity, that is:


sup p (n1/2( 0) < F6) P (n1/2( 00) < xF) 0,

and

sup P (nQ*(O,) < xlF) P (nQ,(O,) < xlF) 0.


2.3.2 The Biased Bootstrap Recycling

Bootstrap recycling was first proposed by Newton and G (v0-r (1995). They -ii-.-. -1. '

drawing a common set of potential re-resamples from a single "da -1,1 distribution, and

then using importance weighting to "recycle" these re-resamples to estimate (conditional)

expectations for each first level bootstrap resample. Unfortunately, this method is

applicable only to the parametric bootstrap, mostly because in the nonparametric

bootstrap the support of the resample empirical distributions varies from resample to

resample. Thus, the 1 I i ily of samples from any candidate distribution that dominates

all the resample distributions will have zero importance weights for most resamples,

leading to extremely inefficient and unstable Monte Carlo estimates (Ventura, 2000).

Using the recycling method, Presnell and Giurcanu (2007) construct a recycling

algorithm for the iterated biased bootstrap that yields a second order correct confidence

interval in the smooth function of means model. At the same time, this procedure

preserves the computational requirements of the single level bootstrap. The main idea of









this method is to use the importance sampling identity (or change of measure)

E0.(T**) E( T* ) (2-11)


d P*
where Po F"'~ is the n fold product measure of Fo and d is the Radon-Nikodym

derivative of P%* with respect to P6. The Monte-Carlo recycling approximation of (2-11)

is given by

SdP0) b=1
where the resamples ,b* with b 1,..., B are iid from the -Si distribution Pg and

P .( (b*)
wb = (213)
P(Xb *)

is the likelihood ratio statistic corresponding to the probability models P6, and Pg

evaluated for the resample Xb*. Using the definitions of P,. and P6, we see that

Pa.( b,*) ni pi *))
mp6 ( ) (PA 7(2-14)
Pe(Xb*) i=1 Pi )

where X}X, where here {X,...,XA}.

In the following, we illustrate the use of the biased bootstrap, iterated biased

bootstrap and biased bootstrap recycling in a standard bootstrap estimation problem.

Suppose that X {Xi,..., X,} is an iid sample from F and we want to estimate T(F),

the solution in t of the equation

E [ft(F, F6)] 0, (2-15)

where F6 is the weighted empirical distribution corresponding to 0 and ft is a given

functional. Examples of such functionals include (Hall, 1992):

ft(F,F=) o t, (216)

for bias estimation/reduction,

ft(F, F) (0- 0)2 t, (2-17)









for mean square error estimation, and


ft(F, Fo) = I{ < +t} a, (2-18)

for a level upper confidence intervals, where I{.} represents the indicator function of a set.
The bootstrap estimate T(F6) of T(F) solves the sample analogue of (2-15),

Ee [ft(F, F.)] = 0, (2-19)

obtained from (2-15) is obtained using the "plug-in rule", i.e. by substituting F6 for F
and F6, for F6.
Usually, E [fT(F6)(F, F6)] / 0, and we might be interested in a further (additive)
correction t(F), the solution in t to the following equation

E [fT(F6)+t(F, F6)] 0. (2-20)

As before, let t(F6) be the bootstrap estimate of t(F), which solves in t the sample
analogue of (2-20)

E6 [fT(F,.)+t(F, FO.)] =0. (2 21)

This equation is again obtained using the "plug-in" rule, by substituting Fb for F and Fb,
for Fb in (2 20). Note that T(Fg,) solves Eg, [ft(Fo.,F,,)] 0, and this is where the
second level of bootstrapping enters. The hope is then that E [fT(Fr)+t(FI)(F, FO)] M 0.
This approximation needs to be taken in the following sense: it is not that T(Fb) + t(F6)
is closer to t(F) than is T(Fo), but that E [fT(F)+t()(F)F, Fo)] is closer to zero than is

E [fT(F)(F, F)].
We can argue as in Hall and Martin (1988, pp. 663-665) that any iteration of the
biased bootstrap increases the accuracy of the estimation. Specifically, suppose that

E [fr( )(F, F)] c(F)n-j/2 + Op(r-(j+1)/2),









and

)E [fT(F)+t(F, F0)] a / 0,
at t0o
where c is a "smooth functional". Then

E [fT(FR)+t(F)(F, F)] = Op(n-(+l)/2).

The argument is identical to Hall and Martin (1988), using the additional fact that if

S= Oo + Op(n-1/2) then II|F F||o, Op(n-1/2).
The recycling biased bootstrap can be used to find the Monte Carlo estimate of

T(Fg,), the solution in t of the equation

Eb. [ft(Fe., Fe..)] = 0. (2-22)

For a given b = 1,..., B, let X,* = {X,...,XbX} be a biased bootstrap resample. Let

0 0(,)*), b 1,... B, be the version of 0 corresponding to Xb*. Then we obtain the
following recycling Monte Carlo approximation of the conditional expectation from (2-22)
as in (2-12):
B
EB. [ft(Fo*,Fo*)] Y, E ,,' ft(Fo, ,Fo), (2-23)
b'= 1
where

"p ( *, p )(O)
and
m # {j : Xj X}.

The recycling Monte Carlo estimate TB(Fo4) of T(FoF) is the solution in t of the equation

B

b' 1
where the upper index B represents the number of simulated bootstrap samples. Hence,
the recycling Monte Carlo approximation tB(Fo) of t(F4) is the solution in t of the









equation
1B
SfTB(FY)+t(FO, Fb=) 0. (2-25)
b= 1

2.4 Instrumental Variables

2.4.1 Review on Instrumental Variables

In the classical regression model, the assumption that the errors are independent of

regressors is necessary in order for OLS estimators to be consistent. In many observational

studies, this orthogonality condition between the regressors and the errors is not satisfied,

so new techniques have been developed to analyze such data. One such technique is the

method of instrumental variables, which assumes the existence of an instrumental variable,

i.e., a variable that is correlated with the regressors but uncorrelated with the errors. This

technique was proposed in the econometric literature by Reiersol (1941) and has been

developed theoretically by Durbin (1954), Sargan (1958), Brundy and Jorgenson (1971),

and White (1982a), among others. Extensive treatments of this methodology are given

in econometric texts, such as Davidson and Mackinnon (1993), Matyas (1999), Hi-vlhi

(2000), and Hall (2005).

The method of instrumental variables is widely applied to cross-sectional, panel, and

times series models, and is more generally used to make causal inference in observational

studies and errors-in-variables models. A variety of estimators have appeared in the

econometrics literature, including two-stage least squares (2SLS), instrumental variable

(IV), and two-stage instrumental variable (2SIV) estimators. All these estimators can also

be viewed as particular types of GMM estimators.

Consider now the estimation of causal effects in an observational study (Angrist et al.,

1996). If one wants to estimate a treatment effect, but the individuals exert some control

over the treatment assignment, then differences between group means are biased. An

instrumental variable is a variable that is correlated with the exposure to the treatment,

but uncorrelated with the outcome after controlling for exposure to the treatment. Thus,









the variation in the instrumental variable can be used to replace the variation in the

treatment assignment when comparing the group effects.

As an example, consider the causal effect of years of schooling on earnings (Angrist

and Krueger, 1991). Individual and institutional decisions generate a correlation between

schooling and unobserved covariates such as ability and motivation, that are related to

potential earnings. For instance, if compulsory attendance laws were extended, then

those who had planned to be in school less would continue to earn less due to unobserved

covariates such as ability, motivation, and family background. A set of instruments in this

case is a set of variables that affect schooling but not earnings, once schooling is controlled

for (i.e. included in the regression equation). Angrist and Krueger (1991) use the quarter

of an individual' s birth as an instrument. They argued that students who are born earlier

in the calendar year are typically older when they enter the school than students who are

born later in the year. This pattern arises because most districts do not admit students

unless they attain age 6 by January 1. Consequently, children born earlier in the year

attain the drop-out age after attending the school for a shorter period of time than those

born later in the calendar year. Other instruments used in studying the causal effect of

schooling on earnings include siblings composition (Butcher and Case, 1994) and proximity

to a nearby college (Card, 1995).

Instrumental variables have also been successfully applied in bio-statistical research.

In a recent article, Newhouse and McClellan (1998) describe how instrumental variables

can be applied to estimate treatment effects in an observational study when a controlled

trial cannot be done. They illustrate with an application to .'-..ressive treatment of acute

myocardial infarction in the elderly. They use instrumental variables to estimate the effect

of catheterization on mortality rate. As instrument, they use the "differential hI-I i,.

(the additional distance, if any, beyond the distance to the nearest hospital to reach a

catheterization hospital). They argue that the differential distance has no direct effect

on myocardial infarction but it effects the likelihood of catheterization (the greater the









differential distance, the less likely it is for a patient to be admitted to a catheterization

hospital). When estimating the effect of catheterization on mortality rate, the authors

obtain a substantial reduction in magnitude when using IV estimate instead of OLS

estimate. In their conclusion, the authors argue in favor of IV estimation technique in

observational studies: "The results of a well-designed observational study are useful even if

the results of a clinical trial are available."

Hogan and Lancaster (2004) apply IV in order to infer causal effects in longitudinal

repeated measurements. They review two methods for estimating causal effects in

longitudinal data, inverse of probability weighting (IPW) and instrumental variables.

They apply these methods to the HERS data, a six-year natural history study that

enrolled 871 HIV-infected women starting in 1993, in order to estimate the therapeutic

effect of highly active antiretroviral therapy regimen (HAART) on CD4 cell count, using

marginal structural modeling. In this data set, the receipt of therapy varies with time and

depends on CD4 count and other covariates. They remark that both methods rely on two

important assumptions: no unmeasured confounding for IPW and the reliability of the

instruments for IV (they must be strongly correlated with the exposure to the HAART

therapy).

Before going into further detail, we present the 2SLS estimator for the linear IV

regression model. Consider the following regression model defined by


Yi x f +Ci, i= ,...,n, (2-26)

where xi is a p x 1 vector of explanatory random variables for the observed random

variable yi, ci is the unobserved error term, 0 is the regression parameter, and the q

instruments are included in the q x 1 vector zi. Some of the following assumptions from

Hall (2005, pp. 34-42) can be relaxed, (White, 1982a), but we confine ourselves to these to

simplify the presentation.









Assumption 2.4.1. The vectors vi = (xT, zT, c)T, for i = 1,... n are iid, rank(Qxz) = p,
and rank(Qzz) = q with p < q where Qz = E[x2zT], and Qzz = E[zZT].
Assumption 2.4.2 (Classical assumptions about the error ci). (i) E[ei] = 0, (ii) E[eC]
a-2, (iii) ci and zi are independent.
Assumption 2.4.3 (Moment conditions).

E [l (xf, zf, )T 4] < 00. (2-27)

Let X = [x,..., x,]T be the n x p matrix of explanatory random variables, y the
n x 1 vector of responses, e = (ei,..., e )T the n x 1 vector of unobserved error term
and Z = [z1,..., z]T the n x q matrix of instrumental random variables, which usually
includes a column of l's. In matrix notation, the linear model (2-26) can be written as

y = Xp + (2-28)

where Z and e are independent, with E[e] = 0 and Var(e) = ra2, where I is the identity
matrix. The 2SLS estimator of 0 is given by


)= (XTZ(ZTZ)- ZTX)-XTZ(ZTZ)- ZTy. (2-29)

Let Pz Z(ZTZ)-ZT be the projection matrix onto the column space of Z, then the
2SLS estimator can be written in an equivalent form as


= (XTPzX)- XTPzy. (2-30)

The estimator given by (2-29) is called the two-stage least squares estimator (2SLS)
because it can be obtained in a two-step least squares procedure. At the first stage, regress
the columns of X on the column space of Z and then, at the second stage, regress y on
the column space of the fitted values of X obtained from the first stage regressions. To be
more exact, for each j E 1,... ,p, consider the regressions

x() z6j+.j, j ...P









where x() is the jth column of X, Z is the matrix of explanatory variables and yj is the
error term. The OLS estimate of 6j is j = (ZTZ)-IZTx(J), a q x 1 vector of estimates.

The fitted values from these regressions are (j) = Pzx(j), each of dimension n x 1. By

concatenating these fitted column vectors (j), we obtain the fitted values of X from these

regressions,

X = [5( ),..., )] = Z(ZTZ)-1ZTX = PzX,

of dimension n x p. Now, at the second stage, we consider the regression model with y as

the response and X as the matrix of explanatory variables


y= Xp +-, (2-31)

where y is the error term. The OLS estimator of 0 in (2-31) is


(XTZX)- XTy

= (XTPzX) 1XT Pzy,

in agreement with (2-29).

The 2SLS estimator / is also a feasible generalized least squares estimator (H Vi-lhi

2000, p. 59). To see this, multiplying (2-28) by ZT, we obtain

ZTy = ZTXP + ZTe. (2-32)

Since E[ei] = 0 and zi and le are independent,

Cov[cizi] E[cz zt] = 2Qz,

and the GLS estimate of 0 corresponding to (2-32) is

/3 (XT Q-ZT X)- XT Q- ZTy.


Now, since n-1ZTZ > Qzz, a feasible GLS estimator of / is obtained by substituting the

consistent estimator n-1ZTZ for Qzz, which again yields 3 as given by (2-29).








The 2SLS estimator is also particular case of an efficient GMM estimator. For
i ... n, let ui (yi, xT, zf)T be the observations arranged in vector form. For
u = (y, xT, ZT)T e RI+p+q, and 3 E RP, let g(u, /) = (y xTp)z. From Assumption 2.4.2,
for all i = 1,..., n,

E[g(u, )]0, (2-33)

where 0 is the regression parameter in (2-28). By Assumption 2.4.1, q > p, so, the q
moment conditions in (2-33) define a GMM model as in (1-1). Then, for a given weight
matrix W,, the GMM estimator is

3 = argmin g,()T Wg,(O), (2-34)
'3

where g,(/) = n-1 El(y TP)Z = n-1ZT(y XP). Since Q,(/3) = g,()T/Wg,(3)
is differentiable with respect to 3, the GMM estimator 3 is a solution to VQT(/)
XTZW,ZT(y XP) = 0, and it is given by

= (XTZWZTX) -XTZWnZTy. (2-35)

In order to obtain an efficient GMM estimator, we need to choose W, to be an estimator
of the ..i-mptotic covariance matrix of gT,(/). Since gT,(/) = n-1 Zi (y xfT)Zi

n-1 Ei cizi, using Assumptions 2.4.1-2.4.3, we obtain n-1/2 y1 eiZi -i N(O, 2Q zz).
Since n-1ZTZ a Qzz, an efficient GMM estimator is obtained from (2-35) by
substituting (ZTZ)-1 for W,, i.e.


S= (XTPzX) 1XPzy, (2-36)

the same as in (2-29).
Finally, if Assumptions 2.4.1-2.4.3 are fulfilled, then the 2SLS estimator is
.i-i v:, l-. ically normally distributed (Hall, 2005), and in particular

n1/2( /3) N (0, 2(QQxz Q T-1) .









2.4.2 Bootstrapping 2SLS Estimators

Consistency results and bootstrap procedures for regression models have been

developed in Freedman (1981), Shorack (1981), Freedman (1984), and Wu (1986).

Freedman (1981) identifies two main bootstrap procedures for linear models: bootstrapping

residuals and bootstrapping cases. Consider the following linear model


y, = xf/ + (2-37)

where yi and xi are the response and the explanatory variables, respectively, 0 is the

regression parameter, and ci is the error term. The observed data are in the form =

{(Xi, yi),..., (x, y,)}. In regression models, the cis are taken to be independent, zero
mean, and usually identically distributed. In this case, one usually first fits the model to

the data and then the residuals are resampled and added to the fitted values to obtain

the bootstrap sample. More precisely, let i = yi yi be the residuals from the regression

fit. Then, we first resample n iid draws from = {c1,..., e}, obtaining the bootstrap

residual resample *= {e,... ,e}. Finally, for each i, define


Yi x/3 + c. (2-38)

In the regression model, a typical bootstrap resample is then = {(y*, xi) : i

1,..., n}, so that the bootstrap estimate /* is defined as the version of / on 1*. This is

called resampling residuals.

The case when {(yi, xi) : i = 1,..., n} are iid pairs, then the linear model (2-37) is

called the correlation model, using the same terminology as (Hall, 1992, p. 170). In this

case, the pairs (y, x\ ) are resampled randomly from 2'. Then, the bootstrap estimate /*

is defined as the version of / on S*.

In the case of the zero-intercept regression model, Freedman (1981) and Shorack

(1981) argue that residuals need to be first re-centered before resampling, in order to

obtain consistent bootstrap estimates. Freedman remarks that "..., without centering,









the bootstrap will usually fail." We will prove that this depends on the underlying

assumptions on the data generating mechanism. The main reason is that in the case of

bootstrapping residuals, the (uniform) bootstrap introduces a bias in the estimates that

sometimes does not vanish .,-,iii!ill1 ically. (This issue does not arise in ordinary least

squares regression with an intercept.) We do not include the proof of the next theorem in

the appendix since it is similar with the following proofs regarding consistency results for

bootstrapping 2SLS estimators, later in this section.

Theorem 2.4.1. Suppose that the vectors vi = (xi, i) are iid, with ci and xi independent.

Moreover, suppose that E. |i 114< oo and let a2 = Var[ci]. Then

(i) If Ex = 0, then the bootstrap is (., ,Il.) consistent: VQ(/* /) N(O, a2 -1);

(ii) If Exi = p 0, then the following (conditional) weak convergence does not hold:

(* ) N(O,a2Z-1).

In the case of the 2SLS estimator, in order for the ordinary uniform bootstrap to be

consistent, Freedman (1984, p. 834) argues that the residuals first need to be recentered,

since "As data, the residuals are not orthogonal to the instruments." This statement is

potentially misleading and might lead a practioner to recenter the residuals unnecessarily.

We will show in this section that under general conditions, the uniform bootstrap estimate

of the distribution of 2SLS estimator is indeed weakly consistent without the need for

re-centering. The main idea here is that it can be shown that bootstrapping the data in

2SLS regression model is equivalent to bootstrapping cases, and bootstrapping cases in

2SLS model is equivalent to bootstrapping efficient GMM estimators, in the GMM model

associated with the regression model. We have shown in the appendix that the (uniform)

bootstrap is consistent for the sampling distribution of GMM estimators. Although Hahn

(1996) also proved the result of consistency for GMM estimators, he does not make the

connection between the Freedman's resampling procedure with uncentered residuals and

bootstrapping the GMM estimators.









To be more specific, let residuals from the 2SLS fit be given by


i = yi xi3, (2-39)

where x is the i-th row of the matrix X. Generally, the ij's are not orthogonal to the

instruments, i.e.,

jzii / 0. (2-40)
i=1
Freedman defines the recentered residuals as the part of residuals that are orthogonal to

the column space of the Z (instruments), i.e.


i zT(ZTZ)-1ZTc.

In order to preserve the dependence structure within vi = (xi, zi, ci), Freedman resamples

from -' -{(x,zfT ,),i 1 l,...,n}. Let 1* {(x', z, ),..., (x*7,z, E,)} be a

generic uniform resample from 4'. Denote by X*, Z*, P;, and e* the bootstrap versions

of X, Z, Pz, and e on 3t*. Define the bootstrap observations y* by

y* X*) + *. (2-41)


The analog of 3 for the bootstrap resample 1* is given by


3* (X*TP;X*) X* Py*-* (2-42)

We now show that recentering is not necessary when bootstrapping 2SLS estimators.

Let = {(x i, l, i),... (x,, z( ,)} and = {(x,z, e),..., (x*, z*, *)} be a

(uniform) bootstrap resample from 2, i.e. (x, z, e),..., (a*, z, *) are (conditionally)

iid, (discrete) uniformly distributed on '. Denote by X*, Z*, P;, and c* the bootstrap

versions of X, Z, Pz, and c on i*. Define the bootstrap observations y* and /* the

bootstrap analog of / for the bootstrap resample as before. It is an easy exercise to

show that resampling (xi, zi, ci)'s, with equal probabilities is equivalent to resampling

cases (yi, xi, zi)'s, with equal probabilities. Therefore, since 2SLS estimators are particular









cases of efficient GMM estimators (as in (2-36)), and using the result that the uniform

bootstrap is consistent for the distribution of the GMM estimators, we obtain the uniform

bootstrap gives consistent estimators for 2SLS estimators, without recentering the

residuals.

The following consistency results can be generalized to allow for heteroskedasticity of

the error terms, but we limit the proof to these simpler hypotheses (White, 1982a).

Theorem 2.4.2. Under the Assumptions 2.4.1-2.4.3, for every x e R",


P, z^* iEi i= 1 i= 1

where 4o,,2Qz (x) is the joint cumulative distribution function of a multivariate normal

vector with mean 0 and covariance matrix O2Q.zz

Theorem 2.4.3. If the Assumptions 2.4.1-2.4.3 are fulfilled, then the bootstrap estimate of

the distribution of the 2SLS estimator is ,i,;,,1;,/..:.' .ria consistent, i.e.


,n/2(* ) f N (0, ,2(Q Q QL)-1) (2-44)


2.4.3 Simulations

We present some simulation studies to compare the coverage performances of

confidence intervals based on different bootstrap procedures developed in this chapter.

We consider the following regression model


Yi = axi + cli, i 1,. n, (2-45)


where the (random) explanatory variables are given by


Xi = Ali + C2i, (2-46)









and the instruments zli and z2i are generated by


Sli 7<2i + 33i, (2-47)

Zi2 72i + C4i, (2-48)

and Cli, C2i, C3i, C4i are iid, (X 1) distributed. Hence, cis have mean 0 and variance 2.

Let x = (xl,..., X)T be the nx 1 vector of explanatory variables, y the nx 1 vector of

responses, e = (Cen, )T the n x 1 vector of unobserved error term, zi = (zl,... zn)T

and z2 (z21,. 2)T the n x 1 vectors of instrumental variables. In matrix notation,

the model given by (2-45)-(2-48) can be written in a more compact form as


y = ax + e, (2-49)


with zl and z2 as instrumental variables. We can easily see that zl and z2 are valid

instruments, in the sense that they are uncorrelated with the error term e (Cov(zli, ~i) =

0) and correlated with the endogenous variable x (Cov(zi, xi) = 7). Moreover,

Cov(xi, i) = A, so that the classical regression model assumptions are not fulfilled.

As a consequence, the OLS estimate of a is inconsistent in this case.

In this simulation study, we compare the coverage performances of confidence

intervals based on biased bootstrap, double biased bootstrap, biased bootstrap recycling,

centered residual bootstrap (Freedman, 1984), uncentered residual bootstrap, centered

double bootstrap, and uncentered residual double bootstrap.

In order to define the biased bootstrap confidence intervals, we consider the

instrumental variable regression model in the GMM framework, as described in subsection 2.4.1.

For i = 1,..., n, let ui = (yi, xi, zT)T be the observations arranged in vector form. For

u = (y, x, zT)T e R4 and a E R, let g(x, y, z, a) = (y ax)z. By assumption,


E[g(ui,a)] = 0, i= 1,...,n, (2-50)









where a is the regression parameter. It is worth noting that n-1 i (yi axi)zi = 0 does

not have solutions, since the dimension of the system of equations (2 in this case) is i.-.- r

than the dimension of the parameter (1 in this case). Thus, we are in the GMM setup.

Let = {u, i : i 1..., n} be the sample of observations arranged in vector form,

and let 3 = {F, : a E R} be the least favorable family of weighted empirical distributions

on the sample 3 defined in Section 2.3 associated with the implied GMM model defined

by (2-50). Let a be the 2SLS estimator of a. We apply the bootstrap technique described

in subsection 2.3.2 to construct bootstrap confidence intervals. For an a level upper

confidence interval for a, we need to find tl-, the solution of the "population equation"

P (a < t t1-)) a 0. (2-51)

We rewrite the left side of this equation as follows

P (a < -) a P (- a > t-a) a

= 1- a P (a- a < ti-a)

-1 E [I(- a < tl)]. (2-52)


Hence, tl-a is the solution of the "population equation"

E [I(a a < tl_ )] 1 a. (2-53)

Let ti be the bootstrap estimate of tl-,, i.e. the solution of the i p!l-- equation"

Ea I(a* a < la-) = 1 a. (2-54)

The biased bootstrap upper confidence interval with nominal coverage a is then Jbb

(-oo, a tl-,). Since typically

P (a < a a1-_) / a, (2-55)









we can look for a (further) correction of the coverage level a. Let q(a) be the solution of

P(a < a- ti(a))= a. (2-56)

Let q(a) be the bootstrap estimator of q(a), so that q(a) satisfies

Pa(a < a* ) = a, (2-57)

where t satisfies P(a** a* < t IFa*) = Since q(a) satisfies Ea[I(ta) < a )] = a,

and

tq(a) _< a E[I(a** -* < a* )> q(a), (2 58)

it follows that q(a) satisfies


Ea [I {Ea*[I((** < a)] > q(a)}] a. (2-59)

Consequently, q(a) is the (1 a)th quantile of Ea* [I(a** a* < a)].

We now describe the Monte Carlo implementation of this bootstrap procedure.

For each biased bootstrap resample b*, b = 1,..., B, compute ag and draw C biased

bootstrap re-resamples b**, c 1,..., C. For each c, let a7* be the bootstrap version of a

on Xb*. For every b = 1,... B, let


4 = I( -2 < a a). (2-60)
c=l 1

The Monte Carlo approximation of q(a) is the (1 a)th quantile of z,..., z*. Then,

compute 4t(a), the 4(a)th quantile of ag a, b 1,..., B. Hence, the ath upper bootstrap

(coverage) calibrated confidence interval for a is given by


Jgbb = (-00, ( tq(a)). (2-61)

The same procedure is used for the (ordinary) double bootstrap, with the only

difference that the resamples are taken from uniform empirical distributions on the









samples. Hence Jgub, the Monte Carlo approximation of the double bootstrap (coverage)

calibrated upper confidence interval for a can be found in a similar way.

We describe now how recycling can be applied for this problem. Instead of redrawing

the second level bootstrap re-resamples cb**, c 1,..., C, in order to estimate

F, (I(-** a< -a )),


corresponding to the bth resample Xb*, we use the importance sampling (change of

measure) identity:

4 F, (I(** a < a ))
EaI* d \ (2-62)
E I,* H-en

Let i'T r, i 1P()) where mi =- #{j :X X }. Hence,

B
z4 I(i(a a < a a)',,. (2-63)
b'= 1
The Monte Carlo approximation of q(a) is the (1 a)th quantile of z-,..., z, and we

then compute ti(a), the 4(a)th quantile of a b = 1,..., B. Hence, the ath upper

bootstrap (coverage) calibrated confidence interval for a is given by


Jrbb = (- 0, ii ig(a)). (2-64)

Let Kbb' = wb be the normalized weights. Then, the (adjusted) Monte Carlo estimates
Let bb 1Zbi Wbb,
in (2-63) are:
B
Z4 I(a, ab < a a)bb'. (265)
b'=
We denote by J', ) the (adjusted) recycling biased bootstrap confidence interval.

Tables 2-1 and 2-2 show the Monte Carlo estimates of the coverage probabilities

corresponding to different bootstrap confidence intervals: GMM biased bootstrap (GBB),

GMM uniform bootstrap (GUB), and Freedman's centered residuals (FCR). We have









also included the coverage of confidence intervals based on .,-vmptotic approximation

(ASY). The coverage probabilities were estimated by the proportion of nominal level

a upper confidence intervals containing the parameter a, using 1000 simulation runs

and 1000 bootstrap resamples, for different values of a = .9, .95, for sample sizes n

20, 40, 60, 80,100, 200, 300. From these tables we remark that all these three resampling

procedures perform similarly, for both types of bootstrap confidence intervals: percentile

and percentile-t. The bootstrap percentile-t confidence intervals perform slightly better

than the bootstrap percentile confidence interval for all resampling procedures. The

confidence intervals based on the .,-iii!!i, l.l i, approximation perform quite poorly in this

case even for relatively large sample sizes.

Figures 2-1, 2-2, and 2-3 show the coverage errors for different bootstrap percentile

confidence intervals, using different bootstrapping procedures for different nominal sizes

a = .9, .85, .9, .95, .99. The coverage probabilities were examined using 500 simulation

runs and 500 outer level resamples. From these plots we can conclude that except for the

sample size n = 20 when all bootstrap procedures perform relatively similar, for all other

sample sizes n = 40, 60, 80,100, 200 the biased bootstrap recycling confidence intervals

outperfom all other bootstrap percentile confidence intervals.

Table 2-1. Estimated coverage probabilities for bootstrap one-sided, upper confidence
intervals at a .90 nominal level, 7=.8, B 1000, S 1000: GMM biased
bootstrap (GBB), GMM uniform bootstrap (GUB), centered residuals (FCR),
and based on .-i, i-!ll ic approximation (ASY)
Percentile Percentile-t
n GBB GUB FGBB GUB FCR Asym
20 0.912 0.928 0.915 0.825 0.831 0.816 0.998
40 0.964 0.972 0.969 0.858 0.862 0.854 0.989
60 0.968 0.973 0.976 0.861 0.869 0.861 0.989
80 0.967 0.968 0.966 0.874 0.878 0.876 0.97
100 0.963 0.969 0.967 0.865 0.874 0.869 0.961
200 0.944 0.946 0.944 0.88 0.881 0.875 0.951
300 0.932 0.93 0.929 0.877 0.879 0.878 0.95















Table 2-2. Estimated coverage probabilities i bootstrap (
intervals at a=.95 nominal level, y.8, B 1000
bootstrap (GBB), GMM i:: : i: bootstrap (GC
and based on asymptotic approximation (A )


one-sided, upper confidence
, S 1000: GMM biased
JB), centered residuals (FCR)


0
l o
o


080


085


Nominal Level


Figure 2-1


090


Nominal Level


Estimated coverage errors co:.- -i -(:: .:: to ::
intervals, at different a levels, sample size n=
(GBB), E CMM i: :: bo, i, ,O CMM i.
(RBB), A centered residuals (FCR), V based on
(ASY)


::: bootstrap confidence
40: o GMM Biased Be i
cycling F : Bo( i
oticc oxidation


ii
ii
ii
ii
::


; :;;


:: : : : :: :


48
4=























I
LU
& o
0
o


095


Nominal Level


Nominal Level


Estimated : :, errors corresp( :: :: to
intervals, at different a levels, sample size n ::'
((H I 3), D (: i :: : : bo 0 (: i R.
(RBB). A centered residuals (FCR), V based on
(ASY)


Sbootstrap confidence
o GMM Biased Be .! ,
ling Biased Bootstrap
.. .totic .. iooximation


w
o o
o


0 95


Nominal Level


Nominal Level


Figure 2-3. Estimated coverage errors co: t.o : :.: bootstrap confidence
intervals, at different a( levels, sample size n=100, o GMM F .i
Bootstrap (CBB), D GCMM 1i: ::.. bootstrap, 0 GMM P : Biased
Bootstrap (RBB), A centered residuals (FC ), V based on -mptotic
.. ;. .. (ASY)


1i .ure 2-2.


o -
0 =-.8A









CHAPTER 3
THE BLOCK BIASED BOOTSTRAP FOR TIME SERIES

3.1 Review on Bootstrap for Time Series

There are essentially two different v--~ to bootstrap dependent data. One is

model based, when the residuals from the model fit are assumed to be approximately

independent. In this case, the bootstrap samples are obtained by first resampling

appropriately the residuals from the model fit; the bootstrap resamples are then recovered

using the model structure with the estimated parameters. See for example Freedman

and Peters (1984), Freedman (1984), Efron and Tibshirani (1986), and Bose (1981).

Model free bootstrap procedures have been proposed in a series of papers by Hall (1985),

Carlstein (1986), and Kiinsch (1989). These are based on blocking arguments, in which

the data are first divided in blocks of consecutive observations, and the blocks instead of

the observations are resampled in order to obtain the bootstrap resamples. As a result, the

dependence structure within the time series is preserved within each block.

There are different types of blocking, including overlapping and nonoverlapping

blocks. These in turn give rise to different block bootstrap procedures, such as the

nonoverlapping block bootstrap (Carlstein, 1986), the moving block bootstrap (Kiinsch,

1989), the circular block bootstrap (Politis and Romano, 1992), and the stationary

bootstrap (Politis and Romano, 1994). All these block bootstrap methods are particular

cases of the generalized block bootstrap, as defined in Lahiri (2003, pp. 31 33). By

approximating general stationary time series with families of (increasingly more complex)

parametric models, sieve bootstraps have also been developed in a series of papers by

Kreiss (1992), Bilmann (1997), and Choi and Hall (2000).

Instead of resampling the blocks with equal probabilities, we propose a new procedure

to resample the blocks from a weighted empirical distribution on the sample of blocks

which satisfies the sample moment conditions. The weights are obtained by minimizing the

Cressie-Read distance to the empirical distribution under the constraint that it satisfies









the moment conditions at the M-estimator. This resampling procedure can be applied

successfully also for the GMM model, and consistency proofs can be easily extended in

this case for bootstrapping GMM and GEL estimators and the test of over-identifying

restrictions. Moreover, as in the iid case, by applying a recycling algorithm (Section 2.3.2),

we can reuse the first level block bootstrap resamples when iterating the block biased

bootstrap in order to estimate higher level parameters.

3.2 The Block Biased Bootstrap for Generalized M-Estimators

In this section, we propose a different method to bootstrap generalized M-estimators

for time series, based on the biased-bootstrap. Brown and N. v. y (2002) applied the

biased-bootstrap for GMM model with iid observations. They anticipated that a similar

bootstrap procedure for dependent data would be possible.

Let X {X1,...,X,} be a realization of a d-dimensional stationary time series.

Denote by F the common distribution of Xis, and by F, = (1/n) E:l 6x, the empirical

distribution of Xis. Suppose that the parameter of interest 00 E C WR is defined

implicitly as the unique solution to the "population equation"


EF [boo] E [b(Xi; 0o)] 0, (3 1)


for some b : Rd x 0 -i RP. Bustos (1982) introduced the generalized M-estimator 0 of 0 as

a solution to the inp!.-1 equation"


E, [b] = bb(X); ) 0. (3-2)
i=1

This class of estimators includes the (pseudo) maximum likelihood estimators (MLEs) and

certain robust statistics in classical parametric time series. We now describe the moving

block bootstrap (though the methodology can be extended for non-overlapping blocks

as well). Let Bi {Xi,...,Xi+l_1} denote the block of length I starting at Xi, with

i = 1,..., N, where N = n 1 + 1 is the total number of blocks and let = {-Bi,..., BN}









be the sample of moving blocks. We will suppose throughout that the block length

I l(n) satisfies I oc and -7- 0 as n oo.

Let B*,..., B* denote a simple random sample drawn from 1X, where b = [T], i.e.

bi is the integer part of the ratio between the sample size n and the block size 1 (though

other values are also possible). Since each resampled block contains I elements, in total we
resample nI = lb elements. If we denote the elements of block B* by X(_+,...,

then K* = {X. ,... ,X, } is a block bootstrap resample of size nl. Generally, the

bootstrap version of 0 and its centered and scaled version T- = 1/2( 0o) are defined in

one of two v--i-.

One way is to define the bootstrap version 0* of 0 as the solution of

t1
1 b(X, 0*) 0, (3 3)
ni
i=1

and then the bootstrap version of T, is T = n (/2(O* 0), where the centering value 0 is

defined as the solution to


b(0) 1 E [b(X:,0) ] 0, (3 4)
ni
i= 1

where E [ I] is the conditional expectation given the sample it. It is easy to see that

0 is generally different from 0, so, computation of T* requires solving an additional set of

equations for the right centering value. Using 0 instead of 0 when defining the bootstrap

version of T, induces an extra bias (due only to the resampling procedure) that leads to

either a worse approximation or inconsistency of the bootstrap (Lahiri, 2003, pp. 81-83).

Another way to define the bootstrap version of 0 is as a solution to the modified

equation

1 Z[b(X,*) 6b(0)] 0. (3-5)
i= 1
This approach has been first -, i--. -1. I by Shorack (1981) in the context of bootstrapping

M-estimators in linear regression models and also by Hall and Horowitz (1996) in the

context of bootstrapping GMM estimators. Note that b(0) is the appropriate quantity









that makes the estimating equation (3-5) conditionally unbiased. As Lahiri (2003, p. 83)

remarks, one advantage of using (3-5) over (3-3) is that we need to solve only one set of

equations (3-5), compared with two sets of equations (3-3) and (3-4) in the latter case.

In order to define the weights of the blocks corresponding to moving block biased

bootstrap (\!I BB), denote by Ui(0) (1/1) E 1j b(Xj; 0) the average of moments at 0

in block Bi. Then let p = p(0) = (p,... ,pN) be the vector of probabilities that is closest

to the vector of uniform weights (1/N,..., 1/N) such that the weighted mean of the block

averages is equal to the zero vector. In other words, p is the solution of the minimization

problem
N
minimize Dp(p) = t,(Npi),
i= 1
N (3-6)
subject to { 1 piUi() ,(3


where b, is the kernel of Cressie-Read distance. The optimization problem (3-6) may be

solved for pi as in the iid case, yielding


{1 + (p- 1)[p6 +ATU()] } 1), ifp/ 1, (3
=(3-7)
exp + ATU(0), ifp 1,

where 6 and A are chosen to satisfy the constraints i1 pi 1 and pU(0) 0.

To obtain the MBBB samples, we first compute the estimate 0 of 00, then randomly

select bl blocks from the collection = {Bi,..., BN} corresponding to the vector

of probabilities p(0) defined by (3-7). As before, each resampled block B3 contains I

elements which we denote by B3 = (X*i-1)+,"..., X), i = ,... b, and nI Ibl is the total

number of bootstrap values. Then X* = {X,..., X1 } is a MBBB sample of size n1. Let

0* denote the bootstrap version of 0 corresponding to the biased bootstrap resample K*

defined as the solution to (l/ni) EY b(X*; 0*) = 0. By construction, the (conditional)

expectation given the sample X of the biased bootstrap mean of the moment conditions









at 0 is the zero vector, i.e.


1 -i
Eb [b*6)] 0 E b(XJ;6)
i 1
bl1

i= 1

E[U()] = pj)U () 0, (3-8)
i 1

where b6(0) (1/n) E b(Xi; 0) and U(0) (1/1) Ei+-1 b(Xj; 0).

3.3 Consistency Results for the Block Biased Bootstrap

Let T, = ln(0 6o) and T,* = V (0* 6), and denote by GT and G, their

distributions (G, is a random distribution). Before describing the bootstrap for time

series, we first introduce the notions of stationarity and a-mixing property necessary for

central limit theorem in the case of dependent data. A sequence of random vectors Xi,

i = 1, 2... is called (strictly) stationary if for every i1 < i2 < ... < ik, k E N, and for

every m E N, the distributions of (Xi,,... ,Xk) and (Xil+m,.. Xic ) are the same. The

strong mixing or a-mixing coefficient of (Xi) is defined as

a(n) = sup{| P(A n B) P(A) P(B)| : A c {Xj : 1 < j < k},
(3-9)
B e a{Xj : j > k + n + 1}t, k N}, n E N.

We assume the following regularity conditions hold.

Assumption 3.3.1. The parameter space 0 C IR is compact, Oo is an inner point of

which satisfies the population moment condition given by (3-1).

Assumption 3.3.2. The function bo(x) is two times differentiable in 0 in a neighborhood

of 00 for every x, such that Ilb(x, 0) 1, ||Vb(x, 0) |, and IIV2b(x, 0) 1| are all bounded by

some square integrable function k, with EF k2 < oc.

Assumption 3.3.3. Both Cov(b(Xi, Oo), b(Xi, 00))+2 i1 Cov(b(Xi, Oo), b(Xl+i, 6o))

and D= E[Vb(X1,00)] are nonsingular.









Assumption 3.3.4. There exists a 6 > 0, such that : ac(n)6/(2+6) < oo. Also, the

block length I satisfies 1-1 + n-6/(4+2)1 o(1) and EIb(X1, 00) 112+< 0.

Under the conditions above (Lahiri, 2003, Theorem 4.2, p. 83), the following

weak convergence holds: GT G, where G = N(0, D 1,(D-1)T) and Eo
lim,,, nVar(b,(Oo)) = Cov(b(Xi, Oo), b(X1, Oo)) + 2 Ei Cov(b(Xi, Oo), b(Xli, Oo)),

b6(O) (1/n) E b(X, ), D E [Vb(Xi, 0o)].

The next lemma is concerned with the existence of the weights satisfying (3-6). Let

U, = U,(0).
Lemma 3.3.1. Under the same conditions as in Theorem 3.3.1, 0 is inside the convex

hull of {Ui : i 1 ,..., N }, with "'..1 l..7l,. I J// '1'1'..' i':.., 1 as n --- o .

Let Co {Ui : i = 1,...,N} = { C aiUi : ci > 0, E 1i 1} be the convex hull

for {Ui : i 1,...,n}. Since 0 E Co {U : i 1,...,N}, there exists a unique set of
weights {pi : i 1,..., N} solving (3-6) such that pi 1, pi > 0 and U = 0.

We consider in detail the case when p = 0, with similar proofs holding for any member in

the Cressie-Read family. Taking p = 0 in (3-7), the weights have the following simplified

expressions
1
pi = T) (3-10)
N(1 + ATU+)'
where the Lagrange multiplier A satisfies

N= 0. (311)
t + ATU,
i= 1
The next theorem shows the consistency of the moving block biased bootstrap distribution

of the mean of moment conditions.

Theorem 3.3.1. Let X = {X1,..., X, } be a realization of a strictly station.' ;, time

series and let Oo be the parameter of interest, assumed to -iI:-fy (3-1). Suppose that 0 is a

sequence of generalized M-estimators 1/. ;,. ., by (3-2), and let F6 be the weighted empirical

distribution given by (3-6). Let X* = {XI ,...,XT*} be a block biased bootstrap resample.









Under the assumptions 3.3.1-3.3.4, as n -+ o,


I/2 b,(0) N(0,o ). (3-12)


Lemma 2.3.2 and Theorem 2.3.2 from Section 2.3.1 hold also in this case. Since the

proofs are identical, we have not included them in the appendix.

Lemma 3.3.2. Under the same conditions as in Theorem 3.3.1, the following ,in:f.rm

convergence for the biased bootstrap holds:

n
sup -Zt b(X,0) EFb = o-(1), (3-13)
i= 1

where E be = Eb(Xi, 0) = b(O).

Theorem 3.3.2. Under the same conditions as in Theorem 3.3.1 and for every

sequence of generalized M-estimators 0 and (biased) bootstrap estimators 0*, with

b(0) = n-1 Ei b(X, 0) = op() and b*(0*) n-1 E 1 b(X, *) o(1), we

have 0 = o + op(1) and 0* = o + o (1), and hence also that 0* 0 + o* (1).

The next theorem shows that the MBBB distribution of the generalized M-estimator

is consistent.

Theorem 3.3.3. Let X {X1,..., X,} be a realization of a strictly statio,.,,l time

series and let Oo be the parameter of interest, assumed to -,i,-fy (3-1). Let 0 be a

sequence of generalized M-estimators 1, fi,. I by (3-2) and let Fh be the weighted empirical

distribution given by (3-6). Let i* ={X ,.. ..,X1 } be a block biased bootstrap resample

and let 0* be the bootstrap version of 0 on i*. Under the assumptions 3.3.1-3.3.4, as

n -oo,

n (/20 0) 2P N(O,D-I :C(D-1)T). (3-14)

3.4 Iterated Block Biased Bootstrap Recycling

For any block size 1, first introduce a family 3 {Fo}oee of weighted empirical

distributions on the sample X of blocks of length 1, defined by (3-7) and associated with

the generalized M-estimation model (3-1). Then, estimate the parameter 0o = 0(F) by 0,









and, for the first level of block biased bootstrap, resample the blocks according to Fb e 7,

obtaining the resample X*. Compute 0*, the biased bootstrap version of 0 corresponding

to the resample 5*. Then, at the second iteration, the block biased bootstrap re-resample

is obtained from iid draws B/* from the weighted empirical distribution F, E 7,

obtaining the re-resample Yi**, (also called the inner level bootstrap resample).

Instead of drawing the second level bootstrap re-resamples in order to estimate

E, (T**), we may use the importance sampling identity as in the iid case


Eg.(T**) E (T* dP ), (3- 15)

where G is a given "dl -1,Si distribution, P.g = F" is the n fold product measure of

F., and dp is the Radon-Nikodym derivative of P,. with respect to G. Of course, if

the statistic itself is difficult to evaluate for each sample, we can choose the first level

biased bootstrap resamples Xb*, with b= 1,... ,B. Hence, we have the following recycling

approximation of (3-15) corresponding to re-sample Xb*

SB
E(T)-T(*), (3-16)
b' =
where
",'" ,n) and n) = #j B BiJ.


To illustrate the use of biased-bootstrap recycling, we consider a typical bootstrap

estimation problem. Suppose that we want to estimate a characteristic Q of the sampling

distribution of 0 00, such as its bias, variance or ath quantile. Lahiri (2003, p. 2) calls

00 a level-1 parameter and defines the level-2 parameter Q as a functional related to the

sampling distribution of an estimator of a level-1 parameter. Generally, Q is the solution of

a functional equation

E f(F, F6) 0. (3-17)









Examples of such functionals are given by (2-17)-(2-19). Denote by 0(1) and 0(1)* the

biased bootstrap solutions of (3-17), which satisfy


EB [fL(l)(FO, F,)] = 0 and Eb, [f(). ,(F., FQ..)] = 0. (3-18)

Let X,*,..., Y be B biased bootstrap resamples and let QB(1) be the Monte Carlo

approximation of Q(1), defined as the solution in 0 of the equation


S.f(Fb, F O.) (3-19)
b=1

The recycling Monte Carlo approximation *B(1) of 14(1) (corresponding to resample Xb*)

is the solution in 0 of the equation
B
fo(F', Fl-,, = 0, (3-20)
b' 1

where ii,,, is given by (3-16).

Remark 3.4.1. The advantage of the biased bootstrap recycling becomes more obvious

when the statistic is difficult to evaluate for each sample (as may be the case for the

generalized M-estimator) and we need to iterate the bootstrap. In this case, we do not

need to evaluate these estimators for the second level bootstrap re-resamples because we

use only the estimators already computed at the first level and the vectors of probabilities

corresponding to these estimates.

Remark 3.4.2. This biased bootstrap recycling methodology can be viewed as a competitor

of the jackknife-after-bootstrap developed by Efron (1992) in the iid case and by Lahiri

(2003) in the case of dependent data. The main idea of these methods is to use the first

level resamples when estimating higher level parameters corresponding to the iterated

bootstrap.

3.5 An Application to the Optimal Block Size Selection

We describe now a small simulation study to show the finite sample properties

of optimal block size selection using the recycling MBBB method described in the









previous section. We use the same time series model as Hall et al. (1995) and Lahiri

(2003, pp. 182-186) in which


Xi (c + ci+)/v, i 0,1,... (3-21)

where ci are iid, (X 1) distributed. Hence, E = 0 and Varc = 2. We consider the

level-1 parameter 0 = EX1 and its obvious estimator = X, with the sample size n = 125.

The level-2 parameters of interest are


01 = nVar(X) (3-22)

and

2 P (X < 0). (3-23)

For each block size 1, let (1)l denote the block biased bootstrap estimator of 0, based on

the block size 1. The optimal choice of the block size can be taken to be


0o argminMSE((01)) argmin E ((l) _- )2. (3-24)
1 1

Using the same terminology as Lahiri (2003), lo is a level-3 parameter, since it relates to

the sampling distribution of '(1), which is an estimator of the level-2 parameter '. Since

the underlying F is unknown, we can approximate the expectation from the last display

by its biased bootstrap version,

= argminMSEg(4*(1)) argmin E ( *(1) (/))2, (3-25)
1 1

where I is a reasonable pilot block size. Hence, in order to estimate 1, we need to iterate

the bootstrap or to apply recycling.

We describe in more detail the computational details of the recycling procedure

proposed in this section, for the particular case when i1 = nVar(X), the same ideas apply

for distribution function estimation. We consider the forward Kullback-Leibler distance

when defining the weights of the biased bootstrap, obtained for p = 0 in the family of









power divergences given by (1-27). For a given value of the block size 1, the vector of

weights given by (3-7) is computed

1
p(X) ~(1 AU(X))' (326)

where Ui(X) = (/1) -1 Xj -X and A satisfies

N U(2)
7 n(- UX) 0. (3-27)
in( 1 AU(X))

Let FX be the weighted empirical distribution on the sample of blocks that assigns

probability pX(X) to the block Bi. Now, let j*,..., Y be B moving-block biased-bootstrap

resamples. For each b 1,..., B, denote by X* the sample mean of Xb*. Let
B
B(l) jZ uI(X X)2
b=l

be the Monte Carlo approximation of the bootstrap estimate Q(1), where ni = lbl. Now,

for each b, compute the vector of weights p(X*) = (pi(X*) : i 1,..., N) using (3 26) and

(3-27). First, a description of the computational details for the double biased bootstrap is

presented and then its recycling version.

For each b, consider C re-resamples { *c* : c = 1,..., C} from Fx*. For each c,

compute XL*, the sample mean of 2b*, so that, the Monte Carlo approximation of

Q (1) ni Varx( (X**) is

CY(1) n1(X 7_ X )2.
c=l 1
Finally, the Monte Carlo approximation MSE C(Q*(I)) of MSEx(**(1)) is

B
MSE ( *(l)) () ))2
b=l

where I is a pilot value for the block size (in the simulations below, = [1251/3] = 5).

Although instead of QB(1) we could have used any consistent estimator of 1, (e.g. one

based on a spectral density estimator evaluated at 0), a choice of a smoothing parameter









(the bandwidth of the spectral density estimator) is still necessary due to the infinite

dimensional nature of 01. The MC approximation of the bootstrap estimate of the optimal

block size 10 is

Sargmin MSEfo (*()),
leJn
where J,, is the set of potential block sizes. In the simulations below, we considered

J125 {1,..., 10}.
In recycling, instead of drawing C additional resamples for each b, we use the initial

resamples {Ib* : b = 1,..., B} together with the corresponding vector of probabilities

(p(X*) : b = 1,..., B), computed as before. Using the importance sampling identity, we
have

;(l) = n,1Ex. (X** X*)2
P p(3-28)
n, Ex [(X* X*)2dPX] (3 2)


where Px = F is the n fold product measure of Fx. Using the resamples { b* b =

1,...,B}, the Monte Carlo recycling approximation /rB(/) of b(1) is

1 B N /f*\mi)
r (1/) -__IXb ) 2 A ()2 (3-29)
b'- 1 i 1 i

where

m = b : 3. (3-30)

Hence, the recycling approximation MSE (B*(1)) of MSEx( *()) is

SB
MSE (*()) = r(1) (1))2.
b=l

Let now kbb' bb be the normalized weights. Then, the (adjusted) Monte Carlo
bb' Wbb'
approximation of ((1) given by (3-28) is

B
rB(0) E l(Xb X*b)2kbb'- (3 31)
b' 1









By running 20000 simulations from the model 3-21, true values of the level-2

parameters were found and they are given by Q1=3.984 and 42 0.516. To find the true

optimal block sizes using both uniform and biased bootstrap, we generate moving block

bootstrap estimators of Q1 and 42 for block sizes 1 = 1,..., 10. Tables 3-1 and 3-2 give

the mean, bias, standard deviation and root mean square errors (RMSE) of the MBB and

MBBB estimators of Q1 and 42 based on S=1000 simulation runs and B = 1000 bootstrap

resamples. From this table we see that the optimal block size is 1 =3 for Qi and 1 =2 for

2.*

Figure 3-1 shows the bootstrap estimates of the RMSE's of different bootstrap

procedures for parameters Q1 and 42, for different block sizes, for one realization of the

process (3-21). For each bootstrap procedure, the optimal block length is the mimimum of

the bootstrap estimates of RMSE's over all block lengths 1 = 1,..., 10. We can see that,

for this particular sample, the bootstrap procedures choose similar .plI 11, I!" block sizes.

In these simulations, we used B=1000 outer bootstrap resamples (for the biased bootstrap

recycling (BBR) and the adjusted biased bootstrap recycling (ABR)) and additional 500

inner bootstrap re-resamples for the uniform double bootstrap (UB) and the double biased

bootstrap (BB).












Table 3-1. Computation of the optimal block size for uniform block bootstrap estimation
of level-2 parameters 0i and 92 given by (3-22) and (3-23). The number of
simulations is S=1000, and the number of bootstrap resamples is B=1000.


An asterix (*) shows the block size
attained.


for which the minimum RMSE has been


Table 3-2.


Computation of the optimal block size for moving block biased bootstrap
estimation of level-2 parameters 01 and /2 given by (3-22) and (3-23). The
number of simulations is S=1000, and the number of bootstrap resamples is
B=1000. An asterix (*) shows the block size for which the minimum RMSE


has been attained.
Variance Estimation Distribution Function Estimation
1 Mean Bias SD RMSE Mean Bias SD RMSE
1 1.972 -2.012 0.679 2.124 0.5094 -0.0068 0.0162 0.0175
2 2.936 -1.048 1.032 1.470 0.5139 -0.0023 0.0164 0.0166*
3 3.248 -0.737 1.200 1.408* 0.5141 -0.0021 0.0165 0.0167
4 3.386 -0.598 1.329 1.457 0.5133 -0.0029 0.0171 0.0173
5 3.448 -0.536 1.394 1.493 0.5135 -0.0027 0.0167 0.0169
6 3.497 -0.488 1.474 1.552 0.5128 -0.0034 0.0169 0.0172
7 3.516 -0.468 1.544 1.612 0.5122 -0.0040 0.0180 0.0183
8 3.528 -0.457 1.617 1.680 0.5112 -0.0050 0.0176 0.0182
9 3.545 -0.439 1.670 1.726 0.5114 -0.0048 0.0184 0.0190
10 3.526 -0.458 1.710 1.770 0.5121 -0.0041 0.0186 0.0189


Variance Estimation Distribution Function Estimation
1 Mean Bias SD RMSE Mean Bias SD RMSE
1 1.977 -2.007 0.688 2.122 0.5092 -0.0070 0.0161 0.0175
2 2.939 -1.045 1.040 1.474 0.5135 -0.0027 0.0164 0.0166*
3 3.245 -0.740 1.201 1.411* 0.5133 -0.0029 0.0167 0.0169
4 3.383 -0.601 1.318 1.448 0.5131 -0.0031 0.0174 0.0177
5 3.451 -0.534 1.405 1.502 0.5124 -0.0038 0.0178 0.0180
6 3.484 -0.500 1.479 1.561 0.5119 -0.0043 0.0171 0.0176
7 3.504 -0.480 1.533 1.606 0.5124 -0.0038 0.0174 0.0178
8 3.524 -0.460 1.618 1.681 0.5123 -0.0039 0.0174 0.0178
9 3.523 -0.462 1.686 1.747 0.5111 -0.0051 0.0187 0.0194
10 3.514 -0.470 1.725 1.787 0.5108 -0.0054 0.0182 0.0189





































o

o 0


8 10


Block Size


Figure 3-1.


Block Size


i.. bootstrap estimates of the BR 1 c s corrn --- endingg to different block
bootstrap schemes. We used B U I (:; outer bootstrap resamples (for the
biased b(... i re cycling (BBR) and the 1u'isted biased bootstrap recycling
(ABBR)) and for the uniform double bootstrap (UB) and the double biased
bootstrap (BB) an additional 500 inner bootstrap resamples for each outer
bootstrap resample









CHAPTER 4
A HYBRID BIASED BOOTSTRAP

4.1 On the Boundary of the Parametric Space

We consider now a problem where the ordinary bootstrap fails. Unfortunately the

biased bootstrap, applied in the natural way, also fails. However, we are able to devise a

somewhat contrived biased bootstrap approach that is similar to the recentering solution

proposed by Datta (1995), but can be more generally applied.

Andrews (2000) gives an example where the uniform bootstrap fails. He remarks

"... we provide such a counterexample ... [which] is quite simple but it generalizes to a

wide variety of estimation problems that are of importance in econometric applications".

Let X {X1,...,X, } be an iid sample from N(p, 1) distribution. Suppose that the

parameter space for p is the positive reals. Then the MLE of p is ~ = max {X,, 0},

where X, = n-1 Z Xi. Let Tn= n1/2(np p) and


T Z, if > 0, (4
max{Z,0}, if p 0.

where Z ~ N(0, 1). Obviously, Tn T as n -- oo.

Let K* = {X*,..., X,} be a bootstrap sample from F,, where F, is the empirical

distribution function. Andrews (2000) shows that in the case when p = 0, the bootstrap is

inconsistent. We devise here a biased bootstrap procedure that gives consistent bootstrap

estimates for the distribution of the MLE. Consider a sequence of positive reals 6, = n-,

with a E (0, 1/2) and define the resampling distribution as


G Fo, if X, < ,(42)
F,, if X, > 6,,

where Fo = argmin, {D,(p) : E piX = E pi = 1,pi > 0} and Dp is the

Cressie-Read distance.









Denote by X* = {X,..., X.*} a bootstrap sample from G,, and by *" the bootstrap

version of p, computed on X*. The biased bootstrap version of T, is


n),


if X, < 6",

if X, > 6L.


(4-3)


In order to prove that we show that
In order to prove that T7 ~, T, we show that


sup P (T, < x G,)


P(T< x) -0.


(4-4)


The following relations are true:


sup P ( T,< x|G,)
x


P (T < x)


x


+sup P (n1/2(p
x


p.)

P (T < x) I{X, > },


where I{.} is the indicator function of a set. If p = 0, it follows from the Law of Iterated

Logarithm that


lim sup I{X, > 6&}


Since when p


I limsup > 1
2n /n 1/
1 /2 /1 1 1 /2


I limsup- n--
I (log log n)1/2


0, sup, P (n1/2" < xF)


P (T < x)


g log n) > 1
nl/26 -


op(l), it follows that (4-4)


holds. When p > 0, it follows that


limsup (Xn, < 6n)


sup P (1 /2(
T,


lim sup I X <

I {limsup -x <
'< n 6n


in) < X\F.)


it follows that (4-4) holds.


P (T < x) I{x,. < <}


(4 5)

(4-6)


0 a.s.


Since


0 a.s.


P (T < x)


wol


f n1/2*,
T 1 2
81/2 (x
n n









4.2 Certain Asymptotically Nonnormal Statistics

Datta (1995) cites Babu (1984)'s paper to argue that the classical bootstrap

approximation "breaks down ... even for nice statistics." Babu (1984) remarked that

the bootstrap approximation of a smooth function of multivariate sample mean is not

consistent for "certain" values of the mean vector. Following their example, we propose

a biased bootstrap version that can correct the inconsistency of the classical uniform

bootstrap.

Let S = {Xi,...,X,} be an iid sample from N(p, 1) distribution and suppose that

we are interested in estimating p2, for which the MLE is X2. Consider


T 1/2 n (X~2 2) if p/ 0,
nX%, if p 0.

Obviously, T, P- T, where

T N(0, 42), if p / 0,

2, if = 0.

Datta (1995) shows that the classical uniform bootstrap is not valid in this case. To

rectify this, as before, consider a sequence of positive reals 6, = n-, with a E (0, 1/2) and

define the resampling distribution G, to be


G,- Fo, if X < (4-9)
F,, if IX,|> 6,,

where Fo = argmin, {Dp(p) : E 1piXi 0, lpi 1,pi > 0} and Dp is the

Cressie-Read distance. Let K* = {XI,... ,X} be a biased bootstrap sample, and

let X* be the bootstrap sample mean. The bootstrap version of (4-7) on the bootstrap









sample K* is


nX{ if X,n < 6,
T= (4-10)
n1/2 (X2 X), if |X, > b6.

Using the same arguments as in the previous example, it can be shown that T,* ~ T.

Table 4-1 gives the results of a simulation study that shows the performance of this

"hybrid" biased bootstrap. Under p=0, T = nX2 has a chi-square distribution, with

one degree of freedom. For these simulations, we have considered sample sizes n=50,

100, 200, 500. For each sample size, we have computed the quantiles of the ordinary

(uniform) bootstrap estimate (n(XF2 X )IF,) as well as the "hybrid biased bootstrap"

C(T,* \G). For this simulation study we have used B 1000 bootstrap resamples, S = 1000

simulated samples and 6 = n-.4. We have also included the true quantiles of T,, given by

the quantiles of the chi-square distribution with one degree of freedom, for comparison.

It is obvious from this simulation study that this "hybrid biased bootstrap" outperforms

the ordinary bootstrap, as expected from the theoretical result. The uniform bootstrap

performs erratically in the tails, and the approximation does not improve with increasing

the sample size.

Table 4-2 gives the bootstrap estimates of the same quantities as above, under

p=0.2. In this case, T, = 1/2(X, .22) has a normal distribution N(0, .42), for the

sample sizes n=50, 100, 200, 500. As before, for each sample size, we have computed the

quantiles of the ordinary (uniform) bootstrap estimate (n1/2(X,2 X.) IFn) as well as

the "hybrid biased bc..e1 -i I'p (T, G,). We have also included the true quantiles of T,,

for comparison. It is obvious from this simulation study that this "hybrid" biased bootstrap

approaches the ordinary bootstrap, as expected from the theoretical result. As the sample

size increases, the two different bootstrap procedures give almost indistinguishable results.



















Table 4-1. T-.. i of the
given


simulation runs and 6,


S: buetion of TI under p ., and their bootstrap

by the ordinary (uniform) bo i: (UB) and the


,(HBB), using B
n- 4


1000 bootstrap resamples, S=1000


10 .15 .: .85 .95


Table 4-2. T-: quant.iles of the .: -mptotic distribution of T, under p =0.2, and their
bootstrap approximations given by the ... .. v*y (uniform) bootstrap (UB)
and the : :d" biased bootstrap (HB), using B=1000 bootstrap resamples
S 1000 simulation runs and ,, = n-.4

.05 .10 .15 .20 .80


: :

: :


i :
: :

: :
: :

: :

: :
: :


i :
: :









APPENDIX A
PROOFS OF CONSISTENCY RESULTS FROM CHAPTER I

In order to prove Propositions 1.3.1 and 1.3.2, we first present two results shared by

concave criterion functions, for more details and applications, see Giurcanu and Trindade

(2006).

Proposition A.O.1. Let f : --+ R be a concave function and suppose that there exists

to E Rq and c > 0 such that f(to) > suptes(to,,) f(t). Ift* = argmaxte,q f(t), then

* B(to, c).

Proof. Suppose on the contrary that t* B(to, ). Then, there exists 0 < A < 1 such that

t1 = At* + (1 A)to and t1 e S(to, c). Then, the following relations are true


f(ti) f(At* + (1 A)to)

> Af (t*) + (1 A)f(to)

> Af (ti) + (1 A)f(to)

> f(tl),

a contradiction. E

Proposition A.0.2. Let m,(O) and m(O) be random and deterministic functions,
P P
'' .I'.. /' ;If supOe Im, (0) m(0) 0, then supoe, m,(0) sup ee m(0).

Proof. First, consider the case when supo m(0) = oc. Since

m,(0) + nm,(0) m(o0) > m(0)

with probability one, it follows that

sup m,(0) + sup m,n(0) m(0) I > sup m(O)


with probability one. Since supo |m,(0) m(O) I 0 and supo m(O) = oc, it follows that

sup m,(0) oo.









If supo m(0) < oo, using the following relations


supmn,(O) -sup m(O) sup (m,(O) -supm(Oi))
0 0 0 01
< sup (m,(6) mn())

< sup mT(0) m(0) ,

we obtain that with probability one,


sup m,n(0) sup m(0) < sup nm,,(0) m(0),
06 6

and from here, the result follows. E

Proof of Proposition 1.3.1. Let c > 0 be small enough such that B(0o, ) C C. By

Proposition A.0.1, ignoring sets of probability zero,


sup m,(60) < m,(80)} C {, c B(O~, e) (A-1)
OES(0o,()

Since pointwise convergence in probability for concave functions on an open set implies

uniform convergence in probability on compact subsets of that open set (Pollard, 1991,

sec. 6), using Proposition A.0.2 we obtain


sup m,(0) -- sup m(O). (A-2)
oES(0o,c) oES(0o,)

Since 0o is globally identifiable, supoEs(oo,) m(0) < m(80), and using (A-2) it follows that


P ( sup m,(0) < mn,(0o) 1, as n 0oo.
0ES(0o,) /

From (A- ), we finally obtain P (6O, E B(Oo, )) 1 as n -+ oo, which establishes

the result. (When measurability is an issue, all results hold with respect to the outer

probability P*.) D








Proof of Proposition 1.3.2. As before, since m,(0) is concave in 0, uniform convergence in
probability on any compact neighborhood D of 00 included in C holds, i.e.,

sup .m(0) m(0) 0. (A-3)
OED

The consistency of 0.,i and (A-3) imply that for every 02 with 0 = (00,1, 2) C C

m,(0.,, 02) m(o0,1, 82). (A-4)

To see this, note that for large enough n, and for any compact neighborhood D of O0
containing (00,1, 02) and contained in C,


P ( n(,is, 02)- m(00o,i, 2) >C)

< P ( m,n(0,, 02) m(ni, 02) > e/2) + P ( ,n(0 2) m (0,i, 02) > e/2)

< P sup n(0) nm()| > e/2) + P m(n0,i, 2) (0,1 2) > e/2)
OED
0 as n --oo, (A-5)

We used the fact that m(-, 02) is continuous at 80,1, so that the second term in the third
line of last display converges to 0. We have also used that P((0,i ,02) D) -+ 0. Since
k.(02) :- n(0n,1, 2) satisfies the hypotheses of Proposition 1.3.1, we obtain

P
argmax kT(02) -- argmax k(02),
02:0EO 02:OEO

where k(.) = m(0o0,, ) is the in probability limit of kT. Noting that On,2 = argmax, kT(02)
and 00,2 = argmaxo2 k(02), gives the required result. E









APPENDIX B
PROOFS OF RESULTS FROM CHAPTER II

B.1 Least Favorable Families

Proof of Theorem 2.2.1. We consider only the case when p / 1, a similar proof holding

also for p = 1. From (2-6),


nVp, 6 0


{1 + (p 1) [p6 + Ab(Xi, 0)] }(2-p)/(p-1)

x (pVW + (VA)Tb(X,, 0) + (Vb(X,, 0))A) 6 A .


(B-1)


From (2-la), (2 2), and (2 5) it follows that 6\ = 0 and A|= 0, so that, from (B-1)


nVpi = pV6 = 6 + (VA)T o bb(X,,0).


(B-2)


Using (2-1b), it follows that EZ, Vpi


0, so by (2-la) we obtain


Ve o 0) Vp e== 0,
i= 1
and from (B-2) it follows that


nVp6 = (VA)T o =b(X,0).

Multiplying both sides of (B-3) by (Vpi)T 0 6 and summing over i yields


in V1p 6(Vpi) _6
i= 1


(VA) 6= 6 b(X1, e)(Vp) 6=.
i= 1


Differentiating the first restriction of (2-1b) with respect to 0 at 0 yields


S b(Xi, 0)(Vp,) 0=
i 1


(B-5)


S Vb(Xi, 0).
i 1


From (B-4) and (B-5)


n 5 p 0=6(VPi) 0=
i= 1


(VA)T 16= Vb(Xi,0).
i= 1


(B-3)


(B-4)


(B-6)









Multiplying both sides of (B-3) by b(XX, O)T, summing over i yields


Svp obb(X4, 0) (VA) Y6 o 1b(X ,O)b(X4, O). (B-7)
i 1 i 1
From (B-5) and (B-7) it follows that
--( t 1 t
VA o = ( b(X4, o)b(X4, o) 1 Vb(X4, 0) (B-8)
Si=- 1 i= l 1
Moreover, from (B-6) and (B-8) we have
n
Sy V 6 o(VP)T 6 6
i=1
n 1 -1 n (B9)
1 nVb(X, O)T b(X, 0)b(X, )T 1 Vb(X 0) (B9)
i= 1 i= 1 i= 1

whose inverse is the usual sandwich estimator of ..-i 'in'l ic variance of the Z-estimator 0

given in (1-5).

Now, consider 3 {Fo}oee as a parametric family of distributions, indexed by 0.

The Fisher information corresponding to this parametric model is given by

1(0) = EF [V(logFo) (V(logF))T]
n
i [>7[V(logPi)] [V(logPi)]T
i= 1

=5 Vp (Vps). (B-10)
i 1

If we evaluate the Fisher information at 0, from (B-10) we obtain

n
I(0) = n Vp o (Vpi) T (B-1)
i= 1
From (B 9) and (B11) it follows


I(0 = Vb(X4, ) ( -b(X4,)b(X4, 0)T 1 Vb(X, 0) (B-12)
i =1 i 1 i= 1









The last display shows that the Fisher information matrix I(0), evaluated at the

Z-estimator 0, equals the inverse of the sandwich covariance matrix. D

B.2 Consistency of the Biased Bootstrap for GMM Estimators

First we prove that the weights given by (2-6) exist with probability approaching one.

This result extends the result of Owen (2001).

Proof of Lemma 2.3.1. Suppose to the contrary that 0 is not inside the convex hull of

{b(Xi, 0) : i = 1,..., n} with probability approaching one. Then there exists an c > 0 and

a sequence (nk)k>l, such that for all k > 1


P (0 Co{b(Xj,, : j 1,...,nk}) > e,

where Co{b(Xj, 0,) : j E 1,... ,nk} denotes the convex hull generated by {b(Xj, O) :

j 1,...,nk}. Let

AT,= {w : 0 Co{b(Xj, 06{)(u) :j 1,..., nk}}.

Then for all uw E Ak, there exists 1, C R', such that I|1,|= 1 and that Wr]b(Xj, Ok)(w) >

0 for all j = 1,..., nk. From this, it follows that for all w E A ,


P y (7,x >0) =- I{m b(Xy,01)(k ) > 0} 1, (B 13)
k 1

where P1, is the empirical distribution of {b(Xj, 0 ,) : i = 1,..., nk. Consequently, for all

w E Anr ,

sup P" (rTx > 0) = 1. (B-14)
117711- 1
Using a Glivenko-Cantelli result given by the Theorem 19.4 of van der Vaart (1998,

p. 270), the following convergence holds with probability 1:


lim sup PP, (rTx > 0) P (rTZ > 0) 0, (B-15)
n 7711 1








where Z ~ N(0, V). From (B-15), it follows that


sup P, (rTx > 0) sup P (rTZ > 0), (B-16)

almost surely as n -+ oc. We take now B = limkA,, = nk>l Um>k An. Since for all
m > 1, P (A,~) > e, it follows that P (B) > e. For almost all w C B, from (B 14) and
(B-16), it follows that

limsup sup P (rTx > 0) 1 sup P (rTZ > 0). (B-17)
n 11|7-l 1|7 14|-

Since V is positive definite (Assumption 2.3.3), we know that rlTVrl > 0 for all r1 such
that I|||l 1. Thus
P (rZ > 0)= for all I||- 1,

hence sup P,,| 1 P (IT Z > 0) 1/2, contradicting (B 17). E

The next Lemma gives the order of the Lagrange multiplier A, and generalizes
Theorem 3.2 of Owen (2001, pp. 219-222).
Lemma B.2.1. Under Assumptions 2.3.1-2.3.3, = Op(n-1/2).

Proof. Using (2-6) for the particular case when p = 0, it follows that the weights are given

by pi(O) = (n(1 + ATb(X, O)))-1, where A satisfies E, p (0)b(Xi, ) 0. Hence

1 A'b(Xi, 6)
n b(X, O) 1 b(Xi, 0. (B-18)
1 1 + ATb(Xs, 0)
Hence,
1 b(Xi) b (X (10)
-Yb(X4, ) (B-19)
n n( + ATb(X, 6))









Let S r b(X= ,Ob(i(Xi,)T and S s Y b(XiO)b(Xi',)T, so that from (B-19) it follows
ti 1 n (l+ATb(Xi,0)) i=1
that SA = 1 Ei, b(Xi, 0). Since pi > 0, we obtain 1 + ATb(Xi, 0) > 0, hence


A
IJA II


IAII b(X, O)b(X, 6)A

< b(X,'O)b(Xio-)TA
- n(1+ ATb(X, O)) (1 + A max b(X,) )
S || A || (1 + || I||11 ()

- t b(Xi,0) (1+ ||A||II ,),
i= 1


where


W.= max i b(X,, 0).
i= 1,...,n


(B-20)


From the last display, we obtain


Al (S


n n
w -- b(X, o) < b(X, 0)
i=1 i=1


(B-21)


Since


V + op(1),


S=1 b(Xi, 0)b(Xi, 0)T
i= 1

and by Lemma B.2.2, we have W~ =


Sb(Xi, )
i= 1


Op(n-1/2),


op(n1/2), hence, from (B-21) it follows that A


Op(n-1/2).

The next lemma gives the order of Wn defined above.

Lemma B.2.2. Under Assumptions 2.3.1-2.3.3, we have WW

given by


W = -max Ib(X, o0)\.
i 1,...,n


op(n1/2), where Wn is


(B-22)


Proof. Using Assumption 2.3.2, by the mean value theorem and triangle inequality, it

follows that


IIb(Xi,0)11< IIb(Xi,O0)11+1 10


0o||k(Xi),









with probability one. Hence, the following inequality holds with probability one


max Ib(Xi, 0)11< max Ib(X, 0o)1||+|| 0o|| max k(Xi).
i= 1,...,n i= 1,...,nr i 1,...,n


(B-23)


Using Lemma 11.2 of Owen (2001, p. 218), it follows that


max IIb(X, O0)o1= op(n1/2) and max k(Xi) = op(1/2).
i= 1,...,n i= 1,...,n

Using that 0 = o + Op(n-1/2), from (B-23) we have W1 = op(1/2). [

The following lemma provides us with an order of convergence that will be needed in

the proofs of the following theorems.

Lemma B.2.3. Under Assumptions 2.3.1-2.3.3,


1
max
il,...,n 1 + ATb(Xi, )


Op(1).


Proof. The following inequality holds on the set D= {I||AII |, < 1}


1 + ATb(X,) 1- ||IIA||I,

Since A = Op(n-1/2) and W, = op(n1/2), it follows that |IIX|11 =

From (B-24), we obtain that maxi 1 .... I+1Tb( = Op(X)

The next proof shows the consistency of the biased bootstrap

moment conditions.


(B-24)


op(1) and P(D,) 1.
[]


for sample mean of


Proof of Theorem 2.3.1. Conditional on X1, X2,..., the term I Y 1 b(X*, 0) is the

sample mean of n iid random variables. Using the definition of Fo, the (conditional) mean

and variance of these observations are


Eg [b(X;,)] = pib(Xi,0) 0
i= 1


Var6 [b(X;, )] = pib(Xi, 0)b(X0, )T.
i= 1


and


(B-25)


(B-26)








First, let us show that Z 1pi b(Xi,0)b(Xi, )T V as n -- oo. We rewrite (B-26)
as

p pib(Xi, ) b(Xi, 6)T
t-i
i t 1
i bb(X, 0)b(X, 0)T + (p ) b(Xi,0)b(Xi, O)T. (B-27)
i= 1 i= 1
From Assumption 2.3.2 it follows that the following uniform law of large numbers holds,
being an application of Theorem 19.4 of van der Vaart (1998, p. 270)

sup 1 b(Xi,0)b(X,0)T- V(0) 0 as n oo,
0 n
i= 1

where V(O) = Er [bob ]. Now, since 0 -P Oo and V = V(00), it follows that the first
term on the right side of (B-27) converges in probability to the .. i!,ll ic covariance
V as n -- oo (for this we also use the fact that V(O) is continuous at 0o, which follows
from Assumption 2.3.2). For the second term of (B-27), using the fact that pi (n(1 +
ATb(X, 0)))-1 from (2-9), we obtain the following relations (we use ||-|| to denote the
Euclidean norm for vectors and the induced operator norm for matrices):

ri 0)T= k Ilxllllb(X, )113
l(pj 1/n)b(X i, 6)b(X i, -)T < I | A| | b(XO, 6)113

< IIllmax 1 lb(X, 0) 3
1 S Op(n-1/2)Op(1)Op(n1/2) = Op(), (B-28)

where the next to last relation follows from Lemmas B.2.1-B.2.3.
Since X*s are sampled from Fo, which changes with n, in order to complete the proof,
the Central Limit Theorem for triangular arrays is used. It suffices to show that for every
e > 0 (van der Vaart, 1998, p. 20)

Eo [llb(X,0) b2j( b(X0, ) > > 1/2)] 0 as n oo. (B-29)









The following relations are true:


EB [llb(X, 0)2( b(X, 0) 1> c 1/2)] Y 6llb(X 0)|2(Jlb(X,0)|> c,1/2)
i1 (B-30)

pi|llb(X )|2 n > C 1/2).
i-1

As before, EC l pi Ib(Xj,0)2 Ellbool2. From Lemma B.2.2, I(W, > cn1/2) 0, and

hence, (B-29) follows from (B-30).

Let Gb = ( ( (0b* 0) lF) and G N(0, V). Using a subsequence argument for

convergence in probability (Kallenberg, 2002, Lemma 4.2, p. 63), it follows that for any

subsequence (mn) C (n), there is a further subsequence (I,) C (mn) such that (B-29)

holds along (IT) almost surely. Using the Central Limit Theorem for triangular arrays

along (lI), it follows that 6(Ga G) ^- 0, where 6 is a metric on the space of distributions

that metrizes the weak convergence. From this, using again the subsequence criterion, it

follows that 6(Gg, G) P~ 0. In other words, we have shown that any subsequence of {Gg }

has a further subsequence that converges in distribution to G almost surely, so that, the

sequence converges in distribution in probability. O

We now prove the (conditional) uniform convergence in probability result that will be

used in proving the consistency of the GMM estimators (and their bootstrap versions) in

Theorem 2.3.2.

Proof of Lemma 2.3.2. Without loss of generality, we suppose q = 1. Using the same

arguments as in Example 19.8 of van der Vaart (1998), there exists a finite sequence

of open balls U,I i 1,..., n, such that 0 CU U b/ buj(x) < b(x, 0) < bi(x) for

all 0 E Ui and for all x, and EF IbU bu,~ < e, where bu (x) = infeE, b(x, 0) and

bU(x) = supoe b(x, 0). For all 0 e Uj,


n n n
SEbuX) < 1Eb(X 0) < 1 Eb (X)l (B 31)
i= 1 i= 1 i= 1









with probability one. Using the same techniques as in Theorem 2.3.1, we have I bU j (X )
EF bU + o(l), E' i bu(X) Ebu + op(1), and 1 l b(X", 0) E be + o(1).

Hence, for all 0 Uj, EF bu, + o(lt) < b(O) < E, bu3 + o(l). Hence, for all 0 e Uj

l b(X;*,0) E be < +o(1). (B-32)
i=

Taking now the supremum over 0 e in(B-32), we obtain

i n
sup b(X;*,) -EFbo < + (1). (B-33)
6ee n
i= 1

Taking now c 0 in (B-33), we obtain the desired result. E

Proof of Theorem 2.3.2. First, by Lemma 2.3.2, we have


sup IQ(0) Q(O) op(1) (B-34)
eee
OEH

and

sup I* (0) Q(0)| op(l), (B-35)
eee
where we use the same technique for proving the first uniform convergence result of

(B-34). By Corollary 1.2.1 we obtain the consistency of 0. Using standard techniques,

this in turn would imply that 0 = 00 + Op(n-1/2), so that all conditions of Theorem 2.3.1
hold. Since Q( 0*) < Q(0) + o*(1) (by hypothesis) and Q(0) Q(0) + o*(1) (by

Theorem 2.3.1), we obtain

Q*(0*) <- Q(0) + op(l).

Hence, using the fact that Q,(0) < Q(Oo) Op(1), we have


Q(0o) Q(* ) Q Q(0*) + Op(l)

> Qc*(0) Q(0*) + op() o (1). (B-36)









But, for every c > 0, there exists r] > 0 such that for every 0 with 110 0oll> C,

Q(0) > Q(0o) + r. Thus

{1|1 0o1 > } C {Q(0*) > Q(0o) + T7}.

Since P,{Q(0*) > Q(0o) + Tl} 0, we have P,{1|0* o||l> e} 4 0, and hence
0* 0o + o(1). E

The next lemma will be used in the proof of the Thereom 2.3.3. It is a version of
Slutsky's theorem formulated for conditional convergence in distribution in probability.
Lemma B.2.4 (Lahiri (2003, Lemma 4.1, p. 77)). For n e N, let a*, b* and T,* be random

variables, all /. I;,, 1 on a common jI,,.. '1l,l.:l. space ( P, P). Suppose that X, is a
sub-a-field of I and that there exist X, -measurable random variables a and b such that

P(|a a > e Xt) + P(lb, b1 > c X,) 0 as n oo, (B-37)

for every c > 0. Also, suppose that T,* ^ T co,'l././..-, .'ll;' given X,, where T represents a
--measurable random variable. Then,

aT* + b aT + b. (B-38)

Proof of Theorem 2.3.3. For simplicity we take the case when p = 1. Let '((0) =
Vb,(0)TWb,(0). By Taylor's theorem there exists a 0* between 0 and 0* such that

1
0 0*) (0) + (0* 0) (0) + (0* 0)22 (0*), (B-39)

where
VW~j() V2bT(0)TWb (0) + Vb(0)6WVb.(0), (B-40)

and

V2** b Wbl 3 0T *O) +3 2b T WVb( 0). (B 41)








Since V2'*#(0) E + o(1), b*() = o(1), and Vb() D + o)(1), it follows that

V7 (0) DT WD + o*(1), (B-42)

where E = E [V2b(Xi, o)]. Moreover,

IIV21*) < I V3 b ) II l I *) ll+311 6 (b*) 1 1Vb:(b*)l
S/1 n 2
< 4||11| n-1k(X*))

Since (n-1 E i k(X ))2 (0- ) o (1), it follows that

(-* )- )2 ( *) o= o(). (B-43)

By substituting (B-42) and (B-43) in (B-39), we obtain

-WI(#) = (* 0)(op(1) + DTWD + ( )V2 9*))
2 (B-44)
=( 0)(DTWD + o*(1)).

Hence
V(0* 0) = -(DTWD)- 1n'(0) + o*(1). (B-45)

Using (the conditional version of) Slutsky's thereom and Theorem 2.3.1, it follows that
VnIf(0) 2 N(O, DTWVWD), and hence

V(0* 0) ^ N(O, (DTWD)- DTWVWD(DTWD)-1)



Remark B.2.1. Neither recentering nor the biased bootstrap are necessary in order to
assure bootstrap consistency of the GMM estimators. In other words, the ordinary
uniform bootstrap consistently estimates the distribution of V(-( 0o), a fact also
confirmed by Hahn (1996).









We show now that in the case of the uniform bootstrap, the previous consistency

result holds. To this end, all results from the previous proof hold, up to and including

(B-45). Hence, we need to show now that


Vv/I(O ) #- N(0, DTWVWD).


(B-46)


In the same way as in the proof of Theorem (2.3.1), it is easy to prove that the following

convergence holds for the uniform bootstrap


v((b(b) b,(#)) N(0, V).

In order to show (B-46), by (conditional) Slutsky's theorem, it is hence enough to show

that

nVb(O) TWb() = op(1).

This follows from the following relations:


vnVb(O)T Wbn(O)


o ) (by conditional Slutsky theorem).
OP(t) (by conditional Slutsky's theorem).


We now prove that the biased bootstrap for the J-test for overidentified restrictions is
consistent.

Proof of Theorem 2.3.4. As in the proof of Theorem 2.3.3, using the Taylor expansion we

obtain that


V-1/2 VnbV ) V-1/2 VVnb ) + (* _))V-1/2Vb () + op(1).

From (B-45), for the particular case when W = V-1, it follows that


(B-47)


(DTV- D)- 1r /n (O) + o*(1).


v (* 0)


(B-48)








From (B-47) and (B-48), we obtain


v-1/2 Vnb0*) = -1/2Vn() + v(* 0)V-1/ 2b(0) + oP(1)
v-1/vnb ) V-1/DV(V-l)- lDTV-1/2V-1/2 ) + o(v1)
S[I V-1/2D(DTV-D)- DTV-1/2] V-1/V2 n 0) + o(1). (B-49)

Since V-1/2D(DTV- D)-IDTV-1/2 is idempotent of rank p = 1, it follows that [I -
V-1/2D(DTV-1D)-IDTV-1/ ] is idemptotent of rank q p (remember that we consider
the case p 1). Moreover, from Thereom 2.3.3, it follows that V-1/2 b (0) NV(O, I).
Hence

nQn(0*) = b*(0*)TV- b*(0*) [V-1/2 Tb*(O*)]TV-1/2 6*(*)

[V-1/2 ~ (0)]T [I- V-1/2D(DTV- D)- DTV-1/2] V-1/2 ()
Xq-p (by Theorem 2.3.1). (B-50)

Consequently, nQ(0*) 2 xp.*

Remark B.2.2. We will prove now that without recentering, (or without applying
the biased bootstrap), the bootstrap estimate of the distribution of the J-statistic is
inconsistent. First we show the following (unconditional) weak convergence

b*(() bn(0) V 0
(0) b() N(0,B), B (B-51)
bn(0) 0 A

where A = I V-1/2D(DTV-1D)-IDTV-1/2. For every x, y E Rq, using the bounded
convergence theorem and the analogue of Theorem 2.3.1 for the uniform bootstrap, the









following relations are true:


P (vn(b:(#)- b.(6)) < x; b(.) < y)

E (I(b,(0) < y) E,[I(v/(b (0) b,(0)) < x)|l])

SE (I(b,(0) < y)Fx(x)) + E (E, [I(v/(b(b) b,(6)) < x)1^] F(x))

=Fx(x)Fy(y) + o(1), (by Theorem 2.3.3), (B-52)

where X ~ N(0, V) and Y ~ N(0, A). Hence, (B-52) holds, so that, the following

(unconditional) weak convergence holds


V-1/2 nb ) N(0,I+ V-1/2AV-1/2).
n -, N(O, I + AV-


(B-53)


Hence,


AV- 1/2 /b() N(0, A + AV-1/2AV-1/2A).


(B-54)


Suppose on the contrary that nQ (0*) ^ >, then, by the bounded convergence theorem

and (B-49) it follows that ,b (O)TV-1/2AV-1/2b*O) 2_. From (B-53), we see that

C = A + AV-1/2AV-1/2A should be idempotent of rank q p (Driscoll, 1999). Of course

this is not alv-i-b the case, e.g. for V = I, C = 2A, which is not idempotent.

B.3 Consistency Results for Bootstrapping 2SLS Estimators

First we prove a useful lemma.

Lemma B.3.1. Under the Assumptions 2.4.1-2.4.3


1 i- 1


(B-55)


i= 1
Qo z + op(1),
v/ .


where


(B-56)


Q = I QiQd QQ t) 'QQ -1)

is an idempotent matrix of rank q p.













ini
Z z i(yi


ti 1 i(


itn
i n
>111zii
U~-i


i 1 T


QT fn(


3) +op(l).


V(~3- 3) = V(XTZ(ZTZ)-IZTX) -XTZ(ZTZ)-IZTC

= (QxQiQT zQ- 1 zi + op(i),
rom (Bi57) and (B 1

from (B-57) and (B-58), it follows


n
>11
i 1


n 1

Szi +


z n + ().
Q Yzici+Op(l).


Remark B.3.1. As an immediate consequence of Lemma B.3.1, it follows that


i 1n
i


where


R = 2QQQT


.2Qzz 2QT (QZQ QT -lQz
a Q(z Q Q


which is a matrix of rank q


Proof.


Sn

i 1


Since


(B-57)


n

1 n


(B-58)


(B-59)


(B-60)


(B-61)


xo X ( ))










Here is the consistency proof for the proposed biased bootstrap procedure for 2SLS

estimators. The main steps of the proof are similar to the proof of consistency of GMM

estimators discussed in the previous section.

Proof of Theorem 2.4.2. We need to show that

z1 (z zc) N(0, u2Qzz).

i=1 j= 1

For fixed sequences yi, y2,..., Zi, z2, .., the term n-1 Zi 1(* *- 1 Zn i Zjj)

is the mean of n observations zic n-1 i zj. The (conditional) mean and variance

of these (hypothetical) observations are


n
E, [z*:* n-1 j c]
j-1


n
3=1


n
n-1 Y zjj = 0,
j=1


n
Var, [ze: n-12 ZJ]
j 1

The following convergence holds


n n
i=1 i=1
1 n

i 1
u2Qz + op(l).


j 1
3=1


(B-63)


n n 1
j=1 j=1


i= 1

Y2Zzc ( i3)) + E YZ(xT (X ))2
i=1 i 1
(B-64)


Since (x*, z*, e*) are sampled from the empirical distribution on the sample 3 which

changes with n, we use the Central Limit Theorem for triangular arrays. By Proposition 2.27

of van der Vaart (1998), we need to verify that the Lindeberg condition holds, i.e., for any

6 > 0,


E, z: n-1 ejzj I Z* -1 Z > 61/2 0.
j=1 j=1


(B-65)


and


(B 62)









The following relations hold:


E,I iz n-1 I n-1> 12
j=1 j=1

zt n nz 2 1 12
S Ziii n~-1 Zjj I ( Zi -- jZji n6T ,112)
i= 1 j= 1 j= 1

n1i2)
Sj=1 j=1
[n n
Sz nj 2 I ( max zi + Y c z > Jul/2) (B-66)
i=- j-1 j=1

Using the assumption 2.4.1-2.4.3,


n
max Izj||~jI+ 1nY jZ e
j= 1,...,n
j=1


max IIZj(yj j)ll+op(1)
j 1,....n

< max lzj(yj xj) (0 0) zj
j 1,....n
< max |zyy + /3-/3|| max ||aj||
J 1,....n Jlz 1....n
S op(O1/2) + Op(n-l/2)Op(n21/2) Op(1/2), (B-67)


where for the last relation we used the result of Lemma 11.2 of Owen (2001, p.218). Hence

I (maxj 1,...,n ZjC + 1 = jZj > u1/2) 0. Moreover, using the same arguments

as in the proof of (B-64),


1 ii -1 j z -E 2lzl
i= 1 j= 1

therefore (B-65) holds. As before, using the subsequence argument, it follows that, any

subsequence, has a further subsequence that converges in distribution conditionally almost

surely, so that, the sequence converge in distribution conditionally in probability. D










Proof of Theorem 2.4.3. The following relations are true


n 1/2( 3) /2 ((X*TPX*) 1TP -)

Sn1/2 ((X*TP X*) lX*T(X* + e*)

/2(X*TP;X*) lX*TPE*.


Using that n-1X*TZ*


n-1 E fi *xz


Q+z+o p() and n-1Z*TZ*


n-1 1 ziz*T


Qzz + o (1), we obtain

Sn


n
i1


S1

Using Theorem 2.4.2, we only need to prove that


n1

1
QXQ- /I- i


(QxzQJIQTZ)


op(1).


(B-69)


(B-70)


Using Lemma B.3.1, we obtain


(QzQQ-1 T Q-1 zQ-1 1
zz 5 z1i
1-


i 1
Q Qo + oQp(1) o-(1). (B 71)

-0 + op(l) op(l). (B-71)


Hence (B-70) holds, so that (2-44) holds.


(B-68)


i 1


(&QxzQLIQT 1QIr&xzQ (









APPENDIX C
PROOFS OF CONSISTENCY RESULTS FROM CHAPTER III

Proof of the Lemma 3.3.1. We need to show that 0 is inside the convex hull of {Uj :

j = 1,..., N} with probability approaching one. Suppose to the contrary that the above

statement is false. Then there exists an e > 0 and a sequence (Nk)k>1, such that for all
k>l

P (0 Co{U:j= ,...,Nk}) >,

where Co{Uj : j 1,..., Nk } denotes the convex hull generated by Ujs. Let

ANk= U{: 0 Co{Uj(w) : j 1,..., Nk}}.

Then for all w E AN, there exists r, E R with Ir1,| = 1 such that rl7Uj(w) > 0 for all

j = 1,..., Nk. From this, it follows that for all Lw c ANk,

P Nk
P" x > 0) = I{1 Uj(L) > 0} 1, (C-1)
wk (C
j=1

where PNk is the empirical distribution of {U j = 1,..., Nk}. Consequently, for

all wL c AN, supl P 11 P~ (OTx > 0) = 1. Using a Glivenko-Cantelli result given

by Theorem 19.4 of van der Vaart (1998, p. 270), the following convergence holds with

probability one:

lim sup IP, (rx > 0) P (OTZ > 0) 0, (C-2)

where Z ~ N(0, Y,). From (C-2), it follows that

sup P, (rfTx > 0) -- sup P (rTZ > 0),
ll ll i1 11 ll- 1

almost surely as n -- o. We take now B = limkANk = k>1 Um>k ANm. Since for all

m > 1, P (AN,) > e, it follows that P (B) > e. For almost all w c B,

limsup sup P~ (rTx > 0) 1 sup P (OTZ > 0). (C 3)
k 1|411=1 1411 =1









Since YE is positive definite (Assumption 3.3.3), we know that rToEr > 0 for all r1 such

that ||r||= 1. Thus
1
P (TZ > 0) for all 1|7|1 1,

hence sup||,- 1P (1rTZ > 0) 1/2, contradicting (C-3). E

Let WN = maxi 1,...,N I Ui1. Next lemmas give the order of A and WN, and generalize

the results of from the previous section for dependent data.

Lemma C.0.2. Under the same conditions as in Ti,,.., 1,, 3.3.1,


II||A| Op(lN-1/2) (C4)

Proof of Lemma C.0.2. From (3-11), it follows that

1 N [ I UT 1
U s 1 XT 0. (C-5)


Hence,
1 N N U (C6)
N 1 i N( + ATU)
N UUT N U.UT
Let S i and S N -, so that from (C-6) it follows that SA =
N(I+\TUi) N
1N il Ui. Since pi > 0, we obtain 1 + AT > 0, hence

III A II UiUiA


< (1 + IIA|| max I1|U 11)
< N(1 + Ti) i=1,...N
S||SA|| (1 + ||11||Il )
1 N
S AE^ (+ AHIIN ),
i 1
where WN = maxi 1,...,N || U. From the last display, we obtain



Ili=A S --VI N Ni N(C









Using the similar results of Lahiri (2003, p. 52), the following orders hold


I oN N( 12)'
i= i i=

and the result of the next lemma that WN o(l-lN1/2), from (C 7) we obtain the

following order for the Lagrange multiplier A,


II|A| Op(lN-1/2). (C8)




Lemma C.0.3. Under the same conditions as in Ti, .., 1,, 3.3.1,


WN- o(Nl/2-1). (C-9)


Proof of Lemma C.O.3. Since

i+l- 1
|Uj(0)||< ||U(0o)+||1 1 k(X)|l 0o|| (C 10)
3=i

it follows that
i+l- 1
max \Uj(0) \< max U||U(8o0)||+|0 0o|| max 1- k(X). (C-11)
i=1,...,N i=1,...,N i=1,...,N
j=i

Hence, it is enough to show that maxi 1,...,N||l U(0o) || o(1-NV1/2). Since for any integer

i > 1, E [llb(X, 00) 2+6] < oo, it follows that for any A > 0, E I P(l b(Xi,00o) 12+->

An) < oc. Using the (strict) stationarity property, b(Xi, Oo)s are identical distributed,

hence

SP( lb(X, 00) 2+6> An) < oc. (C-12)
n>l
Therefore, by Borel-Cantelli lemma, \\b(X,, 0o)| < A1/(2+6) 1/(2+6) eventually, for any

A > 0. This implies that for every i 1,..., N,

i+l -1
U Ji(Oo) = | b(Xj,Oo) < A1(2+6)N1/(2+6)
3=i









eventually, for any A > 0. Consequently,


VN max I||jU(0)> i=1,...,N

eventually, for any A > 0. Taking A arbitrarily close to 0 in a countable set, we obtain

VN o(N1(2+6)) with probability one. Obviously, N1(2+6) < N1/21-1 for O(NS/(4+26)).

Taking now I = O(N/(4+2)) we finally obtain VN = o(N1/21-1) with probability one.

Hence WN = o(N1/21-1) with probability one. D

In the proof of Theorem 3.3.1, we will also need the following result.

Lemma C.0.4. Under the same conditions as in Ti, .,. n,, 3.3.1, ,

1
max Op(1).
i=1,...,N 1 + A TUi

Proof of Lemma C.0.4. The following inequality holds on the set DN = {A II N|| < 1}

1 1
U < (C 13)
1+ ATU t 1 ||A||1i, N

Since A = Op(lN-1/2) and WN = Op(1-1N1/2), it follows that P(DN) 1 and I|A||II N

op(1). From (C-13), maxi 1...,N +U Op(1).

Proof of Theorem 3.3.1. Let UI, = 1-1/2 +1-' b(Xj, 0) be the (scaled) average of

moments at 0 in block Bi and let U, = 1-1/2 -+1 b(Xj, 0) be its bootstrap version. For

a fixed sequence X1,X2,..., the term b#(0) = b-1' E U, is the mean of b (hypothetical)

observations. Conditionally on X, U1*, i = 1,...,b are iid, with P (U = Uj) = pj, for

all j = 1,..., N. Using the definition of Fo, the (conditional) mean and variance of these

observations are
N
Eb [U.*] = piUi 0 (C-14)
i=1
and
N
Varo [UJ] = piUiU. (C-15)
i=1









E-1/2 1 U,i, it follows that


N
Varo [U*1,i] =- EpUiiU-.
i 1

Using the same result as in Lahiri (2003, p. 52), we obtain


U Y Ul,Ui U cT.
i 1

In order to show that Varb [U ,i] Y,, we only need to show that


N N
N E 1,i U i lE,iPli,
i= i= 1


Op(l).


(C-16)


(C-17)


(C-18)


To this end, we use the following inequalities


pi)


iN


N
< || IA II U11 111 |2
-ll z^^1 7ri


1 1v II
< ||A|| max IlUill max 1
i=1,...,N i=1,...,N 1 + TUi i-
Op(lN-1/2)op (1-1l/2)0p(1)0p(1)


where, for the last inequality, Lemmas C.0.2-C.0.4 are used. Since Eg[U*,i]


UN' 11 2
N


0, and


P
Var [Uj,j] YE, the central limit theorem for triangular arrays is used. We only need to

verify that the Lindeberg condition holds, i.e. for every c > 0,


Eb [|Ui 121 (| 1Ui > b1/2C)] 0. (C 19)

Since Ui, i 1,..., b are conditionally iid, it follows that


E, [I|UI ||2I ,1 (U| > b1/2e)]


N
pi | U1,ijj2 ( ,i||> b1/2
i= 1


pijUij21( maX 1,i b1/2C).
1 lmaxJIjj
i= 1,...,N
i= 1


Since n, /2b()


b-1/2 lb 11/2Ui*









Since maxi 1,...,N ||UI, I op(N 1/21-1/2) = Op(b1/2), we have


I( max ||Ul,i| b 1/2e) -P0.
i=1,...,N

Since Ei1 l llU, ll2 Ellb(X, 00o2 (similarly to (C-17)), it follows that (C-19) holds.

Using the same subsequence argument as before, it follows that any subsequence, has a

further subsequence that converges weakly, conditionally almost surely, so that, the full

sequence sequence converge weakly, conditionally in probability. D

Proof of Theorem 3.3.3. For simplicity we consider the case when q = 1. By Taylor's

theorem, there exists a 0* on the line between 0 and 0* such that

1
0 b:(0*) b*(0) + Vb(0)(0* 0) + (0* )TV2b(*)(* 0). (C-20)

Using Assumption 3.3.2, it follows that IIV2b, (0*) < n-1 E k(X*). Hence

(0* 0)TV2b*(0*) = op(1).

Since Vb7(0) D + o)(1), from (C-20) it follows

1
-b*(0) (D + o(1) (* )T2bv0*)(* 0)

(D + op(1l))(0 0). (C-21)

Hence,

v/7(0* 0) -D- 1 b*0)o + op(1). (C-22)

Using Theorem 3.3.1, it follows that nlb*(0) -, N(0, Y,), and hence, using the

(conditional) Slutsky's theorem, we have

1(0* 0) N(0, D -' (D-1)). (C-23)

[]









REFERENCES


Andrews, D. W. (2000). Inconsistency of the bootstrap when a parameter is on the
boundary of the parameter space. Econometrica 68, 399-405.

Angrist, J. D., G. W. Imbens, and D. B. Rubin (1996). Identification of causal effects
using instrumental variables. Journal of the American Statistical Association 91,
444-455.

Angrist, J. D. and A. B. Krueger (1991). Does compulsory school attendance affect
schooling and earnings? Quarterly Journal of Economics 106, 979-1014.

Babu, G. J. (1984). Bootstrapping statistics with linear combinations of chi-squares as
weak limit. S,.,l.c,,r Ser. A 46, 85-93.

B i.--'. ily, K. A. (1998). Empirical likelihood as a goodness-of-fit measure. Biometrika 85,
535-547.

Beran, R. (1987). Prepivoting to reduce level error of confidence sets. Biometrika 74,
457-68.

Beran, R. (1988). Prepivoting test statistics: A bootstrap view of .,-vmptotic refinements.
Journal of the American Statistical Association 83(403), 687-697.

Bhattacharya, R. N. and J. K. Ghosh (1978). On the validity of the formal Edgeworth
expansion. The Annals of Statistics 6, 434-451.

Bickel, P. J. and D. A. Freedman (1981). Some .,-i-!',il..iic theory for the bootstrap. The
Annals of Statistics 9, 11961217.

Bose, A. (1981). Edgeworth correction by bootstrap in autoregressions. The Annals of
Statistics 16, 1709-1722.

Brown, B. W. and W. K. N., -. y (1998). Efficient semiparametric estimation of
expectations. Econometrica 66, 453-464.

Brown, B. W. and W. K. N., v- y (2002). GMM, efficient bol-lI i1.ppli and improved
inference. Journal of Business and Economic Statistics 20, 507-517.

Brundy, J. and D. Jorgenson (1971). Efficient estimation of simultaneous equations by
instrumental variables. Review of Economics and Statistics 53, 207-224.

Biilmann, P. (1997). Sieve bootstrap for time series. Bernoulli 3, 123-148.

Bustos, O. H. (1982). General M-estimates for contaminated p-th order autoregressive
processes; consistency and .,-mptotic normality. Zeitschrift fiir Wahrscheindlichkeits-
theorie und Verwandte Gebiete 59, 491-504.

Butcher, K. F. and A. Case (1994). The effect of sibling composition on women's
education and earnings. Quarterly Journal of Economics 109, 531-563.









Canty, A. J., A. C. Davison, D. V. Hinkley, and V. Ventura (2000). Bootstrap diagnostics.
Technical Report 726, Department of Statistics, Carnegie Mellon University.

Card, D. (1995). Using j ,,'l'l,.:' variation in college p.',i,,:Il; to estimate the returns to
schooling. In Aspects of Labor Market Behavior: E,-- i in Honor of John Vanderkamp,
University of Toronto Press.

Carlstein, E. (1986). The use of subseries methods for estimating the variance of a general
statistics from a stationary time series. The Annals of Statistics 14, 1171-1179.

Choi, E. and P. Hall (2000). Bootstrap confidence regions computed from autoregressions
of arbitrary order. Journal of the Rc,. ,l Statistical S'. .. I.;, Series B (Statistical
M .I /.,/,. .,1 ) 62, 461-477.

Corcoran, S. A. (1998). Bartlett adjustment of empirical discrepancy statistics.
Biometrika 85, 967-972.

Datta, S. (1995). On a modified bootstrap for certain .i-i- .1' ically nonnormal statistics.
Statistics & P, l./,,l.:/.:/.;' Letters 24, 91-98.

Davidson, R. and J. G. Mackinnon (1993). Estimation and Inference in Econometrics.
New York: Oxford University Press.

Davison, A. and D. V. Hinkley (1997). Bootstrap Methods and their Applications.
Cambridge University Press, Cambridge, UK.

Davison, A., D. V. Hinkley, and G. Young (2003). Recent developments in bootstrap
methodology. Statistical Science 18, 141-157.

DiCiccio, T. J., P. Hall, and J. P. Romano (1991). Empirical likelihood is
Bartlett-correctable. The Annals of Statistics 19(2), 1053 1061.

DiCiccio, T. J. and J. P. Romano (1989). The automatic percentile method: accurate
confidence limits in parametric models. Canadian Journal of Statistics 17(2), 155-169.

DiCiccio, T. J. and J. P. Romano (1990). Nonparametric confidence limits by resampling
methods and least favorable families. International Statistical Review 58, 59-76.

Driscoll, M. F. (1999). An improved result relating quadratic forms and chi-square
distributions. The American Statistician 53, 273-275.

Durbin, J. (1954). Errors in variables. Review of the International Statistical Institute 22,
23-32.

Efron, B. (1979). Bootstrap methods: another look at jackknife. The Annals of Statis-
tics 7, 1-26.

Efron, B. (1992). Jackknife-after-bootstrap standard errors and influence functions.
Journal of the Rcial Statistical S... /.;/ Series B 54, 83-127.









Efron, B. (2003). Seconds thoughts on bootstrap. Statistical Science 2003, 135-140.

Efron, B. and R. Tibshirani (1986). Bootstrap methods for standard errors, confidence
intervals, and other measures of statistical accuracy. Statistical Science 1, 54-77.

Efron, B. and R. J. Tibshirani (1993). Introduction to the Bootstrap. NY: C'!h p1' i' and
Hall.

Freedman, D. A. (1981). Bootstrapping regression models. The Annals of Statistics 9,
1218-1226.

Freedman, D. A. (1984). On bootstrapping two-stage least squares estimates in stationary
linear models. The Annals of Statistics 12, 827-842.

Freedman, D. A. and S. F. Peters (1984). Bootstrapping an econometric model: Some
empirical results. Journal of Business and Economic Statistics 2, 150-158.

Gine, E. (1990). Bootstrapping general empirical measures. The Annals of P,. ,rlI.:.:l/;/ 18,
851-869.

Giurcanu, M. and A. Trindade (2006). Establishing consistency of M-estimators under
concavity with an application to some financial risk measures. Technical Report 00000,
Department of Statistics, University of Florida.

Hahn, J. (1996). A note on bootstrapping generalized method of moments estimators.
Econometric Ti,. 12, 187-196.

Hall, A. R. (2005). Generalized Method of Moments. New York: Oxford University Press.

Hall, P. (1985). Resampling a coverage process. Stochastic Processes and their Applica-
tions 20, 231-246.

Hall, P. (1988). Theoretical comparison of bootstrap confidence intervals. The Annals of
Statistics 16, 927-953.

Hall, P. (1992). The Bootstrap and Edgeworth Expansion. New York: Springer-Verlag.

Hall, P., J. Horowitz, and B. Jing (1995). On blocking rules for the bootstrap with
dependent data. Biometrika 82, 561-574.

Hall, P. and J. L. Horowitz (1996). Bootstrap critical values for tests based on
generalized-method-of-moments estimators. Econometrica 64, 891-916.

Hall, P. and B. La Scala (1990). Methodology and algorithms of empirical likelihood.
International Statistical Review 58, 109-127.

Hall, P. and M. A. Martin (1988). On bootstrap resampling and iteration. Biometrika 75,
661-671.









Hall, P. and H. Presnell (1999). Intentionally biased bootstrap methods. Journal of the
Royal Statistical S ..: I,; Series B 61, 143-158.

Hansen, L., J. Heaton, and A. Yaron (1996). Finite-sample properties of some alternative
GMM estimators. Journal of Business and Economic Statistics 14, 262-280.

Hansen, L. P. (1982). Large sample properties of generalized method of moments
estimators. Econometrica 50, 1029-1054.

HEI-lhi F. (2000). Econometrics. Princeton University Press.

Hogan, J. W. and T. Lancaster (2004). Instrumental variables and inverse probability
weighting for causal inference from longitudinal observational studies. Statistical
Methods for Medical Research 13, 17-48.

Huber, P. J. (1981). Robust Statistics. New York: Wiley-Interscience.

Imbens, G. W. (2002). Generalized method of moments and empirical likelihood. Journal
of Business & Economic Statistics 20, 493-506.

Jing, B. Y. and A. T. A. Wood (1996). Exponential empirical likelihood is not Bartlett
correctable. Annals of Statistics 24, 365-369.

Kallenberg, 0. (2002). Foundations of Modern P, .,'/l.:l.:7;; New York: Springer.

Kitamura, Y. (1997). Empirical likelihood methods with weakly dependent processes. The
Annals of Statistics 25, 2084-2102.

Kolaczyk, E. D. (1994). Empirical likelihood for generalized linear models. Statistica
Sinica 4, 199-218.

Kreiss, J. P. (1992). Bootstrap procedures for AR(oo) processes, in Jockel, K. H., Rothe,
G. and Sendler, W. eds., Bootstri'jl,.:,,j' and related techniques. Heidelberg: Springer.

Kiinsch, H. R. (1989). The Jackknife and and the bootstrap for general stationary
observations. The Annals of Statistics 17, 1217-1241.

Lahiri, S. N. (2003). Resampling Methods for Dependent Data. New York: Springer-Verlag.

Liang, K. Y. and S. L. Zeger (1987). Longitudinal data analysis using generalized linear
models. Biometrika 73, 13-22.

Liind- w, B. G. and A. Qu (2003). Inference functions and quadratic score tests. Statistical
Science 18, 394-410.

Matyas, L. (1999). ed. Generalized Method of Moments Estimation. New York: Cambridge
University Press.









N, v'- y, W. K. and D. L. McFadden (1994). I'r,/,.- Sample Estimation and H,,,i'I,., .:
Testing. Handbook of Econometrics, Volume IV. Amsterdam: North Holland: Oxford
Univ. Press.

N, v.- y, W. K. and R. J. Smith (2004). Higher order properties of GMM and generalized
empirical likelihood estimators. Econometrica 72, 219-255.

N, house, J. P. and M. McClellan (1998). Econometrics in outcomes research: The use of
instrumental variables. Annual Review on Public Health 19, 17-34.

Newton, M. A. and C. J. G. -V-r (1995). Bootstrap recycling: A Monte Carlo alternative to
nested bootstrap. Journal of the American Statistical Association 89, 905-912.

Owen, A. B. (1988). Empirical likelihood ratio confidence interval for a single functional.
Biometrika 75, 237-249.

Owen, A. B. (1990). Empirical likelihood ratio confidence regions. The Annals of
Statistics 18, 90-120.

Owen, A. B. (1991). Empirical likelihood for linear models. The Annals of Statistics 19,
1725-1747.

Owen, A. B. (2001). Empirical Likelihood. New York: C'!h .p' .1' & Hall/CRC.

Park, C. (2000). Robust estimation and testing based on quadratic inference functions.
Ph.D. dissertation, Dept. of Statistics, Pennsylvania State Univ.

Politis, D. N. and J. P. Romano (1992). A circular block resampling procedure for
stat.: ',i- a ; data, in R. Lepage and L. Billard, eds., Exploring the Limits of Bootstrap.
New York: Wiley.

Politis, D. N. and J. P. Romano (1994). The stationary bootstrap. Journal of the
American Statistical Association 89, 1303-1313.

Pollard, D. (1991). Asymptotics for least absolute deviations regression estimators.
Econometric Ti7,, 7, 186-199.

Presnell, B. (2002). A note on least favorable families and power divergence. Technical
Report 0000, Department of Statistics, University of Florida.

Presnell, B. and M. Giurcanu (2007). Biased-bootstrap recycling. Technical Report 0000,
Department of Statistics, University of Florida.

Qin, J. and J. Lawless (1994). Empirical likelihood and general estimating equations. The
Annals of Statistics 22, 300-325.

Qu, A., B. G. Lindi-v, and B. Li (2000). Improving generalized estimating equations using
quadratic inference functions. Biometrika 87, 823-836.









Qu, A. and P. X. Song (2002). Testing ignorable missingness in estimating equation
approaches for longitudinal data. Biometrika 89, 841-850.

R i -, .1 0. (1941). Confluence analysis by means of lag moments and other methods of
confluence analysis. Econometrica 9, 1-24.

Sargan, J. D. (1958). The estimation of economic relationships using instrumental
variables. Econometrica 26, 393-415.

Serfling, R. (1980). Approximation Theorems of Mathematical Statistics. New York: Wiley.

Shao, J. and D. Tu (1995). The T1,. J -.,.:fe and the Bootstrap. New York: Springer.

Shorack, G. (1981). Bootstrapping robust regression. Technical Report 8, Department of
Statistics, University of Washington.

Smith, R. (1997). Alternative semiparametric likelihood approaches to Generalized
Method of Moments estimation. Economic Journal 107, 503-519.

Stein, C. (1956). Efficient nonparametric testing and estimation. In J. Neyman (Ed.), Pro-
ceedings of the Third B. itl.,, /1 Symposium on Mathematical Statistics and P,'../,,Irl/.:l.1;
Volume 1, pp. 187-195. University of California Press.

Tauchen, G. (1986). Statistical properties of generalized method-of-moments estimators
of structural parameters obtained from financial market data. Journal of Business &
Economic Statistics 4, 397-416.

Tibshirani, R. (1988). Variance stabilization and the bootstrap. Biometrika 75(3),
433-444.

van der Vaart, A. W. (1998). Asymptotic Statistics. New York: Springer-Verlag.

Ventura, V. (2000). Non-parametric bootstrap recycling. Technical Report 673,
Department of Statistics, Carnegie Mellon University.

White, H. (1982a). Instrumental variables regression with independent observations.
Econometrica 50, 483-499.

White, H. (1982b). Maximum likelihood estimation of misspecified models. Economet-
rica 50, 1-25.

Wu, C. F. J. (1986). Jackknife, bootstrap and other resampling methods in regression
analysis (with Discussion). The Annals of Statistics 14, 1261-1350.









BIOGRAPHICAL SKETCH

Mihai Giurcanu was born on September 19, 1975 in Mangalia, Romania. Upon

graduation from high school in July 1994, he enrolled as a student in the Faculty of

Mathematics at University of Bucharest from whence he received a degree of Bachelor

of Arts in Mathematics in July 1998. In September 1998 he entered a Master's program

in Applied Statistics and Optimization at the Faculty of Mathematics at University of

Bucharest. During this time, he was appointed as Teaching Assistant in Mathematics at

Polytechnic University of Bucharest. In July 2000, he received a Master's of Arts degree in

Applied Statistics and Optimization from the University of Bucharest. In November 2000,

he obtained a Fellowship in Statistics at Weierstrass Institute of Berlin. In August 2002 he

entered a Ph D program in the Department of Statistics at University of Florida. During

his graduate education at the University of Florida, he was also appointed as teaching

assistant and instructor to different classes in the Department of Statistics. He graduated

in August 2007. His dissertation is entitled Biased Bootstrap Methods for Semiparametric

Models. His main research interests in Statistics are resampling techniques, biostatistics,

econometrics, and time series.





PAGE 1

1

PAGE 2

2

PAGE 3

3

PAGE 4

Iamindebtedtomanypeoplewhogavemesupportandadvicewhileagraduatestudent.First,IwanttothankProfessorBrettPresnellforgivingmetheopportunitytoworkonveryinterestingstatisticaltopicsduringmygraduateeducation.Hewasaninvaluablesourceofinspirationinthisfascinatingresearch.Thisdissertationwouldnothavebeenpossiblewithouthisconstantfeedbackandguidance.Ourweeklydiscussionshelpedmebetterunderstandthestatisticalissuesinvolvedinourresearch.IalsowanttothanktomygraduatecommitteemembersProfessorMalayGhosh,ProfessorAlexTrindade,ProfessorJimHobert,andProfessorMuraliRaoforreadingmyresearch.Ourconversationsondierentstatisticaltopicsimprovedmyunderstandingandvisionofstatistics.Ialsowanttothanktomyrstmathematicsteacher,ProfessorIgnatieHenny,whomademediscoverthebeautyofmathematics.Iespeciallywanttothankmyparentsfortheirloveandinterestinmyeducationandmyfamily,mywifeMagda,andmychildrenMichaelandStefaniefortheirlove,support,encouragement,andunderstandingwhensometimesIhadtostaylonghoursatschooltonishthisdissertation. 4

PAGE 5

page ACKNOWLEDGMENTS ................................. 4 LISTOFTABLES ..................................... 7 LISTOFFIGURES .................................... 8 ABSTRACT ........................................ 9 CHAPTER 1ESTIMATIONINMOMENTCONDITIONMODELS .............. 11 1.1Introduction ................................... 11 1.2GeneralizedMethodofMoments ........................ 11 1.2.1ReviewonGeneralizedMethodofMoments .............. 11 1.2.2GMMEstimationandAsymptoticResults .............. 13 1.2.3NestedGMMModels .......................... 17 1.3M-Estimation .................................. 18 1.4EmpiricalLikelihoodEstimation ........................ 20 1.4.1ReviewonEmpiricalLikelihood .................... 20 1.4.2EmpiricalLikelihoodforMomentConditionModels ......... 22 2THEBIASEDBOOTSTRAPWITHIIDOBSERVATIONS ........... 27 2.1ReviewontheBootstrapwithIIDObservations ............... 27 2.2LeastFavorableFamiliesCorrespondingtoZ-EstimationModel ...... 31 2.3TheBiasedBootstrapforGMM ....................... 33 2.3.1ConsistencyResultsfortheBiasedBootstrap ............. 35 2.3.2TheBiasedBootstrapRecycling .................... 38 2.4InstrumentalVariables ............................. 42 2.4.1ReviewonInstrumentalVariables ................... 42 2.4.2Bootstrapping2SLSEstimators .................... 48 2.4.3Simulations ............................... 51 3THEBLOCKBIASEDBOOTSTRAPFORTIMESERIES ........... 59 3.1ReviewonBootstrapforTimeSeries ..................... 59 3.2TheBlockBiasedBootstrapforGeneralizedM-Estimators ......... 60 3.3ConsistencyResultsfortheBlockBiasedBootstrap ............. 63 3.4IteratedBlockBiasedBootstrapRecycling .................. 65 3.5AnApplicationtotheOptimalBlockSizeSelection ............. 67 4AHYBRIDBIASEDBOOTSTRAP ........................ 74 4.1OntheBoundaryoftheParametricSpace .................. 74 4.2CertainAsymptoticallyNonnormalStatistics ................. 76 5

PAGE 6

APROOFSOFCONSISTENCYRESULTSFROMCHAPTERI ......... 79 BPROOFSOFRESULTSFROMCHAPTERII .................. 82 B.1LeastFavorableFamilies ............................ 82 B.2ConsistencyoftheBiasedBootstrapforGMMEstimators ......... 84 B.3ConsistencyResultsforBootstrapping2SLSEstimators ........... 95 CPROOFSOFCONSISTENCYRESULTSFROMCHAPTERIII ........ 100 REFERENCES ....................................... 106 BIOGRAPHICALSKETCH ................................ 112 6

PAGE 7

Table page 2-1Estimatedcoverageprobabilitiesforbootstrapone-sided,uppercondenceintervalsat=.90nominallevel,=.8,B=1000,S=1000:GMMbiasedbootstrap(GBB),GMMuniformbootstrap(GUB),centeredresiduals(FCR),andbasedonasymptoticapproximation(ASY) ................................. 56 2-2Estimatedcoverageprobabilitiesforbootstrapone-sided,uppercondenceintervalsat=.95nominallevel,=.8,B=1000,S=1000:GMMbiasedbootstrap(GBB),GMMuniformbootstrap(GUB),centeredresiduals(FCR),andbasedonasymptoticapproximation(ASY) ................................. 57 3-1Computationoftheoptimalblocksizeforuniformblockbootstrapestimationoflevel-2parameters1and2givenby( 3{22 )and( 3{23 ).ThenumberofsimulationsisS=1000,andthenumberofbootstrapresamplesisB=1000.Anasterix(*)showstheblocksizeforwhichtheminimumRMSEhasbeenattained. ..... 72 3-2Computationoftheoptimalblocksizeformovingblockbiasedbootstrapestimationoflevel-2parameters1and2givenby( 3{22 )and( 3{23 ).ThenumberofsimulationsisS=1000,andthenumberofbootstrapresamplesisB=1000.Anasterix(*)showstheblocksizeforwhichtheminimumRMSEhasbeenattained. ..... 72 4-1ThequantilesofthedistributionofTnunder=0,andtheirbootstrapapproximationsgivenbytheordinary(uniform)bootstrap(UB)andthe\hybrid"biasedbootstrap(HBB),usingB=1000bootstrapresamples,S=1000simulationrunsandn=n:4 78 4-2ThequantilesoftheasymptoticdistributionofTnunder=0.2,andtheirbootstrapapproximationsgivenbytheordinary(uniform)bootstrap(UB)andthe\hybrid"biasedbootstrap(HB),usingB=1000bootstrapresamples,S=1000simulationrunsandn=n:4 78 7

PAGE 8

Figure page 2-1Estimatedcoverageerrorscorrespondingtodierentbootstrapcondenceintervals,atdierentlevels,samplesizen=20,40:GMMBiasedBootstrap(GBB),GMMuniformbootstrap,GMMRecyclingBiasedBootstrap(RBB),Mcenteredresiduals(FCR),Obasedonasymptoticapproximation(ASY) ..... 57 2-2Estimatedcoverageerrorscorrespondingtodierentbootstrapcondenceintervals,atdierentlevels,samplesizen=60,80:GMMBiasedBootstrap(GBB),GMMuniformbootstrap,GMMRecyclingBiasedBootstrap(RBB),Mcenteredresiduals(FCR),Obasedonasymptoticapproximation(ASY) ..... 58 2-3Estimatedcoverageerrorscorrespondingtodierentbootstrapcondenceintervals,atdierentlevels,samplesizen=100,200:GMMBiasedBootstrap(GBB),GMMuniformbootstrap,GMMRecyclingBiasedBootstrap(RBB),Mcenteredresiduals(FCR),Obasedonasymptoticapproximation(ASY) ..... 58 3-1ThebootstrapestimatesoftheRMSE'scorrespondingtodierentblockbootstrapschemes.WeusedB=1000outerbootstrapresamples(forthebiasedbootstraprecycling(BBR)andtheadjustedbiasedbootstraprecycling(ABBR))andfortheuniformdoublebootstrap(UB)andthedoublebiasedbootstrap(BB)anadditional500innerbootstrapresamplesforeachouterbootstrapresample ... 73 8

PAGE 9

9

PAGE 10

10

PAGE 11

1.2.1ReviewonGeneralizedMethodofMomentsThegeneralizedmethodofmomentswasrstintroducedintheeconometricliteratureby Hansen ( 1982 )andhasbeenwidelyappliedtotimeseries,crosssectionaldata,andpanelmodels,particularlywithapplicationstoeconomicandnancialdata.Thismethodologygeneralizesmanystandardestimationmethods,includingmaximumlikelihood(ML),ordinaryleastsquares(OLS),generalizedestimatingequations(GEE),andthemethodofmoments(MM),byallowingthenumberofestimatingequationstoexceedthenumberoffreeparameters. 11

PAGE 12

Hall ( 2005 )remarksinarecenteconometricstextbook,GMMhashadagreatimpactintheeconometricsliteraturemostlybecauseeconomicmodelsrarelyprovideacompletespecicationoftheprobabilitydistributionsofthedata.Moreover,theoptimalitypropertiesofmaximumlikelihoodestimators(MLEs)areonlyattainedwhenthedistributionalassumptionsarecorrectlyspecied.Undermisspecication, White ( 1982b )provedthat(pseudo)MLEsarenolongeroptimal,emphasizingtheneedforalternativemethodsofestimationtoreducetheimpactofmisspecicationinparametricmodeling.Inarecentreviewpaper, LindsayandQu ( 2003 )remarkthatGMMcombinesestimatingequationsinanoptimalwayandthatthecorrespondingproceduresarehighlyecientinthefollowingsense:GMMestimatorsareequivalenttotheestimatorsbasedonthebestlinearcombinationsoftheestimatingequations,intermsoftheasymptoticvariance.Inparticular,ifthesetofestimatingequations(momentconditions)containsthescoreequationsofacorrectlyspeciedparametricmodel,thentheGMMestimatorisasymptoticallyequivalenttotheMLE,thoughasecondordereectmaybeevidentwithsmallersampleswhenadditionalestimatingequationsareincludedinthesetofscores. Quetal. ( 2000 )arguethatGMMcanbeusedtoimprovetheeciencyofgeneralizedestimatingequationsinlongitudinaldatamodels( LiangandZeger 1987 ).Theyoptimallycombineanextendedsetofscoresinsuchawaythat,undermisspecication,theestimatorsaremoreecientthanthosebasedontheGEEforagiven\workingcorrelationmatrix". Park ( 2000 )usesGMMtobalancerobustnessandeciencyofpointestimatorscombiningbothecientandrobustscoresforparametersinordertoobtainaGMMestimatorassociatedwiththeimpliedsemiparametricmodelthatisecientforbothheavyandlighttaileddistributions. QuandSong ( 2002 )usetheGMMJ-testofover-identifyingrestrictionsfortestingwhethermissingdataisignorablebyconstructinganadditionalsetofscoresbasedondierentmissingdatapatterns.Theydistinguish 12

PAGE 13

Hall ( 2005 ). E[b(X1;0)]=EF[b0]=0:(1{1)HavingdenedthemomentconditionthatidentiestheGMMmodel,wealsorequirethat0begloballyidentiable( Hall 2005 ). EF[b]6=0forall2with6=0:(1{2)Usually,inorderfor0tobegloballyidentiable,itisnecessaryforthedimensionofb(the\basicscore")toequalorexceedthedimensionoftheparametervector,i.e.pq.Henceforthwewillassumethatthisconditionissatised.Whenp=q,theZ-estimatorfor0=(F)isobtainedbyconsideringthesampleversionof( 1{1 ), EFn[b]=1 13

PAGE 14

vanderVaart 1998 ,Theorem5.21,p.52).Thesandwichestimator^,thenonparametricestimatoroftheasymptoticcovariancematrixoftheZ-estimator,isgivenby ^=1 1{3 )usuallyhasnosolutionwhenq>p,eventhoughthepopulationequation( 1{1 )issatised.OnewayaroundthisproblemistoconsidertheGMMestimator. ^n=argmin2bn()TWnbn();(1{6)wherebn()=1 Hansen ( 1982 )rstshowedthatunderclassicalregularityconditions,theGMMestimatorisconsistentandasymptoticallynormal.Usually,consistencyresultsforGMMestimatorsareobtainedassumingoneoftwotypesofconditions.Eithertheparameterspaceisassumedtobecompact,inwhichcasethecriterionfunctionisrequiredtobecontinuous 14

PAGE 15

Hayashi 2000 ,pp.456{458).WepresenthereageneralconsistencyresultbasedonTheorem5.7of vanderVaart ( 1998 ,p.45).Supposethatforevery>0 sup2jQn()Q()jP!0;(1{8)and inf:k0kQ()>Q(0):(1{9)Thenanysequenceofestimators^nwithQn(^n)Qn(0)+oP(1)convergesinprobabilityto0.Itiseasytoseethatifiscompactandb()iscontinuousthentheglobalidentiabilityproperty( 1{2 )andcondition( 1{9 )areequivalent.Moreover,ifwefurtherassumethatb(x;)iscontinuousinforallxandEF[sup2kbk]<1,thenusingaUniformLawofLargeNumbers( vanderVaart 1998 ,Example19.8,p.272),wealsocanestablish( 1{8 ).Consequently,wehaveestablishedthefollowingcorollary(forotherproofs,see,e.g., Hall ( 2005 ,p.67), DavidsonandMackinnon ( 1993 ,p.592),and Matyas ( 1999 ,p.13)). 1.2.1 and 1.2.2 ,respectively.SupposefurtherthatRpiscompact,thatb(x;)iscontinuousin,andthatEF[sup2kbk]<1.ThentheGMMestimatordenedin( 1{6 )isweaklyconsistent,i.e.^nP!0:Underadditionalregularityconditions, Hansen ( 1982 )showedthatGMMestimatorsareasymptoticallynormallydistributed.Wepresentherethemostcommonconditionsfoundinthestatisticalandeconometricsliteraturetoassureasymptoticnormality.If,inadditiontotheconditionsofCorrolary 1.2.1 ,b(x;)iscontinuouslydierentiablewith 15

PAGE 16

Hansen ( 1982 )provesthatthechoiceofWn=bV1givesthesmallestasymptoticvarianceofGMMestimators,where^VisanyconsistentestimatorofV(notethoughthatWnisthenrandom).InthiscasetheGMMestimatoriscalledecient.However,inordertoestimate^V,wemustrstestimate0.Hansenproposedatwo-stepGMMestimator,whichisobtainedbyrstcomputinganinitial(inecient)estimator~nof0usinganarbitraryweightmatrixW(usuallytheidentity),andthenletting^n=argminbn()T^V(~n)1bn(),where^V()isanestimatoroftheasymptoticcovarianceofbn().Usually,^V()=n1Pni=1b(Xi;)b(Xi;)T,orthecenteredversion^V()=n1Pni=1(b(Xi;)bn())(b(Xi;)bn())Tinsemiparametricestimation,butthecovariancecanbealsomodeledparametrically.Theasymptoticdistributionofthetwo-stepGMMestimatoris Hansenetal. ( 1996 )andisdenedtobe ^n=argminbn()T^V()1bn():(1{11)ItcanbeshownthatthecontinuousupdatingGMMestimatorisconsistentandhasthesameasymptoticpropertiesasthetwo-stepGMMestimatorgivenin( 1{10 ). 16

PAGE 17

Hansen ( 1982 ),usesthestatistic 1{12 )fortestingtheover-identifyingrestrictions( 1{1 )satises ( 2003 )analyzedtheGMMmethodologywithinparametric,semiparametric,andnonparametricframeworks.Theyidentifythreelevelsofnestedmodels.Attherstlevel,wendtheparametricmodel.Here,ingeneral,weidentifysomebasicscoresthatdenetheparameterofinterest.Forexample,foraunivariatedistribution,wecantakeb1(x;)=xasabasicscoreforthemeanandb2(x;)=sign(x):5asabasicscoreforthemedian.Inthecaseofnormaldistributionwithknownvariance,sayN(0;1),bothbasicscoresprovideconsistentestimatorsoflocationparameter0.Inthiscase,anyecientGMMestimatorbasedonthebasicscoresb1andb2isfullyecient,inthesensethatitsasymptoticvarianceequalstheinverseFisherinformation.Iftheparametricmodelisnotcorrect, LindsayandQu ( 2003 )denethesemiparametricmodelimpliedbythemomentconditionstobethesetofalldistributionsFcompatiblewiththescores,i.e.,thesetofallFforwhichthereexists2suchthatEFb=0.ThentheGMMestimatorsareconsistentunderweakeningofmodelassumptions.Ontheotherhand,iftheparametricmodelholdsandthescoresarecorrectlyspeciedinthesemiparametricmodel,thenanyecientGMMestimatorisrstorderequivalenttotheMLE. 17

PAGE 18

LindsayandQu ( 2003 )callQn()\thequadraticinferencefunction"andtheyshowthatthisinferencefunctionmimicsthepropertiesoflog-likelihood(evenwhenthesemiparametricmodelisfalse): vanderVaart ( 1998 ), Sering ( 1980 ),and Huber ( 1981 ).Inthiscase,theparameterofinterest0isgivenasamaximizerofthepopulation\criterionfunction"m() ^=argmax2mn():(1{17) 18

PAGE 19

1{15 )isconcave( Hayashi 2000 ,p.468). Hayashi ( 2000 ,p.458)presentsapropositionfrom NeweyandMcFadden ( 1994 ,pp.2133{2134)thatestablishesconsistencyforM-estimatorsbasedonconcavecriterionfunctions.Proposition 1.3.1 belowgivessucientconditionsforconsistencyunderweakerassumptions.Itsproof,giveninAppendix A ,doesnotrequirethatthesamplecriterionfunctionsconvergeinprobabilityontheentireparameterspace,nordoesitdependontheresultthatM-estimatorscorrespondingtocontinuouscriterionfunctionsareconsistent.Moreover,itdoesnotrequiretheparameterspacetobeconvex,asin Hayashi ( 2000 ,p.458).Theproofusesonlythefactthatpointwiseconvergenceinprobabilityforconcavefunctionsonanopensetimpliesuniformconvergenceoncompactsubsetsofthatopenset( Pollard 1991 ,sec.6)andiseasilyadaptedtoaccommodatenuisanceparameters,asinProposition 1.3.2 (proveninAppendix A ).Forfurtherdetailsandapplicationstosomenancialriskmeasures,see GiurcanuandTrindade ( 2006 ).DenotebyS(t0;)=ft2Rq:ktt0k=gandB(t0;)=ft2Rq:ktt0kgthesphere,respectively,theclosedballcenteredatt0ofradius.

PAGE 20

1.4.1ReviewonEmpiricalLikelihoodEmpiricallikelihood,introducedinaseriesofpapersby Owen ( 1988 1990 1991 ),isanonparametricapproachtoinferencewithapplicationsinmanyareasofstatistics.Empiricallikelihoodallowstheuseoflikelihoodmethodswithoutnecessarilyassumingthatthedataaredrawnfromaparametricfamilyofdistributions.As Owen ( 2001 ,p.1)remarksinhiscomprehensivemonographonempiricallikelihood,theadvantagesofempiricallikelihoodarisebecause\itcombinesthereliabilityofthenonparametricmethodswiththeexibilityandeectivenessofthelikelihoodapproach".Headoptedthename\empiricallikelihood"becausetheempiricaldistributionofthedataplaysanimportantrole.Aswewilldescribelaterinthissection,alternativenonparametriclikelihoodratioshavebeendevelopedthatarealsobasedontheempiricaldistributionfunction,andas Owen ( 2001 ,p.2)statesinhisbook,\empiricallikelihood...isdistinguishedmorebybeingalikelihoodthanbybeingempirical".Themainideaofempiricallikelihoodistoconstructalikelihoodratiostatisticfortheparameterofinterestusingamultinomialdistributionontheobserveddata. Owen ( 1988 )provesananalogueofWilksTheorem,obtaininga2asymptoticdistributionforthenegativeoftwicethelogempiricallikelihoodratio.As Owen ( 1988 )remarks,thisresultissurprisingbecausethenumberofnuisanceparameters,n1,increaseswiththesamplesize. 20

PAGE 21

Owen 1988 ). Owen ( 1991 )appliedempiricallikelihoodtoregressionmodelsbyextendingthetheoryforindependentandnon-identicaldistributedobservations. Kolaczyk ( 1994 )madefurtherextensionstogeneralizedlinearmodels.Empiricallikelihoodwassoonrecognizedasaseriouscompetitortocontemporarymethodsofnonparametricinference,suchasthebootstrap. HallandLaScala ( 1990 )arguethat\empiricallikelihood...deservesaprominentplaceinthemodernstatistician'sarmoryofcomputer-intensivetools".Theyidentifythefollowingadvantagesofempiricallikelihoodoverthebootstrap: (1) empiricallikelihoodprovidescondenceregionsformultivariateparameters,andtheshapesaredatadriven,beingconcentratedinplaceswherethedensityoftheparameterestimatorisgreatest; (2) empiricallikelihoodisBartlettcorrectable,i.e.,acorrectionforthemeanreducesthecoverageerrorofcondenceregionsbasedonempiricallikelihoodfromordern1toordern2; (3) empiricallikelihooddoesnotrequireestimationofscaleorskewness; (4) empiricallikelihoodregionsarerangepreservingandtransformationrespecting. Imbens ( 2002 )givesareviewofrecentdevelopmentsconcerningmaximumempiricallikelihoodestimators,whicharedenedasthemaximizersoftheempiricallikelihoodovertheparameterspace.Heremarksthattheirmainmeritisthattheycircumventtheneedtoestimatethecovariancematrix(theasymptoticcovarianceofthesamplecriterionfunction)necessaryinthecaseofGMMestimatorsandalsotheyhaveaniceinformation-theoreticinterpretation.Itturnsoutthatthecontinuously-updatingGMMestimatorisaparticularcaseofthegeneralizedempiricallikelihoodestimatorobtained 21

PAGE 22

Owen ( 2001 )isdedicatedtothissubject.Byreducingtoindependence,heshowshowtoapplyempiricallikelihoodinthecaseofAR(1)processes.Extensionstoarbitraryorderautoregressiveprocessesareeasilyobtained,anditwouldbeinterestingtoseehowinferencebasedonempiricallikelihoodcompeteswiththeclassicalapproachesintimeseries. Kitamura ( 1997 )introducedablockwiseempiricallikelihoodthatpreservesthedependencestructurewithintheobservations.Byextendingresultsfrom QinandLawless ( 1994 ),hederivesanecientestimatorbymaximizingtheblockwiseempiricallikelihood.Thisestimatoriscalledthemaximumblockwiseempiricallikelihoodestimatorandisthecounterpartfortimeseriesofthemaximumempiricallikelihoodestimator. EF[b0]=0:(1{18)Ifp=qweobtaintheclassicalZ-estimationmodel,andifq>p,thenweobtaintheGMMmodel,asdiscussedinprevioussections.Weconsiderbothmodelsatthesametime,and,whennecessary,weunderlinethedierences.Werstgivesomedenitions. 22

PAGE 23

Owen ( 2001 ,pp.11{12)thatwecantreatthedataasiftherewerenoties,byconsideringtheprobabilitiesassociatedwithobservationsandnotwiththeirvalues.IfwerepresentanydistributionGFnbyavectorofweightsp=(p1;:::;pn),wherepi=GfXig,thentheempiricallikelihoodratiocanbewritteninanequivalentformas Owen ( 1988 )provesthefollowingfundamentalresult.LetX1;:::;Xn2RdbeindependentrandomvectorswithcommondistributionF.For2Rpandx2Rd,letb(x;)2Rp.Let0besuchthatCovF[b0]isniteandhasrankp.If0satisesEF[b0]=0,then2log(R(0))2p.As Owen ( 2001 ,p.41)remarksinhismonograph,aninterestingaspectofthisasymptoticresultisthatitdoesnotincludeconditionsonb(x;)noronEFb.LetFc;n=fG:R(G)c;GFngandSc;n=SG2Fc;nft:EG[bt]=0g.Owen'sresultsuggeststakingc=exp(2p(1)=2),where2p(1)isthe1quantileof2p,inordertoobtainanasymptotic100(1)%condenceregionfor0,i.e. P(02Sc;n)!1;asn!1:(1{23) 23

PAGE 24

Owen ( 1990 )suggestsusingthequantilesofascaledFisher'sFdistribution(n1)p npFp;npinsteadofa2pwhenconstructingtheempiricallikelihoodcondenceregionsfor0. HallandLaScala ( 1990 ), DiCiccioandRomano ( 1989 ),and DiCiccioetal. ( 1991 )showthatempiricallikelihoodisBartlettcorrectable.BartlettcorrectionamountstoameancorrectionoftheempiricallikelihoodinordertoachieveacoverageaccuracyoforderOP(n2).EmpiricallikelihoodisBartlettcorrectablebecausethethirdandthefourthcumulantsofthecomponentsofthesignedrootoftheempiricallikelihoodareofordersatmostOP(n3=2)andOP(n2),respectively.Consequently,theempiricallikelihoodadmitstheexpansion n2q+OP(n2):(1{24)Sincethealgebraicexpressionforaisfairlycomplicated, HallandLaScala ( 1990 )suggestabootstrapapproximation. QinandLawless ( 1994 )extendempiricallikelihoodforZ-estimatorstomodelswherethedimensionoftheestimatingequationisgreaterthanthatoftheparameter.Theydenethemaximumempiricallikelihoodestimator(MELE)tobethemaximizeroftheempiricallikelihoodratioovertheparameterspace,i.e., ~=argmax2R():(1{25) 24

PAGE 25

Owen ( 1990 ), QinandLawless ( 1994 )provethattheoptimalweightsaregivenbypi()=n(1+Tb(Xi;))1foranyinasmallneighborhoodofthetrueparameter0,whereistheLagrangemultiplierofthesystem( 1{22 )andsatisesnXi=1b(Xi;) QinandLawless ( 1994 )showthat~,theMELEof0,isasymptoticallynormallydistributed,withthesamelimitingdistributionastheecientGMMestimator.Specically,assumethatE[b(Xi;0)b(Xi;0)T]ispositivedenite,thatb(x;)istwotimescontinuouslydierentiableinaneighborhoodof0wherekb(x;)k,krb(x;)k,andkr2b(x;)kareboundedbysomeF-integrablefunctionG(x),andthattherankofE[rb(Xi;0)]isp.Then (1) withprobability1,R()attainsitsmaximumatavalue~intheinterioroftheballk0kn1=3asn!1, (2) (3) Baggerly ( 1998 )hasgeneralizedempiricallikelihoodbyconsideringthefamilyofCressie-Readpowerdivergences.TheCressie-ReadpowerdivergencebetweenFnandFp(orequivalentlybetweenthevectorofuniformprobabilitiesp0=n11andp)isgivenby 25

PAGE 26

Baggerly denedtheempiricaldivergenceforthemeanforthewholeclassofCressie-Readdiscrepancymeasuresbygeneralizing( 1{22 )incaseofthemean,i.e.forevery2R Baggerly ( 1998 )showedthat Owen 'sresultontheasymptoticsofempiricallikelihoodholdsforanymemberinCressie-Readfamily. JingandWood ( 1996 )showthattheexponentialempiricallikelihood(obtainedfor=1)isnotBartlettcorrectable,andlater, Baggerly showsthatempiricallikelihoodistheonlyelementintheCressie-ReadfamilyofdivergencesthatadmitsaBartlettcorrection.Nevertheless,inhissimulationresults,theuseofascaledFisher'sFdistributiongivesbettercoveragethanbothasymptotic2andBartlettcorrectedcondenceregions. Corcoran ( 1998 )extendstheclassofdiscrepancystatisticsthatadmitBartlettcorrections. Smith ( 1997 )introducestheclassofgeneralizedempiricallikelihoodestimatorsdenedassaddlepointsofanoptimizationproblemdenedintermsofanormalizedconvexfunction. NeweyandSmith ( 2004 )showthatthisclassofestimatorsgeneralizestheclassofminimumCressie-Readdiscrepancyestimators. 26

PAGE 27

Efron ( 1979 ),thebootstraphasprovidednewmethodstoappliedstatisticsandmotivatedamyriadofnewtheoreticalresults.InarecenteditionofStatisticalSciencededicatedtothebootstrap, Efron ( 2003 )remarkedthatthebootstrapwasinitiallyintroducedasanalternativetothejackknifeforestimatingthebiasandvarianceofanestimator.Sincethen,manynewapplicationshavebeendeveloped,includingbootstrapcondenceintervalsandsignicancetests,bootstrapbiasreduction,andbootstrapdiagnostics.Inreviewingrecentdevelopmentsinbootstrapping, Davisonetal. ( 2003 )mentionedseveralnewdirectionsofresearch,includinghighlyaccurateparametricbootstrapprocedures,theoreticalpropertiesforthenonparametricbootstrapwithunequalprobabilities,them-out-of-nbootstrap,bootstrapfailuresandremediesforsuperecientestimators,signicancetesting,andresamplingfordependentdata.Booksthatdealwithboththeoreticalpropertiesandapplicationsofbootstrapinclude Hall ( 1992 ), EfronandTibshirani ( 1993 ), ShaoandTu ( 1995 ), DavisonandHinkley ( 1997 ),and Lahiri ( 2003 ).Themainideaofthebootstrapistoestimatethesamplingdistributionofastatisticbyits(re)samplingdistributionobtainedunderanestimateoftheunderlyingdistributionofthedata.Thisdenitionappliestobothparametricandnonparametricproblemsasfollows.SupposeX=fX1;:::;XngisaniidsamplefromadistributionFandwewanttoestimatethesamplingdistributionofastatisticTn=Tn(X1;:::;Xn).LetL(TnjF)representthedistributionofTnwhenthedataXisaredrawnfromF.Intheparametricbootstrap,weconsideraparametricmodelfF;2gfortheunderlyingF,withF=F0and02,whereistheparameterspace.Inordertoapplythebootstrapprincipletothisproblem,werstestimatetheparameter0byaconsistent(andecient)estimator^(usuallytheMLE)andtakeF^asourestimate 27

PAGE 28

Hall 1992 ,pp.9{11),andbootstrapestimatesareusuallyfoundbyMonteCarlosimulation.Inthiscase,foragivenintegerB,weconsiderBsimulatedbootstrapresamplesX1;:::;XB,andwecomputethestatisticTb=T(Xb)foreveryresampleXb.Then,weapproximateL(TnjFn)bytheempiricaldistributionoftheTb's,i.e.,L(TnjFn)1 HallandMartin ( 1988 ).Generally,nomorethantwolevelsofbootstrapareemployed,andsuchproceduresarereferredtointheliteratureas\theiteratedbootstrap",\thenestedbootstrap",and\thedoublebootstrap".Thecomputationaleortrequiredbytheiteratedbootstrapisgenerallytakentobethesquareofthatrequiredforonelevelofbootstrapping,whichisalreadycomputationallyinvolved.Applicationsoftheiteratedbootstrapincludecalibrationofcondenceregions( Beran 1987 1988 ; Hall 1992 ),biasreduction( Halland 28

PAGE 29

, 1988 ; DavisonandHinkley 1997 ),variancestabilization( Tibshirani 1988 ; HallandPresnell 1999 ),andbootstrapdiagnostics( Efron 1992 ; Cantyetal. 2000 ).Often,thebootstrapprovidesmoreaccurateresultsthanrstorderasymptoticapproximations,withoutmakinguseofthecomplexalgebraofhigherorderexpansions.TheanalysisoftheperformanceofbootstrapproceduresgenerallyrelyonEdgeworthexpansions.TheEdgeworthexpansionisarenementoftheCentralLimitTheoremthatgivestheformoftheerrortermsinanasymptoticapproximationofthedistributionofthesamplemean,extendedby BhattacharyaandGhosh ( 1978 )tosmoothfunctionofmeans.Bootstrapversionsoftheseexpansionsweredevelopedby Hall ( 1988 )inordertoanalysetheperformanceofdierenttypesofbootstrapcondenceintervals.Asaconsequence,thebootstrapoftengivesrejectionandcoverageprobabilitiesthataremoreaccuratethanapproximatelargesamplemethods.Generally,agoodbootstrapprocedureshouldsatisfytwodesiderata:itshouldyieldanasymptoticallyconsistentestimateofthesamplingdistributionofastatisticand,forsmalltomoderatesamplesizes,itshouldoutperformasymptoticapproximations. ShaoandTu ( 1995 )identifysometechniquesusedinthestatisticalliteraturetoestablishbootstrapconsistency.Themostpopulartechniqueiscalledimitation.Themainideahereistoimitatetheproofforobtainingtheasymptoticdistributionofthestatisticinordertoextenditforthebootstrap.Theconsistencyofthebootstrapforthesamplemeancanbeproventhisway( vanderVaart 1998 ,Thereom23.4).Then,byapplyingthedeltamethodforbootstrap( vanderVaart 1998 ,Theorem23.5),theconsistencyofthebootstrapforsmoothfunctionsofsamplemeansfollows.AnothertechniqueusesBerry-Esseentypeinequalities.Themainadvantageofthismethodisthatonecanalsoobtaintherateofconvergenceofthebootstrapestimates.Unfortunately,itisoftendiculttoobtainsuchinequalities.Inordertoshowconsistencyofthebootstrap,oneusuallyconsidersametricthatmetrizesweakconvergenceinthespaceofdistributionfunctions( Huber 1981 ), 29

PAGE 30

BickelandFreedman ( 1981 )usedMallow'sdistanceinprovingconsistencyofthebootstrapfort-statistics,vonMisesfunctionals,andempiricalprocesses. Freedman ( 1981 1984 )alsousesMallow'sdistancetoprovetheconsistencyofthebootstrapdistributionofordinaryleastsquares(OLS)andtwostageleastsquares(2SLS)estimatorsincertainlinearregressionmodels.Usingempiricalprocessestheory,bootstrapconsistencycanbealsoestablishedforHadamarddierentiablestatisticalfunctionalsusingtheconsistencyoftheempiricalbootstrapfortheBrownianBridge( Gine 1990 ),andmorerecently vanderVaart ( 1998 ,pp.332{334).Usingafunctionaldeltamethod,onecanprovebootstrapconsistencyforamyriadofstatistics,suchassamplequantiles,L-estimators,andnonparametric-goodness-of-tstatisticssuchasthevonMisesandKolmogorov-Smirnovstatistics.Toillustrate,usingthesamenotationsasabove,letXbeaniidsamplefromFandletGn=L(TnjF)bethedistributionofTn.HavingabootstrapresampleX,letTn=Tn(X1;:::;Xn)bethebootstrapversionofTncorrespondingtoXandlet^Gn=L(TnjFn)beitsbootstrapdistribution.SupposethatTnconvergesweaklytoG,sothat(Gn;G)!0asn!1,whereisametricthatmetrizesweakconvergenceofdistributions,suchasLevydistanceorboundedLipschitzdistance( Huber 1981 ).WhiletheasymptotictheoryapproximatesthedistributionGnbyitslimitG,thebootstrapapproximatesGnbyitsbootstrapdistribution^Gn(whichisarandomdistribution).Ifthesequenceofrandomdistributions^GnconvergestoGinprobability,i.e.(^Gn;G)P!0,then 30

PAGE 31

B.2 ,wepresenta(conditional)versionofSlutsy'stheorem,thatshowsthatifTnPT,Bn=B+oP(1),andCn=C+oP(1),thenBnTn+CnPBT+C.Finally,usingaclassicalsubsequenceargument,( Kallenberg 2002 ,Lemma4.2,p.63),notethat(^Gn;G)P!0ifandonlyifforeverysubsequence^Gmnof^Gn,thereexistsafurthersubsequence^Glnof^Gmn,suchthat(^Gln;G)a:s:!0asn!1. 1.2.2 Presnell ( 2002 )showsthatthesefamiliesareleastfavorableforparametersthataresmoothfunctionsofthemeanvector.InTheorem 2.2.1 below,weprovethatthisresultisalsotruefortheZ-estimationmodel( 1{3 ).Here,byleastfavorablewemeanthat,whenevaluatedatthemaximumlikelihoodestimator,theinverseoftheFisherinformationmatrixcorrespondingtothepseudo-parametricfamiliesdenedinSection 1.2.2 isequaltothesandwichmatrix,theusualnonparametricestimatoroftheasymptoticvarianceoftheZ-estimator.Inthissense,inferenceabouttheparametersisnotmadearticiallyeasierbyrestrictingattentiontothesefamiliesofdistributions( Stein 1956 ).Thereareanumberofbootstrapcondenceintervalproceduresbasedonleastfavorablefamiliesofdistributions.Efron'stiltedbootstrapcondenceintervalsarebasedoninvertingabootstraphypothesistestcarriedoutwithinaleastfavorablefamily.Theautomaticpercentilemethod( DiCiccioandRomano 1989 )canbeappliedinconjunctionwithaleastfavorablefamilyapproach( DiCiccioandRomano 1990 ).Also, DiCiccioandRomano ( 1990 )and HallandPresnell ( 1999 )suggestestimatingthevarianceof^asa 31

PAGE 32

minimizeD(p)=1 1{27 ).Notethat Presnell ( 2002 ),dene 2{1 ),foraxed @pi(D(p)0"nXi=1pi1#TnXi=1pib(Xi;))=0(npi)0Tb(Xi;): Multiplying( 2{4 )bypiandsummingoveri,gives 32

PAGE 33

2{4 )maybesolvedforpiusing( 2{5 ),yielding 2{6 ),i.e.F=nXi=1pi()Xi;wherexistheunitpointmassatx.WeshowintheappendixthatF=fFg2,theresultingfamilyofweightedempiricaldistributionsindexedby,hasthenonparametricleastfavorableproperty( Stein 1956 ; DiCiccioandRomano 1990 ). 1.2.2 ,evaluatedatthemaximumlikelihoodestimator,isequaltothesandwichmatrix,theusualnonparametricestimatoroftheasymptoticvariancecovariancematrixoftheZ-estimator. Tauchen 1986 ).Inaneorttoimprovenitesampleperformanceofthesetests, HallandHorowitz ( 1996 )devisedamodiedbootstrapprocedurethatcanbealsoappliedwithdependentdata.Theirprocedurerecentersthemomentconditionssothatthemodiedmomentconditionsarefullledbythesample. 33

PAGE 34

( 1996 )showedthatthebootstrapt-testforindividualparametersisasymptoticallyconsistentwithoutrecentering.However,iftheuncenteredmomentconditionsareused,thenthebootstrapestimateofthedistributionoftheJ-teststatisticofover-identifyingrestrictionsisnotconsistent,afactalsoclaimedby BrownandNewey ( 2002 )and LindsayandQu ( 2003 ).Asnotedby LindsayandQu ( 2003 ),usingtheuncenteredmomentconditions,thenullhypothesisofmean-zeroofthescoresdoesnotholdforthesample,sothat,oneissamplingunderanalternativehypothesisinwhichthemeanofthescoresisnotzero.Thismayhaveasmallimpactonthecriticalvaluesifthenullistrue(ifthebootstrapisconsistentunderthenull),butwillhaveagreatimpactifthenullisfalse;consequently,thesizeofthebootstraptestmightbenearlycorrectbutitspowermaybepoor. HallandPresnell ( 1999 )devisedthebiasedbootstrapinordertoimprovetheperformanceofawiderangeofstatisticalproceduresforhypothesistesting,shrinkage,robustestimationandvariancestabilization.WeusethismethodologyinordertoconstructasemiparametricbootstrapforGMM.AbiasedbootstrapforGMMwasintroducedby BrownandNewey ( 2002 ),thoughfromanotherperspective.TheyarguethattheweightedempiricaldistributionthatminimizestheKullback-LeiblerdivergencetotheempiricaldistributionwhilesatisfyingtheGMMequations(theycallit\theempiricallikelihooddistribution"),attainsasemiparametriceciencylowerbound( BrownandNewey 1998 ).ToshowtheconsistencyoftheweightedbootstrapfortandJstatistics,theyclaimedthatbecausetheempiricallikelihooddistributionismoreecientthantheempiricaldistributionofthesample,thecorrespondingweightedbootstrapisbothconsistentandmoreecientthantheuniformbootstrap.Ourapproachissimilar.WerstintroduceafamilyofweightedempiricaldistributionsF=fFg2denedby( 2{1 ),associatedwiththeGMMmodel( 1{1 ).Webootstrapasifwehadaparametricmodel:rstweestimatetheparameter0=(F)using,say,theGMMestimator^;then,fortherstlevelofbootstrap,agenericresample 34

PAGE 35

2.3.2 ,wewillseethatthebiasedbootstraphascertaincomputationaladvantagesoverthe\uniform"bootstrapwhenthebootstrapisiterated.Inthisway,wemimictheparametricbootstrapforsemiparametricmodelswithoutanyadjustmentofthemomentconditionsandwithouttheneedtondthecenteringvalueforthebootstrapversionofthestatistics,asitisusuallyrequiredinsuchsituations,seee.g. Shorack ( 1981 ), Freedman ( 1981 ), HallandHorowitz ( 1996 ),and Lahiri ( 2003 ).Tosummarize,themainstepsinthebiasedbootstrapforGMMareasfollows: 2{6 ). 2{6 ).Weassumethefollowingregularityconditionshold. 35

PAGE 36

1.2.1 { 1.2.2 2{6 ).Weshownexttheconsistencyofthebiasedbootstrapforthesamplemeanofcriterionfunctions(Theorem 2.3.1 ),then,wegivealemmaonconditionaluniformconvergenceinprobabilitythatwillbeneededinprovingtheweakconsistencyofGMMestimatorsandtheirbootstrapversions.WeendwithatheoremonconsistencyofthebiasedbootstrapdistributionoftheGMMestimators(Theorem 2.3.3 ).Theorem 2.3.4 showstheconsistencyofthebiasedbootstrapfortheJ-testofover-identifyingrestrictions. 2.3.1 { 2.3.3 ,0isinsidetheconvexhulloffb(Xi;^):i=1;:::;ng,withprobabilityapproaching1. 1{1 ).Let^beasequenceofGMMestimatorsdenedby( 1{6 )andletF^betheweightedempiricaldistributiongivenby( 2{6 )correspondingto^.LetX=fX1;:::;Xngbeabiasedbootstrapresample.Undertheassumptions 2.3.1 { 2.3.3 ,asn!1, 36

PAGE 37

2.3.1 ,thefollowinguniformconvergenceforthebiasedbootstrapholds: 2.3.1 ,foranysequenceofGMMestimators^and(biased)bootstrapestimators^,withQn(^)Qn(0)oP(1)andQn(^)Qn(^)oP(1),wehave^=0+oP(1)and^=0+oP(1),andhencealsothat^=^+oP(1). 1{1 ).Let^beasequenceofGMMestimatorsdenedby( 1{6 )andletF^betheweightedempiricaldistributiongivenby( 2{6 )correspondingto^.LetX=fX1;:::;Xngbeabiasedbootstrapresampleandlet^bethebootstrapversionof^onX.Undertheassumptions 2.3.1 { 2.3.3 ,asn!1, B.2.4 intheappendix B.2 ,wecansubstituteforthenonrandommatrixWthenonparametricestimatorofV1.Asaconsequence,theconsistencyofthebootstrapforthetwo-stepGMMestimatorfollows. 2.3.1 ,weusethefactthattheweightedempiricaldistributionsatisesthemomentconditions.Ifwedonotusethebiasedbootstrap,theconditionalmeanofthe(uniform)bootstrapsamplemeanbn(^)containsarandomelementthatdisappearsasymptotically,aresultthatisprovenintheappendix B.2 .ThefactthatrecenteringisnotnecessaryinthecaseofGMMestimationhasbeenalsoprovenby Hahn ( 1996 ),thoughusinganotherapproach.Asaconsequence,withoutreweightingorrecentering,thebootstrapestimateofLp 37

PAGE 38

B.2 thattheusual(uniform,uncentered)bootstrapdistributionestimateoftheJ-testofover-identifyingrestrictionsisweaklyinconsistent. 2.3.3 NewtonandGeyer ( 1995 ).Theysuggesteddrawingacommonsetofpotentialre-resamplesfromasingle\design"distribution,andthenusingimportanceweightingto\recycle"thesere-resamplestoestimate(conditional)expectationsforeachrstlevelbootstrapresample.Unfortunately,thismethodisapplicableonlytotheparametricbootstrap,mostlybecauseinthenonparametricbootstrapthesupportoftheresampleempiricaldistributionsvariesfromresampletoresample.Thus,themajorityofsamplesfromanycandidatedistributionthatdominatesalltheresampledistributionswillhavezeroimportanceweightsformostresamples,leadingtoextremelyinecientandunstableMonteCarloestimates( Ventura 2000 ).Usingtherecyclingmethod, PresnellandGiurcanu ( 2007 )constructarecyclingalgorithmfortheiteratedbiasedbootstrapthatyieldsasecondordercorrectcondenceintervalinthesmoothfunctionofmeansmodel.Atthesametime,thisprocedurepreservesthecomputationalrequirementsofthesinglelevelbootstrap.Themainideaof 38

PAGE 39

E^(T)=E^TdP^ 2{11 )isgivenby E^TdP^ P^(Xb)(2{13)isthelikelihoodratiostatisticcorrespondingtotheprobabilitymodelsP^andP^evaluatedfortheresampleXb.UsingthedenitionsofP^andP^,weseethat P^(Xb) P^(Xb)=nYi=1pi(^) Eft(F;F^)=0;(2{15)whereF^istheweightedempiricaldistributioncorrespondingto^andftisagivenfunctional.Examplesofsuchfunctionalsinclude( Hall 1992 ): 39

PAGE 40

2{15 ), E^[ft(F^;F^)]=0;(2{19)obtainedfrom( 2{15 )isobtainedusingthe\plug-inrule",i.e.bysubstitutingF^forFandF^forF^.Usually,EfT(F^)(F;F^)6=0,andwemightbeinterestedinafurther(additive)correctiont(F),thesolutioninttothefollowingequation EfT(F^)+t(F;F^)=0:(2{20)Asbefore,lett(F^)bethebootstrapestimateoft(F),whichsolvesintthesampleanalogueof( 2{20 ) E^fT(F^)+t(F^;F^)=0:(2{21)Thisequationisagainobtainedusingthe\plug-in"rule,bysubstitutingF^forFandF^forF^in( 2{20 ).NotethatT(F^)solvesE^ft(F^;F^)=0,andthisiswherethesecondlevelofbootstrappingenters.ThehopeisthenthatEfT(F^)+t(F^)(F;F^)0.Thisapproximationneedstobetakeninthefollowingsense:itisnotthatT(F^)+t(F^)isclosertot(F)thanisT(F^),butthatEfT(F^)+t(F^)(F;F^)isclosertozerothanisEfT(F^)(F;F^).Wecanargueasin HallandMartin ( 1988 ,pp.663{665)thatanyiterationofthebiasedbootstrapincreasestheaccuracyoftheestimation.Specically,supposethatEfT(F^)(F;F^)=c(F)nj=2+OP(n(j+1)=2);

PAGE 41

@tEfT(F^)+t(F;F^)t=0!a6=0;wherecisa\smoothfunctional".ThenEfT(F^)+t(F^)(F;F^)=OP(n(j+1)=2):Theargumentisidenticalto HallandMartin ( 1988 ),usingtheadditionalfactthatif^=0+OP(n1=2)thenkF^Fk1=OP(n1=2).TherecyclingbiasedbootstrapcanbeusedtondtheMonteCarloestimateofT(F^),thesolutionintoftheequation E^ft(F^;F^)=0:(2{22)Foragivenb=1;:::;B,letXb=fXb1;:::;Xbngbeabiasedbootstrapresample.Let^b=^Xb,b=1;:::;B,betheversionof^correspondingtoXb.ThenweobtainthefollowingrecyclingMonteCarloapproximationoftheconditionalexpectationfrom( 2{22 )asin( 2{12 ): E^bft(F^b;F^)1 1 41

PAGE 42

1 2.4.1ReviewonInstrumentalVariablesIntheclassicalregressionmodel,theassumptionthattheerrorsareindependentofregressorsisnecessaryinorderforOLSestimatorstobeconsistent.Inmanyobservationalstudies,thisorthogonalityconditionbetweentheregressorsandtheerrorsisnotsatised,sonewtechniqueshavebeendevelopedtoanalyzesuchdata.Onesuchtechniqueisthemethodofinstrumentalvariables,whichassumestheexistenceofaninstrumentalvariable,i.e.,avariablethatiscorrelatedwiththeregressorsbutuncorrelatedwiththeerrors.Thistechniquewasproposedintheeconometricliteratureby Reiersl ( 1941 )andhasbeendevelopedtheoreticallyby Durbin ( 1954 ), Sargan ( 1958 ), BrundyandJorgenson ( 1971 ),and White ( 1982a ),amongothers.Extensivetreatmentsofthismethodologyaregivenineconometrictexts,suchas DavidsonandMackinnon ( 1993 ), Matyas ( 1999 ), Hayashi ( 2000 ),and Hall ( 2005 ).Themethodofinstrumentalvariablesiswidelyappliedtocross-sectional,panel,andtimesseriesmodels,andismoregenerallyusedtomakecausalinferenceinobservationalstudiesanderrors-in-variablesmodels.Avarietyofestimatorshaveappearedintheeconometricsliterature,includingtwo-stageleastsquares(2SLS),instrumentalvariable(IV),andtwo-stageinstrumentalvariable(2SIV)estimators.AlltheseestimatorscanalsobeviewedasparticulartypesofGMMestimators.Considernowtheestimationofcausaleectsinanobservationalstudy( Angristetal. 1996 ).Ifonewantstoestimateatreatmenteect,buttheindividualsexertsomecontroloverthetreatmentassignment,thendierencesbetweengroupmeansarebiased.Aninstrumentalvariableisavariablethatiscorrelatedwiththeexposuretothetreatment,butuncorrelatedwiththeoutcomeaftercontrollingforexposuretothetreatment.Thus, 42

PAGE 43

AngristandKrueger 1991 ).Individualandinstitutionaldecisionsgenerateacorrelationbetweenschoolingandunobservedcovariatessuchasabilityandmotivation,thatarerelatedtopotentialearnings.Forinstance,ifcompulsoryattendancelawswereextended,thenthosewhohadplannedtobeinschoollesswouldcontinuetoearnlessduetounobservedcovariatessuchasability,motivation,andfamilybackground.Asetofinstrumentsinthiscaseisasetofvariablesthataectschoolingbutnotearnings,onceschoolingiscontrolledfor(i.e.includedintheregressionequation). AngristandKrueger ( 1991 )usethequarterofanindividual'sbirthasaninstrument.Theyarguedthatstudentswhoarebornearlierinthecalendaryeararetypicallyolderwhentheyentertheschoolthanstudentswhoarebornlaterintheyear.Thispatternarisesbecausemostdistrictsdonotadmitstudentsunlesstheyattainage6byJanuary1.Consequently,childrenbornearlierintheyearattainthedrop-outageafterattendingtheschoolforashorterperiodoftimethanthosebornlaterinthecalendaryear.Otherinstrumentsusedinstudyingthecausaleectofschoolingonearningsincludesiblingscomposition( ButcherandCase 1994 )andproximitytoanearbycollege( Card 1995 ).Instrumentalvariableshavealsobeensuccessfullyappliedinbio-statisticalresearch.Inarecentarticle, NewhouseandMcClellan ( 1998 )describehowinstrumentalvariablescanbeappliedtoestimatetreatmenteectsinanobservationalstudywhenacontrolledtrialcannotbedone.Theyillustratewithanapplicationtoaggressivetreatmentofacutemyocardialinfarctionintheelderly.Theyuseinstrumentalvariablestoestimatetheeectofcatheterizationonmortalityrate.Asinstrument,theyusethe\dierentialdistance"(theadditionaldistance,ifany,beyondthedistancetothenearesthospitaltoreachacatheterizationhospital).Theyarguethatthedierentialdistancehasnodirecteectonmyocardialinfarctionbutiteectsthelikelihoodofcatheterization(thegreaterthe 43

PAGE 44

HoganandLancaster ( 2004 )applyIVinordertoinfercausaleectsinlongitudinalrepeatedmeasurements.Theyreviewtwomethodsforestimatingcausaleectsinlongitudinaldata,inverseofprobabilityweighting(IPW)andinstrumentalvariables.TheyapplythesemethodstotheHERSdata,asix-yearnaturalhistorystudythatenrolled871HIV-infectedwomenstartingin1993,inordertoestimatethetherapeuticeectofhighlyactiveantiretroviraltherapyregimen(HAART)onCD4cellcount,usingmarginalstructuralmodeling.Inthisdataset,thereceiptoftherapyvarieswithtimeanddependsonCD4countandothercovariates.Theyremarkthatbothmethodsrelyontwoimportantassumptions:nounmeasuredconfoundingforIPWandthereliabilityoftheinstrumentsforIV(theymustbestronglycorrelatedwiththeexposuretotheHAARTtherapy).Beforegoingintofurtherdetail,wepresentthe2SLSestimatorforthelinearIVregressionmodel.Considerthefollowingregressionmodeldenedby Hall ( 2005 ,pp.34{42)canberelaxed,( White 1982a ),butweconneourselvestothesetosimplifythepresentation. 44

PAGE 45

2{26 )canbewrittenas ^=XTZ(ZTZ)1ZTX1XTZ(ZTZ)1ZTy:(2{29)LetPZ=ZZTZ1ZTbetheprojectionmatrixontothecolumnspaceofZ,thenthe2SLSestimatorcanbewritteninanequivalentformas ^=XTPZX1XTPZy:(2{30)Theestimatorgivenby( 2{29 )iscalledthetwo-stageleastsquaresestimator(2SLS)becauseitcanbeobtainedinatwo-stepleastsquaresprocedure.Attherststage,regressthecolumnsofXonthecolumnspaceofZandthen,atthesecondstage,regressyonthecolumnspaceofthettedvaluesofXobtainedfromtherststageregressions.Tobemoreexact,foreachj21;:::;p,considertheregressionsx(j)=Zj+j;j=1;:::;p;

PAGE 46

2{31 )is^=(^XT^X)1^XTy=XTPZX1XTPZy;inagreementwith( 2{29 ).The2SLSestimator^isalsoafeasiblegeneralizedleastsquaresestimator( Hayashi 2000 ,p.59).Toseethis,multiplying( 2{28 )byZT,weobtain 2{32 )is~=(XTZQ1zzZTX)1XTZQ1zzZTy:Now,sincen1ZTZa:s:!Qzz,afeasibleGLSestimatorofisobtainedbysubstitutingtheconsistentestimatorn1ZTZforQzz,whichagainyields^asgivenby( 2{29 ). 46

PAGE 47

2.4.2 ,foralli=1;:::;n, E[g(ui;)]=0;(2{33)whereistheregressionparameterin( 2{28 ).ByAssumption 2.4.1 ,qp,so,theqmomentconditionsin( 2{33 )deneaGMMmodelasin( 1{1 ).Then,foragivenweightmatrixWn,theGMMestimatoris ^=argmingn()TWngn();(2{34)wheregn()=n1Pni=1(yixTi)zi=n1ZT(yX).SinceQn()=gn()TWngn()isdierentiablewithrespectto,theGMMestimator^isasolutiontorQn(^)=XTZWnZT(yX^)=0,anditisgivenby ^=XTZWnZTX1XTZWnZTy:(2{35)InordertoobtainanecientGMMestimator,weneedtochooseWntobeanestimatoroftheasymptoticcovariancematrixofgn().Sincegn()=n1Pni=1(yixTi)zi=n1Pni=1izi,usingAssumptions 2.4.1 { 2.4.3 ,weobtainn1=2Pni=1iziN(0;2Qzz).Sincen1ZTZa:s:!Qzz,anecientGMMestimatorisobtainedfrom( 2{35 )bysubstituting(ZTZ)1forWn,i.e. ^=XTPZX1XTPZy;(2{36)thesameasin( 2{29 ).Finally,ifAssumptions 2.4.1 { 2.4.3 arefullled,thenthe2SLSestimatorisasymptoticallynormallydistributed( Hall 2005 ),andinparticularn1=2(^)N0;2(QxzQ1zzQTxz)1:

PAGE 48

Freedman ( 1981 ), Shorack ( 1981 ), Freedman ( 1984 ),and Wu ( 1986 ). Freedman ( 1981 )identiestwomainbootstrapproceduresforlinearmodels:bootstrappingresidualsandbootstrappingcases.Considerthefollowinglinearmodel 2{37 )iscalledthecorrelationmodel,usingthesameterminologyas( Hall 1992 ,p.170).Inthiscase,thepairs(yi;xi)areresampledrandomlyfromZ.Then,thebootstrapestimate^isdenedastheversionof^onZ.Inthecaseofthezero-interceptregressionmodel, Freedman ( 1981 )and Shorack ( 1981 )arguethatresidualsneedtoberstre-centeredbeforeresampling,inordertoobtainconsistentbootstrapestimates. Freedman remarksthat\...,withoutcentering, 48

PAGE 49

(i) IfExi=0,thenthebootstrapis(weakly)consistent:p (ii) IfExi=6=0,thenthefollowing(conditional)weakconvergencedoesnothold:p Freedman ( 1984 ,p.834)arguesthattheresidualsrstneedtoberecentered,since\Asdata,theresidualsarenotorthogonaltotheinstruments."Thisstatementispotentiallymisleadingandmightleadapractionertorecentertheresidualsunnecessarily.Wewillshowinthissectionthatundergeneralconditions,theuniformbootstrapestimateofthedistributionof2SLSestimatorisindeedweaklyconsistentwithouttheneedforre-centering.Themainideahereisthatitcanbeshownthatbootstrappingthedatain2SLSregressionmodelisequivalenttobootstrappingcases,andbootstrappingcasesin2SLSmodelisequivalenttobootstrappingecientGMMestimators,intheGMMmodelassociatedwiththeregressionmodel.Wehaveshownintheappendixthatthe(uniform)bootstrapisconsistentforthesamplingdistributionofGMMestimators.Although Hahn ( 1996 )alsoprovedtheresultofconsistencyforGMMestimators,hedoesnotmaketheconnectionbetweentheFreedman'sresamplingprocedurewithuncenteredresidualsandbootstrappingtheGMMestimators. 49

PAGE 50

^i=yixTi^;(2{39)wherexTiisthei-throwofthematrixX.Generally,the^i'sarenotorthogonaltotheinstruments,i.e., Freedman denestherecenteredresidualsasthepartofresidualsthatareorthogonaltothecolumnspaceoftheZ(instruments),i.e.~i=^izTi(ZTZ)1ZT^:Inordertopreservethedependencestructurewithinvi=(xi;zi;i), Freedman resamplesfrom~Z=f(xTi;zTi;~i)T;i=1;:::;ng.Let~Z=f(x1;z1;~1);:::;(xn;zn;~n)gbeagenericuniformresamplefrom~Z.DenotebyX,Z,PZ,and~thebootstrapversionsofX,Z,PZ,and~on~Z.Denethebootstrapobservations~yby ~y=X^+~:(2{41)Theanalogof^forthebootstrapresample~Zisgivenby ~=XTPZX1XTPZ~y:(2{42)Wenowshowthatrecenteringisnotnecessarywhenbootstrapping2SLSestimators.LetZ=f(x1;z1;^1);:::;(xn;zn;^n)gandZ=f(x1;z1;1);:::;(xn;zn;n)gbea(uniform)bootstrapresamplefromZ,i.e.(x1;z1;1);:::;(xn;zn;n)are(conditionally)iid,(discrete)uniformlydistributedonZ.DenotebyX,Z,PZ,and^thebootstrapversionsofX,Z,PZ,and^onZ.Denethebootstrapobservationsyand^thebootstrapanalogof^forthebootstrapresampleZasbefore.Itisaneasyexercisetoshowthatresampling(xi;zi;^i)'s,withequalprobabilitiesisequivalenttoresamplingcases(yi;xi;zi)'s,withequalprobabilities.Therefore,since2SLSestimatorsareparticular 50

PAGE 51

2{36 )),andusingtheresultthattheuniformbootstrapisconsistentforthedistributionoftheGMMestimators,weobtaintheuniformbootstrapgivesconsistentestimatorsfor2SLSestimators,withoutrecenteringtheresiduals.Thefollowingconsistencyresultscanbegeneralizedtoallowforheteroskedasticityoftheerrorterms,butwelimittheprooftothesesimplerhypotheses( White 1982a ). 2.4.1 { 2.4.3 ,foreveryx2Rk, 2.4.1 2.4.3 arefullled,thenthebootstrapestimateofthedistributionofthe2SLSestimatorisasymptoticallyconsistent,i.e. 51

PAGE 52

and1i;2i;3i;4iareiid,(211)distributed.Hence,ishavemean0andvariance2.Letx=(x1;:::;xn)Tbethen1vectorofexplanatoryvariables,ythen1vectorofresponses,=(11;:::;1n)Tthen1vectorofunobservederrorterm,z1=(z11;:::;z1n)Tandz2=(z21;:::;z2n)Tthen1vectorsofinstrumentalvariables.Inmatrixnotation,themodelgivenby( 2{45 ){( 2{48 )canbewritteninamorecompactformas Freedman 1984 ),uncenteredresidualbootstrap,centereddoublebootstrap,anduncenteredresidualdoublebootstrap.Inordertodenethebiasedbootstrapcondenceintervals,weconsidertheinstrumentalvariableregressionmodelintheGMMframework,asdescribedinsubsection 2.4.1 .Fori=1;:::;n,letui=(yi;xi;zTi)Tbetheobservationsarrangedinvectorform.Foru=(y;x;zT)T2R4anda2R,letg(x;y;z;a)=(yax)z.Byassumption, E[g(ui;a)]=0;i=1;:::;n;(2{50) 52

PAGE 53

2.3 associatedwiththeimpliedGMMmodeldenedby( 2{50 ).Let^abethe2SLSestimatorofa.Weapplythebootstraptechniquedescribedinsubsection 2.3.2 toconstructbootstrapcondenceintervals.Foranleveluppercondenceintervalfora,weneedtondt1,thesolutionofthe\populationequation" Pa^at1=0:(2{51)Werewritetheleftsideofthisequationasfollows Pa^at1=P^aat1=1P^aat1=1EI(^aat1): Hence,t1isthesolutionofthe\populationequation" EI(^aat1)=1:(2{53)Let^t1bethebootstrapestimateoft1,i.e.thesolutionofthe\sampleequation" E^aI(^a^a^t1)=1:(2{54)ThebiasedbootstrapuppercondenceintervalwithnominalcoverageisthenJbb=(;^a^t1).Sincetypically Pa^a^t16=;(2{55) 53

PAGE 54

P(a^a^tq())=:(2{56)Let^q()bethebootstrapestimatorofq(),sothat^q()satises P^a(^a^a^t^q())=;(2{57)where^tsatisesP(^a^a^tjF^a)=.Since^q()satisesE^a[I(^t^q()^a^a)]=,and ^t^q()^a^a()E^a[I(^a^a^a^a)]^q();(2{58)itfollowsthat^q()satises E^aIE^a[I(^a^a^a^a)]^q()=:(2{59)Consequently,^q()isthe(1)thquantileofE^a[I(^a^a^a^a)].WenowdescribetheMonteCarloimplementationofthisbootstrapprocedure.ForeachbiasedbootstrapresampleXb,b=1;:::;B,compute^abanddrawCbiasedbootstrapre-resamplesXbc,c=1;:::;C.Foreachc,let^abcbethebootstrapversionof^aonXbc.Foreveryb=1;:::B,let 54

PAGE 55

2{63 )are: 2-1 and 2-2 showtheMonteCarloestimatesofthecoverageprobabilitiescorrespondingtodierentbootstrapcondenceintervals:GMMbiasedbootstrap(GBB),GMMuniformbootstrap(GUB),and Freedman 'scenteredresiduals(FCR).Wehave 55

PAGE 56

2-1 2-2 ,and 2-3 showthecoverageerrorsfordierentbootstrappercentilecondenceintervals,usingdierentbootstrappingproceduresfordierentnominalsizes=:9;:85;:9;:95;:99.Thecoverageprobabilitieswereexaminedusing500simulationrunsand500outerlevelresamples.Fromtheseplotswecanconcludethatexceptforthesamplesizen=20whenallbootstrapproceduresperformrelativelysimilar,forallothersamplesizesn=40;60;80;100;200thebiasedbootstraprecyclingcondenceintervalsoutperfomallotherbootstrappercentilecondenceintervals. Table2-1. Estimatedcoverageprobabilitiesforbootstrapone-sided,uppercondenceintervalsat=.90nominallevel,=.8,B=1000,S=1000:GMMbiasedbootstrap(GBB),GMMuniformbootstrap(GUB),centeredresiduals(FCR),andbasedonasymptoticapproximation(ASY) Percentile Percentile-t n GBB GUB FCR GBB GUB FCR Asym 20 0.912 0.928 0.915 0.825 0.831 0.816 0.998 40 0.964 0.972 0.969 0.858 0.862 0.854 0.989 60 0.968 0.973 0.976 0.861 0.869 0.861 0.989 80 0.967 0.968 0.966 0.874 0.878 0.876 0.97 100 0.963 0.969 0.967 0.865 0.874 0.869 0.961 200 0.944 0.946 0.944 0.88 0.881 0.875 0.951 300 0.932 0.93 0.929 0.877 0.879 0.878 0.95 56

PAGE 57

Estimatedcoverageprobabilitiesforbootstrapone-sided,uppercondenceintervalsat=.95nominallevel,=.8,B=1000,S=1000:GMMbiasedbootstrap(GBB),GMMuniformbootstrap(GUB),centeredresiduals(FCR),andbasedonasymptoticapproximation(ASY) Percentile Percentile-t n GBB GUB FCR GBB GUB FCR Asym 20 0.976 0.984 0.979 0.872 0.874 0.862 1 40 0.997 0.995 0.996 0.89 0.891 0.886 1 60 0.999 0.999 0.999 0.905 0.907 0.899 1 80 0.997 0.996 0.996 0.912 0.916 0.913 0.996 100 0.995 0.997 0.997 0.925 0.922 0.922 0.996 200 0.99 0.991 0.991 0.927 0.927 0.928 0.988 300 0.99 0.992 0.992 0.93 0.93 0.925 0.979 Figure2-1. Estimatedcoverageerrorscorrespondingtodierentbootstrapcondenceintervals,atdierentlevels,samplesizen=20,40:GMMBiasedBootstrap(GBB),GMMuniformbootstrap,GMMRecyclingBiasedBootstrap(RBB),Mcenteredresiduals(FCR),Obasedonasymptoticapproximation(ASY) 57

PAGE 58

Estimatedcoverageerrorscorrespondingtodierentbootstrapcondenceintervals,atdierentlevels,samplesizen=60,80:GMMBiasedBootstrap(GBB),GMMuniformbootstrap,GMMRecyclingBiasedBootstrap(RBB),Mcenteredresiduals(FCR),Obasedonasymptoticapproximation(ASY) Figure2-3. Estimatedcoverageerrorscorrespondingtodierentbootstrapcondenceintervals,atdierentlevels,samplesizen=100,200:GMMBiasedBootstrap(GBB),GMMuniformbootstrap,GMMRecyclingBiasedBootstrap(RBB),Mcenteredresiduals(FCR),Obasedonasymptoticapproximation(ASY) 58

PAGE 59

FreedmanandPeters ( 1984 ), Freedman ( 1984 ), EfronandTibshirani ( 1986 ),and Bose ( 1981 ).Modelfreebootstrapprocedureshavebeenproposedinaseriesofpapersby Hall ( 1985 ), Carlstein ( 1986 ),and Kunsch ( 1989 ).Thesearebasedonblockingarguments,inwhichthedataarerstdividedinblocksofconsecutiveobservations,andtheblocksinsteadoftheobservationsareresampledinordertoobtainthebootstrapresamples.Asaresult,thedependencestructurewithinthetimeseriesispreservedwithineachblock.Therearedierenttypesofblocking,includingoverlappingandnonoverlappingblocks.Theseinturngiverisetodierentblockbootstrapprocedures,suchasthenonoverlappingblockbootstrap( Carlstein 1986 ),themovingblockbootstrap( Kunsch 1989 ),thecircularblockbootstrap( PolitisandRomano 1992 ),andthestationarybootstrap( PolitisandRomano 1994 ).Alltheseblockbootstrapmethodsareparticularcasesofthegeneralizedblockbootstrap,asdenedin Lahiri ( 2003 ,pp.31{33).Byapproximatinggeneralstationarytimeserieswithfamiliesof(increasinglymorecomplex)parametricmodels,sievebootstrapshavealsobeendevelopedinaseriesofpapersby Kreiss ( 1992 ), Bulmann ( 1997 ),and ChoiandHall ( 2000 ).Insteadofresamplingtheblockswithequalprobabilities,weproposeanewproceduretoresampletheblocksfromaweightedempiricaldistributiononthesampleofblockswhichsatisesthesamplemomentconditions.TheweightsareobtainedbyminimizingtheCressie-Readdistancetotheempiricaldistributionundertheconstraintthatitsatises 59

PAGE 60

2.3.2 ),wecanreusetherstlevelblockbootstrapresampleswheniteratingtheblockbiasedbootstrapinordertoestimatehigherlevelparameters. BrownandNewey ( 2002 )appliedthebiased-bootstrapforGMMmodelwithiidobservations.Theyanticipatedthatasimilarbootstrapprocedurefordependentdatawouldbepossible.LetX=fX1;:::;Xngbearealizationofad-dimensionalstationarytimeseries.DenotebyFthecommondistributionofXis,andbyFn=1=nPni=1XitheempiricaldistributionofXis.Supposethattheparameterofinterest02Rpisdenedimplicitlyastheuniquesolutiontothe\populationequation" EFb0=Eb(X1;0)=0;(3{1)forsomeb:Rd!Rp. Bustos ( 1982 )introducedthegeneralizedM-estimator^ofasasolutiontothe\sampleequation" EFnb^=1

PAGE 61

n!0asn!1.LetB1;:::;BbldenoteasimplerandomsampledrawnfromB,wherebl=n l,i.e.blistheintegerpartoftheratiobetweenthesamplesizenandtheblocksizel(thoughothervaluesarealsopossible).Sinceeachresampledblockcontainslelements,intotalweresamplen1=lblelements.IfwedenotetheelementsofblockBibyXi(l1)+1;:::;Xil,thenX=fX1;:::;Xn1gisablockbootstrapresampleofsizen1.Generally,thebootstrapversionof^anditscenteredandscaledversionTn=n1=2(^0)aredenedinoneoftwoways.Onewayistodenethebootstrapversion^of^asthesolutionof 1 b()=1 Lahiri 2003 ,pp.81{83).Anotherwaytodenethebootstrapversionof^isasasolutiontothemodiedequation 1 Shorack ( 1981 )inthecontextofbootstrappingM-estimatorsinlinearregressionmodelsandalsoby HallandHorowitz ( 1996 )inthecontextofbootstrappingGMMestimators.Notethatb(^)istheappropriatequantity 61

PAGE 62

3{5 )conditionallyunbiased.As Lahiri ( 2003 ,p.83)remarks,oneadvantageofusing( 3{5 )over( 3{3 )isthatweneedtosolveonlyonesetofequations( 3{5 ),comparedwithtwosetsofequations( 3{3 )and( 3{4 )inthelattercase.Inordertodenetheweightsoftheblockscorrespondingtomovingblockbiasedbootstrap(MBBB),denotebyUi()=1=lPi+l1j=ib(Xj;)theaverageofmomentsatinblockBi.Thenletp=p()=(p1;:::;pN)bethevectorofprobabilitiesthatisclosesttothevectorofuniformweights(1=N;:::;1=N)suchthattheweightedmeanoftheblockaveragesisequaltothezerovector.Inotherwords,pisthesolutionoftheminimizationproblemminimizeD(p)=1 3{6 )maybesolvedforpiasintheiidcase,yielding 3{7 ).Asbefore,eachresampledblockBicontainslelementswhichwedenotebyBi=(X(i1)l+1;:::;Xil),i=1;:::bl,andn1=lblisthetotalnumberofbootstrapvalues.ThenX=fX1;:::;Xn1gisaMBBBsampleofsizen1.Let^denotethebootstrapversionof^correspondingtothebiasedbootstrapresampleXdenedasthesolutionto(1=n1)Pn1i=1b(Xi;^)=0:Byconstruction,the(conditional)expectationgiventhesampleXofthebiasedbootstrapmeanofthemomentconditions 62

PAGE 63

E^[bn(^)]=E^h1 wherebn()=1=n1Pn1i=1b(Xi;)andUi()=1=lPi+l1j=ib(Xj;). 3{1 ). 63

PAGE 64

Lahiri 2003 ,Theorem4.2,p.83),thefollowingweakconvergenceholds:GnG,whereG=N0;D11(D1)Tand1=limn!1nVar(bn(0))=Cov(b(X1;0);b(X1;0))+2P1i=1Cov(b(X1;0);b(X1+i;0)),bn()=1=nPni=1b(Xi;),D=Erb(X1;0).Thenextlemmaisconcernedwiththeexistenceoftheweightssatisfying( 3{6 ).LetUi=Ui(^). 3.3.1 ,0isinsidetheconvexhulloffUi:i=1;:::;Ng,withprobabilityapproaching1asn!1.LetCoUi:i=1;:::;N=Pni=1iUi:i0;Pni=1i=1betheconvexhullforfUi:i=1;:::;ng.Since02CoUi:i=1;:::;N,thereexistsauniquesetofweightsfpi:i=1;:::;Ngsolving( 3{6 )suchthatPNi=1pi=1;pi0andPNi=1piUi=0.Weconsiderindetailthecasewhen=0,withsimilarproofsholdingforanymemberintheCressie-Readfamily.Taking=0in( 3{7 ),theweightshavethefollowingsimpliedexpressions 3{1 ).Supposethat^isasequenceofgeneralizedM-estimatorsdenedby( 3{2 ),andletF^betheweightedempiricaldistributiongivenby( 3{6 ).LetX=fX1;:::;Xn1gbeablockbiasedbootstrapresample.

PAGE 65

3.3.1 { 3.3.4 ,asn!1, 2.3.2 andTheorem 2.3.2 fromSection 2.3.1 holdalsointhiscase.Sincetheproofsareidentical,wehavenotincludedthemintheappendix. 3.3.1 ,thefollowinguniformconvergenceforthebiasedbootstrapholds: 3.3.1 andforeverysequenceofgeneralizedM-estimators^and(biased)bootstrapestimators^,withb(^)=n1Pni=1b(Xi;^)=oP(1)andb(^)=n1Pni=1b(Xi;^)=oP(1),wehave^=0+oP(1)and^=0+oP(1),andhencealsothat^=^+oP(1).ThenexttheoremshowsthattheMBBBdistributionofthegeneralizedM-estimatorisconsistent. 3{1 ).Let^beasequenceofgeneralizedM-estimatorsdenedby( 3{2 )andletF^betheweightedempiricaldistributiongivenby( 3{6 ).LetX=fX1;:::;Xn1gbeablockbiasedbootstrapresampleandlet^bethebootstrapversionof^onX.Undertheassumptions 3.3.1 { 3.3.4 ,asn!1, 3{7 )andassociatedwiththegeneralizedM-estimationmodel( 3{1 ).Then,estimatetheparameter0=(F)by^, 65

PAGE 66

E^(T)=EGTdP^ whereGisagiven\design"distribution,P^=Fn^isthenfoldproductmeasureofF^anddP^ 3{15 )correspondingtore-sampleXb Lahiri ( 2003 ,p.2)calls0alevel-1parameteranddenesthelevel-2parameterasafunctionalrelatedtothesamplingdistributionofanestimatorofalevel-1parameter.Generally,isthesolutionofafunctionalequation Ef(F;F^)=0:(3{17) 66

PAGE 67

2{17 ){( 2{19 ).Denoteby^(l)and^(l)thebiasedbootstrapsolutionsof( 3{17 ),whichsatisfy E^f^(l)(F^;F^)=0andE^f^(l)(F^;F^)=0:(3{18)LetX1;:::;XBbeBbiasedbootstrapresamplesandlet^B(l)betheMonteCarloapproximationof^(l),denedasthesolutioninoftheequation 1 1 3{16 ). Efron ( 1992 )intheiidcaseandby Lahiri ( 2003 )inthecaseofdependentdata.Themainideaofthesemethodsistousetherstlevelresampleswhenestimatinghigherlevelparameterscorrespondingtotheiteratedbootstrap. 67

PAGE 68

Halletal. ( 1995 )and Lahiri ( 2003 ,pp.182{186)inwhich Usingthesameterminologyas Lahiri ( 2003 ),l0isalevel-3parameter,sinceitrelatestothesamplingdistributionof^(l),whichisanestimatorofthelevel-2parameter.SincetheunderlyingFisunknown,wecanapproximatetheexpectationfromthelastdisplaybyitsbiasedbootstrapversion, ^l=argminlMSE^(^(l))=argminlE^^(l)^(~l)2;(3{25)where~lisareasonablepilotblocksize.Hence,inordertoestimate^l,weneedtoiteratethebootstraportoapplyrecycling.Wedescribeinmoredetailthecomputationaldetailsoftherecyclingprocedureproposedinthissection,fortheparticularcasewhen1=nVar(X),thesameideasapplyfordistributionfunctionestimation.WeconsidertheforwardKullback-Leiblerdistancewhendeningtheweightsofthebiasedbootstrap,obtainedfor=0inthefamilyof 68

PAGE 69

1{27 ).Foragivenvalueoftheblocksizel,thevectorofweightsgivenby( 3{7 )iscomputed 3{26 )and( 3{27 ).First,adescriptionofthecomputationaldetailsforthedoublebiasedbootstrapispresentedandthenitsrecyclingversion.Foreachb,considerCre-resamplesfXbc:c=1;:::;CgfromFXb.Foreachc,computeXbc,thesamplemeanofXbc,sothat,theMonteCarloapproximationof^b(l)=n1VarXbXis^Cb(l)=1 69

PAGE 70

^b(l)=n1EXbXXb2=n1EXXXb2dPXb ^rBb(l)=1 3{28 )is ^arBb(l)=BXb0=1n1(Xb0Xb)2kbb0:(3{31) 70

PAGE 71

3{21 ,truevaluesofthelevel-2parameterswerefoundandtheyaregivenby1=3.984and2=0.516.Tondthetrueoptimalblocksizesusingbothuniformandbiasedbootstrap,wegeneratemovingblockbootstrapestimatorsof1and2forblocksizesl=1;:::;10.Tables 3-1 and 3-2 givethemean,bias,standarddeviationandrootmeansquareerrors(RMSE)oftheMBBandMBBBestimatorsof1and2basedonS=1000simulationrunsandB=1000bootstrapresamples.Fromthistableweseethattheoptimalblocksizeisl10=3for1andl20=2for2.Figure 3-1 showsthebootstrapestimatesoftheRMSE'sofdierentbootstrapproceduresforparameters1and2,fordierentblocksizes,foronerealizationoftheprocess( 3{21 ).Foreachbootstrapprocedure,theoptimalblocklengthisthemimimumofthebootstrapestimatesofRMSE'soverallblocklengthsl=1;:::;10.Wecanseethat,forthisparticularsample,thebootstrapprocedureschoosesimilar\optimal"blocksizes.Inthesesimulations,weusedB=1000outerbootstrapresamples(forthebiasedbootstraprecycling(BBR)andtheadjustedbiasedbootstraprecycling(ABR))andadditional500innerbootstrapre-resamplesfortheuniformdoublebootstrap(UB)andthedoublebiasedbootstrap(BB). 71

PAGE 72

Computationoftheoptimalblocksizeforuniformblockbootstrapestimationoflevel-2parameters1and2givenby( 3{22 )and( 3{23 ).ThenumberofsimulationsisS=1000,andthenumberofbootstrapresamplesisB=1000.Anasterix(*)showstheblocksizeforwhichtheminimumRMSEhasbeenattained. VarianceEstimation DistributionFunctionEstimation l Mean Bias SD RMSE Mean Bias SD RMSE 1 1.977 -2.007 0.688 2.122 0.5092 -0.0070 0.0161 0.0175 2 2.939 -1.045 1.040 1.474 0.5135 -0.0027 0.0164 0.0166 3.245 -0.740 1.201 1.411 -0.0029 0.0167 0.0169 4 3.383 -0.601 1.318 1.448 0.5131 -0.0031 0.0174 0.0177 5 3.451 -0.534 1.405 1.502 0.5124 -0.0038 0.0178 0.0180 6 3.484 -0.500 1.479 1.561 0.5119 -0.0043 0.0171 0.0176 7 3.504 -0.480 1.533 1.606 0.5124 -0.0038 0.0174 0.0178 8 3.524 -0.460 1.618 1.681 0.5123 -0.0039 0.0174 0.0178 9 3.523 -0.462 1.686 1.747 0.5111 -0.0051 0.0187 0.0194 10 3.514 -0.470 1.725 1.787 0.5108 -0.0054 0.0182 0.0189 Table3-2. Computationoftheoptimalblocksizeformovingblockbiasedbootstrapestimationoflevel-2parameters1and2givenby( 3{22 )and( 3{23 ).ThenumberofsimulationsisS=1000,andthenumberofbootstrapresamplesisB=1000.Anasterix(*)showstheblocksizeforwhichtheminimumRMSEhasbeenattained. VarianceEstimation DistributionFunctionEstimation l Mean Bias SD RMSE Mean Bias SD RMSE 1 1.972 -2.012 0.679 2.124 0.5094 -0.0068 0.0162 0.0175 2 2.936 -1.048 1.032 1.470 0.5139 -0.0023 0.0164 0.0166 3.248 -0.737 1.200 1.408 -0.0021 0.0165 0.0167 4 3.386 -0.598 1.329 1.457 0.5133 -0.0029 0.0171 0.0173 5 3.448 -0.536 1.394 1.493 0.5135 -0.0027 0.0167 0.0169 6 3.497 -0.488 1.474 1.552 0.5128 -0.0034 0.0169 0.0172 7 3.516 -0.468 1.544 1.612 0.5122 -0.0040 0.0180 0.0183 8 3.528 -0.457 1.617 1.680 0.5112 -0.0050 0.0176 0.0182 9 3.545 -0.439 1.670 1.726 0.5114 -0.0048 0.0184 0.0190 10 3.526 -0.458 1.710 1.770 0.5121 -0.0041 0.0186 0.0189 72

PAGE 73

ThebootstrapestimatesoftheRMSE'scorrespondingtodierentblockbootstrapschemes.WeusedB=1000outerbootstrapresamples(forthebiasedbootstraprecycling(BBR)andtheadjustedbiasedbootstraprecycling(ABBR))andfortheuniformdoublebootstrap(UB)andthedoublebiasedbootstrap(BB)anadditional500innerbootstrapresamplesforeachouterbootstrapresample 73

PAGE 74

Datta ( 1995 ),butcanbemoregenerallyapplied. Andrews ( 2000 )givesanexamplewheretheuniformbootstrapfails.Heremarks\...weprovidesuchacounterexample...[which]isquitesimplebutitgeneralizestoawidevarietyofestimationproblemsthatareofimportanceineconometricapplications".LetX=fX1;:::;XngbeaniidsamplefromN(;1)distribution.Supposethattheparameterspaceforisthepositivereals.ThentheMLEofis^n=maxXn;0,whereXn=n1Pni=1Xi.LetTn=n1=2(^n)and Andrews ( 2000 )showsthatinthecasewhen=0,thebootstrapisinconsistent.WedevisehereabiasedbootstrapprocedurethatgivesconsistentbootstrapestimatesforthedistributionoftheMLE.Considerasequenceofpositiverealsn=n,with2(0;1=2)anddenetheresamplingdistributionas 74

PAGE 75

supxPTnxjGnPTxP!0:(4{4)Thefollowingrelationsaretrue: supxPTnxjGnPTxsupxPn1=2^nxjF0PTxIXnn +supxPn1=2(^n^n)xjFnPTxIXnn; whereIfgistheindicatorfunctionofaset.If=0,itfollowsfromtheLawofIteratedLogarithmthatlimsupnIfXnng=InlimsupnXn 4{4 )holds.When>0,itfollowsthatlimsupnIXnn=limsupnInXn 4{4 )holds. 75

PAGE 76

( 1995 )cites Babu ( 1984 )'spapertoarguethattheclassicalbootstrapapproximation\breaksdown...evenfornicestatistics." Babu ( 1984 )remarkedthatthebootstrapapproximationofasmoothfunctionofmultivariatesamplemeanisnotconsistentfor\certain"valuesofthemeanvector.Followingtheirexample,weproposeabiasedbootstrapversionthatcancorrecttheinconsistencyoftheclassicaluniformbootstrap.LetX=fX1;:::;XngbeaniidsamplefromN(;1)distributionandsupposethatweareinterestedinestimating2,forwhichtheMLEisX2.Consider Datta ( 1995 )showsthattheclassicaluniformbootstrapisnotvalidinthiscase.Torectifythis,asbefore,considerasequenceofpositiverealsn=n,with2(0;1=2)anddenetheresamplingdistributionGntobe 4{7 )onthebootstrap 76

PAGE 77

4-1 givestheresultsofasimulationstudythatshowstheperformanceofthis\hybrid"biasedbootstrap.Under=0,Tn=nX2nhasachi-squaredistribution,withonedegreeoffreedom.Forthesesimulations,wehaveconsideredsamplesizesn=50,100,200,500.Foreachsamplesize,wehavecomputedthequantilesoftheordinary(uniform)bootstrapestimateL(n(X2nX2n)jFn)aswellasthe\hybridbiasedbootstrap"L(TnjGn).ForthissimulationstudywehaveusedB=1000bootstrapresamples,S=1000simulatedsamplesand=n:4.WehavealsoincludedthetruequantilesofTn,givenbythequantilesofthechi-squaredistributionwithonedegreeoffreedom,forcomparison.Itisobviusfromthissimulationstudythatthis\hybridbiasedbootstrap"outperformstheordinarybootstrap,asexpectedfromthetheoreticalresult.Theuniformbootstrapperformserraticallyinthetails,andtheapproximationdoesnotimprovewithincreasingthesamplesize.Table 4-2 givesthebootstrapestimatesofthesamequantitiesasabove,under=0.2.Inthiscase,Tn=n1=2(X2n:22)hasanormaldistributionN(0;:42),forthesamplesizesn=50,100,200,500.Asbefore,foreachsamplesize,wehavecomputedthequantilesoftheordinary(uniform)bootstrapestimateL(n1=2(X2nX2n)jFn)aswellasthe\hybridbiasedbootstrap"L(TnjGn).WehavealsoincludedthetruequantilesofTn,forcomparison.Itisobviusfromthissimulationstudythatthis\hybrid"biasedbootstrapapproachestheordinarybootstrap,asexpectedfromthetheoreticalresult.Asthesamplesizeincreasis,thetwodierentbootstrapproceduresgivealmostindistinguishableresults. 77

PAGE 78

ThequantilesofthedistributionofTnunder=0,andtheirbootstrapapproximationsgivenbytheordinary(uniform)bootstrap(UB)andthe\hybrid"biasedbootstrap(HBB),usingB=1000bootstrapresamples,S=1000simulationrunsandn=n:4 .10 .15 .20 .80 .85 .90 .95 0.0158 0.0358 0.0642 1.6424 2.0723 2.7055 3.8415 UB,n=50 -0.9823 -0.9084 -0.8186 -0.7157 2.3037 2.9697 3.9059 5.5182 HBB,n=50 -0.0714 -0.0539 -0.0292 0.0027 1.4750 1.8641 2.4293 3.4429 UB,n=100 -0.9497 -0.8806 -0.7956 -0.6956 2.2888 2.9500 3.8959 5.5003 HBB,n=100 -0.0405 -0.0257 -0.003 0.0264 1.4977 1.8891 2.4608 3.4919 UB,n=200 -1.0056 -0.9321 -0.8421 -0.7390 2.3399 3.0151 3.9796 5.6105 HBB,n=200 -0.0241 -0.0101 0.0111 0.0398 1.5202 1.9210 2.5082 3.5573 UB,n=500 -0.9200 -0.8538 -0.7710 -0.6765 2.2791 2.9427 3.8789 5.4813 HBB,n=500 -0.0087 0.0039 0.0240 0.0520 1.5538 1.9587 2.5534 3.6234 Table4-2. ThequantilesoftheasymptoticdistributionofTnunder=0.2,andtheirbootstrapapproximationsgivenbytheordinary(uniform)bootstrap(UB)andthe\hybrid"biasedbootstrap(HB),usingB=1000bootstrapresamples,S=1000simulationrunsandn=n:4 .10 .15 .20 .80 .85 .90 .95 -0.657 -0.512 -0.414 -0.336 0.336 0.414 0.512 0.657 UB,n=50 -0.379 -0.336 -0.297 -0.256 0.470 0.607 0.783 1.088 HBB,n=50 -0.323 -0.276 -0.230 -0.181 1.128 1.432 1.863 2.610 UB,n=100 -0.424 -0.368 -0.319 -0.272 0.421 0.538 0.695 0.952 HBB,n=100 -0.386 -0.329 -0.275 -0.223 0.878 1.112 1.443 2.016 UB,n=200 -0.481 -0.406 -0.346 -0.292 0.395 0.501 0.642 0.866 HBB,n=200 -0.469 -0.394 -0.333 -0.277 0.538 0.680 0.877 1.203 UB,n=500 -0.535 -0.437 -0.365 -0.304 0.369 0.465 0.588 0.783 HBB,n=500 -0.534 -0.437 -0.365 -0.303 0.372 0.470 0.593 0.791 78

PAGE 79

1.3.1 and 1.3.2 ,werstpresenttworesultssharedbyconcavecriterionfunctions,formoredetailsandapplications,see GiurcanuandTrindade ( 2006 ). Proof. Proof.

PAGE 80

1.3.1 A.0.1 ,ignoringsetsofprobabilityzero, Sincepointwiseconvergenceinprobabilityforconcavefunctionsonanopensetimpliesuniformconvergenceinprobabilityoncompactsubsetsofthatopenset( Pollard 1991 ,sec.6),usingProposition A.0.2 weobtain sup2S(0;)mn()P!sup2S(0;)m():(A{2)Since0isgloballyidentiable,sup2S(0;)m()
PAGE 81

1.3.2 sup2Dmn()m()P!0: Theconsistencyof^n;1and( A{3 )implythatforevery2with=(0;1;2)2C mn(^n;1;2)P!m(0;1;2):(A{4)Toseethis,notethatforlargeenoughn,andforanycompactneighborhoodDof0containing(0;1;2)andcontainedinC,Pmn(^n;1;2)m(0;1;2)Pmn(^n;1;2)m(^n;1;2)=2+Pm(^n;1;2)m(0;1;2)=2Psup2Djmn()m()j=2+Pm(^n;1;2)m(0;1;2)=2!0asn!1; 1.3.1 ,weobtainargmax2:2kn(2)P!argmax2:2k(2);wherek()=m(0;1;)istheinprobabilitylimitofkn.Notingthat^n;2=argmax2kn(2)and0;2=argmax2k(2),givestherequiredresult. 81

PAGE 82

2.2.1 2{6 ), From( 2{1a ),( 2{2 ),and( 2{5 )itfollowsthat=^=0and=^=0,sothat,from( B{1 ) 2{1b ),itfollowsthatPni=1rpi=0,soby( 2{1a )weobtainr=^=0(1)nXi=1rpi=^=0;andfrom( B{2 )itfollowsthat B{3 )by(rpi)T=^andsummingoveriyields 2{1b )withrespecttoat^yields B{4 )and( B{5 ) 82

PAGE 83

B{3 )byb(Xi;^)T,summingoveriyields B{5 )and( B{7 )itfollowsthat B{6 )and( B{8 )wehave 1{5 ).Now,considerF=fFg2asaparametricfamilyofdistributions,indexedby.TheFisherinformationcorrespondingtothisparametricmodelisgivenby IfweevaluatetheFisherinformationat^,from( B{10 )weobtain B{9 )and( B{11 )itfollows 83

PAGE 84

2{6 )existwithprobabilityapproachingone.Thisresultextendstheresultof Owen ( 2001 ). 2.3.1 P!nkT!x>0=1 supkk=1P!nkTx>0=1:(B{14)UsingaGlivenko-CantelliresultgivenbytheTheorem19.4of vanderVaart ( 1998 ,p.270),thefollowingconvergenceholdswithprobability1: limnsupkk=1PnTx>0PTZ>0=0;(B{15) 84

PAGE 85

B{15 ),itfollowsthat supkk=1PnTx>0!supkk=1PTZ>0;(B{16)almostsurelyasn!1.WetakenowB= limkAnk=Tk1SmkAnm.Sinceforallm1,PAnm,itfollowsthatP(B).Foralmostall!2B,from( B{14 )and( B{16 ),itfollowsthat limsupnsupkk=1P!nTx>0=1=supkk=1PTZ>0:(B{17)SinceVispositivedenite(Assumption 2.3.3 ),weknowthatTV>0forallsuchthatkk=1.ThusPTZ>0=1 2;forallkk=1;hencesupkk=1PTZ>0=1=2,contradicting( B{17 ). ThenextLemmagivestheorderoftheLagrangemultiplier,andgeneralizesTheorem3.2of Owen ( 2001 ,pp.219{222). 2.3.1 { 2.3.3 ,=OP(n1=2). Proof. 2{6 )fortheparticularcasewhen=0,itfollowsthattheweightsaregivenbypi(^)=(n(1+Tb(Xi;^)))1,wheresatisesPni=1pi()b(Xi;^)=0.Hence 1 1+Tb(Xi;^)#=0:(B{18)Hence, 1 n1+Tb(Xi;^):(B{19) 85

PAGE 86

B{19 )itfollowsthat~S=1 nkknXi=1b(Xi;^)b(Xi;^)T n1+Tb(Xi;^)1+kkmaxi=1;:::;nkb(Xi;^)k=k~Sk1+kkWn=1 B.2.2 ,wehaveWn=oP(n1=2),hence,from( B{21 )itfollowsthat=OP(n1=2). ThenextlemmagivestheorderofWndenedabove. 2.3.1 { 2.3.3 ,wehaveWn=oP(n1=2),whereWnisgivenby 2.3.2 ,bythemeanvaluetheoremandtriangleinequality,itfollowsthatkb(Xi;^)kkb(Xi;0)k+k^0kk(Xi);

PAGE 87

maxi=1;:::;nkb(Xi;^)kmaxi=1;:::;nkb(Xi;0)k+k^0kmaxi=1;:::;nk(Xi):(B{23)UsingLemma11.2of Owen ( 2001 ,p.218),itfollowsthatmaxi=1;:::;nkb(Xi;0)k=oP(n1=2)andmaxi=1;:::;nk(Xi)=oP(n1=2):Usingthat^=0+OP(n1=2),from( B{23 )wehaveWn=oP(n1=2). Thefollowinglemmaprovidesuswithanorderofconvergencethatwillbeneededintheproofsofthefollowingtheorems. 2.3.1 { 2.3.3 ,maxi=1;:::;n1 1+Tb(Xi;^)=OP(1): 1+Tb(Xi;^)1 1kkWn:(B{24)Since=OP(n1=2)andWn=oP(n1=2),itfollowsthatkkWn=oP(1)andP(Dn)!1.From( B{24 ),weobtainthatmaxi=1;:::;n1 1+Tb(Xi;^)=OP(1). Thenextproofshowstheconsistencyofthebiasedbootstrapforsamplemeanofmomentconditions. 2.3.1 E^b(X1;^)=nXi=1pib(Xi;^)=0(B{25)and Var^b(X1;^)=nXi=1pib(Xi;^)b(Xi;^)T:(B{26) 87

PAGE 88

B{26 )as FromAssumption 2.3.2 itfollowsthatthefollowinguniformlawoflargenumbersholds,beinganapplicationofTheorem19.4of vanderVaart ( 1998 ,p.270)sup1 B{27 )convergesinprobabilitytotheasymptoticcovarianceVasn!1(forthiswealsousethefactthatV()iscontinuousat0,whichfollowsfromAssumption 2.3.2 ).Forthesecondtermof( B{27 ),usingthefactthatpi=n(1+Tb(Xi;^))1from( 2{9 ),weobtainthefollowingrelations(weusekktodenotetheEuclideannormforvectorsandtheinducedoperatornormformatrices): wherethenexttolastrelationfollowsfromLemmas B.2.1 { B.2.3 .SinceXisaresampledfromF^,whichchangeswithn,inordertocompletetheproof,theCentralLimitTheoremfortriangulararraysisused.Itsucestoshowthatforevery>0( vanderVaart 1998 ,p.20) E^hkb(Xi;^)k2I(kb(Xi;^)kn1=2)iP!0asn!1:(B{29) 88

PAGE 89

E^kb(Xi;^)k2I(kb(Xi;^)kn1=2)=nXi=1pikb(Xi;^)k2I(kb(Xi;^)kn1=2)nXi=1pikb(Xi;^)k2I(Wnn1=2):(B{30)Asbefore,Pni=1pikb(Xi;^)k2P!Ekb0k2.FromLemma B.2.2 ,I(Wnn1=2)P!0,andhence,( B{29 )followsfrom( B{30 ).LetG^n=Lp Kallenberg 2002 ,Lemma4.2,p.63),itfollowsthatforanysubsequence(mn)(n),thereisafurthersubsequence(ln)(mn)suchthat( B{29 )holdsalong(ln)almostsurely.UsingtheCentralLimitTheoremfortriangulararraysalong(ln),itfollowsthat(G^ln;G)a:s:0;whereisametriconthespaceofdistributionsthatmetrizestheweakconvergence.Fromthis,usingagainthesubsequencecriterion,itfollowsthat(G^n;G)P0:Inotherwords,wehaveshownthatanysubsequenceoffG^nghasafurthersubsequencethatconvergesindistributiontoGalmostsurely,sothat,thesequenceconvergesindistributioninprobability. Wenowprovethe(conditional)uniformconvergenceinprobabilityresultthatwillbeusedinprovingtheconsistencyoftheGMMestimators(andtheirbootstrapversions)inTheorem 2.3.2 2.3.2 vanderVaart ( 1998 ),thereexistsanitesequenceofopenballsUi,i=1;:::;n,suchthatSli=1Ui,bUi(x)b(x;)bUi(x)forall2Uiandforallx,andEFjbUibUij,wherebUi(x)=inf2Uib(x;)andbUi(x)=sup2Uib(x;).Forall2Uj, 1 89

PAGE 90

2.3.1 ,wehave1 B{32 ),weobtain sup21 B{33 ),weobtainthedesiredresult. 2.3.2 2.3.2 ,wehave sup2jQn()Q()j=oP(1)(B{34)and sup2jQn()Q()j=oP(1);(B{35)whereweusethesametechniqueforprovingtherstuniformconvergenceresultof( B{34 ).ByCorollary 1.2.1 weobtaintheconsistencyof^.Usingstandardtechniques,thisinturnwouldimplythat^=0+OP(n1=2),sothatallconditionsofTheorem 2.3.1 hold.SinceQn(^)Qn(^)+oP(1)(byhypothesis)andQn(^)=Qn(^)+oP(1)(byTheorem 2.3.1 ),weobtainQn(^)Qn(^)+oP(1):Hence,usingthefactthatQn(^)Q(0)oP(1),wehave 90

PAGE 91

ThenextlemmawillbeusedintheproofoftheThereom 2.3.3 .ItisaversionofSlutsky'stheoremformulatedforconditionalconvergenceindistributioninprobability. 2.3.3 0=n(^)=n(^)+(^^)rn(^)+1 2(^^)2r2n(~);(B{39)where 91

PAGE 92

(^^)r2n(~)=oP(1):(B{43)Bysubstituting( B{42 )and( B{43 )in( B{39 ),weobtain 2(^^)r2n(~))=(^^)(DTWD+oP(1)):(B{44)Hence 2.3.1 ,itfollowsthatp Hahn ( 1996 ). 92

PAGE 93

B{45 ).Hence,weneedtoshownowthat 2.3.1 ),itiseasytoprovethatthefollowingconvergenceholdsfortheuniformbootstrapp B{46 ),by(conditional)Slutsky'stheorem,itishenceenoughtoshowthatp 2.3.4 2.3.3 ,usingtheTaylorexpansionweobtainthat B{45 ),fortheparticularcasewhenW=V1,itfollowsthat 93

PAGE 94

B{47 )and( B{48 ),weobtain SinceV1=2D(DTV1D)1DTV1=2isidempotentofrankp=1,itfollowsthatIV1=2D(DTV1D)1DTV1=2isidemptotentofrankqp(rememberthatweconsiderthecasep=1).Moreover,fromThereom 2.3.3 ,itfollowsthatV1=2p 2.3.1 ): Consequently,nQn(^)P2qp. 2.3.1 fortheuniformbootstrap,the 94

PAGE 95

2.3.3 ); B{52 )holds,sothat,thefollowing(unconditional)weakconvergenceholds B{49 )itfollowsthatnbn(^)TV1=2AV1=2bn(^)2qp.From( B{53 ),weseethatC=A+AV1=2AV1=2Ashouldbeidempotentofrankqp( Driscoll 1999 ).Ofcoursethisisnotalwaysthecase,e.g.forV=I,C=2A,whichisnotidempotent. 2.4.1 { 2.4.3

PAGE 96

Since from( B{57 )and( B{58 ),itfollows 1 B.3.1 ,itfollowsthat 1 whichisamatrixofrankqp. 96

PAGE 97

2.4.2 Eziin1nXj=1zj^j=n1nXj=1zj^jn1nXj=1zj^j=0;(B{62)and Varziin1nXj=1zj^j=1 1 Since(xi;zi;i)aresampledfromtheempiricaldistributiononthesampleZwhichchangeswithn,weusetheCentralLimitTheoremfortriangulararrays.ByProposition2.27of vanderVaart ( 1998 ),weneedtoverifythattheLindebergconditionholds,i.e.,forany>0, E"ziin1nXj=1^jzj2Iziin1nXj=1^jzjn1=2#P!0:(B{65) 97

PAGE 98

2.4.1 { 2.4.3 maxj=1;:::;nkzj^jk+n1nXj=1^jzj=maxj=1;::::nkzj(yj^Txj)k+oP(1)maxj=1;::::nkzj(yjTxj)(^)Txjkmaxj=1;::::nkzjjk+k^kmaxj=1;::::nkxjk=oP(n1=2)+OP(n1=2)oP(n1=2)=oP(n1=2); whereforthelastrelationweusedtheresultofLemma11.2of Owen ( 2001 ,p.218).HenceImaxj=1;:::nzj^j+n1Pnj=1^jzjn1=2P!0.Moreover,usingthesameargumentsasintheproofof( B{64 ),1 B{65 )holds.Asbefore,usingthesubsequenceargument,itfollowsthat,anysubsequence,hasafurthersubsequencethatconvergesindistributionconditionallyalmostsurely,sothat,thesequenceconvergeindistributionconditionallyinprobability. 98

PAGE 99

2.4.3 Usingthatn1XTZ=n1Pni=1xizTi=Qxz+oP(1)andn1ZTZ=n1Pni=1zizTi=Qzz+oP(1),weobtainn1=2(^^)=(QxzQ1zzQTxz)1QxzQ1zz1 (B{69)UsingTheorem 2.4.2 ,weonlyneedtoprovethat (QxzQ1zzQTxz)1QxzQ1zz1 B.3.1 ,weobtain(QxzQ1zzQTxz)1QxzQ1zz1 B{70 )holds,sothat( 2{44 )holds. 99

PAGE 100

3.3.1 P!NkT!x>0=1 vanderVaart ( 1998 ,p.270),thefollowingconvergenceholdswithprobabilityone: limnsupkk=1PnTx>0PTZ>0=0;(C{2)whereZN(0;1).From( C{2 ),itfollowsthatsupkk=1PnTx>0!supkk=1PTZ>0;almostsurelyasn!1.WetakenowB= limkANk=Tk1SmkANm.Sinceforallm1,PANm,itfollowsthatP(B).Foralmostall!2B, limsupksupkk=1P!NkTx>0=1=supkk=1PTZ>0:(C{3) 100

PAGE 101

3.3.3 ),weknowthatT1>0forallsuchthatkk=1.ThusPTZ>0=1 2;forallkk=1;hencesupkk=1PTZ>0=1=2,contradicting( C{3 ). LetWN=maxi=1;:::;NjjUijj.NextlemmasgivetheorderofandWN,andgeneralizetheresultsoffromtheprevioussectionfordependentdata. 3.3.1 C.0.2 3{11 ),itfollowsthat 1 1 N1+TUi:(C{6)Let~S=PNi=1UiUTi C{6 )itfollowsthat~S=1 NkkNXi=1UiUTi N1+TUi1+kkmaxi=1;:::;NkUik=k~Sk1+kkWN=1 101

PAGE 102

Lahiri ( 2003 ,p.52),thefollowingordersholdl NNXi=1UiUTi=1+oP(1);1 C{7 )weobtainthefollowingorderfortheLagrangemultiplier, 3.3.1 C.0.3 maxi=1;:::;NkUi(^)kmaxi=1;:::;NkUi(0)k+k^0kmaxi=1;:::;Nl1i+l1Xj=ik(Xi):(C{11)Hence,itisenoughtoshowthatmaxi=1;:::;NkUi(0)k=o(l1N1=2).Sinceforanyintegeri1,Ekb(Xi;0)k2+<1,itfollowsthatforanyA>0,Pn1P(kb(Xi;0)k2+An)<1.Usingthe(strict)stationarityproperty,b(Xi;0)sareidenticaldistributed,hence

PAGE 103

IntheproofofTheorem 3.3.1 ,wewillalsoneedthefollowingresult. 3.3.1 ,,maxi=1;:::;N1 1+TUi=OP(1): C.0.4 1+TUi1 1kkWN:(C{13)Since=OP(lN1=2)andWN=oP(l1N1=2),itfollowsthatP(DN)!1andkkWN=oP(1).From( C{13 ),maxi=1;:::;N1 1+TUi=OP(1). 3.3.1 E^Ui=NXi=1piUi=0(C{14)and Var^Ui=NXi=1piUiUTi:(C{15) 103

PAGE 104

Var^U1;i=NXi=1piU1;iUT1;i: Usingthesameresultasin Lahiri ( 2003 ,p.52),weobtain 1 C.0.2 { C.0.4 areused.SinceE^[U1;i]=0,andVar^[U1;i]P!1,thecentrallimittheoremfortriangulararraysisused.WeonlyneedtoverifythattheLindebergconditionholds,i.e.forevery>0, E^kU1;ik2IkU1;ikb1=2P!0:(C{19)SinceU1;i,i=1;:::;bareconditionallyiid,itfollowsthatE^kU1;ik2IkU1;ikb1=2=NXi=1pikU1;ik2IkU1;ikb1=2NXi=1pikU1;ik2Imaxi=1;:::;NkU1;ikb1=2:

PAGE 105

C{17 )),itfollowsthat( C{19 )holds.Usingthesamesubsequenceargumentasbefore,itfollowsthatanysubsequence,hasafurthersubsequencethatconvergesweakly,conditionallyalmostsurely,sothat,thefullsequencesequenceconvergeweakly,conditionallyinprobability. 3.3.3 0=bn(^)=bn(^)+rbn(^)(^^)+1 2(^^)Tr2bn(~)(^^):(C{20)UsingAssumption 3.3.2 ,itfollowsthatkr2bn(~)kn1Pni=1k(Xi).Hence(^^)Tr2bn(~)=oP(1):Sincerbn(^)=D+oP(1),from( C{20 )itfollows 2(^^)Tr2bn(~)(^^)=(D+oP(1))(^^): Hence, 3.3.1 ,itfollowsthatp 105

PAGE 106

Andrews,D.W.(2000).Inconsistencyofthebootstrapwhenaparameterisontheboundaryoftheparameterspace.Econometrica68,399{405. Angrist,J.D.,G.W.Imbens,andD.B.Rubin(1996).Identicationofcausaleectsusinginstrumentalvariables.JournaloftheAmericanStatisticalAssociation91,444{455. Angrist,J.D.andA.B.Krueger(1991).Doescompulsoryschoolattendanceaectschoolingandearnings?QuarterlyJournalofEconomics106,979{1014. Babu,G.J.(1984).Bootstrappingstatisticswithlinearcombinationsofchi-squaresasweaklimit.SankhyaSer.A46,85{93. Baggerly,K.A.(1998).Empiricallikelihoodasagoodness-of-tmeasure.Biometrika85,535{547. Beran,R.(1987).Prepivotingtoreducelevelerrorofcondencesets.Biometrika74,457{68. Beran,R.(1988).Prepivotingteststatistics:Abootstrapviewofasymptoticrenements.JournaloftheAmericanStatisticalAssociation83(403),687{697. Bhattacharya,R.N.andJ.K.Ghosh(1978).OnthevalidityoftheformalEdgeworthexpansion.TheAnnalsofStatistics6,434{451. Bickel,P.J.andD.A.Freedman(1981).Someasymptotictheoryforthebootstrap.TheAnnalsofStatistics9,1196{1217. Bose,A.(1981).Edgeworthcorrectionbybootstrapinautoregressions.TheAnnalsofStatistics16,1709{1722. Brown,B.W.andW.K.Newey(1998).Ecientsemiparametricestimationofexpectations.Econometrica66,453{464. Brown,B.W.andW.K.Newey(2002).GMM,ecientbootstrapping,andimprovedinference.JournalofBusinessandEconomicStatistics20,507{517. Brundy,J.andD.Jorgenson(1971).Ecientestimationofsimultaneousequationsbyinstrumentalvariables.ReviewofEconomicsandStatistics53,207{224. Bulmann,P.(1997).Sievebootstrapfortimeseries.Bernoulli3,123{148. Bustos,O.H.(1982).GeneralM-estimatesforcontaminatedp-thorderautoregressiveprocesses;consistencyandasymptoticnormality.ZeitschriftfurWahrscheindlichkeits-theorieundVerwandteGebiete59,491{504. Butcher,K.F.andA.Case(1994).Theeectofsiblingcompositiononwomen'seducationandearnings.QuarterlyJournalofEconomics109,531{563. 106

PAGE 107

Card,D.(1995).Usinggeographicvariationincollegeproximitytoestimatethereturnstoschooling.InAspectsofLaborMarketBehavior:EssaysinHonorofJohnVanderkamp,UniversityofTorontoPress. Carlstein,E.(1986).Theuseofsubseriesmethodsforestimatingthevarianceofageneralstatisticsfromastationarytimeseries.TheAnnalsofStatistics14,1171{1179. Choi,E.andP.Hall(2000).Bootstrapcondenceregionscomputedfromautoregressionsofarbitraryorder.JournaloftheRoyalStatisticalSociety.SeriesB(StatisticalMethodology)62,461{477. Corcoran,S.A.(1998).Bartlettadjustmentofempiricaldiscrepancystatistics.Biometrika85,967{972. Datta,S.(1995).Onamodiedbootstrapforcertainasymptoticallynonnormalstatistics.Statistics&ProbabilityLetters24,91{98. Davidson,R.andJ.G.Mackinnon(1993).EstimationandInferenceinEconometrics.NewYork:OxfordUniversityPress. Davison,A.andD.V.Hinkley(1997).BootstrapMethodsandtheirApplications.CambridgeUniversityPress,Cambridge,UK. Davison,A.,D.V.Hinkley,andG.Young(2003).Recentdevelopmentsinbootstrapmethodology.StatisticalScience18,141{157. DiCiccio,T.J.,P.Hall,andJ.P.Romano(1991).EmpiricallikelihoodisBartlett-correctable.TheAnnalsofStatistics19(2),1053{1061. DiCiccio,T.J.andJ.P.Romano(1989).Theautomaticpercentilemethod:accuratecondencelimitsinparametricmodels.CanadianJournalofStatistics17(2),155{169. DiCiccio,T.J.andJ.P.Romano(1990).Nonparametriccondencelimitsbyresamplingmethodsandleastfavorablefamilies.InternationalStatisticalReview58,59{76. Driscoll,M.F.(1999).Animprovedresultrelatingquadraticformsandchi-squaredistributions.TheAmericanStatistician53,273{275. Durbin,J.(1954).Errorsinvariables.ReviewoftheInternationalStatisticalInstitute22,23{32. Efron,B.(1979).Bootstrapmethods:anotherlookatjackknife.TheAnnalsofStatis-tics7,1{26. Efron,B.(1992).Jackknife-after-bootstrapstandarderrorsandinuencefunctions.JournaloftheRoyalStatisticalSociety,SeriesB54,83{127. 107

PAGE 108

Efron,B.andR.Tibshirani(1986).Bootstrapmethodsforstandarderrors,condenceintervals,andothermeasuresofstatisticalaccuracy.StatisticalScience1,54{77. Efron,B.andR.J.Tibshirani(1993).IntroductiontotheBootstrap.NY:ChapmanandHall. Freedman,D.A.(1981).Bootstrappingregressionmodels.TheAnnalsofStatistics9,1218{1226. Freedman,D.A.(1984).Onbootstrappingtwo-stageleastsquaresestimatesinstationarylinearmodels.TheAnnalsofStatistics12,827{842. Freedman,D.A.andS.F.Peters(1984).Bootstrappinganeconometricmodel:Someempiricalresults.JournalofBusinessandEconomicStatistics2,150{158. Gine,E.(1990).Bootstrappinggeneralempiricalmeasures.TheAnnalsofProbability18,851{869. Giurcanu,M.andA.Trindade(2006).EstablishingconsistencyofM-estimatorsunderconcavitywithanapplicationtosomenancialriskmeasures.TechnicalReport00000,DepartmentofStatistics,UniversityofFlorida. Hahn,J.(1996).Anoteonbootstrappinggeneralizedmethodofmomentsestimators.EconometricTheory12,187{196. Hall,A.R.(2005).GeneralizedMethodofMoments.NewYork:OxfordUniversityPress. Hall,P.(1985).Resamplingacoverageprocess.StochasticProcessesandtheirApplica-tions20,231{246. Hall,P.(1988).Theoreticalcomparisonofbootstrapcondenceintervals.TheAnnalsofStatistics16,927{953. Hall,P.(1992).TheBootstrapandEdgeworthExpansion.NewYork:Springer-Verlag. Hall,P.,J.Horowitz,andB.Jing(1995).Onblockingrulesforthebootstrapwithdependentdata.Biometrika82,561{574. Hall,P.andJ.L.Horowitz(1996).Bootstrapcriticalvaluesfortestsbasedongeneralized-method-of-momentsestimators.Econometrica64,891{916. Hall,P.andB.LaScala(1990).Methodologyandalgorithmsofempiricallikelihood.InternationalStatisticalReview58,109{127. Hall,P.andM.A.Martin(1988).Onbootstrapresamplinganditeration.Biometrika75,661{671. 108

PAGE 109

Hansen,L.,J.Heaton,andA.Yaron(1996).Finite-samplepropertiesofsomealternativeGMMestimators.JournalofBusinessandEconomicStatistics14,262{280. Hansen,L.P.(1982).Largesamplepropertiesofgeneralizedmethodofmomentsestimators.Econometrica50,1029{1054. Hayashi,F.(2000).Econometrics.PrincetonUniversityPress. Hogan,J.W.andT.Lancaster(2004).Instrumentalvariablesandinverseprobabilityweightingforcausalinferencefromlongitudinalobservationalstudies.StatisticalMethodsforMedicalResearch13,17{48. Huber,P.J.(1981).RobustStatistics.NewYork:Wiley-Interscience. Imbens,G.W.(2002).Generalizedmethodofmomentsandempiricallikelihood.JournalofBusiness&EconomicStatistics20,493{506. Jing,B.Y.andA.T.A.Wood(1996).ExponentialempiricallikelihoodisnotBartlettcorrectable.AnnalsofStatistics24,365{369. Kallenberg,O.(2002).FoundationsofModernProbability.NewYork:Springer. Kitamura,Y.(1997).Empiricallikelihoodmethodswithweaklydependentprocesses.TheAnnalsofStatistics25,2084{2102. Kolaczyk,E.D.(1994).Empiricallikelihoodforgeneralizedlinearmodels.StatisticaSinica4,199{218. Kreiss,J.P.(1992).BootstrapproceduresforAR(1)processes,inJockel,K.H.,Rothe,G.andSendler,W.,eds.,Bootstrappingandrelatedtechniques.Heidelberg:Springer. Kunsch,H.R.(1989).TheJackknifeandandthebootstrapforgeneralstationaryobservations.TheAnnalsofStatistics17,1217{1241. Lahiri,S.N.(2003).ResamplingMethodsforDependentData.NewYork:Springer-Verlag. Liang,K.Y.andS.L.Zeger(1987).Longitudinaldataanalysisusinggeneralizedlinearmodels.Biometrika73,13{22. Lindsay,B.G.andA.Qu(2003).Inferencefunctionsandquadraticscoretests.StatisticalScience18,394{410. Matyas,L.(1999).ed.GeneralizedMethodofMomentsEstimation.NewYork:CambridgeUniversityPress. 109

PAGE 110

Newey,W.K.andR.J.Smith(2004).HigherorderpropertiesofGMMandgeneralizedempiricallikelihoodestimators.Econometrica72,219{255. Newhouse,J.P.andM.McClellan(1998).Econometricsinoutcomesresearch:Theuseofinstrumentalvariables.AnnualReviewonPublicHealth19,17{34. Newton,M.A.andC.J.Geyer(1995).Bootstraprecycling:AMonteCarloalternativetonestedbootstrap.JournaloftheAmericanStatisticalAssociation89,905{912. Owen,A.B.(1988).Empiricallikelihoodratiocondenceintervalforasinglefunctional.Biometrika75,237{249. Owen,A.B.(1990).Empiricallikelihoodratiocondenceregions.TheAnnalsofStatistics18,90{120. Owen,A.B.(1991).Empiricallikelihoodforlinearmodels.TheAnnalsofStatistics19,1725{1747. Owen,A.B.(2001).EmpiricalLikelihood.NewYork:Chapman&Hall/CRC. Park,C.(2000).Robustestimationandtestingbasedonquadraticinferencefunctions.Ph.D.dissertation,Dept.ofStatistics,PennsylvaniaStateUniv. Politis,D.N.andJ.P.Romano(1992).Acircularblockresamplingprocedureforstationarydata,inR.LepageandL.Billard,eds.,ExploringtheLimitsofBootstrap.NewYork:Wiley. Politis,D.N.andJ.P.Romano(1994).Thestationarybootstrap.JournaloftheAmericanStatisticalAssociation89,1303{1313. Pollard,D.(1991).Asymptoticsforleastabsolutedeviationsregressionestimators.EconometricTheory7,186{199. Presnell,B.(2002).Anoteonleastfavorablefamiliesandpowerdivergence.TechnicalReport0000,DepartmentofStatistics,UniversityofFlorida. Presnell,B.andM.Giurcanu(2007).Biased-bootstraprecycling.TechnicalReport0000,DepartmentofStatistics,UniversityofFlorida. Qin,J.andJ.Lawless(1994).Empiricallikelihoodandgeneralestimatingequations.TheAnnalsofStatistics22,300{325. Qu,A.,B.G.Lindsay,andB.Li(2000).Improvinggeneralizedestimatingequationsusingquadraticinferencefunctions.Biometrika87,823{836. 110

PAGE 111

Reiersl,O.(1941).Conuenceanalysisbymeansoflagmomentsandothermethodsofconuenceanalysis.Econometrica9,1{24. Sargan,J.D.(1958).Theestimationofeconomicrelationshipsusinginstrumentalvariables.Econometrica26,393{415. Sering,R.(1980).ApproximationTheoremsofMathematicalStatistics.NewYork:Wiley. Shao,J.andD.Tu(1995).TheJackknifeandtheBootstrap.NewYork:Springer. Shorack,G.(1981).Bootstrappingrobustregression.TechnicalReport8,DepartmentofStatistics,UniversityofWashington. Smith,R.(1997).AlternativesemiparametriclikelihoodapproachestoGeneralizedMethodofMomentsestimation.EconomicJournal107,503{519. Stein,C.(1956).Ecientnonparametrictestingandestimation.InJ.Neyman(Ed.),Pro-ceedingsoftheThirdBerkeleySymposiumonMathematicalStatisticsandProbability,Volume1,pp.187{195.UniversityofCaliforniaPress. Tauchen,G.(1986).Statisticalpropertiesofgeneralizedmethod-of-momentsestimatorsofstructuralparametersobtainedfromnancialmarketdata.JournalofBusiness&EconomicStatistics4,397{416. Tibshirani,R.(1988).Variancestabilizationandthebootstrap.Biometrika75(3),433{444. vanderVaart,A.W.(1998).AsymptoticStatistics.NewYork:Springer-Verlag. Ventura,V.(2000).Non-parametricbootstraprecycling.TechnicalReport673,DepartmentofStatistics,CarnegieMellonUniversity. White,H.(1982a).Instrumentalvariablesregressionwithindependentobservations.Econometrica50,483{499. White,H.(1982b).Maximumlikelihoodestimationofmisspeciedmodels.Economet-rica50,1{25. Wu,C.F.J.(1986).Jackknife,bootstrapandotherresamplingmethodsinregressionanalysis(withDiscussion).TheAnnalsofStatistics14,1261{1350. 111

PAGE 112

MihaiGiurcanuwasbornonSeptember19,1975inMangalia,Romania.UpongraduationfromhighschoolinJuly1994,heenrolledasastudentintheFacultyofMathematicsatUniversityofBucharestfromwhencehereceivedadegreeofBachelorofArtsinMathematicsinJuly1998.InSeptember1998heenteredaMaster'sprograminAppliedStatisticsandOptimizationattheFacultyofMathematicsatUniversityofBucharest.Duringthistime,hewasappointedasTeachingAssistantinMathematicsatPolytechnicUniversityofBucharest.InJuly2000,hereceivedaMaster'sofArtsdegreeinAppliedStatisticsandOptimizationfromtheUniversityofBucharest.InNovember2000,heobtainedaFellowshipinStatisticsatWeierstrassInstituteofBerlin.InAugust2002heenteredaPhDprogramintheDepartmentofStatisticsatUniversityofFlorida.DuringhisgraduateeducationattheUniversityofFlorida,hewasalsoappointedasteachingassistantandinstructortodierentclassesintheDepartmentofStatistics.HegraduatedinAugust2007.HisdissertationisentitledBiasedBootstrapMethodsforSemiparametricModels.HismainresearchinterestsinStatisticsareresamplingtechniques,biostatistics,econometrics,andtimeseries. 112