Inference for Correlation Matrices for Longitudinal and Ordered Data

MISSING IMAGE

Material Information

Title:
Inference for Correlation Matrices for Longitudinal and Ordered Data
Physical Description:
1 online resource (129 p.)
Language:
english
Creator:
Wang, Yanpin
Publisher:
University of Florida
Place of Publication:
Gainesville, Fla.
Publication Date:

Thesis/Dissertation Information

Degree:
Doctorate ( Ph.D.)
Degree Grantor:
University of Florida
Degree Disciplines:
Biostatistics
Committee Chair:
Daniels, Michael J
Committee Members:
Ghosh, Malay
Lu, Xiaomin
Shorr, Ronald I.

Subjects

Subjects / Keywords:
autocorrelation -- bandwidth -- bayesian -- covariance -- matrix -- methods -- nonparametric -- partial -- regression
Biostatistics -- Dissertations, Academic -- UF
Genre:
Biostatistics thesis, Ph.D.
bibliography   ( marcgt )
theses   ( marcgt )
government publication (state, provincial, terriorial, dependent)   ( marcgt )
born-digital   ( sobekcm )
Electronic Thesis or Dissertation

Notes

Abstract:
Many parameters and postive-definiteness are two major obstacles in estimating and modelling a correlation matrix for longitudinal data. In addition, when longitudinal data is incomplet, incorreclty modelling the correlation matrix often results in bias in estimating mean regression parameters (Little and Rubin (2002) and Daniels and Hogan (2008)). Although the smaple covariance matrix is an unbiased estmator of the covariance matrix of a Guassian random vector, it has poor prpoerties if hte dimension (p) is large (Stein (1975)). Besides, covariancec matrices are often sparce for large p. Recently estimating large covariance matrices has seen an upsurge in practical and theoretical approaches due to a plethora of high dimensional data.In my dissertation, we introduce a flexible class of regression models for a covariance matric parameterized using marginal variances and partial autocorrelations. We propose a class of priors for the regression coefficents and examine the importance of correctly modeling the correlation structure on estimation of longitudinal (mean) trajectories via simulaitons. The regression approach is illustrated on data from a longitudinal clinical trial. In addition, we propse a computationally efficent approach to estimate (large) p-dimensional correlation matrices of orderd data based on an independent sample of size n. This approach is considerabley faster than many existing methods and only requires inversion of k (K= n-2 (even when n is derived based on a Toepltiz condition, and a plug-in bandwidth selector is suggested. The improvement is demonstrated by simulations. The simulations suggest that theToeplitz condition on partial autocorrelation matrices could be removed.
General Note:
In the series University of Florida Digital Collections.
General Note:
Includes vita.
Bibliography:
Includes bibliographical references.
Source of Description:
Description based on online resource; title from PDF title page.
Source of Description:
This bibliographic record is available under the Creative Commons CC0 public domain dedication. The University of Florida Libraries, as creator of this bibliographic record, has waived all rights to it worldwide under copyright law, including all related and neighboring rights, to the extent allowed by law.
Statement of Responsibility:
by Yanpin Wang.
Thesis:
Thesis (Ph.D.)--University of Florida, 2012.
Local:
Adviser: Daniels, Michael J.
Electronic Access:
RESTRICTED TO UF STUDENTS, STAFF, FACULTY, AND ON-CAMPUS USE UNTIL 2013-08-31

Record Information

Source Institution:
UFRGP
Rights Management:
Applicable rights reserved.
Classification:
lcc - LD1780 2012
System ID:
UFE0044449:00001


This item is only available as the following downloads:


Full Text

PAGE 1

INFERENCEFORCORRELATIONMATRICESFORLONGITUDINALANDORDEREDDATAByYANPINWANGADISSERTATIONPRESENTEDTOTHEGRADUATESCHOOLOFTHEUNIVERSITYOFFLORIDAINPARTIALFULFILLMENTOFTHEREQUIREMENTSFORTHEDEGREEOFDOCTOROFPHILOSOPHYUNIVERSITYOFFLORIDA2012

PAGE 2

c2012YanpinWang 2

PAGE 3

Tomyparents,myson,andmyhusband 3

PAGE 4

ACKNOWLEDGMENTS Iplacemydeepestappreciationtomyadvisor,ProfessorMichealDaniels.Withouthisguidanceandpersistenthelp,Iwillneverbeabletoreachthisfar.Iamthankfultomydoctoralcommitteemembers,Prof.MalayGhosh,Prof.XiaominLu,andProf.RonaldShorr,whohaveprovidedplentifulsupportandvaluableinsightduringmydissertation.IoweadebtofgratitudetoallofmyteachersattheDepartmentofStatisticsforprovidinganoutstandingeducation.IwouldliketogivemyspecialthankstoTinaGreenly,SummerLayton,RobynCrawford,andMarilynSaddlerfortheirextrahelpineverypossibleway.Myspecialthankswillgotomyhusband,JiangtoLuo,whoencouragesmealltime,andmyson,BinjieLuo,whocheerstheeverystepImade.Iwillplacemydeepestgratitudetomyparents,whogivemeunconditionallyandconstantlylove.Ideeplyobligethesacricesmyparentshadtomaketoprovidemetheopportunitytohavebettereducation.Lastbutnotleast,IwouldliketothankallofmyfriendsattheUniversityofFlorida.Iwillneverforgettheirhumor,friendship,andunwaveringsupport. 4

PAGE 5

TABLEOFCONTENTS page ACKNOWLEDGMENTS .................................. 4 LISTOFTABLES ...................................... 8 LISTOFFIGURES ..................................... 9 ABSTRACT ......................................... 10 CHAPTER 1INTRODUCTION ................................... 12 2BAYESIANMODELINGOFTHEDEPENDENCEINLONGITUDINALDATAVIAPARTIALAUTOCORRELATIONSANDMARGINALVARIANCES ..... 27 2.1Background ................................... 27 2.1.1BriefliteratureReviewforEstimatingaCorrelationMatrix ..... 27 2.1.2ReviewofPartialAutocorrelationsandModelingCorrelationMatrices 28 2.1.3OutlineofthisChapter ......................... 29 2.2ModelsfortheCovarianceMatrix ....................... 30 2.2.1PartialAutocorrelations ........................ 30 2.2.2MarginalVariances ........................... 31 2.3Priorsfor ................................... 31 2.3.1ReviewofPriorsforUnstructuredPartialAutocorrelations ..... 32 2.3.2AnalternativePriorforUnstructuredPartialAutocorrelations ... 32 2.3.3ProposedPrioron .......................... 33 2.3.4ExtensiontoUnit-levelCovariates ................... 36 2.3.5Connectiontog-priors ......................... 36 2.3.6Priorfor ................................ 37 2.3.7PosteriorPropriety ........................... 38 2.4Simulations ................................... 39 2.4.1Models .................................. 39 2.4.2Results ................................. 40 2.5DataExample:SchizophreniaTrial ...................... 41 2.6Discussions ................................... 43 3ESTIMATINGLARGECORRELATIONMATRICESBYBANDINGTHEPARTIALAUTOCORRELATIONMATRIX ........................... 52 3.1Background ................................... 52 3.2ReviewofPartialAutocorrelations ...................... 53 3.3BandingthePartialAutocorrelationMatrix .................. 54 3.3.1EstimatingthePartialAutocorrelationsforeachBand ........ 55 3.3.1.1Statementofneededresults ................ 56 3.3.1.2Resultsforcomputingthemleforeachband ....... 57 5

PAGE 6

3.3.2ChoosingtheBand ........................... 58 3.3.2.1Theorems ........................... 58 3.3.2.2Overallprocedure ...................... 60 3.4Simulations ................................... 61 3.4.1Models .................................. 61 3.4.2Results ................................. 61 3.4.3ChoiceofforBonferroniCorrection ................. 63 3.5ApplicationstoSonarData .......................... 63 3.6Discussions ................................... 64 4NONPARAMETRICESTIMATIONOFLARGECORRELATIONMATRICESBYSMOOTHINGBANDSINTHEPARTIALAUTOCORRELATION ....... 70 4.1Introduction ................................... 70 4.2ReviewofPartialAutocorrelations ...................... 72 4.3SomePropertiesofk0BandPartialAutocorrelationMatrices ....... 73 4.4NonparametricEstimationofPartialAutocorrelationCoefcientswithinBands ...................................... 77 4.4.1TheoreticResults ............................ 78 4.4.2Estimation ................................ 79 4.4.2.1Choosenumberofbands .................. 79 4.4.2.2Smoothestimateswithineachband ............ 79 4.5SimulationStudy ................................ 80 4.6Applications ................................... 81 4.7Discussion ................................... 82 5CONCLUSION .................................... 90 APPENDIX ASUPPORTINGMATERIALFORCHAPTER2 ................... 92 A.1SamplingAlgorithm .............................. 92 A.2SimulatingfromtheFullConditionalfor .................. 93 A.3DerivingtheExpectedInformationMatrixfor ............... 93 A.4ProofofTheorem1 ............................... 97 BSUPPORTINGMATERIALFORCHAPTER3 ................... 101 B.1ProofofLemma 3.0.1 ............................. 101 B.2PreliminariesandNotationforProofofTheorem 3.1 ............ 102 B.3ProofofTheorem 3.1 ............................. 104 CSUPPORTINGMATERIALFORCHAPTER4 ................... 114 C.1ProveTheorem 4.1 ............................... 114 C.2EstimateAsymptoticOptimalBandwidth ................... 117 6

PAGE 7

REFERENCES ....................................... 121 BIOGRAPHICALSKETCH ................................ 128 7

PAGE 8

LISTOFTABLES Table page 2-1Posteriormeansof ................................. 45 2-2Summarymeasuresforthesimulation ....................... 49 2-3Summarycountfromthesimulations ........................ 50 2-4Descriptivesummariesofschizophreniatrial ................... 51 2-5Posteriorsummariesofschizophrenia ....................... 51 3-1Risksimulationsbasedonmetalandrockdata .................. 65 3-2RisksimulationbasedonAR(1)correlationmatrices ............... 68 3-3Risksimulationsof4bandpartialautocorrelationmatrices ............ 69 3-4Estimatednumberofbandsforrockandmetalofsonardata .......... 69 4-1SimulationrecordsforAR(1)structure ....................... 83 4-2Simulationrecordsformodel2 ........................... 84 4-3Simulationrecordsformodel3 ........................... 84 4-4Simulationrecordsformodel4 ........................... 86 8

PAGE 9

LISTOFFIGURES Figure page 2-1Marginalpriors .................................... 46 2-2Trajectoriesofsamplesize30,100,400 ...................... 47 2-3Observedtrajectoriesandttedtrajectories .................... 48 3-1Imagesofsamplecorrelationandestimatedcorrelationmatrices ........ 67 4-1Truepartialautocorrelationcurvesformodels ................... 85 4-2Samplepartialautocorrelationsandsmoothedcurveswithineachbandofthemetaldata ....................................... 87 4-3Samplepartialautocorrelationsandsmoothedcurveswithineachbandoftherockdata. ....................................... 88 4-4Intensityplotsforthemetalandrockdata ..................... 89 9

PAGE 10

AbstractofDissertationPresentedtotheGraduateSchooloftheUniversityofFloridainPartialFulllmentoftheRequirementsfortheDegreeofDoctorofPhilosophyINFERENCEFORCORRELATIONMATRICESFORLONGITUDINALANDORDEREDDATAByYanpinWangAugust2012Chair:MichaelJ.DanielsMajor:BiostatisticsManyparametersandpositive-denitenessaretwomajorobstaclesinestimatingandmodellingacorrelationmatrixforlongitudinaldata.Inaddition,whenlongitudinaldataisincomplete,incorrectlymodellingthecorrelationmatrixoftenresultsinbiasinestimatingmeanregressionparameters( LittleandRubin ( 2002 )and DanielsandHogan ( 2008 )).AlthoughthesamplecovariancematrixisanunbiasedestimatorofthecovariancematrixofaGaussianrandomvector,ithaspoorpropertiesifthedimension(p)islarge( Stein ( 1975 )).Besides,covariancematricesareoftensparseforlargep.Recently,estimatinglargecovariancematriceshasseenanupsurgeinpracticalandtheoreticalapproachesduetoaplethoraofhighdimensionaldata.Inmydissertation,weintroduceaexibleclassofregressionmodelsforacovariancematrixparameterizedusingmarginalvariancesandpartialautocorrelations.Weproposeaclassofpriorsfortheregressioncoefcientsandexaminetheimportanceofcorrectlymodelingthecorrelationstructureonestimationoflongitudinal(mean)trajectoriesviasimulations.Theregressionapproachisillustratedondatafromalongitudinalclinicaltrial.Inaddition,weproposeacomputationallyefcientapproachtoestimate(large)p-dimensionalcorrelationmatricesofordereddatabasedonanindependentsampleofsizen.Thisapproachisconsiderablyfasterthanmanyexistingmethodsandonlyrequiresinversionofk(k<
PAGE 11

whenn
PAGE 12

CHAPTER1INTRODUCTIONEstimationofcovariancematricesisimportantfortheanalysisofmultivariatedata.Basedonapopulationcovariancematrix,onecanestimateprincipalcomponentsandeigenvalues,constructlineardiscriminantfunctionsandcondenceintervals(condencebounds)onlinearfunctions,andestablishdependenciesandconditionaldependencies.Longitudinaldata,whichismultivariatedataorderedintime,possessesfeaturesofbothgeneralmultivariatedataandtimeseriesdata.Timeseriesanalysisisconcernedwithstatisticalinferencefromdatawhicharenotnecessarilyindependentandidenticallydistributed.Timeseriesanalysisoftenassumestationaritywhichdoesnotholdstrue.Inlongitudinalstudies,wehaverepeated(acrossunits)timeseries.Modelingthedependencestructurecarefullyisimportantinmakinginferencefromlongitudinaldata.Itsimportanceismagniedinincompletedatasinceincorrectlymodelingthedependencestructureoftenresultsinabiasedestimateofthemeanparameters( LittleandRubin ( 2002 ); DanielsandHogan ( 2008 )).Moreover,covariancematricesareoftensparseforlargepandhavespecialfeaturesunderordered(longitudinal)data,suchasYjandYkbeingclosertoindependenceorconditionalindependenceasjj)]TJ /F4 11.955 Tf 12.57 0 Td[(kjincreases.Findingawaytoestimatecovariancematricesiscrucialaspislarge,especiallywhensamplesizen
PAGE 13

autocorrelationmatrix=(jk),wherejj=1andfor1j
PAGE 14

2.ModelingtheDependenceStructure2.1MatrixDecompositionSupposeY1,Y2,...,Ynareindependentandidenticallydistributedrandomvectorsofdimensionpfollowingamultivariatenormaldistributionwithmeanandvariance-covariancematrix.Wereviewseveralcommondecompositionthatcanbeusefulforparsimoniouslymodelingthedependencestructure.1.SpectralDecomposition(Eigenvaluedecomposition) =PP0 (1) whereisthediagonalmatrixofeigenvalueswhichspecifyitsshapeandthecolumnvectorsofParethecorrespondingeigenvectors.Modelsusingthisdecomposition(Eq. 1 )havebeendevelopedin LeonardandHsu ( 1992 )and Chiuetal. ( 1996 ).2.ModiedCholeskyDecompositionofCovarianceMatrices TT0=B (1) whereTisauniqueunitlowertriangularmatrixwith1sasdiagonalentriesanduniquediagonalmatrixBwithpositivediagonalentries.Thisdecompositionprovidesaniceinterpretationunderalongitudinalsetting.Thebelow-diagonalelementsofTinEq. 1 arethenegativesofthecoefcientsof^Yt=t+Pt)]TJ /F5 7.97 Tf 6.59 0 Td[(1j=1t,j(Yj)]TJ /F3 11.955 Tf 9.85 0 Td[(j),thelinearleast-squarepredictorsofYtbasedonitspredecessorsYt)]TJ /F5 7.97 Tf 6.59 0 Td[(1,...,Y1.AndthediagonalentriesofBarethepredictionerrorvariances2t=var(Yt)]TJ /F6 11.955 Tf 16.34 2.66 Td[(^Yt),for1tn.Thepositive-deniteconditionofcovariancematricescanbeeasilysatisedbyassuming2t>0( Pourahmadi ( 1999 ); DanielsandPourahmadi ( 2002 )).3.Variance/CorrelationDecomposition =DRD (1) 14

PAGE 15

whereDisdiagonalmatrixwithstandarddeviationsonitsdiagonalandR=(jk)isacorrelationmatrix.4.ReparameterizeR=(jk)ppinEq. 1 topartialautocorrelationmatrix=(jk)Thepartialautocorrelationjkisdenedas:thelag-1partialautocorrelationsj,j+1j,j+1,j=1,,p)]TJ /F6 11.955 Tf 11.96 0 Td[(1andthepartialautocorrelations jk=jkjj+1,,k)]TJ /F5 7.97 Tf 6.58 0 Td[(1,k)]TJ /F4 11.955 Tf 11.96 0 Td[(j2. (1) Furthermore,wecanuseFisherz-transformationz=)]TJ /F5 7.97 Tf 10.5 4.71 Td[(1 2log1+ 1)]TJ /F12 7.97 Tf 6.59 0 Td[(transforming2()]TJ /F6 11.955 Tf 9.3 0 Td[(1,1)toz2(,+1),whichmotivatesthegeneralizedlinearmodelframeworkwithlinkfunctiong()=)]TJ /F5 7.97 Tf 10.49 4.7 Td[(1 2log1+ 1)]TJ /F12 7.97 Tf 6.58 0 Td[(.ThelagkpartialAutocorrelationsarebasedonconditionalregressionswheretheconditioningsetshavethesamenumberofelements.Theyareexchangeableinsomesense(allconditioningonk)]TJ /F6 11.955 Tf 12.47 0 Td[(1interveningvariables).Theante-dependencemodelsintroducedby Gabriel ( 1962 )havecloserelationtothisreparameterization.Wereviewitnext.Observations(Y1,...,Yp)whosejointdistributionismultivariatenormalaresth-orderante-dependenceifYjandYj+k+1areindependentgiventheinterveningobservationsYj+1,...,Yj+k,forallj=1,...,p)]TJ /F4 11.955 Tf 11.95 0 Td[(k)]TJ /F6 11.955 Tf 11.95 0 Td[(1andallks( Gabriel ( 1962 )),i.e.Cov(Yj,Yj+k+1jYj+1,...,Yj+k)=0.Hence,wecanmodeltheserandomvariablesasfollows: 8><>:y1=xT1+1yj=xTj+Psk=1jk(yj)]TJ /F7 7.97 Tf 6.58 0 Td[(k)]TJ /F4 11.955 Tf 11.95 0 Td[(xTj)]TJ /F7 7.97 Tf 6.59 0 Td[(k)+j(j=2,...,p). (1) 15

PAGE 16

wheres=min(s,j)]TJ /F6 11.955 Tf 12.54 0 Td[(1),thejsareindependentnormalrandomvariableswithzeromeanandvariance2j,andthejksareunrestrictedparameters. ZimmermanandNunez-Anton ( 2010 )compareseveralmodels,including(1)theunstructuredcovariancemodel,(2)unstructuredante-dependencemodels,(3)structuredante-dependencemodels,(4)autoregressiveintegratedmovingaverageandsimilarmodels,and(5)randomcoefcientsmodels.2.2PriorsforCovarianceMatricesThemostcommonpriorforacovariancematrixisInverseWishartdistribution,aconjugatepriorforanormalcovariancematrix.However,thispriorlacksexibility,allowingonlyoneprecisionparameterforallp(p-1)/2elementsandrequiresspecicationofameanmatrix( DanielsandKass ( 1999 )). YangandBerger ( 1994 )developareferencepriorforacovariancematrixbasedonaspectraldecompositionofthecovariancematrix. SmithandKohn ( 2002 )provideapriorforlargedatathatallowszeroelementsinthestrictlowertriangleoftheCholeskydecomposition(Eq. 1 )oftheinversecovariancematrixtoobtainaparsimoniousrepresentationofthecovariancematrix. DanielsandPourahmadi ( 2002 )proposeshrinkingelementsoftheTmatrixinEq. 1 towardspecicstructure. Barnardetal. ( 2000 )modelthecorrelationsgivenconstraintsinEq. 1 andproposeajointuniformpriorandmarginalindependentuniformpriorsonconditionaldistributionof(RjD). Liechtyetal. ( 2004 )placeaprioronRwithpdff(Rj,2)Qj
PAGE 17

Estimatinglargecovariancematrices(piscomparableorlargerthanthesamplesizen)hasgainedincreasedinterestrecentlysincehighdimensionaldataissocommonincurrentapplications,suchasclimatedata,nancialdata,functionaldata,geneexpressiondata,andfunctionalMagneticResonanceImaging(fMRI)data.Manyapproachescanbefoundintherecentliterature.Typically,therearealotofzerosinentriesofthevariance-covariancematrix(correlationorconditionalcorrelation)aspgrows.Thesimplestestimateofisthesamplevariance-covariancematrix^=1 nPni=1(Yi)]TJ ET q .478 w 91.67 -169.33 m 101.82 -169.33 l S Q BT /F4 11.955 Tf 91.67 -179.31 Td[(Y)(Yi)]TJ ET q .478 w 137.48 -169.33 m 147.63 -169.33 l S Q BT /F4 11.955 Tf 137.48 -179.31 Td[(Y)T.^isoptimalandunbiasedestimateofwhenpn( BickelandLevina ( 2008b ); BickelandLevina ( 2008a ); Johnstone ( 2001 );and Karoui ( 2009 )).Developingmethodsinthissituationaredesired.Recentapproachesinclude1)Lassoregression,and2)Banding.Wereviewthesenext.3.1LassoRegressionBasedonLikelihoodFunctionLassoregressioncanbedenedusingaloglikelihoodfunctionoralossfunction.Suppose(yi,xi),i=1,2,...,n,wherexi=(xi1,...,xiq)Tisthepredictorofresponsevariableyi.Forsimplication,supposeyiiscentered.LetYi=xTi+,where=(1,...,q)TandN(0,2),Denition1:BasedonloglikelihoodfunctionSuppose`()istheloglikelihoodfunctionofparameterq1.Thelassoestimate^isdenedby ^=argmin`() (1) subjecttoPjijs,wheres0( Tibshirani ( 1996 )).Denition2:BasedonSquaredLoss 17

PAGE 18

Undersquarederrorloss,thelassoestimate^isdenedas ^=argminfnXi=1(y1)]TJ /F13 11.955 Tf 11.96 11.36 Td[(Xjjxij)2g (1) subjecttoPjjjjs,wheres0( Tibshirani ( 1996 )). Fu ( 1998 )showsthatEq. 1 isequivalentto ^=argminfnXi=1(y1)]TJ /F13 11.955 Tf 11.96 11.35 Td[(Xjjxij)2+Xjjjjg (1) whichisapenalizedregression,forsome.ThepropertyoftheL1normallowsLassoregressiontocapturezeroentriesofvariance-covarianceorcorrelationmatrix. Huangetal. ( 2006 )applythelassopenaltytothemodiedCholeskycoefcients. RothmanandZhou ( 2008 )useanestedlassowhichreplacesL1penaltybyanestedLassopenalty.Applyinggrouplasso, Bigotetal. ( 2009 )selectasparsesetofbasisfunctionsinthedictionaryusedtoapproximatetheprocess.Mostoftheseapproachesdealdirectlywithacovariancematrixanddonotadapteasilytoacorrelationmatrix.3.2BandingonVariance-covarianceorCorrelationMatrixToconstructaspareestimatorofacovariancematrix, BickelandLevina ( 2008b )bandthesamplecovariancematrixandtheregressionparametersoftheCholeskydecompositionofcovariancematrix,while WuandPourahmadi ( 2003 )bandtheregressionparametersafteraCholeskydecompositionofinversecovariancematrix.Todeterminethenumberofbandsk, BickelandLevina ( 2008b )minimizedthefollowingrisk,Rk=Ejj^k)]TJ /F6 11.955 Tf 11.95 0 Td[(jj(1,1)andk0=argminkfRkg 18

PAGE 19

wherejjMjj(1,1)supfjjMxjj1:jjxjj1=1g=maxjPijmijj,forM=(mij).TheyproposearesamplingschemetoestimatetheriskRkandk0byrandomlydividingtheoriginalsampleintotwodataset(1,2).1iscalledthetargetdatasetwhosesamplesizeischosentobebn 3c,thentheyestimatetheriskby^Rk=1 NNX=1jjBk(^1()))]TJ /F6 11.955 Tf 15.81 2.66 Td[(^2()jj(1,1)whereNisnumberofresampling,andBk(M)isk-bandmatrixofM=(mij)denedas: Bk(M)=[mijI(ji)]TJ /F4 11.955 Tf 11.96 0 Td[(jjk)] (1) Thenkisselectedas^k=argmink^R(k). WuandPourahmadi ( 2003 )minimizeanAIC-typecriteriondenedas,AIC(d)=npXi=1logt2(d)+2d,whered=0,1,...,bp1 3c,andt2isdenedast(d)2=8><>:n)]TJ /F5 7.97 Tf 6.59 0 Td[(1Pni=1fyt,i)]TJ /F6 11.955 Tf 11.95 0 Td[((t,t)]TJ /F5 7.97 Tf 6.59 0 Td[(1yt)]TJ /F5 7.97 Tf 6.58 0 Td[(1,i+...+t,t)]TJ /F7 7.97 Tf 6.59 0 Td[(dyt)]TJ /F7 7.97 Tf 6.59 0 Td[(d,i)g2d1n)]TJ /F5 7.97 Tf 6.59 0 Td[(1Pni=1(yt,i)]TJ /F6 11.955 Tf 13.15 0 Td[(yi)2d=0.4.ModelsforLongitudinalDataModelsforlongitudinaldatacanbeclassiedintotwogroups,1)directlyspecied(marginal)models;2)indirectlyspecied(conditional)models.Inamarginalmodel,themarginalexpectation,E(Yij),isdirectlymodeledasafunctionofcovariatesxij,e.g.E(Yij)=ij=xTij,( LiangandZeger ( 1986 ); Heagerty ( 1999 );and Heagerty ( 2002 )).Inindirectlyspeciedmodels,theeffectofcovariatesonresponsesismodeledconditionally(conditiononarandomeffectsorprevioushistoryofresponses)( ZegerandKarim ( 1991 ); HedekerandGibbons ( 1994 )). 19

PAGE 20

Here,wereviewtwomarginalmodels,oneforcontinuousresponsesandoneforbinaryresponses.Therstonecapturedependenceviaacovariancematrix,andthesecondviaacorrelationmatrix.4.1MarginalModelsforLongitudinalStudies1)Multivariatenormalmodel:LetYi=(Yi1,...Yip)Tbearesponserandomvectorandxitbeaq1vectorofcovariatesfori=1,..,n.WeassumeYiN(i,i),where E(Yijxi)=xTii. (1) WeconsidermodelsforiinChapter2.2)Multivariateprobitmodel:LetYi=(Yi1,...,Yip)Tbemultivariatebinaryresponsevector,andZi=(Zi1,...,Zip)TbealatentvariablewithZiN(xTi,i).SupposemultivariatebinaryresponsesYisatisfy:Yij(Zij)=8><>:0ifZij<01ifZij0wherej=1,2,...,p.SoP(Yi=yij,i,xi)=ZBip...ZBi1p(ZijXi,i)dZiwhereBijistheinterval(0,+1),ifyij=1and(,0),otherwise.Andp()isthepdfofapdimensionmultivariatenormalwithmeanXTiandcovariancematrixi.Foridentiability,wesetthediagonalelementsofitobe1.(Soiisacorrelationmatrix).WewillexplorewaystoputexiblestructureoniviapartialautocorrelationmodelsandwaystoconstructparsimoniousestimatesoflargecovariancematricesusingpartialautocorrelationsinChapter2,Chapter3,andChapter4.4.2MissingDataAnalysis 20

PAGE 21

Missingdataisverycommoninlongitudinalstudies.Wereviewkeyconceptsnext.4.2.1ModelsforlongitudinalmissingdatastructureSupposethefulldataresponsevectorisYi=(Yi1,...,Yip)T,Xiisapqmatrixofcovariates,andthevectorRi=(Ri1,...,Rip)Tindicateswhichcomponentsareobserved,withRij=1ifYijisobserved,andRij=0ifYijismissing.TheresponsevectorYicanbedividedintotwoparts:1)observeddata(Yiobs)and2)missingdata(Yimis),i.e.Yi=(Yiobs,Yimis).Wedenethefulldataresponsemodelasf(yjx,).Wedenethefulldatamodelasf(y,rjx,!).Therelationshipbetweenfulldataresponsemodelandfulldatamodelis:f(yjx,(!))=Xr2
PAGE 22

Therstfactoristhefulldataresponsemodel.Wecallthesecondfactorf(rjy,x, (!))asthemissingdatamechanism(MDM),wewilldiscussdifferentwaystheMDMisclassiednext.4.2.2MissingdatamechanismStartingfrom Rubin ( 1976 )and LittleandRubin ( 2002 ),missingnesswasclassiedintothreecategories:1)MissingCompletelyAtRandom(MCAR)p(RjY,x)=p(Rjx).2)MissingAtRandom(MAR).p(RjY,x)=p(RjYobs,x).3)MissingNotAtRandom(MNAR)p(RjYobs,Ymis,x)6=p(RjYobs,Y0mis,x).forYmis6=Y0mis.InaBayesiansetting,itisoftenmorenaturaltoclassifymissingnessasignorableornon-ignorable,wedeneignorablenext.Amissingdatamechanismiscalledignorable,if1)ThemissingdatamechanismisMAR,2)Thefulldataparameter!canbedecomposedas!=(, ),a)indexesthefull-dataresponsemodelf(Yj),b) indexesthemissingdatamechanismf(RjY, ).3)Theparametersand areaprioriindependent;i.e.f(, )=f()f( ). 22

PAGE 23

Underignorablemissingnessmechanism,thelikelihoodfunctioncanbefactoredoverand L(, jyobs,R,x)=L1( jR,yobs,x)L2(jyobs,x)andobserved-dataposterioralsocanbefactoredasp(, jyobs,R,x)=ff( )L1( jR,yobs,x)gff()L2(jyobs,x)g.Therefore,theobserved-dataposteriorfor,theparametersofthefulldataresponsemodel,isp(jyobs,R,x)=f()L2(jyobs,x)whichdoesnotcontain .Sothemissingdatamechanismdoesnotneedtobeexplicitlymodeled.Inthefollowingchapters,weassumemissingdataisignorable.5.NonparametricRegressionAssumewehavenpairsofobservations(Y1,X1),(Y2,X2),...,(Yn,Xn).TheresponsevariableYisrelatedtocovariantxwithfollowingregressions Yi=r(xi)+i,i=1,2,...,n (1) whererisaregressionfunctionandiisameanzerorandomprocess.Thisspecicationimpliesr(x)=E(YjX=x)andwewanttoestimatethefunctionr.5.1KernelEstimation Nadaraya ( 1964 )and Watson ( 1964 )introduceanestimatorofr(x)whichisthespecialcaseofttingaconstantlocallyatanypointx0.TheNadaraya-Wastonestimatorofr(x)isdenedasfollows, 23

PAGE 24

^r(x)=Pjxi)]TJ /F7 7.97 Tf 6.59 0 Td[(x0j0.Commonlyusedkernelsinclude, GaussianKernel:K(x)=1 2expf)]TJ /F7 7.97 Tf 16.47 4.7 Td[(x2 2g, BoxcarKernel:K(x)=1 2I(jx1) EpanechnikovKernel:K(x)=3 4(1)]TJ /F4 11.955 Tf 11.95 0 Td[(x2)I(jxj1) TricubeKernel:K(x)=70 81(1)-222(jxj3)2I(jxj1)Wedeneanestimator^r(x)ofr(x)istheonewhichminimizesaselectedlossfunction.ThemostcommonlyusedlossfunctionisLplosswhichisdenedasfollows, Lp=Zjr(x))]TJ /F6 11.955 Tf 11.58 0 Td[(^r(x)jpdF(x)1 p. (1) L1lossisresistenttooutliersandinvariantunder1-1transformations.However,L2loss(squarederrorloss)givesmanyniceproperties.Forexample,undersquarederrorloss,theBayesianestimatoristheposteriormean^=E(jY),andmeansquareerrorissumofvarianceandsquaredbiasE(^Y)]TJ /F4 11.955 Tf 11.96 0 Td[(Y)2=(E(^Y)]TJ /F4 11.955 Tf 11.96 0 Td[(E(Y)))2+E(Y)]TJ /F4 11.955 Tf 11.95 0 Td[(E(Y)2.Undersquarederrorloss,argmin^r(x)E(r(x))]TJ /F6 11.955 Tf 11.57 0 Td[(^r(x))2=argmin^r(x)fBias2+VariancegargminhfBias2+VariancegClearly,^r(x)isafunctionofsmoothparameterh(xi). 24

PAGE 25

ThedecreaseinBiasisaccompaniedbyincreasingvarianceandweneedabalancebetweenthesetwotominimizesquarederrorloss(ormeanintegratedsquarederrorloss).Thereisanextensiveliteratureonhowtochooseasmoothingparameterh.Weoutlinesomeofthemorerelevantliteraturenext.5.2ChooseSmoothParameterCommonlymethodstoestimatethesmoothingparameterinclude cross-validationmethod( Stone ( 1974 )). datadrivenplug-inmethod( Gasseretal. ( 1991 ); Ruppertetal. ( 1995 )).Leave-one-outcross-validationscoreisdenedas: CV(h)=^R(h)=1 nnXi=1(Yi)]TJ /F6 11.955 Tf 11.57 0 Td[(^r()]TJ /F7 7.97 Tf 6.59 0 Td[(i)(xi))2, (1) where^r()]TJ /F7 7.97 Tf 6.59 0 Td[(i)istheestimatorobtainedbyomittingtheithpair(Yi,Xi).WechoosehtominimizeCV(h).Leave-one-outcrossvalidationisthesimplestmethodformodelselectionbutitiscomputationallyexpensive,doesnotprovideaconsistentestimator,andresultsseriousproblemasdataareclustered( CravenandWahba ( 1979 )).Therearemanymodiedversionstoovercomethesedeciency,suchasbiasedcross-validation( ScottandTerrell ( 1987 )),partitionedcrossvalidation( Marron ( 1987 )),andgeneralizedcrossvalidation( CravenandWahba ( 1979 )).Generalizedcrossvalidationhasreceivedmuchattention. Golubetal. ( 1979 )ItmodiesCVbyintroducingafactor1 (1)]TJ /F7 7.97 Tf 6.59 0 Td[(hw(0)=n)2,i.e.GCV(h)=^R(h)=1 nPni=1(Yi)]TJ /F6 11.955 Tf 11.57 0 Td[(^r()]TJ /F7 7.97 Tf 6.59 0 Td[(i)(xi))2 (1)]TJ /F4 11.955 Tf 11.95 0 Td[(hw(0)=n)2hischosentominimizeGCV(h).Datadrivenplug-inruleshavereceivedmoreattentionrecently( Gasseretal. ( 1991 ); Ruppertetal. ( 1995 )).Theoptimalbandwidthiscalculatedbyestimatingsome 25

PAGE 26

relatedquantities,suchasvariance(2)andcurvatureofanunknownfunction(r"(x)).Itiscomputationallyefcientcomparedtocross-validationmethods.However,thepilotbandwidth,whichisminimizerofmeansquareerrorofestimating2andr"(x)withanarbitraryspecicationmayresultinoversmoothingand/ormissingimportantfeatures( Loader ( 1999 ); ParkandMarron ( 1990a )).Generally,crossvalidationismorerobust,whiletheplug-inruleismoreefcientbutdependsonsomestrongerassumptions.6.OutlineoftheDissertationMydissertationhasthreeparts.First,wedevelopBayesianmethodstomodeldependenceinlongitudinaldataviapartialautocorrelationsandmarginalvariances.Second,weproposeamethodtoestimatelargecovariancematricesbybandingthepartialautocorrelationmatrix.Third,weexplorenonparametricsmoothingofapartialautocorrelationmatrixwithinbands.BayesianModelingdependencestructure:Manyparametersandpositive-denitenessaretwomajorobstaclesinestimatingandmodellingacorrelationmatrixforlongitudinaldata.Inaddition,whenlongitudinaldataisincomplete,incorrectlymodellingthecorrelationmatrixoftenresultsinbiasinestimatingmeanregressionparameters.InChapter2,weintroduceaexibleclassofregressionmodelsforacovariancematrixparameterizedusingmarginalvariancesandpartialautocorrelations.Weproposeaclassofpriorsfortheregressioncoefcientsandexaminetheimportanceofcorrectlymodelingthecorrelationstructureonestimationoflongitudinal(mean)trajectoriesviasimulations.Theregressionapproachisillustratedondatafromalongitudinalclinicaltrial.Estimatinglargecovariancematrix:InChapter3,weproposeacomputationallyefcientapproachtoestimate(large)p-dimensionalcorrelationmatricesofordereddatabasedonanindependentsampleofsizenbybandingthepartialautocorrelationmatrix.Thenumberofbands(k)ischosenbyanexactmultiplehypothesistestingprocedure. 26

PAGE 27

Thisapproachisconsiderablyfasterthanmanyexistingmethodsandonlyrequiresinversionofkdimensionalcovariancematrices.Inaddition,theresultingestimatorisguaranteedtobepositivedeniteaslongaskn)]TJ /F6 11.955 Tf 12.12 0 Td[(2(evenwhenn
PAGE 28

CHAPTER2BAYESIANMODELINGOFTHEDEPENDENCEINLONGITUDINALDATAVIAPARTIALAUTOCORRELATIONSANDMARGINALVARIANCESManyparametersandpositive-denitenessaretwomajorobstaclesinestimatingandmodellingacorrelationmatrixforlongitudinaldata.Inaddition,whenlongitudinaldataisincomplete,incorrectlymodellingthecorrelationmatrixoftenresultsinbiasinestimatingmeanregressionparameters.Inthispaper,weintroduceaexibleclassofregressionmodelsforacovariancematrixparameterizedusingmarginalvariancesandpartialautocorrelations.Thepartialautocorrelationscanfreelyvaryintheinterval()]TJ /F6 11.955 Tf 9.3 0 Td[(1,1)whilemaintainingpositivedenitenessofthecorrelationmatrixsotheregressionparametersinthesemodelswillhavenoconstraints.Weproposeaclassofpriorsfortheregressioncoefcientsandexaminetheimportanceofcorrectlymodelingthecorrelationstructureonestimationoflongitudinal(mean)trajectoriesviasimulations.Theregressionapproachisillustratedondatafromalongitudinalclinicaltrial. 2.1BackgroundLongitudinaldata,measurementsonthesamesubjectovertime,ariseinmanyareas,fromclinicaltrialstoenvironmentalstudies.Insuchstudies,todrawvalidinference,thecovariancebetweenrepeatedobservationsonthesameindividualsneedstobeproperlymodeled.Inparticular,inincompletelongitudinaldata,mis-modelingthecovariancematrixcanresultinbiasedestimatesofxedeffectmeanparameters( LittleandRubin ( 2002 ); DanielsandHogan ( 2008 )).Twomajorobstaclesformodelingcovariancematricesare1)the(potentially)highdimensionality,and2)positive-deniteness. 2.1.1BriefliteratureReviewforEstimatingaCorrelationMatrixManyapproacheshavebeenproposedforestimatingacovariancematrixmoreefciently,whetherbyshrinkingeigenvaluestoobtainmorestability( YangandBerger ( 1994 ); EfronandMorris ( 1976 ))orreducingthedimensionviastructure( LeonardandHsu ( 1992 ); Chiuetal. ( 1996 ); Pourahmadi ( 1999 2000 ); DanielsandZhao ( 2003 )). 28

PAGE 29

Therehasalsobeenresearchonshrinkagetointroducestabilityinstructuredways( DanielsandKass ( 1999 ); DanielsandPourahmadi ( 2002 ))orwithoutstructure( Liechtyetal. ( 2004 ); Wongetal. ( 2003 )).Theseapproachescanoftenbethoughtofintermsofspecicdecompositionsofacovariancematrix.Ourapproachwillfocusonthevariance/correlationdecomposition,usedrecentlyby Barnardetal. ( 2000 ),whichdecomposesthecovariancematrixinto=DRD,whereRiscorrelationmatrixandDisdiagonalmatrixofstandarddeviations.OurapproachherewillrelyonthisdecompositionandafurtherdecompositionofthecorrelationmatrixRintopartialautocorrelationswhichwereviewnext. 2.1.2ReviewofPartialAutocorrelationsandModelingCorrelationMatricesConsiderappcorrelationmatrixRwith(j,j+k)thelement,jj+k=Cor(Yj,Yj+k).ThematrixRcanbere-parameterizedusingpartialautocorrelations,jj+k=Cor(Yj,Yj+kjYl,j<>:j,j+1=j,j+1j,j+k=j,j+kjj+1,...,j+k)]TJ /F5 7.97 Tf 6.59 0 Td[(1=j,j+k)]TJ /F5 7.97 Tf 6.59 0 Td[(r1(j,k)R)]TJ /F16 5.978 Tf 5.75 0 Td[(12(j,k)rT3(j,k) [1)]TJ /F5 7.97 Tf 6.58 0 Td[(r1(j,k)R)]TJ /F16 5.978 Tf 5.76 0 Td[(12(j,k)rT1(j,k)]1=2[1)]TJ /F5 7.97 Tf 6.58 0 Td[(r3(j,k)R)]TJ /F16 5.978 Tf 5.76 0 Td[(12(j,k)rT3(j,k)]1=22kp)]TJ /F4 11.955 Tf 11.95 0 Td[(j. (2) 29

PAGE 30

Themarginalcorrelations,j,j+kcanalsobewrittenasasimplefunctionofthepartialautocorrelations, j,j+k=rjk+j,j+kAjk, (2) whererjk=r1(j,k)R)]TJ /F5 7.97 Tf 6.59 0 Td[(12(j,k)rT3(j,k)and A2jk=[1)]TJ /F4 11.955 Tf 11.95 0 Td[(r1(j,k)R)]TJ /F5 7.97 Tf 6.59 0 Td[(12(j,k)rT1(j,k)][1)]TJ /F4 11.955 Tf 11.96 0 Td[(r3(j,k)R)]TJ /F5 7.97 Tf 6.59 0 Td[(12(j,k)rT3(j,k)]. (2) Oneofadvantagesofthisparameterizationisthatjkcanvaryindependentlyin()]TJ /F6 11.955 Tf 9.3 0 Td[(1,1)whilemaintainingpositivedenitenessofR,unlikejk(see,e.g., Joe ( 2006 )).Basedonreparameterizingthemarginalcorrelationsintopartialautocorrelations, DanielsandPourahmadi ( 2009 )introduceapriorforRinducedbyindependentuniformpriorsonthepartialautocorrelations,i.e.,p()=2)]TJ /F7 7.97 Tf 6.59 0 Td[(p(p)]TJ /F5 7.97 Tf 6.59 0 Td[(1)=2WereparameterizeR=(ij)intermsoftheentriesofthepartialcorrelationmatrix=(jk),withthepartialautocorrelationsdenedaboveandjj1.WewilltransformthesepartialautocorrelationsusingFisher'sztransformationmappingtoewheretheoffdiagonalelementsofthelattertakevaluesinentirerealline(,1).MovingfromaconstrainedRtoarealsymmetricmatrixegivesusalinkfunctionframeworksimilartothetheoryofgeneralizedlinearmodelsin McCullaghandNelder ( 1989 ).Thesemodelswillextendrecentmodelsfromtheliteratureforcorrelationmatricesincludingthemultivariateprobit( Czado ( 2000 ))andrelatedmodels( DanielsandNormand ( 2006 )). 2.1.3OutlineofthisChapterThischapterisarrangedasfollows.InSection2.2,weintroduceregressionmodelsforthepartialautocorrelationsandmarginalvariances.WederiveandinvestigatepriorsfortheregressionparametersforthepartialautocorrelationandmarginalvarianceparametersinSection2.3.WeprovidedetailsonposteriorcomputationsinSection2.4.Resultsofasimulationstudytoinvestigatecorrelationstructuremisspecicationare 30

PAGE 31

giveninSection2.5.ApplicationofthemodelstoaschizophreniaclinicaltrialisgiveninSection2.6.Section2.7providesconclusionsandextensions. 2.2ModelsfortheCovarianceMatrixLetYi:i=1,...,nbeap1vectoroflongitudinalresponsesmeasured(withoutlossofgenerality)attimes1,...,pwithdistribution YiNp(xTi,i),(2)whereisavectorof(mean)regressionparameterswithdimensionp1,xiisappcovariatematrix,andi=DiRiDTi.WebuildregressionmodelsforRiviathepartialautocorrelationsandDi,viathemarginalvariancesinthefollowingsubsections. 2.2.1PartialAutocorrelationsConsiderthefollowingregressionmodelfori,jk,thejkthpartialautocorrelationforsubjecti, z(i,jk)=w?i,jk,(2)wherez()isFisher'sz-transform,z()=1 2log1+ 1)]TJ /F12 7.97 Tf 6.59 0 Td[(andw?i,jkisa1qvectorofcovariatestomodelstructureandsubject-levelcovariates;isunconstrainedinq-dimensionrealspaceRq.Giventhatthepartialautocorrelationsarecorrelationsbetweenlongitudinalobservations,conditionalonintermediateones,wemightexpecthigherorderonestobezero.Forexample,wemightspecifyw?i,jk=I(jk)]TJ /F4 11.955 Tf 12.93 0 Td[(jj=1)correspondingtoanAR(1)structurewithalllagpartialautocorrelationsbiggerthanoneequaltozero.w?Ti,jk=(1,jk)]TJ /F4 11.955 Tf 12.67 0 Td[(jj)impliesthatehasaToeplitzformwithz-transformoftheelementoneachsubdiagonalshavingalinearrelationshipinlag.ForrelatedstructuresfortheparametersofthemodiedCholeskidecomposition(see Pourahmadi ( 1999 ); PourahmadiandDaniels ( 2002 )). 31

PAGE 32

2.2.2MarginalVariancesWeassumethelogarithmofthemarginalstandarddeviations,i,j(i.e.,thejthdiagonalelementofDi)followtheregressionmodels, log(i,j)=Ai,j,(2)whereAi,jisa1q0vectorofcovariatestomodelstructureandunit-levelcovariates.Forexample, Ai,j=(I(j=1),I(j>1)) (2) inducesastructureofequalvarianceexceptfortime1.Thefollowing, Ai,j=(1,j) (2) correspondstothemarginalvariancesthatareloglinearintime. Verbyla ( 1993 )proposedmodelsforthemarginalvariances(residualvariances)intermsofunitlevelcovariates(i.e.,heterogeneity)inthesettingofindependentresponses. 2.3PriorsforStandarddiffusepriorsforinEq. 2 ,e.g.,animproperuniformprioronRqoradiffusenormalprior,resultinmostofthemassforthepartialautocorrelations,i,jkbeingputat)]TJ /F6 11.955 Tf 9.3 0 Td[(1and+1.Thishappensinmanysettingswithdiffusepriorsontransformedspaces,e.g.,coefcientsinlogisticregression(see AgrestiandHitchcock ( 2005 )).Thesearenotsensiblepriorbeliefs.Inthenexttwosubsections,wewillreviewapriorproposedforthepartialautocorrelationsfrom DanielsandPourahmadi ( 2009 )andproposeanalternativeonethatbothavoidthisbehaviorforanunstructured.Wethenproposeawaytousethesepriors,whicharebothwithintheclassofindependentBetapriors,toconstructpriorsforandpointouttheirconnectionstog-priors.WealsoconstructasimilarpriorforinEq. 2 32

PAGE 33

2.3.1ReviewofPriorsforUnstructuredPartialAutocorrelationsIndependentuniformpriors,aspecialcaseofatransformedBeta()]TJ /F5 7.97 Tf 6.58 0 Td[(1,1)(a,b)distributionwithshapeparametersa=1andb=1,onpartialautocorrelationsinducedesirablebehaviorforlongitudinal(ordered)databyshrinkinghigherlagmarginalcorrelationstowardzero( DanielsandPourahmadi ( 2009 )).ThebehaviorcanbeunderstoodbyexaminingtheJacobianfromRto,J(R!)=p)]TJ /F5 7.97 Tf 6.59 0 Td[(1Yk=1p)]TJ /F7 7.97 Tf 6.59 0 Td[(kYj=1(1)]TJ /F3 11.955 Tf 11.96 0 Td[(2j,j+k)(p)]TJ /F5 7.97 Tf 6.59 0 Td[(1)]TJ /F7 7.97 Tf 6.59 0 Td[(k)=2.Aslagincreases,moremassisplacedtowardzero.ThisisnotsurprisingsincemostpriorsonRdonotuseinformationonpotentialorderingoftheresponsesandinducethispriorformonpartialcorrelationstoobtainidenticalmarginalpriorsforthemarginalcorrelations.ThisbehavioroftheBeta(1,1)priorisconsistentwithserialcorrelationoftenseeninlongitudinalandordereddata.However,itdoesnotfavorpositivecorrelationsaswetypicallyseeinlongitudinaldata. 2.3.2AnalternativePriorforUnstructuredPartialAutocorrelationsHereweintroduceaprioronthepartialautocorrelationsthatfavorspositivecorrelationstonegativecorrelations.Weproposeindependentpriorsonjkwithpdf's,p(jk)=1+jk 2,whichisatransformedBeta()]TJ /F5 7.97 Tf 6.59 0 Td[(1,1)(a,b)distributionwithparametersa=2andb=1;werefertotheseastriangularpriorsgiventheirshape.TheimpliedmarginalpriorsforjkaregiveninFigure 2-1B .Thepriorshavedecreasingmasscloseto1aslag(jj)]TJ /F4 11.955 Tf 12.05 0 Td[(kj)increases.ThisisconsistentwithserialcorrelationoftenseeninlongitudinaldataandfavorsmoremassonpositivecorrelationsthantheBeta(1,1)priors.Inthefollowingsection,wewilltrytousethesetwopriorsasastartingpointtoconstructapriorfortheregressioncoefcients,inEq. 2 .Intheremaining,allBetapriorswillbespeciedontheinterval()]TJ /F6 11.955 Tf 9.3 0 Td[(1,1),butwejustdenotethemasBeta(a,b). 33

PAGE 34

2.3.3ProposedPrioronInthefollowing,whennotneededwewilldropthesubscriptsonjk.Westartbyderivingthedistributionofz(jk)whenthejkfollowindependentBeta(1,1)priors.Forthisprioron,z()=z=1 2log1+ 1)]TJ /F12 7.97 Tf 6.58 0 Td[(,withpdf f(z)=2e2z (1+e2z)2, (2) wherez2(,+1).Thisisthepdfofalogisticdistribution,zslogistic(0,1 2).Itiswellknownthatthelogisticdistributioncanbeapproximatedwithat-distribution( AlbertandChib ( 1993 )).However,theeasytouseconstructionofthemultivariatet-distributionasagammamixtureofnormalshast-marginalsbuttheyarenotindependentasweneedbasedonouroriginalspecicationofindependentBeta's.Asaresult,wewilluseanormalapproximationtothelogisticdistribution,whosemultivariateversiondoeshaveindependentmarginals,zN(0, 12);thatis,therandomvectorzsN(0 12ITT),whereT=p(p)]TJ /F5 7.97 Tf 6.59 0 Td[(1) 2.Figure 2-1A showshowwellthenormalpriorapproximatestheoriginaluniformprioronthehypercube(i.e.,theindependentBeta(1,1)priors)intermsofthemarginalcorrelations.Theuppertriangularelementsrepresentthemarginalpriorsofjkfromtheoriginaluniformpriorandthelowertriangularelementsrepresentthemarginalpriorsofjkfromthepriorbasedonthenormalapproximation.Theapproximatepriorappearstobehavesufcientlysimilarly.Now,weshowhowthispriorcanbeusedtoconstructapriorforinEq. 2 .Werstfocusonthecaseofz(i,jk)=z(jk)andforeaseofnotation,letzjk=z(jk)andz=(z12,..,z1p,z23,...z2p,...zp)]TJ /F5 7.97 Tf 6.58 0 Td[(1p)T.Considerthefullranklineartransformationz=w,wherew=w?w?. 34

PAGE 35

w?isaTq(Tq)fullcolumnrankmatrixcorrespondingtotheregressioninEq. 2 .Thematrixw?isaT(T)]TJ /F4 11.955 Tf 10.16 0 Td[(q)fullcolumnrankmatrixsuchthat(w?)T(w?)=0q(T)]TJ /F7 7.97 Tf 6.59 0 Td[(q)and(w?)Tw?=I(T)]TJ /F7 7.97 Tf 6.59 0 Td[(q)(T)]TJ /F7 7.97 Tf 6.58 0 Td[(q).Therefore =0B@??1CA=w?w?)]TJ /F5 7.97 Tf 6.59 0 Td[(1zwithw?w?)]TJ /F5 7.97 Tf 6.58 0 Td[(1=0B@((w?)Tw?))]TJ /F5 7.97 Tf 6.58 0 Td[(1(w?)T(w?)T1CA. (2) Thus,z=w?w?0B@??1CA,isa1)]TJ /F6 11.955 Tf 11.96 0 Td[(1transformationfromzto.WedeneE(z)=andVar(z)=2basedonthemultivariatenormalprioronz.UndertheBeta(1,1)prioron,=0and2==12;undertheBeta(2,1)prior(triangularprior),=1 2and2=0.5722.Thecorrespondingpriorforisalsomultivariatenormalwithmeanandvariancegivenbelow,E()=w?w?)]TJ /F5 7.97 Tf 6.59 0 Td[(1E(z())=0B@((w?)Tw?))]TJ /F5 7.97 Tf 6.59 0 Td[(1(w?)T(w?)T1CA1T1=0B@((w?)Tw?))]TJ /F5 7.97 Tf 6.59 0 Td[(1(w?)T(w?)T1CA1T1andVar()=w?w?)]TJ /F5 7.97 Tf 6.59 0 Td[(1Var(z())(w?w?)]TJ /F5 7.97 Tf 6.59 0 Td[(1)T=0B@((w?)Tw?))]TJ /F5 7.97 Tf 6.58 0 Td[(1(w?)T(w?)T1CA2ITT((w?)Tw?))]TJ /F5 7.97 Tf 6.59 0 Td[(1(w?)T)Tw?=20B@((w?)Tw?))]TJ /F5 7.97 Tf 6.59 0 Td[(1(w?)T(w?)T1CA((w?)Tw?))]TJ /F5 7.97 Tf 6.58 0 Td[(1(w?)T)Tw?=20B@((w?)Tw?))]TJ /F5 7.97 Tf 6.59 0 Td[(100I(T)]TJ /F7 7.97 Tf 6.59 0 Td[(q)(T)]TJ /F7 7.97 Tf 6.58 0 Td[(q)1CA. 35

PAGE 36

Theresultingpriorfor?isalsomultivariatenormalwithexpectationandvariance,E(?)=(w?Tw?))]TJ /F5 7.97 Tf 6.58 0 Td[(1w?T1T1andVar(?)=2(w?Tw?))]TJ /F5 7.97 Tf 6.59 0 Td[(1.Thedimensionreductionfromzto?resultsinthepriorvariancebeingtoosmall.Toseethisnotethatthevarianceofithcomponentofzinz=wisvar(zi())=var(w(i,:))=var(w?(i,:)?)+var(w?(i,:)?)=2(w?(i,:)((w?)Tw?))]TJ /F5 7.97 Tf 6.58 0 Td[(1(w?(i,:))T+w?(i,:)(w?(i,:))T).Theithcomponentofz=w??hasvariancevar(zi(?))=var(w?(i,:)?)=2(w?(i,:)((w?)Tw?))]TJ /F5 7.97 Tf 6.59 0 Td[(1(w?(i,:))T).Clearly,var(zi())>var(zi(?)).Itiseasytoadjustforthisbynotingthattheaveragevarianceofz(jk)=w?jk?, var(z)is var(z)=1 TTXi=1var(zi)=21 Ttrfw?((w?)Tw?))]TJ /F5 7.97 Tf 6.58 0 Td[(1(w?)Tg=21 Ttrf((w?)Tw?))]TJ /F5 7.97 Tf 6.58 0 Td[(1(w?)Tw?g=21 TtrfIqqg=q T2,where2isthedesiredvariance.Hencewecaninatevar(?)byafactorofT q.Theresultingpriorfor?is?N(((w?)Tw?))]TJ /F5 7.97 Tf 6.59 0 Td[(1(w?)T1T1,T q2((w?)Tw?))]TJ /F5 7.97 Tf 6.58 0 Td[(1). 36

PAGE 37

2.3.4ExtensiontoUnit-levelCovariatesWecaneasilyextendthispriortosubjectspeciccovariates.Suppose,fori=1,...,n, z(i,jk)=w?i,jk.(2)Letw?ibeapqmatrixsuchthatzi=w?i.Werststackz1,...,znandw?1,...w?ntogether,i.e.,z=(zT1,zT2,...,zTn)T,andw?=(w?T1,w?T2,...,w?Tn)T.So,wehavez=w?andw?isnTqfullcolumnrankmatrix.Similartothepreviouscase,N((w?Tw?))]TJ /F5 7.97 Tf 6.58 0 Td[(1w?T1nT1,nT q2(w?Tw?))]TJ /F5 7.97 Tf 6.59 0 Td[(1).Wecanrewritethisas N((nXi=1w?Tiw?i))]TJ /F5 7.97 Tf 6.58 0 Td[(1w?T1nT1,nT q2(nXi=1w?Tiw?i))]TJ /F5 7.97 Tf 6.59 0 Td[(1), (2) whichisourrecommendedpriorinthegeneralcase. 2.3.5Connectiontog-priorsOurpriorsonhavesimilarformtotheg)]TJ /F1 11.955 Tf 9.3 0 Td[(priorsintroducedby Zellner ( 1986 ).However,ourderivationbeginswithaprioronanunconstrainedparameterspaceasopposedtoZellner'sconstructionofapriorbasedontheposteriordistributionofimaginarydatay0,y0=xT+whereN(0,20In)(withindependentpriorson/1and0/1 0).TheZellnerpriorforj0hastheformN(^0,20 g(xxT))]TJ /F5 7.97 Tf 6.59 0 Td[(1),where^0istheleastsquaresestimatebasedontheimaginarydataandgisapenaltyparameter;inpractice,themeanistypicallysettozerosonoimaginarydataisactuallyrequired.Ourpriorhasasimilarformbutitisbasedontheprojectionofz(i)onw?iwithweightsbasedontheoriginalpriorforontheunconstrainedspace(hereahypercube).The'weights'basedonthepriorinEq. 2 comeinthroughinthepriormean, 37

PAGE 38

(Pni=1w?Tiw?i))]TJ /F5 7.97 Tf 6.59 0 Td[(1w?T1nT1and2inthepriorvariance,2(w?Tw?))]TJ /F5 7.97 Tf 6.59 0 Td[(1.Asaresult,withthesepriors,wedonothavetodealwiththeissueofthechoiceofg(forsomediscussion(see GeorgeandFoster ( 2000 ); ClydeandGeorge ( 2000 )). 2.3.6PriorforThemostcommonlyusedprioronthemarginalvariancesistheinversegammaprior,whichfacilitatescomputationsduetoconditionalconjugacy. Daniels ( 2006 )usedauniformprioronthetransformedinnovation(IV)parameterswithorwithoutstructuresimilartothemodelsinSection2.2forthemarginalstandarddeviations. Barnardetal. ( 2000 )discussedindependentnormalpriorsonlogarithmictransformed.Inparticular,theyproposedthefollowingprior log(i)N(,),(2)withdiagonal.WewillderiveapriorforsimilartothatforbasedonBarnardetal.'spriorforthemarginalstandarddeviations.Theresultingprioris N((nXi=1ATiAi))]TJ /F5 7.97 Tf 6.59 0 Td[(1AT1np1,2np q0(nXi=1ATiAi))]TJ /F5 7.97 Tf 6.59 0 Td[(1)). (2) Noteinthederivation,wehaveassumed=1p1and=2IppinEq. 2 ,whereandarexed.Posteriordistributionandcomputations:Thefulldatalikelihood,L(,,jy)isproportionaltoPni=1jDi()Ri()Di()j)]TJ /F16 5.978 Tf 7.78 3.26 Td[(1 2expf)]TJ /F5 7.97 Tf 16.47 4.71 Td[(1 2tr[ni=1Ri())]TJ /F5 7.97 Tf 6.58 0 Td[(1Di())]TJ /F5 7.97 Tf 6.59 0 Td[(1(Yi)]TJ /F4 11.955 Tf 10.02 0 Td[(xi)(Yi)]TJ /F4 11.955 Tf 10.02 0 Td[(xi)TDi())]TJ /F5 7.97 Tf 6.58 0 Td[(1]g.Wespecifythefollowingpriorsfor,,and, /1 (2) N(0(nXi=1ATiAi))]TJ /F5 7.97 Tf 6.59 0 Td[(1AT1np1,2np q0(nXi=1ATiAi))]TJ /F5 7.97 Tf 6.59 0 Td[(1) (2) 38

PAGE 39

N((nXi=1w?Tiw?i))]TJ /F5 7.97 Tf 6.58 0 Td[(1w?T1nT1,nT q2(nXi=1w?Tiw?i))]TJ /F5 7.97 Tf 6.59 0 Td[(1) (2) wherePni=1ATiAiandPni=1w?Tiw?iarenon-singular.Inthesettingofincompletelongitudinalresponses,underanassumptionofignorablemissingness,weonlyneedtospecifythefulldataresponsemodelandthelikelihoodofinterestistheobserveddatalikelihood,L(,,jyobs,x),wheretheobserveddataresponseisyobs( DanielsandHogan ( 2008 ));theformoftheobserveddatalikelihoodisgiveninthesupplementarymaterials.Sincewespecifyanimproperprioron,weneedtoprovetheposteriordistributionof(,,)isproper.Inthenextsection,weprovideatheoremwhichgivessimplesufcientconditionsunderwhichtheposteriorisproper.ThesupplementarymaterialscontaindetailsontheMCMCalgorithmtosamplefromtheposteriordistribution. 2.3.7PosteriorProprietyInthefollowingtheorem,westateconditionsthataresufcientfortheposteriortobeproper.First,weneedtointroducesomenotation.Supposefull-dataYi:i=1,...,nareindependentlydistributedrandomvariableswithdistributionYiNp(xTi,i),wherexiisappcovariatematrix,isap1(mean)regressionparametervector,i=D1 2i()Ri()D1 2Ti(),andDi()=diag(2i())with2i=(2i1,2i2,...,2ip)speciedbyEq. 2 andRi()speciedbyEq. 2 andEq. 2 ;dene=,=,=tobesamplespacesof,,,respectively.Let(Qi1,...,Qip)TbeavectorofmissingdataindicatorsandletYkii=fYij,j=1,...,kiwhereki:Qiki=1,Qi,ki+1=0g,Sk=fi,Qiki=1andQiki+1=0,andki=k,where1kp)]TJ /F6 11.955 Tf 11.95 0 Td[(1;i=1,..ng. Theorem2.1. Weassumetheobserveddatadistributionfortheithsubject(i=1,...,n)isYkiiNk(xkiTi,kii),wherexkii=xi[:,1:ki]isapkisubmatrixofxiandkii=i[1:ki,1:ki]isakikiprincipalsubmatrixofi.WealsoassumethepriorsontheparametersaregivenbyEq. 2 -Eq. 2 andmissingnessisignorable.Thentheposteriorof(,,)willbeproperunderthefollowingthree(easytocheck)conditions: 39

PAGE 40

1.Pi2SkxkiixkiTiisnon-singularforallk2f1,2,...,p)]TJ /F6 11.955 Tf 11.96 0 Td[(1g.2.PAiATiisnon-singular.3.Pw?iw?Tiisnon-singular.Theproofisgiveninthesupplementarymaterials.Notethatthethreeconditionsareconditionsforthethreedesignmatricesinourmodel(forthemean,thevariance,andthecorrelations,respectively);thelattertwoguaranteethatthepriorsEq. 2 andEq. 2 areproper. 2.4Simulations 2.4.1ModelsToassesstheimportanceofthecorrelationstructureonestimating(mean)longitudinaltrajectoriesinincompletedata,weconductedasimulation.ThetruemodelwasEq. 2 withp=6.Foreachindividualtherowsofthemeandesignmatrixwerespeciedasan(orthogonal)quadratictrajectory,speciedasinEq. 2 .Weset=(27,)]TJ /F6 11.955 Tf 9.3 0 Td[(2.3,0.50)T.Weconsideredthreesamplesizes(30,100,and400).Foreachscenariowesimulated200datasets.Thetruemodelsformarginalvariancesandpartialautocorrelationcoefcientsweregivenby z(jk)=1I(jk)]TJ /F4 11.955 Tf 11.96 0 Td[(jj=1\j=1)+2I(jk)]TJ /F4 11.955 Tf 11.96 0 Td[(jj=2)+3I(jk)]TJ /F4 11.955 Tf 11.95 0 Td[(jj=1\j>1)(2)andlogjj=1I(j=1)+2I(j>1),with=(0.65,0.21,0.85)T,and=(150,200)T.Thestructureonrepresentsasecondordermodelwiththelagonepartialautocorrelationsconstantexceptfortime1,thelagtwopartialautocorrelationsconstantovertime,andhigherlagpartialautocorrelationsequaltozero.Thestructureonthevariancescorrespondstoaconstantvarianceovertimeaftertimeone. 40

PAGE 41

Aftersimulatingthecompletedata,weinduceignorablemissingnessviathefollowingmissingdatamechanism,logitP(Qik=1jQi,k)]TJ /F5 7.97 Tf 6.58 0 Td[(1=1,yobs)=1+2yk)]TJ /F5 7.97 Tf 6.59 0 Td[(1,whereQjk=IfYjkisobservedg,yobsdenotesrealizationofobserveddata,and=(3.86324,)]TJ /F6 11.955 Tf 9.29 0 Td[(0.05).Wetfourmodelstothesimulateddata.Foreachmodel,weusethesametruemeanandmarginalvariancemodels,butdifferentpartialautocorrelationmodels.Ourobjectiveistoevaluatetheimpactofmis-specifyingthepartialautocorrelationmodeloninferenceonthemarginalmeanregressioncoefcients,.Specically,themodelswecompareare:1)TruemodelforgiveninEq. 2 2)Independencemodel,=03)AR(1)model:z(jk)=1Ifjk)]TJ /F4 11.955 Tf 11.96 0 Td[(jj=1g4)Unstructuredmodel(nostructureon)Foreachofthe200simulateddatasetsforeachsamplesize,weran20,000iterationsforeachofthefourmodels.Tocompareinferenceonthemeanunderallfourmodels,wecomputedthefollowingtwoquantities:1)TotalMSE,sumofmeansquarederrorofthecomponentsofand2)ChangefromBaseline,changeofestimatedmeanresponsesfromtimeonetotimesix.Wealsocomparethemeantrajectoriesgraphically. 2.4.2ResultsThesimulationresultsaregiveninTables 2-1 2-2 andFigures( 2-2A 2-2B ,and 2-2C ).Asthesamplesizeincreases,theestimatesforquicklyapproachthetruevalueforthetruecorrelationmodel,moreslowlyfortheunstructuredcorrelationmodelandtothewrongvaluesfortheAR(1)andindependencecorrelationmodels(withthelatterwithconsiderablebias)(Table 2-1 ).Table 2-2 presentsasimilarstorywithbiasfromtheincorrectmodelsandlargerMSE'sfortheestimatesofthe'sandthechange 41

PAGE 42

frombaseline.Graphically,thettedtrajectoriescanbeseeninFigures( 2-2A 2-2B ,and 2-2C )andillustratethebiasinthettedtrajectorywhenthecorrelationstructureisincorrect.Clearly,theunstructuredmodelisconsistent,however,itisquiteunstableandvariableforthesmallersamplesizes.WishartdistributionispopularprioraswedoBayesianinferenceonvariance-covarianceofnormalrandomvector.Toillustration,wedomodelselectionamongIndependentModel,AR(1)model,TrueModel,andWishartprior.TheDICvaluesarerecordedinTable( 2-5 ).Fromthistable,itiseasytoseethatTruemodeldobestamongallmodelweconsiderhereandTruemodeldoesmuchbetterjobthanwishartpriorassamplesizeislarge. 2.5DataExample:SchizophreniaTrialThedatawerecollectedaspartofarandomized,double-blindclinicaltrialforanewpharmacologictreatmentofschizophrenia( Lapierreetal. ( 1990 )).Thetrialcomparedthreedosesofthenewtreatment(low,medium,high)tothestandarddoseofhaloperidol,aneffectiveantipsychoticthathadknownsideeffects.Atthetimeofthestudy,thetrialwasdesignedtondtheappropriatedosinglevelsincetheexperimentaltherapywasthoughttohavesimilarantipsychoticeffectivenesswithfewersideeffects.Twohundredforty-vepatientswereenrolledandrandomizedtooneofthefourtreatmentarms.Theintendedlengthoffollow-upwas6weeks,withmeasurestakenweeklyexpectforweek5.SchizophreniaseveritywasassessedusingtheBriefPsychiatricRatingScale(BPRS)asumofscoresof18itemsthatreectbehaviors,mode,andfeelings.Thescoresrangedfrom0to108withhigherscoresindicatinghigherseverity.Toenterthestudy,theBPRSscorehadtobenolessthan20.Wewillillustrateourapproachusingonlythemediumdosearm.ThemaininferentialinterestisthechangeinBPRSfromthebeginningtotheendofthestudy.Thedropoutrateonthemediumdosearmwashigh,withonly40outof61(about66%)participantshavingameasurementatweek7(thesixthmeasurementtime).Reasonsfordropoutincludedadverseevents(e.g.,sideeffects),lackoftreatmenteffect,and 42

PAGE 43

withdrawalforunspeciedreasons.Thetrajectoriesofcompletersvs.non-completersisshowninFigure 2-3A .Clearlythosedroppingoutweredoingworsepriortodroppingout(higherBPRS).LetthelongitudinalvectorofoutcomesforsubjectibeYi=(Yi1,...,Yi6)T,measuredatweekst=(t1,...,t6)=(1,2,3,4,5,7).WeassumeYifollows( 2 )withmean. E(Yij)=0+1xj1+2xj2, (2) wherexj1=(tj)]TJ ET q .478 w 89.35 -194.29 m 94.8 -194.29 l S Q BT /F4 11.955 Tf 89.35 -203.22 Td[(t)andxj2=(tj)]TJ ET q .478 w 179.75 -194.29 m 185.2 -194.29 l S Q BT /F4 11.955 Tf 179.75 -203.22 Td[(t)2)]TJ /F26 7.97 Tf 13 12.17 Td[(P6k=1(tk)]TJ ET q .359 w 249.18 -190.93 m 252.99 -190.93 l S Q BT /F7 7.97 Tf 249.18 -197.02 Td[(t)3 P6k=1(tk)]TJ ET q .359 w 249.18 -202.29 m 252.99 -202.29 l S Q BT /F7 7.97 Tf 249.18 -208.39 Td[(t)2(tj)]TJ ET q .478 w 288.58 -194.29 m 294.03 -194.29 l S Q BT /F4 11.955 Tf 288.58 -203.22 Td[(t))]TJ /F26 7.97 Tf 13 12.17 Td[(P6k=1(tk)]TJ ET q .359 w 353.08 -190.93 m 356.89 -190.93 l S Q BT /F7 7.97 Tf 353.08 -197.02 Td[(t)2 6,i.e.,anorthogonalquadraticpolynomial.Weassumemissingnessisignorable.Wetthevepartialautocorrelationmodelsgivenbelow:IndependenceModel:z(jj+k)=0,log(j)=I(j=1)1+I(j>1)2.AR(1)Model:z(jj+k)=I(k=1)1,log(j)=I(j=1)1+I(j>1)2.UnstructuredCovarianceModel:z(jj+k)=jj+k(with=(12,13,...,p)]TJ /F5 7.97 Tf 6.58 0 Td[(1,p)),log(j)=j.AfterexaminingtheunstructuredcovariancematrixinTable 2-4 ,weconsidertwostructuredmodels;StructuredModel1:z(jj+k)=I(k=1\j<2)1+I(k=1\j>1)2+I(k=2)3,log(j)=I(j=1)1+I(j>1)2.Thisisthesamemodelastheoneconsideredinthesimulation.StructuredModel2:z(jj+k)=I(k=1\j<2)1+I(k=1\j>1)2+I(k=2\j<3)3+I(k=2\j>2)4+I(k=3)5+I(k=4)6+I(k=5)7,log(j)=I(j=1)1+I(j>1)2. 43

PAGE 44

Thismodelismoreexibleforthepartialautocorrelationsallowingnonstationarylagoneandlagtwoautocorrelationsandstationarylagthree,fourandve(withnostructuralzeros).ThestructureonthevariancesisthesameasStructuredModel1.WeusepriorsspeciedinEq. 2 ,Eq. 2 ,andEq. 2 for,,,respectively.Forallmodels,weran200,000iterationswithnoburn-insincetheyconvergedafterafewiterations.TheplotofallttedmeantrajectoriesisgiveninFigure 2-3B .ThemeanBPRSinitiallydecreasedbutstartedtogobackupbyweek5.Thisisrelatedtothosedroppingoutdoingmorepoorlythanthosestayinginthestudy.Table 2-5 containstheposteriormeanof,thechangefrombaselinetoweek7,their95%credibleintervals,andtheDICbasedontheobserveddatalikelihood.Thechangesfrombaselineinallmodelswerenegativewith95%credibleintervalexcluding0,showingthatMedium-dosereducedtheBPRSscoresignicantly,whichagreeswithearlieranalysisdonein DanielsandHogan ( 2008 ).Thechangesfrombaselinevariedfrom)]TJ /F6 11.955 Tf 9.3 0 Td[(14to)]TJ /F6 11.955 Tf 9.3 0 Td[(11basedonthecovariancemodelchosen.BasedontheDIC,StructuredModel2providedthebestt.ThechangefrombaselineinStructuredModel2wasalmostafullpointdifferentfromtheunstructuredmodel. 2.6DiscussionsInthispaper,werstextendedthepriorsin DanielsandPourahmadi ( 2009 )forpartialautocorrelationsfortheunstructuredcasebyintroducingasetoftriangularpriorswhichfavorpositivemarginalcorrelations.UsingFisher'sz-transformationonthepartialautocorrelations,weintroducedaGLMframeworkforregressionmodelstoinducestructureand/orunit-speciccovariatesinthecorrelationmatrix.Basedonpriorsproposedforthepartialautocorrelationsinthenon-regressionsetting,weintroducedapriorforthecoefcientsinthepartialautocorrelationregressions(andforthecoefcientsofthemarginalvarianceregressions).Weconductedsimulationsthatillustratedtheimportanceofcorrectspecicationofthecorrelationstructureinthe 44

PAGE 45

settingofignorablemissingnessinlongitudinaldataandtthemodelstodatafromalongitudinalschizophreniaclinicaltrial.Thereareavarietyofextensionstothemodelingproposedhere.Clearly,itcanbedifculttondagoodparametricmodelthatimposesstructureonthecorrelationmatrix.Thusextendingapproachesdevelopedunderdifferentparameterizations( SmithandKohn ( 2002 ); Wongetal. ( 2003 ))tooursettingisanimportantextension.Correlationmatrices(insteadofcovariancematrices)arisecommonlyinmodelsforlongitudinaldatamodeledusingGaussiancopulas(e.g.,themultivariateprobitmodel)( Nelsen ( 1999 ));efcientcomputationsusingthepartialautocorrelationinthesesettingswillbeachallengingproblemduetothelackofconjugacy.Finally,tooffersomerobustnesstoaselectedmodelforthecorrelationstructure,analternativewouldbetoshrinkthepartialautocorrelationstothestructureusingindependentBetapriorsashasbeendonepreviouslyusingnormalpriorsonotherparameterizationsofacovariancematrix( DanielsandKass ( 2001 ); DanielsandPourahmadi ( 2002 )) 45

PAGE 46

Table2-1. Posteriormeansof SampleSize30SampleSize100SampleSize400 UnstrTrueAR(1)Indep.UnstrTrueAR(1)Indep.UnstrTrueAR(1)Indep. 26.526.926.725.326.726.926.625.326.927.026.725.4-2.1-2.0-2.1-2.5-2.0-2.0-2.1-2.5-2.0-2.0-2.1-2.50.520.510.500.580.510.510.500.580.500.500.490.57 46

PAGE 47

A B Figure2-1. Marginalpriors 47

PAGE 48

A B C Figure2-2. Trajectoriesofsamplesize30,100,400 48

PAGE 49

A B Figure2-3. Observedtrajectoriesandttedtrajectories 49

PAGE 50

Table2-2. Summarymeasuresforthesimulation TotalMSEChangefromBaseline UnstrTrueAR(1)Indep.UnstrTrueAR(1)Indep. 6.96.56.610.112.612.012.715.11.81.72.04.812.212.012.714.90.420.410.523.412.112.012.815.0 50

PAGE 51

Table2-3. Summarycountfromthesimulations SampleSize30SampleSize100SampleSize400SampleSize1000 AR(1)IndTrueWishAR(1)IndTrueWishAR(1)IndTrueWishAR(1)IndTrueWish 69013102019800018911001946(0.345)(0)(0.655)(0)(0.010)(0)(0.990)(0)(0)(0)(0.945)(0.055)(0)(0)(0.970)(0.030)131069014602520011189006194(0.655)(0)(0.345)(0)(0.730)(0)(0.010)(0.260)(0)(0)(0.055)(0.945)(0)(0)(60.030)(0.970)03001705200148200000200000(0)(0.150)(0)(0.850)(0.260)(0)(0)(0.740)(1)(0)(0)(0)(1)(0)(0)(0)0170030020000020000020000(0)(0.850)(0)(0.150)(0)(1)(0)(0)(0)(1)(0)(0)(0)(1)(0)(0) 51

PAGE 52

Table2-4. Descriptivesummariesofschizophreniatrial Variances(Diagonal)andCorrelations(OffDiagonal) 126.250.6578-0.07380.0804-0.0253-0.52300.7889210.350.8543-0.0593-0.33280.0292-0.07401.2718224.420.85590.26480.43750.0806-0.05941.2779240.840.89610.3506-0.0253-0.34600.27131.4522221.980.8433-0.58050.02920.46920.36611.2325243.08 Table2-5. Posteriorsummariesofschizophrenia (0,1,2)ChangesfromBaseline(95%CI)DIC Independent(25.6,-2.35,0.68)-14.1(-18.7,-9.4)1924.6AR(1)(27.5,-1.97,0.69)-11.8(-15.8,-7.9)1681.8Unstructured(26.9,-2.03,0.62)-12.2(-16.3,-7.9)1663.0StructuredModel1(27.9,-1.81,0.58)-10.8(-14.6,-7.0)1669.5StructuredModel2(27.9,-1.89,0.54)-11.3(-15.3,-7.3)1659.3 52

PAGE 53

CHAPTER3ESTIMATINGLARGECORRELATIONMATRICESBYBANDINGTHEPARTIALAUTOCORRELATIONMATRIXInthischapter,weproposeacomputationallyefcientapproachtoestimate(large)p-dimensionalcorrelationmatricesofordereddatabasedonanindependentsampleofsizen.Todothis,weconstructtheestimatorbasedonak-bandpartialautocorrelationmatrixwiththenumberofbandschosenusinganexactmultiplehypothesistestingprocedure.Thisapproachisconsiderablyfasterthanmanyexistingmethodsandonlyrequiresinversionofkdimensionalcovariancematrices.Inaddition,theresultingestimatorisguaranteedtobepositivedeniteaslongaskn)]TJ /F6 11.955 Tf 12.22 0 Td[(2(evenwhenn
PAGE 54

decompositionofthecovariancematrixtoperformwhattheycalled'bandingtheinversecovariancematrix'. BickelandLevina ( 2008b )achievesparsitybybandingthesamplecovariancematrixwhile Rothmanetal. ( 2010 )bandtheCholeskyfactorofthecovariancematrix.Theformeruses'tapering'toachievepositive-denite( FurrerandBengtsson ( 2007 )).Inaddition,thesemethodsusecross-validationtondthenumberofbands,whichcanbecomputationallyintensive.Inthischapter,wepresentabandingmethodbasedonapartialautocorrelationmatrix,whichhasseveralfavorablepropertiesincludingtheestimatorguaranteedtobepositive-deniteness(w/oanyadjustment),evenforn
PAGE 55

ThereisasimplerelationshipbetweentheelementsofRand.Thepartialautocorrelationj,j+1=j,j+1andj,j+k,(26k6p)]TJ /F6 11.955 Tf 11.96 0 Td[(1)is (j,j+k)=j,j+k)]TJ /F3 11.955 Tf 11.95 0 Td[(T(j,k)R)]TJ /F5 7.97 Tf 6.58 0 Td[(12(j,k)r(j,k) [1)]TJ /F3 11.955 Tf 11.95 0 Td[(T(j,k)R)]TJ /F5 7.97 Tf 6.59 0 Td[(12(j,k)(j,k)]1=2[1)]TJ /F11 11.955 Tf 11.96 0 Td[(rT(j,k)R)]TJ /F5 7.97 Tf 6.58 0 Td[(12(j,k)r(j,k)]1=2, (3) with(j,k)=(j,j+1,...,j,j+k)]TJ /F5 7.97 Tf 6.59 0 Td[(1)Tandr(j,k)=(j+k,j+1,...,,j+k,j+k)]TJ /F5 7.97 Tf 6.59 0 Td[(1)T.R2(j,k)isamatrixthatcontainsthemiddlek)]TJ /F6 11.955 Tf 11.95 0 Td[(1rowsandcolumnsofR[j:j+k],i.e. R[j:j+k]=0BBBB@1T(j,k)j,j+k(j,k)R2(j,k)r(j,k)j+k,jrT(j,k)11CCCCA (3) Correspondingly,j,j+kcanbewrittenasafunctionofj,j+k, j,j+k=rjk+jj+kAjk, (3) whererjk=T(j,k)R)]TJ /F5 7.97 Tf 6.59 0 Td[(12(j,k)r(j,k),andA2jk=[1)]TJ /F3 11.955 Tf 11.96 0 Td[(T(j,k)R)]TJ /F5 7.97 Tf 6.59 0 Td[(12(j,k)(j,k)][1)]TJ /F11 11.955 Tf 11.96 0 Td[(rT(j,k)R)]TJ /F5 7.97 Tf 6.59 0 Td[(12(j,k)r(j,k)].Thepartialautocorrelationmatrixwilloftenbesparse( Dempster ( 1972 ); Friedmanetal. ( 2008 ))sincethecorrelationsinitareconditional(ontheinterveningvariables)correlations.Forexample,anAR(1)correlationmatricescorrespondstoapartialautocorrelationmatrixwithonlyonenon-zeroband.Moregenerallythepartialautocorrelationwillhaveknon-zerobandsunderak-thorderante-dependencemodel( ZimmermanandNunez-Anton ( 2010 ); Gabriel ( 1962 )). 55

PAGE 56

3.3BandingthePartialAutocorrelationMatrixForappmatrixM=[mij]andanyk2f0,1,2,...,p)]TJ /F6 11.955 Tf 12.49 0 Td[(1g,wedeneak-bandmatrix(Bk(M))ofM( BickelandLevina ( 2008b ))asfollows, Bk(M)=[mijI(ji)]TJ /F4 11.955 Tf 11.96 0 Td[(jjk)]. (3) Adiagonalmatrixandfullmatrixarespecialcasesofk-bandmatriceswithk=0andp-1,respectively.Herewewillbandthepartialautocorrelationmatrix,=(jk).Thecomputationalattractivenessofourapproachwillrelyonestimatingthepartialautocorrelationsonebandatatimeandthenforeachband,doingasimplehypothesistestofwhethertoaddanotherband.Ourmultiplehypothesistestingprocedureisbasedonexactsmallsampleresultsfornp+1,notasymptoticones,whichwillbeespeciallyimportantwhenpislargerelativeton;fornp,thecorrespondingtestswillbeapproximate.Forourprocedure,thelargestmatrixwewillneedtomanipulatewillbea(k+1)-dimensionalmatrixwherekisthenumberofbands.Inthefollowingtwosubsections,wewillprovidedetailsonrstestimatingthepartialautocorrelationsforeachbandandsecond,theprocedureforhypothesistesting. 3.3.1EstimatingthePartialAutocorrelationsforeachBandAsstatedintheprevioussection,werstestimatethepartialautocorrelationsforeachband,thentestwhethertokeepthebandorstop.Wewillneedseveralresultstoeasilyandefcientlyestimatethepartialautocorrelationsineachbandwhichweprovidenext.However,rst,weintroducesomenotation.Inthefollowing,weassumethedata,fYi:i=1,...,ngareindependentandidenticallynormallydistributedp-vectorwithmean0andcovariancematrix.TheMLEofisS=1 nnXi=1(Yi)]TJ ET q .478 w 156.31 -577.02 m 166.45 -577.02 l S Q BT /F4 11.955 Tf 156.31 -587 Td[(Y)(Yi)]TJ ET q .478 w 202.07 -577.02 m 212.22 -577.02 l S Q BT /F4 11.955 Tf 202.07 -587 Td[(Y)T= 1 nnXi=1(yij)]TJ ET q .478 w 308.63 -579.68 m 315.48 -579.68 l S Q BT /F4 11.955 Tf 308.63 -587 Td[(yj)(yik)]TJ ET q .478 w 358.33 -579.68 m 365.19 -579.68 l S Q BT /F4 11.955 Tf 358.33 -587 Td[(yk)T!.andthecorrespondingmleofthecorrelationmatrixis 56

PAGE 57

^R=0@Pni=1(yij)]TJ ET q .478 w 229.54 -21.66 m 236.39 -21.66 l S Q BT /F4 11.955 Tf 229.54 -28.98 Td[(yj)(yik)]TJ ET q .478 w 279.24 -21.66 m 286.1 -21.66 l S Q BT /F4 11.955 Tf 279.24 -28.98 Td[(yk) q Pni=1(yij)]TJ ET q .478 w 216.18 -43.48 m 223.03 -43.48 l S Q BT /F4 11.955 Tf 216.18 -50.8 Td[(yj)2Pni=1(yik)]TJ ET q .478 w 299.62 -43.48 m 306.48 -43.48 l S Q BT /F4 11.955 Tf 299.62 -50.8 Td[(yk)21App.Thejkthsamplecorrelationcoefcient^j,k=Pni=1(yij)]TJ ET q .478 w 242.59 -97.38 m 249.44 -97.38 l S Q BT /F4 11.955 Tf 242.59 -104.7 Td[(yj)(yik)]TJ ET q .478 w 292.3 -97.38 m 299.15 -97.38 l S Q BT /F4 11.955 Tf 292.3 -104.7 Td[(yk) q Pni=1(yij)]TJ ET q .478 w 229.23 -119.2 m 236.09 -119.2 l S Q BT /F4 11.955 Tf 229.23 -126.52 Td[(yj)2Pni=1(yik)]TJ ET q .478 w 312.68 -119.2 m 319.53 -119.2 l S Q BT /F4 11.955 Tf 312.68 -126.52 Td[(yk)2,isonlyrelatedtojthandkthcomponentsofthosenormalrandomvectors.Thisimpliesthatwecanestimatecorrelationcoefcientsandvariancesoneatatime.Partialautocorrelationcoefcientshaveonetoonemappingtothecorrelationcoefcients(Section2).Furthermore,Eq. 3 shows^j,kisonlyrelatedtothejththroughkthrowsandcolumnsin^R. 3.3.1.1StatementofneededresultsWebrieyreviewandprovesomeusefulresultsthatwillbeneededinourapproach.Result1:( Anderson ( 1984 ))Let=0B@111221221CAbeasquarematrix,where11,22aresquaresubmatriceswithnon-zerodeterminants.Wedenote112=11)]TJ /F6 11.955 Tf 11.95 0 Td[(12)]TJ /F5 7.97 Tf 6.59 0 Td[(12221and221=22)]TJ /F6 11.955 Tf 11.96 0 Td[(21)]TJ /F5 7.97 Tf 6.59 0 Td[(11112.Thena)det()=det(11)det(221),b))]TJ /F5 7.97 Tf 6.59 0 Td[(1=0B@)]TJ /F5 7.97 Tf 6.58 0 Td[(1112)]TJ /F6 11.955 Tf 9.3 0 Td[()]TJ /F5 7.97 Tf 6.58 0 Td[(111212)]TJ /F5 7.97 Tf 6.58 0 Td[(122)]TJ /F6 11.955 Tf 9.3 0 Td[()]TJ /F5 7.97 Tf 6.58 0 Td[(12221)]TJ /F5 7.97 Tf 6.59 0 Td[(1112)]TJ /F5 7.97 Tf 6.59 0 Td[(122+)]TJ /F5 7.97 Tf 6.59 0 Td[(12221)]TJ /F5 7.97 Tf 6.59 0 Td[(111212)]TJ /F5 7.97 Tf 6.59 0 Td[(1221CA,withsimilarresultsfor112and22. 57

PAGE 58

Result2:( Joe ( 2006 ))For)]TJ /F6 11.955 Tf 9.29 0 Td[(1j.ThenextresultpresentsanexpressionforthedeterminantofthecorrelationmatrixasafunctionofthepartitionedmatrixinEq. 3 Lemma3.0.1. LetRpp=(j,k)beacorrelationmatrix,andpp=(j,k)bethecorrespondingpartialautocorrelationmatrixofRpp.Then,det(Rpp)=[det(R2)(1)]TJ /F3 11.955 Tf 11.95 0 Td[(T2R)]TJ /F5 7.97 Tf 6.58 0 Td[(122)(1)]TJ /F4 11.955 Tf 11.96 0 Td[(rT2R)]TJ /F5 7.97 Tf 6.59 0 Td[(12r2)](1)]TJ /F3 11.955 Tf 11.96 0 Td[(21,p). 58

PAGE 59

Proofinappendix.Weintroduceasequentialproceduretoestimatethepartialautocorrelationsineachbandandshowthatisequivalenttothemle(^)offullmultivariatenormallikelihoodL(jy1,...,yn).Letf(yi,j,...,yi,j+ljj,j+l)bethemarginalpdfoftherandomsub-vectorYij,j+l=(Yi,j,...,Yi,j+l)ofamultivariatenormalrandomvectorYi,andletG(j,j+l)=L(j,j+ljy1j,j+l,...,ynj,j+l)bethecorrespondinglikelihoodfunction.Denee=(e1,e2,...,ep)]TJ /F5 7.97 Tf 6.59 0 Td[(1)asthemaximizerofthefollowingobjectivefunction, G?(l)=p)]TJ /F7 7.97 Tf 6.59 0 Td[(lYj=1G(j,j+l),andl=(1,1+l,...,p)]TJ /F7 7.97 Tf 6.59 0 Td[(l,p),forl=1,...,p)]TJ /F6 11.955 Tf 11.96 0 Td[(1(3)Sincej,k,j,k2f1,2,...,pgandk6=j,independentlyvarysintheinterval()]TJ /F6 11.955 Tf 9.3 0 Td[(1,1),wecanestimatelsequentiallybymaximizingtheobjectivefunctionsin(Eq. 3 )forl=1,...,p)]TJ /F6 11.955 Tf 11.96 0 Td[(1,i.e. ~l=maxlG?(l),forl=1,2,...,p)]TJ /F6 11.955 Tf 11.95 0 Td[(1(3)Inthefollowing,weprovethemaximizeroftheobjectivefunctionforeachlisequivalenttothemleofbasedonthemultivariatenormallikelihood. Theorem3.1. TheMLE^ofthepartialautocorrelationcoefcientsf^1,2,^2,3,...,^p)]TJ /F5 7.97 Tf 6.59 0 Td[(1,p,^1,3,...,^p)]TJ /F5 7.97 Tf 6.59 0 Td[(2,p,...,^1,pgbasedonthemultivariatenormallikelihoodfunctionL(jy1,y2,...,yn)issamease,f~1,2,~2,3,...,~p)]TJ /F5 7.97 Tf 6.59 0 Td[(1,p,~1,3,...,~p)]TJ /F5 7.97 Tf 6.59 0 Td[(2,p,...,~1,pgobtainedfromEq. 3 .Proofinappendix.CombiningwithEq. 3 ,thismeanstheestimatedlagkpartialautocorrelationcoefcientsareinvarianttotheestimatedpartialautocorrelationcoefcientswithlaggreaterthank.Wehavefollowingcorollaryimmediately. Corollary3.1.1. Let^betheMLEofpartialautocorrelationbasedonthemultivariatenormallikelihoodfunctionL(jy1,y2,...,yn)andebeasinEq. 3 .Thenthemle(^)of 59

PAGE 60

ak-bandpartialautocorrelationmatrixisequivalenttothecorrespondinge,basedonak-bandmatrix. 3.3.2ChoosingtheBand 3.3.2.1TheoremsToestimatethenumberofbands(k)ofapartialcorrelationmatrix=(j,k),ourstrategywillbetosequentiallytestthenullhypothesisthateachbandiszerostartingfromtherstband.Implicitly,ifthejthbandiszero,thesubsequentbands,j+1,...,p)]TJ /F6 11.955 Tf 10.72 0 Td[(1arezeroaswell.Ingeneral,fork=(1,1+k,...,p)]TJ /F7 7.97 Tf 6.59 0 Td[(k,k),weconstructmultipletestsunderthefollowinghypotheses: H0:l=0,lkvsHa:k6=0.(3)WechoosethebandastherstkforwhichH0cannotberejected.Notewejustneedtotestthepartialautocorrelationsinthekthbandundertheassumptionofatruebandk-1matrix.Forcomputationalreasons,wedonotexplicitlyincludeinthetestforbandkthepartialautocorrelationsinbands>k;weassesstheimpactofthisonoperatingcharacteristicsofthetestinSection4.Thelemmabelowprovidesthekeyresultforthetheoremthatgivestheresultneededtoefcientlyconductthesehypothesistests. Lemma3.1.1. SupposeY1,Y2,...Yn(n>p)areiidN(0 ,DRD),whereDisadiagonalmatrixofmarginalstandarddeviations.Let^=(^j,t)ppbethemleofthepartialauto-correlationmatrix.Forabandk0partialautocorrelationmatrix,themle'softhepartialautocorrelationswithlagsgreaterthank0areindependentwithmarginaldistributionsgivenbyf(^j,j+k)/(1)]TJ /F6 11.955 Tf 12.35 0 Td[(^j,j+k)(1+^j,j+k)where==8><>:n)]TJ /F7 7.97 Tf 6.59 0 Td[(k)]TJ /F5 7.97 Tf 6.59 0 Td[(2 4fork0
PAGE 61

Proofintheappendix.Theabovelemmaisthekeyresultfortheproofofthefollowingtheorem. Theorem3.2. SupposeY1,Y2,...YnareiidN(0 ,DRD)and=(j,t)ppisthepartialcorrelationmatrixofR.Then,fornp+1,underhypothesesinEq. 3 ,themle'sofj,j+l,denotedas^j,j+l,l=1,...,p)]TJ /F6 11.955 Tf 11.99 0 Td[(1followindependenttransformedBetadistributionson()]TJ /F6 11.955 Tf 9.3 0 Td[(1,1)withparameters==8><>:n)]TJ /F7 7.97 Tf 6.59 0 Td[(l)]TJ /F5 7.97 Tf 6.58 0 Td[(2 4if1klp)]TJ /F6 11.955 Tf 11.95 0 Td[(2n)]TJ /F7 7.97 Tf 6.59 0 Td[(p)]TJ /F5 7.97 Tf 6.58 0 Td[(1 2ifkl=p)]TJ /F6 11.955 Tf 11.96 0 Td[(1.. Proof. ItisadirectresultofLemma 3.1.1 Fornp,thelagk(k>k0)samplepartialautocorrelationsarecorrelatedevenifthetruebandisk0,butthecorrelationsamongthemareverysmall(fromempiricalchecks).Therefore,westillapproximatethedistributionofthek-bandestimatedpartialautocorrelationswiththeindependentBetadistributionsgiveninTheorem 3.2 .SincethepartialautocorrelationsareindependentBetadistributions,weadjustformultipletestingusingaBonferronicorrection.WeexplorethisinthesimulationsinSection3.4forbothn>pandnp.Beforethatthough,wewilloutlinetheexactprocedureinthenextsubsection. 3.3.2.2OverallprocedureHerearethestepsweusetoestimatethepartialautocorrelationmatrix.Startingwithbandj, 1. Computethemleofthe'sinbandjusingtheresultsinSection3.1.2. 2. Testthenullhypothesisthatallthepartialautocorrelationsinthejthbandarezerousingthemultivariatehypothesestestingprocedureoutlinedabove. 3. Ifreject,repeatthersttwostepsforbandj+1;otherwise,stopandthenumberofbandsisj)]TJ /F6 11.955 Tf 11.96 0 Td[(1. 61

PAGE 62

Practically,wedohypothesistestingstartingfromband1andstopattherstj(jmin(n)]TJ /F6 11.955 Tf 11.78 0 Td[(2,p)]TJ /F6 11.955 Tf 11.78 0 Td[(1))forwhichH0cannotberejected.Thereforethisprocedureonlyrelatesto(j+1)-dimensionalprincipalsubmatricesofRandonlyrequiresasamplesizenj+2. 3.4Simulations 3.4.1ModelsToevaluateourbandingmethodinpractice,weconductseveralsimulationsusingBonferronicriteriaatdifferentlevelsof.Weconsiderseveraltruematriceswhichwedescribenext: TwosetsofmatricesbasedonbandingtheestimatedpartialautocorrelationmatricesoftheMetalandRockdata(availableathttp:www.ics.uci.edu=smelearn=MLRepository.html);formoredetailsonthedata,seeSection3.5.Inparticular,weconsider4,9,and14bands. AR(1)correlationmatricesforseveralvaluesofthelag1correlation. Fourbandpartialautocorrelationmatrix:j,j+l=0.4I(jlj=1)+0.2I(2jlj3)+0.1I(jlj=4).Foreachtruematrix,weestimatethebandkbyaveragingtheestimatedbandsover100replicateddatasetsgeneratedfromamultivariatenormaldistributionwithmeanzeroandvariance-covariancematrixbethecorrespondingcorrelationmatrixforeachcaseabove.Weestimatethepartialautocorrelationmatrixbyaveragingthese100estimatedbandedpartialautocorrelationmatricescorrespondingtoeachreplicateddatasetandthenconstructingtheestimatedcorrelationmatrix,which,ingeneral,isnotbanded.Recall,forak-bandpartialautocorrelationmatrix,weonlyneedtoinvertatmostk-dimensionalmatricestoconstructthecorrespondinginversecorrelationmatrixinstead. 3.4.2ResultsWecompareourestimatortothesamplecorrelationmatrixandtheLedoit-Wolfshrinkageestimator( LedoitandWolf ( 2003 ))ofthecovariancematrix(shrinkagetargetisadiagonalmatrix).TheLedoit-Wolf(L-W)estimatoriscomputationallytractablelike 62

PAGE 63

ourestimatorbutdoesnotexploitthedecayingcorrelationthatwedowithourbandedestimator.Andtomakeanappropriatecomparison,wecomputetheL-Westimatorofthecovariancematrixandthenuseasacomparison,theresultingcorrelationmatrix.WeusetheprocedureinSection3.2.1withtheBonferroni(Bonf)criteriaatdifferentlevelsof2f0.05,0.2,0.3,0.5g.TocompareourestimatortotheLedoit-Wolfandthesamplecorrelationmatrix,weusesquarederrorlossoftheestimatedcorrelationmatrix.Tables 3-1 to 3-3 containthesimulationresults.Theriskresultsindicateourapproachdoesmuchbetterthanthesamplecorrelationmatrix(whichisnotsurprising).ThetoppartofTable 3-1 containssimulationresultsbasedonbandingtheestimatedpartialautocorrelationmatrixofthemetaldatawith4,9,and14bandsasthetruematrices.Werstfocusonn>p.Thechoiceof=0.05doesverywellforthelargersamplesizes,n>100,butseverelyunderestimatesforthesmallersamplesizes.Choicesof=0.2,0.3doconsiderablybetterforthesmallersamplesizeswithlessthan1bandofoverestimationforthelargersamples.ThebottompartofTable 3-1 showscorrespondingresultsfortherockdatawiththesamebehaviorasthemetaldata.Table 3-2 containsresultsbasedonanAR(1)correlationmatrixfordifferentchoicesofthelag1correlation(0,0.2,0.5,0.9).Weseesimilarbehaviorasthemetalandrockdatainthatwehaveincreasedpowerforthesmallestsamplesizesfor=0.2,0.3over=0.05andlessoverestimationthan=0.5.FinallyTable 3-3 containstheresultsofatruefourbandpartialautocorrelationmatrix,thatwasusedin Rothmanetal. ( 2010 )forcovariancematrix.Byn=60,theestimatordoesverywellfor=0.2,0.3.Fornp,wecontinuetoseesubstantialriskimprovements,butaconsistentunderestimationofthebands.Thisisduetolowpowerinsmallsamplesizes.ThebestperformanceappearstobebasedonusingaBonferronicorrectionwith0.30.Wealsocompareourestimatorsto LedoitandWolf ( 2003 )shrinkageestimator,whichshrinksthesamplecovariancematrixtowardadiagonalmatrix.RisksofourestimatorsaremuchsmallerthanthoseofLedoit-Wolf(L-W)estimator.The 63

PAGE 64

performanceversustheL-WestimatorisnotsurprisingsincetheL-Westimatorisnotdesignedforoursettingwherethetruecovariancematricesherearesparse. 3.4.3ChoiceofforBonferroniCorrectionSimulationresultsdemonstratedthatdifferentchoicesofgivequitedifferentpowerforthesmallersamplesizeswithminimaloverestimationofthebandsforthelargersamplesizes(atmostoverestimateby1or2bandsonaverage)asseeninTables 3-1 to 3-3 .Tooptimizethepowerandminimizetheoverestimationforlargersamplesizes,werecommendintherangeof0.2to0.3forn>p.However,werecommendthemoreconservativeintherangeof0.3to0.5forthecaseofnpwhenthesamplepartialautocorrelationsineachbandarenolongerindependent. 3.5ApplicationstoSonarDataWeillustrateourapproachontwodatasets,theMetalandRockdataofthesonardata,whichisavailableathttp:www.ics.uci.edu=smelearn=MLRepository.html.Thisdatasetcontains111signalsfromaMetalcylinderand97signalsfromaRock,whereeachsignalhas60frequencyenergymeasurementsrangingfrom0.0to1.0.Thesesignalsweremeasuredatdifferentanglesforthesameobjects.Weassumethesignalsareiidnormalrandomvectors.ImagesofabsolutesamplecorrelationmatricesoftheMetalandRockdatainFigure 3-1 (hottocoolcorrespondsto1.0to0.0)showageneralpatternofdecayingcorrelationswithincreasinglag,whichmotivatesthebandedestimatorhere.Sincethesamplesizesarenotlargerelativetothedimensionofmatrices(p=60),basedonoursimulationsweuseestimatorsbasedon=0.2or0.3.Theresultingestimatorfortherockdatahadfourbandsandthemetaldatahadelevenbands(Table 3-4 ).TheimagesofestimatedcorrelationmatricesbasedonthebandedpartialautocorrelationmatrixestimatorsareshowninFigure 3-1 .ThetoppartofFigure 3-1 forthemetaldatashowsthatmostofthemarginalcorrelationsuptolag37arecapturedquitewellandthebottompartofFigure 3-1 fortheRockdatashowsimilarresultsuptolag17.This(tosomeextent)agreeswithpreviousanalysisofthe 64

PAGE 65

datain Rothmanetal. ( 2010 )(theyfoundbandsof37and17formetalandrockdata,respectively,basedonbandingthesamplecovariancematrixandtheCholeskyfactor).However,ourestimatorhasconsiderablyfewerparameterstoestimate(4vs.17bandsfortherockdataand11vs.37forthemetaldata)andiscomputationallymuchquicker,withlessthanonesecondforeachwithMatlabR2006a. 3.6DiscussionsWehaveproposedak-bandedestimatorforacorrelationmatrixthatexistswhennpandonlyrequiresinversionofatmostk-dimensionalmatrices.Thealgorithmfortheestimatorreliesonexactdistributionalresultsunderthenullhypothesisforn>p.Theestimatorcanbecomputedveryquickly.WerecommendaBonferronicorrectionwith2[0.2,0.3]forn>pand2[0.3,0.5]fornp.RelatedbandedestimatorsbasedonthesamplecovariancematricesneedadjustmentstoensurepositivedenitenessandbandedestimatorsforthesamplecovarianceanditsCholeskyfactordonotprovideexactdistributionalresultsandrequirecomputationallyintensivecross-validationprocedurestoobtainthenumberofbands( BickelandLevina ( 2008b ); Rothmanetal. ( 2010 )).Inaddition,theseestimatorsrequiren>p.Currentextensionsofthisworkincludedeterminingtherateofconvergenceofourestimatorandtheadditionofsmoothingthepartialautocorrelationswithinbands(similartopreviousworkby WuandPourahmadi ( 2003 )usingtheGARPsofthemodiedCholeskydecompositionofacovariancematrix)tobetterestimatethepartialautocorrelationsineachnon-zeroband. 65

PAGE 66

Table3-1. Risksimulationsbasedonmetalandrockdata ^RL-W=0.05=0.2=0.3=0.5 nBandriskriskEBriskEBriskEBriskEBrisk 43591261.463.92.261.42.664.13.274.11093541571.61322.11172.51173.0121143551671.51662.11522.61493.4152417391.12.732.33.333.03.533.64.137.22091641052.687.03.480.63.976.65.075.5141621052.71163.41093.91065.198.4410567.93.321.63.721.94.022.54.424.030911680.13.569.75.258.76.254.97.753.71410582.63.795.35.978.56.775.38.071.5484.258.03.616.34.016.44.116.64.617.740981.364.24.654.36.843.77.840.39.039.61480.664.85.080.47.761.59.457.310.955.3468.050.23.812.64.212.74.212.94.513.750964.552.45.345.47.932.68.630.39.130.01463.253.85.869.19.252.910.948.213.143.8453.742.74.010.24.210.44.310.64.611.160953.244.86.733.18.825.39.125.09.725.31448.340.37.552.911.039.312.435.813.834.1434.328.76.23.054.26.284.33.204.56.61100928.025.18.714.49.214.09.414.09.714.41431.129.212.221.813.920.214.220.014.620.2410.510.04.12.024.22.114.32.154.72.2930099.479.129.14.449.24.499.34.549.74.65149.549.2714.16.2514.36.2914.36.3114.66.3646.936.754.11.204.21.244.31.264.61.3450096.626.369.02.769.32.839.42.859.62.88146.106.0414.03.6814.03.7114.33.7314.63.7543.553.474.00.584.20.604.30.614.60.65100092.942.899.01.339.21.319.31.329.61.35142.992.9514.01.8714.21.8814.41.8914.71.92 66

PAGE 67

Table 3-1 .Continued ^RL-W=0.05=0.2=0.3=0.5 nBandriskriskEBriskEBriskEBriskEBrisk 43641391.146.91.858.32.061.82.772.41093541451.274.61.978.92.384.42.991.8143491611.21161.81182.21172.8119417193.72.333.13.232.33.532.54.034.220916698.72.160.23.154.23.552.24.452.3141671092.199.83.390.23.887.04.785.6411472.63.222.73.721.54.022.24.423.230910980.93.244.44.238.34.736.96.140.11410981.93.181.63.973.84.569.55.865.3482.059.13.615.23.915.14.215.94.516.540982.461.13.834.95.331.95.932.17.532.91477.361.93.670.74.861.85.857.98.253.7468.251.13.713.04.113.34.314.14.915.050965.453.13.833.45.728.86.828.08.527.71463.654.24.165.56.353.67.849.810.045.6455.442.63.710.54.210.14.410.54.711.160949.941.84.626.96.523.47.922.89.122.61452.043.14.758.38.343.910.439.012.635.6434.929.74.15.904.36.044.36.094.66.42100933.431.44.816.48.813.89.113.69.613.61430.327.58.535.513.021.713.620.414.518.8410.810.24.11.924.21.994.32.024.62.11300910.19.579.14.049.24.099.34.129.64.201410.59.6714.06.0914.26.1714.36.1814.56.2746.176.034.11.174.21.194.41.224.61.2650096.896.879.12.379.22.419.32.439.62.49145.785.6014.13.5314.23.5414.33.5614.63.6043.433.334.10.594.30.614.40.624.70.66100092.942.979.11.169.21.179.31.199.51.21143.333.3214.11.7714.21.7814.41.7914.81.81 67

PAGE 68

A B C D E F G HFigure3-1. Imagesofsamplecorrelationandestimatedcorrelationmatrices 68

PAGE 69

Table3-2. RisksimulationbasedonAR(1)correlationmatrices ^RL-W=0.05=0.2=0.3=0.5 nriskriskEBriskEBriskEBriskEBrisk 039416.60.00.350.11.110.11.760.34.28100.239019.90.15.440.27.630.49.360.612.10.538549.30.429.80.919.81.018.71.219.30.93262071.034.91.135.51.135.51.339.301862.100.10.530.21.300.31.870.53.53200.21856.570.25.420.76.680.87.061.18.330.518434.51.07.541.18.381.29.221.510.80.91521221.015.51.115.71.216.21.417.601220.600.00.200.20.820.21.170.52.32300.21225.370.34.740.74.881.05.581.46.840.512030.31.14.811.25.341.25.631.46.460.998.386.51.010.11.110.11.210.61.411.9090.70.310.00.140.20.670.41.160.61.95400.290.75.010.64.011.13.801.33.991.54.650.589.427.71.13.521.24.031.34.431.55.120.971.965.31.17.641.38.091.48.291.79.27072.10.18000.10.3810.30.6590.51.30500.272.04.770.83.081.23.181.33.441.74.330.572.125.71.12.821.33.511.53.961.84.740.958.053.51.05.761.26.081.36.331.77.02059.80.120.10.180.30.630.40.800.71.42600.260.14.670.82.691.22.541.42.871.83.580.558.623.71.12.371.32.851.33.031.53.430.949.146.21.15.211.25.581.36.031.66.37035.80.0510.10.100.30.360.30.460.70.891000.235.74.381.01.321.21.521.41.721.72.050.535.018.61.11.351.21.561.31.651.51.880.927.927.41.03.101.33.371.43.511.63.92 69

PAGE 70

Table 3-2 .Continued ^RL-W=0.05=0.2=0.3=0.5 nriskriskEBriskEBriskEBriskEBrisk 017.70.010.10.030.20.130.40.260.70.412000.217.83.871.00.621.20.711.30.771.60.950.517.512.11.00.631.10.701.30.771.81.020.914.413.91.11.571.31.661.41.711.71.8907.130.010.10.020.40.100.50.120.80.205000.27.082.911.00.241.10.271.30.311.60.390.56.995.971.10.281.20.321.30.341.50.400.95.615.581.10.621.30.651.40.711.80.8003.530.000.10.010.20.030.30.030.60.0810000.23.542.061.00.121.30.151.40.171.80.220.53.473.191.00.131.20.151.30.171.60.200.92.842.831.10.281.20.291.30.301.60.35 Table3-3. Risksimulationsof4bandpartialautocorrelationmatrices ^RL-W=0.05=0.2=0.3=0.5 nriskriskEBriskEBriskEBriskEBrisk 1037872.70.267.40.760.41.055.61.554.12018053.81.044.81.736.82.134.32.932.73011846.31.535.02.426.02.923.73.422.24087.240.21.926.23.016.83.415.43.915.55069.836.62.124.13.314.53.712.64.212.06058.132.42.716.43.69.543.99.934.210.110034.123.73.36.093.85.574.05.564.25.5130011.29.884.01.774.11.744.21.774.41.855006.856.274.01.054.21.094.41.144.61.1810003.453.294.00.494.10.514.30.534.60.57 Table3-4. Estimatednumberofbandsforrockandmetalofsonardata 0.050.20.30.5 Rock3444Metal6111111 70

PAGE 71

CHAPTER4NONPARAMETRICESTIMATIONOFLARGECORRELATIONMATRICESBYSMOOTHINGBANDSINTHEPARTIALAUTOCORRELATIONInthispaper,Weimprovetheestimatorofacorrelationmatrixbyusingnonparametricregressionmethodonthebandedestimatorofpartialautocorrelationmatrixobtainedfrom WangandDaniels ( 2012 ).Inparticular,wesmooththepartialautocorrelationswithinbands.AnasymptoticoptimalbandwidthisderivedbasedonToepltizconditiononabandedpartialautocorrelationmatrix,andaplug-inbandwidthselectorissuggested.Thefavorablepropertiesofthisestimatorisdemonstratedbysimulation.Theestimatorisillustratedontwohighdimensionaldatasets.ThesimulationssuggeststhattheToeplitzconditiononpartialautocorrelationmatriceswithinlagisnotrestrictive. 4.1IntroductionEstimatingalargecovariancematrixisveryimportantinmanyapplicationsandhasbeenanactiveresearchareainthelasttenyears.Thesamplecovariancematrixisasimplenonparametricestimatorofcovariancematrix,butitbehavespoorlywhenthedataareseverelyunbalancedorthedimensionofcovariancematrixislargecomparedtothenumberofobservations( Stein ( 1975 )).Wereviewsomeofthemostrelevantliteraturehere. RiceandSilverman ( 1991 )suggestedanonparametricmethodtoestimateacovariancematrixbysmoothingtherstfeweigenfunctions. Halletal. ( 1994 )usedkernelmethodsforestimatingcovariancefunctionsforastationarystochasticprocess,andemployedtheFouriertransformationtoensurethepositivesemidenitenessconditionfortheestimatedcovariancefunctions. DiggleandVerbyla ( 1998 )providedakernel-weightedlocallinearregressionestimatorforestimatingthenonstationaryvariogramforlongitudinalstudies,butcannotguaranteetheestimatedcovariancematrixispositivedenite. WuandPourahmadi ( 2003 )examinedanonparametricmethodbasedonCholeskydecompositionofacovariancematrixwithalocalregressionpolynomialsmoother. Huangetal. ( 2007 )consideramethodfollowingCholeskydecomposingacovariancematrix,thenapproximatingthesubdiagonals 71

PAGE 72

ofCholeskydecompositionbysplines. Lietal. ( 2007 )proposedanonparametricestimatorofcorrelationfunctionsandprovedasymptoticnormalityforitssamplingdistributionundercertainassumptions,buttheyneedusetheprocedurein Halletal. ( 1994 )toensurepositivedeniteness. Fanetal. ( 2007 )and FanandWu ( 2008 )proposeaviablesemiparametricmodelforcovariancematrixbyestimatingvariancefunctionsnonparametrically,whileestimatingthecorrelationfunctionparametricallyaccordingtoinformationfromirregularandsparsedatapointsoneachsubject. Yinetal. ( 2010 )suggestedanonparametricmodeltoestimateconditionalcovariancematrices. Li ( 2011 )proposedamethodtoestimatecovariancematricesforlongitudinaldataandshowedthatkernelcovarianceestimationprovidesuniformlyconsistentestimatorsforthewithin-subjectcovariancematrices.Boundarybiasisaproblemassociatedwithkernelsmoothingforestimationneartheboundariesoftherange. Jones ( 1993 )outlinedboundarybiascorrectionmethodsforkerneldensityestimationandpointedoutimportantconnectionswiththeregressionsituation.Specically,alinearcombinationoftwokernelsprovidesasimpleandeffectivebiascorrection HastieandLoader ( 1993 ).Selectingthesmoothingparameterisanotherissuelinkedtononparametricestimation.Cross-validationandvariationsprovidesimplemethodstochoosethesmoothingparameterwiththegivenanswerasymptoticallyoptimalunderveryweakconditions( Stone ( 1984 )).However,itperformspoorlyforkerneldensityestimation ParkandMarron ( 1990b )andcanbecomputationallyexpensive. Ruppertetal. ( 1995 )proposedamethodtoestimateanasymptoticallyoptimalbandwidthforindependentdatabyplugginginestimatorsbasedonablockedlocallyquarticpolynomialtwiththenumberofblock,N,chosenbyMallows'Cp.Theyadjustforboundarybiasbytruncatingboundaryregions.Plug-inmethodsassumingindependentdataunderestimate(overestimate)theappropriatebandwidthasdataispositively(negatively)correlated. Chiu ( 1989 )providedabandwidthselectionmethod 72

PAGE 73

forcorrelateddatabymodifyingMallows0Criterion,while Herrmannetal. ( 1992 )modiedtheindependentplug-inbandwidthwithacorrectionfactorforcorrelateddata.Generally,covariancematricesaresparseasthedimensionmislarge. BickelandLevina ( 2008b )achievesparsitybybandingthesamplecovariancematrixwhile Rothmanetal. ( 2010 )bandtheCholeskyfactorofthecovariancematrix,buttheformerhastouse'tapering'toachievepositive-denite( FurrerandBengtsson ( 2007 )). WangandDaniels ( 2012 )investigatesparsitybybandingthepartialautocorrelationmatrixwithoutconcernaboutthepositivedenitenessconditionontheestimatedcorrelationmatrix.Inthisfollowuppaperof WangandDaniels ( 2012 ),wepresentamethodtosmooththebandsofthepartialautocorrelationmatrix,usingkernelsmoothingasin Altman ( 1990 ).Weusetheapproachin FunandGijbels ( 1996 )toaccommodatetheboundarybiasprobleminkernelestimation.Theprocedureiscomputationallyquickevenforhighdimensionalcovariancematricesviatheuseofaplug-inbandwidth(nocross-validation).Thischapterisarrangedasfollows.InSection4.2,webrieyreviewthepartialautocorrelationmatrix.InSection4.3,wediscusssomerelevantpropertiesofabandedpartialautocorrelationmatrix.InSection4.4,wediscusssometheoreticalresultsrelatedtooursmoothingestimatorandproposeasimplealgorithmtocomputetheestimator.Section4.5and4.6investigatetheoperatingcharacteristicsofourprocedureviarisksimulationsandapplyittotworealdataexamples,respectively.Note,inthischapter,mdenotesthedimensionofpartialautocorrelationmatrix,andndenotesthenumberofobservations. 4.2ReviewofPartialAutocorrelationsWerstreviewreparameterizingthecorrelationmatrixR=(jk)usingtheelementsinthepartialautocorrelationmatrix=(jk),wherejj=1andfor1j
PAGE 74

fYj+1,...,Yk)]TJ /F5 7.97 Tf 6.59 0 Td[(1g.UnlikeR,theoff-diagonalelementsofcanindependentlyvaryin()]TJ /F6 11.955 Tf 9.3 0 Td[(1,1)withthecorrespondingcorrelationmatrix,R,remainingpositive-denite.ThereisasimplerelationshipbetweentheelementsofRand.Thepartialautocorrelationjj+1=jj+1andj,j+k,(26k6m)]TJ /F6 11.955 Tf 11.95 0 Td[(1)is (j,j+k)=j,j+k)]TJ /F11 11.955 Tf 11.95 0 Td[(r1(j,k)R)]TJ /F5 7.97 Tf 6.59 0 Td[(12(j,k)rT3(j,k) [1)]TJ /F11 11.955 Tf 11.95 0 Td[(r1(j,k)R)]TJ /F5 7.97 Tf 6.58 0 Td[(12(j,k)rT1(j,k)]1=2[1)]TJ /F11 11.955 Tf 11.95 0 Td[(r3(j,k)R)]TJ /F5 7.97 Tf 6.58 0 Td[(12(j,k)rT3(j,k)]1=2, (4) withr1(j,k)=(j,j+1,...,j,j+k)]TJ /F5 7.97 Tf 6.58 0 Td[(1),r3(j,k)=(j+k,j+1,...,j+k,j+k)]TJ /F5 7.97 Tf 6.59 0 Td[(1),andR2(j,k)containsthemiddlek)]TJ /F6 11.955 Tf 11.95 0 Td[(1rowsandcolumnsofR[j:j+k],i.e. R[j:j+k]=0BBBB@1r1(j,k)j,j+krT1(j,k)R2(j,k)rT3(j,k)j+k,jr3(j,k)11CCCCA (4) Correspondingly,j,j+kcanbewrittenasafunctionofj,j+k, j,j+k=rjk+j,j+kDjk, (4) whererjk=r1(j,k)R)]TJ /F5 7.97 Tf 6.59 0 Td[(12(j,k)rT3(j,k)D2jk=[1)]TJ /F4 11.955 Tf 11.96 0 Td[(r1(j,k)R)]TJ /F5 7.97 Tf 6.59 0 Td[(12(j,k)rT1(j,k)][1)]TJ /F4 11.955 Tf 11.95 0 Td[(r3(j,k)R)]TJ /F5 7.97 Tf 6.58 0 Td[(12(j,k)rT3(j,k)].Apartialautocorrelationmatrixwilloftenbesparse( Dempster ( 1972 ); Friedmanetal. ( 2008 ))sincethecorrelationsinitareconditional(ontheinterveningvariables)correlations.Forexample,anAR(1)correlationmatrixcorrespondstoapartialautocorrelationmatrixwithonlyonenon-zeroband.Moregenerally,apartialautocorrelationmatrixwillhaveknon-zerobandsunderak-thorderante-dependencemodel( Gabriel ( 1962 ); ZimmermanandNunez-Anton ( 2010 )) 74

PAGE 75

4.3SomePropertiesofk0BandPartialAutocorrelationMatricesThestrategyinthispaperwillbetorstbandthepartialautocorrelationmatrixandthensmooththenon-zerobands.Werstprovidesomeusefulpropertiesofthepartialautocorrelationmatrix.AssumeisapartialautocorrelationmatrixandRisitscorrespondingcorrelationmatrix.Toinvestigatepropertiesofk0bandpartialautocorrelationmatrices,werststatesomeusefulfactshere.Fact1:( Joe ( 2006 ))Ifisatoeplitzmatrix,soisR.Fact2:( WangandDaniels ( 2012 ))Ifisak0-bandpartialautocorrelationmatrix,thentheinverseofthecorrespondingcorrelationmatrix,R)]TJ /F5 7.97 Tf 6.59 0 Td[(1isak0-bandmatrix.Fact3:( Demkoetal. ( 1984 ))LetAbeapositiondenite,k0-band,boundedandinvertiblematrixwithanEuclideanmetric,and[a,b]bethesmallestintervalcontainingthespectrum,(A),ofthematrixA.Setr=b a,C0=(1+p r)2 2arand=p r)]TJ /F5 7.97 Tf 6.58 0 Td[(1 p r+12 k0.Then,the(i,j)thelementofA)]TJ /F5 7.97 Tf 6.59 0 Td[(1satises jA)]TJ /F5 7.97 Tf 6.59 0 Td[(1(i,j)jCji)]TJ /F7 7.97 Tf 6.59 0 Td[(jj (4) whereC=maxfa)]TJ /F5 7.97 Tf 6.59 0 Td[(1,C0g.Thisresultshowsthatentriesofinversematrixofabandedmatrixexponentiallydecaywithlag.Moreoverthedecayrateisonlyrelatedtonumberofbands(k0)andtherangeofitseigenvalues,regardlessofthedimensionofthematrix.Inthecaseofabandedpartialautocorrelationmatrix,theentriesofcorrespondingcorrelationmatrixareexponentiallydecayingaslagincreases.ThisisclearlyseeninanAR(1)correlationmatrixwhichinducesa1-bandpartialautocorrelationmatrix.Inthefollowing,letmdenotethedimensionoftherandomvectorYi:i=1,2,...,n.Wehavefollowingpropositionimmediately. Proposition4.1. LetRm=(m(jj+k))bethecorrespondingcorrelationmatrixofak0-bandpartialautocorrelationmatrixm,andlet[am,bm]bethesmallestinterval 75

PAGE 76

containingthespectrum(R)]TJ /F5 7.97 Tf 6.58 0 Td[(1m)ofthematrixR)]TJ /F5 7.97 Tf 6.59 0 Td[(1m.Setrm=bm am,Cm=(1+p rm)2 2amrm,and=p rm)]TJ /F5 7.97 Tf 6.58 0 Td[(1 p rm+12 k0.Then,jm(j,j+k)jCmjkj,whereCm=maxfa)]TJ /F5 7.97 Tf 6.58 0 Td[(1m,Cmg.Furthermore,form=2,3,...assumea0=infmfamg>0,b0=supmfbmg<+1,andC0=supmfCmg<+1.Setr0=b0 a0,and0=p r0)]TJ /F5 7.97 Tf 6.58 0 Td[(1 p r0+12 k0.Then,jm(j,j+k)jC0jkj0,forallm2.Now,assumeY1,Y2,...,Ynareiidm-dimensionalmultivariatenormalrandomvectorswithmeanzeroandvariance-covariancematrixm=DmRmDm,whereDmisadiagonalmatrixwithstandarddeviationsonitsmaindiagonalandRmisacorrelationmatrixwhichcorrespondstopartialautocorrelationmatrixm.Let^mbethecorrespondingsamplepartialautocorrelationmatrixofsamplecorrelationmatrix^Rm.^mistheMLEofm( WangandDaniels ( 2012 )).Now,wedenevh(^m)asvh(^m)=(^21,^31,...,^m1,^32,...,^m2,...,^mm)]TJ /F5 7.97 Tf 6.59 0 Td[(1)T,avectorwithalltheelementsinthelowertriangularpartof^m(withoutthemaindiagonal).Apropertyoftheasymptoticcovariancematrixofvh(^m)isgivenbelow. Theorem4.1. Let^m=(^m(j,j+k))bethesamplepartialautocorrelationmatrixofanm-dimensionalmultivariatenormaldistributionbasedonaniidsample,YiN(m,DmRmDm),i=1,2,...,n.Rmisthecorrelationmatrixwhichcorrespondstoak0-bandpartialautocorrelationmatrix.Let[am,bm]bethesmallestintervalcontainingthespectrum(R)]TJ /F5 7.97 Tf 6.59 0 Td[(1m)ofthematrixR)]TJ /F5 7.97 Tf 6.58 0 Td[(1m.Setrm=bm am,,Cm=(1+p rm)2 2amrmand=p rm)]TJ /F5 7.97 Tf 6.59 0 Td[(1 p rm+12 k0.Assume 76

PAGE 77

a. a0=infmfamg>0andb0=supmfbmg<+1,C0=supmfCmg<+1.Setr0=b0 a0,and0=p r0)]TJ /F5 7.97 Tf 6.59 0 Td[(1 p r0+12 k0. b. )]TJ /F6 11.955 Tf 9.3 0 Td[(1
PAGE 78

4.4NonparametricEstimationofPartialAutocorrelationCoefcientswithinBandsWeusetheresultsinSection3tosmooththepartialautocorrelationsineachband.Letk,mdenotethevector(1k+1,2k+2,...,m)]TJ /F7 7.97 Tf 6.58 0 Td[(k,m),k=1,2,...,m)]TJ /F6 11.955 Tf 12.58 0 Td[(1,thevectorofentriesonkthsubdiagonalsofm.Weassumethatcomponentsofk,m,arerealizationsofasmoothfunctionk,m()on[0,1],i.e.k,m(t)=k,m(t m)]TJ /F7 7.97 Tf 6.58 0 Td[(k).Forconvenience,wedenotex=t m)]TJ /F7 7.97 Tf 6.59 0 Td[(kandxj=j m)]TJ /F7 7.97 Tf 6.59 0 Td[(k,j=1,2,...,m-k.Let^k,m(t m)]TJ /F7 7.97 Tf 6.58 0 Td[(k)):=^k,m(x).Assume ^k,m(x)=k,m(x)+k(x)(4)wherek,m(x)isasmoothmeanfunctionandkisanerrorprocesswithmeanzero.NowwechooseakernelK()satisfyingconditionA,B,Cin Altman ( 1990 ), A. Kissymmetricabout0. B. Khassupportonlyontheinterval()]TJ /F5 7.97 Tf 10.5 4.71 Td[(1 2,1 2). C. KisLipschitzcontinuousoforder>0.WealsoassumethatKisapthorderkernel( GasserandMuller ( 1979 )),whichsatises,sK=ZxiK(x)dx8><>:=0ifi
PAGE 79

wherehisthebandwidth,misthedimensionofthepartialautocorrelationmatrix,andkislag. 4.4.1TheoreticResultsWithoutlossofgenerality,wedropsubscriptskandmofk,m(x)and^k,m(x,h,K),andthenusingresultsin Altman ( 1990 )wearriveatthefollowingtheorem. Theorem4.2. Supposethatthemeanfunction()inEq. 4 haspderivativesandthekernelfunctionKisoforderpandsatisesconditionsA-C.SupposethatthepointsareequallyspacedonsupportoffunctionintervalofestimationandpartialautocorrelationmatrixmsatisestheconditionsinTheorem1.Thenfor& 2x1)]TJ /F12 7.97 Tf 13.51 4.71 Td[(& 2,where&isasmallpositivenumber.MSE(^(x,h,K))=E(^(x,h,K))]TJ /F3 11.955 Tf 11.96 0 Td[((x))2=hp(p)(x)sK p!2+NK(2,k+2S,k) (m)]TJ /F4 11.955 Tf 11.95 0 Td[(k)h+o(1=(m)]TJ /F4 11.955 Tf 11.95 0 Td[(k)h)+o(h2p)wheresK=RxpK(x)dx,NK=RK2(x)dx,andS,kisdenedinEq. 4 .Typically,wewanttoestimate(x)globallyoverthesupportof(x)andanappropriateerrorcriteriaisconditionalmeanintegratedsquarederror( RuppertandWand ( 1994 )).MISE(^(x,h,K))=EZ[^(x,h,K))]TJ /F3 11.955 Tf 11.96 0 Td[((x)]2]v(x)dx=hpsK p!2Z((p)(x))2v(x)dx+NK(2,k+2S,k)Rv(x)dx (m)]TJ /F4 11.955 Tf 11.95 0 Td[(k)h+o(1=(m)]TJ /F4 11.955 Tf 11.96 0 Td[(k)h)+o(h2p)wherev(x)isaweightfunction.ByminimizingMISEasafunctionofh,weobtainasymptoticoptimalbandwidthgiveninthefollowingcorollary. Corollary4.2.1. UndertheconditionsofTheorem2,theasymptoticallyoptimalband-widthis hm,k=(p!)2NK(2,k+2S,k)Rv(x)dx 2ps2KR((p)(x))2v(x)dx1 2p+1(m)]TJ /F4 11.955 Tf 11.95 0 Td[(k))]TJ /F16 5.978 Tf 13.66 3.26 Td[(1 2p+1. (4) 79

PAGE 80

Theasymptoticoptimalbandwidthgivenaboveisageneralizationofasymptoticoptimalbandwidthin Hart ( 1991 )and Herrmannetal. ( 1992 )forp=2. Corollary4.2.2. UndertheconditionsofTheorem2,thekernelestimatorisconsistent,ash)166(!0and(m)]TJ /F4 11.955 Tf 11.96 0 Td[(k)h)166(!+1.Proofs:Trivialextensionsofresultson Altman ( 1990 ).Boundarybias,whichhappensneartheboundariesoftherange.Tobalancebetweenbiasandvariance,alocallinearregressionsmoother,whichisalinearcombinationofK(x)andxK(x)( HastieandLoader ( 1993 ))ischosenforoursituationduetothatourprimarygoalisestimate(x)insteadsomedegreeofderivativeof(x).Italsoprovidesdesirablepropertiesatboundaries. 4.4.2EstimationOurprocedureforestimatingacorrelationmatrixproceedsintwosteps, Estimate,^k0thenumberofbandsinthesamplepartialautocorrelationmatrix. Smooththepartialautocorrelationcoefcientswithineachbandk(k^k0). 4.4.2.1Choosenumberofbands WangandDaniels ( 2012 )developedaproceduretoestimatethenumberofbandsinapartialautocorrelationmatrixbasedonasequentialmultiplehypothesistestingprocedurewithmodiedtypeIerrorrate.Theadvantagesofthisprocedureincludes;1)Hypothesistestingisbasedonexactdistributions,2)Inversionofhighdimensionalmatrices.3)Noconcernsaboutthepositive-deniteconditionontheestimatedcorrelationmatrix.Weestimatethenumberofbands,^k0withthisprocedure. 4.4.2.2SmoothestimateswithineachbandOnceweestimatethenumberofbands,wesmooththeirentries.Corollary 4.2.2 guaranteesoursmoothingestimatorisconsistent.Themainremainingconcernischoosingthesmoothingparameterh.ToestimatethebandwidthinEq. 4 ,weneedtoestimate2,k,S,k,andp,p=R((p)(x))2v(x)dx. Ruppertetal. ( 1995 )suggestedawaytoestimate2,k,r,s=R(r)(x)(s)(x)v(x)dx.Detailsaregivenintheappendix. 80

PAGE 81

4.5SimulationStudyToevaluatetheoperatingcharacteristicsofourestimator,weconductseveralsimulationswithtruemodelsforpartialautocorrelations,j,j+k,below. Model1:AR(1)structure(j,j+k)=I(jkj=1),for=0,0.2,0.5,0.9,whichcorrespondstoallhigherlagpartialautocorrelationsequaltozeroexceptlag1.=0isaspecialcaseofwithinsubjectindependence,and=0.2,0.5,0.9representtoweak,moderate,andstrongwithinsubjectcorrelation. Model2:4bandpartialautocorrelation(j,j+k)=0.4I(jkj=1)+0.2I(2jkj3)+0.1I(jkj=4),whichcorrespondstothepartialautocorrelationmatrixsatisfyingToeplitzconditionandreectsthepartialautocorrelationmatrixhavingmoderatepartialautocorrelationsonlag1andsmallpartialautocorrelationsonlag2,andlag3,verysmallpartialautocorrelationsonlag4. Model3:5bandpartialautocorrelation(I)(j,j+k)=I(jkj<6)maxf0,exp()]TJ /F6 11.955 Tf 9.3 0 Td[((jkj+2)=4))]TJ /F6 11.955 Tf 11.95 0 Td[((1+j))]TJ /F5 7.97 Tf 6.59 0 Td[(1.5g.Thepartialautocorrelationcoefcientsaredecreasingwithtime(j)withineachlag,andareexponentiallydecayingaslag(k)increasesforagiventime. Model4:9bandpartialautocorrelations(II)jj+k=1 2(1+sin60)]TJ /F7 7.97 Tf 6.58 0 Td[(j 5)(1 2+k))]TJ /F5 7.97 Tf 6.59 0 Td[(1.02I(k<10).Thepartialautocorrelationsareperiodicfunctionsoftime.Foreachtruematrix,wesimulate100datasetswithdimensionm=60anddifferentsamplesizesfromamultivariatenormaldistributionwithmeanzeroandvariance-covariancematrixequaltocorrespondingcorrelationmatrixforeachcaseabove.Weestimatethesmoothedpartialautocorrelationmatrixbyaveragingthesmoothedestimatorsofpartialautocorrelationmatricesover100replicateddatasetsandthenconstructingtheestimatedcorrelationmatrix.Wecomparethesmoothedcorrelationestimatortosamplecorrelation,Ledoit-Wolfestimator,thebandedestimatorandsmoothedestimatorwithsmoothingparameterchosenby Ruppertetal. ( 1995 ).Wecalculateameansumofsquareerrorloss(MSSR)oftheestimatedcorrelationmatrixbyrstcalculatingsquaredL2normbetweenthecorrespondingcorrelationmatrixof 81

PAGE 82

estimatedpartialautocorrelationandtruecorrelationmatrix,thenaveragethissquarederrorlossover100replicates.TheresultsarerecordedinTables 4-1 to 4-4 .Table 4-1 providesthemeansumofsquareerrorfortheveestimatorsmentionedformodel1.Wecanseethatsmoothedpartialautocorrelationcoefcientswithineachbandsignicantlyreducethemeansumofsquareerror(MSSR)comparedtobandedpartialautocorrelationestimatorandsamplecorrelationestimator.Furthermore,oursmoothingestimatorreducesthisquantityaround5%comparedtothemethodsuggestedby Ruppertetal. ( 1995 ).Table 4-2 providesresultsformodel2andhasasimilarstorytomodel1.Table 4-3 recordstheresultsformodel3,whichdoesnotsatisfyToeplitzcondition.Wecanseethat WangandDaniels ( 2012 )estimatethebandverywell,evenifasamplesizeissmall(comparingtodimensionp=60),andthesmoothedestimatorsreducetheirMSSRmorethan50%forsmallsamplesize(comparingtoMLEandW-Lestimator).Althoughmodel4ismorecomplicatedduetoperiodicallyvaryingpattern,MSSRshowssimilarimprovements(Table 4-4 ). 4.6ApplicationsWeillustrateourapproachontwodatasets,theMetalandRockdataofthesonardata,whichisavailableathttp:www.ics.uci.edu=smelearn=MLRepository.html.Thisdatasetcontains111signalsfromaMetalcylinderand97signalsfromaRock,whereeachsignalhas60frequencyenergymeasurementsrangingfrom0.0to1.0.Thesesignalsweremeasuredatdifferentanglesforthesameobjects.Asinpreviousanalysisofthisdataweassumethesignalsareiidnormalrandomvectors( Rothmanetal. ( 2010 )).Figure 4-2 and 4-3 displayestimatedpartialautocorrelationsandtheestimatedsmoothedpartialautocorrelationsforeachbandformetalandrockdata,respectively.Forthemetaldata,thetattheboundariesisverygood,exceptontherightboundaryoflag2tting.Thethereisdrivenbytheestimatedpartialautocorrelationatthenalfrequency,whichmaybeanoutlier.Forlag5,thereisabigchangearoundfrequency 82

PAGE 83

30,whichisconrmedbytheimageplotofcorrelationmatrixofmetaldatainFigure 4-4A .Theimageofthecorrelationmatrixwhichcorrespondstosmoothedpartialautocorrelationwithinestimatedbands,appearstocapturethemainfeaturesofthesamplecorrelationmatrix.Comparedtometaldata,thecorrelationstructureofrockdataappearstobemuchsimpler.Figure 4-3 showsclearlyhowthesmoothedcurvescapturethecharacteristicofestimatedpartialautocorrelationwithineachband.NotethatthesmoothedcurvesviolatetheToeplitzconditionusedinourtheorems.However,thecurvettingstillworkswellwhichweobservedfromthedecreaseinsumofmeansquarederrorinthesimulationstudyinsection4.5. 4.7DiscussionWehaveproposedanonparametricsmoothingmethodtoimprovetheestimationofabandedpartialautocorrelationmatrix,evenwhenthedimensionmislessthansamplesizen.ThisestimatorisconsistentaslongaspartialautocorrelationssatisfysomeregularityconditionsforthekernelandaToeplitzconditiononthepartialautocorrelationmatrix.Themaincondition,convergenceofcovariancesbetweenthepartialautocorrelationsdoesnotrequiretheToeplitzcondition,andinanycase,theToeplitzconditiondoesnotappeartoberestrictivefromoursimulationresults.WearedoingfurtherworktoseeiftheToeplitzconditionisreallynecessary.Anotherconditioninourtheoremisequallyspaceddesignpoints,acommonassumptioninthenonparametricliterature( Altman ( 1990 ); Hart ( 1991 ),and Ruppertetal. ( 1995 )).However, RiceandWu ( 2001 )discussaplug-inbandwidthestimatorforunequallyspaceddesignpoints.Weplantoexploremoreonthistopicaswell. 83

PAGE 84

Table4-1. SimulationrecordsforAR(1)structure TypeIError=0.2I(n>m)+0.3I(nmnMSSE(MLE)MSSE(L-W)MSSE(B-E)MSSE(S-R)MSSE(S-D)EB 10394.4216.5629.86830.77550.73800.620186.292.10496.51600.46970.45080.930122.000.59824.73790.33380.31931.004090.7460.30624.49520.35020.33191.45072.1180.18212.48590.18680.18271.06059.8480.12162.56850.22030.21201.210035.8100.05140.59660.04230.04010.420017.6990.01250.36650.03280.03130.65007.13380.00590.15870.01180.01110.610390.4419.89317.1763.24903.19230.920184.766.57289.61111.15811.11561.330121.925.37338.69660.91570.88481.90.24090.6685.00776.28990.49470.46781.95072.0044.77185.98390.52330.49992.36060.1024.67385.14960.43660.41952.410035.6754.37811.83530.15980.15171.520017.7563.86900.88910.07850.07601.55007.07932.90740.36430.03240.03131.510385.3249.31821.5592.91312.82881.320183.5834.54914.2132.09152.03091.930119.9330.3039.02321.19181.13861.90.54089.38827.6607.02380.93960.90512.15072.07325.7386.79780.97360.95112.56058.62023.7253.96210.59060.57001.810035.01518.5881.82690.26190.25381.420017.48612.1060.86390.12990.12601.45006.99155.96710.37550.05530.05311.4 84

PAGE 85

Table 4-1 .Continued TypeIError=0.2I(n>m)+0.3I(nm)nMSSE(MLE)MSSE(L-W)MSSE(B-E)MSSE(S-R)MSSE(S-D)EB 10326.50206.6242.09924.83024.4881.520152.37122.3020.85313.24813.0281.93098.26586.53413.6088.33558.20511.80.94071.97365.31810.1576.27846.18752.25058.03353.4848.92945.56265.46662.46049.06846.2247.29294.59164.54222.210027.89227.3903.76172.57582.55551.520014.37813.9461.81391.24211.22881.65005.61285.58010.74740.50070.49571.6 Table4-2. Simulationrecordsformodel2 TypeIError=0.3I(n>m)+0.4I(nm)nMSSE(MLE)MSSE(L-W)MSSE(B-E)MSSE(S-R)MSSE(S-D)EB 10378.0572.72754.01825.00824.8842.120180.9253.81933.83811.43011.2303.730117.8246.25822.2055.86475.76644.34087.24640.17116.5513.80793.71954.55069.83636.56613.5032.80642.72744.96058.13532.43311.1352.56232.50454.810034.14223.6985.52701.34421.31544.130011.2379.88331.81310.36770.35764.45006.85136.62921.15410.24530.23834.6 Table4-3. Simulationrecordsformodel3 TypeIError=0.3I(n>m)+0.4I(nm)nMSSE(MLE)MSSE(L-W)MSSE(B-E)MSSE(S-R)MSSE(S-D)EB 10346.60164.76135.11113.69113.752.520158.13107.4153.90229.45629.4304.930104.4683.95034.43915.43315.4245.84076.97262.54625.85410.65910.6816.15063.14853.99319.9448.78598.83766.16052.32246.12016.5107.17347.18646.210031.00530.3348.59794.36244.44375.530010.2599.92342.93611.85431.88165.45006.15946.04011.73281.20491.17955.4 85

PAGE 86

A BFigure4-1. Truepartialautocorrelationcurvesformodels 86

PAGE 87

Table4-4. Simulationrecordsformodel4 TypeIError=0.3I(n>m)+0.4I(nm)nMSSE(MLE)MSSE(L-W)MSSE(B-E)MSSE(S-R)MSSE(S-D)EB 10365.56119.1886.52572.51072.2102.020169.2383.04646.66428.21327.9714.330113.2967.56935.80117.89317.6675.84084.89156.25825.36112.15011.9816.25066.76947.53821.1718.10357.92827.26055.23241.33918.5416.26156.11048.110033.12327.82911.2834.61774.48018.130010.82210.0793.94601.44291.37329.75006.54106.29432.33790.93780.85859.7 87

PAGE 88

A BFigure4-2. Samplepartialautocorrelationsandsmoothedcurveswithineachbandofthemetaldata 88

PAGE 89

A BFigure4-3. Samplepartialautocorrelationsandsmoothedcurveswithineachbandoftherockdata 89

PAGE 90

A B C D E F G HFigure4-4. Intensityplotsforthemetalandrockdata 90

PAGE 91

CHAPTER5CONCLUSIONInthisdissertation,wedevelopednewmethodstomodelthedependencestructureinmultivariatedatabasedonre-parameterizingthecorrelationmatrixusingpartialautocorrelations.InChapter2,weproposedseveralpriorsforpartialautocorrelationcoefcients.Thesepriorscapturebothdecayingcorrelationaslagincreasesandplacemoreweightonpositivecorrelations,whichisappreciateforlongitudinaldata.WetransformedthepartialautocorrelationsusingFisher'sZ-transform,andintroducedageneralizedlinearmodelframeworktomodelpartialautocorrelation(correlation)structure.Modelingstructuredpartialautocorrelationsgreatlyreducesthenumberofparametersthatneedtobeestimated.InChapter3,weinvestigatedestimatingcorrelation(covariance)matricesinhighdimensions.Ourapproachwasbasedonbandingapartialautocorrelationmatrix,anddiscoveredthatalllagk0partialautocorrelationcoefcientsareindependenttransformedBetadistributionson(-1,1)withequalshapeandscaleparameters.ThisndingprovidedafoundationfordoingsequentialmultiplehypothesistestingtoestimatebandusingaBonferronicorrection.WealsoexploredtherelationshipoftypeIerrorwithsamplesizeanddimensionviasimulations.Ourestimationprocessismuchfasterthanmanyexistingmethodsandonlyrequiresinversionofmatriceswithdimensionequaltothenumberofbandsplus1.Weareworkingonprovingconsistencypropertiesofourestimatorofthecorrelationmatrix,andthetheoryunderlyingourtypeIerrorcorrection.InChapter4,weimprovedestimatorsofourbandedpartialautocorrelationmatrixestimatorsdiscussedinChapter3bynonparametricsmoothingthepartialautocorrelationfunctionswithineachband.Wederiveanasymptoticallyoptimalbandwidthforkernelestimationofapartialautocorrelationfunctionthatprovidesaplug-inbandwidthestimator.Ourtheoreticalderivationfortheerrorisbasedon 91

PAGE 92

equallyspaceddesignpointsandaToeplitzconditionforthetruepartialautocorrelationmatrices.OursimulationshowsthattheToeplitzconditionisnotoverlyrestrictive,whichmayduetothecorrelationsbetweenthepartialautocorrelationswithinabandexponentiallydecayingtozero.WearecurrentlyexploringwaystoweakentheToeplitzconditionforderivingtheoptimalbandwidth. 92

PAGE 93

APPENDIXASUPPORTINGMATERIALFORCHAPTER2 A.1SamplingAlgorithmWesample(,,)usinga(block)Gibbssampleralongwithdataaugmentationforthemissingresponses(straightforwardgiventhefulldataresponseismultivariatenormal).Wesamplethefullconditionalfortheparametersasfollows.Atiterationk0, 1. SampleYimissj((k0)]TJ /F5 7.97 Tf 6.59 0 Td[(1),(k0)]TJ /F5 7.97 Tf 6.59 0 Td[(1),(k0)]TJ /F5 7.97 Tf 6.58 0 Td[(1),yiobs)bydataaugmentation.YimissN(xTimiss(k0)]TJ /F5 7.97 Tf 6.59 0 Td[(1)+(k0)]TJ /F5 7.97 Tf 6.58 0 Td[(1)i12((k0)]TJ /F5 7.97 Tf 6.59 0 Td[(1)iobs))]TJ /F5 7.97 Tf 6.58 0 Td[(1(yiobs)]TJ /F4 11.955 Tf 11.95 0 Td[(xTiobs(k0)]TJ /F5 7.97 Tf 6.58 0 Td[(1)),(k0)]TJ /F5 7.97 Tf 6.58 0 Td[(1)imiss)]TJ /F6 11.955 Tf 11.96 0 Td[((k0)]TJ /F5 7.97 Tf 6.59 0 Td[(1)i21((k0)]TJ /F5 7.97 Tf 6.59 0 Td[(1)iobs))]TJ /F5 7.97 Tf 6.59 0 Td[(1(k0)]TJ /F5 7.97 Tf 6.59 0 Td[(1)i12)wherexTi=(xTiobs,xTimiss)T,and(k0)]TJ /F5 7.97 Tf 6.59 0 Td[(1)i= (k0)]TJ /F5 7.97 Tf 6.59 0 Td[(1)iobs(k0)]TJ /F5 7.97 Tf 6.59 0 Td[(1)i12(k0)]TJ /F5 7.97 Tf 6.59 0 Td[(1)i21(k0)]TJ /F5 7.97 Tf 6.59 0 Td[(1)imiss!. 2. Samplej((k0)]TJ /F5 7.97 Tf 6.59 0 Td[(1),(k0)]TJ /F5 7.97 Tf 6.59 0 Td[(1),y(k0))fromanormaldistributionwithmean(k0)]TJ /F5 7.97 Tf 6.58 0 Td[(1)=(k0)]TJ /F5 7.97 Tf 6.59 0 Td[(1)0nXi=1xiD)]TJ /F5 7.97 Tf 6.58 0 Td[(1i((k0)]TJ /F16 5.978 Tf 5.76 0 Td[(1))R)]TJ /F5 7.97 Tf 6.59 0 Td[(1i((k0)]TJ /F16 5.978 Tf 5.75 0 Td[(1))D)]TJ /F5 7.97 Tf 6.59 0 Td[(1i((k0)]TJ /F16 5.978 Tf 5.75 0 Td[(1))y(k0)iandvariance,(k0)]TJ /F5 7.97 Tf 6.59 0 Td[(1)0=(nXi=1xiD)]TJ /F5 7.97 Tf 6.58 0 Td[(1i((k0)]TJ /F16 5.978 Tf 5.76 0 Td[(1))R)]TJ /F5 7.97 Tf 6.59 0 Td[(1i((k0)]TJ /F16 5.978 Tf 5.75 0 Td[(1))D)]TJ /F5 7.97 Tf 6.59 0 Td[(1i((k0)]TJ /F16 5.978 Tf 5.75 0 Td[(1))xTi))]TJ /F5 7.97 Tf 6.59 0 Td[(1. 3. Samplej((k0),(k0)]TJ /F5 7.97 Tf 6.59 0 Td[(1),y(k0))usingarandomwalkMetropolis-Hastingsalgorithm.Thefullconditionalisproportionalto(j,,Y,X)/f(nYi=1jDi()j)]TJ /F5 7.97 Tf 6.59 0 Td[(1)jnXi=1xiD)]TJ /F5 7.97 Tf 6.59 0 Td[(1i()R)]TJ /F5 7.97 Tf 6.58 0 Td[(1i((k0)]TJ /F16 5.978 Tf 5.76 0 Td[(1))D)]TJ /F5 7.97 Tf 6.58 0 Td[(1i()xTij)]TJ /F16 5.978 Tf 7.79 3.25 Td[(1 2gexpf)]TJ /F6 11.955 Tf 16.47 8.09 Td[(1 2nXi=1(y(k0)i)]TJ /F4 11.955 Tf 11.96 0 Td[(xTi(k0))TD)]TJ /F5 7.97 Tf 6.59 0 Td[(1i()R)]TJ /F5 7.97 Tf 6.59 0 Td[(1i((k0)]TJ /F16 5.978 Tf 5.76 0 Td[(1))D)]TJ /F5 7.97 Tf 6.59 0 Td[(1i()(y(k0)i)]TJ /F4 11.955 Tf 11.95 0 Td[(xTi(k0))gexp()]TJ /F6 11.955 Tf 10.5 8.08 Td[(1 2T). 4. Samplej((k0),(k0),y(k0))usingaQuasi-NewtonMetropolis-HastingAlgorithm(detailsinthewebappendix).Thefullconditionalisproportionalto 93

PAGE 94

(j,,Y,X)/(Qni=1jRi()j)]TJ /F16 5.978 Tf 7.78 3.26 Td[(1 2)jPni=1xiD)]TJ /F5 7.97 Tf 6.59 0 Td[(1i((k0))R)]TJ /F5 7.97 Tf 6.59 0 Td[(1i()D)]TJ /F5 7.97 Tf 6.58 0 Td[(1i((k0))xTij)]TJ /F16 5.978 Tf 7.78 3.26 Td[(1 2expf)]TJ /F5 7.97 Tf 15.13 4.29 Td[(1 2Pni=1(y(k0)i)]TJ /F20 10.909 Tf 11.95 0 Td[(xTi(k0))TD)]TJ /F5 7.97 Tf 6.59 0 Td[(1i((k0))R)]TJ /F5 7.97 Tf 6.58 0 Td[(1i()D)]TJ /F5 7.97 Tf 6.58 0 Td[(1i((k0))(y(k0)i)]TJ /F20 10.909 Tf 11.95 0 Td[(xTi(k0))gexpf)]TJ /F5 7.97 Tf 15.14 4.29 Td[(1 2()]TJ /F22 10.909 Tf -390.24 -17 Td[()T)]TJ /F5 7.97 Tf 6.58 0 Td[(1()]TJ /F22 10.909 Tf 10.9 0 Td[()g. A.2SimulatingfromtheFullConditionalforForconvenience,wedenotelogfullconditional(j,,Y,X)by`?().Letdenotethemodeof`?()andassumeisinaneighborhoodof0.Then,@`?() @=@`?() @j=0+@2`?() @@Tj=0()]TJ /F3 11.955 Tf 11.96 0 Td[(0)+o(j)]TJ /F3 11.955 Tf 11.96 0 Td[(0j2)1q1.Therefore,0)]TJ /F6 11.955 Tf 11.95 0 Td[((@2`?() @@Tj=0))]TJ /F5 7.97 Tf 6.59 0 Td[(1@`?() @j=0.Wecanapproximatetheobservedinformationmatrix)]TJ /F12 7.97 Tf 10.49 5.48 Td[(@2`?() @@Tj=0bytheexpectedsherinformationmatrixat0.Usingaquasi-Newtonmethodtoupdate,=0+I(0))]TJ /F5 7.97 Tf 6.59 0 Td[(15@`?() @j=0,wechoosetosatisfyWolf'scondition( Wolfe ( 1969 )).Theformoftheexpectedinformation,I()isgiveninthefollowingsection.Atiterationk0,forStep4inouralgorithm,werstapproximatethemodelofl?(),((k0))by(k0)=(k0)]TJ /F5 7.97 Tf 6.58 0 Td[(1)+I((k0)]TJ /F5 7.97 Tf 6.58 0 Td[(1)))]TJ /F5 7.97 Tf 6.58 0 Td[(15@`?() @j=(k0)]TJ /F16 5.978 Tf 5.75 0 Td[(1).Thenwesample (k0)N((k0),I)]TJ /F5 7.97 Tf 6.59 0 Td[(1((k0))), (A) andaccept(k0)=(k0)withprobability p=minf1,((k0))j(k0),(k0),Y(k0),X) ((k0)]TJ /F5 7.97 Tf 6.59 0 Td[(1))j(k0),(k0),Y(k0),X)h((k0)]TJ /F5 7.97 Tf 6.59 0 Td[(1))j(k0),(k0),Y(k0),X) h((k0))j(k0),(k0),Y(k0),X)g, (A) whereh()isthepdfofEq. A A.3DerivingtheExpectedInformationMatrixforWederivetheexpectedinformationmatrixfor(andfor(,0)),where0=(11,...,pp)T.Todothis,werstdenesomeneededquantities.LetAbeann 94

PAGE 95

symmetricmatrixwithelementsfaijg,Bbeamnmatrixwithelementsfbijg,andCbeassmatrixwithelementsfcijgmatrices.Wedenevec(A)=a11a21an1a12an2annTtobeavectorincludingalltheelementsinthematrixsortedbycolumn.Wedenev(A)=a11a21an1a22an2annTtobeavectorincludingalltheelementsinthelowertriangularpartofasquarematrix(sortedbycolumn).Finally,wedenevh(A)=a21a31an1a32an2ann)]TJ /F5 7.97 Tf 6.59 0 Td[(1T,tobeavectorwithalltheelementsinthelowertriangularpartofasquarematrix(withoutthemaindiagonal).DenetheKroneckerproductoftwosquarematricesasBNC=(bijC).Now,deneamatrixDnsuchthatDnv(A)=vec(A).So,v(A)=Dynvec(A),whereDynistheMoore-PenroseinverseofDn,andLetY1,Y2,...,Ynbeindependentlyandidenticallydistributedp1randomvectorsuchthatYiNp(,)wherei=1,2,...,n,=(jk)ispositivedenite,andletnp+1.Theexpectedinformationmatrixforv()isFn=n 2DTp()]TJ /F5 7.97 Tf 6.59 0 Td[(1)]TJ /F5 7.97 Tf 6.58 0 Td[(1)Dp( MagnusandNeudecker ( 1984 ))ToderivetheexpectedFisherinformationof(,0),wespecifythefollowingseriesoftransformations, 95

PAGE 96

!!!!!z!(,?,0)Thetransformationsaredenedas=(v()),=(vh()T,T0)T,=(vh(R)T,T0)T,=(12,23,13,...,p)]TJ /F5 7.97 Tf 6.58 0 Td[(1p,...,1p,T0)T,=(12,23,13,...,p)]TJ /F5 7.97 Tf 6.59 0 Td[(1p,...,1p,T0)T,andz=(z12,z23,z13,...,zp)]TJ /F5 7.97 Tf 6.59 0 Td[(1p,...,z1p,0)T.DetailsontheJacobianofeachtransformationfollow. 1. g1(;):!separatesthediagonalandoff-diagonalelements.TheJacobinmatrixJofg1(;)isobtainedbyre-orderingthep(p+1) 2p(p+1) 2dimensionalidentitymatrixcorrespondingtothisre-orderingtransformation. 2. g2(;):!isa1-1transformationfromthecovarianceparameterstovariance/correlationparameters.TheJacobian,JisJ=J11J120 IppwhereJ11=diag(vh(0NT0)1 2)andJ12=0BBBBBBBBBBBBBBBBBBB@12p 1122 21112p 1122 2220...0013p 1133 211013p 1133 233...00......1pp 11pp 21100...01pp 11pp 2pp023p 2233 22223p 2233 233...00......02pp 22pp 2220...02pp 22pp 2pp......000...p)]TJ /F34 4.981 Tf 5.4 0 Td[(1pp p)]TJ /F34 4.981 Tf 5.39 0 Td[(1p)]TJ /F34 4.981 Tf 5.4 0 Td[(1pp 2p)]TJ /F34 4.981 Tf 5.39 0 Td[(1p)]TJ /F34 4.981 Tf 5.4 0 Td[(1p)]TJ /F34 4.981 Tf 5.4 0 Td[(1pp p)]TJ /F34 4.981 Tf 5.4 0 Td[(1p)]TJ /F34 4.981 Tf 5.4 0 Td[(1pp 2pp1CCCCCCCCCCCCCCCCCCCA 3. g3(;):!isa1-1transformationwhichchangestheorderoftheparameterstowithJacobian,J=J110 0 Ipp,whereJ11isamatrixobtainedbyreorderingidentitymatrixIp(p)]TJ /F16 5.978 Tf 5.75 0 Td[(1) 2p(p)]TJ /F16 5.978 Tf 5.75 0 Td[(1) 2correspondingtothereorderingfrom(12,...,1p,23,...,2p,...,p)]TJ /F5 7.97 Tf 6.59 0 Td[(1p),to(12,23,13,...,p)]TJ /F5 7.97 Tf 6.59 0 Td[(1p,...,1p). 4. g4(;):!isa1-1transformationdenedin(1)withjjunchanged.TheJacobianisJ=J110 0 Ipp 96

PAGE 97

whereJ11isJacobianmatrixoftransformationfrom(12,23,13,...,p)]TJ /F5 7.97 Tf 6.59 0 Td[(1p,...,1p)to(12,23,13,...,p)]TJ /F5 7.97 Tf 6.58 0 Td[(1p,...,1p),whichisalowertriangularmatrixwithelementsn@jk @lmoinposition((j)]TJ /F5 7.97 Tf 6.59 0 Td[(1)(2p)]TJ /F7 7.97 Tf 6.58 0 Td[(j) 2+k,(l)]TJ /F5 7.97 Tf 6.59 0 Td[(1)(2p)]TJ /F7 7.97 Tf 6.58 0 Td[(l) 2+m). 5. g5(;z):!zisa1-1transformationwhichtransformsthetoz().TheJacobianisJz=Jz110 0 IppwhereJz11=0BBBB@1)]TJ /F3 11.955 Tf 11.95 0 Td[(21200...001)]TJ /F3 11.955 Tf 11.95 0 Td[(2230...0001)]TJ /F3 11.955 Tf 11.96 0 Td[(213...0......000...1)]TJ /F3 11.955 Tf 11.96 0 Td[(21p1CCCCA. 6. g6(z;(,?,0)):z!(,?,0)isdenedinEq. 2 withjjunchanged,andtheJacobianJ(,?,0)=J(,?)1100Ipp,whereJ(,?)11=)]TJ /F4 11.955 Tf 10.46 -9.58 Td[(w?w?.Now,letI()denotetheexpectedinformationof.Sincetheinformationmatrixisinvariantundertransformation, I()=J?Tfn 2[DTT()]TJ /F5 7.97 Tf 6.59 0 Td[(1O)]TJ /F5 7.97 Tf 6.58 0 Td[(1)DT]gJ?=0B@I11I12I21I221CA. (A) Then,I()=JTI()J=0B@J11J120 Ipp1CAT0B@I11I12I21I221CA0B@J11J120 Ipp1CA=0B@JT11I11J11JT11I11J12+J11I12JT12I11J11+I21J11JT12I11J12+(I21J12+JT12I12)+I221CA 97

PAGE 98

=0B@I(0)I(00)IT(00)I(0)1CA.where0=(12,...,1p,23,...p)]TJ /F5 7.97 Tf 6.58 0 Td[(1p).ItthenfollowsthatI(,?,0)=(J(,?,0)TJzTJTJT)I()(JJJzJ(,?,0)) =0B@(J(,?)T11JzT11JT11JT11)I(0)(J11J11Jz11J(,?)11)J(,?)T11JzT11JT11JT11I(00)(J(,?)T11JzT11JT11JT11I(00))TI(0)1CA. (A) Sothethreeblocksoftheinformationmatrixfor(,0)are I()=(w?TJzT11JT11JT11)I(0)(J11J11Jz11w?) (A) I(,0)=w?TJzT11JT11JT11I(00) (A) I(0)=JT12I11J12+(I21J12+JT12I12)+I22. (A) A.4ProofofTheorem1 Proof. LetYkii=fYij,j=1,...,kiwhereki:Qiki=1,Qi,ki+1=0g,Sk=fi,Qiki=1andQiki+1=0,andki=k,where1kp)]TJ /F6 11.955 Tf 12.8 0 Td[(1;i=1,...,ngand=,=,=besamplespacesof,,,respectively.ThustheobserveddatadistributionofithsubjectisYkiiNk(xkiTi,kii),wherexkii=xi[:,1:ki]isapkisubmatrixofxiandkii=i[1:ki,1:ki]isakikiprincipalsubmatrixofi.Denetheobserveddata,Yobs=(Yk11,...,Yknn).Therefore,f(ykiij,,,xkii)/jkiij)]TJ /F16 5.978 Tf 7.79 3.25 Td[(1 2exp[)]TJ /F6 11.955 Tf 10.5 8.09 Td[(1 2(ykii)]TJ /F4 11.955 Tf 11.96 0 Td[(xkiTi)T(kii))]TJ /F5 7.97 Tf 6.59 0 Td[(1(ykii)]TJ /F4 11.955 Tf 11.95 0 Td[(xkiTi)],andf(yobsj,,,x)/nYi=1jkiij)]TJ /F16 5.978 Tf 7.78 3.26 Td[(1 2expf)]TJ /F6 11.955 Tf 16.47 8.08 Td[(1 2[nXi=1(ykii)]TJ /F4 11.955 Tf 11.96 0 Td[(xkiTi)T(kii))]TJ /F5 7.97 Tf 6.58 0 Td[(1(ykii)]TJ /F4 11.955 Tf 11.95 0 Td[(xkiTi)g 98

PAGE 99

=p)]TJ /F5 7.97 Tf 6.59 0 Td[(1Yk=1Yi2Skjkiij)]TJ /F16 5.978 Tf 7.79 3.26 Td[(1 2expf)]TJ /F6 11.955 Tf 16.47 8.09 Td[(1 2p)]TJ /F5 7.97 Tf 6.59 0 Td[(1Xk=1[Xi2Sk(yik)]TJ /F4 11.955 Tf 11.96 0 Td[(xkiTi)T(kii))]TJ /F5 7.97 Tf 6.58 0 Td[(1(ykii)]TJ /F4 11.955 Tf 11.95 0 Td[(xkiTi)]g=p)]TJ /F5 7.97 Tf 6.59 0 Td[(1Yk=1Yi2Skjkiij)]TJ /F16 5.978 Tf 7.79 3.25 Td[(1 2expf)]TJ /F6 11.955 Tf 16.47 8.09 Td[(1 2[p)]TJ /F5 7.97 Tf 6.58 0 Td[(1Xk=1Xi2Sk(ykiTi(kii))]TJ /F5 7.97 Tf 6.58 0 Td[(1ykii))]TJ /F6 11.955 Tf 9.3 0 Td[(2(p)]TJ /F5 7.97 Tf 6.58 0 Td[(1Xk=1Xi2SkykiTi(kii))]TJ /F5 7.97 Tf 6.58 0 Td[(1xkiTi)+T(p)]TJ /F5 7.97 Tf 6.58 0 Td[(1Xk=1Xi2Skxkii(kii))]TJ /F5 7.97 Tf 6.58 0 Td[(1xkiTi)]gTherefore,posteriordistributionof(,,)is(,,jyobs,x)=m(,,jyobs,x) R2=R2=R2=m(,,jyobs,x)ddd,wherem(,,jyobs,x)=p)]TJ /F5 7.97 Tf 6.59 0 Td[(1Yk=1Yi2Skj(kii)j)]TJ /F16 5.978 Tf 7.78 3.26 Td[(1 2expf)]TJ /F6 11.955 Tf 16.48 8.09 Td[(1 2[p)]TJ /F5 7.97 Tf 6.59 0 Td[(1Xk=1Xi2Sk(ykiTi(kii))]TJ /F5 7.97 Tf 6.59 0 Td[(1ykii))]TJ /F6 11.955 Tf 11.96 0 Td[(2(p)]TJ /F5 7.97 Tf 6.59 0 Td[(1Xk=1Xi2SkykiTi(kii))]TJ /F5 7.97 Tf 6.59 0 Td[(1xkiTi)+T(p)]TJ /F5 7.97 Tf 6.58 0 Td[(1Xk=1Xi2Skxkii(kii))]TJ /F5 7.97 Tf 6.58 0 Td[(1xkiTi)]g()().SincePxkiixkiTiisfullrank,Pp)]TJ /F5 7.97 Tf 6.59 0 Td[(1k=1Pi2Skxkii)]TJ /F5 7.97 Tf 6.59 0 Td[(1ikxkiTiisapositive-denitematrix.Therefore,itssmallesteigenvaluepislargerthanzeroandjPp)]TJ /F5 7.97 Tf 6.59 0 Td[(1k=1Pi2Skxkii(kii))]TJ /F5 7.97 Tf 6.58 0 Td[(1xkiTij)]TJ /F16 5.978 Tf 7.78 3.26 Td[(1 2<)]TJ /F17 5.978 Tf 7.78 3.53 Td[(p 2p.Dene^=(Pp)]TJ /F5 7.97 Tf 6.59 0 Td[(1k=1Pi2Skxkii(kii))]TJ /F5 7.97 Tf 6.59 0 Td[(1xkiTi))]TJ /F5 7.97 Tf 6.58 0 Td[(1Pp)]TJ /F5 7.97 Tf 6.59 0 Td[(1k=1Pi2Skxkii)]TJ /F5 7.97 Tf 6.59 0 Td[(1kykii.Weobtain,R=m(,,jyobs,x)d/jPp)]TJ /F5 7.97 Tf 6.58 0 Td[(1k=1Pi2Skxkii(kii))]TJ /F5 7.97 Tf 6.58 0 Td[(1xkiTij)]TJ /F16 5.978 Tf 7.78 3.25 Td[(1 2(Qp)]TJ /F5 7.97 Tf 6.58 0 Td[(1k=1Qi2Skj(kii)j)]TJ /F16 5.978 Tf 7.79 3.25 Td[(1 2)expf)]TJ /F5 7.97 Tf 16.47 4.71 Td[(1 2Pp)]TJ /F5 7.97 Tf 6.59 0 Td[(1k=1Pi2Sk(ykii)]TJ /F4 11.955 Tf 11.95 0 Td[(xkiTi^)T(kii))]TJ /F5 7.97 Tf 6.59 0 Td[(1(ykii)]TJ /F4 11.955 Tf 11.96 0 Td[(xkiTi^)g()())]TJ /F17 5.978 Tf 7.78 3.53 Td[(p 2p(Qp)]TJ /F5 7.97 Tf 6.59 0 Td[(1k=1Qi2Skj(kii)j)]TJ /F16 5.978 Tf 7.78 3.26 Td[(1 2)expf)]TJ /F5 7.97 Tf 16.47 4.71 Td[(1 2Pp)]TJ /F5 7.97 Tf 6.58 0 Td[(1k=1Pi2Sk(ykii)]TJ /F4 11.955 Tf 13.15 0 Td[(xkiTi^)T(kii))]TJ /F5 7.97 Tf 6.59 0 Td[(1(ykii)]TJ /F4 11.955 Tf -416.41 -23.91 Td[(xkiTi^)g()(). 99

PAGE 100

Now,deneMkii(^)=(ykii)]TJ /F4 11.955 Tf 12.84 0 Td[(xkiTi^)(ykii)]TJ /F4 11.955 Tf 12.85 0 Td[(xkiTi^)T.TheonlypositiveeigenvalueofMkii(^)is1(Mkii(^))=(ykii)]TJ /F4 11.955 Tf 12.55 0 Td[(xkiTi^)T(ykii)]TJ /F4 11.955 Tf 12.56 0 Td[(xkiTi^)>0( MarshallandOlkin ( 1979 )).Similarto Daniels ( 2006 ),weremovethedependenceofMkii(^)onkiibyboundingtheexponentialterm.Re-writetheexponentialtermintheaboveexpressionas:(ykii)]TJ /F4 11.955 Tf 11.96 0 Td[(xkiTi^)T(kii))]TJ /F5 7.97 Tf 6.58 0 Td[(1(ykii)]TJ /F4 11.955 Tf 11.95 0 Td[(xkiTi^)=trace[(kii))]TJ /F5 7.97 Tf 6.59 0 Td[(1Mkii(^)]kXt=1t()]TJ /F7 7.97 Tf 6.59 0 Td[(kii)k)]TJ /F7 7.97 Tf 6.58 0 Td[(t+1(Mkii(^))=k()]TJ /F7 7.97 Tf 6.59 0 Td[(kii)1(Mkii(^))wheret(),denedas1(A),2(A),...,p(A)aretheorderedeigenvaluesofappmatrixA.TherstinequalityisfromdimciteMarshallOlkin1979.Letmin,k=mini2Skfk()]TJ /F7 7.97 Tf 6.58 0 Td[(kii)g>0and(M)min,k=mini2Skf1(M)]TJ /F7 7.97 Tf 6.59 0 Td[(kii)g>0.Then,Xi2Sk(ykii)]TJ /F4 11.955 Tf 11.95 0 Td[(xkiTi^)T(kii))]TJ /F5 7.97 Tf 6.59 0 Td[(1(ykii)]TJ /F4 11.955 Tf 11.96 0 Td[(xkiTi^)Xi2Skk()]TJ /F7 7.97 Tf 6.59 0 Td[(kii)1(M)]TJ /F7 7.97 Tf 6.59 0 Td[(kii)skmin,k(M)min,k=tracefskmin,k kIk(M)min,kIkgwhereskdenotescardinalityofsetSkandIkisakkidentitymatrix.Finally,foreachi,simulatea'new'setofdata,y?kiifromamultivariatenormaldistributionsuchthatPi2Sky?kiiy?kiTi=(M)min,kIk.Wethenobtain(p)]TJ /F5 7.97 Tf 6.59 0 Td[(1Yk=1Yi2Skj()]TJ /F7 7.97 Tf 6.59 0 Td[(ki)j1 2)expf)]TJ /F6 11.955 Tf 16.47 8.08 Td[(1 2p)]TJ /F5 7.97 Tf 6.59 0 Td[(1Xk=1Xi2Sk(ykii)]TJ /F4 11.955 Tf 11.96 0 Td[(xkiTi^)T(kii))]TJ /F5 7.97 Tf 6.58 0 Td[(1(ykii)]TJ /F4 11.955 Tf 11.95 0 Td[(xkiTi^)g()() 100

PAGE 101

(p)]TJ /F5 7.97 Tf 6.59 0 Td[(1Yk=1Yi2Sk[kt=1t()]TJ /F7 7.97 Tf 6.59 0 Td[(kii)]1 2)expf)]TJ /F6 11.955 Tf 16.47 8.09 Td[(1 2p)]TJ /F5 7.97 Tf 6.58 0 Td[(1Xk=1Xi2Sky?kiTiskmin,k kIky?kiig()()=(p)]TJ /F5 7.97 Tf 6.59 0 Td[(1Yk=1Yi2Skkt=1[kt()]TJ /F7 7.97 Tf 6.59 0 Td[(kii) skmin,k]1 2[skmin,k k]1 2)expf)]TJ /F6 11.955 Tf 16.47 8.09 Td[(1 2p)]TJ /F5 7.97 Tf 6.59 0 Td[(1Xk=1Xi2Sky?kiTiskmin,k kIky?kiig()()=(p)]TJ /F5 7.97 Tf 6.59 0 Td[(1Yk=1Yi2Skkt=1[kt()]TJ /F7 7.97 Tf 6.59 0 Td[(kii) skmin,k]1 2)p)]TJ /F5 7.97 Tf 6.58 0 Td[(1Yk=1Yi2Skjskmin,k kIkj1 2expf)]TJ /F6 11.955 Tf 16.47 8.09 Td[(1 2p)]TJ /F5 7.97 Tf 6.59 0 Td[(1Xk=1Xi2Sky?kiTiskmin,k kIky?kiig()()M0whereM0isaniteconstantsinceallthreetermsabovearebounded.Therefore,R=m(,,jyobs,x)disnite.Sincethepriorson,areproperundertheassumptionthatPAiATiandPwiwTiarefullrank,wehaveZ2=Z2=Z2=m(,,jyobs,x)dddZ2=Z2=M0()()dd<1.Hence,theposteriorof(,,)isproper. 101

PAGE 102

APPENDIXBSUPPORTINGMATERIALFORCHAPTER3 B.1ProofofLemma 3.0.1 First,wepartitionRppasfollows,Rpp=0B@Rp)]TJ /F5 7.97 Tf 6.59 0 Td[(1p)]TJ /F5 7.97 Tf 6.59 0 Td[(1r1rT1p,p1CA,whereRp)]TJ /F5 7.97 Tf 6.59 0 Td[(1p)]TJ /F5 7.97 Tf 6.58 0 Td[(1=R[1:p)]TJ /F5 7.97 Tf 6.59 0 Td[(11:p)]TJ /F5 7.97 Tf 6.59 0 Td[(1]andr1=RT[1,1:p)]TJ /F5 7.97 Tf 6.58 0 Td[(1]=(1,p,rT2)T.ByFact1,det(Rpp)=det(Rp)]TJ /F5 7.97 Tf 6.58 0 Td[(1p)]TJ /F5 7.97 Tf 6.58 0 Td[(1)(1)]TJ /F4 11.955 Tf 11.96 0 Td[(rT1R)]TJ /F5 7.97 Tf 6.59 0 Td[(1p)]TJ /F5 7.97 Tf 6.59 0 Td[(1p)]TJ /F5 7.97 Tf 6.59 0 Td[(1r1).Furthermore,wecanpartitionRp)]TJ /F5 7.97 Tf 6.58 0 Td[(1p)]TJ /F5 7.97 Tf 6.58 0 Td[(1inthefollowingway,R(p)]TJ /F5 7.97 Tf 6.59 0 Td[(1)(p)]TJ /F5 7.97 Tf 6.59 0 Td[(1)=0B@1,1T22R21CA,where2=RTpp[1,2:p)]TJ /F6 11.955 Tf 11.95 0 Td[(1]andR2=Rpp[2:p)]TJ /F6 11.955 Tf 11.96 0 Td[(2,2:p)]TJ /F6 11.955 Tf 11.95 0 Td[(2].So,det(R(p)]TJ /F5 7.97 Tf 6.58 0 Td[(1)(p)]TJ /F5 7.97 Tf 6.59 0 Td[(1))=(1)]TJ /F3 11.955 Tf 11.95 0 Td[(T2R)]TJ /F5 7.97 Tf 6.59 0 Td[(122)det(R2).Now,letA112=1)]TJ /F3 11.955 Tf 11.96 0 Td[(T2R)]TJ /F5 7.97 Tf 6.59 0 Td[(122.Then,R)]TJ /F5 7.97 Tf 6.59 0 Td[(1(p)]TJ /F5 7.97 Tf 6.59 0 Td[(1)(p)]TJ /F5 7.97 Tf 6.59 0 Td[(1)=0B@A)]TJ /F5 7.97 Tf 6.59 0 Td[(1112)]TJ /F4 11.955 Tf 9.3 0 Td[(A)]TJ /F5 7.97 Tf 6.59 0 Td[(1112T2R)]TJ /F5 7.97 Tf 6.58 0 Td[(12)]TJ /F4 11.955 Tf 9.3 0 Td[(R)]TJ /F5 7.97 Tf 6.59 0 Td[(122A)]TJ /F5 7.97 Tf 6.59 0 Td[(1112R)]TJ /F5 7.97 Tf 6.59 0 Td[(122A)]TJ /F5 7.97 Tf 6.58 0 Td[(1112T2R)]TJ /F5 7.97 Tf 6.59 0 Td[(121CA+0B@000R)]TJ /F5 7.97 Tf 6.59 0 Td[(121CA =A)]TJ /F5 7.97 Tf 6.59 0 Td[(11120B@1)]TJ /F3 11.955 Tf 9.3 0 Td[(T2R)]TJ /F5 7.97 Tf 6.59 0 Td[(12)]TJ /F4 11.955 Tf 9.3 0 Td[(R)]TJ /F5 7.97 Tf 6.59 0 Td[(122R)]TJ /F5 7.97 Tf 6.59 0 Td[(122T2R)]TJ /F5 7.97 Tf 6.58 0 Td[(121CA+0B@000R)]TJ /F5 7.97 Tf 6.58 0 Td[(121CA. (B) WenowshowrT1R(p)]TJ /F5 7.97 Tf 6.59 0 Td[(1)(p)]TJ /F5 7.97 Tf 6.58 0 Td[(1)r1=1,prT2264A)]TJ /F5 7.97 Tf 6.59 0 Td[(11120B@1)]TJ /F3 11.955 Tf 9.3 0 Td[(T2R)]TJ /F5 7.97 Tf 6.58 0 Td[(12)]TJ /F4 11.955 Tf 9.3 0 Td[(R)]TJ /F5 7.97 Tf 6.59 0 Td[(122R)]TJ /F5 7.97 Tf 6.59 0 Td[(122T2R)]TJ /F5 7.97 Tf 6.59 0 Td[(121CA+0B@000R)]TJ /F5 7.97 Tf 6.58 0 Td[(121CA3750B@1,pr21CA=A)]TJ /F5 7.97 Tf 6.59 0 Td[(1112[21,p)]TJ /F6 11.955 Tf 11.95 0 Td[(21,prT2R)]TJ /F5 7.97 Tf 6.59 0 Td[(122+(rT2R)]TJ /F5 7.97 Tf 6.59 0 Td[(122)2]+r2R)]TJ /F5 7.97 Tf 6.59 0 Td[(12rT2 102

PAGE 103

=A)]TJ /F5 7.97 Tf 6.59 0 Td[(1112(21,p)]TJ /F3 11.955 Tf 11.96 0 Td[(T2R)]TJ /F5 7.97 Tf 6.58 0 Td[(12r2)2+rT2R)]TJ /F5 7.97 Tf 6.59 0 Td[(12r2.Therefore,det(Rpp)=(1)]TJ /F3 11.955 Tf 11.96 0 Td[(T2R)]TJ /F5 7.97 Tf 6.58 0 Td[(122)det(R2)f1)]TJ /F4 11.955 Tf 11.95 0 Td[(rT2R)]TJ /F5 7.97 Tf 6.59 0 Td[(12r2)]TJ /F4 11.955 Tf 11.95 0 Td[(A)]TJ /F5 7.97 Tf 6.59 0 Td[(1112(21,p)]TJ /F3 11.955 Tf 11.96 0 Td[(T2R)]TJ /F5 7.97 Tf 6.59 0 Td[(12r2)2g=(1)]TJ /F3 11.955 Tf 11.96 .01 Td[(T2R)]TJ /F5 7.97 Tf 6.58 0 Td[(12r2)(1)]TJ /F4 11.955 Tf 11.95 -.01 Td[(rT2R)]TJ /F5 7.97 Tf 6.58 0 Td[(12r2)det(R2)241)]TJ /F13 11.955 Tf 11.95 20.45 Td[( 21,p)]TJ /F3 11.955 Tf 11.96 0 Td[(2R)]TJ /F5 7.97 Tf 6.58 0 Td[(12r2 p (1)]TJ /F3 11.955 Tf 11.96 0 Td[(T2R)]TJ /F5 7.97 Tf 6.59 0 Td[(122)(1)]TJ /F4 11.955 Tf 11.96 0 Td[(rT2R)]TJ /F5 7.97 Tf 6.59 0 Td[(12r2)!235=(1)]TJ /F3 11.955 Tf 11.96 0 Td[(T2R)]TJ /F5 7.97 Tf 6.58 0 Td[(122)(1)]TJ /F4 11.955 Tf 11.95 0 Td[(rT2R)]TJ /F5 7.97 Tf 6.58 0 Td[(12r2)det(R2)(1)]TJ /F3 11.955 Tf 11.95 0 Td[(21,p). B.2PreliminariesandNotationforProofofTheorem 3.1 BeforeprovingTheorem 3.1 ,weintroducesomenotationandderivesomepreliminaryresults.LetfYi:i=1,...,ngbep1vectorsofindependent,normallydistributedrandomvariableswithmean0andcovariancematrix.ThemleofisS=1 nPni=1YiY0i.ApplyingthetransformationXi=TYi,whereT=diag(s)]TJ /F16 5.978 Tf 7.78 3.26 Td[(1 2jj,),wedeneS=1 nnXi=1XiX0i=(s(j,k)).Thelikelihoodforjk,G(j,k)isproportionaltoG(j,k)/(1)]TJ /F3 11.955 Tf 11.95 0 Td[(2j,k))]TJ /F17 5.978 Tf 7.78 3.26 Td[(n 2expf)]TJ /F4 11.955 Tf 16.47 8.09 Td[(n 2tr(R)]TJ /F5 7.97 Tf 6.59 0 Td[(1[j:k]S[j:k])g.Thecorrespondingloglikelihoodcanbere-writtenaslogG(j,k)=g(j,k)/)]TJ /F4 11.955 Tf 23.11 8.09 Td[(n 2log(1)]TJ /F3 11.955 Tf 11.95 0 Td[(2j,k))]TJ /F4 11.955 Tf 46.69 8.09 Td[(n 2det(R[j:k])tr(A[j:k]S[j:k])whereA[j:k]istheadjointmatrixofR[j:k]andR)]TJ /F5 7.97 Tf 6.59 0 Td[(1[j:k]=1 det(R[j:k])A[j:k].SoA[j:k]isaquadraticfunctionofj,k,i.e.A[j:k]=A0+A1j,k+A22j,k,whereA0,A1,A2are(k)]TJ /F4 11.955 Tf 11.95 0 Td[(j+1)(k)]TJ /F4 11.955 Tf 11.95 0 Td[(j+1)matrices.Therstderivativeoftheloglikelihoodforj,kis@g(j,k) @j,k 103

PAGE 104

=)]TJ /F4 11.955 Tf 10.49 8.09 Td[(n 2)]TJ /F6 11.955 Tf 9.3 0 Td[(2j,k 1)]TJ /F3 11.955 Tf 11.95 0 Td[(2j,k)]TJ /F4 11.955 Tf 42.56 8.09 Td[(n 2a(1)]TJ /F3 11.955 Tf 11.95 0 Td[(2j,k)2fDjk(1)]TJ /F3 11.955 Tf 11.95 0 Td[(2j,k)[tr(A1S[j:k])+2j,ktr(A2S[j,k])]+2j,ktr(A[j:k]S[j:k])g=)]TJ /F4 11.955 Tf 9.3 0 Td[(n 2a(1)]TJ /F3 11.955 Tf 11.96 0 Td[(2j,k)2f)]TJ /F6 11.955 Tf 15.28 0 Td[(2aj,k(1)]TJ /F3 11.955 Tf 11.95 0 Td[(2j,k)+Djk(1)]TJ /F3 11.955 Tf 11.96 0 Td[(2j,k)[tr(A1S[j:k])+2j,ktr(A2S[j:k])]+2j,k[tr(A0S[j:k])+j,ktr(A1S[j:k])+2j,ktr(A2S[j:k])]g.wherea=det(R[j:k])=(1)]TJ /F3 11.955 Tf 11.95 0 Td[(2j,k),whichisnotafunctionofj,k(cf:Result3).BasedonthefollowingpartitionforRpp,Rpp=0BBBB@1T21,p2R2r2p,1rT211CCCCA,whereR2=Rpp[2:p)]TJ /F6 11.955 Tf 12.18 0 Td[(1,2:p)]TJ /F6 11.955 Tf 12.18 0 Td[(1],2=RTpp[1,2:p)]TJ /F6 11.955 Tf 12.18 0 Td[(1],r2=Rpp[2:p)]TJ /F6 11.955 Tf 12.18 0 Td[(1,p],weobtainA1=0BBBB@0rT2R)]TJ /F5 7.97 Tf 6.59 0 Td[(12det(R2))]TJ /F6 11.955 Tf 11.29 0 Td[(det(R2)R)]TJ /F5 7.97 Tf 6.59 0 Td[(12r2det(R2)BR)]TJ /F5 7.97 Tf 6.59 0 Td[(122det(R2))]TJ /F6 11.955 Tf 11.29 0 Td[(det(R2)T2R)]TJ /F5 7.97 Tf 6.59 0 Td[(12det(R2)01CCCCA,whereB(i)]TJ /F5 7.97 Tf 6.59 0 Td[(1,j)]TJ /F5 7.97 Tf 6.59 0 Td[(1)=f()]TJ /F6 11.955 Tf 9.3 0 Td[(1)i+jT2(i)M)]TJ /F5 7.97 Tf 6.59 0 Td[(1ijr2(j)+()]TJ /F6 11.955 Tf 9.3 0 Td[(1)i+jT2(j)M)]TJ /F5 7.97 Tf 6.59 0 Td[(1ijr2(i)gdet(Mij),i,j2f2,3,...,pg,MijistheadjointmatrixofR2(ji)(note:R2(ji)isthesubmatrixofR2obtainedbydeletingthej-throwandi-thcolumn),2(i)isthematrix2withthei-thelementdeleted,r2(j)isthematrixr2withthej-thelementdeletedandA2=)]TJ /F13 11.955 Tf 11.29 38.38 Td[(0BBBB@0000R)]TJ /F5 7.97 Tf 6.59 0 Td[(12det(R2)00001CCCCA. 104

PAGE 105

B.3ProofofTheorem 3.1 Assume^=(^1,2,^2,3,...,^p)]TJ /F5 7.97 Tf 6.59 0 Td[(1,p,...,^1,p)]TJ /F5 7.97 Tf 6.58 0 Td[(1,^2,p,^1,p)ismleofmultivariatenormallikelihoodfunction.Wearegoingtoproveitisequalto~inEq. 3 .Let^=(^1,2,^2,3,...,^p)]TJ /F5 7.97 Tf 6.59 0 Td[(1,p,...,^1,p)]TJ /F5 7.97 Tf 6.58 0 Td[(1,^2,p,^1,p)bethecorrespondingmleofthecorrelationcoefcients.SincetheMLEofi,jissi,j,underthemultivariatenormallikelihood,weonlyneedtoprovethatsijistheestimatorofi,j,~ijfromsequentiallymaximizingtheobjectivefunctionsinEq. 3 .Wedothisbyinduction.1)Fori=1,thelag1estimatoroft,t+i,t=1,2,...,p,isobviouslyst,t+i.2)Supposeitistruefori=k,k2f2,3,...,p)]TJ /F6 11.955 Tf 12.62 0 Td[(1g.Thenwehaveobjectivefunctionestimator~t,t+k=st,t+kfort=1,2,...,p)]TJ /F4 11.955 Tf 11.96 0 Td[(k,andt
PAGE 106

0BBBBBBBBBBBBBBBBBBBB@1s(1,2)s(1,3)..s(1,k)]TJ /F5 7.97 Tf 6.59 0 Td[(1)s(1,k)?s(2,1)1s(2,3)..s(2,k)]TJ /F5 7.97 Tf 6.59 0 Td[(1)s(2,k)s(2,k+1)s(3,1)s(3,2)1..s(3,k)]TJ /F5 7.97 Tf 6.59 0 Td[(1)s(3,k)s(3,k+1)..............s(k)]TJ /F5 7.97 Tf 6.59 0 Td[(1,1)s(k)]TJ /F5 7.97 Tf 6.59 0 Td[(1,2)s(k)]TJ /F5 7.97 Tf 6.58 0 Td[(1,3)..1s(k)]TJ /F5 7.97 Tf 6.58 0 Td[(1,k)s(k)]TJ /F5 7.97 Tf 6.59 0 Td[(1,k+1)s(k,1)s(k,2)s(k,3)..s(k,k)]TJ /F5 7.97 Tf 6.59 0 Td[(1)1s(k,k+1)?s(k+1,2)s(k+1,3)..s(k+1,k)]TJ /F5 7.97 Tf 6.59 0 Td[(1)s(k+1,k)11CCCCCCCCCCCCCCCCCCCCA. (B) Pluggingb1,k+1intoEq. B ,~R[1:k+1]=S1k+1.Now,wesimplifytermsinEq. B.3 ,tr(A1S)=4bT2bR)]TJ /F5 7.97 Tf 6.59 0 Td[(12br2det(bR2))]TJ /F6 11.955 Tf 11.96 0 Td[(2s(1,k+1)det(bR2)+2(k)]TJ /F6 11.955 Tf 11.95 0 Td[(2)bT2bR)]TJ /F5 7.97 Tf 6.59 0 Td[(12br2det(bR2)=2[)]TJ /F4 11.955 Tf 9.3 0 Td[(s(1,k+1)+kbT2bR)]TJ /F5 7.97 Tf 6.58 0 Td[(12br2]det(bR2)tr(A2S)=)]TJ /F6 11.955 Tf 11.29 0 Td[(det(bR2)tr(bR)]TJ /F5 7.97 Tf 6.58 0 Td[(12S2)=)]TJ /F6 11.955 Tf 9.3 0 Td[((k)]TJ /F6 11.955 Tf 11.95 0 Td[(1)det(bR2).andtr(A1S1k+1)+2(1,k+1)tr(A2S1k+1)=2[)]TJ /F4 11.955 Tf 9.3 0 Td[(s(1,k+1)+kbT2bR)]TJ /F5 7.97 Tf 6.59 0 Td[(12r2]det(bR2))]TJ /F6 11.955 Tf 11.95 0 Td[(2(k)]TJ /F6 11.955 Tf 11.96 0 Td[(1)s(1,k+1)det(bR2)=)]TJ /F6 11.955 Tf 9.3 0 Td[(2k[s(1,k+1))]TJ /F13 11.955 Tf 12.23 0 Td[(bT2bR)]TJ /F5 7.97 Tf 6.59 0 Td[(12r2]detbR2tr(A[1:k+1]S1k+1)=(k+1)det(bR[1:k+1])=(k+1)ba(1)]TJ /F13 11.955 Tf 12.17 0 Td[(b21,k+1)D21k+1=[1)]TJ /F13 11.955 Tf 12.22 0 Td[(bT2bR)]TJ /F5 7.97 Tf 6.59 0 Td[(12b2][1)]TJ /F13 11.955 Tf 11.39 .5 Td[(brT2bR)]TJ /F5 7.97 Tf 6.58 0 Td[(12br2].So,@g(t,t+i) @t,t+i=)]TJ /F6 11.955 Tf 9.3 0 Td[(2b1,k+1(1)]TJ /F13 11.955 Tf 12.16 0 Td[(b21,1+k)+1 ba[bD1k+1[tr(bA1S1k+1)+2b1,k+1tr(bA2S1k+1)]+2b1,k+1tr(bR)]TJ /F5 7.97 Tf 6.59 0 Td[(1[1:k+1]S1k+1)g=(1)]TJ /F13 11.955 Tf 12.17 0 Td[(b21,1+k)f)]TJ /F6 11.955 Tf 21.26 0 Td[(2b1,k+1 106

PAGE 107

+)]TJ /F6 11.955 Tf 9.3 0 Td[(2bD1k+1det(bR2) ba[s(1,k+1))]TJ /F6 11.955 Tf 11.95 0 Td[(2kbT2bR)]TJ /F5 7.97 Tf 6.58 0 Td[(12br2+2(k)]TJ /F6 11.955 Tf 11.96 0 Td[(2)s(1,k+1)]+2(k+1)b1,1+kg=(1)]TJ /F13 11.955 Tf 12.17 0 Td[(b21,1+k)f2kb1,k+1+)]TJ /F6 11.955 Tf 9.3 0 Td[(2kbD21k+1det(bR2) bas(1,k+1))]TJ /F13 11.955 Tf 12.23 0 Td[(bT2bR)]TJ /F5 7.97 Tf 6.59 0 Td[(12br2 q [1)]TJ /F13 11.955 Tf 12.22 0 Td[(bT2bR)]TJ /F5 7.97 Tf 6.59 0 Td[(12b2][1)]TJ /F13 11.955 Tf 11.39 .5 Td[(brT2(bR)]TJ /F5 7.97 Tf 6.59 0 Td[(12br2]g=(1)]TJ /F13 11.955 Tf 12.17 0 Td[(b21,1+k)f2kb1,k+1)]TJ /F6 11.955 Tf 11.95 0 Td[(2kb1,k+1g=0.Similarly,wecanprovethatitistrueforfi,k+1+i,i2f2,3,...p)]TJ /F4 11.955 Tf 12.22 0 Td[(k)]TJ /F6 11.955 Tf 12.22 0 Td[(1gg.Therefore,byinductionitistrue.Nowdenee1,p=(e1,2,e2,3,...ep)]TJ /F5 7.97 Tf 6.58 0 Td[(1,p,...,e1,p)]TJ /F5 7.97 Tf 6.59 0 Td[(1,e2,p,e1p)asinEq. 3 (maximizeroftheobjectivefunction).Wearegoingtoproveitisalsomleofthelikelihoodfunction.Lete1,p=(e1,2,e2,3,...,ep)]TJ /F5 7.97 Tf 6.59 0 Td[(1,p,...,e1,p)]TJ /F5 7.97 Tf 6.59 0 Td[(1,e2,p,e1,p)bethecorrespondingestimatorsofthecorrelationcoefcients.Weagainuseaninductionargument.1)Fork=1,ej,j+1,themaximizerofG(j,j+1),isalsothemleofthecorrespondingcorrelationcoefcientsj,j+1sincej,j+1=j,j+1.2)assumeitisalsotruefork=t2f1,....p)]TJ /F6 11.955 Tf 11.96 0 Td[(2g.Thatisej,j+t=(ej,j+1,ej+1,j+2,...ej+t)]TJ /F5 7.97 Tf 6.59 0 Td[(1,j+t,...,ej,j+t)]TJ /F5 7.97 Tf 6.59 0 Td[(1,ej+1,j+t,ej,j+t)isthemaximizerofthemultivariatenormallikelihoodwithj=1,2,...,p)]TJ /F4 11.955 Tf 11.96 0 Td[(t.Therefore,thecorrespondingcorrelationcoefcientestimatorsej,j+t=(ej,j+1,ej+1,j+2,...,ej+t)]TJ /F5 7.97 Tf 6.58 0 Td[(1,j+t,...,ej,j+t)]TJ /F5 7.97 Tf 6.59 0 Td[(1,ej+1,j+t,ej,j+t)arethemaximizersofthemultivariatenormallikelihoodonthescale,ej,j+t=^j,j+t.3)fork=t+1,let^j,j+t+1nj+t+1=f^j,j+1,...,^j+t,j+t+1,...,^j,j+t,^j+1,j+t+1,^j,j+t+1g,and^Pj,j+t+1nj+t+1=f^j,j+1,...,^j+t,j+t+1,...,^j,j+t,^j+1,j+t+1,^j,j+t+1g. 107

PAGE 108

Sinceej,j+t+1=(ej,j+1,ej+1,j+2,...,ej+t,j+t+1,...,ej,j+t,ej+1,j+t+1,ej,j+t+1)=(^j,j+1,^j+1,j+2,...,^j+t,j+t+1,...,^j,j+t,^j+1,j+t+1,ej,j+t+1)=f^Pj,j+t+1nj+t+1,ej,j+t+1gand^j,j+t+1=(^j,j+1,^j+1,j+2,...,^j+t,j+t+1,...,^j,j+t,^j+1,j+t+1,^j,j+t+1)=f^Pj,j+t+1nj+t+1,^j,j+t+1gisthemaximizerofL(j,j+t+1).Moreover,ejj+t+1=maxargfG(j,j+t+1)g=maxargfL(jj+t+1j^j,j+t+1nj+t+1)g=^j,j+t+1.Therefore,ej,j+t+1=^j,j+t+i.Correspondingly,ejj+t+1=maxargfG(j,j+t+1)g=maxargfL(jj+t+1j^j,j+t+1nj+t+1)g=^j,j+t+1.Hence,byinduction,e1pismleofthelikelihood. 108

PAGE 109

ProofofResult4.First,letR=(j,k)ppbeacorrelationmatrixanddenoteRi=R[i:p)]TJ /F5 7.97 Tf 6.59 0 Td[(1,i:p)]TJ /F5 7.97 Tf 6.59 0 Td[(1],i=RT[1,i:p)]TJ /F5 7.97 Tf 6.59 0 Td[(1],andri=R[i:p)]TJ /F5 7.97 Tf 6.58 0 Td[(1,p].WepartitionRasR=0BBBBBBBBBB@11,2...1,p)]TJ /F5 7.97 Tf 6.58 0 Td[(11,p2,11...2,p)]TJ /F5 7.97 Tf 6.58 0 Td[(12,p......p)]TJ /F5 7.97 Tf 6.59 0 Td[(1,1p)]TJ /F5 7.97 Tf 6.59 0 Td[(1,2...1p)]TJ /F5 7.97 Tf 6.58 0 Td[(1,pp,1p,2...p,p)]TJ /F5 7.97 Tf 6.59 0 Td[(111CCCCCCCCCCA=0B@R1r1rT11111CA,whererT1=1,p2,p...p)]TJ /F5 7.97 Tf 6.59 0 Td[(1,pT=1,prT2T.Thus,byResult1R)]TJ /F5 7.97 Tf 6.58 0 Td[(1=A)]TJ /F5 7.97 Tf 6.59 0 Td[(12210B@R)]TJ /F5 7.97 Tf 6.58 0 Td[(11r1rT1R)]TJ /F5 7.97 Tf 6.59 0 Td[(11)]TJ /F4 11.955 Tf 9.3 0 Td[(R)]TJ /F5 7.97 Tf 6.58 0 Td[(11r1)]TJ /F4 11.955 Tf 9.3 0 Td[(rT1R)]TJ /F5 7.97 Tf 6.59 0 Td[(111111CA+0B@R10(p)]TJ /F5 7.97 Tf 6.59 0 Td[(1)101(p)]TJ /F5 7.97 Tf 6.58 0 Td[(1)0111CA,whereA221=1)]TJ /F4 11.955 Tf 11.96 0 Td[(rT1R)]TJ /F5 7.97 Tf 6.59 0 Td[(11r1=Qp)]TJ /F5 7.97 Tf 6.59 0 Td[(1j=1(1)]TJ /F3 11.955 Tf 11.96 0 Td[(2j,p),anddet(R)=A221det(R1).Furthermore,partitionR1asR1=0B@111T22R21CA.WecanthenshowthatR1=A)]TJ /F5 7.97 Tf 6.59 0 Td[(11120B@111)]TJ /F3 11.955 Tf 9.3 0 Td[(T2R)]TJ /F5 7.97 Tf 6.59 0 Td[(12)]TJ /F4 11.955 Tf 9.3 0 Td[(R)]TJ /F5 7.97 Tf 6.59 0 Td[(122R)]TJ /F5 7.97 Tf 6.59 0 Td[(122T2R)]TJ /F5 7.97 Tf 6.59 0 Td[(121CA+0B@01101(p)]TJ /F5 7.97 Tf 6.59 0 Td[(2)0(p)]TJ /F5 7.97 Tf 6.59 0 Td[(2)1R)]TJ /F5 7.97 Tf 6.58 0 Td[(121CA,R1r1=1,p)]TJ /F4 11.955 Tf 11.95 0 Td[(rT2R)]TJ /F5 7.97 Tf 6.58 0 Td[(12r2 1)]TJ /F3 11.955 Tf 11.95 0 Td[(T2R)]TJ /F5 7.97 Tf 6.58 0 Td[(1220B@11(p)]TJ /F5 7.97 Tf 6.59 0 Td[(2))]TJ /F4 11.955 Tf 9.3 0 Td[(R)]TJ /F5 7.97 Tf 6.59 0 Td[(1221CA+0B@01(p)]TJ /F5 7.97 Tf 6.59 0 Td[(2)R)]TJ /F5 7.97 Tf 6.58 0 Td[(12r21CA 109

PAGE 110

=1,ps 1)]TJ /F4 11.955 Tf 11.96 0 Td[(rT2R)]TJ /F5 7.97 Tf 6.59 0 Td[(12r2 1)]TJ /F3 11.955 Tf 11.96 0 Td[(T2R)]TJ /F5 7.97 Tf 6.59 0 Td[(1220B@11(p)]TJ /F5 7.97 Tf 6.58 0 Td[(2))]TJ /F4 11.955 Tf 9.3 0 Td[(R)]TJ /F5 7.97 Tf 6.58 0 Td[(1221CA+0B@01(p)]TJ /F5 7.97 Tf 6.59 0 Td[(2)R)]TJ /F5 7.97 Tf 6.58 0 Td[(12r21CAR)]TJ /F5 7.97 Tf 6.59 0 Td[(11r1rT1R)]TJ /F5 7.97 Tf 6.58 0 Td[(11=21,p1)]TJ /F4 11.955 Tf 11.96 0 Td[(rT2R)]TJ /F5 7.97 Tf 6.59 0 Td[(12r2 1)]TJ /F3 11.955 Tf 11.96 0 Td[(T2R)]TJ /F5 7.97 Tf 6.59 0 Td[(1220B@111)]TJ /F3 11.955 Tf 9.3 0 Td[(T2R)]TJ /F5 7.97 Tf 6.58 0 Td[(12)]TJ /F4 11.955 Tf 9.3 0 Td[(R)]TJ /F5 7.97 Tf 6.58 0 Td[(122R)]TJ /F5 7.97 Tf 6.58 0 Td[(122T2R)]TJ /F5 7.97 Tf 6.59 0 Td[(121CA+1,ps 1)]TJ /F4 11.955 Tf 11.95 0 Td[(rT2R)]TJ /F5 7.97 Tf 6.59 0 Td[(12r2 1)]TJ /F3 11.955 Tf 11.95 0 Td[(T2R)]TJ /F5 7.97 Tf 6.59 0 Td[(1220B@011rT2R)]TJ /F5 7.97 Tf 6.59 0 Td[(12R)]TJ /F5 7.97 Tf 6.59 0 Td[(12r2)]TJ /F4 11.955 Tf 9.3 0 Td[(R)]TJ /F5 7.97 Tf 6.59 0 Td[(12(2rT2+r2T2)R)]TJ /F5 7.97 Tf 6.58 0 Td[(121CA+0B@01101(p)]TJ /F5 7.97 Tf 6.59 0 Td[(2)0(p)]TJ /F5 7.97 Tf 6.58 0 Td[(2)1R)]TJ /F5 7.97 Tf 6.59 0 Td[(12r2rT2R)]TJ /F5 7.97 Tf 6.58 0 Td[(121CA.Therefore,0B@R)]TJ /F5 7.97 Tf 6.59 0 Td[(11r1rT1R)]TJ /F5 7.97 Tf 6.58 0 Td[(11)]TJ /F4 11.955 Tf 9.29 0 Td[(R)]TJ /F5 7.97 Tf 6.59 0 Td[(11r1)]TJ /F4 11.955 Tf 9.3 0 Td[(rT1R)]TJ /F5 7.97 Tf 6.58 0 Td[(111111CA=21,pQp)]TJ /F16 5.978 Tf 5.75 0 Td[(1j=2(1)]TJ /F12 7.97 Tf 6.58 0 Td[(2j,p) Qp)]TJ /F16 5.978 Tf 5.75 0 Td[(1k=2(1)]TJ /F12 7.97 Tf 6.59 0 Td[(21,k)0BBBB@111)]TJ /F3 11.955 Tf 9.3 0 Td[(T2R)]TJ /F5 7.97 Tf 6.59 0 Td[(12011)]TJ /F4 11.955 Tf 9.3 0 Td[(R)]TJ /F5 7.97 Tf 6.59 0 Td[(122R)]TJ /F5 7.97 Tf 6.59 0 Td[(122T2R)]TJ /F5 7.97 Tf 6.58 0 Td[(120(p)]TJ /F5 7.97 Tf 6.58 0 Td[(2)101101(p)]TJ /F5 7.97 Tf 6.59 0 Td[(2)0111CCCCA+1,pr Qp)]TJ /F16 5.978 Tf 5.76 0 Td[(1j=2(1)]TJ /F12 7.97 Tf 6.59 0 Td[(2j,p) Qp)]TJ /F16 5.978 Tf 5.76 0 Td[(1k=2(1)]TJ /F12 7.97 Tf 6.59 0 Td[(21,k)0BBBB@011rT2R)]TJ /F5 7.97 Tf 6.59 0 Td[(12)]TJ /F6 11.955 Tf 9.3 0 Td[(111R)]TJ /F5 7.97 Tf 6.59 0 Td[(12r2)]TJ /F4 11.955 Tf 9.3 0 Td[(R)]TJ /F5 7.97 Tf 6.59 0 Td[(12(2rT2+r2T2)R)]TJ /F5 7.97 Tf 6.59 0 Td[(12R)]TJ /F5 7.97 Tf 6.59 0 Td[(122)]TJ /F6 11.955 Tf 9.3 0 Td[(111T2R)]TJ /F5 7.97 Tf 6.58 0 Td[(10111CCCCA+0BBBB@01101(p)]TJ /F5 7.97 Tf 6.59 0 Td[(2)0110(p)]TJ /F5 7.97 Tf 6.59 0 Td[(2)1R)]TJ /F5 7.97 Tf 6.58 0 Td[(12r2rT2R)]TJ /F5 7.97 Tf 6.59 0 Td[(12)]TJ /F4 11.955 Tf 9.29 0 Td[(R)]TJ /F5 7.97 Tf 6.58 0 Td[(12r2011)]TJ /F4 11.955 Tf 9.3 0 Td[(rT2R)]TJ /F5 7.97 Tf 6.59 0 Td[(121111CCCCA, 110

PAGE 111

sincethepartialautocorrelationmatrix=(j,k)pphask0bands,j,j+k=0fork>k0.Hence,0B@R)]TJ /F5 7.97 Tf 6.59 0 Td[(11r1rT1R)]TJ /F5 7.97 Tf 6.59 0 Td[(11)]TJ /F4 11.955 Tf 9.3 0 Td[(R)]TJ /F5 7.97 Tf 6.59 0 Td[(11r1)]TJ /F4 11.955 Tf 9.3 0 Td[(rT1R)]TJ /F5 7.97 Tf 6.59 0 Td[(1111CA=0BBBB@01101(p)]TJ /F5 7.97 Tf 6.59 0 Td[(2)0110(p)]TJ /F5 7.97 Tf 6.59 0 Td[(2)1R)]TJ /F5 7.97 Tf 6.59 0 Td[(12r2rT2R)]TJ /F5 7.97 Tf 6.59 0 Td[(12)]TJ /F4 11.955 Tf 9.3 0 Td[(R)]TJ /F5 7.97 Tf 6.59 0 Td[(12r2011)]TJ /F4 11.955 Tf 9.3 0 Td[(rT2R)]TJ /F5 7.97 Tf 6.59 0 Td[(121111CCCCA,(forp)]TJ /F6 11.955 Tf 11.95 0 Td[(1>k0).Similarly,forp)]TJ /F4 11.955 Tf 11.96 0 Td[(i>k0,weobtain0B@R)]TJ /F5 7.97 Tf 6.59 0 Td[(1irirTiR)]TJ /F5 7.97 Tf 6.58 0 Td[(1i)]TJ /F4 11.955 Tf 9.29 0 Td[(R)]TJ /F5 7.97 Tf 6.59 0 Td[(1iri)]TJ /F4 11.955 Tf 9.3 0 Td[(rTiR)]TJ /F5 7.97 Tf 6.59 0 Td[(1i1111CA=0BBBB@01101(p)]TJ /F7 7.97 Tf 6.58 0 Td[(i)]TJ /F5 7.97 Tf 6.59 0 Td[(1)01101(p)]TJ /F7 7.97 Tf 6.59 0 Td[(i)]TJ /F5 7.97 Tf 6.59 0 Td[(1)R)]TJ /F5 7.97 Tf 6.58 0 Td[(1i+1ri+1rTi+1R)]TJ /F5 7.97 Tf 6.59 0 Td[(1i+1)]TJ /F4 11.955 Tf 9.3 0 Td[(R)]TJ /F5 7.97 Tf 6.59 0 Td[(1i+1ri+1011)]TJ /F4 11.955 Tf 9.3 0 Td[(rTi+1R)]TJ /F5 7.97 Tf 6.59 0 Td[(1i+11111CCCCA.Therefore,0B@R)]TJ /F5 7.97 Tf 6.58 0 Td[(11r1rT1R)]TJ /F5 7.97 Tf 6.59 0 Td[(11)]TJ /F4 11.955 Tf 9.3 0 Td[(R)]TJ /F5 7.97 Tf 6.58 0 Td[(11r1)]TJ /F4 11.955 Tf 9.3 0 Td[(rT1R)]TJ /F5 7.97 Tf 6.59 0 Td[(111111CA=0BBBB@0(p)]TJ /F7 7.97 Tf 6.58 0 Td[(k0)]TJ /F5 7.97 Tf 6.59 0 Td[(1)(p)]TJ /F7 7.97 Tf 6.58 0 Td[(k0)]TJ /F5 7.97 Tf 6.59 0 Td[(1)0(p)]TJ /F7 7.97 Tf 6.59 0 Td[(k0)]TJ /F5 7.97 Tf 6.58 0 Td[(1)k00110k0(p)]TJ /F7 7.97 Tf 6.59 0 Td[(k0)]TJ /F5 7.97 Tf 6.59 0 Td[(1)R)]TJ /F5 7.97 Tf 6.58 0 Td[(1p)]TJ /F7 7.97 Tf 6.58 0 Td[(k0rp)]TJ /F7 7.97 Tf 6.59 0 Td[(k0rTp)]TJ /F7 7.97 Tf 6.59 0 Td[(k0R)]TJ /F5 7.97 Tf 6.59 0 Td[(1p)]TJ /F7 7.97 Tf 6.59 0 Td[(k0)]TJ /F4 11.955 Tf 9.3 0 Td[(R)]TJ /F5 7.97 Tf 6.59 0 Td[(1p)]TJ /F7 7.97 Tf 6.59 0 Td[(k0rp)]TJ /F7 7.97 Tf 6.59 0 Td[(k0011)]TJ /F4 11.955 Tf 9.3 0 Td[(rTp)]TJ /F7 7.97 Tf 6.58 0 Td[(k0R)]TJ /F5 7.97 Tf 6.58 0 Td[(1p)]TJ /F7 7.97 Tf 6.59 0 Td[(k01111CCCCA,whereRp)]TJ /F7 7.97 Tf 6.59 0 Td[(k0=R[p)]TJ /F7 7.97 Tf 6.59 0 Td[(k0:p)]TJ /F5 7.97 Tf 6.58 0 Td[(1,p)]TJ /F7 7.97 Tf 6.59 0 Td[(k0:p)]TJ /F5 7.97 Tf 6.58 0 Td[(1]isaprinciplesubmatrixofRwithrowsandcolumnsfrom(p)]TJ /F4 11.955 Tf 11.95 0 Td[(k0)to(p)]TJ /F6 11.955 Tf 11.95 0 Td[(1),andrp)]TJ /F7 7.97 Tf 6.59 0 Td[(k0=R[p)]TJ /F7 7.97 Tf 6.59 0 Td[(k0:p)]TJ /F5 7.97 Tf 6.58 0 Td[(1,p].Therefore,undertheassumptionofk0bands,weobtainR)]TJ /F40 6.974 Tf 6.23 0 Td[(1=A)]TJ /F40 6.974 Tf 6.23 0 Td[(12210BBBB@0(p)]TJ /F42 6.974 Tf 6.23 0 Td[(k0)]TJ /F40 6.974 Tf 6.23 0 Td[(1)(p)]TJ /F42 6.974 Tf 6.23 0 Td[(k0)]TJ /F40 6.974 Tf 6.22 0 Td[(1)01k00(p)]TJ /F42 6.974 Tf 6.23 0 Td[(k0)]TJ /F40 6.974 Tf 6.23 0 Td[(1)10k0(p)]TJ /F42 6.974 Tf 6.22 0 Td[(k0)]TJ /F40 6.974 Tf 6.23 0 Td[(1)R)]TJ /F40 6.974 Tf 6.22 0 Td[(1p)]TJ /F42 6.974 Tf 6.22 0 Td[(k0rTp)]TJ /F42 6.974 Tf 6.23 0 Td[(k0rp)]TJ /F42 6.974 Tf 6.23 0 Td[(k0R)]TJ /F40 6.974 Tf 6.23 0 Td[(1p)]TJ /F42 6.974 Tf 6.22 0 Td[(k0)]TJ /F28 9.963 Tf 7.75 0 Td[(R)]TJ /F40 6.974 Tf 6.22 0 Td[(1p)]TJ /F42 6.974 Tf 6.23 0 Td[(k0rp)]TJ /F42 6.974 Tf 6.23 0 Td[(k001(p)]TJ /F42 6.974 Tf 6.22 0 Td[(k0)]TJ /F40 6.974 Tf 6.23 0 Td[(1))]TJ /F28 9.963 Tf 7.75 0 Td[(rTp)]TJ /F42 6.974 Tf 6.22 0 Td[(k0R)]TJ /F40 6.974 Tf 6.22 0 Td[(1p)]TJ /F42 6.974 Tf 6.22 0 Td[(k01111CCCCA+0@R)]TJ /F40 6.974 Tf 6.23 0 Td[(110(p)]TJ /F40 6.974 Tf 6.23 0 Td[(1)101(p)]TJ /F40 6.974 Tf 6.22 0 Td[(1)0111A=p)]TJ /F40 6.974 Tf 6.22 0 Td[(1Yj=1(1)]TJ /F31 9.963 Tf 9.96 0 Td[(2j,p))]TJ /F40 6.974 Tf 6.23 0 Td[(10BBBB@0(p)]TJ /F42 6.974 Tf 6.22 0 Td[(k0)]TJ /F40 6.974 Tf 6.23 0 Td[(1)(p)]TJ /F42 6.974 Tf 6.23 0 Td[(k0)]TJ /F40 6.974 Tf 6.23 0 Td[(1)0(p)]TJ /F42 6.974 Tf 6.22 0 Td[(k0)]TJ /F40 6.974 Tf 6.23 0 Td[(1)k00(p)]TJ /F42 6.974 Tf 6.23 0 Td[(k0)]TJ /F40 6.974 Tf 6.22 0 Td[(1)101(p)]TJ /F42 6.974 Tf 6.23 0 Td[(k0)]TJ /F40 6.974 Tf 6.23 0 Td[(1)R)]TJ /F40 6.974 Tf 6.23 0 Td[(1p)]TJ /F42 6.974 Tf 6.23 0 Td[(k0rp)]TJ /F42 6.974 Tf 6.23 0 Td[(k0rTp)]TJ /F42 6.974 Tf 6.22 0 Td[(k0R)]TJ /F40 6.974 Tf 6.22 0 Td[(1p)]TJ /F42 6.974 Tf 6.23 0 Td[(k0)]TJ /F28 9.963 Tf 7.74 0 Td[(R)]TJ /F40 6.974 Tf 6.23 0 Td[(1p)]TJ /F42 6.974 Tf 6.23 0 Td[(k0rp)]TJ /F42 6.974 Tf 6.22 0 Td[(k001(p)]TJ /F42 6.974 Tf 6.23 0 Td[(k0)]TJ /F40 6.974 Tf 6.23 0 Td[(1))]TJ /F28 9.963 Tf 7.75 0 Td[(rTp)]TJ /F42 6.974 Tf 6.23 0 Td[(k0R)]TJ /F40 6.974 Tf 6.23 0 Td[(1p)]TJ /F42 6.974 Tf 6.23 0 Td[(k01111CCCCA+0@R)]TJ /F40 6.974 Tf 6.22 0 Td[(110(p)]TJ /F40 6.974 Tf 6.22 0 Td[(1)101(p)]TJ /F40 6.974 Tf 6.23 0 Td[(1)0111A=p)]TJ /F40 6.974 Tf 6.23 0 Td[(1Yj=p)]TJ /F42 6.974 Tf 6.23 0 Td[(k0(1)]TJ /F31 9.963 Tf 9.96 0 Td[(2j,p))]TJ /F40 6.974 Tf 6.22 0 Td[(10BBBB@0(p)]TJ /F42 6.974 Tf 6.23 0 Td[(k0)]TJ /F40 6.974 Tf 6.22 0 Td[(1)(p)]TJ /F42 6.974 Tf 6.22 0 Td[(k0)]TJ /F40 6.974 Tf 6.23 0 Td[(1)0(p)]TJ /F42 6.974 Tf 6.23 0 Td[(k0)]TJ /F40 6.974 Tf 6.22 0 Td[(1)k00(p)]TJ /F42 6.974 Tf 6.22 0 Td[(k0)]TJ /F40 6.974 Tf 6.23 0 Td[(1)101(p)]TJ /F42 6.974 Tf 6.23 0 Td[(k0)]TJ /F40 6.974 Tf 6.23 0 Td[(1)R)]TJ /F40 6.974 Tf 6.23 0 Td[(1p)]TJ /F42 6.974 Tf 6.23 0 Td[(k0rp)]TJ /F42 6.974 Tf 6.22 0 Td[(k0rTp)]TJ /F42 6.974 Tf 6.23 0 Td[(k0R)]TJ /F40 6.974 Tf 6.23 0 Td[(1p)]TJ /F42 6.974 Tf 6.23 0 Td[(k0)]TJ /F28 9.963 Tf 7.75 0 Td[(R)]TJ /F40 6.974 Tf 6.23 0 Td[(1p)]TJ /F42 6.974 Tf 6.22 0 Td[(k0rp)]TJ /F42 6.974 Tf 6.23 0 Td[(k001(p)]TJ /F42 6.974 Tf 6.23 0 Td[(k0)]TJ /F40 6.974 Tf 6.23 0 Td[(1))]TJ /F28 9.963 Tf 7.74 0 Td[(rTp)]TJ /F42 6.974 Tf 6.23 0 Td[(k0R)]TJ /F40 6.974 Tf 6.23 0 Td[(1p)]TJ /F42 6.974 Tf 6.23 0 Td[(k01111CCCCA 111

PAGE 112

+0@R)]TJ /F40 6.974 Tf 6.22 0 Td[(110(p)]TJ /F40 6.974 Tf 6.23 0 Td[(1)101(p)]TJ /F40 6.974 Tf 6.23 0 Td[(1)0111A.Now,were-writeR1similartoRtoobtainR)]TJ /F5 7.97 Tf 6.59 0 Td[(1=p)]TJ /F5 7.97 Tf 6.59 0 Td[(1Yj=p)]TJ /F7 7.97 Tf 6.58 0 Td[(k0(1)]TJ /F3 11.955 Tf 11.95 0 Td[(2j,p))]TJ /F5 7.97 Tf 6.58 0 Td[(10BBBB@0(p)]TJ /F7 7.97 Tf 6.58 0 Td[(k0)]TJ /F5 7.97 Tf 6.59 0 Td[(1)(p)]TJ /F7 7.97 Tf 6.59 0 Td[(k0)]TJ /F5 7.97 Tf 6.58 0 Td[(1)0(p)]TJ /F7 7.97 Tf 6.59 0 Td[(k0)]TJ /F5 7.97 Tf 6.59 0 Td[(1)k00(p)]TJ /F7 7.97 Tf 6.59 0 Td[(k0)]TJ /F5 7.97 Tf 6.59 0 Td[(1)10k0(p)]TJ /F7 7.97 Tf 6.58 0 Td[(k0)]TJ /F5 7.97 Tf 6.59 0 Td[(1)MjpMTjp)]TJ /F4 11.955 Tf 9.3 0 Td[(Mjp01(p)]TJ /F7 7.97 Tf 6.58 0 Td[(k0)]TJ /F5 7.97 Tf 6.59 0 Td[(1))]TJ /F4 11.955 Tf 9.3 0 Td[(MTjp1111CCCCA+......++s)]TJ /F5 7.97 Tf 6.59 0 Td[(1Yj=s)]TJ /F7 7.97 Tf 6.59 0 Td[(k0(1)]TJ /F3 11.955 Tf 11.96 0 Td[(2j,s))]TJ /F5 7.97 Tf 6.59 0 Td[(10BBBBBBB@0(p)]TJ /F7 7.97 Tf 6.59 0 Td[(k0)]TJ /F7 7.97 Tf 6.59 0 Td[(s)]TJ /F5 7.97 Tf 6.58 0 Td[(1)(p)]TJ /F7 7.97 Tf 6.59 0 Td[(k0)]TJ /F7 7.97 Tf 6.59 0 Td[(s)]TJ /F5 7.97 Tf 6.58 0 Td[(1)0(p)]TJ /F7 7.97 Tf 6.58 0 Td[(k0)]TJ /F7 7.97 Tf 6.59 0 Td[(s)]TJ /F5 7.97 Tf 6.59 0 Td[(1)k00(p)]TJ /F7 7.97 Tf 6.59 0 Td[(k0)]TJ /F7 7.97 Tf 6.59 0 Td[(s)]TJ /F5 7.97 Tf 6.58 0 Td[(1)10(p)]TJ /F7 7.97 Tf 6.59 0 Td[(k0)]TJ /F7 7.97 Tf 6.59 0 Td[(s)]TJ /F5 7.97 Tf 6.58 0 Td[(1)s0k0(p)]TJ /F7 7.97 Tf 6.58 0 Td[(k0)]TJ /F7 7.97 Tf 6.59 0 Td[(s)]TJ /F5 7.97 Tf 6.59 0 Td[(1))MjsMTjs)]TJ /F4 11.955 Tf 9.29 0 Td[(Mjs0k0s01(p)]TJ /F7 7.97 Tf 6.59 0 Td[(k0)]TJ /F7 7.97 Tf 6.59 0 Td[(s)]TJ /F5 7.97 Tf 6.58 0 Td[(1))]TJ /F4 11.955 Tf 9.3 0 Td[(MTjs11101s0s(p)]TJ /F7 7.97 Tf 6.59 0 Td[(k0)]TJ /F7 7.97 Tf 6.58 0 Td[(s)]TJ /F5 7.97 Tf 6.58 0 Td[(1)0sk00s10ss1CCCCCCCA+......++k0Yj=1(1)]TJ /F3 11.955 Tf 11.96 0 Td[(2j,k0+1))]TJ /F5 7.97 Tf 6.59 0 Td[(10BBBBBBB@01101k001101(p)]TJ /F7 7.97 Tf 6.59 0 Td[(k0)]TJ /F5 7.97 Tf 6.58 0 Td[(2)0k01Mjk0+1MTjk0+1)]TJ /F4 11.955 Tf 9.3 0 Td[(Mjk0+101(p)]TJ /F7 7.97 Tf 6.59 0 Td[(k0)]TJ /F5 7.97 Tf 6.58 0 Td[(2)011)]TJ /F4 11.955 Tf 9.3 0 Td[(MTjk0+111101(p)]TJ /F7 7.97 Tf 6.59 0 Td[(k0)]TJ /F5 7.97 Tf 6.58 0 Td[(2)0(p)]TJ /F7 7.97 Tf 6.59 0 Td[(k0)]TJ /F5 7.97 Tf 6.59 0 Td[(2)10(p)]TJ /F7 7.97 Tf 6.58 0 Td[(k0)]TJ /F5 7.97 Tf 6.59 0 Td[(2)k00(p)]TJ /F7 7.97 Tf 6.58 0 Td[(k0)]TJ /F5 7.97 Tf 6.59 0 Td[(2)10(p)]TJ /F7 7.97 Tf 6.59 0 Td[(k0)]TJ /F5 7.97 Tf 6.58 0 Td[(2)(p)]TJ /F7 7.97 Tf 6.59 0 Td[(k0)]TJ /F5 7.97 Tf 6.59 0 Td[(2)1CCCCCCCA+0B@R)]TJ /F5 7.97 Tf 6.59 0 Td[(1[1:k01:k0]0k0(p)]TJ /F7 7.97 Tf 6.59 0 Td[(k0)0(p)]TJ /F7 7.97 Tf 6.59 0 Td[(k0)k00(p)]TJ /F7 7.97 Tf 6.59 0 Td[(k0)(p)]TJ /F7 7.97 Tf 6.58 0 Td[(k0)1CA,whereMjp=R)]TJ /F5 7.97 Tf 6.59 0 Td[(1[p)]TJ /F7 7.97 Tf 6.58 0 Td[(k0:p)]TJ /F5 7.97 Tf 6.59 0 Td[(1,p)]TJ /F7 7.97 Tf 6.59 0 Td[(k0:p)]TJ /F5 7.97 Tf 6.59 0 Td[(1]R[p)]TJ /F7 7.97 Tf 6.59 0 Td[(k0:p)]TJ /F5 7.97 Tf 6.59 0 Td[(1,p],Mjs=R)]TJ /F5 7.97 Tf 6.59 0 Td[(1[s)]TJ /F7 7.97 Tf 6.58 0 Td[(k0:s)]TJ /F5 7.97 Tf 6.58 0 Td[(1,s)]TJ /F7 7.97 Tf 6.59 0 Td[(k0:s)]TJ /F5 7.97 Tf 6.59 0 Td[(1]R[s)]TJ /F7 7.97 Tf 6.59 0 Td[(k0:s)]TJ /F5 7.97 Tf 6.59 0 Td[(1,s],andMjk0+1=R)]TJ /F5 7.97 Tf 6.58 0 Td[(1[2:k0+1,2:k0+1]R[2:k0+1,k0+2].ThisshowsthatR)]TJ /F5 7.97 Tf 6.59 0 Td[(1isasumofppmatricesincludingonly(k0+1)(k0+1)non-zeroprinciplesub-matrices.Asaresult,we 112

PAGE 113

onlyneedtoinvert(k0)]TJ /F6 11.955 Tf 12.77 0 Td[(1)-dimensionalmatricestomovefromak0bandtoR)]TJ /F5 7.97 Tf 6.59 0 Td[(1.Furthermore,bothR)]TJ /F5 7.97 Tf 6.59 0 Td[(1andtheprecisionmatrix,)]TJ /F5 7.97 Tf 6.58 0 Td[(1=D)]TJ /F5 7.97 Tf 6.59 0 Td[(1R)]TJ /F5 7.97 Tf 6.59 0 Td[(1D)]TJ /F5 7.97 Tf 6.58 0 Td[(1arek0-bandmatrices.ProofofResult5.UsingResult4,undertheconditionofak0bandpartialautocorrelationmatrix,h(^k0,^0)=exp()]TJ /F6 11.955 Tf 10.5 8.09 Td[(1 2trace(D)]TJ /F5 7.97 Tf 6.59 0 Td[(1R)]TJ /F5 7.97 Tf 6.58 0 Td[(1D)]TJ /F5 7.97 Tf 6.59 0 Td[(1S))isonlyrelatedto(k0+1)(k0+1)principlesub-matricesofS,i.e.,itisonlyaffectedbysamplepartialautocorrelationswithlagnotgreaterthank0.ProofofLemma 3.1.1 SinceY1,Y2,...,Ynarei.i.d.multivariatenormalrandomvectorsN(0,DRD),thesamplecovarianceSfollowsaWishartdistributionWn()fornp+1withpdf,p(s)/jSjn)]TJ /F17 5.978 Tf 5.76 0 Td[(p)]TJ /F16 5.978 Tf 5.76 0 Td[(1 2exp[)]TJ /F6 11.955 Tf 10.49 8.09 Td[(1 2trace(D)]TJ /F5 7.97 Tf 6.59 0 Td[(1R)]TJ /F5 7.97 Tf 6.59 0 Td[(1D)]TJ /F5 7.97 Tf 6.59 0 Td[(1S)].LetS=f^j,l:j,l=1,...,pg,A=(^1,2,^2,3,^1,3,...,^1,p,^1,1,...,^p,p),B=(^1,2,^2,3,^1,3,...,^1,p,^1,1,...,^p,p),and^=(^1,2,^2,3,^1,3,...,^1,p,^1,1,...,^p,p).TheJacobianfromAtoBis^J=0B@^J1100Ipp1CA,where^J11=diag(vh(^0^0))and^0=(^1,1,..^p,p).Accordingto Joe ( 2006 ),thedeterminantoftheJacobianfor^JB!isj^JB!j=p)]TJ /F5 7.97 Tf 6.59 0 Td[(1Yj=1(1)]TJ /F3 11.955 Tf 11.96 0 Td[(2j,j+1)p)]TJ /F16 5.978 Tf 5.76 0 Td[(2 2p)]TJ /F5 7.97 Tf 6.58 0 Td[(2Yk=2p)]TJ /F7 7.97 Tf 6.59 0 Td[(kYj=1(1)]TJ /F6 11.955 Tf 12.35 0 Td[(^2j,j+k)p)]TJ /F16 5.978 Tf 5.76 0 Td[(1)]TJ /F17 5.978 Tf 5.76 0 Td[(k 2.Also,recallj^Rj=Qp)]TJ /F5 7.97 Tf 6.58 0 Td[(1k=1Qp)]TJ /F7 7.97 Tf 6.58 0 Td[(kj=1(1)]TJ /F6 11.955 Tf 12.36 0 Td[(^2j,j+k)(Result3).Therefore,p(B)/j^JjjSjn)]TJ /F17 5.978 Tf 5.76 0 Td[(p)]TJ /F16 5.978 Tf 5.76 0 Td[(1 2exp[)]TJ /F6 11.955 Tf 10.5 8.08 Td[(1 2trace(D)]TJ /F5 7.97 Tf 6.59 0 Td[(1R)]TJ /F5 7.97 Tf 6.58 0 Td[(1D)]TJ /F5 7.97 Tf 6.58 0 Td[(1S)]andp(,0)/j^J11jj^JB!jj^D^R^Djn)]TJ /F17 5.978 Tf 5.76 0 Td[(p)]TJ /F16 5.978 Tf 5.76 0 Td[(1 2exp[)]TJ /F6 11.955 Tf 10.49 8.08 Td[(1 2trace(D)]TJ /F5 7.97 Tf 6.59 0 Td[(1R)]TJ /F5 7.97 Tf 6.59 0 Td[(1D)]TJ /F5 7.97 Tf 6.59 0 Td[(1S)]. 113

PAGE 114

Wecansimplifythisasfollows,p(,0)/j^J11jj^Djn)]TJ /F7 7.97 Tf 6.59 0 Td[(p)]TJ /F5 7.97 Tf 6.59 0 Td[(1[p)]TJ /F5 7.97 Tf 6.59 0 Td[(1Yj=1(1)]TJ /F22 10.909 Tf 10.9 0 Td[(2j,j+1)p)]TJ /F16 5.978 Tf 5.75 0 Td[(2 2p)]TJ /F5 7.97 Tf 6.58 0 Td[(2Yk=2p)]TJ /F7 7.97 Tf 6.59 0 Td[(kYj=1(1)]TJ /F21 10.909 Tf 11.35 0 Td[(^2j,j+k)p)]TJ /F16 5.978 Tf 5.76 0 Td[(1)]TJ /F17 5.978 Tf 5.75 0 Td[(k 2][p)]TJ /F5 7.97 Tf 6.59 0 Td[(1Yk=1p)]TJ /F7 7.97 Tf 6.58 0 Td[(kYj=1(1)]TJ /F22 10.909 Tf 10.9 0 Td[(2j,j+k)]n)]TJ /F17 5.978 Tf 5.76 0 Td[(p)]TJ /F16 5.978 Tf 5.76 0 Td[(1 2h(^k0,^0)=(1)]TJ /F21 10.909 Tf 11.35 0 Td[(^21,p)n)]TJ /F16 5.978 Tf 5.76 0 Td[(3 2p)]TJ /F5 7.97 Tf 6.58 0 Td[(2Yk=k0+1p)]TJ /F7 7.97 Tf 6.59 0 Td[(kYj=k(1)]TJ /F21 10.909 Tf 11.35 0 Td[(^2j,j+k)n)]TJ /F17 5.978 Tf 5.75 0 Td[(k)]TJ /F16 5.978 Tf 5.76 0 Td[(2 2p)]TJ /F5 7.97 Tf 6.58 0 Td[(1Yj=1(1)]TJ /F22 10.909 Tf 10.91 0 Td[(2j,j+1)p)]TJ /F16 5.978 Tf 5.76 0 Td[(2 2[k0Yk=2p)]TJ /F7 7.97 Tf 6.59 0 Td[(kYj=1(1)]TJ /F21 10.909 Tf 11.35 0 Td[(^2j,j+k)]n)]TJ /F17 5.978 Tf 5.76 0 Td[(k)]TJ /F16 5.978 Tf 5.75 0 Td[(2 2j^J11jj^Djn)]TJ /F7 7.97 Tf 6.59 0 Td[(p)]TJ /F5 7.97 Tf 6.59 0 Td[(1h(^k0,^0) =(1)]TJ /F21 10.909 Tf 11.35 0 Td[(^21,p)n)]TJ /F17 5.978 Tf 5.76 0 Td[(p)]TJ /F16 5.978 Tf 5.76 0 Td[(1 2p)]TJ /F5 7.97 Tf 6.58 0 Td[(2Yk=k0+1p)]TJ /F7 7.97 Tf 6.59 0 Td[(kYj=1(1)]TJ /F21 10.909 Tf 11.35 0 Td[(^2j,j+k)n)]TJ /F17 5.978 Tf 5.75 0 Td[(k)]TJ /F16 5.978 Tf 5.76 0 Td[(2 2h?(^k0,^0),whereh?(^k0,^0)=[Qp)]TJ /F5 7.97 Tf 6.59 0 Td[(1j=1(1)]TJ /F3 11.955 Tf 9.29 0 Td[(2j,j+1)p)]TJ /F16 5.978 Tf 5.75 0 Td[(2 2][Qk0k=2Qp)]TJ /F7 7.97 Tf 6.58 0 Td[(kj=1(1)]TJ /F6 11.955 Tf 9.69 0 Td[(^2j,j+k)]n)]TJ /F17 5.978 Tf 5.76 0 Td[(k)]TJ /F16 5.978 Tf 5.75 0 Td[(2 2j^J11jj^Djn)]TJ /F7 7.97 Tf 6.59 0 Td[(p)]TJ /F5 7.97 Tf 6.59 0 Td[(1h(^k0,^0).Hence,allsamplepartialautocorrelationswithlagsnotlessthank0areindependentwithmarginaldistributionsgivenbyf(^jj+k)/8><>:(1)]TJ /F6 11.955 Tf 12.35 0 Td[(^2j,j+k)n)]TJ /F17 5.978 Tf 5.76 0 Td[(k)]TJ /F16 5.978 Tf 5.76 0 Td[(2 4fork2fk0+1,...,p)]TJ /F6 11.955 Tf 11.95 0 Td[(2g(1)]TJ /F6 11.955 Tf 12.36 0 Td[(^21,p)n)]TJ /F17 5.978 Tf 5.76 0 Td[(p)]TJ /F16 5.978 Tf 5.76 0 Td[(1 2fork0
PAGE 115

APPENDIXCSUPPORTINGMATERIALFORCHAPTER4 C.1ProveTheorem 4.1 Proof:LetAbeannsymmetricmatrixwithelements,faijg.Dene vh(A)=a21a31an1a32an2nn)]TJ /F6 11.955 Tf 11.96 0 Td[(1T, (C) tobeavectorwithalltheelementsinthelowertriangularpartofasquarematrix(withoutthemaindiagonal).LetY1,Y2,...,Ynbeiidm-dimensionalmultivariatenormalrandomvectorswithmeanzeroandcovariancematrixm=DmRmDm,whereDmisdiagonalmatrixwithstandarddeviationonitsmaindiagonalandRmisthecorrelationmatrixcorrespondingtom.Forsimplicity,wedropsubscriptsmofmandminthefollowingproofwithoutlackofclarity.Toderivetheasymptoticcovariancematrixofthelagjestimatedpartialautocorrelations,^j=(^1j+1,^2j+2,...,^m)]TJ /F7 7.97 Tf 6.59 0 Td[(jm),theMLEofj=(1j+1,2j+2,...,m)]TJ /F7 7.97 Tf 6.58 0 Td[(jm),weintroducesomeadditionalnotationwhichrelatestothetransformationfrom(vh()T,T0)T!(vh(R)T,T0)T!(vh()T,T0)T,where0=(11,...mm).LetJ11=diag(vh(0T0))]TJ /F16 5.978 Tf 7.78 3.25 Td[(1 2)and 115

PAGE 116

J12=)]TJ /F23 10.909 Tf 10.3 103.74 Td[(0BBBBBBBBBBBBBBBBBBBBBBBBB@12 21112 2220...0013 211013 233...00......1m 21100...01m 2mm023p 2233 22223 233...00......02m 2220...02m 2mm......000...m)]TJ /F16 5.978 Tf 5.75 0 Td[(1m 2m)]TJ /F16 5.978 Tf 5.75 0 Td[(1m)]TJ /F16 5.978 Tf 5.75 0 Td[(1m)]TJ /F16 5.978 Tf 5.76 0 Td[(1m 2mm1CCCCCCCCCCCCCCCCCCCCCCCCCAAlso,let ()=0B@111221221CA (C) bethepartitionedcovariancematrixof=(vh(^)T,^0)AccordingtoaTheoremin MagnusandNeudecker ( 1988 )andresultsdiscussedin WangandDaniels ( 2012 ),theasymptoticcovariancematrixof^=vh(^R)is (0)=J1111JT11+J1222JT12+(J1112JT12+J1221JT11)=(!ij,i1j1), (C) where!ij;i1j1=!(1)ij;i1j1+!(2)ij,i1j1+!(3)ij;i1j1whichisthecovariancebetweenijandi1j1,J1111JT11=(!(1)ij,i1j1),J1222JT12=(!(2)ij,i1j1),and(J1112JT12+J1221JT11)=(!(3)ij,i1j1).Then, 116

PAGE 117

!(1)ij;i1j1=ii1jj1+ij1i1j,!(2)ij;i1j1=)]TJ /F6 11.955 Tf 9.3 0 Td[([i1j1(ii1i1j+ij1jj1)+ij(ii1ij1+i1j+jj1)],!(3)ij;i1j1=iji1j1(2ii1+2ij1+2i1j+2jj1).Let0betheconstantstatedinProposition 4.1 .Then,wehave!(1)(j+s,j+t+1;j1+s,j1+t+1)=j+s,j1+sj+t+1,j1+t+1+j+s,j1+t+1j1+s,j+t+1j!(1)(j+s,j+t+1;j1+s,j1+t+1)jo(jj1)]TJ /F7 7.97 Tf 6.58 0 Td[(jj0)o(jj1)]TJ /F7 7.97 Tf 6.59 0 Td[(jj0)+o((jj1)]TJ /F7 7.97 Tf 6.59 0 Td[(jj+js)]TJ /F7 7.97 Tf 6.58 0 Td[(tj)0)o((jj1)]TJ /F7 7.97 Tf 6.59 0 Td[(jjjs)]TJ /F7 7.97 Tf 6.59 0 Td[(tj)0)=o(2jj1)]TJ /F7 7.97 Tf 6.59 0 Td[(jj0)!(2)(j+s,j+t+1;j1+s,j1+t+1)=)]TJ /F6 11.955 Tf 9.3 0 Td[([j1+s,j1+t+1(j+s,j1+sj1+s,j+t+1+j+s,j1+t+1j+t+1,j1+t+1)+j+s,j+t+1(j+s,j1+sj+s,j1+t+1+j1+s,j+t+1j+t+1,j1+t+1)]j!(2)(j+s,j+t+1;j1+s,j1+t+1)jo(js)]TJ /F7 7.97 Tf 6.59 0 Td[(t)]TJ /F5 7.97 Tf 6.59 0 Td[(1j0)[o(jj1)]TJ /F7 7.97 Tf 6.59 0 Td[(jj0)o(jj1)]TJ /F7 7.97 Tf 6.59 0 Td[(j+s)]TJ /F7 7.97 Tf 6.58 0 Td[(t)]TJ /F5 7.97 Tf 6.59 0 Td[(1j0)+o(jj1)]TJ /F7 7.97 Tf 6.59 0 Td[(j)]TJ /F7 7.97 Tf 6.59 0 Td[(s+t+1j0)o(jj1)]TJ /F7 7.97 Tf 6.58 0 Td[(jj0)]+o(js)]TJ /F7 7.97 Tf 6.59 0 Td[(t)]TJ /F5 7.97 Tf 6.59 0 Td[(1j0)[o(jj1)]TJ /F7 7.97 Tf 6.59 0 Td[(jj0)o(jj1)]TJ /F7 7.97 Tf 6.59 0 Td[(j)]TJ /F7 7.97 Tf 6.59 0 Td[(s+t+1j0)+o(jj1)]TJ /F7 7.97 Tf 6.59 0 Td[(j+s)]TJ /F7 7.97 Tf 6.59 0 Td[(t)]TJ /F5 7.97 Tf 6.59 0 Td[(1j0)o(jj1)]TJ /F7 7.97 Tf 6.58 0 Td[(jj0)]=o(2jj1)]TJ /F7 7.97 Tf 6.59 0 Td[(jj0)!(3)(j+s,j+t+1;j1+s,j1+t+1)=j+s,j+t+1j1+s,j1+t+1(2j+s,j1+s+2j+s,j1+t+1+2j1+s,j+t+1+2j+t+1,j1+t+1)j!(3)(j+s,j+t+1;j1+s,j1+t+1)jo(2js)]TJ /F7 7.97 Tf 6.58 0 Td[(t+1j0)[o(2jj1)]TJ /F7 7.97 Tf 6.59 0 Td[(jj0)+o(2jj1)]TJ /F7 7.97 Tf 6.58 0 Td[(j)]TJ /F7 7.97 Tf 6.58 0 Td[(s+t+1j0)+o(2jj1)]TJ /F7 7.97 Tf 6.59 0 Td[(j+s)]TJ /F7 7.97 Tf 6.59 0 Td[(t)]TJ /F5 7.97 Tf 6.59 0 Td[(1j0)+o(2jj1)]TJ /F7 7.97 Tf 6.58 0 Td[(jj0)]=o(2jj1)]TJ /F7 7.97 Tf 6.59 0 Td[(jj0) 117

PAGE 118

Therefore,!(j+s,j+t+1;j1+s,j1+t+1)o(2jj1)]TJ /F7 7.97 Tf 6.59 0 Td[(jj0).Now,letsfocusoncovarianceof^0swithinlagk,i.e.,covarianceofbetween^jj+kand^j1j1+k.cov(^jj+k,^j1j1+k)=@jj+k @T@j1j1+k @=Pks=1Ps)]TJ /F5 7.97 Tf 6.59 0 Td[(1t=0@jj+k @j+s,j+t+1!(j+s,j+t+1;j1+s,j1+t+1)@j1j1+k @j1+s,j1+t+1.Since)]TJ /F6 11.955 Tf 9.29 0 Td[(1p(=2)),andletX=0BBBBBBB@11 m)]TJ /F7 7.97 Tf 6.59 0 Td[(k)]TJ /F4 11.955 Tf 11.95 0 Td[(x...(1 m)]TJ /F7 7.97 Tf 6.59 0 Td[(k)]TJ /F4 11.955 Tf 11.95 0 Td[(x)q12 m)]TJ /F7 7.97 Tf 6.59 0 Td[(k)]TJ /F4 11.955 Tf 11.95 0 Td[(x...(2 m)]TJ /F7 7.97 Tf 6.59 0 Td[(k)]TJ /F4 11.955 Tf 11.95 0 Td[(x)q......11)]TJ /F4 11.955 Tf 11.95 0 Td[(x...(1)]TJ /F4 11.955 Tf 11.95 0 Td[(x)q1CCCCCCCAandWt=diag"K(1 (m)]TJ /F7 7.97 Tf 6.59 0 Td[(k)gr,s) (m)]TJ /F4 11.955 Tf 11.95 0 Td[(k)gr,s,...,K(m)]TJ /F7 7.97 Tf 6.59 0 Td[(k)]TJ /F7 7.97 Tf 6.58 0 Td[(x (m)]TJ /F7 7.97 Tf 6.59 0 Td[(k)gr,s) (m)]TJ /F4 11.955 Tf 11.95 0 Td[(k)gr,s#,wheregr,sisabandwidthforestimatingr,s.Hereandinwhatfollows,wedenoteejisacolumnvectorwith1asitsjthentryandallotherentrieszero.Wethenhave^(j)(x,h,q)=eTj+1(XTq,xWxXq,x))]TJ /F5 7.97 Tf 6.59 0 Td[(1XTq,xWx)^, 118

PAGE 119

andthenestimates,tas ^r,s=1 m)]TJ /F4 11.955 Tf 11.96 0 Td[(km)]TJ /F7 7.97 Tf 6.59 0 Td[(kXi=1^(r)(xi,gr,s)^(s)(xi,gr,s), (C) Now,letKr,q(u)=r![jMr,q(u)j=jNqj]K(u),whereNqisa(q+1)(q+1)matrixhaving(i,j)entryequaltoRui+j)]TJ /F5 7.97 Tf 6.59 0 Td[(2K(u)duandMr,q(u)isthesameasNq,exceptthatthe(r+1)throwisreplacedby(1,u,...,uq).ThekernelKqisdenedtobeK0,q.Finally,let(L1L2)(x)=RL1(u)L2(x)]TJ /F4 11.955 Tf 12.06 0 Td[(u)dudenotetheconvolutionoftworealvaluedfunctionsL1andL2.Then,theminimizerofthemeansquareerrorof^2,2,calledg2,2is g2,2C1(K)2,k(b)]TJ /F4 11.955 Tf 11.95 0 Td[(a) j24j(m)]TJ /F4 11.955 Tf 11.96 0 Td[(k)1 7, (C) whereC1(K)=8><>:h12R(K2,3) 4(K2,3)i1 724<0h30R(K2,3) 4(K2,3)i1 724>0.Nowweneedtoestimate2,kand24inEq. C .Weestimate2,jviaablockedquarticpolynomialtasfollows,letNbethenumberofsubsamples,andletXj=fX(j)]TJ /F5 7.97 Tf 6.58 0 Td[(1)t+1),...,Xjtg,t=1,2,...,bm)]TJ /F7 7.97 Tf 6.59 0 Td[(k Nc,denotethejthsubsampleoforderedXi's.LetQj(x)betheleastsquaresqaurtictobtainedfromdatacontaininginXj.Formax(r,s)4,theblockedquarticestimatorforr,sis^Qr,s(N)=1 m)]TJ /F4 11.955 Tf 11.95 0 Td[(km)]TJ /F7 7.97 Tf 6.59 0 Td[(kXi=1NXj=1(^Qj)(r)(Xi)(^Qj)(s)(Xi)IfXi2Xjg.and^2Q(N)=1 (m)]TJ /F4 11.955 Tf 11.95 0 Td[(k))]TJ /F6 11.955 Tf 11.95 0 Td[(5Nm)]TJ /F7 7.97 Tf 6.59 0 Td[(kXi=1NXj=1f^i)]TJ /F6 11.955 Tf 12.35 0 Td[(^Qj(Xi)g2IfXi2XjgTheoptimalN2f1,2,...,NgischosenbyminimizingMallows'Cp(N),Cp(N)=RSS(N) RSS(Nmax)=((m)]TJ /F4 11.955 Tf 11.95 0 Td[(k))]TJ /F6 11.955 Tf 11.95 0 Td[(5Nmax))]TJ /F6 11.955 Tf 11.95 0 Td[(((m)]TJ /F4 11.955 Tf 11.95 0 Td[(k))]TJ /F6 11.955 Tf 11.96 0 Td[(10N), 119

PAGE 120

whereRSS(N)isresidualsumofsquaresforablockedquartictwithNblocks,N2f1,...,Nmaxg,andRSS(Nmax)isresidualsumofsquaresforablockedquartictwithNmax, Nmax=maxfmin(bm)]TJ /F4 11.955 Tf 11.95 0 Td[(k 20c,N),1g, (C) whereN,issomepositiveinteger. Ruppertetal. ( 1995 )suggestsN=5.Afterestimatingg2,2,weusecorrespondingestimatoroftocalculatetheoptimalbandwidth,,for2inEq. 4 =C2(K)4 2q+1,q+1(m)]TJ /F4 11.955 Tf 11.96 0 Td[(k)21 4q+5,whereC2(K)=h((q+1)!)4R(KqKq)]TJ /F5 7.97 Tf 6.58 0 Td[(2Kq) 2(q+1)q+1(Kq)4i1 4q+5withp=2.Then,weestimate2,kas^2q()=)]TJ /F5 7.97 Tf 6.59 0 Td[(1m)]TJ /F7 7.97 Tf 6.59 0 Td[(kXi=1fi)]TJ /F6 11.955 Tf 12.36 0 Td[(^(xi;)g2.where=(m)]TJ /F4 11.955 Tf 11.96 0 Td[(k))]TJ /F6 11.955 Tf 11.96 0 Td[(2Pi!ii+PPi,j!2ijand!ij=eT1(XTq,XiWXiXq,Xi))]TJ /F5 7.97 Tf 6.58 0 Td[(1XTq,XiWXiej.Thenweusetruncateddatawithin100%isusedtoadjustboundarybiasandtheestimatorofr,sisreplacedby^r,s(g)=1 m)]TJ /F4 11.955 Tf 11.96 0 Td[(km)]TJ /F7 7.97 Tf 6.58 0 Td[(kXi=1^(r)(Xi)^(s)(Xi)I[
PAGE 121

andestimate2,kas^21(),where^=C2(K)"^4Q(^N) (^0.052,2(^g2,2))2(m)]TJ /F4 11.955 Tf 11.95 0 Td[(k)2#1=7.weusethedifferences^(i))]TJ /F6 11.955 Tf 13.05 0 Td[(^(i)()asestimatedresiduals(andcalculatethecovariancefunctionestimatorbasedontheseresiduals, Herrmannetal. ( 1992 )thenestimatethelongruncovarianceS,kby^S,k()=m)]TJ /F7 7.97 Tf 6.58 0 Td[(k)]TJ /F5 7.97 Tf 6.59 0 Td[(1Xj=1"1 m)]TJ /F4 11.955 Tf 11.95 0 Td[(j)]TJ /F4 11.955 Tf 11.95 0 Td[(k+1m)]TJ /F7 7.97 Tf 6.58 0 Td[(j)]TJ /F7 7.97 Tf 6.58 0 Td[(kXi=1(^(i))]TJ /F6 11.955 Tf 12.35 0 Td[(^(i)(^))(^(i+j))]TJ /F6 11.955 Tf 12.35 0 Td[(^(i+j)(^))#. 3. EstimatethebandwidthhwithEpanechnikovkernel,K(x)=3 4(1)]TJ /F4 11.955 Tf 11.95 0 Td[(x2)I(jxj1)as ^haobm,k=(15(^21()+2^S,k()) ^0.052,2(^g2,2))1 51 m)]TJ /F4 11.955 Tf 11.96 0 Td[(k1 5. (C) 121

PAGE 122

REFERENCES Agresti,A.andHitchcock,D.B.(2005),BayesianInferenceforCategoricalDataAnalysis,StatisticalMethodsandApplications,14,297. Albert,J.H.andChib,S.(1993),BayesianAnalysisofBinaryandPolychotomousResponseData,JournaloftheAmericanStatisticalAssociation,88,pp.669. Altman,N.S.(1990),KernelSmoothingofDataWithCorrelatedErrors,JournaloftheAmericanStatisticalAssociation,85,pp.749. Anderson,T.W.(1984),AnIndroductiontoMultivariateStatisticalAnalysis,Wiley,NewYork. Barnard,J.,McCulloch,R.,andMeng,X.-l.(2000),ModelingCovarianceMatricesinTermsofStandardDeviationsandCorrelations,StatisticaSinica,10,1281. Bickel,P.J.andLevina,E.(2008a),CovarianceRegularizationbyThresholding,TheAnnalsofStatistics,36,2577. (2008b),RegularizedEstimationofLargeCovarianceMatrices,TheAnnalsofStatistics,36,199. Bigot,J.,Biscay,R.,Loubes,J.-M.,andAlvarez,L.M.(2009),NonparametricEstimationofCovarianceFunctionsbyModelSelection,ElectronicJournalofStatistics,4,822. Chiu,S.T.(1989),BandwidthSelectionforKernelEstimatewithCorrelatedNoise,ProbabilityLetters,8,347354. Chiu,T.Y.M.,Leonard,T.,andTsui,K.-W.(1996),TheMatrix-LogarithmicCovarianceModel,JournaloftheAmericanStatisticalAssociation,91,198. Clyde,M.andGeorge,E.I.(2000),FlexibleEmpiricalBayesEstimationforWavelets,JournaloftheRoyalStatisticalSociety.SeriesB(StatisticalMethodology),62,pp.681. Craven,P.andWahba,G.(1979),SmoothingNoisyDatawithSplineFunctions:EstimatingtheCorrectDegreeofSmoothingbytheMethodofGeneralizedCross-Validation,Numer.Math.,,31,377. Czado,C.(2000),MultivariateRegressionAnalysisofPanelDatawithBinaryOutcomesAppliedtoUnemploymentData,StatisticalPapers,41. Daniels,M.J.(2006),BayesianModelingofSeveralCovarianceMatricesandSomeResultsonPoprietyofthePosteriorforLinearRegressionwithCorrelatedand/orHeterogeneousErrors,JournalofMultivariateAnalysis,97,1185. Daniels,M.J.andHogan,J.W.(2008),MissingDatainLongitudinalStudies:StrategiesforBayesianModelingandSensitivityAnalysis,Chapman&Hall. 122

PAGE 123

Daniels,M.J.andKass,R.E.(1999),NonconjugateBayesianEstimationofCovarianceMatricesandItsUseinHierarchicalModels,JournaloftheAmericanStatisticalAssociation,94,1254. (2001),ShrinkageEstimatorsforCovarianceMatrices,Biometrics,57,1173. Daniels,M.J.andNormand,S.-l.(2006),LongitudinalProlingofHealthCareUnitsBasedonMixedMultivariatePatientOutcomes,Biostatistics,7,1. Daniels,M.J.andPourahmadi,M.(2002),BayesianAnalysisofCovarianceMatricesandDynamicModelsforLongitudinalData,Biometrika,89,553. (2009),ModelingCovarianceMatricesviaPartialAutocorrelations,JournalofMultivariateAnalysis,100,2352. Daniels,M.J.andZhao,Y.D.(2003),ModelingtheRandomEffectsCovarianceMatrixinLongitudinalData,StatisticsinMedicine,22,1631. Demko,S.,Moss,W.F.,andSmith,P.W.(1984),DecayRatesforInversesofBandMatrices,MathematicsofComputation,43,491. Dempster,A.P.(1972),CovarianceSelection,Biometrics,28,pp.157. Diggle,P.J.andVerbyla,A.P.(1998),NonparametricEstimationofCovarianceStructureinLongitudinalData,Biometrics,54,pp.401. Efron,B.andMorris,C.(1976),MultivariateEmpiricalBayesandEstimationofCovarianceMatrices,TheAnnalsofStatistics,4,pp.22. Fan,J.,Huang,T.,andLi,R.(2007),AnalysisofLongitudinalDatawithSemiparametricEstimationofCovarianceFunction,JournaloftheAmericanStatisticalAssociation,102,632. Fan,J.andWu,Y.(2008),SemiparametricEstimationofCovarianceMatrixesforLongitudinalData,JournaloftheAmericanStatisticalAssociation,103,1520. Fan,J.F.Y.andLv,J.(2008),HighDimensionalCovarianceMatrixEstimationUsingaFactorModel,JournalofEconometrics,147,186. Friedman,J.,Hastie,T.,andTibshirani,R.(2008),SparseInverseCovarianceEstimationwiththeGraphicalLasso,Biostatistics,9,432. Friedman,J.H.(1989),RegularizedDiscriminantAnalysis,JournaloftheAmericanStatisticalAssociation,84,pp.165. Fu,W.J.(1998),TheBridgeversustheLasso,JournalofComputationalandGraphicalStatistics,7,379. Fun,J.andGijbels,I.(1996),LocalPolynomialModellinganditsApplications,ChapmanandHall. 123

PAGE 124

Furrer,R.andBengtsson,T.(2007),EstimationofHigh-dimensionalPriorandPosteriorCovarianceMatricesinKalmanFilterVariants,JournalofMultivariateAnalysis,98,227. Gabriel,K.R.(1962),Ante-dependenceAnalysisofanOrderedSetofVariables,TheAnnalsofMathematicalStatistics,33,pp.201. Gasser,T.,Kneip,A.,andKohler,W.(1991),AFlexibleandFastMethodforAutomaticSmoothing,JournaloftheAmericanStatisticalAssociation,86,pp.643. Gasser,T.andMuller,H.-G.(1979),KernelEstimationofRegressionFunctions:inSmoothingTechniquesforCurveEstimation,LectureNotesinMath,Berlin:Spring-Verlag. George,E.I.andFoster,D.P.(2000),CalibrationandEmpiricalBayesvariableSelection,Biometrika,87,731. Golub,G.H.,Heath,M.,andWahba,G.(1979),GeneralizedCross-ValidationasaMethodforChoosingaGoodRidgeParameter,Technometrics,21,215. Hall,P.,Fisher,N.I.,andHoffmann,B.(1994),OntheNonparametricEstimationofCovarianceFunctions,TheAnnalsofStatistics,22,pp.2115. Hart,J.D.(1991),KernelRegressionEstimationWithTimeSeriesErrors,JournaloftheRoyalStatisticalSociety.SeriesB(Methodological),53,pp.173. Hastie,T.andLoader,C.(1993),LocalRegression:AutomaticKernelCarpentry,StatisticalScience,8,120. Heagerty,P.J.(1999),MarginallySpeciedLogistic-NormalModelsforLongitudinalBinaryData,Biometrics,55,688. (2002),MarginalizedTransitionModelsandLikelihoodInferenceforLongitudinalCategoricalData,Biometrics,58,pp.342. Hedeker,D.andGibbons,R.D.(1994),ARandom-EffectsOrdinalRegressionModelforMultilevelAnalysis,Biometrics,50,pp.933. Herrmann,E.,Gasser,T.,andKneip,A.(1992),ChoiceofBandwidthforKernelRegressionwhenResidualsareCorrelated,Biometrika,79,783. Huang,J.Z.,Liu,L.,andLiu,N.(2007),EstimationofLargeCovarianceMatricesofLongitudinalDatawithBasisFunctionApproximations,JournalofComputationalandGraphicalStatistics,16,189. Huang,J.Z.,Liu,N.,Pourahmadi,M.,andLiu,L.(2006),CovarianceMatrixSelectionandEstimationviaPenalisedNormalLikelihood,Biometrika,93,pp.85. Joe,H.(2006),GeneratingRandomCorrelationMatricesBasedonPartialCorrelations,JournalofMultivariateAnalysis,97,2177. 124

PAGE 125

Johnstone,I.M.(2001),OntheDistributionoftheLargestEigenvalueinPrincipalComponentsAnalysis,TheAnnalsofStatistics,29,pp.295. Johnstone,I.M.andLu,A.(2007),SparePrincipalComponentsAnalysis,JournaloftheAmericanStatisticalAssociation. Jones,M.C.(1993),SimpleBoundaryCorrectionforKernelDensityEstimation,StatisticsandComputing,3,135. Karoui,N.E.(2009),OperatorNormConsistentEstimationofLarge-dimensionalSparseCovarianceMatrices,TheAnnalsofStatistics,36,2717. Lapierre,Y.D.,Nair,N.P.V.,Chouinard,G.,Awad,A.G.,Saxena,B.,Jones,B.,McClure,D.J.,Bakish,D.,Max,P.,Manchanda,R.,Beaudry,P.,BIoom,D.,Rotstein,E.,Ancill,R.,Sandor,P.,Sladen-Dew,N.,Durand,C.,Chandrasena,R.,Horn,E.,Elliot,D.,Das,M.,Ravindran,A.,andMatsos,G.(1990),AControlledDose-rangingStudyofRemoxiprideandHaloperidolinSchizophrenia-aCanadianMulticentreTrial,ActaPsychiatricaScandinavica,82,72. Ledoit,O.andWolf,M.(2003),ImprovedEstimationoftheCovarianceMatrixofStockReturnswithanApplicationtoPortfolioSelection,JournalofEmpiricalFinance,10,603. Leonard,T.andHsu,J.S.J.(1992),BayesianInferenceforaCovarianceMatrix,TheAnnalsofStatistics,20,pp.1669. Li,Y.(2011),EfcientSemiparametricRegressionforLongitudinalDatawithNonparametricCovarianceEstimation,Biometrika,98,355. Li,Y.,Wang,N.,Hong,M.,Turner,N.D.,Lupton,J.R.,andCarroll,R.J.(2007),NonparametricEstimationofCorrelationFunctionsinLongitudinalandSpatialData,withApplicationtoColonCarcinogenesisExperiments,AnnalsofStatistics,35,1608. Liang,K.Y.andZeger,S.L.(1986),LongitudinalDataAnalysisUsingGeneralizedLinearModels,Biometrics,73,13. Liechty,J.C.,Liechty,M.W.,andMuller,P.(2004),BayesianCorrelationEstimation,Biometrika,91,1. Little,R.J.andRubin,D.B.(2002),StatisticalAnalysiswithMissingData,JohnWiley,NewYork. Loader,C.R.(1999),BandwidthSelection:ClassicalorPlug-In?TheAnnalsofStatistics,27,pp.415. Magnus,J.R.andNeudecker,H.(1984),MatrixDifferentialCalculuswithApplicationsinStatisticsandEconometrics,JohnWileyandSonsLtd. 125

PAGE 126

(1988),MatrixDifferentialCalculuswithApplicationsinStatisticsandEconometrics,JohnWileyandSons,1sted. Marron,J.S.(1987),AComparisonofCross-ValidationTechniquesinDensityEstimation,TheAnnalsofStatistics,15,pp.152. Marshall,A.W.andOlkin,I.(1979),Inequalities:TheoryofMajorizationanditsApplica-tions,AcademicPress,1sted. McCullagh,P.andNelder,J.A.(1989),GeneralizedLinearModels,ChapmanandHall,2nded. Nadaraya,E.(1964),OnEstimatingRegression,TheoryofProbabilityanditsApplica-tions,9,141. Nelsen,R.B.(1999),AnIntroductiontoCopulas,Springer. Park,B.U.andMarron,J.S.(1990a),ComparisonofData-DrivenBandwidthSelectors,JournaloftheAmericanStatisticalAssociation,85,pp.66. (1990b),ComparisonofData-DrivenBandwidthSelectors,JournaloftheAmericanStatisticalAssociation,85,pp.66. Pourahmadi,M.(1999),JointMean-CovarianceModelswithApplicationstoLngitudinalData:UnconstrainedParameterisation,Biometrika,86,677. (2000),MaximumLikelihoodEstimationofGeneralisedLinearModelsforMultivariateNormalCovarianceMatrix,Biometrika,87,pp.425. Pourahmadi,M.andDaniels,M.J.(2002),DynamicConditionallyLinearMixedModelsforLongitudinalData,Biometrics,58,225. Rice,J.A.andSilverman,B.W.(1991),EstimatingtheMeanandCovarianceStructureNonparametricallyWhentheDataareCurves,JournaloftheRoyalStatisticalSociety.SeriesB(Methodological),53,pp.233. Rice,J.A.andWu,C.O.(2001),NonparametricMixedEffectsModelsforUnequallySampledNoisyCurves,Biometrics,57,pp.253. Rothman,A.J.,Levina,E.,andZhu,J.(2010),AnewApproachtoCholesky-basedCovarianceRegularizationinHighDimensions,Biometrika,97,539. Rothman,E.L.A.andZhou,J.(2008),SparseEstimationofLargeCovarianceMatricesviaaNestedLassoPenalty,TheAnnalsofAppliedStatistics,2,245. Rubin,D.B.(1976),InferenceandMissingData,Biometrika,63,581. Ruppert,D.,Sheather,S.J.,andWand,M.P.(1995),AnEffectiveBandwidthSelectorforLocalLeastSquaresRegression,JournaloftheAmericanStatisticalAssociation,90,pp.1257. 126

PAGE 127

Ruppert,D.andWand,M.(1994),MultivariateLocallyWeightedLeastSquaresRegression,TheAnnalsofStatistics,22,1346. Scott,D.W.andTerrell,G.R.(1987),BiasedandUnbiasedCross-ValidationinDensityEstimation,JournaloftheAmericanStatisticalAssociation,82,pp.1131. Smith,M.andKohn,R.(2002),ParsimoniousCovarianceMatrixEstimationforLongitudinalData,JournaloftheAmericanStatisticalAssociation,97,pp.1141. Stein,C.(1975),EstimationofaCovarianceMatrix,RietzLecture,39thAnnualMeetingIMS.Atlanta,Georgia. Stone,C.J.(1984),AnAsymptoticallyOptimalWindowSelectionRuleforKernelDensityEstimates,TheAnnalsofStatistics,12,pp.1285. Stone,M.(1974),Cross-ValidatoryChoiceandAssessmentofStatisticalPredictions,JournaloftheRoyalStatisticalSociety.SeriesB(Methodological),36,pp.111. Tibshirani,R.(1996),RegressionShrinkageandSelectionviatheLasso,JournaloftheRoyalStatisticalSociety.SeriesB(Methodological),58,pp.267. Verbyla,A.P.(1993),ModellingVarianceHeterogeneity:ResidualMaximumLikelihoodandDiagnostics,JournaloftheRoyalStatisticalSociety.SeriesB(Methodological),55,pp.493. Wang,Y.andDaniels,M.J.(2012),EstimatingLargeCorrelationMatricesbyBandingthePartialAutocorrelationMatrix,TechnicalReport. Warton,D.I.(2008),PenalizedNormalLikelihoodandRidgeRegularizationofCorrelationandCovarianceMatrices,JournaloftheAmericanStatisticalAssoci-ation,103,340. Watson,G.S.(1964),SmoothRegressionAnalysis,Sankhya,SeriesA,26,359. Witten,D.M.andTibshirani,R.(2009),Covariance-RegularizedRegressionandClassicationforHighDimensionalProblems,JournalOfTheRoyalStatisticalSocietySeriesB,71,615. Wolfe,P.(1969),ConvergenceConditionsforAscentMethods,SIAMReview,11,pp.226. Wong,F.,Carter,C.K.,andKohn,R.(2003),EfcientEstimationofCovarianceSelectionModels,Biometrika,90,809. Wu,W.B.andPourahmadi,M.(2003),NonparametericEstimationofLargeCovarianceMatricesofLongitudinalData,Biometrika,90,831. Yang,R.andBerger,J.O.(1994),EstimationofaCovarianceMatrixUsingtheReferencePrior,TheAnnalsofStatistics,22,pp.1195. 127

PAGE 128

Yin,J.,Geng,Z.,Li,R.,andWang,H.(2010),NonparamterocCovarianceModel,StatisticaSinica,20,467. Zeger,S.L.andKarim,R.M.(1991),GeneralizedLinearModelsWithRandomEffects;AGibbsSamplingApproach,JournaloftheAmericanStatisticalAssociation,86,pp.79. Zellner,A.(1986),OnAssessingPriorDistributionsandBayesianRegressionAnalysiswithg-priorDistributions,Tech.rep. Zimmerman,D.L.andNunez-Anton,V.(2010),AntedependenceModelsforLongitudi-nalData,BocaRaton,Florida:CRCPress. 128

PAGE 129

BIOGRAPHICALSKETCH YanpinWangwasbornandraisedintheSichuan,China.ShegraduatedfromSichuanNormalUniversitywithBachelordegreeinmathematics.ThenshetaughtmathematicsinmiddleschoolandhighschoolinChina.In2001,shemovedtoUSAwithherHusbandJiangtaoLuoandhersonBinjie.YanpinwasofferedgraduateassistantshipbytheDepartmentofMathematicsattheUniversityofFloridain2005.In2007,sheearnedherMasterDegreeinmathematicalscienceandwasadmittedtotheDepartmentofStatistics.AfterreceivingMasterdegreeinstatistics,shetransferredtotheBiostatisticsDepartmentintheCollegeofPublicHealthandHealthProfessionsattheUniversityofFloridaasherwish.ShegraduatesinAugust2012withherDoctoratedegreeinbiostatistics.DuringhertimeattheUniversityofFlorida,ShehasbeenaninstructorintheDepartmentofMathematicsandaresearchassistantintheCollegeofPublicHealthandHealthProfessionsandtheDepartmentofStatistics.YanpinhaswonseveraltravelawardsforconferencesandENARDistinguishStudentPaperAward. 129