xml version 1.0 encoding UTF-8 standalone no
fcla fda yes
dl
!-- Bayesian modeling of nonstationarity in normal and lognormal processes ( Book ) --
METS:mets OBJID UF00098090_00001
xmlns:METS http:www.loc.govMETS
xmlns:mods http:www.loc.govmodsv3
xmlns:xlink http:www.w3.org1999xlink
xmlns:xsi http:www.w3.org2001XMLSchema-instance
xmlns:daitss http:www.fcla.edudlsmddaitss
xmlns:sobekcm http:digital.uflib.ufl.edumetadatasobekcm
xsi:schemaLocation
http:www.loc.govstandardsmetsmets.xsd
http:www.loc.govmodsv3mods-3-3.xsd
http:www.fcla.edudlsmddaitssdaitss.xsd
http:digital.uflib.ufl.edumetadatasobekcmsobekcm.xsd
METS:metsHdr CREATEDATE 2010-09-20T08:48:29Z ID LASTMODDATE 2010-02-11T00:00:00Z RECORDSTATUS NEW
METS:agent ROLE CREATOR TYPE ORGANIZATION
METS:name UF
METS:note server=TC
projects=
OTHERTYPE SOFTWARE OTHER
Go UFDC - FDA Preparation Tool
INDIVIDUAL
UFAD\mariner1
METS:dmdSec DMD1
METS:mdWrap MDTYPE MODS MIMETYPE textxml LABEL Metadata
METS:xmlData
mods:mods
mods:genre authority marcgt bibliography
non-fiction
mods:identifier type AlephBibNum 000071549
OCLC 04536074
NOTIS AAH6803
mods:language
mods:languageTerm text English
code iso639-2b eng
mods:location
mods:physicalLocation University of Florida
UF
mods:name personal
mods:namePart Velez-Arocho, Jorge Ivan
date 1947-
mods:role
mods:roleTerm Main Entity
mods:note thesis Thesis--University of Florida.
bibliography Bibliography: leaves 198-212.
statement responsibility by Jorge Ivan Velez-Arocho.
Typescript.
Vita.
mods:originInfo
mods:place
mods:placeTerm marccountry flu
mods:dateIssued marc 1978
point start 1978
mods:copyrightDate 1978
mods:recordInfo
mods:recordIdentifier source ufdc UF00098090_00001
mods:recordCreationDate 790109
mods:recordOrigin Imported from (ALEPH)000071549
mods:recordContentSource University of Florida
marcorg FUG
mods:languageOfCataloging
English
eng
mods:relatedItem original
mods:physicalDescription
mods:extent xiii, 24 leaves : graphs ; 28 cm.
mods:subject SUBJ650_1 lcsh
mods:topic Bayesian statistical decision theory
SUBJ650_2
Break-even analysis
SUBJ650_3
Economic forecasting
Mathematical models
SUBJ690_1
Management thesis Ph. D
SUBJ690_2
Dissertations, Academic
Management
mods:geographic UF
mods:titleInfo
mods:title Bayesian modeling of nonstationarity in normal and lognormal processes
mods:subTitle with applications in CVP analysis and life testing models
mods:typeOfResource text
DMD2
OTHERMDTYPE SobekCM Custom
sobekcm:procParam
sobekcm:Collection.Primary UFIR
sobekcm:Collection.Alternate VENDORIA
sobekcm:SubCollection UFETD
sobekcm:MainThumbnail bayesianmodeling00velerich_Page_008thm.jpg
sobekcm:Download
sobekcm:fptr FILEID UR2
sobekcm:EncodingLevel I
sobekcm:bibDesc
sobekcm:BibID UF00098090
sobekcm:VID 00001
sobekcm:Source
sobekcm:statement UF University of Florida
sobekcm:Type Book
sobekcm:SortDate -1
METS:amdSec
METS:digiprovMD AMD_DAITSS
DAITSS
daitss:daitss
daitss:AGREEMENT_INFO ACCOUNT PROJECT UFDC
METS:fileSec
METS:fileGrp USE reference
METS:file GROUPID G1 J1 imagejpeg SIZE 23259
METS:FLocat LOCTYPE OTHERLOCTYPE SYSTEM xlink:href bayesianmodeling00velerich_Page_001.jpg
G2 J2 12667
bayesianmodeling00velerich_Page_002.jpg
G3 J3 19298
bayesianmodeling00velerich_Page_003.jpg
G4 J4 61567
bayesianmodeling00velerich_Page_004.jpg
G5 J5 56285
bayesianmodeling00velerich_Page_005.jpg
G6 J6 64490
bayesianmodeling00velerich_Page_006.jpg
G7 J7 65181
bayesianmodeling00velerich_Page_007.jpg
G8 J8 16256
bayesianmodeling00velerich_Page_008.jpg
G9 J9 20508
bayesianmodeling00velerich_Page_009.jpg
G10 J10 22507
bayesianmodeling00velerich_Page_010.jpg
G11 J11 64959
bayesianmodeling00velerich_Page_011.jpg
G12 J12 69522
bayesianmodeling00velerich_Page_012.jpg
G13 J13 22915
bayesianmodeling00velerich_Page_013.jpg
G14 J14 60922
bayesianmodeling00velerich_Page_014.jpg
G15 J15 75668
bayesianmodeling00velerich_Page_015.jpg
G16 J16 66014
bayesianmodeling00velerich_Page_016.jpg
G17 J17 71667
bayesianmodeling00velerich_Page_017.jpg
G18 J18 62772
bayesianmodeling00velerich_Page_018.jpg
G19 J19 61285
bayesianmodeling00velerich_Page_019.jpg
G20 J20 67013
bayesianmodeling00velerich_Page_020.jpg
G21 J21 30878
bayesianmodeling00velerich_Page_021.jpg
G22 J22 60536
bayesianmodeling00velerich_Page_022.jpg
G23 J23 69272
bayesianmodeling00velerich_Page_023.jpg
G24 J24 60321
bayesianmodeling00velerich_Page_024.jpg
G25 J25 67566
bayesianmodeling00velerich_Page_025.jpg
G26 J26 66355
bayesianmodeling00velerich_Page_026.jpg
G27 J27 69760
bayesianmodeling00velerich_Page_027.jpg
G28 J28 70605
bayesianmodeling00velerich_Page_028.jpg
G29 J29 68303
bayesianmodeling00velerich_Page_029.jpg
G30 J30 65045
bayesianmodeling00velerich_Page_030.jpg
G31 J31 64765
bayesianmodeling00velerich_Page_031.jpg
G32 J32 69474
bayesianmodeling00velerich_Page_032.jpg
G33 J33 68335
bayesianmodeling00velerich_Page_033.jpg
G34 J34 43642
bayesianmodeling00velerich_Page_034.jpg
G35 J35 61367
bayesianmodeling00velerich_Page_035.jpg
G36 J36 50577
bayesianmodeling00velerich_Page_036.jpg
G37 J37 57751
bayesianmodeling00velerich_Page_037.jpg
G38 J38 58363
bayesianmodeling00velerich_Page_038.jpg
G39 J39 55042
bayesianmodeling00velerich_Page_039.jpg
G40 J40 42039
bayesianmodeling00velerich_Page_040.jpg
G41 J41 63732
bayesianmodeling00velerich_Page_041.jpg
G42 J42 67938
bayesianmodeling00velerich_Page_042.jpg
G43 J43 68850
bayesianmodeling00velerich_Page_043.jpg
G44 J44 67153
bayesianmodeling00velerich_Page_044.jpg
G45 J45 67927
bayesianmodeling00velerich_Page_045.jpg
G46 J46 63717
bayesianmodeling00velerich_Page_046.jpg
G47 J47 56180
bayesianmodeling00velerich_Page_047.jpg
G48 J48 66982
bayesianmodeling00velerich_Page_048.jpg
G49 J49 68138
bayesianmodeling00velerich_Page_049.jpg
G50 J50 65562
bayesianmodeling00velerich_Page_050.jpg
G51 J51 69009
bayesianmodeling00velerich_Page_051.jpg
G52 J52 67654
bayesianmodeling00velerich_Page_052.jpg
G53 J53 55797
bayesianmodeling00velerich_Page_053.jpg
G54 J54 67136
bayesianmodeling00velerich_Page_054.jpg
G55 J55 71767
bayesianmodeling00velerich_Page_055.jpg
G56 J56 72815
bayesianmodeling00velerich_Page_056.jpg
G57 J57 74684
bayesianmodeling00velerich_Page_057.jpg
G58 J58 65500
bayesianmodeling00velerich_Page_058.jpg
G59 J59 69920
bayesianmodeling00velerich_Page_059.jpg
G60 J60 67215
bayesianmodeling00velerich_Page_060.jpg
G61 J61 71158
bayesianmodeling00velerich_Page_061.jpg
G62 J62 65416
bayesianmodeling00velerich_Page_062.jpg
G63 J63 47419
bayesianmodeling00velerich_Page_063.jpg
G64 J64 63665
bayesianmodeling00velerich_Page_064.jpg
G65 J65 71882
bayesianmodeling00velerich_Page_065.jpg
G66 J66 66066
bayesianmodeling00velerich_Page_066.jpg
G67 J67 69247
bayesianmodeling00velerich_Page_067.jpg
G68 J68 49671
bayesianmodeling00velerich_Page_068.jpg
G69 J69 69093
bayesianmodeling00velerich_Page_069.jpg
G70 J70 49835
bayesianmodeling00velerich_Page_070.jpg
G71 J71 68334
bayesianmodeling00velerich_Page_071.jpg
G72 J72 65435
bayesianmodeling00velerich_Page_072.jpg
G73 J73 42652
bayesianmodeling00velerich_Page_073.jpg
G74 J74 52890
bayesianmodeling00velerich_Page_074.jpg
G75 J75 73560
bayesianmodeling00velerich_Page_075.jpg
G76 J76 71426
bayesianmodeling00velerich_Page_076.jpg
G77 J77 75950
bayesianmodeling00velerich_Page_077.jpg
G78 J78 53446
bayesianmodeling00velerich_Page_078.jpg
G79 J79 54358
bayesianmodeling00velerich_Page_079.jpg
G80 J80 62249
bayesianmodeling00velerich_Page_080.jpg
G81 J81 39289
bayesianmodeling00velerich_Page_081.jpg
G82 J82 62818
bayesianmodeling00velerich_Page_082.jpg
G83 J83 45812
bayesianmodeling00velerich_Page_083.jpg
G84 J84 56354
bayesianmodeling00velerich_Page_084.jpg
G85 J85 48693
bayesianmodeling00velerich_Page_085.jpg
G86 J86 54615
bayesianmodeling00velerich_Page_086.jpg
G87 J87 58901
bayesianmodeling00velerich_Page_087.jpg
G88 J88 52746
bayesianmodeling00velerich_Page_088.jpg
G89 J89 43831
bayesianmodeling00velerich_Page_089.jpg
G90 J90 61712
bayesianmodeling00velerich_Page_090.jpg
G91 J91 61310
bayesianmodeling00velerich_Page_091.jpg
G92 J92 59505
bayesianmodeling00velerich_Page_092.jpg
G93 J93 63778
bayesianmodeling00velerich_Page_093.jpg
G94 J94 63150
bayesianmodeling00velerich_Page_094.jpg
G95 J95 62554
bayesianmodeling00velerich_Page_095.jpg
G96 J96 64652
bayesianmodeling00velerich_Page_096.jpg
G97 J97 77199
bayesianmodeling00velerich_Page_097.jpg
G98 J98 76070
bayesianmodeling00velerich_Page_098.jpg
G99 J99 65664
bayesianmodeling00velerich_Page_099.jpg
G100 J100 48493
bayesianmodeling00velerich_Page_100.jpg
G101 J101 49988
bayesianmodeling00velerich_Page_101.jpg
G102 J102 49299
bayesianmodeling00velerich_Page_102.jpg
G103 J103 65084
bayesianmodeling00velerich_Page_103.jpg
G104 J104 62165
bayesianmodeling00velerich_Page_104.jpg
G105 J105 72383
bayesianmodeling00velerich_Page_105.jpg
G106 J106 59806
bayesianmodeling00velerich_Page_106.jpg
G107 J107 59621
bayesianmodeling00velerich_Page_107.jpg
G108 J108 61454
bayesianmodeling00velerich_Page_108.jpg
G109 J109 57252
bayesianmodeling00velerich_Page_109.jpg
G110 J110 42583
bayesianmodeling00velerich_Page_110.jpg
G111 J111 55117
bayesianmodeling00velerich_Page_111.jpg
G112 J112 39520
bayesianmodeling00velerich_Page_112.jpg
G113 J113 36518
bayesianmodeling00velerich_Page_113.jpg
G114 J114 36608
bayesianmodeling00velerich_Page_114.jpg
G115 J115 43126
bayesianmodeling00velerich_Page_115.jpg
G116 J116 76628
bayesianmodeling00velerich_Page_116.jpg
G117 J117 69212
bayesianmodeling00velerich_Page_117.jpg
G118 J118 75967
bayesianmodeling00velerich_Page_118.jpg
G119 J119 74967
bayesianmodeling00velerich_Page_119.jpg
G120 J120 63911
bayesianmodeling00velerich_Page_120.jpg
G121 J121 72084
bayesianmodeling00velerich_Page_121.jpg
G122 J122 59282
bayesianmodeling00velerich_Page_122.jpg
G123 J123 71894
bayesianmodeling00velerich_Page_123.jpg
G124 J124 69606
bayesianmodeling00velerich_Page_124.jpg
G125 J125 55989
bayesianmodeling00velerich_Page_125.jpg
G126 J126 61503
bayesianmodeling00velerich_Page_126.jpg
G127 J127 77221
bayesianmodeling00velerich_Page_127.jpg
G128 J128 71400
bayesianmodeling00velerich_Page_128.jpg
G129 J129 70737
bayesianmodeling00velerich_Page_129.jpg
G130 J130 63514
bayesianmodeling00velerich_Page_130.jpg
G131 J131 21070
bayesianmodeling00velerich_Page_131.jpg
G132 J132 63128
bayesianmodeling00velerich_Page_132.jpg
G133 J133 71623
bayesianmodeling00velerich_Page_133.jpg
G134 J134 69698
bayesianmodeling00velerich_Page_134.jpg
G135 J135 61820
bayesianmodeling00velerich_Page_135.jpg
G136 J136 62969
bayesianmodeling00velerich_Page_136.jpg
G137 J137 76025
bayesianmodeling00velerich_Page_137.jpg
G138 J138 54271
bayesianmodeling00velerich_Page_138.jpg
G139 J139 51618
bayesianmodeling00velerich_Page_139.jpg
G140 J140 57656
bayesianmodeling00velerich_Page_140.jpg
G141 J141 54075
bayesianmodeling00velerich_Page_141.jpg
G142 J142 52005
bayesianmodeling00velerich_Page_142.jpg
G143 J143 65149
bayesianmodeling00velerich_Page_143.jpg
G144 J144 59483
bayesianmodeling00velerich_Page_144.jpg
G145 J145 68737
bayesianmodeling00velerich_Page_145.jpg
G146 J146 49482
bayesianmodeling00velerich_Page_146.jpg
G147 J147 68628
bayesianmodeling00velerich_Page_147.jpg
G148 J148 48425
bayesianmodeling00velerich_Page_148.jpg
G149 J149 68945
bayesianmodeling00velerich_Page_149.jpg
G150 J150 66730
bayesianmodeling00velerich_Page_150.jpg
G151 J151 62160
bayesianmodeling00velerich_Page_151.jpg
G152 J152 63808
bayesianmodeling00velerich_Page_152.jpg
G153 J153 68325
bayesianmodeling00velerich_Page_153.jpg
G154 J154 67275
bayesianmodeling00velerich_Page_154.jpg
G155 J155 69680
bayesianmodeling00velerich_Page_155.jpg
G156 J156 67402
bayesianmodeling00velerich_Page_156.jpg
G157 J157 52602
bayesianmodeling00velerich_Page_157.jpg
G158 J158 63074
bayesianmodeling00velerich_Page_158.jpg
G159 J159 75867
bayesianmodeling00velerich_Page_159.jpg
G160 J160 77582
bayesianmodeling00velerich_Page_160.jpg
G161 J161 71989
bayesianmodeling00velerich_Page_161.jpg
G162 J162 19319
bayesianmodeling00velerich_Page_162.jpg
G163 J163 64620
bayesianmodeling00velerich_Page_163.jpg
G164 J164 57394
bayesianmodeling00velerich_Page_164.jpg
G165 J165 73856
bayesianmodeling00velerich_Page_165.jpg
G166 J166 70704
bayesianmodeling00velerich_Page_166.jpg
G167 J167 74011
bayesianmodeling00velerich_Page_167.jpg
G168 J168 63786
bayesianmodeling00velerich_Page_168.jpg
G169 J169 75626
bayesianmodeling00velerich_Page_169.jpg
G170 J170 77000
bayesianmodeling00velerich_Page_170.jpg
G171 J171 53814
bayesianmodeling00velerich_Page_171.jpg
G172 J172 12424
bayesianmodeling00velerich_Page_172.jpg
G173 J173 62490
bayesianmodeling00velerich_Page_173.jpg
G174 J174 41833
bayesianmodeling00velerich_Page_174.jpg
G175 J175 59475
bayesianmodeling00velerich_Page_175.jpg
G176 J176 59303
bayesianmodeling00velerich_Page_176.jpg
G177 J177 68890
bayesianmodeling00velerich_Page_177.jpg
G178 J178 53289
bayesianmodeling00velerich_Page_178.jpg
G179 J179 54822
bayesianmodeling00velerich_Page_179.jpg
G180 J180 44465
bayesianmodeling00velerich_Page_180.jpg
G181 J181 66378
bayesianmodeling00velerich_Page_181.jpg
G182 J182 43176
bayesianmodeling00velerich_Page_182.jpg
G183 J183 58531
bayesianmodeling00velerich_Page_183.jpg
G184 J184 12442
bayesianmodeling00velerich_Page_184.jpg
G185 J185 60747
bayesianmodeling00velerich_Page_185.jpg
G186 J186 42713
bayesianmodeling00velerich_Page_186.jpg
G187 J187 52749
bayesianmodeling00velerich_Page_187.jpg
G188 J188 46707
bayesianmodeling00velerich_Page_188.jpg
G189 J189 54372
bayesianmodeling00velerich_Page_189.jpg
G190 J190 53188
bayesianmodeling00velerich_Page_190.jpg
G191 J191 45968
bayesianmodeling00velerich_Page_191.jpg
G192 J192 44123
bayesianmodeling00velerich_Page_192.jpg
G193 J193 50607
bayesianmodeling00velerich_Page_193.jpg
G194 J194 59667
bayesianmodeling00velerich_Page_194.jpg
G195 J195 39062
bayesianmodeling00velerich_Page_195.jpg
G196 J196 21508
bayesianmodeling00velerich_Page_196.jpg
G197 J197 12756
bayesianmodeling00velerich_Page_197.jpg
G198 J198 51336
bayesianmodeling00velerich_Page_198.jpg
G199 J199 36691
bayesianmodeling00velerich_Page_199.jpg
G200 J200 41370
bayesianmodeling00velerich_Page_200.jpg
G201 J201 33109
bayesianmodeling00velerich_Page_201.jpg
G202 J202 30310
bayesianmodeling00velerich_Page_202.jpg
G203 J203 67617
bayesianmodeling00velerich_Page_203.jpg
G204 J204 40443
bayesianmodeling00velerich_Page_204.jpg
G205 J205 39560
bayesianmodeling00velerich_Page_205.jpg
G206 J206 51861
bayesianmodeling00velerich_Page_206.jpg
G207 J207 31199
bayesianmodeling00velerich_Page_207.jpg
G208 J208 31589
bayesianmodeling00velerich_Page_208.jpg
G209 J209 36283
bayesianmodeling00velerich_Page_209.jpg
G210 J210 15726
bayesianmodeling00velerich_Page_210.jpg
G211 J211 66145
bayesianmodeling00velerich_Page_211.jpg
G212 J212 76457
bayesianmodeling00velerich_Page_212.jpg
G213 J213 83927
bayesianmodeling00velerich_Page_213.jpg
G214 J214 74875
bayesianmodeling00velerich_Page_214.jpg
G215 J215 72788
bayesianmodeling00velerich_Page_215.jpg
G216 J216 77835
bayesianmodeling00velerich_Page_216.jpg
G217 J217 83353
bayesianmodeling00velerich_Page_217.jpg
G218 J218 81282
bayesianmodeling00velerich_Page_218.jpg
G219 J219 81820
bayesianmodeling00velerich_Page_219.jpg
G220 J220 77766
bayesianmodeling00velerich_Page_220.jpg
G221 J221 82235
bayesianmodeling00velerich_Page_221.jpg
G222 J222 78079
bayesianmodeling00velerich_Page_222.jpg
G223 J223 84656
bayesianmodeling00velerich_Page_223.jpg
G224 J224 78232
bayesianmodeling00velerich_Page_224.jpg
G225 J225 35346
bayesianmodeling00velerich_Page_225.jpg
G226 J226 60416
bayesianmodeling00velerich_Page_226.jpg
G227 J227 42580
bayesianmodeling00velerich_Page_227.jpg
G228 J228 61943
bayesianmodeling00velerich_Page_228.jpg
G229 J229 33949
bayesianmodeling00velerich_Page_229.jpg
G230 J230 13437
bayesianmodeling00velerich_Page_230.jpg
E1 imagejp2 313568
bayesianmodeling00velerich_Page_001.jp2
E2 113046
bayesianmodeling00velerich_Page_002.jp2
E3 211684
bayesianmodeling00velerich_Page_003.jp2
E4 835282
bayesianmodeling00velerich_Page_004.jp2
E5 760379
bayesianmodeling00velerich_Page_005.jp2
E6 798354
bayesianmodeling00velerich_Page_006.jp2
E7 801754
bayesianmodeling00velerich_Page_007.jp2
E8 147419
bayesianmodeling00velerich_Page_008.jp2
E9 224823
bayesianmodeling00velerich_Page_009.jp2
E10 248463
bayesianmodeling00velerich_Page_010.jp2
E11 870689
bayesianmodeling00velerich_Page_011.jp2
E12 887400
bayesianmodeling00velerich_Page_012.jp2
E13 273675
bayesianmodeling00velerich_Page_013.jp2
E14 823681
bayesianmodeling00velerich_Page_014.jp2
E15 887366
bayesianmodeling00velerich_Page_015.jp2
E16 887386
bayesianmodeling00velerich_Page_016.jp2
E17 887372
bayesianmodeling00velerich_Page_017.jp2
E18 826527
bayesianmodeling00velerich_Page_018.jp2
E19 828119
bayesianmodeling00velerich_Page_019.jp2
E20 887371
bayesianmodeling00velerich_Page_020.jp2
E21 379143
bayesianmodeling00velerich_Page_021.jp2
E22 838839
bayesianmodeling00velerich_Page_022.jp2
E23 861952
bayesianmodeling00velerich_Page_023.jp2
E24 788670
bayesianmodeling00velerich_Page_024.jp2
E25 861948
bayesianmodeling00velerich_Page_025.jp2
E26 887378
bayesianmodeling00velerich_Page_026.jp2
E27 861863
bayesianmodeling00velerich_Page_027.jp2
E28
bayesianmodeling00velerich_Page_028.jp2
E29 861956
bayesianmodeling00velerich_Page_029.jp2
E30 887383
bayesianmodeling00velerich_Page_030.jp2
E31 853742
bayesianmodeling00velerich_Page_031.jp2
E32
bayesianmodeling00velerich_Page_032.jp2
E33 861960
bayesianmodeling00velerich_Page_033.jp2
E34 548249
bayesianmodeling00velerich_Page_034.jp2
E35 814302
bayesianmodeling00velerich_Page_035.jp2
E36 643206
bayesianmodeling00velerich_Page_036.jp2
E37 748907
bayesianmodeling00velerich_Page_037.jp2
E38 776963
bayesianmodeling00velerich_Page_038.jp2
E39 709742
bayesianmodeling00velerich_Page_039.jp2
E40 537421
bayesianmodeling00velerich_Page_040.jp2
E41 841494
bayesianmodeling00velerich_Page_041.jp2
E42 887398
bayesianmodeling00velerich_Page_042.jp2
E43 861953
bayesianmodeling00velerich_Page_043.jp2
E44 887397
bayesianmodeling00velerich_Page_044.jp2
E45 861959
bayesianmodeling00velerich_Page_045.jp2
E46 853327
bayesianmodeling00velerich_Page_046.jp2
E47 726170
bayesianmodeling00velerich_Page_047.jp2
E48 887375
bayesianmodeling00velerich_Page_048.jp2
E49 861911
bayesianmodeling00velerich_Page_049.jp2
E50 887399
bayesianmodeling00velerich_Page_050.jp2
E51 861919
bayesianmodeling00velerich_Page_051.jp2
E52 887388
bayesianmodeling00velerich_Page_052.jp2
E53 738955
bayesianmodeling00velerich_Page_053.jp2
E54 887387
bayesianmodeling00velerich_Page_054.jp2
E55 861946
bayesianmodeling00velerich_Page_055.jp2
E56
bayesianmodeling00velerich_Page_056.jp2
E57 861930
bayesianmodeling00velerich_Page_057.jp2
E58
bayesianmodeling00velerich_Page_058.jp2
E59 861939
bayesianmodeling00velerich_Page_059.jp2
E60 887393
bayesianmodeling00velerich_Page_060.jp2
E61 861904
bayesianmodeling00velerich_Page_061.jp2
E62 872347
bayesianmodeling00velerich_Page_062.jp2
E63 598071
bayesianmodeling00velerich_Page_063.jp2
E64 859777
bayesianmodeling00velerich_Page_064.jp2
E65 861940
bayesianmodeling00velerich_Page_065.jp2
E66 887392
bayesianmodeling00velerich_Page_066.jp2
E67
bayesianmodeling00velerich_Page_067.jp2
E68 627863
bayesianmodeling00velerich_Page_068.jp2
E69
bayesianmodeling00velerich_Page_069.jp2
E70 658633
bayesianmodeling00velerich_Page_070.jp2
E71 861950
bayesianmodeling00velerich_Page_071.jp2
E72 868977
bayesianmodeling00velerich_Page_072.jp2
E73 523518
bayesianmodeling00velerich_Page_073.jp2
E74 696790
bayesianmodeling00velerich_Page_074.jp2
E75 861957
bayesianmodeling00velerich_Page_075.jp2
E76 887353
bayesianmodeling00velerich_Page_076.jp2
E77 861954
bayesianmodeling00velerich_Page_077.jp2
E78 691779
bayesianmodeling00velerich_Page_078.jp2
E79 700007
bayesianmodeling00velerich_Page_079.jp2
E80 825582
bayesianmodeling00velerich_Page_080.jp2
E81 468390
bayesianmodeling00velerich_Page_081.jp2
E82 827581
bayesianmodeling00velerich_Page_082.jp2
E83 572749
bayesianmodeling00velerich_Page_083.jp2
E84 727870
bayesianmodeling00velerich_Page_084.jp2
E85 605736
bayesianmodeling00velerich_Page_085.jp2
E86 721544
bayesianmodeling00velerich_Page_086.jp2
E87 764591
bayesianmodeling00velerich_Page_087.jp2
E88 685484
bayesianmodeling00velerich_Page_088.jp2
E89 550304
bayesianmodeling00velerich_Page_089.jp2
E90 820417
bayesianmodeling00velerich_Page_090.jp2
E91 797138
bayesianmodeling00velerich_Page_091.jp2
E92 789813
bayesianmodeling00velerich_Page_092.jp2
E93 860454
bayesianmodeling00velerich_Page_093.jp2
E94 829756
bayesianmodeling00velerich_Page_094.jp2
E95 824061
bayesianmodeling00velerich_Page_095.jp2
E96 852781
bayesianmodeling00velerich_Page_096.jp2
E97 861943
bayesianmodeling00velerich_Page_097.jp2
E98
bayesianmodeling00velerich_Page_098.jp2
E99 843764
bayesianmodeling00velerich_Page_099.jp2
E100 613740
bayesianmodeling00velerich_Page_100.jp2
E101 624880
bayesianmodeling00velerich_Page_101.jp2
E102 659063
bayesianmodeling00velerich_Page_102.jp2
E103 825687
bayesianmodeling00velerich_Page_103.jp2
E104 816661
bayesianmodeling00velerich_Page_104.jp2
E105
bayesianmodeling00velerich_Page_105.jp2
E106 771811
bayesianmodeling00velerich_Page_106.jp2
E107 761633
bayesianmodeling00velerich_Page_107.jp2
E108 795175
bayesianmodeling00velerich_Page_108.jp2
E109 722355
bayesianmodeling00velerich_Page_109.jp2
E110 544166
bayesianmodeling00velerich_Page_110.jp2
E111 689140
bayesianmodeling00velerich_Page_111.jp2
E112 494568
bayesianmodeling00velerich_Page_112.jp2
E113 454629
bayesianmodeling00velerich_Page_113.jp2
E114 463867
bayesianmodeling00velerich_Page_114.jp2
E115 535097
bayesianmodeling00velerich_Page_115.jp2
E116 887373
bayesianmodeling00velerich_Page_116.jp2
E117
bayesianmodeling00velerich_Page_117.jp2
E118
bayesianmodeling00velerich_Page_118.jp2
E119
bayesianmodeling00velerich_Page_119.jp2
E120 885692
bayesianmodeling00velerich_Page_120.jp2
E121 861928
bayesianmodeling00velerich_Page_121.jp2
E122 806467
bayesianmodeling00velerich_Page_122.jp2
E123
bayesianmodeling00velerich_Page_123.jp2
E124 908646
bayesianmodeling00velerich_Page_124.jp2
E125 723923
bayesianmodeling00velerich_Page_125.jp2
E126 842567
bayesianmodeling00velerich_Page_126.jp2
E127
bayesianmodeling00velerich_Page_127.jp2
E128 908650
bayesianmodeling00velerich_Page_128.jp2
E129
bayesianmodeling00velerich_Page_129.jp2
E130 870561
bayesianmodeling00velerich_Page_130.jp2
E131 230144
bayesianmodeling00velerich_Page_131.jp2
E132 873317
bayesianmodeling00velerich_Page_132.jp2
E133 861949
bayesianmodeling00velerich_Page_133.jp2
E134 908689
bayesianmodeling00velerich_Page_134.jp2
E135 808921
bayesianmodeling00velerich_Page_135.jp2
E136 889278
bayesianmodeling00velerich_Page_136.jp2
E137 861947
bayesianmodeling00velerich_Page_137.jp2
E138 752827
bayesianmodeling00velerich_Page_138.jp2
E139 650616
bayesianmodeling00velerich_Page_139.jp2
E140 781731
bayesianmodeling00velerich_Page_140.jp2
E141 688524
bayesianmodeling00velerich_Page_141.jp2
E142 720035
bayesianmodeling00velerich_Page_142.jp2
E143 840805
bayesianmodeling00velerich_Page_143.jp2
E144 824316
bayesianmodeling00velerich_Page_144.jp2
E145
bayesianmodeling00velerich_Page_145.jp2
E146 674721
bayesianmodeling00velerich_Page_146.jp2
E147
bayesianmodeling00velerich_Page_147.jp2
E148 644479
bayesianmodeling00velerich_Page_148.jp2
E149
bayesianmodeling00velerich_Page_149.jp2
E150 908661
bayesianmodeling00velerich_Page_150.jp2
E151 805112
bayesianmodeling00velerich_Page_151.jp2
E152 892436
bayesianmodeling00velerich_Page_152.jp2
E153 861941
bayesianmodeling00velerich_Page_153.jp2
E154 908655
bayesianmodeling00velerich_Page_154.jp2
E155
bayesianmodeling00velerich_Page_155.jp2
E156
bayesianmodeling00velerich_Page_156.jp2
E157 661765
bayesianmodeling00velerich_Page_157.jp2
E158 844746
bayesianmodeling00velerich_Page_158.jp2
E159 861955
bayesianmodeling00velerich_Page_159.jp2
E160 895114
bayesianmodeling00velerich_Page_160.jp2
E161
bayesianmodeling00velerich_Page_161.jp2
E162 214446
bayesianmodeling00velerich_Page_162.jp2
E163 856192
bayesianmodeling00velerich_Page_163.jp2
E164 775144
bayesianmodeling00velerich_Page_164.jp2
E165
bayesianmodeling00velerich_Page_165.jp2
E166 895132
bayesianmodeling00velerich_Page_166.jp2
E167
bayesianmodeling00velerich_Page_167.jp2
E168 894978
bayesianmodeling00velerich_Page_168.jp2
E169
bayesianmodeling00velerich_Page_169.jp2
E170 895140
bayesianmodeling00velerich_Page_170.jp2
E171 682642
bayesianmodeling00velerich_Page_171.jp2
E172 108437
bayesianmodeling00velerich_Page_172.jp2
E173 821579
bayesianmodeling00velerich_Page_173.jp2
E174 551174
bayesianmodeling00velerich_Page_174.jp2
E175 736825
bayesianmodeling00velerich_Page_175.jp2
E176 774517
bayesianmodeling00velerich_Page_176.jp2
E177
bayesianmodeling00velerich_Page_177.jp2
E178 709040
bayesianmodeling00velerich_Page_178.jp2
E179 705683
bayesianmodeling00velerich_Page_179.jp2
E180 572522
bayesianmodeling00velerich_Page_180.jp2
E181 861929
bayesianmodeling00velerich_Page_181.jp2
E182 558419
bayesianmodeling00velerich_Page_182.jp2
E183 755399
bayesianmodeling00velerich_Page_183.jp2
E184 112193
bayesianmodeling00velerich_Page_184.jp2
E185 799389
bayesianmodeling00velerich_Page_185.jp2
E186 547340
bayesianmodeling00velerich_Page_186.jp2
E187 690308
bayesianmodeling00velerich_Page_187.jp2
E188 587692
bayesianmodeling00velerich_Page_188.jp2
E189 698965
bayesianmodeling00velerich_Page_189.jp2
E190 708873
bayesianmodeling00velerich_Page_190.jp2
E191 601854
bayesianmodeling00velerich_Page_191.jp2
E192 566886
bayesianmodeling00velerich_Page_192.jp2
E193 644707
bayesianmodeling00velerich_Page_193.jp2
E194 780694
bayesianmodeling00velerich_Page_194.jp2
E195 483157
bayesianmodeling00velerich_Page_195.jp2
E196 235347
bayesianmodeling00velerich_Page_196.jp2
E197 118664
bayesianmodeling00velerich_Page_197.jp2
E198 687294
bayesianmodeling00velerich_Page_198.jp2
E199 426182
bayesianmodeling00velerich_Page_199.jp2
E200 515431
bayesianmodeling00velerich_Page_200.jp2
E201 403181
bayesianmodeling00velerich_Page_201.jp2
E202 367243
bayesianmodeling00velerich_Page_202.jp2
E203
bayesianmodeling00velerich_Page_203.jp2
E204 548017
bayesianmodeling00velerich_Page_204.jp2
E205 513868
bayesianmodeling00velerich_Page_205.jp2
E206 696665
bayesianmodeling00velerich_Page_206.jp2
E207 407082
bayesianmodeling00velerich_Page_207.jp2
E208 421096
bayesianmodeling00velerich_Page_208.jp2
E209 460115
bayesianmodeling00velerich_Page_209.jp2
E210 180632
bayesianmodeling00velerich_Page_210.jp2
E211
bayesianmodeling00velerich_Page_211.jp2
E212 913472
bayesianmodeling00velerich_Page_212.jp2
E213 861923
bayesianmodeling00velerich_Page_213.jp2
E214 913419
bayesianmodeling00velerich_Page_214.jp2
E215 861937
bayesianmodeling00velerich_Page_215.jp2
E216 913424
bayesianmodeling00velerich_Page_216.jp2
E217
bayesianmodeling00velerich_Page_217.jp2
E218 913452
bayesianmodeling00velerich_Page_218.jp2
E219 861909
bayesianmodeling00velerich_Page_219.jp2
E220 913464
bayesianmodeling00velerich_Page_220.jp2
E221 861935
bayesianmodeling00velerich_Page_221.jp2
E222 913435
bayesianmodeling00velerich_Page_222.jp2
E223
bayesianmodeling00velerich_Page_223.jp2
E224 913466
bayesianmodeling00velerich_Page_224.jp2
E225 440012
bayesianmodeling00velerich_Page_225.jp2
E226 844096
bayesianmodeling00velerich_Page_226.jp2
E227 547197
bayesianmodeling00velerich_Page_227.jp2
E228 752597
bayesianmodeling00velerich_Page_228.jp2
E229 421093
bayesianmodeling00velerich_Page_229.jp2
E230 174697
bayesianmodeling00velerich_Page_230.jp2
archive
F1 imagetiff 6.0 22352890
bayesianmodeling00velerich_Page_001.tif
F2 21320720
bayesianmodeling00velerich_Page_002.tif
F3
bayesianmodeling00velerich_Page_003.tif
F4
bayesianmodeling00velerich_Page_004.tif
F5
bayesianmodeling00velerich_Page_005.tif
F6
bayesianmodeling00velerich_Page_006.tif
F7
bayesianmodeling00velerich_Page_007.tif
F8
bayesianmodeling00velerich_Page_008.tif
F9
bayesianmodeling00velerich_Page_009.tif
F10
bayesianmodeling00velerich_Page_010.tif
F11
bayesianmodeling00velerich_Page_011.tif
F12
bayesianmodeling00velerich_Page_012.tif
F13
bayesianmodeling00velerich_Page_013.tif
F14
bayesianmodeling00velerich_Page_014.tif
F15
bayesianmodeling00velerich_Page_015.tif
F16
bayesianmodeling00velerich_Page_016.tif
F17
bayesianmodeling00velerich_Page_017.tif
F18
bayesianmodeling00velerich_Page_018.tif
F19
bayesianmodeling00velerich_Page_019.tif
F20
bayesianmodeling00velerich_Page_020.tif
F21
bayesianmodeling00velerich_Page_021.tif
F22
bayesianmodeling00velerich_Page_022.tif
F23 20709676
bayesianmodeling00velerich_Page_023.tif
F24
bayesianmodeling00velerich_Page_024.tif
F25
bayesianmodeling00velerich_Page_025.tif
F26
bayesianmodeling00velerich_Page_026.tif
F27
bayesianmodeling00velerich_Page_027.tif
F28
bayesianmodeling00velerich_Page_028.tif
F29
bayesianmodeling00velerich_Page_029.tif
F30
bayesianmodeling00velerich_Page_030.tif
F31
bayesianmodeling00velerich_Page_031.tif
F32
bayesianmodeling00velerich_Page_032.tif
F33
bayesianmodeling00velerich_Page_033.tif
F34
bayesianmodeling00velerich_Page_034.tif
F35
bayesianmodeling00velerich_Page_035.tif
F36
bayesianmodeling00velerich_Page_036.tif
F37
bayesianmodeling00velerich_Page_037.tif
F38
bayesianmodeling00velerich_Page_038.tif
F39
bayesianmodeling00velerich_Page_039.tif
F40
bayesianmodeling00velerich_Page_040.tif
F41
bayesianmodeling00velerich_Page_041.tif
F42
bayesianmodeling00velerich_Page_042.tif
F43
bayesianmodeling00velerich_Page_043.tif
F44
bayesianmodeling00velerich_Page_044.tif
F45
bayesianmodeling00velerich_Page_045.tif
F46
bayesianmodeling00velerich_Page_046.tif
F47
bayesianmodeling00velerich_Page_047.tif
F48
bayesianmodeling00velerich_Page_048.tif
F49
bayesianmodeling00velerich_Page_049.tif
F50
bayesianmodeling00velerich_Page_050.tif
F51
bayesianmodeling00velerich_Page_051.tif
F52
bayesianmodeling00velerich_Page_052.tif
F53
bayesianmodeling00velerich_Page_053.tif
F54
bayesianmodeling00velerich_Page_054.tif
F55
bayesianmodeling00velerich_Page_055.tif
F56
bayesianmodeling00velerich_Page_056.tif
F57
bayesianmodeling00velerich_Page_057.tif
F58
bayesianmodeling00velerich_Page_058.tif
F59
bayesianmodeling00velerich_Page_059.tif
F60
bayesianmodeling00velerich_Page_060.tif
F61
bayesianmodeling00velerich_Page_061.tif
F62
bayesianmodeling00velerich_Page_062.tif
F63
bayesianmodeling00velerich_Page_063.tif
F64
bayesianmodeling00velerich_Page_064.tif
F65
bayesianmodeling00velerich_Page_065.tif
F66
bayesianmodeling00velerich_Page_066.tif
F67
bayesianmodeling00velerich_Page_067.tif
F68
bayesianmodeling00velerich_Page_068.tif
F69
bayesianmodeling00velerich_Page_069.tif
F70
bayesianmodeling00velerich_Page_070.tif
F71
bayesianmodeling00velerich_Page_071.tif
F72
bayesianmodeling00velerich_Page_072.tif
F73
bayesianmodeling00velerich_Page_073.tif
F74
bayesianmodeling00velerich_Page_074.tif
F75
bayesianmodeling00velerich_Page_075.tif
F76
bayesianmodeling00velerich_Page_076.tif
F77
bayesianmodeling00velerich_Page_077.tif
F78
bayesianmodeling00velerich_Page_078.tif
F79
bayesianmodeling00velerich_Page_079.tif
F80
bayesianmodeling00velerich_Page_080.tif
F81
bayesianmodeling00velerich_Page_081.tif
F82
bayesianmodeling00velerich_Page_082.tif
F83
bayesianmodeling00velerich_Page_083.tif
F84
bayesianmodeling00velerich_Page_084.tif
F85
bayesianmodeling00velerich_Page_085.tif
F86
bayesianmodeling00velerich_Page_086.tif
F87
bayesianmodeling00velerich_Page_087.tif
F88
bayesianmodeling00velerich_Page_088.tif
F89
bayesianmodeling00velerich_Page_089.tif
F90
bayesianmodeling00velerich_Page_090.tif
F91
bayesianmodeling00velerich_Page_091.tif
F92
bayesianmodeling00velerich_Page_092.tif
F93
bayesianmodeling00velerich_Page_093.tif
F94
bayesianmodeling00velerich_Page_094.tif
F95
bayesianmodeling00velerich_Page_095.tif
F96
bayesianmodeling00velerich_Page_096.tif
F97
bayesianmodeling00velerich_Page_097.tif
F98
bayesianmodeling00velerich_Page_098.tif
F99
bayesianmodeling00velerich_Page_099.tif
F100
bayesianmodeling00velerich_Page_100.tif
F101
bayesianmodeling00velerich_Page_101.tif
F102
bayesianmodeling00velerich_Page_102.tif
F103
bayesianmodeling00velerich_Page_103.tif
F104
bayesianmodeling00velerich_Page_104.tif
F105
bayesianmodeling00velerich_Page_105.tif
F106
bayesianmodeling00velerich_Page_106.tif
F107
bayesianmodeling00velerich_Page_107.tif
F108
bayesianmodeling00velerich_Page_108.tif
F109
bayesianmodeling00velerich_Page_109.tif
F110
bayesianmodeling00velerich_Page_110.tif
F111
bayesianmodeling00velerich_Page_111.tif
F112
bayesianmodeling00velerich_Page_112.tif
F113
bayesianmodeling00velerich_Page_113.tif
F114
bayesianmodeling00velerich_Page_114.tif
F115
bayesianmodeling00velerich_Page_115.tif
F116
bayesianmodeling00velerich_Page_116.tif
F117
bayesianmodeling00velerich_Page_117.tif
F118
bayesianmodeling00velerich_Page_118.tif
F119
bayesianmodeling00velerich_Page_119.tif
F120 21831628
bayesianmodeling00velerich_Page_120.tif
F121
bayesianmodeling00velerich_Page_121.tif
F122
bayesianmodeling00velerich_Page_122.tif
F123
bayesianmodeling00velerich_Page_123.tif
F124
bayesianmodeling00velerich_Page_124.tif
F125
bayesianmodeling00velerich_Page_125.tif
F126
bayesianmodeling00velerich_Page_126.tif
F127
bayesianmodeling00velerich_Page_127.tif
F128
bayesianmodeling00velerich_Page_128.tif
F129
bayesianmodeling00velerich_Page_129.tif
F130
bayesianmodeling00velerich_Page_130.tif
F131
bayesianmodeling00velerich_Page_131.tif
F132
bayesianmodeling00velerich_Page_132.tif
F133
bayesianmodeling00velerich_Page_133.tif
F134
bayesianmodeling00velerich_Page_134.tif
F135
bayesianmodeling00velerich_Page_135.tif
F136
bayesianmodeling00velerich_Page_136.tif
F137
bayesianmodeling00velerich_Page_137.tif
F138
bayesianmodeling00velerich_Page_138.tif
F139
bayesianmodeling00velerich_Page_139.tif
F140
bayesianmodeling00velerich_Page_140.tif
F141
bayesianmodeling00velerich_Page_141.tif
F142
bayesianmodeling00velerich_Page_142.tif
F143
bayesianmodeling00velerich_Page_143.tif
F144
bayesianmodeling00velerich_Page_144.tif
F145
bayesianmodeling00velerich_Page_145.tif
F146
bayesianmodeling00velerich_Page_146.tif
F147
bayesianmodeling00velerich_Page_147.tif
F148
bayesianmodeling00velerich_Page_148.tif
F149
bayesianmodeling00velerich_Page_149.tif
F150
bayesianmodeling00velerich_Page_150.tif
F151
bayesianmodeling00velerich_Page_151.tif
F152
bayesianmodeling00velerich_Page_152.tif
F153
bayesianmodeling00velerich_Page_153.tif
F154
bayesianmodeling00velerich_Page_154.tif
F155
bayesianmodeling00velerich_Page_155.tif
F156
bayesianmodeling00velerich_Page_156.tif
F157
bayesianmodeling00velerich_Page_157.tif
F158
bayesianmodeling00velerich_Page_158.tif
F159
bayesianmodeling00velerich_Page_159.tif
F160 21506452
bayesianmodeling00velerich_Page_160.tif
F161
bayesianmodeling00velerich_Page_161.tif
F162
bayesianmodeling00velerich_Page_162.tif
F163
bayesianmodeling00velerich_Page_163.tif
F164
bayesianmodeling00velerich_Page_164.tif
F165
bayesianmodeling00velerich_Page_165.tif
F166
bayesianmodeling00velerich_Page_166.tif
F167
bayesianmodeling00velerich_Page_167.tif
F168
bayesianmodeling00velerich_Page_168.tif
F169
bayesianmodeling00velerich_Page_169.tif
F170
bayesianmodeling00velerich_Page_170.tif
F171
bayesianmodeling00velerich_Page_171.tif
F172
bayesianmodeling00velerich_Page_172.tif
F173
bayesianmodeling00velerich_Page_173.tif
F174
bayesianmodeling00velerich_Page_174.tif
F175
bayesianmodeling00velerich_Page_175.tif
F176
bayesianmodeling00velerich_Page_176.tif
F177
bayesianmodeling00velerich_Page_177.tif
F178
bayesianmodeling00velerich_Page_178.tif
F179
bayesianmodeling00velerich_Page_179.tif
F180
bayesianmodeling00velerich_Page_180.tif
F181
bayesianmodeling00velerich_Page_181.tif
F182
bayesianmodeling00velerich_Page_182.tif
F183
bayesianmodeling00velerich_Page_183.tif
F184
bayesianmodeling00velerich_Page_184.tif
F185
bayesianmodeling00velerich_Page_185.tif
F186
bayesianmodeling00velerich_Page_186.tif
F187
bayesianmodeling00velerich_Page_187.tif
F188
bayesianmodeling00velerich_Page_188.tif
F189
bayesianmodeling00velerich_Page_189.tif
F190
bayesianmodeling00velerich_Page_190.tif
F191
bayesianmodeling00velerich_Page_191.tif
F192
bayesianmodeling00velerich_Page_192.tif
F193
bayesianmodeling00velerich_Page_193.tif
F194
bayesianmodeling00velerich_Page_194.tif
F195
bayesianmodeling00velerich_Page_195.tif
F196
bayesianmodeling00velerich_Page_196.tif
F197
bayesianmodeling00velerich_Page_197.tif
F198
bayesianmodeling00velerich_Page_198.tif
F199
bayesianmodeling00velerich_Page_199.tif
F200
bayesianmodeling00velerich_Page_200.tif
F201
bayesianmodeling00velerich_Page_201.tif
F202
bayesianmodeling00velerich_Page_202.tif
F203
bayesianmodeling00velerich_Page_203.tif
F204 21946396
bayesianmodeling00velerich_Page_204.tif
F205
bayesianmodeling00velerich_Page_205.tif
F206
bayesianmodeling00velerich_Page_206.tif
F207
bayesianmodeling00velerich_Page_207.tif
F208
bayesianmodeling00velerich_Page_208.tif
F209
bayesianmodeling00velerich_Page_209.tif
F210
bayesianmodeling00velerich_Page_210.tif
F211
bayesianmodeling00velerich_Page_211.tif
F212
bayesianmodeling00velerich_Page_212.tif
F213
bayesianmodeling00velerich_Page_213.tif
F214
bayesianmodeling00velerich_Page_214.tif
F215
bayesianmodeling00velerich_Page_215.tif
F216
bayesianmodeling00velerich_Page_216.tif
F217
bayesianmodeling00velerich_Page_217.tif
F218
bayesianmodeling00velerich_Page_218.tif
F219
bayesianmodeling00velerich_Page_219.tif
F220
bayesianmodeling00velerich_Page_220.tif
F221
bayesianmodeling00velerich_Page_221.tif
F222
bayesianmodeling00velerich_Page_222.tif
F223
bayesianmodeling00velerich_Page_223.tif
F224
bayesianmodeling00velerich_Page_224.tif
F225
bayesianmodeling00velerich_Page_225.tif
F226
bayesianmodeling00velerich_Page_226.tif
F227
bayesianmodeling00velerich_Page_227.tif
F228 21016332
bayesianmodeling00velerich_Page_228.tif
F229
bayesianmodeling00velerich_Page_229.tif
F230 22973994
bayesianmodeling00velerich_Page_230.tif
R1 textx-pro 9242
bayesianmodeling00velerich_Page_001.pro
R2 1400
bayesianmodeling00velerich_Page_002.pro
R3 6075
bayesianmodeling00velerich_Page_003.pro
R4 37975
bayesianmodeling00velerich_Page_004.pro
R5 33939
bayesianmodeling00velerich_Page_005.pro
R6 43149
bayesianmodeling00velerich_Page_006.pro
R7 44355
bayesianmodeling00velerich_Page_007.pro
R8 4619
bayesianmodeling00velerich_Page_008.pro
R9 8180
bayesianmodeling00velerich_Page_009.pro
R10 10597
bayesianmodeling00velerich_Page_010.pro
R11 37134
bayesianmodeling00velerich_Page_011.pro
R12 40809
bayesianmodeling00velerich_Page_012.pro
R13 9316
bayesianmodeling00velerich_Page_013.pro
R14 36651
bayesianmodeling00velerich_Page_014.pro
R15 45188
bayesianmodeling00velerich_Page_015.pro
R16 42478
bayesianmodeling00velerich_Page_016.pro
R17 42794
bayesianmodeling00velerich_Page_017.pro
R18 39205
bayesianmodeling00velerich_Page_018.pro
R19 39124
bayesianmodeling00velerich_Page_019.pro
R20 42635
bayesianmodeling00velerich_Page_020.pro
R21 14115
bayesianmodeling00velerich_Page_021.pro
R22 38599
bayesianmodeling00velerich_Page_022.pro
R23 42369
bayesianmodeling00velerich_Page_023.pro
R24 34193
bayesianmodeling00velerich_Page_024.pro
R25 42211
bayesianmodeling00velerich_Page_025.pro
R26 43094
bayesianmodeling00velerich_Page_026.pro
R27 43478
bayesianmodeling00velerich_Page_027.pro
R28 44339
bayesianmodeling00velerich_Page_028.pro
R29 40339
bayesianmodeling00velerich_Page_029.pro
R30 40540
bayesianmodeling00velerich_Page_030.pro
R31 38074
bayesianmodeling00velerich_Page_031.pro
R32 39616
bayesianmodeling00velerich_Page_032.pro
R33 39481
bayesianmodeling00velerich_Page_033.pro
R34 21308
bayesianmodeling00velerich_Page_034.pro
R35 37175
bayesianmodeling00velerich_Page_035.pro
R36 30495
bayesianmodeling00velerich_Page_036.pro
R37 35173
bayesianmodeling00velerich_Page_037.pro
R38 37677
bayesianmodeling00velerich_Page_038.pro
R39 31950
bayesianmodeling00velerich_Page_039.pro
R40 24391
bayesianmodeling00velerich_Page_040.pro
R41 37166
bayesianmodeling00velerich_Page_041.pro
R42 45217
bayesianmodeling00velerich_Page_042.pro
R43 40982
bayesianmodeling00velerich_Page_043.pro
R44 41625
bayesianmodeling00velerich_Page_044.pro
R45 41120
bayesianmodeling00velerich_Page_045.pro
R46 40154
bayesianmodeling00velerich_Page_046.pro
R47 34245
bayesianmodeling00velerich_Page_047.pro
R48 42271
bayesianmodeling00velerich_Page_048.pro
R49 42251
bayesianmodeling00velerich_Page_049.pro
R50 41499
bayesianmodeling00velerich_Page_050.pro
R51 41845
bayesianmodeling00velerich_Page_051.pro
R52 41503
bayesianmodeling00velerich_Page_052.pro
R53 32822
bayesianmodeling00velerich_Page_053.pro
R54 43420
bayesianmodeling00velerich_Page_054.pro
R55 44814
bayesianmodeling00velerich_Page_055.pro
R56 44977
bayesianmodeling00velerich_Page_056.pro
R57 41963
bayesianmodeling00velerich_Page_057.pro
R58 44194
bayesianmodeling00velerich_Page_058.pro
R59 42114
bayesianmodeling00velerich_Page_059.pro
R60 41907
bayesianmodeling00velerich_Page_060.pro
R61 41808
bayesianmodeling00velerich_Page_061.pro
R62 38066
bayesianmodeling00velerich_Page_062.pro
R63 23923
bayesianmodeling00velerich_Page_063.pro
R64 41015
bayesianmodeling00velerich_Page_064.pro
R65 43206
bayesianmodeling00velerich_Page_065.pro
R66 40023
bayesianmodeling00velerich_Page_066.pro
R67 40939
bayesianmodeling00velerich_Page_067.pro
R68 29166
bayesianmodeling00velerich_Page_068.pro
R69 43171
bayesianmodeling00velerich_Page_069.pro
R70 29476
bayesianmodeling00velerich_Page_070.pro
R71 40868
bayesianmodeling00velerich_Page_071.pro
R72 41319
bayesianmodeling00velerich_Page_072.pro
R73 23780
bayesianmodeling00velerich_Page_073.pro
R74 31745
bayesianmodeling00velerich_Page_074.pro
R75 47312
bayesianmodeling00velerich_Page_075.pro
R76 43001
bayesianmodeling00velerich_Page_076.pro
R77 45929
bayesianmodeling00velerich_Page_077.pro
R78 30941
bayesianmodeling00velerich_Page_078.pro
R79 31555
bayesianmodeling00velerich_Page_079.pro
R80 40344
bayesianmodeling00velerich_Page_080.pro
R81 21452
bayesianmodeling00velerich_Page_081.pro
R82 40660
bayesianmodeling00velerich_Page_082.pro
R83 26524
bayesianmodeling00velerich_Page_083.pro
R84 32723
bayesianmodeling00velerich_Page_084.pro
R85 26652
bayesianmodeling00velerich_Page_085.pro
R86 33165
bayesianmodeling00velerich_Page_086.pro
R87 35995
bayesianmodeling00velerich_Page_087.pro
R88 31485
bayesianmodeling00velerich_Page_088.pro
R89 27917
bayesianmodeling00velerich_Page_089.pro
R90 40676
bayesianmodeling00velerich_Page_090.pro
R91 38750
bayesianmodeling00velerich_Page_091.pro
R92 37630
bayesianmodeling00velerich_Page_092.pro
R93 38484
bayesianmodeling00velerich_Page_093.pro
R94 38741
bayesianmodeling00velerich_Page_094.pro
R95 38585
bayesianmodeling00velerich_Page_095.pro
R96 36924
bayesianmodeling00velerich_Page_096.pro
R97 46174
bayesianmodeling00velerich_Page_097.pro
R98 46414
bayesianmodeling00velerich_Page_098.pro
R99 37902
bayesianmodeling00velerich_Page_099.pro
R100 25874
bayesianmodeling00velerich_Page_100.pro
R101 28165
bayesianmodeling00velerich_Page_101.pro
R102 30617
bayesianmodeling00velerich_Page_102.pro
R103 36944
bayesianmodeling00velerich_Page_103.pro
R104 39087
bayesianmodeling00velerich_Page_104.pro
R105 42773
bayesianmodeling00velerich_Page_105.pro
R106 38247
bayesianmodeling00velerich_Page_106.pro
R107 34648
bayesianmodeling00velerich_Page_107.pro
R108 36022
bayesianmodeling00velerich_Page_108.pro
R109 34782
bayesianmodeling00velerich_Page_109.pro
R110 25113
bayesianmodeling00velerich_Page_110.pro
R111 31957
bayesianmodeling00velerich_Page_111.pro
R112 24004
bayesianmodeling00velerich_Page_112.pro
R113 21771
bayesianmodeling00velerich_Page_113.pro
R114 22885
bayesianmodeling00velerich_Page_114.pro
R115 22357
bayesianmodeling00velerich_Page_115.pro
R116 44370
bayesianmodeling00velerich_Page_116.pro
R117 40297
bayesianmodeling00velerich_Page_117.pro
R118 47302
bayesianmodeling00velerich_Page_118.pro
R119 42922
bayesianmodeling00velerich_Page_119.pro
R120 38716
bayesianmodeling00velerich_Page_120.pro
R121 42551
bayesianmodeling00velerich_Page_121.pro
R122 36961
bayesianmodeling00velerich_Page_122.pro
R123 44685
bayesianmodeling00velerich_Page_123.pro
R124
bayesianmodeling00velerich_Page_124.pro
R125 37081
bayesianmodeling00velerich_Page_125.pro
R126 38465
bayesianmodeling00velerich_Page_126.pro
R127 46537
bayesianmodeling00velerich_Page_127.pro
R128 45115
bayesianmodeling00velerich_Page_128.pro
R129 41835
bayesianmodeling00velerich_Page_129.pro
R130 37794
bayesianmodeling00velerich_Page_130.pro
R131 6771
bayesianmodeling00velerich_Page_131.pro
R132 39224
bayesianmodeling00velerich_Page_132.pro
R133 43309
bayesianmodeling00velerich_Page_133.pro
R134 43884
bayesianmodeling00velerich_Page_134.pro
R135 34473
bayesianmodeling00velerich_Page_135.pro
R136 41105
bayesianmodeling00velerich_Page_136.pro
R137 46340
bayesianmodeling00velerich_Page_137.pro
R138 33233
bayesianmodeling00velerich_Page_138.pro
R139 28723
bayesianmodeling00velerich_Page_139.pro
R140 35998
bayesianmodeling00velerich_Page_140.pro
R141 31308
bayesianmodeling00velerich_Page_141.pro
R142 32170
bayesianmodeling00velerich_Page_142.pro
R143 38191
bayesianmodeling00velerich_Page_143.pro
R144 34764
bayesianmodeling00velerich_Page_144.pro
R145 39802
bayesianmodeling00velerich_Page_145.pro
R146 31644
bayesianmodeling00velerich_Page_146.pro
R147 40752
bayesianmodeling00velerich_Page_147.pro
R148 27349
bayesianmodeling00velerich_Page_148.pro
R149 41122
bayesianmodeling00velerich_Page_149.pro
R150 41400
bayesianmodeling00velerich_Page_150.pro
R151 33846
bayesianmodeling00velerich_Page_151.pro
R152 38476
bayesianmodeling00velerich_Page_152.pro
R153 36987
bayesianmodeling00velerich_Page_153.pro
R154 40061
bayesianmodeling00velerich_Page_154.pro
R155 39946
bayesianmodeling00velerich_Page_155.pro
R156 43800
bayesianmodeling00velerich_Page_156.pro
R157 29650
bayesianmodeling00velerich_Page_157.pro
R158 37445
bayesianmodeling00velerich_Page_158.pro
R159 45073
bayesianmodeling00velerich_Page_159.pro
R160 49795
bayesianmodeling00velerich_Page_160.pro
R161 39992
bayesianmodeling00velerich_Page_161.pro
R162 6709
bayesianmodeling00velerich_Page_162.pro
R163 36269
bayesianmodeling00velerich_Page_163.pro
R164 34399
bayesianmodeling00velerich_Page_164.pro
R165 42362
bayesianmodeling00velerich_Page_165.pro
R166 43707
bayesianmodeling00velerich_Page_166.pro
R167 43499
bayesianmodeling00velerich_Page_167.pro
R168 39940
bayesianmodeling00velerich_Page_168.pro
R169 44477
bayesianmodeling00velerich_Page_169.pro
R170 45611
bayesianmodeling00velerich_Page_170.pro
R171 29086
bayesianmodeling00velerich_Page_171.pro
R172 1220
bayesianmodeling00velerich_Page_172.pro
R173 34311
bayesianmodeling00velerich_Page_173.pro
R174 24161
bayesianmodeling00velerich_Page_174.pro
R175 33337
bayesianmodeling00velerich_Page_175.pro
R176 33734
bayesianmodeling00velerich_Page_176.pro
R177 39507
bayesianmodeling00velerich_Page_177.pro
R178 31466
bayesianmodeling00velerich_Page_178.pro
R179 31608
bayesianmodeling00velerich_Page_179.pro
R180 26236
bayesianmodeling00velerich_Page_180.pro
R181 38086
bayesianmodeling00velerich_Page_181.pro
R182 23832
bayesianmodeling00velerich_Page_182.pro
R183 34097
bayesianmodeling00velerich_Page_183.pro
R184 1245
bayesianmodeling00velerich_Page_184.pro
R185 34950
bayesianmodeling00velerich_Page_185.pro
R186 25776
bayesianmodeling00velerich_Page_186.pro
R187 29424
bayesianmodeling00velerich_Page_187.pro
R188 24924
bayesianmodeling00velerich_Page_188.pro
R189 32713
bayesianmodeling00velerich_Page_189.pro
R190 30010
bayesianmodeling00velerich_Page_190.pro
R191 26617
bayesianmodeling00velerich_Page_191.pro
R192 25497
bayesianmodeling00velerich_Page_192.pro
R193 28886
bayesianmodeling00velerich_Page_193.pro
R194 34569
bayesianmodeling00velerich_Page_194.pro
R195 20090
bayesianmodeling00velerich_Page_195.pro
R196 7526
bayesianmodeling00velerich_Page_196.pro
R197
bayesianmodeling00velerich_Page_197.pro
R198 30235
bayesianmodeling00velerich_Page_198.pro
R199 14023
bayesianmodeling00velerich_Page_199.pro
R200 19252
bayesianmodeling00velerich_Page_200.pro
R201 12917
bayesianmodeling00velerich_Page_201.pro
R202 13007
bayesianmodeling00velerich_Page_202.pro
R203 36981
bayesianmodeling00velerich_Page_203.pro
R204 10473
bayesianmodeling00velerich_Page_204.pro
R205 20603
bayesianmodeling00velerich_Page_205.pro
R206 30529
bayesianmodeling00velerich_Page_206.pro
R207 3209
bayesianmodeling00velerich_Page_207.pro
R208 5475
bayesianmodeling00velerich_Page_208.pro
R209 10120
bayesianmodeling00velerich_Page_209.pro
R210 1856
bayesianmodeling00velerich_Page_210.pro
R211 40656
bayesianmodeling00velerich_Page_211.pro
R212 47926
bayesianmodeling00velerich_Page_212.pro
R213 49691
bayesianmodeling00velerich_Page_213.pro
R214 48717
bayesianmodeling00velerich_Page_214.pro
R215 45516
bayesianmodeling00velerich_Page_215.pro
R216 51638
bayesianmodeling00velerich_Page_216.pro
R217 48567
bayesianmodeling00velerich_Page_217.pro
R218 53657
bayesianmodeling00velerich_Page_218.pro
R219 49697
bayesianmodeling00velerich_Page_219.pro
R220 51576
bayesianmodeling00velerich_Page_220.pro
R221 49958
bayesianmodeling00velerich_Page_221.pro
R222 51678
bayesianmodeling00velerich_Page_222.pro
R223 53495
bayesianmodeling00velerich_Page_223.pro
R224 51905
bayesianmodeling00velerich_Page_224.pro
R225 17044
bayesianmodeling00velerich_Page_225.pro
R226 37802
bayesianmodeling00velerich_Page_226.pro
R227 21746
bayesianmodeling00velerich_Page_227.pro
R228 28971
bayesianmodeling00velerich_Page_228.pro
R229 15175
bayesianmodeling00velerich_Page_229.pro
T1 textplain 518
bayesianmodeling00velerich_Page_001.txt
T2 126
bayesianmodeling00velerich_Page_002.txt
T3 313
bayesianmodeling00velerich_Page_003.txt
T4 1650
bayesianmodeling00velerich_Page_004.txt
T5 1394
bayesianmodeling00velerich_Page_005.txt
T6 2081
bayesianmodeling00velerich_Page_006.txt
T7 2174
bayesianmodeling00velerich_Page_007.txt
T8 276
bayesianmodeling00velerich_Page_008.txt
T9 419
bayesianmodeling00velerich_Page_009.txt
T10 528
bayesianmodeling00velerich_Page_010.txt
T11 1743
bayesianmodeling00velerich_Page_011.txt
T12 1756
bayesianmodeling00velerich_Page_012.txt
T13 400
bayesianmodeling00velerich_Page_013.txt
T14 1589
bayesianmodeling00velerich_Page_014.txt
T15 1801
bayesianmodeling00velerich_Page_015.txt
T16 1723
bayesianmodeling00velerich_Page_016.txt
T17 1725
bayesianmodeling00velerich_Page_017.txt
T18 1657
bayesianmodeling00velerich_Page_018.txt
T19 1598
bayesianmodeling00velerich_Page_019.txt
T20 1773
bayesianmodeling00velerich_Page_020.txt
T21 659
bayesianmodeling00velerich_Page_021.txt
T22 1653
bayesianmodeling00velerich_Page_022.txt
T23 1754
bayesianmodeling00velerich_Page_023.txt
T24 1526
bayesianmodeling00velerich_Page_024.txt
T25 1705
bayesianmodeling00velerich_Page_025.txt
T26 1731
bayesianmodeling00velerich_Page_026.txt
T27 1748
bayesianmodeling00velerich_Page_027.txt
T28 1826
bayesianmodeling00velerich_Page_028.txt
T29 1687
bayesianmodeling00velerich_Page_029.txt
T30 1692
bayesianmodeling00velerich_Page_030.txt
T31 1605
bayesianmodeling00velerich_Page_031.txt
T32 1668
bayesianmodeling00velerich_Page_032.txt
T33 1730
bayesianmodeling00velerich_Page_033.txt
T34 1101
bayesianmodeling00velerich_Page_034.txt
T35 1628
bayesianmodeling00velerich_Page_035.txt
T36 1502
bayesianmodeling00velerich_Page_036.txt
T37 1476
bayesianmodeling00velerich_Page_037.txt
T38 1566
bayesianmodeling00velerich_Page_038.txt
T39 1401
bayesianmodeling00velerich_Page_039.txt
T40 1093
bayesianmodeling00velerich_Page_040.txt
T41 1567
bayesianmodeling00velerich_Page_041.txt
T42 1847
bayesianmodeling00velerich_Page_042.txt
T43 1712
bayesianmodeling00velerich_Page_043.txt
T44 1659
bayesianmodeling00velerich_Page_044.txt
T45
bayesianmodeling00velerich_Page_045.txt
T46 1629
bayesianmodeling00velerich_Page_046.txt
T47 1492
bayesianmodeling00velerich_Page_047.txt
T48 1700
bayesianmodeling00velerich_Page_048.txt
T49 1695
bayesianmodeling00velerich_Page_049.txt
T50 1745
bayesianmodeling00velerich_Page_050.txt
T51 1755
bayesianmodeling00velerich_Page_051.txt
T52 1812
bayesianmodeling00velerich_Page_052.txt
T53 1433
bayesianmodeling00velerich_Page_053.txt
T54 1758
bayesianmodeling00velerich_Page_054.txt
T55 1794
bayesianmodeling00velerich_Page_055.txt
T56 1789
bayesianmodeling00velerich_Page_056.txt
T57 1744
bayesianmodeling00velerich_Page_057.txt
T58 1764
bayesianmodeling00velerich_Page_058.txt
T59 1741
bayesianmodeling00velerich_Page_059.txt
T60 1742
bayesianmodeling00velerich_Page_060.txt
T61
bayesianmodeling00velerich_Page_061.txt
T62 1551
bayesianmodeling00velerich_Page_062.txt
T63 968
bayesianmodeling00velerich_Page_063.txt
T64 1677
bayesianmodeling00velerich_Page_064.txt
T65 1808
bayesianmodeling00velerich_Page_065.txt
T66 1606
bayesianmodeling00velerich_Page_066.txt
T67 1707
bayesianmodeling00velerich_Page_067.txt
T68 1331
bayesianmodeling00velerich_Page_068.txt
T69 1768
bayesianmodeling00velerich_Page_069.txt
T70 1293
bayesianmodeling00velerich_Page_070.txt
T71 1689
bayesianmodeling00velerich_Page_071.txt
T72 1710
bayesianmodeling00velerich_Page_072.txt
T73 1260
bayesianmodeling00velerich_Page_073.txt
T74 1410
bayesianmodeling00velerich_Page_074.txt
T75 1891
bayesianmodeling00velerich_Page_075.txt
T76 1727
bayesianmodeling00velerich_Page_076.txt
T77 1886
bayesianmodeling00velerich_Page_077.txt
T78 1320
bayesianmodeling00velerich_Page_078.txt
T79
bayesianmodeling00velerich_Page_079.txt
T80
bayesianmodeling00velerich_Page_080.txt
T81 1367
bayesianmodeling00velerich_Page_081.txt
T82 1709
bayesianmodeling00velerich_Page_082.txt
T83 1447
bayesianmodeling00velerich_Page_083.txt
T84 1515
bayesianmodeling00velerich_Page_084.txt
T85 1273
bayesianmodeling00velerich_Page_085.txt
T86 1444
bayesianmodeling00velerich_Page_086.txt
T87 1634
bayesianmodeling00velerich_Page_087.txt
T88 1374
bayesianmodeling00velerich_Page_088.txt
T89 1489
bayesianmodeling00velerich_Page_089.txt
T90 1787
bayesianmodeling00velerich_Page_090.txt
T91 1717
bayesianmodeling00velerich_Page_091.txt
T92 1875
bayesianmodeling00velerich_Page_092.txt
T93 1684
bayesianmodeling00velerich_Page_093.txt
T94 1632
bayesianmodeling00velerich_Page_094.txt
T95 1593
bayesianmodeling00velerich_Page_095.txt
T96 1680
bayesianmodeling00velerich_Page_096.txt
T97 1830
bayesianmodeling00velerich_Page_097.txt
T98
bayesianmodeling00velerich_Page_098.txt
T99 1870
bayesianmodeling00velerich_Page_099.txt
T100 1235
bayesianmodeling00velerich_Page_100.txt
T101 1540
bayesianmodeling00velerich_Page_101.txt
T102 1594
bayesianmodeling00velerich_Page_102.txt
T103 1834
bayesianmodeling00velerich_Page_103.txt
T104 1881
bayesianmodeling00velerich_Page_104.txt
T105 1917
bayesianmodeling00velerich_Page_105.txt
T106 1848
bayesianmodeling00velerich_Page_106.txt
T107 1529
bayesianmodeling00velerich_Page_107.txt
T108 1610
bayesianmodeling00velerich_Page_108.txt
T109 2175
bayesianmodeling00velerich_Page_109.txt
T110 1644
bayesianmodeling00velerich_Page_110.txt
T111 1749
bayesianmodeling00velerich_Page_111.txt
T112 1685
bayesianmodeling00velerich_Page_112.txt
T113 1069
bayesianmodeling00velerich_Page_113.txt
T114 1221
bayesianmodeling00velerich_Page_114.txt
T115 1196
bayesianmodeling00velerich_Page_115.txt
T116 1850
bayesianmodeling00velerich_Page_116.txt
T117
bayesianmodeling00velerich_Page_117.txt
T118 1961
bayesianmodeling00velerich_Page_118.txt
T119 1765
bayesianmodeling00velerich_Page_119.txt
T120 1716
bayesianmodeling00velerich_Page_120.txt
T121 1757
bayesianmodeling00velerich_Page_121.txt
T122 1663
bayesianmodeling00velerich_Page_122.txt
T123 1782
bayesianmodeling00velerich_Page_123.txt
T124 1803
bayesianmodeling00velerich_Page_124.txt
T125 1600
bayesianmodeling00velerich_Page_125.txt
T126
bayesianmodeling00velerich_Page_126.txt
T127 1884
bayesianmodeling00velerich_Page_127.txt
T128 1793
bayesianmodeling00velerich_Page_128.txt
T129 1674
bayesianmodeling00velerich_Page_129.txt
T130 1648
bayesianmodeling00velerich_Page_130.txt
T131 354
bayesianmodeling00velerich_Page_131.txt
T132
bayesianmodeling00velerich_Page_132.txt
T133 1738
bayesianmodeling00velerich_Page_133.txt
T134
bayesianmodeling00velerich_Page_134.txt
T135 1445
bayesianmodeling00velerich_Page_135.txt
T136 1686
bayesianmodeling00velerich_Page_136.txt
T137 1844
bayesianmodeling00velerich_Page_137.txt
T138 1403
bayesianmodeling00velerich_Page_138.txt
T139 1222
bayesianmodeling00velerich_Page_139.txt
T140 1471
bayesianmodeling00velerich_Page_140.txt
T141 1276
bayesianmodeling00velerich_Page_141.txt
T142 1334
bayesianmodeling00velerich_Page_142.txt
T143 1550
bayesianmodeling00velerich_Page_143.txt
T144 1507
bayesianmodeling00velerich_Page_144.txt
T145 1601
bayesianmodeling00velerich_Page_145.txt
T146 1537
bayesianmodeling00velerich_Page_146.txt
T147 1863
bayesianmodeling00velerich_Page_147.txt
T148
bayesianmodeling00velerich_Page_148.txt
T149 1704
bayesianmodeling00velerich_Page_149.txt
T150
bayesianmodeling00velerich_Page_150.txt
T151 1591
bayesianmodeling00velerich_Page_151.txt
T152
bayesianmodeling00velerich_Page_152.txt
T153 1562
bayesianmodeling00velerich_Page_153.txt
T154 1671
bayesianmodeling00velerich_Page_154.txt
T155 1626
bayesianmodeling00velerich_Page_155.txt
T156 1747
bayesianmodeling00velerich_Page_156.txt
T157 1399
bayesianmodeling00velerich_Page_157.txt
T158 1676
bayesianmodeling00velerich_Page_158.txt
T159
bayesianmodeling00velerich_Page_159.txt
T160 2044
bayesianmodeling00velerich_Page_160.txt
T161
bayesianmodeling00velerich_Page_161.txt
T162 350
bayesianmodeling00velerich_Page_162.txt
T163 1620
bayesianmodeling00velerich_Page_163.txt
T164 1452
bayesianmodeling00velerich_Page_164.txt
T165
bayesianmodeling00velerich_Page_165.txt
T166
bayesianmodeling00velerich_Page_166.txt
T167 1940
bayesianmodeling00velerich_Page_167.txt
T168 1691
bayesianmodeling00velerich_Page_168.txt
T169
bayesianmodeling00velerich_Page_169.txt
T170 1960
bayesianmodeling00velerich_Page_170.txt
T171 1176
bayesianmodeling00velerich_Page_171.txt
T172 127
bayesianmodeling00velerich_Page_172.txt
T173
bayesianmodeling00velerich_Page_173.txt
T174 1234
bayesianmodeling00velerich_Page_174.txt
T175
bayesianmodeling00velerich_Page_175.txt
T176 1396
bayesianmodeling00velerich_Page_176.txt
T177
bayesianmodeling00velerich_Page_177.txt
T178 1427
bayesianmodeling00velerich_Page_178.txt
T179
bayesianmodeling00velerich_Page_179.txt
T180 1809
bayesianmodeling00velerich_Page_180.txt
T181 1652
bayesianmodeling00velerich_Page_181.txt
T182 1217
bayesianmodeling00velerich_Page_182.txt
T183
bayesianmodeling00velerich_Page_183.txt
T184 124
bayesianmodeling00velerich_Page_184.txt
T185
bayesianmodeling00velerich_Page_185.txt
T186 1342
bayesianmodeling00velerich_Page_186.txt
T187 1642
bayesianmodeling00velerich_Page_187.txt
T188 1215
bayesianmodeling00velerich_Page_188.txt
T189 1419
bayesianmodeling00velerich_Page_189.txt
T190 1463
bayesianmodeling00velerich_Page_190.txt
T191 1291
bayesianmodeling00velerich_Page_191.txt
T192 1224
bayesianmodeling00velerich_Page_192.txt
T193 1366
bayesianmodeling00velerich_Page_193.txt
T194 1788
bayesianmodeling00velerich_Page_194.txt
T195 1070
bayesianmodeling00velerich_Page_195.txt
T196 451
bayesianmodeling00velerich_Page_196.txt
T197 123
bayesianmodeling00velerich_Page_197.txt
T198 1390
bayesianmodeling00velerich_Page_198.txt
T199 689
bayesianmodeling00velerich_Page_199.txt
T200 1080
bayesianmodeling00velerich_Page_200.txt
T201 638
bayesianmodeling00velerich_Page_201.txt
T202 695
bayesianmodeling00velerich_Page_202.txt
T203
bayesianmodeling00velerich_Page_203.txt
T204 810
bayesianmodeling00velerich_Page_204.txt
T205 1195
bayesianmodeling00velerich_Page_205.txt
T206 1230
bayesianmodeling00velerich_Page_206.txt
T207 197
bayesianmodeling00velerich_Page_207.txt
T208 330
bayesianmodeling00velerich_Page_208.txt
T209 1141
bayesianmodeling00velerich_Page_209.txt
T210 372
bayesianmodeling00velerich_Page_210.txt
T211 1898
bayesianmodeling00velerich_Page_211.txt
T212 2252
bayesianmodeling00velerich_Page_212.txt
T213 2248
bayesianmodeling00velerich_Page_213.txt
T214 2238
bayesianmodeling00velerich_Page_214.txt
T215 2117
bayesianmodeling00velerich_Page_215.txt
T216 2347
bayesianmodeling00velerich_Page_216.txt
T217 2216
bayesianmodeling00velerich_Page_217.txt
T218 2474
bayesianmodeling00velerich_Page_218.txt
T219 2239
bayesianmodeling00velerich_Page_219.txt
T220 2335
bayesianmodeling00velerich_Page_220.txt
T221
bayesianmodeling00velerich_Page_221.txt
T222 2319
bayesianmodeling00velerich_Page_222.txt
T223 2424
bayesianmodeling00velerich_Page_223.txt
T224 2355
bayesianmodeling00velerich_Page_224.txt
T225 875
bayesianmodeling00velerich_Page_225.txt
T226 1615
bayesianmodeling00velerich_Page_226.txt
T227 929
bayesianmodeling00velerich_Page_227.txt
T228
bayesianmodeling00velerich_Page_228.txt
T229 771
bayesianmodeling00velerich_Page_229.txt
UR1 2325
bayesianmodeling00velerich_Page_001thm.jpg
applicationpdf 9717828
bayesianmodeling00velerich.pdf
AR1 6650
bayesianmodeling00velerich_Page_001.QC.jpg
AR2 4021
bayesianmodeling00velerich_Page_002.QC.jpg
AR3 1603
bayesianmodeling00velerich_Page_002thm.jpg
AR4 5597
bayesianmodeling00velerich_Page_003.QC.jpg
AR5 1966
bayesianmodeling00velerich_Page_003thm.jpg
AR6 19180
bayesianmodeling00velerich_Page_004.QC.jpg
AR7 5442
bayesianmodeling00velerich_Page_004thm.jpg
AR8 17750
bayesianmodeling00velerich_Page_005.QC.jpg
AR9 5217
bayesianmodeling00velerich_Page_005thm.jpg
AR10 20156
bayesianmodeling00velerich_Page_006.QC.jpg
AR11 5194
bayesianmodeling00velerich_Page_006thm.jpg
AR12 18527
bayesianmodeling00velerich_Page_007.QC.jpg
AR13 5085
bayesianmodeling00velerich_Page_007thm.jpg
AR14 4905
bayesianmodeling00velerich_Page_008.QC.jpg
AR15 1778
bayesianmodeling00velerich_Page_008thm.jpg
AR16 6419
bayesianmodeling00velerich_Page_009.QC.jpg
AR17 2163
bayesianmodeling00velerich_Page_009thm.jpg
AR18 7369
bayesianmodeling00velerich_Page_010.QC.jpg
AR19 2376
bayesianmodeling00velerich_Page_010thm.jpg
AR20
bayesianmodeling00velerich_Page_011.QC.jpg
AR21 5638
bayesianmodeling00velerich_Page_011thm.jpg
AR22 22020
bayesianmodeling00velerich_Page_012.QC.jpg
AR23 5994
bayesianmodeling00velerich_Page_012thm.jpg
AR24 7730
bayesianmodeling00velerich_Page_013.QC.jpg
AR25 2464
bayesianmodeling00velerich_Page_013thm.jpg
AR26 19201
bayesianmodeling00velerich_Page_014.QC.jpg
AR27 5301
bayesianmodeling00velerich_Page_014thm.jpg
AR28 23724
bayesianmodeling00velerich_Page_015.QC.jpg
AR29 6285
bayesianmodeling00velerich_Page_015thm.jpg
AR30 20768
bayesianmodeling00velerich_Page_016.QC.jpg
AR31 5405
bayesianmodeling00velerich_Page_016thm.jpg
AR32 23493
bayesianmodeling00velerich_Page_017.QC.jpg
AR33 6480
bayesianmodeling00velerich_Page_017thm.jpg
AR34 19490
bayesianmodeling00velerich_Page_018.QC.jpg
AR35 5482
bayesianmodeling00velerich_Page_018thm.jpg
AR36 19085
bayesianmodeling00velerich_Page_019.QC.jpg
AR37 5425
bayesianmodeling00velerich_Page_019thm.jpg
AR38 21382
bayesianmodeling00velerich_Page_020.QC.jpg
AR39 5887
bayesianmodeling00velerich_Page_020thm.jpg
AR40 10317
bayesianmodeling00velerich_Page_021.QC.jpg
AR41 3096
bayesianmodeling00velerich_Page_021thm.jpg
AR42 19318
bayesianmodeling00velerich_Page_022.QC.jpg
AR43 5400
bayesianmodeling00velerich_Page_022thm.jpg
AR44 21725
bayesianmodeling00velerich_Page_023.QC.jpg
AR45 6084
bayesianmodeling00velerich_Page_023thm.jpg
AR46 19590
bayesianmodeling00velerich_Page_024.QC.jpg
AR47 5681
bayesianmodeling00velerich_Page_024thm.jpg
AR48 21127
bayesianmodeling00velerich_Page_025.QC.jpg
AR49 5745
bayesianmodeling00velerich_Page_025thm.jpg
AR50 20917
bayesianmodeling00velerich_Page_026.QC.jpg
AR51 5759
bayesianmodeling00velerich_Page_026thm.jpg
AR52 21517
bayesianmodeling00velerich_Page_027.QC.jpg
AR53 6032
bayesianmodeling00velerich_Page_027thm.jpg
AR54 22313
bayesianmodeling00velerich_Page_028.QC.jpg
AR55 5931
bayesianmodeling00velerich_Page_028thm.jpg
AR56 21801
bayesianmodeling00velerich_Page_029.QC.jpg
AR57 5936
bayesianmodeling00velerich_Page_029thm.jpg
AR58 20777
bayesianmodeling00velerich_Page_030.QC.jpg
AR59 5666
bayesianmodeling00velerich_Page_030thm.jpg
AR60 20266
bayesianmodeling00velerich_Page_031.QC.jpg
AR61 5792
bayesianmodeling00velerich_Page_031thm.jpg
AR62 21918
bayesianmodeling00velerich_Page_032.QC.jpg
AR63 5903
bayesianmodeling00velerich_Page_032thm.jpg
AR64 21612
bayesianmodeling00velerich_Page_033.QC.jpg
AR65 5866
bayesianmodeling00velerich_Page_033thm.jpg
AR66 14084
bayesianmodeling00velerich_Page_034.QC.jpg
AR67 4227
bayesianmodeling00velerich_Page_034thm.jpg
AR68 19277
bayesianmodeling00velerich_Page_035.QC.jpg
AR69 5391
bayesianmodeling00velerich_Page_035thm.jpg
AR70 16043
bayesianmodeling00velerich_Page_036.QC.jpg
AR71 4598
bayesianmodeling00velerich_Page_036thm.jpg
AR72 17956
bayesianmodeling00velerich_Page_037.QC.jpg
AR73 5185
bayesianmodeling00velerich_Page_037thm.jpg
AR74 18753
bayesianmodeling00velerich_Page_038.QC.jpg
AR75 5170
bayesianmodeling00velerich_Page_038thm.jpg
AR76 16724
bayesianmodeling00velerich_Page_039.QC.jpg
AR77 5037
bayesianmodeling00velerich_Page_039thm.jpg
AR78 13456
bayesianmodeling00velerich_Page_040.QC.jpg
AR79 3985
bayesianmodeling00velerich_Page_040thm.jpg
AR80 19947
bayesianmodeling00velerich_Page_041.QC.jpg
AR81 5369
bayesianmodeling00velerich_Page_041thm.jpg
AR82 21727
bayesianmodeling00velerich_Page_042.QC.jpg
AR83 5796
bayesianmodeling00velerich_Page_042thm.jpg
AR84 22035
bayesianmodeling00velerich_Page_043.QC.jpg
AR85 5708
bayesianmodeling00velerich_Page_043thm.jpg
AR86 20710
bayesianmodeling00velerich_Page_044.QC.jpg
AR87 5723
bayesianmodeling00velerich_Page_044thm.jpg
AR88 21637
bayesianmodeling00velerich_Page_045.QC.jpg
AR89
bayesianmodeling00velerich_Page_045thm.jpg
AR90 20139
bayesianmodeling00velerich_Page_046.QC.jpg
AR91 5459
bayesianmodeling00velerich_Page_046thm.jpg
AR92 18805
bayesianmodeling00velerich_Page_047.QC.jpg
AR93 5458
bayesianmodeling00velerich_Page_047thm.jpg
AR94 21202
bayesianmodeling00velerich_Page_048.QC.jpg
AR95 5819
bayesianmodeling00velerich_Page_048thm.jpg
AR96 21686
bayesianmodeling00velerich_Page_049.QC.jpg
AR97 6062
bayesianmodeling00velerich_Page_049thm.jpg
AR98 21171
bayesianmodeling00velerich_Page_050.QC.jpg
AR99
bayesianmodeling00velerich_Page_050thm.jpg
AR100 21593
bayesianmodeling00velerich_Page_051.QC.jpg
AR101 6161
bayesianmodeling00velerich_Page_051thm.jpg
AR102 21362
bayesianmodeling00velerich_Page_052.QC.jpg
AR103 5829
bayesianmodeling00velerich_Page_052thm.jpg
AR104 18370
bayesianmodeling00velerich_Page_053.QC.jpg
AR105 5350
bayesianmodeling00velerich_Page_053thm.jpg
AR106 21265
bayesianmodeling00velerich_Page_054.QC.jpg
AR107 5677
bayesianmodeling00velerich_Page_054thm.jpg
AR108 22772
bayesianmodeling00velerich_Page_055.QC.jpg
AR109 6208
bayesianmodeling00velerich_Page_055thm.jpg
AR110 23066
bayesianmodeling00velerich_Page_056.QC.jpg
AR111 6370
bayesianmodeling00velerich_Page_056thm.jpg
AR112 23722
bayesianmodeling00velerich_Page_057.QC.jpg
AR113 6612
bayesianmodeling00velerich_Page_057thm.jpg
AR114 21322
bayesianmodeling00velerich_Page_058.QC.jpg
AR115 5850
bayesianmodeling00velerich_Page_058thm.jpg
AR116 21908
bayesianmodeling00velerich_Page_059.QC.jpg
AR117 6132
bayesianmodeling00velerich_Page_059thm.jpg
AR118 21567
bayesianmodeling00velerich_Page_060.QC.jpg
AR119 5747
bayesianmodeling00velerich_Page_060thm.jpg
AR120 22556
bayesianmodeling00velerich_Page_061.QC.jpg
AR121 6310
bayesianmodeling00velerich_Page_061thm.jpg
AR122 20962
bayesianmodeling00velerich_Page_062.QC.jpg
AR123 5775
bayesianmodeling00velerich_Page_062thm.jpg
AR124 15165
bayesianmodeling00velerich_Page_063.QC.jpg
AR125 4222
bayesianmodeling00velerich_Page_063thm.jpg
AR126 20190
bayesianmodeling00velerich_Page_064.QC.jpg
AR127 5464
bayesianmodeling00velerich_Page_064thm.jpg
AR128 22778
bayesianmodeling00velerich_Page_065.QC.jpg
AR129 6225
bayesianmodeling00velerich_Page_065thm.jpg
AR130 21632
bayesianmodeling00velerich_Page_066.QC.jpg
AR131 5722
bayesianmodeling00velerich_Page_066thm.jpg
AR132 22103
bayesianmodeling00velerich_Page_067.QC.jpg
AR133 6217
bayesianmodeling00velerich_Page_067thm.jpg
AR134 15307
bayesianmodeling00velerich_Page_068.QC.jpg
AR135 4636
bayesianmodeling00velerich_Page_068thm.jpg
AR136 21664
bayesianmodeling00velerich_Page_069.QC.jpg
AR137 5813
bayesianmodeling00velerich_Page_069thm.jpg
AR138 15896
bayesianmodeling00velerich_Page_070.QC.jpg
AR139 4477
bayesianmodeling00velerich_Page_070thm.jpg
AR140 21604
bayesianmodeling00velerich_Page_071.QC.jpg
AR141 5889
bayesianmodeling00velerich_Page_071thm.jpg
AR142 20527
bayesianmodeling00velerich_Page_072.QC.jpg
AR143 5537
bayesianmodeling00velerich_Page_072thm.jpg
AR144 13360
bayesianmodeling00velerich_Page_073.QC.jpg
AR145 4069
bayesianmodeling00velerich_Page_073thm.jpg
AR146 17277
bayesianmodeling00velerich_Page_074.QC.jpg
AR147 4983
bayesianmodeling00velerich_Page_074thm.jpg
AR148 23327
bayesianmodeling00velerich_Page_075.QC.jpg
AR149 6605
bayesianmodeling00velerich_Page_075thm.jpg
AR150 23142
bayesianmodeling00velerich_Page_076.QC.jpg
AR151 6177
bayesianmodeling00velerich_Page_076thm.jpg
AR152 23928
bayesianmodeling00velerich_Page_077.QC.jpg
AR153 6306
bayesianmodeling00velerich_Page_077thm.jpg
AR154 17149
bayesianmodeling00velerich_Page_078.QC.jpg
AR155 4985
bayesianmodeling00velerich_Page_078thm.jpg
AR156 16872
bayesianmodeling00velerich_Page_079.QC.jpg
AR157 5134
bayesianmodeling00velerich_Page_079thm.jpg
AR158 20060
bayesianmodeling00velerich_Page_080.QC.jpg
AR159
bayesianmodeling00velerich_Page_080thm.jpg
AR160 12016
bayesianmodeling00velerich_Page_081.QC.jpg
AR161 4014
bayesianmodeling00velerich_Page_081thm.jpg
AR162 20006
bayesianmodeling00velerich_Page_082.QC.jpg
AR163 5490
bayesianmodeling00velerich_Page_082thm.jpg
AR164 15092
bayesianmodeling00velerich_Page_083.QC.jpg
AR165 4250
bayesianmodeling00velerich_Page_083thm.jpg
AR166 18495
bayesianmodeling00velerich_Page_084.QC.jpg
AR167 5160
bayesianmodeling00velerich_Page_084thm.jpg
AR168 15762
bayesianmodeling00velerich_Page_085.QC.jpg
AR169 4740
bayesianmodeling00velerich_Page_085thm.jpg
AR170 17311
bayesianmodeling00velerich_Page_086.QC.jpg
AR171 4911
bayesianmodeling00velerich_Page_086thm.jpg
AR172 18889
bayesianmodeling00velerich_Page_087.QC.jpg
AR173 5408
bayesianmodeling00velerich_Page_087thm.jpg
AR174 16388
bayesianmodeling00velerich_Page_088.QC.jpg
AR175 4871
bayesianmodeling00velerich_Page_088thm.jpg
AR176 13586
bayesianmodeling00velerich_Page_089.QC.jpg
AR177 4038
bayesianmodeling00velerich_Page_089thm.jpg
AR178 19744
bayesianmodeling00velerich_Page_090.QC.jpg
AR179 5444
bayesianmodeling00velerich_Page_090thm.jpg
AR180 19434
bayesianmodeling00velerich_Page_091.QC.jpg
AR181 5362
bayesianmodeling00velerich_Page_091thm.jpg
AR182 19312
bayesianmodeling00velerich_Page_092.QC.jpg
AR183 5283
bayesianmodeling00velerich_Page_092thm.jpg
AR184 20591
bayesianmodeling00velerich_Page_093.QC.jpg
AR185
bayesianmodeling00velerich_Page_093thm.jpg
AR186 19695
bayesianmodeling00velerich_Page_094.QC.jpg
AR187
bayesianmodeling00velerich_Page_094thm.jpg
AR188 19627
bayesianmodeling00velerich_Page_095.QC.jpg
AR189 5443
bayesianmodeling00velerich_Page_095thm.jpg
AR190 20858
bayesianmodeling00velerich_Page_096.QC.jpg
AR191 5805
bayesianmodeling00velerich_Page_096thm.jpg
AR192 25145
bayesianmodeling00velerich_Page_097.QC.jpg
AR193 6879
bayesianmodeling00velerich_Page_097thm.jpg
AR194 24229
bayesianmodeling00velerich_Page_098.QC.jpg
AR195 6622
bayesianmodeling00velerich_Page_098thm.jpg
AR196 21537
bayesianmodeling00velerich_Page_099.QC.jpg
AR197 5843
bayesianmodeling00velerich_Page_099thm.jpg
AR198 15684
bayesianmodeling00velerich_Page_100.QC.jpg
AR199 4615
bayesianmodeling00velerich_Page_100thm.jpg
AR200 16749
bayesianmodeling00velerich_Page_101.QC.jpg
AR201 4955
bayesianmodeling00velerich_Page_101thm.jpg
AR202 16307
bayesianmodeling00velerich_Page_102.QC.jpg
AR203 4847
bayesianmodeling00velerich_Page_102thm.jpg
AR204 21246
bayesianmodeling00velerich_Page_103.QC.jpg
AR205 5823
bayesianmodeling00velerich_Page_103thm.jpg
AR206 20115
bayesianmodeling00velerich_Page_104.QC.jpg
AR207 5817
bayesianmodeling00velerich_Page_104thm.jpg
AR208 23363
bayesianmodeling00velerich_Page_105.QC.jpg
AR209 6573
bayesianmodeling00velerich_Page_105thm.jpg
AR210 19284
bayesianmodeling00velerich_Page_106.QC.jpg
AR211 5548
bayesianmodeling00velerich_Page_106thm.jpg
AR212 19365
bayesianmodeling00velerich_Page_107.QC.jpg
AR213
bayesianmodeling00velerich_Page_107thm.jpg
AR214 19379
bayesianmodeling00velerich_Page_108.QC.jpg
AR215 5622
bayesianmodeling00velerich_Page_108thm.jpg
AR216 18923
bayesianmodeling00velerich_Page_109.QC.jpg
AR217 5356
bayesianmodeling00velerich_Page_109thm.jpg
AR218 14159
bayesianmodeling00velerich_Page_110.QC.jpg
AR219 4147
bayesianmodeling00velerich_Page_110thm.jpg
AR220 17622
bayesianmodeling00velerich_Page_111.QC.jpg
AR221 5117
bayesianmodeling00velerich_Page_111thm.jpg
AR222 12317
bayesianmodeling00velerich_Page_112.QC.jpg
AR223 4077
bayesianmodeling00velerich_Page_112thm.jpg
AR224 11920
bayesianmodeling00velerich_Page_113.QC.jpg
AR225 3870
bayesianmodeling00velerich_Page_113thm.jpg
AR226 12220
bayesianmodeling00velerich_Page_114.QC.jpg
AR227 3897
bayesianmodeling00velerich_Page_114thm.jpg
AR228 14119
bayesianmodeling00velerich_Page_115.QC.jpg
AR229 4384
bayesianmodeling00velerich_Page_115thm.jpg
AR230 24616
bayesianmodeling00velerich_Page_116.QC.jpg
AR231 6577
bayesianmodeling00velerich_Page_116thm.jpg
AR232 22072
bayesianmodeling00velerich_Page_117.QC.jpg
AR233 6242
bayesianmodeling00velerich_Page_117thm.jpg
AR234 24761
bayesianmodeling00velerich_Page_118.QC.jpg
AR235 6690
bayesianmodeling00velerich_Page_118thm.jpg
AR236 23718
bayesianmodeling00velerich_Page_119.QC.jpg
AR237 6372
bayesianmodeling00velerich_Page_119thm.jpg
AR238 20258
bayesianmodeling00velerich_Page_120.QC.jpg
AR239 5646
bayesianmodeling00velerich_Page_120thm.jpg
AR240 23704
bayesianmodeling00velerich_Page_121.QC.jpg
AR241 6220
bayesianmodeling00velerich_Page_121thm.jpg
AR242 18956
bayesianmodeling00velerich_Page_122.QC.jpg
AR243 5535
bayesianmodeling00velerich_Page_122thm.jpg
AR244 22775
bayesianmodeling00velerich_Page_123.QC.jpg
AR245 6184
bayesianmodeling00velerich_Page_123thm.jpg
AR246 21799
bayesianmodeling00velerich_Page_124.QC.jpg
AR247 6148
bayesianmodeling00velerich_Page_124thm.jpg
AR248 17223
bayesianmodeling00velerich_Page_125.QC.jpg
AR249 5089
bayesianmodeling00velerich_Page_125thm.jpg
AR250 20231
bayesianmodeling00velerich_Page_126.QC.jpg
AR251 5645
bayesianmodeling00velerich_Page_126thm.jpg
AR252 24407
bayesianmodeling00velerich_Page_127.QC.jpg
AR253 6613
bayesianmodeling00velerich_Page_127thm.jpg
AR254 22530
bayesianmodeling00velerich_Page_128.QC.jpg
AR255 6058
bayesianmodeling00velerich_Page_128thm.jpg
AR256 22600
bayesianmodeling00velerich_Page_129.QC.jpg
AR257 6090
bayesianmodeling00velerich_Page_129thm.jpg
AR258 20045
bayesianmodeling00velerich_Page_130.QC.jpg
AR259 5816
bayesianmodeling00velerich_Page_130thm.jpg
AR260 6543
bayesianmodeling00velerich_Page_131.QC.jpg
AR261 2237
bayesianmodeling00velerich_Page_131thm.jpg
AR262 20021
bayesianmodeling00velerich_Page_132.QC.jpg
AR263 5507
bayesianmodeling00velerich_Page_132thm.jpg
AR264 22211
bayesianmodeling00velerich_Page_133.QC.jpg
AR265 6323
bayesianmodeling00velerich_Page_133thm.jpg
AR266 22025
bayesianmodeling00velerich_Page_134.QC.jpg
AR267 6092
bayesianmodeling00velerich_Page_134thm.jpg
AR268 20291
bayesianmodeling00velerich_Page_135.QC.jpg
AR269 5947
bayesianmodeling00velerich_Page_135thm.jpg
AR270 20400
bayesianmodeling00velerich_Page_136.QC.jpg
AR271 5498
bayesianmodeling00velerich_Page_136thm.jpg
AR272 24112
bayesianmodeling00velerich_Page_137.QC.jpg
AR273 6703
bayesianmodeling00velerich_Page_137thm.jpg
AR274 17469
bayesianmodeling00velerich_Page_138.QC.jpg
AR275 4872
bayesianmodeling00velerich_Page_138thm.jpg
AR276 16356
bayesianmodeling00velerich_Page_139.QC.jpg
AR277 4874
bayesianmodeling00velerich_Page_139thm.jpg
AR278 18001
bayesianmodeling00velerich_Page_140.QC.jpg
AR279 5045
bayesianmodeling00velerich_Page_140thm.jpg
AR280 17635
bayesianmodeling00velerich_Page_141.QC.jpg
AR281 5204
bayesianmodeling00velerich_Page_141thm.jpg
AR282 16178
bayesianmodeling00velerich_Page_142.QC.jpg
AR283 4755
bayesianmodeling00velerich_Page_142thm.jpg
AR284 21186
bayesianmodeling00velerich_Page_143.QC.jpg
AR285 5670
bayesianmodeling00velerich_Page_143thm.jpg
AR286 18992
bayesianmodeling00velerich_Page_144.QC.jpg
AR287 5364
bayesianmodeling00velerich_Page_144thm.jpg
AR288 22882
bayesianmodeling00velerich_Page_145.QC.jpg
AR289 6385
bayesianmodeling00velerich_Page_145thm.jpg
AR290 16313
bayesianmodeling00velerich_Page_146.QC.jpg
AR291 4733
bayesianmodeling00velerich_Page_146thm.jpg
AR292 23133
bayesianmodeling00velerich_Page_147.QC.jpg
AR293 6108
bayesianmodeling00velerich_Page_147thm.jpg
AR294 15227
bayesianmodeling00velerich_Page_148.QC.jpg
AR295 4624
bayesianmodeling00velerich_Page_148thm.jpg
AR296
bayesianmodeling00velerich_Page_149.QC.jpg
AR297 6015
bayesianmodeling00velerich_Page_149thm.jpg
AR298 21531
bayesianmodeling00velerich_Page_150.QC.jpg
AR299 6053
bayesianmodeling00velerich_Page_150thm.jpg
AR300 20617
bayesianmodeling00velerich_Page_151.QC.jpg
AR301 5799
bayesianmodeling00velerich_Page_151thm.jpg
AR302 21317
bayesianmodeling00velerich_Page_152.QC.jpg
AR303
bayesianmodeling00velerich_Page_152thm.jpg
AR304 21073
bayesianmodeling00velerich_Page_153.QC.jpg
AR305 6134
bayesianmodeling00velerich_Page_153thm.jpg
AR306 20888
bayesianmodeling00velerich_Page_154.QC.jpg
AR307 5768
bayesianmodeling00velerich_Page_154thm.jpg
AR308 22552
bayesianmodeling00velerich_Page_155.QC.jpg
AR309 6199
bayesianmodeling00velerich_Page_155thm.jpg
AR310 21427
bayesianmodeling00velerich_Page_156.QC.jpg
AR311 5955
bayesianmodeling00velerich_Page_156thm.jpg
AR312 17451
bayesianmodeling00velerich_Page_157.QC.jpg
AR313 4904
bayesianmodeling00velerich_Page_157thm.jpg
AR314 19495
bayesianmodeling00velerich_Page_158.QC.jpg
AR315 5470
bayesianmodeling00velerich_Page_158thm.jpg
AR316 24213
bayesianmodeling00velerich_Page_159.QC.jpg
AR317 6579
bayesianmodeling00velerich_Page_159thm.jpg
AR318 23922
bayesianmodeling00velerich_Page_160.QC.jpg
AR319 6200
bayesianmodeling00velerich_Page_160thm.jpg
AR320
bayesianmodeling00velerich_Page_161.QC.jpg
AR321 6473
bayesianmodeling00velerich_Page_161thm.jpg
AR322 6432
bayesianmodeling00velerich_Page_162.QC.jpg
AR323 2188
bayesianmodeling00velerich_Page_162thm.jpg
AR324 21067
bayesianmodeling00velerich_Page_163.QC.jpg
AR325 5902
bayesianmodeling00velerich_Page_163thm.jpg
AR326 17892
bayesianmodeling00velerich_Page_164.QC.jpg
AR327
bayesianmodeling00velerich_Page_164thm.jpg
AR328 23769
bayesianmodeling00velerich_Page_165.QC.jpg
AR329 6304
bayesianmodeling00velerich_Page_165thm.jpg
AR330 21872
bayesianmodeling00velerich_Page_166.QC.jpg
AR331 5954
bayesianmodeling00velerich_Page_166thm.jpg
AR332 24137
bayesianmodeling00velerich_Page_167.QC.jpg
AR333 6435
bayesianmodeling00velerich_Page_167thm.jpg
AR334 20211
bayesianmodeling00velerich_Page_168.QC.jpg
AR335 5836
bayesianmodeling00velerich_Page_168thm.jpg
AR336 23538
bayesianmodeling00velerich_Page_169.QC.jpg
AR337 6707
bayesianmodeling00velerich_Page_169thm.jpg
AR338 24872
bayesianmodeling00velerich_Page_170.QC.jpg
AR339 6876
bayesianmodeling00velerich_Page_170thm.jpg
AR340 17526
bayesianmodeling00velerich_Page_171.QC.jpg
AR341 4949
bayesianmodeling00velerich_Page_171thm.jpg
AR342 3812
bayesianmodeling00velerich_Page_172.QC.jpg
AR343 1599
bayesianmodeling00velerich_Page_172thm.jpg
AR344 20207
bayesianmodeling00velerich_Page_173.QC.jpg
AR345 5661
bayesianmodeling00velerich_Page_173thm.jpg
AR346 13372
bayesianmodeling00velerich_Page_174.QC.jpg
AR347 3983
bayesianmodeling00velerich_Page_174thm.jpg
AR348 19276
bayesianmodeling00velerich_Page_175.QC.jpg
AR349 5335
bayesianmodeling00velerich_Page_175thm.jpg
AR350 19327
bayesianmodeling00velerich_Page_176.QC.jpg
AR351 5392
bayesianmodeling00velerich_Page_176thm.jpg
AR352 22230
bayesianmodeling00velerich_Page_177.QC.jpg
AR353 6202
bayesianmodeling00velerich_Page_177thm.jpg
AR354 17366
bayesianmodeling00velerich_Page_178.QC.jpg
AR355 5057
bayesianmodeling00velerich_Page_178thm.jpg
AR356 17666
bayesianmodeling00velerich_Page_179.QC.jpg
AR357 5053
bayesianmodeling00velerich_Page_179thm.jpg
AR358 13958
bayesianmodeling00velerich_Page_180.QC.jpg
AR359 4256
bayesianmodeling00velerich_Page_180thm.jpg
AR360 21643
bayesianmodeling00velerich_Page_181.QC.jpg
AR361 6185
bayesianmodeling00velerich_Page_181thm.jpg
AR362 14462
bayesianmodeling00velerich_Page_182.QC.jpg
AR363 4553
bayesianmodeling00velerich_Page_182thm.jpg
AR364 18885
bayesianmodeling00velerich_Page_183.QC.jpg
AR365 5320
bayesianmodeling00velerich_Page_183thm.jpg
AR366 3997
bayesianmodeling00velerich_Page_184.QC.jpg
AR367 1596
bayesianmodeling00velerich_Page_184thm.jpg
AR368 19682
bayesianmodeling00velerich_Page_185.QC.jpg
AR369 5605
bayesianmodeling00velerich_Page_185thm.jpg
AR370 13667
bayesianmodeling00velerich_Page_186.QC.jpg
AR371 4234
bayesianmodeling00velerich_Page_186thm.jpg
AR372 17108
bayesianmodeling00velerich_Page_187.QC.jpg
AR373 5191
bayesianmodeling00velerich_Page_187thm.jpg
AR374 15155
bayesianmodeling00velerich_Page_188.QC.jpg
AR375 4545
bayesianmodeling00velerich_Page_188thm.jpg
AR376 17184
bayesianmodeling00velerich_Page_189.QC.jpg
AR377 5224
bayesianmodeling00velerich_Page_189thm.jpg
AR378 16320
bayesianmodeling00velerich_Page_190.QC.jpg
AR379 4910
bayesianmodeling00velerich_Page_190thm.jpg
AR380 15420
bayesianmodeling00velerich_Page_191.QC.jpg
AR381 4494
bayesianmodeling00velerich_Page_191thm.jpg
AR382 14397
bayesianmodeling00velerich_Page_192.QC.jpg
AR383 4528
bayesianmodeling00velerich_Page_192thm.jpg
AR384 16191
bayesianmodeling00velerich_Page_193.QC.jpg
AR385 4843
bayesianmodeling00velerich_Page_193thm.jpg
AR386 19334
bayesianmodeling00velerich_Page_194.QC.jpg
AR387 5594
bayesianmodeling00velerich_Page_194thm.jpg
AR388 13196
bayesianmodeling00velerich_Page_195.QC.jpg
AR389 3956
bayesianmodeling00velerich_Page_195thm.jpg
AR390 6695
bayesianmodeling00velerich_Page_196.QC.jpg
AR391 2353
bayesianmodeling00velerich_Page_196thm.jpg
AR392 4039
bayesianmodeling00velerich_Page_197.QC.jpg
AR393 1651
bayesianmodeling00velerich_Page_197thm.jpg
AR394 16058
bayesianmodeling00velerich_Page_198.QC.jpg
AR395 4794
bayesianmodeling00velerich_Page_198thm.jpg
AR396 12297
bayesianmodeling00velerich_Page_199.QC.jpg
AR397 3837
bayesianmodeling00velerich_Page_199thm.jpg
AR398 13795
bayesianmodeling00velerich_Page_200.QC.jpg
AR399 4218
bayesianmodeling00velerich_Page_200thm.jpg
AR400 10785
bayesianmodeling00velerich_Page_201.QC.jpg
AR401 3556
bayesianmodeling00velerich_Page_201thm.jpg
AR402 10312
bayesianmodeling00velerich_Page_202.QC.jpg
AR403 3345
bayesianmodeling00velerich_Page_202thm.jpg
AR404 21749
bayesianmodeling00velerich_Page_203.QC.jpg
AR405 6142
bayesianmodeling00velerich_Page_203thm.jpg
AR406 12864
bayesianmodeling00velerich_Page_204.QC.jpg
AR407 3990
bayesianmodeling00velerich_Page_204thm.jpg
AR408 12725
bayesianmodeling00velerich_Page_205.QC.jpg
AR409 4240
bayesianmodeling00velerich_Page_205thm.jpg
AR410 16389
bayesianmodeling00velerich_Page_206.QC.jpg
AR411 4646
bayesianmodeling00velerich_Page_206thm.jpg
AR412 8672
bayesianmodeling00velerich_Page_207.QC.jpg
AR413 2702
bayesianmodeling00velerich_Page_207thm.jpg
AR414 8442
bayesianmodeling00velerich_Page_208.QC.jpg
AR415 2438
bayesianmodeling00velerich_Page_208thm.jpg
AR416 10171
bayesianmodeling00velerich_Page_209.QC.jpg
AR417 2915
bayesianmodeling00velerich_Page_209thm.jpg
AR418 4968
bayesianmodeling00velerich_Page_210.QC.jpg
AR419 1799
bayesianmodeling00velerich_Page_210thm.jpg
AR420 19288
bayesianmodeling00velerich_Page_211.QC.jpg
AR421 4946
bayesianmodeling00velerich_Page_211thm.jpg
AR422 21506
bayesianmodeling00velerich_Page_212.QC.jpg
AR423 5617
bayesianmodeling00velerich_Page_212thm.jpg
AR424 23238
bayesianmodeling00velerich_Page_213.QC.jpg
AR425 6028
bayesianmodeling00velerich_Page_213thm.jpg
AR426 20920
bayesianmodeling00velerich_Page_214.QC.jpg
AR427 5403
bayesianmodeling00velerich_Page_214thm.jpg
AR428 20903
bayesianmodeling00velerich_Page_215.QC.jpg
AR429 5613
bayesianmodeling00velerich_Page_215thm.jpg
AR430 21981
bayesianmodeling00velerich_Page_216.QC.jpg
AR431
bayesianmodeling00velerich_Page_216thm.jpg
AR432 24077
bayesianmodeling00velerich_Page_217.QC.jpg
AR433 6317
bayesianmodeling00velerich_Page_217thm.jpg
AR434 22156
bayesianmodeling00velerich_Page_218.QC.jpg
AR435 5647
bayesianmodeling00velerich_Page_218thm.jpg
AR436 22832
bayesianmodeling00velerich_Page_219.QC.jpg
AR437 6099
bayesianmodeling00velerich_Page_219thm.jpg
AR438 21547
bayesianmodeling00velerich_Page_220.QC.jpg
AR439 5685
bayesianmodeling00velerich_Page_220thm.jpg
AR440 23320
bayesianmodeling00velerich_Page_221.QC.jpg
AR441 6319
bayesianmodeling00velerich_Page_221thm.jpg
AR442 21500
bayesianmodeling00velerich_Page_222.QC.jpg
AR443 5438
bayesianmodeling00velerich_Page_222thm.jpg
AR444 23710
bayesianmodeling00velerich_Page_223.QC.jpg
AR445 5962
bayesianmodeling00velerich_Page_223thm.jpg
AR446 21487
bayesianmodeling00velerich_Page_224.QC.jpg
AR447 5243
bayesianmodeling00velerich_Page_224thm.jpg
AR448 9963
bayesianmodeling00velerich_Page_225.QC.jpg
AR449 2985
bayesianmodeling00velerich_Page_225thm.jpg
AR450 19441
bayesianmodeling00velerich_Page_226.QC.jpg
AR451 5261
bayesianmodeling00velerich_Page_226thm.jpg
AR452 13619
bayesianmodeling00velerich_Page_227.QC.jpg
AR453 4005
bayesianmodeling00velerich_Page_227thm.jpg
AR454 17589
bayesianmodeling00velerich_Page_228.QC.jpg
AR455 4631
bayesianmodeling00velerich_Page_228thm.jpg
AR456 9826
bayesianmodeling00velerich_Page_229.QC.jpg
AR457 3042
bayesianmodeling00velerich_Page_229thm.jpg
AR458 4469
bayesianmodeling00velerich_Page_230.QC.jpg
AR459 1752
bayesianmodeling00velerich_Page_230thm.jpg
AR460 286829
UF00098090_00001.mets
METS:structMap STRUCT1 mixed
METS:div DMDID ORDER 0 main
D1 1 Main
P1 Page i
METS:fptr
P2 ii 2
P3 iii 3
P4 iv 4
P5 v 5
P6 vi 6
P7 vii 7
P8 viii 8
P9 ix 9
P10 x 10
P11 xi 11
P12 xii 12
P13 xiii 13
P14 14
P15 15
P16 16
P17 17
P18 18
P19 19
P20 20
P21 21
P22 22
P23 23
P24 24
P25 25
P26 26
P27 27
P28 28
P29 29
P30 30
P31 31
P32 32
P33 33
P34 34
P35 35
P36 36
P37 37
P38 38
P39 39
P40 40
P41 41
P42 42
P43 43
P44 44
P45 45
P46 46
P47 47
P48 48
P49 49
P50 50
P51 51
P52 52
P53 53
P54 54
P55 55
P56 56
P57 57
P58 58
P59 59
P60 60
P61 61
P62 62
P63 63
P64 64
P65 65
P66 66
P67 67
P68 68
P69 69
P70 70
P71 71
P72 72
P73 73
P74 74
P75 75
P76 76
P77 77
P78 78
P79 79
P80 80
P81 81
P82 82
P83 83
P84 84
P85 85
P86 86
P87 87
P88 88
P89 89
P90 90
P91 91
P92 92
P93 93
P94 94
P95 95
P96 96
P97 97
P98 98
P99 99
P100 100
P101 101
P102 102
P103 103
P104 104
P105 105
P106 106
P107 107
P108 108
P109 109
P110 110
P111 111
P112 112
P113 113
P114 114
P115 115
P116 116
P117 117
P118 118
P119 119
P120 120
P121 121
P122 122
P123
P124
P125 125
P126
P127
P128 128
P129 129
P130 130
P131 131
P132 132
P133 133
P134 134
P135 135
P136 136
P137 137
P138 138
P139 139
P140 140
P141 141
P142 142
P143 143
P144 144
P145 145
P146 146
P147 147
P148 148
P149 149
P150 150
P151 151
P152 152
P153 153
P154 154
P155 155
P156 156
P157 157
P158 158
P159 159
P160 160
P161 161
P162 162
P163 163
P164 164
P165 165
P166 166
P167 167
P168 168
P169 169
P170 170
P171 171
P172 172
P173 173
P174 174
P175 175
P176 176
P177 177
P178 178
P179 179
P180 180
P181 181
P182 182
P183 183
P184 184
P185 185
P186 186
P187 187
P188 188
P189 189
P190 190
P191 191
P192 192
P193 193
P194 194
P195 195
P196 196
P197
P198 198
P199 199
P200 200
P201 201
P202 202
P203 203
P204 204
P205 205
P206 206
P207 207
P208 208
P209 209
P210 210
P211 211
P212 212
P213 213
P214 214
P215 215
P216 216
P217 217
P218 218
P219 219
P220 220
P221 221
P222 222
P223 223
P224 224
P225 225
P226 226
P227 227
P228 228
P229 229
P230 230
METS:behaviorSec VIEWS Options available to the user for viewing this item
METS:behavior VIEW1 STRUCTID Default View
METS:mechanism Viewer JPEGs Procedure xlink:type simple xlink:title JPEG_Viewer()
VIEW2 Alternate
zoomable JPEG2000s JP2_Viewer()
VIEW3
Related image viewer shows thumbnails each Related_Image_Viewer()
INTERFACES Banners or interfaces which resource can appear under
INT1 Interface
UFDC_Interface_Loader
PAGE 1
BAYESIAN MODELING OF NONSTATIONARITY IN NOK^L\L AND LOGNORMAL PROCESSES WITH APPLICATIONS IN CVP ANALYSIS AND LIFE TESTING MODELS By JORGE IVAN VELEZ-AHOCHO A DISSERTA'l ION PRESENTED TO THE GRADUATE COUNCIL OF THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF DOCTOR OF PHILISOPHY INIVERSITY OF FLORIDA 1978
PAGE 2
Copyright 1978 by Jorge Ivan Velez-Arocho
PAGE 3
lis dissertation stands as a symbol of love to my wife, Angie, and to my daughter, Angeles Maria, without v;hose understanding, patience and willingness to accept sacrifice this investigation would have been quite impossible.
PAGE 4
ACKNOWLEDGMENTS I would like to acknowledge my full indebtedness to those people \A\o gave their interest, time and effort to making this dissertation possible. To Dr. Christopher B. Barry who has been my advisor and my friend, 1 wish to express my gratitude and deepest appreciation for the su[)port lie has given me throughout the development of this study. He critized but tolerated my mistakes and encouraged my good performance. His intelligent guidance, extraordinary competence, and friendly attitude have been a source of inspiration and encouragement for me. I am especially grateful to Dr. Antal Majthay for his sincere advice and assistance during the supervision of my doctoral program and the preparation of this dissertation. I admire and am inspired by bis unreserved dedication to excellence in education. He will always be remembered as one of the most valuable models of excellent teaching. The other members of my committee. Dr. Tom Hodgson and Dr. Zoran Pop-S tojanovic have each in his own way contributed to the successful completion of this work. Appreciation is extended to each for his individual efforts and expressed concern for my progress. Although not on my committee, I would also like to express appreciation to Dr. Gary Koehler, whose support iind encouragement came when they were badly needed . To Omar Ruiz, Dean of the School of Business Administration of tliLUniversity of VuitUo Rico at Mayaguez, I am particularly grateful iv
PAGE 5
for his understanding, confidence and cooperation during my leave of absence from that institution. Completion of this study was only possible because of the combined financial support of the University of Puerto Rico, the University of Florida and Peter Eckrich and Sons Co.. Their (.ontinuous support is sincerely appreciated. I am indebted to Dr. Conrad Doenges, Chairman of the Department of Finance of the University of Texas at Austin, for his interest and help and to the many members of the Finance faculty for their interest during my period of research at the University of Texas. Special thanks go to Nettie Webb for her warm friendship and continuous secretarial assistance to my wife. It is difficult to adequately convey the support my family has provided. My parents, Jorge Velez and Elba Lucrecia Arocho , and my brothers and sisters provided understanding and moral assistance for which I will alvjays be grateful. Their high expectations and constant encouragement have been a powerful factor in shaping my desire to pursue this degree. Most of all a gratitude v^fhich cannot be expressed in words goes to my loving wife, Angle, for her patience and persistance in typing this dissertation and for her wonderful attitude throughout the entire arduous process. V
PAGE 6
TABLE OF CONTENTS Page ACKNOWLEDGEMENTS iv LIST OF APrENDIX TABLES ix LIST OF FIGURES x ABSTRACT xi Chapter ONE INTRODUCTION 1 1.1 Introd>ictlon 1 1.2 Sur.imary of Results and Overview of Dissertation A TWO SURVEY OF PERTINENT LITERATURE 9 2.1 Cost-Volune-Prof it (CVP) Analysis 9 2.2 Life Testing Models 17 2.2.1 Introduction 17 2.2.2 Some Common Life Distributions 22 2.2.3 Traditional Approach to Life Testing Inferences 29 2.2.4 Bayesian Techniques in Life Testing 33 2.3 Modeling of Nonstationary Processes 41 THREE NONSTATTONARITY IN NORMAL AND LOGNORMAL PROCESSES 51 3 . 1 Introduction 51 3.2 Bayesian Analysis of Normal and Lognormal Processes.... 54 3.3 Nonstationary Model for Normal and Lognormal Means 63 3.3.1 p is Unknov>m and a^ is Known 65 3.3.2 jj and a'^ Both Unknown 70 3.3.3 Stationary Versus Nonstationary Results 74 3 . 4 (Conclusion 80 FOUR LIMITING RESULTS AND PREDICTION INTERVALS FOR NONSTATIONARY NORMAL AND LOGNORMAL PROCESSES 83 4.1 Ini rcnhu1 i on 83 vi
PAGE 7
Chapter Page 4.2 Special Properties and Limiting Results Under Nonstationarity 86 4.2.1 Limiting Behavior of m' and n' Wlien y\ is the Only Unknown Parameter 86 4.2.2 Limiting Behavior of m' n' v' and d' When Both Parameters \^ and a*^ are Unknown 95 4.3 Prediction Intervals for Normal, Student, Lognormal and LogStudent Distributions 103 4.4 Conclusion 117 FIVE NONSTATIONARITY IN CVP AND STATISTICAL LIFE ANALYSIS 119 5.1 Introduction 119 5.2 Nonstationarity in Cost-Volume-Profit Analysis 120 5.2.1 Existing Analysis 120 5.2.2 Nonstationary Bayesian CVP Model 122 5.2.3 Extensions to the Nonstationary Bayesian CVP Model 1 36 5.3 Nonstationarity in Statistical Life Analysis 140 5.3.1 Existing Analysis 140 5.3.2 A Life Testing Model Under Nonstationarity 141 5.4 Conclusion 148 SIX CONCLUSIONS, LIMITATIONS AND FURTHER STUDY 150 6. 1 Summary .' 150 6.2 Limitations 152 6.3 Suggestions for Further Research 155 APPENDIXES I Bayesian Analysis of Normal and Lognormal Processes 160 IT Nonstationary Models for the Exponential Distribution 172 III Algor it 1)111 to Peterriiine Predict; ion TntervaJs for Lt)gnormnl and LogStudent Distributions 185 vii
PAGE 8
Page LIS'l' OF REFERENCl'S 198 BIOGRAPHICAL SKETCH 213 VI 1 1
PAGE 9
Table LIST OF APPENDIX TABLES Page 1. "redictlve Intervals for Some Lognormal Predictive Distributions -j^g-l^ 2. Predictive Intervals for Some LogStudent Predictive Distributions 2.92 IX
PAGE 10
LIST OF FIGURES Figure Page 1. Life Characteristics of Some Systems 21 AIII.l Predictive Distribution 186 AIII.2 Predictive Distribution ]87 AIII.3 Predictive Distribution 188 AITI.4 Predictive Distribution 139
PAGE 11
Abstract of Dissertation Presented to the Graduate Council of the University of Florida in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy BAYESIAN MODELING OF NONSTATIONARITY IN NORMAL AND LOGNORMAL PROCESSES WITH APPLICATIONS IN CVP ANALYSIS AND LIFE TESTING MODELS By Jorge Ivan Velez-Arocho June 1978 Chairman: Christopher B. Barry Major Department: Management Probability models applied by decision makers in a wide variety of contents must be able to provide inferences under conditions of change. A stochastic process whose probabilistic properties change through time can be described as a nonstationary process. In tViis dissertation a model involving normal and lognormal processes is developed for handling a particular form of nonstationarity within a Bayesian framework. Two uncertainty conditions are considered; in one the location parameter, y , is assumed to be unknown and the spread parameter, a, is assumed to be known; and in the other both parameters are assumed to be unknown. Comparing the nonstationary model with the stationary one it is shown that: Â— 1. more uncertainty (of a particular definition) is present under nonstationarity than under stat ionarity ; 2. since the variance of a lognormal distribution, V(x) , is a function of \i and o"^ , nonstationarity in P means that both mean and variance of the random variable, x, are nonstationary so that the lognormal xi
PAGE 12
case provides a generalization of the normal results; and 3. as additional observations are collected uncertainty about stochastically-varying parameters is never entirely eliminated. The asymptotic behavior of the model has important implications for the decision maker. An implication of the stationary Bayesian model for normal and lognormal processes is that as additional observations are collected, parameter uncertainty is reduced and (in the limit) eliminated altogether. In contrast, for the nonstationary model considered in this dissertation the following inferential results are obtained: 1. for the case of lognormal or normal model, a particular form of stochastic parameter variation implies a treatment of data involving the use of all observations in a differential weighting scheme; and 2. random parameter variation produces important differences in the limiting behavior of the prior and predictive distributions since under nonstationnrity the limiting values of the parameters of the posterior and predictive distributions cannot be determined clearly. Practical implications of the results for the areas of CostVolume-Profit Analysis and life testing are discussed with emphasis on the predictive distribution for the outcome of a future observation from the data generating process. It is emphasized that a Cost-Volume-Profit (CVP) and life testing model ideally should include the changing character of the process by allowing for changes in the parametric description of the process through Lime. Failure to recognize nonstationarity when xii
PAGE 13
it is present has a number of imp! icnt ions in the CVP and life-testing contexts that are explored in tlie dissertation. For example, inferences are improperly obtained if the nonstationarity is ignored, and prediction interval coverage probabilities are overstated since uncertainty is greater (in a particular sense) when nonstationarity is present. .11
PAGE 14
CHAPTER ONE TNTRODUCTTON 1 . 1 Introduction Uncertainty is an essential and intrinsic part of the human condition. The opinions we express, the conclusions we reach and the decisions we make are often based on beliefs concerning the probability of uncertain events such as the result of an experiment, the future value of an investment or the number of units to be sold next year. If management, for instance, were certain about what circumstances would exist at a given time, the preparation of a forecast would be a trivial matter. Virtually all situations faced by management involve uncertainty, however, and judgments must be made and information must be gathered to reduce this uncertainty and its effects. One of the functions of applied mathematics is to provide information which may be used in making decisions or forming judgments about unknown quantities. Several early studies by econometricians and statisticians examined the problem of constructing a model. whose output is as close as possible to the observed data from the real system and which reflects all the uncertainty that tlie decision maker has. Mathematical models for statistical problems, for instance, have some element of uncertainty incorporated in the form of a probability measure. The model usually involves the formulation of a probability distribution of the uncertain quantities. This element of uncertainty is carried through
PAGE 15
2 the analysis to the inferences drawn. The equations that form the mathematica] model are usually specified to within a number of parameters or coefficients which must be estimated. The unknown parameters are usually assumed to be constant and the problem of model identification is reduced to one of constant parameter estimation. There are several reasons for suspecting that the parameters of many models constructed by engineers and econometricians are not constant but in fact time-varying. For instance, it has become increasingly clear that to assume that behavioral and technological relationships are stable over time is, in many cases, completly untenable on the basis of economic theory. Several recent studies provide support for the claim that the parameters of distributions of stock-price-related variables may change over time [see Barry and Winkler (1976)]. In engineering, particularly in reliability theory, the origins of parameter variation are usually not very hard to pinpoint. Component wear, variation in inputs or component failure are some very common reasons for parameter variations. The major objective of construction of engineering models is control and regulation of the real system modeled. Therefore, much of the research in that area has concentrated on devising ways to make the output of the model insensitive to parameter variation. Simil arly, in forecasting models for economic variables, researchers have had great concern with time varying parameters of the distributions of interest. In this area the problem of varying parameters has received increased attention because there is increasing evidence that the common regression assumption of stable
PAGE 16
liarametiirs often appears invalid. Ln tliis (I issertat inn we plan to study a particular type of random parameter variation whi.th is likely to be applicable when nonstat ionar ity over time is present. The modeling of nonstationarity that we are going to present assumes that successive values in time of the unknown parameter are related in a stochastic manner; i.e., the parameter variation includes a component which is a realization of some random process. For purposes of estimation we are interested in specific realizations of the random process. When the process generating the unknown parameter is a nonstationary process over time tlie decision maker should be concerned with a sequence of values of the parameter instead of a single value as in the usual stationary model; i.e., inferences and decisions concerning the parameter should reflect the fact that it is changing over time. If tlie values of an unknown parameter over time are related in a stochastic manner, a formal analysis of the situation requires some assumptions about the stochastic relationsliip . For the model of nonstationarity tliat we develop in this dissertation, the specification of the stochastic relationship between values of the parameter is sufficient. Moreover it is assumed that this relationship is stationary (usually referred to as second-order stat ionar ity) in the sense that the stochastic relat ionsliij) is the same for any pair of consecutive values of the unknovjn parameter. We \\'anr to gaii^i more precise information about tlie structure of the t iiiu'-vary ing parameters and to obtain estimated relationships
PAGE 17
that are suitable for forecasting. The model to be developed makes it possible to dra^j inferences about the structure of the relationship at every point in time. There are problems in accounting, life testing theory, finance and a variety of other areas that can benefit from nonstationary parameter estimation techniques. 1 . 2 Summ ary of Results and Ov e rvie w of Dissertation The goals of this dissertation are to develop a rigorous model for handling nonstationarity within a Bayesian framework, to compare inferences from stationary and nonstationary models, and to investigate inferential applications in the areas of Cost-Volxime-Prof it Analysis and life testing models involving nonstationarity. Probably the most important advantage of the new work to be presented in this dissertation is the Increased versatility it adds to the nonstationary Bayesian model derived by Winkler and Barry (1973). The new results enlarge the range of real and important problems involving univariate and multivariate nonstationary normal and lognormal processes which can be handled. Another advantage is the simplicity of the updating methods for the efficient handling of the estimation of unknown parameters and the prediction of the outcome of a future sample. A survey of the most relevant literature is provided in Chapter Two to set the stage for the new developments in the remainder of the dissertation. In tliis survey we present an overview of probabilistic CostVolume-Profit (CVP) Analysis and discuss the most important articles that deal with CVP under conditions of uncertainty. The review of the
PAGE 18
literature includes a section on life testing; models eniptiasizing the use of Bayesian tec.hnitiues used in life testing. It is empliaslzed that most of the research done in these two areas neglects the problem of nonstationarity. A special section Is presented to discuss some important articles about modeling nonstat ionary processes. As is mentioned in Chapter Two, most research concerned with the normal and lognormal distributions has considered cmly stationary situations. That is, the parameters and distributions used are assumed to remain tlie same in all periods. In Cliapter Tliree we develop a Bayesian model of nonstat ionari ty for normal and lognormal processes, In it we describe essential features of the Bayesian analysis of normal and lognormal processes under nonstat ionari ty , like the prior, posterior and predictive dist rihut icins . Two uncertainty conditions are considered in this chapter; in one the loc^ition parameter, y , is assumed to be unknovm and the spread parameter, a, is assumed to be known; and in the other, both parameters are assumed to be unknown. Comparing the nonstat ionary model with the stationary one It Is shown that: 1. more uncertainty (of a particular definition) is present under nonstat ionari ty than under stat lonarlty; 2. since the variance of a lognormal distribution, V(x), is a function of p and u^, nonstat iiinarity in p means that both mean and variance of the random variable, x, are nonstat ionary , so that the lognormal case provides a generalization of the normal results;
PAGE 19
6 and , 3. that, as adiiitiona] observations are collected, uncertainty about stochastically-varying parameters is never entirely eliminated. The results discussed in Chapter Three have to do with the periodto-period effects of random parameter variation upon the posterior and predictive distributions. However, the asymptotic behavior of tlie model has important implications for the decision maker. An implication of the stationary Bayesian model for normal and lognormal processes is that as additional observations are collected parameter uncertainty is reduced and (in the limit) eliminated altogether. Such an implication is inconsistent with observed real \;orld behavior largely because the conditions under which inferences are made typically change across time. The common dictiam [see Dickinson (1974)] has been to eliminate some observations in the case of changing parameters so that only those most recent observations are considered. In Chapter Four we show that: 1. for the case of a lognormal or normal model, a particular form of stochastic parameter variation implies a treatment of data involving the use of all observations in a differential weighting scheme, and, 2, random parameter variation produces important differences in the limiting behavior of the prior and predictive distributions since under nonstationar i ty the! limiting values of some of the parameters of the posterior and predictive distributions can not be determined clearly.
PAGE 20
One objecaive of tliis dissertation is to develop Bayesian prediction intervals for future observations that come from normal and lognormal data generating processes. In Chapter Four we address the problem of constructing prediction intervals for normal, Student, lognormal and logStudent distributions. It is pointed out tliat it is easy to construct these intervals for the normal and Student distributions but that it is rather difficult for the lognormal and logStudent distributions. An algorithm is presented to compute the Bayesian prediction intervals for the lognormal and logStudent distributions. Bayesian prediction intervals under nonstat ionar i ty are compared with classical, certainty equivalent and Bayesian stationary intervals. In Chapter Five we discuss the application of the results of Ciiapters Tliree and Four concerning nonstat ionarity to the area of CVP analysis and life testing models. Practical implications of our results for these two areas are discussed with emphasis on the predictive distribution for the outcome of a future observation from the data generating process. It is emj^hasized that CVP and life testing models ideally should include tlie clianging character of the process by allowing for changes in the parametric description of the process through time. It is shown that, for the case of normal ami lognormal data generating processes under a partii-ular form of stochastic parameter variation, the presence of nonstat ionar i ty produces greater uncertainty to tlie decision maker. Nonstat iona r i.ty implies greater uncertainty, whicli is reflected by an increase in the iiredictive variance of profits for CVP models,
PAGE 21
8 by an increase in tlie predictive variance of life length for life testing models, and by an increase in the width of intervals required to contain particular coverage probabilities. Chapter Six provides conclusions, limitations and suggestions for further research. Since stationarlty assumptions are often quite unrealistic; it is concluded in that chapter that the introduction of possible nonstationarity greatly increases the realism and applicability of statistical inference methods, in particular of Bayesian procedures.
PAGE 22
CHAPTER TWO SURVliY OF PERTINENT LITERATURE The primary purpose of the research in this dissertation is to present a Bayesian model of nonstationarity in normal and lo^normal processes witli ajipl ications in Cost-Volume-Profit analysis and life testing models. A survey of the most relevant literature is provided in the cliapter and will serve to set the stage for the new developments in the remainder of the thesis. In tills survey, three areas are covered. In Section 2.1 we present an overvlev\? of probal)ilistic Cost-Volume-Profit (CVP) analysis and discuss the most important articles that deal v;lth CVP under conditions of uncertainty. In Section 2.2 we discuss life testing models V\7ith an emphasis on the exi^onentlal , gamma, Weibull and lognormal models. The review of tlie literature includes a special section on Bayesian techniques used in life testing. Finally in Section 2.3 a survey is presented of some important articles about modeling nonstationary processes. 2 . 1 Cost -Volume-Profit (CVP)Analysi s Management requires realistic and accurate information to aid in decision making. Cost-Vo] unie-Prof i t (CVP) analysis is a widely accepted generator of information useful in decision making processes. CVP analysis essentially consists in examining the relationship between changes in volume ( output ) and changes in profit. The fundamental ass\imption in all types of CVP decisions is that the firm, or a department, or other Lvpe of costing unit, pt)ssesses a fixed set
PAGE 23
10 of resources tliat comniits tlie firm to a certain level of fixed costs for at least a slinrtrun periotl. I'he decision problem facing a manager is to determine the most efficient and productive use of this fixed set of resources relative to output levels and output mixes. The scope of CVP analysis ranges from determination of the optimal output level for a single-product department to the determination of optimal output mix of a large multi-product firm. All these decisions rely on simple relationships between changes in revenues and costs and changes in output levels or mixes. All CVP analyses are characterized by their emphasis on cost a\id revenue behavior over various ranges of output levels and mixes. The determination of the selling price of a product is a complex matter that is often affected by forces partially or entirely beyond the control of management. Nevertlieless, management must formulate pricing policies williin the bounds permitted by the market place. Accounting can play an important role in the development of policy by supplying management with special reports on the relative profitability of its various products, the probable effects of contemplated changes in selling price and otlier CVP relationships. The unit cost of producing a commodity is affected by such factors as the iniierent nature of tlie product, the efficiency of operations, and the volume of production. An increase in the quantity produced is ortlinaiily accompanied by a decrease in unit cost, provided tiie volume attained remains within tlie limits of plant capacity. Quantitative data relating to the effect on income of changes in
PAGE 24
11 unit selling price, sales volume, production volume, production costs, and operating expenses help management to improve the relationships among these variables. If a change in selling price appears to be desirable or, because of competitive pressure, unavoidable, the possible effect of the diange on sales volume and prod\ict cost needs to be considered. A mathematical expression of the profit equation of CVP analysis is: (2.1.1) Z = Q (P-V) F, where Z = total profits, Q = sales volume in units, 1' = unit selling price, V = unit variable cost, and F = total fixed costs. Tills accounting model of analysis has been traditionally used by the management accountant in profit planning. This use, howwver, typically ignores the uncertainty associated with the firm's operation, thus severely limiting its applicability. During the past 12 years, accountants have attempted to resolve this problem by introducing stochastic aspects into the analysis. The applicability of probabilistic models for this analysis has been claimed because of the realism of such models, i.e., decisions are always accompanied by uncertainty. Thus, the ideal model is one that gives a probability distribution of tlie criterion variable, profit, and that fully recognizes the uncertainty faced by the firm.
PAGE 25
12 The realism of sueli a nu)del is dependent on logical assumptions for the input variables and rigorous methodology in obtaining the output distribution. Further, we hope that, the model can accomodate a wide range of uses. For example, the capability to handle dependence among input variables adds a highly useful dimension. Jaedicke and Robichek (196A) first introduced risk into the model. They assum.ed the follovying relation among the means (2.1.2) F(Z) = E(Q) [E(P) E(V)1 E(F) , where E( Â•) denotes mathematical expectation. In addition they assumed that the key variables were all normally distributed and tliat the resulting profit is also normally distributed. Thus, by computing the r^iean value and standard deviation of the resulting profit function, various probabilistic measures of profit can be obtained. This model has been depicted as a limit analysis, since the assumptions of the independent model parameters and tlie normalcy of the resulting profit fimction are not true except in limiting cases. According to Ferrara, Hayya and Nachman (1972), the product of two normally and independently distributed variables will approximate normality if the sum of the two coefficients of variation is less than or equal to .12. Others have confronted the same problem of how to identify the resulting profit distribution when it is not close to a normal distribution. They have noted that it is often difficult to obtain analytical, expressions for the product of random variables. Because the ap[)ropiate d i :: t ri buL i onal forms for tlie product of the variable
PAGE 26
13 tunc I: ions may not In? known, Bu/.by (1974) suggests the application of Tchebychef f ' s theorem to stochastic Cost-Volume-Profit analysis. This theorem, ho\v;ever, permits the analyst to derive only some very crude bounds on the probabilities of interest, so its value as a decisionmaking tool is limited. Liao (1975) illustrated liow m.odel sampling (also called distribution sampling) coupled with a curve-fitting technique can be used to overcome the above problems associated with stochastic CVP analysis. In his paper, the illustration of the proposed approach to stochastic CVP analysis is first developed through a consideration to the Jaedicke-Robicheck problem, wherein the model parameters are independent and normally distributed. After that, the Illustration problem is modified to accomodate dependent and non-normal variates in the problem. Milliard and Leitch (1975) developed a model for CVP analysis assuming a more tractable distribution for the inputs of the equation. It allows fur dependent relationships and permits a rigorous derivation of the distribution of profit. The problems of assuming price and quantity to be independent are pointed out. The authors also pointed out that assuming sales to be normally distributed implies a positive probability of negative sales. Probabilities and tolerance intervals for the Hilliard and Leitch model are obtained from tables of the normal distribution. The only assumptions i-equired for the model are (1) quantity and contr il)ut ion margin are lognormally distributed random variables and (2) fixed costs are deterministic. The assumption that sales
PAGE 27
14 quantity and contribution margin are bivariate loRnormally distributed eliminates the possibility of negative sales and of selling prices below variable costs, and it has the nice additional property that the product of two bivariate iognormal random variables is also lognormal. Thus, \^Ie can allow for uncertainty in price and quantity and still have a closed form expression for the probability distribution of gross profits. Hilliard and Leitch can not assume that price and varial)le costs are marginally lognormally distributed and have contribution margin also be lognormally distributed. Similarly, if fixed costs are assumed to be lognormally distributed too, net profits will not be lognormally distributed. Adar, Barnea and Lev (1977) presented a model for CVI^ analysis under uncertainty tiiat combines the probability characteristics of the environment variables with the risk preferences of decision makers. The approach is based on recently suggested economic models of the firm's optimal output decision under uncertainty, which were modified within tlu' mean-standard deviation framework to provide for a cost-volume-ut ii i ty analysis allov.'ing management to: (1) determine optimal output, (2) consider the desirability of alternative plans involving changes in fixed and variable costs, expected price and uncertainty of price and technology changes and (3) determine the economic consequences of fixed cost variances. Dickinson (1974) addresses tlie problem of CVP analysis under uncertainty by exaniining the relialiility of using the usual methods of estimating the means .ind variances of the past distributions of
PAGE 28
15 sales demand, lie emphasized tliat, wliea the expectation and variance of profits are estimated from past data, it is important to differentiate between what, in fact, are estimated and what are true values of the parameters. In other words, he pointed out that the estimated expectation of profits, I'Uti ) , reflects estimation risk and is not equal to E(n ) . Classical confidence intervals v^?ere used for tlie expected value of profits, E(-n) , for the variance of profits, Var(7r)i and for probabilities of various profit levels. However, Dickinson misinterpreted the classical confidence intervals that he obtains in his paper. When a classicist constructs a 90 percent confidence interval for ]s, for example, he would state that in the long run, 90 percent of all such intervals v%?i 1 1 contain the true value of p. Tlie classical statement is based on long-run frequency considerations. The classicist is absolutely opposed to the interpretation that tlie 90 percent refers to the probability tliat tlie true universe mean lies within the specified interval. In the eyes of a classicist, a unique true value exists for the universe mean, and therefore the value of the universe mean cannot be treated as a random variable. Dickinson's paper also illustrates the difficulty of obtaining the probability statements of greatest interest to management in a classical approach. His analysis is only able to provide confidence intervals of probabilities of profit levels rather tlian the profit level probabilities themselves. The probliMii of parameter uncertainty has been neglected by the people that have studied CVP analysis under uncertainty. In the liayesi
PAGE 29
16 models is reflected in prior and posterior probability statements regarding the parameters. Marginal distributions of variables which depend on tliose |iarameters may be obtained by integrating out the distribution of the parameters, thereby obtaining predictive distributions [see Roberts (1965) and Zellner (1971)] of the quantities of interest to the manager. These predictive distributions permit one to make valid proliability statements regarding the important quantities, such as profits. Nonstat ionarity is another important aspect related to CVP analysis that no one has considered. In a world that is continually changing, it is important to recognize that the parameters that describe a process at a particular point in time may not do so at a later point in time. In the case of the variable siiles, for instance, experience shows that it is typically affected by a variety of economic and political events. Thus, a CVP model ideally should include the changing character of the process by allowing for changes in the parametric description of the process through time. Failure to recognize the nonstationary conditions may result in misleading inferences. lii this dissertation the problem ofCost-Volume-Profit analysis Vv/ill be considered from a Bayesian viewpoint, and inferences under a special case of nonstationarity will be considered. Also the Bayesian results under nonstationarity will be compared with tliose results that can be obtained under a stationary Bayesian model, and the Bayesian model will 1m> c:ompared with some alternative approaches.
PAGE 30
17 2.2 Life Testing Models 2.2.1 Introduction The development of recent technology has given special importance to several problems concerning the improvement of the effectiveness of devices of various kinds. It is often important to impose extraordinarily high standards on tlie performance of these devices, since a failure in the performance could bring disastrous consequences. The quality of production plays an important role in today's life. An interruption in the operation of a regulating device can lead not only to deterioration in the quality of a manufactured product but also to damage of the industrial process. From a purely economic viewpoint high reliability is desirable to reduce costs. However, since it is costly to achieve high reliability, there is a tradeoff. The failure of a part or conijionent results not only in the loss of the failed item but often results in the loss (at least temporarily) of some larger assembly or system of which it is part. There are numerous examples in wiiioh failures of components have caused losses of millions of dollars and personal losses. The space program is an excellent example where even the lives of some astronauts were lost due to failure in the system. The follo^^;ing authors have considered the statistical theory of reliability and provide a good set of references on the subject: Mendenhall (1958), Buckland (1960), Birnbaum (1962), Covind.ira i II 1 u (IMtjA), Mann, Scliaefer and Singpurwaila (1973), and Canfield and liorgiiian (1975). Ki_' 1 iabil i I y tlieory is the disc ii^l inc' tliat deals with procediires
PAGE 31
18 to ensure tlie maximum effectiveness of manufactured articles and tliat develops methods of evaluating the quality of systems from knovm cjualities of their component parts. A large num.ber of problems in reliability theory have a mathematical character and require the use of mathematical tools and the development of new ones for their solution. Areas like probability theory and mathematical statistics are necessary to solve some of the problems found in reliability theory. No matter how liard the company works to maintain constant conditions during a production process, fluctuations in the production factors lead to a significant variation in the properties of the finished products. In add ition, articles are subjected to different conditions in the course of tlieir use. To maintain and to increase the reliability of a system or of an article requires both material expenditures and scientific research. Statistical tlieory and methodology have played an influential role in the development of reliability theory since the publication of the paper by Epstein and Sobel (1953). Four statistical concepts provide the basis for estimating relevant parameters and testing hypotheses about the life characteristic of the subject matter. These concepts are: (1) the distribution function of some variable which is a direct or indirect measure of the response (life time) to usage in a particular euvirnnment; (ii) tlie associated probability density (or frequency) function :
PAGE 32
19 (ill) the survival probability function; and (iv) the conditional failure rate. A failure distribution provides a mathematical description of the length of life of a device, structure or material. Consider a piece of equipment which has been in a given environment, e. Tlie fatigue life of this piece of equipment is defined to be the length of time, T(e), this piece of equipment operates before it fails. Full information about e would fully determine T(6) , so that given e, T(e) would not be random. One source of randomness in life is in uncertainty about the environment, i.e., T(e) is a random variable because e is random. Equipment has different survival characteristics depending nn the conditions under which it is operated, and e provides a statement of what conditions are but does not determine T(e) fully. The reliability of an operating system is defined as the probability that the system will perform satisfactorily within specified conditions over a given future time period when the system starts operating at some time origin. Different distributions can be distinguished according to their failure rate function, which is known in the literature of reliability as a hazard rate [see Barlow and Prosch.m (1965) ] . The hazard rate (denoted by h) , which is a function of time, gives the conditional density of failure at time, t, wit]i the hypothesis that the unit has been funcitoning without failure up to that point in time. The conditional failure is defined as: (2.2.1) h(t) = f(t)/[L F(t)] = f(t)/R(t) ,
PAGE 33
20 where (2.2.2) F(t) = Prob (T < t) = f^ f(t) ds , is the probability that an observed value of T will be less than or equal to an assigned number t. The reliability function (also called the survival function) of the random variable T gives the probability that T will exceed t and is defined by (2.2.3) R(t) = 1 F(t) = Prob (T > t) . The probability density function of the random variable T, f(t), < t < oo, is knovm as the failure density function of the device. It can be shown that the conditional failure rate and the distribution function of a random variable are related by (2.2.4) F(t) = 1 exp[f^ h(s) d(s)]. The causes of failure can be categorized into three basic types. It is recognized, however, that there may be more than one contributing cause to a particular failure and that, in some cases, there may be no completely clearcut distinction between some of the causes. The three classes of failure are infant mortalities, or early failures, random failures and wearout failures. The behavior of the hazard rate as a funciton of time is sometimes known as the hazard function or life characteristic of the system. For a typical system that may experience any of the three previously described types of failure, the life characteristic will appear as in Figure 1. The representation of the life characteristic has been classically referred to as the "bathtub curve", wherein the three segments of the curve represent the three time periods of initial, chance and wearout failure,
PAGE 34
21 Time Figure 1. Life characteristics of some systems The initial failure period is characterized by a high hazard rate shortly after time x=0 and a gradual reduction during the initial period of operation. During the chance failure period, the hazard rate is constant and generally lower than during the initial period. The cause of this failure is attributed to unusual and unpredictable environmental conditions occuring during the operating time of the system or of the device. The hazard rate increases during the wearout period. This failure is associated with the gradual depletion of a material or an accumulation of shocks and so on. In the following subsections ue will consider the general
PAGE 35
22 properties of some widely used life distributions, the assessment and use of those distributions, and the literature related to Bayesian methods in life testing. 2.2.2 Some Common Life Distr i butions J . 2 . 2 . 1 The Expone n tial Distribution In the case of a constant failure rate the distribution of life is exponential. This case has received the most emphasis in the literature, since, in spite of theoretical limitations, it presents attractive statistical properties and is highly tractable. Data arising from life tests under laboratory or service conditions are often found to conform to the exponential distribution. An acceptable justification for the assumption of an exponential distribution to life studies was initially presented by Davis (1952). More recently Barlow and Proschan (1965) have advanced a mathematical argument to support the plausability of the exponential distribution as the failure law of complex equipment. The random variable T has an exponential distribution if it has a probability density function of the form (2.2.5) tj,(t) = o"^ exp[-(t-0)/o] , t > 6, o > 0. The mean and variance of T are (o + 9) and a^ , respectively. In most a])pllcat ions is taken as zero. For this distribution, the physical interpretation of a constant hazard function is that, irrespective of the time elapsed since the start of operation, of a system the probability that the system fail in the next time intervals dt.
PAGE 36
23 given that it ha.s survived to time t, is independent of the elapsed time t and is constant. 2.2.2.2 The Gamma Distribution An extremely useful distribution in fatigue and wearout studies is the gamma distribution. It also has a very important relationship to the exponential distribution, namely, that the sum of n independent and identically distributed (i.i.d.) exponential random variables with common parameters Q=0 and a is a random variable that has a gamma distribution with parameters n and o. Hence, the exponential distribution is a special case of the gamma with n=l. Tiie random variable T has a gamma distribution if its probability density function is of the form, (2.2.6) f (t) = t(t-9)"""^ exp[-(t-6)/al} /a'Y(n); n > 0, a > 0, e > 0. The standard form of the distribtuion is obtained by puttiiig o = l and 6=0, giving (2.2.7) f^(t) = [t"~^ exp(-t)]/r(n), t>0; where the gamma function, denoted F, is a mapping of the interval (0,Â°Â°) into itself and is defined by (2.2.8) r(n) = / t"~^ exp(-t) dt. The probability distribution function of (2.2.7) is (2.2.9) ProbiT < t] = [r(n)]-l f^ x""^ exp(-x) dx .
PAGE 37
24 Since a distribution of the form given in equation (2,2.6) can be obtained from standardized distributions, as in equation (2.2.7), by the linear transformation t=(t'-e)/o, there is no difficulty in deriving formulas for moments, generating functions, etc., for equation (2.2.6) from those for equation (2.2.7). One of the most important properties of the distribution is the reproductive property; if T, and T are independent random variables each having a distribution of the form (2.2.7), possibly with different values n', n" of n but \jith common values of a and 0, then (Ti+ T2) also has a distribution of this form, with the same value of o and 0, and with n = n' + n" . 2.2.2.3 The Meibull Distribution The WeibuU distribution was developed by W. Weibull (1951) of Sweden and used for problems involving fatigue lives of materials. Three parameters are required to uniquely define a particular Weibull distribution. Those three parameters are the scale parameter a, the shape parameter n and the location parameter G. A random variable T has a Weibull distribution if there are values of the parameters n (>0) , a (>0) and such that, (2.2.10) Y = [(t-0)/a]" has the exponential distribution with probability density function (2.2.11) f^,(y) = exp(-y), y > 0. The probability density function of T is given by
PAGE 38
26 at succesive points of time of, for example, a fatigue crack or the growtli of biological organisms and the change between any pairs of succesive steps or stages is a random proportion of the previous size, then asymptotically the distribution of the random variable is lognormal [see Kapteyn (1903)1. This theoretical result imparted some plausibility to the lognorraal distribution for failure problems. Let t, < tÂ„< ... < t be a sequence of random variables that denote the -1-2 n ' sizes of a fatigue crack at succesive stages of its growth. It is assumed that the crack growth at stage i, t.t._,, is randomly proportional to the size of the crack, t . _-. and that the item fails v>7hen the crack reaches t . Let t_j^t_^_-, = it . t . _.. , i= 1 , 2 , . . . , n, where TT . is a random variable. The n . are assumed to be independently distributed random variables tliat need not have a common distribution for all i's when ii is large but that need to be lognormally distributed otherwise. 'ITius, TTi = (t.^L-p/'^i-l ' i = 1, 2, ... , n . Mann, Schaefer and Slngpurwalla (1973) show tliat In t , the life length of the item, for large n, is asymptotically normally distributed, and hence t has a lognormal distribution. If there is a number y such that (2.2.18) Z = In(t-Y) is normally di st r i liuLcd , then the distribution of t is said to be lognormal. The d i si r i but i cm of t can be defined by the equation, (2.2.19) U = -/, + 6 In(t-y) ,
PAGE 39
25 (2.2.12) f^(t) = no"^ [(t-e)/a]"~^ exp{ [ (t-O) /a] } " , t > 6. Tlie standard WeihuU distribution is obtained by putting a = l and 6=0. The value zero for is by far the most frequently used, especially in representing distributions of life times. The Weibull distribution has cumulative distribution function (2.2.13) F,|,(t) = l-exp{-[(t-e)/a]"} , and its mean and variance are (2.2.14) E(t) = ar(l + H/n]) and (2.2.15) Var(t) = o^i r(l+[ 2/n] ) r2(H-[l/n])} , respectively. For the two parameter Weibull distribution we have that the reliability and hazard function are (2.2.16) R.jXO = exp [-(t/o)"l and (2.2.17) h^,(t) = nl"~Vo" . Wien n=l, the hazard function is a constant. Thus the exponential distribution is a special case of the Weibull distribution v^;ith n=l. 2.2.2.4 The Lognormal Distribution The lognormal distribution is also a very popular distribution in describing wearout failures. This model was developed as a physical or, more appropiately biological, model associated with the theory of {iroport ionate effects (see Aitchison and Brown (1937) for a full description of the distribution, its properties, and its developments). Briefly, if a random variable is supposed to represent the magnitudes
PAGE 40
27 where U is n unit noriiKil variable and 0, fi and y are parameters. The probability den.sitv function of T is defined by (2.2.20) f^,(t) = 5[(t-Y)/2^|"^ exp[-{Si+Sln(t-Y)}"/2], t>v An alternative, more fashionable notation replaces Q and 6 by the expected value m and standard deviation a of Z = In(t-Y). The two sets of parameters are related by the equations, (2.2.21) p = -il/6 and (2.2.22) o = 6~^ so that the distrilnitlon of t can be defined by (2.2.23) U = fln(t-Y) p]/a and the probability density function of T by (2.2.24) f^(t) = [(t-Y)>^a]-l exp [-{ ln( t-y) -y }2 /2a2 ] , t>Y . In many applications, y is known (or assumed) Lo be zero. This iminirtant case has been given the name two parameter lognormal distribution. The mean and variance of the two parameter distribution are given by (2.2.25) mt) = exp[y + (0^/2)] , and (2.2.26) Var(i) = [exp(2p)]
PAGE 41
28 In addition, the value t such that Fr(t
PAGE 42
29 2.2.3 T raditional Approach t o Life Testing Inferences In life testing theory we find a large niimher of random quantities. In most cases we do not know the distributions and theoretical characteristics; our aim is to estimate some of these quantities. This is usually accomplished with the aid of observations on the random variables. According to the laws of large numbers, an "exact" determination of a probability, an expected value, etc., would require an "infinite" number of observations. Having samjales of finite size, we can do no more than estimate the theoretical values in question. The sample characteristics, or statistics, serve the purpose of statistical estimation. For a good estimation of theoretical quantities, a fairly large sample is sometimes needed. In many practical situations the following two types of estimation problems arise. A certain quantity, say t) , which is, from the statistical point of view, a theoretical quantity, lias to be determined by means of measurement. Such a quantity may be, for example, the electrical resistance of a given device, the life of a given product, etc. The result T of the measuring procedure is a random variable whose distribution depends on 9 and perhaps on additional quantities. That is, we have to estimate the parameter 9 out of a sample T, , T,, , ... , T taken on T. In the J2 n other case, the quantity in question is a random variable itself and in such cases we are interested in tlie (theoretical) average value, or the dispersion of 1', etc. This means that we have to estimate the expected value E(T) or Var(l'), and perhaps other (constant) quantities that caii be expressed with the aid of the distribution
PAGE 43
30 function of T, like the reliability function. More often for lifetime distributions, tlie quantity of interest is a distribution percentile, also knovvn as tlie reliable life of the item to be tested, corresponding to some specified population survival proportion; or it is the population proportion surviving at least a specified time, say S , For the classical statistician, the unknown parameter 9 is considered to be a constant. In estimating a constant valtie there are various aspects to consider. If we wish to have an estimator whose value can be used instead of the unknown parameter in formulas [certainty equivalent (CE) approach], then tlie estimator should have one given value. In this case we speak of point estimation. Rut knowing that our estimator is subject to error, sometimes we would like to have some information on tlie average deviation from the value. In this case we have to construct an interval that contains the unknovm parameter, at least with high probability, or give a measure of the variability of the estimator (such as the standard error of the estimate) . ^k5st of the literature about the traditional approach to life testing inferences is focused in two areas; one relates to point and interval estimation procedures for lifetime distributions and the other relates to methods of testing statistical hypotheses in reliability (known as "reliability demonstration tests") . The classical, approach to point estimation in life testing inferences emptiasizes that a good estimator should have properties like unbiasedness , efficiency, consistency and sufficiency [see
PAGE 44
31 ly Dubey (1968), BartletC (1937) and Weiss (1961)]. Two methods, tlie method of niomeats and iiu'thod of maximum likelihood, are frequentl; used to yield estimators with as many as possible of the previously mentioned properties. Under various sampling assumptions, the maximum likeliliood estimators of the parameters Vv^ere obtained for the following distributions; gamma [see Choi and Wette (1969) and Harter and ^k^ore (1965)); Weihull [see Bain (1972), Bil Imaii e^ j_l Â• n971), Cohen (1965), Englehardt (1975), Haan and Beer (1967), Lemon (1975) and Rockette e t a 1 . (1973)]; exponential [see Deemer and Votaw (1955), El-Sayyad (1967) and Epstein (1957)]; and for the normal and lognormal [see Cohen (1951), Harter and Moore (1966), Lambert (1964) and Tallis and Young (1962)]. The traditional approach also includes some linear estimation properties like Best Linear Unbiased (BLU) and Best Linear Invariance (BLI) . Interval estimation procedures have also been developed for the parameters of the life distributions. Examples include Bain and Englehardt (1973), Epstein (1961), Harter (1964) and Mann (1968). Point or interval estimators for functions of the life distributions, such as reliable life, reliability function, hazard rate, etc. , were obtained by substituting for the unknov^'n parameters the point or interval estimators obtained for them [see Johns and Lieberman (1966), Bartholomew (1963), Criibbs (1971), Harris and Singpurwalla (1968, 1969), Lawless ( J 9 7 1 , 1 9 7 2) , Likcs (1967), Mann (19h9-a, 1969-b, 1970), Varde (1969) and Linliai t (1M(,-,) j .
PAGE 45
32 Testing reliability liypotheses is the second major area of research in the classical approach to life testing. By means of the methods referenced previously, a test statistic is selected, regions of acceptance and rejection are set up, and risks of incorrect decisions are calculated. In addition it is emphasized that the risks of incorrect decisions are specified before the sample is obtained, and in tiiis case n, the sample size, is generally to be determined. Some of the references in this area include [Epstein (1960), Mpstein and Sobel (1955), Kumar and Patel (1971), Lilliefors (1967, 1969), Sobel and Tlschendorf (1959), Thoman et al . (1969, 1970) and Ferclio and Ringer (1972)]. A large part of the statistical problem in reliability involves the estimation of parameters in failure models. Each of the methods of obtaining point estimates previously referenced has certain statistical properties that make it desirable, at least from a theoretical viewpoint. Not surprisingly, point estimates are often made (particularly in reliability) because decisions are to be based on them. The consequences of the decisions based on the estimates often involve money, or, more generally, some form of utility. Hence the decision maker is more interested in the practical consequence of the estimate than in its tlieoretical properties. In particular, he may be interested in making estimates that minimize the expected loss (cost), but this can not be accomplished in general with classical methodology because the methodology does not admit probaliility distributions of the parameters.
PAGE 46
33 2.2.4 B ayesian Techniques in Life Testing Tlie non-liayeslan (classical) approach to estimation considers an unknown parameter as fixed. This means that classical interval estimation and liypothesis testing must lean on inductive reasoning either through the likelihood function or the sampling distributions. Tn point estimation, the classical approai^h must depend on estimates the criteria for which often are not based on the practical consequences of the estimates. On the other hand, Bayes procedures assume a prior distribution of the parameter space, that is, considers the i^arameter as a random variable, and, hence, tlie posterior distribution is available. This creates the possibility of a whole new class of criteria for estimation, namely, minimization of expected loss, probability intervals and others. In view of the difficulty in assessing utility or costs of complex reliability prol^iems, in previous studies Bayesian methods have been used primarily to provide a means of combining previous data (expressed as the prior distribution) with observed data (expressed in the likelihood function) to obtain estimates of parameters by using the posterior density. However, it must be emphasized that Bayesian methods are perfectly general in providing whatever the reliability problem demands. Tliere is a loss function that is rather popular in Bayesian analysis anil gives simple results. Su()pose that 6 is an estimate of i) and that the loss function is (2.2.28) L(u,u) = (e-e)-^.
PAGE 47
34 This function states that the loss is equal to the square of the distance of 9 from 0. The Bayes approach is to select the estimate of that minimizes the expected loss witli respect to tlie posterior distribution. The estimate that accomplishes this is the posterior mean, that is, (2.2.29) 6 = E(0| t^, t2, ... , t^;P) ; where P represents prior information. The above loss function is often called the quadratic ]oss function and the posterior mean is termed the Bayes estimate, if the loss function is of the form (2.2.30) L(e,6) = |6-e| , the estimate of 6 that minimizes the expected loss is the median of the posterior distribution. Canfield (1970) developed a Bayesian estimate of reliability for the exponential case using this loss function. The resulting estimate is seen to be the MVUE of reliability when the prior is flat. A third and simple case is the asymmetric linear, (2.2.31) 1.(0,0) = ky (0-6) if ex k (e-9) . if e
PAGE 48
35 The expected loss is gmierally a random variable a priori since it depends on the as yet unobserved sample data. The unconditional expectation (witli respect to tlie sample) of the expected loss is called the "Bayes risk" and is minimized by the Bayes estimate. Hayes methods liave been used in a variety of areas of reliability. Most uses can be characterized as point or interval estimation of parameters of life distributions or of reliability functions. Examples include Breijiohl, et. al., (1965) who studied the behavior of a family of Bayesian posterior distributions. In addition the properties of the mean of the posterior distribution as a point estimate and a method for constructing confidence intervals were given. The problem of hypothesis testing was considered, among others, by MacFarland (1972). IKprovided a simple exposition of the rudiments of applying Bayes equation to hypotlieses concerning reliability. The Bayesian approach has also been applied to parameter estimation and reliability estimation of some known distributions like gamma, PoissiMi, lognormal and others. Lwin and Singh (1974) considered a Bayesian analysis of a two-parameter gamma model in life testing context with special emphasis on estimation of the reliability function. Tlie Poisson distribution has received the attention of Canavos (1972, 1973). In the first article a smooth empirical Bayes estimator is derived for the hazard rate. The reliability function is also estimated either by using the empirical Bayes estimate of the |i,irameters , or by obtaining the expectation
PAGE 49
36 of the reliability function. Results indicate a significant reduction In mean squared error of the empirical Bayes estimates over the maximum likelihood estimates. A similar result v^as also derived for the exponential distribution by hemon (1972) and by Martz (1975). Next, Canavos developed Bayesian procedures for life testing witli respect to a random intensity parameter. Bayes estimators were derived for the Poisson parameters and reliability function based on uniform and gamma prior distributions. Again, as expected, the Bayes estimators have mean squared errors (MSE) that are appreciably smaller than thoseof the minimum variance unbiased estimator (MVUE) and of tlie maximum likelihood estimator (MLE) . Zellner (1971) has studied the Bayesian estimation of the parameters of the lognormal distribution. Employing a flat prior, Zellner found that the MSE estimators of the parameters are the optimal Bayesian estimators when a relative squared error loss function is used. The Weibull and exponential function have received most of the attention of authors who have studied life distributions from a Bayesian viev^7point. The Weibull process with unknown scale parameter is taken as a model by Soland (1968) for Bayesian decision theory. The family of natural conjugate prior distributions for the scale parameter is used in prior and posterior analysis, in addition, prepostorior analysis is given for an acceptance sampling problem with utility linear in the unknown mean of the Weibull process. Soland (1969) extended the analysis by treating both the shape and scale
PAGE 50
37 parameters as uiikmnvn, but as was previously kuovvai Jt is not possible to find a family of continuous ^iiint distributions on the two parameters that is (.-Itjsed under sampling, so a family of prior distributions is used that places continuous distributions on the scale parameter and discrete distributions on tl\e shape parameter. Prior and posterior analysis are examined and seen to be no more difficult than for the case in which only the scale parameter is treated as unknov\;n , but posterior analysis and determination of optimal sampling plans are considerably more complicated in this case. In Bury (1972), a two-parameter Welbull distribution is assumed to be an appropiate statistical life model. A Bayesian decision model is constructed around a conjugate probability density function for the Weibull hazard rate. Since a single sufficient statistic of fixed dimensionality does not exist for the Weibull model, Bury was able to consider only two sampling plans in his preposterior analysis: obtain one further observation or terminate testing. Bury points out that small sample Bayesian analysis tends to be more accurate than classical analysis because of the additional prior information utilized , in the analysis. Bayes credible bounds for the scale parameter and for the reliability function are derived by Papadopoulos and Tsokos (1975). Reliability data often include information that the failure event lias not yet occuL-red for some items, while observations of complete lifetimes are available for other items. Cozzolino (197A) addressed this problem from a Bayesian point of view, considering
PAGE 51
38 density functions that \\a.\e failure rate functions consisting of a known function multiplied by an unknown scale factor. It is shov-zn that a gamma family of priors is conjugate for the unknown scale parameter for both complete and incomplete experiments. A very flexible and convenient model resulting from the assumption of a piecewise constant failure function. Life tests that are terminated at preassigned time points or after a preassigned number of failures are sometimes found in reliability tlieory. Bhattacharya (1967) provided a Bayesian analysis of the exponential model based on this kind of life test. He showed that the reliability estimate for a diffuse prior (wliich is uniform over the entire positive line) closely resembles the classical MVUE, and he considered the role of prior quasi-densities when a life tester has no prior information. Bliattacharya points out that the use of constant density over the positive real line has been suggested to express ignorance but that it causes problems. For example it can not be interpreted as a probability density since it assigns infinite measure on the parameter space. [See Box and Tiao (1972).] A paper by Dunsmore (1974), stands out from among the other Bayesian papers in life testing and is particularly pertinent to the life testing application in this thesis. This article is an important exception because it carries the Bayesian approach to its natural conclusions by determining prediction intervals for future If g(0) is any non-negative function defined in the parameter space U such that g(r;) f (J V e .Q , then g(b) is called a prior quasi-densi ty .
PAGE 52
39 observations in life testing nsinj^ the concept of the Bayesian predictive distribut iiin . One objective of prediction is to provide some estimate either point or interval, for future observations of an experiment F based on the results obtained from an informative experiment E. As we mentioned before, the classical approach to prediction involves the use of tolerance regions. [See Aitchison (1966), Folks and Broi%me (1975), Guenther et al. (1976) and Hewett and Moeschberger (1976)]. In these we obtain a prediction interval only, and the measure of confidence refers to the repetitions of the whole experimental situation. The Bayesian approach on the other hand, allows us to incorporate further information which might be available through a prior distribution and leads to a more natural interpretation. Let t , ..., t be a random sample from a distribution with probability density function i'(t|6), (tcT;0eO), and let y,,, y.,, ..., y be a second independent random sample of "future" observations from a distribution witli probability density function F(y|o), (yeY;Gc0). Our aim is to make predictions about some function of y , y^, ..., y . The Bayesian approach assumes that a prior density function P(6), (8t:G) is available that measures our uncertainty about the value of 0. If the information in E is summarized by a sufficient statistic t then a posterior distribution P(6|t) is available. Suppose now that we wish to predict some statistic y defined on y , y,,, ... y . Then Such a sufficient statistic will always exist since, for T example, t could [
PAGE 53
40 the predictive density function for y is given by (2.2.32) P(ylt) = / P(y|0) P(6|t) de \ A Bayesian prediction interval of cover g is then defined as an interval I such that , (2.2.33) P(l|t) = / P(y|t) dy = 6. [See, for example, Aitchison and Sculthorpe (1965), Aitchison (1966) and Guttman (1970).] It should be emphasized that in the Bayesian approach the complete Inferential statement about y is given by the predictive density function P(y|t). Any prediction interval is only a summary of the full description P(y|t). In general there will be many intervals I that satisfy (2.2.33). Dunsmore considers most plausible Bayesian prediction intervals (conmionly known as highest posterior density (HPD) intervals) of cover (3, wliich have the form, (2.2.34) I = [y:P(y |t) > X] , where A is. determined by P(I|t) = 3. In conclusion we might say that the "uses of Bayesian methods in life testing have been limited. However in those cases where Bayes estimators have been found, they performed better, according to classical criteria, tlian the conventional ones. The use of loss functions has not been analyzed deeply for the reasons mentioned before; namely It is implicitly assumed in (2.2.32) that conditional on , y and t are independent.
PAGE 54
that the loss funt'tion is usually complex and unknown, and that even when tlie hiss 1; miction is known the Bayes estimate is sometimes difficult to find. Some of these problems wi i,l be solved with the development of mathematical theory and probably with the development of computer systems. Only the Dunsmore paper fully used the Bayesian methodology to obtain prediction Intervals that consider all available information and fully recognize the remaining parameter uncertainty. All of the papers discussed in the previous section considered a stationary situation. That is, the known parameters and the distributions used are assumed to remain the same across all time periods. It would be of value to study the nonstationary case, where tiie parameters are changing in time and possibly tlie distributions could also change in time. It is important to recognize, however, that probably the |iroblems now faced ^^/itll the stationarity assumption will be greater when tliat assumption is relaxed. Never this dynamic system is well worth investigating. 2 . 'i Modeling of Nonstationary Processes For many real world data generating processes the assumption of stationarity is questionable. Take for instance life testing models. l\nien it is assumed tiiat the life of certain commodities follows a lognorm.al distribution, for example, the stationarity assumption could be expecti'd to hold over short periods of time; but in most cases it \Jould be expected tliat for a lengtliy period, stationarity would hi a doubtful assumjJt ion . If the model represents the life of perishable |)rodut:ts, like food for example, then it
PAGE 55
42 v%fOu]d be expected that enYircnmental factors like heat and humidity could change and affect the cliaracteristics of the life distribution of the product or affect the input factors used in the manufacturing process. Furthermore, the wearout of the machines used in the manufacture of the products could cause changes in the quality of the products and hence in the parameters of the life distributions. Random parameter variation is surely to be a reasonable assumption when we are concerned with economic variables, like those used in Cost-Voluiue-Prof it analysis. A wide spectrum of circumstances could be mentioned where the economic environment is gradually affected. For exam[)le, the level of economic development changes gradually in a country and consequently brings gradual changes in related variables like income, consumption and price. Also, consumer's tastes and preferences evolve relatively slowly as social and economic conditions change and as new marketing channels or techniques are developed. The gradual increase in technology available to the industry and to the government may produce changes that are not dramatic but that will liave some Influence in any particular period of time. In other words, it seems reasonable to assume that in at least some situations the distribution functions of variables, like sales, price or costs, could be gradually changing in time. It is important to emphasize that we are referring to gradual changes, the effects of which are not perf(^ctly predictable in advance for a particular period. If a data generating process characterized by some parameter e is nonstat ionary , then it is not particularly realistic to make
PAGE 56
43 inferences and decisions concerning 6 as if only took on a single value. Instead we should be concerned with a sequence 6,, eÂ„, ... , of values of 6 corresponding to different time periods, assuming the characteristics of the process vary across time but are relatively constant within a given period. Some researchers have studied this problem with particular stochastic processes. Chernoff and Zacks (196A) studied what they called a "tracking" problem. Observations are taken on the successive positions of an object traveling on a path, and it is desired to estimate its current position. If the path is smooth, regression estimates seem appropiate. Hov;ever, if the path is subjected to occasional changes in direction, regression will give misleading results. Their objective was to arrive at a simple formula which implicitly accounts for possible changes in direction and discounts observations taken before the latest change. Successive observations were assumed to be taken on n independently and normally distributed random variables v^ith means y-, , p-,, ... , (i . JÂ»n Each mean is equal to the preceding mean except vjhen an occasional change takes place. The object is to estimate the current mean p Â• They studied the problem from a Bayesian point of view and made the following assumptions: tlie time points of change obey an arbitrary specified a priori probability distribution; the amounts of change in the means (when changes take place) are independently and normally distributed random variables with zero mean; and the current mean U is a normally distributed random variable with zero mean. Using a quadratic loss fimctiou and a uniform prior distribution for y-. on
PAGE 57
44 the wliole real line Lhey derived a Bayes estimator of \i . In addition they derived llie minimum variance linear unbiased (MVLU) estimator of y . Comparing both estimators they found tliat although the MVLU estimator is considerably simpler tlian the Bayes estimator, when the expected number of changes in the mean is neither zero nor n-i the Bayes estimator is more efficient than the MVLU. Chernoff and Zacks studied an alternative problem in which the prior distribution of time points of change is such that there is at most one change. This problem leads to a relatively simple Bayes estimator. However, difficulties may arise if this estimator is applied when there are actually two (or more) changes. The suggested technique starts at the entl of a series, searches back for a change in mean and then estimates tlie mean value of the series forward from the point at which such a change is assumed to have occured. They designed a procedure to test whether a change in mean has occurred and found a simpler test than the one used by Page (1954, 1955). Most of the results appearing in this paper were derived in a previous paper by Barnard (1959) in a somewhat different manner, but the general results are essentially the same. The previous paper by Chernoff and Zacks motivated some research in the follov/ing years. Mustafi (1968) considered a situation in which a random variable is observed sequentially over time and the distribution of this random variable is sub-jected to a possibl.change at every point in the sequence. The study of this problem is centered about the model introduced by Chernoff and Zacks.
PAGE 58
45 Three aspects of tlie prublem were considered ])y Mustafi. First he considered the i^robleni of estimating' the current value of the mean on the basis of a set of observations taken up to the present. Chernoff and Zacks assumed that certain parameters occuring in the model were kno^^m . Mustafi tlien derives a procedure for estimating the current value of the mean on tlie basis of a set of observations taken at successive time points wlien nothing is known about the other parameters occuring in tlie model. Second Mustafi estimated tlie various points of change in the framework of an empirical Bayes procedure and used an idea similar to tliat of Taimiter (1966) to derive a sequence of tests to be applied at each stage. Third he considers n independent observations of a random variable that belong to the one parameter exponential family taken at successive time points. He examines the problem of testing the equality of these n parameters against the alternative that the parameter has changed r-times at some unkno\^m points where r is some finite positive integer less than n. He developed a test procedure generalizing the techniques used by Render and Zacks (1966) and Page (19S5) . Hinich and Farley (1966) also studied the problem of estimation models for time series with nonstationary means. They assumed a model similar to the one developed by Chernoff and Zacks except that they assumed that the number of points of change per unit time are Poisson distriluited with a knovm shift rate parameter. They found an estimator for the mean which is unbiased and efficient. Also it turned out to be a linear combination of the vector of observations.
PAGE 59
46 The Farley-llinich technique attempts to estimate jointly the level of the mean at the beginning of a series as v;ell as the size of the change (if any) . Farley and Hinich in a later paper (1970) compared the method developed in (196h) with the one presented by Chernoff and Zacks (1954) and later generalized by Mustafi (1968). Some ways were examined to systematically track time series which may contain small stochastic mean shifts as well as random measurement errors. A "small" shift is one which is small relative to measurement error. Three approaches were tested with artificial data, by means of Monte Carlo methods, using mean shifts which were rather small, that is, mean shifts which \-iere half the magnitude of random measurement error variance. Several false starts with actual marketing data showed that there was an identification problem to provide an adequate test of the procedures' performance, and artificial data of known configuration provided a more natural starting point. Two techniques (one developed by the authors and the oth.er by Chernoff and Zacks) involved formal estimation under the assumption that there was at most one discrete jump in a data record of fixed length of the type often stored in an information system. Both techniques performed reasonably well when the rate of shift occurrence was known, but both teciiniques are very sensitive to prior specification of the rate at which shifts occur in terms of both classes of errors, that is, missing shifts wliich occur and idenc living "shifts" which do not occur. Knowing the shift rate precisely and knowing that more than one siiift in a record
PAGE 60
A7 is extriimely unlikely are two very severe restrictions for many applications. A sin.pler filter technique was tested siriiJ.irly witli more promising results in terms of avoiding both classes of errors. The filter approach involved first smoothing the series and then implementing ad hoc decision rules based on consecutive occurrences of smoothed values falling outside a predetermined range around the moving average. Harrison and Stevens have produced two important papers about Bayesian forecasting using nonstationary models. In the first of these papers (1971), they described a new approach to short-term forecasting based on Bayesian princiiiles in conjunction with a multi-state datagenerating process. The various states correspond to th.e occurrence of transient errors and step changes in trend and slope. The performance of conventional systems, like the growth models of Holt (1957), Brown (1963) and Box-Jenkins (1970), is often upset by tlie occurrence of changes in trend ,uul slope or transients. In Harrison an
PAGE 61
A8 formation in a purely mechanica]. way. The latter, however, includes people: the person responsible for the foret:ast and all the people concerned with using tlie forecasts and supplying information relevant to the resulting actions. It is necessary that people can communicate their information to the method and that the method clearly communicates the uncertain information in such a way that it is readily interpreted and accepted by decision makers. The basic model, called by them "the dynamic linear model", is defined together with Kalman filter recurrence relations and a number of model formulations are given based on their result. They first plirase the models in terms of their "natural" parameters and structure, and then translate them, into the dynamic 1 inear model form. Some of the models discussed by them are, a) regression models, b) the steady model, c) the linear growth model, d) the general polynomial models, e) seasonal models, f) autoregressive models, and g) moving average models, Multiproct^ss models introduce uncertainty as to the underlying model, itself, and this approach is described in a more general fashion than in tlieir 1971 paper. In the 1976 paper they present a Bayesian approach to forecasting which not only includes many conventional metliods, as presented before, but possesses a remarkable range of additional facilities, not the least being its ability to respond effectively in the start-up situation where no prior data history (as distinct from information) is available. The essential foundations of the method are: (a) a parametric (or state space) model, as distinct from
PAGE 62
49 a functional model; (b) probabilistic information on the parameters at any given time; (c) a sequential model definition which describes how the parameters change in time, both systematically and as a result of random shocks; and (d) uncertainty as to the underlying model itself, as between a number of discrete alternatives. Kamat (1976) developed a smoothed Bayes control procedure for controlling the output of a production process when the quality characteristic is continuous with a linear shift in its basic level. The procedure uses Bayesian estimation with exponential smoothing for updating the necessary parameter estimates. The application of the procedure to real life data is illustrated with an example. Applications of the traditional x-chart and the cumulative sum control chart to the same data are also illustrated for comparison. In Chapter Three of this dissertation we develop a Bayesian model of nonstationarity for normal and lognoi'mal processes. We build our results directly on two papers, Winkler and Barry (1973) and Barry and Winkler (1976). In the first paper they developed a Bayesian model for nonstationary means in a multinormal data-generating process and demonstrated that the presence of nonstationary means can have an impact upon the uncertainty associated with a given random variable that has a normal distribution. Moreover, the nonstationary model considered by
PAGE 63
50 them seems to have more realistic properties than the corresponding stationary model. For example, they found that in the nonstationary model the recent observations are given more weight that the distant ones in determining the mean of the distribution at any given time, and the uncertainty about the parameters of the process is never completely removed. Barry and Winkler (1976) were concerned with the effects of nonstationarity on portfolio decision. The use of a Bayesian approach to statistical inference and decision provides a convenient framework for studying the problem of changing parameters, both in terms of forecasting security prices and in terms of portfolio decision making. In this thesis a number of extensions to their results are made, thereby removing some of the restrictiveness of their results, and applications are considered in the areas of CVP analysis and life testing.
PAGE 64
CHAPTER THREE NONSTATIONARITY IN NORMAL AND LOGNORMAL PROCESSES 3. 1 Introduction The normal distribution is considered by many persons an important distribution. Tlie earliest workers regarded the distribution only as a convenient approximation to the binomial distribution. However, with the work of Laplace and Gauss its broader theoretical importance spread. The normal distribution became widely and uncritically accepted as the basis of much practical statistical work. More recently a more critical spirit has developed, with more attention being paid to systems of "skew (asynmiet ric) frequency curves". This critical spirit has persisted, but is offset by developments in both theory and practice. The normal distribution has a unique position in probability theory, and can be used as an approximation to many other distributions. In real world problems, "normal theory" can frequently be apjjlied, with small risk of serious erros, whtni substantially non-normal distributions correspond more closely to observed values. This allows us to take advantage of the elegant nature and extensive supporting numerical tables of normal theory. Most theoretical arguments for the use of the normal distribution are based on forms of central limit tiieorems. These theorems state conditions under which the distribution of standardized sums of random variables tends to a unit non.ial d isi r i ImL ion as the number of variables in tlie sum increases, that is with conditions sufficient to ensure an asymptotic unit normal distr ibut ion. 51
PAGE 65
52 The normal dist rlbul ion , for the reasons exposed before, has been widely used and enumerating the fields of application would be lengthy and not really informative. However, we do emphasize that the normal distribution is almost always used as an approximation, either to a theoretical or an unknown distribution. The normal distribution is well suited to this because its theoretical analysis is fully worked out and often simple in form. Where these conditions are not fullfilled substitutes for normal distributions should be sought. Even when normal distributions are not used results corresponding to "normal theory" are often useful as standards of comparison. The use of normal distributions when the coefficient of variation is large presents many difficulties in some applications. For instance, observed values more than twice the mean would then imply the existance of observations with negative values. Frequently this is a logical absurdity. The lognormal distribution, as defined in equation 2.2.20, is in at least one important respect a more realistic representation of distributions of characters that cannot assume negative values than is the normal distribution. A normal distribution assigns positive probability to such events, while the lognormal distribution does not. The use of the lognormal distribution has been investigated as a possible solution to this problem [see Cohen (1951), Gallon (1Â«79), Jenkins (1932) and Yuan (1933)]. In a review of the literature Gaddum (1945) found that the lognormal distribution i;oulil ht used to describe several processes. In Chapter Two we presented a list of some of the applications of this distribution
PAGE 66
53 to real life problems. Among those applications we emphasized its use in Cost-Volume-Profit analysis and in life testing models. Furthermore, by taking the spread parameter small enough, it is possible to construct a lognormal distribution closely resembling any normal distribution. Hence, even if a normal ditribution is felt to be really appropiate, it might be replaced by a suitable lognormal distribution. As v;as mentioned in Chapter Two, most research concerned with the normal and lognormal distributions has considered only stationary situations. That is, tlie parameters (known or assumed to be known) and distributions used are assumed to remain the same in the future. In this third chapter we intend to build a nonstationary model for normal and lognormal processes from a Bayesian point of view. Section 3.2 sets the stage for the development of the nonstationary model. In it, we describe essential features of the Bayesian analysis of normal and lognormal processes including prior, posterior and predictive distributions. Two uncertainty situations are considered in this section; In one the shift parameter, \i , is assumed to be unknown and the spread parameter, a, is assumed to be known; and in the other, both parameters are assumed to be unknown. In Section 3.3, we develop a particular nonstationary model for the shift parameter of the lognormal distribution, again under the same two uncertainty situations, and provide a comparison of the results with a stationary model.
PAGE 67
54. 3.2 B ayes la n A nnly sis of Normal a nd Lo^^normal Proc esse s lief ore the last decaiie, most of the Rayeslan research deiiling with problems of statistical inference and decisions concerning a parameter assume that G takes on a single value; those models are called stationary models. For example, 6 may represent the proportion of defective items produced by a certain manufacturing process; the mean monthly profits of a given company; the mean life of a manufactured product and so on. In each case 6 is assumed to be a fixed but not knovvoi. A formal Bayesian statistical analysis articulates the evidence of a sample to be analyzed \.'ith evidence other than that of tlie sample; it is felt that there ususally is prior evidence. The non-sample evidence is assessed iudgmentally or subjectively and is expressed in probabilistic terms, by means of: (1) a data distribution tliat specifies the probability of any sample result conditional on certain parameters; and (2) a prior distribution that expresses our uncertainty about the parameters. When judgment in the form, of the assessment of a likelihood function to apply t(j the data is combined with evidence of a sample, we have the likelihood function of the sample. The likelihood function of tlie sample is combined with the prior distribution via Bayes' theorem to produce a posterior distribution for the parameters of the data distribution, and this is the typical output of a formal Bayesian analysis. If we assume that the prior distribution, for the parameters of the data Â»! i st rihution , is continuous then we may express^ Bayes' theorem as
PAGE 68
55 where and [i.Z.i) l(y x) ' . I . ' , t(x i) + 0; ' 1 (x I t) ' ' X denotes the vector of sample observations, 6 represents all the unknown parameters, r represents the known parameters of the prior distribution of 0. We can interpret f(x|o) in two ways: (1) for given e, f(x|o) gives the distribution of the random vector s; (2) for given x, f(x|6) as a function of u , together with all positive multiples, in the ususal usage is the likeJ ihood function of the sample. The prior i^robahility of the sample fCxji) is computed from (3.2.2) fCxIr) = / f(e|T) f(x|9) de , (J from which we see tliat f(xli) can be interpreted as the expected value of the likelihood in the light of the prior distribution. Alternatively, f(x|T) can be interpreted as the marginal distribution of the random vector x with respect to the joint distribution, (3.2.3) f(x,6|T) = f(e|T) f(x|e). Since (3.2.2) can be computed in advance of the sample for any x, we shall frequently refer to tlie marginal distribution of x as the predictive distr ilniL ion implied by the specified prior distribution and data distribution.
PAGE 69
56 If we have a posterior distribution f(e|x) and if a future random vector S is to come from f(x |o), whicli may or may not be the same data distribution as in ('3.2.2), we may compute (3.2.4) f(x |x) = / f(0|x) f(x |e) de. We refer to the distribution so defined as tlie predictive distribution of a future sample implied by the posterior distribution. It must be understood that (3.2,2) and (3.2.4) are but two Instances of the same relationship; sometimes it is worth distinguishing the practical problems arising when predictions refer to the present sample from those arising in connection with predictions about a future sample; that is a "not-yet-observed" sample. The revision of the prior distribution gives the statistician a method for dravv'ing inferences about 9, the uncertain expression, quantity or parameter of interest, and for decisions related to 6. In general then we may say that the term Bayesian refers to any use or user of prior distributions on a parameter space (although there is some nonpnrametrlc Bayesian material also) with the associated application of Bayes theorem in the analysis of an inferential or decision problem under uncertainty. Such an analysis rests on the belief that in most practical situations the statistician will possess some subjective a priori information concerning the probable values of the parameter. This information may often be reasonably summarized and formalized by tlie choice of a suitable prior distribution on the parameter space. The fact that the decision maker can not specify every detail of his prior distribution by direct asses-
PAGE 70
57 snieiit means that t lic^rc will otien be considerable latitude in the choice of the family of distributions to be used, even though tlie selectioii of a particular member within the chosen family will usually be wholly determined by the decision maker's expressed beliefs or betting odds. Three characteristics are particularly desirable for a family of prior tiistributions : (i) analytical tractability in tliree aspects; namely
PAGE 71
58 cliosen member of tlie family is really in close agreement with the decision maker's prior jiuigments about and not a mere artifact agreeing with one or two quantitative sununarizatlons of these judgments. A family of prior densities which gives rise to posteriors belonging to the same family is very useful inasmuch as one aspect of mathematical tractability is maintained, and this property has been termed "closure under sampling". For densities which admit sufficient statistics of fixed dimensionality, a concept to be explained later, Raiffa and Schlaifer (1961) have considered a method of generating prior densities on the parameter space that possess the "closure under sampling" property. A family of such densities has been called by them a "natural conjugate family". To define the cont;epts of sufficient statistic and sufficient statistic of fixed d im.ensional Ity , consider a statistical problem in which a large amoimt of experimental data has been collected. The treatment of the data is often simplified if the statistician computes a few numerical values, or statistics, and considers these values as summaries of the relevant information in the data. In some problems, a statistical analysis that is based on these few summary values can be just as effective as any analysis that could be based on all observed values. If the summaries are fully informative they are known as sufficient statistics. Formally, suppose tliat G is a parameter which takes a value in the space 0. Also suppose that x is a random varialile, or random vector, which takes values in tlie
PAGE 72
59 sample tipacu? S. We slin 1 1 let '^(Â•|0,^) dfnote the conditional probability density finu't ion (p.d.f.) of x wlien 6 = 0^ (G,-t:0). It is assumed th.it the oljserved vaJue of x will be available for making inferences and decisions related to the parameter Q. Denote any function T of the observations x, a statistic. Loosely speaking, a statistic T is called a sufficient statistic if, for any prior distribution of 6, its posterior distribution depends on the observed value of X only through T(x) . More formally, for any prior p.d.f. gCe) and any observed value xeS, let g(*|x) denote the posterior p.d.f. of 0, assuming for simplicity that for every value of xeS and every prior p.d.f. g, the posterior g('|x) exists and is specified by the IWiyes theorem. Then it is said that a statistic T is sufficient for the family of p.d.f. 's f(-|0), QcQ, if g(*|x ) = g(-|x ) for any prior p.d.f. g and any two points x,eS and xÂ„E.S such that T(x,) = T(x,-,) . Now, consider only data generating processes which generate independent and identically distributed random variables x , x , ... , such that, for any n and any (x^ , xÂ„, ... , x ) there exists a suf1 z . n ficient statistic. Sufficient statisticsof fixed dimensionality are those statistics '1' such that T (x x^, ... , x ) = T = (T , T ... , T ) where a particular value T. is a real number and the dimensionality s of T does not depend on n. Independently of how many elements we sample, only s statistics are needed. Kaiffa ami Sclilaifer (1961) present the following metliod for develojjing the natural coniug.ate ]n"ior for a given likelihood function:
PAGE 73
60 (i) Let the density function of 6 be g, where g denotes eithei" a pi-ior or a posterior density, and let k be another function on 6 such that Then we shall write (3.2.6) g(o) cc k(e) and say that k is a kernel of the density of 0. (ii) Let the likelihood of x given q be l(x|6), and suppose that P and k are functions on x such that, for all x and fcl, (3.2.7) l(x|e) = k(x|e) P(x). Then we shall say that k(x|6) is a kernel of the likelihood of x given 6 and that P(x) is a residue of this likelihood. (iii) Let tlie prior distribution of the random variable 6 liiive a density g'. For any x such taht l*(x|g|) = / 1 (x 1 6) g' ( 6) d e > 0, 6 it follows from Bayes theorem that the posterior distribution of 9 has a density g" whose valurat ( 6) for the given x is (3.2.8) g"(0|x) = g'(e) l(x|e) N(x) , wliere N(x) = [ / g'(G) KxIh) de]~^ .
PAGE 74
61 (iv) Now 1ft k' deni)te a kernel of the prior density of 6. It follov^is Troni the definitions of k and I and of the .syniiiol Â« that the Bayes formula can he written, (3.2.9) g"(f)|x) = g'(e) 1(x|b) N(x) = k'(e) [ / k(G) de]~^ k(x|e) P(x) N(x) e 8"(g|x) cc k'(6) k(x|o), where the value of the constant of proportionality for the given x, (3.2.10) P(x) N(x) [ / k(e) do]"^ 6 can always be determined by tlie condition, (3.2.11) g"(o|x) d8 = 1, whenever the Integral exists. Before v/e begin our presentation of a basic Bayesian analysis of normal and lognormal processes we want to emphasize that caution should be exercised in the application of the method developed by Raiffa and Schlaifer, as is pointed out by Box and Tiao (1972). According to them it is often appropiate to analyze data from scientific investigation on the assumption that the likelihood dominate the prior, for two reasons : (i) a scientiLic investigation is not ususally undertaken unless information supplied by the investigation is likely to be considerably more [irecise tiiaii information already available, that is unless it is Likely to increase knowledge by a substantial amount. Therefore analysis
PAGE 75
62 Vv^ith priors which are dominated by the likelihood often realistically represents the true inferential situation. (ii) Even when a scientist holds strong prior beliefs about the value of a parameter 6, nevertlieless, in reporting the results it would usually be appropiate and most convincing to his colleagues if he analyzed the data against a reference prior which is dominated by the likelihood, lie could say that, irrespective of what he or anyone else believed to begin with, tlie posterior distribution represented what someone who a priori knew very little about 9 should believe in the light of the data. Reference priors in general mean standard priors dominated by the likelihood. [See Dickey (1973) for a general discussion of Bayeaian methods in scientific reporting.] In general a prior wliich is dominated by the likelihood is one which does not change very much over the region in which the likelihood is appreciable and does not assume large values outside that range. We shall refer to a prior distribution which has these properties as a locally uniform prior. There are some difficulties, however, associated with locally uniform priors. The choice of a prior to characterize a situation where "nothing" (or, more realistiqal ly , little) is known a priori has long been, and still is, a matter of dispute. Bayes tentatively suggested that where such knowledge was lacking concerning the nature of the prior distribution, it might be regarded as uniform. There is an objection to Bayes postulate. If the distribution of a continuous parameter b were taken to be locally uniform, then the distribution of ., log 6, or some other transformation of 9 (which might provide equally
PAGE 76
63 sensible bases for parametrizing the problem) would not be locally unifonu. Thus, nppplication of Bayes' postulate to different transformations of G would lead to posterior distributions from the same data whicli were inconsistent with the notion that nothing is known about 9 or functions of Â„ This argument is of course correct, but the arbitrariness of the choice of parametrization does not by itself mean that we should not employ Bayes postulate in practice. Box and Tiao (1972) present an argument for choosing a particular metric in terms of which a locally uniform prior can be regarded as noninformative about the parameters. It is important to bear in mind that one can never be in a state of complete ignorance; further, the statement "knowing little a priori" can only have meaning relative to the information provided by the experiment. A prior distribution is supposed to represent knowledge about parameters before the outcome of a proiected experiment is kno^^m. Thus, the main issue is how to select a prior which provides little information relative to what is expected to be provided by the intended experiment. 3 . 3 No nstat ionary Model for Normal and Logno rmal Mean s It was emphasized in Section 2.3 that for many real world data generating processes the assumption of stationarity is questionable. Random parameter variation could be a reasonable assumption when we are concerned with life testing models or witli economic variables. For example, iw life testing models, when it is assumed that the life of certain parts follows a lognormal distribution, the stationarity
PAGE 77
64 assumption could be exjiocted to hold over short periods of time; but in most cases it would be expected that for a lengthy period, stationarity would be a doubtful assumption. Similarly in other areas like Cost-Volume-Profit analysis it is doubtful that the stationarity assumption will hold over long periods of time. Variables like sales, costs, and contribution margin are affected by economic, political and environmental factors. In particular it was pointed out that we are interested in gradual changes, the effects of which are not perfectly predictable in advance for a particular period. If a data generating process characterized by some parameter e is nonstationary , then it is potentially misleading to make inferences and decisions concu^^rning ti as if 6 only took on a single value. Instead we should be concerned witli a sequence 6 , 6Â„, ... , of values of 6 corresponding to different time periods, assuming the characteristics of the process may vary across time. Several methods have been proposed to study stochastic parameter variation [see Chernoff and Zacks (1964) and Harrison and Stevens (1976)]. Some have claimed that a reasonable approach to the effects of gradual change might be to model the parameters of nonstationary distributions as if they undergo independent random shifts through time [see Barry (1976), Carter (1972), and Kamat (1976)]. Specifically they suggest the use of a model that assumes that the mean of the distribution has a linear shift. In those papers, it is clearly demonstrated that when it is assumed that the process represented by the model is normal, this linear random shift model allows analytical comparisons to be drawn if it is assumed that the succesive incri^ments in the princess mean are drawn independently
PAGE 78
65 from a normal population with mean u and variance p. We Intend to use the same approat'li in this dissertation. Two cases are considered: IJ unknown and o" known; and both y and ounknown. 3.3.1 p is Unknown and o^ is Known For a process that has a normal density function with unknown parameter u, Raiffa and Schlaifer (1961) show that the natural coniugate prior is normal with parameters m' and o'^/n'. (See Appendix I for the details of their exposition.) From the prior distribution on 0Â„ and with a sequence of n independent observations (x , xÂ„, ... , x ) from tlie normal process under consideration [N(p,o''-)], the posterior distribution in period zero is obtained. If the sample yields sufficient statistics m and n, then the posterior distribution is normal with parameters n" and m'' jj,lven by (3.3.1) n^ = n^ + n, and (3.3.2) m|J = (n^ m'^ + n m)/(n^ + n) . If the mean of the distribution does not change from period to period except by the effect of the sample information then each posterior can be thought of as prior with respect to the following sample. Thus, the posterior distribution on p is the prior distribution on p ; i.e. (3.3.3) i'^ (^'(M-\y "^/"o) = f?; (Miin>[, o'7np ,
PAGE 79
66 where (3.3.4) ra[] = m| , and (3.3.5) nJJ = n| . In general, if we assume that a fixed sample of size n is employed every time a sample is taken and if we assume that the mean is stationary except by the effect of the sample information, then in any given period t the posterior distribution is normal with parameters n" and ra" given by, (3.3.6) n'^ = n' + n , and (3.3.7) m" = (n' m' + n m)/(n' + n) . This inferential model is called a stationary model since it assumes that neither the distribution nor the parameters change from period to period. In this case it assumes that \i takes on the same value in every period and that f'(y) represents the information available about that value as of the start of the t-th period. Suppose now that the process generating the observations undergoes a mean shift between succesive periods. In particular inferences about the mean of a normal process are considered when the parameter n shifts from period to period, with the shifts governed by an independent norma] process. Formally, consider a data generating process that generates n observations x ,, x Â„, ..., x during time * ti t2 ' tn
PAGE 80
67 period t accordinj', to a nornuil process wiLli parameters p and o^. Assume that the parameter o is known antl does not chanj^e over time, whereas p is not known and may vary over time. In |Kirtionlar, values of the parameter tor successive time periods are related as, (3.3.8) p^^^ = p^ + e^_^^, t = 1, 2, ... , where e is a normal "random sliock" term independent of p with known mean u and variance a^.That is u behaves as a random walk. e "^t The mean in any period t is equal to the mean in the previous period plus an increment e, wh.ich has a normal distribution, with known mean and variance. Before the sample is taken at time t, we assume that a prior density function t:ould be assessed that represents judgment (based on past experience, past information etc.) concerning tlie probabilities for the possible values of p . If tlie prior distribution of p at the beginning of time period t is represented by f'(p ), and a sample of size n during period t yields x = (x ,,..., x ), then the prior distribution of u can be revised. Furthermore at the end of time ^t period t (the beginning of time period t+1 ) " the data generating process is governed by a new mean p ,, so it is necessary to use the posterior distribution of p and the relation (3.3.8) to determine tlie prior distribution of p . Jn order to determine the distribution of the parameter p ,-, a we] L kutivjn tlieoic'iii cculd be used. It says that tiie convolution g(2) of tvr/o normal distributions with parameters (p oO and (p,,,(3^)
PAGE 81
68 gives a distribution which is normal with mean (p, + \i^) and variance (o2 + o|) , i.e. , (3.3.9) g(z) = f (z|m;^ + P,, o2 + op [see Mood et. al . (1974)]. Thus the distribution of fi , is normal, i.e. , (3.3.10) ^N('\+iht + "' (Â°^/"t) + Â°e^' -"< f^t+1 ^'"' _co< m" + u <Â«, (a^/np + a2 >0. We could find a simpler expression if we realize that, since o and a^ are positive, there must exist n such that, (3.3.11) q2 = o2/n^ , or n = o^/o^ s e In other words, the disturbance variance is a multiple of the process variance. The prior distribution of the mean after t periods then simplifies to (3.3.12) fN^\+ll"'t + ^' ^^f("t + "s^/"t "s^^' or where (3.3.14) m'_^j^ = m;: + u.
PAGE 82
69 and (3.3.15) n' , = In" n /(n'/ + n ) ] Â•Â• n" . t+1 t y t s t The inequality stated above can be interpreted as showing tliat the presence of nonstationarity produces greater uncertainty (variance) at the start of period t+l than would be present under stat ionarity because in the stationary case n' = n" . If we assume that a change in the mean occurs between every tvro consecutive periods then we could repeat the previous procedure each time a change occurs to determine the new prior distribution. For a process tliat has a lognormal density function as defined in (A1.14), it was shown in Appendix I that, when the unknovjn parameter is p, the natural conjugate prior is normal. Thus, the revision of the prior distribution in ;uiy given period is identical to the revision in the normal case [see equations (3.3.6) and (3.3.7)] except tliat m is defined as the sample mean of the natural logaritlims of the observed X values. Furthermore the procedure presented before to represent changes in the mean, p, of the normal distribution can be used to model changes in the shift parameter p of the logrtormal distribution. The normality of the natural conjugate prior, in this case, allows us to use the formulas ( i. 3. 8)-( 3. 3. 15) to study the behavior of the prior distribution of y after t jjeriods of time. Since tinvariance V(x) of the lognormal random variable x is a function of p and a'' in the lognormal case, nonstationarity in p means that both the mean and the variance of x are nonstat ionary , so
PAGE 83
7G that the lognormal case provides a generalization of the normal results, 3.3,2 y a nd o Bo th Un know n The results of the previous section can be extended to the case of unknown mean and variance. The joint natural conjugate prior density function for y and o^ is a normal-gamina-2 functions, as was shown in Appendix I, given by (3.3.16) ^-1 /n' exp[Â— 2(p-m')] exp[--Â— ^] [~^jz] [-JÂ— ^ a /2tt r(d'/2) Given a prior from this family and assuming that information is available from a normal (or lognormal) process through a sample of observations X , x^, . . . , X , it is possible to obtain a posterior distribution of the two parameters p and 5^. It was shown in Appendix I that the posterior distribution is also normal-gamina-2, i.e., f'' Â„ (^,6^ Im" ,v" , n" ,d") where (3.3.17) m" = (n'm' + n m)/(n.' + n) , (3.3.18) v" = [d'v' + n'm'+ dv + nm2 n"m"2]/(d' + n) , (K3.19) n" = n' + n ,
PAGE 84
71 and (3.3.20) d" = d' + n, It is clear from (3.3.16) that the joint distribution of \i o and a Is the product of two marginal distributions, i.e., (3.3.21) r_ _^Cu,a^\m'\v",n",d") = f " (p | o^n" ,m") f"(d2|v",d") The marginal density of 5^ does not depend on jj. Now consider the case of nonstationary y as in the previous section. The independence of the marginal distribution of o'^ from u will be an important factor in our results below. At the end of period t (the beginning of time period t+1) the posterior distribution of p and o^ could be used in conjunction with the relation between p and the random shock e to get the joint prior distribution at the beginning of period t+1. As before, the random shock model to be considered is u = u + e .We make the assumpt+1 t t+1 tion that although 6 is unknown, it is known that e 's variance, t a , is 1/n times the unknown process variance, a . As before, assuming that ti has a posterior distribution with parameters (m'^o^/n") and that e is distributed normally with parameters (u,d^/n ) it was shown in Appendix I that the convolution z (z = u + e) has a conditional density given by (3.3.22) g(z) = f:;(z|m" + u, a'^[il/n") + (l/n,.)l).
PAGE 85
72 Note that this density is conditional on 5^, as is the conjugate prior of p. Thus, the prior density of P^.iÂ» at the beginning of period t+1 after the random shock has occured, is given by (3.3.23) f^(Pt+ih" + ". o2[(ng + np/n'^: nj). Since o^ is assumed constant, f ^(o^) does not change but Y-2 equals the posterior distribution at the end of period t. Hence, the joint distribution at the beginning of period t+1 is given by ^3-3. 2M f' _,(0t+l.^^) = fN(Pt+ll-t + u.52[(n3+ n'-n,)]) f ' z^^' I ^t. V') If we let (3.3.25) m^^^ = m;; + u, (3.3.26) n'_^^ = n'' n^/(:Vn''). (3.3.27) d^^^ = d'^. and (3.3.28) v^^^ = v', then the distribution of jj and o*^ could be written as The revision could be continued since the prior distribution at the beginning of period t+1 is still a normal-gamma-2 distribution. At any time t, the process mean is not known with certainty, but the
PAGE 86
7 3 informaL ii)n from tlie samples collected up to time t provides an indication of P . Before the sample is taken at time t, we assuiiit' that one is capable of assessing a prior density function that represents our judgment (based on past experience, past information, etc.) concerning the probabilities for the possible values of p and o"^. In effect, one viev;s ( m ,5 ) as a pair of random variables to which we have assigned a probability density function; in this case a normalgamma-2 with parameters m' n', v' and d' The sample results at time t can be described in terms of the sufficient statistics m. , n^^ , v and d ; sample mean, sample size, sample variance and degrees of freedom needed to determine v , respectively. Using these sample results, a new posterior distribution could be obtained whicli is normal-gamma-2 . The tractability of the model is maintained when a n^itural conjugate prior is used and ,i shifl model of the form (3.3.8) is assumed for the changes of the parameter M between tvjo consecutive periods. Hence, after t periods of time the joint distribution of p and a^ is normagamma-2; that is, ^^Â•^Â•^"^ ^^;-.-2^vi'''i">t-*-i' ";+!' ^t+i' <+p ' where (3.3.31) d^^^ = dj + (t)n , (3.3.32) n'^j = (n* + n)n^/[(n^ + n) + nj ,
PAGE 87
7A (Â•5.3.33) v' = fd'v' + n'm'+ dv + nm^ + n"m"^]/[d' + n), ^ 'Â• Â• ' i + \ 'll tt ttt and (3.3.3A) m^_^^ = (n^m^ + nm)/(n^ + n) . In this manner, a sequence of prior and posterior distributions for successive p may be obtained as successive values of the random vector S = (x, , .... X ) are observed. ^ Vlt' 'at For the process that has a lognormal density function as defined in (A1.14), it was shown before that when both parameters are unknown the joint natural conjugate prior is normal-gamma-2 , Tims, Che revision of the i^rior distribution in any given period is identical to the revision in the normal case. Furthermore the procedure presented previously to represent changes in the mean, P, of the normal distribution could be used to model clianges in the shift parameter P of the lognormal. The fact that both normal and lognormal distributions have a joint natural conjugate prior which is normal-gamma-2 allows us to use the formulas (3.3.30 3.3.34) to study the behavior of the prior distribution of P and 0after t periods, 3.3.3 Stationary Versus Nonstationary Results Stationary conditions, in the context of our discussion, imply that tiiere is no shift in the mean, M, of the distribution; that is, e = and consequt-ntly u and o are both zero. Successive values of V t e are tlie same acro.ss tinn-, i.e., P = M.-,= ... P . For the case when 1 2 t
PAGE 88
75 only (I is unknown, thlt; implies that equation (3.3.10) becomes, (3.3.35) f'/Pt+ll'^'t + ^' ("'/"P + "^' or (3.3.36) fNtpj.+Jm'^', (o2/n'')]. Under stationari ty , then, the prior distribution of fj^ii at the start of period t+1 is the same as the posterior distribution of p at the end of period t. In the case of nonstationarity with no drift, u=0; in other words, tlie distribution of e is normal with mean and variance o| Â• For this case it is clear that for a given posterior distribution of p at time t, the only difference between the prior distributions of P^,i under stationarity (see equation 3.3.36) and the prior distribution of u under nonstationarity (see equation 3.3.10) is the variance term. The prior variance of P , , under stationarity IS, (3.3.37) Varg(P^^j^) = ''^/n^+^ = Â°^/n'^; whereas the prior variance of \^-, under nonstationarity is, (3.3.38) Var^,^(P^^,) = o^/n' = (o'/n'l) + (o'/n ), = a^[(l/np + (1/n^) As expected, the incorporation of the nonstationary condition has caused an increase in the variance of the prior distribution. The variance increased by an amount "Vn^; that is, by an amount equal
PAGE 89
76 to the variance of the distribution of successive increments in the process mean. For th.e stationary case (3.3.39) i^[+i^~^ = n/n;;] , and for the nonstationary case, (3.3.40) t";^!^^ = [(1/np + d/n^)]. Thus, equivalent ly, we could say that for a given posterior distribution of p at time t, the only difference between the prior distribution of u , , under stationarity Is that the term n' , is larger '^t+1 ^ t+1 ^ with the stationary condition. When u=|=0, m' is alv^'ays changing and, therefore, tliere is a difference in mean and variance. Stationary conditions, in the case when both jl and 6^ are unknown, imply that in any given period t+1 the joint prior density for p and 5 is a normal -gamma-2 of the form given in equations (3.3.30 3.3.3A) . That is, (3.3.41) 'n-y-2^\.v''K.v ^;+i'"m'^m> ^N^ptVii^^+i'/-) ^'-2^^'|^^l'^;+l>' where (3.3.42) m' , , = m" r + 1 t (3.3.4 3) (3.3.44) V , , = V t+T t u , , = n t + 1 t
PAGE 90
77 and (3.3.A5) d'^^ = .r . Under stationai-ity , then, the joint prior distribution of p and d^ at the start of pi-riod t+1 is the same as the posterior distribution of p and (5^ at the end of period t. Since the distribution of a^ does not depend on jj, only on the parameters d and v, we could model changes in p. These changes in the mean only affect the function ^M^fif + l I'^'+l ' o"^/"'.!)' ^'^ equation (3.3.'41). In fact, the effect of the nonstationarity assumption on f^Cy , -, ) is identical to the effect of nonstationarity over the prior distribution in the case when only y was the unknown parameter. In the case of nonstationarity with no drift, i.e., u=0, for a given posterior distribution of p and a at time t, the joint prior density function for p and o'^ is similar to the stationary viounterpart , as given in equation (3.3.41), except for the fact that the variance of f'(p ,-, |ni',-,5 o^/n' ) is larger than the variance uf f'(p^.i|m' , g'^/n' ,) in the stationary case. N t+i t+1 t+1 In other words 6'^/n' in the stationary case is smaller than fl-^/n' in the nonstat ionary case. The nonstationarity assumption also affects the predictive distribution. For the case wlien p is the unknown parameter and the data generating process is normal, assume that after t periods we have a posterior tl i str i luit icjii f"(u ) wliich is norma] with mean m" and t ' L t variance o'/n". The predictive distribution at the end of period t was shown in equal ion (Al..i2) to be nnrin.il with moan,
PAGE 91
(3.3.A6) ''-t^^t^ " "" ' and variance (3.3.47) VarJXj.) = a2l(l + np/n;*] = o2[l + (1/np]. If the process is stationary then the predictive distribution of the random variable of interest at the beginning of period t+1 is the same as the distribution we had at the end of period t, i.e., N(m" , o^ [ (1+np /n|_' ] ) However if we assume the nonstat ionary condition, the prior distribution of p at the start of period t+1 has a different mean and a different variance. Consequently the predictive distribution changes in mean and variance between consecutive time periods. In other words E , (x ) is always changing depending on the stochastic change of the mean p , , Â• In the case of nonstationarity with no drift, i.e., u=0, for a given posterior distribution of y at time t, the only difference between the predictive distribution of x ^^ under stationarity and tlie predictive distribution of x under nonstationarity is the variance term. Thevariance of x under stationarity, at the start of time period t+1, is (3.3.48) Var^^^(x^_^^) = o2 [ (1+n;^^) /n^^^] = o^ [l+( 1/n;^^) ] . It was stated previously that the parameter n' is smaller when y is unknown and nonstat ionary than when p is unknown but stationary. Hence, as expected, tlie variance of the predictive distribution, Var , (x ,), is larger when Q is nonstat ionary , This has some t+1 t+1 ' ' h M implications for t tie determination of prediction intervals; which
PAGE 92
79 we will discuss in detail in Chapter Fonr. Nonstat ionar i ty implies greater uncertainty, which is reflected by an increase in the measure ot uncertainty, variance. For the case when both p and 6 are the unknown parameters and the data generating process is normal, assume that after t periods we have a posterior distribution f"(p ,5^) which is normal -gamma2 with parameters m" , n", v" and d". The predictive b I t t t t "^ distribution at the end of period t was shown in equation (A1.33) to be Student with mean, (3.3.49) E (x ) = m" , d" > 1, t t t t ' and variance (3.3.50) Var^(x^) = [v'^ (n'^+1) /n^ [d;_7(d'^' -2)], d^,' > 2. Again, if the process is stationary then the predictive distribution at the beginning of period t+1 is the same as the distribution that we had at the end of period t, i.e., ST (m" , [v" (n"+] ) /n" ] [d'7 (d" 2)]) When we assume the nonstationary condition, the joint prior distribution of p and 5"^ at tlie start of period t+1 changes from its original form at the end of period t. The specific random model we are assuming causes the parameter m and n of the distribution of p to change from the end of period t to the start of period t+1. Therefore the predictive distribution f' (x ,) has a different t+1 t+1 mean and variance than f"(x ). In tlie case of nonstat ionarity with It no drift, i.e., u=0, for a given posterior distribution of p and 6
PAGE 93
80 at time t, the only dillerence between the predictive distribution of X under station;iri ty vis-a-vis nonstationar ity is the variance term. Observing equation (3.3.50) closely we note that the effect of nonstationarity is the same as in all previous cases; that is the parameter n' is smaller wlien p is nonstationary and therefore the variance is larger. In this case since both p and o'^ are unknown, at the end of period t our estimate of the variance is v" which includes ail the t information that we have available at the time including sample information. A comparison of stationary versus nonstationary results when the data generating process is lognormal moves along tlie same lines as the normal process does. For the case where the unknown parameter is p, the nonstationarity condition causes an increase in the variance and in the mean of the normal prior distribution which causes an increase in the mean and variance of the lognormal predictive distribution. Similarly, for tlie case when botli parameters are unknown the condition causes an increase in mean and variance in the prior distribution of p and a change in the joint prior distribution of p and o^ which affects the logStudent predictive distribution. The logStudent predictive distribution has infinite mean and variance which are not affected by the nonstationary condition. 3. 4 Conclusion In this i.hapLer we modeled nonstationarity in the mean of normal and iognoriual processes under two uncertainty assumptions,
PAGE 94
81 The model is bullr upon the Bayesian analysis of in)rnia] processes of [\aifri and Schl.iiiiM' (lOhl) and upon the analysis of nonstat ionnry means of normal processes, for unknown fi , of Barry (197'3). We extended the nonstationary results of Barry (1973) to the lognormal distribution. The variance of the lognormal distribution is given by (3.4.1) Var(x) = w(v^-l) e2M , where w = 6x9(0^). Since V(x) is a function of p and o^ in the lognormal case, nonstationarity in jj means that both mean and variance of x are nonstationary, so that the lognormal case provides a generalization of the normal results. Furthermore, we developed ttie nonstationary model for the mean of normal and lognormal processes for the case when both parameters, p and Q, are unkno^^m. For each group of assumptions we noted that, in every time period t, the uncertainty is never fully eliminated from the model. In Chapter Two vje emphasized that the exponential distribution was often used to represent life testing models. All the researcli in the area of life testing vjhere this distribution has been used has assumed stationary conditions for the parameters of the model and for the model itself. Appendix II shows the Bayesian modeling of nonstat ionarity for the parameters of an exponential distribution using random shock models. On]y under very trivial assumptions does the analysis yield tractal)le and consequently useful results. On the other hand, as vjas shown in this chapter, the normal
PAGE 95
82 and lognormal distributions provide results that are especially tractable. In any given period t, the prior, posterior and predictive distributions depend on the parameters, m and n when only y is unknown; and on the parameters m , n , v and d when both y and a^ are unknown. Under the nonstationarity conditions, these parameters change from period to period not only because new information becomes available through the sample, but because of tlie additional uncertainty involving tlie shifts in the parameter y. To make better use of these distributions the decision maker must know how they are evolving through time. Management requires realistic and accurate information to aid in decision making. For instance the decision maker can be interested in knowing how the variance of the distribution of the mean, y, changes across time. Furthermore, since one of the objectives of the user of the distribution is to construct prediction intervals for the process variable he can be interested in knovv)ing how the variance of the predictive distribution behaves as the number of observed periods increases. We will address this problem in detail in Cliapter Four through the study of the limiting behavior of the parameters m , n , v and d . In addition, attention ^ t t' t t will be focused on the methods of constructing prediction intervals for tlie normal. Student, lognormal and logStudent distributions under various uniertaiaty conditions.
PAGE 96
CHAPTER FOUR LIMITING RESULTS AND PREDICTION INTERVALS FOR NONSTATIONARY NORMAL AND LOGNORMAL PROCESSES 4 . 1 Int r oduction In Chapter Three we emphasized that for many real world data generating processes the assumption of stationarity is questionable and stochastic parameter variation seems to be a reasonable assumption. If a data generating process characterized by some parameter is nonstationary, then it is potentially misleading to make inferences and decisions concerning the parameter as if it only took on a single value. We should be concerned with a sequence of values of the parameter corresponding to different time periods. It was shown in Chapter Three that if we use a particular stochastic model we can model nonstationarity for the shift parameter of normal and lognormal processes from a Bayesian viewpoint, under two uncertainty conditions, and that we can obtain tractable results. In particular, values of the parameter for successive time periods are assumed to be related as (^Â•1-1) Pt+l ^ Pt ^ ^+1' t = 1, 2, ... , where e , ^ is a normal "random shock" term independent of y with known t+1 ^ t mean u and variance o^ . The mean in any period t is equal to the mean in the previous period plus an increment e, which has a normal distribution, with known mean. Comparing tlie stationary with the nonstat ionary prcjcesses we pointed out tliat when tlie data generating process is normal or logH3
PAGE 97
84 normal and the unknown parameter is y, the nonstationary condition causes in any given period t an increase in the variance of the normal prior distribution. This causes an increase in the mean of the normal predictive distribution for normal processes and causes an increase in the mean and variance of the lognormal predictive distribution for lognormal processes. Wlien both parameters, y and q^ , are unkno^^m a similar result is found for the prior and predictive distributions of the normal and lognormal data generating processes. The results discussed in Chapter Three Viave to do with the period to period effects of random parameter variation upon the prior and predictive distributions. However, the asymptotic behavior of the model has important implications for the decision maker. For instance, when only y is the unknown parameter, under constant parameters uncertainty about y eventually is eliminated since nl Increases without bound and the sequence of prior variances (a^/n!) converges to zero. Hence the distribution of y eventually will be unaffected by further samples. On the other hand, shifting parameters could increase the uncertainty under which a decision must be made since it reduces the information content that past samples offer for the actual situation. Increases in uncertainty, caused by stochastic parameter variation, have important implications for the decision maker since his decisions depend upon the uncertainty under which they are made. Similarly, random parameter variation produces important differences in the limiting beliavior of the prior and predictive distributions wlien y and o^ are the unknovxm parameters. In Section 4.2 we studv the limiting behavior of the param-
PAGE 98
85 eters m' , v' n' and d' of the prior and predictive distributions for t L t t the normal and lognormal. data generating processes. In addition we discuss the implications of these limiting results for the inferences and decisions based on the posterior and predictive distributions. In any period t, all the information contained in the initial prior distribution and in subsequent samples is fully reflected in the posterior and the predictive distributions. In some applications, partial summaries of the information are of special importance. One important way to partially sunmiarize the information contained in the posterior distribution is to quote one or more intervals whicli contain a stated amount of probability. Often the problem itself will dictate certain limits whicli are of special interest. A rather different situation occurs when there are no limits of special interest, but an interval is needed to show a range over which "most of the probability lies". One objective of this thesis is to develop Bayesian prediction intervals for future observations that come from normal and lognormal data generating processes. In particular, we are interested in most plausible Bayesian prediction intervals of cover 3 as were defined in Section 2.2. In Section 4.3 we discuss the problem of constructing prediction intervals for normal, Student, lognormal and logStudent distributions. It is pointed out that it is easy to construct these intervals for the normal and Student distributions but that it is rather difficult for the lognormal and logStudent distributions. An algorithm is presented to compute the Bayesian i)rediction intervals for the lognormal and logStudent distribuLions. In addition, we discuss the relationship that
PAGE 99
86 exists between Bayesian prediction intervals under nonstationarity and classical certainty equivalent and Bayesian stationary intervals. 4 . 2 S pecial Prop erties and Limiting Results U nder Nonstati onarity 4.2.1 Limiting Behavior of m' and n' When \i is the Only Unknown Parameter For a process that has a normal density function with unknown parameter p, Raiffa and Schlaifer (1961) show that the natural conjugate prior distribution is normal with parameters m' and o^/n'. In Section 3.3 we pointed out that if the mean, y, of the data generating process does not change from period to period except by the effect of the sample information, then each posterior can be thought of as a prior with respect to a subsequent sample. In general, if we assume that a sample of size n is employed every time a sample is taken [which yields a n statistic m = ( Z x ./n)] and if we assume that the mean y is stai=l ^^ tionary then in any given period t the posterior distribution of y is normal with parameters n" and m" given by (4.2.1) n" = n' + n t t t and (4.2.2) m'^' = (n^ m' + n^ mj.)/(n' + n^) . In order to study the limiting values of n' and m' under stat t tionary conditions, we have to characterize the posterior and predictive tlistributions after t periods of time have elapsed. Since the limiting results under nonstat ionary means will be based on a fixed sample size
PAGE 100
87 each period, we will make the same assumption for the stationary limiting results, that is n = n, Vt. In period one, for a process that has a normal density function with unknown parameter y , i .e., f (x | p) , the natural conjugate prior is normal with mean m' and variance a^/n' , i. e., f (p, |m' ,a'^/n ' ) . If a sample of size n from a normal process yields the sufficient statistics m and n, then the posterior and predictive distributions at the end of period one are given by (4.2.3) f|^ [pj(n;mj + nmj)/(n|+ n),a2/(n|+ n) ] = f j^(p Jm!||,o2/n'p or = fj;j(M2|m',a2/n') , and (4.2.4) fj^(x-^|m^, a2(l + n!^)/n!j;) , respectively. In period two, if a sample is taken from a normal process that yields the sufficient statistics mÂ„ and n then the posterior and predictive distributions at the end of the period are given by, (4.2.5) f;; [fi2l[n{ni{ + n(mj+ m^) ] / in[+ 2n) . a2/(n| + 2n)] = f;;(p2|m2, ^^/^'^ or and = f^(M2|m^, o^/n^) , (4.2.6) f^^(xjm|;, o2(l + np/n'p respectively.
PAGE 101
88 After t samples are taken the posterior and predictive distributions are given by (4.2.7) f'; (il |m",a2/n") N t' t t and (4.2.8) f^ (xjm'^,a2(l + n'p/n'p , where t (4.2.9) m'^ = Cn| in! + n Z m.)/(n: + t n) i=l ^ and (4.2.10) n|.' = n| + t n . We pointed out in Chapter Three that if the data generating process is lognormal with unkno\,m parameter \a , then the natural conjugate prior is normal and the predictive distribution is lognormal. For this case after t samples are taken that yield sufficient statistics n (ni^,n), (m^.n), ... (m^ ,n) , (where m^. = [ Z lnx^_.]/n), the posterior i=l and predictive distributions are normal and lognormal, respectively, with parameters m'^ and n" as defined in (4.2.7 4.2.10). The mean and variance of the predictive distribution when the data generating process is normal are given by (4.2.11) E(x^) = ra" and (4.2.12) V(x^) a^in'^ + l)/n" . On the other hand, the mean and variance of the predictive distribution for the lognormal process are given by
PAGE 102
89 (4.2.13) E(x ) = exp[m" + o-(l + n")/2n"] and (4.2.14) Var(x ) = exp (m") /w(w-l) , where w = exp[a^(l+ n")/n"] . The mean and variance of the posterior distribution of y for the normal and lognormal cases are given by (4.2.15) E(p^) = m'^ and (4.2.16) Var(p^) = a'^/n'^ . Since n is a positive integer for all t, n' ( = n' + (t-l)n) increases without bound as t increases, so that the variance of the posterior distribution of p for the normal and lognormal cases approaches zero as t increases. Intuitively, in the stationary case, the distribution of the unknown parameter becomes tighter as more information is obtained. As expected, when the data generating process is normal the variance of the predictive distribution approaches the process variance, a'^> as t increases, i.e.. (4.2.17) Jim {a2(l + np /np = lim {(a^/n^ + o2} = a^ , since the uncertainty about p is eliminated as t approaches infinity. In any given period t, m" is a weighted average of the prior mean at period one, m! , and of all past sample means, m , m , ... , m .
PAGE 103
90 All sample means up to period t are given the same weight, n, in the determination of the posterior mean, m" ; in other words recent observations receive the same weight as not-so-recent ones. Moreover, in any period t, the prior information contained in the parameter m' has a weight n'/(n' + tn) , which decreases as t increases. The variance of the predictive distribution for the lognormal case depends on the parameters n" and m" [see (4.2.14)]. For t very large, the term /w(w-l) approaches a constant since w = [o^(l + n")/n"] approaches exp(o^) . As t increases the changes in the predictive variance are produced solely by changes in m" since /w(w-l) is convergent. The mean of the predictive distributions for the lognormal case also depends on n" and m", [see (4.2.13)]. Since the variance a' approaches zero as t increases, the posterior mean m" approaches the unknown population mean P of In x . That is (4.2.18) E(x ) -^ exp[y + {a'^H)]. Suppose now we assume as in Chapter Three that the process generating the observations undergoes a mean shift between successive periods. In particular, values of the parameter for successive time periods are related as (4.2.19) P^+l " ""t "^ ^t+1' e '^ N(u,[o2/n^]) We pointed out in Section 3.3 that the prior distribution of V in any given period t+1 is given by. (^-2.20) f'0',^J-;^i. o2/n;^^).
PAGE 104
91 where (A. 2. 21) m' ,, = ra" + u t+1 t and (4.2.22) n;^^= [n-'n^/(n''+n^)l
PAGE 105
92 p , Even though the observations in the first period yield yet further information concerning y , the random shock at the end of the period is strong enough to imply that there is less information about y^ ^^ the beginning of the second period than there was about p at the benginning of the first period. On the other hand, if n| is less than n^, then the information obtained each period "overrides" the uncertainty caused by the random shock, in a sense, and there is more information about ^2 at the beginning of the second period than there was about y at the beginning of the first period. To investigate the behavior of the sequence (m') assume as before that the sample size in each period is n, In addition, to obtain a simpler expression for comparisons with the stationary case, assume that 1) the mean of the distribution of the random shock is zero, i.e., u=0 and 2) at the beginning of the first period the model is already in steady state form in the sense that n' = n , so that the sequence of variances (o^/n') will be a constant sequence (once the process reaches the limit n it remains there). Based on the assumptions, from (4.2.21) and (A, 2. 2), m'^^ can be expressed in the form t+1 1 (4.2,24) m^^^ = qm' + (l-q)m^. rhe result can be motivated as follows: it is assumed that n =n, and nl= n which Implies n'= n ; tlierefore it follows that the posterior mean can be expressed By ^'c+l" ""t " ^"t"'t"^ "t'"t^''^"t"^ "t^ " ^"h^'t"'" ""'t^^^^h'^ "Â• = (nj /(n^+ n)lm^ + In Defining q as in (4.2.25) is follows that m', , = qm' + (l-q)m , = I", /("l"^ n)lm^ + ln/(n^+ n)ra^.
PAGE 106
93 where (4.2.25) q = iij^/Cnj + n). Note tluit <: q < 1 , When we successively apply (4,2,24), m' becomes a function of ni' (the initial mean), m. (the sample means) and of q. The prior mean of the unknown parameter p, after t periods of time have elapsed can be written as (4.2.26) , t , ^ ,, , ^"^ i ""t+l = 'I '"t "Â•" ^^""^^ ^ "^ Â™t-i Â• i=0 Unlike the stationary case, the sequence (m') does not have a limit. Tlie prior mean at the beginning of any period, under nonstationarity, can be expressed as the sum of the initial mean, m', discounted by a factor q and an exponentially weighted sum of the observed t 2 1 sample means. Since q is a constant less than one, q < ,,, < q < q , Thus as we move into the future the initial prior mean has less weight in the determination of the prior mean m'. From the exponentially weighted sum of sample means we note that recent observations are weighted more heavily than not so recent ones. The impact of a particular sample mean on future values of the prior distribution of \i decreases as t increases. Under the same assumptions that we used to present the limiting results of n' and m ' , the mean and variance of the normal predictive distribution when the data generating process is normal are given by (4.2.27) E(x^) = m'^' [as defined in (4.2.26)],
PAGE 107
94 and (4.2.28) Var(x ) = (72(1 + n ) /n respectively. Similarly when the data generating process is lognormal the mean and variance of the lognormal predictive distributions are given by (4.2.29) E(X|.) = exp [m'^.' + 0^(1 + n^)2n^], and (4.2.30) Var(x ) = exp(m") /^"(w^^ , where w = exp [a^Cl + n ) /n ] . 'L The additional uncertainty involving the shifts in the parameter p affects the predictive distribution of the random variable depending on how the initial parameter n' relates to the limiting value n . If 1 L" n' is larger than n , the variance of the predictive distribution for 1 ^ normal processes, 0^(1 + n'')/n", increases as t increases. Again there is initially a great amount of information concerning x. The information obtained each period from the sample ia not strong enough to override the uncertainty caused by the random shock. There is not a similar effect in the variance of the predictive distribution for lognormal processes since it depends on both parameters m" and n" . The expected value of the predictive distribution for normal cases does not have a bound. It is influenced heavily by the most recent sample means. The expected value of the predictive distribution for lognormal cases also depends on both parameters m" and n" . K t t
PAGE 108
95 A. 2. 2 Limiting Behavior of ra' n' v' and d' When Both Parameters \i an d o^ Are U nkno wn The most involved of the normal or lognormal cases is, quite naturally, that in which neither p nor o^ is known. It is clear that we shall have to assign (fi,d^) a bivariate prior density function. The natural conjugate prior density function of (p,a^) is the normal-gamma-2 with parameters m', v', n', d'. If the mean and variance of the data generating process are stationary and sample information arrives each period then each posterior can be thought of as a prior with respect to the following sample. In general if we assume that a sample of size nj. is employed every time a sample is taken, and the sample yields sufficient statistics m , n , v and d , and if we assume that the parameters do not change then in any given period t the bivariate distribution of i\i,a^) is normal-gamma-2 with parameters iii" , n" , v" and d" given by (4.2.31) m'^ = (nm' + n^m^)/(n^ + n^) , (4.2.32) n'j! = (n+ n^) , and (4.2.23) v" = (d'v' + n'm'+ d v + n m^-n"m"2) / (d ' + n ), t 'tt tt tt ttttt t (4.2.34) d'^ = d^ + n^.. To study tlie limiting behavior of m' v' , n' and d' under stationary and nonstat ionary conditions we will make the assumptions tliat n =n Vt and that d = d Vt . After t samples are taken the sufficient sta-
PAGE 109
96 tistics (m , V , n, d) , (m , v , n, d) ... (m , v^, n, d) are available. The characLerization and consequently the limiting behavior of m' and n' are identical to the ones presented for the case when p is the only unknown parameter, [see equations (4.2.9) and (4.2.10) for the stationary conditions and (4.2.23) and (4.2.26) for nonstationary conditions]. The characterization of the parameter d' is rather simple. Under stationarity and nonstationarity the parameter d' is equal to the parameter d". After t periods of time the following relation holds, (4.2.35) d" = d' + tn. t t The limiting value of the parameter d" approaches infinity as t approaches infinity. The characterization of the parameter v" is more involved. Before considering the characterization of v" a transformation of the original expression is to be presented. Expression (4.2.33) could be rewritten as (n'm' + nm )2 d'v' + dv n'm'2 + nm^(n'+ n) [ Â— "^ / , , Jy] (4.2. 3fa) ' 'I . ^t (n'2m'2+n^m2+2n'nm'm ) Â°^ 1 .-> _L. 2 r t t t t t t ' ., , , . n m^ + nm-^ I Â— ; Â— Â— Â— 1 d'v'+dv tt t n'+n ^ (4.2.37) v" = "^ "^ ^ t d" d" t t Combining terms and simplifying (4.2.37) becomes
PAGE 110
97 n'n (ni'"^ + m^ 2m 'm ) II,, r t t t t t d'v' + dv [Â— p--Â— Â— -;^-^-] (4.2.38) v"=-^-t_ + "t -^ " ^ '^t d" t or nn'(m' m )2 (4.2.39) v" = [d'v' + dv + Â—5 ^f-r---1 / (d' + n) ttt n'+nt It can be noted that given o^ , (n! + n) (4.2,40) V(m ) = E m m']= Â— Â—rt ' t t n'n t o2 ; so t ha t n'n (m m ' ) ^ (4.2.41) E [ ^ n'% ~n^~^ = Â°^' [see Raiffa and Schlaifer (1961)]. Thus assuming that v' and v are obtained as unbiased estimators of o^, unbiasedness In v" Is preserved ' t ' by the Inclusion of the third term in the numerator of v". v' will t t only be unbiased if it was based on a noninformative prior at time t=0. Otherwise it is biased by prior information. Now consider the characterization of v" as defined in (4.2.39) t In period one the posterior value of v is given by d'v' dv n'n (m iii ' ) (4.2.42) vV = ,4-7 Â— + TV~+ 1 d' + n d' + n (n'+ n) (d|+ n) ' in period two the (josterior value of v is given by
PAGE 111
98 d'vj dv n'n (m^ mj)'n'n (mÂ„ m') "^ (n|+ n)(d]+ n) "" (n^ + n) ' ^""^ In period three tlie posterior value of v is given by (4.2.44) d'v' dv dv^ dv "^3 " 7dM^ny(d]+2^(Trp-Ti7)" ''" (d '+n) (d ' +2T)yTd''T3rO "^ ("d ' +27^)Td' +170 '^(d"'!-!'^^ ^ n'n (m,m')^ n'n (m, m')^ n'n(m -m')^ , 1 1 1 , I I Z J J J } ( (n|+n)(d|+a)(d|+2n)(d|+3n) ^ (n'+n) (d|+2n) (d|+3n) (n^+n) (d]^+3n) In any given period t the value of v" depends on the stationarity condition. Let v" be the sum of two terms (a) and (b) as defined in (4.2.43) and (4.2.44). Term (a) does not include parameters that depend on the nonstationarity assumption but term (b) does. It has been pointed out many times before that n' is affected by the nonstationarity assumption. To study the limiting behavior of v" under conditions of nonstationarity it is assumed as in a previous example that n'= n and consequently n' = n Vt. Define P = n n/(n + n) . Based on the ' ^ t L L L assumptions and the definition of P, expressions (4.2.42) and (4.2.43) will be written as
PAGE 112
99 and
PAGE 113
100 distribution of a^ : v'd' 232 v'd' d'jv^ (4.2.49) f ,,(l/a21v',d') = ^ ^&^ J I 2 J y_/ I Â— Â—^ r(d'/2) d'v' ^ _ 1 (4.2.50)
PAGE 114
101 Define and (4.2.56) d" = d' + t(n-l) (4.2.57) v" = E (n-l)v. + d'v7[d' + t(n-l)l ^ ' i=l ^ The posterior distribution of 5^ could be rewritten as v'; ,d" d" (t) t ,, .Â„ t (4.2,58) f"(a2|v" ,,d") -T^^Â— V' -d" -Â„1 d" v'; , 20-^ r_(t) 1]^ r t (t) tTP ' ^ 2 (t)' t' r(d'72) Now lets look at the limiting behavior of v'l , t (n-l)v. + d'v' lira v'! X = lini E Â— rv Â— ; ; :-,(t) . , d' + t(n-l) t->Â«> tx" 1=1 t (n-l)v. lira E ^ . . t(n-l) From sampling theory it is known that if the sample variance t is defined to be v = l (x .m )2/(n-l) then E(v lu ,0"^) = o^ and t . , ti t t' t 1=1 V(v 111 ,0^) = 2a'V(n-]). Assuming that v .., v are i.i.d. then t t , E( T. v./t) = oand V( E v./t) = Var(v.)/t2 = [2o'+/t(n-l) -> 0. i=l ^ i=l ^ ^ . t-^ Therefore 1 im v = o"^ w.p.l. t ^ t '^''^
PAGE 115
]02 lim v" = lini I v./t (t) . , 1 t-J-co t->'^n 1=1 = o^ w. p. 1 . We have shown that there is a sequence {v, , ) as defined in (4.2.57) which converges to o^. Moreover the Bayesian can observe (v, ,] therefore by observing {v, } he comes to know 5^. 2, In the limit since he knows n^, his limiting posterior distribution of a^ must be degenerate at lim v, , = o^, 3, Raiffa and Schlaifer (1961) show that the mean of the gamma-2 posterior distribution of (l/o^) is equal to the inverse of the posterior estimate of the variance as defined by (4.2.54) E'^(l/a2) = l/v'^ . 4, Therefore by (2) and (3), it must be true that (4.2,60) lim v'^ = lim v = o2 w,p.l , Observe that the argument presented before applies to both stationary and nonstationary cases. Savage (1971) summarizes informally the argument we have presented.
PAGE 116
103 4 . 3 Predi c tion Interv al s for Normal, St uden t, Lo gnormal and LogS tudent Distributions Bayesian analysis is generally concerned with the past only insofar as it relates to the present and future; interest is with the current situation and how it relates to what might happen rather than with what did happen. Above all, it is concerned with creating a meaningful view of the future in the minds of people v;ho make decisions. The Bayesian metliods, however, include people explicitly the person responsible for the analysis and all the people concerned with using the output information and supplying information relevant to the resulting actions. Apart from the fact that classical analyses often ignore external information and apart from the fact that the statistical criterion is usually far from reflecting the decision loss function, the analysis often neglects the people who will communicate with each other and the model. People have sources of information quite beyond the data; for example, they may know perfectly well that a competing product is being introduced, that a new tliechnology has been developed, or that the President is planning to sign a new legislation that will affect the marketing of tlieir product. The effects of such events can often be well foreseen in a qualitative or subjective sense, but it may nevertheless be difficult to be expressed and require probability distributions to describe the uncertainty surrounding them. It is necessary that people can communicate tlieir Information to the metliod and that the method clearly communicates tlie uncertain information in such a way that it is readily interpreted and used by decision makers. The nonstationary model that we
PAGE 117
104 developed in Chapter Three for normal and lognormal processes incorporates prior distrilmtions on the unknown parameters to reflect the decision maker's information. In a Bayesian analysis, the information coming from the data is contained in the posterior distribution of the unknown parameter. One way to partially summarize the information contained in the posterior distribution is to quote one or more intervals which contain stated amounts of probability. For the classical statistician, the information coming from the data is contained in the sampling distribution. He can summarize the information in the sampling distribution quoting intervals with confidence coefficient Y. Suppose that x , . . . ,x form a random sample from a distribution which involves a parameter 6 whose value is unknown. Suppose also that two statistics T (x, , . . . , x ) and T,-,(x,, ... , x ) can be found such that, no matter what the value of 6 may be (4.3.1) Pr[T^(x^, ..., x^) < e < T2(xj^, ... , x^)|e] = y, where y is a fixed probability (0< Y <1) Â• If the observed values of T^ (x , ..., x^) and T2(x-,, ..., x^) are a and b, then it is said that the interval (a,b) is a confidence interval for 9 with confidence coefficient Y, or, in other words, that the interval (a,b) contains with confidence Y. The uncertainty pertains to the interval, and not to 6. It is not correct to state that lies in the interval (a,b) with probability )' . Before the values of the statistics T (x, , ... , x ) and T^Cx^, ... , x ) are observed, those statistics are random variables,
PAGE 118
105 It follows therefore from (A. 3.1) that y will lie in the random interval having end points T, (x , ... , x ) and T (x, , ... , x ) with pro1 1 n 2 1 n bability y. After the specific values T (x , ... , x ) = a and ^1 n TÂ„(x, , ... , x ) = b have been observed, it is not possible to assign a probability to the event that lies in the specific interval (a,b) without regarding 6 as a random variable which itself lias a probability distribution. In other words, it is necessary first to assign a prior distribution to 6 and then to use the resulting posterior distribution to calculate the probability that 6 lies in the interval (a,b). Rather than assigning a prior distribution to the parameter 6, classical statisticians have preffered to state that there is confidence y. rather than probability y, that lies in the interval (a,b). To a classicist, any given confidence interval statement is either correct (in which case the probability that it is correct is 1.0) or incorrect (in which case the probability tliat it is incorrect is 0.0) . That is, a confidence interval is one type of interval estimate that has the feature that in repeated sampling a known proportion (for instance, 95%) of the intervals computed by a given method will include the population parameter. This concept has a shortcoming since, although the particular sample values that are observed may give the experimenter additional information about whether or not the interval formed from these particular values actually does include 0, there is no way to adjust the confidence coefficient y in the light of this new information. To differentiate betv>;een the two statements, usually the classical interval estimate is called a "confidence interval" and the Bayesian interval
PAGE 119
106 is called a "credible interval". Users of classical intervals tend to interpret them in the subjective sense as probability statements about a random variable 9 despite the classical statistician's emphasis on the frequency interpretation. Pratt (1965) has observed that people should not be blamed for this misinterpretation, since the correct interpretation ([a,b] is the interval which, before the observations are obtained, had probability Y of covering 6) is simply not relevant to people concerned solely vvfith 9 and not with the observations vjhose only role is to furnish information about 9. The classical approach often requires that Y, the probability associated witfi the interval estimate, be chosen in advance of sampling. The Bayesian may wish to look at intervals for several different values of Y (not necessarily chosen in advance) . A rather interesting situation arises when an interval is needed to show a range within which most of the distribution lies. In searching for ways to summarize the information in the posterior distribution P(S|x), wliere 9 is the unknown parameter and x is the vector of observations, it is to be noted that, although the interval over which the posterior density is nonzero may extend over infinite ranges in the parameter space, nevertheless over a substantial part of the parameter space the density may be negligible. Thus it may be possible to construct a relatively small interval which contains most of the probability or to construct a number of intervals wliicii contain various stated proportions of the total prol)abil i ty . There are an infinite number of ways in which these intervals can be constructed.
PAGE 120
107 In some applications, two properties are desirable for such intervals: 1) the probability density of every point inside the interval is at least as large as that of any point outside it, and 2) for a given probability content the interval should be as short as possible . Intervals v;hich have these properties have been called highest posterior density (H.P.D.) intervals. The normal, lognormal, Student and logStudent will permit H.P.D. intervals. Moreover for these distributions, as for any unimodal distribution, the H.P.D. interval of content y is unique. Throughout the discussion in the previous paragraphs we assumed that there was only one unknown parameter. If we are referring to a vector of the unknown parameters, i.e., 6= (0i,6o), all that can be known about 6 is contained in the joint posterior bivariate distribution. Mathematically speaking, therefore, the problem of making inferences about y is solved as soon as the posterior distribution is written. As soon as we consider more than one unknown parameter we refer to highest posterior density (H.P.D.) regions instead of H.P.D. intervals. As with H.P.D. intervals, the region should be such that the probability density of every point inside it is at least as large as that of any point outside it or the region should be such that for a given probability content, it occupies the smallest volume Sometimes in order to have the smallest total width one must
PAGE 121
108 in the parameter space. We em[)hasized in Cliai)ter Three that one of the purposes of prediction is often to provide some estimate, either point or interval, for future observations of an experiment F based on the results obtained from an informative experin\ent E. In other words, in addition to being interested in the posterior distribution of the unknown parameters we are interested in the distribution of further samples or observations. For instance, it is sometimes of interest to obtain a value, arrived at by life testing, that with high probability will be less than the life length of a particular component that is to be used in a system. Or, on the basis of annual profits in previous years, a firm is interested in having an estimate, in interval form, of the profits for the coming year. These are examples of statistical inference problems called prediction intervals or |i-expectation tolerance intervals. The problem can be stated more formally as follows. [See Aitchison and Schulthorpe (1965) and Fraser and Guttman (1956).] Suppose an informative experiment has been performed. A random sample x,, X2, ... , Xj^ is taken from a distribution that belongs to the class of density functions [p (Â•|e):6t;0], E ' say f(x|6). Also assume that there is a future experiment F, which consists of taking a random sample Y, for which a prediction of some sort is required and that the possible probabilistic descriptions of F form the class of density functions [p (Â• | 6 ) :(;)fc ] . The densities describing E and F are conditioned by the same parameter vector. It is through this connection between E and F that E provides information about F.
PAGE 122
109 Although E and F are connected by 9 , It iq assumed that for given Q they are statistically independent. On the basis of the sample x, , ... , x we wish to make a prediction about Y, usually in the form of an Interval or region tiiat we are confident will contain the outcome of Y. That is, if L and U are functions of x, , ... , x , then 1' ' n (4.3.2) Pr( 1, < Y < U ) = g, or equivalently (4.3.3) E { /' f(y|e) dy} = 3. L Aitchison and Sculthorpe (1965) classify the prediction problem in two categories, first a prediction is required for only one performance of F and second a series of replications of V is to be conducted and then the prediction region, R, is to be used for each replication. Although there could be more than one replication in a single time period and one can still get prediction intervals from future replications from what we know about m' and e, we are restricting ourselves to single replications. In other words, each time that we find the predictive distribution we will be concerned with one, future experiment F. Faced with case one a Bayesian would proceed to obtain f(y|x), the posterior distribution of y given x. As we pointed out in Chapter Three, from a prior density f(6) on the posterior density f(0|x) is obtained in the usual way Â£ind this is convened into f(y|x) through the relation (4.3.4) f(y|x) = / f^(y|e) f(elx) de r
PAGE 123
IJO f(y|x.) is called the predictive disti-ibution of y. Most of tlie liter.iture on prediction intervals is concerned with solving for prediction intervals of a particular type or solving for intervals for a particular distribution. For instance, Thatcher (1964) found prediction limits for binomial variables which do not depend on any assumptions about the unknown proportion in the population. Hahn (1969) considers prediction regions for k future Y ol^servations when sampling from a normal distribution. Shah (1969) and Nelson (1970) present a method for obtaining prediction intervals for a Poisson variable and generate prediction limits for the numl)cr of failures in one time interval by observing the failures in the other time interval, prt)vided both observations are subject to the same Poisson law. Faulkenberry (197 3) obtains a prediction interval for a random variable Y based on the conditional distribution of y given a sufficient statistic for the conditioning parameter. Aitchison (1966) considers the construction of linear utility tolerance intervals which do take into account how far inside or outside the interval a future observation y happens to fall. From a Rayesian viewpoint, it is found that expectedcover and linear utility intervals can be regarded as equivalent through a simple relation between the expected cover and the relative cost ratio. For the frequentist approach, it is first shown that linear-utility intervals can be simply constructed for the normal and gamma distributions. Comparison of these with expec ted-cover intervals shows that, while there is no complete identity, there is an equivalence in a "large sample" sense.
PAGE 124
HI Prediction intervals for future observations in life testing situations have been derived also by Hewitt (1968), Nelson (1968), and Lawless (1971, 1972) through the use of expected-cover tolerance regions. Dunsmore (1974) gives a Bayesian approach to such situations and uses the concept of the Bayesian predictive distribution. He considers both the exponential and the two-parameter exponential distributions. As was pointed out in Chapter Two we intend to use the same approach for the construction of prediction intervals for the normal, lognormal. Student and logStudent distributions under conditions of nonstationary shift parameters. If the prior distribution is natural conjugate to the process then the predictive distribution for normal processes is normal when U is unknown and a^ is known and is Student when \i and o^ are both unknown. The determination of prediction intervals in general and H.P.D. intervals in particular is easy due to the characteristics of both distributions. The normal and Student distributions are similar in the sense that they are unimodal, symmetric, bell shaped, and asymptotic, extending from minus infinity to plus infinity. Graphically, the standardized Student distribution is flatter than the normal distribution, with a larger portion of the area under the curve located in the tails of the distribution. This implies that one must proceed a greater distance along the number line away from the mean under a standardized Student distribut ion to include any given percentage of the area under the curve than would be the case for the standardized normal distribution. Since both distributions are symmetric, to construct H.P.D. inter-
PAGE 125
112 vals of Y content it suffices to take the area between the lower limit of the interval and thtPiean to be equal to the area between tlie mean and the upper limit of the interval. If we let a be the lower limit, b be the upper limit and c be the mean the condition could be written as, (4.3.5) I"" f(x|y) dx = (y/2) = /^ f(x|y) dx . a a Since the distributions are symmetric, the length of the Interval between the lov;er limit and the mean is equal to the length between the mean and the upper limit. To obtain the probabilities needed to determine the limits of the interval we use a table of the probability integral of the normal and Student curve depending on the assumptions of tlie problem. Prediction intervals of content y take the form (A. 3.6) RCx) t K, Std. Dev. (x) , 1-Y wliere K refers to the number of standard deviations one must proceed 1-Y in one direction from thi' mean in order to encompass (y/2) percent of the area under the curve. For the case wht-n p is the unknown parameter and the data generating process is normal, assume that after t periods we have a posterior distribution f"(ii ) which is normal with mean m" and varit ^t t ance n^/n". The predictive distribution at the end of period t was sliown in equ.ition (AI.12) to be normal witli mean, m", and variance, o'fl + (1/ri")]. For tiiis case the prediction intervals of content y
PAGE 126
113 take the form (A. 3. 7) m" . tC [o2(l + [i/n'']))^/^ t l-y I For the case when both [i and o^ are the unknovvai parameters and the data generating process is normal, assume that after t periods we have a posterior distribution f"(ij ,0"^) which is normal-gamnia-2 with parameters m" , n", v" and d" . The predictive distribution at the end of period t t t t t was shown in equation (AI.33) to be Student with mean, m" , and variance, [v"(n'' + ])/n''J ld'7(d"-2)]. For this case the prediction intervals of content y take the form (4.3.8) m'^ J K^_ {[v-jlCn'^ + l)/n'^'] d" / (d"-2)]^^^\ The predictive distribution for lognormal processes is lognormal when y is unknown and ois known and is logStudent when p and o are both unknown. The construction of prediction intervals in general and H.P.D. intervals in particular becomes difficult for the lognormal and the logStudent predictive distributions since these distributions are asymmetric. In Appendix III we provide an algorithm to construct the H.P.D. intervals when the predictive distributions are asymmetric. In any given period t the user only has to provide the current values of the parameters of the predictive distribution, i.e., m" and n2(l+n")/n" t t t for the lognormal case and m" , v", n" , d'' for the logStudent case. It is shown in Appendix III that the algorithm finds tlie highest posterior density intervals in very few iterations. It took about 15 iterations
PAGE 127
114 to find the intervals in the examples that were considered. In Cliapter Tliiee we pointed out that, under nonstationary conditions, if the data generating process is normal for the case when p is the unknown parameter and for the case when p and g^ ^j-g t^i^g unknown parameters the predictive distribution changes in mean and variance between consecutive time periods. The E , , (x ) is always changing depending on the stochastic change of the mean p . I'hus it is not possible to establisli how the H.P.D. interval for this predictive distribution compares with the stationary H.P.D. interval under the same assumptions. However, in the case of nonstationarity v\7itli no drift, i.e., u=0 for a given posterior distribution of p at time t, the only difference between the predictive distribution of x , under stationarity and the predictive distril)Lition of x under nonstationarity is the variance t+1 term. As expected, the variance of the predictive distribution is larger when p is nonstationary. For normal and Student processes the H.P.D. interval will be wider for a given content y when p is nonstationary than when p is stationary. A comparison of stationary versus nonstationary results when the data generating process is iognormal shows that as in the normal case, the nonstationarity condition causes the prediction intervals to be larger under the nonstationary conditions than under stationary conditions for both parameter uncertainty cases. A rather different approach to the prediction problem, termed the Certainty F.quivalent (CK) approach, is considered by Holt Â£t . a^l . (1960) and Tlieil (196A) among others. Suppose, as in the classical school, that the parameter ii of a normal distribution is fixed rather than random.
PAGE 128
115 but that the decision maker does not know this fixed value and estimates it by means of some statistical procedure; or consider the case where p is random and its expectation is E(y) , but tlie decision maker does not know E(p) and estimates it. In the CE approach the decision maker uses the estimates of the uncertain parameters in place of the relevant true values, i.e., they are treated as if they were the actual values of the parameters. According to the method, the point estimate constitutes a certainty equivalent for complete knowledge of the distribution function. That is, if the distribution of x is f(x|ft), where is an unknown parameter or vector of parameters, then an estimate e for the unknown parameter constitutes a certainty equivalent and f(x|e) is considered to represent full knt)wledge of the distribution f(x|o). The decision maker then bases all his probability statements and decision choices on the distribution f (x I o) . Theil (196A), Brown (1976), Barry (1974) and Barry et. al. (1977) show that the CE approach can lead to inappropiate decisions since it does not reflect uncertainty in as is done in the use of predictive distributions. However this approach allows the decision maker to make the probability statements of most direct interest to him without using confidence Interval terms. Thus this approach, CE. would seem preferable in some respects to a classical confidence interval approach. Since there is the problem that the true parameters may deviate from the estimates, a problem that is variously referred to as estimation risk or parameter uncertainty, much effort has been devoted to the task of improving the estimate that is
PAGE 129
116 made of the vector of parameters 9. The Bayesian, on the otlier hand, asseses a probability distribution over the range of possible values q can assume. In general, no one point can fully capture the information contained in this distribution except in the special case where it is concentrated about 0, so the Bayesian methodology provides an approach which uses as much information as possible. In the case where the prior beliefs of the decision maker, with regard to the unknown parameters, have convenient representations, i.e., mathematically tractable forms, the Bayesian approach has been shown to perform better than the CE approaiJi. To the extent that a CE distribution misrepresents the decision makers predictive, the CE approach can lead to inappropiate decisions [see Brown (1976)]. In conclusion, since the CE approach does not include the parameter uncertainty it understates the uncertainty faced by the decision maker and could produce predictive distributions that are misleading. Since the CE approach does not consider parameter uncertainty, it yields prediction intervals that overstate the content probability or (equivalently) understate their risk. Thus the CE approach discards information, i.e., the distribution of 6, and then gives interval estimates that appear more informative than the Bayesian highest posterior density intervals. In Chapter Five we are going to show some examples of this condition when vje present applications of the results from Chapters Three and Four to Cost-Volume-Prof it Analysis and life testing models.
PAGE 130
117 4 . A Conclusion In this c:hapter we discuss the limiting behavior of tlie parameters m', v', n' and d' of the prior and predictive distributions for the normal and lognormal data generating processes. In addition we discuss the implications of these limiting results for the inferences and decisions based on the posterior and predictive distributions. The asymptotic behavior of the model has important implications for the decision maker. An implication of the stationary Bayesian model for normal and lognormal processes is that as additional observations are collected parameter uncertainty is reduced and (in the limit) eliminated altogether, In contrast, for the nonstationary model considered in this dissertation the following inferential results are obtained: 1. for the case of lognormal or 'normal model, a particular form of stochastic parameter variation implies a treatment of data involving the use of all observations in a differential weighting scheme; and 2. random parameter variation produces important differences in the limiting Isehavior of the prior and predictive distributions since under nonstationarity the limiting values of the parameters of the posterior and predictive distributions cannot be determined clearly. The protilem of constructing prediction intervals for normal. Student, lognormal and logStudent distributions is considered in this' chapter , It is pointed out that it is easy to construct these intervals
PAGE 131
118 for the normal and Student distributions but that it is rather difficult for the lognornial and logStudent distributions. An algorithm is presented that efficiently compute Bayesian prediction intervals for lognormal and logStudent distributions.
PAGE 132
CHAPTER FIVE NONSTATIONAKTTY TN CVP AND STATTSTICAT, LIFE ANALYSIS 5 . 1 Introductl on In Chapter Four v;e pointed out that one objective of this dissertation is to develop Bayesian prediction intervals for future observations that come from normal and lognorr.ial data generating processes under conditions of nonstationary means. In particular we stressed the importance of tiighe'St posterior density intervals as a mean to convey to the decision maker wli.it he is entitled to believe about the predictive distribution of the variable of interest. This kind of analysis is partiiularly useful in tlie area of Cost-Volume-Profit (CVP) Analysis (see Dickinson (1474), Hilliard and Leitch (1975) and Kaplan (1977) among others) and in tlie area of Statistical Life Analysis (see Folk and Browne (1975), Jones (1971) and Dunsmore (1974) among others) since the application of the lognormal distribution is not only based on empirical observations, but in some cases is supported by theoretical arguments. The lognormal distribution has been found to be a serious competitor to the Weibull distribution in representing lifetime distributions for manufactured products. In Section 5.2 we discuss the afiplication of the results of Chapters Three and Four concerning nonsta tionar i ty to the area of CVP analysis, i'hf proMem of CVP analysis will be considered from a Bayesian viewpoint, and inferences under tlie special case of nonstat ionarity developed in (lliapler Tlux'-c v; i 1 1 he discussed. Also the Bayesian results 119
PAGE 133
120 under nonstationar i ty will be compai-ed with some alternative approaches suggested in the accounting literature. In Section 5.3 we incorporate our results into Che theory of Statistical Life Analysis. Practical implications of our results for the reliability problems are discussed vjith emphasis on the predictive distribution of the random variable. In Section 5.4 we present the conclusions of the chapter. 5 . 2 Nonstationarity in Cost-V olume-Prof it Analysis 5.2.1 Existing Analysis The scope of CVP analysis ranges from determination of the optimal output level for a single-product department to the determination of optimal output mix for a large multi-product firm. All these decisions rely on simple relationsliips between changes in revenues and costs and changes in output levels or mixes. All CVP analyses are characterized by their emphasis on cost and revenue behavior over various ranges of output levels and mixes. The applicability of probabilistic models for this analysis has been claimed because of the realism of such models. That is, an inherent aspect of any management decision-making situation is the presence of uncertainty concerning one or more of the relevant factors; for exanijile, tlie entire notion of forecasting the value of some variable in the future is based on the fact that there is uncertainty concerning that variable. The ideal model is one that gives a probability distribution of tiie criterion variables. Like profit, that fully recognizes tlie uncertainty faced by the firm and incorporates all available information. Ihe I'ealism of such a model is dependent on assumptions about
PAGE 134
121 the input variables ami rigoruiis mettiodoiogy in obtaining the output distribution. In Chapter I'wo we surveyed some of the relevant literature related to the development of CVP analysis under uncertainty. As was pointed out in that survey, most of the papers reflect how the people that liave studied CVF analysis have neglected one potentially important source of uncertainty to the manager, namely the problem of parameter uncertainty. Classical methods used in CVP analysis generate correct confidence interval estimates only on those occasions where the manaj^^er has no knowledge witli respect to the variable he is attempting to estimate. Such a situation seldom, if ever, occurs. Bayesian methods explicitly treat judgmental information and take the position that any estimate generated should reflect all the information at the manager's disposal. This is reflected by the assignment of a prior distribution, which is used in conjunction with observed sample evidence to form a posterior distribution. Dickinson (197A) addressed the problem of CVP analysis under uncertainty by examining the reliability of using sample means and the unbiased sample variance to estimate the means and variances of the past distributions of sales demand. As pointed out in Chapter Two his paper illustrates the limitation of non Bayesian CVP analysis of not being able to obtain the probability statements of most interest to ttie manager. The Bayesian appr(j;ich provides a general procedure of describing and analy^.ing any suc-h situation without tlie appeal to ad hoc procedures or ingenious tric:ks [see Lindley (1972)], especially through the use of the
PAGE 135
122 predictive distribution. Barry, Velez and Welch (1977) have recently applied a predictive Bayesian model to CVP analysis, explicitly allowing for parameter uncertainty. An implication of such a model is that as additional observations are collected parameter uncertainty is reduced and (in the limit) eliminated altogether. Such an implication is inconsistent with observed real world behavior largely because the conditions under which firms operate typically change across time. A CVP model ideally should include the changing character of the process by allowing for changes in the parametric description of the process through time. Failure to recognize the nonstationary condition may result in misleading inferences. CVP literature has neglected to include this additional source of uncertainty that influences the decision maker's frame of reference for his decision process. In Chapter Three we showed that if tlie presence of nonstationarity is not fully recognized then we can be lead to a serious misinterpretation of the conclusions drawn from a stationary model. 5.2.2 No nstat ionary Bayesian CVP Nodel Assume that a single product firm' has a profit function defined by (5.2.1) Z = Q[P-V] F , where Z = total profits , Q = sales volume in units, P = unit selling price ,
PAGE 136
123 V = unit VÂ£\riable cost cind F = total fixed cost. Thus the firm produces the quantity Q at a fixed cost F, and variable cost VQ . Assume that the only random element in the system is the quantity variable Q. In addition assume that Q is normally distributed with mean u and variance o^ , i.e., f (Q|u,a'^). Later we will consider the cases N ' where other variables are random and also we will modify the analysis to allow for lognormall ty in the distribution of the variables. In general the values of the parameters of the CVP model will be unknown. Consider a manager with a prior distribution over the parameters of the probability model of Q, say f'(e|r) where 6 includes all the unknown parameters and r represents all information known to the manager. In particular assume that if p is the only unknown parameter then the prior distribution is the normal natural conjugate with parameters m' and Q-^/n' or that if y and a^ are both unknown then the manager has a normal-gamma natural conjugate prior witli parameters m' , n', v'; and d'. (See de Finnetti (1962,1965), Murphy and Winkler (1970), Savage (1971), Stael von Holstein (1970a, 1970b) and Winkler (1967a, 1967b, 1969, 1971) for a discussion of evaluation of probability assessors and assessments.) A formal Bayesian analysis articulates the evidence of a sample, say (I , Q.;, , ... , Q , vjitli evidence other than that of the sample, in the form ot a prior distribution of the parameters to obtain a posterior distribution of tlie unknown parameters. In areas like CVP analysis it is
PAGE 137
124 doubtfLil that the assumption of stationary parameters will hold over long periods of time since variables like quantity sold (Q) , costs and contribution margin (P-V) are affected by economic, political and environmental factors. Thus we are going to assume that the distribution of the random variable, sales, undergoes a gradual mean shift between successive periods of time of the form u , = u + e as defined in ^ *^t+l ^t t+1 (3.3.8). Tlie Bayesian analysis provides a natural method to include the remaining parameter uncertainty in the computation of the predictive distribution. The nonstationarity assumption affects the predictive distribution of the coming period's sales quantity Q. If the process is stationary then the predictive distribution of the random variable Q at the beginning of period t+1 is tlie same as the distribution that we had at tlie end of period t. However if we assume the nonstationary condition and that tlie decision maker is aware of the nonstationarity, then the prior distribution of the parameter at the start of period t+1 has a different mean and variance. Consequently, the predictive distribution changes in mean and variance between consecutives time periods. In other words E (x t+i' t+ is always changing depending on the stochastic change of the shift parameter y(-^j . In the case of nonstationarity with no drift, i.e., u=0, if the distrilnition of sales is normal then, assuming that they started from the same posteriors, the only difference between the predictive distribution of X -, under statlonarity and the predictive distribution of Xf.,, under nonstationarity is the variance term. The parameter n^_i i is smaller when p is uukno\vm and nonstationary than when y is unknown but stationary.
PAGE 138
125 Hence, as expected, the variance of the predictive distribution is larger. The predictive distribution under nonstationari ty may be used to make probability statements about sales quantity or, if desired, profits. To illustrate, suppose that the montlily sales Q , Q , ... of a firm are independent and identical J y tiistributed random variables witli common density function f (Q|y,a2) and tliat the population variance o^ is known to be JOL). Suppose also that, at the beginning of a given period t, the manager has assessed the prior distribution function over the parameter p to be (5.2.2) f;(fiJm',o^7np = f^(p^ 500, 25) . Since o^= 100 and o-/n'= 25, n'= 4. If the manager has available a sample of, say 12, monthly sales with sample mean m =480 then he may compute a posterior distribution of the unknoivm parameter p v;hich will reflect this new information that he has available. Since the normal prior is natural conjugate for sampling from an independent normal process the posterior distribution of the unknov^n parameter y will be (5.2.3) q(u^\vr,o^-/n'p = f;^(pj485, 6.25) . The predictive distribution of Q given the av.iilable information (and uncertainty) about p can be obtained using the posterior distribution of p . T!ie predictive d i str iliut ion of sales at period t is
PAGE 139
126 (5. 2. A) fNC^Im'^;, u2[l+ L/n;'J) = fNCQJ^SS, 106.25). If now we assume Lliat the random shock distribution is (5.2.5) f^Cet-lu, a2/ng) = f^CeJO, 50) then the prior distribution of the unknown parameter jj|at the beginning of period t+1 may be obtained using equations (3.3.10) and (3.3.]1) This new prior distribution is (5.2.6) fi;(p,+ j|m'', a2(n;+ n^)/n^.^) = f'(M,^J^85, 56.25). The predictive distribution under nonstationarity at the beginning of period t+1 is (5.2.7) fj,(Q,.+Jm';, a2[l+ l/n'^^l) = ^^C^+il^^^^ 156.25). It has a higher variance than under stationarity as was pointed out in previous paragraplis. The manager may determine, in any given period t, the predictive distribution of profits from equation (5.2.1). Since the predictive distribution of sales is as defined in (5.2.7) then Lin.' predictive distribution of profits is (5.2.8) fj^(TT^^j|m'; (P-V)-F, o-[l+ l/n^+l](P-V)2). That is, if we suppose that the contribution margin (P-V) is say 8, and that the fixed costs (F) are, say, 1,000, then the predictive distribution of next period's profits is
PAGE 140
127 (5.2.9) f^(n^^j2,880, 10,000). Probability stateiiK'nts are easily obtained using the standard normal distribution tiieory. Observe how this analysis provides the probability statements that the manager needs \v7ithout the necessity of cumbersome phrasing in terms of classical confidence intervals. Lets look now at the same problem but assuming this time that the decision maker knows neither the population variance (o2) nor the population mean (p) . This is the most involved of the univariate normal cases since it requires the assignment of a bivariate prior distribution function to (y,o^). To illustrate, suppose that the manager has ex|)ressed his judgments about (P ,o-) by a normalgamma distribution of the form f'_ (jj^. ,o|m' v' n ' ,d ' ) = f ' (jj ,5-^ | 500 , 25 , 10 , 7) Assume that the manager takes a random sampie of 12 monthly sales, which are assumed to come from a process with unknown mean and variance, and that the sample yields a sample mean (m ) of 480 and a sample variance (v^) of 80. He may compute a posterior distribution of the unknown parameters pj. and o using equations (3.3.17) and (3.3.18). Since the normal-gamma prior is natural conjugate forsampling from an independent normal process, V'jith unknown parameters, the posterior distribution of p and a will be (5.2.10) fN_/^'Â°'l"i't' ^'i'"t' 'Ip = ^^t!,_. ^^'^^1 ^^^' 4784. A, 22, 19)
PAGE 141
128 Under stacionarity the prior distribution at the start of period t+1 is e(iiuil to the posterior distribution at the end of period t. The predictive distribution of Q given the available information (and uncertainty) about p and a^ can be obtained using the posterior distribution of p and d^ . From equations (3. 3. 49) and (3.3.50) we know that the predictive distribution of sales at the beginning of period t is (5.2.11) . f2^(QJm'^, v'^in'^ + l)d;_'/n'j;[d'^'-2]) = fg^,(Qj485, 5590.32). If we now assume tliat the random shock distribution is (5.2.12) fN(et|u, o2/n^^) = fN(it|0, 0^/2), then the manager may obtain the new prior distribution of p and 5^ at the beginning of period t+1 using equations (3.3.25 3.3.28). This new distribution is (5.2.13) ^'N-/f't+l'Â°'l4'' ^t' "t"s/("s+"P' d") = f^:^(Mt+i.5-|485,4784.4,l-83,19) The predictive distribution of sales for the coming period under nonstationary conditions is Student with mean 485 and variance 8,269.28. As expected, the additional uncertainty introduced in the model by the shifting means has caused an increase in the variance of the predictive distribution. If tiie manager does not recognize in liis predic-
PAGE 142
129 tive model the existence of a nonstationary ronditioa lie may draw inferenres from tlie model that are misleading. In any given period t, the manager may determine the predictive distribution of profits from equation (5.2.1). The predictive distribution of profits is (5.2.14) fsT(^+]lÂ™t+l(P-V)-F,(Jt+l/^t+l2)fv' + i(n'^^+ 1) /n^^^ ) [P-V] ^) . That is, if we assume as before that the contribution margin (P-V) is 8, and that the fixed costs (F) are 1,000 then the predictive distribution of next period's profits is fÂ„ (fl 12880, 357,786.25) under stationarity and (5.2.15) fsT*^^t+ll "^^Â°' 529,233.92) under nonstationari ty . Probability statements are easily obtained using Che standard Student distribution tables available in many books. The use of normal distributions in Â•aijplications where the coefficient of variation is large can present many difficulties. The lognormal distribution is in at least one important respect a more realistic representation of distributions of variables that cannot assume negative values (such as sales) than is tiie normal distribution. A normal distribution assigns probability to such events, vjlii Le the lognoriikil distribution does not. Fur tliermore , even though
PAGE 143
130 the loRnormal distribution is skewed, by taking the spread jjarameter small enough, it is possible to construct a lognoi^mal d istr ibut icna closely resembling any normal distribution ( except those with high probabilities of negative values ) . Milliard and Leitch (1975) pointed out the problem of assuming price and quantity to be independent. However, if we assume that sales quantity and contribution margin are joint lognormally distributed then we can allow for statistical dependence among the two variables as we will show later. When it is assumed that sales quantity and contribution margin are both lognormally distribution, there is a closed form expression for the probability distribution of gross profits since the product of tvi/o lognormal random variables is also lognormally distributed. The nonstationary Bayesian CVP analysis is easily extended to the case of a lognormal distribution of Q or to a case where sales quantity and contribution margin are both lognormally distributed. The extension is easy because if x is lognormal then ln~x is normal. Suppose that the distribution of sales is lognormal, i.e., (5.2.16) fLig(Q|p,a-) = [Qav''2^1 exp[-(ln Q P)/2o2], with unknown parameter p and known o^ . Note that if we consider In to be the random variable instead of Q the lognormal distribution is easily transfromed into a normal distribution and vice versa, i.e., (5.2.17) iLN^'^lP'"' ) = ^"^ f^dn-Qlp.o^).
PAGE 144
131 Thus the predictive Cyi' model for norm.il processes presented before can be extended t a'^/n') = f'Cil I 4, .1). For a sample of 12 months with mean 6.2 the posterior distribution will be f"(p | m" , a2/n")=f"(p | 5.2, .0455) The predictive distribution of In Q at period t is (5.2.18) f^Cln^Q Jm", o2[l + 1/n"]) = f,,(ln Q I 5.2, 1.0455), N t ' t t N 1 1 or the predictive distribution of Q at period t is (5.2.19) f^^(()Jm", a2[l+ l/n^'l) = ^i^^iQ^] 5.2, 1.0455). By the properties of lognormal random variates it follows that Q has predictive mean E(Q ) = exp [5.2 + 1.0455/2] = 305/74 and predictive variance Var(Qj^) = [exp (10.4)] w(w-l) where w = exp [1.0455], that is Var(Qj.) = 172,002.72. To obtain probability statements regarding Q it is necessary to translate the iirobability statement regarding In Q using the antilogarl thmic transformation. For instance, as before let the contribution margin be 8 and let fixed costs be 1,000. The probability of making mori' than $3,000 in profits is equal to the probability
PAGE 145
132 of selling more than 500 units, [Q^. > (n F)/(P-V)] or In Q >^ 6.2146. Since the distribution of In Q is as in (5,2.18) the probability of profits in excess of $3,000 can be obtained from the standard normal distribution theory. The normal model with unknown mean and variance can be extended to include the case in which the decision maker knows neither the population variance (a^) nor the population mean (p^) The predictive distribution of Q is logStudent when both parameters of the lognormal distribution are unknown. Assuming the prior is of the natural conjugate form, a simple operation transforms the logStudent distribution into a Student distribution; i.e., (5.2.20) fjc.(Q|f(,a^) = Q~^ f^dn^Ql ,1 ,5^) . Therefore by working with In Q instead of Q , the analysis of the normal process can be applied to obtain a Student predictive distribution for In Q unconditioned by the unknown parameters p and g^ Â• To obtain probability statements for Q and it one needs to obtain probability statements for In Q . For instance, suppose that the monthly sales are distributed lognormally with unknown mean and variance and that the predictive distribution of In Q is Student with mean equal to 485 and the variance equal to 5590.32, i.e., Q^. r^ LS ( 485. 5,590.32). Under the assumptions of the previous example, the probability of making more tlian $3,000 is equivalent to the probability of selling more than 500 units. This probability can be obtained from the standard
PAGE 146
133 Stiidi^nt disLribiiL i on . An impDitant fcaLiire of the model Lli.iL we have developc-d is itri unequal weighting o\ past observations, a characteristic that clearly demoiistrates the problem faced by users that apply stationary inferences when the variables really are nonstationary . It ^^/as pointed out before that the posterior value of the prior parameter m' during any given period t is m'^! = (m'n' + m n)/(n' + n) . Under stationari ty , successively applying this equation gives I'l' , as a function of m' the initial prior mean, of n, the sample size, and of the past sample means. All past observations are weighted equally and m' , can be expressed in the form (5.2.21) m' = (n|m-| + n T. m.)/(nj + tn) i=l ^ or (5.2.22) m;^^ = (n|m' + E Q.)/(n' + tn) , i = l n where Q. = E Q, . ' k=l '^^ Under sta tionarity , n^^-j = n'^ln^/ (n'^'+ n ,) , and n' , < n". If we assume nonstationarl ty with no drift, i.e., u=0, and define q =n'/(n'+n), then tlie posterior value of tlie prior mean parameter in period t+1 is (5.2.23) m^^j = q^m^ + (l-q^)m^. Successi vi'l V applying (5.2.23) gives m'.i as a function of ml, the initial mean, ;ind m. and (| for i = 1 . 2 , . . . , t . 1 1 was shown in 1 i
PAGE 147
134 Appendix I ;md in Cliaptt-r Four that the weight assigned to any observation, say , in determining a prior distribution for P^ , , is a strictly decreasing function of i. That is, the importance of any observed value, say Q ., for making inferences about a future value of the mean, say p , decreases as i increases. For the special case in which n' = n we showed that the prior mean at the beginning of any period under nonstationarity can be expressed as the sum of the initial mean, m' discounted by a factor q, and an exponentially t t-1 i weighted sum of the observed sample means ; i . e ., m' = q m'+ (1-q) S q m .. t+i 1 |=Q t-i To illustrate, suppose that the monthly sales Q , Q , ... of a firm are independent and Identically distributed random variables with common density function f (Q | p, 0^=100) . Assume that the random shock distribution is f (e|0, 50). Suppose also that, at the beginning of period 1, the manager has assessed the prior distribution function to be f'Cp |500, 57.28), To obtain a simpler expression for comparisons with the stationary case, v;e are assuming that at the beginning of the first periotJ the model is already in steady state form in the sense that n! = n = 1.74596 and q =q = ... = .127016. If the manager has available a sample of say, 12 monthly sales with sample mean m = 480 then the mean of the posterior distr ibutitni of p is 482.5403. Since we are assuming that there is nonstationarity with no drift the mean of t lie [jrior distribution of p under stationarity and nonstationarity is 482.5403. If the manager has available, during
PAGE 148
135 period 2, a new sample of 12 monthly sales with a sample mean m^=505 then he may compute a new posterior distribution of p^ whicli reflects this new information that he has available. Under stationarity the mean of the posterior distribution of p^ Vi/ill be (5.2.24) mi; = [1.74596(500) + 12(480 + 505) 1 /[ 1 . 74596 + 2(12)] = 493.0155 . Under nonstationarity with n' = n = 1.74596 the mean of the posterior 1 L distribution of p.^ will be (5.2.25) m" = (.127016)2(500) + (1. 12701 6) [ 505 + (. 127016) (480) = 502.1473 . During period 3 if a sample of 12 observations is available that yields a sample mean m = 520 then the mean of the posterior distribution of Ot will be m" = 501.5895 under stationarity. Under nonstationarity and steady state condition the mean of the posterior distribution of li will be (5.2.26) m'^ = (.127016)3(500) + (1-. 127016) [520 + (. 127016) (500) + ( . 127016) ^ (480) ] m" = 517.7324 . As we move into the future tlie initial prior mean has loss weight in the determination of the prior mean m'. From the exponentially v;eighted
PAGE 149
136 sum of sample meaaa we note that recent sample means are weighted more heavily that not so recent ones. The impact of a particular sample mean on future values of the prior distrihution of u decreases as t increases. 5.2.3 Extensions to the Nonstationary Ba y esian CVP Mo del It is possible to significantly extend the model presented in the previous section by assuming that sales, Q, and contribution margin, (P-V) , are normally or lognormally distributed. For Instance suppose that Q and (P-V) are both normally distributed with unknown means fj and M(-p_y^Â• The predictive distribution of Q and the predictive distribution of (P-V) are normally distributed. It is well known [see Ferrara, Hayya and Nachman (1972)] that the distribution of the product of two normally distributed random variables is not normally distributed. However, if we denote Q* = e^ and (P-V)* = e to be the new random variables then the distributions of Q" and (P-V)* are lognormally distributed. If a conjugate prior is assigned to y, then the predictive distribution of a lognormally distributed variable when y is unknown, is also lognormal; hence both.Q* and (P-V)* have lognormal predictive distributions. However, Patel, Kapadia and Owen (1976) point out that if x-j^ and X2 are independent random variables with probability density functions f(x| 8^,62) and f (x2 |a-j^ ,02) , respectively, then the random variable Y =XjX2 also has a lognormal distribution with probability dL-nsiry function f(Y|0^+ a^, 0.-,+ a2) . Suppose then that Q and (P-V) are both lojinornia 1 Ly distributed with unknown parameters V' and
PAGE 150
137 p. . respectively. Both distributions have lognormal predictive distributions and hence Q[P-V] is lognormally distributed. To illustrate, suppose that at any given period t the predictive distribution of sales (Q ) is given by f [Q Im'' , o'^Cl + 1/n" ] and that tVie predictive dlstrl^ ^ hN^t ' Qt Qt bution of the contribution margin (P-V) is given by ^LNf^^"^^''"'(P-V)t' Â°(P-V)^^ ^ ^^"(P-V)t^^'^''^"' ^^'^ predictive distribution of Q[P-V] is given by (5.2.27) Once we find the predictive distribution as defined in (5.2.21) ue can find the distribution of profits as was explained before. We cannot extend our analysis to the cases where Q and (P-V) are both normally or lognormally distributed with unknown means y and p, , and unknown variances a?^ and o^ .. For the case in which both Q and (P-V) are normally distributed it was shown in Chapter Four tliat the predictive distributions are Student. The distribution of the product of two Student distributiona does not have a tractable closed form except when the parameters of the two distributions are the same. In this case the distribution of the product is an F distribution. If conjugate priors are assigned to the unknown parameters of the distribution of Q and (P-V) then for the case in which Q and (P-V) are lognormally distributed, ihe predictive distributions are logStudent. We cannot extend our analysis to this case either because the distribution of
PAGE 151
138 the product of two logStudent distributions does not have a tractable closed form in the general case, We can address the previous problem from a different view point. Suppose that Q and (P-V) have a joint lognormal distribution with parameters p and Z where (5,2,28) (P-V. and (5.2.29) 9 12 21 Given that \s is unknown and I is known, the decision maker can assess a joint prior distribution on the vector of unknown parameters. A joint predictive distribution for Q and (P-V) can be obtained from the posterior distribution of y. This approach works if Z is known; otherwise there is not a tractable closed form in the general case. The nonstationary Bayesian CVP model presented in the previous section can be extended to the multiproduct case. In any given period t, the random variables of interest are vectors Q of quantities sold for products 1, 2, ... ,P; i.e., Q^= ^Qjil' ^t2' "' ' ^tP^ * ^"PPÂ°^^ ^^'^^^ Q^ is multivariate normally distributed with mean vector u and E covariance. A Bayesian analysis involves the assessment of a prior distribution for p if only the vector of means is unknown or a joint prior on (p ,Z) if both parameters are unknown. After a vector Q is observed, the posterior distribution of the unknown parameters is
PAGE 152
139 available. Next, assume that values of the mean vector for successive time periods are relateil as li . , = u + e , where e is a multi"t+l ,t -, t .t normal "random shock" term independent of p with known mean vector u and covariance matrix U, For the case in which p is the unknown vector of parameters, Winkler and Barry (197J) discuss the methodology to obtain the posterior distribution of jTi and the predictive distribution of the vector of quantities sold, Q , They pointed out that the updating procedure for the model is relatively straightforward but that difficulties are encountered in attempting to investigate limiting properties of the model. Simplifying assumptions which produce limiting results are: 1. the prior information at the beginning of period one can be thought of as equivalent to the information obtained from a sample of size n' from the process and therefore the covariance matrix of the initial distribution, say S' can be thought as a constant multiple of E; i.e., S' = (n')~ T.; 2. the random shocks that change the mean vector from period to period are such that they do not change the underlying relationship among the elements of the mean vector and therefore the covariance matrix il can be thought as a constant multiple of I; say fi= w E. If we make the same simplifying assumptions as in VJinkler and Barry (.197 J) and in addition assume that from period to period the unknown covariance matrix E does not change then we can extend tlie metli-
PAGE 153
140 odology from the univarinte to the multivariate case for the case in which p and Z are both unknown. Under these assumptions, during the time period t we can revise the joint distribution on (fj ,z) and at the end of time period t (the beginning of time period t+1) determine the new mean vector j] , which reflects the effects of the random shock. From the prior distribution of (p ,-, ,Z) the decision maker can determine the predictive distribution of quantities sold and the predictive distribution of profits. 5 . 3 Non statlonarity in Statistical Life Analys is 5.3,1 Existing Ana lysis Reliability theory is the discipline that deal, among other things, with procedures to ensure the maximum effectiveness of manufactured articles. In general, life length is random, and so we are led to a study of Life distributions. For instance. Farewell and Prentice (1977) emphasize the applicability of lognormal models to recent data sets from the industrial and medical literature. Reliability theory emphasizes the prediction, estimation and optimization of the probability of survival, the mean life, or more generally, the life distribution of components or systems, In the traditional approach to life testing inference points or interval estimators for functions of the life distributions were obtained by substituting for the unknown parameters the point estimators obtained for them. Most uses of Bayesian methods can be characterized
PAGE 154
141 as point or interval estimation of parameters of life distributions or of reliability functions. All of the papers discussed in Chapter Two that have considered life testing problems have assumed a stationary situation. However, no matter how hard the company works to maintain constant condition during a production process, fluctuations in the production factors can lead to a significant variation in the properties of the finished products. Variation in inputs, in some cases, tend to be purely random and could gradually change the characteristics of the life distributions of the products. Moreover, the wearout of the machines used in the manufacture of the products could cause changes in the quality of the products and hence in the parameters of the life distributions. Again we want to stress that we are referring to gradual changes, the effects of which are not perfectly predictable in advance for a particular period, i.e., the characteristics of the process vary across time but are relatively constant vjithin a given period. In our opinion the model developed in Chapters Three and Four provides a convenient framework to study the effects of nonstationarity on the inferences drawn from life testing statistical models. 5.3.2 A Life Testi ng Model Under No nstationar ity A natural framework for studying the problem of changing parameters in terms of forecasting the life of a manufactured product is provided by the Bayesian approacli to statistical Inference. Having a product, let us consider the random interval beginning with the
PAGE 155
142 moment the product starts to work and ending at the moment of its failure. This positive random variable is called life time of the product or time to failure. Suppose that the model for the life of a product is the lognormal distribution with parameters y and o'^ ; i.e., the life of products coming from a given process, L^ , h^ , ... , are independent and identically tlistributed random variables with common density function f (lIp.o'^). Suppose also that a posterior distribution over the unknown parameters is available and that the distribution of the random variable, life, undergoes a gradual parameter shift between successive periods of time of the form \i = p^ + e,-,-i as defined in (3.3.8). From a formal Bayesian analysis, during a given period t, two distributions are available, namely the posterior distribution and the predictive distribution of a future observation which comes from the same data generating process. If pj. is the only unknown parameter and the prior distribution is natural conjugate to the process then tlie posterior distribution is f"(M|.|m",a'^/np and the predictive distribution is ^LN^^tl'^t'^Sld + n^^^)/n^^^]), as defined in (3.3.6) and (3.3.7). If pj. and 6 are both unknowi and the prior distribution is natural conjugate to the lognormal distribution tlien the posterior distribution is fM_--.(f't ''^^ l'"t ' "t' ^" ^^'^ "^"-^ ^^^ ^^^ predictive distribution is logStudeiit witli infinite mean and variance, as defined in (3.3.17)(3.3.20).
PAGE 156
143 Under, both uncertainCy situations the posterior distribution (or the prior if no sample evidence was included) then reflects whatever is known concerning the parameters of interest and it also fully reflects the remaining uncertainty the manager has concerning the parameters. A large part of the statistical problem in reliability involves the estimation (point or interval) of parameters in failure distributions. Each of tlie non-Bayesian methods of obtaining point estimates given in Chapter Two has certain statistical properties that make it desirable from a theoretical viewpoint. From the Bayesian standpoint the posterior distribution should be used to derive the point or interval estimators of the unkno\vm parameters, except under nonstationarity in which case the new prior should be used. With respect to inferences, the manager considers the entire posterior distribution (or any probability determined from this distribution) as an inferential statement, and he may not be interested in a single point estimate. For instance, some potential estimators of p based on the normal posterior distribution, for the case when only jj is unknown, are the posterior mean, the posterior median, the posterior mode, and so on. Since the normal is unimodal and symmetric, the posterior mean, m" , is equal to the posterior median and to the posterior mode. On the other liand if an interval of values for fj rather than a t single value is desired tlien from the normal posterior distribution, the probability of any interval of values of \i can be determined. It was sliown in Chapter Three that the presence of nonstationarity produces
PAGE 157
144 greater uncertainty (variance), at the start of period t+1, with respect to the unknown parameter than would be present under stationarity because in the stationary case n' , = n'' . Thus we would expect to have wider intervals for a given y content; and after several periods the intervals will also be shifted in location since they will differ in means. For the case in which both parameters (jj and 6^) of the lognormal distribution are unknown, some potential point or interval estimators are based on the marginal distributions obtained form the joint posterior distribution function. In any given period t if the joint posterior distribution of the unknown parameters of the lognormal life density function is normal-gamma, as defined in Section 3.3.2, then the marginal distribution of o is inverted-gamma-2^ as defined by (5.3.1) 2 exp[-d'V72a2] [d"v"/2o2 ] ^^^t/^) + 1/2 f. o(okv", d") = ^-^ ^-^ ' l-Y-2 ' t t . 1/9 ^ r(d'72) [d"v'72] ' with mean (5.3,2) E(o|v'^, d'^) = /VVd'^72 fd'72 3/2] !/ [r'(d'72) ] The marginal distribution of l/d^ is garama-2,
PAGE 158
145 and variance (5.3.3) V(0|v'j:, dp = [v^M'^/(d'^' 2)] [E(o)]2. The cimiulative density function of the inverted-gaTnina-2 is related to the cumulative density function of the gamiiia-2 variable by (5.3.4) G._^_2(o|/^, dp = F^_2^1/e^iv;:. d'^) , 00 a where G(a) = / f(x) dx and F(a) = / f(x) dx, [see Raiffa and Schlaifer a (1961) J. The marginal distribution of p is Student as defined by f-,(0^|ni'J> ii'J/v" ,d'') . Point or interval estimators may be obtained from (5.3.1) or from the Student marginal distribution of fl . Sometimes the people working with life testing models are interested in the distribution of the median and of the mean of the lognorm;illy distributed variables. The median and the mean of lognormally distributed random variables are given by C= exp (|j) and 5= exp(y+ a^ 1 2) . For given period t, the conditional posterior probability density function given o is norma] with mean m" and variance a /n", then C, given T, has a lognormai posterior probability density function. The marginal posterior probability density function for jj is Student; thus C has a posterior density function which is logStudent [Zellner (1971)]. Similarly, givi?n o, the conditional posterior probability density function lor 5 is lognormai. Again these distributions incorporate all the available prior and sample information and can be employed to oljtaiii point estimates, to make probability statements
PAGE 159
146 about parameter's values, to perform Bayesian tests of hypotlieses and to derive predictive probability distrlbut ion.s for future observations coming from the lognormal life testing model. As discussed in Chapter Four, a prediction interval is different from a confidence interval for an unknown population parameter (such as the population mean) or from a tolerance interval to contain a specified proportion of the population. It is sometimes of interest to obtain a value, arrived at by life testing, that with high probability will be less than the life length of a particular component that is to be used in a one trial system. In many practical problems in industry, it is desired to use the results of a previous sample to predict the results of a future sample. For example, data on warranty values on engines over the past three years might be used for planning purposes to obtain limits that will contain the warranty in the coming year with a high probability. Such problems can be handled by Bayesian prediction intervals. Prediction intervals are also of special interest to engineers who are concerned with setting limits on the performance of a small number of units of a product. Such limits would be required, for example, in setting specifications to contain with a high probability a critical performance characteristic for all units in an order of three heavy transformers when the only available information is the data on five previous transformers of the same type. By using the limits of a prediction interval as specification limits, one can state that wi tli a specified probability all three transformers will meet specifications [Hahn and Nelson (1973)].
PAGE 160
147 Prediction intervals are also required by the typical customer who purchases one or a small number of units of a given product and who must set limits on the performance values of the particular units he will purchase. The prior mean, at the beginning of any period, (m' ) , under nonstationarlty can be expressed as the sum of the initial mean, m' discounted by a factor that decreases with time, and an exponentially weighted sum of the observed sample observations. This relationship provides probably the most interesting aspect of the nonstationarlty Bayesian model, particularly for tlie life testing problem. Since most of the point and intervals estimates discussed in previous paragraphs are functions of n' , then they are also unequally weighted functions of past data. This gives a Bayesian interpretation and justification for the old production management idea of exponential smoothing . A strong argument is made that since the most recent observations contain the most information about what will happen in the future they should be given relatively more weight than the older observations. A limitation in exponential smoothing techniques is that there is no good The exponentially weighted moving average forecast arises from the following moilel of expectations adapting to changing conditions. Let y^. represent that part of a time series which cannot be explained by trend, seasonal, or any other systematic factors; and let y represent the forecast, or expectation, of y. on the basis of iiiformation available through the (t-l)st period. It is assumed that the forecast is changed from one period to the next liy an amount proportional to the last observed error. That is. y^^ = y , + (Â•i(yj._^y,-_^) , 0< B <1. The solution of the above difference equation gives the formula for the exponentially weighted forecast : y^ = \-> T. (l-ii)^"-^ y^_. . 1=1
PAGE 161
148 rule for determining tlie appropiate value of the weights to be assigned to each observation. The nonstationarity Bayesian model provides a rule to determine the set of weights to be assigned to the observations, 5 . 4 Conclusion In this chapter, Bayesian models for Cost-Volume-Profit Analysis and for life testing models under nonstationarity have been presented. This is reflected by the assignment of a prior distribution to the unknown parameters, which recognizes all uncertainty the decision maker has concerning the parameters. The input to the forecasting model is not only the past history of sales of the item, in the case of CVP analysis, but direct Information concerning the market, the Industry, the economy, sales of competing and complementary products, price changes, advertising campaigns, and so on are used. A similar amount of information is incorporated from life testing models. The model also emphasizes that such a model ideally should include the changing character of the parameters of economic and life distributions by allowing for changes in the parametric description of the process through time. For the case of nornial and lognormal data generating processes, under a particular form of stochastic parameter variation it is shown that the presence of nonstationarity produces greater uncertainty to the manager, whi( h is reflected in these particular cases by an increase in a particular iiu^asuro of imcertainty, variance. Bayesian methods are used to derive predictive distributions for CVP analysis and life testing
PAGE 162
U9 models allow the decision maker to provide probability statements about future values of sales and future life length of items. Estimates obtained from the posterior and predictive distributions are unequally weighted functions of past data.
PAGE 163
CHAPTER SIX CONCLUSIONS, LIMITATIONS, AND FURTHER STUDY 6 , 1 Summary Great effort has been expended by engineers, econometricians and statisticians over the last two decades on the problem of model identification. This problem is concerned with construction of a model whose output is close in some sense to the observed data from the real system. The equations which describe the model are often specified to within a number of parameters which must be estimated. The unknown parameters are usually assumed a priori to be constant. In this case the problem of model identification is reduced to one of constant parameter estimation. The problem of time varying parameters has received more attention during recent years because of an increased body of evidence that the usual assumption of stable parameters often lacks realism. The stochastic parameter variation problem arises when parameter variation includes a component which is a realization of some random process in addition to whatever component is related to observable variables, Ideally, a model would be so well specified that no stochastic parameter variation would be present, but the world is less than ideal, In this dissertation we extend and generalize an earlier model developed by Winkler and Barry (1973) by 1. explicitly accounting for uncertainty with respect to both parameters of the Uayesian normal model, 150
PAGE 164
151 and 2. model iag nunstationarity in mean and vari;mce for the lognormal case, since the mean and variance of the lognormal distribution are both functions of both \^ and o^ Â• Some of the objectives of this kind of research are to gain more precise information about the structure of economic relationships and/or to obtain estimated relationships that are suitable for forecasting, in particular in the areas of CVP analysis and life testing models. Tlie model developeti in the previous chapters seems particularly appropiate to both of these objectives, because it [provides a framework for drawing inferences about the structure of the relatlonshi]^ at every point in time. Comparing the nonstationary model with the stationary one it is shovm that: 1. more uncertainty is present under nonstationarity than under stationarlty; 2. past observations provide relatively less information about the current values of p under nonstationarity than under stationarlty because the particular form of stochastic parameter variation used implies a treatment of data involving the use of all observations in a differential weighting scheme; and , 3. under nonstationarity the limiting values of some of the parameters of the [>osterior and predictive distributions cannot be determined clearly.
PAGE 165
152 Tlie model developed in this dissertation is simple and some of the results arc obtained under very restrictive assumptions. F'robahly the most important advantage of the new v;ork is the increased versatility it lends to the nonstationary Bayesian model derived by Winkler and Barry (1973), i.e., the enlarged range of real and important problems involving univariate or multivariate nonstationary normal and lognormal processes v.'ith which it can cope. Another advantage is tliat it keeps the simplicity of the updating methods for the efficient handling of the estimation of unknovm parameters and the prediction of the outcome of a future sample. 6 . 2 Limitations The results obtained from the Bayesian modeling of nonstationarity rely on some general and simplifying assumptions that we have pointed out throughout the dissertation. Some of these assumptions limit the results obtained from the model. These are assumjjtions that are part of the more general Bayesian statistical inference model and others are related directly to the nonstationary condition. The decisions we make, the conclusions we reach and the explanations we offer are usually based on beliefs concerning the probability of uncertain events su(.:h as the result of an experiment, the outcome of a sport event or the future value of an investment. In general, we do not have objectively given models according to v;hich the probability of such events could he computed. As a consequence, the assessment of uncertainty is often based on the intuitive judgments of lumian beings. One .important assumption ol tlie model that v/e developed is that t'ne manager can express
PAGE 166
153 his judgments about the unknown parameters in terms of a natural conjugate prior distribution for the process, Tlie manager has to decide which parameters are unknown and then he must express his information about these random variables in probabilistic terms and according to the natural conjugate family of prior distributions. The prior probabilities should reflect the decision maker's prior information about the uncertain quantity in question, i.e., sample results if available and if there is little or no sample information, then they should be based on any other relevant information available. Several techniques are available for the quantification of judgment; some of these were referenced in Chapters Two and Five. For many problems, a joint distribution for the unknown parameters is needed, If the uncertain parameters are dependent, the assessment process becomes difficult, especially if we are dealing with continuous random variables. The applicability of conjugate prior distributions depends in part on the appliv-.abi li ty of a particular statistical model because the conjugate family of distributions, as shov\7n in Chapter Three, depends on assumptions concerning a statistical model. Although the model is originally developed for normal data generating processes, several references are given for the applicability of lognormal models to economic and life testing problems. There are cases in which even if a certain model is applicable to the dtita generating process and if the corresponding conjugate family is known, it may be that no member of the family adequately represents the assessoT''s prior juilgments.
PAGE 167
154 For some of the results, we assumed that for each period a sample of equal size, n, is available. This is a crucial assumption for the limiting results discussed in Chapter Four. If a sample of equal size is not available, each time a sample is taken, then the limiting value of n' cannot be obtained without some further restrictions on the nature of the sampling procedure actually used. The imposition of the transition relation p ,, = p + ^^ , is critical to the determination of the prior distribution of the time varying coefficient. We assumed that the distributions of p and e were normal, and therefore we were able to find the convolution. It is shown in Appendix II that other assumptions like gamma or exponential random shocks, non-additive nonstationary models, i.e., P , = P e^ ^ Â» and exponential data generating processes can lead to distributions that are not tractable and consequently not useful for the Bayesian modeling of time varying parameters. It is also assumed in this model that no seasonal or trend effects are present. Insofar as the model is used for shortterm forecasting, this assumption does not seem unrealistic. Further research including these additional sources of variation could lead to a more versatile model, although problems like those discussed in Appendix II are likely to reduce the possibilities of obtaining a model in closed form. [See Harrison and Stevens (1971) for some results with such a model.] The assumption that the variance of the normal process is known seems particularly unrealistic when we are assuming tliat the mean is unknown. Thus, we assumed that both parameters are unknown. However, a restrictive
PAGE 168
155 assumption has to be iinposeil in order to permit tlie determination of the new prior distribution after a random shock has occurred, i.e., that the ratio (n ) between the unkno\
PAGE 169
156 We have assumed that a change in the process mean takes place during each period and that the magnitude of that change, e, has a distribution N(0,o'^). Although this has been a convenient assumption, it e perhaps lacks realism. A more realistic assumption would seem to be that an assignable cause (and hence a change in the process mean) occurs according to a Poisson process. Carter (1972) approached this problem assuming that o'^ , the population variance, was known. The methodology described in this dissertation for the case in which both parameters are unknown could be used incorporating this new assumption to the problem. The probler.i of nonstationarity could be approached from a different angle. Suppose tliat the time varying parameters y and p are independent and identically distributed, conditional upon some second order parameter (s) , instead of being related in a stochastic manner. In a problem like this the decision maker is making inferences about the distribution of p , which sometimes is called the distribution of nonstationarity. For instance, if y is the mean for period t of a normal data generating process for sales of a given company, then the distribution of nonstationarity might represent the different values of M over time. In general, the distribution of nonstationarity will have a parameter (or a vector of parameters) often denoted by (f" , so that the distribution of nonstationarity can be represented by f(Pj,|'||) for all t. A Bayesian approach to this problem requires the specification of a probability distribution for f((t') in order to express the decision maker's uncertainty about *)' . This problem can be studied
PAGE 170
157 under various uncertainty and distrilnitional assumptions concerning tlie distribution of nonstat ionari ty and concerning the distribution of second order parameters, i.e., f (<()). This problem is related to the problem studied by a class of theorists known as Empirical Bayesian [see Maritz (1970)]. Another application of the model developed in this dissertation relates to calibration of Instruments. Suppose that a product is being weighed. During period t a sample is taken to estimate the average weight of the products, p . As the average appears to be high or low, a dial can be set to increase or reduce the average weight of the products by an amount c . If we assume that the dial is poorly calibrated, i.e., e becomes a random variable, then when we change the dial we do not get V "*" '\but rather M + e , where E(e ) = e . Since the setting varies, Â£ will vary and hence the expected mean weight of the products for the next period of time, ECfi ,) will vary. The expected value of e , e , might be subject to control, so that a decision problem arises. Each period of time a setting must be selected tliat minimizes the variance of the average weight or that minimizes the predictive variance of the weight for a f utut e product that is sampled, or that satisfies a probabilistic constraint on the next weights of items produced by the process. Perhaps the most important area for further work has to do with identification of the nonstat ionarity , We have stressed throughout the dissertation that it is important for the decision maker to recognize Che presence of nonstationar i ty if it exists. However, most of the time it is very difficult to get information about the general form of nonstationarity . Analyzing data for evidence of changes in [larameter
PAGE 171
158 conditions is a problem central to the development of an inferential system that the decision maker can use. The decision maker has available the sequence of sample means m^ , m2 , ... , m^. , ... . More research is needed to find out how those sample means could be helpful in determining what form of nonstationarity is present and what its variability is. In the previous section we pointed out that, when the parameters ]s and o^ are unknovm, our model depended on the assumption that the ratio (n .) between the unknown population variance and the random shock variance is known. In most cases the decision maker does not know this value and needs to estimate it. Additional research is required to find out how to use the sample means and the sample variance to estimate n . s In conclusion, since assumptions of stationarity are often quite unrealistic, the introduction of possible nonstationarity greatly increases the realism and tlie applicability of statistical inference ' methods, in particular of Bayesian procedures. More work, of both an empirical and analytical nature, appears to be promising.
PAGE 172
APPENDIX I APPENDIX TO CHAPTER THREE
PAGE 173
APPENDIX I APPENDIX TO CHAPTER THREE Bayesian Analysis of Normal and Lognormal Processes The general Bayesian theory presented in subsection 3.2.1 provides the foundation for the analysis of normal and lognormal processes to be considered in this appendix. Most of this \\;ork appear in Raiffa and Schlaifer (1961) and in De Groot (1970). It sets the stage for our analysis of normal and lognormal processes under nonstationarity in section 3,3. Two uncertainty conditions are to be studied in detail ; in one the shift parameter, y, is unknown and the spread parameter, a^, is assumed to be known, and in the other case both parameters are assumed to be unknown. Prior, posterior and predictive distributions will be determined for both cases. In every case sufficient statistics will be found for the unknown parameters. I . 1 Normal and Lognormal Processes with Known Spread Parameters The purpose of an experiment is to obtain information about p or o , depending upon which (if either) is known beforehand. Consider experiments consisting of n independent and identically distributed observations x , xÂ„, ... , X obtained from a normal process; that is a process generating random variables x , x , ... , x with identical densities (AT.l) fj^(x|M,a2) = {/l^ a)~^ exp [-(x-p) 2/2a2] , -Â«>< x <" , Â— OD< y 0. 160
PAGE 174
161 The likelihood thaL an Tiuiependent Normal process will generate n successive values x,, x^, ... , x is the product i)f tlieir individual likelihoods as given tiy (Al.l) if the stopping process is noninformative. (See La Valle (1970) for a general discussion of stopping rules.) In other words it is the product of their individual likelihoods if the kernel of the likelihood function for tlie parameter depends only on the data generating process and not on the stopping process. We will assume that the stopping process is noninf ormative. Therefore the likelihood could be written as. n (AI.2) l(^:|fi,a^) = ][ {[/Trr'o] ^ exp [-(x . -p) 2 /2o2 ] ) i = l or (AI.3) =[/2^ or" exp{-[ E (x.-p)2]/2a2} . i=l If we assume that jj is unknown, then we can compute the statistic m defined as n (A1.4) m = ( T. x^)/n. i=l The likelihood can be written as, (AI.5) l(x|p) = (/2^ a)~" (exp{-[ Z (x .-m)2 /2o2 ] } ) exp [-n(m-p)2/2o2 J i=l (A1.6) a exp[-n(m-p)-/262 ] .
PAGE 175
162 Thus all the information in the sample is conveyed by the statistics m, sample mean, and n, sample size. Since the data enter Bayes' formula only through the likelihood, it follows that all other aspects of the data, with the exception of m, are irrelevant in determining the posterior distribution of \i and hence in making inferences about yRaiffa and Schlaifer (1961) show that when the variance, o^, of an independent normal process is known but the mean is treated as a random variable, the most convenient distribution of y, the natu-ral conjugate prior, is the normal distribution defined by (AT. 7) tj^(plm,a'2) = {exp [-(p-m) 2/2o' 2] }/a' 2/2T, Â— < y <-, a'2 > 0. In the particular case of an unknown mean, the likelihood of p is a normal curve completely known a priori except for location, which is determined by m. That is, the likelihood is data translated in the original metric p and therefore a noninf ormative prior is locally uniform in p itself, To simplify our results, let o'^ =-o2/n'; that is we define the parameter n' by (AI.8) n' = o2/o'2 and say that the information, (m,o'2), contained in the prior distribution of fi is equivalent to n' observations on the process. In otheiwords let the prior distribution be
PAGE 176
163 (AI.9) fjljC M |m,a2,n) = {expl-n( u-m)2/2o2 ]]/a/2u/n . If a normal distribution with parameters n' and m' is assigned to u and if a sample then yields a sufficient statistic (m,n) then the posterior distribution of y will be a normal distribution with parameters, (ALIO) m" = (n'm' + nra)/(n'+ n) and (AI.ll) n" = n' + n . It can be seen in (ALIO) that m" is the weiglited average of the prior and sample means. Therefore, we may conveniently regard the mean of the posterior distribution as a weighted average of an estimate of p formed from the sample and an estimate of u formed from the prior distribution. The weights of m and m' in this weighted average are proportional to n' and n. If n'>n, the prior mean is given more weight, and the posterior mean m" is closer to m' than to m. If n'
PAGE 177
164 increases by a constant amount with each observation that is taken, regardless of the observed values. Therefore as the number of observations increases, the distribution of p becomes more concentrated around its mean. Moreover, the concentration must increase in a fixed, predetermined way, while the values of the expectation of the mean will depend on the observed values. If the n random variables x^ , ..., x represent a random 1 n '^ sample of size n from a normally distributed population with mean M o and variance o , then the sample mean m is normally distributed with conditional mean E(m|y,a'^) = p and conditional variance V(m| p ,o^ )=a'^/n. Since the variance of tlie prior distribution is equal to o^/n' and the variance of the sample mean is equal to o^/n, we notice that, in the posterior distribution, the prior information receives more weight than the sample information if the prior variance is less than the variance of m (i.e.,n'
PAGE 178
165 predictive distribution function is a normal distribution with mean m" and variance (AI.13) o-Xn" + l)/n" = o"' + (o^/n") . Thus the predictive variance reflects both the process variance ^ and uncertainty about \i measured by o^/n". We are also interested in studying experiments consisting of n independent and identically distributed observations x , xÂ„ x 12 n obtained from a lognormal process, that is a process generating random variables x , x , ..., x vs^ith identical densities, (A1.14) f (x|p,o^) = {exp[-(ln X -& )^/2a2] )/xo /2tt , x > 0, LN _ oo<(j o > 0. It was stated in Chapter Two that a random variable x is said to be lognormal if and only if In x is normal. That is, suppose that In X is normal with unknown mean jL and known variance a^ . Denoting by f^,(ln X 1 u, ,0^-) tlie value of the normal density at In x and by N ' Ij 1. f (x|vL ,0"^) the value of the lognormal density at x, it follows that (AI.15) ^LN^^l^'^^L^ = ^N^^" x|wj^,a2)/x . Thus working in terms of the variable In x, the preceding analysis in terms of the normal process can be applied to obtain results that apply to tlie lognormal distribution. When it Js assumed, in a lognormal distribution, that o is
PAGE 179
166 kno\m and that p is unkno^sm it follows that the sufficient statistics are, (AI.16) m = ( ? In X )/n ; i=l 1 and n. The natural conjugate prior for the unknown parameter y is normal v/ith parameters m' and n' as in the normal case. The revision of the prior distrihution is similar to the normal case also. If a lognormal distribution with parameters m' and n' is assigned to y and if a sample then yields a sufficient statistic (m,n) then the posterior distribution of y, will be normal with parameters, (AT. 17) m" = (n'm' + n m) / (n ' + n) , and (AI.18) n" = n' + n . The predictive distribution will be lognormal with parameters m" and o2(n" + l)/n". I . 2 Normal and Lognormal Processes with Both Parameters Unknown We shall now consider the important problem of sampling from a normal distribution for which both mean and variance are unknown. A conjugate family for this problem must be a family of bivariate distributions. Suppose that x^ , xÂ„ , . . . , x is a random sample from i 2 n a normal distribution with an unknown value of the mean, y, and an unknown value of the variance, a^ . The likelihood that an independent normal process will generate such a sample is given in (AI.3), if the
PAGE 180
167 stopping process is noninf ormative . Now, if we define the statistics, n (AI.19) 111 = ( F, x.)/n, i=l ^ and n (AI.20) V = ( I (x.-m)-)/(n-l), ( E if n=l) , i=l ^ the likelihood (AT . 3) could be rewritten as, (AI.21) -n -n l(x|p,o?) = (2it)" (exp [-{(n-l)v/262} _ {n(m-P) ^/2a^ }] } (6)^. All the information in the sample is conveyed by the statistics m,v, and n; i.e., (ni,v,n) is sufficient. The kernel of tlie likelihood is -jn (A1.22) {exp[-{(n-l)v/2d2} {n(m-u)'^ /2d'']]} (o)^. Raiffa and Schlaifer (1961) show that under these assumptions, the natural conjugate family of prior distributions for the two random variables, p and 6", is a normal-gamma-2 distribution defined by (AI.23) Fj^_-^_2(P'^^U.v,n) = f^(p|d2,m,n) f^_2(o2 | v,n) ; that is (AI.24) e 2o' ^ v(n-l) j 2 ^(n-l)v^ fN-Y-7^^'"^'"''^'"^ = I /^^exp[-n(P-m)2/2n2l}{ ^"^ _, ~Â— 2110 2 r ("-^i) -^< p <-., n,v >().
PAGE 181
168 They recommend also that, in order to improve the richness of the prior joint distribution we could define a new parameter, d, which could be called the number of degrees of freedom in the statistic v. It does not have to be equal to n-1 since f^,(p d ,m,n) and f Â„(()"" |v,d) are distinct. The prior joint distribution could then be defined as (AI.25) 1\, (M,a2|m,v,n,d) = f^,(p | o^ ,m,n) f Â„(a^|v,d). N-y N ' y-z ' We want to point out that, if we are concerned with noninformatlve prior, then in order to find this tractable prior distribution a metric log a and not o should be used, In other words the metric (transformation) log o permits us to have a prior distribution of p and a^ that is locally uniform (noninformative) with respect to the likelihood. However, there is not such a restriction when we are working with informative priors. Next we want to present the marginal distributions of 5^ and p since we will make use of them in Section 3,3, where we develop a model for nonstationari ty in normal and lognormal processes. If the joint distribution of the random variables (p,o^) is normal-gamma-2 as defined before. Box and Tiao (1972) show that the marginal distributions of o^ is gamma-2 with parameters v and d, that is (AI.26) d 2 -1 f 2^"^'^'^^^ = {exp[-dv/2d-]} [vd/2a2] [dv/2]/r (d/2) , d2 > 0, v,d > 0. Also they show that the marginal distribution of p is the Student dis-" tribution with parameters (m,v,d,n), that is
PAGE 182
169 (AI.27) d d+1 l'5,j^,(li|m,n/v,d) d^ [d+ln(lj-m)2/v}] ^ /n/v/3 ( i , d/2) , Â—< y <-, d,(n/v)>0 where 3(p,q) Is the complete beta function. If a sample yields a sufficient statistic (m,v,n,d) and a normal-gamma-2 prior with parameters (m' ,v' ,n' ,d ' ) is assigned to p and o^ then the posterior distribution will be normal-gamma-2 with parameters ni" , n" , d", v" given by (Al,28) m" = (n'm' + n m)/(n' + n) , (AI.29) n" = n' + n , (AI.3n) d" = d' + n , and (AI.31) v" = (d'v' + n'm'-+ dv + nm2-n"m"2) / (d ' + n) . To find the predictive distribution of the random variable X, we have to evaluate the expression (AI.32) f(x) = r r f^,(x|p,o2) f-; (vi,o2|m",n",d",v")d,i da -Â°^ Substituting the corresponding functions into the expression and integrating out p and 0-' Kaiffa and Schlalfer (1961) show that the predictive d i str 1 but ion is a Student dist rlbutloa, defined as 2
PAGE 183
170 ^AT.33) d" 2 r M. Â„^9,_.,...n,.^.^-(d" + l)/2 , , (d")^ [d"+[n" (x-m")2/v"(n"+l )]] ^ " / n" , Â— < x <-, ^^ B [(l/2),(d'72)] V'(n"+1) d". n" >0. v"(n"+l) The prior-posterior analysis of the lognormal distribution under the assumption that both parameters are unknown is very similar as we mentioned before to its normal counterpart. The sufficient statistics are n m = ( E In x.)/n , i=l ^ n V = ( Z (In X. m)2)/(n 1) , i=i and n. The natural conjugate prior distribution for both unknown variables is the normal-gamma-2 as defined in (AT. 25), and the marginal distributions are gamma-2 and Student for the parameters 5^ and jj respectively. A posterior analysis will lead us to a norraal-gamma-2 posterior distribution with parameters revised as. in (AT. 28 AI.31). The predictive distribution of In x is Student and hence the predictive distribution of x is logStudent. Ohlson (1977) shows that if the logarithms of the values of a random variable follow a t-model, then the expected value and the variance are infinite. Thus tlie predictive distribution of x, in our case where both parameters are unknov^m, has . Infinite mean and variance. In Chapter Four we will discuss the implications of these properties for our statistical inferential model.
PAGE 184
APPENDIX II APPENDIX TO CHAPTER THREE
PAGE 185
APPENDIX II APPENDIX TO CHAPTER THREE Nonstatlonary Models for the Exponential Distribution In Chapter Two we pointed out that the exponential distribution was frequently used to represent life testing models. All the research in the area of life testing where the distribution has been used has assumed stationary conditions for the parameters of the model. We wanted to model nonstationarity for this distribution using two different noise models, but it proved to be fruitless. Only under very trivial assumptions did the analysis yield tractable results. For the more interesting and realistic assumptions, we will show in this appendix that useful results cannot be developed. In particular these two noise models were considered: one assumes that the value of the parameter of interest, say A, at time period t+1 is equal to the value at time t plus a random term, i.e., (AII.l) A^_^^ = A^ + e^.^^, t = 1, 2, ... ; the other noise model assumes that the value of the parameter A at time period t+1 is equal to the value at time t, tim.es a random term, i.e., (All. 2) \^_^^ = He^,^^, t = 1, 2, ... . Consider experiments consisting of n independent and identically distributed observations x, , X2 , ... , x obtained from an exponential process; that is a process generating random variables x, , x^, ... , x with identical densities. 17:
PAGE 186
174 are equal , i . e. ,3=b; and in the other case we make no restrictions whatsoever in relation to the parameters. Clearly case one is a special case of case two. Mood, Graybill and Boes (1974) state that if T and T are independent continuous random variables and if z = T, + T,^, then the convolution has a density function given by, OO (All. 6) f(z) = / f^ (z-T ) f (T ) dT Since X and e are necessarily positive, the convolution of them will have a density function (ATI. 7) g(z) = /^ f (z-Ala,e) f^ (A|a,b) dA . e ' A ' But in case one, the scale parameters are assumed to be equal to a constant, say to c. Thus equation (All. 7) becomes (All. 8) '^ , . nC-I r / ,N / -. r , / -, ^-1 g(z) = / (z-A)" ^ exp[-(z-A)/c]exp[-A/c.]A" Vr (a)c^r (a) c"* dX . Since z is fixed and A cannot be greater than z we could define a new variable, (All. 9) A = uz < A < z , or (All. 10) u = >/z , . < u < 1 .
PAGE 187
173 (AIL. 3) f (x|a) = A exp(-Ax), x > 0, A > 0. When the stopping process is noninformative the natural conjugate prior distribution of tlie unknown parameter A is the gamma distribution with parameters a and b; i.e., (All. 4) f (A|a,b) = a"""^ exp[-A/B]/r(a)b'^, < A < b, a > 0, b > 0. In any given period t, with the prior on A and with the sufficient statistics from the sample we could find the posterior distribution on A , which will be a gamma with parameters a" and b". At the end of period t, if there are nonstationary parameters, we use the posterior distribution on A and the relation between A^^ and e^,-, to get the prior distribution of tlie unkno;-m parameter at the start of the next period. Assume that a gamma random shock is imposed on the unknown mean, A, of the exponential data generating process; that is (All. 5) f^(ela,3) = e""^ exp[-e/3l /r(a) 3" < e < 3, a > 0, 3 > 0. Furthermore assume that equation (AII.l) describes the nonstationary random shock. Two cases are worthwhile to look at under this scenario-; in the first case vje additionally assume that the scale parameters
PAGE 188
175 Substituting (All. 10) in equation (A1I,8) and simplifying, g(z) becomes, (All. 11) Thus the prior distribution of the mean at the beginning of time period t+1 is gamma again with parameters (a' = a" + a; c) . However, the assumption that the scale parameters are equal makes this result not very useful. It is much more reasonable to think that the distributions of A and of e have not only different parameters a and a but also that they have different scale parameters b and 3. The convolution z of the random variable A, given by equation (All. 4), and the random variable e, given by equation (AIT. 5), when all the parameters are different could be written as (All. 12) g(z) = /^ (z-A)'^"-^ expl-(z-A)/3]A''"^exp(-A/b)/r(a)3'^r(a)b'' dA, or (All. 13) exp (-z/g) Â„ _T _l ^^^* ^ r(a)b''r(a)b'' ^ ^^~^^Â° exp(A[(l/y)-(l/b)])A'' dA. Gradshteyn and Ryshik (1965) show that
PAGE 189
176 (ATT. 14) J X (u-x) L'xp(|,x) dx = 3(u,v) u I" (v;;i+ v;[3u), where BCy^^Yi) is the heta function, and F (v,p+ v,(ju) is a degenerate hypergeometric function wh i ch does not have a_ closed form . Substituting (All. 14) in equation (ATI. 13) yields (All. 15) exp[-A^ ^^/,]A;;^-^ ,F^(a,a^;[(l/3)-(l/b)]A^^^) ^ g(z) = g(>^,+i) = :; '^^ e"b"r(a+a) It is clear from expression (All. 15) that we cannot have a tractable expression to work with in future periods. Furthermore if we assume that at the beginning of period t+1 , the random variable A has a density function of the form given by (All. 15), and if in addition we assume that new information is available that comes from an exponential process, then the posterior distribution cannot be shown to be of the form (All. 15) The previous analysis assumed that the random shock model was of the form (All.l), that is A , = X + e . If we assume now that equation (ATI. 2) describes the nonstationarity condition on the mean of the data generating process, i.e., A .^ = A e ,^, then we could show ' t+1 t t+1 that even in tlie simple case where both scale parameters have a value of one we could not find tractable results. In any given period t.
PAGE 190
177 assume that the posterior distribution of X is given by (All. 4) and that the distribution of e is given by (ATI. 5). Mood, Graybill and Boes (1974) state that for two independent continuous random variables, X and y, the distribution of their product z,i.e,,z = xy, is given by (All. 16) f(z) = /" {f (x,z/x)/|xl} dx . -00 xy ' ' Hence since A is positive, the distribution of the product of the posterior of X and the nonstationary random shock 6 is given by (All. 17) f(2) = /" {A^"^exp(-A) [z/A]'*"-^exp[-z/A]/|A|r(a)r(a)} dA , or a-1 z _ (All. 18) f(z) = ^, .^, . r A^ " -'exp[-A-(z/A)] dA . r^a)r(a) Gradshteyn and Ryshik (1965) state that (All. 19) r x''~^exp[-(e/x)-Yx] dx = 2(B/y)'"'^ K (2/^7) , ^ where K is a Bessel function of imaginary argument. Thus using the relation (All. 19) in (All. 18), f(z) becomes (All. 20) f(z) = z^~^ ItS^"^^''^ K (2/^)/r(a)r(a) . v This shows that even for the simple case where g=b=l, the results are not tractable. Additional problems of interest, like those studied in the previous section, present additional complications. Instead of assuming a gamma random shock we could assume an
PAGE 191
178 exponential random shock to model nonstat lonary means in the data generating process. Consider samples of n independent and identically distributed observations x , x^, ... , x from an exponential process as defined in (All. 3). Assume a gamma prior distribution for the unknown parameter A as defined in (All. 4) and that an exponential random shock is imposed on the unknown mean X, i.e., (All. 21) f (6|it,3) = a exp[-ae], < e < g , a > 0, B > 0. If the equation that describes the nonstationary condition of the mean is (AII.l) then two cases are relevant for analysis: in one we assume that a=l/b and in tlie other we do not make assumptions about the parameters. When we assume that a and 1/b are equal to a constant, say w, then cinivolution z of the random variables A and e has a density function given by (All. 22) f(z) = /^ {w exp[-w(z-A)]A''~''" [exp(-Aw) ] w''/r(a) }dA, or integrating, (All. 23) f(z) = w"""*"^ [exp(-wz)]z''/r(a) a . If we define, (All. 24) w = 1/d, and (All. 25) c = a + 1
PAGE 192
179 and substitute them in (AII23), the density of z becomes (All. 2b) f(z) = exp[-z/d] z''~Vd'' r(c) , vv^hich is easily recognized as a gamma distribution with parameters c and d. If we do not make assumptions about the parameters, the convolution z has a density, (All. 27) f(z) = / {a exp[-a(z-A)] A^"-*" exp(A/b) }/ r(a)b'^ dA , or integrating (All. 28) f(z) =iL^2iP(zi^ ;Z ^A[a-(]/b)1 ^a-1 ^^ ^ r(a)b''' Gradshteyn and Ryshik (1965) state that (All. 29) / X exp(-Mx) dx = y ^(Vjiju) where Â•Y(a,x) is the incomplete gamma function. Hence if we use (All. 29), the density of z could be rewritten as (All. 30) f(z) = -[a-d/b)]""" Y"[a,-[a-(l/b)]z]. In any given period t, vv^ith a posterior distribution on A which is gamma and an exponential random shock, we cannot get closed forms for Cbie convolution of the variables. Furthermore the "closure under sampling" property of tlie prior is lost with a prior of the form
PAGE 193
180 (All. 30). For instance, suppose that the prior distribution of the mean of an exponential process in any given period t is of the form (All. 30). Consider the case, now, in which new information comes from a sample of n observations from the exponential data generating process. The posterior distribution of A , determined by means of Bayes theorem, is given by (ATI. 31) -[a-(l/b)]"^ Y[a,-{a-(l/b)}A J ? [exp(-L Zx ) ] f"(Ajx) = Â— __L_JL 1 X ^ r -[a-(l/b)l"'' Y[a,-[u-(l/b)] AJV [exp(-L Ex.)]d.v t t t 1 t or (All. 32) Y[a,-{(t-(l/b) }Aj.] a" [exp(-Aj. Zx^) ] /" Y[a,-{u-(l/b)}A J a" [exp(-A^ Zx ) ] dA. t t tic Gradshteyn and Ryshik (1965) state that (All. 33) r K^^-^)e-^^ ,[v,ux] dx = Â«!il(ii.Â±^F ,1,^ + v,v + l;a/(a+e)} . where ^F (Â•) is a Gauss hypergeometric function which in most cases is indetermined. Hence, the denominator of (All. 32) cannot be determined as a closed form. Therefore, we cannot find a posterior distribution of the form of the prior distribution.
PAGE 194
181 Finally consider the case where samples come from an exponential process [as defined in (AII.3)]; the prior distribution for the unknown parameter is gamma [as defined in (AII.4)]; an exponential random shock is imposed on the unknown parameter [as defined in (All. 21)] and the equation that describes the nonstationary condition of the mean is given by A , ^ = A e , . We will show that even for the simplest case, where ^ t+1 t t+1 ^ the scale parameter of the gamma distribution has a value of one, we cannot get tractable results. In any given period t, assume that the posterior distribution of A is given by (All. A) and that the distribution of e is given by (Ail . 21) . If we assimie that the scale parameter has a value of one, the distribution of the product of the posterior distribution of A and tlie nonstationary random shock e ,,, i.e., z= A e .,, is given by, 3 t+1' ' t t+1 ^ ^ oo a Â— 1 (All. 34) f(z) = / {A a exp [-A-(az/A) J/AF (a) } dA , or (All. 35) f(z) = -7^ /" A^~^ exp[-A-(az/A)] dA . lU) Q We could simplify (All. 35) by using the equality (All. 19) to rewrite the integral in the equation. Hence the prior distribution of the mean at the beginning of period t+1 has a density function (All. 36) f(A_) = 2a''"^^ '^^~^^ '\ . [2^y~^/^ (a) ] ; t+1 a-1 t+1 where as before K (Â•) is a Bessel function of imaginary argument, that is
PAGE 195
182 (All. 37) K (2/1) = /" [exp(-2v^ (cos h)t)] (cos h)vt dt. For the same nonstationary model, if we assume that the scale parameter of the gamma distribution and the parameter of the exponential random shock are equal, say to c, then the distribution of the product of A and e is given by (All. 38) f(z) = /" {A'*"Mexp[-cA-(cz/A)]} c^"^^/Ar(a)} dA, or a+1 Â„ (All. 39) f(z) = ^ r A^"^ exp[-cA-(cz/A)] dA, or (All. 40) f(z) = 2c^"^^ z^''"^^/^ K r2cv^/r(a)]. 3. X In the case that the parameters are unrestricted, the distribution of the product of the random variables has a density (All. 41) f(z) = /" (a"^ exp[-A/b]a exp [-az/A ] /AT (a)b^} dA, or (All. 42) f(z) = -^Â— r A^-2 [-(A/b) (az/A)] dA . r(a)b^
PAGE 196
183 or (AII.AJ) f(z) = ZCazb)^"^"*"^^^^ K (2/uzT) /F (a)b^ . In all three cases discussed before, it is clear that the procedure does not yield tractable results. We cannot use f(z), i.e., f(A , ) Â» as the prior distribution of the unknown mean at the beginning of time period t+1 .
PAGE 197
APPENDIX III APPENDIX TO CHAPTER FOUR
PAGE 198
Al'PENDIX Hi APPENDIX TO CHAPTER FOUR Al gorithm to Determine Prediction Intervals for Lognor ma 1 and LogS tudent Distributi on s A Bayesian prediction interval of cover y is defined as an interval A such tliat (AlII.l) F(A|y) = / P(x|y) dx = Y . A In general such a prediction Interval is not unique. One particular interval which we sliall consider is defined as follows. A most plausible Bayesian prediction interval of cover Y (also called highest posterior density [H.F.D.] interval) has the form (A1II.2) A = [x:P(x|y) >_ y], where Y is determined by P(A|y) = y . If the prior distributions are natural conjugate to the process then the predictive distribution for lognormal processes is lognormal when p is unknown and o~ is known and is logStudent when y and o are both unknown. The construction of H.P.D. intervals becomes difficult for tliese distributions since they are asymmetric. In this aijpendix we develop an algoritlim to compute the prediction intervals for these distributions. If the predictive distribution is lognormal vjith mean m and variance a In, then the H.P.D. interval of cover y is of the form (a,b) where a and b are the solutions of 185
PAGE 199
186 (AIII.3) r fj^(x|m, a-/n) dx = y. a and among all the solutions they have the H.P.D. property. To determine the values of a and b we developed a search procedure. If the predictive distribution is logStudent then the H.P.D. interval of cover y is of the form (a,b) where a and b are the solution of (AIII.4) / ^TQ^^I"*' "' ^' '^^ '^^ " Y > such that the H.P.D. property holds. Suppose that the predictive distribution could be represented as in Figure AIlI.l f(x) Mode "igure AIII.l Predictive Distribution
PAGE 200
187 The search procedure works as follows: (i) in the first iteration lahe 1 the value of the density function at the mode value A. i.e., A= f(Mode); label the value of the density function at the origin C, i.e., C = f(0). Select an arbitrary initial point a (greater than the mode) and find another point a, with equal density. (See Figure Mil. 2.) The value of the density function for this initial value will be between points A and C; label it B, i.e., B = f(a^) = f(a2). f(x)
PAGE 201
188 the density function for the next point in the search will be between points B and C. Relabel those points as A = B and C = C; then select the next point in the search. (See Figure AIII.3.) f(x) a. Mode Figure A1II.3 Predictive Distribution b) If / f(x) dx > Y then it means that the value ^1 of the density function corresponding to the next point in the search will be between points A and B. Relabel those points as A = A and
PAGE 202
189 C = B ; hen select the next jxiint in the search. (See Figure AIII.4.) f(x) a-. Mode Figure AIII.4 Predictive Distribution (iii) To select the new points for a and a^, in either cases K ^ 2' ii-a or ii-b, take a to be the solution to -the following equation (AIII.5) f (a,) = C + .681 ( A C ) See lAu-iiberger (1973) for a discussion of the use of the golden section method, which usl-s the constant .681 .
PAGE 203
190 and then lind d,^ with deusiLy equal to a , where t (Â•) is the predictive density functioci. (iv) Once we find a and a (with equal density) we could go to step (ii) and repeat the procedure.. The algorithm stops if it does not find the desired y content intervals within a specified number of iterations or if it finds the interval for a specified precision, that is if the absolute value of the difference between the computed y content and the required y content does not exceed a specified precision. A computer program was written to determine the H.P.D. Intervals for lognormal and logStudent distributions using the previous algorithm. The computational work requires the use of some numerical algorithms. We used three computer packages from the International Mathematical and Statistical hibraries, Inc. Volume 2. To determine the mode of the lognormal and logStudent distributions we used the subroutine ZXMIN, which is a quasi-Newton algorithm for finding the minimum of a function of N variables. To integrate the functions from a to a^ we used DCADRE , which integrates a function t(x) from a to b using cautious adaptive Romberg extrapolation. To determine the new values of a, and a^, say a* and a^ , we used the subroutine ZREALl, which finds real zeros of a real function f(x) where the initial guesses may not be good. In Tables 1 and 2 we present some intervals computed for some lognormal and logStudent distributions.
PAGE 205
192 TABLE 2 PREDICTIVE INTERVALS FOR SOME LOGSTUDENT PREDICTIVE DISTRIBUTIONS Lower Upper Computed Density PARAf-IETERS Interval Limit: Limit: interval: at the number of m V n d Y a^. ^9 '^* Limits iterations 2 .4 15 10 .90 .2989 22.2919 .8999 .0083 14 2 .5 14 10 .90 .4940 20.7564 .8999 .0094 13 2 .5 15 9 .90 .4637 20.5035 .8999 .0097 12 2 .5 15 10 .80 .97384 14.4370 .7999 .0237 12 2 .5 15 10 .90 .5481 19.7881 .9001 .0107 10 2 .5 15 10 .95 .2237 29.3595 .9499 .0034 16 2 .5 15 11 .90 .5619 20.2267 .8999 .0101 11 2 .5 16 10 ,90 .5334 20.3135 .8999 .0099 14 2 .6 15 10 .90 .8561 18.4218 .8999 .0121 14 3 .5 15 10 .90 1.4081 55.3166 .8999 .0036 13
PAGE 206
iy3 The algorithm is used to determine highest posterior density intervals but can be used to determine any type of intervals desired. Minor changes in the computer program are needed to determine one sided prediction intervals or any other interval needed. To get any of the intervals shown in Tables 1 and 2, the user needs to submit only the parameters of the predictive distribution, the desired y content of the interval and the value of the complete Beta function, B{l/2,(d/2)} . The computer program then gives as the output all the information that appears in Tables 1 and 2. For instance, when the shift rate, y, and the spread parameter, o^, of a lognormal predictive distribution are 1 and .5 respectively and a .90 content interval is desired, the algorithm finds a .8999 content interval with limits .3988 and 6.8162. Similarly when the parameters of a logStudent predictive distribution are (m=2, v=.A, n=15, d=10) and a .90 content interval is required the algorithm finds a .8999 content interval with limits .2989 and 22.2919. Following we present a computer printout of the program to find predictive intervals for the logStudent distribution.
PAGE 207
194 I y J .' Jr I ^< \t \< \i i ^ jt M , . n . ~ ^ y J . Â— 9 < Â— -. r<< 1 < <. -Â•7^' .1 ^ a o 1 ; >Â• J -.Â— '>, J M Z C '" -3 3 -
PAGE 208
195 y Â» J > 1 d -."Â•< . 1 Â— o Â— Â— Â• Â• Â• .T 3 > J3 c n Â— *
PAGE 209
196 'Â•J' /I n J" /Â» 'Â•-* J r a ^ Â» .1 T 71 i -Nj J ' i T O l^^ Jiiw-vj Â— > Â» a M -. It
PAGE 210
197 Â— J
PAGE 211
LIST OF REFERIZNCES 1. Adams, J.D. (1962). "Failure Time Distribution Estimation". S emiconductor Reliability , Vol. 2, pp. 41-52. 2. Adar, Z., A. Barnea and B. Lev (1977). "A Comprehensive CostVolume-Profit Analysis Under Uncertainty". T he Accountin g Review, Vol. LIT, pp. 137-149. 3. Aitchison, J. (1966)"Expected-Cover and Linear Utility Tolerance Intervals". Journal of the Royal Statistical Society , Series B, Vol. 28, pp. 5"7-63. 4. Aitchison, J. and J.A.C. Brown (1957). T he Lognormal Distribution . London: Cambridge University Press. 5. Aitchison, J. and D. Schulthorpe (1965). "Some Problems of Statistical Prediction", Biometrik a, Vol. 52, pp. 469-483. 6. Ansley, W.G. (1967). "Device Failures During Equipment Life for Lognormal Distributions". IE EE Transactions on Reliability , Vol.. R-16, pp. 139-140. 7. Bain, L.J. (1972). "Inferences Based on Censored Sampling From the WeiliulJ or Extreme-Value Distribution". Technometrics , Vol. 14, pp. 693-702. ' ~~~ 8. Bain, L.J. and M. Englehardt (1973). "Interval Estimation for the Two-Parameter Double Exponential Distribution". T echnometric s, Vol. 15, pp. 875-887. 9. Barlow, R.E. and F. Proschan (1965). Mat hematical Theory of Rel lability . New York: Wiley. 10. Barnard, G.A. (1959). "Control Charts and Stochastic Processes". Journal of the Royal Statistical Socie ty, Series B, Vol. 21, pp. 239-271. 11. Barry, C.B., J. I. Velez-Arocho and P. Welch (1977). "A Bayesian Predictive Approach to CVP Analysis Under Parameter Uncertainty". Working Paper 78-6. Bureau of Business Research. University of Texas at Austin. 198
PAGE 212
199 12. Barry, C.li. and R.L. Winkler (1976). "Nonstationarity and Portfolio Choice". J ournal of Financial and Quantitative Analysis , Vol 1 1 , pp. 217-235. 13. Bartliolomew, 1) . J . (1963). "The Sampling Distribution of an Estimate Arising in Life Testing". Technometric s , Vol. 5, pp. 361374. 14. Bartlett, M.S. (1937). "Properties of Sufficiency and Statistical Tests". Proceedings of the Royal Society A , Vol. 160, pp. 268-282. 15. Bhattacharya, S.K. (1967). "Bayesian Approach to Life Testing and Reliability Estimation". J ourna l of the Am erican Statist ical Associatio n, Vol. 52, pp. 48-62. 16. Billman. B.R. , C.L. Antle and L.J. Bain (1971). "Statistical Inference From Censored Weibull Samples". T echn ometrics, Vol. 14, pp. 831-840. 17. Birnbaum, A. (1962). "On the Foundations of Statistical Inference". Journal of th e American Statistical Association , Vol. 57, pp. 269-307. 18. Box, G.E.P. and CM. Jenkins (1970). Time Series A nalysis, Forecasting and Control . San Francisco: Holden Day. 19. Box, G.E.P. and G.C. Tiao (1972). Bayesian Inference in Statistical Ana lysis . Massachusetts: Addison-Wesley . 20. Breipohl, A.M., R.R. Prairie and W.J. Zimmer (1965). "A Consideration of tlie Bayesian Approach in Reliability Evaluation". IEEE Transactions on Reliability , Vol. R-14, pp. 107-113. 21. Brown, R.G. (1963). Smo othing, Forecasting and Prediction . Englewood Cliffs: Prentice-Hall. 22. Bro\sm, S.J. (1976). "Optimal Portfolio Choice Under Uncertainty: A Baytisian Approach". Unpublished Ph.D. dissertati on. University of Chicago. 23. Buckland, W.R. (1960). Life Testing, a Bibliographic Guide. Griffins Monograph Series No. Thirteen. London: Charles Griffin and Co . I I el . 24. Burry, K.V. (J 972). "Bayesian Decision Analysis of the Hazard Rate for a Two-Parameter Weibull Process", IEE E Transactio ns on Reliability, Vol. R-21 , pp. 159-169.
PAGE 213
200 25. Buzby, S, (1974). "Rxtfridlng the Applicability of Probabilistic Management Planning and Control Models". T he Accounting Review , Vol. XLIX, pp. 42-49. 26. Canavos, G.C. (1972). "A Bayesian Approach to Parameter and Reliability in the Poisson Distribution". I EEE Transactio ns on Reliability , Vol. R-21, pp. 52-56. 27. Canavos, G.C. (1973). "An Empirical Bayes Approach for the Poisson Life Distribution". IE EE Transactions on Reliability , Vol. R-22, pp. 91-96. 28. Canfield, R.V. (1970). "A Bayesian Approach to Reliability Estimation Using a Loss Function". IEEE Tra ns actions on Reliability , Vol. R-19 , pp. 13-16. 29. Canfield, R.V. and L.E. Borgman (1975). "Some Distributions of Time to Failure for Reliability Applications". Technometri cs , Vol. 17, pp. 263-268. 30. Carter, P.P. (1972). "A Bayesian Approach to Quality Control". Management Science , Vol. 18, pp. 647-655. 31. Chernoff, H. and S. Zacks (1964). "Estimating the Current Mean of a Normal Distribution l-flnich is Subjected to Changes in Time". Annals of Ma t hematical Statistics , Vol. 35, pp. 999-1018. 32. Choi, S.C. and R. Wette (1969). "Maximum Likelihood Estimation of the Parameters of the Gamma Distribution and Their Bias". Technometrics , Vol. 11, pp. 683-690. 33. Cohen, A.C. (1951). "Estimating Parameters of Logarithmic-Normal Distributions by Maximum Likelihood". Journa l of the American Statistical Association , Vol. 46, pp. 206-212, 34. (1965). "Maximum Likelihood Estimation in the Weibull Distribution Based on Complete and on Censored Samples". Techn ometrics, Vol. 7, pp. 5 7 9-588. 35. Cozzolino, J.M. (1974). "Conjugate Distributions for Incomplete Observations". Jo urnal of the American Statistical Association , Vol. 69, pp. 264'-2 66. 36. Davis, D.J. (1952). "An Analysis of Some Failure Data". Journal of ttu! Amer lean S tatistica 1_ Associat ion , Vol. 47, pp. 113-150.
PAGE 214
201 37. Deemer, W.L. and D.F. Votaw (1935). "Estimation of Parameters of Truncated or Censored Exponential Distributions". Annals of Ma themat ical Statistics , Vol. 26, pp. A98-504. 38. de Finnetti, B. (J'^)62). "Does it Make Sense to Speak of 'Good Probability Appraisers'?" in I.J. Good (ed.), The Scient ist Speculates : An Anthology of Partly-Baked Id eas, New York: Basic Books. 39. (1965). "Methods for Discriminating Levels of Partial Knowledge Concerning a Test Item". Brit i sh Journal of Mathema tica l and Statistical Psy c hology , Vol. 18, pp. 87-123. 40. De Groot, M.H. (1970). O ptimal Statistical Decisions . New York: McGraw-Hill. 41. Dickey, J. (1973). "Scientific Reporting and Personal Probabilities: Student's Hypothesis". Journal of the Royal Statistical Socie ty, Vol. 35, pp. 285-305. 42. Dickinson, J. P. (1974). "Cost-Volume-Profit Analysis Under Uncertainty". Jo urnal of Accounting Research , Vol. 12, pp. 182-187. 43. Dixon, A. (1937). "Soil Protozoa: Their Growth on Various Media". Annals of Ap plied Biology , Vol. 24, pp. 442-456, 44. Dubey, S.D. (1968). "Hyper-Efficient Estimator of the Location Parameter of the Weibull Laws". Naval Research Logistics Quarter ly, Vol. 13, pp. 253-263. 45. Dunsmore, I.R. (1974). "The Bayesian Predictive Distribution in Life Testing Models". Technometrics , Vol. 16, pp. 4 55-460. 46. El-Sayyad, G.M. (1967). "Estimation of the Parameter of an Exponential Distribution". Journal of the Royal Statistic al Socie ty, Series B, Vol. 29, pp. 525-532. 47. Englehardt, M. and L.J. Bain (1973). "Some Complete and Censored Sampling Results for the Weibull or Extreme-Value Distribution". Techn ometrics , Vol. 15, pp. 541-549. 48. (1975). "Tests of Two-Parameter Exponentially Against I'hree-Parameter Weibull Al tervatives" . Technometrics, Vol. 17, pp. 353-356.
PAGE 215
202 49. Englehardt, M. (1975). "On Simple Estimation of the Parameters of tlie Weibull or Extreme-Value Distribution". Technometrics , Vol. 17, PI). 369-374. 50. Epstein, B. (1947). "The Mathematical Description of Certain Breakage Mechanisms Leading to the Logarithmic Normal Distributions". J ournal of the Franklin Institute , Vol. 244, pp. 471-477. 51. (1948). "Statistical Aspects of Fracture Problems". Journal of Applied Physics, Vol. 19, pp. 140-147. 52. Espstein, B. (1957). "Simple Estimators of the Parameters of Exponential Distributions Wlien Samples Are Censored". Annals of the Institute o f Statistical M athematics , Vol. 8, pp. 15-26. 53. (1960). "Statistical Life Test Acceptance Procedures". Technometrics , Vol. 2, pp. 435-446. 54. (1961). "Estimates of Bounded Relative Error for the Mean Life of an Exponential Distribution". Technometric s , Vol. 3, pp. 107-109. 55. Epstein, B. and M. Sobel (1953). "Life Testing". Journal of the American Statistical Association , Vol. 48, pp. 486-502. 56. (1955). "Sequential Life Tests in the Exponential Cast", Annals of Mathematical Statistics , Vol. 26, pp. 82-93. 57. Farewell, V.T. and R.L. Prentice (1977). "A Study of Distributional Shape in Life Testing". Teclinometrics , Vol. 19, pp. 69-75. 58. Farley, J.U. and K. Hinich (1970). "Detecting 'Small' Mean Shifts in Time Series". Management Science , Vol. 17, pp. 188-199. 59. Faulkenberry, CD. (1973). "A Method of Obtaining Prediction Intervals" . Jour n al of the Ameri can S tatis ti cal Asso c iation , Vol. 68, pp. 433-435. 60. Fercho, W.W. and L.J. Ringer (1972). "Small Sample Power of Some Tests of the Constant Failure Rate". Technom e trics , Vol. 14, pp. 713-727.
PAGE 216
203 61. Ferrnra, W.L., J.C. Ilayya and D.A. Nachman (1972). "Normalcy of Profit in the Jaedicke-Roebichek Model", The Accounting Revie w, Vol. XLVII, pp. 299-307. 62, 64, 65, 66, 70. Ferrell, E.B. (1958). "Control Charts for Lognornial Universes". Industrial Quality Contro l, Vol. 15, pp. 4-6. 63. Finney, D.J. (1941). "On the Distribution of a Variate Whose Logarithm is Normally Distributed". Joiirnal of the Royal Ji-iSi^tical S ociety Supple ment, Vol. 7, pp. 144^-161. Folks, J.L. and R.H. Brovme (1975). "On the Interpretation of the Observed Confidence in Certain Reliability Assessments". T echnometrics , Vol. 17, pp. 287-290. Fraser, D.A.S. and I. Guttman (1956). "Tolerance Regions". Annals of Mathematical Statistics , Vol. 27, pp. 162-179. Gaddum, J.H. (194:)). "Lognormal Distributions". Nature Vol 156 p. 463. ~ ~~ 67. Galton, F. (1879). "Tlie Geometric Mean in Vital and Social Statistics". Proceedings of the Royal Soc iety Vol 29 pp. 365-367, 68. Goldthwaite, L.R. (1961). "Failure Rate Study for the Lognormal Lifetime Model". Proceeding s of the 7t h National Symposiu m -gILjJgJ:,igbiIity a nd Quality Control in Electronics, pp. 208"215. ' 69. Govindarajulu, Z. (1964). "A Supplement to Mendenhall's Bibliography on Life Testing and Related Topics". Journal of the American Statistical Association, Vol. 59, pp. 1231-1291. (1977). "A Class of Distributions Useful in Life Testing and Reliability", IFEE Transa ctions on R elia bility Vol. R-26, pp. 70-75, ' ~ ' 71. Gradshteyn, l.S. .md I.M. Ryshik (1965). Tables of Integra ls. Series and Produc ts, London: Academic Press. 72. Grubbs, F.E. (1971). "Fiducial Bounds on Reliability for the TwoParameter Negative Exponential Distribution". Tech nometr ics, Vol. 11, pp. 873-876. ' 73. Guenther, W.C., S.A. Patil and V.R.R. Uppuliri (1^176). "Oiie-Sided 3-Content Tolerance Factors for llie Two Parameter Exponential Distribution". Technome tric s, Vol. 18, pp. 333-340.
PAGE 217
204 74. Gupta, S.S. (1962). "Life Test Sampling Plans for Normal and I.ognormal Distributions". Technometrics , Vol. 4, pp. 151-175. 75. Gupta, S.S., G.G. McDonald and D.l. Galarneau (1974). "Moments, Product Moments and Percentage Points of the Order Statistics From the Lognormal Distribution for Samples of Size Twenty and Less". Sankh ya Series B , Vol. 36, pp. 230-260. 76. Guttman, I. (1970). Statistical Tolerance Regions: Classical and Bayesian. London: Griffin. 77. Haan, C.T. and C.F,. Beer (1967). "Determination of Maximum Likelihood Estimators for the Three Parameter Weibull Distribution". Iowa State J ournal of Science , Vol. 42, pp. 37-42. 78. Hahn, G.J. (1969). "Factors for Calculating Two-Sided Prediction Intervals for Samples From a Normal Distribution". Journal of t he A merican Statistical Association , Vol. 64, ppT~878"-^'"s'. 79. Hahn, A.J. and W. Nelson (1973). "A Survey of Prediction Intervals and Their Applications". Journal of Quali ty Techn ology , Vol. 5, pp. 178-187. 80. Hald, A. (1952). S tati stical Theory with Engi neer ing Applicati ons . New York: IJiley. 81. Harris, CM. and N.D. Singpurwalla (1968). "Life Distributions Derived From Stochastic Hazard Rates". IE EE Transactions on Reliabili ty, Vol. R-17 , pp. 70-79. 82. . (1969). "On Estimation in Weibull Distribution with Random Scale Parameters". Nava l Research Logistic Quarterly , Vol. 16, pp. 405-410. 83. Harrison, P.J. and C.F. Stevens (1971)". "A Bayesian Approach to Short-Term Forecasting". Operat ional Research Quarterly , Vol. 22, pp. 341-361. 84. (1976). "Bayesian Forecasting". Journal o f t he Royal Statistica l Society, Vol. 25, pp. 205-24'7'l 85. Harter, ILL. (1964). "Exact Confidence Bounds Based on One Order Statistic, for tlie Parameter of an Exponential Population". Technometrics, Vol. 16, pp. 301-317.
PAGE 218
205 86. Barter, H.L. and A.H. Moore (L965). "Maximum Likelihood Estimation of the Parameters of the Gamma and Weibull Populations From Complete and From Censored Samples". Techn u metrics , Vol. 7. pp. 639-6'^ 3. 87. (1966). "Local Maximum Likelihood Estimation of the Parameters of Three-Parameter Lognormal Populations From Complete and Censored Samples". Jo urna l of the American Statistical Association , Vol. 6], pp. 8A2851. 88. Hewett, J.E. (196(S). "A Note on Prediction Intervals Based on Partial Observations in Certain Life Test Experiments". Techn ometrics, Vol. 10, pp. 850-853. 89. Hewett, J.E. and M.L. Moeschberger (197b). "Some Approximate Simulations Prediction Intervals for Reliability Analysis". Technometri cs, Vol. 18, pp. 227-229. 90. Hill, B.M. (1963). "Tlie Three-Parameter Lognormal Distribution and Bayesian Analysis of a Point-Source Epidemic". Journal o f the Americ an Statistical Association , Vol. 58, pp. 72-84. 91. Hilliard, J.E. and R.A. Leitch (1975). "Cost-Volume-Profit Analysis Under Uncertainty: A Lognormal Approach". Tlie Accounting Review, Vol. L, pp. 68-80; also, "A Reply". The Account ing Review, Vol. LI, pp. 168-171. 92. Hinich, M. and J. Farley (1966). "i'heory and Application of an Estimation t'kjdel for Time Series with Nonstationary Means". Management Science , Vol. 12, pp. 648-658. 93. Holt, C.C. (1957). "Forecasting Seasonals and Trends by Exponentially Weighted Moving Averages". Carnegi e Institut e of Technology , Pittsburgh, Pennsylvania. 94. Holt, C.C, F. Modigliani, J.F. Muth and H.A. Simon (1960). Pl anning Production , Inventories and \\Jork Force. New Jersey: Prentice Hall . 95. Jaedicke, R.K. and A. A. Kobichek (1964). "Cost-Volume-Profit Analysis Under Conditions of Uncertainty". The Accountin g Review , Vol . XXXIX, pp. 917-926. 96. Jenkins, T.N. (1932). "A Short Method and Tables for the Calculation of the Average and Standard Deviation of Logarithmic D i St r Lb\it ions" . Annals of Mathema tical Statistics, Vol. 3, pp. 4 5-55.
PAGE 219
206 97. Johns, M.V., Jr., and C.J. Lieberman (1966), "An Exact Asymptotically Efficient Confidence Bound for Reliability in the Case of the Weibull Distribution". Technometrics , Vol. 8, pp. 135175. 98. Johnson, N.L. and S. Katz (1970). Continuous Univariate Distributions-! . Houghton Miffin, Boston. 99. Johnson, C.L. and S.S. Simik (1971). "Multiproduct CVP-Analysis Under Uncertainty". Journal of Accoun ting R esearch , Vol. 9, pp. 278-286. 100. Jones, C.F. (1971). "A Confidence Interval for the Lognornial Hazard". Technometrics , Vol. 13, pp. 885-888. 101. Kamat, S.J. (1976). "A Smoothed Bayes Control Procedure for the Control of a Variable Quality Characteristic with Linear Shift". Jo urnal o f Quality Technology , Vol. 8, pp. 98-lOA . 102. Kander, Z. and S. Zacks (1966). "Test Procedures for Possible Changes in Parameters Occurring at Unknown I'ime Periods". Annals of Mathematical Statistics , Vol. 37, pp. 1196-1210. 103. Kaplan, R.S. (1977). "Application of Quantitative Models in Managerial Accounting: A State of the Art Survey", unpub lished manu s cript, GSIA , Carnegie-Mellon University. 104. Kapteyn, J.C. (1903). Skew Frequency Curves in Biology and Stat i stics . Astronomical Laboratory, Gronningen: Noordliof f . 105. Kumar, S. and H.I. Patel (1971). "A Test for the Comparison of Two Exponential Distributions". Technometric s , Vol. 13, pp. 183-189. 106. Lambert, J. A. (1964). "Estimation of Parameters in the Three Parameter Log-Normal Distribution". Australian Journal of St atistics, Vol. 6, pp. 29-32. 107. Larsen, R.I. (1969). "A New Mathematical Model of Air Pollutant Concentration Averaging Time and Frequency". Journal of the Air Pol lution Control Association , Vol. 19, pp. 24-30. 108. Lau. A.H.-L. and H . -S . Lau (1976). "CVP Analysis Under UncertaintyA l,og-Normal Approach: A Comment". The Accounting Review , Vol. 51, pp 163-167. ' Â•
PAGE 220
207 109. La Valle, l.H. (1970). A n Introdurtion to Probability, Decision and Inference . New York: Holt, Rlnehart and \/inston. 110. Lawless, J . J>' . (1971). "A Prediction Problem Concerning Samples From the Exponential Distribution with Application In Life Testing". Technometric s , Vol. 13, pp. 725-730. 111. (1972). "On Prediction Intervals for Samples From the Exponential Distribution and Prediction Limits for System Survival". Sankh ya, Vol. B 34, pp. 1-14. 112. Lemon, G.H. (1972). "An Empirical Bayes Approach to Reliability". I EEE T rans actions on Reliability , Vol. R-21, pp. 155-158, 113. (1975). "Maximum Likelihood Estimation for the Three Parameter Weibull Distribution Based on Censored Samples". Technometrics , Vol. 17, pp. 247-254. 114. Liao, M. (1975). "Model Sampling: A Stochastic Cost-Volume-Profit Analysis". The Accounting Review , Vol. L, pp. 790-790. 115. Likes, J. (1967). "Distributions of Some Statistical Samples From Exponential and Power-Function Populations". Journal of the American Statistical As sociatio n, Vol. 62, pp. 259-271. 116. Lilliefors, H.W. (1967). "On the Kolmogorov-Smirnov Test for Normality with Mean and Variance Unknown". Journal of the American Statistical Association , Vol. 62, pp. 399-402. 117. (1969). "On the Kolmogorov-Smirnov Test for the Exponential Distribution with Mean Unknown". Journal of the Am e rican Statistical Association , Vol. 64, pp. 387-389. 118. Lindley, D.V. (1972). Ba yesian Statistics; A Review . Philadelphia; SI AM. 119. Linhart, H. (1965). "Approximate Confidence Limits for the Coefficient of Variation of Gamma Distributions". Biome trics, Vol. 21, pp. 733-738. 120. Luenberger, D.G. (1973). Introduction to Linear and Nonlin ear Programming', . Massachusetts: Addi son-Wesley . 121. Lwin, T. and N. Singh (1974). "Bayesian Analysis of the Gamma Distr ibuL ieii Model in Reliability Estimation". IEEE Transactions on Reliability. Vol. R-23, pp. 314-318.
PAGE 221
208 122. Mann, N.R. (1968). "Point and Interval Estimation Procedures for the Two-Parameter Welbull and Extreme Value Distributions". Techn ometrics, Vol. 10, pp. 231-256. 123. (1969-a). "Exact Three-Order Statistic Confidence Bounds on Reliable Life for a Welbull Model with Progressive Censoring". Journal of the American Statistical Association , Vol. 64, pp. 306-315. 124. Mann, N.R. (1970). "Warranty Periods Based on Three Ordered Sample Observations From a Weibull Population". IE EE Tran s actions on Reliabilit y, Vol. 19, pp. 167-171. 125. Mann, N.R. and B.C. Saunders (1969-b) . "On Evaluation of Warranty Assurance When Life Has a Weibull Distribution". Biometrika , Vol. 56, pp. 615-625, 126. Mann, N.R., R.E. Schaefer and N.D. Singpurwalla (1973). M ethods of Statist ical A nalysis for Relia bility and Lif e Data . New York: Wiley. 127. Maritz, J.S. (1970). Empirica l Bayes Methods. London: Methuen. 128. Martz, H.D. (1975). "Pooling Life-Test Data by Means of the Empirical Bayes Method". IEEE Transactions of Reliability , Vol. R-24, pp. 27-30. "129. Mendenhall, W.A. (1958). "A Bibliography on Life Testing and Related Topics". Biometrika , Vol. 45, pp. 521-543. 130. Mood, A.M., F.A. Graybill and D.C. Boes (1974). Intr oduc tion to the Theor y of Statistics . New York: McGraw-Hill. 131. Morrison, J. (1958). "The Lognormal Distribution In Quality Control". Applied Statistics , Vol. 7, pp. 160-172. 132. Murphy, A.H. and R.L. Winkler (1970). "Scoring Rules in Probability Assessment and Evaluation". Acta Psychologica , Vol. 34, pp. 273-286. 133. Mustafi, C.K. (1968). "Inference Problems About Parameters ^^^^ich Are Subjected to Changes Over Time". The An na ls of M athematica l Stat istics , Vol. 39, pp. 840-8 54. 134. McFarland, W.J. (1972). "Bayes Estimation, Reliability and Multiple llypotliesls Testing". IEEE Transac ti ons on Re liability , Vol. R-21, pp. 136-147.
PAGE 222
209 135. Nelson, W.B. (19^8). "Two-Sample Prediction". General Electric Company P.l.S. Keport 68-CÂ— 40 A, Schnectady, New York. 136. (1970). "Statistical Methods for Accelerated Life Test DataThe Inverse Power Law Model". General Electric Corpor ate Resear ch and Deve l opment TIS Report 71-C-Oll. Sclinectady, Now York. 137. Nowick, A.S. and B.S. Berry (196J). "Lognormal Distribution Function for Describing Anelastic and Otlier Relaxation Processes". IBM Journal of Research and Developmen t, Vol. 5, pp. 297-311, Ibid, 312-320. 138. Ohlson, J. A. (1977). "Quadratic Approximations of the Portfolio Selection Problem". Management S cience, Vol. 23, pp. 576-584. 139. Oldham, P.D. (19b5). "On Estimating the Arithmetic Means of Lognormally-Dist ributed Populations". Biome trics , Vol. 21, pp. 235-239. 140. Page, E.S. (1954). "Continuous Inspection Schemes". Biometrika , Vol . 41 , pp. 100-116. 141. _________ (L955). "A Test for a Change in a Parameter Occurring at an Unknown Point". Biometrika , Vol. 42, pp. 523-526. 142. Papadopoulos , A.M. and C.P. Tsokos (1975). "Bayesian Confidence Bounds for the Weibull Failure Model". IEEE Transactions on Reliability , Vol. R-24 , pp. 21-26. 143. Pratt, J.W. (1965). "Bayesian Interpretation of Standard Inference Statements". Journ al of the Royal Statistical Society , Series B, 27, pp. 169-203. 144. Raiffa, H. and R. Schlaifer (1961). Applied Statistical Decision T_lieo_ry . Massachusetts: The M.I.T. Press. 145. Roberts, H.V. (1965). "Probabilistic Prediction". Journal of the Americ an S tatist ical A ssociatio n, Vol. 60, pp. 50-62. 146. Rockette, H., C. Antle and L.A. Klimbo (1973). "Maximum Likelihood Estimation with the Weibull Model". Journa l of the .Ameri can Statistica l Asso ciati on, Vol. 69. PP246-249. 147. Kohn, W.B. U959). "Reliability Prediction for Complex Systems".P roce edings of the 5th Nati onal Sy mposium on Reliability and OualiVy Control in Elec tronics, pp. 381-388.
PAGE 223
210 148. Sarris, A.H. (1973). "A Bayesian Approach to Estimation on TimeVarying Regression Coefficients". Annals of Economic and Socia l Measuremen t, Vol. 2, pp. 501-524. 149. Savage, L.J. (1971). "The Ellcitation of Personal Probabilities and Expectations". Journal of the American Statistical Association , Vol. 66, pp. 783-801. 150. Shah, B.V. (1969). "On Predicting Failures in a Future Time Period From Knowi Observations". I EEE Transactions on Reliabil ity, Vol. 18, pp. 203-204. 151. Singpurwalla, N.D. (1971). "Statistical Fatigue Models: A Survey". IEEE Transactions on Reliability , Vol. R-20, pp. 185-189. 152. _______ (1972). "Extreme Values from a Lognormal Law with Applications to Air Pollution Problems". Technometrics , Vol. 14, lip. 703-711. 153. Sobel, M. and J. A. Tischendorf (1959). "Acceptance Sampling with Nev>7 Life Test Objectives". Proceedings of th e 5t h National Symposium on Reliability and Quality Con trol, pp. 108-118. 154. Soland, R.M. (1968). "Bayesian Analysis of the Weibull Process with Unknown Scale Parameter and Its Application to Acceptance Sampling". IEEE Transactions on Reliability , Vol. R-17, pp. 84-90. 155. (1969). "Bayesian Analysis of the V.'eibull Process with Unknovvm Scale and Shape Parameters". IEEE Transactions on Reliability , Vol. R-18, pp. 181-184. 156. Stael von Holstein, C.-A.S. (1970-a) . Assessmen t and Evaluation of Subjective Probability Distribut ions. Stockliolra: Economic Research Institute, Stockholm School of Economics. 157. (1970-b) . "Measurement of Subjective Probability". Acta Psycholo gica, Vol. 34, pp. 146-159. 158. Taimiter, M. (1966). "Sequential Hypothesis Tests for r-Dependent Marginally Stationary Processes". A nnals of Mathematic al Statistics , Vol. 37, pp. 90-97. 159. Tallin, CM. and S-S.Y. Young (1962). "Maximum Likeliliood Estimation of ParametL'rs of the Normal, the Lognormal, Frtmcated Norrial and Bivariate Normal Distributions From Grouped Data". Australian .lournal of Statistics , Vol. 4, pp. 49-54.
PAGE 224
211 160. Thatcher, A.R. (196A). "Relationships Between Bayesian and Confidence Limits for Prediction". Journal of the Royal Statistical Society . Series B, Vol. 26, pp. 176-192. 161. Tlieil, H. (196A). Optim al Decisio n Rules for Government and Industry . Amsterdam: North-Holland. 162. Thoman, U.R., L..J. Bain and C.E. Antle (1969). "Inferences on the Parameters of the Weibull Distribution". T echn om etric s , Vol. 11 , pp. 445-460. 163. (1970). "Reliability and Tolerance Limits in the Weibull Distribution". Technome tries , Vol. 12, pp. 363-371. 164. Tomlinson, R.C . (1957). "A Simple Sequential Procedure to Test Whether Average Conditions Achieve a Certain Standard." Applied S ta tistics , Vol. 60, pp. 168-207. 165. Varde, S.D. (1969). "Life Testing and Reliability Estimation for the Iwo Parameter Exponenti;il Distribution". Journal of the American Statistical Association , Vol. 64, pp. 621-631. 166. Weibull, W. (1951). "A Statistical Distribution of Wide Applicability", Journal of Applied Mechanic s, Vol. 18, pp. 293-297. 167. Weiss, L. (1961). "On the Estimation of Scale Parameters". Naval Research Logistics Quarterly , Vol. 8, pp. 245-256. 168. Wilson, E.B. and J. Worcester (1945). "The Normal Logarithmic Transformation". Review of Economics and Statistics , Vol. 27, pp. 17-22. 169. Winkler, R.i,. (1967-a). "The Assessment of Prior Distributions in Bayesian Analysis". Journal of t he American Statistical Assoc i ation , Vol. 62, pp. 776-800, 170. .________, (1967-b). "The Quantification of Judgment: Some Methodological Suggestions". Journal of the American Statis tical Association. Vol. 62, pp. 1105-1120. 171. (1969). "Scoring Rules and the Evaluation of Probability Assessors". Journal of the American Statistic al A_ssociaJ.;_loji, Vol. 64, pp . lo'73-]078. 172. ^ (1971). "Probabilistic Prediction: Some ExperimentaJ Results". Journal of the American Statistical Association , Vol. 66, pp. 675-685.
PAGE 225
212 173. Winkler, R.L. and C.B. Barry (1973). "Nonstationary Means In a Multinomal Process". Re search Repor t , RR-73-9, Laxemburg, Austria: International Institute for Applied Systems Analysis. 17^. Yuan, P. (1933). "On the Logarithmic Frequency Distribution and the Semi-Logarithmic Correlation Surface" . Annals of Mathematical Statistics , Vol, 4, pp. 30-74. 175. Zellner, A. (1971). An Introducti o n to Bayesian Inference in Econometric s . New York: Wiley. 176. (1971). "Bayesian and Non-Bayesian Analysis of the Log-Normal Distribution and Log-Normal Regression", Journal of t he American Statistical Association , Vol. 66. pp. 327-330.
PAGE 226
213 BIOGRAPHTCAJ. SKETCH Jorge Ivan Velez-Arocho was born in Lares, Puerto Rico, on October 25, 1947, son of Jorge Velez Velez and Elba Lucrecia Arocho Velez. He attended elementary and second.iry school in tlie Lares, Puerto Rico public schoo] system. He graduated from Domingo Aponte Collazo High School, Lares-Puerto Rico in 1966. From 1966 tlnrinigh 1969 he attended the University of Puerto Rico (U.P.R.), at Rio Piedras. His university attendanc-e was interrupted by active service in the United States National Guard fri^)m June 1969 to May 1970. In May 25, 1970 he graduated as Mc^dical Corpsinan from the Medical Training Center of the U.S. Army at Fort Sam Houston, Texas. Upon his return to Puerto Rico he joijied the Puerto Rico National Guard where he served for four years as Senior Medical Aidman . He returned to the U.P.R. where he received liis B.iS.A. degree in December 1970 with a major in Quantitative Methods. He received his M.B.A. degree from the Graduate School of Business Administration of the U.P.R. in June 1973 with a major in Quantitative Methods. While at the Graduate School he held the positions of computer laboratory assistant and graduate teaching assistant. In July 1972 he joined the faculty of the School of Business Administration of the U.P.R. at Mayaguez where he tield I lie rank of instructor and taught courst-s on quantitative methods for management and introductory statistics. He received a leave of absence from the U.P.R.
PAGE 227
214 in 1974 and entered tlie Oradiiate School of the University of Florida to pursue the degree of Doctor of Philosophy, In that year he joined the 301st Field Hospital of the U.S. Army Reserve where he served as Senior Clinical Specialist. He was honorably discharged from the Armed Forces of the United States in May 1975. From September 1974 to July 1977 he v^as employed as a Research Assistant in the Department of Management of the University of Florida, During the Summer of 1976 he taught a course on quantitative methods for managers at the Department of Management of that University. Since September 1977 he has been employed as a Research Associate in the Department of Finance of the University of Texas at Austin, He was married to Digna de los Angeles Hernandez on May 26,1973 and they are the proud parents of a daughter; Angeles Maria,
PAGE 228
1 certify that I have read this stiuiy and that in my opinion it conforms to acceptable standards of scholarly presentation and is fully adequate, in scope and quality, as a dissertation for the degree of Doctor of Philosophy. Christopfier B. Barry Associate Professor of Manasjenient I certify that I have read this study and that in my opinion it conform to acceptable standards of scholarly presentation and is fully adequate, in scope and quality, as a dissertation for the degree of Doctor of Philosophy. Thomas Hodgsc Associate Professor of Industrial Engineering I certify that I have read this study and that in my opinion it conform to acceptable standards of scholarly presentation and is fully adequate, in scope and quality, as a dissertation for the degree of Doctor of Philosophy. Antal Ma j thay Associate Professor of Management I certify that I have read this study and that in my opinion it conform to acceptable standards of scholarly presentation and is fully adequate, in sco|je and i|uality, as a lii ssertation for tlie degree of Doctor of Pli i I osuijIiv . ^ <:L. ) h . /'C, (. Zoran Pop-Stoianovic Professor of Mathematics
PAGE 229
I certify that I have read this study and that in my opinion it conforms to acceptable standards of scholarly presentation and is fully adequate, in scope and quality, as a dissertation for the degree of Doctor of Philosophy. Gary Kbehler Associate Professor of Management This dissertation was submitted to the Graduate Faculty of the Department of Management in the College of Business Administration and to the Graduate Council, and was accepted as partial fulfillment of the requirements for the degree of Doctor of Philosophy. June 1978 Dean, Graduate School
PAGE 230
4A1 K