Citation
Quantal response assays by inverse regression

Material Information

Title:
Quantal response assays by inverse regression
Alternate title:
Inverse regression, Quantal response assays by
Alternate title:
Regression, Quantal response assays by inverse
Creator:
Dietrich, Frank H
Publication Date:
Language:
English
Physical Description:
viii, 90 leaves : ; 28cm.

Subjects

Subjects / Keywords:
Statistics thesis Ph. D
Dissertations, Academic -- Statistics -- UF
Genre:
Academic theses. ( lcgft )

Notes

Thesis:
Thesis--University of Florida.
Bibliography:
Includes bibliographical references (leaves 88-89).
General Note:
Typescript.
General Note:
Vita.
Statement of Responsibility:
by Frank Hain Dietrich II.

Record Information

Source Institution:
University of Florida
Holding Location:
University of Florida
Rights Management:
The University of Florida George A. Smathers Libraries respect the intellectual property rights of others and do not claim any copyright interest in this item. This item may be protected by copyright but is made available here under a claim of fair use (17 U.S.C. §107) for non-profit research and educational purposes. Users of this work have responsibility for determining copyright status prior to reusing, publishing or reproducing this item for purposes other than what is allowed by fair use or other copyright exemptions. Any reuse of this item in excess of fair use or other copyright exemptions requires permission of the copyright holder. The Smathers Libraries would like to learn more about this item and invite individuals or organizations to contact the RDS coordinator (ufdissertations@uflib.ufl.edu) with any additional information they can provide.
Resource Identifier:
025344667 ( ALEPH )
02832918 ( OCLC )

Downloads

This item has the following downloads:


Full Text
QUANTAL RESPONSE ASSAYS BY
INVERSE REGRESSION
By
FRANK HAIN DIETRICH Il
A DISSERTATION PRESENTED TO THE GRADUATE COUNCIL OF
THE UNIVERSITY OF FLORIDA IN PARTIAL
FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF DOCTOR OF PHILOSOPHY
UNIVERSITY OF FLORIDA
1975




To my Mother and Father
for their love and faithful support




ACKNOWLEDGMENTS
I wish to express my deepest thanks to Dr. J. J. Shuster for his expert and helpful guidance in this effort.
I also wish to thank Dr. J. T. McClave for many helpful discussions and comments.
Finally, I wish to thank Mrs. Nancy McDavid for the outstanding job of transforming the rough draft I gave her into this typing masterpiece.




TABLE OF CONTENTS
Page
ACKNOWLEDGMENTS ......... ............... .. ii
LIST OF TABLES vi ... . .. .
ABSTRAQT ... . .... .. vii
CHAPTER
I STATEMENT OF THE PROBLEM . 1
1.0 Preamble. ........ ...... 1
1.1 Introduction 1
1.2 History--Previous Methods of Analyzing Quantal Response Curves .
1.3 History--Inverse Regression 8
1.4 Summary of Results. ........ ......13
II INVERSE REGRESSION OF QUANTAL RESPONSE
ASSAYS: ASYMPTOTIC THEORY . 15
2.0 Preamble...... .. .. .. .. . 15.
2. 1. Introduction. ......... .. 15
2.2 Parametric Model and Estimators 16
2.3 Asymptotic Theory....... .......... 18
2.4 Summary.... ........ .. .37
III APPLICATION TO THE ANGLE TRANSFORMATIONS 38
3.0 Preamble. .. ...... ........38
3.1 Introduction. .. ...... ......38
3.2 Estimation of LD(50) .. .. .......47
3.3 Estimation of Relative Potency. 72
3.4 Test for Parallelism.. .... ....... 76
3.5 Summary ......... .. ...... ..80
IV NUMERICAL APPLICATIONS .. ... ..... .. 82
4.0 Preamble .. .. ......
4.1 Exact Coverage Probability (95% *8
Nominal Confidence Interval) .. .. ...82
i v




TABLE OF CONTENTS (continued)
CHAPTER Page
IV 4.2 Estimation of Relative Potency by
(cont.) Various Linear Techniques 85
4. 3 Summary 87
BIBLIOGRAPHY 88
BIOGRAPHICAL SKETCH 90
v




LIST OF TABLES
Table Page
3.1 Notation Chart 73
4.1 Exact Coverage Probability 84
4.2 Estimation of Relative Potency 86
v i




Abstract of Dissertation Presented to the Graduate Council
of the University of Florida in Partial Fulfillment
of the Requirements for the Degree of Doctor of Philosophy
QUANTAL RESPONSE ASSAYS BY INVERSE REGRESSION
By
Frank Hain Dietrich II
August, 1975
Chairman: Dr. Jonathan J. Shuster Major Department: Statistics
Numerous methods are available to analyze quantal response assays. Some of the more popular methods of analysis are discussed. The general aspects of inverse regression are also discussed.
A general inverse regression procedure for
estimating dose response curves in quantal response assays is presented. Asymptotic distributional properties are developed. Procedures to form (1-cc)100% nominal confidence intervals for quantities of interest are given. Methods of testing hypotheses of interest are also developed.
The particular method of applying inverse regression to quantal response assays by use of the angle transformation is presented. The special case is given where, after application of the transformation, the dose response curve is linear. The inverse method has a decided advantage over the more classical methods in
vi i




this case, both in flexibility and in ease of application. The procedure will be shown to be fully efficient in the asymptotic sense.
Numerical examples are presented to demonstrate
the applicability of inverse regression to quantal response assays. The numerical examples deal with linear response curves since classical methods of analysis are only applicable in this case. Inverse regression may also be used when the response curves are non-linear.
v i i i




CHAPTER I
STATEMENT OF THE PROBLEM
1.0 Preamble
In Chapter I the general problems of quantal
response assay and inverse regression are presented. In Section 1.1 we will introduce the classical quantal response problem along with different objectives of a quantal response assay. Section 1.2 gives a history of frequently used methods of analyzing quantal response curves. In Section 1.3 we will discuss the general topic of inverse regression. Inherent in this discussion is a comparison with classical regression. In Section 1.4 we will give a summary of Chapter I along with the results obtained in the remaining chapters.
1.1 Introduction
In the classical quantal response problem subjects (plants, insects, patients, etc.) are subjected to a stimulus (fungicide, insecticide, physical therapy, etc.), and an all or nothing response is recorded. Although it would usually be desirable to measure the response quantitatively, it is often only possible to measure a




2
response as occurring or not occurring. It is this type of response we will be interested in analyzing.
Trhe stimulus is often referred to as a dose, and the dose is administered at different levels. Generally, we independently sample n.i subjects at dose, d.,
1 = 1, 2, k. For each dose, di we are interested in the true fraction of positive responses, pi. Thus, for each dose level the number of positive responses observed in a sample of n.i subjects is a binomial random variable with probability of success equal to pi. For each dose, d., we calculate the observed fraction of positive responses,, ^Pi, the maximum likelihood estimator of pi. A quantal response curve is then fit. The response curve is basically found by fitting the fraction of positive responses observed against dose. Usually both the fraction of responses and the doses are transformed before the curve is actual ly fit. This type,of analysis is often used to assess the potency of drugs of all types when it is either impractical or impossible to determine the potency by chemical analys is.
The actual objective of a quantal response assay
may be the solution of one of a number of related problems. An objective of many assays is to estimate LD(lO0p), the true dose at which 100p% of the subjects have a positive response. In particular, LD(50), called the median lethal




3~J
dose, is often of prime interest. One reason for this is that it is used in an attempt to classify dIrugs as to their effectiveness. At one time it was attempted to classify drugs by a minimal lethal dose or a maximal YjK
lethal dose. The minimal lethal dose would b'e the smallest dose at which a positive response is attained for at least one subject. The maximal lethal dose would be the smallest dose at which all the subjects would exhibit a positive response. Needless to say, it would be very difficult to estimate these quantities. For a fixed number of subjects LD(50) can be estimated more accurately than a minimal lethal dose or a maximal lethal dose. Thus, LD(50) is now often used to attempt to measure the effectiveness, or potency of a drug. There are however instances, such as toxicological problems, where doses producing 100% response are of more interest than LD(50).
If two or more drugs are to be compared, it is
often done in terms of the relative potency, the ratio of equally effective doses. Even if a new drug is to be compared to a standard, the tolerance of the population may change, and both drugs must be experimented with at. the same perio d of time. Thus, an estimate of relative potency is obtained~ rather than measure the performance of the new drug singly and measure its effectiveness in relation to the standard as an absolute effect. Relative ,potency is a




valuable measure only if it is foundI that the quantal response curves are parallel, Thus, the ratio would be the same at all equally effective doses.- The relative potency is therefore usually measured as the ratio of median effective doses.
Whenever two or more drugs are under consideration in a particular problem, it is desired to know if a mixture of the drugs might be more effective than applying the drugs individually. In ge neral, the joint action of a mixture of drugs can be classified in three categories. The three categories as given by Bliss [1] are independent
joint action, similar joint action, and synergistic action.
If drugs have independent joint action, they act independently and have different modes of action., The drugs. may or may not be correlated in terms of the susceptibility of one component as compared to anoth er. The potency of the mixture can be predicted from the fitted 'ki curve for each drug alone and the correlation in suscepti ~Y; bility to the drugs. The potency of the mixture can be *Drugs are classified as having similar joint action if they produce similar~ efects so that one component can be substituted at a constant proportion for the other. Variations in individual susceptibility to the drugs are




/5
completely correlated or parallel. The potency of a mixture is predictable from the relative proportions of the individual components.
The last classification is synergistic action. The
potency of the mixture cannot be assessed from a knowledge of the individual potencies. 'It must be based upon a study of their combined potency when used in different propor-. tions. If the potency of the mixture is greater than that expected by studying the mixtures singly, the drugs are said to synergize. One drug antagonizes another if the mixture has a smaller potency than expected.
We have now stated the basic problems of interest in a quanta] response assay. The next section will deal with a history of methods for analyzing quanta] response curves.
1.2 History--Previous Methods of Analyzing.>j
Quantal Response Curves
Although numerous methods have been proposed 'for;4
analyzing quanta] response curves, the most frequntly / <~
used method is probit analysis. A thorough discussion of probit analysis is given by Finney [1
, In the classical quanta]response problem we indepenently sample ni subjects at dose, d obti P,
the fraction of positive responses, i = 1, 2, k.




In order to use.probit analys.is n analyzing quantal response curves, the profit of ^., Z., is found by the following transformation,
Z. -e1/2x dx, i = 1, 2, ,k.(.2.1)
Once the probits have been determined, a linear response curve is fit against log dose by iterative weighted least' squares.
A procedure similar to probit analysis was suggested by Knudsen and Curtis [5]. Rather than use the probit transformation given in equation (.1.2.1), Knudsen and Curtis suggest the use of the angle transformation
Z.=Arcsine('.k (12)
1 1i I =, 2, l22
where Z, is recorded-in degrees. Once the angle transformation has been performed, a linear response curie is fit against log dose by ordinary least squares if the sample sizes are approximately equal, and by weight[ least squares otherwise. For all practical purposes th1e angles transform nation is a linear function of the profit transformati on.
Moore and Zeigler [ 7 ] discuss the use of nonlinear regression methods for analyzing quantal response




7
curves. They demonstrate that any methods based on maximum likelihood estimation of appropriate parameters may be formulated as non-linear regression problems. It should be noted that both probit analysis and the angle trans- <
formation are based on maximum likelihood principles, and thus fall in this category. Moore and Zeigler conclude that a reasonably general least squares computer program could replace several specialized quantal response analysis programs.
It has also been pointed out by Nelder [8 ] that there is an important class of estimation problems which leads to a form of solution which is closely analogous to linear rather than non-linear regression. Basically, the condition which must be satisfied to be in this class ofestimation problems is that the first derivative of the likelihood can be put in a form where p, the true fraction of positive responses, is a linear function of the unknown~ parameters. Again, probit analysis and use of the ang/ tr ansformation fall into this class of problems. Thus a well-constructed linear regression program could be adapted to cope with this type of problem. Although quanital res ponse assays usually involve discrete distributions,
Nelder also shows that the' same iterative linear regression procedure can be used on a class of non-linear models which involve continuous rather than discrete distribut i s .




1.3 History-Inverse Regression
Krutchkoff [6] discusses the general problem of
inverse regressionand in particular as it applies to the problem of calibrating an instrument. He uses the example of calibrating a pressure gauge. To calibrate the gauge, one subjects it to two or more controlled pressures, and notes the gauge markings. From these data, the calibration parameters are estimated,.and the gauge is calibrated, Unknown pressures are' then estimated by reading the calibrat-ed markings.
If x represents the controlled variable, and y represents the measured variable, then the relationship between x and y can be expressed by the usual linear model,
y + .x + C. (1 .3 1,
The classical approach to calibration using model (1.3.1) with k values of x, .and independent identically distributed errors with zero mean, uses the usual leas squares estimates of c't and eThese estimates are found by
k
nA i ...........
k I (1 .3. 2)
Z (x.-iY2
a n d




where J i
X k-~l and y ( .3 4
The least squares line is,-then represented by : ;i
A IN
y ".: ( 1 3. 5
a nd the LcalIibration. equa tion i s
.
X= ya (1,3.3)
Thus, frmagauge, reading- of Y h lsia
estimate, Xc for the pressure is
Y-aa
i1 (1 .3i.7)
Thelesn squae inverse teresnte byracmoe
(:1.3.1)..is rewritten as
X = y + (1.3.8)
where e y. = a 6 =:1/., and F::.- ... ,
Again, the usual leatti sq esaimateis of y andt are




found by
k
k= (1 .3.9)
and
y x 6 y .(1.3.10)
The least squares line is now one and the same as the calibration equation and is expressed by
x = (1.3.11)
Thus, using inverse regression, for a reading of
Y of the gauge, the inverse regression estimate, of the
pressure is
A, = A Y (1.3. 12
The estimates given by equations (1.3.6) and
(1.3.11) are not generally the same. It is therefore o interest to judge which estimate is better by the use of certain criteria. Krutchko'ff uses the criterion~ of mean square error to judge the relative effectiveness of the estimates.




Krutchkoff concludes on the basis of a Monte
Carlo study (in which values of 01<.001 were replaced by +.001 as appropriate) that the mean square error of the inverse estimate is uniformly less than that of the classical estimate. The Monte Carlo study involved different values of a and different variances, different designs, and normal as well as non-normal error distributions. Thus, on the basis of mean square error, it appears that the inverse estimate is more desirable than the classical estimate.
Williams [11] points out that under the assumption of normally distributed errors the classical estimate has undefined expectation and infinite variance, and hence infinite mean square error. Under the same assumption the inverse estimate has finite mean square error. Thus, Williams concludes that the inverse estimate is better
than the classical estimate from the mean square error point of view.
Williams goes on to point out, however, that this
conclusion is not very satisfying. He reaches this conclusion because all that was shown is that the mean square error of the inverse estimate is less than infinity. He questions using mean square error at all as a criterion for comparing the two estimators.




It is also of interest to note that Williams shows~ that there is no unbiased estimator with fjiite variance.
In somewhat the same spirit as Williams, Halperin [4] notes that a random drawing from any distribution with finite variance would provide a better estimate than the classical estimate inthe mean 'square error sense.
Rather than dwelling on the mean square error argument, Halperin considers the criterion of relative "closeness" of two estimators, and 2 to X. Here closeness" is in the Pitman sense. That is, X is a closer estimate of X than 22 if, for all X,
P[I 1-X Halperin shows that the inverse estimate is a closer estimator than the classical estimate for all values in a closed interval of X.~ This interval depends on quantities such~ as 5, a, x, and the sample size. It turns out that if Ipi is large, where p = o/a, Y is well determined, or the values of the independent variable ar widely disperse the estimates are indistinguishable.
Saw [10] shows that for any distribution on the
errors, the slope of the inverse regression line is always of the same sign, but greater modulus, than the slope of the classical line. Thus, at X = x, the inverse estimate is closer to X than the classical estimate with probability o ne .




Saw goes on to point out that any line through< (y, x) with slope of the same sign, but greater modulus than the classical regression line,will perform better (as an estimate of X) than will the classical estimate within some neighborhood of x.
A similar statement can be made in reference to the inverse regression line. Thus, there exists no best way to estimate X uniformly over an interval of X. This being the case, Saw concludes the specific use of inverse calibration is unappealing.
1.4 Summary of Results
In Chapter I we have presented the general problem of quantal response assay. We have also discussed the general method of inverse regression. We have chosen to apply inverse regression to quantal response assays for a number of reasons. ..
In quantal response assays we usually seek solutions of
F(dose) = p (1.4.1) '
and relationships among such solutions, for two or more drugs. The following criticisms >are 1) the< least squars~ process minimizes the residual sum of squares in the transforme d probability scale (vertical), while estimates of the solutions of (1.4.1<) have errors measured in the log-cfose scale (horizont'al).




2) Serious problems occur in estimating solutions~ to (1.4.1) when we model
F (Probability of Response) = Z Br (1o-oer( 42 for ir I r.og?.dose r (1.4,2)
The main problem is that the solutions to (1.4.1) may not exist or may not be unique when they do exist. Thus, the classical approach is pretty well limited to situations where a linear rela-tionship exists (that-is to situations when m=l in (1.4.2)).
In Chapter II we will develop the general theory
necessary to apply inverse regression to find solutions to (1.4.2) when intl. The solutions will minimize residual sum of squares in the log-dose scale.
In Chapter III we will use the angle transformation with inverse regression to develop a particular method of analysis.
Chapter IV will give~ some numerical applications of the methods developed in the preceding chapters. B actual application of our results it is seen that inverse regression offers the mos elementary computations as compared to other methods.




CHAPTER II
INVERSE REGRESSION OF QUANTAL RESPONSE
.ASSAYS: ASYMPTOTIC THEORY
2.0 Preamble
In Section 2.1 we will introduce the basic reasons for studying the asymptotic theory. Section 2.2 will deal with developing a parametric model along with estimation of population parameters. In Section 2.3 we will develop the asymptotic distribution of the estimators. 'We will also develop methods of forming confidence intervals and testing hypotheses of interest. Section 2.4 will be a summary of the results.
2.1 Introduction
As previously stated, the classical quantal response assay consists of independently sampling n. subjects at dose, di and obtaining the' f racti onof. positive responses, Pi1 i = 1, 2, k. If only one drug is of interest in the assay, it is often of interest to estimate LDcl00p), the true dose of which Op percent of the subjects exhibit a positive response. In particular, LD(50) is a quantity often estimated.
15




If more than one drug is involved in the assay, other aspects of the analysis may be of interest. It is often desirable to compare LD(50) values in terms of relative potency, the ratio of the true LD(50) values. If the assay involves drug mixtures, it is of interest to know if one drug synergizes or antagonizes the other.
In order to use inverse regression to analyze a
quantal response assay, a model for the problem is necessary. We will develop a parametric model for the classical quantal response assay. Once the model has been formulated, estimators of population parameters will be developed. Since confidence intervals for, or test hypotheses about, population quantities are of interest, the asymptotic distribution of the estimators will be studied. The results of the asymptotic theory will be stated in terms of linear combinations of the estimators. From this, confidence intervals and tests of hypotheses of interest will follow.
2.2 Parametric Model and Estimators
We will now develop a para metric model to express the relationship between the observed fraction of positive responses at different doses and the corresponding true fraction. In order to do this certain matrices and their relationships will be defined.




~~< 17
Let M be a k x r matrix with r <.k. M will1 beof rank r and will usually consist of two different types of elements,. M will contain elements which are functions of the true fraction of positive response. M may also contain dummy variables. A k x r matrix Y will be of a form similar to M. If M contains a dummy variable in position m iY n will contain the same element in position y. The remaining elements of Yn wil be the maximum keli hood estimates of the corresponding elements of M. Thus, rather than containing functions of the true fraction of positive responses, as M does, Y n will contain the corresponding functions of observed fraction of positive responses. For a k x 1 (transformed) dose vector, X, we hypothesize the following relationship:
X M (2.2.1)
where : is an r x 1 vector of parameters. Let En be a k x r matrix such that {e: } the rows of pendent random vectors. If we let n be a linear function of the n, = 12,. k w, we will assume that
n1/2 (e i) L >Nr(01 V as n (2.2.2)
-nr
where Nr represents an r-varlate normal randoI variable, and V. is a continuous matrix function of M.




With all matrices defined as above, we propose
the following model for the relationship between the ob~servecd fraction of positive' responses and the true fraction of positive responses:
Yn = M+ En* (2.2.3)
Multiplying equation (2.2.3) on the right by g
Y n M + En .(2. 2.4)
Using the relationship given in (2.2.1) we see that (2.2.4) can be rewritten as
Yn =x +E n (2.2.5)
or equivalently as
.= .. E n P- (2.2.6)
Thus usi ng the unweighted least squares estimator of we obtain
'"4(2 2.7)
2.3 Asym'pto:tic Theory "
Now that a parametric model has bee developed with the estimators of these parameters, we will obtain the limiting distribution of the quantity




19
Tn n 112 ( (2.3. 1)
n -n
where X- is an r x 1 specified vector.
Before we actually find the asymptotic distribution of Tn, we will first introduce some lemmas needed in later proofs. The first three lemmas may be found in Rao [ 9].
Lemma 2.3.1
Let {X n Y, n = 1, 2, be a sequence of pairs of random variables. Then
Xn YnI > 0 Y i > Y X -> Y, (2.3.2)
that is, the limiting distribution of X exists and is the same as that of Y.
Lemma 2.3.2
Let {Xn5 Yn}, n = 1, 2, be a sequence of
pairs of random variables. Then:
(a) X L > X, Y n > 0 = X nY n P > 0. (2.3.3)
n nnn
(b) Xn > n2> C = X +Yn X+c (2. 3.4)
fiIf
=> X Y n L > cX (2.3.5)
=. X. .n n > X/c, if cf0.(2.3.6)




Lemma 2.3.3
Let g be a continuous function. Then;
(a) X L > X = g(X ) L > g(X). (2.3.7)
(b) Xn P > X g(X n) > g(X). (2.3.8)
(c) Xn -Y p> 0, Y L> Y g(X )-g(Yn) -> 0.(2.3.9) Lemma 2.3.4
Let g be a continuous matrix valued function of Yn a matrix. Then
Y > M g(Y ) p > g(M). (2.3.10)
Proof
Since g is a continuous matrix valued function of Y we can let E>0 be arbitrary and let 6 >0 be such that
Y MI < 6 = Ig(Y) g(M)I 1 > P[jg(Y ) g(M) I - n
>P [IYI n Mu < 6 1 -> 1 as n (2.3.12)
since Y p > M.
n




21
Since s is arbitrary, g(Y ) P >g(M). This completes the
proof.
Lemma 2.3.5
Let E = e11 e12 e Ir (2.3.13)
e21 e22 e2r
e1k ek2 kr
be a k x r matrix of random variables such that the asymptotic distribution of. nl /2e the i-th row vector of nl/2E has variance covariance matrix Vi i = 1, 2, .. k. Assume that ACov(n/2e ., n 112e ,.) = 0, Vii', where ACov
13 1'j
is the covariance of the asymptotic distribution. Let
a' = (al a2, a (2.3.14)
b = (b1, b2, br (2.3.15)
C' = (c1, c2, ck) (2.3.16)
and
d' = (d1, d 2 r)(2 3 17)
be vectors of constants.
Then,
AVarrnl/'2a ,~ n1/2c' Edl = a2, (2.3.18)




where a2 k 2 121
Cy F [a b V b + c d "Vd A 2 a c b 4 d] (2.3. 19)j .
and AVar refers to the variance of the asymptotic clistribut ion.
Proof
1/2 1/2 k r
n a Eb n Ei Z a b e. (2.3.20)
i=1 j=1 1
and
1/2 1/2 k r
n c'Ed =n z c d e. (2. 3.21
- i=1 j] ~ 1
Thus,
2 Fk r12
a = AVar E Z (a b.- c id )n e
i=1 j=i i 32l~;
= r c dab 2 cd)AVar(n 1/ 2e.
+ Z(a b..c d )(a.,b.,-Ci,d.,)ACOV(n 1/2 e n1/2 e~j)
H 1 3 13 1 i~




23
where
H={(i j jIi 1j': 0(ij Ii ')ij 1i-1,2,. .k; j j' l=12,...rl.(2 .3 .22) Since ACov(n 1/2e..j, n 1/2 e .i) 0= oVitv, equation (2.3.22) can be written as
Z rZ (a ib.j- c d.i) AVar (n "e.i
i=1 j=1 13 131
+ (a ib.-c id .)(a ib1.I-c d.,)ACov(n 1/ 'e.i .f1/2 e. 1)
k 2 2
where
Hl=f(i,j,jl): (i,j)f(i,j'), i=1,2,...,k; j,j'=i,2,,..,r}. (2A. 22 This completes the proof.
We are now ready to derive the asymptotic distribution of Tn' as given in (2.3.1). Theorem 2.3.1
Under the conditions specified in section 2.2,
T n N(Q,0~2) as n +-, (2.3.23)




24~
where
2 k 2
E {a b v b+c d kV d 2 a c bkVd} (2.3.24) W
wi th
a' (a, a k) =X .{I M(M .M) 1~ Wl (2..3.25~)
b =(MM e (2.3.26)
C, (c1, c k ~ (M.M)1 M, (2.3.27)
and
d =(MIM)"1 M' x *(2.3.28) Proof
From equation (2.2.1) we see that
= (M'M)- Mk, X..( 3 29
Using equations (2.2.7) and (2.3.29) we obtain
k (.-n9L = Z"{(Y'nyn))1Yx (M' M)1 'x n~~ n




From equation (2.2.3) we observe that
=k M' + E' 231
and
Yly =WMM + M'E(, + Ek"M + EE .( 2
n n n n n n( .2
Substituting (2.3.31) and (2.3.32) into (2.3.30) yields
V(O_-)=VI'(M'M+ME +E'M+E M-E
n nl n n) M IM+ )x+ZV(M-M) E' x.(2.3.33)
By making use of the identity
(U + V)1 = (I + U -V)- Ul (2.3.34)
and letting
U =M'iI, V W'E + P'M + E'E (2.3.35)
n n n n
(2.3.33) can be written as Z,(^ -.)=vK[+(MMyl 1(M'E +E'M+E'EM) (WE~
nn n n En M') n
+ (I'M) 1 E'x. (2.3.36)




(I + V' (I V) I -V2 (2.3.37){~
implies that
2
(I V) =(I + V) (I + V)~ V2 (2.3.38)~
Recalling that n112e (i) L> N(0, V as n
n
and letting V be as defined in (2.3.35), we observe that
n '12 -6 V >0 6 >0 (2. 3. 39)
Thus, from Lemma 2.3.3
n I 6V2 Pa--> 0, I6> 0. (2. 3. 40)
Combining the results of (2.3.30), (2.3.39), and (2.3.40) and applying Lemma 2.3.2, we obtain
(I V) (I + V) + 0(n1 (2.3.41)
or
(I + V). (I V) + 0(n 1(2.3 .42)
Using the relationship given in (2.3.42) and ing Lemma 2.3.2, equation (2.3.36) can be written as




27
(MIM)- 1 MIX Z,(MM)- MIX
+ Z'(M'M) E I X (M' M) E I X n n
,(MM)- (M'En+EnM+EnE n)(M'M)- MIX '(M-M) -1 (M'E +E M+EnE Enx
n n n)(M.M)V (M'M) E I X + 0(n- 1 ( .3.43)
Since any matrices involving EnE n are of order n- 1, application of Lemma 2.3.2 reduces equation .(2.3.43) to V -Z'(M")- M'En(M'M)- I MIX
-Z'(M'M)- En'M(M'M)-'M'x
+X'(M'M)-.l P x + O(nn
V (M' M) -1 E.[,-M(M-M)-'M.]x
-TI(MIM)-l
WE (M-M) MIX +0)(n
n
X,[I-M(M.M)- 1 M']En(M-M)ZI(MIM)- 1 M'En (M I.M)-lM'x + O(n- 1




28
c'E nd + O(n -1 2... .
where a', b, c', and d are given in (2.3.25) (23.26), (2.3.27), and (2.3.28) respectively.
Thus ,
n/L7g 1 n) n a 'E nb + n 2c'E d + (1/ (2.3.45
From Lemma 2.3.1, we observe that both sides of equation (2.3.45) have the same limiting distribution. Thus, from the asymptotic properties of Eand the application of Lemma 2.3.3 and Lemma 2.3.5, we observe that
Tn T -L N(0,c2 (2.3.46)
where u2 is as given in (2.3.24). 4;4 This completes the proof.
In Theorem 2.3.1 the asymptotic variance component, 02, was given in terms of M. Elements of M involve he true fraction of positive responses. Since the truefraction of positive responses i s unknown in a practical situations, we will wish to estimate them and obtain a consistent estimator of a2.




29.
'7 7. ; .. 2
Corollary 2.3.1
By substituting Y for M in equation (2.3.24) including V terms, we obtain 92, a consistent estimator
-2
ofa and hence
A-1 L
a n Tn > N(O, 1). (2.3. 47)
Proof
Chebyshev's Inequality states that for any random variable X with mean, p, and variance, a2
P(I X- 1 ) < 1 X>0. (2.3.478)
XA
For a binomial random variable, p, the maximum likelihood estimator,of p, has mean, p, and variance, Pq[ Thus, using Chebyshev's Inequality we observe that
nn
pPp DO) (2.3.49
The refore
SlimP( p-p > < (235)
A
Thus', p converges to p in probability.
=!. ; .. .:i ; ;'' 1 = !{ : .i./ 'J ;




Recalling that Y is identical to M except tha
*where M contains functions of p, Y n contains the same functions of p. Thus, from *Lemma 2.3.3 we observe that each element of Y n converges in probability to the correspon'ding element of M. That is,
Yn I> M. (2.3.51)
By application of Lemma 2.3.4, we observe that
-2P
Y n > a, (2.3.52),
where ao2 is found by substituting Y forM in equation
n n
(2.3.24). Since
T L>NO 2 (2.3.3
Lemma 2.3.2 justifies that
0n Tn N(O, 1) (2.3....
This completes the proof.
Now that the asymptotic distribution of the
estimators has been developed, we will give nominal
(1-a)100% confidence interval for i We will give a
confidence interval of this form because many of the




31
estimation problems of interest can be phrased in terms of linear combinations of the parameters. Corollary 2.3.2
eC' nn 1 Zc/ 2 (2.3. 55)
C~ ~~ ~z/ ni ~ r :.
forms a nominal (l-a)lO0% confidence interval for V2
where z/ is such that
P[Z > z X/ a/2 (2.3. 56).
when Z is the standard normal random variable. Proof
Corollary 2.3.1 implies that
1/2 "-1
n a~ [n EV -n > N(O, 1), (2.3.57)
Thus, as n ,
S < (2.3. 58)
Therefore in the asymptotic sense
+n Z an (2. 3. 59)
" ~- a /2.. ... n'S t. . .)7.... <:,: .:




forms a nominal (1-00)l00% confidence interval for VThis completes the proof.
Although estimation is often of prime importance, it may also be of interest to test hypotheses of the general, form
H 0 A =0 ,(2.3.60)
where A is a q x r matrix with rank q (1 < q < r).
We will now develop a test statistic appropriate
for this general hypothesis. In order to achieve this end, we will consider the asymptotic distribution of n in terms of a multivariate normal framework. We will first
give a definition and two lemmas from Rao [9 1 ..
Definition 2.3.1
A p-dimensional random variable U, that is, a
random variable U taking values in E (Euclidean space of p-dimension) is said to have a p-variate normal distribution; N if and only if every linear function of U has a univariate normal distribution.
Lemma 2.3.6
If U has a p-variate normal distribution, then the
joint distribution q linear functions of U is N Let
U have mean vector, p, and dispersion matrix, Z. if




Y =CU, where C is (q x p), represents thie q linear
tions, then Y has mean vector, C}j, and dis version matrix
Lemma 2.3.7
Let U be p-variate normal with mean vector, and
dispersion matrix, .. Then the necessary and sufficient condition that
Q U R (U (2. 3.61)
has a chi-squared distribution with k degrees of freedom is
E(R E R' R) E =0 (2. .62)
in which case
k = trace (RE). (2.3.63)
Theorem 2.3.2
n 1L > Nr ( E) (2.3. 3 .64)
where
- f" "




34
Q.. = 1 /2(coefficient of 4. 1-j (2. 3. 65)
in (2.3.24), and
==2(coefficient of Z )(2.3.66)
-2
in (2.3.24). n is similarly obtained by using as
n n
defined in Corollary 2.3.1. 0i is a consistent estimator of Q.
Proof
In Theorem 2.3.1 we proved that every linear
1/2 A
combination of n (8 8) is asymptotically univariate
-n
1/2
normal. Thus, since n ( ) is an r-dimensional random variable, by Definition 2.3.1 in the asymptotic sense n/2 ) has an r-variate normal distribution.
The mean vector and dispersion matrix follow directly from Theorem 2.3.1. It should be noted that the elements of Q are defined as they are because in (2.3.24),
4 4
coefficient Z. = 2ACov(', 3) (2. 3. 67)
1 n -n
where is the i-th element of the vector and
-n n
2 Ai
coefficient = AVar(o ) (2.3.68)
i". n.




It again follows from Lemma 2.3.3 that each eleenrt of converges in probability to the corresponding element'' of Q. Thus,
P
-Q n,~. (2,3.69) and is thus a consistent estimator.
This completes the proof. Corollary 2.3.3
To test the general hypothesis
A 0 3 : 0, (2.3.70)
the test statistic is ~ A)1AI L 2 '3
n(A n)''(A n A-n) -> X q
Proof < ~ "
From Lemma 2.3.6 and Theorem 2.3.2,
1/2 L J ~<
n A n-> N q(0, A ~2A')
if A =0. Thus,, from Lemma 2.3.7, if A g 0




36
n(A )'(AA Q1(A ) 2
n e nA''( *'" Xq # 444
since in the notation used in Lemma 2.3.7
n A A (2.3.72)
and
R = = (A Q A') (2.3.73)
Thus,
Z(R Z R'- R) Z = (Z- ) = 0, (2.3.74) and
trace(RE) = trace(E-1 ) = trace (I) = q. (2.3.75) Application of Lemma 2.3.2 implies that if A = 0,
n(A An )'(A A) in) X2 (2.3.76)
This completes the proof.




24 Sumr
We have now developed the general asymptotic
theory necessary to solve problems of interest in classical quanta] response assays. In Chapter III we will apply these general results to the particular use of the angle transformation. In particular we will discuss estimation of L(lOOp) and relative potency. We wiN also discuss testing the hypothesis of parallelism. ,<
iN/
:r :);; i;2 ,;
: .. ?' ) J>
i 4~<
?' : 74
" =. 4t 447 ". j




CH P E II AP ICA INT H AL
TRANSFORMATION
3. Preamble
In Sec io 3.1 we wil dicus te r tio al
behid chosin theangl trnsfomatin toappl th
me ho of invrs rersso Wew llas j sif h
pro ose moel InA Se t o w ild sus h si mation of LD ) Wew l nld eslsfrtecs
3.1 ICHAPTERtion
3. Preamble 1
behind choin th n trnfrato oaplh
mhethdorvre regesio. We will emlywigtdlaso jqustify the propsedmode. I Secion32w ildscs h si




;Q
fit the model. Appealing to the notation developed in, Section 2.2, thle probabilistic parametric mdel we will thus be using is 2-1 2
Yn=M+En (3. 1 .2)
where the elements of Yn and M are defined by
K >;
Y i ini (3 1 .3)
1 I i < r r < k
and
k
T n f E (3. f sry
Th for of Enwl flosoty




40
Justification of the model. proposed in (3. 1.1) will now be given. A brief discussion of classical methods will first be given.
Classical methods of analyzing quantal response
assays are applicable when there is a linear relationship between the log-dose and the transformed fraction of positive responses. Probit analysis, for example, produces such a linear relationship when tolerances (measured in the logdose scale) have a no-rmal distribution. The tolerance of a subject is the dose level at which that subject would exhibit a positive response. Needless to say, not all quantal response assays have a normal distribution of tolerances. For assays such as these, probit analysis is not appropriate.
Although for a non-normal distribution of tolerances another transformation might produce a linear relation ship, it would be desirable to find one method of analysis which would be appropriate for a wide class of tolerance distributi ons .
In many quantal response problems, the, fraction of
positive response is montomically increasing, with respect to dose in the area of experimentation of interest. This, ,of course, implies that dose is monotonically increasing with respect to the fraction of positive responses. Both logdose and sin I lpl/2 ) are monotone functions. Thus, p1 < ~P2 and d1 < d2 if anid only if log d1 < log d2 and si"1 (p1/2) < sin- 1 (p 1/2) Thus, dose is a monotonically increasing




function of the fraction of positive responses if and only if log-dose is a monotonically increasing function of 2>2
. 'm {]
sin l(P12 In this case, it would b~e appropriate to moe
the relationship between 1og~dose and sin- 1p'112) by a polynomial, which isd the model given in (3.1.1).
In conclusion, the model given in (3.1.1) is appropriate regardless of the distribution of tolerances, as long as the fraction of positive responses is an increasing function of the dose. Thus, the inverse regression approach is applicable to a much wider class of quantal response assays than classical methods.
Theorem 3.1.1 (Mean Value Theorem)
If f is continuous on [a, b] where a< b and differentiable on (a, b), then there exists a point c c(a, b) such
that A
f(b) f(a) = (b a) f' (c). (3.1.6).
We will now use Theorem 3.1.1 to prove other useful resul-ts. ..
Theorem 3.1.2
Let a random sample of size m be taken from a binomial population with parameter, p. Then
ml/2Ein- l ~/2J sin-' [p /2]l -L N(O, 1~) (3. 1 .7)
where sin pl /2) and sin' (p are measured in radians.




42
Proof
Let
f(x) =sin" i1 (x112 (3.1.8)
Thus,
f'(x) 1 [x(b-xfl" 112, (3.1.9)
From Theorem 3.12., there exists a c such that Ip-cl !P-pI and 4
1l/2Ei n 1 [,1 /2) si n {p 1/21j
= nm (P-~p)-Lc(1 c)P1 (3,2.10)
Si nce~ c-p < '-p I 1and
-11
ic-pi 0m- 1/2) (3.1.12)
thenA
m12[sn 1 r1I2 s 1 [1/2j1
=i112 1~~ p(~p]12 ~ /
m~~~~~~~ >PP PI-), +O 12 (..3
S i n c.




3 43~43
(1-.L N(T 1)-72 (3. 1 14)
1/2
2[p(l-p)]
Thus, by Lemma 2.3.2
m1/[L ^ 112] si n -1 p/2J L ~> N(0, 1/4).(3.1..16)
This completes the proof.
Theorem 3.1.3
Let a random sample of size m be taken from a binomial population with parameter, p. Then for any given {C.}
r~
1 (1/2] l- 1 2
m /2 cj{Jin1(l/2 tsin1 [If
j~l L
L r r1/ k- (3 1.17
-> N 10, 1/4 Z jl(-ICC -1
j=l k=l kS




44 4~4
Proof kw
Let
f(x) m in12r C. x(3.1.18) j=1 J:
1/2 r
f'(x) m zn U(-1).C. xi2~3..
j= 1 3
From Theorem 3.1.1, there exists a u such that
ju sin-1 (p112) < Isin 1 (P&12) sin- 1(p 1/2) (3. 1.20) and 44j
12r12j-1 1j 1l/2 j-sn1[ 112] -si -1tpl/23m12~(~ ) rujSince I^-PJ O(m 112) by the Mean Value Theorem




4<4 1k s [in1 ~ = O(m~t) (3, 1 22)~
wher-e I < k Iu sn'I~EI~j< in(~p sin (p/?j (3.1.23)
u [sin- 1 (p 1/'2 A- O(m' 1/2) (3. 1.24)
Thus,
r j-1
m 12Ec { in- p in- i(- l j-1~~< 2 S) S
[,11 ) 1 2 1/ r [sin-l(pl/2 j 2 M,12/ L P p m j~l0-1)
(3. 1 25
Fromthe esul of heorm 3..2 w canthu
condetha-




j=1
m1 /2 r /j-in( / J 1 n l 1/2 jj 1 I
r r 1 .. ..(. 1 2
L> N(O, 1/4 (j-. (k )C Ck inj=1 k=l S 12
This completes the proof.
In the model given in (3.1.2), En is a random
error matrix to explain the asymptotic variability of Yn as compared to M. With Yn and M as defined by (3.1.3) and (3.1.4), respectively, and using the results of Theorem 3.1.3, we are now ready to justify the form of
En'
The rows of Y. are independent and we have shown in Theorem 3.1.3 that
r --j-1 -j 1
1/2 -1 C i 1/2i 1 /2
spi- s i |pi
r r j+k-4
> N(0, 1/4 r r (j- 1)(k- )C .Ck si n-2') ( 27)
j=1 k=l k
k n.
With n = n./k, we assume that as n- 1 A
i=11 n




i = 1, 2, k. Also, k, Thus, as n+,- each
n +O. Thus every linear combination of n 1/2 i '
is asymptotically normal as n-+ where Y and M1 are given in (3.1.3) and (3.1.4), respectively. It then follows that the i-th row of En is a random vector such that
.n1/2 e (i) L > N (0, V.), as n+ (3,1.28)
-n r 1
where
Ss+t 4
V = (s-1 (t-1 i p 2 l The element V. is found directly from the variance component given in (3.1.17).
Thus, we see that by using the angle transformation we can apply weighted least squares and form a model which complies with that proposed in Section 2.2. We are now ready to use the general results of Chapter II for this particular case.
3.2 Estimation of LD(50)
We will now discuss, the estimation ofLD(50) by inverse regression with he specific use of the angle transformation. Recall that LD(50) isthe median lethal




>48
dose. Since LD(50) is.often of prime interest in quantal response assays, the main emphasis of th is section will be the discussion of estimating LD(50). Since estmation of LD(lO0p) may also be of interest (when p .5), we will give results concerning this also.
We will first recall the notation andt define (or redefine) the matrices appropriate for this problem.
The classical quantal response assay consists of independently sampling n 1 subjects at dose, di, i = 1, 2, .,k. The observed response frequency,.
Piis calculated for each dose level, d. i*The true
response probability for dose, di, is represented by p i
Let :~~%;
n.. ..... (3 2 1
The gen eral deterministic model is given by
log d = ; l 1
Let wA: 4
V>AI j4 IS< :A
where r < k. We will employ weighted least squae an~rd~ write the probabilistic model as .
l~ogd :-ik,; < '
n 2
.. .. <
,. #<>11
K(i !{




S49
We will define Yn M, and En as follows,n 1n2
Si~i .1./.2............1
F s i n' I i !!! !'!ii ... ..... ,,, N i !"iii
Ynij n si n pi/ J -,(3. 2. 4a)
1< i < k, 1 < j < r.
= ~ si 1/pi2 (3.2.4b)
Also, we assume that
x = MB (3.2.5a) where x is a k x 1 transformed dose vector with i-th element
1/2
x. i log d., (3.2.5b)
n 82
_ = (3.2.5c)
L




where E n is a matrix of independent random vectors. In
Section 3.i we justified that the i-th row of En is such that
1/2 (i) L .
n ern > NrN(0, Vi), as n (3, 2.6)
In Section 3.1, we.gave the form of Vi when si- 1 1'2
- 1 11/2
and sin (pi ) are measured in radians. Sice
... ..... 1 8 0 0 .
1 radian = 1800 57.29580, (3.2.7)
IT
it follows from (3.1.26) that when sin-1 !i) and
-1 1/2
sin (p ) are measured in degrees that the st-th element of Vi is given by
Vist 820.7(s-1)(t-1) in Pi, (3. 2.8).
-n n n n
r
1 s < r, '," 7 if .?]f ,




In order to predict the transformed dose, xji, 0
at which pi of the subjects respond, we would use the ~i
weighted least squares prediction equation 1/2
S[in (3 2. 1i0
Rather than estimate the transformed dose, xi, it is preferred to estimate the dose in the log-dose scale Thus, the estimation can be given by
r r J
log d E jin' i12 (3.2.11)
To form a (1-ct)lOO% nominal confidence interval
v.
for log di we may apply Corollary 2.3.2. Th~e confidence interval is given by
~n Z2%~ 1/2 ( 3.12
where
s n 1 p 1
= p~ (3.2 13)
ri
1/20
[sn',p




and anis given by Corollary 2.3.1. 4A4
If it is desired to estimate LD(50), pi .5.
Thus with O < sin(pi'2 < Tr/2.
454
(3.2. 15)
(45)r 1
We will now discuss in detail the estimation of
LD(5O) when it is assumed that the relationship between>jjfr+ log di and sin- 1(p 1/'2 is linear. We will use the following notation:
n. 4
Siceweare assuming a linear rel-ationship, we are usingFJ~/ the models given in (3.2.2) with r = 2. In this case, the weighted least squares estimate of can be ex pressed as
_ (3.2. 17)
n 2,
?
' > c '- .."-.s : : L ". ..> : L .>-, ; .. : -' i 6 --4




where
2 Y~y (3 2,. 18) )
yy7
and
wi th V
k
=y (x Cm ,) (3.220?)
k k
Z W.x.iW
k k
and4
k k
From~~~~ (32.) e bsrv ta




54
n1/2 e (i > N2(, V.). (3.2.23)
-n 2(01
From (3.2.8) we see that
T) 0
V = (3. 2.24)
0 820.7
In order to predict LD(50) in the log-dose scale, we use (3.2.11) and (3.2.14) to obtain
A
LD(50) = + 45 2
= + 2 (45-9). (3.2.25)
L'.~~2 1 / .... .. .. ..,5; ,) ] ;
The asymptotic variance, a2 of' n12 LD(50) can be obtained from (2.3.24). By applying Corollary
2.3.1, a consistent estimator of a2 may be obtained. When a linear relationship is assumed a simplified expression for n can be obtained. This will be shown in the n~
following theorem.
Theorem 3.2.1
A consistent estimate of a, the asymptotic variance of nl 2[LD(50)-LD(50)], is given by




^"2 = 2.[4~~ 2 ~ -2 1 ~ ~
a 820.7(45-) S + k (3.2.26)
n xx yy
Proof
In (2.3.24) a2 was given in terms of M.. To find
2
a consistent estimator of a M is replaced byYn. To
-2
find ,we will define the following vectors:
n
a_ = I- (Y'Y ) Y'}
-n n n n
x Y' (3. 2.27)
n -n
= (Y y'n )-1Z (3.2.28)
c = I '(Y 'Y ) Y (3.2.29)
n n n
= (YY )-1Y' x
n n n
(3.2.30)
where x, Yn' n are defined in (3.2.5b), (3.2.4a), and (3.2.17) respectively.
(3.2.31)
45




Since a', b,; c', and aare thie same as a', b, c'~, and d of Theorem 2.3.1 with M replaced by Y n5 by Corollary 2.3.1 an E (a.i b'Vb c I. dVcl ac.vo,7 '323)
where Viis defined in (3.2.24) is a consistent estimate o f a2
=112 1/2^ 112
1/2 ~ 11 1/11i + W 2 w Y 2
2(2
k 2 -'ky ,;
(Y'yn)'- (3.2,34
i y 80.7 .(3. 2.35) ryy j 27.2




i 2
wij 22 (
[ 4 45(Y..y y Y w y. (3.2.36)
d' V.i d 2 820.7.(32.37
-2^a.Z.bV.d =-2a ic.5] 820.7, (3. 2. 38)
1 1 1 f P,
1 *I- 1-
where a-. and 6 are given in (3.2.33) and (3.2,6) respectively.
Thus, by substituting (3.2.33), (3.2.35), (3.236), (3.2.37), and (3.2.38) in (3.2.32), we obtain
2=820.7 1 Rx. (45-s)
n 2 i
i= yy~
+2 (45- )2(y-~ 2 -^ 2
+_95)A2 2 + 2+2 2+ (45 2(i k iy 2al




58
+ LO 2 k 2 go 2(yi )Yi
k 2(yi-y) Z'W iyi 2 -Y
2 22 k go(x.,R)(Y
Y E W (45
k Yi i=l iyl Y)R2
2 k 2
(x i- )(45-YW 2 E W iyi + 2(xi-,)Yi 9(45- )R 2
2 2 k
2 2 2 2
+ 90 (yi-Y) (45- ) + (y y
2 k i-9)(45 ) E W i i
i
2 (Yi-Y)(45-9)yi l (3. 2.39)
2 I
Performing the summation we obtain
^2 820.7[ 2 + 2
Un 2 45-y) Sxx .2 (45-Y)2 yy
yy
-2 (45- )2Sx + (45)2 -2S + 2[k w iy2j 2
R2 y 2 yy
,,,2 2 2 go 2 2 2 2
P E Wiyi 2yS 2 2y W y
yy




59
+ 2 Y(45, )Sx
90(45- )^2 xy 2 y
+ go 2 (45-Y)S 2 2(45- ) S yyl (3. 2.40)
2 yy 2
Since from. (3.2.18)
S xy =.Y yy (3.2.41
,-,2 820.7 )2 + ^2 2an 2 4.5 9 xx 2 (45-9) S yy
S
yy
2 k
12 2- 2 + 2 22 2 (45- ) S yy + (45)2 2 yy iyi S yy
go 2 90(45- ) 2 S + 2 2
2 yy 2 yy .2 yy
,N2 ^2(4
+ 90 2 (45- )S yy 2 2 YYI
2 k
821D 7 2 + 2 2
1(45-y) xx k iy i yy
S L
yy
^2 2y
]




820.7L45 )<~x S(- 2
xyy k2(..2
This completes the proof. Kii
Corollary 3.2.1
LD(50) z2a n:L a/2 n
forms a nominal (1-c)10of% confidence interval for LD(50) where LD(50) is given in (3.2.25), and a-n is given in ~~ (3. 2. 26) .
Proof
Corollary 3.2.1 follows directly from Corollary
2.3. 2. K
We will now derive some results for the KnudsenCurtis [5] method.for analyzing quantal response data. We can thus compare the inverse regression approach their. method. 2
Knudsen-Curtis use classical weighted leas squa
to fit the model.
Y b1+ b2 x i+(.
where
2 ; o ;T?
: 77 7",.1




.61
x ..= log di, 3.2,46)
and from an argument similar to that used n Section 3.1 we can assume
n1/2 C> N( 820,7), as ni ( 2.4
and c. is independent of ce. for if t .
1 34
Since weighted least squares will be employed, the model to be fit could be expressed as
1 2 n 1 /2 1/ 1/2
1 i n b I+ b n X.+ E. (3.2.48)
n 2 n n
:'. :
Letti ng
we see from (3.2.47) that. e e
n1/2E > N(0, 820.7)., (3.2.50)
1 11
an s.i indepedentof.. for i i:ii
? 44




7'62
Using the notation of this section the weighted least squares estimates are
82 Sxy / xx (.25
and
b= 82 (3. 2.52)>
Thus, to estimate LD(50) in the log-dose scale, Knudsen-Curtis would use
45 8'7'7i~
LD*(50) B2 *
R + (45-y)/S2. (3.2.53)
Theorem 3.2.2 4
n 1/2 (c* 1 -l LD*(5o) -LD(5O)] L -- .2.5
where <'
2 2^ 4-117
*2 820,7Fc45- )2- i +kb4 3 2.55)




Proof ~{N
Frm(3.2.48), N
kp~
b2 xx
x Nx
Substtutin yi fom (3(.442yi5l)
S xx
k
z (x~ 1 1 *
b2+i ~ l.3..l7
2S
xx
n (b2 +1=) > 11,




In a like manner, it can be shown that~'t '
n112 b) L > N(0, u2) (3.2.59) A
and any linear function of n 112(81b1 and l/ 12 b 2)
is asymptotically normal.: 1i
From (3.2.44) it can be seen that.
b 24
[LDA')0 LD( ) 45- 45 b3 2. 1
45-bl (45-bl b
bL(0 b + (32.0
26 bb -b
2.
Now,




6 465
Thus, "
[LD*(50) LD(50)]
(45-b) b -b
1 ofb2 2bl 1 1
= b
+ 4- +' b (3. 2;.63)
2 j=1 2,7
Thus, from (3.2.58) and (3.2.59)
1/2 j L 2
n/ [LD*(50) LD(50)] > N(. (3
2.. .- 2 m + 1 1,, L"
It still r emains to find 62. .
In essence we need the asymptotic variance of
nl/2LD(50). We will first find the asymptotic variance of some other variables.
m" =. ";.6




I 2e, i 1 (x.i )w n 1/2 Y
AVar(n b /2 AVar --= sxx
Z~ (x.-iZ)2w.A~a n 1/2ii
2 1 k 2 Yi>4
z (xi- R) 2Wi 820.7
x1 1
820.7(3.2.65)
XXs 4,
x>A
1/2-1/2
bw.ar'rfl y) 4
= 1 k
2- 1 w. 820.7 >4
k i=i
820.7 (3 2. ~66)
k>4 ''




y 21 =AVar[n 112 (45-Y)/I?21L(,.7
If we let j~be such that 4~>
- > N(O,y 2 thenA
45-j 1 _(45-1i) + (P~- ~)
b2 2 b 2- 2
22 1
- E4~l)+lJ~T+~22 (3. 2. 68)4
+ j tb2 i
Since 2and 62 are asymptotically independent, an
terms b -8b
2 2
where j > 2, are of order n' we obtain tha
AVar[n 112 LD*(5O)]
2A~ b A
b'2 2




68
= b 4(45-0)2 820,7 g&I + 2 820,7 kr1 (3.2.69)
(2 :x 2~
0F2 =AVar{nl/2[LD*(50 LD(50)]}
820.7[(45-p) 2 b2 (3.2.70)
By Lemma 2.3.3, (a*)2 is a consistent estimator
o 2
of a2 As in Corollary 2.3.1,
n 1/2 (a*) 1[LD*(50) LD(50)] N( 1). (3.2.71)
This completes the proof.
Corollary 3.2.2
21 1/
S 2.Assuming a linear model for sin against
log-dose,
Thig- os copee throf
^ P'
C /a* > 1(3.2.72) inverse method to the Knudsen-Curtis method is unity.




Proof '
From (3.2.26) and (3.2.55) we obtain that
(45-Y)2 S k- Ir2 2
-2 2 kxyS SYY
rn (459)2S -6 4 + k S3
xxxyxxy xy x
r 2[(45 ) 2 S -6 S4 S + k'143 S ]Y
xy xx yy xy xx yy
r 4 (3.2.73)
where,
r=)/ (3.2.74)
(XX)12( yy12
is the sample coefficient of correlation. ~'i~ 4
S2.
Since a linear model is assumed, ~'
r >P 1, (3.2.76)
and thus




70
P4
r n' Cr( 7
<' A:
This completes the proof.
It should be noted that the Knudsen-Curtis method is itself asymptotically efficient Thus, the method of inverse regression is asymptotically efficient.
Confidence intervals with nominal (1-a)100% coverage 'are obtained by either
A.A
(5 1 z: G n' 1/2
LD(50) 2 z on (3.2.78)
or
LD*(50) + z a a n (3.2.79)
Since the choice of method cannot be made on the basis of asymptotic efficiency, we shall examine the two methods on the basis of robustness.
Consider the set of points in R
-1 1/2.
i = 1 2
S = [log di, sin (pi 2)],i 1, 2, 80
The inverse regression method consistently estimates a weighted least squares line which minimizes the horizontal deviations for the deterministic S. Assuming
n i1kn > thep
ni/kn --> as n + m then the point Si. carries weight




ci.Similarly,, the Knudsen-Curti s approach consistentlyi estimates a weighted least squares line which m~inimizes the vertical deviations for the deterministic S..
Since the error statements abou t LD50), relative potency, etc., are made in the horizontal scale, he inverse method, when linearity is false, should tend to have a smaler.asymptotic bias than the Knudsen-Curtis method.
Corollary 3.2.3
AVar[LD(50)] < AVar[LD*(50)1. (3.2.81)
Proof
From (3.2.75),
a~~~ ~~ 2 (*)2 r 3..2
4 74-4: '4
n4
Thu s ,ee
where P is the population coefficient of cor for a bivariate random variable with mas funtio
P[X = Si] o,0 i 1, 2, k. (3.2.84)




1 if and only if there is truly a linear relation
ship. Otherwise -1 < p < 1. Thus, from (3.2.70) w~e see that
AVar[LD(50)] < AVar[LD*(50)], (3.2.85)
with equality holding when the linear mode is correct.
This completes the proof.
Thus we can conclude that when estimating LD(50) in the log-dose scale inverse regression seems to yield a more reasonable estimate than the Knudsen-Curtis method as far as robustness is concerned. As a further bonus, LD(50) will tend to be a better estimate than LD*(50) in terms of variances of the asymptotic distributions.
3.3. Estimation of Relative Potency
If two drugs are involved in a quantal~ respne assay, it is often of interest to estimatethe reatie potency of the two drugs. The relative pote t
ratio of equally effective doses. It should b V ,e
that relative potency is a valuable measure o i the
quantal response curves are parallel. Th thruhout this section we will assume the response cuve are parallel.
In order to estimate the relativ potency, we illJ use the results of Chapter II and Section 3.2. 'To achieve




this end we will first give the notation nd
this section, Table 3. 1 '
Notation Chart Drug 1 Du
Dose levels d d k d k, +, 1 9 ',d k
Sample sizes n, nk n k........n
Response probability p, "k Pk1+1 9 P
Observed responds P P k
frequencyP1 Weight, n i/n wk W +1 . k
Recall that
k
n En /K.(3.1 i
Since the curves are assumed tobe r
deterministic model can be expressed as
log d. E j[sin' [P/2j i 1 : 1~




r F
In this case, r+l k. It can thus be seen ttfj2222 j2~
relative potency = ~ e2 (3..4
The probabilistic model will again b2e 2of' theform~
The response matrix Y n has 222224
1/ 122/
13i 1 ~i nl{Ij P l and 2 222 24224 22,
The, mari M4j~ iso h am ma Ynwt i
replaced~~242 bysn ,p / lo
i$ 24412
x~~ (3.3.9




75
where the transformed does vector, x, has -th component
x 2 log d 1 < i < k, (3. 10)
1 i
and
2
(3.3. 11)
r+1
In the same manner as was employed in Section 3.1, we can assume that E is a matrix of independent random vectors with
Z4
1/2 i) L
n e -> N (0, V.). 3.3.2
-n r+1 I
It again follows that Vi has the form given in (3.2.8) That is, V. has entries
s+t-4
Visit (820.7)(s-1)(t-1) sin 1 = 0 otherwise (3.3.14)
and




= (Y'yn Y ny' _X ( .3 5
is the weighted least squares estimate of R.
We can now employ the results of Chapter II to estimate the relative potency. We will perform this estimation in the log-dose scale. Thus, we will estimate r+l by means of a (1-a)100% confidence interval. If
we let' (0, 0, 1), then from Corollary 2.3.2
a'~n ZCX2 a n -12(3. 3.16)
forms a nominal (l-u )l00% confidence interval for and thus the relative potency. a n is obtained by applying Corollary 2.3.1. ..?
We have thus estimated the relative potency (in the log-dose scale) of two drugs. It was assumed t hat the quantal response curves are parallel. In th ext se we will give a test for parallelism.
3.4 Test for Parallelism
If two drugs are involved



77
We will use the same notation as that given in Table 3.1. The deterministic model is now given by
r
log d j in- 1 /2 1 (3.4 1)
3=1
2 r 1 p1/2] -r-1
E sin p. ,i>k ,(3.4. 2)
jr+1 j I
where r < min(k1, k-k 1).
The probabilistic model is again given by
Yn = M + En (3.4.3)
The response matrix Y has entries
n
1 /2 1 1 /2 (3.44
Y .. = W sin Pi ,1 nij i i PJ I 1~I
= l 1/ 2 k i i 1= 0 elsewhere. (3.4.6)
M is defined in a manner similar to Y with sin ( )1/2
n Pi




replaced by sin- 1 (p 1/2 Again we have
x M ~(3.4.7)
where the transformed dose vector, x, has i-th component
xi l o / 1 < k,'< k, (3.4.3)
E n is a g aiJn' a m atr. ix o.f ra d m ve t r En ca no b
andn
] ., -; ;' ( : K ; ,
rhepresEnted by 2 r lr n kk1)r epciey




E (1) and E (2) are thus comprised of independendnt random vectors, and we again have
n1/2 e > N (0, V i), -n r i
From (3.2.8) we see that Vi has entries
- s+t-4
V. =t 820.7(s-1)(t-l)[)sin-1 1/2 1st< ,i 1 .4. 12)
L
F s+t-2r-4
= 820.7(s-r-1)(t-r-1) Lin- [pi2 r+14s,t<2r,i>k
I (3.4.13
= 0 elsewhere (3.4.14)
S (iY n) Y x (3.4.15)
-n nn n
is the weighted least squares estimate of .
Since we desire to test for parallelism, the test of interest can be expressed by the hypothesis
Ho: r+j This hypothesis is thus of the form
p /
A, ,.. ,




H0: A $3 0, (3.4.17)~
where the entries of A an rx2r matrix are a = .. o. 3 8
a11 a22 a (3.4.rr18)
a I r+l1 2,r +2 ar2r(3.4. 19 (
a0 elsewhere. (3.4.20). >
Thus, we can apply Corollary 2.3.3 and tes t H0 by the test statistic
n(A~n )' (A -l A~n x 2 (3.4.21) ,
-n (A A-<, (A j!
where A, 13n, and Q are defined in [(3.4.18),( 3.41) (3.4.20)], (3.4.15), and Theorem 2.3.2, respectivel
If xris sufficiently large to reject HO: A we can conclude that the response curves are t
3.5 Summary
We have now discussed the applic o e
transformation to inverse regression f
assays. We have shown that inverse rersso i11 'g i v
better asymptotic results than the Knudsen- method
when the relationship between log di and sin- 1/2 i




jf< 81
not truly linear. IAnverse regression maybe used to fit models other than the linear model wherea the~ KnudsenCurtis method is not appropriate. We have gven methods of



CHAPTER IV
NUMERICAL APPLICATIONS
4.0 Preamble
In this chapter we will apply the results obtained in Chapter III. Several numerical applications will be given. Section 4.1 will give the exact probabilities that 95% nominal confidence intervals cover LD(50). Eight probability schemes will be considered which satisfy a linear relationship between log-dose and sin-l(pl/2). In Section 4.2 we will compare the use of inverse regression to other methods of analysing quantal response assays. Section 4.3 will be a summary of the chapter. /
4.1 Exact Coverage Probability (95%~
Nominal Confidence Interval)
In this section we will investigate small( smple results for eight probability schemes satisyn the liea
model
i + sinl(pv/2),a i 1, 2, 3. (4.1.1)
The log-doses, x.i, i = 1, 2, 3, 4 were fixed at four equally spaced values. Equ~al sample sizes of five, ten, .. .. )k8-




and fifteen were considered. We ran all possible assays for the model given in (4.1.1) with the conditions described. For all realizations, we then computed nominal 95% confidence limits for LD(50). These limits were found by use of the results given in Corollary 3.2.1.
For the eight different cur ves, we then computed the exact probability that the true LD(50) lies in the confidence interval.
Based ~on a pilot study we replaced pi by
(a) ^i + (2n)- ifpi < 1/2, and. (4.1.2)
(b) pi (2n)- i > 1/2 (4.1.3)
or in We recommendthis continiy
correction whenever the sample sizes are relatively small. This continuity correction has no affect on he aipot distribution.
The following table summarizes the r obtained.
<' k:,~kD I
q~jb> ; T ]{
. .. ,
L 7!:-1 I1,
;if: ]f:,C?,7>< ;111i=:j
"," it, "' ila! !! ii;)
,'I




84
TablIe 4. 1
Exact Coverage Probability
(95% Nominal Coverage)
Run Number 1 2 3 4 5 6 7 8
l -1.5) .039 .029 .152 .087 .230 .319 .230 .152 = -.5) .319 .206 .415 .230 .415 .415 .319 .230 = .5) .708 .485 .708 .415 .614 .515 .415 .319 P4(x4 1.5), .971 .770 .928 .614 .794 .614 .515 .415
1og LD(50) -.04 .55 -.21 .93 -.07 .35 1.35 2.35
Sample Size Coverage
all, n = n ,5 .999 .980 .997 .880 .999 .882 .660 .557 all n1 = 10 .994 .954 .986 .912 .995 .962 .74 63
all n = 15 .979 .954 .94 .922 .987 .99.779 71.
The first four lines of Table 4.1 gi~v e prbilt
of response, pi, at log-dose, xi, for t eh
considered. The fifth line gives the true LD(50). The last three lines give the eact coverage probability for the various equal sample sizes. VW were limited
to relatively 'small samples in this investigation since




- ~~< ~ 85iV'
there are, for example, more than 65,000 possible realizations (each of varying probability) associated with n. i15.
As was previously stated, the above examples are all linear in terms of log-dose against sin1 (pl/2). Four different slopes were used. Run 1 had the smallest slope, runs 2 and 3 the next smallest, runs 4 and 5 the second largest, and runs 6, 7, and 8 the largest. Runs 1 through 6 provide excellent small sample approximation, while runs
7 and 8 do not. For runs 7 and 8, there is substantial probability that all pi's ar'e less than .5, and hence LD(50) must often be estimated by extrapolation. We conjecture that convergence is slow whenever extrapolationis highly probable.
4.2 Estimation of Relative Potency
by Various Linear Techniques
In this section we will compare the estimation of
relative potency by various methods of analyzing quna response assays. The data we will analyze reaexml presented by Finney E3]. The data are the resut of a of i nsul i n. Mice were injected with vary or with a test preparationand the numbers the symptoms of collapse or convulsionswr reordd For
the data of Finney [31, page 477, we obtaind 95 nominal confide-nce intervals for the relative oencyof the insulin as compared to th e test preparation. Excellent linear f~it




was obtained for all methods. A summary of the analyses is given in Table 4.2.
Table 4.2
Estimation of Relative Potency
Method R LCL 2C lP
P r obi t 13.4] 11. 11 16.12 .28
Logit 13.38 11.04 16.16 .35
Angle (MLE) 13.50 11.31 16.05 .17
K-C 13.70 11.58 16.20 .35
I-R-L 13.72 11.66 16.18 .19
Legend: R = estimated relative potency
LCL = lower 95% confidence limit UCL = upper 95% confidence limit
X 2(P =chi-square statistic for parallelism, on~e1
Degree of freedom.
MLE = maximum likelihood estimation1
K-C = Knudsen-Curti s method
I-R-L = Inverse regression: linear. =' i
Chapter III)
While the five methods give virtual
results, the Knudsen-Curtis and inverse regesso mehos
require the more elementary computations an rersi
to explain to nonquantitative scienti.sIn situations where parallelism reasonaebu
linearity is not, we can use inverse r.... 2,
w the o m a
Me tho'l ;;, Iii' ,,4'~ Ir,,,... p
L o i~ 1 .:3 :;;:iI, ?. gl ;::; <.A
Lege d]: I! :;: ,:e tim ;.t d ii~i! e po ;,: i~~ii...




<4 87
4.3 Summary
By use of numerical examples we have shown the applicability of inverse regression in analyzing quantal response assays. Since other methods of analysis are restricted to linear models, we have compared inverse egression to some other methods when a linear fit is excellent. As has been stated before, inverse regression can also be applied to quantal response assays when linearity is doubtful.
Finally., another reason to use inverse regression to
analyze quantal response assays i-s the computational simplicity and ease of explaining the results, to nonquantitative scientists.
s t ; d ;77 ;< ]g~ir 01'" "+7 ,7 7. ; ,<. /j/ ii 7 !D ; 7 77 .< 7 g < .1' 4d
/ ,.7 "
- -, / S :- ,;': < "
" ,' ) .>
f < ?
); !:i~ -. ...
:7 ,77I
, i += ,'+i "" .'llI II




BIBLIOGRAPHY
[1] Bliss, C. I. (1939). The toxicity of poisons applied jointly. Ann. Appl. Biol. 26, 585-615.
[2] Finney, 0. J. (1971). Probit Analysis. 3rd Ed .
Cambridge: University Press.
[3] Finney, 0. J. (1964). Statistical Method in Biological Assay. 2nd Ed. London: Griffin and Co.
[4] Halperin, M. (1970). On inverse estimation in linear regression. Technometrics 12, 727-36.
E5] Knudsen, L. F. and Murtis, J. M. (1947). The use of the angular transformation in biological assays.
J. Amer. Statist.-Assoc'" 42, 889-902.
E6] Krutchkoff, R. G. (1967). Classical and inverse regression methods of calibration. Technometrics
9, 425-39. T
[7] Moore, R. H. and Zeigler, R. K. (1,967). The use of non-linear regression methods for analysing 1i<>4
sensitivity and quantal response data. Biometrics~
23 565-66.
[8] Nelder, J. A. (1968). Weighted regression, quantal response data, and inverse polynomias Bio:: ,. metri,:c~s 24, 97:9-_,85 i ii, ~ !ii.
9] Rao C. R. (1965).. Linear Statistical Ifencad Its Applications. New York John Wile n
Sons, Inc.
[ 10] Saw, J. G. (1970). Letter to the editor. Tcnomtr
12, 937.
11 Williams. E. J. (1969). A note on regresion methods in calibration. Technometrics 11 19-92.
88




<~Additional References
Berkson, J. (1944). Application of the logistic function to bio-assay. 3. Amer. Statist. Assoc.
39, 357-65.
Krutchkoff, R. G. (1969). Classical and inverse regression methods of calibration, in extrapolation.
Technometrics 11, 605-8.
Litchfield, J. T. and Wilcoxon, F. (1949). A simplified
method of evaluating dose response experiments.
J. Pharmacol. Exp. Therapeutics 96, 99-113.
Patel, K. M. and Hoel, D. G. (1973). A generalized
Jonckheere k-sample test. against ordered alternatives when observations are subject to arbitrary
right censorship. Comm. Statist. 2, 373-80.
Steel, R. G. and Torrie, J. H. (1960). Principles and
Procedures in Statistics. New York: McGrawHill Book Co.
i !i~ iiiii~iiii~iiiii~ii~iI ii2 !i~ iiiiiiiiiC~ ~ l!iii iii~ii iiiiii iiil ii~iii~liii~iiiiilii i 1 ii ii i i~i~i!i~i iii ii i iili i! !ilii ii ] !](d ii ][ ii ] i iK iiiiiiii i iii~iii~ii i ); iii ; iii iliiiii']iiliii~ i il i i!iiiil ~ i!;i =ii ;!i i~iiiiiii i! i~i~iiii;i iiii~ i! ii ii != ii iiiiiili~i !iiiiii liiil iiii1 ]ii~~~i liiiii~iPi iii~ i !i! liii~iiliiii iii~kl iili ?]iiiiiiiiliiiii ii i iil 1 ii~i iiu i iii ii iil ~ ii il i . .. .. .
'>4 i ~ i i li l l ? ~ i i !'iiii6 i=iiii~ll1 iiii1!=1!11ili!!1 11!11 !iil]!iiiii iiii61iiilii iiiii iiiii Kli~i A K K lIil~i]'~
;=)i= :I !:ll][~i!;l iii ii;i i= i 67iiiii iii 1i! I~ii i~ili; i) !i !iii iiiiiii~iiii ]! ] iiiil~ii~ii~iiiiiiiiiKi!! i~> ili ii i! i!i i i~ iiii ] 1 ii iiiiii i~ii~ iiii~ii:iii i i ii ii1 i lii ii i ,ii !i~ i!El i iI!A Ii iilii iiiiii~ i !i ]iiiii~ il~i lii~il iii iiiliii!! ;===: %K A 6 : == ... .... .. ... ... ... .. .
ilii~ i!i iI[i~i iii~i~ ti~iiiiii i~i1iii11i% : i lli ~ili!i~I =i~i il!I lii;= !ii' lii!i.liliiiid !i ii!ii~ iri~i:!Li;iiiiii~ iiii=il iK K!31iiii
!ii ilii !111liiiilii~i~iiilii~iil iiiiili iiK K !~i A j d Rv[ 4 eA>e s'
;i~ iiiii11ii I~: iii i~ lii~ i iiiii ,,> ji' !ii A K i (iiiii~
i ili ii ii i ii iiii ii~i!i !i ii ii~li !Iiii !R ii 1 ii i! i ii i ]i i. > I!: 'ii i!> ?i iii :> > 4 >ili i
1i li : !I i I iii iiiili= ] iii i iii I iii i i iI i d !!! i i )i= i i ii Ii 1iiiilii i ii :i i> 3 i iii ii1i~ ~i! ii } iiii %i~K > Ii
ii iI!ii~ilii i! ~i1!iil !iiii iliiliil~ii+ii i iiiii i i !i~ ii I~ iiiii i~i i ~ i!i iiii~i#1 i~ii < >i 1 fAK11{i~ii lA K > K!,~il iiil iiiA K Kil
ii~diiiiiiii~i~iil i~i! lii,{lllliI iii~i1!i ii!iiii~iiiiiii iiiiiii~ i!i=ii iiii~ii!ii ~ii i!ii i11 ii iii i K ii i ~ .K K KJ>K 4. >iIi i@ ii~~i ,* i~
ii~! iii i~li iii 1 i iiiilill i= + Iiil ]i ii!iiI i~ l iii !ii!iil ii~ il i ~ iiii~i i !i Lil i ii ii i!! !bi iiii!i ii = 1A ii> Ki i !9 Ki> ii K i' li!iiii 1iA K K i !i ? !i ii A ])] ] i~ i [ !i ii~ ![iii !iiii ~ i] 'ii~iiii! !~ ] i ':!ii !ili D Iiii~ ii! ]ili~iiii! iiii~ii !ii!i i=! ii ,i ii !ii i ilii ii i~ i~lKK...... ..... ...... ,....
iiB III! I p~~~~~~ ~~~ iTiaK> n l! iii t lc- iii)!i~




BIGAPIA SKETC
Frank I HanDerc Iwsbono uut9 95
burgFank High Schetich June was3 bonn SAugt9,1945f
that year he enrolled in Wilkes College, receiving the degree of Bachelor of Arts with a major in mathematics in June, 1967. In September of that year hie enrolled in Bucknell University, receiving. t-he degree of Master of Arts with a major in mathematics in January, 1970.
The writer also taught high school for the school A
year 1968-1969 at Selinsgrove Area High School. He entered the University of Florida Graduate School ~in~ September, 1970. Mr. Dietrich has worked as a teaching j4'
assistant for the Department of Statistics~ siethat time, simultaneously pursuing his work towards th4ege of Doctor of Philosophy../4
90




I certify that I have read this study and that in my
opinion it conforms to acceptable standards of scholarly presentation and is fully adequate, in scope and quality, as a dissertation for the degree of Doctor of Philosophy.
Jonathan J. Shuster, Chairman Associate Professor of Statistics
I certify that I have read this study and that in my opinion it conforms to acceptable standards of scholarly presentation and is fully adequate, in scope and quality, as a dissertation for the degree of Doctor of Philosophy.
I certify that I have read this study and that in my opinion it conforms to acceptable standards of scholarly presentation and is fully adequate, in scope and quality, as a dissertation for the degree of Doctor of Philosophy.
I certify that I have read this study and that in my opinion it conforms to acceptable standards of scholarly presentation and is fully adequate, in scope and quality, as a dissertation for the degree of Doctor of Philosophy.
(3,...... "'j
Pejaver V. Rao
Professor of Statistics




I certify that I have read this study and that in my opinion it conforms to acceptable standards of scholarly presentation and is fully adequate, in scope and' quality as a dissertation for the degree of Doctor of PhilosQphy.
Stratton H. rerr
Professor of Entomology
This dissertation was submitted to the Graduate Faculty of the Department of Statistics in the College of Arts and Sciences and to the Graduate Council, and was accepted as partial fulfillment of the requirements for the degree of Doctor of Philosophy..
August, 1975
Dean, Graduate School




Full Text
xml version 1.0 encoding UTF-8 standalone no
fcla fda yes
!-- Quantal response assays by inverse regression ( Mixed Material ) --
METS:mets OBJID AA00062827_00001
xmlns:METS http:www.loc.govMETS
xmlns:xlink http:www.w3.org1999xlink
xmlns:xsi http:www.w3.org2001XMLSchema-instance
xmlns:daitss http:www.fcla.edudlsmddaitss
xmlns:mods http:www.loc.govmodsv3
xmlns:sobekcm http:digital.uflib.ufl.edumetadatasobekcm
xmlns:lom http:digital.uflib.ufl.edumetadatasobekcm_lom
xsi:schemaLocation
http:www.loc.govstandardsmetsmets.xsd
http:www.fcla.edudlsmddaitssdaitss.xsd
http:www.loc.govmodsv3mods-3-4.xsd
http:digital.uflib.ufl.edumetadatasobekcmsobekcm.xsd
METS:metsHdr CREATEDATE 2019-02-15T12:38:20Z ID LASTMODDATE 2019-02-15T11:36:33Z RECORDSTATUS COMPLETE
METS:agent ROLE CREATOR TYPE ORGANIZATION
METS:name UF,University of Florida
OTHERTYPE SOFTWARE OTHER
Go UFDC FDA Preparation Tool
INDIVIDUAL
UFAD\renner
METS:dmdSec DMD1
METS:mdWrap MDTYPE MODS MIMETYPE textxml LABEL Metadata
METS:xmlData
mods:mods
mods:accessCondition The University of Florida George A. Smathers Libraries respect the intellectual property rights of others and do not claim any copyright interest in this item. This item may be protected by copyright but is made available here under a claim of fair use (17 U.S.C. §107) for non-profit research and educational purposes. Users of this work have responsibility for determining copyright status prior to reusing, publishing or reproducing this item for purposes other than what is allowed by fair use or other copyright exemptions. Any reuse of this item in excess of fair use or other copyright exemptions requires permission of the copyright holder. The Smathers Libraries would like to learn more about this item and invite individuals or organizations to contact the RDS coordinator (ufdissertations@uflib.ufl.edu) with any additional information they can provide.
mods:identifier type ALEPH 025344667
OCLC 02832918
mods:language
mods:languageTerm text English
code authority iso639-2b eng
mods:location
mods:physicalLocation University of Florida
UF
mods:url access object in context http://ufdc.ufl.edu/AA00062827/00001
mods:name personal
mods:namePart Dietrich, Frank H
given Frank H
family Dietrich
mods:role
mods:roleTerm Main Entity
mods:note thesis Thesis--University of Florida.
bibliography Bibliography: leaves 88-89.
Typescript.
Vita.
statement of responsibility by Frank Hain Dietrich II.
mods:originInfo
mods:place
mods:placeTerm marccountry flu
mods:dateIssued 1975
marc 1975
point start 1975
mods:recordInfo
mods:recordIdentifier source sobekcm AA00062827_00001
mods:recordCreationDate 770325
mods:recordOrigin Imported from (ALEPH)025344667
mods:recordContentSource University of Florida
marcorg fug
FUG
mods:languageOfCataloging
English
eng
mods:relatedItem original
mods:physicalDescription
mods:extent viii, 90 leaves : ; 28cm.
mods:subject SUBJ690_1
mods:topic Statistics thesis Ph. D
SUBJ690_2
Dissertations, Academic
Statistics
mods:geographic UF
mods:titleInfo
mods:title Quantal response assays by inverse regression
alternative displayLabel Added title page
Inverse regression, Quantal response assays by
Regression, Quantal response assays by inverse
mods:typeOfResource mixed material
DMD2
OTHERMDTYPE SOBEKCM SobekCM Custom
sobekcm:procParam
sobekcm:Aggregation ALL
UFIR
UFETD
IUF
GRADWORKS
sobekcm:MainThumbnail 00001thm.jpg
sobekcm:Wordmark UFIR
sobekcm:Tickler RTDO.18.07
RTDS.18.04
sobekcm:bibDesc
sobekcm:BibID AA00062827
sobekcm:VID 00001
sobekcm:EncodingLevel I
sobekcm:Source
sobekcm:statement UF University of Florida
sobekcm:SortDate 720988
METS:amdSec
METS:digiprovMD DIGIPROV1
DAITSS Archiving Information
daitss:daitss
daitss:AGREEMENT_INFO ACCOUNT PROJECT UFDC
METS:techMD TECH1
File Technical Details
sobekcm:FileInfo
sobekcm:File fileid JP21 width 2216 height 3184
JPEG1 630 905
JPEG2
JP22
JPEG3
JP23
JPEG4
JP24
JPEG5
JP25
JPEG6
JP26
JPEG7 911
JP27 2188 3164
JPEG8 908
JP28 2202 3174
JPEG9 913
JP29 2180 3158
JPEG10
JP210
JPEG11 912
JP211 2182 3160
JPEG12 920
JP212 2146 3134
JPEG13
JP213 2186
JPEG14
JP214
JPEG15
JP215
JPEG16
JP216 2184 3162
JPEG17 898
JP217 2251 3209
JPEG18 909
JP218 2196 3170
JPEG19
JP219 2206 3178
JPEG20 915
JP220 2170 3152
JPEG21 914
JP221 2176 3156
JPEG22
JP222
JPEG23
JP223
JPEG24
JP224
JPEG25
JP225 2174 3154
JPEG26
JP226 2168 3150
JPEG27
JP227
JPEG28
JP228
JPEG29
JP229
JPEG30
JP230
JPEG31
JP231
JPEG32
JP232
JPEG33
JP233
JPEG34
JP234
JPEG35
JP235
JPEG36 907
JP236 2210 3180
JPEG37
JP237
JPEG38
JP238
JPEG39
JP239
JPEG40 910
JP240 2194 3168
JPEG41
JP241
JPEG42
JP242
JPEG43
JP243
JPEG44
JP244
JPEG45
JP245 2190 3166
JPEG46
JP246
JPEG47
JP247
JPEG48
JP248
JPEG49 906
JP249 2212 3182
JPEG50
JP250
JPEG51
JP251 2208
JPEG52
JP252
JPEG53
JP253
JPEG54
JP254
JPEG55
JP255
JPEG56 897
JP256 2254 3210
JPEG57
JP257
JPEG58
JP258
JPEG59 899
JP259 2247 3206
JPEG60
JP260
JPEG61
JP261
JPEG62 893
JP262 2273 3223
JPEG63
JP263 2248
JPEG64
JP264
JPEG65
JP265
JPEG66
JP266
JPEG67
JP267
JPEG68
JP268
JPEG69 894
JP269 2268 3220
JPEG70 890
JP270 2290 3235
JPEG71
JP271 2172
JPEG72
JP272
JPEG73
JP273
JPEG74
JP274
JPEG75
JP275
JPEG76
JP276
JPEG77
JP277
JPEG78
JP278
JPEG79
JP279
JPEG80
JP280
JPEG81 917
JP281 2160 3144
JPEG82
JP282 2276 3225
JPEG83
JP283 2277 3226
JPEG84
JP284
JPEG85 888
JP285 2300 3242
JPEG86
JP286 2256 3212
JPEG87
JP287 2291 3236
JPEG88 916
JP288 2166 3148
JPEG89 891
JP289 2287 3233
JPEG90
JP290 2192
JPEG91
JP291
JPEG92
JP292
JPEG93
JP293
JPEG94
JP294 2178
JPEG95
JP295
JPEG96
JP296
JPEG97
JP297
JPEG98
JP298
JPEG99
JP299 2200 3172
JPEG100
JP2100
METS:fileSec
METS:fileGrp USE reference
METS:file GROUPID G1 imagejp2 CHECKSUM 8e8b974150809e9236933b877fed6929 CHECKSUMTYPE MD5 SIZE 882040
METS:FLocat LOCTYPE OTHERLOCTYPE SYSTEM xlink:href 00001.jp2
G2 3271f2c510d475904be90670140ec6a5 882061
00002.jp2
G3 24e4bd04650aefe14b5c66c5e7a3bbaf
00003.jp2
G4 0999e7ebefbab6327bdfbd4e982a8d73 882076
00004.jp2
G5 e83fcfdd57ae705b5440d7d55c1307a3 882064
00005.jp2
G6 cf497f1b2fa30897a83ea69ef0c08c47 882018
00006.jp2
G7 fb2ff83a826c16376e1fa6ef8117a1ec 865009
00007.jp2
G8 25d55e806437a5b3f6dacc17d1c95417 873751
00008.jp2
G9 6820469bdd2748123721e267997bef70 860608
00009.jp2
G10 5af74a97ecb9a36780f602ca769a6135 865463
00010.jp2
G11 adb506f47a8641541213c7d59f372c83 861971
00011.jp2
G12 f7a5da8e4782bffc62168d19324771fb 840566
00012.jp2
G13 5ebecadd35a2da8d5881852fdbdbc978 864672
00013.jp2
G14 22bb9d06bd90229600ffb111645002fc 861989
00014.jp2
G15 5221cb1ae0eaf327b51d1ec5492c0687 861817
00015.jp2
G16 90154a5ea2bef3ba0c840c2cee792c2d 863324
00016.jp2
G17 ee324817d3fd495945398864cd69bdf7 903021
00017.jp2
G18 a017d8e59bc3226cf95bd2dcb74f7f5a 870273
00018.jp2
G19 dce2fdb342b0f9b17f09e29ccc833bc8 876190
00019.jp2
G20 f15aadf810197b6cb5c8ae095bda3696 855090
00020.jp2
G21 8bc4c7add989da7303b921143084fce5 858509
00021.jp2
G22 dedf8ac8077a24bfec8a1c6af689e343 863121
00022.jp2
G23 1e0f1637e442aa126246b737c863b91d 861997
00023.jp2
G24 37e27ddda206acaa7cd583c547cff1b7 858540
00024.jp2
G25 bcae0e04d1e695693d3411b1bfd8ed56 857188
00025.jp2
G26 b478a5e90a2a6cbcdb61162dcf4254de 853741
00026.jp2
G27 d40bb933d9726770c61d278d3a42e87b 882049
00027.jp2
G28 ad2388f88e434724f8b3edd50e8eabae 882075
00028.jp2
G29 c91498dfcda49763b157a339eb81b3fe 882059
00029.jp2
G30 f0588fa34dc2f4a713d8157526636c6e 882078
00030.jp2
G31 47ba0a25e8e45feeb43b6e4c2320a959 882031
00031.jp2
G32 ce5928099f6fbdd93036d4d8d53cd7eb 882079
00032.jp2
G33 53ee7554cd77a59812d33948bf0955dd 882077
00033.jp2
G34 b3f9511a948af386d7bcd6e5b4f2df21 882065
00034.jp2
G35 6c63bb04b46733db0436f3b8ae4bb414
00035.jp2
G36 b0e6deccc7ea88296422eecb684c773c 878567
00036.jp2
G37 06a96eed6bcec3b9e0f4335d59574968 882058
00037.jp2
G38 17dbe62f9f5601f80255471038efe5ec 881959
00038.jp2
G39 b3dffe38d50ba418e9422adf9c5b2b65 882063
00039.jp2
G40 64c365ad377879837c4a545ab5cd048d 868935
00040.jp2
G41 5cd87e43405c22c8f2cf7a115c1e4ac0 882056
00041.jp2
G42 dd6caeb51b8a289f1114653556a968ff 878584
00042.jp2
G43 613fc2d903b8c7d10d79f6e006cf0baa 882071
00043.jp2
G44 62c836502b256d6e00f9833f90b48100
00044.jp2
G45 99cfbaf07c66558899a19cb6b0fb8741 866790
00045.jp2
G46 39b4a71d3b2f0efe8e3d42a0173b3cdb 881819
00046.jp2
G47 e2f57d20e6660de1d76148fa1082bfb4 882046
00047.jp2
G48 3d60e9429564f66985ebd4042a332a05 878315
00048.jp2
G49 117b95b804c89e184e7b96816327b086 879779
00049.jp2
G50 0f4bdbad061605539efb2fd96a9e4a9f
00050.jp2
G51 9bec391ed04160be953c3248a567ede3 877120
00051.jp2
G52 8b248ef9d1666b51f947fa0ec86bad9e 881948
00052.jp2
G53 9e54c3bf63e8236e1e480388898bd6ca
00053.jp2
G54 174b043ff9316099cb3794eca13fcad9
00054.jp2
G55 6ed043e0dae786ef8c2f57bbe2c7a07d 866757
00055.jp2
G56 d06a092f10d8b0b0c2495d94074fa24b 904494
00056.jp2
G57 257f1686c36f12e2ab6ad87fbbeda905 882066
00057.jp2
G58 96fd38be489d0f1329c1b0c3a057fdc3 882074
00058.jp2
G59 71141dfe116aba19ffe05009028a6eb1 900595
00059.jp2
G60 980494100f767619f89332a36c1a96d3 882037
00060.jp2
G61 a90c4c6525d433ff8b88cea608296117 881988
00061.jp2
G62 b50b0f410d4cd3cc5e573598a04be967 915830
00062.jp2
G63 f099b20bbd7656e223268fad728635f2 900991
00063.jp2
G64 ad88f7c74970fcab100650ac3284f23d
00064.jp2
G65 8649c8713718134ec6fecb253aadfda5 882057
00065.jp2
G66 5deda3ed72bbbdf407cbce0cfcc65a69 882073
00066.jp2
G67 9668b31169ef8b01c7ac2669d1825987 882062
00067.jp2
G68 83e410ff77d365dfbcb7f9546b202bda 866728
00068.jp2
G69 78f89e47118481803ce0b71fa2a709dd 912932
00069.jp2
G70 a17e3b681129a16eac213ebb4c80a440 926109
00070.jp2
G71 ddc85fc5b5f46ca7e1da1465581f787f 855853
00071.jp2
G72 ff361867a4bf00ae6dce935c2f2d0015 857184
00072.jp2
G73 fa9bb875957894e235caffe81f61f0ba
00073.jp2
G74 96d5e3e4fec44ebeef19f03405c2ed26 882051
00074.jp2
G75 ca264077bd48eed2c9923189a7153b97 912956
00075.jp2
G76 d77e7258a0f1a3c0297358943ac3a5db 882070
00076.jp2
G77 ad977c6d35ec9fe4883bd2ada19e8acc 857161
00077.jp2
G78 b50ea478d3c4614132f51c976bc0117a 873731
00078.jp2
G79 34069da26f35d57bf03b9211cb282aea 855056
00079.jp2
G80 13731b14b3e4c8758d7c3029b124cab2 858534
00080.jp2
G81 f9c04003d4581b12392efa36ea431370 848986
00081.jp2
G82 6e68d676f36602bbbe9a7781f461acd4 917613
00082.jp2
G83 7ffed1d3a592ee0882b2767bce8afc44 918284
00083.jp2
G84 a09f2585e0077d8eede620203f581dcc 857154
00084.jp2
G85 dcfb1132799383ddd052727616052cb2 932137
00085.jp2
G86 a221205a41d4d89b889c3478706d8e6b 905886
00086.jp2
G87 c9dd799977de7b3711c0616dbd038c7e 926820
00087.jp2
G88 ae4fe1096f84cc7263002d51f5b57be8 852430
00088.jp2
G89 5f2777c192d794107654734a5976b542 924324
00089.jp2
G90 c614f9721a1c402d864c5bf610187190 868122
00090.jp2
G91 472ea6d1625be2b16edb50362ffbe3c5 866791
00091.jp2
G92 6946e0dc3d1b6d93c52c2848ec895164 855076
00092.jp2
G93 17160f138890d7e0dfab4819c2109008 876075
00093.jp2
G94 9c2cc0576f4d8b98dcb6d3ee75ec525c 859755
00094.jp2
G95 27bf8af0674c24d39758d864ffe26eac 870200
00095.jp2
G96 89f945c947fb33318f1f3d53176ad4da 865465
00096.jp2
G97 7907b17c357a8ebe58a55a50b24dadb2 860610
00097.jp2
G98 bcabdb7b00f855cb200477b0a165ea32 878370
00098.jp2
G99 7af8bd3b297c9c6ea48efd7beedc6880 872391
00099.jp2
G100 b7780867be66256024f3db7b7c88cf40 852722
00100.jp2
imagejpeg d432cb6caf14394395a93e69bbe89acc 171629
00001.jpg
JPEG1.2 a4fb6d085c6fbb592106e2ee5f5fe168 30326
00001.QC.jpg
cce0fa18a2e1c1bbef210a7be3f90bc4 157568
00002.jpg
JPEG2.2 e1a5cf2a9fe1519446b862d1cc7678c3 25750
00002.QC.jpg
d91f3838e5c8691a4ab8a31ee5659c4f 176181
00003.jpg
JPEG3.2 a53195faed7a2d08d16399faa7201dd9 32200
00003.QC.jpg
01ca4b19bb4d0a86285a3ee90299f863 209292
00004.jpg
JPEG4.2 7ec1776149ee156d5fe5e7c0c47de1c2 45556
00004.QC.jpg
7e4de503e6b4fc0f7f12cf9a369773ea 167396
00005.jpg
JPEG5.2 6f574fcab5ca3d579129303db85eeea9 29912
00005.QC.jpg
884af040afdd625641e2352e86e4b36b 163985
00006.jpg
JPEG6.2 5d5b0ebdc6fb706f37ea17e047e58c68 28340
00006.QC.jpg
b388e72422ec0ae077100fae8577fc0a 224951
00007.jpg
JPEG7.2 d2e5fa4a2044f166efb84d94d32cfb9c 48970
00007.QC.jpg
28f194a48bf5904d2143aa340b71e32e 185476
00008.jpg
JPEG8.2 b98a621d55a1641786e6cbae8c7d7d16 35161
00008.QC.jpg
264b99883ea4fb7f8e9fdd521eb64386 217395
00009.jpg
JPEG9.2 6f120f6832b6c39d2355a9160832250d 46098
00009.QC.jpg
d3062262ff272015a4a1b5a172ada65c 234936
00010.jpg
JPEG10.2 76136a1617ad3f2840e44d75851869e2 53604
00010.QC.jpg
8eea19c1ba79687b933f7bac12f8d784 237798
00011.jpg
JPEG11.2 3a475d154cb2c58320a02cae2bad8342 54063
00011.QC.jpg
54b842242ad3147dca109ea1a8a3248b 240000
00012.jpg
JPEG12.2 0f89997ec9ec386f1d833b20b8043b05 55314
00012.QC.jpg
937e32590a949aadef57f5ec63eb4077 224922
00013.jpg
JPEG13.2 fbd5e2b1d21c21c2be6147a76b440b3b 48712
00013.QC.jpg
be68bc0adc50229a1732811a6f5f7206 218465
00014.jpg
JPEG14.2 86aec8f591d816edd8c98c8331664d61 47109
00014.QC.jpg
d4c97b3dfc5c97f89312b247111829c3 237467
00015.jpg
JPEG15.2 e0b2a2be9aff5c61b4cc2e8201d611d9 54193
00015.QC.jpg
b82b4a2282fdf6e24bd8cbc641507466 215657
00016.jpg
JPEG16.2 e233e53dd28224fc5b517d3cab66b21b 44918
00016.QC.jpg
e25b1d126eaf6855d30aea9693f3e8ef 200030
00017.jpg
JPEG17.2 4ddb075bac0f4c310ff25e42f5985d5f 52727
00017.QC.jpg
e6cf1e041083f8060054cd80a9eaeecb 193290
00018.jpg
JPEG18.2 d4a7edd9f992607f518fe4b30c34b4c6 37839
00018.QC.jpg
0fb19e42a454ff1b42f487f471adec5e 226701
00019.jpg
JPEG19.2 92561b8c9460cb9c0d003d864bd3b9a3 49855
00019.QC.jpg
8d16f9c43e493ba1096f09fdaf54275f 231534
00020.jpg
JPEG20.2 872b32a0e81faf06591c6b75b0fc875c 53679
00020.QC.jpg
5e2f027520113705a191e989b3e045a9 228137
00021.jpg
JPEG21.2 e141b87416d67fbfbb8717ef3b0cc39c 50842
00021.QC.jpg
9417ec3ead16a9bbe799447cbb3572ad 214037
00022.jpg
JPEG22.2 e88edd68410793a89aab894e36748c76 46234
00022.QC.jpg
9154b9518015a5e3aa245bd4f910d9aa 215315
00023.jpg
JPEG23.2 11f0c602426204d78b657e8856ed45b7 45321
00023.QC.jpg
9cbef0b9c919ad5fd44bf6601381c940 231061
00024.jpg
JPEG24.2 b4815b1066bfd320fb4ee4ccd430042f 50934
00024.QC.jpg
a20fa146e060072cac5d728424db026f 224264
00025.jpg
JPEG25.2 308c0cfc7afea13554ab75f966bb928e 50388
00025.QC.jpg
b446c933fe0c3ecd44aa50d96737ebab 202773
00026.jpg
JPEG26.2 25f7f1a4cc85e54d0d8d4928f88626e8 40843
00026.QC.jpg
0557a814a1f53c28f17d2df03f040754 192858
00027.jpg
JPEG27.2 73ac9ceb1825a897eb7dc222413f7f68 39636
00027.QC.jpg
fdbcea5d4677dd33e0d9ff535533f320 185326
00028.jpg
JPEG28.2 b66b19f5c33c46b3cdf49569afa00da4 36347
00028.QC.jpg
5831f28174a3223016238c0fde3440b7 190409
00029.jpg
JPEG29.2 9e1429152d23d41c5031f66384ebc8ac 39135
00029.QC.jpg
11acebe2f664c3c9f85915a286547dfb 178575
00030.jpg
JPEG30.2 f5869345af55de6ed4db597cd2d77c4c 33988
00030.QC.jpg
886c967ef8e8bcaed4c70d8a3d18273b 187559
00031.jpg
JPEG31.2 9e8bb599b937c6a01333f4087f6c562f 36486
00031.QC.jpg
b1440de6baabfc5567eefb50a2980d5d 179197
00032.jpg
JPEG32.2 7b5db3c3fabe782e856c327f99eb3df0 34571
00032.QC.jpg
cc645ca86573ebfda0622e8f8ff3c5e0 182086
00033.jpg
JPEG33.2 c3e78b2f006367dcb6acbc02fb750416 35026
00033.QC.jpg
82adf7e716e9f0955e2f8f3ec1ce1166 186318
00034.jpg
JPEG34.2 803ef1c20651dff540153caf21d4aa23 37526
00034.QC.jpg
d006ed89898bf52ad094ce2d15ea5ad9 185467
00035.jpg
JPEG35.2 84529a25e42e32a2e288f3dce7f8eef9 36200
00035.QC.jpg
b94f588936ba6cd146b45df728bdbad3 202995
00036.jpg
JPEG36.2 e62f4c438f34a1bedbcee390d2a69869 42546
00036.QC.jpg
f951b1c9550ea982b6edf2a585a0ba51 188541
00037.jpg
JPEG37.2 c18721bc8623bf804f9daa145dc659a6 36849
00037.QC.jpg
2c1e1736e8f5b1395690c35e5d053e43 196399
00038.jpg
JPEG38.2 46477564ae7570055ac3f31a59282534 40790
00038.QC.jpg
6e14933da6cb3e1c9df77dc01815db69 188397
00039.jpg
JPEG39.2 bbc620d409eb8b789257650162316c84 37255
00039.QC.jpg
dfeeffcba01feeab8428eaefc252567d 215226
00040.jpg
JPEG40.2 88a02707fba0f0e262d61fd6ba51fa3a 44980
00040.QC.jpg
1bb7284106043850f8ca246e53960436 182899
00041.jpg
JPEG41.2 fc242c45abfcb426e16bcb9d14a59714 34588
00041.QC.jpg
421b3c9e720cf09bb027492f18c7de5e 202798
00042.jpg
JPEG42.2 19d0c09cf47d5b0f84664240d0d7b513 41281
00042.QC.jpg
debc1845aec1405d5c963a388ed9b5da 181346
00043.jpg
JPEG43.2 ea72fa2dd4767d4ac80fcfab37497b2f 35080
00043.QC.jpg
a0ba3151ec09d3f3b1c86adb9009a129 193684
00044.jpg
JPEG44.2 6beb5448567e175b247ed83344c2fdca 51505
00044.QC.jpg
bdca0313313b40b8e91f677067d4abe5 179712
00045.jpg
JPEG45.2 948ce325582fa15f73e519623898ab09 32818
00045.QC.jpg
d88797c480f4f7e2e72c81fe051b0cd4 203220
00046.jpg
JPEG46.2 9ff8da013269ce097322efe8adb7d8f9 43455
00046.QC.jpg
eb3756320ac9605b64e97dffc0ddb3e1 178717
00047.jpg
JPEG47.2 c082decf96291c43add9c4637aeff1b7 34670
00047.QC.jpg
a76b9723950fa6812d63195d84d54b06 235077
00048.jpg
JPEG48.2 dee6f06fb9a0dfa84058c338bc8db8c2 54161
00048.QC.jpg
fd18f59c31578dd10c1ea71a266238c1 217184
00049.jpg
JPEG49.2 d94904add05b4542fd6fa6afd1fc02d7 47329
00049.QC.jpg
ece59d10aeeadf9713d967d69d988ce4 178204
00050.jpg
JPEG50.2 e161c77a1b8e1a676bc1994fe9ed2b84 33858
00050.QC.jpg
839d2f3e62ff136d97d065a83323ede9 190874
00051.jpg
JPEG51.2 81975b139a59b1004b2261a0840d57d5 36038
00051.QC.jpg
73d49fe582d5057e0cfd7682fbaf830b 176185
00052.jpg
JPEG52.2 2a15d25052fea69235a9bfdbf5671ee2 32876
00052.QC.jpg
980e448eb2881d91d6845425ea8eaa82 178232
00053.jpg
JPEG53.2 75dbfb896bde3ae8d780ec15a9ee9b59 33734
00053.QC.jpg
c7377ab99e8b4d058f243d545b4867a3 195875
00054.jpg
JPEG54.2 4bfacb5048ea507e52652b796c7d5073 39778
00054.QC.jpg
cb8d9134760a1d3238e7f3624dcace16 209840
00055.jpg
JPEG55.2 70f06e92ca2d25b1456318d6bce95f42 44057
00055.QC.jpg
dfc1a8464c79be75c171e0b5b0b69c64 220631
00056.jpg
JPEG56.2 feea9db98c02224fdeba632115d73fcb 61289
00056.QC.jpg
62f1109f75eb003702befcc8cbc55cab 174146
00057.jpg
JPEG57.2 95be0d60e60947aedded1506ab60531d 32562
00057.QC.jpg
3716a763b3c616bc0fdfbdc6081a86c9 186721
00058.jpg
JPEG58.2 a233018aa4b1191c6cba3be6ac753e6d 36911
00058.QC.jpg
5113d5ead954d5a10972347a979d7de9 208462
00059.jpg
JPEG59.2 7dc69b0e96b9b762053a0edd4c1f687c 57777
00059.QC.jpg
18df144290bb5cc226283417af3c8efb 187864
00060.jpg
JPEG60.2 08bd84a3a4c5f85b6f413e07b111b5a0 38377
00060.QC.jpg
74e2602d7fe4fd44c55b84c90a111bf5 167894
00061.jpg
JPEG61.2 ea03dd027b3186a39517059c76152c18 30110
00061.QC.jpg
fa0c4d7863f52fb673808d3048e54604 204916
00062.jpg
JPEG62.2 a4b3aa93e9181cfa985bc7718bb1eb41 55925
00062.QC.jpg
186812e50ee5a049617b4bec56397d78 196686
00063.jpg
JPEG63.2 c550f88887196cbc46293f4656712e37 52574
00063.QC.jpg
7202ae88e3293c023791c7943752d08e 182713
00064.jpg
JPEG64.2 794384cc2f23c6e5c53e4d926306dbdf 34795
00064.QC.jpg
360bcc38344fe613e476ed19ac87473b 184299
00065.jpg
JPEG65.2 a5a19ab221fc3e288149bb2a851e579e 35760
00065.QC.jpg
95d6d193bdaa4dbe4034fe61060b738f 181167
00066.jpg
JPEG66.2 b0572ff405ba6a6e55a147f4dd5b1968 34361
00066.QC.jpg
c6b3f6ab06081d7de96bd9e21cdd8c2d 178414
00067.jpg
JPEG67.2 77b59965c1d1e028a357e9b361213ba2 33056
00067.QC.jpg
5cf403b06c5eb32016dad2d194e91f74 192883
00068.jpg
JPEG68.2 6cd303cdd24cb0b6dacc576ad651775f 37630
00068.QC.jpg
3b83a3cc68b98937d9e3d14f62a50eff 195453
00069.jpg
JPEG69.2 971687571f1bb250457adc55aefbd327 53181
00069.QC.jpg
27fb50e5a92d08de3fbb92ff74a861a9 190004
00070.jpg
JPEG70.2 9ade860a219caeaafa121e9b9b0461da 50602
00070.QC.jpg
be810bc0116a1e62e671d297da4e4225 179063
00071.jpg
JPEG71.2 be3ad2f04a9a658d2bc57db42656bb7c 32620
00071.QC.jpg
51029ac9ac351d393d90453dc72aed3e 183030
00072.jpg
JPEG72.2 abb6207d7c73d046bb2b072f1e64400e 34340
00072.QC.jpg
2687fee3e1c034d262fec3317d2d61ee 180369
00073.jpg
JPEG73.2 940df1356e3f21985f632991d0416380 34353
00073.QC.jpg
1b672a93f7a4ef0a84d17aa34448c461 170832
00074.jpg
JPEG74.2 f1900d454928a2175fe6fc5acb0fee51 30853
00074.QC.jpg
b562c90283ac8f031111317c57ca1440 195285
00075.jpg
JPEG75.2 82f3b7e577d87371d34228b7ea4dfd93 52110
00075.QC.jpg
c4a5e94f55a6ad967dff1afb8c7baa8c 182489
00076.jpg
JPEG76.2 929e28a91bc7af94a9093080b8197b87 35774
00076.QC.jpg
5f0eb681a5d3f02427c6e3bdf61ffcb3
00077.jpg
JPEG77.2 014ba151298c5b6c21bbfbe6838a6552 35206
00077.QC.jpg
8d272cdaca03cda564078bb64e6ec5bb 202729
00078.jpg
JPEG78.2 283fa5766e26f30b506e83e2d967e6e8 42467
00078.QC.jpg
f32829ce49f27e4321b6c7b3bc61ecb4 199576
00079.jpg
JPEG79.2 0537431af4b5ce433712de962b4c9fcf 40316
00079.QC.jpg
a4dd461c90385e4c94dc8bbe60f89676 220844
00080.jpg
JPEG80.2 3fedc50f618eac4a6ef0fe130206d164
00080.QC.jpg
c83e445735b4e282a3bd025d6ecf74e9 192544
00081.jpg
JPEG81.2 ff0b05aee1611e7688fe865662086d9c 38014
00081.QC.jpg
0accbe463237baea242bccb0c4dd6d3f 192451
00082.jpg
JPEG82.2 0c389c7337c508c8e85dfebf9ecc4e14 52366
00082.QC.jpg
d52e05dc4fa9b62f149f6331efe45624 195473
00083.jpg
JPEG83.2 3bbaa6b12d8d0efc55b0f21bfb9da939 52634
00083.QC.jpg
c829e4426ab3d608fda273598cdf3470 215064
00084.jpg
JPEG84.2 e295b0d65f4604966ba802a72ea5bb02 46977
00084.QC.jpg
c1a373e59886b4b9be982ed4e4ad4fe3 194513
00085.jpg
JPEG85.2 3aff0c4bc4c87ced929b062642f16009 53820
00085.QC.jpg
079c92ff36c0054d6daea337098f645a 189432
00086.jpg
JPEG86.2 5b3f43d52550f80b5c74f0beb4f70cf5 49995
00086.QC.jpg
0344fa4c9f57ee5db144ffe74331f3da 199681
00087.jpg
JPEG87.2 ead1e4dabc4b147262f239840328079c 54471
00087.QC.jpg
7a74eb5424fa6c43430cc5ad14798820 206685
00088.jpg
JPEG88.2 4dc4f0c4be9e8a5d8532e7f05d4f2c4f 41236
00088.QC.jpg
3c5b4234e69a46a3fc93861d5eac779f 188980
00089.jpg
JPEG89.2 a8e3e4a2e107f3f625f403b20a0994b7 50456
00089.QC.jpg
852ca578963082b07371676d472e2fef 210842
00090.jpg
JPEG90.2 fb142f4a7e7e5dc6377c5222a73e7904 43692
00090.QC.jpg
bd87a431c2269e9fe43165828b23de21 205029
00091.jpg
JPEG91.2 d0c50b6d23f02bc5456b7a6bf922e4b2 41670
00091.QC.jpg
db4b084250cbfbcd1715bc04c8d3a91f 215395
00092.jpg
JPEG92.2 ae6f6242384c28ebf6db7d045d444b9c 46139
00092.QC.jpg
d75d668609b540be0ceb1779191d7b1c 235147
00093.jpg
JPEG93.2 b0b6ad30c48d7a0cf827d74d2dbc115e 53830
00093.QC.jpg
7c6a0252fb6d2115434424c80113b9aa 217524
00094.jpg
JPEG94.2 bffabea80b775686bed4903041a9aae5 45630
00094.QC.jpg
cb3e47be15cf20d5067e9c39100c59bf 195765
00095.jpg
JPEG95.2 8b1c8929423403ba4a98f85e10e56f1f 38207
00095.QC.jpg
a2c1eeb4755dc3731418a93936864fe2 226757
00096.jpg
JPEG96.2 3e939ebab03fd6571709f9186326f934 47849
00096.QC.jpg
7878f718b04d1845f2918a4137612cb6 202489
00097.jpg
JPEG97.2 46c3e661dd5f846463906558686e0a4d 38159
00097.QC.jpg
739cf9de3dc5be3595fe4c541fba4728 204911
00098.jpg
JPEG98.2 35a8b7d03b9766a9d56b76d6c44e128a 41655
00098.QC.jpg
8640d8f41a93d73318b912376a917e69 229257
00099.jpg
JPEG99.2 70a4d8418603120f6a4e7b37cb785765 48187
00099.QC.jpg
2401decb36f0664cfffd0d78dbd06994 188383
00100.jpg
JPEG100.2 9bfe66bcc1e3411dcdba625481311d9b 35167
00100.QC.jpg
archive
TIF1 imagetiff b9d1cf04ed17e43b89f7688154f330d2 21193072
00001.tif
TIF2 423b13c758ca5927f6b733ddad7b5d39
00002.tif
TIF3 2bbcae9ac644faea7bc9bc0d5154c7ff
00003.tif
TIF4 8041aa507ddc061a5d4fd54f2aa95e21
00004.tif
TIF5 8c8dc3f52e29de8ba12250a556f6ca80
00005.tif
TIF6 8b57a76a36532fce7a37052ef0b052b5
00006.tif
TIF7 69d23d0a91cd7c41b71d6e3d906c31c9 20794176
00007.tif
TIF8 28e2b7bd6788dff44cae486761459672 20993204
00008.tif
TIF9 b6e39594623ddabd89c73273cbaea7e5 20678952
00009.tif
TIF10 9a65df74cca83421b87a3dc7cb08f002
00010.tif
TIF11 2a2dbb038474ae2feec6b33c2b286db0 20711008
00011.tif
TIF12 84d49fb04462c3f3024e7d71ad54b541 20202132
00012.tif
TIF13 e41b77e5a317b14310f2aad3a4e1a89b 20775192
00013.tif
TIF14 f913cd3b0d455de2e17e0b604c9a6f70
00014.tif
TIF15 b6c37e29b2a356d73b6ea3bf71065e4a
00015.tif
TIF16 2099d5029eff60d1f7a37f586da1b789 20743088
00016.tif
TIF17 df53e32e0dc1cfecd7879c4275aabcb7 21689860
00017.tif
TIF18 888b6601c57067a020d0a45daf1b0f2b 20909688
00018.tif
TIF19 d992191800ff4194cfcbff02b405cb1b 21057796
00019.tif
TIF20 4480cfb39a5cf49d6f6f91ab5d2d58c2 20545104
00020.tif
TIF21 47dc3788a1849e5567c58d9d5b8e07b7 20627984
00021.tif
TIF22 7717c71dd771e113e1cb06638e642611
00022.tif
TIF23 b504db0421df72488efd5d87d1cdf39e
00023.tif
TIF24 7690c3cca77ff6422b7a054c6f9f0ec2
00024.tif
TIF25 80f554465406fd6dcae777d313997779 20595988
00025.tif
TIF26 177a7cd97f6882f0c8d13515b780c47e 20513168
00026.tif
TIF27 8d62971273336e66874b3c735a93e607
00027.tif
TIF28 be683388484950547ddaf67698c13fa1
00028.tif
TIF29 b67566bc34853303ec34c709edb96f2b
00029.tif
TIF30 ca5cc22d790efbd79cf5ecf9159d87ca
00030.tif
TIF31 46c6576e98f8837cc1d6477009132245
00031.tif
TIF32 ff65fe03d05e3c1d7879fa40def773ff
00032.tif
TIF33 fbab64d07b037dd6f768d0254ec9f2ac
00033.tif
TIF34 623635119d0ec08712dcf6f011ee3542
00034.tif
TIF35 772d10440b501759f426f7054b1e1e30
00035.tif
TIF36 f0dd96a5cca21e54706bbd50479cb9df 21109208
00036.tif
TIF37 7b30e5cf6e60e8bf4f894344445094e2
00037.tif
TIF38 ebd820e9a216ee42f8b44114200964ec
00038.tif
TIF39 ffd184c1f5a51ced3ad04dd5ee793c86
00039.tif
TIF40 50dfafa119b705b3501d8c464b5dd070 20877488
00040.tif
TIF41 a68d768fb353ca152c64f9eac80b8615
00041.tif
TIF42 2e5d34ad003c001b8a080331c9f9c4d9
00042.tif
TIF43 d7f94cd284b1c9a34bc6da944b5e1139
00043.tif
TIF44 b98148521ff7d7252c28e9f188c00921 21186408
00044.tif
TIF45 03c5bd11f301b835a787b2f59c7a4632 20826316
00045.tif
TIF46 8c61afda4c994db46dc565697c9cb3ba
00046.tif
TIF47 6087eaaffbabbfa1e8e9d7d23a68e575
00047.tif
TIF48 aa50c342ac28e1757937b33dafe0c166
00048.tif
TIF49 8f3cb47d2f10b4dfc4ab56b175003868 21141576
00049.tif
TIF50 ad9e1de8cb0bf06a94c2d02bb2212eac
00050.tif
TIF51 87bea58be68a426c07e6f59e3f0d2675 21076864
00051.tif
TIF52 8ed2cac4d9e68953e2d0cec678e78cff
00052.tif
TIF53 a37bd40c3ca98f0b81a5d40bd1ade0e7
00053.tif
TIF54 21487bdcbb7582530aba7736dcd3caf6
00054.tif
TIF55 b81a7c5be26f2596d702753a3275ea01
00055.tif
TIF56 ce7db6fc12f55ee87b8f8594f7babe47 21726024
00056.tif
TIF57 cf1e927310a5df385e5a4c9dffb3ae0f
00057.tif
TIF58 605d00a74834fed0982dc63bed438900
00058.tif
TIF59 d2cb51791679b40c9d1d3ac1602acebe 21631388
00059.tif
TIF60 9c3607c33321fc7e260e6761c911413a
00060.tif
TIF61 33c592d01fa20d0f69b1ca39c087f8f4
00061.tif
TIF62 adfc992e2a00293d591f814d032d4167 21997396
00062.tif
TIF63 28065c856ffd5df8c1dbb4ee3de75c6a 21640628
00063.tif
TIF64 06962e50e7600de1c713f4b58090b013
00064.tif
TIF65 ec0b7ee34ecba38936d82a04e8f205bf
00065.tif
TIF66 c7abec9cc3e93f45ad1ecba9e5d0f4a1
00066.tif
TIF67 2440c1eb48c216c5c7dd59be29bd6ef4
00067.tif
TIF68 b5471329088d80d4940fa2943521f223
00068.tif
TIF69 276eeba79eef03b95709466674608067 21928880
00069.tif
TIF70 432177449415247b1b55d6f1b6f035d8 22243696
00070.tif
TIF71 55e253ed0640f8e74375d34de4c9271d 20564016
00071.tif
TIF72 56970684e18b57a4a0adb31dfb75e33d
00072.tif
TIF73 dea60ad536b6c3424bf3f1ca4de0ffdf
00073.tif
TIF74 86b21f942a2e61eb64658e12b038ff15
00074.tif
TIF75 e299940b54bd3bad6488c03694a97ea1 21928404
00075.tif
TIF76 3bb368ae630901305130f740a8d81d3d
00076.tif
TIF77 8c89bfd8c85d2180d242f8cbf50f9f74
00077.tif
TIF78 fdb05651ec9eb33a0b6c00f4a7186d8e
00078.tif
TIF79 ee49d68972e7ba3f4ab7ffe0e6bc5350
00079.tif
TIF80 482257fee8067b8fe9f5afdb67c217b6
00080.tif
TIF81 1a386c7ea3bf9ec8bf8827e7b93882bc 20398640
00081.tif
TIF82 bd1f29ca60bfe226e842daf326c58f1d 22039744
00082.tif
TIF83 f1a32f1a43ee495c5783be0c22c7c06d 22056284
00083.tif
TIF84 3f704288ad879a3031af6d8bf827ed1e
00084.tif
TIF85 6fc60f2bb2f4ec4e80ca4e199a16d3ed 22389436
00085.tif
TIF86 03a55a36fbfbda46dcae07ca9926fd13 21758012
00086.tif
TIF87 b4a25c2128a5a5594b8c653b4aa22cc5 22260800
00087.tif
TIF88 a60bd0567e54bd710eae40a8e894893e 20481256
00088.tif
TIF89 59f3db0a05137b721a7f574a763f88d5 22200648
00089.tif
TIF90 ce9e9eb8a4cbb368790a591eff3a4f5d 20858480
00090.tif
TIF91 8727c187acb28b87b01ee29acfc65318
00091.tif
TIF92 bd8ce9f1db16427384828e93ae273eea
00092.tif
TIF93 cc4df561489a997a8265dc02bad48f80
00093.tif
TIF94 54ba2dcf05351f6d963fb6e9350e5ddc 20660004
00094.tif
TIF95 7d2c090fa9231e5a36f62de9dfc31624
00095.tif
TIF96 d571564e0378e1a5bcf457f653fe6b38
00096.tif
TIF97 548d97fc4ad3a571e49a68bbd3bb050e
00097.tif
TIF98 b44eb9ae99a498b26dcb07c5e523b03d
00098.tif
TIF99 bcc07469f6c07283921a2c6ab620c3a0 20960944
00099.tif
TIF100 6ed7dffef17dffc0d603dd010ac9bea1 20494268
00100.tif
THUMB1 imagejpeg-thumbnails 34672ac6e972dfc8874f2755e8961a00 6409
00001thm.jpg
THUMB2 4fa6ce7632bd009684b7187f01f006b7 5492
00002thm.jpg
THUMB3 150c610f3cdb3d43a0eb5f1344969f60 7177
00003thm.jpg
THUMB4 d022e9545a2e943ba1c12cdca1f8dd0f 9603
00004thm.jpg
THUMB5 4f45e1d56dfd1f4ac814ee3c3d5439ea 6407
00005thm.jpg
THUMB6 b25e711ae041424ab697efcb07c044c3 5875
00006thm.jpg
THUMB7 5b064ce1f82d8a438d75a111d34e2c51 10886
00007thm.jpg
THUMB8 7387c3856b2991447d7eaddcb2762069 7440
00008thm.jpg
THUMB9 811929e584d249d220f8b3859893bd73 10343
00009thm.jpg
THUMB10 ea4a9c6019cc28321729eb73ec7d3253 12006
00010thm.jpg
THUMB11 e03317eb17e02b2dde8aa67e384f804b 12172
00011thm.jpg
THUMB12 acac03be838d728f82f085995972b55a 12268
00012thm.jpg
THUMB13 064e079ffc9812038d23113336f69ff9 11055
00013thm.jpg
THUMB14 9aa3f7bcb69a6cd7c158fac19dd8644c 11031
00014thm.jpg
THUMB15 6b2c67aa3ce740ea1e78afc8cd4a47bd 12115
00015thm.jpg
THUMB16 5e6b9696be7e90223786d346cdf7c841 10201
00016thm.jpg
THUMB17 0380dfa78e8743f24019eafb97afb6ac 26932
00017thm.jpg
THUMB18 b9c36f9b509218eab9d1bdae1dfc6c79 8794
00018thm.jpg
THUMB19 35309fa6de3889cbc5f58af5ef62bfd7 10994
00019thm.jpg
THUMB20 a6320c10f36b19f52f855507700b2c3f 11894
00020thm.jpg
THUMB21 0befe9082a3cb93325773de1d5a16857 11467
00021thm.jpg
THUMB22 e7f811737b33b08a8b71f9e5bd368b9e 10048
00022thm.jpg
THUMB23 da9b2c1dcd9f7e3c25833d416dd052da 10039
00023thm.jpg
THUMB24 8f9b48655dd5d58c1790cb6aeaa96bf5 11376
00024thm.jpg
THUMB25 454ababbd18e03d1f0b6d8eeb6db9f7f 11412
00025thm.jpg
THUMB26 3324f579a3e66a63a4aa46c47af5af4b 9537
00026thm.jpg
THUMB27 89ee7ec7ea595d120a4afa6cfc17c0df 9215
00027thm.jpg
THUMB28 4621dc493cf35100d35a6e3bf316db82 8537
00028thm.jpg
THUMB29 75e9f72f0a124d1a3fa5e3f0334fbf45 8715
00029thm.jpg
THUMB30 6fb6ae7c29f1b9109306011660ecc78b 7737
00030thm.jpg
THUMB31 8e63efc67fa68941f28ec133687583d9 8336
00031thm.jpg
THUMB32 ff3cbc67a25190d3617214d64b3fb4ec 7889
00032thm.jpg
THUMB33 c47398f6d0de463a3b3097160a78a9ba 8191
00033thm.jpg
THUMB34 fb80d8285079872b671f9d11f27834cf 8418
00034thm.jpg
THUMB35 64b0bfbf5f0240da2bf4ca1bf7f1baf4 8495
00035thm.jpg
THUMB36 5d4c68f3406a9eb15ac09995c639a2eb 9557
00036thm.jpg
THUMB37 876b91241996cc3e17a146ec312c1ec2 8533
00037thm.jpg
THUMB38 d97de9e02597377b06a0ba1611320636 9078
00038thm.jpg
THUMB39 56618cba66828417dd0ecf586da3a296 8486
00039thm.jpg
THUMB40 41939fd4cc81691a417acff18ed6cd2e 10510
00040thm.jpg
THUMB41 d88208d63eec7dab5ac2a83f970f09d2 7876
00041thm.jpg
THUMB42 9296bff8066e2b73fe6452612b50b368 9327
00042thm.jpg
THUMB43 0bc0961fd3c461c51a573cccb926cffe 7957
00043thm.jpg
THUMB44 f21d498cc22e3f86a10ede1a1f93059a 26086
00044thm.jpg
THUMB45 553ccc51b40f5de5e68e03a3b5e7975d 7167
00045thm.jpg
THUMB46 75116e7ddbda8a19bbd18d26dc25de3f 9548
00046thm.jpg
THUMB47 fe7997ce4c82000fcf4db354e3c78e46 7708
00047thm.jpg
THUMB48 c174eb37753f329e65df051e50dcc47d 12031
00048thm.jpg
THUMB49 9c96b8a584ab7b035dd79b37f18e3aef 10644
00049thm.jpg
THUMB50 69b65d5c1a6b2438ceb6856f802b4650 7576
00050thm.jpg
THUMB51 482871f8dcccdd83756bfe09920e2e47 8015
00051thm.jpg
THUMB52 cfd9d86aab2980bbd2d70290b74cfb14 7519
00052thm.jpg
THUMB53 a88ae9046e3f1a4da7551ecab5f1ddc8 7529
00053thm.jpg
THUMB54 1dd6911a677f0ec0719b10dc5ced0ce7 9192
00054thm.jpg
THUMB55 8f66b271743a5a175aac06645f4cb4cf 10344
00055thm.jpg
THUMB56 d262e57a6624fce1ed80efddf288a20a 28832
00056thm.jpg
THUMB57 9800a1ae358d39ceb58ee2f1990d0863 7336
00057thm.jpg
THUMB58 40f4bbfd6e3390a29602d1d5ed593161 8520
00058thm.jpg
THUMB59 0651a96912daa176a4f83b19c6be4b5f 27718
00059thm.jpg
THUMB60 e111f2b0cadfd2d9b5d153d4f567606f 8499
00060thm.jpg
THUMB61 080c8650e3f3538ebfb175e77db8821c 6768
00061thm.jpg
THUMB62 6c6b555cf76f67c8976d2e1944bfe45d 27685
00062thm.jpg
THUMB63 88ad3f94e25f1a2e0ee061fbbb0a364f 26616
00063thm.jpg
THUMB64 f62ff8205214d38a9dd590e2e1ed05c9 8023
00064thm.jpg
THUMB65 bfd76a9b02affccbe9711ac0ec2f51e2 8139
00065thm.jpg
THUMB66 0352465d223f66f3bd44b9194f5ea5b7 7835
00066thm.jpg
THUMB67 1f63e05fd7daca0e0db22a73545b2934 7570
00067thm.jpg
THUMB68 32a047a82bb12134a0cf70fe11dd8bb0 8876
00068thm.jpg
THUMB69 25e46b0f700357520dcca9c33369d511 27149
00069thm.jpg
THUMB70 d785feb462881fde99335078330619a0 25976
00070thm.jpg
THUMB71 b869d3bd0363990d00167a5d9f6ac8e4 7311
00071thm.jpg
THUMB72 ed5f89ecff36b80d784d5c3b3f4f024a 7636
00072thm.jpg
THUMB73 6d5c1b41aa8a1d7cd185892a4b0e8f15 7635
00073thm.jpg
THUMB74 3e82410b34515530b042c27d4fa2f437 6892
00074thm.jpg
THUMB75 74aaac30215230da7278124dccdc922b 26578
00075thm.jpg
THUMB76 bb77c340f5377677cdac567f04fd72ff 8119
00076thm.jpg
THUMB77 8f85d6ebac9f27eac012becb22c82cad 7879
00077thm.jpg
THUMB78 6333a21da47823d1bece124c111f3bd2 9396
00078thm.jpg
THUMB79 85593a7e096565707f67cc3c6dc68381 9043
00079thm.jpg
THUMB80 6cdead1fb156794f2fef5bf22c0d2f1d 11081
00080thm.jpg
THUMB81 2e551a0ba518804748d33c53312d1a4e 8864
00081thm.jpg
THUMB82 9a7840211bcf6d545f02237880327236 26626
00082thm.jpg
THUMB83 569675074acf9b92a67eaac92b46909d 26648
00083thm.jpg
THUMB84 45d8d61e051db2d3cb0060c2aad88d28 10412
00084thm.jpg
THUMB85 80f058a36ff9f01e82f466983b350dc1 27115
00085thm.jpg
THUMB86 b0eb777c6c08458439773557c46941ff 26028
00086thm.jpg
THUMB87 d294cd2e4ca7fc47f9ed47a0d923e69d 27485
00087thm.jpg
THUMB88 ab30a39f1bfdd2e915816ac7bd5817a9 10032
00088thm.jpg
THUMB89 b780d6234409a22cc95247d5f07fd14b 25434
00089thm.jpg
THUMB90 581d89749546d825e9d0192b5fb44b81 10137
00090thm.jpg
THUMB91 05a7597f73643356c84ce09a88046d4b 9601
00091thm.jpg
THUMB92 c45b628a50a0ad61df947fc809c094fc 11050
00092thm.jpg
THUMB93 b21f84a2007a89a7cf270c2cbbd1b1c3 12179
00093thm.jpg
THUMB94 48ed6ad439e12ab7a488cd37b642c66b 10404
00094thm.jpg
THUMB95 b2e03c3ed939bb9197872462410ebcae 8404
00095thm.jpg
THUMB96 1d1bc3719df0b83c3fd36d6d97bc0491 10515
00096thm.jpg
THUMB97 0685be648b235c5b30430b55b96c531d 8140
00097thm.jpg
THUMB98 f2d432e74a9eaf6415d9a41e0d36744c 9367
00098thm.jpg
THUMB99 704dc15263891d40f24f3a57dcda8a40 10753
00099thm.jpg
THUMB100 327a1dec0df4e395b1d5a728f242304b 7431
00100thm.jpg
PRO1 textx-pro 8a2075d231b9876e962455f7a670d9d4 6797
00001.pro
PRO2 ac99c49d1eb77991431c3b08459e3d1a 1705
00002.pro
PRO3 7c2d5123a38194f8d988cf7d8b00dd5c 10031
00003.pro
PRO4 7493c2aa90dc12b1b84e0aacebcac69c 37647
00004.pro
PRO5 42a4f771c332c515bdcb88e3466924df 8813
00005.pro
PRO6 4b18607e9854d0cac7aef6f747709027 5571
00006.pro
PRO7 3213d69c9f8a8ee331c389fe5994ffae 33275
00007.pro
PRO8 d6b926aeeff30137cc3dc0d248705b69 11653
00008.pro
PRO9 7344f5aa9c0e789a33829eeec21ebfb9 24422
00009.pro
PRO10 93c9e032da1dd56b481d5705e1d54433 38924
00010.pro
PRO11 5a306b3a9e205842f4379a48a9d510ce 40330
00011.pro
PRO12 3d8c88e9109ba3b2e0a9cf98a4776fe7 38552
00012.pro
PRO13 08b2d1202ba099abcdefa28259e5a6ff 34199
00013.pro
PRO14 70aa44144ae3a674a1745a6ae4e6c3cc 30058
00014.pro
PRO15 96b0cdc3c0efda9e1df8b1b2651aabe8 39638
00015.pro
PRO16 ad7bace4aa4ac0e504892c87090f5ac3 28083
00016.pro
PRO17 55570544b99dcb0d1c620c6580b62257 18849
00017.pro
PRO18 40e1db4c70a6fb64200414f72c5c965b 18221
00018.pro
PRO19 3f31dfe528beb07b640810efa17ef387 30291
00019.pro
PRO20 1475637807156d7e8087b281d241dcef 36411
00020.pro
PRO21 6c6aec9bf54a1aa2eba965c0bb22d810 34692
00021.pro
PRO22 dc2430dae0f8c4e84fbd88f645b4bf2f 28022
00022.pro
PRO23 2879b40d67659d476d81d21dc265fdfe 25496
00023.pro
PRO24 ffcfdddc4fbda7792d75cdf74e7f0f09 35423
00024.pro
PRO25 0f9338eb57cd067e2c66a0bd3bfb258a 32971
00025.pro
PRO26 b56ab69aae783fec1357def20b4e7923 20333
00026.pro
PRO27 10704ec79ce7e25e7f6cfddcddd230c0 20805
00027.pro
PRO28 66dc2c1b615b75464141d0e443c447c4 16522
00028.pro
PRO29 427c79fc70018d7f5b933fd560f5246f 20326
00029.pro
PRO30 e6528b3833acec415d732db7f3615ec1 15216
00030.pro
PRO31 33ebc6251175eaf487e604669b098691 16412
00031.pro
PRO32 50412a5991af33dbb1d2e3ef9533d704 13249
00032.pro
PRO33 0c0db6b1b3b0072e71dbe2e686a0c6d1 17081
00033.pro
PRO34 fa25347c5246cc2342b376ea19102436 17336
00034.pro
PRO35 2c6fe78ec3fe9456b48e554650b2c8cb 15054
00035.pro
PRO36 57034da00dc3478b4bfbfa1f0eee07d2 23117
00036.pro
PRO37 55497489e1303c658354ab5609cc9413 19949
00037.pro
PRO38 010ae4e495721fadea5460f096b121df 22737
00038.pro
PRO39 d90dd0a4f33d9b0d4f71fa8a0177131e 18506
00039.pro
PRO40 74d94765255c42e600654010e769728f 28234
00040.pro
PRO41 6cb98402a1829203518bf354740d60a6 15234
00041.pro
PRO42 2e8df34289b82322f9545ac724ac1507 25658
00042.pro
PRO43 8d9ba1a2b4460b1e5f20d7ffb75b3025 15732
00043.pro
PRO44 083687642f3e7c1f3a8d9a8d6929bc5c 12294
00044.pro
PRO45 1648ebf8ee6046f9495c745e8f769ca3 13545
00045.pro
PRO46 0c73bb0aa2f266a44439cde2d30cf8a8 11860
00046.pro
PRO47 1b05ee17cf7b18f30f3fce7412358803 11621
00047.pro
PRO48 d34857f45e0f440f063de27d997cc1ec 40711
00048.pro
PRO49 d88cba2fbf73231fd385756c37ad1dc0 31906
00049.pro
PRO50 a4eba7b826a227b948faa64f62ff6b0e 15043
00050.pro
PRO51 468fdb97ec885f35b6a4be23d692ca42 14433
00051.pro
PRO52 dbaf1a908147750a271bf30fe3c0eea9 13719
00052.pro
PRO53 a7e035128421e550e55403f8529aeea0 12331
00053.pro
PRO54 4126b226747995c45f1dfb581e275573 22109
00054.pro
PRO55 14238aee14798c96519dc1baf925f9de 25896
00055.pro
PRO56 2aadaef6fa9d0e8704692625af7a68d7 30061
00056.pro
PRO57 7999fbd11ce86d97e74ad4ddf2ecb135 13691
00057.pro
PRO58 f1ad5af9ed7c42d63074ef1acc77ac7a 17022
00058.pro
PRO59 192d84bd9256261e5f5f3da8f5bbc17c 18454
00059.pro
PRO60 80f6b68976567d39fea8bc047c889653 18920
00060.pro
PRO61 29ad54b3fab2705c7ffe6ff7d7b1954d 7681
00061.pro
PRO62 8a7961ca1917a00cf85af655e22b4e90 20078
00062.pro
PRO63 d3fd95fb5b336485cd8534536fcb15d9 15292
00063.pro
PRO64 be337a8af98248d2736eae4fa7696b86 12191
00064.pro
PRO65 e8d2d2c50ceb47d1e3b61417644a04a2 14655
00065.pro
PRO66 e7c2416e1d002dce1390302fb85ab285 19496
00066.pro
PRO67 0fadf01365424ae29fd94e1949c9e99f 16408
00067.pro
PRO68 59180aa4418e68b68f3d3b73e2afaff5 18672
00068.pro
PRO69 d72920b816e83bc2b75b7d37dbb085ec 15393
00069.pro
PRO70 bab5ebe3bcde71b2fce7fe9118724ce2 14168
00070.pro
PRO71 fcdb9686a3b01e7bad172eedaf53d066 7413
00071.pro
PRO72 46f7308a0cff3a1d3dba26e9d76d4a4a 11690
00072.pro
PRO73 31f3247896a098b9481f624a8aae634e 13932
00073.pro
PRO74 604ef38908ab92489bbf83886b1fdfa0 11483
00074.pro
PRO75 78d4f9795e7ddad484bc475d06985d61 11146
00075.pro
PRO76 224f679efc33c53667d97012ac344795 14538
00076.pro
PRO77 80b3e54e982191da791d63fa1b1518f3 13607
00077.pro
PRO78 edf06d8819df658fc0d02dca9c49eac4 25454
00078.pro
PRO79 0b30f50c351da4fcc52b2d14427ae4aa 18895
00079.pro
PRO80 14a58bf96d5e91e8a4224874ecc4d6c1 31155
00080.pro
PRO81 b5e315273bebb79661bbe9eb953bf877 14939
00081.pro
PRO82 cfc10c137d1a1f0d52eb7cb1c00881bd 14069
00082.pro
PRO83 cd4f03c24293fa02b987f83a1adb962d 14522
00083.pro
PRO84 34cf9c1827edeb97ad7147f928c4901c 28346
00084.pro
PRO85 f6f6970db264cbd1c344ddd0e51ea965 17754
00085.pro
PRO86 1a8d04cc6ccc58b3f8239624807d8259 10018
00086.pro
PRO87 2a41cfec78b476eec181cc2ce9ac4a9b 19606
00087.pro
PRO88 34aead5238868a56b46903093aad230d 24907
00088.pro
PRO89 2cdb34bd6176425bf8045daa53215bf5 12061
00089.pro
PRO90 0955facdfdeb8effe7e041b27615ec8e 25551
00090.pro
PRO91 e9f08489e3de0a1c105529a0c170228f 24444
00091.pro
PRO92 317401110a06d67deec6a3d896c1835f 26617
00092.pro
PRO93 9b475378040f1923a217d542d5b32c04 37408
00093.pro
PRO94 e23c8500b2422214eb9be0462228c0b5 32006
00094.pro
PRO95 29c91a0a9c3f29517f75ba8476864504 23831
00095.pro
PRO96 ebe815e8d1f0e55bb2d5331967f4b803 35989
00096.pro
PRO97 12d1d07b419a58d775bc4f4dbbd1d867 85243
00097.pro
PRO98 91910a715c1e1ddf5cf0bbe4fbc57a79 21276
00098.pro
PRO99 a513b865a6711781eeb21822f3a0178c 29124
00099.pro
PRO100 c7d5d3d5903aef6ef9b040fdb11a3246 15799
00100.pro
TXT1 textplain caa0d96b3da0b7605ac77564abd31e20 272
00001.txt
TXT2 8773e879a4e0a5bb99294bf37f322f91 66
00002.txt
TXT3 a48ae624775da4a95f7782cb1cc25948 356
00003.txt
TXT4 5291444dda5d8ab06c48576ec8733769 1458
00004.txt
TXT5 2ce8f055905b5916f8b4dd79dc215d6b 433
00005.txt
TXT6 5568465d83e50d15173e7a8af5ffe4aa
00006.txt
TXT7 af564ea5f5614c372f1a377da40b5ca6 1174
00007.txt
TXT8 692f9b6b024c52b18ff5ef205d7f25a1 469
00008.txt
TXT9 5d6de1efea3202c07c3e4e9eb5f47b2d 978
00009.txt
TXT10 f54f9b5fae9919e095eedcc02674b80e 1372
00010.txt
TXT11 2a5e6fa8d978c51e7c058874012c1d07 1425
00011.txt
TXT12 83f99d4ff1f638af5c48ae21eb13d348 1365
00012.txt
TXT13 a58a7743621198c2e6091646f062e1d1 1229
00013.txt
TXT14 1df7f6be63a9cac433011aa1467a5805 1083
00014.txt
TXT15 524cf98216c5666ea2061bec583acece 1399
00015.txt
TXT16 e1a447fb0c3aa6513df5b947ed8b2449 1053
00016.txt
TXT17 aa12fbd3bd35c4d69ce525f49b98c060 747
00017.txt
TXT18 63ff5e03125ef1a63a3dd9a24d89955f 704
00018.txt
TXT19 6324f042284c962511c28cf25f355783 1228
00019.txt
TXT20 e36a6e9fafc12873ad885d826fe7a96f 1310
00020.txt
TXT21 c188d3220808646787d57ebc63b23701 1248
00021.txt
TXT22 2bdd6c3aa6a523510510c2c31b68fe28 1013
00022.txt
TXT23 9d741c50c9fe166337d907ae09a8e36c 958
00023.txt
TXT24 cedf0a4f8247a3929d8716d5889dbf55 1245
00024.txt
TXT25 50932eb2a5e60dce3ed2623f97a14217 1213
00025.txt
TXT26 1c56b3712e416f14c294cdc695b9e4fc 765
00026.txt
TXT27 02fd50163067eb94ebb9a488bf188240 804
00027.txt
TXT28 e5ef14c5a1a2c4200067a2225b1dd9e7 650
00028.txt
TXT29 fe5589089d7e8c12a8d8ba4a5bfe9af3 809
00029.txt
TXT30 743b16bd17c2203a57e315fec71c4b50 633
00030.txt
TXT31 5b877093b5b9d54fc30039204b3f4c97 606
00031.txt
TXT32 905ba3a9d04bb38bc9bbec226f949c7e 487
00032.txt
TXT33 c0bd5c76e53448d9a7704e46d334ff3d 726
00033.txt
TXT34 3c1633cbd2efaf810d3caa64087312c7 689
00034.txt
TXT35 5286c7361c12feba5ee126234b4f1810 565
00035.txt
TXT36 3c77eb0a11b889133bb5e675fe9227d1 866
00036.txt
TXT37 9a8b61be4a192ba3ee829b9fe2260474 805
00037.txt
TXT38 f4dcf1763fc66ebc0f2734ae7d71f785 870
00038.txt
TXT39 fd894d2b8389c3157add5745e8246951 739
00039.txt
TXT40 3146f215b0b73a44f27dc05b4011943f
00040.txt
TXT41 821e4542059afd4521671bf1a0e80a30 618
00041.txt
TXT42 e3eff1da7153f92fdd2662d84d8fee6b 1018
00042.txt
TXT43 0e939f0ce84cd00d464f164b01a6fc68 623
00043.txt
TXT44 d5e0a44d45ddefc495ef53d7b5f7acbe 509
00044.txt
TXT45 81127827f5b4c29f67cb4706869ba7ea 533
00045.txt
TXT46 be402d2dabb91ca5edfc3893c5077314 475
00046.txt
TXT47 de463b4a92e66a4de616154f6b5ca107 505
00047.txt
TXT48 b9d5fe7a263ef6c8cbb27b17a8cb5f81 1430
00048.txt
TXT49 a22155ff60e9c453ba907595d67a69b0 1187
00049.txt
TXT50 95b16d88efa02d954c59605ae5f8203a 557
00050.txt
TXT51 89f168275dad2cd2c9a9a3b0e2489430 627
00051.txt
TXT52 8cfa2f94e547a162006283167d71e1cc 490
00052.txt
TXT53 361d3a6d3855d3d06e74680f1e3f5967 504
00053.txt
TXT54 b6ec2d002380add683584179ac2024f9 930
00054.txt
TXT55 af101025b538fbf8ffb429b400baaac4 962
00055.txt
TXT56 9dddad214997d19942a3d7adc160a782 1222
00056.txt
TXT57 44065f8a090bf33be1ec7f2b9169d172
00057.txt
TXT58 a9cd6fbc837e880b42050bfb323345f4 672
00058.txt
TXT59 374a9f75de316d9cdf583b9bb476a902 736
00059.txt
TXT60 52ef501f045b7e1cd83245b952f09041
00060.txt
TXT61 f6f345ecaafe5d93ca3194784a7351a3 284
00061.txt
TXT62 e0908ce49677ca163062246eb7fa9de3 785
00062.txt
TXT63 392ba0a35f3d6a5866c81661e74dda46 637
00063.txt
TXT64 449e1d13865b5479a0684448ee8574c4 443
00064.txt
TXT65 c4ad497d4fab600b7fee3d34c12a5491 545
00065.txt
TXT66 46c528c944a7ae7916ea4f5c24d375c2 749
00066.txt
TXT67 37bca162658495d193faf874a788360a 629
00067.txt
TXT68 55c44f16d5a93b836b58c23992e5e496 745
00068.txt
TXT69 b17fccb25b78096d6c6eed9cceac215d 658
00069.txt
TXT70 cd48c5726e930ca76e907da9480076fe 624
00070.txt
TXT71 bc452c6e0fb9c08cf7cce7e41c7e398e 298
00071.txt
TXT72 ea1fff34ee52be52d5ef1c004a50a80b 441
00072.txt
TXT73 38e28d62c7172cbbc4e543b057d06dfb 536
00073.txt
TXT74 001d374e46385aedff2507be23624d73 463
00074.txt
TXT75 61758bed89d6caa774f968a2a930dec3 465
00075.txt
TXT76 75e43861f446f1ffd5404bea95743309 552
00076.txt
TXT77 0ae4a131166cdedd0008a560175bc22a 613
00077.txt
TXT78 705e3930efd30d9885416873bfdb95ed 969
00078.txt
TXT79 5c22873337b825f44e9f014a026f3974 701
00079.txt
TXT80 083a1fe4b8863c3f9a3bc5817d09ada5 1124
00080.txt
TXT81 acccaf9c52eb3a6b011fbd10ff49596c
00081.txt
TXT82 48ce82cce739d110389bc13c3013f6f7 544
00082.txt
TXT83 9f980408b8f50242b2b9edc49bf5b2a8 588
00083.txt
TXT84 de120f6a601f2b42bcbf08a4973b5379 1040
00084.txt
TXT85 50da10bbd1c883bd743272b142bd81b5 735
00085.txt
TXT86 e8d0a1dec3a3aa38d220b6c3c63e369c 395
00086.txt
TXT87 449e5b38ce30c945c6f861849a8952ae 812
00087.txt
TXT88 83107d5bc08bcd625334b701881eb5b2 998
00088.txt
TXT89 ecfb6ad5be12473e2c135d6133f91118
00089.txt
TXT90 2f9ba8a27ac111811839270a827fbbe2 927
00090.txt
TXT91 01c3c2a8d1e34908ec2f6b602a174bff
00091.txt
TXT92 bf8375de8dced88748ed4a2ceb00c026 1043
00092.txt
TXT93 a20de2da0316bfe4df56517864537f84 1352
00093.txt
TXT94 05dabe328961951556a5dc917534305a 1282
00094.txt
TXT95 1a2cf7fc20d3e9d207082b89333186ad
00095.txt
TXT96 6b830f4341cc193a3a69a006a2bc196e 1335
00096.txt
TXT97 ceaae755fedfbd704950c60172d4a203 3405
00097.txt
TXT98 4c2496e11e82652ff1a086b3b6087c18 788
00098.txt
TXT99 58e2125bf444f8684faed4c10edb6bba 1042
00099.txt
TXT100 932bcc24bc4ec930b5e7e9cddd110a88 566
00100.txt
PDF1 applicationpdf 4da225c78d2a203bcad27f6c776c6b11 52332726
AA00062827_00001.pdf
METS1 unknownx-mets 4b82f2ec853d68b029d100b5a4c6c70e 101886
AA00062827_00001.mets
METS:structMap STRUCT1 physical
METS:div DMDID ADMID ORDER 0 main
PDIV1 1 Main
PAGE1 Page i
METS:fptr FILEID
PAGE2 ii 2
PAGE3 iii 3
PAGE4 iv 4
PAGE5 v 5
PAGE6 vi 6
PAGE7 vii 7
PAGE8 viii 8
PAGE9 9
PAGE10 10
PAGE11 11
PAGE12 12
PAGE13 13
PAGE14 14
PAGE15 15
PAGE16 16
PAGE17 17
PAGE18 18
PAGE19 19
PAGE20 20
PAGE21 21
PAGE22 22
PAGE23 23
PAGE24 24
PAGE25 25
PAGE26 26
PAGE27 27
PAGE28 28
PAGE29 29
PAGE30 30
PAGE31 31
PAGE32 32
PAGE33 33
PAGE34 34
PAGE35 35
PAGE36 36
PAGE37 37
PAGE38 38
PAGE39 39
PAGE40 40
PAGE41 41
PAGE42 42
PAGE43 43
PAGE44 44
PAGE45 45
PAGE46 46
PAGE47 47
PAGE48 48
PAGE49 49
PAGE50 50
PAGE51 51
PAGE52 52
PAGE53 53
PAGE54 54
PAGE55 55
PAGE56 56
PAGE57 57
PAGE58 58
PAGE59 59
PAGE60 60
PAGE61 61
PAGE62 62
PAGE63 63
PAGE64 64
PAGE65 65
PAGE66
PAGE67 67
PAGE68 68
PAGE69 69
PAGE70 70
PAGE71 71
PAGE72 72
PAGE73 73
PAGE74 74
PAGE75 75
PAGE76 76
PAGE77 77
PAGE78 78
PAGE79 79
PAGE80 80
PAGE81 81
PAGE82 82
PAGE83 83
PAGE84 84
PAGE85 85
PAGE86 86
PAGE87 87
PAGE88 88
PAGE89 89
PAGE90 90
PAGE91 91
PAGE92 92
PAGE93 93
PAGE94 94
PAGE95 95
PAGE96 96
PAGE97 97
PAGE98 98
PAGE99 99
PAGE100 100
STRUCT2 other
ODIV1
FILES1



PAGE 1

QUANTAL RESPONSE ASSAYS BY INVERSE REGRESSION By FRANK HAIN DIETRICH II A DISSERTATION PRESENTED TO THE GRADUATE COUNCIL OF THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF DOCTOR OF PHILOSOPHY UNIVERSJTY OF FLORIDA 1975

PAGE 2

To my Mother and Father for their love and faithful support

PAGE 3

ACKNOWLEDGMENTS I wish to express my deepest thanks to Dr. J. J. Shuster for his expert and helpful guidance in this effort. I also wish to thank Dr. J. T. McClave for many helpful discussions and comments. Finally, I wish to thank Mrs. Nancy McDavid for the outstanding job of transf-0rming the rough draft I gave her into this typing masterpiece. i i i

PAGE 5

CHAPTER IV (cont.) TABLE OF CONTENTS (continued) 4.2 Estimation of Relative Potency by Various Linear Techniques 4.3 Summary BIBLIOGRAPHY BIOGRAPHICAL SKETCH V 85 87 88 90

PAGE 6

LI ST OF TABLES Table 3. 1 Notation Chart 73 4. 1 Exact Coverage Probability 84 4.2 Estimation of Relative Potency 86 vi

PAGE 7

Abstract of Dissertation Presented to the Graduate Council of the University of Florida in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy QUANTAL RESPONSE ASSAYS BY INVERSE REGRESSION By Frank Hain Dietrich II August, 1975 Chairman: Dr Jonathan J. Shuster Major Department: Statistics Numerous methods are available to analyze quantal response assys. Some of the more popular methods of analysis are discussed The general aspects of inverse regression are also discussed. A general inverse regression procedure for estimating dose response curves in quantal response assays is presented. Asymptotic distributional properties are developed. Procedures to form (l-a.)1OO% nominal confi dence intervals for quantities of interest are given. Methods of testing hypotheses ~f interest are also developed. The particular method of applying inverse regres sion to quantal response assays by use of the angle transformation is presented. The special case is given where, after application of the transformation, the dose response curve is linear. The inverse method has a decided advantage over the more classical methods in Vi i

PAGE 8

this case, both in flexibility and in ease of applica tion. The procedure will be shown to be fully efficient in the asymptotic sense. Numerical examples are presented to demonstrate the applicability of inverse regression to quantal response assays. The numerical examples deal with linear response cu~ves since cla~sical methods of analysis are only ap plicable in this case. Inverse regression may also be used when the response curves are non-linear. V i i i

PAGE 9

CHAPTER I STATEMENT OF THE PROBLEM 1.0 Preamble In Chapter I the general problems of quantal response assay and inverse regression are presented. In Section l. l we will introduce the classical quantal response problem along with different objectives of a quantal response assay. Section 1.2 gives a history of frequently used methods of an~lyzing quantal response curves. In Section 1.3 we ~ill discuss the general topic rif inverse regression. Jnherent in this discussion is a comparison with classical regression. In Section l .4 we will give a summary of Chapter I along with the results obtained in the remaining chapters. l. l Introduction In the classical quantal response problem subjects (plants, insects, patients, e tc.) are subjected to a stimulu~ (fungicide, insecticide, physical therapy, etc.), and an all or nothing response is recorded. Al though it would usually be desirable to measure the response quantitatively, it is often only possible to measure a l

PAGE 10

response as occurring or not occurring. It is this type of response we will be interested in analyzing. The stimulus is often referred to as a dose, and the dose is administered at different levels. Generally, we independently sample n. subjects at dose, d., l l 2 i = l, 2, ... k. For each dose, di' we are interested in the true fraction of positive respon~es, pi. Thus, for each dose level the number of positive responses ob served in a sample of ni subjects is a binomial random variable with prob~bility of success equal to pi. For each dose, d., we calculate the observed fraction of posi1 tive response~, ~i, the maximum likelihood estimator of Pi A quantal response curve is then fit. The response curve is basically found by fitting the fraction of positiv~ responses observed against dose. Usually both the fraction of re~ponses and the doses are transformed before the curve is actually fit. This type of analysis is often used to assess the potency of drugs of all types when it is either impractical or impossible to determine the potency by chemical analysis. The actual objective of a quantal response assay may be the solution of one of a number of related problems. An objective of many assays is to estimate LD(lOOp), the true dose at which 100p% of the subjects have a positive response. In particular, LD(50), called the median lethal

PAGE 11

dose, is often of prime interest One reason for this is that it is used in an attempt to classify drugs as to their effectiveness. At one time it was attempted to classify drugs by a minimal lethal dose or a maximal 3 lethal dose. The minimal lethal dose would be the smallest dose at which a positive response is attained for at least one subject. The maximal lethal dose would be the smallest dose at which all the subjects would exhibit a positive response. Needless to say, it would be very difficult to estimate these quantities. For a fixed number of sub jects LD(50) can be estimated more accurately than a minimal lethal doie or a maximal lethal dose Thus, LD(50) is now often used to attempt to measure the effectiveness, or potency of a drug. There are however instances, such as toxicological problems, where doses producing 100% response are of more intere st than LD(50). If two or more drugs are to be compared, it is often done in terms of the relative potency, the ratio o( equally effective doses. Even if a new drug is to be com ~ pared to a standard, the tolerance of the population may change, and both drugs must be experimented with at the same period of time. Thus, an estimate of relative potency is obtained rather than measure the performance of the new drug singly and measure its effectiveness in relation to the standard as an absolute effect. Reiative potency is a

PAGE 12

valuable measure only if it is found that the quantal response curves are parallel, Thus, the ratio woul d be the s~me at all equally effective doses; The relative potency is therefore usually measured as the ratio of median effective doses. 4 Whenever two or more drugs are under consi~eration in a articular problem, it is desired to know if a mixture of the drugs might be more effective than applying the drugs individually. In general, the joint action of a mixture of drugs can be classified in thr~e categories. The three categories as given by Bliss [l] are independent joint action, similar joint action, and synergistic action. If drugs have independent joint action, they act indepindently and have different modes of action The drugs may or may not be crirrelated in terms of the sus ceptibility of one component as compared to another. The potency of the mixture can be predicted from the fitted curve for each drug alone and the correlation in suscepti bility to the drugs. The potency of the mixture can be computed on this basis whatever the relative proportions of the individual constituents Drugs are classified as having similar joint action ff they produce similar effects so that one component can be substituted at a constant proportion for the other. Variations in individual susceptibility to the drugs are

PAGE 13

5 completely correlated or parallel. The potency of a mixture is predictable from the relative proportions of the individual components. The last classification is synergistic action. The potency of the mixture cannot be assessed from a knowledge ~the individual potencies. It must be based upon a study of their combined potency when used in different propor tions. If the potency of the mixture is greater than that expected by studying the mixtures singly, the drugs are said to synergize. One drug antagonizes another if the ~ixture has a smaller potency than expected. We have now stated the basic problems of interest in a quantal response assay. The next section will deal with a history of methods for analyzing quantal response curves. 1.2 History--Pr~vious Methods of Analyzing Quantal Response Curves Although numerous methods have been proposed for analyzing quantal response curves, the most frequently used method is probit analysis. A thorough discussion of probit analysis is given by Finney [2]. In the classical quantal response problem, we inde~ pendently sample ni subjects at dose, di' and obtain ~i' the fraction of positive responses, i = 1, 2, ... k.

PAGE 14

6 In order to use prob ft ana Tys is in analyzing quan tal response curves, the probit of p Z., is found by 1 l the following transformation, -l/2x 2 d e x, i = l 2 k.(l.2.1) Once the probits have been determined, a linear response curve is fit against log dose by iterative weighted least squares. A procedure similar to probit analysis was sug gested by Knudsen and Curtis [5]. Rather than use the probit transformation given in equation (l 2.1), Knudsen and Curtis suggest the use of the angle transformation Zi = Arcsine (pi 112 ), i = l, 2,. ; ., k, (l.2.2) where Zi is recorded in degrees. Once the ang1e transfor mation has been performed, a linear response curve is fit against log dose by ordinary least squares if the sample sizes are approximately equal, and by weighted least squares otherwise. For all practical pur~oses the angle transformation is a linear function of the probit transformation. Mo o r e a n d Z e i g l e r [ 7 J d i s c u s s t h e u s e o f n o n linear regression methods for analyzing quantal response

PAGE 15

7 curves. They demonstrate that any methods based on maximum likelihood estimation of appropriate parameters may be formulated as non-linear regression problems. It should be noted that both probit analysis and the angle trans f-0rmation are based on maximum ltkeli hood principles, and thus fall in this_category. Moore and Zeigler conclude that a reasonably general least squares computer program could replace several specialized quantal response analysis programs. It has also been pointed out by Nelder [ s J that there is an important class of estimation problems which leads to a form of solution which is closely arialogous to linear rather than non-linear regression; Basically, the conditi6n which must be satisfied to be in this class ~estimation problems is that the first derivative of the likelihood can be put in a form where p, the true fraction of positive responses, is a linear function of the unknown parameters. Again, probit analysis and use of the angle transformation fall into this class of problems. Thus, a well-constructed linear regression program co~ld be adapted to cope with this type of problem. Although quan tal response assays usually involve discrete distributions, Nelder also shows that the same iterattve linear regression procedure can be used on a class of non-linear models which inv6lve continuous rather than di~crete distribu tions.

PAGE 16

8 1.3 History--Inverse Regression Krutchkoff [6] discusses the general problem of inverse regression,and in particular as it applies to the problem of calibrating an instrument. He uses the example of calibrating a pressure gauge. To calibrate the gauge, one subjects it to two or more controlled pressures, and notes the gauge markings. From these data, the calibration parameters are estimated, and the gauge is calibrated, Unknown pressures are then estim~ted by reading the cali brated markings. If x represents the controlled variable, and y represents the measured variable, then the relationship between x and y can be expressed by the usual linear model y =Cl+ SX + (1.3,1) The classical approach to calibration using model (1.3. 1) with k values of x, and independent identically distributed errors with zero mean, uses the usual least squares estimates of a and S. These estimates are found by A s = and k I (x.-x)(y.-y) l l l l = k I: (x.-x) 2 1 l l = (1.3.2)

PAGE 17

where ,.., a = y "\ BX X = k EX. i = l l k and y = k E y. i = l l k The least squares Jine is then represented by A I', y =a+ Bx, and the calibration equation is A v a X = ,L_.::::___ 6 9 (l.3.3) (1.3.4) (1.3..5) (1 .. 3 6) Thus, from a gauge reading of Y, the classical estimate, Xe, for the pressure is Y-~ A f3 Using the inverse regression approach, model (1.3.1) is rewritten as x = y + oy + E' where y = a /B, o = 1/B, and E'= E /f3. (l.3.7) (1.3.8) Again, the usual least squares estimate of y and o are

PAGE 18

found by k 8 = _r (xi-x)(yi-y) 1 = l k 2 L (y .... y) l 1 = (1 .3.9) ahd y = X (l .3.10) The least squares line is now one and the same as the calibration equation and is expressed by A "\ X = y + oy. (1.3.11) Thus, using inverse regression, for a reading of 10 Y of the gauge, the inverse r~gression estimate, i 1 of the pressure is X 1 = y + 6Y. (1.3.12) The estimates given by equations (1.3.6) and (1 .3.11) are not generally the same. It is therefore of interest to judge which estimate is better by the use of certain criteria. Krutchkoff uses the criterion of mean square error to judge the relative effectiveness of the estimates.

PAGE 19

1 1 Krutchkoff concludes on the basis of a Monte Carlo study (in which values of l l<.001 were replaced by +:001 as appropriate) that the mean square error of the inverse estimate is uniformly less than that of the classi cal estimate. The Monte Carlo study involved different values of a and S, different variances, different designs, and normal as well as non r normal error distributions. Thus, on the basis of mean square error, it appears that the inverse estimate is more de5irable than the classical estimate. Williams [11] points out that under the assumption of normally distributed errors the classical estimate has undefined expectation and infinite variance, and hence infinite mean square error, Under the same assumption the inverse estimate has finite mean square error. Thus, Williams concludes that the inverse estimate is better than the classical estimate from the mean square error point of view. Williams goes on to point out, however, that this conclusion is not very satisfying. He reaches this conclu sion because ~11 that was shown is that the mean square error of the inverse estimate i~ less than infinity. He questions using mean square error at all as a criterion for comparing the two estimators.

PAGE 20

1 2 It is also of interest to note that Williams shows that there is no unbiased estimator with finite variance. In somewhat the same spirit as William s, Halperin [4] notes that a random drawing from any distribution with finite variance would provide a better estimate than the c 1 a s s i c a 1 e s t i m a t e in th e m e a n s q u a r e e r r o r s e n s e Rather than dwelling on the mean square error argument, Halperin considers the criteriori of relative l cl o sen es s II of two estimators 1 and 2 to X Here "closeness" is in the Pitman sense That is, x 1 is ft closer estimate of X than 1 2 if, for all X, (l .3.13) Halperin shows that the inverse estimate is a closer esti~ mater than the classical estimate for all values in a closed interval of X. This interval depends on quantities such as S, cr, x, and the sample size. It turns out that if IPI is large, where p = S/cr, Y is well determined, or the val u es of the i n dependent var i a b 1 e a _re w i de l y d i s per s e d the estimates are indistinguishable. Saw [10] shows that for any distribution on the errors, the slo~e of the inverse regression line is always of the same sign, but greater modulus, than the slope of the classical line. Thus, at X = x, the inverse estimate is closer to X than the classic-al estim.ate with probability one.

PAGE 21

Saw goes on to point out that any line t~rough (y, x) with slope of the same sign, but greater modulus than the classical regression line,will perform better (as an estimate of X) than will the classical estimate within some nefghborhood of x. A similar statement can be made in reference to the inverse regression 1ine. Thus, there exists no best way to estimate X uniformly over an interval of X. This being the case, Saw concludes the specific use of inverse calibration is unappealing. 1.4 Sum~ary of Results l 3 In CMapter I we have presented the general problem of quantal response assay. We have also discussed the general method of inverse regressiun. We have chosen to apply i~verse regression to quantal response assays for a number of reasons. In quantal response assays we usually seek solutions of F{dose) = p (1.4.1) and relationships among such solutions, for two or more drugs. The following criticisms are levelled against the classical approach: 1) the least squares process minimizes the residual sum of squares in the transfcirmed probability ~cale (verti cal), while estimates of the solutions of (1.4.1) have errors measured in the log,-dose scale (horizontal).

PAGE 22

14 2) Serious problems occur in estimating solutions to (1.4. 1) when we model -1 F (Probability of Response) for mtl. m = r Br (log~dose)r r=O (1.4.2) The main problem is that the solutions to (1.4.1) may not exist or may not be unique when they do exist. Thus, the classical approach is pretty well limited to situations where a linear rela tionship exists (that is to situations when m=l in (1.4.2)). In Chapter II we will develop the general theory necessary to apply inverse regression to find solutions to (1.4.2) when mfl. The solutions will minimize residual sum of squares in the log-dose scale. In Chapter III we will use the angle transforma tion with inverse regression to develop a particular method of analysis. Chapter IV will give some numerical applications of the methods developed in the preceding chapters. By actual application of our results it is seen that inverse regression offers the most elementary computations as compared to oth~r methods.

PAGE 23

CHAPTER II INVERSE REGRESSION OF QUANTAL RESPONSE ASSAYS: ASYMPTOTIC THEORY 2.0 Preamble In Section 2.1 we wil 1 introduce the basic reasons for studying the asymptotic theory. Section 2.2 will deal with developing a parametric model along with estima tion of population parameters. In Section 2.3 we Will develop the asymptotic distribution of the estimators. We will also develop methods of forming confidence inter vals and testing hypotheses of interest. Section 2.4 will be a summary of the results. 2. 1 Introduction As previously stated, the classical quantal response assay consists of independently sampling n. subjects at l dose, d., and ~~taining the fraction of positive responses, l A p., l = 1, 2, l k. If only one drug is of interest in the assay, it is often of interest to estimate LD(lOOp), the true dose of which lOOp percent of the subjects exhibit a positive response. In particular, LD(50) is a quantity often estimated. 1 5

PAGE 24

If more than one drug is involved in the assay, other aspects of the arialysis may be of interest. It is often desirable to compare Lo(50) values in terms of relative potency, the ratio of the true LD(50) values. If the assay involves drug mixtures, it is of interest to know if one drug synergizes or antagonizes the other. 1 6 In order to use inverse regression to analyze a quantal response assay, a model for the problem is neces sary. We will develop a parametric model for the classical quantal resporise assay. Once the model has been fdrmulate~, estimators of population parameters will be developed. Since confid~nce intervals for, or test hypotheses about, population quantities are of interest, the asymptotic distribution of the estimators will be studied. The re sults of the asymptotic theory will be stated in terms of ltnear combinations of the estimators. From thi~, confi dence intervals and tests of hyp0theses of interest will follow. 2.2 Parametric Model and Estimator~ We will now develop a parametric model to express the relationship between the observed fraction of positive responses at different doses and the corresponding true fraction~ In order to do this certain matrices and their relationships will be defined.

PAGE 25

1 7 Let M beak x r matrix with r < k. M will be of rank rand will usually consist of two different types of elements. M will contain elements which are functions of the true fraction of positive response. M may also contain dummy variables. A k x r matrix Y will be of a form n similar to M. If M contains a dummy variable in position m .. Y will contain the same element in position y ... l J n l J The remaining elements of Yn will be the maximum likeli~ hood estimates of the ccirresponding elem~nts of M. Thus, rather than containing functions of the true fraction of positive responses, as M does, Yn will contain the cor responding furictions of observed fraction of positive responses. For a k x 1 (transfotmed} dose vector,!, we hypothesize the following relationship: X = MS, (2.2.1) where Sis an r x l vector of parameters. Let En be a k x r matrix such that {~~i)}, the rows of En' are inde pendent random vectors. If we let n ~ea linear function of the n i l = 1 2, k, we will assume that 1/2 (') L n e 1 --> N (0 v.) as n + 00 -n r -' 1 (2.2.2) where Nr represents an r-variate normal random variable, and Vi is a continuous matrix function of M.

PAGE 26

18 With all matrices defined as above, we propose the following model for the relationship between the ob served fraction of positive responses and the true fraction of positive responses: Y = M + En. n (2.2.3) Multiplying equation (2.2.3') on the right by~ yields (2.2.4) Using the relationship given in (2.2.l) we see that (2.2.4) can ~e rewritten as or equivalently as E 8 n X = Y 8 E 8 n n ( 2 2 5 ) (2.2.6) Thus, using the unweighted least squares estimator of ..@_, we obtain 8 = (Y' y )'"l Y' ~n n n n x. (2.2.7) 2.3 Asymptotic Theory Now that a parametric model has been developed with the estimat0rs of these parameters, we will obtain the limiting distribution of the quantity

PAGE 27

l 9 (2.3.1) where l is an r x l specified vector. Before we actually find the asymptotic distribution of T we will first introduce some lemmas needed in later n proofs. The first three lemmas may be found in Rao [ 9 ]. Lemma 2.3. l Let {Xn, Yn}, n = 1, 2, ... be a sequence of pairs of random variables. Then p --> L --> L --> Y, (2.3.2) that is, the li~iting distribution of Xn exists and is the same as that of Y. Lemma 2.3.2 Let {Xn' Yn}' n = l, 2, .. be a sequence of pairs of random variables. Then: (a) X _L_> X' y _P_> 0 ::::;> xnYn _P_> 0. n n (2.3.3) ( b) X _L_> X' Yn _P_> C ::::;> xn +Y _L_> X+c n n (2.3.4) X y L cX :::;:,. --> n n (2.3.5) ='> X / Y l> X / c, if ct O. ( 2. 3. 6) n n

PAGE 28

20 Lemma 2.3.3 Let g be a continuous function. Then; ( a ) xn L -> X L g ( X) (2.3.7) g(X ) -> n ( b) xn p X g(Xn) p g ( X) (2.3.8) > -> p -> L 0, Y n -> Y p -> 0.(2.3.9) Lemma 2.3.4 Let g be a continuous matrix valued function of Y n a matrix. Then p -> g ( M) (2.3.10) Proof Since g is a continuous matrix valued function of Y we can let >0 be arbitrary and let o >0 be such that n t IIY Mil< 0 =? lg(Y) g(M) I<. since p y -> M. n (2.3.11) (2.3.12)

PAGE 29

21 p Since Eis arbitrary, g(Yn) -> g(M). This completes the proof. Lemma 2 3.5 Let E = (2.3.13) beak x r matrix of random variables such that the asymptotic distribution of. n 112 e., the i~th row vector of -1 l/ 2 E h .. t V l 2 k n as variance covariance ma r,x ., 1 = 1 Assume that ACov(n 112 e .. n 112 e ) = 0, 'u'H=i, where ACov 1 J 1 J _. is the covariance of the asymptotic distribution. Let a I = (al a2, b = (bl b2' c = (cl c2' and d = ( d l d2' be vectors of constants Then, AVar[n 112 ~ E b 1/2 n C E ~_] = a k) b r) ck) dr) 2 (5 (2.3.14) (2.3.15) (2.3.16) (2.3.17) (2.3.18)

PAGE 30

where k [ 2 b 1 V b + c 2 .d V.d r. a ,,, ,1 = l 2a c.b 1 V d] l 11(2 3.19) 22 and AVar refers to the vartance of the asymptotic distri. bution. Proof k r = n 112 E E a.b.e .. i=l j=l l J lJ (2.3.20) and 1/2 1/2 k r n ~'Ed= n E E c.d.e .. i=l j=l l J lJ (?..3.21) Thus, 2 [k a = AVar E i = l r E (a.b j = l l J 1/2 c.d.)n e .. l J l J k r = E E (a b ~ i=l j=l 1 J 2 1/2 c.d } AVar(n e .. ) l J l J 1/2 1/2 + E ( a b c d ) ( a b. 1 c ,d ) A C o v ( n e j n e ) H l J l J l J l J l l J

PAGE 31

23 where H = { ( i j i j ) : ( i j ) t ( i 1 j ) ; i i" = 1 2 ... k ; j j = 1 2 ... r} .( 2 3 2 2 ) Since ACov(n 112 e .. n 112 e., ., } = 0, Vifi', equation l J l J (2.3.22) can be written as a 2 = (a.b. ~ c.d.) 2 AVar(n 112 e .. ) i=l j=l l J l J lJ 1 /2 1 /2 + L (a.b.-c.d.)(a.b., .. c.d.,)ACov(n e .. ,n e .. ,) W l J l J l J l J lJ lJ k = L [a~b'V.b+c 2 1 -d'V.d-2a.c.b'V.d], 111l 111 = l where H'={(i,j,j'): (i,j)t(i,j'), i=l,2, ... ,k; j,j'=l,2, ... ,r}. (2 3.22) This ccimpletes the proof. We are now ready to derive the asymptotic distribu tion of Tn, as given in (2.3. 1). Theorem 2.3. 1 Under the conditions specified in section 2.2, T _L_> N(O 2) n a a s n + 00 (2.3.23)

PAGE 32

where 2 (J = with k 2 2 E {a.bLv.b+c.d~V.d~2a.c~~ ~ V ~ d} 1 1111l 111 = b = (M'M)'"" 1 l c' = (c 1 ... ck)= l' (M' M).,. 1 M 1 and d = (M' M)-l M' x Proof From equation (2.2 1) we see that 1 f = (M'M) w ~Using equations (2.2.7) and (2.3.29) we obtain l ( t -~) = l { ( y y ) .,. 1 y, X .,. 01'-' M ) .,. 1 M l X } -., n n n24 (2.3.24) (2 3 25) (2.3.26) (2.3.27) (2.3.28) (2.3.29) = l' {(Y'Y )'" 1 .. (M'Mr 1 }Y 'x + l (M'M)'"" 1 (Y'-M')x. (2.3.30) n n nn

PAGE 33

25 From equation (2.2.3) we observe that Y' = M' + E' n n (2.3.31) and Y'Y = M'M + M'E h +E L M+ E ~ E n n n n n n (2.3.32) Substituting (2.3.31) and (2.3.32) into (2.3.30) yields ,e_'(B -8)=,t'{(M'M+M E +E'M+E' E f 1 ~(M'M)"" 1 }(M'+E )x+l'(M'M)-lE' x.(2.3.33) -'---11 n n n n n n By making use of the identity (2.3.34) and letting U = M'M V = M'E + E'M + E'E n n n n (2.3.35) (2.3.33) can be written as + l' (M'Mf 1E'x. n(2.3.36)

PAGE 34

(I+ V) (I .. V) = I v 2 implies that ( I V ) = ( I + V ) l ( I + V ) "" l V 2 Recalling that n 112 e~i) L > N(0, V.) as 1 26 (2.3.37) (2.3.38) n+oo, and letting V be as defined in (2.3.35), we observe that p -> 0, 'uo>0. Thus, from Lemma 2.3.3 l 2 o V 2 P 0 n -> 'tfo > o. (2.3.39) (2.3.40) Combining the results of (2.3.30), (2.3.39), and (2.3.40) and applying Lemma 2.3.2 we obtain (I V) = (I + V)-l + O(n-l) (2.3.41) or (I + V)-l = (I V) + 0(n1 ). (2.3.42) Using the re~ationship given in (2.3.42) and apply ing Lemma 2.3.2, equation (2.3.36) can be written as

PAGE 35

I + ,e ( M I M ) l E I x l' { ,M I M ) l E I x n n~(WM)1 (M'E +E 1 M+E 1 E )(M'M)1 M 1 x n n n n l'(M'M)1 (M'E +E 1 M+E 1 E )(M'M)-lE'x n n n n n27 (2.3.43) Since any matrices involving E~En are of order n1 application of Lemma 2.3.2 reduces equation {2.3.43) to l' ( B -s) = -l 1 (M'M)1 WE (M'M)1 M 1 x -n n = x'[I-M(M'M)-lM']E (M'M)-l,e_ n

PAGE 36

where~, Q, ~, and d are given in (2.3.25), (2.3.26), (2.3.27), and (2.3.28) respectively. Thus, 28 n 1I2 ,e. (8 -B) = -n (2.3.45) From Lemma 2.3.l, we observe that both sides of equation (2.3 45) have the same limiting distribuiion. Thus, from the asymptotic properties of En a nd the appli cation of Lemma 2.3.3 and Lemma 2.3.5, we observe that L 2 Tn -> N(O,o ), (2.3.46) where o 2 is as given in (2.3.24). This completes the proof. In Theorem 2.3. l the asymptotic variance component, o 2 was given in terms of M. El~ments of M involve the true fraction of positive responses. Since the true frac tion of positive responses is unknown in a practical situati9n, we will wish to estimate them and obtain a con~istent estimator of o 2

PAGE 37

Corollary 2.3.l By substituting Yn for Min equation (2.3.24), I" 2 including V terms, we obtain an, a consistent estimator 2 of' a a n d h e n c e 29 A-1 L an Tn -> N(0, 1). (2.3.47) Proof Chebyshev's Inequality states that for any random variable X with mean,, and variance, a 2 P(IX-j~>.a) < l 2' >.> 0. (2.3'.48) A For a binomial random variable, p, the maximum likelihood estimator, of p, has meanJ p, and variance, Q_ E) < 0. n+oo (2.3.50) A Thus, p converges top in probability.

PAGE 38

Recalling that Yn is identical to M except that where M contains functions of p, Yn contains the same A function~ of p. Thus, from lemma 2.3.3 we observe that each element of Yn converges in probability to the cor responding element of M. That is, 30 p Yn > M. (2.3.51) By application of Lemma 2.3 4, we observe that "' 2 P 0 2, a -> n (2.3.52) where cr~ is found by substituting Yn for Min equation (2.3.24). Since (2.3.53) Lemma 2.3.2 justifies that A -1 L an Tn -~N(O, 1). (2.3.54) This completes the proof. Now that the asymptotic distribution of th e estimators has been developed, we will give a nominal (l-a)l00% confidence interval for f' ~We will give a confidence interval of this form because many of the

PAGE 39

estimation problems of interest can be phrased in terms of linear combinations of the~ parameters. Corollary 2.3.2 o ,.,B +_ z ,., n-1/2 .:h -n a/2 n (2.3.55) forms a nominal (l-a.)100% confidence interval for .f. 1 B, where za/ 2 is such that 31 (2.3.56) w h e n Z i s t he s ta n d a rd n o r m a l r a n d om v a r i a b l e Proof Corollary 2.3. 1 implies that T h u s a s n + 00 Therefore in the asymptotic sense A l' B -n (2.3.57) (2.3.59)

PAGE 40

forms a nominal (l-a.)100% confidence interval for l' This completes the proof. Although estimation is often of prime importance, it may also be of interest to test hypotheses of the gen era 1 form 32 H: Ai=O, 0 (2.3.60) where A is a q x r matrix with rank q (1 q r). We will now develop a test statistic appropriate for this general hypothesis. In order to achieve this end, we will consider the asymptotic distributiort of & in -n terms of a multivariate normal framework. We will first give a definition and two lemmas from Rao [ 9 ]. Definition 2.3. 1 A p-dimensional random variable U, that is, a random vari able u taking values in EP (.Euclidean space of p-dimension) is said to have a p-variate normal distribu tion~ NP, if ancl only if every linear function of Uhas a univariate normal distribution. Lemma 2.3.6 If Uhas a p-variate normal distribution, then the joint distribution of q linear functiorts of U is Nq. Let U have mean vector,~, and dispersion matrix, L. If

PAGE 41

Y = CU, where C is (q x p), represents the q linear func, tions, then Y has mean vector, C~, and dispersion matrix, ...... CLC I Lemma 2.3.7 33 Let Ube p-variate n0rmal with mean vector,~' and dispersion matrix, L. Then the necessary and sufficient condition that Q = ( u .E,) I R ( u (2 3.61) has a chi-squared distribution with k degrees of freedom is L ( R L R' R) L = 0 (2 3.62) in which case k = trace (RL). (2.3.63) Theorem 2.3.2 nl/2 (~ ~) _L_> N (O, a) n r (2. 3. 64) where

PAGE 42

34 a .. = l/2(coefficient of 1.1.) l J l J (2.3.65) in (2.3.24), and a .. = (coefficient of 1~) (2.3.66) l l l A A2 in (2.3.24). an is similarly obtained by using an as defined in Corollary 2.3.1. 6n is a consistent estimator of a. Proof In Theorem 2.3.l we proved that every linear combination of normal. Thus, n 1 / 2 (s ~) is asymptotically univariate -n since n 112 (s ~) is an r-dimensional -n random variable, by Definition 2.3.1 in the asymptotic sense n 112 (Bn : ~) has an r-variate normal distribution. T h e m e a n v e c t or a n d d i s p e r s i o n ma t r i x f o 1 1 ow d i re c t 1 y from Theorem 2.3.1. It should be noted that the elements of a are defined as they are because in (2.3.24), coefficient t.1. = 2ACov( 0 (i) g(f)) 1 J 1=..n -n (2.3.67) where B(i) is the i-th element of the vector S and -n -n coefficient 1~ = AVar(S(i)). l -n ( 2. 3 6 8)

PAGE 43

35 It again follows from Lemma 2.3.3 that each element of Qn converges in probability to the corresponding element of n. Thus, p !G --> Q, n and is thus a consistent estimator. This completes the proof. Corollary 2.3.3 To test the general hy pothesis H : A.@_= 0, 0 the test statistic is Proof L --> From Lemma 2.3.6 and Theorem 2.3.2, L --> if A.@_= 0. Thus, from Lemma 2.3.7, if A.@_= 0 (2.3.69) (2.3.70) (2.3.71)

PAGE 44

L -> since in the notation used in Lemma 2.3.7 I: = An A' and Thus, and 36 (2.3.72) (2. 3. 73) (2.3.74) trace(RI:) = trace(I:1 I: ) = trace (I) = q. (2.3.75) Application of Lemma 2.3.2 implies that if A~= 0, L 2 -> X. q This completes the proof. (2.3.76)

PAGE 45

37 2.4 Summary We have now developed the general asymptotic theory necessary to solve problems of interest in classical quantal response assays. In Chapter III we will apply tbese general results to the particular use of the angle transformation. In particular we will discuss estimation of LD(lOOp) and relative potency. We will also discuss testing the hypothesis of parallelism.

PAGE 46

3.0 Preamble CHAPTER III APPLICATION Tb TH~ ANGLE TRANSFORMATION In Section 3. l we will discuss the rationale behind choosing the angle transformation to apply the method of inverse regression. We will also justify the propoied model. In Section 3.2 we will discuss the esti~ mation of LD(50). We will include results for the case when the relationship between log-dose and sin-l (p. 1 1 2 ) 1 is linear as well as the case of non-linearity. Sections 3.3. and 3.4 will deal with estimation of relative potency and a test for parallelism, respectively. Section 3.5 will be a summary of the results. 3. l Introduction In this chapter we will discuss using inverse re gression to fit the general model (given here as deter ministic) log d. c: 1 E S. sin r -1 j = l J j-1 r l / 2 1 l pi J (3.1.1) where r < k. We will employ weighted least squares to 38

PAGE 47

39 fit the model. Appealing to the notation developed in Section 2.2, the probabilistic parametric model we will thus be using is Yn = M + E n (3.1.2) where the elements of Yn and Mare defined by l/ 2 ~. -l[,l/2 j -1 Y n i j = ["nil l] (3.1.3) s, n p. l 1 < i < k, 1 < j < r, r < k, and = (nij 1 1 2 f-:. -1 [ 1;2 )~j-l Mij n L,n pi (3.1.4) 1 < i < k, 1 < j < r, r < k, and k n = n I k 1 l l = (3.1.5) The form of En will follow shortly. /

PAGE 48

40 Justification of the model proposed in (3.1.1) will now be given. A brief discussion of classical methods will first be given Classical methods of analyzing quantal response assays are applicable when there is a linear relationship between the log-dose and the transformed fraction of positive responses. Probit analysis, for example, produces such a linear relationship when tolerances (measured in the log dose scale) have a normal distribution. The tolerance of a subject is the dose level at which that subject would exhibit a positive response. Needless to say, not all quan tal response assays have a normal distribution of tolerances. For assays such as these, probit analysis is not appropriate Although for a non-normal distribution of tolerances another transformation might produce a linear relationship, it would be desirable to find one method of analysis which would be appropriate for a wide class of tolerance distri butions. In many quantal response problems, the fraction of positive response is montomical1y increasing with respect to dose in the area of experimentation of interest. This, of course, implies that dose is monotonically increasing witA respect to the fraction of positive ~esponses. dose and sin-l (p 112 ) are monotone functions. Both 1 ogThus, pl < p 2 .... 1( 1/2) and d 1 < d 2 if and only if log d 1 < log d 2 and sin p 1 < sin ~ 1 (p~ 12 ) Thus, dose is a monotonically increasing

PAGE 49

41 function of the fraction of p6sitive responses if and only if log.-dose is a monotonically increasing function of .. l( l/ 2 ) I h 1d b sin p n t 1s case, tt wou e appropr i ate to model the relationship between log"dose and sin .. 1 (p 112 ) by a polynom i al, which is the model given in (3.l.l). In conclusion, the model given in (3.l.l) is appro .. priate regardless of the distribution of tolerances, as long a~ the fraction of positive responses is an increa~ing function of the dose. Thus, the inverse regression approach is applicable to a much wider class of qwantal response assays t~an classical methods. Theorem 3 1.l (Mean Value Theorem) If f is continuous on [a, b] where a < b and differ entiable on (a, b}, then there exists a point c s(a, b) such that f(b} f (aj = (b a) f' (~). ( 3. l 6 ) We wi 11 now use Theorem 3. 1.1 to prove other useful results. Theorem 3. 1.2 Let a random sample of size m be taken from a binomial population with parameter, p. Then m ,12r:. _, ["112 1 J .... [ 112n _b_> t '(a 1) n p ... s 1 n p JJ 4 (3.1 .7) where sin .. 1 (p 112 ) and sin .. 1 (p 112 ) are measured in radians.

PAGE 50

Proof Let f(x) ,..,1/ 112) = sin \.x (3.1.8) Thus, l[ ) '"'l/2 f 1 ( X) = 2 X ( l ,-,X ] (3.1.9) From Theorem 3. l, l, there exists a c such that Ip-cl s_ Ip-pl and 1/2~. -1 [ "' 1121 m s 1 n p .-1( 1/2]11 s, n p LJ __ 1/2( (\ )lL (l )]"'l/2 m ~-p 2 C ... c Since le-pl 2. lp~pl and then IP-Pl = O(m 112 ), le-pl = O(m11 2 ), Since (3,1.10) (3.l.11) (3.1.12) (3.1.13) 42

PAGE 51

43 1 / 2 A m (p"p) L t 1 (0 1) 1/2 -> l [p(l-p)] (3.1.14) 1 / 2 A m (p-p) 2 [ p ( 1 p ) J 1 / 2 L -> N(O, 1/4). Thus, by Lemma 2.3.2 l lr;) l :.) l/2ir. -1 ["1/2) m L, n p -1 [ 1/2]~ sin p I I L -> N(O, 1/4). (3.1. 16) This completes the pro~f. Theoreni 3. 1.3 Let a random sample of size m be taken from a binomial population with parameter, p. Then for any given {C.}, J _l_> Nfo, 1/4 I (j-l)(k-l)C.ckfsin1 r p 1I2 'j l ( 3 .1. l 7) r uj+k-4 l j=l k=l J L l J

PAGE 52

Proof Let f ( X) f 1 ( X) r = m 112 IC. j = l J X j l r = m 112 I (j-l)C. j=l J 44 (3.l.18) j-2 X (3.1.19) From Theorem 3.1.l~ th~re exists au such that and ~ -1 r"l/2] -1 ( 1/2]~ 1/2 ( l )C j-2 = sin p sin p m L., J. u \. j = l J (3.l.21) Since !~-Pl = O(m112 ), by the Mean Value Theorem,

PAGE 53

~ ilk ,.., [ "J 12 )11 sin p I, .. l ( l / 2}~ k = 0 ( l / 2 ) c,n p m (3, 1.22) where l < k j "" l. Since (3.1.24) Thus, (3.1.25) From the result of Theorem 3.1.2 we can thus conclude that 45

PAGE 54

46 (3.1.26) This completes the pr6of. In the model given in (3. 1.2), En is a random error matrix to explain the asymptotic variability of Yn as compared to M. ~vith Yn and Mas defined ~Y (3.1.3) and (3.1.4), respectively, and using the results of Theorem 3. 1.3, we are now ready to justify the form of The rows of Yn are independent and we have shown in Theorem 3. 1.3 that n /2 C { i n 1 [P I 2 ]~ j l l j=l J 1 r _L_ > N(O, 1/4 I: j =l k n. With n I: n./k, we assume that as n +oo 1 1 1 n l = + >... l (3.1.27)

PAGE 55

47 k i = 1, 2, .. k. Also, L. \i = k, Thus, as n+ 00 each l = 1 n 1 +co. Thus every linear comb1nat1on of n I2 (Y .. ,.. M .. ) nlJ lJ i s a s y mp t o t i c a 1 1 y n o rm a 1 a s n + QO, w h e r e Y n1J and M are 1 J given in (3.1.3) and (3.1.4), respectively. It then follows that the i-th row of En is a random vector such that n 1 / 2 i ) _L_ N r ( Q_, V i ) a s n + oo, (3. 1. 28) where 1 1 ( 1 2) I :7S+t ... A Vist = 4 (s-1)\t-l) sin [P / JJ 1 ~s, t
PAGE 56

48 dose. Since LD(50) is often of prime interest in quantal response assays, the main emphasis of this section will be the discussion of estimating LD(50). Since estimation of LD(lOOp) may also be of interest ~hen p I .5), we will give results concerning this also. We will first recall the notation and define (or redefine) the matrices appropriate for this problem. The classical quantal response assay consists of independently sampling ni subjects at dose, di' i = 1, 2, ... k. The observed response frequency, p i s ca 1 c u 1 ate d f.o r ea c h dos e 1 eve 1 d The tr u e l l response probability for dose, di' is represented by pi. Let 1 k n = k En 1 l l = (3.2.1) The general deterministic model is given by (3.2.2) where r < k. We will employ weighted least squares and write the probabilistic model as Y = M + E n n (3.2.3)

PAGE 58

50 where En is a matrix of independent random vectors. In Section 3. l we justified that the i-th row of E is such n that 1/2 n ( i ) e -n L > + oo, (3.2.6) In Section 3. l, we gave the form of Vi when sin1 (p~ 12 ) -1 l / 2 and sin (pi ) are measured in radians. Si~ce l radian= 180 7T 57.2958, (3.2.7} it follovJs from (3.1.26) that when sin1 (p~ 12 ) and -1 l / 2 sin (pi ) are measured in degrees that the st-th element of v. is given by l [ s+t-4 Vist = 820.7(s-1Xt-l)[in1 H 12 )J (3.2.8) l < s t < r, $, 2 ('\ (Y' Y )'"' 1 y l ( 3 2 9 ) In = X = n n n

PAGE 59

In order to predict the transformed dose, X. l at which pi of the subjects respond, we would use the weighted least squares prediction equation 1/2 j~l 51 r rni) -1 r l /2'U x. = ES-sin p. J 1 j=l J \ n \ (3.2.10) Rather than estimate the transformed dose, xi, i t is preferred to estimate the dose in the log-dose scale, Thus, the estimation can be given by ( 3 2 l l ) To form a (l-a)l00% nominal confidence interval for log d., we may apply Corollary 2.3.2. The confidence l interval is given by l' fn -1/2 n (3.2.12) where l l -1 ( 1/2') S l n pi ( 3. 2 l 3) ~ -llr l/2l~r-l sin p. ) J

PAGE 60

"2 and a is given by Corollaiy 2,3.l. n 52 If it is desired to estimate LD(5b), pi = .5. Thus with O sin(p~ 12 ) TI/2. sin-l [(.5) l/~ = 45 ,f_ = l 45 (45)r-l (3.2.14) (3.2.15) We will now discuss in detail the estimation of LD(50) when it is assu~ed that the relationship between log ct. and sin1 (p~ 12 ) is linear. We will use the follow, l ing notation: w. = 1 n l n (3.2.16) Since we are assuming a linear relationship, we are using the model given in (3.2.2) with r = 2. In this case, the weighted least squares estimate off can be expressed as ~ l = ~lJ -n S 2 (3.2.17)

PAGE 61

53 where (3.2.18) and (3.2.19) with k L ( X X) ( y ""Y) w l l l l l = (3.2.20) k k r w.x. r w.x. i = l l l i = l l l x = = (3.2.21) k k r W. i = 1 l and k k r w.y. r w.y. ~ i = l l l i = l l l ( 3. 2. 22) y = = k k r w. i=l l From (3.2.6) we observe that

PAGE 62

1/2 n ( i ) e -n L > From (3.2.8) we see that V. = 1 0 54 (3.2.23) 0 (3.2.24) 820.7 In order to predict LD(50) in the log-dose scale, we use (3.2.11) and (3.2.14) to obtain A LD(SO) = s 1 + 45 A = x + s 2 (45-y). ( 3 2 2 5 ) The asymptotic. variance, cr 2 of n 112 [LD(50)-LD(50)] can be obtained from (2.3.24). By applying Corollary 2.3.l, a consistent estimator of cr 2 may be obtained. When a linear relationship is assumed a simplified expression "'2 for crn can be obtained. This will be shown in the following theorem. Theorem 3. 2. l A consistent estimate of 2 cr the asymptotic l / 2 "' variance of n [LD(50)-LD(50)], is given by

PAGE 63

55 2 = 820.7[(45~ ) 2 S S 2 + k'" 1 s 2 2 ] n Y xx yy (3.2.26) Proof In (2.3.24) 0 2 was given in terms of M. To find a consistent estimator of Q 2 Mis replaced by Yn. To find ~ 2 we will define the following vectors: n ;, = x'{I-Y (Y 1 Y )'" 1 v 1 } n n n n x "1 y (3.2.27) = ~n -n 6 = (Y 1 Y )-l,e. (3.2.28) n n 2 = 1 (Y 1 Y )1 y (3.2.29) n n n a = (Y' Y )-ly X n n n A (3.2.30) = ~n where~' Yn' ~n are defined in (3.2.5b), (3.2.4a), and (3 2. 17) respectively. (3.2.31)

PAGE 64

Since ~t, b, ('\ and d are the same as ~, E_, ~, and~ of Theorem 2.3. 1 with M replaced by Yn, by Corollary 2.3. 1 56 A 2 k A 2 A A A 2 A A ~A~ A a = E (a. b'V b + t. d'V.d 2~.c.b V.d), n l 1l 1l 111 =l (3.2.32) where Vi is defined in (3.2.24) is a consistent estimate 2 of cr A 1/2 l/2A 1/2 a. = w. xw. Bw. Y B 2 l l l l l l l 1/2 1/2~ 1/2~ A 1/2 A = w. X w x + w. y B2 w. Y-B2 l l l l l l Since = w~ 12 [(x.-x) s2(y.-y)]. l l l k 2 E w.y. i = 1 l l (Y 1 Y )-l = n n 1 ks yy -ky y ~12 820 7. -ky k (3.2.33) (3.2.34) (3.2.35)

PAGE 65

w ~/ 2 r k J c. =~I E w.y~ ... 45ky + 45ky. ,. kyy 1 1 kS i=l l l l yy w~/2 k 1 ~ l 2 ~ = ~ 45(y.~y) + -k E w.y. p y.y. S l ._ 1 1 1 1 j yy ,(3.2.36) d I Vi d = ]~ 8 2 0. 7. (3.2.37) A A A A A A ~:y~"-2a .c bV .d -2a.c. B 2 820.7, l 111 1 .) yy (3. 2. 38) where;. and 2 ~ are given in (3.2.33) and (3:2.36) 1 1 respectively. T h u s by s u b s t i t u t i n g ( 3 2 3 3 ) ( 3 2 3 5 ) ( 3 2 3 6 ) (3.2.37), and (3.2.38) in (3.2.32), we obtain &~ = 820. 7 A 2 2 2 2 ~ 2 .. 8 2 Gk A 2 2 ~ 2 + (45) s 2 (y.-y) + 1 E w.y. + B 2 y.y 1 < i=l 1 1 1 57

PAGE 66

58 k 90 ~ 2 2 A 2( P + -k ~ 2(y.~y) E. w.y ... 90 B2 y. ~ y}y.y 1 i=l 1 l l l 2 8 2 k + 90 s 2 2 (y 1 .-Y) 2 (45-y) + -k 2 (y.-y)(45-y) E w.y~ 1 1 1 1 1 = 2 s 2 2 {y.-y)(45-y)y.~. 1 1 j (3.2.39) Performing the summation we obtain 6 2 Gk ~ 2 -2 B 2(45-y) 2 s y + (45} 2 @ 2 2s y + -k 2 I w.y~ .. X y i=l 1 1 k .. 90 s 2 s,s 2 s 2 2 s, 2 E (JJ.y~ 2 yy i=l 1 1

PAGE 67

59 (3.2.40) Since from (3.2.18) (3.2.41) A 2 = 820.7~ ( 45 .. ~)2s + 8 2( 45 .., ~ )2s 0 n "' 2 Y xx f-'2 Y YY s yy r ~ 2 k = 820.71 (45-y)2S + _g_ L w.y~S S 2 L xx k i = l 1 1 yy yy

PAGE 68

60 (3.2.42) This completes the proof, Corollary 3.2. l A A -1/2 LD(50) za 12 ann (3.2.43) forms a nominal (l~a)l00% confidence interval for LD(50), where LD(50) is given in (3.2.25), and;~ is given in (3.2.26). Proof Corollary 3.2.l follows directly from Corollary 2 3 2 We will now derive some results for the Knudsen Curtis [5] method for analyzing quantal response data. We can thus compare the inverse regression approach to their method. Knudsen-Curtis use classical weighted least squares t o f i t t h e m ode 1 x. + E:. l l (3.2.44) where

PAGE 69

. ~1( (\ 1/2) Yi = sin Pi (3 2.45) (3.2 46) and from an argument similar to that used in Section 3.1 we can assume 1/2 n. E:. l l L -> N(0, 820 7), as n. + 00 l ands. is independent of s for i t j. l J (3.2 47) 61 Since weighted least squa~es will be employed, the model to be fit could be expressed as Letting (3.2.49) we see from (3 : 2.'47) that l /2 _L_> n E: l N(0, 820.7). (3.2.50)

PAGE 70

Using the notation of this section, the weighted least squares estimates are t"", "' 6 = S /S 2 xy xx (3.2.51) and bl = y (3.2.52) Thus, to estimate LD(50) in the log~dose scale, Knudsen-Curtis would use LD*(50) = 45 .. bl 8 2 62 (3.2.53) Theorem 3.2.2 LD(50)] where L N(0,1),(3.2.54) (3.2,55)

PAGE 71

Proof Froni (3.2.48), k I ( X. -X ) ( y y ) w A i=l l l l 62 = = 5 xx k I (x.-x)y.w. i=l l l l "' s xx Sub~tituting yi from (3.2.44) yields k (x.-i)(b 1 +b 2 x.+E.)w. i=l l l l k I (x.-X)E.W i=l l l l = b2 + 5 xx (3.2.56) (3.2.57) Thus, using the properties of E given in (3.2.50) l implies that (3.2 .58) 63

PAGE 72

In a like manner, it can be shown that L -> 2 N(O, u ), (3.2.59) and any linear function of n 112 (6 1 -b 1 ) and n 112 (S 2 -b 2 ) is asymptotically normal. Now, From (3.2.44) it can be seen that LD(50) = Thus, 45-B [LD*(50) LD(50)] = l = l = 1G 6 2 (45-b 1 ) + (b 1 -6 1 ) b2 + 62 b2 (45-bl) + (b,-b,) l b2-b2 L b2 (3.2.60) (3.2.61) 64

PAGE 73

(3.2.62) Thus, [LD*(50) LD(50)] = (3.2.63) Thus, from (3.2.58) and (3.2.59) L -> 2 N(O. o ). (3.2.64) It still remains to find o 2 In essence we need the asymptotic variance of n 112 LD*(50). We will first find the asymptotic variance of some other variables. 65

PAGE 74

k ,.. 1/2 (x. n X)w.n y. l 1 1 l 1 = = AVar l k 2 ( 1/2 1 = ~ 2 E (x .-x) w.AVar n. Y j l 1 1 1 1 sxx 1= l k 2 = :--2 E (x.~x) w; 820.7 S l 1 = xx ,= 820.7 5 xx k 1/2 AVar(n 112 y) E w.n y. l 1 1 1 = = AVar k l k [ 1/2 1. = 2 E w.AVar n. Y;j k i=l 1 l 1 l k = -2 I: w. 820;7 k l = 1 = 820.7 k 66 (3.2.65) (3.2.66)

PAGE 75

then If we let be such that (y L -> 2 N(O,y ), (45-) + {-y) b2.-B2 b2 = b (45-)+(-y) l+ E l~ ~c 00 [b ... 6?]J 2 j =l 2 67 (3,2.67) (3.2.68) and 8 are asymptotically independent, and 2 terms where j > 2, are of order n"" 1 we obtain that AVar[n 112 LD*(50)] l ( )2 ( l/2r.) l AV ( 1/2~) = 4 45AVar n b 2 + 2 ar n y b2 b2

PAGE 76

= b~ 4 (45~) 2 820 7 + ~" 2 820 7 k~l 2 xx 2 (3.2.69) Thus, 0 2 = AVar{n 112 [LD*(50} LD(50)]} (3.2.70) By Lemma 2.3.3, (a*) 2 is a consistent estimator of 0 2 As in Corollary 2.3. l, This completes the proof. Corollary 3.2.2 L > N(O, l). (3.2.71) Ass u m i n g a l i near model for s i n 1 ( p 1 / 2 ) a g a i n st log-dos e, /0* _P_> l, n .. (3.2.72) and thus, the asymptotic relative efficiency of the inverse method to the Knudsen-Curtis method is unity. 68

PAGE 77

Proof From (3.2.26} and (3.2 ~ 55) we obtain that (45. "' ) 2 s s + y xx yy ( 0)2 "' 3 ~,.. 4 45 ..., ., sxx.:)xy + = r4 where, ~ s r = X is the sample coefficient of correlatfon Thus, lo* n 2 = r Since a linear m6del is assumed, 2 _P_ > l r and thus 69 (3.2.73) (3.2.74) (3.2.75) (3.2.76)

PAGE 78

d /a* _P _> l. n Ihis completes the proof. (3.2.77) It should be noted that the Knudsen~Curtis method is itself asymptotically efficient Thus, the method of inverse regression is asymptotically efficient. Confid~nce int~rvals wit~ nominal (l-a)lOO % coverage are obtained by ei ther A ,., l / 2 LD(5O) I', z a/2 on n (3.2.78) or LD*(5O) za/2 q ~l/2 n (3.2.79) Since the choice of method cannot be made on the basis of asymptotic efficiency, we shall examine the two methods on the basis of robustness. Consider the set of p6ints in R 2 : 70 The inverse regression method consistently esti mates a weight~d least squares line which minimizes the horizontal deviations for the deterministic S .. Assuming 1 n 1 /kn --> 0 1 as n + 00 then the point s 1 carries weight

PAGE 79

0i. Similarly, the Knudsen-Curtis approach consistently estimates a weighted least squares line which minimizes the vertical deviations for the deterministic S .. 1 71 Since the error statements about LD(50), relative potency, etc., are made in the horizontal scale, the inverse met hod, when linearity is false, should tend to have a smaller asymptotic bias than the Knudsen-Curtis method. Corollary 3.2.3 AV a r '[ to ( 5 0 ) ] < AV a r [ L D ( 5 0 )] (3.2.81) Proof From (3.2.75), 4 = r (3.2.82) Thus, (3.2.83) where pis the population coefficient of correlation for a bivariate random variable with mass function P[X = S.] = 0., i 1 1 = l, 2, ... k. (3.2.84)

PAGE 80

p = if and only if there is truly a linear relation ship. Otherwise -1 < p < l. Thus, from (3.2.70) we see that A AVar[LD(50)] < AVar[LD*(50)], ( 3. 2. 85) with equality holding when the linear model is correct. This completes the proof. Thus we can conclude that when estimating LD(50) in the log-dose scale inverse regression seems to yield a more reasonable estimate than the Knudsen-Curtis method as far as robustness is concerned. As a further A bonus, LD(50) will tend to be a better estimate than LD*(50) in terms of variances of the asymptotic distri butions. 3.3. Estimation of Relative Potency If two drugs are involved in a quantal response assay, it is often of interest to estimate the relative potency of the two drugs. The relative potency is the ratio of equ~lly effective doses. It should be recalled that relative potency is a valuable measure only if the quantal response curves are parallel. Thus, throughout 72 this section we will assume the response curves are parallel. In order to estimate the relative potency, we will use the results of Chapter II and Section 3.2. To achieve

PAGE 81

73 this end we will first give the notation and model for thi.s section, Table 3 1 Notation Chart Drug 1 Dose levels d 1 ? Sample sizes n l Response probability p 1 ; I\ Observed response pl frequency Weight, n./n w, l Recall that k n = tn./K. l l l = Drug 2 dk d k + 1 dk 1 1 nk n k + l nk l l pk pk + l Pk l l r.,. t\ pk pk + l pk l l wkl w wk 1<,+1' (3.3.l) Since the curves are assumed to be parall~l, the deterministic model can be expressed as a

PAGE 82

74 ... 1 [ 1 / 2 1 r r j .l E B.is1n p. I i?k 1 j=lJL ,, (3,3,3) In this case, r+l < k. It can thus be seen that relative potency (3.3.4) The probabilistic model will again be of the form yn = M + En. (3.3.5) The response matrix Yn has ( 3 3 6 ) and y. l ,r+ = w~ 12 i < k 1 l (3.3.7) = 0 (3.3.8) Y "th l ("'l/2) The matrix Mis of the same form as n w, sin Pi replaced by sin~ 1 (p~ 12 ). Also, (3.3.9)

PAGE 83

where the transformed does vector,~' has i-th component and l/2 x = w. log d. l < i < k, ( 3 3 l O) l l l (3 = r s, (3.3.11) In the same manner as was employed in Section 3.1, we can assume that En is a matrix of independent random vectors with l/2 (i) L n ~n --> N r + l ( Q V i ) (3.3.12) It again follows that Vi has the form given in (3.2.8). That is, Vi has entries = 0 otherwise, (3.3.14) and 75

PAGE 84

76 s = (Y'Y ) 1 v x -n n n n (3. 3. 15) is the weighted least squares estimate of~. We can now employ the results of Chapter Ii to estimate the relative potency. We will perform this estimation in the log-dose scale. Thus, we will estimate Sr+l by means of a (l-a)l00% confidence interval. If we let .R-' = (0, 0, l), then from Corollary 2.3.2 01 ; z A n-1/2 N ~n a/2 an (3.3.16) forms a nomihal (l-a)l00% confidence interval for Sr+l, and thus the relative potency. ;~ is obtained by applying Corollary 2.3.l. Wi have thus ~stimated the relative potency (in the log-dose scale) of two drugs. It was assumed that the quantal response curves are parallel. In the next section we will give a test for parallelism. 3.4 Test for Parallelism If two drugs are involved in a quantal response assay, it may be of interest to test for parallelism of the response curves. If it is found that the curves are not par~llel, it woul d be inappropriate to attempt to use the relative potency of the t wo drugs in any way.

PAGE 85

We will use the same notation as that given in Table 3. 1. The deterministic model is now given by (3.4.1) = 2 { s.lsin1 [p~ 12 ]~j-r-l i>k 1 ,(3.4.2) J=r+l JL lJ where r .:.. min-( k 1 k-k 1 ). The probabilistic model is again given by yn = M + En. (3.4.3) The response matrix Yn has entries I 2 r l[ "1 / 2 )~ j 1 Y n i j = wi L i n p i 1.:.. i ~_kl 1.:._j ~r ( 3 4 4 ) r+l~j_~_2r (3.4.5) = 0 elsewhere. (3.4.6) -l("l/2) Mis defined in a manner similar to Yn with s,n P; 77

PAGE 86

replaced by sin1 (p~ 12 ). Again we have X = M (3.4.7) where the transformed dose vector,~' has i-th component xi = w~/ 2 log di' 1 < i < k, (3.4.8) and = (3.4.9} En is again a matrix 6f random vectors. En can now be represented by 0 (3.4.10) where E(l) and E( 2 ) are k 1 xr and (k-k 1 )xr respectively. 78

PAGE 87

E(l) and E( 2 ) are thus comprised of independent random vectors, and we again have 1/2 (i) L n ~n > Nr (0, Vi)' lk 1 (3.4.13) = Q elsewhere = (Y'Y )-l Y' x n n n is the weighted least squares estimate of~(3.4.14) (3.4.15) Since we desire to test for parallelism, the test of interest can be expressed by the hypothesis H : O This hypothesis is thus of the form (3.4.16) 79

PAGE 88

Ho: A.@_= 0, where the entries of A an rx2r matrix are a .. l J = 0 = elsewhere. = a = 1 rr ar,2r = -1, (3.4.17) (3.4.18) (3.4.19) (3.4.20) Thus, we can apply Corollary 2.3.3 and test H 0 by the test statistic L --> 2 X r (3.4.21) where A, ~n' and Bn are defined in [(3.4.18), (3.4.19), (3.4.20)], (3.4.15), and Theorem 2.3.2, respectively. 80 If x~ is sufficiently large to reject H 0 : AS= Q, we can conclude that the response curves are not parallel. 3.5 Summary We have now discussed the application of the angle transformation to inverse ~egression of quantal response assays. We have shown that inverse regression will give better asymptotic results than the Knudsen-Curtis method -1( 1/2) when the relationsh ip between log di and sin pi 1s

PAGE 89

81 not truly linear. Inverse regression may be used to fit models other than the linear model whereas the Knudsen Curtis meth~d is not appropriate. We have given methods of forming confidence intervals for LD(lOOp) as well as relative potency. We have also given a test for parallel ism. In Chapter IV we will give examples of numerical application of the results developed in this chapter

PAGE 90

CHAPTER IV NUMERICAL APPLICATIONS 4.0 Preamble In this cha~ter we will apply the results obtained in Chapter III. Several numerical applications will be given. Section 4.1 will give the exact probabilities that 95% nomi nal confidence intervals cover LD(50). Eight probability schemes will ship between be considered which satisfy a linear relation log-dose and sin-l(~~/ 2 ). In Section 4.2 we 1 will compare the use of inverse regression to other methods of analysing quantal response assays. Section 4 3 will be a summary of the chapter. 4. l Exact Covera e Probabilit (95% Nominal Confidence Interval In this section we will investigate small sample results for eight probability schemes satisfying the linear model (4.1.l) The log-doses, xi, i = l, 2, 3, 4 were fixed at four equally spaced values. Equal sample sizes of five, ten 82

PAGE 91

83 and fifteen were considered. We ran all possible assays for the model given in (4.1 .1) with the conditions de scribed. For all realizations, we then computed nominal 95% confidence limits for LD(50). These limits were found by use of the results given in Corollary 3.2.l. For the eight different curves, we then computed the exact probability that the true LD(50) lies in the confi dence interval. A Based on a pilot study we replaced Pi by ( a ) p ; + ( 2 n ) 1 i f p i < l / 2 and (4.1.2) (b) Pi (2n)1 if ~i > 1/2 (4.1.3) prior to taking sin1 (~ll 2 ). We recommend this contin~itj correction whenever the sample sizes are relatively small. This co~tinuity correction has no affect on the asympototic distribution. The following table summarizes the results obtained.

PAGE 92

84 Table 4. l Exact Coverage Probability ( 95% Nominal Coverage) Run Number l 2 3 4 5 6 7 8 pl ( x l = l 5) .039 .029 l 5 2 .087 .230 .319 230 l 5 2 P2(x2 = 5 ) 319 .206 .415 .230 .415 41 5 3 l 9 .230 P3(X3 = 5) .708 .485 .708 .415 .614 515 .415 .319 P4(X4 = l. 5) .971 .770 .928 .614 .794 .614 51 5 415 log LD(50) -.04 55 -.21 .93 -.07 .35 l. 35 2.35 Sam~le Size Coverage a 11 n = 5 .999 .980 .997 .880 .999 .882 .660 .557 a 11 n. = 10 .994 .954 .986 91 2 .995 .962 .749 .633 all n = l 5 .979 .954 .974 .922 .987 .959 .779 .701 The first four lines of Table 4.1 give the probability of response, Pi, at log-dose, xi, for the eight linear models considered. The fifth line gives the true value of log LD (50). The last three lines give the exact coverage proba bility for the various equal sample sizes. We were limited to relatively small samples in this investigation since

PAGE 93

85 there ~re, for exa~ple, more than 65,000 possible realiza tions (each of varying probability) associated with n 1 = 15. As was previously stated, the above examples are all linear in terms of log-dose against sin1 (p 1 1 2 ). Four different slopes were used. Run l had the smallest slope, runs 2 and 3 the next smallest, runs 4 and 5 the second largest, and runs 6, 7, and 8 the largest. ~uns l through 6 provide ex~ellent ~mall sample approximation, while runs 7 and 8 do not. For runs 7 and 8, there is substantial probability that all p/s are less than .5, and hence LD(50) must often be estimated by extrapolation. We conjecture that convergence is slow whenever extrapolat ~ ion is highly probable. 4.2 Estimation of Relative Potency by Various Linear Techniques In this section we will compare the estimation of relative potency by various methods of analyzing quantal response assays. The data we will analyze are an example presented by Finney [3]. The data are the result of an assay of insulin Mice were injected with varying doses of insulin o~ with a test preparation,and the numbers of mice showing the symptomi of collapse or convulsions were recorded. For the data of Finney [J], page 477, we obtained 95% nominal confidence inter~als for the relative potency of the insulin as compared to the test preparation. Excellent linear fit

PAGE 94

86 was obtained for all methods. A summary of the analyses is given in Table 4.2. Method Probit Logit Angle (MLE) K-C I-R-L Legend: R LCL uc L Table 4.2 Estimation of Relative Potency R LCL 1 3. 41 11. 11 1 3. 38 11 0 4 13. 50 11 31 13.70 11 .58 13.72 11. 66 = estimated relative potency = lower 95% confidence limit = upper 95% confidence limit UCL 2 X 1 ( p) 1 6. 1 2 .28 1 6. 1 6 .35 16.05 17 1 6. 20 35 1 6. 18 1 9 2 x.(P) = l chi-square statistic for parallelism, one degree of freedom MLE K-C I-R-L = maximum likelihood estimation = Knudsen-Curtis method = Inverse regression: linear (p=2 in Chapter III) While the five methods give virtually the same results, the Knudsen-Curtis and inverse regression methods require the more elementary computations, and are easier to explain to nonquantitative scientists. In situations where parallelism is reasonable, but linearity is not, we can use inverse regression with r > 2, whereas the other methods are inappropriate.

PAGE 95

87 4.3 Summary By use of numerical examples we have shown the appli cability of inverse regression in analyzing quantal response assays. Since other methods of analysis are restricted to linear models, we have compared inverse regression to some other methods when a linear fit is excellent. As has been stated before, inverse regression can also be applied to quantal response assays when linearity is doubtful. Finally another reason to use inverse regression to analyze quantal response assays i~ the computational simpli city and ease of explaining the result~ to nonqua~titative scientists.

PAGE 96

BIBLIOGRAPHY [l] Bliss, C. I. (1939). The toxicity of poisons applied jointly. Ann. Appl. Bio l 585-615. [2] Finney, D. J. (1971). Probit Analysis. 3rd Ed. Cambridge: University Press. [3] Finney, D. J. (1964). Statistical Method in Biologi cal Assay. 2nd Ed. London: Griffin and Co. [4] Halperin, M. (1970). On inverse estimation in linear regression. Technometrics ]1_, 727-36. [5] Knudsen, L. F. and Murtis, J. M. (1947). The use of the angular transformation in biological assays. J. Amer. Statist. Assoc .!, 889-902. [6] Krutchkoff, R. G. (1967). Classical and inverse regression methods of calibration. Technometrics 1, 425-39. [ 7 J [8] [9] Moo re R H a n d Z e i g l e r R K ( l 9 6 7 ) The u s e of non-linear regression methods for analysing sensitivity and quantal response data. Biometrics _g]_, 565-66. Nelder, J. A. (196~). Weighted regression, quantal response data, and inverse polynomials. Bio metrics!, 979-85. Rao, C. R. (1965) Linear Statistical Inference and Its Applications. New York: John Wiley and Sons, Inc. [10] Saw, J. G. (1970). Letter to the editor. Technometrics ]1_, 937. [ l l ] W i l l i a m s E J ( l 9 6 9 ) A n o t e o n r e g r e s s i o n m e t h o d S in calibration. Technometrics l_l, 189-92. 88

PAGE 97

Additional References Berkson, J. (1944). Application of the logistic func tion to bio-assay. J. Amer. Stati~t. Assoc. 12_, 357-65. 89 Krutchkoff, R. G (1969),. Classical and inverse regres sion methods of calibration in extrapolation. Technometrics ll, 605-8~ Litchfield, J. T. and Wilcoxon, F. (1949). A simplified me t h o d o f e v a l u a t i n g d o s e r e s p o n s e ex p e r i m e n t s J. Pharmacol. Exp. Therapeutics 2.._, 99-113. Patel, K. M. and Ho el, D. G. (1973). A generalized Jonckheere k-sample test against ordered alterna tives when observations are subject to arbitrary right censorship. Comm. Statist. I, 373-80. Steel, R. G. and Torrie, J. H. (1960). Principles and Procedures in Statistics. New York: McGraw Hill Book Co.

PAGE 98

BIOGRAPHICAL SKETCH Frank Hai n Dietrich II was born on August 9, 1945, in Lewisburg, Pennsylvania. He was graduated from Lewis burg Joirat High School in June, 1963. In September of that year he enrolled in Wilkes College, receiving ~he degree of Bachelor of Arts with a major in mathematics in June, 1967. In September of that year he enrolled in Bucknell University, receiving t he degree of Master of Arts with a major in mathematics in January, 1970. The writer also taught high school for the school year 1968-1969 at Selinsgrove Area High School. He entered the University of Florida Graduate School in September, 1970. Mr. Dietrich has worked as a teaching assistant for the Department of Statistics since that time, simultaneously pursuing his work towards the degree of Doct6r of Philosophy 90

PAGE 99

I certify that I have read this study and that in my opinion it conforms to acceptable standards of scholarly presentation and is fully adequate, in scope and quality, as a dissertation for the degree of Doctor of Philosophy. Jonathan J. Shuster, Chairman Associate Professor of Statistics I certify that I have read this study and that in my opinion it conforms to acceptable standards of scholarly presentation and is fully adequat~, in scope and quality, as a dissertation for the degree of Doctor of Philosophy. cClave Professor of Statistics I certify that I have read this study and that in my opinion it conforms to acceptable standards of scholarly presentation and is fully adequate, in scope and quality, as a dissertation for the degree of Doctor of Philosophy. f Statistics I certify that I have read this study and that in my op inion it c-0nforms to acceptable standards of scholarly presentation and is fully adequate, in scope and quality, as a dissertation for the degree of Doctor of Philosophy. Pejaver V. Rao Professor of Statistics

PAGE 100

I certify that I have read this study and that in my opinion it conforms to acceptable standards of scholarly presentation and is fully adequate, in scope and quality, as a dissertation for the degree of Doctor of Philosophy. Stratton H. Kerr Professor of Entomology This dissertation was submitted to the Graduate Faculty of the pepartment of Statistics in the Col lege of Arts and Sciences and to the Graduate Council, and was accepted as partial fulfillment of the requirements for the degree of Doctor of Philosophy. August, 1975 Dean, Graduate School