Citation
Nonlinear extensions to the minimum average correlation energy filter

Material Information

Title:
Nonlinear extensions to the minimum average correlation energy filter
Creator:
Fisher, John W., 1965-
Publication Date:
Language:
English
Physical Description:
x, 173 leaves : ill. ; 29 cm.

Subjects

Subjects / Keywords:
Associative memory ( jstor )
Discriminants ( jstor )
Entropy ( jstor )
Feature extraction ( jstor )
Image filters ( jstor )
Linear filters ( jstor )
Mathematical vectors ( jstor )
Matrices ( jstor )
Signals ( jstor )
White noise ( jstor )
Dissertations, Academic -- Electrical and Computer Engineering -- UF ( lcsh )
Electrical and Computer Engineering thesis, Ph. D ( lcsh )
Genre:
bibliography ( marcgt )
non-fiction ( marcgt )

Notes

Thesis:
Thesis (Ph. D.)--University of Florida, 1997.
Bibliography:
Includes bibliographical references (leaves 168-172).
General Note:
Typescript.
General Note:
Vita.
Statement of Responsibility:
by John W. Fisher III.

Record Information

Source Institution:
University of Florida
Holding Location:
University of Florida
Rights Management:
Copyright [name of dissertation author]. Permission granted to the University of Florida to digitize, archive and distribute this item for non-profit research and educational purposes. Any reuse of this item in excess of fair use or other copyright exemptions requires permission of the copyright holder.
Resource Identifier:
027596855 ( ALEPH )
37163201 ( OCLC )

Downloads

This item has the following downloads:


Full Text









NONLINEAR EXTENSIONS TO THE
MINIMUM AVERAGE CORRELATION ENERGY FILTER















By

JOHN W. FISHER III


A DISSERTATION PRESENTED TO THE GRADUATE SCHOOL OF
THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT OF
THE REQUIREMENTS FOR THE DEGREE OF DOCTOR OF
PHILOSOPHY

UNIVERSITY OF FLORIDA


1997







ACKNOWLEDGEMENTS


There are many people I would like to acknowledge for their help in the genesis of this
manuscript. I would begin with my family for their constant encouragement and support.

I am grateful to the Electronic Communications Laboratory and the Army Research
Laboratory for their support of the research at the ECL. I was fortunate to work with very
talented people, Marion Bartlett, Jim Bevington, and Jim Kurtz, in the areas of ATR and
coherent radar systems. In particular, I cannot overstate the influence that Marion Bartlett
has had on my perspective of engineering problems. I would also like to thank Jeff Sichina
of the Army Research Laboratory for providing many interesting problems, perhaps too
interesting, in the field of radar and ATR. A large part of who I am technically has been
shaped by these people.

I would, of course, like to acknowledge my advisor, Dr. Jose Principe, for providing me
with an invaluable environment for the study of nonlinear systems and excellent guidance
throughout the development of this thesis. His influence will leave a lasting impression on
me. I would also like to thank DARPA, funding by this institution enabled a great deal of
the research that went into this thesis. I would also like to thank Drs. David Casasent and
Paul Viola for taking an interest in my work and offering helpful advice.

I would also like to thank the students, past and present, of the Computational Neu-
roEngineering Laboratory. The list includes, but is not limited to, Chuan Wang for useful
discussions on information theory, Neil Euliano for providing much needed recreational
opportunities and intramural championship t-shirts, Andy Mitchell for being a good friend
to go to lunch with and who suffered long inane technical discussions and who now is a
better climber than me. There are certainly others and I am grateful to all.

Finally I would like to thank my wife, Anita, for enduring a seemingly endless ordeal,
for allowing me to use every ounce of her patience, and for sacrificing some of her best
years so that I could finish this Ph. D. I hope it has been worth it.








TABLE OF CONTENTS


Page
ACKNOWLEDGEMENTS ................... ...................... ii
LIST OF FIGURES ...................................... ........... v
LIST OF TABLES ............ ............................ ......... viii
ABSTRACT ..................... ................... .... ix
CHAPTERS

1 INTRODUCTION .................. ................ 1
1.1 Motivation ............ ........................ .. 1
2 BACKGROUND ........................... .... ........ 6
2.1 Discussion of Distortion Invariant Filters ...................... 6
2.1.1 Synthetic Discriminant Function ........................ 12
2.1.2 Minimum Variance Synthetic Discriminant Function ........ 15
2.1.3 Minimum Average Correlation Energy Filter. .............. 18
2.1.4 Optimal Trade-off Synthetic Discriminant Function ......... 20
2.2 Pre-processor/SDF Decomposition ........................... 24
3 THE MACE FILTER AS AN ASSOCIATIVE MEMORY ............. 27
3.1 Linear Systems as Classifiers. ............................... 27
3.2 MSE Criterion as a Proxy for Classification Performance ......... 29
3.2.1 Unrestricted Functional Mappings ....................... 30
3.2.2 Parameterized Functional Mappings ...................... 32
3.2.3 Finite Data Sets ..................................... 34
3.3 Derivation of the MACE Filter .............................. 35
3.3.1 Pre-processor/SDF Decomposition. .................. .... 38
3.4 Associative Memory Perspective .......................... 39
3.5 Comments .................................. ............ 49
4 STOCHASTIC APPROACH TO TRAINING NONLINEAR SYNTHETIC DIS-
CRIMINANT FUNCTIONS. ......................... .. ........ 52
4.1 Nonlinear iterative Approach. .......... ........... ... ... 52
4.2 A Proposed Nonlinear Architecture. ................... ...... 53
4.2.1 Shift Invariance of the Proposed Nonlinear Architecture...... 55
4.3 Classifier Performance and Measures of Generalization ........... 57
4.4 Statistical Characterization of the Rejection Class ............... 67
4.4.1 The Linear Solution as a Special Case .................... 69
4.4.2 Nonlinear M appings ...................... ........... 70








Page
4.5 Efficient Representation of the Rejection Class................... 72
4.6 Experimental Results ...................................... 74
4.6.1 Experiment I noise training ........................... 75
4.6.2 Experiment II noise training with an orthogonalization constraint 81
4.6.3 Experiment III subspace noise training .................. 84
4.6.4 Experiment IV convex hull approach .................... 89
5 INFORMATION-THEORETIC FEATURE EXTRACTION ........... 96
5.1 Introduction .................. .......................... 96
5.2 Motivation for Feature Extraction ............................ 97
5.3 Information Theoretic Background ........................... 101
5.3.1 Mutual Information as a Self-Organizing Principle .......... 101
5.3.2 Mutual Information as a Criterion for Feature Extraction ..... 104
5.3.3 Prior Work in Information Theoretic Neural Processing ...... 106
5.3.4 Nonparametric PDF Estimation ......................... 108
5.4 Derivation Of The Learning Algorithm ........................ 110
5.5 Gaussian Kernels ....................................... 115
5.6 Maximum Entropy/ PCA: An Empirical Comparison ............. 118
5.7 Maximum Entropy: ISAR Experiment ........................ 124
5.7.1 Maximum Entropy: Single Vehicle Class .................. 125
5.7.2 Maximum Entropy: Two Vehicle Classes .................. 127
5.8 Computational Simplification of the Algorithm ................. 127
5.9 Conversion of Implicit Error Direction to an Explicit Error ........ 136
5.9.1 Entropy Minimization as Attraction to a Point.............. 136
5.9.2 Entropy Maximization as Diffusion ...................... 139
5.9.3 Stopping Criterion. ................................. 141
5.10 Observations ............................................ 143
5.11 Mutual Information Applied to the Nonlinear MACE Filters ....... 144
6 CONCLUSIONS. ................. .......................... 151
APPENDIX

A DERIVATIONS................... ........ ..................... 155
REFERENCES ............. .... ................. ... .......... 168
BIOGRAPHICAL SKETCH .................. ...................... 173








LIST OF FIGURES


Eage
Figure
1 ISAR images of two vehicle types......... ........................... 9
2 MSF peak output response of training vehicle la over all aspect angles. ..... 10
3 MSF peak output response of testing vehicles lb and 2a over all aspect angles. 11
4 MSF output image plane response........ ........................... 12
5 SDF peak output response of training vehicle la over all aspect angles....... 15
6 SDF peak output response of testing vehicles Ib and 2a over all aspect angles. 16
7 SDF output image plane response. .................................. 17
8 MACE filter output image plane response. ............................ 20
9 MACE peak output response of vehicle la, lb and 2a over all aspect angles... 21
10 Example of a typical OTSDF performance plot ........................ 23
11 OTSDF filter output image plane response. ........................... 24
12 OTSDF peak output response of vehicle la over all aspect angles........... 25
13 OTSDF peak output response of vehicles Ib and 2a over all aspect angles .... 26
14 Decomposition of distortion invariant filter in space domain............... 26
15 Adaline architecture ......... ... ............................. 28
16 Decomposition of MACE filter as a preprocessor (i.e. a pre-whitening filter over
the average power spectrum of the exemplars) followed by a synthetic discrimi-
nant function ................................................ 39
17 Decomposition of MACE filter as a preprocessor (i.e. a pre-whitening filter over
the average power spectrum of the exemplars) followed by a linear associative
memory. ............................................. ........ 43
18 Peak output response over all aspects of vehicle I a when the data matrix which is
not full rank .............. ...... .................. .............. 47
19 Output correlation surface for LMS computed filter from non full rank data... 48
20 Learning curve for LMS approach............... ...................... 49
21 NMSE between closed form solution and iterative solution................ 50
22 Decomposition of optimized correlator as a pre-processor followed by SDF/LAM
(top). Nonlinear variation shown with MLP replacing SDF in signal flow (middle),
detail of the MLP (bottom). The linear transformation represents the space domain
equivalent of the spectral pre-processor ............................... 54
23 ISAR images of two vehicle types shown at aspect angles of 5, 45, and 85 degrees
respectively. .............. ......... ........ ................ 59










24 Generalization as measured by the minimum peak response .............. 62
25 Generalization as measured by the peak response mean square error......... 63
26 Comparison of ROC curves ................ ....... ................ 64
27 ROC performance measures versus ................... .............. 66
28 Peak output response of linear and nonlinear filters over the training set...... 77
29 Output response of linear filter (top) and nonlinear filter (bottom)........... 78
30 ROC curves for linear filter (solid line) versus nonlinear filter (dashed line)... 79
31 Experiment I: Resulting feature space from simple noise training ........... 80
32 Experiment II: Resulting feature space when orthogonality is imposed on the input
layer of the MLP. ................................................ 83
33 Experiment II: Resulting ROC curve with orthogonality constraint.......... 84
34 Experiment II: Output response to an image from the recognition class training
set......... ..................... .................. 85
35 Experiment III: Resulting feature space when the subspace noise is used for train-
ing ................... ........ ............................... 88
36 Experiment Im: Resulting ROC curve for subspace noise training........... 89
37 Experiment III: Output response to an image from the recognition class training
set .................................. ....... ...... ........... 90
38 Learning curves for three methods. ............................ .. 90
39 Experiment IV: resulting feature space from convex hull training ........... 94
40 Experiment IV: Resulting ROC curve with convex hull approach ........... 95
41 Classical pattern classification decomposition. ................. ...... 100
42 Decomposition of NL-MACE as a cascade of feature extraction followed by dis-
crimination .................................................... 100
43 Mutual information approach to feature extraction ...................... 106
44 Mapping as feature extraction. Information content is measured in the low dimen-
sional space of the observed output.......... .......................... 108
45 A signal flow diagram of the learning algorithm. .................. ..... 114
46 Gradient of two-dimensional gaussian kernel. The kernels act as attractors to low
points in the observed PDF on the data when entropy maximization is desired. 117
47 Mixture of gaussians example. ............... ..................... 118
48 Mixture of gaussians example, entropy minimization and maximization...... 119
49 PCA vs. Entropy gaussian case...................................... 120
50 PCA vs. Entropy non-gaussian case. ............................ 122
51 PCA vs. Entropy non-gaussian case. ............................ 123










52 Example ISAR images from two vehicles used for experiments. ........... 124
53 Single vehicle experiment, 100 iterations. .......................... 125
54 Single vehicle experiment, 200 iterations. ............................. 126
55 Single vehicle experiment, 300 iterations. ............................. 126
56 Two vehicle experiment. ......................................... 128
57 Two dimensional attractor functions. ................................. 133
58 Two dimensional regulating function. .............................. 134
59 Magnitude of the regulating function. ................................ 134
60 Approximation of the regulating function ............................. 135
61 Feedback functions for implicit error term ........................... 138
62 Entropy minimization as local attraction. ............................. 140
63 Entropy maximization as diffusion. ................................. 142
64 Stopping criterion. ............................................... 143
65 Mutual information feature space. ................................. 146
66 ROC curves for mutual information feature extraction (dotted line) versus linear
M ACE filter (solid line)............................................ 148
67 Mutual information feature space resulting from convex hull exemplars...... 149
68 ROC curves for mutual information feature extraction (dotted line) versus linear
MACE filter (solid line)..... .................................. 150








LIST OF TABLES


Page

Table

1 Classifier performance measures when the filter is determined by either of the
common measures of generalization as compared to best classifier performance for
two values of.................................... ............. 61
2 Correlation of generalization measures to classifier performance. In both cases (
equal to 0.5 or 0.95) the classifier performance as measured by the area of the ROC
curve or Pfa at Pd equal 0.8, has an opposite correlation as to what would be
expected of a useful measure for predicting performance ................ 64
3 Comparison of ROC classifier performance for to values of Pd. Results are shown
for the linear filter versus four different types of nonlinear training. N: white noise
training, G-S: Gram-Schmidt orthogonalization, subN: PCA subspace noise, C-H:
convex hull rejection class ....................................... 81
4 Comparison of ROC classifier performance for to values of Pd. Results are shown
for the linear filter versus experiments III and IV from section 4.6 and mutual
information feature extraction.The symbols indicate the type of rejection class
exemplars used. N: white noise training, G-S: Gram-Schmidt orthogonalization,
subN: PCA subspace noise, C-H: convex hull rejection class.............. 145












Abstract of Dissertation Presented to the Graduate School
of the University of Florida in Partial Fulfillment of the
Requirements for the Degree of Doctor of Philosophy


NONLINEAR EXTENSIONS TO THE MINIMUM AVERAGE
CORRELATION ENERGY FILTER


By

John W. Fisher III

May 1997

Chairman: Dr. Jose C. Principe
Major Department: Electrical and Computer Engineering


The major goal of this research is to develop efficient methods by which the family of

distortion invariant filters, specifically the minimum average correlation energy (MACE)

filter, can be extended to a general nonlinear signal processing framework. The primary

application of MACE filters has been to pattern classification of images. Two desirable

qualities of MACE-type correlators are ease of implementation via correlation and ana-

lytic computation of the filter coefficients.

Our motivation for exploring nonlinear extensions to these filters is due to the well-

known limitations of the linear systems approach to classification. Among these limita-

tions the attempt to solve the classification problem in a signal representation space,

whereas the classification problem is more properly solved in a decision or probability

space. An additional limitation of the MACE filter is that it can only be used to realize a

linear decision surface regardless of the means by which it is computed. These limitations

lead to suboptimal classification and discrimination performance.








Extension to nonlinear signal processing is not without cost. Solutions must in general

be computed iteratively. Our approach was motivated by the early proof that the MACE

filter is equivalent to the linear associative memory (LAM). The associative memory per-

spective is more properly associated with the classification problem and has been devel-

oped extensively in an iterative framework.

In this thesis we demonstrate a method emphasizing a statistical perspective of the

MACE filter optimization criterion. Through the statistical perspective efficient methods

of representing the rejection and recognition classes are derived. This, in turn, enables a

machine learning approach and the synthesis of more powerful nonlinear discriminant

functions which maintain the desirable properties of the linear MACE filter, namely, local-

ized detection and shift invariance.

We also present a new information theoretic approach to training in a self-organized or

supervised manner. Information theoretic signal processing looks beyond the second order

statistical characterization inherent in the linear systems approach. The information theo-

retic framework probes the probability space of the signal under analysis. This technique

has wide application beyond nonlinear MACE filter techniques and represents a powerful

new advance to the area of information theoretic signal processing.

Empirical results, comparing the classical linear methodology to the nonlinear exten-

sions, are presented using inverse synthetic aperture radar (ISAR) imagery. The results

demonstrate the superior classification performance of the nonlinear MACE filter.













CHAPTER 1

INTRODUCTION

1.1 Motivation

Automatic target detection and recognition (ATD/R) is a field of pattern recognition.

The goal of an ATD/R system is to quickly and automatically detect and classify objects

which may be present within large amounts of data (typically imagery) with a minimum of

human intervention. In an ATD/R system, it is not only desirable to recognize various tar-

gets, but to locate them with some degree of accuracy. The minimum average correlation

energy (MACE) filter [Mahalanobis et al., 1987] is of interest to the ATD/R problem due

to its localization and discrimination properties. The MACE filter is a member of a family

of correlation filters derived from the synthetic discriminant function (SDF) [Hester and

Casasent, 1980]. The SDF and its variants have been widely applied to the ATD/R prob-

lem. We will describe synthetic discriminant functions in more detail in chapter 2. Other

generalizations of the SDF include the minimum variance synthetic discriminant function

(MVSDF) [Kumar, 1986], the MACE filter, and more recently the gaussian minimum

average correlation energy (G-MACE) [Casasent et al., 1991] and the minimum noise and

correlation energy (MINACE) [Ravichandran and Casasent, 1992] filters.

This area of filter design is commonly referred to as distortion-invariant filtering. It is a

generalization of matched spatial filtering for the detection of a single object to the detec-

tion of a class of objects, usually in the image domain. Typically the object class is repre-

sented by a set of exemplars. The exemplar images represent the image class through a







range of "distortions" such as a variation in viewing aspect of a single object. The goal is

to design a single filter which will recognize an object class through the entire range of

distortion. Under the design criterion the filter is equally matched to the entire range of

distortion as opposed to a single viewpoint as in a matched filter. Hence the nomenclature

distortion-invariant filtering [Kumar, 1992].

The bulk of the research using these types of filters has focused on optical and infra-

red (IR) imagery and overcoming recognition problems in the presence of distortions asso-

ciated with 3-D to 2-D mappings, e.g. scale and rotation (in-plane and out-of-plane).

Recently, however, this technique has been applied to radar imagery [Novak et al., 1994;

Fisher and Principe, 1995a; Chiang et al., 1995]. In contrast to optical or infra-red imag-

ery, the scale of each pixel within a radar image is usually constant and known. Conse-

quently, radar imagery does not suffer from scale distortions of objects.

In the family of distortion invariant filters, the MACE filter has been shown to posses

superior discrimination properties [Mahalanobis et al., 1987, Casasent and Ravichandran,

1992]. It is for this reason that this work emphasizes nonlinear extensions to the MACE

filter. The MACE filter and its variants are designed to produce a narrow, constrained-

amplitude peak response when the filter mask is centered on a target in the recognition

class while minimizing the energy in the rest of the output plane. This property provides

desirable localization for detection. Another property of the MACE filter is that it is less

susceptible to out-of-class false alarms [Mahalanobis et al., 1987]. While the focus of this

work will be on the MACE filter criterion, it should be stated that all of the results pre-

sented here are equally applicable to any of the distortion invariant filters mentioned above

with appropriate changes to the respective optimization criteria.







Although the MACE filter does have superior false alarm properties, it also has some

fundamental limitations. Since it is a linear filter, it can only be used to realize linear deci-

sion surfaces. It has also been shown to be limited in its ability to generalize to exemplars

that are in the recognition class (but not in the training set), while simultaneously rejecting

out-of-class inputs [Casasent and Ravichandran, 1992; Casasent et al., 1991]. The number

of design exemplars can be increased in order to overcome generalization problems; how-

ever, the calculation of the filter coefficients becomes computationally prohibitive and

numerically unstable as the number of design exemplars is increased [Kumar, 1992]. The

MINACE and G-MACE variations have improved generalization properties with a slight

degradation in the average output plane variance [Ravichandran and Casasent, 1992] and

sharpness of the central peak [Casasent et al., 1991], respectively.

This research presents a basis by which the MACE filter, and by extension all linear

distortion invariant filters, can be extended to a more general nonlinear signal processing

framework. In the development it is shown that the performance of the linear MACE filter

can be improved upon in terms of generalization while maintaining its desirable proper-

ties, i.e. sharp, constrained peak at the center of the output plane.

A more detailed description of the developmental progression of distortion invariant

filtering is given in chapter 2. In this chapter a qualitative comparison of the various distor-

tion invariant filters is presented using inverse synthetic aperture radar (ISAR) imagery.

The application of pattern recognition techniques to high-resolution radar imagery has

become a topic of great interest recently with the advent of widely available instrumenta-

tion grade imaging radars. High resolution radar imagery poses a special challenge to dis-

tortion invariant filtering in that the source of distortions such as rotation in aspect of an







object do not manifest themselves as rotations within the radar image (as opposed to opti-

cal imagery). In this case the distortion is not purely geometric, but more abstract.

Chapter 3 presents a derivation of the MACE filter as a special case of Kohonen's lin-

ear associative memory [1988]. This relationship is important in that the associative mem-

ory perspective is the starting point for developing nonlinear extensions to the MACE

filter.

In chapter 4 the basis upon which the MACE filter can extended to nonlinear adaptive

systems is developed. In this chapter a nonlinear architecture is proposed for the extension

of the MACE filter. A statistical perspective of the MACE filter is discussed which leads

naturally into a class representational viewpoint of the optimization criterion of distortion

invariant filters. Commonly used measures of generalization for distortion invariant filter-

ing are also discussed. The results of the experiments presented show that the measures

are not appropriate for the task of classification. It is interesting to note that the analysis

indicates the appropriateness of the measures is independent of whether the mapping is

linear or nonlinear. The analysis also discusses the merit of the MACE filter optimization

criterion in the context of classification and with regards to measures of generalization.

The chapter concludes with a series of experiments further refining the techniques by

which nonlinear MACE filters are computed.

Chapter 5 presents a new information theoretic method for feature extraction. An

information theoretic approach is motivated by the observation that the optimization crite-

rion of the MACE filter only considers the second-order statistics of the rejection class.

The information theoretic approach, however, operates in probability space, exploiting

properties of the underlying probability density function. The method enables the extrac-





5

tion of statistically independent features. The method has wide application beyond nonlin-

ear extensions to MACE filters and as such represents a powerful new technique for

information theoretic signal processing. A review of information theoretic approaches to

signal processing are presented in this chapter. This is followed by the derivation of the

new technique as well as some general experimental results which are not specifically

related to nonlinear MACE filters, but which serve to illustrate the potential of this

method. Finally the logical placement of this method within nonlinear MACE filters is

presented along with experimental results.

In chapter 6 we review the significant results and contributions of this dissertation. We

also discuss possible lines of research resulting from the base established here.













CHAPTER 2

BACKGROUND

2.1 Discussion of Distortion Invariant Filters

As stated, distortion invariant filtering is a generalization of matched spatial filtering.

It is well known that the matched filter maximizes the peak-signal-to-average-noise power

ratio as measured at the filter output at a specific sample location when the input signal is

corrupted by additive white noise.

In the discrete signal case the design of a matched filter is equivalent to the following

vector optimization problem.[Kumar, 1986]


min hth st. xth = d {h,x}e CNXe

where the column vector x contains the N coefficients of the signal we wish to detect, h

contains the coefficients of the filter ( t indicates the hermitian transpose operator), and d

is a positive scaler. This notation is also suitable for N-dimensional signal processing as

long as the signal and filter have finite support and are re-ordered in the same lexico-

graphic manner (e.g. by row or column in the two-dimensional case) into column vectors.

The optimal solution to this problem is


h = x(xtx)- d.








Given this solution we can calculate the peak output signal power as


= (xth)2

= (xtx(xtx)-ld)2
= d2

and the average output noise power due to an additive white noise input


o = E{htnnth}
= htEnh

= o2hth
= aYd2(xt,)-1

where is an ao2 is the input noise variance. Resulting in a peak-signal-to-average-noise

output power ratio of


( )9 d2
oF a,2d2(xtx)-I

(xtx)
2

As we can see, the result is independent of the choice of scalar, d. If d is set to unity,

the result is a normalized matched spatial filter.[Vander Lugt, 1964]

In order to further motivate the concept of distortion invariant filtering, a typical ATR

example problem will be used for illustration. This experiment will also help to illustrate

the genesis of the various types of distortion invariant filtering approaches beginning with

the matched spatial filter (MSF).

Inverse synthetic aperture radar (ISAR) imagery will be used for all of the experiments

presented herein. The distortion invariant filtering; however, is not limited to ISAR imag-

ery and in fact can be extended to much more abstract data types. ISAR images are shown








in figure 1. In the figure, three vehicles are displayed, each at three different radar viewing

aspect angles (5, 45, and 85 degrees), where the aspect angle is the direction of the front of

the vehicle relative to the radar antenna. The image dimensions are 64 x 64 pixels. Radar

systems measure a quantity called radar cross section (RCS). When a radar transmits an

electromagnetic pulse, some of the incident energy on an object is reflected back to the

radar. RCS is a measure of the reflected energy detected by the radar's receiving antenna.

ISAR imagery is the result of a radar signal processing technique which uses multiple

detected radar returns measured over a range of relative object aspect angles. Each pixel in

an ISAR image is a measure of the aggregate radar cross section at regularly sampled

points in space.

Two types of vehicles are shown. Vehicle type I will represent a recognition class,

while vehicle type 2 will represent a confusion class. The goal is to compute a filter which

will recognize vehicle type 1 without being confused by vehicle 2. Images of vehicle la

will be used to compute the filter coefficients. Vehicles lb and 2a represent an independent

testing class.

ISAR images of all three vehicles were formed in the aspect range of 5 to 85 degrees at

1 degree increments. As the MSF is derived from a single vehicle image, an image of vehi-

cle la at 45 degrees (the midpoint of the aspect range) is used.

The peak output response to an image represents maximum of the cross correlation

function of the image with the MSF template. The peak output response over the entire

aspect range of vehicle la is shown in figure 2. As can be seen in the figure, the filter

matches at 45 degrees very well; however, as the aspect moves away from 45 degrees, the




















vehicle la (training)


vehicle lb (testing)


vehicle 2a (testing)
Figure 1. ISAR images of two vehicle types. Vehicles are shown at aspect angles
of 5, 45, and 85 degrees respectively. Two different vehicles of type 1 (a
and bi are shown, while one vehicle of type 2 (a) is shown. Vehicle la
is utcd as a training vehicle, while vehicle lb is used as the testing
eh, Ile for the recognition class. Vehicle 2a represents a confusion
vehicle.

peak output response begins to degrade. Depending on the type of imagery as well as the


vehicle, this degradation can become very severe.









matched spatial filter
1.2


1.0


0.8

o
S0.6



0.4


0.2


0.0
0 20 40 60 80 100
aspect angle
Figure 2. MSF peak output response of training vehicle 1 a over all aspect angles.
Peak response degrades as aspect difference increases.

The peak output responses of both vehicles in the testing set are shown in figure 3

overlain on the training image response. In one sense the filter exhibits good generaliza-

tion, that is, the peak response to vehicle lb is much the same as a function of aspect as the

peak response to vehicle la. However, the filter also "generalizes" equally as well to vehi-

cle 2b, which is undesirable. As a vehicle discrimination test (vehicle 1 from vehicle 2) the

MSF fails.









spatial filter
I ,, .


0 20 40 60
aspect angle


80 100


Figure 3. MSF peak output response of testing vehicles lb and 2a over all aspect
angles. Responses are overlaid on training vehicle response. Filter
responses to vehicles lb (dashed line) and 2a (dashed-dot) do not differ
significantly.


matched






12

The output image plane response to a single image of vehicle la is shown in figure 4.

Refinements to the distortion invariant filter approach, namely the MACE filter, will show

that the localization of this output response, as measured by the sharpness of the peak, can

be improved significantly.

1.0


0.8 -


0.6 -

0.4






0.0







Figure 4. MSF output image plane response.


2.1.1 Synthetic Discriminant Function

The degradation evidenced in figures 2 and 3 were the primary motivation for the syn-

thetic discriminant function (SDF)[Hester and Casasent, 1980]. A shortcoming of the

MSF, from the standpoint of distortion invariant filtering, is that it is only optimum for a

single image. One approach would be to design a bank of MSFs operating in parallel

which were matched to the distortion range. The typical ATR system; however, must rec-

ognize/discriminate multiple vehicle types and so from an implementation standpoint

alone parallel MSFs is an impractical choice. Hester and Casasent set out to design a sin-





13

gle filter which could be matched to multiple images using the idea of superposition. This

approach was possible due to the large number of coefficients (degrees of freedom) that

typically constitute 2-D image templates. For historical reasons, specifically that the filters

in question were synthesized optically using holographic techniques [Vander Lugt, 1964],

it was hypothesized that such a filter could be synthesized from linear combinations of a

set of exemplar images.

The filter synthesis procedure consists of projecting the exemplar images onto an

ortho-normal basis (originally Gram-Schmidt orthogonalization was used to generate the

basis). The next step is to determine the coefficients with which to linearly combine the

basis vectors such that a desired response for each original image exemplar was obtained.

[Hester and Casasent, 1980]

The proposed synthesis procedure is a bit convoluted. It turns out that the choice of

ortho-normal basis is irrelevant. As long as the basis spans the space of the original exem-

plar images the result is always the same. The development of Kumar [1986] is more use-

ful for depicting the SDF as a generalization of the matched filter (for the white noise

case) to multiple signals. The SDF can be cast as the solution to the following optimiza-

tion problem


min hth s.t. Xth = d {h E CN X,X e C 'NxN,d CN

where X is now a matrix whose Nt columns comprise a set of training images] we wish

to detect, d is a column vector of desired outputs (one for each of the training exemplars)


1. Since these filters have been applied primarily to 2D images, signals will be referred to
as images or exemplars from this point on. In the vector notation, all NI x N2 images are
re-ordered (by row or column) into N x 1 column vectors, where N = N1N2.






14

and is typically set to all unity values for the recognition class. The images of the data

matrix X comprise the range of distortion that the implemented filter is expected to

encounter. It is assumed that N, < N and so the problem formulation is a quadratic optimi-

zation subject to an under-determined system of linear constraints. The optimal solution is


h = X(XtX)-ld.

When there is only one training exemplar (N, = 1) and d is unity the SDF defaults to

the normalized matched filter. Similar to the matched filter (white noise case), the SDF is

the linear filter which minimizes the white noise response while satisfying the set of linear

constraints over the training exemplars.

By way of example, the SDF technique is tested against the ISAR data as in the MSF

case. Exemplar images from vehicle la were selected at every 4 degrees aspect from 5 to

85 degrees for a total of 21 exemplar images (i.e. N, = 21). Figure 5 shows the peak out-

put response over all aspects of the training vehicle (la). As seen in the figure, the degra-

dation as the aspect changes is removed. The MSF response has been overlaid to highlight

the differences.

The peak output response over all exemplars in the testing set is shown in figure 6.

From the perspective of peak response, the filter generalizes fairly well. However, as in the

MSF, the usefulness of the filter as a discriminant between vehicles 1 and 2 is clearly lim-

ited.

Figure 7 shows the resulting output plane response when the SDF filter is correlated

with a single image of vehicle I a. The localization of the peak is similar to the MSF case.









synthetic discriminant function
1.2


1.0-


0.8 -/ "


0.6


0.4


0.2



0 20 40 60 80 100
aspect angle
Figure 5. SDF peak output response of training vehicle la over all aspect angles.
The MSF response is also shown (dashed line). The degradation in the
peak response has been corrected.

2.1.2 Minimum Variance Synthetic Discriminant Function

The SDF approach seemingly solved the problem of generalizing a matched filter to

multiple images. However, the SDF has no built-in noise tolerance by design (except for

the white noise case). Furthermore, in practice, it would turn out that occasionally the

noise response would be higher than the peak object response depending on the type of

imagery. As a result, detection by means of searching for correlation peaks was shown to

be unreliable for some types of imagery, specifically imagery which contains recognition

class images embedded in non-white noise[Kumar, 1992]. Kumar [1986] proposed a

method by which noise tolerance could be built in to the filter design. This technique was

termed the minimum variance synthetic discriminant function (MVSDF). The MVSDF is








synthetic discriminant function
1.2 I


1.0 -


0.8

0
S0.6

a
0.4


0.2


0.0 2 ,. ,.
0 20 40 60 80 100
aspect angle
Figure 6. SDF peak output response of testing vehicles lb and 2a over all aspect
angles. The dashed line is vehicle lb while the dashed-dot line is
vehicle 2a.

the correlation filter which minimizes the output variance due to zero-mean input noise

while satisfying the same linear constraints as the SDF. The output noise variance can be

shown to be htZh, where h is the vector of filter coefficients and Zn is the covariance

matrix of the noise. [Kumar, 1986]

Mathematically the problem formulation is


min htnh s.t. Xth = d


he CNX tXe CN X ,ENe CNxNde CNx'




























Figure 7. SDF output image plane response.

with the optimal solution


h = InlX(XtZX X)- d.

In the case of white noise, the MVSDF is equivalent to the SDF This technique has a

significant numerical complexity issue which is that the solution requires the inversion of

an Nx N matrix (Z,) which for moderate image sizes (N = NIN2) can be quite large

and computationally prohibitive, unless simplifying assumptions can be made about its

form (e.g. a diagonal matrix, toeplitz, etc.).

The MVSDF can be seen as a more general extension of the matched filter to multiple

vector detection as most signal processing definitions of the matched filter incorporate a

noise power spectrum and do not assume the white noise case only. It is mentioned here

because it is the first distortion invariant filtering technique to recognize the need to char-

acterize a rejection class.





18

2.1.3 Minimum Average Correlation Energy Filter

The MVSDF (and the SDF) control the output of the filter at a single point in the out-

put plane of the filter. In practice large sidelobes may be exhibited in the output plane

making detection difficult. These difficulties led Mahalanobis et al [1987] to propose the

minimum average correlation energy (MACE) filter. This development in distortion invari-

ant filtering attempts as its design goal to control not only the output point when the image

is centered on the filter, but the response of the entire output plane as well. Specifically it

minimizes the average correlation energy of the output over the training exemplars subject

to the same linear constraints as the MVSDF and SDF filters.

The problem is formulated in the frequency domain using Parseval relationships. In

the frequency domain, the formulation is


min HtDH s.t. XtH = d
{He CN ',Xe CNx E,De CNx Nd eCx

where D is a diagonal matrix whose diagonal elements are the coefficients of the average

2-D power spectrum of the training exemplars. The form of the quadratic criterion is

derived using Parseval's relationship. A derivation is given in section A. 1 of the appendix.

The other terms, H and X, contain the 2-D DFT coefficients of the filter and training

exemplars, respectively. The vector d is the same as in the MVSDF and SDF cases. The

optimal solution, in the frequency domain, is



H = D-IX(XtD-'X)-'d. (1)

As in the MVSDF, the solution requires the inversion of an N x N matrix, but in this

case the matrix D is diagonal and so its inversion is trivial. When the noise covariance





19

matrix is estimated from observations of noise sequences (assuming wide-sense stationar-

ity and ergodicity) the MVSDF can also be formulated in the frequency domain, as well,

and the complex matrix inversion is avoided. A derivation of this is given in the appendix

A, examination of equations (95), (96), (97) shows that under the assumption that the

noise class can be modeled as a stationary, ergodic random noise process the solution of

the MVSDF can be found in the spectral domain using the estimated power spectrum of

the noise process and equation (1).

In practice, the MACE filter performs better than the MVSDF with respect to rejecting

out-of-class input images. The MACE filter; however, has been shown to have poor gener-

alization properties, that is, images in the recognition class but not in the training exemplar

set are not recognized.

A MACE filter was computed using the same exemplar images as in the SDF example.

Figure 8 shows the resulting output image place response for one image. As can be seen in

the figure, the peak in the center is now highly localized. In fact it can be shown [Mahal-

anobis et al., 1987] that over the training exemplars (those used to compute the filter) the

output peak will always be at the constraint location.

Generalization to between aspect images, as mentioned, is a problem for the MACE

filter. Figure 9 shows the peak output response over all aspect angles. As can be seen in the

figure, the peak response degrades severely for aspects between the exemplars used to

compute the filter. Furthermore, from a peak output response viewpoint, generalization to

vehicle lb is also worse. However, unlike the previous techniques, we now begin to see

some separation between the two vehicle types as represented by their peak response.





























Figure 8. MACE filter output image plane response.

2.1.4 Optimal Trade-off Synthetic Discriminant Function

The final distortion invariant filtering technique which will be discussed here is the

method proposed by R6fr6grier and Fique [1991], known as the optimal trade-off syn-

thetic discriminant function (OTSDF). Suppose that the designer wishes to optimize over

multiple quadratic optimization criteria (e.g. average correlation energy and output noise

variance) subject to the same set of equality constraints as in the previous distortion invari-

ant filters. We can represent the individual optimization criterion by


J, = htQih,

where Q, is an N x N symmetric, positive-definite matrix (e.g. Qi = 1n for MVSDF

optimization criterion).

The OTSDF is a method by which a set of quadratic optimization criterion may be

optimally traded off against each other; that is, one criterion can be minimized with mini-









MACE filter
1.2


1.0


S0.8

0




".- '. *" 'W '' ; "


0.2


0.0
0 20 40 60 80 100
aspect angle
Figure 9. MACE peak output response of vehicle la, lb and 2a over all aspect
angles. Degradation to between aspect exemplars is evident.
Generalization to the testing vehicles as measured by peak output
response is also poorer. Vehicle la is the solid line, lb is the dashed line
and 2a is the dashed-dot line.

mum penalty to the rest. The solution to all such filters can be characterized by the equa-

tion



h = Q IX(XtQ- X) d, (2)

where, assuming M different criteria,


M M
Q = ,Qi i = I 0 < X
i= l i= l








The possible solutions, parameterized by X,, define a performance bound which can-

not be exceeded by any linear system with respect to the optimization criteria and the

equality constraints. All such linear filters which optimally trade-off a set of quadratic cri-

teria are referred to as optimal trade-off synthetic discriminant functions.

We may, for example, wish to trade-off the MACE filter criterion versus the MVSDF

filter criterion. This presents the added difficulty that one criterion is specified in the space

domain and the other in the spectral domain. If the noise is represented as zero-mean, sta-

tionary, and ergodic (if the covariance is to be estimated from samples) we can, as men-

tioned, transform the MVSDF criterion to the spectral domain. In this case the optimal

filter has the frequency domain solution,


1 -I
H = [kDn+(l-X)Dox-lX[Xt[h Dn+(1-. )DJ] X] d

= DIX[XtDxX]-Id

where DX = AD, + ( )Dx, 0 < < 1, and Dn, Dx are diagonal matrices whose

diagonal elements contain the estimated power spectrum coefficients of the noise class and

the recognition class, respectively. The performance bound of such a filter would resemble

figure 10, where all linear filters would fall in the darkened region and all optimal trade-off

filters would lie somewhere on the boundary.

By way of example we again use the data from the MACE and SDF examples. In this

case we will construct an OTSDF which trades off the MACE filter criterion for the SDF

criterion. In order to transform the SDF to the spectral domain, we will assume that the

noise class is zero-mean, stationary, white noise. The power spectrum is therefore flat. One

of the issues for constructing an OTSDF is how to set the value of X which represents the











a
S I








S Realizable region
ofperformance


I-


MAC
MAC


average correlation energy


Figure 10. Example of a typical OTSDF performance plot. This plot shows the
trade-off, hypothetically, between the ACE criteria versus a noise
variance criteria. The curved arrow on the performance bound indicates
the direction of increasing X for the two criterion case. The curve is
bounded by the MACE and MVSDF results.

degree by which one criterion is emphasized over another. We will not address that issue

here, but simply set the value to X = 0.95, indicating more emphasis on the MACE filter

criterion.

The output plane response of the OTSDF is shown in figure 11. As compared to the

MACE filter response, the output peak is not nearly as sharp, but still more localized than

the SDF case.

The peak output response over the training vehicle for the OTSDF is compared to the

MACE filter in figure 12. The degradation to between aspect exemplars is less severe than

the MACE filter. The peak output response of vehicles lb and 2a are shown in figure 13.


MVSD


n





























Figure 11. OTSDF filter output image plane response.

As compared to the MACE filter the peak response is improved over the testing set. Sepa-

ration between the two vehicle types appears to be maintained.


2.2 Pre-processor/SDF Decomposition

In the sample domain, the SDF family of correlation filters is equivalent to a cascade

of a linear pre-processor followed by a linear correlator [Mahalanobis et al., 1987;Kumar,

1992]. This is illustrated in figure 14 with vector operations. The pre-processor, in the case

of the MACE filter, is a pre-whitening filter computed on the basis of the average power

spectrum of the recognition class training exemplars. In the case of the MVSDF the pre-

processor is a pre-whitening filter computed on the basis of the covariance matrix of the

noise. The net result is that after pre-processing, the second processor is an SDF computed

over the pre-processed exemplars.








OTSDF
1.2






0.8 II I II I '

0.6I I Iill II I -



0.4


0.2 -


0.0 .
0 20 40 60 80 100
aspect angle
Figure 12. OTSDF peak output response of vehicle la over all aspect angles.
Degradation to between aspect exemplars is less than in the MACE
filter shown in dashed line.

The primary contribution of this research will be to extend the ideas of MACE filtering

to a general nonlinear signal processing architecture and accompanying classification

framework. These extensions will focus on processing structures which improve the gen-

eralization and discrimination properties while maintaining the shift-invariance and local-

ization detection properties of the linear MACE filter.




































Figure 13. OTSDF peak output response of vehicles lb and 2a over all aspect
angles. Generalization is better than in the MACE filter. Vehicle lb is
shown in dashed line, vehicle 2a is shown in dashed-dot line.


y = Ax h = y(yty)-Id

input image, x pre-processor SDF scalar output


Filter Decomposition

Figure 14. Decomposition of distortion invariant filter in space domain. The
notation used assumes that the image and filter coefficients have been
re-ordered into vectors. The input image vector, x, is pre-processed
by the linear transformation, y = Ax. The resulting vector is
processed by a synthetic discriminant function, yout = yth.


OTSDF


\I i; .


0.4



0.2


0 20 40 60
aspect angle


-L' ' '


80 100













CHAPTER 3

THE MACE FILTER AS AN ASSOCIATIVE MEMORY

3.1 Linear Systems as Classifiers

In this chapter we present the MACE filter from the perspective of associative memo-

ries. This perspective is important because it leads to a machine-learning and classification

framework and consequently a means by which to determine the parameters of a nonlinear

mapping via gradient search techniques. We shall refer, herein, to the machine learning/

gradient search methods as an iterative framework. The techniques are iterative in the

sense that adaptation to the mapping parameters are computed sequentially and repeatedly

over a set of exemplars. We shall show that the iterative and classification framework com-

bined with a nonlinear system architecture have distinct advantages over the linear frame-

work of distortion invariant filters.

As we have stated, distortion invariant filters can only realize linear discriminant func-

tions. We begin, therefore, by considering linear systems used as classifiers. The adaline

architecture [Widrow and Hoff, 1960], depicted in figure 15, is an example of a linear sys-

tem used for pattern classification. A pattern, represented by the coefficients xi, is applied

to a linear combiner, represented by the weight coefficients wi, the resulting output y is








then applied to a hard limiter which assigns a class to the input pattern. Mathematically

this can be represented by


c = sgn(y-p)
= sgn(wTx- p)

where sgn( ) is the signum function, tp is a threshold, and w, x e 9Nx are column

vectors containing the coefficients of the pattern and combiner weights, respectively. In

the context of classification, this architecture is trained iteratively using the least mean

square (LMS) algorithm [Widrow and Hoff, 1960]. For a two class problem the desired

output, d in the figure, is set to 1 depending on the class of the input pattern, the LMS

algorithm then minimizes the mean square error (MSE) between the classification output

c and the desired output. Since the error function, ec, can only take on three values +2

and 0, minimization of the MSE is equivalent to minimizing the average number of actual

errors.






29

There are several observations to be made about the adaline/LMS approach to classifi-

cation. One observation is that the adaptation process described uses the error, E, as mea-

sured at the output of the linear combiner to drive the adaptation process and not the actual

classification error, Ec. Another observation is that this approach presupposes that the pat-

tern classes can be linearly separated. A final point, on which we will have more to say, is

that the method uses the MSE criterion as a proxy for classification.


3.2 MSE Criterion as a Proxy for Classification Performance

As we have pointed out, the adaline/LMS approach to classification uses the MSE cri-

terion to drive the adaptation process. It is the probability of misclassification (also called

the Bayes criterion), however, with which we are truly concerned. We now discuss the

consequence of using the MSE criterion as a proxy for classification performance.

It is well known that the discriminant function that minimizes misclassification is

monotonically related to the posterior probability distribution of the class, c, given the

observation x [Fukanaga, 1990]. That is, for the two class problem, if the discriminant

function is



f(x) = P2P(C 2x), (3)

where P2 is the prior probability of class 2, and p(C21x) is the conditional probability

distribution of class 2 given x, then the probability of classification will be minimized if

the following decision rule is used



f(x) < 0.5 choose class 1 (4)
f(x) > 0.5 choose class 2








For the case of f(x) = 0.5, both classes are equally likely, so a guess must be made.


3.2.1 Unrestricted Functional Mappings

With regards to the adaline/LMS approach we now ask, what is the consequence of

using the MSE criterion for computing discriminant functions? In the two class case, the

source distributions are p(xI C1) or p(xI C2) depending on whether the observation, x, is

drawn from class 1 or class 2, respectively. If we assign a desired output of zero to class 1

and unity to class 2 then the MSE criterion is equivalent to the following



J(f) = 2E{f(x)2|C, + --E{(- f(x))2C2}, (5)


where the 1/2 scale factors are for convenience, E{ } is the expectation operator, and

Ci indicates class i.

For now we will place no constraints on the functional form of f(x). In so doing, we

can solve for the optimal solution using the calculus of variations approach. In this case,

we would like to find a stationary point of the criterion J(f) due to small perturbations in

the function f(x) indicated by



J = J(f+6f)-J(f)
0 (6)
=0







The first term of 6 can be computed as


P P
J(f +sf) = P2E{(f+Sf)2I|C}+-~E{(l-f-if)2IC2}

= P-E{(f2+2f2f) C1}
(7)
P
+-E{((1 f)2 -2(1 f)8f)IC21+O( 0(8f2)
2
P P2
= J(f)+ E{(2ff) Cl} E{(2(1- f)8f)C2}

which can be substituted into 6 to yield


8J = PE{f8fIC, }-P2E{( -f)8flC2}

= Pf f(x)8fp(x|C,I)dx-P2f (- f(x))Sfp(xIC)dx
(8)
= f [f(x)(Plp(x|Ci) + P2P(x C2)) (P2P(X C2))]fd

= [f(x)py(x)--P2p( C2)]fdx

where px(x) = P p(x C1) + P2p(xlC2) is the unconditional probability distribution of

the random variable X. In order for f(x) to be a stationary point of J(f), equation 8

must be zero over all x for any arbitrary perturbation 6f(x). Consequently


f(x)px(x)-P2p(xlC2) = 0








or


S P2P(xC2)
f(x) -
px(x)
P2p(xlC2) (10)
Plp(x|C1)+P2P(xlC2)
= p(C21x)

which is the likelihood that the observation is drawn from class 2. If we had reversed the
desired outputs, the result would have been the likelihood that the observation was drawn
from class 1. This result, predicated by our choice of desired outputs, shows that for arbi-

trary f(x), the MSE criterion is equivalent to probability of misclassification error crite-
rion. In fact, it has been shown by Richard and Lippman [1991] (using other means) for

the multi-class case that if the desired outputs are encoded as vectors, ei E Ntx 1, where

the ith element is unity and the others are zero, for an N-class problem the MSE criterion
is equivalent to optimizing the Bayes criterion for classification.

3.2.2 Parameterized Functional Mappings

Suppose, however, that the function is not arbitrary, but is also a function of parameter

set, a, as in f(x, a). The MSE criterion of 5 can be rewritten


P1
J(f) -= -TE{f(x, ca)21C} + 2EE{(- f(x, a))ZC2}. (1l)

The gradient of the criterion with respect to the parameters becomes



= P E P f(x, a) -f(x, a)I C P2E ( f(x, a)) f(x, a) C2 (12)
Tat aa I ~







and consequently



= PJ f(x,ca) -f(x, a)p(x\C,)dx

-P, (l-f(x,a)) -f(x, a)p(xl C2)dx
(13)
= (f(x, a)(P1 p(xC1) +P2p(xC2))- (P2p(x C2)))- f(x, (X)dx

= (f(xa)p'(x)- Pp(xC2))- f(x,a)dx

Examination of equation 13 allows for two possibilities for a stationary point of the crite-
rion. The first, as before, is that


P2p(xfC2)
f(x, 0) =
Px(x) (14)
= p(C2x)

while the second is if we are near a local minima with respect to a. In other words, if the
parameterized function can realize the Bayes discriminant function via an appropriate
choice of its parameters, then this function represents a global minima, but this does not
discount the fact that there may be local minima. Furthermore, if the parameterized func-
tion is not capable of representing the Bayes discriminant function there is no guarantee
that the global (or local) minima will result in robust classification.








3.2.3 Finite Data Sets

The previous development does not take into account that in an iterative framework we

are working with observations of a random variable. Therefore, we rewrite the criterion of

equation 5 as finite summations. That is, the criterion becomes



J(f(x, a)) = -, f(x,. a)2 + (1 -f(xi, a))2, (15)
x, C, X, E C

where xi e Ci denotes the set of observations taken from class Ci. Taking the derivative

of this criterion with respect to the parameters, a, yields



S= PI f(xi, a)-f(x, )-P2 ( f(xi, a))-f(xi, a). (16)
xi, CI x, e C2

It is assumed that the set of observations from class C1 (xi e C ) are independent and

identically distributed (i.i.d.), as are the set of observations from class C2 (xi E C2)

although with a different distribution than class CI Since the summation terms are bro-

ken up by class, we can assume that the arguments of the summations (functions of dis-

tinct i.i.d. random variables) are themselves i.i.d. random variables [Papoulis, 1991]. If we

set PINI = P1 and P2N2 = P2, where PI and P2 are the prior probabilities of classes

C, and C2, respectively, and N, and N2 are the number of samples from drawn from






35

each of the classes, we can use the law of large numbers to say that the summations of

equation 16 approach their expected values. In other words, in the limit as N1, N2 --




S= PE f(x, a) C 2E (1 f(x, a)) C (17)


which is identical to equation 12 and so yields the same solution for the mapping as


Pp(x|C2)
f(x, a) = (18)
p(x)

The conclusion is that if we have a sufficient number of observations to characterize

the underlying distributions then the MSE criterion is again equivalent to the Bayes crite-

rion.


3.3 Derivation of the MACE Filter

We have already introduced the MACE filter in a previous section. We present a deri-

vation of the MACE filter here. The development is similar to the derivations given in

Mahalanobis [1987] and Kumar [1992]. Our purpose in this presentation of the derivation

is that it serves to illustrate the associative memory perspective of optimized correlators; a

perspective which will be used to motivate the development of the nonlinear extensions

presented in later sections.






36

In the original development, SDF type filters were formulated using correlation opera-

tions, a convention which will be maintained here. The output, g(n1, n2), of a correlation

filter is determined by


N1-1 N2-1
g(n,n2) = x lx*(nl+ml,n2+m2)h(m,,m2)
m = Om2 = 0
x*(nl, n2)**h(n n2)


where x*(n,, n2) is the complex conjugate of an input image with NI x N2 region of sup-

port, h(nl, n2) represents the filter coefficients, and ** represents the two-dimensional

circluar convolution operation [Oppenheim and Shafer, 1989].

The MACE filter formulation is as follows [Mahalanobis et al., 1987J. Given a set of

image exemplars, {xiE NixNI; i = 1...N,}, we wish to find filter coefficients,


h E 9N, x N, such that average correlation energy at the output of the filter defined as



N, N I-IN,-1
-= Iz gn ,2 (19
ti=1 =0n2=0 ])

is minimized subject to the constraints


N,-1 N,-1
gi(O,0) = E xi*(ml m2)h(m1,m2) = di; i= 1...N,. (20)
m, = m = 0

Mahalanobis [1987] reformulates this as a vector optimization in the spectral domain

using Parseval's theorem. In the spectral domain we wish to find the elements of

H e CNN2 x I a column vector whose elements are the 2-D DFT coefficients of the space








domain filter h reordered lexicographically. Let the columns of the data matrix


X e CNINxN, contain the 2-D DFT coefficients of the exemplars {xl, .... XN, also


reordered into column vectors. The diagonal matrix Di E 9NN2 N2O contains the mag-


nitude squared of the 2-D DFT coefficients of the ith exemplar. These matrices are aver-

aged to form the diagonal matrix D as



N,
N,= Di= (21)


which then contains the average power spectrum of the training exemplars. Minimizing

equation (19) subject to the constraints of equation (20) is equivalent to minimizing


HtDH, (22)

subject to the linear constraints


XtH = d (23)

N xl
where the elements of d E x are the desired outputs corresponding to the exemplars.

The solution to this optimization problem can be found using the method of Lagrange

multipliers. In the spectral domain, the filter that satisfies the constraints of equation (20)

and minimizes the criterion of equation (19) [Mahalanobis et al., 1987;Kumar, 1992] is



H = D-'X(XtD-tX)-'d, (24)


where H E CNN, x contains the 2D-DFT coefficients of the filter, assuming a unitary 2-

D DFT.1








3.3.1 Pre-processor/SDF Decomposition

As observed by Mahalanobis [1987], the MACE filter can be decomposed as a syn-

thetic discriminant function preceded by a pre-whitening filter. Let the matrix

B = D1/2, where B is diagonal with diagonal elements equal to the inverse of the

square root of the diagonal elements of D. We implicitly assume that the diagonal ele-

ments of D are non-zero, consequently BtB = D-1 and Bt = B. Equation (24) can

then be rewritten as



H = B(BX)((BX)1(BX))d. (25)

Substituting Y = BX, representing the original exemplars preprocessed in the spec-

tral domain by the matrix B, equation (25) can be written



H = BY(YIY)d. (26)

The term H' = Y(Yt Y)d is recognized as the SDF computed from the preprocessed

exemplars Y. The MACE filter solution can therefore be written as a cascade of a pre-

whitener (over the average power spectrum of the exemplars) followed by a synthetic dis-

criminant function, depicted in figure 16, as



H = BH'. (27)


1. If the DFT were as defined in [Oppenheim and Shafer, 1989] then a scale factor of
NIN2 would be necessary.













X yo
pre-processor SDF

process t omrposin

Figure 16. Decomposition of MACE filter as a preprocessor (i.e. a pre-
whitening filter over the average power spectrum of the
exemplars) followed by a synthetic discriminant function.

3.4 Associative Memory Perspective

Having presented the derivation of the MACE filter and the pre-processor/SDF decom-

position, we now show that with a modification (addition of a linear pre-processor), the

MACE filter is a special case of Kohonen's linear associative memory [1988].

Associative memories [Kohonen, 1988] are general structures by which pattern vec-

tors can be related to one another, typically in an input/output pair-wise fashion. An input

stimulus vector is presented to the associative memory structure resulting in an output

response vector. The input/output pairs establish the desired response to a given input. In

the case of an auto-associative memory, the desired response is the stimulus vector,

whereas, in a hetero-associative memory the desired response is arbitrary. From a signal

processing perspective, associative memories are viewed as projections [Kung, 1992], lin-

ear and nonlinear. The input patterns exist in a vector space and the associative memory

projects them onto a new space. The linear associative memory of Kohonen [1988] is for-

mulated exactly in this way.

A simple form of the linear hetero-associative memory maps vectors to scalars. It is

formulated as follows. Given the set of input/output vector/scalar pairs








{xiE Nxl, die 9t,i= 1...N,}, which are placed into a input data matrix,


x = [xl...XN], and desired output vector, d = [d .. .dN] find the vector, h 9Nx 1

such that



xth = d (28)

If the system of equations described by (28) is under-determined the inner product


hth (29)

is minimized using (28) as a constraint. If the system of equations are over-determined


(xth d)t(xh -d)

is minimized.

Here, we are interested in the under-determined case. The optimal solution for the

under-determined, using the pseudo-inverse of x is [Kohonen, 1988]



h = x(xtx) d. (30)

As was shown in [Fisher and Principe, 1994], we can modify the linear associative

memory model slightly by adding a pre-processing linear transformation matrix, A, and

find h such that the under-determined system of equations


(Ax)th = d


(31)








is satisfied while hth is minimized. As in the MACE filter, this optimization can be

solved using the method of Lagrange multipliers. We adjoin the system of constraints to

the optimization criterion as


J = hth + T((Ax)th d) (32)


where X 9E Nx 1 is a column vector of Lagrange multipliers, one for each constraint

(desired response). Taking the gradient of equation (32) with respect to the vector h yields


aJ
S= 2h +AxX. (33)
oh

Setting the gradient to zero and solving for the vector h yields



h= -Ax%. (34)


Substituting this result into the constraint equations of (31) and solving for the Lagrange

multipliers yields



S= -2((Ax)tAx)-ld. (35)

Substituting this result back into equation (34) yields the final solution to the optimization

as



h = Ax(xtAtAx) ld. (36)

If the pre-processing transformation, A, is the space-domain equivalent of the MACE

filter's spectral pre-whitener and the columns of the data matrix x contain the re-ordered

elements of the images from the MACE filter problem then equation (36) combined with








the pre-processing transformation yields exactly the space domain coefficients of the

MACE filter. This can be shown using a unitary discrete Fourier transformation (DFT)

matrix.


If U CN x N2 is the DFT of the image u e 9tN, x N, we can reorder both U and u


into column vectors, U e CN2 and u e CN 2 respectively. We can then imple-

ment the 2-D DFT as a unitary transformation matrix, cE, such that


U = Qu u = tU 44)t = I.

In order for the transformation A to be the space domain equivalent of the spectral pre-

whitener of the MACE filter, the relationship


Ax = Oty
= (tBX
= +DtfiBx

where B is the same matrix as in equation 27, must be true which, by inspection, means

that


A = VtBD. (37)


Substituting equation (37) into equation (36) and using the property BtB = BB = D-1

yields



h = Ax(xtAtAx)- d

= totB x(xt(itBD)t tdtB>x)- d
(38)
= >tB4x(xt4tBDtB4tdx) d38)

= DtBX(XtD -X)-'d








combining this solution for h with the pre-processor in equation (31) for the equivalent

linear system, hsys, yields


hy = Ah

= A4tBX(XtD-'X)-d

= (DTBt~tDBX(XtD LX)I d

= VtD-X(XtD-lX)- d

Substituting the MACE filter solution, equation (24), gives the result


hsys = tHMACE (39)


and so hsys is the inverse DFT pair of the spectral domain MACE filter. This result estab-

lishes the relationship between the MACE filter and the linear associative memory. The

decomposition of the MACE filter of figure 16 can also be considered as a cascade of a lin-

ear pre-processor followed by a linear associative memory (LAM) as in figure 17.



y=Ax yo=yth
A = (tD- I/2a h = y(yfy)d
x y yo
pre-processor LAM

e- ocessor Ier composition

Figure 17. Decomposition of MACE filter as a preprocessor (i.e. a pre-
whitening filter over the average power spectrum of the exemplars)
followed by a linear associative memory.


Since the two are equivalent then why make the distinction between the two perspec-

tives? The are several reasons. The development of distortion invariant filtering and asso-

ciative memories has proceeded in parallel. Distortion invariant filtering has been





44

concerned with finding projections which will essentially detect a set of images. Towards

this goal the techniques have emphasized analytic solutions resulting in linear discrimi-

nant functions. Advances have been concerned with better descriptions of the second order

statistics of the causes of false detections. The approach, however, is still a data driven

approach. The desired recognition class is represented through exemplars. In the distortion

invariant filtering approach, the task has been confined to fitting a hyper-plane to the rec-

ognition exemplars subject to various quadratic optimization criterion.

The development of associative memories has proceeded along a different track. It is

also data driven, but the emphasis has been on iterative machine learning methods. Many

of the methods are biologically motivated, including the perception learning rule [Rosenb-

latt, 1958] and Hebbian learning [Hebb, 1949]. Other methods, including the least-mean-

square (LMS) algorithm [Widrow and Hoff, 1960] (which we have described) and the

backpropagation algorithm [Rumelhart et al., 1986; Werbos 1974], are gradient descent

based methods.

From the classification standpoint, of which the ATR problem is a subset, iterative

methods have certain advantages. This can be illustrated with a simple example. Suppose

the data matrix


N N,N,xN,
X = [Xl, X2 ... N] 9 x N,

were not full rank. In other words the exemplars representing the recognition class could

be represented without error in a subspace of dimension less than N,. From an ATR per-

spective this would be a desirable property. The implicit assumption in any data driven

method is that information about the recognition class is transmitted through exemplars.

This is as true for distortion invariant filters, which have analytic solutions, as it is for iter-








ative methods. The smaller the dimension of the subspace in which the recognition class

lies, the better we can discriminate images considered to be out of the class. One limitation

of the analytic solutions of distortion invariant filters is that they require the inverse of a

matrix of the form


xtQx, (40)

where Q is a positive definite matrix representing a quadratic optimization criterion. If the

matrix, x, is not full column rank there is no inverse for the matrix of (40) and conse-

quently no analytic solution for any of the distortion invariant filters. The LMS algorithm,

however, will still find a best fit to the design goal, which is to minimize the criterion while

satisfying the linear constraints.

We can illustrate this by modifying the data from the experiments in section 2.1. It is

well known that the data matrix x can be decomposed using the singular value decompo-

sition (SVD) as


x = UAVV


where the columns of U 9Nx N, form an ortho-normal basis (the principal components

of the vector xi in fact), the diagonal matrix A 9N' N' contains the singular values of
N, xN,
the data matrix, and V 9t N' is unitary. The columns of the data matrix can be pro-

jected onto a subspace by setting one of the diagonal elements of A to zero. The impor-

tance of any of the basis vectors in U is directly proportional to the singular value. In this

case N, = 21 so we can choose one of the smaller singular values to set to zero without








changing basic structure of the data. For this example we choose the twelfth largest singu-

lar value. A data matrix xsub is generated by



AI-11 0 0 VT
xsu5b = U 0 0 0 V ,
0 0 A13-21

where Ai_ is a diagonal matrix containing the i through j singular values of the original

data matnx x.

This data matrix is not full rank, so there is no analytical solution for the MACE filter,

however we can use the LMS approach and derive a linear associative memory. The col-

umns of xsub are pre-processed with a pre-whitening filter computed over the average

power spectrum. The LMS algorithm can then be used to iteratively compute the transfor-

mation that best fits

T
Xsubh = d,

in a least squares sense; that is, we can find the h that minimizes


(xTh -d) (xTbh -d)

where d is column vector of desired responses (set to all unity in this case).

The peak output response for this filter was computed over all of the aspect views of

vehicle la and is shown in figure 18. The exemplars used to compute the filter are plotted

with diamond symbols. The desired response cannot be met exactly so a least squares fit is

achieved. Figure 19 shows the correlation output surface for one of the training exemplars.









MACE filter (LMS)
1.2


1.0


0.8-


0.6


0.4 -


0.2 -


0.0 .
0 20 40 60 80 100
aspect angle
Figure 18. Peak output response over all aspects of vehicle la when the data
matrix which is not full rank. The LMS algorithm was used to compute
the filter coefficients.

As can be seen in the image, the qualities of low variance and localized peak are still

maintained using the iterative method.

The learning curve, which measures the normalized mean square error (NMSE)

between the filter output and the desired output, is shown as a function of the learning

epoch (an epoch is one pass through the data) in figure 20. When the data matrix is full

rank, as shown with a solid line, we see that since there is an exact solution and the error

approaches zero. When xub is used the NMSE approaches a limit because there is no


exact solution and so a least squares solution is found.





























Figure 19. Output correlation surface for LMS computed filter from non full rank
data. The filter output is not substantially different from the analytic
solution with full rank data.

Since the system of constraint equations are generally under-determined, there are infi-

nitely many filters which will satisfy the constraints. There is only one, however, that min-

imizes the norm of filter (the optimization criterion after pre-processing) [Kohonen, 1988].

Figure 21 shows the NMSE between the analytic solution for the filter coefficients as com-

pared to the iterativel method. When the data matrix is full rank the iterative method

approaches the optimal analytic solution, as shown by the solid line in the figure. When

the data matrix is not full rank, as shown by the dashed line in the figure, the error in the

iterative solution approaches a limit.

These qualities of iterative learning methods are important from the ATR perspective.

We see from the example that when the data possesses a quality that would seemingly be

1. in this case iterativee" refers to the LMS algorithm, within this text it generally refers to
a gradient search algorithm.









10 LMS learning curves
1o0 F -- i-', ,--- '-,



10-


10-42

--
10-3


10-4


10-5


10-6
0 20 40 60 80 100
epoch
Figure 20. Learning curve for LMS approach. The learning curve for the LMS
algorithm when the full rank data matrix is shown with a solid line, the
non full rank case is shown with a dashed line.

useful to the ATR problem, namely that the class can be described by a sub-space, the ana-

lytic solution fails when the number of exemplars exceeds the dimensionality of the sub-

space. The iterative method, however, finds a reasonable solution. Furthermore, if the data

matrix is full rank, the iterative method approaches the optimal analytic solution.


3.5 Comments

There are further motivations for the associative memory perspective and by extension

the use of iterative methods. It is well known that non-linear associative memory struc-

tures can outperform their linear counterparts on the basis of generalization and dynamic

range [Kohonen, 1988;Hinton and Anderson, 1981]. In general, they are more difficult to

design as their parameters cannot be computed analytically. The parameters for a large









filter error


0 5 10 15 20
1_ epoch
Figure 21. NMSE between closed form solution and iterative solution. The
learning curve for the LMS algorithm when the full rank data matrix is
shown with a solid line, the non full rank case is shown with a dashed
line.

class of nonlinear associative memories can, however, be determined by gradient search

techniques. The methods of distortion invariant filters are limited to linear or piece-wise

linear discriminant functions. It is unlikely that these solutions are optimal for the ATR

problem.

In this chapter we have made the connection between distortion invariant filtering and

linear associative memories. Furthermore we have motivated an iterative approach. Recall

figure 15, which shows the adaline architecture. In this architecture we can use the linear

error term in order to train our system as a classifier. This is consequence of the assump-

tion that a linear discriminant function is desirable. If a linear discriminant function is sub-






51

optimal, which will almost always be the case for any high-dimensional classification

problem, then we must work directly with the classification error.

We have also shown that the MSE criterion is a sufficient proxy for classification error

(with certain restrictions), however, it requires that we work with the true output error of

the mapping as well as a mapping with sufficient flexibility (i.e. can closely approximate a

wide range of functions which are not necessarily linear). The linear systems approach,

however, does not allow for either of these requirements. Consequently, we must adopt a

nonlinear systems approach if we hope to achieve improved performance. The next chap-

ter will show that the MACE filter can be extended to nonlinear systems such that the

desirable properties of shift invariance and localized detection peak are maintained while

achieving superior classification performance.














CHAPTER 4

STOCHASTIC APPROACH TO TRAINING NONLINEAR
SYNTHETIC DISCRIMINANT FUNCTIONS

4.1 Nonlinear iterative Approach

The MACE filter is the best linear system that minimizes the energy in the output cor-

relation plane subject to a peak constraint at the origin. An advantage of linear systems is

that we have the mathematical tools to use them in optimal operating conditions from the

standpoint of second order statistics. Such optimality conditions, however, should not be

confused with the best possible classification performance.

Our goal is to extend the optimality condition of MACE filters to adaptive nonlinear

systems and classification performance. The optimality condition of the MACE filter con-

siders the entire output plane, not just the response when the image is centered. With

regards to general nonlinear filter architectures which can be trained iteratively, a brute

force approach would be to train a neural network with a desired output of unity for the

centered images and zero for all shifted images. This would indeed emulate the optimality

of the MACE filter, however, the result is a training algorithm of order NIN2N, for N,

training images of size N x N2 pixels. This is clearly impractical.

In this section we propose a nonlinear architecture for extending the MACE filter. We

discuss some its properties. Appropriate measures of generalization are discussed. We also

present a statistical viewpoint of distortion invariant filters from which such nonlinear

extensions fit naturally into an iterative framework. From this iterative framework we






53

present experimental results which exhibit improved discrimination and generalization

performance with respect to the MACE filter while maintaining the properties of localized

detection peak and low variance in the output plane.


4.2 A Proposed Nonlinear Architecture

As we have stated, the MACE filter can be decomposed as a pre-whitening filter fol-

lowed by a synthetic discriminant function (SDF), which can also be viewed as a special

case of Kohonen's linear associative memory (LAM) [Hester and Casasent, 1980; Fisher

and Principe, 1994]. This decomposition is shown at the top of figure 22. The nonlinear

filter architecture with which we are proposing is shown in the middle of figure 22. In this

architecture we replace the LAM with a nonlinear associative memory, specifically a feed-

forward multi-layer perception (MLP), shown in more detail at the bottom of figure 22.

We will refer to this structure as the nonlinear MACE filter (NL-MACE) for brevity.

Another reason for choosing the multi-layer perception (MLP) is that it is capable of

achieving a much wider range of discriminant functions. It is well known that an MLP

with a single hidden layer can approximate any discriminant function to any arbitrary

degree of precision [Funahashi, 1989]. One of the shortcomings of distortion invariant

approaches such as the MACE filter is that it attempts to fit a hyper-plane to our training

exemplars as the discriminant function. Using an MLP in place of the LAM relaxes this

constraint. MLPs do not, in general, allow for analytic solutions. We can, however, deter-

mine their parameters iteratively using gradient search.












NxN
x E9


LI



pre-proe..,-r I SOMLP
I linear filler

r- -* -- -



N1xN










IXE





PE




pre-processed
image : .
3- PE'



I 1 pi MLP

Figure 22. Decomposition of optimized correlator as a pre-processor followed by
SDFILAM (top). Nonlinear variation shown with MLP replacing SDF
in signal flow (middle), detail of the MLP (bottom). The linear
transformation A represents the space domain equivalent of the
spectral pre-processor (aP +(1- a) /2
spectral pre-processor (p + *(I L)p)






55

4.2.1 Shift Invariance of the Proposed Nonlinear Architecture

One of the properties of the MACE filter is shift invariance. We wish to maintain that

property in our nonlinear extensions. A transformation, T[ ], of a two-dimensional func-

tion is shift invariant if it can be shown that


g(nl, n2) = [y(n,, n2)]
g(nI +n1',n2 +n2) = T[y(nl + n',2 +n2')]'

where nl, nl', n2, n2' are integers. In other words, a shift of the input signal is reflected as

a corresponding shift of the output signal. [Oppenheim and Shafer, 1989]

We show here that this property is maintained for our proposed nonlinear architecture.

The pre-processor of the nonlinear architecture at the bottom of figure 22 is the same as

the pre-processor of the linear filter shown at the top. The pre-processor is implemented as

a linear shift invariant (LSI) filter. Cascading shift invariant operations maintains shift

invariance of the entire system [Oppenheim and Shafer, 1989]. In order to show that the

system as a whole is shift invariant, it is sufficient to show that the MLP is shift invariant.

The mapping function of the MLP in figure 22 can be written


g(oo,y) = o(W3s(W2o(WYy)+ p))

0 = W ~ NIN2 N, x- W, N,3 N, N.l } (41)
o W l IW 2 C: 9 ,,W 3C = 91 x '


In the nonlinear architecture, the matrix Wi, represents the connectivities from the pro-

cessing elements (PEs) of layer (i 1) to the input to the PEs of layer i; that is, the matrix

Wi is applied as linear transformation to the vector output of layer (i- 1). When i = 1

the transformation is applied to the input vector, y. The number of PEs in layer i is






56

denoted by N. In equation 41 p is a constant bias vector added to each element of the

vector, W2a(Wy) e 9N-, x It is also assumed that if the argument to the nonlinear

function o( ) is a matrix or vector then the nonlinearity is applied to each element of the

matrix or vector.

N,N2 x l
The input to the MLP is denoted as a vector, y e 9t2 The elements of the vector

are samples of a two-dimensional pre-whitened input signal, y(n1, n2). We can write the

ith element of the vector as a function of the two dimensional signal as follows


yi(nl, n2) = y(n" + (i, N1), n2 + N ) i = 0,..., NN2 1,

where (i, Nl) indicates a modulo operation (the remainder of i divided by N ) and

[i, NI1 indicates integer division of i by N Written this way, the elements of the vector

y sample a rectangular region of support of size NI x N2 beginning at sample (nI, n2) in

the pre-whitened signal, y(n n2). The vector argument of equation 41 and the resulting

output signal can now be written as an explicit function of the beginning sample point of

the template within the pre-whitened image


go(nl,n2)= g(o, (nl,n2)) = o( W30(W20(Wly(nl,n2))+(P)). (42)

The output of the mapping as written in equation 42 is now an explicit function of

(n,, n2) and the constant parameter set, 0o (which do not vary with (n, n2)). We can also

write the output response as a function of the shifted version of the image, y(n n2) as


g,(nl +nl',n2+n2') = g(O,y(n" + n',n2+n2'))









Since the parameters, co, are constant, equations 42 and 43 are sufficient to show the

mapping of the MLP is shift invariant and consequently, the system as a whole (including

the shift invariant pre-processor) is also shift invariant.


4.3 Classifier Performance and Measures of Generalization

One of the issues for any iterative method which relies on exemplars is the number of

training exemplars to use in the computation of the discriminant function. In addition, for

iterative methods, there is the issue of when to stop the adaptation process. In the case of

distortion invariant filters, such as the MACE filter, some common heuristics are used to

determine the number of training exemplars. Typically samples are drawn from the train-

ing set and used to compute the filter from equation 23 until the minimum peak response

over the remaining samples exceeds some threshold [Casasent and Ravichandran, 1992].

A similar heuristic is to continue to draw samples from the training set until the mean

square error of the peak response over the remaining samples drops below some preset

threshold. These measures are then used as indicators of how well the filter generalizes to

between aspect exemplars from the training set which have not been used for the computa-

tion of the filter coefficients.

The ultimate goal, however, is classification. Generalization in the context of classifi-

cation must be related to the ability to classify a previously unseen input [Bishop, 1995].

We show by example that the measures of generalization mentioned above may be mis-

leading as predictors of classifier performance for even the linear filters. In fact the result

of the experiments will show that the way in which the data is pre-processed is more indic-

ative of classifier performance than these other indirect measures.









We illustrate this point with an example using ISAR image data. A data set, larger than

in the previous experiments, will be used. Two more vehicles, one from each vehicle type

will be used for the testing set, and all vehicles will be samples at higher aspect resolution.

Figure 23 shows ISAR images of size 64 x 64 taken from five different vehicles and two

different vehicle types. The images are all taken with the same radar. Data taken from

vehicles in the same class vary in the vehicle configuration and radar depression angle (15

or 20 degrees depression). Images have been formed from each vehicle at aspect varia-

tions of 0.125 degrees from 5 to 85 degrees aspect for a total of 641 images for each vehi-

cle. Figure 23 shows each of the vehicles at 5, 45, and 85 degrees aspect.

We will use vehicle type 1 as the recognition class and vehicle type 2 as a confusion

vehicle. Images of vehicle la will be used as the set from which to draw training exem-

plars. Classification performance will then be measured as the ability to recognize vehi-

cles lb and Ic while rejecting vehicles 2a and 2b. The filter we will use is a form of the

OTSDF [R6fr6gier and Figue, 1991] which is computed in the spectral domain as


--1-
H = [oaPx+(l-a)P,] X[Xt[aPx+(l-a)PI X]d, (44)


where the columns of the data matrix X e CNN2 xN' are the Fourier coefficients of Nt

exemplar images of dimension NI x N2 of vehicle la reordered into column vectors. The

diagonal matrix Px E 9 'N2x NN contains the coefficients of the average power spec-


trum measured over the N, exemplars of vehicle la, while FP e9t ',NxNN is the iden-


tity matrix scaled by the average of the diagonal terms of Px. Finally, d e NX is a

column vector of desired outputs, one for each exemplar. The elements of d are typically









vehicle I








b










vehicle 2


a





b



Figure 23. ISAR images of two vehicle types shown at aspect angles of 5, 45, and
85 degrees respectively. Three different vehicles of type 1 (a, b, and c)
are shown, while two different vehicles of type 2 (a and b) are shown.
Vehicle la is used as a training vehicle, while vehicles lb and Ic are
used as the testing vehicles for the recognition class. Vehicles 2a and
2b are used a s confusion vehicles.

set to unity. When a is set to unity equation 44 yields exactly the MACE filter, when it is

set to zero the result is the SDF The filter we are using is therefore trading off the MACE

filter criterion with the SDF criterion. The SDF criterion can also be viewed as the

MVSDF [Kumar, 1986] criterion when the noise class is represented by a white noise ran-

dom process. This filter can also be decomposed as in figure 22.






60

These experiments examine the relationship between the two commonly used mea-

sures of generalization and two measures of classification performance. We can draw con-

clusions from the results about the appropriateness of the generalization measures with

regards to classification. The first generalization measure is the minimum peak response,

denoted Ymin, taken over the aspect range of the images of the training vehicle (excluding

the aspects used for computing the filter). The second generalization measure is the mean

square error, denoted yme, between the desired output of unity and the peak response over

the aspect range of the images of the training vehicle (excluding the aspects used for com-

puting the filter). The classification measures are taken from the receiver operating char-

acteristic (ROC) curve measuring the probability of detecting, Pd, a testing vehicle in the

recognition class (vehicles lb and Ic) versus the probability of false alarm, Pfa, on a test-

ing vehicle in the confusion class (vehicles 2a and 2b) based on peak detection. The spe-

cific measures are the area under the ROC curve, a general measure of the test being used,

while the second measure is the probability of false alarm when the probability of detec-

tion equals 80%, which measures a single point of interest on the ROC curve.

Two filters are used, one with a = 0.5 and the other with a = 0.95, or one in which

both criterion are weighted equally and one which is close to the MACE filter criterion.

The number of exemplars drawn from the training vehicle (la) is varied from 21 to 81

sampled uniformly in aspect (1 to 4 degrees aspect separation between exemplars).

Examination of figures 24 and 25 show that for both cases (a equal to 0.5 and 0.95)

no clear relationship emerges in which the generalization measures are indicators of good

classification performance. Table 1 compares the classifier performance when the general-








ization measures as described above are used to choose the filter versus the best ROC per-

formance achieved throughout the range of aspect separation. In one regard, the

generalization measures were consistent in that the same aspect separation was predicted

by both measures for both settings of a. In figure 26 we compare the ROC curves for two

cases, first where the filter chosen using the generalization measures and second the best

achieved ROC curve, for both settings of a. We would expect that for each a the filter

using the generalization measure would be near the best ROC performance. As can be

seen in the figure this is not the case.

Table 1. Classifier performance measures when the filter is determined by either of the
common measures of generalization as compared to best classifier performance for two
values of a.
Generalization Measure
Ymin Ymse Best
S= 0.50 Pfa@Pd=0.8 0.24 0.24 0.16
ROC area 0.83 0.83 0.90

S= 0.95 Pfa@Pd=0.8 0.16 0.16 0.07
ROC area 0.94 0.94 0.95

It is obvious from figures 24 and 25 that the generalization measures are not signifi-

cantly correlated with the ROC performance. In fact, as summarized in table 2, the gener-

alization measures are negatively, albeit weakly, correlated with ROC performance. One

feature of figures 24 and 25 is that although ROC performance varies independent of










-.10 ivs. ROC area
1.00



0.95-




0.90 U A A




0.85 A a= 0.50
a = 0.95



0.80 I .....
0.60 0.70 0.80 0.90 1.00
Ymin

Ymin vs. Pv (@ P= 0.8)
0.30


0.25- A


0.20o
[AA

B 0.15 -


0.10- ]n O]E LF


0.05- A = 0.50
a = 0.95

0.00 __
0.60 0.70 0.80 0.90 1.00
Ymin
Figure 24. Generalization as measured by the minimum peak response. The plot
compares y.in versus classification performance measures (ROC area
and Pfa@Pd=0.8).










Ym,, vs. ROC area


1.00


-, E 0 0
r 0 o


0
%


Aa = 0.50
Sa = 0.95


0.020 0.030 0.040 0.050
Y. ..
v--- vs. P,. (@ P =0.8)


0.00 ,
0.000


0.010 0.020 0.030
Ym..


0.040 0.050 0.060 0.070


Figure 25. Generalization as measured by the peak response mean square error.
The plot compares ymse versus classification performance measures
(ROC area and Pfa@Pd=0.8).


0.95


1

S0.90



0.85


0.060 0.070


0.000 0.010


0.30 F-' -'


0.25


0.20


a 0.15


0.10


0.05


10 D 0


O o
uo% oo []iit


0


Sa = 0.50
Sa = 0.95


0.801


I~~~~I~~~~I~~~~~~~~









1.0










04 a = 0.50, best ROC
a = 0.95, best generalization
a = 0.95, best ROC
0.8 2 ,



0.6 .' j











0.0 0.2 0.4 0.6 0.8 1.0
*r i.







Figure 26. Comparison of ROC curves. The ROC curves for the number of
:; / ......a = 0.50, best ROC





training exemplars yielding the 0.95, best generalization measure versus the
number yielding the0.95, best ROC performance for values of a equal to
0.2-/ ,



0.0 0.2 0.4 0.6 0.8 1.0
Figure 26. Comparison of ROC curves. The ROC curves for the number of
training exemplars yielding the best generalization measure versus the
number yielding the best ROC performance for values of a equal to
0.5 and 0.95 are shown.

either the minimum peak response or the MSE, there does appear to be dependency on a.

This leads to a second experiment.

Table 2. Correlation of generalization measures to classifier performance. In both cases (a equal to 0.5
or 0.95) the classifier performance as measured by the area of the ROC curve or Pf at Pd equal 0.8, has
an opposite correlation as to what would be expected of a useful measure for predicting performance.
Performance Measures
ROC area Pfa(@Pd=0.8) ROC area Pfa(@Pd=0.8)
a = 0.50 a = 0.95
Generalization Ymin -0.39 0.21 -0.40 0.41
Measures Ymse 0.32 -0.11 0.31 -0.35


In the second experiment we examine the relationship between the parameter a and

the ROC performance. The aspect separation between training exemplars is set to 2, 4, and

8 degrees. The value of a, the emphasis on the MACE criterion, is varied in the range

zero to unity. Figure 27 shows the relationship between ROC performance and the value









of a. It is clear from the plots that there is a positive relationship between the emphasis on

the MACE criteria and the ROC performance. However, the peak in ROC performance is

not achieved at a equal to unity. In all three cases, the ROC performance peaks just prior

to unity with the performance drop-off increasing with aspect separation at a equal to

unity.

The difference between the SDF and MACE filter is the pre-processor. What is shown

by this analysis is that, in general, the pre-processor from the MACE filter criterion leads

to better classification, but too much emphasis on the MACE filter criterion, as measured

by a equal to unity, leads to a filter which is too specific to the training samples. The

problems described above are well known. Alterations to the MACE criterion have been

the subject of many researchers [Casasent et al., 1991; Casasent and Ravichandran, 1992;

Ravichandran and Casasent, 1992; Mahalanobis et al., 1994a]. There is still, as yet, no

principled method found in the literature by which to set the parameter a.

There are two conclusions from this analysis that are pertinent to the nonlinear exten-

sion we are using. First the results show that pre-whitening over the recognition class

leads to better classification performance. For this reason we choose to use the pre-proces-

sor of the MACE filter in our nonlinear filter architecture. The issue of extending the

MACE filter to nonlinear systems can in this way be formulated as a search for a more

robust nonlinear discriminant function in the pre-whitened image space.

The second conclusion is that comparisons of the nonlinear filter to its linear counter-

part must be made in terms of classification performance only. There are simple nonlinear

systems, such as a soft threshold at the output of a linear system for example, that will out-










ROC area vs. a


08
13


ROC area vs. a


0.2 0.4 0.6
a
P,.(@P,=0.8) vs. a


0.8 1.0


001 I o i ______ I
0.0 0.2 0.4 0.6 0.8 1.0
a
Figure 27. ROC performance measures versus a. Results are shown for training
aspect separations of 2, 4, and 8 degrees. These plots indicate that
ROC performance is positively related to a.

perform the MACE filter or its variations in terms of maximizing the minimum peak

response over the training vehicle or reducing the variance in the output image plane.


0.41-


0.0 R-
0.0


10


0.8



S0.6
II





02


0 a = 2.00 degrees
A a = 4.00 degrees
D a = 8.00 degrees









0 0


Sa = 2.00 degrees
A = 4.00 degrees
Sa = 8.00 degrees


, I I









These measures are not, however, sufficient to describe classification performance. We

have also used these measures in the past but feel that they are not the most appropriate for

classification [Fisher and Principe, 1995b].


4 4 Statistical Characterization of the Rejection Class

We now present a statistical viewpoint of distortion invariant filters from which such

nonlinear extensions fit naturally into an iterative framework. This treatment results in an

efficient way to capture the optimality condition of the MACE filter using a training algo-

rithm which is approximately of order N, and which leads to better classification perfor-

mance than the linear MACE.

A possible approach to design a nonlinear extension to the MACE filter and improve

on the generalization properties is to simply substitute the linear processing elements of

the LAM with nonlinear elements. Since such a system can be trained with error back-

propagation [Rumelhart et al., 1986], the issue would be simply to report on performance

comparisons with the MACE. Such methodology does not, however, lead to understand-

ing of the role of the nonlinearity, and does not elucidate the trade-offs in the design and in

training.

Here we approach the problem from a different perspective. We seek to extend the

optimality condition of the MACE to a nonlinear system, i.e. the energy in the output

space is minimized while maintaining the peak constraint at the origin. Hence we will

impose these constraints directly in the formulation, even knowing A priori that an analyti-

cal solution is very difficult or impossible to obtain. We reformulate the MACE filter from




! -


68

a statistical viewpoint and generalize it to arbitrary mapping functions, linear and nonlin-

ear.

Consider images of dimension NI x N2 re-ordered by column or row into vectors. Let


the rejection class be characterized by the random vector, XI E NNx I We know the

second-order statistics of this class as represented by the average power spectrum (or

equivalently the autocorrelation function). Let the recognition class be characterized by


the columns of a data matrix x2 91NN x N which are observations of the random vector,


X2 E NNx similarly re-ordered. We wish to find the parameters, Co, of a mapping,


g(Co, X):9 1 91 such that we may discriminate the recognition class from the

rejection class. Here, it is the mapping function, g, which defines the discriminator topol-

ogy.

Towards this goal, we wish to minimize the objective function


J = E(g(, X)2)

over the mapping parameters, co, subject to the system of constraints



g(, x2) = d (45)


where d e 9' is a column vector of desired outputs. It is assumed that the mapping

function is applied to each column of x2, and E( ) is the expected value function.




-...............-........1 I -


69

Using the method of Lagrange multipliers, we can augment the objective function as



J = E(g(o, X1)2) + (g(o, x2) -d ),, (46)


where X e 9tNx is a vector whose elements are the Lagrange multipliers, one for each

constraint. Computing the gradient with respect to the mapping parameters yields


aj fg(m,X,)Y\ g(oa,x2)
S= 2E g(, X1 ) )) + a ,. (47)


Equation 47 along with the constraints of equation 45 can be used to solve for the opti-

mal parameters, co, assuming our constraints form a consistent set of equations. This is,

of course dependent on the mapping topology.


4.4 1 The Linear Solution as a Special Case

It is interesting to verify that this formulation yields the MACE filter as a special case.

If, for example, we choose the mapping to be a linear projection of the input image, that is


g(a,,x) = Tx o = [hl...hNN2 ]T E 9 x

equation 46 becomes, after simplification,


J = TE(XIXT)O +(oTx dT). (48)

In order to solve for the mapping parameters, co, we are still left with the task of com-

T
putting the term E(XIXT) which, in general, we can only estimate from observations of the

random vector, X1 or assume a specific form. Assuming that we have a suitable estima-









tor, the well known solution to the minimum of equation 48 over the mapping parameters

subject to the constraints of equation 45 is



-1 T-1 -1
o = Rx,x2[x2Rx X2] d, (49)

where



x = estimate{E(XIX) }. (50)


Depending on the characterization of X1, equation 49 describes various SDF-type fil-

ters (i.e. MACE, MVSDF, etc.). In the case of the MACE filter, the rejection class is char-

acterized by all 2D circular shifts of target class images away from the origin. Solving for

the MACE filter coefficients is therefore equivalent to using the average circular autocor-

relation sequence (or equivalently the average power spectrum in the frequency domain)


over images in the target class as estimators of the elements of the matrix E(XIXT).

Sudharsanan et al [1991] suggest a very similar methodology for improving the perfor-

mance of the MACE filter. In that case the average linear autocorrelation sequence is esti-

T
mated over the target class and this estimator of E(XIX1) is used to solve for linear

projection coefficients in the space domain. The resulting filter is referred to as the

SMACE (space-domain MACE) filter.


4.4.2 Nonlinear Mappings

For arbitrary nonlinear mappings it will, in general, be very difficult to solve for glo-

bally optimal parameters analytically. Our purpose is instead to develop iterative training

algorithms which are practical and yield improved performance over the linear mappings.


I









It is through the implicit description of the rejection class by its second-order statistics

from which we have developed an efficient method extending the MACE filter and other

related correlators to nonlinear topologies such as neural networks.

As stated, our goal is to find mappings, defined by a topology and a parameter set,

which improve upon the performance of the MACE filter in terms of generalization while

maintaining a sharp constrained peak in the center of the output plane for images in the

recognition class. One approach, which leads to an iterative algorithm, is to approximate

the original objective function of equation 46 with the modified objective function



J= (1 3)E(g(o, XI)2) + g(, x2)-dT][g(, x2) -d ]. (51)

The principal advantage gained by using equation 51 over equation 46 is that we can

solve iteratively for the parameters of the mapping function (assuming it is differentiable)

using gradient search. The constraint equations, however, are no longer satisfied with

equality over the training set. It has been recognized that the choice of constraint values

has direct impact on the performance of optimized linear correlators. Sudharsanan et al

[1990] have explored techniques for optimally assigning these values within the con-

straints of a linear topology. Other methods have been suggested [Mahalanobis et al.,

1994a, 1994b; Kumar and Mahalanobis, 1995] to improve the performance of distortion

invariant filters by relaxing the equality constraints. Mahalanobis [1994a] extends this

idea to unconstrained linear correlation filters. The OTSDF objective function of

Rdfr6gier [1991] appears similar to the modified objective function and indeed, for a lin-

ear topology this can be solved analytically as an optimal trade-off problem.


_








Our primary purpose for modifying the objective function is to allow for an iterative

method within the NL-MACE architecture. We have already shown in the previous chap-

ter that this choice of criterion is suitable for classification. We will show that the primary

qualities of the MACE filter are still maintained when we relax the equality constraints in

our formulation. Varying P in the range [0, 1] controls the degree to which the average

response to the rejection class is emphasized versus the variance about the desired output

over the recognition class.

As in the linear case, we can only estimate the expected variance of the output due to

the random vector input and its associated gradient. If, as in the MACE (or SMACE) filter

formulation, X1 is characterized by all 2-D circular (or linear) shifts of the recognition

class away from the origin then this term can be estimated with a sampled average over

the exemplars, x2, for all such shifts. From an iterative standpoint this still leads to the

impractical approach training exhaustively over the entire output plane. It is desirable,

then, to find other equivalent characterizations of the rejection class which may alleviate

the computational load without significantly impacting performance.


4.5 Efficient Representatinn of the Rejection Class

Training becomes an issue once the associative memory structure takes a nonlinear

form. The output variance of the linear MACE filter is minimized for the entire output

plane over the training exemplars. Even when the coefficients of the MACE filter are

computed iteratively we need only consider the output point at the designated peak loca-

tion (constraint) for each pre-whitened training exemplar [Fisher and Principe, 1994]. This

is due to the fact that for the under-determined case, the linear projection which satisfies








the system of constraints with equality and has minimum norm is also the linear projection

which minimizes the response to images with a flat power spectrum. This solution is

arrived at naturally via a gradient search which only considers the response at the con-

straint location.

This is no longer the case when the mapping is nonlinear. Adapting the parameters via

gradient search (such as error backpropagation) on recognition class exemplars only at the

constraint location will not, in general, minimize the variance over the entire output image

plane. In order to minimize the variance over the entire output plane we must consider the

response of the filter to each location in the input image, not just the constraint location.

The MACE filter optimization criterion minimizes, in the average, the response to all

images with the same second order statistics as the rejection class. At the output of the pre-

whitener (prior to the MLP) any white sequence will have the same second order statistics

as the rejection class. This condition can be exploited to make the training of the MLP

more efficient.

From an implementation standpoint, the pre-whitening stage and the input layer

weights can be combined into a single equivalent linear transformation, however, pre-

whitening separately allows the rejection class to be represented by white sequences at the

input to the MLP during the training phase.

This result is due to the statistical formulation of the optimization criterion. Minimiz-

ing the response to white sequences, in the average, minimizes the response to shifts of the

exemplar images since they have the same second-order statistics (after pre-whitening).

Consequently, we do not have to train over the entire output plane exhaustively, thereby

reducing training times proportionally by the input image size, NIN2. Instead, we use a









small number of randomly generated white sequences to efficiently represent the rejection

class. The result is an algorithm which is of order N, + Ns (where Ns is the number of

white noise rejection class exemplars) as compared to exhaustive training.


4 6 Experimental Results

We now present experimental results which illustrate the technique and potential pit-

falls. There are four significant outcomes in the experiments presented in this section. The

first is that when using the white sequences to characterize the rejection class, the linear

solution is a strong attractor. The second outcome is that imposing orthogonality on the

input layer to the MLP tends to lead to a nonlinear solution with improved performance.

The third result, in which we restrict the rejection class to a subspace, yields a significant

decrease in the convergence time. The fourth result, in which we borrow from the idea of

using the interior of the convex hull to represent the rejection class [Kumar et al., 1994],

yields significantly better classification performance.

In these experiments we use the data depicted in figure 23. As in the previous experi-

ments images from vehicle la will be used as the training set. Vehicles lb and Ic will be

used as the recognition class while vehicles 2a and 2b will be used as a rejection/confusion

class for testing purposes. In each case comparisons will be made to a baseline linear filter.

Specifically, in all cases the value of a for the linear filter is set to 0.99. The aspect

separation between training images is 2.0 degrees. This results in 41 training exemplars

from vehicle la. These settings of a and aspect separation were found to give the best

classifier performance for the linear filter with this data set. We continue to refer to this as

a MACE filter since the MACE criterion is so heavily emphasized. Technically it is an









OTSDF filter, but such nomenclature does not convey the type of pre-processing that is

being performed. We choose the value of a so as to compare to the best possible MACE

filter for this data set.

The nonlinear filter will use the same pre-processor as the linear filter (i.e. a = 0.99).

The MLP structure is shown at the bottom of figure 22. It accepts an NIN2 input vector (a

preprocessed image reordered into a column vector), followed by two hidden layers (with

two and three hidden PE nodes, respectively), and a single output node. The parameters of

the MLP


W,2NNx2 2 2x3 W C 3x] 3xl

are to be determined through gradient search. The gradient search technique used in all

cases will be error backpropagation algorithm.


4.6.1 Experiment I noise training

As stated, using the statistical approach, the rejection class is characterized by white

noise sequences at the input to the MLP. The recognition class is characterized by the

exemplars. It is from these white noise sequences that the MLP, through the backpropaga-

tion learning algorithm, captures information about the rejection class. So it would seem a

simple matter, during the training stage, to present random white noise sequences as the

rejection class exemplars. This is exactly the training method used for this experiment.

From our empirical observations we observed that with this method of training the linear

solution is a strong attractor. The results of the first experiment is demonstrates this behav-

ior.








Figure 28 shows the peak output response taken over all images of vehicle la for both

the linear (top) and nonlinear (bottom) filters. In the figure we see that for the linear filter

the peak constraint (unity) is met exactly for the training exemplars with degradation for

the between aspect exemplars. As mentioned previously, if the pure MACE filter criterion

were used (a equal to unity), the peak in the output plane is guaranteed to be at the con-

straint location [Mahalanobis et al., 1987]. It turns out that for this data set the peak output

also occurs the constraint location for the training images, however, with a = 0.99 it was

not guaranteed. Examination of the peak output response for the NL-MACE filter shows

that the constraints are met very closely (but not exactly) for the training exemplars also

with degradation in the peak output response at between aspect locations. The degradation

for the nonlinear filter is noticeably less than in the linear case and so in this regard it has

outperformed the linear filter.

Figure 29 shows the output plane response for a single image of vehicle la (not one

used for computing the filter coefficients) for the linear filter (top) and the nonlinear filter

(bottom). Again in this figure we see that both filters result in a noticeable peak when the

image is centered on the filter and a reduced response when the image is shifted. The

reduction in response to the shifted image is again noticeably better in the nonlinear filter

than in the linear filter. Such would be found to be true for all images of vehicle la and so

in this regard we can again say that the nonlinear filter had outperformed the linear filter.

However, as we have already illustrated for the linear case, these measures are not suf-

ficient to predict classifier performance alone and are certainly not sufficient to compare

linear systems to nonlinear systems. This point is made clear in table 3 which summarizes

the classifier performance at two probabilities of detection for all of the experiments










linear filter


1.10F-


1.00-


r 0.90






0.70 -


0.60
0 20 40 60 80
aspect (degrees)
nonlinear filter
1.10






0.90


0.80


0.70 -


0.60
0 20 40 60 80
aspect (degrees)
Figure 28. Peak output response of linear and nonlinear filters over the training
set. The nonlinear filter clearly outperforms the linear filter by this
metric alone.

reported here when vehicles lb and Ic are used as the recognition class and vehicles 2a

and 2b are used for the rejection class. At this point we are only interested in the results

pertaining to the linear filter (our baseline) and nonlinear filter results for experiment I.













































Figure 29. Output response of linear filter (top) and nonlinear filter (bottom).
The response is for a single image from the training set, but not one
used to compute the filter.

This table shows that the classifier performance for the linear filter and nonlinear filters

are nominally the same, despite what may be perceived to be better performance in the

nonlinear filter with regards to peak response over the training vehicle and reduced output

plane response to shifts of the image. Furthermore, if we examine figure 30, which shows







79

the ROC curve for both filters we see that they overlay each other. From a classification

standpoint the two filters are equivalent.

ROC curve
1.0


0.8 -


0.6


0.4


0.2


0.0
0.0 0.2 0.4 0.6 0.8 1.0
P,
Figure 30. ROC curves for linear filter (solid line) versus nonlinear filter (dashed
line). Despite improved performance of the nonlinear filter as
measured by peak output response and reduced variance over the
training set, the filters are equivalent with regards to classification
over the testing set.


The explanation of this result is best explained by figure 31. Recall the points ul and


u2 labeled in figure 22.


We can view these outputs as a feature space, that is, the MLP discriminant function

can be superimposed on the projection of the input image onto this space. In this case the

feature space is a representation of the input vector internal to the MLP structure. The des-

ignation of these points as features is due to the fact that they represent some abstract qual-























O3 D recognition, training
+ recognition, nontraining
o rejection, training


-1.0 -0.5 0.0
u1


1.0



0.5



3 0.0



-0.5



-1.0


-1.0 -0.5 0.0
UI


0.5 1.0


0.5 1.0


Figure 31. Experiment I: Resulting feature space from simple noise training.
Note that all points are projected onto a single curve in the feature
space. In the top figure squares are the recognition class training
exemplars, triangles are white noise rejection class exemplars, and
plus signs are the images of vehicle la not used for training. In the
bottom figure, squares are the peak responses from vehicles Ib and Ic,
triangles are the peak responses from vehicles 2a and 2b.


O recognition, testing
0 rejection, testing


~






81

ity of the data and the decision surface can be computed as a function of the features.

Mathematically this can be written




Wx = = u Yo = ((Wia(Woa(u)+(p)). (52)


Recall that the matrix Wi represents the connectivities from the output of layer (i- 1) to

the inputs of the PEs of layer i, (p is a constant bias term, and a( ) is a sigmoidal nonlin-

earity (hyperbolic tangent function in this case).

Figure 31 shows this projection for the training set (top) and the testing set (bottom).

What is significant in the figure is that although the discriminant as a function of the vec-

tor u is nonlinear, the projection of the images lie on a single curve in this feature space.

Topologically this filter can put into one-to-one correspondence with a linear projection.

This is not to say that the linear solution is undesirable, but under the optimization crite-

rion it can be computed in closed form. Furthermore, in a space as rich as the ISAR image

space it is unlikely that the linear solution will give the best classification performance.

Table 3. Comparison of ROC classifier performance for to values of Pd. Results are shown for the linear
filter versus four different types of nonlinear training. N: white noise training, G-S: Gram-Schmidt
orthogonalization, subN: PCA subspace noise, C-H: convex hull rejection class.
Pd (%) Pfa (%)
linear filter nonlinear writer, experiments I-IV
I (N) II (N, G-S) III (subN, G-S) IV (subN, G-S, C-H)
80 4.37 4.37 3.74 2.81 2.45
99 42.43 41.87 27.15 26.52 15.33


4.6.2 Experiment II noise training with an orthogonalization constraint

As a means of avoiding the linear solution a modification was made to the training

algorithm. The modification was to impose orthogonality on the columns of W, through a





82

Gram-Schmidt process. The motivation for doing this stems from the fact that we are

working in a pre-whitened image space. In a pre-whitened image space this condition is

sufficient to assure the outputs in the feature space, as measured at ul and u2, will be

uncorrelated over the rejection class. Mathematically this can be shown as


T T T T T 2
E{uu } = E{W1X1X1W1 = WrE{XlX1}Wl

TE(XI T T Ti T 1
w2E(X1XI)w w2E(XIXI)w2 I2 w w2 2
[ T T T T [ T 2 T 2


St2 IW11120o 1
SI 2 1120
[ 0 jw2

where wl, w2 E 9NN x are the columns of W1 This result is true for any number of

nodes in the first layer of the MLP.

The results of the training with this modification are shown in figure 32 which is the

resulting feature space as measured at uj and u2. From this figure we can see that the dis-

criminant function, represented by the contour lines, is a nonlinear function of ul and u2.

Furthermore, because the projection of the vehicles into the feature space do not lie on a

single curve (as in the previous experiment), the features represent different discrimina-

tion information with regards to the both rejection and recognition classes. The bottom of

the figure, showing the projection of a random sampling of the test vehicles (all 1282

would be too dense for plotting) show that both features are useful for separating vehicle 1

from vehicle 2. Examination of table 3 (column II in the nonlinear results) shows that at

the two detection probabilities of interest improved false alarm performance has been









obtained. Figure 33 shows the ROC curve for the resulting filter. It is evident that the non-

linear filter is a uniformly better test for classification.


1.0


0.5


" 0.0


-0.5 -
] recognition, training 0
+ recognition, nontraining
-1.0 o rejection, training

-1.0 -0.5 0.0 0.5 1.0
U1


1.0 -0.5 0.0 0.5 1.0


Figure 32. Experiment I: Resulting feature space when orthogonality is imposed
on the input layer of the MLP. In the top figure squares indicate the
recognition class training exemplars, triangles indicate white noise
rejection class exemplars, and plus signs are the images of vehicle la
not used for training. In the bottom figure, squares are the peak
responses from vehicles lb and Ic, triangles are the peak responses
from vehicles 2a and 2b.









ROC curve
1.0 -


0.8 -


0.6-


0.4 -



0.2 -


0.0
0.0 0.2 0.4 0.6 0.8 1.0
Pa
Figure 33. Experiment II: Resulting ROC curve with orthogonality constraint.

Convinced that the filter represents a better test for classification than the linear filter,

we now examine the result for the other features of interest. Figure 34 shows the output

response for this filter for one of the images. As seen in the figure, a noticeable peak at the

center of the output plane has been achieved. This shows that the filter maintains the local-

ization properties of the linear filter.

In this way the characterization of the rejection class by its second order statistics, the

addition of the orthogonality constraint at the input layer to the MLP and the use of a non-

linear topology has resulted in a superior classification test.


4.6.3 Experiment ITT subspace noise training

The next experiments describes an additional modification to this technique. One of

the issues of training nonlinear systems is the convergence time. Training methods which

require overly long training times are not of much practical use. We have already shown



























Figure 34. Experiment II: Output response to an image from the recognition class
training set.

how to reduce the training complexity by recognizing that we can sufficiently describe the

rejection class with white noise sequences. We now show a more compact description of

the rejection class which leads to shorter convergence times, as demonstrated empirically.

This description relies on the well known singular value decomposition (SVD).

We view the random white sequences as stochastic probes of the performance surface

in the whitened image space. The classifier discriminant function is, of course, not deter-

mined by the rejection class alone. It is also affected by the recognition class. We have

shown previously that the white noise sequences enable us to probe the input space more

efficiently than examining all shifts of the recognition exemplars. However, we are still

searching a space of dimension equal to the image size, N, N2.

One of the underlying premises to a data driven approach is that the information about

a class is conveyed through exemplars. In this case the recognition class is represented by








N, < NN2 exemplars placed in the data matrix x2 9NN2 '. It is well known that if

x2, if it is full rank, can be decomposed with the SVD as



x2 = UAVT. (53)


where the columns U e RNN, x N'' are an ortho-normal basis that span the column space

of the data matrix, A are the singular values, and V is an orthogonal matrix. This decom-

position has many well known properties including compactness of representation for the

columns of the data matrix[Gerbrands, 1981]. Indeed, as has been noted by Gheen[1990],

the SDF can be written as a function of the SVD of the data matrix.



hSDF = UA -vTd (54)

We will use this recognition class representation to further refine our description of the

rejection class for training. As we stated, the underlying assumption in a data driven

method, is that the data matrix x2 conveys information about the recognition class, any

information about the recognition class outside the space of the data matrix is not attain-

able from this perspective. The information certainly exists, but there is no mechanism by

which to include it in the determination of the discriminant function within this frame-

work. This does however lead to a more efficient description of the rejection class. We can

modify our optimization criterion to reduce the response to white sequences as they are

projected into the N,-dimensional subspace of the data matrix. Effectively this reduces the

search for a discriminant function in an NIN2-dimensional space to an N,-dimensional

subspace.






87

The adaptation scheme of backpropagation allows a simple mechanism to implement

this constraint. The adaptation of matrix W, at iteration k can be written as



W,(k+ 1) = W,(k) + x(k)e (k) (55)


where E'i is a column vector derived from the backpropagated error and xi(k) is the

current input exemplar from either class presented to network which, by design, lies in the

subspace spanned by the columns of U. From equation (55) if the rejection class noise

exemplars are restricted to lie in the data space of x2, which can be achieved by projecting

random vectors of size Nt onto the matrix U above, and W, is initialized to be a random

projection from this space we will be assured that the columns of W, only extract infor-

mation from the data space of x2. This is because the columns of W1 will only be con-

structed from vectors which lie in the columns space of U and so will be orthogonal to

any vector component that lies in the null space of U.

The search for a discriminant function is now reduced from within an N,N2-dimen-

sional space to a search from within an N -dimensional space. Due to the dimensionality

reduction achieved we would expect the convergence time to be reduced.

This is the method that was used for the third experiment. Rejection class noise exem-

N,x I
plars were generated by projecting a random vector, n E 9 onto the basis U by

xrej = Un. In figure 35 the resulting discriminant function is shown as in the previous






88

experiments and the result is similar to experiment II. The classifier performance as mea-

sured in table 3 and the ROC curve of figure 36 are also nominally the same.


Figure 35. Experiment III: Resulting feature space when the subspace noise is
used for training. Symbols represent the same data as in the previous
case.


There are, however, two notable differences. Examination of figure 37 shows that the

output response to shifted images is even lower allowing for better localization. This con-









ROC curve
1.0 .


0.8


0.6



0.4



0.2-


0.0 _
0.0 0.2 0.4 0.6 0.8 1.0
Pf.
Figure 36. Experiment I: Resulting ROC curve for subspace noise training.

edition was found to be the case throughout the data set. Of more significance is the result

shown in figure 38 in which we compare the learning curves of all of the experiments pre-

sented here. In this figure the dashed and dashed-dot lines are the learning curves for

experiments II and III, respectively. In this case the convergence rate was increased nomi-

nally by a factor of three, from 100 epochs to approximately 30 epochs. Here an epoch

represents one pass through all of the training data.


4.6.4 Experiment TV convex hull approach

In this experiment we present a technique which borrows from the ideas of Kumar et

al [1994]. This approach designed an SDF which rejects images which are away from the




























Figure 37. Experiment III: Output response to an image from the recognition
class training set


_0 learning curve


10-6


100
epoch


10000


~-~-~=;ip"=;-'-t\


Figure 38. Learning curves for three methods. Experiment II: White noise
training (dashed line). Experiment III: subspace noise (dashed-dot
line). Experiment IV: subspace noise plus convex hull exemplars
(solid line).




Full Text
72
Our primary purpose for modifying the objective function is to allow for an iterative
method within the NL-MACE architecture. We have already shown in the previous chap
ter that this choice of criterion is suitable for classification. We will show that the primary
qualities of the MACE filter are still maintained when we relax the equality constraints in
our formulation. Varying (3 in the range [0, 1 ] controls the degree to which the average
response to the rejection class is emphasized versus the variance about the desired output
over the recognition class.
As in the linear case, we can only estimate the expected variance of the output due to
the random vector input and its associated gradient. If, as in the MACE (or SMACE) filter
formulation, is characterized by all 2-D circular (or linear) shifts of the recognition
class away from the origin then this term can be estimated with a sampled average over
the exemplars, x2, for all such shifts. From an iterative standpoint this still leads to the
impractical approach training exhaustively over the entire output plane. It is desirable,
then, to find other equivalent characterizations of the rejection class which may alleviate
the computational load without significantly impacting performance.
4.5 Efficient Representation of the Rejection Class
Training becomes an issue once the associative memory structure takes a nonlinear
form. The output variance of the linear MACE filter is minimized for the entire output
plane over the training exemplars. Even when the coefficients of the MACE filter are
computed iteratively we need only consider the output point at the designated peak loca
tion (constraint) for each pre-whitened training exemplar [Fisher and Principe, 1994]. This
is due to the fact that for the under-determined case, the linear projection which satisfies


21
Figure 9. MACE peak output response of vehicle la, lb and 2a over all aspect
angles. Degradation to between aspect exemplars is evident.
Generalization to the testing vehicles as measured by peak output
response is also poorer. Vehicle la is the solid line, lb is the dashed line
and 2a is the dashed-dot line.
mum penalty to the rest. The solution to all such filters can be characterized by the equa
tion
h = Q 'x(XiQ ]X) 'd,
(2)
where, assuming M different criteria,
M M
Q = I \Q¡ I \ = 1
i = 1 i = 1
o

141
dimensional space, uniformly spaced in a hypercube, the distance between nearest points
approaches
l
where V is the volume of the hypercube. The upper bound of the kernel size can be set
proportionally to this value. During training, the kernel size can be set so as to ensure local
interaction subject to the upper bound.
Figure 63 shows an example result when entropy maximization is modeled as diffu
sion. The upper bound on the kernel size was set to 1/3 of equation 90. Subject to the
upper bound the kernel size was adaptively set to 1/2 the maximum nearest neighbor dis
tance. One interesting observation is that near the center of the figure, the data (diamond
symbols) have arranged themselves in a hexagonal configuration, which is well known to
be the most efficient sampling scheme in two dimensions.
5.9.3 Stopping Criterion
Figure 63 brings up one final subject in the local interaction viewpoint. The original
optimization criterion was the integrated squared error between the observed distribution
and the desired uniform distribution. Since the PDF estimation was bypassed, we no
longer have access to the criterion while training. Consequently, we need a proxy for the
criterion in order to determine when to stop the training. We propose the following mea
sure as a substitute
max(ANN) min(Aww)
max (A)
(91)


46
changing basic structure of the data. For this example we choose the twelfth largest singu
lar value. A data matrix jcsub is generated by
V
M-n 0
0 0 0
0 0 A,,
T
v ,
where A¡ is a diagonal matrix containing the i through j singular values of the original
data matrix x.
This data matrix is not full rank, so there is no analytical solution for the MACE filter,
however we can use the LMS approach and derive a linear associative memory. The col
umns of xsub are pre-processed with a pre-whitening filter computed over the average
power spectrum. The LMS algorithm can then be used to iteratively compute the transfor
mation that best fits
*Lb* =d
in a least squares sense; that is, we can find the h that minimizes
(xJubh-d)T(xJubh-d)
where d is column vector of desired responses (set to all unity in this case).
The peak output response for this filter was computed over all of the aspect views of
vehicle la and is shown in figure 18. The exemplars used to compute the filter are plotted
with diamond symbols. The desired response cannot be met exactly so a least squares fit is
achieved. Figure 19 shows the correlation output surface for one of the training exemplars.


8
in figure 1. In the figure, three vehicles are displayed, each at three different radar viewing
aspect angles (5,45, and 85 degrees), where the aspect angle is the direction of the front of
the vehicle relative to the radar antenna. The image dimensions are 64 x 64 pixels. Radar
systems measure a quantity called radar cross section (RCS). When a radar transmits an
electromagnetic pulse, come of the incident energy on an object is reflected back to the
radar. RCS is a measure of the reflected energy detected by the radars receiving antenna.
ISAR imagery is the result of a radar signal processing technique which uses multiple
detected radar returns measured over a range of relative object aspect angles. Each pixel in
an ISAR image is a measure of the aggregate radar cross section at regularly sampled
points in space.
Two types of vehicles are shown. Vehicle type 1 will represent a recognition class,
while vehicle type 2 will represent a confusion class. The goal is to compute a filter which
will recognize vehicle type 1 without being confused by vehicle 2. Images of vehicle la
will be used to compute the filter coefficients. Vehicles lb and 2a represent an independent
testing class.
ISAR images of all three vehicles were formed in the aspect range of 5 to 85 degrees at
1 degree increments. As the MSF is derived from a single vehicle image, an image of vehi
cle la at 45 degrees (the midpoint of the aspect range) is used.
The peak output response to an image represents maximum of the cross correlation
function of the image with the MSF template. The peak output response over the entire
aspect range of vehicle la is shown in figure 2. As can be seen in the figure, the filter
matches at 45 degrees very well; however, as the aspect moves away from 45 degrees, the


84
Figure 33. Experiment II: Resulting ROC curve with orthogonality constraint.
Convinced that the filter represents a better test for classification than the linear filter,
we now examine the result for the other features of interest. Figure 34 shows the output
response for this filter for one of the images. As seen in the figure, a noticeable peak at the
center of the output plane has been achieved. This shows that the filter maintains the local
ization properties of the linear filter.
In this way the characterization of the rejection class by its second order statistics, the
addition of the orthogonality constraint at the input layer to the MLP and the use of a non
linear topology has resulted in a superior classification test.
4.6.3 Experiment ITI suhspace noise training
The next experiments describes an additional modification to this technique. One of
the issues of training nonlinear systems is the convergence time. Training methods which
require overly long training times are not of much practical use. We have already shown


7
Given this solution we can calculate the peak output signal power as
s = (x'h)1
= (*+*(*+xf'd)2.
= and the average output noise power due to an additive white noise input
a2 = E{h^nn^h}
=
= csffih
= a2d2(xr+Jc)-1
where is an a2 is the input noise variance. Resulting in a peak-signal-to-average-noise
output power ratio of
As we can see, the result is independent of the choice of scalar, d. If d is set to unity,
the result is a normalized matched spatial filter.[Vander Lugt, 1964]
In order to further motivate the concept of distortion invariant filtering, a typical ATR
example problem will be used for illustration. This experiment will also help to illustrate
the genesis of the various types of distortion invariant filtering approaches beginning with
the matched spatial filter (MSF).
Inverse synthetic aperture radar (ISAR) imagery will be used for all of the experiments
presented herein. The distortion invariant filtering; however, is not limited to ISAR imag
ery and in fact can be extended to much more abstract data types. ISAR images are shown


142
Figure 63. Entropy maximization as diffusion. The data points are plotted as
diamonds in the figure above. PDF estimation locations are shown as
plus signs. We see that near the center of the distribution that the points
have arranged themselves in a hexagonal configuration, known to be
the most efficient sampling scheme in two dimensions.
where max(Aww) is the maximum nearest neighbor distance (which we are already keep
ing track of, minCA^) is the minimum nearest neighbor distance and max(A) is the
maximum distance between any two points. The numerator term measures how equally
spaced the points are and is expected to approach zero as the distribution becomes more
uniform. The denominator term is a penalty term for not filling the entire space. This mea
sure is shown for the previous diffusion example along with the integrated squared error
measure and entropy in figure 64. Both the integrated squared error and entropy measures


51
optimal, which will almost always be the case for any high-dimensional classification
problem, then we must work directly with the classification error.
We have also shown that the MSE criterion is a sufficient proxy for classification error
(with certain restrictions), however, it requires that we work with the true output error of
the mapping as well as a mapping with sufficient flexibility (i.e. can closely approximate a
wide range of functions which are not necessarily linear). The linear systems approach,
however, does not allow for either of these requirements. Consequently, we must adopt a
nonlinear systems approach if we hope to achieve improved performance. The next chap
ter will show that the MACE filter can be extended to nonlinear systems such that the
desirable properties of shift invariance and localized detection peak are maintained while
achieving superior classification performance.


74
small number of randomly generated white sequences to efficiently represent the rejection
class. The result is an algorithm which is of order N, + Nns (where Nns is the number of
white noise rejection class exemplars) as compared to exhaustive training.
4.6 Experimental Results
We now present experimental results which illustrate the technique and potential pit-
falls. There are four significant outcomes in the experiments presented in this section. The
first is that when using the white sequences to characterize the rejection class, the linear
solution is a strong attractor. The second outcome is that imposing orthogonality on the
input layer to the MLP tends to lead to a nonlinear solution with improved performance.
The third result, in which we restrict the rejection class to a subspace, yields a significant
decrease in the convergence time. The fourth result, in which we borrow from the idea of
using the interior of the convex hull to represent the rejection class [Kumar et al., 1994],
yields significantly better classification performance.
In these experiments we use the data depicted in figure 23. As in the previous experi
ments images from vehicle la will be used as the training set. Vehicles lb and lc will be
used as the recognition class while vehicles 2a and 2b will be used as a rejection/confusion
class for testing purposes. In each case comparisons will be made to a baseline linear filter.
Specifically, in all cases the value of a for the linear filter is set to 0.99. The aspect
separation between training images is 2.0 degrees. This results in 41 training exemplars
from vehicle la. These settings of a and aspect separation were found to give the best
classifier performance for the linear filter with this data set. We continue to refer to this as
a MACE filter since the MACE criterion is so heavily emphasized. Technically it is an


172
Schmidt, W., and J. Davis (1993); Pattern recognition properties of various feature
spaces for higher order neural networks, IEEE Transactions on Pattern Analysis and
Machine Intelligence, 15 (8): 795-801.
Shannon, C. E. (1948); A mathematical theory of communications, Bell Systems
Technical Journal, 27: 379-423.
Sudharsanan, S. I., A. Mahalanobis, and M. K. Sundareshan (1990); Selection of
optimum output correlation values in synthetic discriminant function design, J. Opt. Soc.
Am. A, 7 (4): 611-616.
Sudharsanan, S. I., A. Mahalanobis, and M. K. Sundareshan (1991); A unified frame
work for the synthesis of synthetic discriminant functions with reduced noise variance and
sharp correlation structure, Appl. Opt. 30 (35): 5176-5181.
Vander Lugt, A. (1964); Signal detection by complex matched spatial filtering,
IEEE Trans. Inf. Theory. 10(23): 139.
Viola, P., N. Schraudolph, and T. Sejnowski,(1996); Empirical entropy manipulation
for real-world problems, Neural Information Processing Systems 8, to appear in pub
lished proceedings.
Werbos, P. (1974); Beyond regression: new tools for prediction and analysis in the
behavioral sciences, Ph. D. Thesis, Harvard University, Cambridge, MA.
Widrow, B., and M. Hoff. (1960); Adaptive switching circuits, IRE Wescon Conven
tion Record, 96-104.
Wilkinson, T. and J. Goodman (1991); Synthetic discriminants and eigenvector
decompositions, Appl. Opt. 30 (23): 3278-3280.
Wong, B., and I. Blake(1994); Detection in multivariate non-gaussian noise, IEEE
Transactions on Communications. 42 .


52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
Page
Example ISAR images from two vehicles used for experiments 124
Single vehicle experiment, 100 iterations 125
Single vehicle experiment, 200 iterations 126
Single vehicle experiment, 300 iterations 126
Two vehicle experiment 128
Two dimensional attractor functions 133
Two dimensional regulating function 134
Magnitude of the regulating function 134
Approximation of the regulating function 135
Feedback functions for implicit error term 138
Entropy minimization as local attraction 140
Entropy maximization as diffusion 142
Stopping criterion 143
Mutual information feature space 146
ROC curves for mutual information feature extraction (dotted line) versus linear
MACE filter (solid line) 148
Mutual information feature space resulting from convex hull exemplars 149
ROC curves for mutual information feature extraction (dotted line) versus linear
MACE filter (solid line) 150
vii


95
Figure 40. Experiment IV: Resulting ROC curve with convex hull approach.


68
a statistical viewpoint and generalize it to arbitrary mapping functions, linear and nonlin
ear.
Consider images of dimension TV, x N2 re-ordered by column or row into vectors. Let
N N x \
the rejection class be characterized by the random vector, X¡ e 91 1 2 We know the
second-order statistics of this class as represented by the average power spectrum (or
equivalently the autocorrelation function). Let the recognition class be characterized by
N N x N
the columns of a data matrix x2 e 91 2 which are observations of the random vector,
N N x \
X2 e 9t 1 similarly re-ordered. We wish to find the parameters, 0), of a mapping,
g(co, X):9? 12 > 9 such that we may discriminate the recognition class from the
rejection class. Here, it is the mapping function, g, which defines the discriminator topol
ogy-
Towards this goal, we wish to minimize the objective function
J = E(g(m,A'1)J)
over the mapping parameters, co, subject to the system of constraints
g((,x2) = dT, (45)
N x 1
where d e 9 is a column vector of desired outputs. It is assumed that the mapping
function is applied to each column of jr2. and E( ) is the expected value function.


101
5 3 information Theoretic Background
At this point we provide some background for the technique we are using. As this
material is more specific to information theoretic processing we feel that it is more appro
priately presented at this time. The method we describe here combines mutual information
maximization with Parzen window probability density function estimation. These con
cepts are reviewed.
5.3.1 Mutual Information as a Self-Organizing Principle
Entropy based information theoretic methods have been applied to a host of problems
(e.g. blind separation [Bell and Sejnowski, 1995], parameter estimation [Kapur and Kesa-
van, 1992], and, of course, coding theory [Shannon, 1948], etc.). Linsker [1988] has pro
posed mutual information (derived from entropy) as a self-organizing principle for neural
systems. The premise is that a mapping of a signal through a neural network should be
accomplished so as to preserve the maximum amount of mutual information. Linsker
demonstrates this principle of maximum information preservation for several problems
including a deterministic signal corrupted by gaussian noise.
The appeal of mutual information as a criterion for feature extraction is threefold.
First, mutual information exploits the structure of the underlying probability density func
tion. Second, adaptation, as we will show, can be used to remove as much uncertainty
about the input class using observations of the output, y = g(x, a). Third, this is accom
plished within the constraints of the mapping topology, g([ ], a).


160
Computing the gradient of (99) with respect to h yields
dJ
d h
2$h + 2(\-$)x'xh-2(\-$)xd + x
X.
1
(100)
Setting (100) to zero and solving for h yields
f
1 \
h = (P/ + (1 Pjcx+))-'
(1 -P)JC d-x
K L
X
2
u J
(101)
Substituting (101) into the zero-mean constraint equation yields the condition
[l...l]*t(p/ + (l-P)xxt)-l
-1 \
(1 -P )xd-x
X
2
J J
[1...!] (102)
For the special case of d = [ 1... 1 ] we can further simplify (102) yielding
[ 1... 1 ]jc+(P/ + (1 -P)t)-
H-i>
= NT
(103)
Letting x[ 1 /Nt. .. 1 /NT\ = x and factoring out the scalar term (1 P X/2) yields
(1 P -1) = 3(it(P/+(1-P)Jt)_i)'
(104)


99
We note that while these constraints led to improved classifier performance, the sub
spaces used were more closely related to signal representation, rather than classification. It
should be noted, however, that for an N-class classification problem the optimal Bayes
classifier can be derived from an (N 1)-dimensional feature space, where the features
are the posterior probabilities of each class given the observation of the data (i.e.
P(C|X=x)). This point underlies a key difference between feature extraction for classifi
cation versus feature extraction for signal representation [Fukanaga, 1990], In classifica
tion it is the number of classes, N, which determines the minimum feature-space
dimension.
The feature extraction approach to image classification methods is often decomposed,
as in figure 41, into two stages: feature extraction followed by discrimination. In some
cases, the decomposition is explicit while in others it is a matter of interpretation. Often
the features are determined in ad hoc fashion based on an intuitive understanding of the
data, but not explicitly with respect to classification. As an example, we can interpret the
linear distortion invariant filtering methods as a decomposition of a pre-whitening filter
(feature extraction) followed by an SDF (synthetic discriminant function). Similarly, the
NL-MACE architecture that we are working with can decomposed in this way as shown in
figure. In fact as the results have been reported, this is exactly the decomposition that has
been used for feature space analysis to this point. However, such a decomposition is arbi
trary as the features extracted were driven by a single optimization criterion derived from
the output space and which is coupled to the training of the discriminator.


2
range of distortions such as a variation in viewing aspect of a single object. The goal is
to design a single filter which will recognize an object class through the entire range of
distortion. Under the design criterion the filter is equally matched to the entire range of
distortion as opposed to a single viewpoint as in a matched filter. Hence the nomenclature
distortion-invariant filtering [Kumar, 1992].
The bulk of the research using these types of filters has focused on optical and infra
red (IR) imagery and overcoming recognition problems in the presence of distortions asso
ciated with 3-D to 2-D mappings, e.g. scale and rotation (in-plane and out-of-plane).
Recently, however, this technique has been applied to radar imagery [Novak et al., 1994;
Fisher and Principe, 1995a; Chiang et al., 1995]. In contrast to optical or infra-red imag
ery, the scale of each pixel within a radar image is usually constant and known. Conse
quently, radar imagery does not suffer from scale distortions of objects.
In the family of distortion invariant filters, the MACE filter has been shown to posses
superior discrimination properties [Mahalanobis et al., 1987, Casasent and Ravichandran,
1992]. It is for this reason that this work emphasizes nonlinear extensions to the MACE
filter. The MACE filter and its variants are designed to produce a narrow, constrained-
amplitude peak response when the filter mask is centered on a target in the recognition
class while minimizing the energy in the rest of the output plane. This property provides
desirable localization for detection. Another property of the MACE filter is that it is less
susceptible to out-of-class false alarms [Mahalanobis et al., 1987], While the focus of this
work will be on the MACE filter criterion, it should be stated that all of the results pre
sented here are equally applicable to any of the distortion invariant filters mentioned above
with appropriate changes to the respective optimization criteria.


22
The possible solutions, parameterized by X¡, define a performance bound which can
not be exceeded by any linear system with respect to the optimization criteria and the
equality constraints. All such linear filters which optimally trade-off a set of quadratic cri
teria are referred to as optimal trade-off synthetic discriminant functions.
We may, for example, wish to trade-off the MACE filter criterion versus the MVSDF
filter criterion. This presents the added difficulty that one criterion is specified in the space
domain and the other in the spectral domain. If the noise is represented as zero-mean, sta
tionary, and ergodic (if the covariance is to be estimated from samples) we can, as men
tioned, transform the MVSDF criterion to the spectral domain. In this case the optimal
filter has the frequency domain solution,
H = [\Dn + (\ -\)Dx}-'x[XH'>,Dn + (\-X)Dx\-'x]' d
= Dl'xiX^Dl'xf'd
where D^ = XDn + (1 -X)Dx, 0 < X < 1, and Dn, Dx are diagonal matrices whose
diagonal elements contain the estimated power spectrum coefficients of the noise class and
the recognition class, respectively. The performance bound of such a filter would resemble
figure 10, where all linear filters would fall in the darkened region and all optimal trade-off
filters would lie somewhere on the boundary.
By way of example we again use the data from the MACE and SDF examples. In this
case we will construct an OTSDF which trades off the MACE filter criterion for the SDF
criterion. In order to transform the SDF to the spectral domain, we will assume that the
noise class is zero-mean, stationary, white noise. The power spectrum is therefore flat. One
of the issues for constructing an OTSDF is how to set the value of X which represents the


TABLE OF CONTENTS
ACKNOWLEDGEMENTS ii
LIST OF FIGURES v
LIST OF TABLES viii
ABSTRACT ix
CHAPTERS
1 INTRODUCTION 1
1.1 Motivation 1
2 BACKGROUND 6
2.1 Discussion of Distortion Invariant Filters 6
2.1.1 Synthetic Discriminant Function 12
2.1.2 Minimum Variance Synthetic Discriminant Function 15
2.1.3 Minimum Average Correlation Energy Filter 18
2.1.4 Optimal Trade-off Synthetic Discriminant Function 20
2.2 Pre-processor/SDF Decomposition 24
3 THE MACE FILTER AS AN ASSOCIATIVE MEMORY 27
3.1 Linear Systems as Classifiers 27
3.2 MSE Criterion as a Proxy for Classification Performance 29
3.2.1 Unrestricted Functional Mappings 30
3.2.2 Parameterized Functional Mappings 32
3.2.3 Finite Data Sets 34
3.3 Derivation of the MACE Filter 35
3.3.1 Pre-processor/SDF Decomposition 38
3.4 Associative Memory Perspective 39
3.5 Comments 49
4 STOCHASTIC APPROACH TO TRAINING NONLINEAR SYNTHETIC DIS
CRIMINANT FUNCTIONS 52
4.1 Nonlinear iterative Approach 52
4.2 A Proposed Nonlinear Architecture 53
4.2.1 Shift Invariance of the Proposed Nonlinear Architecture 55
4.3 Classifier Performance and Measures of Generalization 57
4.4 Statistical Characterization of the Rejection Class 67
4.4.1 The Linear Solution as a Special Case 69
4.4.2 Nonlinear Mappings 70


3
Although the MACE filter does have superior false alarm properties, it also has some
fundamental limitations. Since it is a linear filter, it can only be used to realize linear deci
sion surfaces. It has also been shown to be limited in its ability to generalize to exemplars
that are in the recognition class (but not in the training set), while simultaneously rejecting
out-of-class inputs [Casasent and Ravichandran, 1992; Casasent et al., 1991], The number
of design exemplars can be increased in order to overcome generalization problems; how
ever, the calculation of the filter coefficients becomes computationally prohibitive and
numerically unstable as the number of design exemplars is increased [Kumar, 1992], The
MINACE and G-MACE variations have improved generalization properties with a slight
degradation in the average output plane variance [Ravichandran and Casasent, 1992] and
sharpness of the central peak [Casasent et al., 1991], respectively.
This research presents a basis by which the MACE filter, and by extension all linear
distortion invariant filters, can be extended to a more general nonlinear signal processing
framework. In the development it is shown that the performance of the linear MACE filter
can be improved upon in terms of generalization while maintaining its desirable proper
ties, i.e. sharp, constrained peak at the center of the output plane.
A more detailed description of the developmental progression of distortion invariant
filtering is given in chapter 2. In this chapter a qualitative comparison of the various distor
tion invariant filters is presented using inverse synthetic aperture radar (ISAR) imagery.
The application of pattern recognition techniques to high-resolution radar imagery has
become a topic of great interest recently with the advent of widely available instrumenta
tion grade imaging radars. High resolution radar imagery poses a special challenge to dis
tortion invariant filtering in that the source of distortions such as rotation in aspect of an


81
ity of the data and the decision surface can be computed as a function of the features.
Mathematically this can be written
y = 0(^00^0(11) +tp)).
(52)
Recall that the matrix W¡ represents the connectivities from the output of layer (i 1) to
the inputs of the PEs of layer i,

earity (hyperbolic tangent function in this case).
Figure 31 shows this projection for the training set (top) and the testing set (bottom).
What is significant in the figure is that although the discriminant as a function of the vec
tor u is nonlinear, the projection of the images lie on a single curve in this feature space.
Topologically this filter can put into one-to-one correspondence with a linear projection.
This is not to say that the linear solution is undesirable, but under the optimization crite
rion it can be computed in closed form. Furthermore, in a space as rich as the ISAR image
space it is unlikely that the linear solution will give the best classification performance.
Table 3. Comparison of ROC classifier performance for to values of Pd. Results are shown for the linear
filter versus four different types of nonlinear training. N: white noise training, G-S: Gram-Schmidt
orthogonalization, subN: PCA subspace noise, C-H: convex hull rejection class.
Pd (*)
Pf,(%)
linear filter
nonlinear filter, experiments I-IV
I (N)
II (N, G-S)
III (subN, G-S)
IV (subN, G-S, C-H)
80
4.37
4.37
3.74
2.81
2.45
99
42.43
41.87
27.15
26.52
15.33
4.6.2 Experiment II noise training with an orthogonalization constraint
As a means of avoiding the linear solution a modification was made to the training
algorithm. The modification was to impose orthogonality on the columns of IV, through a


83
obtained. Figure 33 shows the ROC curve for the resulting filter. It is evident that the non
linear filter is a uniformly better test for classification.
Figure 32. Experiment II: Resulting feature space when orthogonality is imposed
on the input layer of the MLP. In the top figure squares indicate the
recognition class training exemplars, triangles indicate white noise
rejection class exemplars, and plus signs are the images of vehicle la
not used for training. In the bottom figure, squares are the peak
responses from vehicles lb and lc, triangles are the peak responses
from vehicles 2a and 2b.


57
Since the parameters, 0), are constant, equations 42 and 43 are sufficient to show the
mapping of the MLP is shift invariant and consequently, the system as a whole (including
the shift invariant pre-processor) is also shift invariant.
4.3 Classifier Performance and Measures of Generalization
One of the issues for any iterative method which relies on exemplars is the number of
training exemplars to use in the computation of the discriminant function. In addition, for
iterative methods, there is the issue of when to stop the adaptation process. In the case of
distortion invariant filters, such as the MACE filter, some common heuristics are used to
determine the number of training exemplars. Typically samples are drawn from the train
ing set and used to compute the filter from equation 23 until the minimum peak response
over the remaining samples exceeds some threshold [Casasent and Ravichandran, 1992],
A similar heuristic is to continue to draw samples from the training set until the mean
square error of the peak response over the remaining samples drops below some preset
threshold. These measures are then used as indicators of how well the filter generalizes to
between aspect exemplars from the training set which have not been used for the computa
tion of the filter coefficients.
The ultimate goal, however, is classification. Generalization in the context of classifi
cation must be related to the ability to classify a previously unseen input [Bishop, 1995].
We show by example that the measures of generalization mentioned above may be mis
leading as predictors of classifier performance for even the linear filters. In fact the result
of the experiments will show that the way in which the data is pre-processed is more indic
ative of classifier performance than these other indirect measures.


117
Figure 46. Gradient of two-dimensional gaussian kernel. The kernels act as attractors to low
points in the observed PDF on the data when entropy maximization is desired.
a gaussian). The kernel gradient is convolved with the difference between the desired and
observed distributions to determine the error direction. The resulting error direction is
shown in figure 48. The difference between the cases is the sign of the error direction. As
we can see and would expect when we are maximizing entropy (top figure) the error direc
tion points away from the modes of the observed distribution while when we are minimiz
ing entropy (bottom figure) the error direction is to the center of the modes. This
repulsion/attraction behavior extends to the multi-dimensional case as well. The bottom of
the figure illustrates another point with regards to feature extractions. As we can see, when
we are minimizing entropy, the trend is to make the observations more compact, a prop
erty which would be useful for identifying a class.


120
bution and compute a mapping using an MLP and the entropy maximizing criterion
described in previous sections. The architecture of the MLP is 2-4-1, indicating 2 input
nodes, 4 hidden nodes, and 1 output node. The nonlinearity used is the hyperbolic tangent
function. We are therefore, nonlinearly mapping the two-dimensional input space onto a
one-dimensional output space. The right-hand plot of figure 49 shows the image of the
maximum entropy mapping onto the input space. From the contours of this mapping we
see that the maximum entropy mapping lies essentially in the same direction as the first
principal components.
PCA mapping
entropy mapping
4
-
4
-
77 V
/ %
-j ryT 77
** r-;
i 1 1 .* ,i.
f:
-4
V
-4
6-4-20 2 4 6
6 -4 -2 0 2 4 6
Figure 49. PCA vs. Entropy gaussian case. Left: image of PCA features shown
as contours. Right: Entropy mapping shown as contours.
This result is expected. It illustrates that when the gaussian assumption is supported by
the data, maximum entropy and PCA are equivalent from the standpoint of direction. This
result has been recognized by many researchers. In fact the gaussian assumption is often
used as a limiting case for maximum entropy approaches [Plumbey and Fallside, 1988].
These techniques, however, only examine the covariance of the data in the output space.


58
We illustrate this point with an example using ISAR image data. A data set, larger than
in the previous experiments, will be used. Two more vehicles, one from each vehicle type
will be used for the testing set, and all vehicles will be samples at higher aspect resolution.
Figure 23 shows ISAR images of size 64 x 64 taken from five different vehicles and two
different vehicle types. The images are all taken with the same radar. Data taken from
vehicles in the same class vary in the vehicle configuration and radar depression angle (15
or 20 degrees depression). Images have been formed from each vehicle at aspect varia
tions of 0.125 degrees from 5 to 85 degrees aspect for a total of 641 images for each vehi
cle. Figure 23 shows each of the vehicles at 5, 45, and 85 degrees aspect.
We will use vehicle type 1 as the recognition class and vehicle type 2 as a confusion
vehicle. Images of vehicle la will be used as the set from which to draw training exem
plars. Classification performance will then be measured as the ability to recognize vehi
cles lb and lc while rejecting vehicles 2a and 2b. The filter we will use is a form of the
OTSDF [Rfrgier and Figue, 1991] which is computed in the spectral domain as
H = [afj + il A'tX+iaf^ + O -a)PJ 'x]d.
(44)
exemplar images of dimension N¡ x N2 of vehicle la reordered into column vectors. The
diagonal matrix P x e 91
..N¡N2xN{N,
contains the coefficients of the average power spec
trum measured over the N: exemplars of vehicle la, while Px e 9?
.NtNt*N
is the iden-
N x 1
tity matrix scaled by the average of the diagonal terms of Px. Finally, de 91 is a
column vector of desired outputs, one for each exemplar. The elements of d are typically


148
Figure 66. ROC curves for mutual information feature extraction (dotted line)
versus linear MACE filter (solid line).


115
It is interesting to compare this result to supervised training using error backpropaga-
tion. When training in a supervised manner an explicit desired output d¡ is assigned to
each input x¡, MSE minimization results in the following adaptation of the mapping
parameters
(75)
where y¡ is the observed response to the input x¡ and is the observed output error. In
contrast, maximizing or minimizing entropy in the manner described results in the follow
ing adaptation of the mapping parameters
which, neglecting the sign term, is the same the same as equation 75 with one significant
difference. The sign term depends on whether we are minimizing or maximizing entropy.
5.5 Gaussian Kernels
Examination of the gaussian kernel and its differential in two dimension illustrates
some of the practical issues of implementing this method of feature extraction as well as
providing an intuitive understanding of what is happening during the adaptation process.


CHAPTER 3
THE MACE FILTER AS AN ASSOCIATIVE MEMORY
3.1 Linear Systems as Classifiers
In this chapter we present the MACE filter from the perspective of associative memo
ries. This perspective is important because it leads to a machine-learning and classification
framework and consequently a means by which to determine the parameters of a nonlinear
mapping via gradient search techniques. We shall refer, herein, to the machine learning/
gradient search methods as an iterative framework. The techniques are iterative in the
sense that adaptation to the mapping parameters are computed sequentially and repeatedly
over a set of exemplars. We shall show that the iterative and classification framework com
bined with a nonlinear system architecture have distinct advantages over the linear frame
work of distortion invariant filters.
As we have stated, distortion invariant filters can only realize linear discriminant func
tions. We begin, therefore, by considering linear systems used as classifiers. The adaline
architecture [Widrow and Hoff, 1960], depicted in figure 15, is an example of a linear sys
tem used for pattern classification. A pattern, represented by the coefficients x¡, is applied
to a linear combiner, represented by the weight coefficients w¡, the resulting output y is
27


35
each of the classes, we can use the law of large numbers to say that the summations of
equation 16 approach their expected values. In other words, in the limit as Nlt N2 > 00
Is = ^/(*>)
'3/Q, a) I
3a
-P,E
|(l-/0,a))^M
3a
(17)
which is identical to equation 12 and so yields the same solution for the mapping as
/(*.<*) =
P2p(x\C2)
PO)
(18)
The conclusion is that if we have a sufficient number of observations to characterize
the underlying distributions then the MSE criterion is again equivalent to the Bayes crite-
3.3 Derivation of the MACE Filter
We have already introduced the MACE filter in a previous section. We present a deri
vation of the MACE filter here. The development is similar to the derivations given in
Mahalanobis [1987] and Kumar [1992], Our purpose in this presentation of the derivation
is that it serves to illustrate the associative memory perspective of optimized correlators; a
perspective which will be used to motivate the development of the nonlinear extensions
presented in later sections.


78
Figure 29. Output response of linear filter (top) and nonlinear filter (bottom).
The response is for a single image from the training set, but not one
used to compute the filter.
This table shows that the classifier performance for the linear filter and nonlinear filters
are nominally the same, despite what may be perceived to be better performance in the
nonlinear filter with regards to peak response over the training vehicle and reduced output
plane response to shifts of the image. Furthermore, if we examine figure 30, which shows


69
Using the method of Lagrange multipliers, we can augment the objective function as
J = E(g(CO,*1)2) + (g(G),X2)-rfT)\. (46)
N x\
where X e 91 is a vector whose elements are the Lagrange multipliers, one for each
constraint. Computing the gradient with respect to the mapping parameters yields
3J_
d(0
= 2E^(co,Ar,)f
3co
3g(to,x2)
+ A,.
o(0
(47)
Equation 47 along with the constraints of equation 45 can be used to solve for the opti
mal parameters, co, assuming our constraints form a consistent set of equations. This is,
of course dependent on the mapping topology.
4.4.1 The Linear Solution as a Special Case
It is interesting to verify that this formulation yields the MACE filter as a special case.
If, for example, we choose the mapping to be a linear projection of the input image, that is
g(a,x) = coTx ; co = [A, ...hN NJT e equation 46 becomes, after simplification,
J = coT£(A'lx]')co + (coTx2-dT)X. (48)
In order to solve for the mapping parameters, to, we are still left with the task of com-
T
puting the term E(X¡X¡) which, in general, we can only estimate from observations of the
random vector, X¡, or assume a specific form. Assuming that we have a suitable estima-


89
Figure 36. Experiment IH: Resulting ROC curve for subspace noise training.
dition was found to be the case throughout the data set. Of more significance is the result
shown in figure 38 in which we compare the learning curves of all of the experiments pre
sented here. In this figure the dashed and dashed-dot lines are the learning curves for
experiments II and III, respectively. In this case the convergence rate was increased nomi
nally by a factor of three, from 100 epochs to approximately 30 epochs. Here an epoch
represents one pass through all of the training data.
4.6.4 Experiment IV convex hull approach
In this experiment we present a technique which borrows from the ideas of Kumar et
al [1994], This approach designed an SDF which rejects images which are away from the


145
The goal is to maximize mutual information conditioned on the recognition class or
I(C,y) = h(y)-h(y\C)
(92)
I(C,g(x, a)) = h(g(x,a))-h(g(x,a.)\C)
where, x is the pre-processed training exemplar and y is the extracted feature vector.
The feature extraction architecture is an NtN2-4-2 MLP (4 hidden nodes, 2 output
nodes). The resulting feature space mapping is shown in figure 65. In contrast to the
results of section 4.6, the feature space is the output of a nonlinear mapping and so it is
difficult to make other than qualitative comments about it. We can, however, say much
about the criterion from which it was derived (and we have). In this case we are left with a
performance comparison to the previous experiments. A summary of the results of this
section with those of section 4.6 ar given in table 4. Where we can see that the perfor
mance is comparable (slightly better) than in experiment III from the previous chapter,
which used the same noise class exemplars with the orthogonality constraint.
Table 4. Comparison of ROC classifier performance for to values of Pd. Results are
shown for the linear filter versus experiments III and IV from section 4.6 and mutual
information feature extraction.The symbols indicate the type of rejection class
exemplars used. N: white noise training, G-S: Gram-Schmidt orthogonalization,
subN: PCA subspace noise, C-H: convex hull rejection class.
pd (%)
Pfa (%)
linear filter
section 4.6
mutual information
(subN, G-S)
(subN, G-S, C-H)
(subN)
(subN,C-H)
80
4.37
2.81
2.45
2.65
1.36
99
42.43
26.52
15.33
23.09
11.07
The resulting ROC curve is shown in figure 66 as compared to the linear MACE filter.
IT is not surprising that the performance did not exceed the performance of experiment III
when we consider how the rejection class was generated as a random projection of gaus-


82
Gram-Schmidt process. The motivation for doing this stems from the fact that we are
working in a pre-whitened image space. In a pre-whitened image space this condition is
sufficient to assure the outputs in the feature space, as measured at ul and u2, will be
uncorrelated over the rejection class. Mathematically this can be shown as
£{uuT} = £{w[.XjA:]VI} = w]e{xxx\}W\
H}£(Jf,3r})H'1 w}£(A:|2f[)H>2
T 2, T 2,
Wy a Iwj Wj a Iw2
w\e(X\X\)W\ w^EiX[X1¡)w2
T 2- T 2r
W20 Iw j W20 lw 2
HI2 o
0 iKf
N N x l
where wt, w2 e 9 1 are the columns of IT,. This result is true for any number of
nodes in the first layer of the MLP.
The results of the training with this modification are shown in figure 32 which is the
resulting feature space as measured at u j and u2 From this figure we can see that the dis
criminant function, represented by the contour lines, is a nonlinear function of n, and u2.
Furthermore, because the projection of the vehicles into the feature space do not lie on a
single curve (as in the previous experiment), the features represent different discrimina
tion information with regards to the both rejection and recognition classes. The bottom of
the figure, showing the projection of a random sampling of the test vehicles (all 1282
would be too dense for plotting) show that both features are useful for separating vehicle 1
from vehicle 2. Examination of table 3 (column II in the nonlinear results) shows that at
the two detection probabilities of interest improved false alarm performance has been


76
Figure 28 shows the peak output response taken over all images of vehicle la for both
the linear (top) and nonlinear (bottom) filters. In the figure we see that for the linear filter
the peak constraint (unity) is met exactly for the training exemplars with degradation for
the between aspect exemplars. As mentioned previously, if the pure MACE filter criterion
were used (a equal to unity), the peak in the output plane is guaranteed to be at the con
straint location [Mahalanobis et al., 1987]. It turns out that for this data set the peak output
also occurs the constraint location for the training images, however, with a = 0.99 it was
not guaranteed. Examination of the peak output response for the NL-MACE filter shows
that the constraints are met very closely (but not exactly) for the training exemplars also
with degradation in the peak output response at between aspect locations. The degradation
for the nonlinear filter is noticeably less than in the linear case and so in this regard it has
outperformed the linear filter.
Figure 29 shows the output plane response for a single image of vehicle la (not one
used for computing the filter coefficients) for the linear filter (top) and the nonlinear filter
(bottom). Again in this figure we see that both filters result in a noticeable peak when the
image is centered on the filter and a reduced response when the image is shifted. The
reduction in response to the shifted image is again noticeably better in the nonlinear filter
than in the linear filter. Such would be found to be trae for all images of vehicle la and so
in this regard we can again say that the nonlinear filter had outperformed the linear filter.
However, as we have already illustrated for the linear case, these measures are not suf
ficient to predict classifier performance alone and are certainly not sufficient to compare
linear systems to nonlinear systems. This point is made clear in table 3 which summarizes
the classifier performance at two probabilities of detection for all of the experiments


102
Three equivalent formulations of mutual information are
I(x,y) = h(x) + h(y) -h(x,y),
(56)
/(x,y) = h(y) h(y\x), and
(57)
I(x,y) = h(x)-h(x\y).
(58)
where I(x,y) is the mutual information of the RVs X and Y. In equations 56 through 58,
h(x) is the differential entropy measure (which we will refer to as simply entropy)
[Papoulis, 1991] of the RV X, h(x\y) is the entropy of the RV X conditioned on the RV
Y, and h{x,y) is the joint entropy of the RVs X and Y. Entropy is used to quantify our
uncertainty about a given random variable or vector. Mutual information quantifies the
relative uncertainty of one random variable/vector with respect to another; it measures the
information that one random variable/vector conveys about another. We note that manipu
lation of mutual information is dependent upon the ability to manipulate entropy. In fact
we can manipulate the entropy related terms of mutual information independently.
Following the notation of Papoulis, the entropy of a continuous random variable or
vector (RV), X e 9iW, is defined as
h(x) = J log(fx(x))fx(x)dx, (59)
where fx(x) is the probability density function of the RV, the base of the logarithm is arbi
trary, and the integral is N-fold. The conditional and joint forms of entropy used in 56
substitute the joint and conditional probability density functions, respectively, into equa-


146
sian noise onto an ortho-normal basis. As we have shown in previous experiments, under
the gaussian condition (equal covariances), orthogonality and entropy are equivalent.
These results then give support to this technique since orthogonality was not enforced on
the feature extractor.
Figure 65. Mutual information feature space. Rejection class is represented with
sub-space noise images. The top figure shows the training exemplars
(plus sign is recognition and triangles are rejection), while the bottom
side shows the testing set.


167
where Kj(m, a) is the one-dimensional Gaussian kernel of width o. The vector result of
the convolution is then written
/r() =
TT-
awil2
1*1
TT-
erf
erf
. a\
M + 9
/ Cl\\
¡-j ^
Jlo
-erf
J2c
>
\ ))
. a\
+ 9
1 9
J2c
-erf
J2a
(120)


104
mapping is nonlinear, in which case the second term of equation 61, is a function of the
random variable, it is possible to change the relative information of two random variables.
From the perspective of classification this is an important point. If the mapping is
topological (in which case it has a unique inverse), their is no increase, theoretically, in the
ability to separate classes. That is, we can always reflect a discriminant function in the
transformed space as a warping of another discriminant function in the original space.
This is not true, however, for a mapping onto a subspace. Our implicit assumption here is
that we are unable to reliably determine a discriminant function in the full input space. As
a consequence we seek a subspace mapping that is by some measure optimal for classifi
cation. We cannot avoid the loss of information (and hence some ability to discriminate
classes) when using a subspace mapping. However, if the criterion used for adapting the
mapping, is information (entropy) based, we can perhaps minimize this loss. It should be
mentioned that in all classification problems there is an implicit assumption that the
classes to be discriminated do indeed lie in a subspace.
5.3.2 Mutual Information as a Criterion for Feature Extraction
It is our intent to use mutual information as a criterion for feature extraction (prior to
classification). The use of mutual information in this way can be motivated simply by
Fanos inequality [1961] which gives a lower bound for the probability of error (or con
versely an upper bound on the probability of correct classification) when estimating a dis
crete RV from another RV as a function of the conditional entropy (and ultimately the
mutual information). Fanos inequality is stated as follows, given the discretely distributed


66
ROC area vs. a
1.0
-



, 1 1 1
g g oafi
i
0.8
-
o

-
03
0.6
$
-
CD
CO
o
o
OS
0.4
O a
= 2.00 degrees
-
A
= 4.00 degrees
-
a
= 8.00 degrees
-
0.2
-
0.0
! .
-
0.0
0.2
0.4
0.8
0.8
1
a
Figure 27. ROC performance measures versus a. Results are shown for training
aspect separations of 2, 4, and 8 degrees. These plots indicate that
ROC performance is positively related to a.
perform the MACE filter or its variations in terms of maximizing the minimum peak
response over the training vehicle or reducing the variance in the output image plane.


16
Figure 6. SDF peak output response of testing vehicles lb and 2a over all aspect
angles. The dashed line is vehicle lb while the dashed-dot line is
vehicle 2a.
the correlation filter which minimizes the output variance due to zero-mean input noise
while satisfying the same linear constraints as the SDF. The output noise variance can be
shown to be hiTnh where h is the vector of filter coefficients and Xn is the covariance
matrix of the noise. [Kumar, 1986]
Mathematically the problem formulation is
nin s.t. XUt = d
, N x 1 NxN, n x N N, x 1
he C ,XeC ,I,eC ,de C


30
For the case of f(x) = 0.5, both classes are equally likely, so a guess must be made.
3.2.1 Unrestricted Functional Mappings
With regards to the adaline/LMS approach we now ask, what is the consequence of
using the MSE criterion for computing discriminant functions? In the two class case, the
source distributions are p(x\C¡) or p(x\C2) depending on whether the observation, x, is
drawn from class 1 or class 2, respectively. If we assign a desired output of zero to class 1
and unity to class 2 then the MSE criterion is equivalent to the following
j(f) = y£{/w2|c,} +y£{(l -/(jc))2|C2}. (5)
where the 1/2 scale factors are for convenience, E{ } is the expectation operator, and
C¡ indicates class i.
For now we will place no constraints on the functional form of f(x). In so doing, we
can solve for the optimal solution using the calculus of variations approach. In this case,
we would like to find a stationary point of the criterion ?(/) due to small perturbations in
the function f(x) indicated by
5/ = 7(/ + 8/)-7(/)
= 0
(6)


159
Insertion of the unitary DFT transformation matrices into equation (96), using the defini
tions of (94) and the identity of (95) yields the frequency domain relationship
2 ,rr ,
= h Znh
t (97)
=
o2 = = *%*
A.2 Optimal Trade-off of Noise Response with Error Variance Subject to Zero-
Mean F.rror Constraint
Suppose we wish to relax the equality constraints with regards to the desired outputs.
That is, we no longer require that
x*h = d (98)
and instead allow the output response at the locations of the former equality constraints to
vary with zero mean. We can trade-off the noise response of the filter with respect to the
error variance as follows
min t ^
$kh + (\
subject to the constraint
[ 1... 1 ](x^h d) = 0.
Adjoining the zero-mean equality constraint to the optimization criterion yield
J = VhTh + (\-$)(x'h-d)\x'h-d) + \l\...\](xlh-d)
= + (1 P)(*+jcjc+A (xd)'h tfxd + d'd) + U\...l](xlh-d)
(99)


13
gle filter which could be matched to multiple images using the idea of superposition. This
approach was possible due to the large number of coefficients (degrees of freedom) that
typically constitute 2-D image templates. For historical reasons, specifically that the filters
in question were synthesized optically using holographic techniques [Vander Lugt, 1964],
it was hypothesized that such a filter could be synthesized from linear combinations of a
set of exemplar images.
The filter synthesis procedure consists of projecting the exemplar images onto an
ortho-normal basis (originally Gram-Schmidt orthogonalization was used to generate the
basis). The next step is to determine the coefficients with which to linearly combine the
basis vectors such that a desired response for each original image exemplar was obtained.
[Hester and Casasent, 1980]
The proposed synthesis procedure is a bit convoluted. It turns out that the choice of
ortho-normal basis is irrelevant. As long as the basis spans the space of the original exem
plar images the result is always the same. The development of Kumar [1986] is more use
ful for depicting the SDF as a generalization of the matched filter (for the white noise
case) to multiple signals. The SDF can be cast as the solution to the following optimiza
tion problem
min hUi s.t. X^h = d {h e CN* X s CN* N', d £ C*'*'}
where X is now a matrix whose At, columns comprise a set of training images1 we wish
to detect, d is a column vector of desired outputs (one for each of the training exemplars)
1. Since these filters have been applied primarily to 2D images, signals will be referred to
as images or exemplars from this point on. In the vector notation, all At, x N2 images are
re-ordered (by row or column) into At x 1 column vectors, where N = At, N2


134
Figure 58. Two dimensional regulating function. The x] -component is shown at
top while the x2 -component is shown at the bottom.
mapping topology, the approximation can be used in order to save significant computa
tion.
Figure 59. Magnitude of the regulating function. The magnitude of this function is
zero except near the boundary of the desired output distribution.


ACKNOWLEDGEMENTS
There are many people I would like to acknowledge for their help in the genesis of this
manuscript. I would begin with my family for their constant encouragement and support.
I am grateful to the Electronic Communications Laboratory and the Army Research
Laboratory for their support of the reasearch at the ECL. I was fortunate to work with very
talented people, Marion Bartlett, Jim Bevington, and Jim Kurtz, in the areas of ATR and
coherent radar systems. In particular, I cannot overstate the influence that Marion Bartlett
has had on my perspective of engineering problems. I would also like to thank Jeif Sichina
of the Army Research Laboratory for providing many interesting problems, perhaps too
interesting, in the field of radar and ATR. A large part of who I am technically has been
shaped by these people.
I would, of course, like to acknowledge my advisor, Dr. Jos Principe, for providing me
with an invaluable environment for the study of nonlinear systems and excellent guidance
throughout the development of this thesis. His influence will leave a lasting impression on
me. I would also like to thank DARPA, funding by this institution enabled a great deal of
the research that went into this thesis. I would also like to thank Drs. David Casasent and
Paul Viola for taking an interest in my work and offering helpful advice.
I would also like to thank the students, past and present, of the Computational Neu-
roEngineering Laboratory. The list includes, but is not limited to, Chuan Wang for useful
discussions on information theory, Neil Euliano for providing much needed recreational
opportunities and intramural championship t-shirts, Andy Mitchell for being a good friend
to go to lunch with and who suffered long inane technical discussions and who now is a
better climber than me. There are certainly others and I am grateful to all.
Finally I would like to thank my wife, Anita, for enduring a seemingly endless ordeal,
for allowing me to use every ounce of her patience, and for sacrificing some of her best
years so that I could finish this Ph. D. I hope it has been worth it.
u


56
denoted by Nw In equation 41 cp is a constant bias vector added to each element of the
vector, W2o( W|.y) e 91^**1. It is also assumed that if the argument to the nonlinear
function a( ) is a matrix or vector then the nonlinearity is applied to each element of the
matrix or vector.
N N x l
The input to the MLP is denoted as a vector, ye 91 1 The elements of the vector
are samples of a two-dimensional pre-whitened input signal, y(nu n2). We can write the
i th element of the vector as a function of the two dimensional signal as follows
y¡(nl,n2) = y(n, + where |_, IVj J indicates integer division of i by N¡. Written this way, the elements of the vector
y sample a rectangular region of support of size N¡ x N2 beginning at sample (n(, n2) in
the pre-whitened signal, y(nv n2). The vector argument of equation 41 and the resulting
output signal can now be written as an explicit function of the beginning sample point of
the template within the pre-whitened image
g0(n\n2) ~ g( The output of the mapping as written in equation 42 is now an explicit function of
(/I,, n2) and the constant parameter set, co (which do not vary with (,, n2)). We can also
write the output response as a function of the shifted version of the image, y(j, n2) as
S("l +nt',n2 + n2) = g(m,y(n, + n{, n2 + n2)).
(43)


41
is satisfied while h^h is minimized. As in the MACE filter, this optimization can be
solved using the method of Lagrange multipliers. We adjoin the system of constraints to
the optimization criterion as
J = tfh + XT((Ax)ih-d) (32)
N x 1
where X e 9? is a column vector of Lagrange multipliers, one for each constraint
.(desired response). Taking the gradient of equation (32) with respect to the vector h yields
¡¥= 2h + AxX. (33)
dh
Setting the gradient to zero and solving for the vector h yields
h = -~AxX. (34)
Substituting this result into the constraint equations of (31) and solving for the Lagrange
multipliers yields
X = -2((Ax)Mjc) V (35)
Substituting this result back into equation (34) yields the final solution to the optimization
as
h = Ax(x^Ax) 'd. (36)
If the pre-processing transformation, A is the space-domain equivalent of the MACE
filters spectral pre-whitener and the columns of the data matrix x contain the re-ordered
elements of the images from the MACE filter problem then equation (36) combined with


59
Figure 23. ISAR images of two vehicle types shown at aspect angles of 5, 45, and
85 degrees respectively. Three different vehicles of type 1 (a, b, and c)
are shown, while two different vehicles of type 2 (a and b) are shown.
Vehicle la is used as a training vehicle, while vehicles lb and lc are
used as the testing vehicles for the recognition class. Vehicles 2a and
2b are used a s confusion vehicles.
set to unity. When a is set to unity equation 44 yields exactly the MACE filter, when it is
set to zero the result is the SDF. The filter we are using is therefore trading off the MACE
filter criterion with the SDF criterion. The SDF criterion can also be viewed as the
MVSDF [Kumar, 1986] criterion when the noise class is represented by a white noise ran
dom process. This filter can also be decomposed as in figure 22.


124
statistics are sufficient (e.g. gaussian) to describe such structure, but in general they are
not.
5.7 Maximum Entropy: ISAR Experiment
We now present some experimental results using maximum entropy feature extractor
for ISAR data. The mapping structure we use in our experiment is a multi-layer perceptron
with a single hidden layer (4096 input nodes, 4 hidden nodes, 2 output nodes). The method
is used tc extract features from two vehicle types with ISAR images from 180 degrees of
aspect. Examples of the imagery are shown in figure 52. In these experiments the optimi
zation criterion is always to maximize entropy conditioned on the input class. The input
class may be represented by a single vehicle type or both vehicle types depending on the
experiment. This is not how the technique would be applied to the NL-MACE structure
(recall that mutual information has both an entropy minimizing and maximizing term), but
the results are interesting and further illustrate the potential of the information theoretic
approach.
Figure 52. Example ISAR images from two vehicles used for experiments. The
vehicles were rotated through an aspect range of 0 to 180 degrees. The
top and bottom rows are from different vehicles.


92
magnitude of the correlator output). It is the nonlinear iterative process which determines
how to separate the recognition class exemplars from the images derived from the convex
hull. The result is significantly improved classification.
In this experiment we continue to use random noise projected onto the basis defined
by columns of the matrix V as in experiment III. In addition, convex hull exemplars are
N xl
generated by projecting a random vector a e 9? onto the data matrix x2 The basis
for this approach is that elements of the convex hull that are distant from the extremal
points (the training exemplars) do not convey information about the recognition class, and
so in keeping with this idea we imposed a further restriction on the coefficients a¡, namely
N,-a-N,-
This restriction assures that none of the generated convex hull exemplars lie too close
to one of the recognition class training exemplars. Rejection class exemplars from within
the convex hull are randomly generated throughout the training from xrej = x2a. Another
property of these rejection class exemplars is that they also lie in the subspace of the data
matrix x2
Examination of table 3 and the ROC curve of figure 40 show that this method yields
significantly improved classification performance. The discriminant function shown in
figure 39 is quite different and much more nonlinear than in the previous cases. In the fig
ure the convex hull exemplars are clustered between the subspace noise exemplars and the
recognition class exemplars. If this is a general property of the type of data we are using
then it may be a powerful method by which to describe the rejection class within the non-


166
The y'th element of the vector integration can be written
a
"f 5 eXP("^2(f Xj)2]{Uj Xi)dXi
n4
.N/2 N +2
aN(2n) a
r a. a\ a\\\
i + 2
JlG
- erf
j2o
V JJJ
a2(eXP(-(> f )2) exp(-(; + f/))
Vn
a"!" J?ai*j
f
(Vfl
erf
V
J2a
-erf
2
Jl<5
\ ))
- exp(-(wv+I)2))
= n-
a"ll2
*j
erf
1 2
J2a
V
- erf
J2a
X ...
^ 1 J'J j
-ini
/
f,+^
a\
U;
erf
2
V2ct
^ >
-erf
1 2
J2c
\ /
(119)


xml version 1.0 encoding UTF-8
REPORT xmlns http:www.fcla.edudlsmddaitss xmlns:xsi http:www.w3.org2001XMLSchema-instance xsi:schemaLocation http:www.fcla.edudlsmddaitssdaitssReport.xsd
INGEST IEID EW1AJMXHI_SPYQKR INGEST_TIME 2013-09-28T02:02:09Z PACKAGE AA00014281_00001
AGREEMENT_INFO ACCOUNT UF PROJECT UFDC
FILES


48
0-8
0.6
Figure 19. Output correlation surface for LMS computed filter from non full rank
data. The filter output is not substantially different from the analytic
solution with full rank data.
Since the system of constraint equations are generally under-determined, there are infi
nitely many filters which will satisfy the constraints. There is only one, however, that min
imizes the norm of filter (the optimization criterion after pre-processing) [Kohonen, 1988].
Figure 21 shows the NMSE between the analytic solution for the filter coefficients as com
pared to the iterative1 method. When the data matrix is full rank the iterative method
approaches the optimal analytic solution, as shown by the solid line in the figure. When
the data matrix is not full rank, as shown by the dashed line in the figure, the error in the
iterative solution approaches a limit.
These qualities of iterative learning methods are important from the ATR perspective.
We see from the example that when the data possesses a quality that would seemingly be
1. in this case iterative refers to the LMS algorithm, within this text it generally refers to
a gradient search algorithm.


128
Figure 56. Two vehicle experiment. Projection of training (left) and testing (right)
images onto feature space after 150 (top) and 300 (bottom) iterations
for two vehicle class training Vehicle 1 is indicated by diamond
symbols, while vehicle 2 is indicated by triangles. Each class is
connected in order of aspect angle. It appears in these figures that the
mapping has maintained aspect dependence for each vehicle. At the 300
iteration point some separation of the vehicles is in evidence. In the
bottom left plot, the connecting lines have been removed in order to
better show the class separation which has taken place


19
matrix is estimated from observations of noise sequences (assuming wide-sense stationar-
ity and ergodicity) the MVSDF can also be formulated in the frequency domain, as well,
and the complex matrix inversion is avoided. A derivation of this is given in the appendix
A, examination of equations (95), (96), (97) shows that under the assumption that the
noise class can be modeled as a stationary, ergodic random noise process the solution of
the MVSDF can be found in the spectral domain using the estimated power spectrum of
the noise process and equation (1).
In practice, the MACE filter performs better than the MVSDF with respect to rejecting
out-of-class input images. The MACE filter; however, has been shown to have poor gener
alization properties, that is, images in the recognition class but not in the training exemplar
set are not recognized.
A MACE filter was computed using the same exemplar images as in the SDF example.
Figure 8 shows the resulting output image place response for one image. As can be seen in
the figure, the peak in the center is now highly localized. In fact it can be shown [Mahal-
anobis et al., 1987] that over the training exemplars (those used to compute the filter) the
output peak will always be at the constraint location.
Generalization to between aspect images, as mentioned, is a problem for the MACE
filter. Figure 9 shows the peak output response over all aspect angles. As can be seen in the
figure, the peak response degrades severely for aspects between the exemplars used to
compute the filter. Furthermore, from a peak output response viewpoint, generalization to
vehicle lb is also worse. However, unlike the previous techniques, we now begin to see
some separation between the two vehicle types as represented by their peak response.


63
Figure 25. Generalization as measured by the peak response, mean square error.
The plot compares ymse versus classification performance measures
(ROC area and Pfa@Pd=0.8).


NONLINEAR EXTENSIONS TO THE
MINIMUM AVERAGE CORRELATION ENERGY FILTER
By
JOHN W. FISHER III
A DISSERTATION PRESENTED TO THE GRADUATE SCHOOL OF
THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT OF
THE REQUIREMENTS FOR THE DEGREE OF DOCTOR OF
PHILOSOPHY
UNIVERSITY OF FLORIDA
1997


il
O 20 40 60 80 100
aspect angle
Figure 3. MSF peak output response of testing vehicles lb and 2a over all aspect
angles. Responses are overlaid on training vehicle response. Filter
responses to vehicles lb (dashed line) and 2a (dashed-dot) do not differ
significantly.


162
in which case equations (106) and (107) become
k(u) = - N/1 wexpi--^aTal, (108)
(271) o V 2a )
and
K'(u) = -,TTw72~N+'2exp(-riT] <109>
(2ji) a v 2 The convolution of these terms, Ka(u), is computed as follows
K a(u) = k(u)*k'(u)
= |k(-x)K'(x)dx
= .AT1 21V + 2eXpi_T~2(( --r)T(u ~x) + XTx)\xdx im
(2k) a v 2a J
= ~(27t)V-JeXP^(UT 2x1 U +
Examination of equation 110 reveals that it contains a vector term. The convolution of the
scalar kernel expression with vector gradient expression is carried out with each element
of the gradient vector. Substituting the elements of the vectors u and x into equation 110,
where
U = [,, ....tf]7
and
x = [x ....a^F,


12
The output image plane response to a single image of vehicle la is shown in figure 4.
Refinements to the distortion invariant filter approach, namely the MACE filter, will show
that the localization of this output response, as measured by the sharpness of the peak, can
be improved significantly.
2.1.1 Synthetic Discriminant Function
The degradation evidenced in figures 2 and 3 were the primary motivation for the syn
thetic discriminant function (SDF)[Hester and Casasent, 1980]. A shortcoming of the
MSF, from the standpoint of distortion invariant filtering, is that it is only optimum for a
single image. One approach would be to design a bank of MSFs operating in parallel
which were matched to the distortion range. The typical ATR system; however, must rec
ognize/discriminate multiple vehicle types and so from an implementation standpoint
alone parallel MSFs is an impractical choice. Hester and Casasent set out to design a sin-


45
ative methods. The smaller the dimension of the subspace in which the recognition class
lies, the better we can discriminate images considered to be out of the class. One limitation
of the analytic solutions of distortion invariant filters is that they require the inverse of a
matrix of the form
x'Qx, (40)
where Q is a positive definite matrix representing a quadratic optimization criterion. If the
matrix, x, is not full column rank there is no inverse for the matrix of (40) and conse
quently no analytic solution for any of the distortion invariant filters. The LMS algorithm,
however, will still find a best fit to the design goal, which is to minimize the criterion while
satisfying the linear constraints.
We can illustrate this by modifying the data from the experiments in section 2.1. It is
well known that the data matrix x can be decomposed using the singular value decompo
sition (SVD) as
x = i/AVT,
N x N
where the columns of U e 9? form an ortho-normal basis (the principal components
N x N
of the vector x¡ in fact), the diagonal matrix A 6 9? ' contains the singular values of
N x N
the data matrix, and V e 9 'is unitary. The columns of the data matrix can be pro
jected onto a subspace by setting one of the diagonal elements of A to zero. The impor
tance of any of the basis vectors in U is directly proportional to the singular value. In this
case N, = 21 so we can choose one of the smaller singular values to set to zero without


164
The magnitude of the gradient is
Kg(u)
du
)N+- uTuJ
Evaluation of the magnitude at |u| = 0 gives
(114)
3k0(u)
f 1 1
du
' \2n*\n/1cn*1)
(115)
A.4 Convolution of the Uniform Distribution Function with the Gradient of the
The uniform distribution function has the following form
/{/() =
m-,) ai ui b¡v
i
0 ; otherwise
(116)
The convolution of the uniform distribution function with gradient of the gaussian ker
nel can be written as
= J/t/CM-x)dx'
which is an (V-fold integral over the region of support, £2y, of the uniform distribution
function. We are interested in the result of this vector integration when the kernel gradient
term has the same form as in the previous section and the uniform distribution function is
such that
b. = -a, =
a
2'


73
the system of constraints with equality and has minimum norm is also the linear projection
which minimizes the response to images with a flat power spectrum. This solution is
arrived at naturally via a gradient search which only considers the response at the con
straint location.
This is no longer the case when the mapping is nonlinear. Adapting the parameters via
gradient search (such as error backpropagation) on recognition class exemplars only at the
constraint location will not, in general, minimize the variance over the entire output image
plane. In order to minimize the variance over the entire output plane we must consider the
response of the filter to each location in the input image, not just the constraint location.
The MACE filter optimization criterion minimizes, in the average, the response to all
images with the same second order statistics as the rejection class. At the output of the pre-
whitener (prior to the MLP) any white sequence will have the same second order statistics
as the rejection class. This condition can be exploited to make the training of the MLP
more efficient.
From an implementation standpoint, the pre-whitening stage and the input layer
weights can be combined into a single equivalent linear transformation, however, pre
whitening separately allows the rejection class to be represented by white sequences at the
input to the MLP during the training phase.
This result is due to the statistical formulation of the optimization criterion. Minimiz
ing the response to white sequences, in the average, minimizes the response to shifts of the
exemplar images since they have the same second-order statistics (after pre-whitening).
Consequently, we do not have to train over the entire output plane exhaustively, thereby
reducing training times proportionally by the input image size, NtN2. Instead, we use a


94
ui

Figure 39. Experiment IV: resulting feature space from convex hull training. In
the figure symbols are as before. In the top figure one difference to
note is that the convex hull exemplars (indicated by arrow) are closer
to the discriminant boundary and play a greater role in determining the
shape of the function.


138
undershoot
x/cr
x/ct
overshoot
x/ct
Figure 61. Feedback functions for implicit error term Undershoot condition (top),
slope normalized (middle), overshoot (bottom).


24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
Page
Generalization as measured by the minimum peak response 62
Generalization as measured by the peak response mean square error. 63
Comparison of ROC curves 64
ROC performance measures versus 66
Peak output response of linear and nonlinear filters over the training set 77
Output response of linear filter (top) and nonlinear filter (bottom) 78
ROC curves for linear filter (solid line) versus nonlinear filter (dashed line).. 79
Experiment I: Resulting feature space from simple noise training 80
Experiment II: Resulting feature space when orthogonality is imposed on the input
layer of the MLP. 83
Experiment II: Resulting ROC curve with orthogonality constraint 84
Experiment II: Output response to an image from the recognition class training
set 85
Experiment HI: Resulting feature space when the subspace noise is used for train
ing 88
Experiment III: Resulting ROC curve for subspace noise training 89
Experiment III: Output response to an image from the recognition class training
set 90
Learning curves for three methods 90
Experiment IV: resulting feature space from convex hull training 94
Experiment IV: Resulting ROC curve with convex hull approach 95
Classical pattern classification decomposition 100
Decomposition of NL-MACE as a cascade of feature extraction followed by dis
crimination 100
Mutual information approach to feature extraction 106
Mapping as feature extraction. Information content is measured in the low dimen
sional space of the observed output 108
A signal flow diagram of the learning algorithm 114
Gradient of two-dimensional gaussian kernel. The kernels act as attractors to low
points in the observed PDF on the data when entropy maximization is desired. 117
Mixture of gaussians example 118
Mixture of gaussians example, entropy minimization and maximization 119
PCA vs. Entropy gaussian case 120
PCA vs. Entropy non-gaussian case 122
PCA vs. Entropy non-gaussian case 123
vi


93
linear framework. More analysis is needed, however, before this conclusion can be made.
We do conclude that in this case this method is an effective means by which to character
ize the rejection class. The advantage in this technique versus the linear method of Kumar
et al [1994] is that the training learns to separate automatically the recognition class exem
plars from the convex hull exemplars as opposed to priori assigning a complex desired
output for each exemplar.
There were, however, some difficulties with this technique which are worth mention
ing. Recall that the motivation for using orthogonalization in the input layer was to
increase the likelihood that a nonlinear discriminant function was found. When using con
vex hull exemplars in the rejection class, this may seem unnecessary. In practice, however,
it was found that when the orthogonalization was removed, training times became
extremely long. Even with orthogonalization we can see from the learning curve (solid
line) in figure 38 that convergence took over an order of magnitude longer as in experi
ment HI.
There were also stability issues as well with this type of training. The training became
unstable nearly as often as it converged. However, when the training did converge, as in
the results shown, the classification performance was always superior. Convergence, or its
lack, can be directly measured from the MSE. When convergence was not reached in a
suitable number of iterations (typically 1000 epochs), the algorithm was restarted with a
new random parameter initialization. Due to the improved classification results, we
believe that this method bears further study.


103
tion 59. Inspection of equation 59 shows that entropy can be seen as the expected value of
the log of the probability density functions, or
h(x) = £{log(4(jr))}. (60)
Several properties of the entropy measure are of interest.
N
1. If the RV is restricted to a finite range in 91 entropy is maximized for the uniform dis
tribution.
2. If the diagonal elements of the covariance matrix are held constant, entropy is maxi
mized for the normal distribution with diagonal covariance matrix.
N N
3. If the RV is transformed by a mapping g\91 > 9 then the entropy of the new RV,
y = g(x). satisfies the inequality
/tCy) with equality if and only if the mapping has a unique inverse, where JXY is the Jaco
bian of the mapping from A' to l7.
Regarding the first two properties we note that in both instances each element of the
RV is statistically independent from the other elements. We will make use of the first
property in the method presented here.
Equation 61 implies that by transforming a RV we can increase the amount of informa
tion that it conveys; that is, the RV Y derived from the RV X can have more information
than X. This is a consequence of working with continuous RVs. In general the continuous
entropy measure is used to compare the relative entropies of several RVs. We can see from
equation 61, that if two RVs are mapped by the same invertible linear transformation their
relative entropies (as measured by the difference) remains unchanged. However, if the


33
and consequently
= P\f_ f(x,a)^f(x, a)p(x\Ct)dx
-P2fjl-f(x, a))^/(x, a)p(x\C2)dx
(13)
= f^(f(x,a)(Plp(x\Cl) + P2p(.x\C2))-(P2p(x\C2)))jLf(x,a)dx
= £_(/(*, a)px(Ar)- P2p(x\C2))^f(x, a)dx
Examination of equation 13 allows for two possibilities for a stationary point of the crite
rion. The first, as before, is that
c, .s PlP(*\C 2)
/(*,<*) = p-r
PjfM (14)
= p(C2 |x)
while the second is if we are near a local minima with respect to a. In other words, if the
parameterized function can realize the Bayes discriminant function via an appropriate
choice of its parameters, then this function represents a global minima, but this does not
discount the fact that there may be local minima. Furthermore, if the parameterized func
tion is not capable of representing the Bayes discriminant function there is no guarantee
that the global (or local) minima will result in robust classification.


108
ulated explicitly. Their approach is similar to the approach in communications theory,
wherein the communications channel (or mapping) is assumed to be fixed. Mutual infor
mation is then used to estimate the source of the observations. A significant difference in
the method presented here is that the mapping (or communications channel) is not
assumed to be fixed, rather it is parameterized and we are free to choose the parameters in
order to manipulate entropy.
5.3.4 Nonparametric PDF Estimation
One obstacle to using mutual information as the figure of merit is that it is an integral
function of the PDF of a continuous random variable. Since we cannot work with the PDF
directly (unless assumptions are made about its form), we rely on nonparametric esti
mates. Nonparametric density estimation in a high-dimensional space is an ill-posed prob
lem. The approach described here, however, relies on such estimates in the output space,
as depicted in figure 44, where the dimensionality is under the control of the designer.
/
([ ]> OC)
feature extraction
T"
/ information is observed in the
/ low dimensional output space
and used to adapt the parame
ters of the mapping
Figure 44. Mapping as feature extraction. Information content is measured in the low
dimensional space of the observed output.
The Parzen window method [Parzen, 1962], which we will use, is a nonparametric
kernel-based method for estimating probability density functions. The Parzen window


169
Fisher J., and Principe, J. C. (1995b); A nonlinear extension of the MACE filter,
Neural Networks: Special Issue on Neural Networks for Automatic Target Recognition, 8
(7): 1131-1141.
Fisher J., and Principe, J. C. (1995c); Unsupervised learning for nonlinear synthetic
discriminant functions, Proceeding ofSPIE, 2752: 1-13.
Funahashi, K. (1989); On the approximate realization of continuous mappings by
neural networks, Neural Networks 2 (3): 183-192.
Fukanaga, K. (1990); Statistical Pattern Recogntion 2nd ed., Harcourt Brace Jovanov-
ich, Cambridge, Massacheusetts.
Gerbrands, J. (1981); On the relationships between SVD, KLT, and PCA, Pattern
Recognition, 14: 375-381.
Gheen, G. (1990); Design of considerations for low-clutter, distortion-invariant cor
relation filters, Optical Engineering, 29 (9): 1029-1032.
Hardle, W. (1990); Applied Nonparametric Regression, Cambridge University Press,
New York.
Haykin, Simon (1994); Neural Networks A Comprehensive Foundation, IEEE Press,
Macmillan, New York.
Hebb, D. (1949); The Organization of Behavior: A Neuropsychological Theory, Wiley,
New York.
Hester, C. F., and D. Casasent (1980); Multivariant technique for multiclass pattern
recognition, Appl. Opt. 19: 1758-1761.
Hinton, G. E., and J. A. Anderson Ed. (1981), Parallel Models of Associative Memory,
Lawrence Erlbaum Associates, New Jersey.
Hobson, A. (1969); A new theorem of information theory, J. Stat. Phys., 3: 383-391.
Kapur, J. N., and H. K. Kesavan (1992); Entropy Optimization Principles with Appli
cations, Academic Press, Boston, Massacheusetts.
Khinchin, I. A. (1957); Mathematical Foundations Of Information Theory, Dover Pub
lications, New York.
Kohonen, T. (1988); Self-Organization and Associative Memory (1st ed.), Springer
Series in Information Sciences, 8; Springer-Verlag.
Kohonen, T. (1995); Self-Organizing Maps', Springer Series in Information Sciences,
Springer Series in Information Sciences, 30; Springer-Verlag, New York.


163
we can rewrite the j th element of the vector integral as
=
expi-~^(T)l''
f
r j ^
\
V 2a 2
lexp
X,.
2N + 2
(2t) a
V J
V 13 i*/ 2
/
X ...
¡e*P(-^-XjUj))xjdXj
expb?(T})]
.N 2N+2
(27C) O
n (J e*p(-^(*? x¡ ,))*,)x
i*j
JexP-xjuj))xjdxj
.N 2N+2
(2n) o
"[ exp "2 (T/y/ic
4a
rexp
-h lajnu:
v4 a J
which, as our final result, can be converted back to vector form as
Ka(u) = k(u)*k'(u)
= -(2v+ inv/2aN+2)exp(-;^(T))
= J 1 )
^23^/4 + 1 nN/4aN/2 + 2 J
K(a)1,zu
(111)
(112)
Also of interest is the gradient (specifically the magnitude) of equation 112, which has
the form
9k ()
du
(2a,+ 1a,/V+2W 4o2(T))(1 2a2T)'
(113)


136
points and the constraints of the desired output distribution. Furthermore, if the mapping
topology is matched to the desired output distribution, the evaluation of the topology reg
ulating term can be further simplified. The final algorithm complexity has been reduced
substantially as
(86)
5.9 Conversion of Implicit Error Direction to an Explicit Error
In the previous section we derived a method which greatly simplified the complexity
of the error direction computation. In the process, manipulation of a global property,
entropy, was seen to be a process of local attraction/repulsion of the individual observa
tions in the output space. This idea of maximizing and minimizing entropy and ultimately
mutual information through local interactions can be further extended such that the com
puted error direction can be converted into an implicit desired signal. That is, we can go
from an unsupervised learning algorithm to one which is supervised, in a step-wise fash
ion. The resulting simplification to the algorithm is that we no longer need to estimate the
error direction for every gradient step.
5.9.1 Entropy Minimization as Attraction to a Point
We begin with entropy minimization which is modeled as local attraction between the
data points. In figure 48, the bottom plot indicates that the points are attracted to the center
of the observed distribution modes, with the degree of attraction being stronger for the
larger mode. As we have stated, however, the influence function is in reality a direction. If
a proper scale factor can be found then the error direction can be equated to an actual error
and a desired signal.


24
As compared to the MACE filter the peak response is improved over the testing set. Sepa
ration between the two vehicle types appears to be maintained.
2.2 Pre-processor/SDF Decomposition
In the sample domain, the SDF family of correlation filters is equivalent to a cascade
of a linear pre-processor followed by a linear correlator [Mahalanobis et al., 1987;Kumar,
1992]. This is illustrated in figure 14 with vector operations. The pre-processor, in the case
of the MACE filter, is a pre-whitening filter computed on the basis of the average power
spectrum of the recognition class training exemplars. In the case of the MVSDF the pre
processor is a pre-whitening filter computed on the basis of the covariance matrix of the
noise. The net result is that after pre-processing, the second processor is an SDF computed
over the pre-processed exemplars.


65
of a. It is clear from the plots that there is a positive relationship between the emphasis on
the MACE criteria and the ROC performance. However, the peak in ROC performance is
not achieved at a equal to unity. In all three cases, the ROC performance peaks just prior
to unity with the performance drop-off increasing with aspect separation at a equal to
unity.
The. difference between the SDF and MACE filter is the pre-processor. What is shown
by this analysis is that, in general, the pre-processor from the MACE filter criterion leads
to better classification, but too much emphasis on the MACE filter criterion, as measured
by a equal to unity, leads to a filter which is too specific to the training samples. The
problems described above are well known. Alterations to the MACE criterion have been
the subject of many researchers [Casasent et al 1991; Casasent and Ravichandran, 1992;
Ravichandran and Casasent, 1992; Mahalanobis et ah, 1994a]. There is still, as yet, no
principled method found in the literature by which to set the parameter a.
There are two conclusions from this analysis that are pertinent to the nonlinear exten
sion we are using. First the results show that pre-whitening over the recognition class
leads to better classification performance. For this reason we choose to use the pre-proces
sor of the MACE filter in our nonlinear filter architecture. The issue of extending the
MACE filter to nonlinear systems can in this way be formulated as a search for a more
robust nonlinear discriminant function in the pre-whitened image space.
The second conclusion is that comparisons of the nonlinear filter to its linear counter
part must be made in terms of classification performance only. There are simple nonlinear
systems, such as a soft threshold at the output of a linear system for example, that will out-


110
suitable here. The computational complexity of the estimator increases with dimension,
however, as we will be estimating the PDF in the output space of our mapping, the dimen
sionality can be controlled.
5.4 Derivation Of The Learning Algorithm
Our goal is to find features that convey maximum information about the input class.
How do we adapt the parameters a of a mapping such that this is the case? We now show
how the Parzen window density estimator coupled with a property of entropy can be used
to accomplish this goal.
Consider the mapping 91'W; M described by the following equation
Y = g(X,a). (68)
If the mapping is nonlinear we can exploit the following property of entropy. If a ran
dom variable has finite region of support, entropy is maximized for the uniform distribu
tion. The Parzen windows estimator, coupled with a mapping with finite region of support
at the output (e.g. an MLP with sigmoidal nonlinearities), can be used to minimize or max
imize the distance between the observed distribution and the desired distribution. Fur
thermore, if the region of support is a hypercube, as is the case for the MLP using
sigmoidal nonlinearities, the features are statistically independent when entropy is maxi
mized.
Considering equation 68, the method of Viola et al estimates the value of the input
parameters, X, rather than the parameters of the mapping, a. The goals are very different.


80
Figure 31. Experiment I: Resulting feature space from simple noise training.
Note that all points are projected onto a single curve in the feature
space. In the top figure squares are the recognition class training
exemplars, triangles are white noise rejection class exemplars, and
plus signs are the images of vehicle la not used for training. In the
bottom figure, squares are the peak responses from vehicles lb and lc,
triangles are the peak responses from vehicles 2a and 2b.


38
3.3.1 Pre-processor/SDF Decomposition
As observed by Mahalanobis [1987], the MACE filter can be decomposed as a syn
thetic discriminant function preceded by a pre-whitening filter. Let the matrix
-1/2
B = D where B is diagonal with diagonal elements equal to the inverse of the
square root of the diagonal elements of D. We implicitly assume that the diagonal ele
ments of D are non-zero, consequently B^B = D 1 and B'' = B. Equation (24) can
then be rewritten as
H = B(BX)((BX)\BX))d. (25)
Substituting Y = BX, representing the original exemplars preprocessed in the spec
tral domain by the matrix B, equation (25) can be written
H = BY(Y^Y)d. (26)
The term H' = Y(Y^Y)d is recognized as the SDF computed from the preprocessed
exemplars Y. The MACE filter solution can therefore be written as a cascade of a pre-
whitener (over the average power spectrum of the exemplars) followed by a synthetic dis
criminant function, depicted in figure 16, as
H = BH
(27)
1. If the DFT were as defined in [Oppenheim and Shafer, 1989] then a scale factor of
N\N2 would be necessary.


LIST OF FIGURES
1 ISAR images of two vehicle types 9
2 MSF peak output response of training vehicle la over all aspect angles 10
3 MSF peak output response of testing vehicles lb and 2a over all aspect angles. 11
4 MSF output image plane response 12
5 SDF peak output response of training vehicle la over all aspect angles 15
6 SDF peak output response of testing vehicles lb and 2a over all aspect angles. 16
7 SDF output image plane response 17
8 MACE filter output image plane response 20
9 MACE peak output response of vehicle la, lb and 2a over all aspect angles... 21
10 Example of a typical OTSDF performance plot 23
11 OTSDF filter output image plane response 24
12 OTSDF peak output response of vehicle la over all aspect angles 25
13 OTSDF peak output response of vehicles lb and 2a over all aspect angles. ... 26
14 Decomposition of distortion invariant filter in space domain 26
15 Adaline architecture 28
16 Decomposition of MACE filter as a preprocessor (i.e. a pre-whitening filter over
the average power spectrum of the exemplars) followed by a synthetic discrimi
nant function 39
17 Decomposition of MACE filter as a preprocessor (i.e. a pre-whitening filter over
the average power spectrum of the exemplars) followed by a linear associative
memory 43
18 Peak output response over all aspects of vehicle la when the data matrix which is
not full rank 47
19 Output correlation surface for LMS computed filter from non full rank data.. 48
20 Learning curve for LMS approach 49
21 NMSE between closed form solution and iterative solution 50
22 Decomposition of optimized correlator as a pre-processor followed by SDF/LAM
(top). Nonlinear variation shown with MLP replacing SDF in signal flow (middle),
detail of the MLP (bottom). The linear transformation represents the space domain
equivalent of the spectral pre-processor 54
23 ISAR images of two vehicle types shown at aspect angles of 5, 45, and 85 degrees
respectively 59
v


157
where {x, X, (p} e CNx 1 are complex column vectors. The DFT relationship between
the estimated autocorrelation sequence Rx(m) and the periodogram of the observed
sequence, Px(k) = |X(fc)|2, is written
RAO)
II
%ll
e
PJL0)
PA O '
= JQ>
RAO)
MN-l)
P,{N- 1)
Px(N- 1)
RAN- l)
The covariance matrix of a zero-mean, complex, wide-sense stationary process x(n) can
be estimated from N observations of the process as
I, = E(xx')
E(x*(n)x(n)) E(x*(n + l)x(n)) ... E(x*(n + N- 1 )x(n))
_ E(x*{n)x(n + 1)) E(x*(n + l)x(n + 1)) ... E(x*(n + N- l)x(n + 1))
E(x*(n)x(n + N- 1)) E(x*(n + l)*(n + N- 1)) ... E(x*(n +N- 1 )jc(n +N- 1))_ .
Rx(0) Rx(-1) ... Rx(-N+ 1)
Rx( 1) /?x(0) ... Rx(-N + 2)
RX(N- 1) RX(N- 2) ... Rx(0)
Replacing the elements of the covariance matrix with the autocorrelation sequence esti
mates and applying the unitary DFT matrices yields
*x(0)
PA-1)
.. RA-N
*x(D
RAO)
.. RA-N
RAN-
1) RX(N-2) .
RAO)
OX,4>t = O


37
domain filter h reordered lexicographically. Let the columns of the data matrix
N N x N
X e C 1 2 contain the 2-D DFT coefficients of the exemplars ...,xN } also
N N x N N
reordered into column vectors. The diagonal matrix D¡ e 9? 1 2 12 contains the mag
nitude squared of the 2-D DFT coefficients of the ith exemplar. These matrices are aver
aged to form the diagonal matrix D as
N,
D = jflDi <21>
', = 1
which then contains the average power spectrum of the training exemplars. Minimizing
equation (19) subject to the constraints of equation (20) is equivalent to minimizing
H DH, (22)
subject to the linear constraints
AT+ H = d (23)
N x 1
where the elements of d 9 are the desired outputs corresponding to the exemplars.
The solution to this optimization problem can be found using the method of Lagrange
multipliers. In the spectral domain, the filter that satisfies the constraints of equation (20)
and minimizes the criterion of equation (19) [Mahalanobis et al., 1987;Kumar, 1992] is
H = D-'X(X^D-'X)-'d, (24)
N N X l
where H eC 1 1 contains the 2D-DFT coefficients of the filter, assuming a unitary 2-
D DFT.1


107
1. the form of entropy used is discrete whereas we are working with continuous entropy,
and
2. the mapping from the input to the output is constrained to be linear, whereas the method
presented here may be used with arbitrary nonlinear maps (so long as they are differen
tiable).
Deco and Obradovic[l996] have also presented extensive results on information theo
retic approaches to neural processing. The techniques described differ from Linskers
method in that they work with the continuous form of entropy and use nonlinear map
pings. The constraint, however, is that the mapping be symplectic (volume preserving) and
bijective. These constraints restrict the method to a subset of the mappings which are
N N
9 9? As we have stated, from a theoretical point of view, such mappings in no way
increase our ability to discriminate classes. Furthermore, it is our implicit assumption that
the dimensionality reduction is one of the motivating factors for feature extraction prior to
classification. Deco and Obradovic also show that if the mapping function is chosen to be
linear in its parameters, very little can be done to manipulate the information at the output
of the mapping without prior knowledge of the input PDF.
Bell and Sejnowski [1995] present yet another approach to information theoretic map
pings. Their technique is applicable to subspace projections. It is limited in that it manipu
lates entropy only if the underlying distribution in the input space is uni-modal.
Furthermore, it is restricted to nonlinear MLP architectures of a single layer. The method
we present here has neither of these restrictions.
Viola el al [1996] have taken a similar approach to the method presented here for
entropy manipulation. The work of Viola et al differs in that it does not address arbitrary
nonlinear mappings directly, the gradient is estimated stochastically, and entropy is manip-


86
N N x N
N< N¡ A2 exemplars placed in the data matrix x2 e 95 1 2 '. It is well known that if
x2, if it is full rank, can be decomposed with the SVD as
x2 = UAVJ.
(53)
of the data matrix, A are the singular values, and V is an orthogonal matrix. This decom
position has many well known properties including compactness of representation for the
columns of the data matrixfGerbrands, 1981]. Indeed, as has been noted by Gheen[1990],
the SDF can be written as a function of the SVD of the data matrix.
hSDF = UK lVTd
(54)
We will use this recognition class representation to further refine our description of the
rejection class for training. As we stated, the underlying assumption in a data driven
method, is that the data matrix x2 conveys information about the recognition class, any
information about the recognition class outside the space of the data matrix is not attain
able from this perspective. The information certainly exists, but there is no mechanism by
which to include it in the determination of the discriminant function within this frame
work. This does however lead to a more efficient description of the rejection class. We can
modify our optimization criterion to reduce the response to white sequences as they are
projected into the Nt -dimensional subspace of the data matrix. Effectively this reduces the
search for a discriminant function in an NtN2 -dimensional space to an A,-dimensional
subspace.


130
where Nd is the dimension of the output space and Nu is with respect to a one dimen
sional output space. Furthermore in the equation, we set Nu ~ 3Ny in order to get an accu
rate estimate of the implicit error term, that is the sampling grid on the order of three times
as dense as the data observations (assuming gaussian kernels [Hardle, 1990]). Using this
rule of thumb, the order of the computational complexity as a function of the output
dimension and the number of observations becomes
Nd + 1 ,/id + 2
3 Ny (81)
Fortunately, the dimensionality of the output space is controlled by the designer, how
ever, equation 81 poses a fundamental computational limitation to the dimensionality of
the subspace mapping. This limitation, however, is only valid if the implicit error term is
computed in the straightforward manner that the equations imply. Further examination of
the Parzen window estimator shows how this complexity can be greatly simplified. The
final result reveals that the implicit error term can be computed purely as a function of the
local interaction between the observations in the output space.
The Parzen window estimator is the convolution of the kernel with the data, therefore
we can rewrite equation 79 as
e, = Ey(u|{y})*K'(u)|
\u=y,
= (fM)-fy(u\{y}))*K'(u)\u=y ,
= (/V() -y()*K(u))*K'(u)|B = ^
(82)


132
In section A.3 of the appendix, the analytic form of Ka is derived for the At-dimen
sional gaussian kernel with covariance matrix of the form
I = ct2/.
The result from equation 112 in the appendix is
k () = K(u)*K'(u)
= ~( N + 1 N/2 N + 2(Ta)]
\2 n a J 4 (84)
= _(23w/4+ ljtV/4aV/2 + 2]1C()l/
where N is the dimension of the kernel.
In section A.4 of the appendix, the analytic form of fr is also derived. The result from
equation 120 is
/,-(")
FT-
nN ll2
T-
/vll 2
/
(V*
/ a\
[ u¡-~
\
erf
1 2
Jia
- erf
1 2
Jia
(ki(i+|a)-Ki
(I
D
V
V >
v y
f
ru¡ + -
a\
erf
1 2
-erf
1 2
f f a \
f,. a
Y)
J2a
J2cs
|^Kl^Uw+-, CTj-K,
l"~2
< y
y.
J
(85)
where the desired distribution is uniform in the hypercube centered at zero with vertices of
size a, A is the dimension of the kernel (and the output space), and K, (m, a) is the one
dimensional gaussian kernel with mean m and standard deviation a. These functions are
shown for the two dimensional case in figures 57 and 58.


64
Figure 26. Comparison of ROC curves. The ROC curves for the number of
training exemplars yielding the best generalization measure versus the
number yielding the best ROC performance for values of a equal to
0.5 and 0.95 are shown.
either the minimum peak response or the MSE, there does appear to be dependency on a.
This leads to a second experiment.
Table 2. Correlation of generalization measures to classifier performance. In both cases (0C equal to 0.5
or 0.95) the classifier performance as measured by the area of the ROC curve or Pfa at Pd equal 0.8, has
an opposite correlation as to what would be expected of a useful measure for predicting performance.
Performance Measures
ROC area Pfa(@Pd=0.8) ROC area Pfa(@Pd=0.8)
a = 0.50 a = 0.95
Generalization ym¡n -0.39 0.21 -0.40 0.41
Measures ymse 0.32 -0.11 0.31 -0.35
In the second experiment we examine the relationship between the parameter a and
the ROC performance. The aspect separation between training exemplars is set to 2,4, and
8 degrees. The value of ot, the emphasis on the MACE criterion, is varied in the range
zero to unity. Figure 27 shows the relationship between ROC performance and the value


BIOGRAPHICAL SKETCH
Mr. Fisher was bom April 13, 1965. He earned his bachelors degree in electrical engi
neering from the University of Florida in 1987. He was a graduate research assistant in the
Electronic Communications Laboratory at the University of Florida from 1987 until 1990,
during which time he earned his Master of Engineering degree from the University of
Florida. He has continued his affiliation with the ECL, as both a faculty member and
graduate research assistant, since 1990, during which time he has conducted research in
the areas of ultra-wideband radar for ground penetration and foliage penetration applica
tions, radar signal processing, and automatic target recognition algorithms. He has also
performed duties as a graduate research assistant and Ph. D. candidate in the Computa
tional NeuroEngineering Laboratory, during which time he has conducted research (Ph.D.
topic) on nonlinear extensions to synthetic discriminant functions with application to clas
sification of mm-wave SAR imagery.
173


171
Novak, L. M., M. C. Burl, and W. W. Irving (1993); Optimal polarimetric processing
for enhanced target detection, IEEE Transactions on Aerospace and Electronic Systems,
29 (1): 234-243.
Novak, L. M., G. Owirka, C. Netishen (1994); Radar target identification using spa
tial matched filters, Pattern Recognition 27 (4): 607-617.
Oppenheim A. V, and R. W. Shafer (1989), Discrete-Time Signal Processing, Pren
tice-Hall, New Jersey
Papoulis, A. (1991), Probability, Random Variables, and Stochastic Processes (3rd
ed.), McGraw-Hill, New York.
Parzen, E. (1962); On the estimation of a probability density function and the mode,
Ann. Math. Stat. 33: 1065-1076.
Plumbey, M., and F. Fallside, (1988); An information-theoretic approach to unsuper
vised connectionist models, Proceedings of the 1988 Connectionist Models Summer
School. D. Touretzky, G. Hinton, and T. Sejnowski, eds., Morgan Kaufmann, San Mateo,
CA, 239-245.
Ravichandran, G., and D. Casasent (1992); Minimum noise and correlation energy
filters, Appl. Opt. 31 (11): 1823-1833.
Rfrgier, Ph. (1991); Filter design for optical pattern recognition: multicriteria opti
mization approach, Opt. Lett. 15 (15): 854-856.
Rfrgier, Ph., and J. Figue (1991); Optimal trade-off filter for pattern recognition
and their comparison with Weiner approach, Opt. Comp. Proc. 1:3-10.
Richard, M., and R. Lippman (1991); Neural network classifiers estimate bayesian
posteriori probabilities, Neural Computation 3:461-483.
Rosen, J. (1993); Learning in correlators based on projection onto constraint sets,
Optics Letters, 18(14): 1183-1185.
Rosenblatt, F. (1958); The perceptron: a probabilistic model for information storage
and organiztion in the brain, Psychological Review, 65: 386-408.
Rumelhart, D., G. Hinton, R. Williams (1986); Learning internal representations by
error backpropagation., Parallel Distributed Processing: Explorations in the Microstruc
ture of Cognition (D. Rumelhart and J. McClelland, eds.), 1: 322-328., MIT Press, Massa-
cheusetts.
Scharf, Louis L. (1991); Statistical Signal Processing: Detection, Estimation, and
Time Series Analysis, Addison-Wesley Publishing Company, New York.


Ill
By our choice of topology (MLP) and distance metric we are able to work with entropy
indirectly and fit the approach naturally into a back-propagation learning paradigm.
As our criterion we use integrated squared error between our estimate of the output
distribution, fy(u,y) at a point a over a set of observations y and the desired output dis
tribution, fy(u), which we approximate with a summation.
(69)
j
In equation 69, indicates the nonzero region (a hypercube for the uniform distribu
tion) over which the M -fold integration is evaluated. Assuming the output space is sam
pled adequately, we can approximate this integral with a summation in which u] e 9?M
are samples in M -space and Au is represents a volume.
We use the Parzen window method [Parzen, 1962] as our estimator of the output distri
bution. The Parzen window estimate of a PDF is written
(70)
i= 1
where k( ) is the kernel function, y = {y, yNr} are the set of observations at the
output of the mapping, and u is the location at which the output estimate is being com-


133
Figure 57. Two dimensional attractor functions. The x¡ -component is shown at
top while the x2 -component is shown at bottom. The function
represents the influence of each data point on its locale in the output
space. As in the analysis of the kernel gradients, we see that this
function
The magnitude of the regulating function is shown in figure 59. It is evident from the
figure that the regulating function only has influence at the boundaries of the desired out
put distribution. Furthermore, examination of the equation 85 shows that the topology reg
ulating function contains an erf([ ]) function evaluation when the output space is greater
than one dimension. From a computational standpoint this function evaluation can be
costly. This term is essentially unity except at the vertices of the hypercube. Figure 60
shows an approximation of equation 85 minus the erf([ ]) evaluation. As can be seen in
the figure, within the region of support of the desired distribution, the function is essen
tially unchanged. If we match the region of support of the desired output distribution to the


APPENDIX A
DERIVATIONS
A.l Frequency Domain Relationships
The following derivations show frequency domain relationships for unitary discrete fou-
rier transformations. The results are shown for the one dimensional vectors, but can be
easily extended to multiple dimensions. The autocorrelation sequence of a complex, wide-
sense stationary process x(n), is defined as
Rx(m) = E(x*(n)x(n + m))
can be estimated from N observations of the process (assuming the sequence is ergodic)
as
&r() = jy X **()*(" + )
(93)
A relationship can be derived between the estimated autocorrelation sequence and the
DFT of the observed sequence using the unitary DFT
X(k) = X x(n)j=e-lMn/N x(n) = <£ xik)j=SMka/N.
155


4
object do not manifest themselves as rotations within the radar image (as opposed to opti
cal imagery). In this case the distortion is not purely geometric, but more abstract.
Chapter 3 presents a derivation of the MACE filter as a special case of Kohonens lin
ear associative memory [1988], This relationship is important in that the associative mem
ory perspective is the starting point for developing nonlinear extensions to the MACE
filter.
In chapter 4 the basis upon which the MACE filter can extended to nonlinear adaptive
systems is developed. In this chapter a nonlinear architecture is proposed for the extension
of the MACE filter. A statistical perspective of the MACE filter is discussed which leads
naturally into a class representational viewpoint of the optimization criterion of distortion
invariant filters. Commonly used measures of generalization for distortion invariant filter
ing are also discussed. The results of the experiments presented show that the measures
are not appropriate for the task of classification. It is interesting to note that the analysis
indicates the appropriateness of the measures is independent of whether the mapping is
linear or nonlinear. The analysis also discusses the merit of the MACE filter optimization
criterion in the context of classification and with regards to measures of generalization.
The chapter concludes with a series of experiments further refining the techniques by
which nonlinear MACE filters are computed.
Chapter 5 presents a new information theoretic method for feature extraction. An
information theoretic approach is motivated by the observation that the optimization crite
rion of the MACE filter only considers the second-order statistics of the rejection class.
The information theoretic approach, however, operates in probability space, exploiting
properties of the underlying probability density function. The method enables the extrae-


29
There are several observations to be made about the adaline/LMS approach to classifi
cation. One observation is that the adaptation process described uses the error, £, as mea
sured at the output of the linear combiner to drive the adaptation process and not the actual
classification error, ec. Another observation is that this approach presupposes that the pat
tern classes can be linearly separated. A final point, on which we will have more to say, is
that the method uses the MSE criterion as a proxy for classification.
3.2 MSE Criterion as a Proxy for Classification Performance
As we have pointed out, the adaline/LMS approach to classification uses the MSE cri
terion to drive the adaptation process. It is the probability of misclassification (also called
the Bayes criterion), however, with which we are truly concerned. We now discuss the
consequence of using the MSE criterion as a proxy for classification performance.
It is well known that the discriminant function that minimizes misclassification is
monotonically related to the posterior probability distribution of the class, c, given the
observation x [Fukanaga, 1990]. That is, for the two class problem, if the discriminant
function is
/(*) = Pjp(C2 Jx), (3)
where P2 is the prior probability of class 2, and p(C2|x) is the conditional probability
distribution of class 2 given x, then the probability of classification will be minimized if
the following decision rule is used
f(x) < 0.5 choose class I ^
f(x) > 0.5 choose class 2


44
concerned with finding projections which will essentially detect a set of images. Towards
this goal the techniques have emphasized analytic solutions resulting in linear discrimi
nant functions. Advances have been concerned with better descriptions of the second order
statistics of the causes of false detections. The approach, however, is still a data driven
approach. The desired recognition class is represented through exemplars. In the distortion
invariant filtering approach, the task has been confined to fitting a hyper-plane to the rec
ognition exemplars subject to various quadratic optimization criterion.
The development of associative memories has proceeded along a different track. It is
also data driven, but the emphasis has been on iterative machine learning methods. Many
of the methods are biologically motivated, including the perceplron learning rule [Rosenb
latt, 1958] and Hebbian learning [Hebb, 1949], Other methods, including the least-mean-
square (LMS) algorithm [Widrow and Hoff, 1960] (which we have described) and the
backpropagation algorithm [Rumelhart et al., 1986; Werbos 1974], are gradient descent
based methods.
From the classification standpoint, of which the ATR problem is a subset, iterative
methods have certain advantages. This can be illustrated with a simple example. Suppose
the data matrix
r i roN,N2xN,
X = [xlP x2, .... xw ] e 9
were not full rank. In other words the exemplars representing the recognition class could
be represented without error in a subspace of dimension less than Nt. From an ATR per
spective this would be a desirable property. The implicit assumption in any data driven
method is that information about the recognition class is transmitted through exemplars.
This is as true for distortion invariant filters, which have analytic solutions, as it is for iter-


77
linear filter
1.10
e 0.90
* 0.80
O)
ex,
0.70
0.60
20 40 60
aspect (degrees)
nonlinear filter
aspect (degrees)
Figure 28. Peak output response of linear and nonlinear filters over the training
set. The nonlinear filter clearly outperforms the linear filter by this
metric alone.
reported here when vehicles lb and lc are used as the recognition class and vehicles 2a
and 2b are used for the rejection class. At this point we are only interested in the results
pertaining to the linear filter (our baseline) and nonlinear filter results for experiment I.


I certify that I have read this study and that in my opinion it conforms to acceptable standards of
scholarly presentation and is fully adequate, in scope and quality, as a dissertation for the degree of
Doctor of Philosophy.
I certify that I have read this study and that in my opinion it conforms to acceptable standards of
scholarly presentation and is fully adequate, in scope and quality, as a dissertation for the degree of
Doctor of Philosophy.
£
Thomas E. Bullock
Professor of Electrical and ComputerEngineering
I certify that I have read this study and that in my opinion it conforms to acceptable standards of
scholarly presentation and is fully adequate, in scope and quality, as a dissertation for the degree of
Doctor of Philosophy.
(2a~ A/ A^. CjUs-^>
/John M. M. Anderson
Assistant Professor of Electrical and Computer Engineering
I certify that I have read this study and that in my opinion it conforms to acceptable standards of
scholarly presentation and is fully adequate, in scope and quality, as a dissertation for the degree of
Doctor of Philosophy.
ssistant Professor of Electrical and Computer Engineering
I certify that I have read this study and that in my opinion it conforms to acceptable standards of
scholarly presentation and is fully adequate, in scope and quality, as a dissertation for the degree of
Doctor of Philosophy.
Frank J. Bova
Professor of Nuclear and Radiological Engineering


18
2.1.3 Minimum Average Correlation F.nergv Filter
The MVSDF (and the SDF) control the output of the filter at a single point in the out
put plane of the filter. In practice large sidelobes may be exhibited in the output plane
making detection difficult. These difficulties led Mahalanobis et al [1987] to propose the
minimum average correlation energy (MACE) filter. This development in distortion invari
ant filtering attempts as its design goal to control not only the output point when the image
is centered on the filter, but the response of the entire output plane as well. Specifically it
minimizes the average correlation energy of the output over the training exemplars subject
to the same linear constraints as the MVSDF and SDF filters.
The problem is formulated in the frequency domain using Parseval relationships. In
the frequency domain, the formulation is
min WDH
s.t.
H = d
{HeCNx\XeCNxN',DeCN*N,deCN'X'}
where D is a diagonal matrix whose diagonal elements are the coefficients of the average
2-D power spectrum of the training exemplars. The form of the quadratic criterion is
derived using Parsevals relationship. A derivation is given in section A.l of the appendix.
The other terms, H and X, contain the 2-D DFT coefficients of the filter and training
exemplars, respectively. The vector d is the same as in the MVSDF and SDF cases. The
optimal solution, in the frequency domain, is
H = D '*(*+£> '*) 'd.
(1)
As in the MVSDF, the solution requires the inversion of an N x N matrix, but in this
case the matrix D is diagonal and so its inversion is trivial. When the noise covariance


127
lattice. The relationship of this maximum entropy mapping approach to the SOFM of
Kohonen is a topic that will be left for later research.
5.7.2 Maximum Entropy: Two Vehicle Classes
It is commonly assumed in the blind source separation problem that the sources are
statistically independent [Bell and Sejnowski, 1995], Maximum entropy has been used in
approaches to this problem. As an example of blind source separation we repeat the previ
ous experiment on both vehicles, which can be modeled as statistically independent
sources, from figure 52. The projection of the training images (and between aspect testing
images) is shown in figure 56 (where adjacent aspect training images are connected). As
can be some significant class separation is exhibited (without a priori labeling the classes).
In the early stages of learning the method appears to maximize information with regards to
the underlying distortion common to both classes (rotation through aspect). As the map
ping is refined the information begins to focus on the differences between the classes.
5.8 Computational Simplification of the Algorithm
So far we have only presented results using the method to maximize entropy. Our
interest with regards to classification is mutual information. Specifically as described by
equation 57 where the mutual information is a function of the observed output entropies.
However, before we discuss extensions to mutual information we present some significant
computational aspects of our method. There have been other techniques to entropy manip
ulation of continuous random variable proposed. However, the methods either oversim
plify (assume Gaussianity or unimodal pdfs [Bell and Sejnowski, 1995, Plumbey and
Fallside, 1988]) or are overly complex (Edgeworth expansions [Wong and Blake, 1994]).


118
error direction
4
3
2
1
0
-2-10 1 2
li
Figure 47. Mixture of gaussians example. The estimated distribution is a mixture
of gaussians, while the desired distribution is uniform between -1 to 1.
The kernel gradient, which will be convolved with the difference
between the two distributions is shown in dotted line.
5.6 Maximum F.ntropy/PCA: An Rmpirical Comparison
We present some experiment results designed to illustrate the properties of information
theoretic feature extraction as compared to a signal representational approach. In these
experiments we will compare a simple entropy-maximizing feature extractor to the well
known principal components analysis (PCA) approach to feature extraction. The source
distributions are simple by design, but as we shall see they are sufficient to show the dif
ferences in the two methods.
We will begin with the simple case of a two dimensional gaussian distribution. The
distribution we will use is zero mean with a covariance matrix of
I =
1 0
0 0.1


31
The first term of 6 can be computed as
JU + m = y£{(/ + 5/)2|C1} + ^£{(l-/-8/)2|C2}
= y£{(/2 + 2/5/)|C,}
p (?)
+ y£{(( 1 /)2-2(l-/)8/)|C2} + 0(8/2)
= 7(/) + y£{(2y5/)|C1}-y£'{(2(l -/)8/)|C2}
which can be substituted into 6 to yield
5 J = /1£{/8/|C1}-£2£{(l-/)8/|C2}
= Px[j(x)i>fp(x\Ci)dx-P2[_J\-f(xWp(x\C2)dx
r (g)
= )_Jf(.x)(.Pip(x\Cl) + P2p(x\C2))-(P2p(x\C2))]bfdx
= £ [f(x)px(x) P2p(x\C2)]8f dx
where px(x) = P^p(x\C\) + P2p(x\C2) is the unconditional probability distribution of
the random variable X. In order for /(x) to be a stationary point of /(/), equation 8
must be zero over all x for any arbitrary perturbation 8/(jr). Consequently
f(x)px(x) -P2P(x\C2) = 0
(9)


50
filter error
Figure 21. NMSE between closed form solution and iterative solution. The
learning curve for the LMS algorithm when the full rank data matrix is
shown with a solid line, the non full rank case is shown with a dashed
line.
class of nonlinear associative memories can, however, be determined by gradient search
techniques. The methods of distortion invariant filters are limited to linear or piece-wise
linear discriminant functions. It is unlikely that these solutions are optimal for the ATR
problem.
In this chapter we have made the connection between distortion invariant filtering and
linear associative memories. Furthermore we have motivated an iterative approach. Recall
figure 15, which shows the adaline architecture. In this architecture we can use the linear
error term in order to train our system as a classifier. This is consequence of the assump
tion that a linear discriminant function is desirable. If a linear discriminant function is sub-


47
Figure 18. Peak output response over all aspects of vehicle la when the data
matrix which is not full rank. The LMS algorithm was used to compute
the filter coefficients.
As can be seen in the image, the qualities of low variance and localized peak are still
maintained using the iterative method.
The learning curve, which measures the normalized mean square error (NMSE)
between the filter output and the desired output, is shown as a function of the learning
epoch (an epoch is one pass through the data) in figure 20. When the data matrix is full
rank, as shown with a solid line, we see that since there is an exact solution and the error
approaches zero. When xsub is used the NMSE approaches a limit because there is no
exact solution and so a least squares solution is found.


10
OT
t
4
CO
matched spatial filter
1 1 1 L 1 1 1 I I I I I ,
0 20 40 60 80 100
aspect angle
Figure 2. MSF peak output response of training vehicle 1 a over all aspect angles.
Peak response degrades as aspect difference increases.
The peak output responses of both vehicles in the testing set are shown in figure 3
overlain on the training image response. In one sense the filter exhibits good generaliza
tion, that is, the peak response to vehicle lb is much the same as a function of aspect as the
peak response to vehicle la. However, the filter also generalizes equally as well to vehi
cle 2b, which is undesirable. As a vehicle discrimination test (vehicle 1 from vehicle 2) the
MSF fails.


112
puted. Since the output observations are functional mappings of the input data, we can
rewrite 70 as
",
I = 1
= /y(.(a,Jt)) (71)
",
^ 1=1
The gradient of the criterion function with respect to the mapping parameters is deter
mined via the chain rule as
97 ^YlLY^
9a Id/A 9 Jl9aJ
where EY(Uj,y) is the observed distribution error over all observations 31, The last
term in 72, dg/da, is recognized as the sensitivity of our mapping to the parameters a.
Since our mapping is a feed-forward MLP (a represents the weights and bias terms of the


119
entropy max
4
3
f -rv i
2
i- £ % j
j 7 \=a ;
1
F a \=|
n
iJÂ¥ ^
-0.6 -0.4 -0.2 0.0 0.2 0.4 0.6
u
entropy min
4
3
i- ^
: /> < \ :
2
r -i
/> i\
1
- f / > : :
n
-0.6 -0.4 -0.2 0.0 0.2 0.4 0.6
ii
Figure 48. Mixture of gaussians example, entropy minimization and
maximization. The plots above show the resulting influence function
when the kernel gradient is convolved with the observed distribution
error. The sign depends on whether we are minimizing (bottom) or
maximizing (top) entropy.
The contours of this distribution are shown in figure 49 along with the image of the
first principal component features. We see from the figure that the first principal compo
nent lies along the x, -axis. We draw a set of observations (50 in this case) from this distri-


140
iteration 1
Figure 62. Entropy minimization as local attraction. The figures above show three
iterations of the local attraction algorithm. The two groups of points are
seen to be attracted to their local means as well as to each other.
towards the later stages of learning we would like the interaction to decrease to a negligi
ble level as the distribution approaches a uniformity.
If we make the assumption that the observed distribution (which we no longer com
pute in the local interaction framework) will eventually approach uniformity, we have a
basis for setting an upper bound on the size of the kernel. Given NY points in an Nd-


42
the pre-processing transformation yields exactly the space domain coefficients of the
MACE filter. This can be shown using a unitary discrete Fourier transformation (DFT)
matrix.
N x N N x N
If U e C 1 is the DFT of the image we 9? 2, we can reorder both U and w
JV.WjXl JV,/V,xl ,
into column vectors, U e C and u e C respectively. We can then imple
ment the 2-D DFT as a unitary transformation matrix, , such that
U = u u = W t = i
In order for the transformation A to be the space domain equivalent of the spectral pre-
whitener of the MACE filter, the relationship
Ax = ty
= I2f .
= fBj:
where B is the same matrix as in equation 27, must be true which, by inspection, means
that
A = t B. (37)
Substituting equation (37) into equation (36) and using the property B^B = BB = i> 1
yields
h = Ax(x^A^Ax)
= tBx(xt(ts)ttBjcr'i/
= tBx(xttBtfix:)_li/
= +BX(X^D~'x)'d
(38)


25
OTSDF
Figure 12. OTSDF peak output response of vehicle la over all aspect angles.
Degradation to between aspect exemplars is less than in the MACE
filter shown in dashed line.
The primary contribution of this research will be to extend the ideas of MACE filtering
to a general nonlinear signal processing architecture and accompanying classification
framework. These extensions will focus on processing structures which improve the gen
eralization and discrimination properties while maintaining the shift-invariance and local
ization detection properties of the linear MACE filter.


149
Figure 67. Mutual information feature space resulting from convex hull
exemplars. The training exemplars are shown in the top figure (square -
recognition, triangle rejection). The bottom figure shows the testing
exemplars.


CHAPTER 6
CONCLUSIONS
We have discussed a methodology by which linear distortion invariant filtering can be
extended to nonlinear systems. The extension to nonlinear systems was initiated by first
establishing the link between distortion invariant filters and the linear associative memory
in chapter 3. The linear associative memory perspective is important in that it more closely
aligns distortion invariant filtering with classification. Advances in distortion invariant fil
tering, as described in chapter 2, have occurred within a linear systems framework despite
the primary application being classification. The result is a classification approach which
considers only second order statistics. In contrast the development of associative memo
ries has occurred within a probabilistic framework emphasizing a classification approach
which considers the underlying probability density function. This perspective led naturally
to nonlinear signal processing. The consequence of using the MSE criterion was also dis
cussed in chapter 3. The result, which has been shown by other researchers as well, was
that the MSE criterion combined with a universal approximator and 1-of-N coding (the
desired output is an N-vector with the desired output for the ith element set to unity and all
others to zero for an N-class classification problem) is suitable for estimating posterior
class probabilities.
Some of the major contributions of this dissertation were presented in chapter 4. We
began with an analysis of commonly used measures of generalization for distortion invari
ant filters. Our analysis showed that these measures were actually counter to good classifi-
151


Page
4.5 Efficient Representation of the Rejection Class 72
4.6 Experimental Results 74
4.6.1 Experiment I noise training 75
4.6.2 Experiment II noise training with an orthogonalization constraint 81
4.6.3 Experiment III subspace noise training 84
4.6.4 Experiment IV convex hull approach 89
5 INFORMATION-THEORETIC FEATURE EXTRACTION 96
5.1 Introduction 96
5.2 Motivation for Feature Extraction 97
5.3 Information Theoretic Background 101
5.3.1 Mutual Information as a Self-Organizing Principle 101
5.3.2 Mutual Information as a Criterion for Feature Extraction 104
5.3.3 Prior Work in Information Theoretic Neural Processing 106
5.3.4 Nonparametric PDF Estimation 108
5.4 Derivation Of The Learning Algorithm 110
5.5 Gaussian Kernels 115
5.6 Maximum Entropy/PCA: An Empirical Comparison 118
5.7 Maximum Entropy: ISAR Experiment 124
5.7.1 Maximum Entropy: Single Vehicle Class 125
5.7.2 Maximum Entropy: Two Vehicle Classes 127
5.8 Computational Simplification of the Algorithm 127
5.9 Conversion of Implicit Error Direction to an Explicit Error 136
5.9.1 Entropy Minimization as Attraction to a Point 136
5.9.2 Entropy Maximization as Diffusion 139
5.9.3 Stopping Criterion 141
5.10 Observations 143
5.11 Mutual Information Applied to the Nonlinear MACE Filters 144
6 CONCLUSIONS 151
APPENDIX
A DERIVATIONS 155
REFERENCES 168
BIOGRAPHICAL SKETCH 173
iv


90
Figure 37. Experiment III: Output response to an image from the recognition
class training set
Figure 38. Learning curves for three methods. Experiment II: White noise
training (dashed line). Experiment III: subspace noise (dashed-dot
line). Experiment IV: subspace noise plus convex hull exemplars
(solid line).


CHAPTER 2
BACKGROUND
2.1 Discussion of Distortion Invariant Filters
As stated, distortion invariant filtering is a generalization of matched spatial filtering.
It is well known that the matched filter maximizes the peak-signal-to-average-noise power
ratio as measured at the filter output at a specific sample location when the input signal is
corrupted by additive white noise.
In the discrete signal case the design of a matched filter is equivalent to the following
vector optimization problem.[Kumar, 1986]
min h^h s.t. x= d {h, x}e CNxi,
where the column vector x contains the N coefficients of the signal we wish to detect, h
contains the coefficients of the filter ( f indicates the hermitian transpose operator), and d
is a positive scaler. This notation is also suitable for N-dimensional signal processing as
long as the signal and filter have finite support and are re-ordered in the same lexico
graphic manner (e.g. by row or column in the two-dimensional case) into column vectors.
The optimal solution to this problem is
h = *(*tjr y'd.
6


129
In contrast, the method here is fairly straightforward and as we will show computationally
simple. The results of this section greatly reduce the computational complexity of our
approach and yield a surprisingly simple and intuitive perspective of mutual information.
Again we consider equation 74, where we have already observed that the implicit error
direction is the convolution of the observed distribution error with the kernel gradient. We
illustrate this by rewriting the implicit error direction term associated with each observa
tion y¡ (excluding the term related to mapping sensitivities and neglecting the sign for the
moment) as
e, = <79)
j
where }) indicates the observed distribution error at point Uj estimated over the
set of observations {y }.
At first glance it would seem that the method is computationally expensive. Computa
tion of the Parzen window estimate is itself of order NyNu, the number of observations
multiplied by the number of locations in the output space at which the estimator is com
puted. Reasonable estimates of the density using a discrete approximation requires Nu to
increase exponentially with the dimension of the output space. Furthermore, from equa
tion 79, in order to compute the implicit error term we multiply the complexity of the com
putation by Nu, yet again, to yield an overall complexity of
N,
y *
(80)


LIST OF TABLES
Table
Page
1 Classifier performance measures when the filter is determined by either of the
common measures of generalization as compared to best classifier performance for
two values of 61
2 Correlation of generalization measures to classifier performance. In both cases (
equal to 0.5 or 0.95) the classifier performance as measured by the area of the ROC
curve or Pfa at Pd equal 0.8, has an opposite correlation as to what would be
expected of a useful measure for predicting performance 64
3 Comparison of ROC classifier performance for to values of Pd. Results are shown
for the linear filter versus four different types of nonlinear training. N: white noise
training, G-S: Gram-Schmidt orthogonalization, subN: PCA subspace noise, C-H:
convex hull rejection class 81
4 Comparison of ROC classifier performance for to values of Pd. Results are shown
for the linear filter versus experiments III and IV from section 4.6 and mutual
information feature extraction.The symbols indicate the type of rejection class
exemplars used. N: white noise training, G-S: Gram-Schmidt orthogonalization,
subN: PCA subspace noise, C-H: convex hull rejection class 145
viii


131
where the term
N,
y(u) = X 5(u~y)
1= 1
represents the data set as observed in the output space, fY(u) is the desired output distri
bution (uniform), and fy(u \ (y }) is the observed output distribution estimated over the set
{y}. Continuing from the last step of 82,
e, = CM)-3'()*k())*k'(*)L^
= /r()-J()*Ka()|II=j,. <83>
= fr^j)-'LKa(y¡-yj)
i*¡
The terms ka(u) and /r(u) will be termed the attractor kernel and the topology regu
lating term, respectively, for reasons which will become clear. The significance of equa
tion 83 is that it breaks the fundamental limitation imposed by equation 81. Both the
attractor kernel and topology regulating terms can be computed analytically. The implicit
error term is therefore the negative of the convolution of the attractor kernel with the data
(as it is projected into the output space). More importantly the computational complexity
of equation 82 is only of order Ny for each y¡. The total computational complexity is
therefore significantly reduced from that of equation 78.


79
the ROC curve for both filters we see that they overlay each other. From a classification
standpoint the two filters are equivalent.
ROC curve
Figure 30. ROC curves for linear filter (solid line) versus nonlinear filter (dashed
line). Despite improved performance of the nonlinear filter as
measured by peak output response and reduced variance over the
training set, the filters are equivalent with regards to classification
over the testing set.
The explanation of this result is best explained by figure 31. Recall the points u t and
m2 labeled in figure 22.
We can view these outputs as a feature space, that is, the MLP discriminant function
can be superimposed on the projection of the input image onto this space. In this case the
feature space is a representation of the input vector internal to the MLP structure. The des
ignation of these points as features is due to the fact that they represent some abstract qual-


34
3.2.3 Finite Data Sets
The previous development does not take into account that in an iterative framework we
are working with observations of a random variable. Therefore, we rewrite the criterion of
equation 5 as finite summations. That is, the criterion becomes
/(/(*,)) = y X /(*;, o02 + y X Oa))2. 05)
X' e C, g C2
where x¡ e C¡ denotes the set of observations taken from class C¡. Taking the derivative
of this criterion with respect to the parameters, a, yields
= Pi X /(*,> a)Pi X a))^/(*,,a). (16)
x¡ e C, x, e C2
It is assumed that the set of observations from class C, (jr(- e C[) are independent and
identically distributed (i.i.d.), as are the set of observations from class C2 (x, e C2)
although with a different distribution than class C,. Since the summation terms are bro
ken up by class, we can assume that the arguments of the summations (functions of dis
tinct i.i.d. random variables) are themselves i.i.d. random variables [Papoulis, 1991]. If we
set = Pj and P2)V2 = P2, where P¡ and P2 are the prior probabilities of classes
C, and C2, respectively, and N{ and N2 are the number of samples from drawn from


135
Figure 60. Approximation of the regulating function. The figure shows the
regulating function when the erf( ) is ignored. The change is not
significant within the region of support of the desired distribution.
The result of this analysis is that the manipulation of entropy can be modeled as local
interactions of the observed data in the output space. The function of the attractor kernel,
Ka([ ]), is to model the interaction of the data points with each other, while the function
of the topology regulating term, fr([ ]), is to model the interaction between the data


y.0
l7*
7 v
,fssy
UNIVERSITY OF FLORIDA
3 1262 08556 6387


152
cation performance. It is our opinion that the generalization measures discussed are more
properly suited to a signal representation framework and not classification. The analysis
also revealed that emphasis on the MACE filter optimization criterion in the construction
of the OTSDF led to superior classification performance.
The results of the analysis of generalization measures was significant in that it high
lighted the fact that commonly used measures of generalization should not be the sole
basis upon which to compare nonlinear systems to the their linear counterparts since these
measures are only weakly coupled to classification performance.
The probabilistic viewpoint of the MACE filter optimization criterion was presented
in chapter 4 as well. Within this framework, nonlinear mappings such as the multi-layer
perceptron, were included allowing for a nonlinear extension of the MACE filter. The lack
of closed form analytical solutions for general nonlinear mappings necessitated an itera
tive approach. Consequently, the feed-forward multi-layer perceptron was an obvious can
didate for the nonlinear mapping due to its property as a universal function approximator
coupled with computationally efficient iterative algorithms. This choice also preserved the
shift invariance property of the original linear filter.
Several developments resulted from the nonlinear approach. An efficient training
algorithm resulted from the recognition that the optimization criterion was equivalent to
characterizing the rejection class by white-noise images in the pre-whitened image space.
The results of experiment I in section 4.6.1 emphasized the need for suitable performance
measures by which to compare nonlinear and linear classifiers. This motivated a feature
space viewpoint of the internal mappings of the multi-layer perceptron. Examination of
the feature space led to several modifications and subsequent performance improvements


I certify that I have read this study and that in my opinion it conforms to acceptable standards of
scholarly presentation and is fully adequate, in scope and quality, as a dissertation for the degree of
Doctor of Philosophy.
Andrew F. Laine
Associate Professor of Computer and Information
Science and Engineering
This dissertation was submitted to the Graduate Faculty of the College of Engineering and to the
Graduate School and was accepted as partial fulfillment of the requirements for the degree of
Doctor of Philosophy.
May 1997
Winfred M. Phillips
Dean, College of Engineering
Karen A. Holbrook
Dean, Graduate School


14
and is typically set to all unity values for the recognition class. The images of the data
matrix X comprise the range of distortion that the implemented filter is expected to
encounter. It is assumed that Nt < A' and so the problem formulation is a quadratic optimi
zation subject to an under-determined system of linear constraints. The optimal solution is
h = X(X^X)~'d.
When there is only one training exemplar (N, = 1) and d is unity the SDF defaults to
the normalized matched filter. Similar to the matched filter (white noise case), the SDF is
the linear filter which minimizes the white noise response while satisfying the set of linear
constraints over the training exemplars.
By way of example, the SDF technique is tested against the ISAR data as in the MSF
case. Exemplar images from vehicle la were selected at every 4 degrees aspect from 5 to
85 degrees for a total of 21 exemplar images (i.e. N, = 21). Figure 5 shows the peak out
put response over all aspects of the training vehicle (la). As seen in the figure, the degra
dation as the aspect changes is removed. The MSF response has been overlaid to highlight
the differences.
The peak output response over all exemplars in the testing set is shown in figure 6.
From the perspective of peak response, the filter generalizes fairly well. However, as in the
MSF, the usefulness of the filter as a discriminant between vehicles 1 and 2 is clearly lim
ited.
Figure 7 shows the resulting output plane response when the SDF filter is correlated
with a single image of vehicle la. The localization of the peak is similar to the MSF case.


75
OTSDF filter, but such nomenclature does not convey the type of pre-processing that is
being performed. We choose the value of a so as to compare to the best possible MACE
filter for this data set.
The nonlinear filter will use the same pre-processor as the linear filter (i.e. a = 0.99).
The MLP structure is shown at the bottom of figure 22. It accepts an Ni N2 input vector (a
preprocessed image reordered into a column vector), followed by two hidden layers (with
two and three hidden PE nodes, respectively), and a single output node. The parameters of
the MLP
2 ,_2x3 3 X 1 ,3 X 1
VP, e 91 W2e 9? W3 e 9t tpeSR
are to be determined through gradient search. The gradient search technique used in all
cases will be error backpropagation algorithm.
4.6.1 Experiment I noise training
As stated, using the statistical approach, the rejection class is characterized by white
noise sequences at the input to the MLP. The recognition class is characterized by the
exemplars. It is from these white noise sequences that the MLP, through the backpropaga
tion learning algorithm, captures information about the rejection class. So it would seem a
simple matter, during the training stage, to present random white noise sequences as the
rejection class exemplars. This is exactly the training method used for this experiment.
From our empirical observations we observed that with this method of training the linear
solution is a strong attractor. The results of the first experiment is demonstrates this behav
ior.


165
With these conditions, equation 117 can be rewritten
40) = \fu(.x)K'(u-x)dx
a a
2 2
a a
= l?f0 f0- N/2 HTleX--i)T( -*)!( -*)<&
a -? (2t) o V 2c '
(i/ -x)dx
2 ~2 (271)'
1
x/V/2 /V+2
aN(2n) a 2
fa-faeXP
2a
2 v
(118)


61
ization measures as described above are used to choose the filter versus the best ROC per
formance achieved throughout the range of aspect separation. In one regard, the
generalization measures were consistent in that the same aspect separation was predicted
by both measures for both settings of a. In figure 26 we compare the ROC curves for two
cases, first where the filter chosen using the generalization measures and second the best
achieved ROC curve, for both settings of a. We would expect that for each a the filter
using the generalization measure would be near the best ROC performance. As can be
seen in the figure this is not the case.
Table 1. Classifier performance measures when the filter is determined by either of the
common measures of generalization as compared to best classifier performance for two
values of Ot.
Generalization Measure
^min ^mse
Best
s*
ii
o
be
0.24
0.24
0.16
a = 0.50
ROC area
0.83
0.83
0.90
Pf@P.,=0.8
0.16
0.16
0.07
a = 0.95
ROC area
0.94
0.94
0.95
It is obvious from figures 24 and 25 that the generalization measures are not signifi
cantly correlated with the ROC performance. In fact, as summarized in table 2, the gener
alization measures are negatively, albeit weakly, correlated with ROC performance. One
feature of figures 24 and 25 is that although ROC performance varies independent of


53
present experimental results which exhibit improved discrimination and generalization
performance with respect to the MACE filter while maintaining the properties of localized
detection peak and low variance in the output plane.
4.2 A Proposed Nonlinear Architecture
As we have stated, the MACE filter can be decomposed as a pre-whitening filter fol
lowed by a synthetic discriminant function (SDF), which can also be viewed as a special
case of Kohonens linear associative memory (LAM) [Hester and Casasent, 1980; Fisher
and Principe, 1994]. This decomposition is shown at the top of figure 22. The nonlinear
filter architecture with which we are proposing is shown in the middle of figure 22. In this
architecture we replace the LAM with a nonlinear associative memory, specifically a feed
forward multi-layer perceptron (MLP), shown in more detail at the bottom of figure 22.
We will refer to this structure as the nonlinear MACE filter (NL-MACE) for brevity.
Another reason for choosing the multi-layer perceptron (MLP) is that it is capable of
achieving a much wider range of discriminant functions. It is well known that an MLP
with a single hidden layer can approximate any discriminant function to any arbitrary
degree of precision [Funahashi, 1989]. One of the shortcomings of distortion invariant
approaches such as the MACE filter is that it attempts to fit a hyper-plane to our training
exemplars as the discriminant function. Using an MLP in place of the LAM relaxes this
constraint. MLPs do not, in general, allow for analytic solutions. We can, however, deter
mine their parameters iteratively using gradient search.


26
OTSDF
Figure 13. OTSDF peak output response of vehicles lb and 2a over all aspect
angles. Generalization is better than in the MACE filter. Vehicle lb is
shown in dashed line, vehicle 2a is shown in dashed-dot line.
Figure 14. Decomposition of distortion invariant filter in space domain. The
notation used assumes that the image and filter coefficients have been
re-ordered into vectors. The input image vector, x, is pre-processed
by the linear transformation, y = Ax. The resulting vector is
processed by a synthetic discriminant function, yout = _y+/i.


Extension to nonlinear signal processing is not without cost. Solutions must in general
be computed iteratively. Our approach was motivated by the early proof that the MACE
filter is equivalent to the linear associative memory (LAM). The associative memory per
spective is more properly associated with the classification problem and has been devel
oped extensively in an iterative framework.
In this thesis we demonstrate a method emphasizing a statistical perspective of the
MACE filter optimization criterion. Through the statistical perspective efficient methods
of representing the rejection and recognition classes are derived. This, in turn, enables a
machine learning approach and the synthesis of more powerful nonlinear discriminant
functions which maintain the desirable properties of the linear MACE filter, namely, local
ized detection and shift invariance.
We also present a new information theoretic approach to training in a self-organized or
supervised manner. Information theoretic signal processing looks beyond the second order
statistical characterization inherent in the linear systems approach. The information theo
retic framework probes the probability space of the signal under analysis. This technique
has wide application beyond nonlinear MACE filter techniques and represents a powerful
new advance to the area of information theoretic signal processing.
Empirical results, comparing the classical linear methodology to the nonlinear exten
sions, are presented using inverse synthetic aperture radar (ISAR) imagery. The results
demonstrate the superior classification performance of the nonlinear MACE filter.
x


91
boundary of the convex hull of the training set. The convex hull of a set {xj, x2, ...,xN}
is defined as all points which can be represented as
N,
* = X y¡
i = 1
where the s are constrained to satisfy
N,
ai 0 2>¡ = 1
i = 1
It was pointed out that by Kumar et al that when the peak constraints for the SDF (or
any of the linear distortion invariant filters) are all set to unity, points in the interior of the
convex hull over the training exemplars are recognized as well as die those near the extre
mal points. This would include, for example, an image which is the mean of the training
exemplars. Examination of imagery derived from points that are closer to the interior of
the convex hull, rather than near the boundary would indicate that they are not representa
tive of the recognition class.
It was suggested that a way to mitigate this property was to set the desired output over
the training set to be complex, unity magnitude and mean zero. The magnitude of the out
put was then used as the response. In this way only points near the boundary of the convex
hull are recognized.
The approach taken here is similar in that exemplars from the interior of the convex
hull are used as representative of the rejection class. The difference is that this description
is included in the learning process without priori determining the decision surface (e.g.


28
then applied to a hard limiter which assigns a class to the input pattern. Mathematically
this can be represented by
c = sgn(y-tp)
= sgn(vt'Tj: tp)
N x l
where sgn( ) is the signum function, cp is a threshold, and w, x e 9? are column
vectors containing the coefficients of the pattern and combiner weights, respectively. In
the context of classification, this architecture is trained iteratively using the least mean
square (LMS) algorithm [Widrow and Hoff, I960]. For a two class problem the desired
output, d in the figure, is set to 1 depending on the class of the input pattern, the LMS
algorithm then minimizes the mean square error (MSE) between the classification output
c and the desired output. Since the error function, ec, can only take on three values 2
and 0, minimization of the MSE is equivalent to minimizing the average number of actual
errors.
Figure 15. Adaline architecture


153
to the training algorithm and classification performance. Specifically, an orthogonality
constraint on the input layer of the multi-layer perceptron was sufficient to guarantee

uncorrelated features over the rejection class. Projection of the white noise rejection class
exemplars onto the space of the recognition class data matrix effectively reduced the
dimensionality of the problem from NiN2 (the image size) to /V, (the number of recogni
tion class exemplars). The result of this modification was a significantly faster conver
gence rate. The last result borrowed the idea of using the interior of the convex hull (over
the recognition class exemplars) as representative of the rejection class. This is a further
refinement of the concept of reducing the intrinsic dimensionality of problem. There were
two observations concerning the convex hull approach. Convergence times were consider
ably longer and the stability of the iterative procedure became an issue. However, when
the training did converge, the classification performance was superior to the previous
cases. We feel that the results of the convex hull approach merit further investigation.
In chapter 5 we presented a new information theoretic feature extraction method. We
provided a clear motivation (Fanos inequality) for using mutual information as the crite
rion for feature extraction in a classification framework. It is our opinion that this new
method represents a significant advance to the state of the art for self-organizing systems
and information theoretic signal processing in several regards. It utilizes the continuous
form of entropy and mutual information as opposed to the discrete form, consequently it
can be used for continuous mappings. In contrast to previous entropy based approaches it
poses no limitation on the number of hidden layers of the network mapping. Also it does
not require the underlying pdf to unimodal, again in contrast to previous approaches.
These qualities make it a very powerful method for information theoretic signal process-


Abstract of Dissertation Presented to the Graduate School
of the University of Florida in Partial Fulfillment of the
Requirements for the Degree of Doctor of Philosophy
NONLINEAR EXTENSIONS TO THE MINIMUM AVERAGE
CORRELATION ENERGY FILTER
By
John W. Fisher III
May 1997
Chairman: Dr. Jos C. Principe
Major Department: Electrical and Computer Engineering
The major goal of this research is to develop efficient methods by which the family of
distortion invariant filters, specifically the minimum average correlation energy (MACE)
filter, can be extended to a general nonlinear signal processing framework. The primary
application of MACE filters has been to pattern classification of images. Two desirable
qualities of MACE-type correlators are ease of implementation via correlation and ana
lytic computation of the filter coefficients.
Our motivation for exploring nonlinear extensions to these filters is due to the well-
known limitations of the linear systems approach to classification. Among these limita
tions the attempt to solve the classification problem in a signal representation space,
whereas the classification problem is more properly solved in a decision or probability
space. An additional limitation of the MACE filter is that it can only be used to realize a
linear decision surface regardless of the means by which it is computed. These limitations
lead to suboptimal classification and discrimination performance.
IX


CHAPTER 5
INFORMATION-THEORETIC FEATURE EXTRACTION
VI introduction
The material presented in this section is motivated by the analysis of the previous
chapter. Recall that beginning with section 4.6.1 our analysis of the nonlinear system was
aided by reference to a feature space within the MLP architecture. The designation of the
.feature space, which led to useful modifications to the iterative training algorithm. In the
previous analysis, however, the generation of the features were a function of training algo
rithm with regards to the desired system response. The analysis did show, however, that
the representation of the data in the feature space was critical to the classification perfor
mance. This section examines the de-coupling of the feature extraction stage from the
training of the discriminant function in overall system architecture
Of course, when the feature extraction is de-coupled it is important to use a criterion
which is related to the overall goal classification. The approach described here uses an
information theoretic measure, namely mutual information, as a criterion for adaptation.
We will show that, although the feature extraction is de-coupled from the classifier train
ing, the resulting features are, in fact, specific to classification. This method represents a
new advance to the area of information theoretic signal processing and as such has wide
application beyond nonlinear extensions to MACE filters and classification.
We have recently presented a maximum entropy based technique for feature extraction
[Fisher and Principe, 1995c] which we now extend to mutual information. This method
96


161
Substituting (104) into (101) yields the solution
(105)
A.3 Convolution of Gaussian Kernel with its Gradient
The N -dimensional Gaussian kernel takes the form
(106)
where u e 9('Vx 1 and the covariance term, X e gradient of the kernel with respect to u has the form [Fukanaga, 1990]
9k ,
d = K
= -K ()£*
(107)
(2jz)N/2IzI
T72exp(_^Tx 'j1
Here we are interested in these functions when the covariance term has the simplified
form
X = a2/
a 0 0
0 a2
: 0 :
0
0 0 o2


109
estmate of the probability distribution, /(), of a random vector Y s at a point u is
defined as
",
/K) = (66)
I = 1
The vectors yt e 3iV are observations of the random vector and k([ ]) is a kernel
function which itself satisfies the properties of PDFs (i.e. k(k) > 0 and Jic(M)d = 1).
The Parzen window estimate can be viewed as a convolution of the estimator kernel with
the observations. Since we wish to make a local estimate of the PDF, the kernel function
should also be localized (i.e. uni-modal, decaying to zero). In the method we describe we
will also require that K([ ]) be differentiable everywhere.
There are several properties of the Parzen density estimate of note. If the estimator
kernel function satisfies the properties above, the estimate will satisfy the properties of a
PDF. In the limit as the estimator approaches the true underlying distribution
convolved with the kernel function, that is
lim fy(u) = /y()*K(u), (67)
consequently, the Parzen window estimator is a biased estimator. The bias can be made
arbitrarily small by reducing the extent of the kernel at the cost of raising the variance
[Hardle, 1990],
In the multidimensional case the form of the kernel is typically gaussian or hypercube.
As a result of the differentiability requirement of our method, the gaussian kernel is most


105
RV X and a related RV Y the probability of incorrectly estimating X based on an estimate
derived from observations of Y is lower bounded by
P(X*X)
h(x\y)-\
log (N)
(62)
where N is the number of discrete events or classes represented by the RV X and X is the
estimate of X. Using equation 58 we can rewrite Fanos inequality as a function of the
mutual information of X and Y as follows
P(X*&)>
h(x)-l(x,y)-l
log (A()
(63)
In this form of Fanos inequality we see that the lower bound on the error probability is
minimized when the mutual information between X and Y is maximized.
It can be shown that the upper bound on the probability of error is
P(**j¡:)<(l-max{/>})<(l-l/AO, (64)
where P¡ is the prior probability of the ¡th class of X and max{P;} is the maximum over
the set of P¡ s. Equation 64 is itseif upper bounded by (1 l/N); the case in which all
classes are equally likely. This is upper bound is met with equality when the mutual infor
mation between X and Y is zero and the optimal class estimator reverts to choosing the
class with the greatest prior probability.
This approach is depicted in figure 43 with regards to a Bayesian framework. In the
figure, C is a discrete RV which represents the class. The function
F(C) = P¡
i =
(65)


67
These measures are not, however, sufficient to describe classification performance. We
have also used these measures in the past but feel that they are not the most appropriate for
classification [Fisher and Principe, 1995b].
4 4 .Statistical Characterization of the Rejection Class
We now present a statistical viewpoint of distortion invariant filters from which such
nonlinear extensions fit naturally into an iterative framework. This treatment results in an
efficient way to capture the optimality condition of the MACE filter using a training algo
rithm which is approximately of order A, and which leads to better classification perfor
mance than the linear MACE.
A possible approach to design a nonlinear extension to the MACE filter and improve
on the generalization properties is to simply substitute the linear processing elements of
the LAM with nonlinear elements. Since such a system can be trained with error back-
propagation [Rumelhart et al., 1986], the issue would be simply to report on performance
comparisons with the MACE. Such methodology does not, however, lead to understand
ing of the role of the nonlinearity, and does not elucidate the trade-offs in the design and in
training.
Here we approach the problem from a different perspective. We seek to extend the
optimality condition of the MACE to a nonlinear system, i.e. the energy in the output
space is minimized while maintaining the peak constraint at the origin. Hence we will
impose these constraints directly in the formulation, even knowing priori that an analyti
cal solution is very difficult or impossible to obtain. We reformulate the MACE filter from


125
5.7.1 Maximum Entropy: Single Vehicle Class
In the first experiment we trained the feature extractor on a single vehicle (upper vehi
cle in figure 52) over 180 degrees of aspect with 3.6 degrees aspect between each exem
plar. We show the mapping of the input images onto the two dimensional output feature
space in figures 53, 54, and 55 after 100, 200 and 300 iterations, respectively. The map
ping of the images into the feature space are connected in order of increasing aspect. In the
latter two plots it is clear that the extracted features have begun to fill the output feature
space, but have also maintained aspect dependence on the images. We believe that this is
evidence that while the method increases the statistical independence of the two output
features, it is still tuned to the underlying distortion of the input vehicle class as repre
sented by rotation through aspect.
Figure 53. Single vehicle experiment, 100 iterations. Projection of training (top
left) and testing (top right) images onto feature space.
We believe that this is evidence that the mapping has maintained topological neighbor
hoods in a similar fashion to the Kohonen self-organizing feature map (SOFM) [Kohonen,


123
Consequently, the principal components are the eigenvectors of the matrix
0.62 1
1 0.62
with the major principal component at 45 degrees above the x0-axis. As in the previous
case we compare the principal component mapping to the maximum entropy mapping.
The results are shown in figure 5!. Again, as in the previous case, it is evident that the
maximum entropy mapping is better related to the underlying structure of the distribution,
as it has found the separate direction of the individual modes, whereas the PCA projection
has essentially averaged the directions.
Figure 51. PCA vs. Entropy non-gaussian case. Left: image of PCA features
shown as contours. Right: Entropy mapping shown as contours.
These experiments help to illustrate the differences between PCA (a signal representa
tion method) and entropy (an information-theoretic method). PCA is primarily concerned
with direction finding and only considers the second order statistics of the underlying data,
while entropy explores the structure of the data class. In a few limited cases, second order


9
vehicle la (training)
vehicle lb (testing)
vehicle 2a (testing)
Figure 1. ISAR images of two vehicle types. Vehicles are shown at aspect angles
of 5, 45, and 85 degrees respectively. Two different vehicles of type 1 (a
and b) are shown, while one vehicle of type 2 (a) is shown. Vehicle la
is used as a training vehicle, while vehicle lb is used as the testing
vehicle for the recognition class. Vehicle 2a represents a confusion
vehicle.
peak output response begins to degrade. Depending on the type of imagery as well as the
vehicle, this degradation can become very severe.


5
tion of statistically independent features. The method has wide application beyond nonlin
ear extensions to MACE filters and as such represents a powerful new technique for
information theoretic signal processing. A review of information theoretic approaches to
signal processing are presented in this chapter. This is followed by the derivation of the
new technique as well as some general experimental results which are not specifically
related to nonlinear MACE filters, but which serve to illustrate the potential of this
method. Finally the logical placement of this method within nonlinear MACE filters is
presented along with experimental results.
In chapter 6 we review the significant results and contributions of this dissertation. We
also discuss possible lines of research resulting from the base established here.


170
Kullback, S. (1968); Information Theory and Statistics. Dover Publications, New
York.
Kullback, S., and R. Leibler (1951); On information and sufficiency, Ann. Math.
Stat. 22:79-86.
Kumar, B. (1986); Minimum variance synthetic discriminant functions, J. Opt. Soc.
Am. A 3 (10): 1579-1584.
Kumar, B. (1992); Tutorial survey of composite filter designs for optical correlators,
Appl. Opt. 31 (23): 4773-4801.
Kumar, B., Z. Bahri, and A. Mahalanobis (1988); Constraint phase optimization in
minimum variance synthetic discriminant functions, Appl. Opt. 27 (2): 409-413.
Kumar, B, A. Mahalanobis, S. Song, S. Sims, J. Epperson (1992); Minimum squared
error synthetic discriminant functions, Opt. Eng. 31 (5): 915-922.
Kumar, B., J. Brasher, C. Hester, G. Srinivasan, and S. Bollapragada, (1994); Syn
thetic discriminant functions for recognition of images on the boundary of the convex hull
of the training set, Pattern Recognition 27 (4): 543-548.
Kumar, B. V. K., and A. Mahalanobis (1995); Recent advances in distortion-invariant
correlation filter design, Proceedings of SPIE, 2490: 2-13.
Kung, S. Y. (1992); Digital Neural Networks, Prentice-Hall, New Jersey.
Linsker, R. (1988); Self-organization in a perceptual system, Computer, 21: 105-
117.
Linsker, R. (1990); How to generate ordered maps by maximizing the mutual infor
mation between input and output signals, Neural Computation, 1:402-411.
Mahalanobis, A., B.V.K. Vijaya Kumar, and D. Casasent (1987); Minimum average
correlation energy filters, Appl. Opt. 26 (17): 3633-3640.
Mahalanobis, A., and H. Singh (1994); Application of correlation filters for texture
recognition; Appl. Opt. 33 (11): 2173-2179.
Mahalanobis, A., B.V.K. Vijaya Kumar, Sewoong Song, S.R.F. Sims, and J.F. Epper
son (1994); Unconstrained correlation filters; Appl. Opt. 33 (33): 3751-3759.
Mahalanobis, A., A. V. Forman, N. Day, M. Bower, R. Cherry (1994); Multi-class
SAR ATR using shift-invariant correlation filters, Pattern Recognition 27 (4): 619-626.


143
were computed from a sampled estimate of the observed PDF. The estimation locations
are represented by the cross symbols in the figure 63.
As we can see, both the integrated squared error measure and the measure of equation
91 are adequate estimates of the entropy. Equation 91, however, is much less computation
ally expensive than the other two.
Figure 64. Stopping criterion. Comparison of entropy, integrated squared error,
and distance derived stopping criterion. Integrated squared error and
the distance derived criterion are reasonable approximations to the
criterion of interest, entropy.
5.10 Observations
We have described a nonparametric approach to information theoretic feature extrac
tion. We believe that this method can be used to improve classification performance by
directly choosing relevant features for classification via maximization of mutual informa
tion. A critical capability of the information theoretic approach is the ability to adapt the
entropy of the output space of the nonlinear projection entropy. We have shown that


106
is the probability density function of the class, where P¡ is the prior probability of the ith
class and N is the number of classes. The probability density function of X is conditioned
on the class. The feature vector, y = g(x, a), is derived from the observation of X and is
itself a random vector prior to observation. It is from the feature vector, y, that we wish to
estimate the class. Our goal is to choose the parameters, a, of the mapping g([ ], a)
such that the mutual information between ¥ and C is maximized. We are still left with the
task of determining the estimator, C, however, from Fanos inequality we know that if
I(c,y) is maximized, the lower bound on the classification error will be minimized.
Figure 43. Mutual information approach to feature extraction. An observation of
the random variable X is generated by the probability density function
f(x | C) which is conditioned on the discrete random variable C which
is characterized by the discrete probability density function P(C). The
features, y derived from the observation of X, are used to estimate C.
5.3.3 Prior Work in Information Theoretic Neural Processing
The concept of using information theoretic measures in neural processing is not new.
One application, related to feature extraction was for the purpose of generating ordered
maps [Linsker, 1990], In this work, a modification to Kohonens self-organizing feature
map (SOFM) [Kohonen, 1988, 1995], entropy is used as a competitive measure for adap
tation. Specifically input exemplars are mapped onto a discrete lattice and entropy is used
as a measure for determining which lattice point to adapt. The method differs from the
presentation here in two ways


116
The N-dimensional gaussian kernel evaluated at some u is (simplified for two dimen
sions)
1 ( uTZ*lu'\
K(u) = N/ 2, ,1/2SXP 0
(271) |I| X 2 '
1 ( uTu\
= 2eXP i I
2tio k 2o2'
(77)
L = o2/, N =2
The partial derivative of the kernel (also simplified for the two-dimensional case) is
3k ,
= k(h)I u
du
2no
1 ( uTu\
(78)
I = a2/, N =2
These functions are shown in figure 46 for the two-dimensional case. Recall that the term
Y,zY(u] j
in 74 replaces the standard supervised error direction term in the backpropagation algo
rithm. From the figure we see that when we are maximizing entropy, the distribution error
through the kernel functions acts as a local attractor when the computed PDF error is pos
itive and as a local repellor when the PDF error is negative. When we are minimizing
entropy the behavior is opposite. In this way the adaptation procedure operates in the fea
ture space locally from a globally derived measure of the output space (PDF estimate).
The one-dimensional example of figure 47 further illustrates the entropy minimizing/
maximizing behavior of the algorithm. The figure shows a bi-modal distribution (that has
presumably estimated from observations) overlain on a desired distribution that is uniform
from -1 to 1. Also shown in the figure is the gradient of the estimator kernel (the kernel is


88
experiments and the result is similar to experiment II. The classifier performance as mea
sured in table 3 and the ROC curve of figure 36 are also nominally the same.
Figure 35. Experiment HI: Resulting feature space when the subspace noise is
used for training. Symbols represent the same data as in the previous
case.
There are, however, two notable differences. Examination of figure 37 shows that the
output response to shifted images is even lower allowing for better localization. This con-


60
These experiments examine the relationship between the two commonly used mea
sures of generalization and two measures of classification performance. We can draw con
clusions from the results about the appropriateness of the generalization measures with
regards to classification. The first generalization measure is the minimum peak response,
denoted ymin, taken over the aspect range of the images of the training vehicle (excluding
the aspects used for computing the filter). The second generalization measure is the mean
square error, denoted ymx, between the desired output of unity and the peak response over
the aspect range of the images of the training vehicle (excluding the aspects used for com
puting the filter). The classification measures are taken from the receiver operating char
acteristic (ROC) curve measuring the probability of detecting, Pd, a testing vehicle in the
recognition class (vehicles lb and lc) versus the probability of false alarm, Pja, on a test
ing vehicle in the confusion class (vehicles 2a and 2b) based on peak detection. The spe
cific measures are the area under the ROC curve, a general measure of the test being used,
while the second measure is the probability of false alarm when the probability of detec
tion equals 80%, which measures a single point of interest on the ROC curve.
Two filters are used, one with a = 0.5 and the other with a = 0.95, or one in which
both criterion are weighted equally and one which is close to the MACE filter criterion.
The number of exemplars drawn from the training vehicle (la) is varied from 21 to 81
sampled uniformly in aspect (1 to 4 degrees aspect separation between exemplars).
Examination of figures 24 and 25 show that for both cases (a equal to 0.5 and 0.95)
no clear relationship emerges in which the generalization measures are indicators of good
classification performance. Table 1 compares the classifier performance when the general-


-
71
It is through the implicit description of the rejection class by its second-order statistics
from which we have developed an efficient method extending the MACE filter and other
related correlators to nonlinear topologies such as neural networks.
As stated, our goal is to find mappings, defined by a topology and a parameter set,
which improve upon the performance of the MACE filter in terms of generalization while
maintaining a sharp constrained peak in the center of the output plane for images in the
recognition class. One approach, which leads to an iterative algorithm, is to approximate
the original objective function of equation 46 with the modified objective function
J = (1 P)E(g(co, Aj )2) + (Mg(o), x2) dT][g(0), x2) -d1] (51)
The principal advantage gained by using equation 51 over equation 46 is that we can
solve iteratively for the parameters of the mapping function (assuming it is differentiable)
using gradient search. The constraint equations, however, are no longer satisfied with
equality over the training set. It has been recognized that the choice of constraint values
has direct impact on the performance of optimized linear correlators. Sudharsanan et al
[1990] have explored techniques for optimally assigning these values within the con
straints of a linear topology. Other methods have been suggested [Mahalanobis et al.,
1994a, 1994b; Kumar and Mahalanobis, 1995] to improve the performance of distortion
invariant filters by relaxing the equality constraints. Mahalanobis [1994a] extends this
idea to unconstrained linear correlation filters. The OTSDF objective function of
Rfrgier [1991] appears similar to the modified objective function and indeed, for a lin
ear topology this can be solved analytically as an optimal trade-off problem.


100
The goal of feature extraction is always to improve the overall system classification
performance. In the technique we are presenting now, the decomposition is explicit. The
feature extraction is decoupled from the determination of the discriminant function.
#(! La) ¡
y¡
-H 1, CO) j
d(y¡, co)
feature extraction
discriminator
Figure 41. Classical pattern classification decomposition.
Figure 42. Decomposition of NL-MACE as a cascade of feature extraction
followed by discrimination.


40
{x;e 9?Nx',d¡e 9, i = which are placed into a input data matrix,
T N x 1
x = [ x |... x ^ ], and desired output vector, d = \d]...dNJ find the vector, h e 9
such that
x+fi = d (28)
If the system of equations described by (28) is under-determined the inner product
At* (29)
is minimized using (28) as a constraint. If the system of equations are over-determined
(x^h-d)\x^h-d)
is minimized.
Here, we are interested in the under-determined case. The optimal solution for the
under-determined, using the pseudo-inverse of x is [Kohonen, 1988]
h = x(xtx) 'd. (30)
As was shown in [Fisher and Principe, 1994], we can modify the linear associative
memory model slightly by adding a pre-processing linear transformation matrix, A and
find h such that the under-determined system of equations
(Ax) t/i = d
(31)


23
Figure 10. Example of a typical OTSDF performance plot. This plot shows the
trade-off, hypothetically, between the ACE criteria versus a noise
variance criteria. The curved arrow on the performance bound indicates
the direction of increasing X for the two criterion case. The curve is
bounded by the MACE and MVSDF results.
degree by which one criterion is emphasized over another. We will not address that issue
here, but simply set the value to X = 0.95, indicating more emphasis on the MACE filter
criterion.
The output plane response of the OTSDF is shown in figure 11. As compared to the
MACE filter response, the output peak is not nearly as sharp, but still more localized than
the SDF case.
The peak output response over the training vehicle for the OTSDF is compared to the
MACE filter in figure 12. The degradation to between aspect exemplars is less severe than
the MACE filter. The peak output response of vehicles lb and 2a are shown in figure 13.


158
Using the DFT relationship of (94) and the DFT properties x(n-m) 4-> J(f m(k)X(k)
and P,(0) Px( 1 ) Px(0)p,(0)
^x(l)V,Cl)
/>,(0) fyWw-iO)
Px(N-l)I(/V-l)9Ar.1(W-l)
¡ft
P,( 0) 0
0 PM)
Px(0) 0
0 P,( 1)
/>,( 0) 0
0 P,(l)
0
0
Px(N-l)
0
0
Px(.N-l)
0
0
PJN- I)
90(U 9,(0
9-i(0)
9n-i(U
9o(^-U Vi(N-i) ... 9w.,(W-l)
9o<0) 9oO> 9o(w-U
9,(0) 9i(U 9i(W-l>
9k-i(0) 9k-i(U 9w-i(JV- 1)
O0*
*
= Px
(95)
where P is a diagonal matrix whose diagonal elements contain the periodogram of the
observed sequence x(n).
Nx 1
The output variance of an FIR filter h s 91 due to a zero-mean, complex, wide-
sense-stationary random noise sequence input n e 9?^ *1 is
Gn = E[(hTn)2]
= E[(hTn)(nh)\
= hTE[nnT]h
= hTTh


32
or
P2p(x\C2)
(10)
Pxp(x\Cx) + P2p(x\C2)
= P(.C2\x)
which is the likelihood that the observation is drawn from class 2. If we had reversed the
desired outputs, the result would have been the likelihood that the observation was drawn
from class 1. This result, predicated by our choice of desired outputs, shows that for arbi
trary /(*), the MSE criterion is equivalent to probability of misclassification error crite
rion. In fact, it has been shown by Richard and Lippman [1991] (using other means) for
the multi-class case that if the desired outputs are encoded as vectors, e¡ s 9tWx', where
the t'th element is unity and the others are zero, for an N-class problem the MSE criterion
is equivalent to optimizing the Bayes criterion for classification.
3.2.2 Parameterized Functional Mappings
Suppose, however, that the function is not arbitrary, but is also a function of parameter
set, a, as in f(x, a). The MSE criterion of 5 can be rewritten
J(f) =
(11)
The gradient of the criterion with respect to the parameters becomes


REFERENCES
Amit, D. J. (1989); Modeling Brain Function: The World of Attractor Neural Net
works, Cambridge University Press, New York.
Bishop, C. (1995); Neural Networks for Pattern Recognition, Clarendon Press,
Oxford.
Bell, A., and T. Sejnowski (1995); An information-maximization approach to blind
separation and blind deconvolution, Neural Computation 7; 1129-1159.
Brasher, J., J. Kinser (1994); Fractional-power synthetic discriminant functions,
Pattern Recognition 27 (4): 577-585.
Casasent, D., G. Ravichandran, and S. Bollapragada (1991); Gaussian minimum
average correlation energy filters, Appl. Opt. 30 (35): 5176-5181.
Casasent, D., and G. Ravichandran (1992): Advanced distortion-invariant minimum
average correlation energy (MACE) filters, Appl. Opt. 31 (8): 1109-1116.
Chiang, H-C., R. Moses, S. Ahalt, and L. Potter (1995); Statistical properties of linear
correlators for image pattern classification with application to sar imagery, Proceedings
ofSPIE, 2490,266-277.
Deco, G., and D. Obradovic (1996); An Information-Theoretic Approach to Neural
Computing, Springer-Verlag, New York.
Fano, R. M. (1961); Transmission of Information: A Statistical Theory of Communica
tion, Wiley, NY.
Figue, J., and P. Rfrgier (1993); Optimality of trade-off filters, Appl. Opt. 32 (11):
1933-1935.
Fisher J., and Principe, J. C. (1994); Formulation of the MACE filter as a linear asso
ciative memory, Proceedings of the IEEE International Conference on Neural Networks,
5: 2934-2938.
Fisher J., and Principe, J. C. (1995a); Experimental results using a nonlinear exten
sion of the minimum average correlation energy (MACE) filter, Proceedings of SPIE,
2490: 41-52.
168


62
0.60 0.70 0.80 0.90 1.00
0.60 0.70 0.80 0.90 1.00
y min
Figure 24. Generalization as measured by the minimum peak response. The plot
compares ymin versus classification performance measures (ROC area
and Pfa@Pd=0.8).


126
Figure 54. Single vehicle experiment, 200 iterations. Projection of training (top
left) and testing (top right) images onto feature, space. Adjacent aspect
angles are connected by a line.
Figure 55. Single vehicle experiment, 300 iterations. Projection of training (top
left) and testing (top right) images onto feature space.
1995]. The difference between this approach and the SOFM approach is that in this case
the mapping is continuous, whereas in the SOFM the samples are mapped onto a discrete


49
Figure 20. Learning curve for LMS approach. The learning curve for the LMS
algorithm when the full rank data matrix is shown with a solid line, the
non full rank case is shown with a dashed line.
useful to the ATR problem, namely that the class can be described by a sub-space, the ana
lytic solution fails when the number of exemplars exceeds the dimensionality of the sub
space. The iterative method, however, finds a reasonable solution. Furthermore, if the data
matrix is full rank, the iterative method approaches the optimal analytic solution.
3.5 Comments
There are further motivations for the associative memory perspective and by extension
the use of iterative methods. It is well known that non-linear associative memory struc
tures can outperform their linear counterparts on the basis of generalization and dynamic
range [Kohonen, 1988;Hinton and Anderson, 1981]. In general, they are more difficult to
design as their parameters cannot be computed analytically. The parameters for a large


150
ROC curve
Pf.
Figure 68. ROC curves for mutual information feature extraction (dotted line)
versus linear MACE filter (solid line).


137
The extent of the attraction field between points is directly proportional to the kernel
size as represented by a in equation 84. In the equation we also see that the degree of
attraction is inversely proportional to 0,v + 2, where N is the dimension of the kernel. So
as the kernel size decreases the degree of attraction increases dramatically.
Again, referring to figure 48, attraction to a point makes sense from an intuitive stand
point with regards to minimizing entropy. We also recognize that the influence of all of the
points is additive. So in order to ensure that the net attraction is to a point, we simply set
the gradient at the center of the attractor kernel to unity. The scale factor as a function the
kernel size and dimension is solved for in section A.3 of the appendix with the result
Figure 61 illustrates three cases of scaling the attractor kernel for one dimension. We
can see in the figure, when the attractor is scaled such that the slope is less than or equal to
unity we will get stable attraction to a point. As a result when minimizing entropy we are
able to compute an explicit desired output as function of the current configuration of the
observations in the output space. This allows us to train a multi-layer perceptron in a
supervised fashion. When the MSE of the error is reduced satisfactorily we can compute a
new desired signal based on the new configuration of the observations in the output space.
One question which remains is how to set the size of the kernel. Towards that goal we
note that figure 61 has been normalized by the kernel size, a, and by virtue of our scale
factor this plot can be extended to the multidimensional case as well. The field of influ-


70
tor, the well known solution to the minimum of equation 48 over the mapping parameters
subject to the constraints of equation 45 is
where
ffl = Rxx1[x\Rxx2] V
(49)
Rxt = estimate{£(Ar1X^)}. (50)
Depending on the characterization of X¡, equation 49 describes various SDF-type fil
ters (i.e. MACE, MVSDF, etc.). In the case of the MACE filter, the rejection class is char
acterized by all 2D circular shifts of target class images away from the origin. Solving for
the MACE filter coefficients is therefore equivalent to using the average circular autocor
relation sequence (or equivalently the average power spectrum in the frequency domain)
T
over images in the target class as estimators of the elements of the matrix E(X¡ A'j).
Sudharsanan et al [1991] suggest a very similar methodology for improving the perfor
mance of the MACE filter. In that case the average linear autocorrelation sequence is esti-
T
mated over the target class and this estimator of E(X\X-\) is used to solve for linear
projection coefficients in the space domain. The resulting filter is referred to as the
SMACE (space-domain MACE) filter.
4 4 2 Nonlinear Mappings
For arbitrary nonlinear mappings it will, in general, be very difficult to solve for glo
bally optimal parameters analytically. Our purpose is instead to develop iterative training
algorithms which are practical and yield improved performance over the linear mappings.


17
h = Z?X(X' Taxi'd.
In the case of white noise, the MVSDF is equivalent to the SDF. This technique has a
significant numerical complexity issue which is that the solution requires the inversion of
an Nx N matrix (£) which for moderate image sizes (N = N¡N2) can be quite large
and computationally prohibitive, unless simplifying assumptions can be made about its
form (e.g. a diagonal matrix, toeplitz, etc.).
The MVSDF can be seen as a more general extension of the matched filter to multiple
vector detection as most signal processing definitions of the matched filter incorporate a
noise power spectrum and do not assume the white noise case only. It is mentioned here
because it is the first distortion invariant filtering technique to recognize the need to char
acterize a rejection class.


97
differs from previous methods in that it is not limited to linear topologies [Linsker, 1988]
nor uni-modal probability density functions (PDFs) [Bell and Sejnowski, 1995], The
method is directly applicable to any nonlinear mapping which is differentiable in its
parameters. In particular, we demonstrate that the technique can be applied to a feed-for
ward multi-layer perceptron (MLP) with an arbitrary number of hidden layers. It is also
shown that the resulting iterative training algorithm fits naturally into the back-propaga
tion method for training multi-layer perceptrons.
In this section we present some background information on feature extraction and
information theoretic approaches to signal processing. This is followed by the derivation
of the feature extraction method. Experimental results will be presented which illustrate
the usefulness of this approach. We will conclude with the logical placement of this
method within nonlinear MACE filters as well as experimental results which can be
directly compared to the results of section 4.6.
5.2 Motivation for Feature Extraction
We have shown in section 3.2, that theoretically at least, the MSE criterion can be used
to iteratively train a universal approximator (such as the multi-layer perceptron) to classify
raw input variables directly. That is, the MSE criterion coupled with a universal approxi
mator estimates posterior class probabilities. An issue for any estimator is the variance.
There are two ways to reduce the variance of the estimate of the posterior probabilities
in this case, we can either supply more data (which may not be possible or practical) or we
can somehow impose constraints on the system. Feature extraction is a means by which
constraints can be imposed on the system [Fukanaga, 1990, Bishop, 1995]. This does not
contradict the results of the previous chapter. The previous chapter illustrated that moving



PAGE 1

121/,1($5 (;7(16,216 72 7+( 0,1,080 $9(5$*( &255(/$7,21 (1(5*< ),/7(5 %\ -2+1 : ),6+(5 ,,, $ ',66(57$7,21 35(6(17(' 72 7+( *5$'8$7( 6&+22/ 2) 7+( 81,9(56,7< 2) )/25,'$ ,1 3$57,$/ )8/),//0(17 2) 7+( 5(48,5(0(176 )25 7+( '(*5(( 2) '2&725 2) 3+,/2623+< 81,9(56,7< 2) )/25,'$

PAGE 2

$&.12:/('*(0(176 7KHUH DUH PDQ\ SHRSOH ZRXOG OLNH WR DFNQRZOHGJH IRU WKHLU KHOS LQ WKH JHQHVLV RI WKLV PDQXVFULSW ZRXOG EHJLQ ZLWK P\ IDPLO\ IRU WKHLU FRQVWDQW HQFRXUDJHPHQW DQG VXSSRUW DP JUDWHIXO WR WKH (OHFWURQLF &RPPXQLFDWLRQV /DERUDWRU\ DQG WKH $UP\ 5HVHDUFK /DERUDWRU\ IRU WKHLU VXSSRUW RI WKH UHDVHDUFK DW WKH (&/ ZDV IRUWXQDWH WR ZRUN ZLWK YHU\ WDOHQWHG SHRSOH 0DULRQ %DUWOHWW -LP %HYLQJWRQ DQG -LP .XUW] LQ WKH DUHDV RI $75 DQG FRKHUHQW UDGDU V\VWHPV ,Q SDUWLFXODU FDQQRW RYHUVWDWH WKH LQIOXHQFH WKDW 0DULRQ %DUWOHWW KDV KDG RQ P\ SHUVSHFWLYH RI HQJLQHHULQJ SUREOHPV ZRXOG DOVR OLNH WR WKDQN -HLI 6LFKLQD RI WKH $UP\ 5HVHDUFK /DERUDWRU\ IRU SURYLGLQJ PDQ\ LQWHUHVWLQJ SUREOHPV SHUKDSV WRR LQWHUHVWLQJ LQ WKH ILHOG RI UDGDU DQG $75 $ ODUJH SDUW RI ZKR DP WHFKQLFDOO\ KDV EHHQ VKDSHG E\ WKHVH SHRSOH ZRXOG RI FRXUVH OLNH WR DFNQRZOHGJH P\ DGYLVRU 'U -RV 3ULQFLSH IRU SURYLGLQJ PH ZLWK DQ LQYDOXDEOH HQYLURQPHQW IRU WKH VWXG\ RI QRQOLQHDU V\VWHPV DQG H[FHOOHQW JXLGDQFH WKURXJKRXW WKH GHYHORSPHQW RI WKLV WKHVLV +LV LQIOXHQFH ZLOO OHDYH D ODVWLQJ LPSUHVVLRQ RQ PH ZRXOG DOVR OLNH WR WKDQN '$53$ IXQGLQJ E\ WKLV LQVWLWXWLRQ HQDEOHG D JUHDW GHDO RI WKH UHVHDUFK WKDW ZHQW LQWR WKLV WKHVLV ZRXOG DOVR OLNH WR WKDQN 'UV 'DYLG &DVDVHQW DQG 3DXO 9LROD IRU WDNLQJ DQ LQWHUHVW LQ P\ ZRUN DQG RIIHULQJ KHOSIXO DGYLFH ZRXOG DOVR OLNH WR WKDQN WKH VWXGHQWV SDVW DQG SUHVHQW RI WKH &RPSXWDWLRQDO 1HX UR(QJLQHHULQJ /DERUDWRU\ 7KH OLVW LQFOXGHV EXW LV QRW OLPLWHG WR &KXDQ :DQJ IRU XVHIXO GLVFXVVLRQV RQ LQIRUPDWLRQ WKHRU\ 1HLO (XOLDQR IRU SURYLGLQJ PXFK QHHGHG UHFUHDWLRQDO RSSRUWXQLWLHV DQG LQWUDPXUDO FKDPSLRQVKLS WVKLUWV $QG\ 0LWFKHOO IRU EHLQJ D JRRG IULHQG WR JR WR OXQFK ZLWK DQG ZKR VXIIHUHG ORQJ LQDQH WHFKQLFDO GLVFXVVLRQV DQG ZKR QRZ LV D EHWWHU FOLPEHU WKDQ PH 7KHUH DUH FHUWDLQO\ RWKHUV DQG DP JUDWHIXO WR DOO )LQDOO\ ZRXOG OLNH WR WKDQN P\ ZLIH $QLWD IRU HQGXULQJ D VHHPLQJO\ HQGOHVV RUGHDO IRU DOORZLQJ PH WR XVH HYHU\ RXQFH RI KHU SDWLHQFH DQG IRU VDFULILFLQJ VRPH RI KHU EHVW \HDUV VR WKDW FRXOG ILQLVK WKLV 3K KRSH LW KDV EHHQ ZRUWK LW X

PAGE 3

7$%/( 2) &217(176 $&.12:/('*(0(176 LL /,67 2) ),*85(6 Y /,67 2) 7$%/(6 YLLL $%675$&7 L[ &+$37(56 ,1752'8&7,21 0RWLYDWLRQ %$&.*5281' 'LVFXVVLRQ RI 'LVWRUWLRQ ,QYDULDQW )LOWHUV 6\QWKHWLF 'LVFULPLQDQW )XQFWLRQ 0LQLPXP 9DULDQFH 6\QWKHWLF 'LVFULPLQDQW )XQFWLRQ 0LQLPXP $YHUDJH &RUUHODWLRQ (QHUJ\ )LOWHU 2SWLPDO 7UDGHRII 6\QWKHWLF 'LVFULPLQDQW )XQFWLRQ 3UHSURFHVVRU6') 'HFRPSRVLWLRQ 7+( 0$&( ),/7(5 $6 $1 $662&,$7,9( 0(025< /LQHDU 6\VWHPV DV &ODVVLILHUV 06( &ULWHULRQ DV D 3UR[\ IRU &ODVVLILFDWLRQ 3HUIRUPDQFH 8QUHVWULFWHG )XQFWLRQDO 0DSSLQJV 3DUDPHWHUL]HG )XQFWLRQDO 0DSSLQJV )LQLWH 'DWD 6HWV 'HULYDWLRQ RI WKH 0$&( )LOWHU 3UHSURFHVVRU6') 'HFRPSRVLWLRQ $VVRFLDWLYH 0HPRU\ 3HUVSHFWLYH &RPPHQWV 672&+$67,& $3352$&+ 72 75$,1,1* 121/,1($5 6<17+(7,& ',6n &5,0,1$17 )81&7,216 1RQOLQHDU LWHUDWLYH $SSURDFK $ 3URSRVHG 1RQOLQHDU $UFKLWHFWXUH 6KLIW ,QYDULDQFH RI WKH 3URSRVHG 1RQOLQHDU $UFKLWHFWXUH &ODVVLILHU 3HUIRUPDQFH DQG 0HDVXUHV RI *HQHUDOL]DWLRQ 6WDWLVWLFDO &KDUDFWHUL]DWLRQ RI WKH 5HMHFWLRQ &ODVV 7KH /LQHDU 6ROXWLRQ DV D 6SHFLDO &DVH 1RQOLQHDU 0DSSLQJV

PAGE 4

3DJH (IILFLHQW 5HSUHVHQWDWLRQ RI WKH 5HMHFWLRQ &ODVV ([SHULPHQWDO 5HVXOWV ([SHULPHQW QRLVH WUDLQLQJ ([SHULPHQW ,, QRLVH WUDLQLQJ ZLWK DQ RUWKRJRQDOL]DWLRQ FRQVWUDLQW ([SHULPHQW ,,, VXEVSDFH QRLVH WUDLQLQJ ([SHULPHQW ,9 FRQYH[ KXOO DSSURDFK ,1)250$7,217+(25(7,& )($785( (;75$&7,21 ,QWURGXFWLRQ 0RWLYDWLRQ IRU )HDWXUH ([WUDFWLRQ ,QIRUPDWLRQ 7KHRUHWLF %DFNJURXQG 0XWXDO ,QIRUPDWLRQ DV D 6HOI2UJDQL]LQJ 3ULQFLSOH 0XWXDO ,QIRUPDWLRQ DV D &ULWHULRQ IRU )HDWXUH ([WUDFWLRQ 3ULRU :RUN LQ ,QIRUPDWLRQ 7KHRUHWLF 1HXUDO 3URFHVVLQJ 1RQSDUDPHWULF 3') (VWLPDWLRQ 'HULYDWLRQ 2I 7KH /HDUQLQJ $OJRULWKP *DXVVLDQ .HUQHOV 0D[LPXP (QWURS\3&$ $Q (PSLULFDO &RPSDULVRQ 0D[LPXP (QWURS\ ,6$5 ([SHULPHQW 0D[LPXP (QWURS\ 6LQJOH 9HKLFOH &ODVV 0D[LPXP (QWURS\ 7ZR 9HKLFOH &ODVVHV &RPSXWDWLRQDO 6LPSOLILFDWLRQ RI WKH $OJRULWKP &RQYHUVLRQ RI ,PSOLFLW (UURU 'LUHFWLRQ WR DQ ([SOLFLW (UURU (QWURS\ 0LQLPL]DWLRQ DV $WWUDFWLRQ WR D 3RLQW (QWURS\ 0D[LPL]DWLRQ DV 'LIIXVLRQ 6WRSSLQJ &ULWHULRQ 2EVHUYDWLRQV 0XWXDO ,QIRUPDWLRQ $SSOLHG WR WKH 1RQOLQHDU 0$&( )LOWHUV &21&/86,216 $33(1',; $ '(5,9$7,216 5()(5(1&(6 %,2*5$3+,&$/ 6.(7&+ LY

PAGE 5

/,67 2) ),*85(6 ,6$5 LPDJHV RI WZR YHKLFOH W\SHV 06) SHDN RXWSXW UHVSRQVH RI WUDLQLQJ YHKLFOH OD RYHU DOO DVSHFW DQJOHV 06) SHDN RXWSXW UHVSRQVH RI WHVWLQJ YHKLFOHV OE DQG D RYHU DOO DVSHFW DQJOHV 06) RXWSXW LPDJH SODQH UHVSRQVH 6') SHDN RXWSXW UHVSRQVH RI WUDLQLQJ YHKLFOH OD RYHU DOO DVSHFW DQJOHV 6') SHDN RXWSXW UHVSRQVH RI WHVWLQJ YHKLFOHV OE DQG D RYHU DOO DVSHFW DQJOHV 6') RXWSXW LPDJH SODQH UHVSRQVH 0$&( ILOWHU RXWSXW LPDJH SODQH UHVSRQVH 0$&( SHDN RXWSXW UHVSRQVH RI YHKLFOH OD OE DQG D RYHU DOO DVSHFW DQJOHV ([DPSOH RI D W\SLFDO 276') SHUIRUPDQFH SORW 276') ILOWHU RXWSXW LPDJH SODQH UHVSRQVH 276') SHDN RXWSXW UHVSRQVH RI YHKLFOH OD RYHU DOO DVSHFW DQJOHV 276') SHDN RXWSXW UHVSRQVH RI YHKLFOHV OE DQG D RYHU DOO DVSHFW DQJOHV 'HFRPSRVLWLRQ RI GLVWRUWLRQ LQYDULDQW ILOWHU LQ VSDFH GRPDLQ $GDOLQH DUFKLWHFWXUH 'HFRPSRVLWLRQ RI 0$&( ILOWHU DV D SUHSURFHVVRU LH D SUHZKLWHQLQJ ILOWHU RYHU WKH DYHUDJH SRZHU VSHFWUXP RI WKH H[HPSODUVf IROORZHG E\ D V\QWKHWLF GLVFULPLn QDQW IXQFWLRQ 'HFRPSRVLWLRQ RI 0$&( ILOWHU DV D SUHSURFHVVRU LH D SUHZKLWHQLQJ ILOWHU RYHU WKH DYHUDJH SRZHU VSHFWUXP RI WKH H[HPSODUVf IROORZHG E\ D OLQHDU DVVRFLDWLYH PHPRU\ 3HDN RXWSXW UHVSRQVH RYHU DOO DVSHFWV RI YHKLFOH OD ZKHQ WKH GDWD PDWUL[ ZKLFK LV QRW IXOO UDQN 2XWSXW FRUUHODWLRQ VXUIDFH IRU /06 FRPSXWHG ILOWHU IURP QRQ IXOO UDQN GDWD /HDUQLQJ FXUYH IRU /06 DSSURDFK 106( EHWZHHQ FORVHG IRUP VROXWLRQ DQG LWHUDWLYH VROXWLRQ 'HFRPSRVLWLRQ RI RSWLPL]HG FRUUHODWRU DV D SUHSURFHVVRU IROORZHG E\ 6')/$0 WRSf 1RQOLQHDU YDULDWLRQ VKRZQ ZLWK 0/3 UHSODFLQJ 6') LQ VLJQDO IORZ PLGGOHf GHWDLO RI WKH 0/3 ERWWRPf 7KH OLQHDU WUDQVIRUPDWLRQ UHSUHVHQWV WKH VSDFH GRPDLQ HTXLYDOHQW RI WKH VSHFWUDO SUHSURFHVVRU ,6$5 LPDJHV RI WZR YHKLFOH W\SHV VKRZQ DW DVSHFW DQJOHV RI DQG GHJUHHV UHVSHFWLYHO\ Y

PAGE 6

3DJH *HQHUDOL]DWLRQ DV PHDVXUHG E\ WKH PLQLPXP SHDN UHVSRQVH *HQHUDOL]DWLRQ DV PHDVXUHG E\ WKH SHDN UHVSRQVH PHDQ VTXDUH HUURU &RPSDULVRQ RI 52& FXUYHV 52& SHUIRUPDQFH PHDVXUHV YHUVXV 3HDN RXWSXW UHVSRQVH RI OLQHDU DQG QRQOLQHDU ILOWHUV RYHU WKH WUDLQLQJ VHW 2XWSXW UHVSRQVH RI OLQHDU ILOWHU WRSf DQG QRQOLQHDU ILOWHU ERWWRPf 52& FXUYHV IRU OLQHDU ILOWHU VROLG OLQHf YHUVXV QRQOLQHDU ILOWHU GDVKHG OLQHf ([SHULPHQW 5HVXOWLQJ IHDWXUH VSDFH IURP VLPSOH QRLVH WUDLQLQJ ([SHULPHQW ,, 5HVXOWLQJ IHDWXUH VSDFH ZKHQ RUWKRJRQDOLW\ LV LPSRVHG RQ WKH LQSXW OD\HU RI WKH 0/3 ([SHULPHQW ,, 5HVXOWLQJ 52& FXUYH ZLWK RUWKRJRQDOLW\ FRQVWUDLQW ([SHULPHQW ,, 2XWSXW UHVSRQVH WR DQ LPDJH IURP WKH UHFRJQLWLRQ FODVV WUDLQLQJ VHW ([SHULPHQW +, 5HVXOWLQJ IHDWXUH VSDFH ZKHQ WKH VXEVSDFH QRLVH LV XVHG IRU WUDLQn LQJ ([SHULPHQW ,,, 5HVXOWLQJ 52& FXUYH IRU VXEVSDFH QRLVH WUDLQLQJ ([SHULPHQW ,,, 2XWSXW UHVSRQVH WR DQ LPDJH IURP WKH UHFRJQLWLRQ FODVV WUDLQLQJ VHW /HDUQLQJ FXUYHV IRU WKUHH PHWKRGV ([SHULPHQW ,9 UHVXOWLQJ IHDWXUH VSDFH IURP FRQYH[ KXOO WUDLQLQJ ([SHULPHQW ,9 5HVXOWLQJ 52& FXUYH ZLWK FRQYH[ KXOO DSSURDFK &ODVVLFDO SDWWHUQ FODVVLILFDWLRQ GHFRPSRVLWLRQ 'HFRPSRVLWLRQ RI 1/0$&( DV D FDVFDGH RI IHDWXUH H[WUDFWLRQ IROORZHG E\ GLVn FULPLQDWLRQ 0XWXDO LQIRUPDWLRQ DSSURDFK WR IHDWXUH H[WUDFWLRQ 0DSSLQJ DV IHDWXUH H[WUDFWLRQ ,QIRUPDWLRQ FRQWHQW LV PHDVXUHG LQ WKH ORZ GLPHQn VLRQDO VSDFH RI WKH REVHUYHG RXWSXW $ VLJQDO IORZ GLDJUDP RI WKH OHDUQLQJ DOJRULWKP *UDGLHQW RI WZRGLPHQVLRQDO JDXVVLDQ NHUQHO 7KH NHUQHOV DFW DV DWWUDFWRUV WR ORZ SRLQWV LQ WKH REVHUYHG 3') RQ WKH GDWD ZKHQ HQWURS\ PD[LPL]DWLRQ LV GHVLUHG 0L[WXUH RI JDXVVLDQV H[DPSOH 0L[WXUH RI JDXVVLDQV H[DPSOH HQWURS\ PLQLPL]DWLRQ DQG PD[LPL]DWLRQ 3&$ YV (QWURS\ JDXVVLDQ FDVH 3&$ YV (QWURS\ QRQJDXVVLDQ FDVH 3&$ YV (QWURS\ QRQJDXVVLDQ FDVH YL

PAGE 7

3DJH ([DPSOH ,6$5 LPDJHV IURP WZR YHKLFOHV XVHG IRU H[SHULPHQWV 6LQJOH YHKLFOH H[SHULPHQW LWHUDWLRQV 6LQJOH YHKLFOH H[SHULPHQW LWHUDWLRQV 6LQJOH YHKLFOH H[SHULPHQW LWHUDWLRQV 7ZR YHKLFOH H[SHULPHQW 7ZR GLPHQVLRQDO DWWUDFWRU IXQFWLRQV 7ZR GLPHQVLRQDO UHJXODWLQJ IXQFWLRQ 0DJQLWXGH RI WKH UHJXODWLQJ IXQFWLRQ $SSUR[LPDWLRQ RI WKH UHJXODWLQJ IXQFWLRQ )HHGEDFN IXQFWLRQV IRU LPSOLFLW HUURU WHUP (QWURS\ PLQLPL]DWLRQ DV ORFDO DWWUDFWLRQ (QWURS\ PD[LPL]DWLRQ DV GLIIXVLRQ 6WRSSLQJ FULWHULRQ 0XWXDO LQIRUPDWLRQ IHDWXUH VSDFH 52& FXUYHV IRU PXWXDO LQIRUPDWLRQ IHDWXUH H[WUDFWLRQ GRWWHG OLQHf YHUVXV OLQHDU 0$&( ILOWHU VROLG OLQHf 0XWXDO LQIRUPDWLRQ IHDWXUH VSDFH UHVXOWLQJ IURP FRQYH[ KXOO H[HPSODUV 52& FXUYHV IRU PXWXDO LQIRUPDWLRQ IHDWXUH H[WUDFWLRQ GRWWHG OLQHf YHUVXV OLQHDU 0$&( ILOWHU VROLG OLQHf YLL

PAGE 8

/,67 2) 7$%/(6 7DEOH 3DJH &ODVVLILHU SHUIRUPDQFH PHDVXUHV ZKHQ WKH ILOWHU LV GHWHUPLQHG E\ HLWKHU RI WKH FRPPRQ PHDVXUHV RI JHQHUDOL]DWLRQ DV FRPSDUHG WR EHVW FODVVLILHU SHUIRUPDQFH IRU WZR YDOXHV RI &RUUHODWLRQ RI JHQHUDOL]DWLRQ PHDVXUHV WR FODVVLILHU SHUIRUPDQFH ,Q ERWK FDVHV HTXDO WR RU f WKH FODVVLILHU SHUIRUPDQFH DV PHDVXUHG E\ WKH DUHD RI WKH 52& FXUYH RU 3ID DW 3G HTXDO KDV DQ RSSRVLWH FRUUHODWLRQ DV WR ZKDW ZRXOG EH H[SHFWHG RI D XVHIXO PHDVXUH IRU SUHGLFWLQJ SHUIRUPDQFH &RPSDULVRQ RI 52& FODVVLILHU SHUIRUPDQFH IRU WR YDOXHV RI 3G 5HVXOWV DUH VKRZQ IRU WKH OLQHDU ILOWHU YHUVXV IRXU GLIIHUHQW W\SHV RI QRQOLQHDU WUDLQLQJ 1 ZKLWH QRLVH WUDLQLQJ *6 *UDP6FKPLGW RUWKRJRQDOL]DWLRQ VXE1 3&$ VXEVSDFH QRLVH &+ FRQYH[ KXOO UHMHFWLRQ FODVV &RPSDULVRQ RI 52& FODVVLILHU SHUIRUPDQFH IRU WR YDOXHV RI 3G 5HVXOWV DUH VKRZQ IRU WKH OLQHDU ILOWHU YHUVXV H[SHULPHQWV ,,, DQG ,9 IURP VHFWLRQ DQG PXWXDO LQIRUPDWLRQ IHDWXUH H[WUDFWLRQ7KH V\PEROV LQGLFDWH WKH W\SH RI UHMHFWLRQ FODVV H[HPSODUV XVHG 1 ZKLWH QRLVH WUDLQLQJ *6 *UDP6FKPLGW RUWKRJRQDOL]DWLRQ VXE1 3&$ VXEVSDFH QRLVH &+ FRQYH[ KXOO UHMHFWLRQ FODVV YLLL

PAGE 9

$EVWUDFW RI 'LVVHUWDWLRQ 3UHVHQWHG WR WKH *UDGXDWH 6FKRRO RI WKH 8QLYHUVLW\ RI )ORULGD LQ 3DUWLDO )XOILOOPHQW RI WKH 5HTXLUHPHQWV IRU WKH 'HJUHH RI 'RFWRU RI 3KLORVRSK\ 121/,1($5 (;7(16,216 72 7+( 0,1,080 $9(5$*( &255(/$7,21 (1(5*< ),/7(5 %\ -RKQ : )LVKHU ,,, 0D\ &KDLUPDQ 'U -RV & 3ULQFLSH 0DMRU 'HSDUWPHQW (OHFWULFDO DQG &RPSXWHU (QJLQHHULQJ 7KH PDMRU JRDO RI WKLV UHVHDUFK LV WR GHYHORS HIILFLHQW PHWKRGV E\ ZKLFK WKH IDPLO\ RI GLVWRUWLRQ LQYDULDQW ILOWHUV VSHFLILFDOO\ WKH PLQLPXP DYHUDJH FRUUHODWLRQ HQHUJ\ 0$&(f ILOWHU FDQ EH H[WHQGHG WR D JHQHUDO QRQOLQHDU VLJQDO SURFHVVLQJ IUDPHZRUN 7KH SULPDU\ DSSOLFDWLRQ RI 0$&( ILOWHUV KDV EHHQ WR SDWWHUQ FODVVLILFDWLRQ RI LPDJHV 7ZR GHVLUDEOH TXDOLWLHV RI 0$&(W\SH FRUUHODWRUV DUH HDVH RI LPSOHPHQWDWLRQ YLD FRUUHODWLRQ DQG DQDn O\WLF FRPSXWDWLRQ RI WKH ILOWHU FRHIILFLHQWV 2XU PRWLYDWLRQ IRU H[SORULQJ QRQOLQHDU H[WHQVLRQV WR WKHVH ILOWHUV LV GXH WR WKH ZHOO NQRZQ OLPLWDWLRQV RI WKH OLQHDU V\VWHPV DSSURDFK WR FODVVLILFDWLRQ $PRQJ WKHVH OLPLWDn WLRQV WKH DWWHPSW WR VROYH WKH FODVVLILFDWLRQ SUREOHP LQ D VLJQDO UHSUHVHQWDWLRQ VSDFH ZKHUHDV WKH FODVVLILFDWLRQ SUREOHP LV PRUH SURSHUO\ VROYHG LQ D GHFLVLRQ RU SUREDELOLW\ VSDFH $Q DGGLWLRQDO OLPLWDWLRQ RI WKH 0$&( ILOWHU LV WKDW LW FDQ RQO\ EH XVHG WR UHDOL]H D OLQHDU GHFLVLRQ VXUIDFH UHJDUGOHVV RI WKH PHDQV E\ ZKLFK LW LV FRPSXWHG 7KHVH OLPLWDWLRQV OHDG WR VXERSWLPDO FODVVLILFDWLRQ DQG GLVFULPLQDWLRQ SHUIRUPDQFH ,;

PAGE 10

([WHQVLRQ WR QRQOLQHDU VLJQDO SURFHVVLQJ LV QRW ZLWKRXW FRVW 6ROXWLRQV PXVW LQ JHQHUDO EH FRPSXWHG LWHUDWLYHO\ 2XU DSSURDFK ZDV PRWLYDWHG E\ WKH HDUO\ SURRI WKDW WKH 0$&( ILOWHU LV HTXLYDOHQW WR WKH OLQHDU DVVRFLDWLYH PHPRU\ /$0f 7KH DVVRFLDWLYH PHPRU\ SHUn VSHFWLYH LV PRUH SURSHUO\ DVVRFLDWHG ZLWK WKH FODVVLILFDWLRQ SUREOHP DQG KDV EHHQ GHYHOn RSHG H[WHQVLYHO\ LQ DQ LWHUDWLYH IUDPHZRUN ,Q WKLV WKHVLV ZH GHPRQVWUDWH D PHWKRG HPSKDVL]LQJ D VWDWLVWLFDO SHUVSHFWLYH RI WKH 0$&( ILOWHU RSWLPL]DWLRQ FULWHULRQ 7KURXJK WKH VWDWLVWLFDO SHUVSHFWLYH HIILFLHQW PHWKRGV RI UHSUHVHQWLQJ WKH UHMHFWLRQ DQG UHFRJQLWLRQ FODVVHV DUH GHULYHG 7KLV LQ WXUQ HQDEOHV D PDFKLQH OHDUQLQJ DSSURDFK DQG WKH V\QWKHVLV RI PRUH SRZHUIXO QRQOLQHDU GLVFULPLQDQW IXQFWLRQV ZKLFK PDLQWDLQ WKH GHVLUDEOH SURSHUWLHV RI WKH OLQHDU 0$&( ILOWHU QDPHO\ ORFDOn L]HG GHWHFWLRQ DQG VKLIW LQYDULDQFH :H DOVR SUHVHQW D QHZ LQIRUPDWLRQ WKHRUHWLF DSSURDFK WR WUDLQLQJ LQ D VHOIRUJDQL]HG RU VXSHUYLVHG PDQQHU ,QIRUPDWLRQ WKHRUHWLF VLJQDO SURFHVVLQJ ORRNV EH\RQG WKH VHFRQG RUGHU VWDWLVWLFDO FKDUDFWHUL]DWLRQ LQKHUHQW LQ WKH OLQHDU V\VWHPV DSSURDFK 7KH LQIRUPDWLRQ WKHRn UHWLF IUDPHZRUN SUREHV WKH SUREDELOLW\ VSDFH RI WKH VLJQDO XQGHU DQDO\VLV 7KLV WHFKQLTXH KDV ZLGH DSSOLFDWLRQ EH\RQG QRQOLQHDU 0$&( ILOWHU WHFKQLTXHV DQG UHSUHVHQWV D SRZHUIXO QHZ DGYDQFH WR WKH DUHD RI LQIRUPDWLRQ WKHRUHWLF VLJQDO SURFHVVLQJ (PSLULFDO UHVXOWV FRPSDULQJ WKH FODVVLFDO OLQHDU PHWKRGRORJ\ WR WKH QRQOLQHDU H[WHQn VLRQV DUH SUHVHQWHG XVLQJ LQYHUVH V\QWKHWLF DSHUWXUH UDGDU ,6$5f LPDJHU\ 7KH UHVXOWV GHPRQVWUDWH WKH VXSHULRU FODVVLILFDWLRQ SHUIRUPDQFH RI WKH QRQOLQHDU 0$&( ILOWHU [

PAGE 11

&+$37(5 ,1752'8&7,21 0RWLYDWLRQ $XWRPDWLF WDUJHW GHWHFWLRQ DQG UHFRJQLWLRQ $7'5f LV D ILHOG RI SDWWHUQ UHFRJQLWLRQ 7KH JRDO RI DQ $7'5 V\VWHP LV WR TXLFNO\ DQG DXWRPDWLFDOO\ GHWHFW DQG FODVVLI\ REMHFWV ZKLFK PD\ EH SUHVHQW ZLWKLQ ODUJH DPRXQWV RI GDWD W\SLFDOO\ LPDJHU\f ZLWK D PLQLPXP RI KXPDQ LQWHUYHQWLRQ ,Q DQ $7'5 V\VWHP LW LV QRW RQO\ GHVLUDEOH WR UHFRJQL]H YDULRXV WDUn JHWV EXW WR ORFDWH WKHP ZLWK VRPH GHJUHH RI DFFXUDF\ 7KH PLQLPXP DYHUDJH FRUUHODWLRQ HQHUJ\ 0$&(f ILOWHU >0DKDODQRELV HW DO @ LV RI LQWHUHVW WR WKH $7'5 SUREOHP GXH WR LWV ORFDOL]DWLRQ DQG GLVFULPLQDWLRQ SURSHUWLHV 7KH 0$&( ILOWHU LV D PHPEHU RI D IDPLO\ RI FRUUHODWLRQ ILOWHUV GHULYHG IURP WKH V\QWKHWLF GLVFULPLQDQW IXQFWLRQ 6')f >+HVWHU DQG &DVDVHQW @ 7KH 6') DQG LWV YDULDQWV KDYH EHHQ ZLGHO\ DSSOLHG WR WKH $7'5 SUREn OHP :H ZLOO GHVFULEH V\QWKHWLF GLVFULPLQDQW IXQFWLRQV LQ PRUH GHWDLO LQ FKDSWHU 2WKHU JHQHUDOL]DWLRQV RI WKH 6') LQFOXGH WKH PLQLPXP YDULDQFH V\QWKHWLF GLVFULPLQDQW IXQFWLRQ 096')f >.XPDU @ WKH 0$&( ILOWHU DQG PRUH UHFHQWO\ WKH JDXVVLDQ PLQLPXP DYHUDJH FRUUHODWLRQ HQHUJ\ *0$&(f >&DVDVHQW HW DO @ DQG WKH PLQLPXP QRLVH DQG FRUUHODWLRQ HQHUJ\ 0,1$&(f >5DYLFKDQGUDQ DQG &DVDVHQW @ ILOWHUV 7KLV DUHD RI ILOWHU GHVLJQ LV FRPPRQO\ UHIHUUHG WR DV GLVWRUWLRQLQYDULDQW ILOWHULQJ ,W LV D JHQHUDOL]DWLRQ RI PDWFKHG VSDWLDO ILOWHULQJ IRU WKH GHWHFWLRQ RI D VLQJOH REMHFW WR WKH GHWHFn WLRQ RI D FODVV RI REMHFWV XVXDOO\ LQ WKH LPDJH GRPDLQ 7\SLFDOO\ WKH REMHFW FODVV LV UHSUHn VHQWHG E\ D VHW RI H[HPSODUV 7KH H[HPSODU LPDJHV UHSUHVHQW WKH LPDJH FODVV WKURXJK D

PAGE 12

UDQJH RI fGLVWRUWLRQVf VXFK DV D YDULDWLRQ LQ YLHZLQJ DVSHFW RI D VLQJOH REMHFW 7KH JRDO LV WR GHVLJQ D VLQJOH ILOWHU ZKLFK ZLOO UHFRJQL]H DQ REMHFW FODVV WKURXJK WKH HQWLUH UDQJH RI GLVWRUWLRQ 8QGHU WKH GHVLJQ FULWHULRQ WKH ILOWHU LV HTXDOO\ PDWFKHG WR WKH HQWLUH UDQJH RI GLVWRUWLRQ DV RSSRVHG WR D VLQJOH YLHZSRLQW DV LQ D PDWFKHG ILOWHU +HQFH WKH QRPHQFODWXUH GLVWRUWLRQLQYDULDQW ILOWHULQJ >.XPDU @ 7KH EXON RI WKH UHVHDUFK XVLQJ WKHVH W\SHV RI ILOWHUV KDV IRFXVHG RQ RSWLFDO DQG LQIUDn UHG ,5f LPDJHU\ DQG RYHUFRPLQJ UHFRJQLWLRQ SUREOHPV LQ WKH SUHVHQFH RI GLVWRUWLRQV DVVRn FLDWHG ZLWK WR PDSSLQJV HJ VFDOH DQG URWDWLRQ LQSODQH DQG RXWRISODQHf 5HFHQWO\ KRZHYHU WKLV WHFKQLTXH KDV EHHQ DSSOLHG WR UDGDU LPDJHU\ >1RYDN HW DO )LVKHU DQG 3ULQFLSH D &KLDQJ HW DO @ ,Q FRQWUDVW WR RSWLFDO RU LQIUDUHG LPDJn HU\ WKH VFDOH RI HDFK SL[HO ZLWKLQ D UDGDU LPDJH LV XVXDOO\ FRQVWDQW DQG NQRZQ &RQVHn TXHQWO\ UDGDU LPDJHU\ GRHV QRW VXIIHU IURP VFDOH GLVWRUWLRQV RI REMHFWV ,Q WKH IDPLO\ RI GLVWRUWLRQ LQYDULDQW ILOWHUV WKH 0$&( ILOWHU KDV EHHQ VKRZQ WR SRVVHV VXSHULRU GLVFULPLQDWLRQ SURSHUWLHV >0DKDODQRELV HW DO &DVDVHQW DQG 5DYLFKDQGUDQ @ ,W LV IRU WKLV UHDVRQ WKDW WKLV ZRUN HPSKDVL]HV QRQOLQHDU H[WHQVLRQV WR WKH 0$&( ILOWHU 7KH 0$&( ILOWHU DQG LWV YDULDQWV DUH GHVLJQHG WR SURGXFH D QDUURZ FRQVWUDLQHG DPSOLWXGH SHDN UHVSRQVH ZKHQ WKH ILOWHU PDVN LV FHQWHUHG RQ D WDUJHW LQ WKH UHFRJQLWLRQ FODVV ZKLOH PLQLPL]LQJ WKH HQHUJ\ LQ WKH UHVW RI WKH RXWSXW SODQH 7KLV SURSHUW\ SURYLGHV GHVLUDEOH ORFDOL]DWLRQ IRU GHWHFWLRQ $QRWKHU SURSHUW\ RI WKH 0$&( ILOWHU LV WKDW LW LV OHVV VXVFHSWLEOH WR RXWRIFODVV IDOVH DODUPV >0DKDODQRELV HW DO @ :KLOH WKH IRFXV RI WKLV ZRUN ZLOO EH RQ WKH 0$&( ILOWHU FULWHULRQ LW VKRXOG EH VWDWHG WKDW DOO RI WKH UHVXOWV SUHn VHQWHG KHUH DUH HTXDOO\ DSSOLFDEOH WR DQ\ RI WKH GLVWRUWLRQ LQYDULDQW ILOWHUV PHQWLRQHG DERYH ZLWK DSSURSULDWH FKDQJHV WR WKH UHVSHFWLYH RSWLPL]DWLRQ FULWHULD

PAGE 13

$OWKRXJK WKH 0$&( ILOWHU GRHV KDYH VXSHULRU IDOVH DODUP SURSHUWLHV LW DOVR KDV VRPH IXQGDPHQWDO OLPLWDWLRQV 6LQFH LW LV D OLQHDU ILOWHU LW FDQ RQO\ EH XVHG WR UHDOL]H OLQHDU GHFLn VLRQ VXUIDFHV ,W KDV DOVR EHHQ VKRZQ WR EH OLPLWHG LQ LWV DELOLW\ WR JHQHUDOL]H WR H[HPSODUV WKDW DUH LQ WKH UHFRJQLWLRQ FODVV EXW QRW LQ WKH WUDLQLQJ VHWf ZKLOH VLPXOWDQHRXVO\ UHMHFWLQJ RXWRIFODVV LQSXWV >&DVDVHQW DQG 5DYLFKDQGUDQ &DVDVHQW HW DO @ 7KH QXPEHU RI GHVLJQ H[HPSODUV FDQ EH LQFUHDVHG LQ RUGHU WR RYHUFRPH JHQHUDOL]DWLRQ SUREOHPV KRZn HYHU WKH FDOFXODWLRQ RI WKH ILOWHU FRHIILFLHQWV EHFRPHV FRPSXWDWLRQDOO\ SURKLELWLYH DQG QXPHULFDOO\ XQVWDEOH DV WKH QXPEHU RI GHVLJQ H[HPSODUV LV LQFUHDVHG >.XPDU @ 7KH 0,1$&( DQG *0$&( YDULDWLRQV KDYH LPSURYHG JHQHUDOL]DWLRQ SURSHUWLHV ZLWK D VOLJKW GHJUDGDWLRQ LQ WKH DYHUDJH RXWSXW SODQH YDULDQFH >5DYLFKDQGUDQ DQG &DVDVHQW @ DQG VKDUSQHVV RI WKH FHQWUDO SHDN >&DVDVHQW HW DO @ UHVSHFWLYHO\ 7KLV UHVHDUFK SUHVHQWV D EDVLV E\ ZKLFK WKH 0$&( ILOWHU DQG E\ H[WHQVLRQ DOO OLQHDU GLVWRUWLRQ LQYDULDQW ILOWHUV FDQ EH H[WHQGHG WR D PRUH JHQHUDO QRQOLQHDU VLJQDO SURFHVVLQJ IUDPHZRUN ,Q WKH GHYHORSPHQW LW LV VKRZQ WKDW WKH SHUIRUPDQFH RI WKH OLQHDU 0$&( ILOWHU FDQ EH LPSURYHG XSRQ LQ WHUPV RI JHQHUDOL]DWLRQ ZKLOH PDLQWDLQLQJ LWV GHVLUDEOH SURSHUn WLHV LH VKDUS FRQVWUDLQHG SHDN DW WKH FHQWHU RI WKH RXWSXW SODQH $ PRUH GHWDLOHG GHVFULSWLRQ RI WKH GHYHORSPHQWDO SURJUHVVLRQ RI GLVWRUWLRQ LQYDULDQW ILOWHULQJ LV JLYHQ LQ FKDSWHU ,Q WKLV FKDSWHU D TXDOLWDWLYH FRPSDULVRQ RI WKH YDULRXV GLVWRUn WLRQ LQYDULDQW ILOWHUV LV SUHVHQWHG XVLQJ LQYHUVH V\QWKHWLF DSHUWXUH UDGDU ,6$5f LPDJHU\ 7KH DSSOLFDWLRQ RI SDWWHUQ UHFRJQLWLRQ WHFKQLTXHV WR KLJKUHVROXWLRQ UDGDU LPDJHU\ KDV EHFRPH D WRSLF RI JUHDW LQWHUHVW UHFHQWO\ ZLWK WKH DGYHQW RI ZLGHO\ DYDLODEOH LQVWUXPHQWDn WLRQ JUDGH LPDJLQJ UDGDUV +LJK UHVROXWLRQ UDGDU LPDJHU\ SRVHV D VSHFLDO FKDOOHQJH WR GLVn WRUWLRQ LQYDULDQW ILOWHULQJ LQ WKDW WKH VRXUFH RI GLVWRUWLRQV VXFK DV URWDWLRQ LQ DVSHFW RI DQ

PAGE 14

REMHFW GR QRW PDQLIHVW WKHPVHOYHV DV URWDWLRQV ZLWKLQ WKH UDGDU LPDJH DV RSSRVHG WR RSWLn FDO LPDJHU\f ,Q WKLV FDVH WKH GLVWRUWLRQ LV QRW SXUHO\ JHRPHWULF EXW PRUH DEVWUDFW &KDSWHU SUHVHQWV D GHULYDWLRQ RI WKH 0$&( ILOWHU DV D VSHFLDO FDVH RI .RKRQHQfV OLQn HDU DVVRFLDWLYH PHPRU\ >@ 7KLV UHODWLRQVKLS LV LPSRUWDQW LQ WKDW WKH DVVRFLDWLYH PHPn RU\ SHUVSHFWLYH LV WKH VWDUWLQJ SRLQW IRU GHYHORSLQJ QRQOLQHDU H[WHQVLRQV WR WKH 0$&( ILOWHU ,Q FKDSWHU WKH EDVLV XSRQ ZKLFK WKH 0$&( ILOWHU FDQ H[WHQGHG WR QRQOLQHDU DGDSWLYH V\VWHPV LV GHYHORSHG ,Q WKLV FKDSWHU D QRQOLQHDU DUFKLWHFWXUH LV SURSRVHG IRU WKH H[WHQVLRQ RI WKH 0$&( ILOWHU $ VWDWLVWLFDO SHUVSHFWLYH RI WKH 0$&( ILOWHU LV GLVFXVVHG ZKLFK OHDGV QDWXUDOO\ LQWR D FODVV UHSUHVHQWDWLRQDO YLHZSRLQW RI WKH RSWLPL]DWLRQ FULWHULRQ RI GLVWRUWLRQ LQYDULDQW ILOWHUV &RPPRQO\ XVHG PHDVXUHV RI JHQHUDOL]DWLRQ IRU GLVWRUWLRQ LQYDULDQW ILOWHUn LQJ DUH DOVR GLVFXVVHG 7KH UHVXOWV RI WKH H[SHULPHQWV SUHVHQWHG VKRZ WKDW WKH PHDVXUHV DUH QRW DSSURSULDWH IRU WKH WDVN RI FODVVLILFDWLRQ ,W LV LQWHUHVWLQJ WR QRWH WKDW WKH DQDO\VLV LQGLFDWHV WKH DSSURSULDWHQHVV RI WKH PHDVXUHV LV LQGHSHQGHQW RI ZKHWKHU WKH PDSSLQJ LV OLQHDU RU QRQOLQHDU 7KH DQDO\VLV DOVR GLVFXVVHV WKH PHULW RI WKH 0$&( ILOWHU RSWLPL]DWLRQ FULWHULRQ LQ WKH FRQWH[W RI FODVVLILFDWLRQ DQG ZLWK UHJDUGV WR PHDVXUHV RI JHQHUDOL]DWLRQ 7KH FKDSWHU FRQFOXGHV ZLWK D VHULHV RI H[SHULPHQWV IXUWKHU UHILQLQJ WKH WHFKQLTXHV E\ ZKLFK QRQOLQHDU 0$&( ILOWHUV DUH FRPSXWHG &KDSWHU SUHVHQWV D QHZ LQIRUPDWLRQ WKHRUHWLF PHWKRG IRU IHDWXUH H[WUDFWLRQ $Q LQIRUPDWLRQ WKHRUHWLF DSSURDFK LV PRWLYDWHG E\ WKH REVHUYDWLRQ WKDW WKH RSWLPL]DWLRQ FULWHn ULRQ RI WKH 0$&( ILOWHU RQO\ FRQVLGHUV WKH VHFRQGRUGHU VWDWLVWLFV RI WKH UHMHFWLRQ FODVV 7KH LQIRUPDWLRQ WKHRUHWLF DSSURDFK KRZHYHU RSHUDWHV LQ SUREDELOLW\ VSDFH H[SORLWLQJ SURSHUWLHV RI WKH XQGHUO\LQJ SUREDELOLW\ GHQVLW\ IXQFWLRQ 7KH PHWKRG HQDEOHV WKH H[WUDH

PAGE 15

WLRQ RI VWDWLVWLFDOO\ LQGHSHQGHQW IHDWXUHV 7KH PHWKRG KDV ZLGH DSSOLFDWLRQ EH\RQG QRQOLQn HDU H[WHQVLRQV WR 0$&( ILOWHUV DQG DV VXFK UHSUHVHQWV D SRZHUIXO QHZ WHFKQLTXH IRU LQIRUPDWLRQ WKHRUHWLF VLJQDO SURFHVVLQJ $ UHYLHZ RI LQIRUPDWLRQ WKHRUHWLF DSSURDFKHV WR VLJQDO SURFHVVLQJ DUH SUHVHQWHG LQ WKLV FKDSWHU 7KLV LV IROORZHG E\ WKH GHULYDWLRQ RI WKH QHZ WHFKQLTXH DV ZHOO DV VRPH JHQHUDO H[SHULPHQWDO UHVXOWV ZKLFK DUH QRW VSHFLILFDOO\ UHODWHG WR QRQOLQHDU 0$&( ILOWHUV EXW ZKLFK VHUYH WR LOOXVWUDWH WKH SRWHQWLDO RI WKLV PHWKRG )LQDOO\ WKH ORJLFDO SODFHPHQW RI WKLV PHWKRG ZLWKLQ QRQOLQHDU 0$&( ILOWHUV LV SUHVHQWHG DORQJ ZLWK H[SHULPHQWDO UHVXOWV ,Q FKDSWHU ZH UHYLHZ WKH VLJQLILFDQW UHVXOWV DQG FRQWULEXWLRQV RI WKLV GLVVHUWDWLRQ :H DOVR GLVFXVV SRVVLEOH OLQHV RI UHVHDUFK UHVXOWLQJ IURP WKH EDVH HVWDEOLVKHG KHUH

PAGE 16

&+$37(5 %$&.*5281' 'LVFXVVLRQ RI 'LVWRUWLRQ ,QYDULDQW )LOWHUV $V VWDWHG GLVWRUWLRQ LQYDULDQW ILOWHULQJ LV D JHQHUDOL]DWLRQ RI PDWFKHG VSDWLDO ILOWHULQJ ,W LV ZHOO NQRZQ WKDW WKH PDWFKHG ILOWHU PD[LPL]HV WKH SHDNVLJQDOWRDYHUDJHQRLVH SRZHU UDWLR DV PHDVXUHG DW WKH ILOWHU RXWSXW DW D VSHFLILF VDPSOH ORFDWLRQ ZKHQ WKH LQSXW VLJQDO LV FRUUXSWHG E\ DGGLWLYH ZKLWH QRLVH ,Q WKH GLVFUHWH VLJQDO FDVH WKH GHVLJQ RI D PDWFKHG ILOWHU LV HTXLYDOHQW WR WKH IROORZLQJ YHFWRU RSWLPL]DWLRQ SUREOHP>.XPDU @ PLQ KAK VW [ G ^K [`H &1[L ZKHUH WKH FROXPQ YHFWRU [ FRQWDLQV WKH 1 FRHIILFLHQWV RI WKH VLJQDO ZH ZLVK WR GHWHFW K FRQWDLQV WKH FRHIILFLHQWV RI WKH ILOWHU I LQGLFDWHV WKH KHUPLWLDQ WUDQVSRVH RSHUDWRUf DQG G LV D SRVLWLYH VFDOHU 7KLV QRWDWLRQ LV DOVR VXLWDEOH IRU 1GLPHQVLRQDO VLJQDO SURFHVVLQJ DV ORQJ DV WKH VLJQDO DQG ILOWHU KDYH ILQLWH VXSSRUW DQG DUH UHRUGHUHG LQ WKH VDPH OH[LFRn JUDSKLF PDQQHU HJ E\ URZ RU FROXPQ LQ WKH WZRGLPHQVLRQDO FDVHf LQWR FROXPQ YHFWRUV 7KH RSWLPDO VROXWLRQ WR WKLV SUREOHP LV K rrWMU \nG

PAGE 17

*LYHQ WKLV VROXWLRQ ZH FDQ FDOFXODWH WKH SHDN RXWSXW VLJQDO SRZHU DV V [nKf rrr[InGf U DQG WKH DYHUDJH RXWSXW QRLVH SRZHU GXH WR DQ DGGLWLYH ZKLWH QRLVH LQSXW D (^KAQQAK` FVIILK DG[U-Ff ZKHUH LV DQ D LV WKH LQSXW QRLVH YDULDQFH 5HVXOWLQJ LQ D SHDNVLJQDOWRDYHUDJHQRLVH RXWSXW SRZHU UDWLR RI $V ZH FDQ VHH WKH UHVXOW LV LQGHSHQGHQW RI WKH FKRLFH RI VFDODU G ,I G LV VHW WR XQLW\ WKH UHVXOW LV D QRUPDOL]HG PDWFKHG VSDWLDO ILOWHU>9DQGHU /XJW @ ,Q RUGHU WR IXUWKHU PRWLYDWH WKH FRQFHSW RI GLVWRUWLRQ LQYDULDQW ILOWHULQJ D W\SLFDO $75 H[DPSOH SUREOHP ZLOO EH XVHG IRU LOOXVWUDWLRQ 7KLV H[SHULPHQW ZLOO DOVR KHOS WR LOOXVWUDWH WKH JHQHVLV RI WKH YDULRXV W\SHV RI GLVWRUWLRQ LQYDULDQW ILOWHULQJ DSSURDFKHV EHJLQQLQJ ZLWK WKH PDWFKHG VSDWLDO ILOWHU 06)f ,QYHUVH V\QWKHWLF DSHUWXUH UDGDU ,6$5f LPDJHU\ ZLOO EH XVHG IRU DOO RI WKH H[SHULPHQWV SUHVHQWHG KHUHLQ 7KH GLVWRUWLRQ LQYDULDQW ILOWHULQJ KRZHYHU LV QRW OLPLWHG WR ,6$5 LPDJn HU\ DQG LQ IDFW FDQ EH H[WHQGHG WR PXFK PRUH DEVWUDFW GDWD W\SHV ,6$5 LPDJHV DUH VKRZQ

PAGE 18

LQ ILJXUH ,Q WKH ILJXUH WKUHH YHKLFOHV DUH GLVSOD\HG HDFK DW WKUHH GLIIHUHQW UDGDU YLHZLQJ DVSHFW DQJOHV DQG GHJUHHVf ZKHUH WKH DVSHFW DQJOH LV WKH GLUHFWLRQ RI WKH IURQW RI WKH YHKLFOH UHODWLYH WR WKH UDGDU DQWHQQD 7KH LPDJH GLPHQVLRQV DUH [ SL[HOV 5DGDU V\VWHPV PHDVXUH D TXDQWLW\ FDOOHG UDGDU FURVV VHFWLRQ 5&6f :KHQ D UDGDU WUDQVPLWV DQ HOHFWURPDJQHWLF SXOVH FRPH RI WKH LQFLGHQW HQHUJ\ RQ DQ REMHFW LV UHIOHFWHG EDFN WR WKH UDGDU 5&6 LV D PHDVXUH RI WKH UHIOHFWHG HQHUJ\ GHWHFWHG E\ WKH UDGDUfV UHFHLYLQJ DQWHQQD ,6$5 LPDJHU\ LV WKH UHVXOW RI D UDGDU VLJQDO SURFHVVLQJ WHFKQLTXH ZKLFK XVHV PXOWLSOH GHWHFWHG UDGDU UHWXUQV PHDVXUHG RYHU D UDQJH RI UHODWLYH REMHFW DVSHFW DQJOHV (DFK SL[HO LQ DQ ,6$5 LPDJH LV D PHDVXUH RI WKH DJJUHJDWH UDGDU FURVV VHFWLRQ DW UHJXODUO\ VDPSOHG SRLQWV LQ VSDFH 7ZR W\SHV RI YHKLFOHV DUH VKRZQ 9HKLFOH W\SH ZLOO UHSUHVHQW D UHFRJQLWLRQ FODVV ZKLOH YHKLFOH W\SH ZLOO UHSUHVHQW D FRQIXVLRQ FODVV 7KH JRDO LV WR FRPSXWH D ILOWHU ZKLFK ZLOO UHFRJQL]H YHKLFOH W\SH ZLWKRXW EHLQJ FRQIXVHG E\ YHKLFOH ,PDJHV RI YHKLFOH OD ZLOO EH XVHG WR FRPSXWH WKH ILOWHU FRHIILFLHQWV 9HKLFOHV OE DQG D UHSUHVHQW DQ LQGHSHQGHQW WHVWLQJ FODVV ,6$5 LPDJHV RI DOO WKUHH YHKLFOHV ZHUH IRUPHG LQ WKH DVSHFW UDQJH RI WR GHJUHHV DW GHJUHH LQFUHPHQWV $V WKH 06) LV GHULYHG IURP D VLQJOH YHKLFOH LPDJH DQ LPDJH RI YHKLn FOH OD DW GHJUHHV WKH PLGSRLQW RI WKH DVSHFW UDQJHf LV XVHG 7KH SHDN RXWSXW UHVSRQVH WR DQ LPDJH UHSUHVHQWV PD[LPXP RI WKH FURVV FRUUHODWLRQ IXQFWLRQ RI WKH LPDJH ZLWK WKH 06) WHPSODWH 7KH SHDN RXWSXW UHVSRQVH RYHU WKH HQWLUH DVSHFW UDQJH RI YHKLFOH OD LV VKRZQ LQ ILJXUH $V FDQ EH VHHQ LQ WKH ILJXUH WKH ILOWHU PDWFKHV DW GHJUHHV YHU\ ZHOO KRZHYHU DV WKH DVSHFW PRYHV DZD\ IURP GHJUHHV WKH

PAGE 19

YHKLFOH OD WUDLQLQJf YHKLFOH OE WHVWLQJf YHKLFOH D WHVWLQJf )LJXUH ,6$5 LPDJHV RI WZR YHKLFOH W\SHV 9HKLFOHV DUH VKRZQ DW DVSHFW DQJOHV RI DQG GHJUHHV UHVSHFWLYHO\ 7ZR GLIIHUHQW YHKLFOHV RI W\SH D DQG Ef DUH VKRZQ ZKLOH RQH YHKLFOH RI W\SH Df LV VKRZQ 9HKLFOH OD LV XVHG DV D WUDLQLQJ YHKLFOH ZKLOH YHKLFOH OE LV XVHG DV WKH WHVWLQJ YHKLFOH IRU WKH UHFRJQLWLRQ FODVV 9HKLFOH D UHSUHVHQWV D FRQIXVLRQ YHKLFOH SHDN RXWSXW UHVSRQVH EHJLQV WR GHJUDGH 'HSHQGLQJ RQ WKH W\SH RI LPDJHU\ DV ZHOO DV WKH YHKLFOH WKLV GHJUDGDWLRQ FDQ EHFRPH YHU\ VHYHUH

PAGE 20

8 27 W  &2 PDWFKHG VSDWLDO ILOWHU / , , DVSHFW DQJOH )LJXUH 06) SHDN RXWSXW UHVSRQVH RI WUDLQLQJ YHKLFOH D RYHU DOO DVSHFW DQJOHV 3HDN UHVSRQVH GHJUDGHV DV DVSHFW GLIIHUHQFH LQFUHDVHV 7KH SHDN RXWSXW UHVSRQVHV RI ERWK YHKLFOHV LQ WKH WHVWLQJ VHW DUH VKRZQ LQ ILJXUH RYHUODLQ RQ WKH WUDLQLQJ LPDJH UHVSRQVH ,Q RQH VHQVH WKH ILOWHU H[KLELWV JRRG JHQHUDOL]Dn WLRQ WKDW LV WKH SHDN UHVSRQVH WR YHKLFOH OE LV PXFK WKH VDPH DV D IXQFWLRQ RI DVSHFW DV WKH SHDN UHVSRQVH WR YHKLFOH OD +RZHYHU WKH ILOWHU DOVR fJHQHUDOL]HVf HTXDOO\ DV ZHOO WR YHKLn FOH E ZKLFK LV XQGHVLUDEOH $V D YHKLFOH GLVFULPLQDWLRQ WHVW YHKLFOH IURP YHKLFOH f WKH 06) IDLOV

PAGE 21

LO 2 DVSHFW DQJOH )LJXUH 06) SHDN RXWSXW UHVSRQVH RI WHVWLQJ YHKLFOHV OE DQG D RYHU DOO DVSHFW DQJOHV 5HVSRQVHV DUH RYHUODLG RQ WUDLQLQJ YHKLFOH UHVSRQVH )LOWHU UHVSRQVHV WR YHKLFOHV OE GDVKHG OLQHf DQG D GDVKHGGRWf GR QRW GLIIHU VLJQLILFDQWO\

PAGE 22

7KH RXWSXW LPDJH SODQH UHVSRQVH WR D VLQJOH LPDJH RI YHKLFOH OD LV VKRZQ LQ ILJXUH 5HILQHPHQWV WR WKH GLVWRUWLRQ LQYDULDQW ILOWHU DSSURDFK QDPHO\ WKH 0$&( ILOWHU ZLOO VKRZ WKDW WKH ORFDOL]DWLRQ RI WKLV RXWSXW UHVSRQVH DV PHDVXUHG E\ WKH VKDUSQHVV RI WKH SHDN FDQ EH LPSURYHG VLJQLILFDQWO\ 6\QWKHWLF 'LVFULPLQDQW )XQFWLRQ 7KH GHJUDGDWLRQ HYLGHQFHG LQ ILJXUHV DQG ZHUH WKH SULPDU\ PRWLYDWLRQ IRU WKH V\Qn WKHWLF GLVFULPLQDQW IXQFWLRQ 6')f>+HVWHU DQG &DVDVHQW @ $ VKRUWFRPLQJ RI WKH 06) IURP WKH VWDQGSRLQW RI GLVWRUWLRQ LQYDULDQW ILOWHULQJ LV WKDW LW LV RQO\ RSWLPXP IRU D VLQJOH LPDJH 2QH DSSURDFK ZRXOG EH WR GHVLJQ D EDQN RI 06)V RSHUDWLQJ LQ SDUDOOHO ZKLFK ZHUH PDWFKHG WR WKH GLVWRUWLRQ UDQJH 7KH W\SLFDO $75 V\VWHP KRZHYHU PXVW UHFn RJQL]HGLVFULPLQDWH PXOWLSOH YHKLFOH W\SHV DQG VR IURP DQ LPSOHPHQWDWLRQ VWDQGSRLQW DORQH SDUDOOHO 06)V LV DQ LPSUDFWLFDO FKRLFH +HVWHU DQG &DVDVHQW VHW RXW WR GHVLJQ D VLQ

PAGE 23

JOH ILOWHU ZKLFK FRXOG EH PDWFKHG WR PXOWLSOH LPDJHV XVLQJ WKH LGHD RI VXSHUSRVLWLRQ 7KLV DSSURDFK ZDV SRVVLEOH GXH WR WKH ODUJH QXPEHU RI FRHIILFLHQWV GHJUHHV RI IUHHGRPf WKDW W\SLFDOO\ FRQVWLWXWH LPDJH WHPSODWHV )RU KLVWRULFDO UHDVRQV VSHFLILFDOO\ WKDW WKH ILOWHUV LQ TXHVWLRQ ZHUH V\QWKHVL]HG RSWLFDOO\ XVLQJ KRORJUDSKLF WHFKQLTXHV >9DQGHU /XJW @ LW ZDV K\SRWKHVL]HG WKDW VXFK D ILOWHU FRXOG EH V\QWKHVL]HG IURP OLQHDU FRPELQDWLRQV RI D VHW RI H[HPSODU LPDJHV 7KH ILOWHU V\QWKHVLV SURFHGXUH FRQVLVWV RI SURMHFWLQJ WKH H[HPSODU LPDJHV RQWR DQ RUWKRQRUPDO EDVLV RULJLQDOO\ *UDP6FKPLGW RUWKRJRQDOL]DWLRQ ZDV XVHG WR JHQHUDWH WKH EDVLVf 7KH QH[W VWHS LV WR GHWHUPLQH WKH FRHIILFLHQWV ZLWK ZKLFK WR OLQHDUO\ FRPELQH WKH EDVLV YHFWRUV VXFK WKDW D GHVLUHG UHVSRQVH IRU HDFK RULJLQDO LPDJH H[HPSODU ZDV REWDLQHG >+HVWHU DQG &DVDVHQW @ 7KH SURSRVHG V\QWKHVLV SURFHGXUH LV D ELW FRQYROXWHG ,W WXUQV RXW WKDW WKH FKRLFH RI RUWKRQRUPDO EDVLV LV LUUHOHYDQW $V ORQJ DV WKH EDVLV VSDQV WKH VSDFH RI WKH RULJLQDO H[HPn SODU LPDJHV WKH UHVXOW LV DOZD\V WKH VDPH 7KH GHYHORSPHQW RI .XPDU >@ LV PRUH XVHn IXO IRU GHSLFWLQJ WKH 6') DV D JHQHUDOL]DWLRQ RI WKH PDWFKHG ILOWHU IRU WKH ZKLWH QRLVH FDVHf WR PXOWLSOH VLJQDOV 7KH 6') FDQ EH FDVW DV WKH VROXWLRQ WR WKH IROORZLQJ RSWLPL]Dn WLRQ SUREOHP PLQ K8L VW ;AK G ^K H &1r ; V &1r 1n G e &rnrn` ZKHUH ; LV QRZ D PDWUL[ ZKRVH $W FROXPQV FRPSULVH D VHW RI WUDLQLQJ LPDJHV ZH ZLVK WR GHWHFW G LV D FROXPQ YHFWRU RI GHVLUHG RXWSXWV RQH IRU HDFK RI WKH WUDLQLQJ H[HPSODUVf 6LQFH WKHVH ILOWHUV KDYH EHHQ DSSOLHG SULPDULO\ WR LPDJHV VLJQDOV ZLOO EH UHIHUUHG WR DV LPDJHV RU H[HPSODUV IURP WKLV SRLQW RQ ,Q WKH YHFWRU QRWDWLRQ DOO $W [ 1 LPDJHV DUH UHRUGHUHG E\ URZ RU FROXPQf LQWR $W [ FROXPQ YHFWRUV ZKHUH 1 $W 1 ‘

PAGE 24

DQG LV W\SLFDOO\ VHW WR DOO XQLW\ YDOXHV IRU WKH UHFRJQLWLRQ FODVV 7KH LPDJHV RI WKH GDWD PDWUL[ ; FRPSULVH WKH UDQJH RI GLVWRUWLRQ WKDW WKH LPSOHPHQWHG ILOWHU LV H[SHFWHG WR HQFRXQWHU ,W LV DVVXPHG WKDW 1W $n DQG VR WKH SUREOHP IRUPXODWLRQ LV D TXDGUDWLF RSWLPLn ]DWLRQ VXEMHFW WR DQ XQGHUGHWHUPLQHG V\VWHP RI OLQHDU FRQVWUDLQWV 7KH RSWLPDO VROXWLRQ LV K ;;A;fanG :KHQ WKHUH LV RQO\ RQH WUDLQLQJ H[HPSODU 1 f DQG G LV XQLW\ WKH 6') GHIDXOWV WR WKH QRUPDOL]HG PDWFKHG ILOWHU 6LPLODU WR WKH PDWFKHG ILOWHU ZKLWH QRLVH FDVHf WKH 6') LV WKH OLQHDU ILOWHU ZKLFK PLQLPL]HV WKH ZKLWH QRLVH UHVSRQVH ZKLOH VDWLVI\LQJ WKH VHW RI OLQHDU FRQVWUDLQWV RYHU WKH WUDLQLQJ H[HPSODUV %\ ZD\ RI H[DPSOH WKH 6') WHFKQLTXH LV WHVWHG DJDLQVW WKH ,6$5 GDWD DV LQ WKH 06) FDVH ([HPSODU LPDJHV IURP YHKLFOH OD ZHUH VHOHFWHG DW HYHU\ GHJUHHV DVSHFW IURP WR GHJUHHV IRU D WRWDO RI H[HPSODU LPDJHV LH 1 f )LJXUH VKRZV WKH SHDN RXWn SXW UHVSRQVH RYHU DOO DVSHFWV RI WKH WUDLQLQJ YHKLFOH ODf $V VHHQ LQ WKH ILJXUH WKH GHJUDn GDWLRQ DV WKH DVSHFW FKDQJHV LV UHPRYHG 7KH 06) UHVSRQVH KDV EHHQ RYHUODLG WR KLJKOLJKW WKH GLIIHUHQFHV 7KH SHDN RXWSXW UHVSRQVH RYHU DOO H[HPSODUV LQ WKH WHVWLQJ VHW LV VKRZQ LQ ILJXUH )URP WKH SHUVSHFWLYH RI SHDN UHVSRQVH WKH ILOWHU JHQHUDOL]HV IDLUO\ ZHOO +RZHYHU DV LQ WKH 06) WKH XVHIXOQHVV RI WKH ILOWHU DV D GLVFULPLQDQW EHWZHHQ YHKLFOHV DQG LV FOHDUO\ OLPn LWHG )LJXUH VKRZV WKH UHVXOWLQJ RXWSXW SODQH UHVSRQVH ZKHQ WKH 6') ILOWHU LV FRUUHODWHG ZLWK D VLQJOH LPDJH RI YHKLFOH OD 7KH ORFDOL]DWLRQ RI WKH SHDN LV VLPLODU WR WKH 06) FDVH

PAGE 25

V\QWKHWLF GLVFULPLQDQW IXQFWLRQ 7 n L n 7 f I n B &2 ? A ? B ?Y9 D X ? a ‘ L DVSHFW DQJOH )LJXUH 6') SHDN RXWSXW UHVSRQVH RI WUDLQLQJ YHKLFOH OD RYHU DOO DVSHFW DQJOHV 7KH 06) UHVSRQVH LV DOVR VKRZQ GDVKHG OLQHf 7KH GHJUDGDWLRQ LQ WKH SHDN UHVSRQVH KDV EHHQ FRUUHFWHG 0LQLPXP 9DULDQFH 6\QWKHWLF 'LVFULPLQDQW )XQFWLRQ 7KH 6') DSSURDFK VHHPLQJO\ VROYHG WKH SUREOHP RI JHQHUDOL]LQJ D PDWFKHG ILOWHU WR PXOWLSOH LPDJHV +RZHYHU WKH 6') KDV QR EXLOWLQ QRLVH WROHUDQFH E\ GHVLJQ H[FHSW IRU WKH ZKLWH QRLVH FDVHf )XUWKHUPRUH LQ SUDFWLFH LW ZRXOG WXUQ RXW WKDW RFFDVLRQDOO\ WKH QRLVH UHVSRQVH ZRXOG EH KLJKHU WKDQ WKH SHDN REMHFW UHVSRQVH GHSHQGLQJ RQ WKH W\SH RI LPDJHU\ $V D UHVXOW GHWHFWLRQ E\ PHDQV RI VHDUFKLQJ IRU FRUUHODWLRQ SHDNV ZDV VKRZQ WR EH XQUHOLDEOH IRU VRPH W\SHV RI LPDJHU\ VSHFLILFDOO\ LPDJHU\ ZKLFK FRQWDLQV UHFRJQLWLRQ FODVV LPDJHV HPEHGGHG LQ QRQZKLWH QRLVH>.XPDU @ .XPDU >@ SURSRVHG D PHWKRG E\ ZKLFK QRLVH WROHUDQFH FRXOG EH EXLOW LQ WR WKH ILOWHU GHVLJQ 7KLV WHFKQLTXH ZDV WHUPHG WKH PLQLPXP YDULDQFH V\QWKHWLF GLVFULPLQDQW IXQFWLRQ 096')f 7KH 096') LV

PAGE 26

)LJXUH 6') SHDN RXWSXW UHVSRQVH RI WHVWLQJ YHKLFOHV OE DQG D RYHU DOO DVSHFW DQJOHV 7KH GDVKHG OLQH LV YHKLFOH OE ZKLOH WKH GDVKHGGRW OLQH LV YHKLFOH D WKH FRUUHODWLRQ ILOWHU ZKLFK PLQLPL]HV WKH RXWSXW YDULDQFH GXH WR ]HURPHDQ LQSXW QRLVH ZKLOH VDWLVI\LQJ WKH VDPH OLQHDU FRQVWUDLQWV DV WKH 6') 7KH RXWSXW QRLVH YDULDQFH FDQ EH VKRZQ WR EH KL7QK ZKHUH K LV WKH YHFWRU RI ILOWHU FRHIILFLHQWV DQG ;Q LV WKH FRYDULDQFH PDWUL[ RI WKH QRLVH >.XPDU @ 0DWKHPDWLFDOO\ WKH SUREOHP IRUPXODWLRQ LV QLQ VW ;8W G f1 [ f1[1 Q [ 1 f1 [ KH & ;H& ,H& GH &

PAGE 27

K =";;n 7D[LnG ,Q WKH FDVH RI ZKLWH QRLVH WKH 096') LV HTXLYDOHQW WR WKH 6') 7KLV WHFKQLTXH KDV D VLJQLILFDQW QXPHULFDO FRPSOH[LW\ LVVXH ZKLFK LV WKDW WKH VROXWLRQ UHTXLUHV WKH LQYHUVLRQ RI DQ 1[ 1 PDWUL[ eff ZKLFK IRU PRGHUDWH LPDJH VL]HV 1 1c1f FDQ EH TXLWH ODUJH DQG FRPSXWDWLRQDOO\ SURKLELWLYH XQOHVV VLPSOLI\LQJ DVVXPSWLRQV FDQ EH PDGH DERXW LWV IRUP HJ D GLDJRQDO PDWUL[ WRHSOLW] HWFf 7KH 096') FDQ EH VHHQ DV D PRUH JHQHUDO H[WHQVLRQ RI WKH PDWFKHG ILOWHU WR PXOWLSOH YHFWRU GHWHFWLRQ DV PRVW VLJQDO SURFHVVLQJ GHILQLWLRQV RI WKH PDWFKHG ILOWHU LQFRUSRUDWH D QRLVH SRZHU VSHFWUXP DQG GR QRW DVVXPH WKH ZKLWH QRLVH FDVH RQO\ ,W LV PHQWLRQHG KHUH EHFDXVH LW LV WKH ILUVW GLVWRUWLRQ LQYDULDQW ILOWHULQJ WHFKQLTXH WR UHFRJQL]H WKH QHHG WR FKDUn DFWHUL]H D UHMHFWLRQ FODVV

PAGE 28

0LQLPXP $YHUDJH &RUUHODWLRQ )QHUJY )LOWHU 7KH 096') DQG WKH 6')f FRQWURO WKH RXWSXW RI WKH ILOWHU DW D VLQJOH SRLQW LQ WKH RXWn SXW SODQH RI WKH ILOWHU ,Q SUDFWLFH ODUJH VLGHOREHV PD\ EH H[KLELWHG LQ WKH RXWSXW SODQH PDNLQJ GHWHFWLRQ GLIILFXOW 7KHVH GLIILFXOWLHV OHG 0DKDODQRELV HW DO >@ WR SURSRVH WKH PLQLPXP DYHUDJH FRUUHODWLRQ HQHUJ\ 0$&(f ILOWHU 7KLV GHYHORSPHQW LQ GLVWRUWLRQ LQYDULn DQW ILOWHULQJ DWWHPSWV DV LWV GHVLJQ JRDO WR FRQWURO QRW RQO\ WKH RXWSXW SRLQW ZKHQ WKH LPDJH LV FHQWHUHG RQ WKH ILOWHU EXW WKH UHVSRQVH RI WKH HQWLUH RXWSXW SODQH DV ZHOO 6SHFLILFDOO\ LW PLQLPL]HV WKH DYHUDJH FRUUHODWLRQ HQHUJ\ RI WKH RXWSXW RYHU WKH WUDLQLQJ H[HPSODUV VXEMHFW WR WKH VDPH OLQHDU FRQVWUDLQWV DV WKH 096') DQG 6') ILOWHUV 7KH SUREOHP LV IRUPXODWHG LQ WKH IUHTXHQF\ GRPDLQ XVLQJ 3DUVHYDO UHODWLRQVKLSV ,Q WKH IUHTXHQF\ GRPDLQ WKH IRUPXODWLRQ LV PLQ :'+ VW + G ^+H&1[?;H&1[1n'H&1r1GH&1n;n` ZKHUH LV D GLDJRQDO PDWUL[ ZKRVH GLDJRQDO HOHPHQWV DUH WKH FRHIILFLHQWV RI WKH DYHUDJH SRZHU VSHFWUXP RI WKH WUDLQLQJ H[HPSODUV 7KH IRUP RI WKH TXDGUDWLF FULWHULRQ LV GHULYHG XVLQJ 3DUVHYDOfV UHODWLRQVKLS $ GHULYDWLRQ LV JLYHQ LQ VHFWLRQ $O RI WKH DSSHQGL[ 7KH RWKHU WHUPV + DQG ; FRQWDLQ WKH ')7 FRHIILFLHQWV RI WKH ILOWHU DQG WUDLQLQJ H[HPSODUV UHVSHFWLYHO\ 7KH YHFWRU G LV WKH VDPH DV LQ WKH 096') DQG 6') FDVHV 7KH RSWLPDO VROXWLRQ LQ WKH IUHTXHQF\ GRPDLQ LV + nrre! nrf nG f $V LQ WKH 096') WKH VROXWLRQ UHTXLUHV WKH LQYHUVLRQ RI DQ 1 [ 1 PDWUL[ EXW LQ WKLV FDVH WKH PDWUL[ LV GLDJRQDO DQG VR LWV LQYHUVLRQ LV WULYLDO :KHQ WKH QRLVH FRYDULDQFH

PAGE 29

PDWUL[ LV HVWLPDWHG IURP REVHUYDWLRQV RI QRLVH VHTXHQFHV DVVXPLQJ ZLGHVHQVH VWDWLRQDU LW\ DQG HUJRGLFLW\f WKH 096') FDQ DOVR EH IRUPXODWHG LQ WKH IUHTXHQF\ GRPDLQ DV ZHOO DQG WKH FRPSOH[ PDWUL[ LQYHUVLRQ LV DYRLGHG $ GHULYDWLRQ RI WKLV LV JLYHQ LQ WKH DSSHQGL[ $ H[DPLQDWLRQ RI HTXDWLRQV f f f VKRZV WKDW XQGHU WKH DVVXPSWLRQ WKDW WKH QRLVH FODVV FDQ EH PRGHOHG DV D VWDWLRQDU\ HUJRGLF UDQGRP QRLVH SURFHVV WKH VROXWLRQ RI WKH 096') FDQ EH IRXQG LQ WKH VSHFWUDO GRPDLQ XVLQJ WKH HVWLPDWHG SRZHU VSHFWUXP RI WKH QRLVH SURFHVV DQG HTXDWLRQ f ,Q SUDFWLFH WKH 0$&( ILOWHU SHUIRUPV EHWWHU WKDQ WKH 096') ZLWK UHVSHFW WR UHMHFWLQJ RXWRIFODVV LQSXW LPDJHV 7KH 0$&( ILOWHU KRZHYHU KDV EHHQ VKRZQ WR KDYH SRRU JHQHUn DOL]DWLRQ SURSHUWLHV WKDW LV LPDJHV LQ WKH UHFRJQLWLRQ FODVV EXW QRW LQ WKH WUDLQLQJ H[HPSODU VHW DUH QRW UHFRJQL]HG $ 0$&( ILOWHU ZDV FRPSXWHG XVLQJ WKH VDPH H[HPSODU LPDJHV DV LQ WKH 6') H[DPSOH )LJXUH VKRZV WKH UHVXOWLQJ RXWSXW LPDJH SODFH UHVSRQVH IRU RQH LPDJH $V FDQ EH VHHQ LQ WKH ILJXUH WKH SHDN LQ WKH FHQWHU LV QRZ KLJKO\ ORFDOL]HG ,Q IDFW LW FDQ EH VKRZQ >0DKDO DQRELV HW DO @ WKDW RYHU WKH WUDLQLQJ H[HPSODUV WKRVH XVHG WR FRPSXWH WKH ILOWHUf WKH RXWSXW SHDN ZLOO DOZD\V EH DW WKH FRQVWUDLQW ORFDWLRQ *HQHUDOL]DWLRQ WR EHWZHHQ DVSHFW LPDJHV DV PHQWLRQHG LV D SUREOHP IRU WKH 0$&( ILOWHU )LJXUH VKRZV WKH SHDN RXWSXW UHVSRQVH RYHU DOO DVSHFW DQJOHV $V FDQ EH VHHQ LQ WKH ILJXUH WKH SHDN UHVSRQVH GHJUDGHV VHYHUHO\ IRU DVSHFWV EHWZHHQ WKH H[HPSODUV XVHG WR FRPSXWH WKH ILOWHU )XUWKHUPRUH IURP D SHDN RXWSXW UHVSRQVH YLHZSRLQW JHQHUDOL]DWLRQ WR YHKLFOH OE LV DOVR ZRUVH +RZHYHU XQOLNH WKH SUHYLRXV WHFKQLTXHV ZH QRZ EHJLQ WR VHH VRPH VHSDUDWLRQ EHWZHHQ WKH WZR YHKLFOH W\SHV DV UHSUHVHQWHG E\ WKHLU SHDN UHVSRQVH

PAGE 30

LR )LJXUH 0$&( ILOWHU RXWSXW LPDJH SODQH UHVSRQVH 2SWLPDO 7UDGHRII 6\QWKHWLF 'LVFULPLQDQW )XQFWLRQ 7KH ILQDO GLVWRUWLRQ LQYDULDQW ILOWHULQJ WHFKQLTXH ZKLFK ZLOO EH GLVFXVVHG KHUH LV WKH PHWKRG SURSRVHG E\ 5IUJULHU DQG )LTXH >@ NQRZQ DV WKH RSWLPDO WUDGHRII V\Qn WKHWLF GLVFULPLQDQW IXQFWLRQ 276')f 6XSSRVH WKDW WKH GHVLJQHU ZLVKHV WR RSWLPL]H RYHU PXOWLSOH TXDGUDWLF RSWLPL]DWLRQ FULWHULD HJ DYHUDJH FRUUHODWLRQ HQHUJ\ DQG RXWSXW QRLVH YDULDQFHf VXEMHFW WR WKH VDPH VHW RI HTXDOLW\ FRQVWUDLQWV DV LQ WKH SUHYLRXV GLVWRUWLRQ LQYDULn DQW ILOWHUV :H FDQ UHSUHVHQW WKH LQGLYLGXDO RSWLPL]DWLRQ FULWHULRQ E\ 94K ZKHUH 4c LV DQ 1 [1 V\PPHWULF SRVLWLYHGHILQLWH PDWUL[ HJ 4c OQ IRU 096') RSWLPL]DWLRQ FULWHULRQf 7KH 276') LV D PHWKRG E\ ZKLFK D VHW RI TXDGUDWLF RSWLPL]DWLRQ FULWHULRQ PD\ EH RSWLPDOO\ WUDGHG RII DJDLQVW HDFK RWKHU WKDW LV RQH FULWHULRQ FDQ EH PLQLPL]HG ZLWK PLQL

PAGE 31

)LJXUH 0$&( SHDN RXWSXW UHVSRQVH RI YHKLFOH OD OE DQG D RYHU DOO DVSHFW DQJOHV 'HJUDGDWLRQ WR EHWZHHQ DVSHFW H[HPSODUV LV HYLGHQW *HQHUDOL]DWLRQ WR WKH WHVWLQJ YHKLFOHV DV PHDVXUHG E\ SHDN RXWSXW UHVSRQVH LV DOVR SRRUHU 9HKLFOH OD LV WKH VROLG OLQH OE LV WKH GDVKHG OLQH DQG D LV WKH GDVKHGGRW OLQH PXP SHQDOW\ WR WKH UHVW 7KH VROXWLRQ WR DOO VXFK ILOWHUV FDQ EH FKDUDFWHUL]HG E\ WKH HTXDn WLRQ K 4 n[;L4 @;f nG f ZKHUH DVVXPLQJ 0 GLIIHUHQW FULWHULD 0 0 4 ?4c ? L L R[

PAGE 32

7KH SRVVLEOH VROXWLRQV SDUDPHWHUL]HG E\ ;c GHILQH D SHUIRUPDQFH ERXQG ZKLFK FDQn QRW EH H[FHHGHG E\ DQ\ OLQHDU V\VWHP ZLWK UHVSHFW WR WKH RSWLPL]DWLRQ FULWHULD DQG WKH HTXDOLW\ FRQVWUDLQWV $OO VXFK OLQHDU ILOWHUV ZKLFK RSWLPDOO\ WUDGHRII D VHW RI TXDGUDWLF FULn WHULD DUH UHIHUUHG WR DV RSWLPDO WUDGHRII V\QWKHWLF GLVFULPLQDQW IXQFWLRQV :H PD\ IRU H[DPSOH ZLVK WR WUDGHRII WKH 0$&( ILOWHU FULWHULRQ YHUVXV WKH 096') ILOWHU FULWHULRQ 7KLV SUHVHQWV WKH DGGHG GLIILFXOW\ WKDW RQH FULWHULRQ LV VSHFLILHG LQ WKH VSDFH GRPDLQ DQG WKH RWKHU LQ WKH VSHFWUDO GRPDLQ ,I WKH QRLVH LV UHSUHVHQWHG DV ]HURPHDQ VWDn WLRQDU\ DQG HUJRGLF LI WKH FRYDULDQFH LV WR EH HVWLPDWHG IURP VDPSOHVf ZH FDQ DV PHQn WLRQHG WUDQVIRUP WKH 096') FULWHULRQ WR WKH VSHFWUDO GRPDLQ ,Q WKLV FDVH WKH RSWLPDO ILOWHU KDV WKH IUHTXHQF\ GRPDLQ VROXWLRQ + >?'Q ? ?f'[`n[>;+n!'Q ?;f'[?n[@n G 'On[L;A'On[InG ZKHUH 'A ;'Q ;f'[ ; DQG 'Q '[ DUH GLDJRQDO PDWULFHV ZKRVH GLDJRQDO HOHPHQWV FRQWDLQ WKH HVWLPDWHG SRZHU VSHFWUXP FRHIILFLHQWV RI WKH QRLVH FODVV DQG WKH UHFRJQLWLRQ FODVV UHVSHFWLYHO\ 7KH SHUIRUPDQFH ERXQG RI VXFK D ILOWHU ZRXOG UHVHPEOH ILJXUH ZKHUH DOO OLQHDU ILOWHUV ZRXOG IDOO LQ WKH GDUNHQHG UHJLRQ DQG DOO RSWLPDO WUDGHRII ILOWHUV ZRXOG OLH VRPHZKHUH RQ WKH ERXQGDU\ %\ ZD\ RI H[DPSOH ZH DJDLQ XVH WKH GDWD IURP WKH 0$&( DQG 6') H[DPSOHV ,Q WKLV FDVH ZH ZLOO FRQVWUXFW DQ 276') ZKLFK WUDGHV RII WKH 0$&( ILOWHU FULWHULRQ IRU WKH 6') FULWHULRQ ,Q RUGHU WR WUDQVIRUP WKH 6') WR WKH VSHFWUDO GRPDLQ ZH ZLOO DVVXPH WKDW WKH QRLVH FODVV LV ]HURPHDQ VWDWLRQDU\ ZKLWH QRLVH 7KH SRZHU VSHFWUXP LV WKHUHIRUH IODW 2QH RI WKH LVVXHV IRU FRQVWUXFWLQJ DQ 276') LV KRZ WR VHW WKH YDOXH RI ; ZKLFK UHSUHVHQWV WKH

PAGE 33

)LJXUH ([DPSOH RI D W\SLFDO 276') SHUIRUPDQFH SORW 7KLV SORW VKRZV WKH WUDGHRII K\SRWKHWLFDOO\ EHWZHHQ WKH $&( FULWHULD YHUVXV D QRLVH YDULDQFH FULWHULD 7KH FXUYHG DUURZ RQ WKH SHUIRUPDQFH ERXQG LQGLFDWHV WKH GLUHFWLRQ RI LQFUHDVLQJ ; IRU WKH WZR FULWHULRQ FDVH 7KH FXUYH LV ERXQGHG E\ WKH 0$&( DQG 096') UHVXOWV GHJUHH E\ ZKLFK RQH FULWHULRQ LV HPSKDVL]HG RYHU DQRWKHU :H ZLOO QRW DGGUHVV WKDW LVVXH KHUH EXW VLPSO\ VHW WKH YDOXH WR ; LQGLFDWLQJ PRUH HPSKDVLV RQ WKH 0$&( ILOWHU FULWHULRQ 7KH RXWSXW SODQH UHVSRQVH RI WKH 276') LV VKRZQ LQ ILJXUH $V FRPSDUHG WR WKH 0$&( ILOWHU UHVSRQVH WKH RXWSXW SHDN LV QRW QHDUO\ DV VKDUS EXW VWLOO PRUH ORFDOL]HG WKDQ WKH 6') FDVH 7KH SHDN RXWSXW UHVSRQVH RYHU WKH WUDLQLQJ YHKLFOH IRU WKH 276') LV FRPSDUHG WR WKH 0$&( ILOWHU LQ ILJXUH 7KH GHJUDGDWLRQ WR EHWZHHQ DVSHFW H[HPSODUV LV OHVV VHYHUH WKDQ WKH 0$&( ILOWHU 7KH SHDN RXWSXW UHVSRQVH RI YHKLFOHV OE DQG D DUH VKRZQ LQ ILJXUH

PAGE 34

$V FRPSDUHG WR WKH 0$&( ILOWHU WKH SHDN UHVSRQVH LV LPSURYHG RYHU WKH WHVWLQJ VHW 6HSDn UDWLRQ EHWZHHQ WKH WZR YHKLFOH W\SHV DSSHDUV WR EH PDLQWDLQHG 3UHSURFHVVRU6') 'HFRPSRVLWLRQ ,Q WKH VDPSOH GRPDLQ WKH 6') IDPLO\ RI FRUUHODWLRQ ILOWHUV LV HTXLYDOHQW WR D FDVFDGH RI D OLQHDU SUHSURFHVVRU IROORZHG E\ D OLQHDU FRUUHODWRU >0DKDODQRELV HW DO .XPDU @ 7KLV LV LOOXVWUDWHG LQ ILJXUH ZLWK YHFWRU RSHUDWLRQV 7KH SUHSURFHVVRU LQ WKH FDVH RI WKH 0$&( ILOWHU LV D SUHZKLWHQLQJ ILOWHU FRPSXWHG RQ WKH EDVLV RI WKH DYHUDJH SRZHU VSHFWUXP RI WKH UHFRJQLWLRQ FODVV WUDLQLQJ H[HPSODUV ,Q WKH FDVH RI WKH 096') WKH SUHn SURFHVVRU LV D SUHZKLWHQLQJ ILOWHU FRPSXWHG RQ WKH EDVLV RI WKH FRYDULDQFH PDWUL[ RI WKH QRLVH 7KH QHW UHVXOW LV WKDW DIWHU SUHSURFHVVLQJ WKH VHFRQG SURFHVVRU LV DQ 6') FRPSXWHG RYHU WKH SUHSURFHVVHG H[HPSODUV

PAGE 35

276') )LJXUH 276') SHDN RXWSXW UHVSRQVH RI YHKLFOH OD RYHU DOO DVSHFW DQJOHV 'HJUDGDWLRQ WR EHWZHHQ DVSHFW H[HPSODUV LV OHVV WKDQ LQ WKH 0$&( ILOWHU VKRZQ LQ GDVKHG OLQH 7KH SULPDU\ FRQWULEXWLRQ RI WKLV UHVHDUFK ZLOO EH WR H[WHQG WKH LGHDV RI 0$&( ILOWHULQJ WR D JHQHUDO QRQOLQHDU VLJQDO SURFHVVLQJ DUFKLWHFWXUH DQG DFFRPSDQ\LQJ FODVVLILFDWLRQ IUDPHZRUN 7KHVH H[WHQVLRQV ZLOO IRFXV RQ SURFHVVLQJ VWUXFWXUHV ZKLFK LPSURYH WKH JHQn HUDOL]DWLRQ DQG GLVFULPLQDWLRQ SURSHUWLHV ZKLOH PDLQWDLQLQJ WKH VKLIWLQYDULDQFH DQG ORFDOn L]DWLRQ GHWHFWLRQ SURSHUWLHV RI WKH OLQHDU 0$&( ILOWHU

PAGE 36

276') )LJXUH 276') SHDN RXWSXW UHVSRQVH RI YHKLFOHV OE DQG D RYHU DOO DVSHFW DQJOHV *HQHUDOL]DWLRQ LV EHWWHU WKDQ LQ WKH 0$&( ILOWHU 9HKLFOH OE LV VKRZQ LQ GDVKHG OLQH YHKLFOH D LV VKRZQ LQ GDVKHGGRW OLQH )LJXUH 'HFRPSRVLWLRQ RI GLVWRUWLRQ LQYDULDQW ILOWHU LQ VSDFH GRPDLQ 7KH QRWDWLRQ XVHG DVVXPHV WKDW WKH LPDJH DQG ILOWHU FRHIILFLHQWV KDYH EHHQ UHRUGHUHG LQWR YHFWRUV 7KH LQSXW LPDJH YHFWRU [ LV SUHSURFHVVHG E\ WKH OLQHDU WUDQVIRUPDWLRQ \ $[ 7KH UHVXOWLQJ YHFWRU LV SURFHVVHG E\ D V\QWKHWLF GLVFULPLQDQW IXQFWLRQ \RXW B\L

PAGE 37

&+$37(5 7+( 0$&( ),/7(5 $6 $1 $662&,$7,9( 0(025< /LQHDU 6\VWHPV DV &ODVVLILHUV ,Q WKLV FKDSWHU ZH SUHVHQW WKH 0$&( ILOWHU IURP WKH SHUVSHFWLYH RI DVVRFLDWLYH PHPRn ULHV 7KLV SHUVSHFWLYH LV LPSRUWDQW EHFDXVH LW OHDGV WR D PDFKLQHOHDUQLQJ DQG FODVVLILFDWLRQ IUDPHZRUN DQG FRQVHTXHQWO\ D PHDQV E\ ZKLFK WR GHWHUPLQH WKH SDUDPHWHUV RI D QRQOLQHDU PDSSLQJ YLD JUDGLHQW VHDUFK WHFKQLTXHV :H VKDOO UHIHU KHUHLQ WR WKH PDFKLQH OHDUQLQJ JUDGLHQW VHDUFK PHWKRGV DV DQ LWHUDWLYH IUDPHZRUN 7KH WHFKQLTXHV DUH LWHUDWLYH LQ WKH VHQVH WKDW DGDSWDWLRQ WR WKH PDSSLQJ SDUDPHWHUV DUH FRPSXWHG VHTXHQWLDOO\ DQG UHSHDWHGO\ RYHU D VHW RI H[HPSODUV :H VKDOO VKRZ WKDW WKH LWHUDWLYH DQG FODVVLILFDWLRQ IUDPHZRUN FRPn ELQHG ZLWK D QRQOLQHDU V\VWHP DUFKLWHFWXUH KDYH GLVWLQFW DGYDQWDJHV RYHU WKH OLQHDU IUDPHn ZRUN RI GLVWRUWLRQ LQYDULDQW ILOWHUV $V ZH KDYH VWDWHG GLVWRUWLRQ LQYDULDQW ILOWHUV FDQ RQO\ UHDOL]H OLQHDU GLVFULPLQDQW IXQFn WLRQV :H EHJLQ WKHUHIRUH E\ FRQVLGHULQJ OLQHDU V\VWHPV XVHG DV FODVVLILHUV 7KH DGDOLQH DUFKLWHFWXUH >:LGURZ DQG +RII @ GHSLFWHG LQ ILJXUH LV DQ H[DPSOH RI D OLQHDU V\Vn WHP XVHG IRU SDWWHUQ FODVVLILFDWLRQ $ SDWWHUQ UHSUHVHQWHG E\ WKH FRHIILFLHQWV [c LV DSSOLHG WR D OLQHDU FRPELQHU UHSUHVHQWHG E\ WKH ZHLJKW FRHIILFLHQWV Zc WKH UHVXOWLQJ RXWSXW \ LV

PAGE 38

WKHQ DSSOLHG WR D KDUG OLPLWHU ZKLFK DVVLJQV D FODVV WR WKH LQSXW SDWWHUQ 0DWKHPDWLFDOO\ WKLV FDQ EH UHSUHVHQWHG E\ F VJQ\WSf VJQYWn7M WSf 1 [ O ZKHUH VJQ f LV WKH VLJQXP IXQFWLRQ FS LV D WKUHVKROG DQG Z [ H DUH FROXPQ YHFWRUV FRQWDLQLQJ WKH FRHIILFLHQWV RI WKH SDWWHUQ DQG FRPELQHU ZHLJKWV UHVSHFWLYHO\ ,Q WKH FRQWH[W RI FODVVLILFDWLRQ WKLV DUFKLWHFWXUH LV WUDLQHG LWHUDWLYHO\ XVLQJ WKH OHDVW PHDQ VTXDUH /06f DOJRULWKP >:LGURZ DQG +RII ,@ )RU D WZR FODVV SUREOHP WKH GHVLUHG RXWSXW G LQ WKH ILJXUH LV VHW WR s GHSHQGLQJ RQ WKH FODVV RI WKH LQSXW SDWWHUQ WKH /06 DOJRULWKP WKHQ PLQLPL]HV WKH PHDQ VTXDUH HUURU 06(f EHWZHHQ WKH FODVVLILFDWLRQ RXWSXW F DQG WKH GHVLUHG RXWSXW 6LQFH WKH HUURU IXQFWLRQ HF FDQ RQO\ WDNH RQ WKUHH YDOXHV s DQG PLQLPL]DWLRQ RI WKH 06( LV HTXLYDOHQW WR PLQLPL]LQJ WKH DYHUDJH QXPEHU RI DFWXDO HUURUV )LJXUH $GDOLQH DUFKLWHFWXUH

PAGE 39

7KHUH DUH VHYHUDO REVHUYDWLRQV WR EH PDGH DERXW WKH DGDOLQH/06 DSSURDFK WR FODVVLILn FDWLRQ 2QH REVHUYDWLRQ LV WKDW WKH DGDSWDWLRQ SURFHVV GHVFULEHG XVHV WKH HUURU e DV PHDn VXUHG DW WKH RXWSXW RI WKH OLQHDU FRPELQHU WR GULYH WKH DGDSWDWLRQ SURFHVV DQG QRW WKH DFWXDO FODVVLILFDWLRQ HUURU HF $QRWKHU REVHUYDWLRQ LV WKDW WKLV DSSURDFK SUHVXSSRVHV WKDW WKH SDWn WHUQ FODVVHV FDQ EH OLQHDUO\ VHSDUDWHG $ ILQDO SRLQW RQ ZKLFK ZH ZLOO KDYH PRUH WR VD\ LV WKDW WKH PHWKRG XVHV WKH 06( FULWHULRQ DV D SUR[\ IRU FODVVLILFDWLRQ 06( &ULWHULRQ DV D 3UR[\ IRU &ODVVLILFDWLRQ 3HUIRUPDQFH $V ZH KDYH SRLQWHG RXW WKH DGDOLQH/06 DSSURDFK WR FODVVLILFDWLRQ XVHV WKH 06( FULn WHULRQ WR GULYH WKH DGDSWDWLRQ SURFHVV ,W LV WKH SUREDELOLW\ RI PLVFODVVLILFDWLRQ DOVR FDOOHG WKH %D\HV FULWHULRQf KRZHYHU ZLWK ZKLFK ZH DUH WUXO\ FRQFHUQHG :H QRZ GLVFXVV WKH FRQVHTXHQFH RI XVLQJ WKH 06( FULWHULRQ DV D SUR[\ IRU FODVVLILFDWLRQ SHUIRUPDQFH ,W LV ZHOO NQRZQ WKDW WKH GLVFULPLQDQW IXQFWLRQ WKDW PLQLPL]HV PLVFODVVLILFDWLRQ LV PRQRWRQLFDOO\ UHODWHG WR WKH SRVWHULRU SUREDELOLW\ GLVWULEXWLRQ RI WKH FODVV F JLYHQ WKH REVHUYDWLRQ [ >)XNDQDJD @ 7KDW LV IRU WKH WZR FODVV SUREOHP LI WKH GLVFULPLQDQW IXQFWLRQ LV rf 3MS& -[f f ZKHUH 3 LV WKH SULRU SUREDELOLW\ RI FODVV DQG S&_[f LV WKH FRQGLWLRQDO SUREDELOLW\ GLVWULEXWLRQ RI FODVV JLYHQ [ WKHQ WKH SUREDELOLW\ RI FODVVLILFDWLRQ ZLOO EH PLQLPL]HG LI WKH IROORZLQJ GHFLVLRQ UXOH LV XVHG I[f FKRRVH FODVV A I[f FKRRVH FODVV

PAGE 40

)RU WKH FDVH RI I[f ERWK FODVVHV DUH HTXDOO\ OLNHO\ VR D JXHVV PXVW EH PDGH 8QUHVWULFWHG )XQFWLRQDO 0DSSLQJV :LWK UHJDUGV WR WKH DGDOLQH/06 DSSURDFK ZH QRZ DVN ZKDW LV WKH FRQVHTXHQFH RI XVLQJ WKH 06( FULWHULRQ IRU FRPSXWLQJ GLVFULPLQDQW IXQFWLRQV" ,Q WKH WZR FODVV FDVH WKH VRXUFH GLVWULEXWLRQV DUH S[?&cf RU S[?&f GHSHQGLQJ RQ ZKHWKHU WKH REVHUYDWLRQ [ LV GUDZQ IURP FODVV RU FODVV UHVSHFWLYHO\ ,I ZH DVVLJQ D GHVLUHG RXWSXW RI ]HUR WR FODVV DQG XQLW\ WR FODVV WKHQ WKH 06( FULWHULRQ LV HTXLYDOHQW WR WKH IROORZLQJ MIf \e^Z_F` \e^O MFff_&` f ZKHUH WKH VFDOH IDFWRUV DUH IRU FRQYHQLHQFH (^ ` LV WKH H[SHFWDWLRQ RSHUDWRU DQG &c LQGLFDWHV FODVV L )RU QRZ ZH ZLOO SODFH QR FRQVWUDLQWV RQ WKH IXQFWLRQDO IRUP RI I[f ,Q VR GRLQJ ZH FDQ VROYH IRU WKH RSWLPDO VROXWLRQ XVLQJ WKH FDOFXOXV RI YDULDWLRQV DSSURDFK ,Q WKLV FDVH ZH ZRXOG OLNH WR ILQG D VWDWLRQDU\ SRLQW RI WKH FULWHULRQ "f GXH WR VPDOO SHUWXUEDWLRQV LQ WKH IXQFWLRQ I[f LQGLFDWHG E\ ff f

PAGE 41

7KH ILUVW WHUP RI FDQ EH FRPSXWHG DV -8 P \e^ f_&` Ae^Of_&` \e^ f_&` S f "f \e^ fOff_&` f f \e^\f_&`\en^O ff_&` ZKLFK FDQ EH VXEVWLWXWHG LQWR WR \LHOG fe^_&`ee^Of_&` 3[>M[fL!IS[?&LfG[3>B-?I[:S[?&fG[ Uf f Jf fB-I[f3LS[?&Of 3S[?&ff3S[?&ff@EIG[ e >I[fS[[f 3S[?&f@I G[ ZKHUH S[[f 3AS[?&?f 3S[?&f LV WKH XQFRQGLWLRQDO SUREDELOLW\ GLVWULEXWLRQ RI WKH UDQGRP YDULDEOH ; ,Q RUGHU IRU [f WR EH D VWDWLRQDU\ SRLQW RI f HTXDWLRQ PXVW EH ]HUR RYHU DOO [ IRU DQ\ DUELWUDU\ SHUWXUEDWLRQ MUf &RQVHTXHQWO\ I[fS[[f 33[?&f f

PAGE 42

RU 3S[?&f f 3[S[?&[f 3S[?&f 3&?[f ZKLFK LV WKH OLNHOLKRRG WKDW WKH REVHUYDWLRQ LV GUDZQ IURP FODVV ,I ZH KDG UHYHUVHG WKH GHVLUHG RXWSXWV WKH UHVXOW ZRXOG KDYH EHHQ WKH OLNHOLKRRG WKDW WKH REVHUYDWLRQ ZDV GUDZQ IURP FODVV 7KLV UHVXOW SUHGLFDWHG E\ RXU FKRLFH RI GHVLUHG RXWSXWV VKRZV WKDW IRU DUELn WUDU\ rf WKH 06( FULWHULRQ LV HTXLYDOHQW WR SUREDELOLW\ RI PLVFODVVLILFDWLRQ HUURU FULWHn ULRQ ,Q IDFW LW KDV EHHQ VKRZQ E\ 5LFKDUG DQG /LSSPDQ >@ XVLQJ RWKHU PHDQVf IRU WKH PXOWLFODVV FDVH WKDW LI WKH GHVLUHG RXWSXWV DUH HQFRGHG DV YHFWRUV Hc V W:[n ZKHUH WKH WnWK HOHPHQW LV XQLW\ DQG WKH RWKHUV DUH ]HUR IRU DQ 1FODVV SUREOHP WKH 06( FULWHULRQ LV HTXLYDOHQW WR RSWLPL]LQJ WKH %D\HV FULWHULRQ IRU FODVVLILFDWLRQ 3DUDPHWHUL]HG )XQFWLRQDO 0DSSLQJV 6XSSRVH KRZHYHU WKDW WKH IXQFWLRQ LV QRW DUELWUDU\ EXW LV DOVR D IXQFWLRQ RI SDUDPHWHU VHW D DV LQ I[ Df 7KH 06( FULWHULRQ RI FDQ EH UHZULWWHQ -If f 7KH JUDGLHQW RI WKH FULWHULRQ ZLWK UHVSHFW WR WKH SDUDPHWHUV EHFRPHV

PAGE 43

DQG FRQVHTXHQWO\ 3?IB I[DfAI[ DfS[?&WfG[ 3IMOI[ DffA[ DfS[?&fG[ f IAI[Df3OS[?&Of 3S[?&ff3S[?&fffM/I[DfG[ eBr DfS[$Uf 3S[?&ffAI[ DfG[ ([DPLQDWLRQ RI HTXDWLRQ DOORZV IRU WZR SRVVLELOLWLHV IRU D VWDWLRQDU\ SRLQW RI WKH FULWHn ULRQ 7KH ILUVW DV EHIRUH LV WKDW F fV 3O3r?& f rrf SUf§ 3MI0 f S& _[f ZKLOH WKH VHFRQG LV LI ZH DUH QHDU D ORFDO PLQLPD ZLWK UHVSHFW WR D ,Q RWKHU ZRUGV LI WKH SDUDPHWHUL]HG IXQFWLRQ FDQ UHDOL]H WKH %D\HV GLVFULPLQDQW IXQFWLRQ YLD DQ DSSURSULDWH FKRLFH RI LWV SDUDPHWHUV WKHQ WKLV IXQFWLRQ UHSUHVHQWV D JOREDO PLQLPD EXW WKLV GRHV QRW GLVFRXQW WKH IDFW WKDW WKHUH PD\ EH ORFDO PLQLPD )XUWKHUPRUH LI WKH SDUDPHWHUL]HG IXQFn WLRQ LV QRW FDSDEOH RI UHSUHVHQWLQJ WKH %D\HV GLVFULPLQDQW IXQFWLRQ WKHUH LV QR JXDUDQWHH WKDW WKH JOREDO RU ORFDOf PLQLPD ZLOO UHVXOW LQ UREXVW FODVVLILFDWLRQ

PAGE 44

)LQLWH 'DWD 6HWV 7KH SUHYLRXV GHYHORSPHQW GRHV QRW WDNH LQWR DFFRXQW WKDW LQ DQ LWHUDWLYH IUDPHZRUN ZH DUH ZRUNLQJ ZLWK REVHUYDWLRQV RI D UDQGRP YDULDEOH 7KHUHIRUH ZH UHZULWH WKH FULWHULRQ RI HTXDWLRQ DV ILQLWH VXPPDWLRQV 7KDW LV WKH FULWHULRQ EHFRPHV frmff \ ; r R \ ; 2Dff f ;n H & J & ZKHUH [c H &c GHQRWHV WKH VHW RI REVHUYDWLRQV WDNHQ IURP FODVV &c 7DNLQJ WKH GHULYDWLYH RI WKLV FULWHULRQ ZLWK UHVSHFW WR WKH SDUDPHWHUV D \LHOGV 3L ; rf! Dff§3L ; DffArDf f [c H & [ H & ,W LV DVVXPHG WKDW WKH VHW RI REVHUYDWLRQV IURP FODVV & MU H &>f DUH LQGHSHQGHQW DQG LGHQWLFDOO\ GLVWULEXWHG LLGf DV DUH WKH VHW RI REVHUYDWLRQV IURP FODVV & [ H &f DOWKRXJK ZLWK D GLIIHUHQW GLVWULEXWLRQ WKDQ FODVV & 6LQFH WKH VXPPDWLRQ WHUPV DUH EURn NHQ XS E\ FODVV ZH FDQ DVVXPH WKDW WKH DUJXPHQWV RI WKH VXPPDWLRQV IXQFWLRQV RI GLVn WLQFW LLG UDQGRP YDULDEOHVf DUH WKHPVHOYHV LLG UDQGRP YDULDEOHV >3DSRXOLV @ ,I ZH VHW 3M DQG 3f9 3 ZKHUH 3c DQG 3 DUH WKH SULRU SUREDELOLWLHV RI FODVVHV & DQG & UHVSHFWLYHO\ DQG 1^ DQG 1 DUH WKH QXPEHU RI VDPSOHV IURP GUDZQ IURP

PAGE 45

HDFK RI WKH FODVVHV ZH FDQ XVH WKH ODZ RI ODUJH QXPEHUV WR VD\ WKDW WKH VXPPDWLRQV RI HTXDWLRQ DSSURDFK WKHLU H[SHFWHG YDOXHV ,Q RWKHU ZRUGV LQ WKH OLPLW DV 1OW 1 f§! ,V Ar!mf n4 Df D 3( _ODffA0 D f ZKLFK LV LGHQWLFDO WR HTXDWLRQ DQG VR \LHOGV WKH VDPH VROXWLRQ IRU WKH PDSSLQJ DV rrf 3S[?&f 32f f 7KH FRQFOXVLRQ LV WKDW LI ZH KDYH D VXIILFLHQW QXPEHU RI REVHUYDWLRQV WR FKDUDFWHUL]H WKH XQGHUO\LQJ GLVWULEXWLRQV WKHQ WKH 06( FULWHULRQ LV DJDLQ HTXLYDOHQW WR WKH %D\HV FULWH 'HULYDWLRQ RI WKH 0$&( )LOWHU :H KDYH DOUHDG\ LQWURGXFHG WKH 0$&( ILOWHU LQ D SUHYLRXV VHFWLRQ :H SUHVHQW D GHULn YDWLRQ RI WKH 0$&( ILOWHU KHUH 7KH GHYHORSPHQW LV VLPLODU WR WKH GHULYDWLRQV JLYHQ LQ 0DKDODQRELV >@ DQG .XPDU >@ 2XU SXUSRVH LQ WKLV SUHVHQWDWLRQ RI WKH GHULYDWLRQ LV WKDW LW VHUYHV WR LOOXVWUDWH WKH DVVRFLDWLYH PHPRU\ SHUVSHFWLYH RI RSWLPL]HG FRUUHODWRUV D SHUVSHFWLYH ZKLFK ZLOO EH XVHG WR PRWLYDWH WKH GHYHORSPHQW RI WKH QRQOLQHDU H[WHQVLRQV SUHVHQWHG LQ ODWHU VHFWLRQV

PAGE 46

,Q WKH RULJLQDO GHYHORSPHQW 6') W\SH ILOWHUV ZHUH IRUPXODWHG XVLQJ FRUUHODWLRQ RSHUDn WLRQV D FRQYHQWLRQ ZKLFK ZLOO EH PDLQWDLQHG KHUH 7KH RXWSXW JQY Qf RI D FRUUHODWLRQ ILOWHU LV GHWHUPLQHG E\ QL QL JQWQf @7 e [rQO PQ PfKPcPf P P [rQ_ QfrrKQ Qf ZKHUH [rQc Qf LV WKH FRPSOH[ FRQMXJDWH RI DQ LQSXW LPDJH ZLWK 1W [ 1 UHJLRQ RI VXSn SRUW KQ[ Qf UHSUHVHQWV WKH ILOWHU FRHIILFLHQWV DQG rr UHSUHVHQWV WKH WZRGLPHQVLRQDO FLUFOXDU FRQYROXWLRQ RSHUDWLRQ >2SSHQKHLP DQG 6KDIHU @ 7KH 0$&( ILOWHU IRUPXODWLRQ LV DV IROORZV >0DKDODQRELV HW DO @ *LYHQ D VHW RI LPDJH H[HPSODUV [ W$n[: L ,9` ZH ZLVK WR ILQG ILOWHU FRHIILFLHQWV K H L1r[1L VXFK WKDW DYHUDJH FRUUHODWLRQ HQHUJ\ DW WKH RXWSXW RI WKH ILOWHU GHILQHG DV f LV PLQLPL]HG VXEMHFW WR WKH FRQVWUDLQWV 1c? QL ef < ; [LrP3 PfrPOPf GL r f§1U f P P 0DKDODQRELV >@ UHIRUPXODWHV WKLV DV D YHFWRU RSWLPL]DWLRQ LQ WKH VSHFWUDO GRPDLQ XVLQJ 3DUVHYDOfV WKHRUHP ,Q WKH VSHFWUDO GRPDLQ ZH ZLVK WR ILQG WKH HOHPHQWV RI + J &11O [ D FROXPQ YHFWRU ZKRVH HOHPHQWV DUH WKH ')7 FRHIILFLHQWV RI WKH VSDFH

PAGE 47

GRPDLQ ILOWHU K UHRUGHUHG OH[LFRJUDSKLFDOO\ /HW WKH FROXPQV RI WKH GDWD PDWUL[ 1 1 [ 1 ; H & n FRQWDLQ WKH ')7 FRHIILFLHQWV RI WKH H[HPSODUV [1 ` DOVR 1 1 [ 1 1 UHRUGHUHG LQWR FROXPQ YHFWRUV 7KH GLDJRQDO PDWUL[ 'c H FRQWDLQV WKH PDJn QLWXGH VTXDUHG RI WKH ')7 FRHIILFLHQWV RI WKH LWK H[HPSODU 7KHVH PDWULFHV DUH DYHUn DJHG WR IRUP WKH GLDJRQDO PDWUL[ DV 1 MIO'L n ZKLFK WKHQ FRQWDLQV WKH DYHUDJH SRZHU VSHFWUXP RI WKH WUDLQLQJ H[HPSODUV 0LQLPL]LQJ HTXDWLRQ f VXEMHFW WR WKH FRQVWUDLQWV RI HTXDWLRQ f LV HTXLYDOHQW WR PLQLPL]LQJ + '+ f VXEMHFW WR WKH OLQHDU FRQVWUDLQWV $7 + G f 1 [ ZKHUH WKH HOHPHQWV RI G f  DUH WKH GHVLUHG RXWSXWV FRUUHVSRQGLQJ WR WKH H[HPSODUV 7KH VROXWLRQ WR WKLV RSWLPL]DWLRQ SUREOHP FDQ EH IRXQG XVLQJ WKH PHWKRG RI /DJUDQJH PXOWLSOLHUV ,Q WKH VSHFWUDO GRPDLQ WKH ILOWHU WKDW VDWLVILHV WKH FRQVWUDLQWV RI HTXDWLRQ f DQG PLQLPL]HV WKH FULWHULRQ RI HTXDWLRQ f >0DKDODQRELV HW DO .XPDU @ LV + 'n;;A'n;fnG f 1 1 ; O ZKHUH + H& FRQWDLQV WKH '')7 FRHIILFLHQWV RI WKH ILOWHU DVVXPLQJ D XQLWDU\ ')7

PAGE 48

3UHSURFHVVRU6') 'HFRPSRVLWLRQ $V REVHUYHG E\ 0DKDODQRELV >@ WKH 0$&( ILOWHU FDQ EH GHFRPSRVHG DV D V\Qn WKHWLF GLVFULPLQDQW IXQFWLRQ SUHFHGHG E\ D SUHZKLWHQLQJ ILOWHU /HW WKH PDWUL[ % ZKHUH % LV GLDJRQDO ZLWK GLDJRQDO HOHPHQWV HTXDO WR WKH LQYHUVH RI WKH VTXDUH URRW RI WKH GLDJRQDO HOHPHQWV RI :H LPSOLFLWO\ DVVXPH WKDW WKH GLDJRQDO HOHn PHQWV RI DUH QRQ]HUR FRQVHTXHQWO\ %A% DQG %nn % (TXDWLRQ f FDQ WKHQ EH UHZULWWHQ DV + %%;f%;f?%;ffG f 6XEVWLWXWLQJ < %; UHSUHVHQWLQJ WKH RULJLQDO H[HPSODUV SUHSURFHVVHG LQ WKH VSHFn WUDO GRPDLQ E\ WKH PDWUL[ % HTXDWLRQ f FDQ EH ZULWWHQ + %<2SSHQKHLP DQG 6KDIHU @ WKHQ D VFDOH IDFWRU RI 1?1 ZRXOG EH QHFHVVDU\

PAGE 49

)LJXUH 'HFRPSRVLWLRQ RI 0$&( ILOWHU DV D SUHSURFHVVRU LH D SUHn ZKLWHQLQJ ILOWHU RYHU WKH DYHUDJH SRZHU VSHFWUXP RI WKH H[HPSODUVf IROORZHG E\ D V\QWKHWLF GLVFULPLQDQW IXQFWLRQ $VVRFLDWLYH 0HPRU\ 3HUVSHFWLYH +DYLQJ SUHVHQWHG WKH GHULYDWLRQ RI WKH 0$&( ILOWHU DQG WKH SUHSURFHVVRU6') GHFRPn SRVLWLRQ ZH QRZ VKRZ WKDW ZLWK D PRGLILFDWLRQ DGGLWLRQ RI D OLQHDU SUHSURFHVVRUf WKH 0$&( ILOWHU LV D VSHFLDO FDVH RI .RKRQHQfV OLQHDU DVVRFLDWLYH PHPRU\ >@ $VVRFLDWLYH PHPRULHV >.RKRQHQ @ DUH JHQHUDO VWUXFWXUHV E\ ZKLFK SDWWHUQ YHFn WRUV FDQ EH UHODWHG WR RQH DQRWKHU W\SLFDOO\ LQ DQ LQSXWRXWSXW SDLUZLVH IDVKLRQ $Q LQSXW VWLPXOXV YHFWRU LV SUHVHQWHG WR WKH DVVRFLDWLYH PHPRU\ VWUXFWXUH UHVXOWLQJ LQ DQ RXWSXW UHVSRQVH YHFWRU 7KH LQSXWRXWSXW SDLUV HVWDEOLVK WKH GHVLUHG UHVSRQVH WR D JLYHQ LQSXW ,Q WKH FDVH RI DQ DXWRDVVRFLDWLYH PHPRU\ WKH GHVLUHG UHVSRQVH LV WKH VWLPXOXV YHFWRU ZKHUHDV LQ D KHWHURDVVRFLDWLYH PHPRU\ WKH GHVLUHG UHVSRQVH LV DUELWUDU\ )URP D VLJQDO SURFHVVLQJ SHUVSHFWLYH DVVRFLDWLYH PHPRULHV DUH YLHZHG DV SURMHFWLRQV >.XQJ @ OLQn HDU DQG QRQOLQHDU 7KH LQSXW SDWWHUQV H[LVW LQ D YHFWRU VSDFH DQG WKH DVVRFLDWLYH PHPRU\ SURMHFWV WKHP RQWR D QHZ VSDFH 7KH OLQHDU DVVRFLDWLYH PHPRU\ RI .RKRQHQ >@ LV IRUn PXODWHG H[DFWO\ LQ WKLV ZD\ $ VLPSOH IRUP RI WKH OLQHDU KHWHURDVVRFLDWLYH PHPRU\ PDSV YHFWRUV WR VFDODUV ,W LV IRUPXODWHG DV IROORZV *LYHQ WKH VHW RI LQSXWRXWSXW YHFWRUVFDODU SDLUV

PAGE 50

^[H "1[nGcH  L ZKLFK DUH SODFHG LQWR D LQSXW GDWD PDWUL[ 7 1 [ [ > [ [ A @ DQG GHVLUHG RXWSXW YHFWRU G ?G@G1ILQG WKH YHFWRU K H  VXFK WKDW [IL G f ,I WKH V\VWHP RI HTXDWLRQV GHVFULEHG E\ f LV XQGHUGHWHUPLQHG WKH LQQHU SURGXFW $Wr f LV PLQLPL]HG XVLQJ f DV D FRQVWUDLQW ,I WKH V\VWHP RI HTXDWLRQV DUH RYHUGHWHUPLQHG [AKGf?[AKGf LV PLQLPL]HG +HUH ZH DUH LQWHUHVWHG LQ WKH XQGHUGHWHUPLQHG FDVH 7KH RSWLPDO VROXWLRQ IRU WKH XQGHUGHWHUPLQHG XVLQJ WKH SVHXGRLQYHUVH RI [ LV >.RKRQHQ @ K [[W[f nG f $V ZDV VKRZQ LQ >)LVKHU DQG 3ULQFLSH @ ZH FDQ PRGLI\ WKH OLQHDU DVVRFLDWLYH PHPRU\ PRGHO VOLJKWO\ E\ DGGLQJ D SUHSURFHVVLQJ OLQHDU WUDQVIRUPDWLRQ PDWUL[ $ DQG ILQG K VXFK WKDW WKH XQGHUGHWHUPLQHG V\VWHP RI HTXDWLRQV $[f WL G f

PAGE 51

LV VDWLVILHG ZKLOH KAK LV PLQLPL]HG $V LQ WKH 0$&( ILOWHU WKLV RSWLPL]DWLRQ FDQ EH VROYHG XVLQJ WKH PHWKRG RI /DJUDQJH PXOWLSOLHUV :H DGMRLQ WKH V\VWHP RI FRQVWUDLQWV WR WKH RSWLPL]DWLRQ FULWHULRQ DV WIK ;7$[fLKGf f 1 [ ZKHUH ; H n LV D FROXPQ YHFWRU RI /DJUDQJH PXOWLSOLHUV RQH IRU HDFK FRQVWUDLQW GHVLUHG UHVSRQVHf 7DNLQJ WKH JUDGLHQW RI HTXDWLRQ f ZLWK UHVSHFW WR WKH YHFWRU K \LHOGV cg K $[; f GK 6HWWLQJ WKH JUDGLHQW WR ]HUR DQG VROYLQJ IRU WKH YHFWRU K \LHOGV K a$[; f 6XEVWLWXWLQJ WKLV UHVXOW LQWR WKH FRQVWUDLQW HTXDWLRQV RI f DQG VROYLQJ IRU WKH /DJUDQJH PXOWLSOLHUV \LHOGV ; $[f0MFf 9 f 6XEVWLWXWLQJ WKLV UHVXOW EDFN LQWR HTXDWLRQ f \LHOGV WKH ILQDO VROXWLRQ WR WKH RSWLPL]DWLRQ DV K $[[A$[f nG f ,I WKH SUHSURFHVVLQJ WUDQVIRUPDWLRQ $ LV WKH VSDFHGRPDLQ HTXLYDOHQW RI WKH 0$&( ILOWHUfV VSHFWUDO SUHZKLWHQHU DQG WKH FROXPQV RI WKH GDWD PDWUL[ [ FRQWDLQ WKH UHRUGHUHG HOHPHQWV RI WKH LPDJHV IURP WKH 0$&( ILOWHU SUREOHP WKHQ HTXDWLRQ f FRPELQHG ZLWK

PAGE 52

WKH SUHSURFHVVLQJ WUDQVIRUPDWLRQ \LHOGV H[DFWO\ WKH VSDFH GRPDLQ FRHIILFLHQWV RI WKH 0$&( ILOWHU 7KLV FDQ EH VKRZQ XVLQJ D XQLWDU\ GLVFUHWH )RXULHU WUDQVIRUPDWLRQ ')7f PDWUL[ 1 [ 1 1 [ 1 ,I 8 H & n LV WKH ')7 RI WKH LPDJH ZH n ZH FDQ UHRUGHU ERWK 8 DQG Z -9:M;O -99[O LQWR FROXPQ YHFWRUV 8 H & DQG X H & UHVSHFWLYHO\ :H FDQ WKHQ LPSOHn PHQW WKH ')7 DV D XQLWDU\ WUDQVIRUPDWLRQ PDWUL[ p VXFK WKDW 8 pX X W!: ppW L ,Q RUGHU IRU WKH WUDQVIRUPDWLRQ $ WR EH WKH VSDFH GRPDLQ HTXLYDOHQW RI WKH VSHFWUDO SUH ZKLWHQHU RI WKH 0$&( ILOWHU WKH UHODWLRQVKLS $[ pW\ p,mI pI%pM ZKHUH % LV WKH VDPH PDWUL[ DV LQ HTXDWLRQ PXVW EH WUXH ZKLFK E\ LQVSHFWLRQ PHDQV WKDW $ pW %p f 6XEVWLWXWLQJ HTXDWLRQ f LQWR HTXDWLRQ f DQG XVLQJ WKH SURSHUW\ %A% %% L! \LHOGV K $[[A$A$[f pW%p[[WpWVpfWpW%pMFUnL pW%p[[WpW%ppWILp[fBOL p%;;A'an[fnG f

PAGE 53

FRPELQLQJ WKLV VROXWLRQ IRU K ZLWK WKH SUHSURFHVVRU LQ HTXDWLRQ f IRU WKH HTXLYDOHQW OLQHDU V\VWHP IWV\V \LHOGV KV\V $K $t%;WI'aO;fnG 27%WRRW%$7W$nW\n$fnr t' n ;;A' ;f nG 6XEVWLWXWLQJ WKH 0$&( ILOWHU VROXWLRQ HTXDWLRQ f JLYHV WKH UHVXOW .\V pWIOU0$&( f DQG VR cV\V LV WKH LQYHUVH ')7 SDLU RI WKH VSHFWUDO GRPDLQ 0$&( ILOWHU 7KLV UHVXOW HVWDEn OLVKHV WKH UHODWLRQVKLS EHWZHHQ WKH 0$&( ILOWHU DQG WKH OLQHDU DVVRFLDWLYH PHPRU\ 7KH GHFRPSRVLWLRQ RI WKH 0$&( ILOWHU RI ILJXUH FDQ DOVR EH FRQVLGHUHG DV D FDVFDGH RI D OLQn HDU SUHSURFHVVRU IROORZHG E\ D OLQHDU DVVRFLDWLYH PHPRU\ /$0f DV LQ ILJXUH \D )LJXUH 'HFRPSRVLWLRQ RI 0$&( ILOWHU DV D SUHSURFHVVRU LH D SUHn ZKLWHQLQJ ILOWHU RYHU WKH DYHUDJH SRZHU VSHFWUXP RI WKH H[HPSODUVf IROORZHG E\ D OLQHDU DVVRFLDWLYH PHPRU\ 6LQFH WKH WZR DUH HTXLYDOHQW WKHQ ZK\ PDNH WKH GLVWLQFWLRQ EHWZHHQ WKH WZR SHUVSHFn WLYHV" 7KH DUH VHYHUDO UHDVRQV 7KH GHYHORSPHQW RI GLVWRUWLRQ LQYDULDQW ILOWHULQJ DQG DVVRn FLDWLYH PHPRULHV KDV SURFHHGHG LQ SDUDOOHO 'LVWRUWLRQ LQYDULDQW ILOWHULQJ KDV EHHQ

PAGE 54

FRQFHUQHG ZLWK ILQGLQJ SURMHFWLRQV ZKLFK ZLOO HVVHQWLDOO\ GHWHFW D VHW RI LPDJHV 7RZDUGV WKLV JRDO WKH WHFKQLTXHV KDYH HPSKDVL]HG DQDO\WLF VROXWLRQV UHVXOWLQJ LQ OLQHDU GLVFULPLn QDQW IXQFWLRQV $GYDQFHV KDYH EHHQ FRQFHUQHG ZLWK EHWWHU GHVFULSWLRQV RI WKH VHFRQG RUGHU VWDWLVWLFV RI WKH FDXVHV RI IDOVH GHWHFWLRQV 7KH DSSURDFK KRZHYHU LV VWLOO D GDWD GULYHQ DSSURDFK 7KH GHVLUHG UHFRJQLWLRQ FODVV LV UHSUHVHQWHG WKURXJK H[HPSODUV ,Q WKH GLVWRUWLRQ LQYDULDQW ILOWHULQJ DSSURDFK WKH WDVN KDV EHHQ FRQILQHG WR ILWWLQJ D K\SHUSODQH WR WKH UHFn RJQLWLRQ H[HPSODUV VXEMHFW WR YDULRXV TXDGUDWLF RSWLPL]DWLRQ FULWHULRQ 7KH GHYHORSPHQW RI DVVRFLDWLYH PHPRULHV KDV SURFHHGHG DORQJ D GLIIHUHQW WUDFN ,W LV DOVR GDWD GULYHQ EXW WKH HPSKDVLV KDV EHHQ RQ LWHUDWLYH PDFKLQH OHDUQLQJ PHWKRGV 0DQ\ RI WKH PHWKRGV DUH ELRORJLFDOO\ PRWLYDWHG LQFOXGLQJ WKH SHUFHSOURQ OHDUQLQJ UXOH >5RVHQEn ODWW @ DQG +HEELDQ OHDUQLQJ >+HEE @ 2WKHU PHWKRGV LQFOXGLQJ WKH OHDVWPHDQ VTXDUH /06f DOJRULWKP >:LGURZ DQG +RII @ ZKLFK ZH KDYH GHVFULEHGf DQG WKH EDFNSURSDJDWLRQ DOJRULWKP >5XPHOKDUW HW DO :HUERV @ DUH JUDGLHQW GHVFHQW EDVHG PHWKRGV )URP WKH FODVVLILFDWLRQ VWDQGSRLQW RI ZKLFK WKH $75 SUREOHP LV D VXEVHW LWHUDWLYH PHWKRGV KDYH FHUWDLQ DGYDQWDJHV 7KLV FDQ EH LOOXVWUDWHG ZLWK D VLPSOH H[DPSOH 6XSSRVH WKH GDWD PDWUL[ U L UR11[1 ; >[O3 [ [Z @ H  ZHUH QRW IXOO UDQN ,Q RWKHU ZRUGV WKH H[HPSODUV UHSUHVHQWLQJ WKH UHFRJQLWLRQ FODVV FRXOG EH UHSUHVHQWHG ZLWKRXW HUURU LQ D VXEVSDFH RI GLPHQVLRQ OHVV WKDQ 1W )URP DQ $75 SHUn VSHFWLYH WKLV ZRXOG EH D GHVLUDEOH SURSHUW\ 7KH LPSOLFLW DVVXPSWLRQ LQ DQ\ GDWD GULYHQ PHWKRG LV WKDW LQIRUPDWLRQ DERXW WKH UHFRJQLWLRQ FODVV LV WUDQVPLWWHG WKURXJK H[HPSODUV 7KLV LV DV WUXH IRU GLVWRUWLRQ LQYDULDQW ILOWHUV ZKLFK KDYH DQDO\WLF VROXWLRQV DV LW LV IRU LWHU

PAGE 55

DWLYH PHWKRGV 7KH VPDOOHU WKH GLPHQVLRQ RI WKH VXEVSDFH LQ ZKLFK WKH UHFRJQLWLRQ FODVV OLHV WKH EHWWHU ZH FDQ GLVFULPLQDWH LPDJHV FRQVLGHUHG WR EH RXW RI WKH FODVV 2QH OLPLWDWLRQ RI WKH DQDO\WLF VROXWLRQV RI GLVWRUWLRQ LQYDULDQW ILOWHUV LV WKDW WKH\ UHTXLUH WKH LQYHUVH RI D PDWUL[ RI WKH IRUP [n4[ f ZKHUH 4 LV D SRVLWLYH GHILQLWH PDWUL[ UHSUHVHQWLQJ D TXDGUDWLF RSWLPL]DWLRQ FULWHULRQ ,I WKH PDWUL[ [ LV QRW IXOO FROXPQ UDQN WKHUH LV QR LQYHUVH IRU WKH PDWUL[ RI f DQG FRQVHn TXHQWO\ QR DQDO\WLF VROXWLRQ IRU DQ\ RI WKH GLVWRUWLRQ LQYDULDQW ILOWHUV 7KH /06 DOJRULWKP KRZHYHU ZLOO VWLOO ILQG D EHVW ILW WR WKH GHVLJQ JRDO ZKLFK LV WR PLQLPL]H WKH FULWHULRQ ZKLOH VDWLVI\LQJ WKH OLQHDU FRQVWUDLQWV :H FDQ LOOXVWUDWH WKLV E\ PRGLI\LQJ WKH GDWD IURP WKH H[SHULPHQWV LQ VHFWLRQ ,W LV ZHOO NQRZQ WKDW WKH GDWD PDWUL[ [ FDQ EH GHFRPSRVHG XVLQJ WKH VLQJXODU YDOXH GHFRPSRn VLWLRQ 69'f DV [ L$97 1 [ 1 ZKHUH WKH FROXPQV RI 8 H n IRUP DQ RUWKRQRUPDO EDVLV WKH SULQFLSDO FRPSRQHQWV 1 [ 1 RI WKH YHFWRU [c LQ IDFWf WKH GLDJRQDO PDWUL[ $ n n FRQWDLQV WKH VLQJXODU YDOXHV RI 1 [ 1 WKH GDWD PDWUL[ DQG 9 H  nLV XQLWDU\ 7KH FROXPQV RI WKH GDWD PDWUL[ FDQ EH SURn MHFWHG RQWR D VXEVSDFH E\ VHWWLQJ RQH RI WKH GLDJRQDO HOHPHQWV RI $ WR ]HUR 7KH LPSRUn WDQFH RI DQ\ RI WKH EDVLV YHFWRUV LQ 8 LV GLUHFWO\ SURSRUWLRQDO WR WKH VLQJXODU YDOXH ,Q WKLV FDVH 1 VR ZH FDQ FKRRVH RQH RI WKH VPDOOHU VLQJXODU YDOXHV WR VHW WR ]HUR ZLWKRXW

PAGE 56

FKDQJLQJ EDVLF VWUXFWXUH RI WKH GDWD )RU WKLV H[DPSOH ZH FKRRVH WKH WZHOIWK ODUJHVW VLQJXn ODU YDOXH $ GDWD PDWUL[ MFVXE LV JHQHUDWHG E\ 9 0Qr $ 7 Y ZKHUH $c f LV D GLDJRQDO PDWUL[ FRQWDLQLQJ WKH L WKURXJK M VLQJXODU YDOXHV RI WKH RULJLQDO GDWD PDWUL[ [ 7KLV GDWD PDWUL[ LV QRW IXOO UDQN VR WKHUH LV QR DQDO\WLFDO VROXWLRQ IRU WKH 0$&( ILOWHU KRZHYHU ZH FDQ XVH WKH /06 DSSURDFK DQG GHULYH D OLQHDU DVVRFLDWLYH PHPRU\ 7KH FROn XPQV RI [VXE DUH SUHSURFHVVHG ZLWK D SUHZKLWHQLQJ ILOWHU FRPSXWHG RYHU WKH DYHUDJH SRZHU VSHFWUXP 7KH /06 DOJRULWKP FDQ WKHQ EH XVHG WR LWHUDWLYHO\ FRPSXWH WKH WUDQVIRUn PDWLRQ WKDW EHVW ILWV r/Er Gf LQ D OHDVW VTXDUHV VHQVH WKDW LV ZH FDQ ILQG WKH K WKDW PLQLPL]HV [-XEKGf7[-XEKGf ZKHUH G LV FROXPQ YHFWRU RI GHVLUHG UHVSRQVHV VHW WR DOO XQLW\ LQ WKLV FDVHf 7KH SHDN RXWSXW UHVSRQVH IRU WKLV ILOWHU ZDV FRPSXWHG RYHU DOO RI WKH DVSHFW YLHZV RI YHKLFOH OD DQG LV VKRZQ LQ ILJXUH 7KH H[HPSODUV XVHG WR FRPSXWH WKH ILOWHU DUH SORWWHG ZLWK GLDPRQG V\PEROV 7KH GHVLUHG UHVSRQVH FDQQRW EH PHW H[DFWO\ VR D OHDVW VTXDUHV ILW LV DFKLHYHG )LJXUH VKRZV WKH FRUUHODWLRQ RXWSXW VXUIDFH IRU RQH RI WKH WUDLQLQJ H[HPSODUV

PAGE 57

)LJXUH 3HDN RXWSXW UHVSRQVH RYHU DOO DVSHFWV RI YHKLFOH OD ZKHQ WKH GDWD PDWUL[ ZKLFK LV QRW IXOO UDQN 7KH /06 DOJRULWKP ZDV XVHG WR FRPSXWH WKH ILOWHU FRHIILFLHQWV $V FDQ EH VHHQ LQ WKH LPDJH WKH TXDOLWLHV RI ORZ YDULDQFH DQG ORFDOL]HG SHDN DUH VWLOO PDLQWDLQHG XVLQJ WKH LWHUDWLYH PHWKRG 7KH OHDUQLQJ FXUYH ZKLFK PHDVXUHV WKH QRUPDOL]HG PHDQ VTXDUH HUURU 106(f EHWZHHQ WKH ILOWHU RXWSXW DQG WKH GHVLUHG RXWSXW LV VKRZQ DV D IXQFWLRQ RI WKH OHDUQLQJ HSRFK DQ HSRFK LV RQH SDVV WKURXJK WKH GDWDf LQ ILJXUH :KHQ WKH GDWD PDWUL[ LV IXOO UDQN DV VKRZQ ZLWK D VROLG OLQH ZH VHH WKDW VLQFH WKHUH LV DQ H[DFW VROXWLRQ DQG WKH HUURU DSSURDFKHV ]HUR :KHQ [VXE LV XVHG WKH 106( DSSURDFKHV D OLPLW EHFDXVH WKHUH LV QR H[DFW VROXWLRQ DQG VR D OHDVW VTXDUHV VROXWLRQ LV IRXQG

PAGE 58

)LJXUH 2XWSXW FRUUHODWLRQ VXUIDFH IRU /06 FRPSXWHG ILOWHU IURP QRQ IXOO UDQN GDWD 7KH ILOWHU RXWSXW LV QRW VXEVWDQWLDOO\ GLIIHUHQW IURP WKH DQDO\WLF VROXWLRQ ZLWK IXOO UDQN GDWD 6LQFH WKH V\VWHP RI FRQVWUDLQW HTXDWLRQV DUH JHQHUDOO\ XQGHUGHWHUPLQHG WKHUH DUH LQILn QLWHO\ PDQ\ ILOWHUV ZKLFK ZLOO VDWLVI\ WKH FRQVWUDLQWV 7KHUH LV RQO\ RQH KRZHYHU WKDW PLQn LPL]HV WKH QRUP RI ILOWHU WKH RSWLPL]DWLRQ FULWHULRQ DIWHU SUHSURFHVVLQJf >.RKRQHQ @ )LJXUH VKRZV WKH 106( EHWZHHQ WKH DQDO\WLF VROXWLRQ IRU WKH ILOWHU FRHIILFLHQWV DV FRPn SDUHG WR WKH LWHUDWLYH PHWKRG :KHQ WKH GDWD PDWUL[ LV IXOO UDQN WKH LWHUDWLYH PHWKRG DSSURDFKHV WKH RSWLPDO DQDO\WLF VROXWLRQ DV VKRZQ E\ WKH VROLG OLQH LQ WKH ILJXUH :KHQ WKH GDWD PDWUL[ LV QRW IXOO UDQN DV VKRZQ E\ WKH GDVKHG OLQH LQ WKH ILJXUH WKH HUURU LQ WKH LWHUDWLYH VROXWLRQ DSSURDFKHV D OLPLW 7KHVH TXDOLWLHV RI LWHUDWLYH OHDUQLQJ PHWKRGV DUH LPSRUWDQW IURP WKH $75 SHUVSHFWLYH :H VHH IURP WKH H[DPSOH WKDW ZKHQ WKH GDWD SRVVHVVHV D TXDOLW\ WKDW ZRXOG VHHPLQJO\ EH LQ WKLV FDVH fLWHUDWLYHf UHIHUV WR WKH /06 DOJRULWKP ZLWKLQ WKLV WH[W LW JHQHUDOO\ UHIHUV WR D JUDGLHQW VHDUFK DOJRULWKP

PAGE 59

)LJXUH /HDUQLQJ FXUYH IRU /06 DSSURDFK 7KH OHDUQLQJ FXUYH IRU WKH /06 DOJRULWKP ZKHQ WKH IXOO UDQN GDWD PDWUL[ LV VKRZQ ZLWK D VROLG OLQH WKH QRQ IXOO UDQN FDVH LV VKRZQ ZLWK D GDVKHG OLQH XVHIXO WR WKH $75 SUREOHP QDPHO\ WKDW WKH FODVV FDQ EH GHVFULEHG E\ D VXEVSDFH WKH DQDn O\WLF VROXWLRQ IDLOV ZKHQ WKH QXPEHU RI H[HPSODUV H[FHHGV WKH GLPHQVLRQDOLW\ RI WKH VXEn VSDFH 7KH LWHUDWLYH PHWKRG KRZHYHU ILQGV D UHDVRQDEOH VROXWLRQ )XUWKHUPRUH LI WKH GDWD PDWUL[ LV IXOO UDQN WKH LWHUDWLYH PHWKRG DSSURDFKHV WKH RSWLPDO DQDO\WLF VROXWLRQ &RPPHQWV 7KHUH DUH IXUWKHU PRWLYDWLRQV IRU WKH DVVRFLDWLYH PHPRU\ SHUVSHFWLYH DQG E\ H[WHQVLRQ WKH XVH RI LWHUDWLYH PHWKRGV ,W LV ZHOO NQRZQ WKDW QRQOLQHDU DVVRFLDWLYH PHPRU\ VWUXFn WXUHV FDQ RXWSHUIRUP WKHLU OLQHDU FRXQWHUSDUWV RQ WKH EDVLV RI JHQHUDOL]DWLRQ DQG G\QDPLF UDQJH >.RKRQHQ +LQWRQ DQG $QGHUVRQ @ ,Q JHQHUDO WKH\ DUH PRUH GLIILFXOW WR GHVLJQ DV WKHLU SDUDPHWHUV FDQQRW EH FRPSXWHG DQDO\WLFDOO\ 7KH SDUDPHWHUV IRU D ODUJH

PAGE 60

ILOWHU HUURU )LJXUH 106( EHWZHHQ FORVHG IRUP VROXWLRQ DQG LWHUDWLYH VROXWLRQ 7KH OHDUQLQJ FXUYH IRU WKH /06 DOJRULWKP ZKHQ WKH IXOO UDQN GDWD PDWUL[ LV VKRZQ ZLWK D VROLG OLQH WKH QRQ IXOO UDQN FDVH LV VKRZQ ZLWK D GDVKHG OLQH FODVV RI QRQOLQHDU DVVRFLDWLYH PHPRULHV FDQ KRZHYHU EH GHWHUPLQHG E\ JUDGLHQW VHDUFK WHFKQLTXHV 7KH PHWKRGV RI GLVWRUWLRQ LQYDULDQW ILOWHUV DUH OLPLWHG WR OLQHDU RU SLHFHZLVH OLQHDU GLVFULPLQDQW IXQFWLRQV ,W LV XQOLNHO\ WKDW WKHVH VROXWLRQV DUH RSWLPDO IRU WKH $75 SUREOHP ,Q WKLV FKDSWHU ZH KDYH PDGH WKH FRQQHFWLRQ EHWZHHQ GLVWRUWLRQ LQYDULDQW ILOWHULQJ DQG OLQHDU DVVRFLDWLYH PHPRULHV )XUWKHUPRUH ZH KDYH PRWLYDWHG DQ LWHUDWLYH DSSURDFK 5HFDOO ILJXUH ZKLFK VKRZV WKH DGDOLQH DUFKLWHFWXUH ,Q WKLV DUFKLWHFWXUH ZH FDQ XVH WKH OLQHDU HUURU WHUP LQ RUGHU WR WUDLQ RXU V\VWHP DV D FODVVLILHU 7KLV LV FRQVHTXHQFH RI WKH DVVXPSn WLRQ WKDW D OLQHDU GLVFULPLQDQW IXQFWLRQ LV GHVLUDEOH ,I D OLQHDU GLVFULPLQDQW IXQFWLRQ LV VXE

PAGE 61

RSWLPDO ZKLFK ZLOO DOPRVW DOZD\V EH WKH FDVH IRU DQ\ KLJKGLPHQVLRQDO FODVVLILFDWLRQ SUREOHP WKHQ ZH PXVW ZRUN GLUHFWO\ ZLWK WKH FODVVLILFDWLRQ HUURU :H KDYH DOVR VKRZQ WKDW WKH 06( FULWHULRQ LV D VXIILFLHQW SUR[\ IRU FODVVLILFDWLRQ HUURU ZLWK FHUWDLQ UHVWULFWLRQVf KRZHYHU LW UHTXLUHV WKDW ZH ZRUN ZLWK WKH WUXH RXWSXW HUURU RI WKH PDSSLQJ DV ZHOO DV D PDSSLQJ ZLWK VXIILFLHQW IOH[LELOLW\ LH FDQ FORVHO\ DSSUR[LPDWH D ZLGH UDQJH RI IXQFWLRQV ZKLFK DUH QRW QHFHVVDULO\ OLQHDUf 7KH OLQHDU V\VWHPV DSSURDFK KRZHYHU GRHV QRW DOORZ IRU HLWKHU RI WKHVH UHTXLUHPHQWV &RQVHTXHQWO\ ZH PXVW DGRSW D QRQOLQHDU V\VWHPV DSSURDFK LI ZH KRSH WR DFKLHYH LPSURYHG SHUIRUPDQFH 7KH QH[W FKDSn WHU ZLOO VKRZ WKDW WKH 0$&( ILOWHU FDQ EH H[WHQGHG WR QRQOLQHDU V\VWHPV VXFK WKDW WKH GHVLUDEOH SURSHUWLHV RI VKLIW LQYDULDQFH DQG ORFDOL]HG GHWHFWLRQ SHDN DUH PDLQWDLQHG ZKLOH DFKLHYLQJ VXSHULRU FODVVLILFDWLRQ SHUIRUPDQFH

PAGE 62

&+$37(5 672&+$67,& $3352$&+ 72 75$,1,1* 121/,1($5 6<17+(7,& ',6&5,0,1$17 )81&7,216 7KH 0$&( ILOWHU LV WKH EHVW OLQHDU V\VWHP WKDW PLQLPL]HV WKH HQHUJ\ LQ WKH RXWSXW FRUn UHODWLRQ SODQH VXEMHFW WR D SHDN FRQVWUDLQW DW WKH RULJLQ $Q DGYDQWDJH RI OLQHDU V\VWHPV LV WKDW ZH KDYH WKH PDWKHPDWLFDO WRROV WR XVH WKHP LQ RSWLPDO RSHUDWLQJ FRQGLWLRQV IURP WKH VWDQGSRLQW RI VHFRQG RUGHU VWDWLVWLFV 6XFK RSWLPDOLW\ FRQGLWLRQV KRZHYHU VKRXOG QRW EH FRQIXVHG ZLWK WKH EHVW SRVVLEOH FODVVLILFDWLRQ SHUIRUPDQFH 2XU JRDO LV WR H[WHQG WKH RSWLPDOLW\ FRQGLWLRQ RI 0$&( ILOWHUV WR DGDSWLYH QRQOLQHDU V\VWHPV DQG FODVVLILFDWLRQ SHUIRUPDQFH 7KH RSWLPDOLW\ FRQGLWLRQ RI WKH 0$&( ILOWHU FRQn VLGHUV WKH HQWLUH RXWSXW SODQH QRW MXVW WKH UHVSRQVH ZKHQ WKH LPDJH LV FHQWHUHG :LWK UHJDUGV WR JHQHUDO QRQOLQHDU ILOWHU DUFKLWHFWXUHV ZKLFK FDQ EH WUDLQHG LWHUDWLYHO\ D EUXWH IRUFH DSSURDFK ZRXOG EH WR WUDLQ D QHXUDO QHWZRUN ZLWK D GHVLUHG RXWSXW RI XQLW\ IRU WKH FHQWHUHG LPDJHV DQG ]HUR IRU DOO VKLIWHG LPDJHV 7KLV ZRXOG LQGHHG HPXODWH WKH RSWLPDOLW\ RI WKH 0$&( ILOWHU KRZHYHU WKH UHVXOW LV D WUDLQLQJ DOJRULWKP RI RUGHU 1W11 IRU ,9 WUDLQLQJ LPDJHV RI VL]H $ [ 1 SL[HOV 7KLV LV FOHDUO\ LPSUDFWLFDO ,Q WKLV VHFWLRQ ZH SURSRVH D QRQOLQHDU DUFKLWHFWXUH IRU H[WHQGLQJ WKH 0$&( ILOWHU :H GLVFXVV VRPH LWV SURSHUWLHV $SSURSULDWH PHDVXUHV RI JHQHUDOL]DWLRQ DUH GLVFXVVHG :H DOVR SUHVHQW D VWDWLVWLFDO YLHZSRLQW RI GLVWRUWLRQ LQYDULDQW ILOWHUV IURP ZKLFK VXFK QRQOLQHDU H[WHQVLRQV ILW QDWXUDOO\ LQWR DQ LWHUDWLYH IUDPHZRUN )URP WKLV LWHUDWLYH IUDPHZRUN ZH

PAGE 63

SUHVHQW H[SHULPHQWDO UHVXOWV ZKLFK H[KLELW LPSURYHG GLVFULPLQDWLRQ DQG JHQHUDOL]DWLRQ SHUIRUPDQFH ZLWK UHVSHFW WR WKH 0$&( ILOWHU ZKLOH PDLQWDLQLQJ WKH SURSHUWLHV RI ORFDOL]HG GHWHFWLRQ SHDN DQG ORZ YDULDQFH LQ WKH RXWSXW SODQH $ 3URSRVHG 1RQOLQHDU $UFKLWHFWXUH $V ZH KDYH VWDWHG WKH 0$&( ILOWHU FDQ EH GHFRPSRVHG DV D SUHZKLWHQLQJ ILOWHU IROn ORZHG E\ D V\QWKHWLF GLVFULPLQDQW IXQFWLRQ 6')f ZKLFK FDQ DOVR EH YLHZHG DV D VSHFLDO FDVH RI .RKRQHQfV OLQHDU DVVRFLDWLYH PHPRU\ /$0f >+HVWHU DQG &DVDVHQW )LVKHU DQG 3ULQFLSH @ 7KLV GHFRPSRVLWLRQ LV VKRZQ DW WKH WRS RI ILJXUH 7KH QRQOLQHDU ILOWHU DUFKLWHFWXUH ZLWK ZKLFK ZH DUH SURSRVLQJ LV VKRZQ LQ WKH PLGGOH RI ILJXUH ,Q WKLV DUFKLWHFWXUH ZH UHSODFH WKH /$0 ZLWK D QRQOLQHDU DVVRFLDWLYH PHPRU\ VSHFLILFDOO\ D IHHGn IRUZDUG PXOWLOD\HU SHUFHSWURQ 0/3f VKRZQ LQ PRUH GHWDLO DW WKH ERWWRP RI ILJXUH :H ZLOO UHIHU WR WKLV VWUXFWXUH DV WKH QRQOLQHDU 0$&( ILOWHU 1/0$&(f IRU EUHYLW\ $QRWKHU UHDVRQ IRU FKRRVLQJ WKH PXOWLOD\HU SHUFHSWURQ 0/3f LV WKDW LW LV FDSDEOH RI DFKLHYLQJ D PXFK ZLGHU UDQJH RI GLVFULPLQDQW IXQFWLRQV ,W LV ZHOO NQRZQ WKDW DQ 0/3 ZLWK D VLQJOH KLGGHQ OD\HU FDQ DSSUR[LPDWH DQ\ GLVFULPLQDQW IXQFWLRQ WR DQ\ DUELWUDU\ GHJUHH RI SUHFLVLRQ >)XQDKDVKL @ 2QH RI WKH VKRUWFRPLQJV RI GLVWRUWLRQ LQYDULDQW DSSURDFKHV VXFK DV WKH 0$&( ILOWHU LV WKDW LW DWWHPSWV WR ILW D K\SHUSODQH WR RXU WUDLQLQJ H[HPSODUV DV WKH GLVFULPLQDQW IXQFWLRQ 8VLQJ DQ 0/3 LQ SODFH RI WKH /$0 UHOD[HV WKLV FRQVWUDLQW 0/3V GR QRW LQ JHQHUDO DOORZ IRU DQDO\WLF VROXWLRQV :H FDQ KRZHYHU GHWHUn PLQH WKHLU SDUDPHWHUV LWHUDWLYHO\ XVLQJ JUDGLHQW VHDUFK

PAGE 64

K \\WMLf OG 6')/$0 \ $[ SUHSURFHVVRU )LJXUH 'HFRPSRVLWLRQ RI RSWLPL]HG FRUUHODWRU DV D SUHSURFHVVRU IROORZHG E\ 6')/$0 WRSf 1RQOLQHDU YDULDWLRQ VKRZQ ZLWK 0/3 UHSODFLQJ 6') LQ VLJQDO IORZ PLGGOHf GHWDLO RI WKH 0/3 ERWWRPf 7KH OLQHDU WUDQVIRUPDWLRQ $ UHSUHVHQWV WKH VSDFH GRPDLQ HTXLYDOHQW RI WKH VSHFWUDO SUHSURFHVVRU D3[ O Df"AfaO 6W2> @f 0/3 [H 5 :[: MFH n 1c[1 SUHSURFHVVHG LPDJH O 0/3

PAGE 65

6KLIW ,QYDULDQFH RI WKH 3URSRVHG 1RQOLQHDU $UFKLWHFWXUH 2QH RI WKH SURSHUWLHV RI WKH 0$&( ILOWHU LV VKLIW LQYDULDQFH :H ZLVK WR PDLQWDLQ WKDW SURSHUW\ LQ RXU QRQOLQHDU H[WHQVLRQV $ WUDQVIRUPDWLRQ 7> @ RI D WZR GLPHQVLRQDO IXQFn WLRQ LV VKLIW LQYDULDQW LI LW FDQ EH VKRZQ WKDW JQOQf 7>\QYQf JQcQcnQ Qf 7>\QO Q^Q Qf@f ZKHUH QQ Q DUH LQWHJHUV ,Q RWKHU ZRUGV D VKLIW RI WKH LQSXW VLJQDO LV UHIOHFWHG DV D FRUUHVSRQGLQJ VKLIW RI WKH RXWSXW VLJQDO >2SSHQKHLP DQD 6KDIHU @ :H VKRZ KHUH WKDW WKLV SURSHUW\ LV PDLQWDLQHG IRU RXU SURSRVHG QRQOLQHDU DUFKLWHFWXUH 7KH SUHSURFHVVRU RI WKH QRQOLQHDU DUFKLWHFWXUH DW WKH ERWWRP RI ILJXUH LV WKH VDPH DV WKH SUHSURFHVVRU RI WKH OLQHDU ILOWHU VKRZQ DW WKH WRS 7KH SUHSURFHVVRU LV LPSOHPHQWHG DV D OLQHDU VKLIW LQYDULDQW /6,f ILOWHU &DVFDGLQJ VKLIW LQYDULDQW RSHUDWLRQV PDLQWDLQV VKLIW LQYDULDQFH RI WKH HQWLUH V\VWHP >2SSHQKHLP DQG 6KDIHU @ ,Q RUGHU WR VKRZ WKDW WKH V\VWHP DV D ZKROH LV VKLIW LQYDULDQW LW LV VXIILFLHQW WR VKRZ WKDW WKH 0/3 LV VKLIW LQYDULDQW 7KH PDSSLQJ IXQFWLRQ RI WKH 0/3 LQ ILJXUH FDQ EH ZULWWHQ JD!\f R:R:R:\f T!ff -9m;:} Q : H : H L f f : H $n[ LS H .L[! f ,Q WKH QRQOLQHDU DUFKLWHFWXUH WKH PDWUL[ UHSUHVHQWV WKH FRQQHFWLYLWLHV IURP WKH SURn FHVVLQJ HOHPHQWV 3(Vf RI OD\HU L f WR WKH LQSXW WR WKH 3(V RI OD\HU WKDW LV WKH PDWUL[ :c LV DSSOLHG DV OLQHDU WUDQVIRUPDWLRQ WR WKH YHFWRU RXWSXW RI OD\HU c f :KHQ L WKH WUDQVIRUPDWLRQ LV DSSOLHG WR WKH LQSXW YHFWRU \ 7KH QXPEHU RI 3(V LQ OD\HU L LV

PAGE 66

GHQRWHG E\ 1Z ,Q HTXDWLRQ FS LV D FRQVWDQW ELDV YHFWRU DGGHG WR HDFK HOHPHQW RI WKH YHFWRU :R :_\f H A}rr ,W LV DOVR DVVXPHG WKDW LI WKH DUJXPHQW WR WKH QRQOLQHDU IXQFWLRQ D f LV D PDWUL[ RU YHFWRU WKHQ WKH QRQOLQHDULW\ LV DSSOLHG WR HDFK HOHPHQW RI WKH PDWUL[ RU YHFWRU 1 1 [ O 7KH LQSXW WR WKH 0/3 LV GHQRWHG DV D YHFWRU \H 7KH HOHPHQWV RI WKH YHFWRU DUH VDPSOHV RI D WZRGLPHQVLRQDO SUHZKLWHQHG LQSXW VLJQDO \QX Qf :H FDQ ZULWH WKH L WK HOHPHQW RI WKH YHFWRU DV D IXQFWLRQ RI WKH WZR GLPHQVLRQDO VLJQDO DV IROORZV \cQOQf \Q L ,9f Q _BL -9 -f L 1W1O ZKHUH L 1Wf LQGLFDWHV D PRGXOR RSHUDWLRQ WKH UHPDLQGHU RI GLYLGHG E\ 1^f DQG _Bm ,9M LQGLFDWHV LQWHJHU GLYLVLRQ RI L E\ 1c :ULWWHQ WKLV ZD\ WKH HOHPHQWV RI WKH YHFWRU \ VDPSOH D UHFWDQJXODU UHJLRQ RI VXSSRUW RI VL]H 1c [ 1 EHJLQQLQJ DW VDPSOH Q Qf LQ WKH SUHZKLWHQHG VLJQDO \QY Qf 7KH YHFWRU DUJXPHQW RI HTXDWLRQ DQG WKH UHVXOWLQJ RXWSXW VLJQDO FDQ QRZ EH ZULWWHQ DV DQ H[SOLFLW IXQFWLRQ RI WKH EHJLQQLQJ VDPSOH SRLQW RI WKH WHPSODWH ZLWKLQ WKH SUHZKLWHQHG LPDJH JQ?fQf a JR\QQff FM+n*:D+nnO\QQff Sff f 7KH RXWSXW RI WKH PDSSLQJ DV ZULWWHQ LQ HTXDWLRQ LV QRZ DQ H[SOLFLW IXQFWLRQ RI Qf DQG WKH FRQVWDQW SDUDPHWHU VHW FR ZKLFK GR QRW YDU\ ZLWK m Qff :H FDQ DOVR ZULWH WKH RXWSXW UHVSRQVH DV D IXQFWLRQ RI WKH VKLIWHG YHUVLRQ RI WKH LPDJH \mM Qf DV 6fO QWnQ Qf JP\Q Q^ Q Qff f

PAGE 67

6LQFH WKH SDUDPHWHUV f DUH FRQVWDQW HTXDWLRQV DQG DUH VXIILFLHQW WR VKRZ WKH PDSSLQJ RI WKH 0/3 LV VKLIW LQYDULDQW DQG FRQVHTXHQWO\ WKH V\VWHP DV D ZKROH LQFOXGLQJ WKH VKLIW LQYDULDQW SUHSURFHVVRUf LV DOVR VKLIW LQYDULDQW &ODVVLILHU 3HUIRUPDQFH DQG 0HDVXUHV RI *HQHUDOL]DWLRQ 2QH RI WKH LVVXHV IRU DQ\ LWHUDWLYH PHWKRG ZKLFK UHOLHV RQ H[HPSODUV LV WKH QXPEHU RI WUDLQLQJ H[HPSODUV WR XVH LQ WKH FRPSXWDWLRQ RI WKH GLVFULPLQDQW IXQFWLRQ ,Q DGGLWLRQ IRU LWHUDWLYH PHWKRGV WKHUH LV WKH LVVXH RI ZKHQ WR VWRS WKH DGDSWDWLRQ SURFHVV ,Q WKH FDVH RI GLVWRUWLRQ LQYDULDQW ILOWHUV VXFK DV WKH 0$&( ILOWHU VRPH FRPPRQ KHXULVWLFV DUH XVHG WR GHWHUPLQH WKH QXPEHU RI WUDLQLQJ H[HPSODUV 7\SLFDOO\ VDPSOHV DUH GUDZQ IURP WKH WUDLQn LQJ VHW DQG XVHG WR FRPSXWH WKH ILOWHU IURP HTXDWLRQ XQWLO WKH PLQLPXP SHDN UHVSRQVH RYHU WKH UHPDLQLQJ VDPSOHV H[FHHGV VRPH WKUHVKROG >&DVDVHQW DQG 5DYLFKDQGUDQ @ $ VLPLODU KHXULVWLF LV WR FRQWLQXH WR GUDZ VDPSOHV IURP WKH WUDLQLQJ VHW XQWLO WKH PHDQ VTXDUH HUURU RI WKH SHDN UHVSRQVH RYHU WKH UHPDLQLQJ VDPSOHV GURSV EHORZ VRPH SUHVHW WKUHVKROG 7KHVH PHDVXUHV DUH WKHQ XVHG DV LQGLFDWRUV RI KRZ ZHOO WKH ILOWHU JHQHUDOL]HV WR EHWZHHQ DVSHFW H[HPSODUV IURP WKH WUDLQLQJ VHW ZKLFK KDYH QRW EHHQ XVHG IRU WKH FRPSXWDn WLRQ RI WKH ILOWHU FRHIILFLHQWV 7KH XOWLPDWH JRDO KRZHYHU LV FODVVLILFDWLRQ *HQHUDOL]DWLRQ LQ WKH FRQWH[W RI FODVVLILn FDWLRQ PXVW EH UHODWHG WR WKH DELOLW\ WR FODVVLI\ D SUHYLRXVO\ XQVHHQ LQSXW >%LVKRS @ :H VKRZ E\ H[DPSOH WKDW WKH PHDVXUHV RI JHQHUDOL]DWLRQ PHQWLRQHG DERYH PD\ EH PLVn OHDGLQJ DV SUHGLFWRUV RI FODVVLILHU SHUIRUPDQFH IRU HYHQ WKH OLQHDU ILOWHUV ,Q IDFW WKH UHVXOW RI WKH H[SHULPHQWV ZLOO VKRZ WKDW WKH ZD\ LQ ZKLFK WKH GDWD LV SUHSURFHVVHG LV PRUH LQGLFn DWLYH RI FODVVLILHU SHUIRUPDQFH WKDQ WKHVH RWKHU LQGLUHFW PHDVXUHV

PAGE 68

:H LOOXVWUDWH WKLV SRLQW ZLWK DQ H[DPSOH XVLQJ ,6$5 LPDJH GDWD $ GDWD VHW ODUJHU WKDQ LQ WKH SUHYLRXV H[SHULPHQWV ZLOO EH XVHG 7ZR PRUH YHKLFOHV RQH IURP HDFK YHKLFOH W\SH ZLOO EH XVHG IRU WKH WHVWLQJ VHW DQG DOO YHKLFOHV ZLOO EH VDPSOHV DW KLJKHU DVSHFW UHVROXWLRQ )LJXUH VKRZV ,6$5 LPDJHV RI VL]H [ WDNHQ IURP ILYH GLIIHUHQW YHKLFOHV DQG WZR GLIIHUHQW YHKLFOH W\SHV 7KH LPDJHV DUH DOO WDNHQ ZLWK WKH VDPH UDGDU 'DWD WDNHQ IURP YHKLFOHV LQ WKH VDPH FODVV YDU\ LQ WKH YHKLFOH FRQILJXUDWLRQ DQG UDGDU GHSUHVVLRQ DQJOH RU GHJUHHV GHSUHVVLRQf ,PDJHV KDYH EHHQ IRUPHG IURP HDFK YHKLFOH DW DVSHFW YDULDn WLRQV RI GHJUHHV IURP WR GHJUHHV DVSHFW IRU D WRWDO RI LPDJHV IRU HDFK YHKLn FOH )LJXUH VKRZV HDFK RI WKH YHKLFOHV DW DQG GHJUHHV DVSHFW :H ZLOO XVH YHKLFOH W\SH DV WKH UHFRJQLWLRQ FODVV DQG YHKLFOH W\SH DV D FRQIXVLRQ YHKLFOH ,PDJHV RI YHKLFOH OD ZLOO EH XVHG DV WKH VHW IURP ZKLFK WR GUDZ WUDLQLQJ H[HPn SODUV &ODVVLILFDWLRQ SHUIRUPDQFH ZLOO WKHQ EH PHDVXUHG DV WKH DELOLW\ WR UHFRJQL]H YHKLn FOHV OE DQG OF ZKLOH UHMHFWLQJ YHKLFOHV D DQG E 7KH ILOWHU ZH ZLOO XVH LV D IRUP RI WKH 276') >5IUJLHU DQG )LJXH @ ZKLFK LV FRPSXWHG LQ WKH VSHFWUDO GRPDLQ DV + >DIfM LO f$nW;LDIA 2 Df3n[@G f H[HPSODU LPDJHV RI GLPHQVLRQ 1c [ 1 RI YHKLFOH OD UHRUGHUHG LQWR FROXPQ YHFWRUV 7KH GLDJRQDO PDWUL[ 3 [ H 1c1[1^1 FRQWDLQV WKH FRHIILFLHQWV RI WKH DYHUDJH SRZHU VSHFn WUXP PHDVXUHG RYHU WKH 1 H[HPSODUV RI YHKLFOH OD ZKLOH 3[ H 1W1Wr1 LV WKH LGHQ 1 [ WLW\ PDWUL[ VFDOHG E\ WKH DYHUDJH RI WKH GLDJRQDO WHUPV RI 3[ )LQDOO\ GH LV D FROXPQ YHFWRU RI GHVLUHG RXWSXWV RQH IRU HDFK H[HPSODU 7KH HOHPHQWV RI G DUH W\SLFDOO\

PAGE 69

)LJXUH ,6$5 LPDJHV RI WZR YHKLFOH W\SHV VKRZQ DW DVSHFW DQJOHV RI DQG GHJUHHV UHVSHFWLYHO\ 7KUHH GLIIHUHQW YHKLFOHV RI W\SH D E DQG Ff DUH VKRZQ ZKLOH WZR GLIIHUHQW YHKLFOHV RI W\SH D DQG Ef DUH VKRZQ 9HKLFOH OD LV XVHG DV D WUDLQLQJ YHKLFOH ZKLOH YHKLFOHV OE DQG OF DUH XVHG DV WKH WHVWLQJ YHKLFOHV IRU WKH UHFRJQLWLRQ FODVV 9HKLFOHV D DQG E DUH XVHG D V FRQIXVLRQ YHKLFOHV VHW WR XQLW\ :KHQ D LV VHW WR XQLW\ HTXDWLRQ \LHOGV H[DFWO\ WKH 0$&( ILOWHU ZKHQ LW LV VHW WR ]HUR WKH UHVXOW LV WKH 6') 7KH ILOWHU ZH DUH XVLQJ LV WKHUHIRUH WUDGLQJ RII WKH 0$&( ILOWHU FULWHULRQ ZLWK WKH 6') FULWHULRQ 7KH 6') FULWHULRQ FDQ DOVR EH YLHZHG DV WKH 096') >.XPDU @ FULWHULRQ ZKHQ WKH QRLVH FODVV LV UHSUHVHQWHG E\ D ZKLWH QRLVH UDQn GRP SURFHVV 7KLV ILOWHU FDQ DOVR EH GHFRPSRVHG DV LQ ILJXUH

PAGE 70

7KHVH H[SHULPHQWV H[DPLQH WKH UHODWLRQVKLS EHWZHHQ WKH WZR FRPPRQO\ XVHG PHDn VXUHV RI JHQHUDOL]DWLRQ DQG WZR PHDVXUHV RI FODVVLILFDWLRQ SHUIRUPDQFH :H FDQ GUDZ FRQn FOXVLRQV IURP WKH UHVXOWV DERXW WKH DSSURSULDWHQHVV RI WKH JHQHUDOL]DWLRQ PHDVXUHV ZLWK UHJDUGV WR FODVVLILFDWLRQ 7KH ILUVW JHQHUDOL]DWLRQ PHDVXUH LV WKH PLQLPXP SHDN UHVSRQVH GHQRWHG \PLQ WDNHQ RYHU WKH DVSHFW UDQJH RI WKH LPDJHV RI WKH WUDLQLQJ YHKLFOH H[FOXGLQJ WKH DVSHFWV XVHG IRU FRPSXWLQJ WKH ILOWHUf 7KH VHFRQG JHQHUDOL]DWLRQ PHDVXUH LV WKH PHDQ VTXDUH HUURU GHQRWHG \P[ EHWZHHQ WKH GHVLUHG RXWSXW RI XQLW\ DQG WKH SHDN UHVSRQVH RYHU WKH DVSHFW UDQJH RI WKH LPDJHV RI WKH WUDLQLQJ YHKLFOH H[FOXGLQJ WKH DVSHFWV XVHG IRU FRPn SXWLQJ WKH ILOWHUf 7KH FODVVLILFDWLRQ PHDVXUHV DUH WDNHQ IURP WKH UHFHLYHU RSHUDWLQJ FKDUn DFWHULVWLF 52&f FXUYH PHDVXULQJ WKH SUREDELOLW\ RI GHWHFWLQJ 3G D WHVWLQJ YHKLFOH LQ WKH UHFRJQLWLRQ FODVV YHKLFOHV OE DQG OFf YHUVXV WKH SUREDELOLW\ RI IDOVH DODUP 3MD RQ D WHVWn LQJ YHKLFOH LQ WKH FRQIXVLRQ FODVV YHKLFOHV D DQG Ef EDVHG RQ SHDN GHWHFWLRQ 7KH VSHn FLILF PHDVXUHV DUH WKH DUHD XQGHU WKH 52& FXUYH D JHQHUDO PHDVXUH RI WKH WHVW EHLQJ XVHG ZKLOH WKH VHFRQG PHDVXUH LV WKH SUREDELOLW\ RI IDOVH DODUP ZKHQ WKH SUREDELOLW\ RI GHWHFn WLRQ HTXDOV b ZKLFK PHDVXUHV D VLQJOH SRLQW RI LQWHUHVW RQ WKH 52& FXUYH 7ZR ILOWHUV DUH XVHG RQH ZLWK D DQG WKH RWKHU ZLWK D RU RQH LQ ZKLFK ERWK FULWHULRQ DUH ZHLJKWHG HTXDOO\ DQG RQH ZKLFK LV FORVH WR WKH 0$&( ILOWHU FULWHULRQ 7KH QXPEHU RI H[HPSODUV GUDZQ IURP WKH WUDLQLQJ YHKLFOH ODf LV YDULHG IURP WR VDPSOHG XQLIRUPO\ LQ DVSHFW WR GHJUHHV DVSHFW VHSDUDWLRQ EHWZHHQ H[HPSODUVf ([DPLQDWLRQ RI ILJXUHV DQG VKRZ WKDW IRU ERWK FDVHV D HTXDO WR DQG f QR FOHDU UHODWLRQVKLS HPHUJHV LQ ZKLFK WKH JHQHUDOL]DWLRQ PHDVXUHV DUH LQGLFDWRUV RI JRRG FODVVLILFDWLRQ SHUIRUPDQFH 7DEOH FRPSDUHV WKH FODVVLILHU SHUIRUPDQFH ZKHQ WKH JHQHUDO

PAGE 71

L]DWLRQ PHDVXUHV DV GHVFULEHG DERYH DUH XVHG WR FKRRVH WKH ILOWHU YHUVXV WKH EHVW 52& SHUn IRUPDQFH DFKLHYHG WKURXJKRXW WKH UDQJH RI DVSHFW VHSDUDWLRQ ,Q RQH UHJDUG WKH JHQHUDOL]DWLRQ PHDVXUHV ZHUH FRQVLVWHQW LQ WKDW WKH VDPH DVSHFW VHSDUDWLRQ ZDV SUHGLFWHG E\ ERWK PHDVXUHV IRU ERWK VHWWLQJV RI D ,Q ILJXUH ZH FRPSDUH WKH 52& FXUYHV IRU WZR FDVHV ILUVW ZKHUH WKH ILOWHU FKRVHQ XVLQJ WKH JHQHUDOL]DWLRQ PHDVXUHV DQG VHFRQG WKH EHVW DFKLHYHG 52& FXUYH IRU ERWK VHWWLQJV RI D :H ZRXOG H[SHFW WKDW IRU HDFK D WKH ILOWHU XVLQJ WKH JHQHUDOL]DWLRQ PHDVXUH ZRXOG EH QHDU WKH EHVW 52& SHUIRUPDQFH $V FDQ EH VHHQ LQ WKH ILJXUH WKLV LV QRW WKH FDVH 7DEOH &ODVVLILHU SHUIRUPDQFH PHDVXUHV ZKHQ WKH ILOWHU LV GHWHUPLQHG E\ HLWKHU RI WKH FRPPRQ PHDVXUHV RI JHQHUDOL]DWLRQ DV FRPSDUHG WR EHVW FODVVLILHU SHUIRUPDQFH IRU WZR YDOXHV RI 2W *HQHUDOL]DWLRQ 0HDVXUH APLQ APVH %HVW Vr LL R EH D 52& DUHD 3If#3 D 52& DUHD ,W LV REYLRXV IURP ILJXUHV DQG WKDW WKH JHQHUDOL]DWLRQ PHDVXUHV DUH QRW VLJQLILn FDQWO\ FRUUHODWHG ZLWK WKH 52& SHUIRUPDQFH ,Q IDFW DV VXPPDUL]HG LQ WDEOH WKH JHQHUn DOL]DWLRQ PHDVXUHV DUH QHJDWLYHO\ DOEHLW ZHDNO\ FRUUHODWHG ZLWK 52& SHUIRUPDQFH 2QH IHDWXUH RI ILJXUHV DQG LV WKDW DOWKRXJK 52& SHUIRUPDQFH YDULHV LQGHSHQGHQW RI

PAGE 72

\ PLQ )LJXUH *HQHUDOL]DWLRQ DV PHDVXUHG E\ WKH PLQLPXP SHDN UHVSRQVH 7KH SORW FRPSDUHV \PLQ YHUVXV FODVVLILFDWLRQ SHUIRUPDQFH PHDVXUHV 52& DUHD DQG 3ID#3G f

PAGE 73

)LJXUH *HQHUDOL]DWLRQ DV PHDVXUHG E\ WKH SHDN UHVSRQVH PHDQ VTXDUH HUURU 7KH SORW FRPSDUHV \PVH YHUVXV FODVVLILFDWLRQ SHUIRUPDQFH PHDVXUHV 52& DUHD DQG 3ID#3G f

PAGE 74

)LJXUH &RPSDULVRQ RI 52& FXUYHV 7KH 52& FXUYHV IRU WKH QXPEHU RI WUDLQLQJ H[HPSODUV \LHOGLQJ WKH EHVW JHQHUDOL]DWLRQ PHDVXUH YHUVXV WKH QXPEHU \LHOGLQJ WKH EHVW 52& SHUIRUPDQFH IRU YDOXHV RI D HTXDO WR DQG DUH VKRZQ HLWKHU WKH PLQLPXP SHDN UHVSRQVH RU WKH 06( WKHUH GRHV DSSHDU WR EH GHSHQGHQF\ RQ D 7KLV OHDGV WR D VHFRQG H[SHULPHQW 7DEOH &RUUHODWLRQ RI JHQHUDOL]DWLRQ PHDVXUHV WR FODVVLILHU SHUIRUPDQFH ,Q ERWK FDVHV & HTXDO WR RU f WKH FODVVLILHU SHUIRUPDQFH DV PHDVXUHG E\ WKH DUHD RI WKH 52& FXUYH RU 3ID DW 3G HTXDO KDV DQ RSSRVLWH FRUUHODWLRQ DV WR ZKDW ZRXOG EH H[SHFWHG RI D XVHIXO PHDVXUH IRU SUHGLFWLQJ SHUIRUPDQFH 3HUIRUPDQFH 0HDVXUHV 52& DUHD 3ID#3G f 52& DUHD 3ID#3G f D D *HQHUDOL]DWLRQ \PcQ 0HDVXUHV \PVH ,Q WKH VHFRQG H[SHULPHQW ZH H[DPLQH WKH UHODWLRQVKLS EHWZHHQ WKH SDUDPHWHU D DQG WKH 52& SHUIRUPDQFH 7KH DVSHFW VHSDUDWLRQ EHWZHHQ WUDLQLQJ H[HPSODUV LV VHW WR DQG GHJUHHV 7KH YDOXH RI RW WKH HPSKDVLV RQ WKH 0$&( FULWHULRQ LV YDULHG LQ WKH UDQJH ]HUR WR XQLW\ )LJXUH VKRZV WKH UHODWLRQVKLS EHWZHHQ 52& SHUIRUPDQFH DQG WKH YDOXH

PAGE 75

RI D ,W LV FOHDU IURP WKH SORWV WKDW WKHUH LV D SRVLWLYH UHODWLRQVKLS EHWZHHQ WKH HPSKDVLV RQ WKH 0$&( FULWHULD DQG WKH 52& SHUIRUPDQFH +RZHYHU WKH SHDN LQ 52& SHUIRUPDQFH LV QRW DFKLHYHG DW D HTXDO WR XQLW\ ,Q DOO WKUHH FDVHV WKH 52& SHUIRUPDQFH SHDNV MXVW SULRU WR XQLW\ ZLWK WKH SHUIRUPDQFH GURSRII LQFUHDVLQJ ZLWK DVSHFW VHSDUDWLRQ DW D HTXDO WR XQLW\ 7KH GLIIHUHQFH EHWZHHQ WKH 6') DQG 0$&( ILOWHU LV WKH SUHSURFHVVRU :KDW LV VKRZQ E\ WKLV DQDO\VLV LV WKDW LQ JHQHUDO WKH SUHSURFHVVRU IURP WKH 0$&( ILOWHU FULWHULRQ OHDGV WR EHWWHU FODVVLILFDWLRQ EXW WRR PXFK HPSKDVLV RQ WKH 0$&( ILOWHU FULWHULRQ DV PHDVXUHG E\ D HTXDO WR XQLW\ OHDGV WR D ILOWHU ZKLFK LV WRR VSHFLILF WR WKH WUDLQLQJ VDPSOHV 7KH SUREOHPV GHVFULEHG DERYH DUH ZHOO NQRZQ $OWHUDWLRQV WR WKH 0$&( FULWHULRQ KDYH EHHQ WKH VXEMHFW RI PDQ\ UHVHDUFKHUV >&DVDVHQW HW DOf &DVDVHQW DQG 5DYLFKDQGUDQ 5DYLFKDQGUDQ DQG &DVDVHQW 0DKDODQRELV HW DK D@ 7KHUH LV VWLOO DV \HW QR SULQFLSOHG PHWKRG IRXQG LQ WKH OLWHUDWXUH E\ ZKLFK WR VHW WKH SDUDPHWHU D 7KHUH DUH WZR FRQFOXVLRQV IURP WKLV DQDO\VLV WKDW DUH SHUWLQHQW WR WKH QRQOLQHDU H[WHQn VLRQ ZH DUH XVLQJ )LUVW WKH UHVXOWV VKRZ WKDW SUHZKLWHQLQJ RYHU WKH UHFRJQLWLRQ FODVV OHDGV WR EHWWHU FODVVLILFDWLRQ SHUIRUPDQFH )RU WKLV UHDVRQ ZH FKRRVH WR XVH WKH SUHSURFHVn VRU RI WKH 0$&( ILOWHU LQ RXU QRQOLQHDU ILOWHU DUFKLWHFWXUH 7KH LVVXH RI H[WHQGLQJ WKH 0$&( ILOWHU WR QRQOLQHDU V\VWHPV FDQ LQ WKLV ZD\ EH IRUPXODWHG DV D VHDUFK IRU D PRUH UREXVW QRQOLQHDU GLVFULPLQDQW IXQFWLRQ LQ WKH SUHZKLWHQHG LPDJH VSDFH 7KH VHFRQG FRQFOXVLRQ LV WKDW FRPSDULVRQV RI WKH QRQOLQHDU ILOWHU WR LWV OLQHDU FRXQWHUn SDUW PXVW EH PDGH LQ WHUPV RI FODVVLILFDWLRQ SHUIRUPDQFH RQO\ 7KHUH DUH VLPSOH QRQOLQHDU V\VWHPV VXFK DV D VRIW WKUHVKROG DW WKH RXWSXW RI D OLQHDU V\VWHP IRU H[DPSOH WKDW ZLOO RXW

PAGE 76

52& DUHD YV D J J RDIL L R Â’ &' &2 R R 26 2 D GHJUHHV $ m GHJUHHV Â’ D GHJUHHV D )LJXUH 52& SHUIRUPDQFH PHDVXUHV YHUVXV D 5HVXOWV DUH VKRZQ IRU WUDLQLQJ DVSHFW VHSDUDWLRQV RI DQG GHJUHHV 7KHVH SORWV LQGLFDWH WKDW 52& SHUIRUPDQFH LV SRVLWLYHO\ UHODWHG WR D SHUIRUP WKH 0$&( ILOWHU RU LWV YDULDWLRQV LQ WHUPV RI PD[LPL]LQJ WKH PLQLPXP SHDN UHVSRQVH RYHU WKH WUDLQLQJ YHKLFOH RU UHGXFLQJ WKH YDULDQFH LQ WKH RXWSXW LPDJH SODQH

PAGE 77

7KHVH PHDVXUHV DUH QRW KRZHYHU VXIILFLHQW WR GHVFULEH FODVVLILFDWLRQ SHUIRUPDQFH :H KDYH DOVR XVHG WKHVH PHDVXUHV LQ WKH SDVW EXW IHHO WKDW WKH\ DUH QRW WKH PRVW DSSURSULDWH IRU FODVVLILFDWLRQ >)LVKHU DQG 3ULQFLSH E@ 6WDWLVWLFDO &KDUDFWHUL]DWLRQ RI WKH 5HMHFWLRQ &ODVV :H QRZ SUHVHQW D VWDWLVWLFDO YLHZSRLQW RI GLVWRUWLRQ LQYDULDQW ILOWHUV IURP ZKLFK VXFK QRQOLQHDU H[WHQVLRQV ILW QDWXUDOO\ LQWR DQ LWHUDWLYH IUDPHZRUN 7KLV WUHDWPHQW UHVXOWV LQ DQ HIILFLHQW ZD\ WR FDSWXUH WKH RSWLPDOLW\ FRQGLWLRQ RI WKH 0$&( ILOWHU XVLQJ D WUDLQLQJ DOJRn ULWKP ZKLFK LV DSSUR[LPDWHO\ RI RUGHU $ DQG ZKLFK OHDGV WR EHWWHU FODVVLILFDWLRQ SHUIRUn PDQFH WKDQ WKH OLQHDU 0$&( $ SRVVLEOH DSSURDFK WR GHVLJQ D QRQOLQHDU H[WHQVLRQ WR WKH 0$&( ILOWHU DQG LPSURYH RQ WKH JHQHUDOL]DWLRQ SURSHUWLHV LV WR VLPSO\ VXEVWLWXWH WKH OLQHDU SURFHVVLQJ HOHPHQWV RI WKH /$0 ZLWK QRQOLQHDU HOHPHQWV 6LQFH VXFK D V\VWHP FDQ EH WUDLQHG ZLWK HUURU EDFN SURSDJDWLRQ >5XPHOKDUW HW DO @ WKH LVVXH ZRXOG EH VLPSO\ WR UHSRUW RQ SHUIRUPDQFH FRPSDULVRQV ZLWK WKH 0$&( 6XFK PHWKRGRORJ\ GRHV QRW KRZHYHU OHDG WR XQGHUVWDQGn LQJ RI WKH UROH RI WKH QRQOLQHDULW\ DQG GRHV QRW HOXFLGDWH WKH WUDGHRIIV LQ WKH GHVLJQ DQG LQ WUDLQLQJ +HUH ZH DSSURDFK WKH SUREOHP IURP D GLIIHUHQW SHUVSHFWLYH :H VHHN WR H[WHQG WKH RSWLPDOLW\ FRQGLWLRQ RI WKH 0$&( WR D QRQOLQHDU V\VWHP LH WKH HQHUJ\ LQ WKH RXWSXW VSDFH LV PLQLPL]HG ZKLOH PDLQWDLQLQJ WKH SHDN FRQVWUDLQW DW WKH RULJLQ +HQFH ZH ZLOO LPSRVH WKHVH FRQVWUDLQWV GLUHFWO\ LQ WKH IRUPXODWLRQ HYHQ NQRZLQJ £ SULRUL WKDW DQ DQDO\WLn FDO VROXWLRQ LV YHU\ GLIILFXOW RU LPSRVVLEOH WR REWDLQ :H UHIRUPXODWH WKH 0$&( ILOWHU IURP

PAGE 78

D VWDWLVWLFDO YLHZSRLQW DQG JHQHUDOL]H LW WR DUELWUDU\ PDSSLQJ IXQFWLRQV OLQHDU DQG QRQOLQn HDU &RQVLGHU LPDJHV RI GLPHQVLRQ 79 [ 1 UHRUGHUHG E\ FROXPQ RU URZ LQWR YHFWRUV /HW 1 1 [ ? WKH UHMHFWLRQ FODVV EH FKDUDFWHUL]HG E\ WKH UDQGRP YHFWRU ;c H :H NQRZ WKH VHFRQGRUGHU VWDWLVWLFV RI WKLV FODVV DV UHSUHVHQWHG E\ WKH DYHUDJH SRZHU VSHFWUXP RU HTXLYDOHQWO\ WKH DXWRFRUUHODWLRQ IXQFWLRQf /HW WKH UHFRJQLWLRQ FODVV EH FKDUDFWHUL]HG E\ 1 1 [ 1 WKH FROXPQV RI D GDWD PDWUL[ [ H n n ZKLFK DUH REVHUYDWLRQV RI WKH UDQGRP YHFWRU 1 1 [ ? ; H W VLPLODUO\ UHRUGHUHG :H ZLVK WR ILQG WKH SDUDPHWHUV f RI D PDSSLQJ JFR ;f" f§!  VXFK WKDW ZH PD\ GLVFULPLQDWH WKH UHFRJQLWLRQ FODVV IURP WKH UHMHFWLRQ FODVV +HUH LW LV WKH PDSSLQJ IXQFWLRQ J ZKLFK GHILQHV WKH GLVFULPLQDWRU WRSROn RJ\ 7RZDUGV WKLV JRDO ZH ZLVK WR PLQLPL]H WKH REMHFWLYH IXQFWLRQ (JP$nf-f RYHU WKH PDSSLQJ SDUDPHWHUV FR VXEMHFW WR WKH V\VWHP RI FRQVWUDLQWV J[f G7 f 1 [ ZKHUH G H  LV D FROXPQ YHFWRU RI GHVLUHG RXWSXWV ,W LV DVVXPHG WKDW WKH PDSSLQJ IXQFWLRQ LV DSSOLHG WR HDFK FROXPQ RI MU DQG ( f LV WKH H[SHFWHG YDOXH IXQFWLRQ

PAGE 79

8VLQJ WKH PHWKRG RI /DJUDQJH PXOWLSOLHUV ZH FDQ DXJPHQW WKH REMHFWLYH IXQFWLRQ DV (J&2rff J*f;fUI7f? f 1 [? ZKHUH ; H n LV D YHFWRU ZKRVH HOHPHQWV DUH WKH /DJUDQJH PXOWLSOLHUV RQH IRU HDFK FRQVWUDLQW &RPSXWLQJ WKH JUDGLHQW ZLWK UHVSHFW WR WKH PDSSLQJ SDUDPHWHUV \LHOGV -B G (AFR$UfI FR JWR[f $ R f (TXDWLRQ DORQJ ZLWK WKH FRQVWUDLQWV RI HTXDWLRQ FDQ EH XVHG WR VROYH IRU WKH RSWLn PDO SDUDPHWHUV FRr DVVXPLQJ RXU FRQVWUDLQWV IRUP D FRQVLVWHQW VHW RI HTXDWLRQV 7KLV LV RI FRXUVH GHSHQGHQW RQ WKH PDSSLQJ WRSRORJ\ 7KH /LQHDU 6ROXWLRQ DV D 6SHFLDO &DVH ,W LV LQWHUHVWLQJ WR YHULI\ WKDW WKLV IRUPXODWLRQ \LHOGV WKH 0$&( ILOWHU DV D VSHFLDO FDVH ,I IRU H[DPSOH ZH FKRRVH WKH PDSSLQJ WR EH D OLQHDU SURMHFWLRQ RI WKH LQSXW LPDJH WKDW LV JD[f FR7[ FR >$ K1 1-7 H 511O; HTXDWLRQ EHFRPHV DIWHU VLPSOLILFDWLRQ FR7e$nO[@nfFR FR7[G7f; f ,Q RUGHU WR VROYH IRU WKH PDSSLQJ SDUDPHWHUV WR ZH DUH VWLOO OHIW ZLWK WKH WDVN RI FRP 7 SXWLQJ WKH WHUP (;c;cf ZKLFK LQ JHQHUDO ZH FDQ RQO\ HVWLPDWH IURP REVHUYDWLRQV RI WKH UDQGRP YHFWRU ;c RU DVVXPH D VSHFLILF IRUP $VVXPLQJ WKDW ZH KDYH D VXLWDEOH HVWLPD

PAGE 80

WRU WKH ZHOO NQRZQ VROXWLRQ WR WKH PLQLPXP RI HTXDWLRQ RYHU WKH PDSSLQJ SDUDPHWHUV VXEMHFW WR WKH FRQVWUDLQWV RI HTXDWLRQ LV ZKHUH IIO 5[[>[?5[[@ 9 f 5[W HVWLPDWH^e$U;Af` f 'HSHQGLQJ RQ WKH FKDUDFWHUL]DWLRQ RI ;c HTXDWLRQ GHVFULEHV YDULRXV 6')W\SH ILOn WHUV LH 0$&( 096') HWFf ,Q WKH FDVH RI WKH 0$&( ILOWHU WKH UHMHFWLRQ FODVV LV FKDUn DFWHUL]HG E\ DOO FLUFXODU VKLIWV RI WDUJHW FODVV LPDJHV DZD\ IURP WKH RULJLQ 6ROYLQJ IRU WKH 0$&( ILOWHU FRHIILFLHQWV LV WKHUHIRUH HTXLYDOHQW WR XVLQJ WKH DYHUDJH FLUFXODU DXWRFRUn UHODWLRQ VHTXHQFH RU HTXLYDOHQWO\ WKH DYHUDJH SRZHU VSHFWUXP LQ WKH IUHTXHQF\ GRPDLQf 7 RYHU LPDJHV LQ WKH WDUJHW FODVV DV HVWLPDWRUV RI WKH HOHPHQWV RI WKH PDWUL[ (;c $nMf 6XGKDUVDQDQ HW DO >@ VXJJHVW D YHU\ VLPLODU PHWKRGRORJ\ IRU LPSURYLQJ WKH SHUIRUn PDQFH RI WKH 0$&( ILOWHU ,Q WKDW FDVH WKH DYHUDJH OLQHDU DXWRFRUUHODWLRQ VHTXHQFH LV HVWL 7 PDWHG RYHU WKH WDUJHW FODVV DQG WKLV HVWLPDWRU RI (;?;?f LV XVHG WR VROYH IRU OLQHDU SURMHFWLRQ FRHIILFLHQWV LQ WKH VSDFH GRPDLQ 7KH UHVXOWLQJ ILOWHU LV UHIHUUHG WR DV WKH 60$&( VSDFHGRPDLQ 0$&(f ILOWHU 1RQOLQHDU 0DSSLQJV )RU DUELWUDU\ QRQOLQHDU PDSSLQJV LW ZLOO LQ JHQHUDO EH YHU\ GLIILFXOW WR VROYH IRU JORn EDOO\ RSWLPDO SDUDPHWHUV DQDO\WLFDOO\ 2XU SXUSRVH LV LQVWHDG WR GHYHORS LWHUDWLYH WUDLQLQJ DOJRULWKPV ZKLFK DUH SUDFWLFDO DQG \LHOG LPSURYHG SHUIRUPDQFH RYHU WKH OLQHDU PDSSLQJV

PAGE 81

,W LV WKURXJK WKH LPSOLFLW GHVFULSWLRQ RI WKH UHMHFWLRQ FODVV E\ LWV VHFRQGRUGHU VWDWLVWLFV IURP ZKLFK ZH KDYH GHYHORSHG DQ HIILFLHQW PHWKRG H[WHQGLQJ WKH 0$&( ILOWHU DQG RWKHU UHODWHG FRUUHODWRUV WR QRQOLQHDU WRSRORJLHV VXFK DV QHXUDO QHWZRUNV $V VWDWHG RXU JRDO LV WR ILQG PDSSLQJV GHILQHG E\ D WRSRORJ\ DQG D SDUDPHWHU VHW ZKLFK LPSURYH XSRQ WKH SHUIRUPDQFH RI WKH 0$&( ILOWHU LQ WHUPV RI JHQHUDOL]DWLRQ ZKLOH PDLQWDLQLQJ D VKDUS FRQVWUDLQHG SHDN LQ WKH FHQWHU RI WKH RXWSXW SODQH IRU LPDJHV LQ WKH UHFRJQLWLRQ FODVV 2QH DSSURDFK ZKLFK OHDGV WR DQ LWHUDWLYH DOJRULWKP LV WR DSSUR[LPDWH WKH RULJLQDO REMHFWLYH IXQFWLRQ RI HTXDWLRQ ZLWK WKH PRGLILHG REMHFWLYH IXQFWLRQ 3f(JFR $M ff 0JRf [f G7@>Jf [f G@ f 7KH SULQFLSDO DGYDQWDJH JDLQHG E\ XVLQJ HTXDWLRQ RYHU HTXDWLRQ LV WKDW ZH FDQ VROYH LWHUDWLYHO\ IRU WKH SDUDPHWHUV RI WKH PDSSLQJ IXQFWLRQ DVVXPLQJ LW LV GLIIHUHQWLDEOHf XVLQJ JUDGLHQW VHDUFK 7KH FRQVWUDLQW HTXDWLRQV KRZHYHU DUH QR ORQJHU VDWLVILHG ZLWK HTXDOLW\ RYHU WKH WUDLQLQJ VHW ,W KDV EHHQ UHFRJQL]HG WKDW WKH FKRLFH RI FRQVWUDLQW YDOXHV KDV GLUHFW LPSDFW RQ WKH SHUIRUPDQFH RI RSWLPL]HG OLQHDU FRUUHODWRUV 6XGKDUVDQDQ HW DO >@ KDYH H[SORUHG WHFKQLTXHV IRU RSWLPDOO\ DVVLJQLQJ WKHVH YDOXHV ZLWKLQ WKH FRQn VWUDLQWV RI D OLQHDU WRSRORJ\ 2WKHU PHWKRGV KDYH EHHQ VXJJHVWHG >0DKDODQRELV HW DO D E .XPDU DQG 0DKDODQRELV @ WR LPSURYH WKH SHUIRUPDQFH RI GLVWRUWLRQ LQYDULDQW ILOWHUV E\ UHOD[LQJ WKH HTXDOLW\ FRQVWUDLQWV 0DKDODQRELV >D@ H[WHQGV WKLV LGHD WR XQFRQVWUDLQHG OLQHDU FRUUHODWLRQ ILOWHUV 7KH 276') REMHFWLYH IXQFWLRQ RI 5IUJLHU >@ DSSHDUV VLPLODU WR WKH PRGLILHG REMHFWLYH IXQFWLRQ DQG LQGHHG IRU D OLQn HDU WRSRORJ\ WKLV FDQ EH VROYHG DQDO\WLFDOO\ DV DQ RSWLPDO WUDGHRII SUREOHP

PAGE 82

2XU SULPDU\ SXUSRVH IRU PRGLI\LQJ WKH REMHFWLYH IXQFWLRQ LV WR DOORZ IRU DQ LWHUDWLYH PHWKRG ZLWKLQ WKH 1/0$&( DUFKLWHFWXUH :H KDYH DOUHDG\ VKRZQ LQ WKH SUHYLRXV FKDSn WHU WKDW WKLV FKRLFH RI FULWHULRQ LV VXLWDEOH IRU FODVVLILFDWLRQ :H ZLOO VKRZ WKDW WKH SULPDU\ TXDOLWLHV RI WKH 0$&( ILOWHU DUH VWLOO PDLQWDLQHG ZKHQ ZH UHOD[ WKH HTXDOLW\ FRQVWUDLQWV LQ RXU IRUPXODWLRQ 9DU\LQJ LQ WKH UDQJH > @ FRQWUROV WKH GHJUHH WR ZKLFK WKH DYHUDJH UHVSRQVH WR WKH UHMHFWLRQ FODVV LV HPSKDVL]HG YHUVXV WKH YDULDQFH DERXW WKH GHVLUHG RXWSXW RYHU WKH UHFRJQLWLRQ FODVV $V LQ WKH OLQHDU FDVH ZH FDQ RQO\ HVWLPDWH WKH H[SHFWHG YDULDQFH RI WKH RXWSXW GXH WR WKH UDQGRP YHFWRU LQSXW DQG LWV DVVRFLDWHG JUDGLHQW ,I DV LQ WKH 0$&( RU 60$&(f ILOWHU IRUPXODWLRQ LV FKDUDFWHUL]HG E\ DOO FLUFXODU RU OLQHDUf VKLIWV RI WKH UHFRJQLWLRQ FODVV DZD\ IURP WKH RULJLQ WKHQ WKLV WHUP FDQ EH HVWLPDWHG ZLWK D VDPSOHG DYHUDJH RYHU WKH H[HPSODUV [ IRU DOO VXFK VKLIWV )URP DQ LWHUDWLYH VWDQGSRLQW WKLV VWLOO OHDGV WR WKH LPSUDFWLFDO DSSURDFK WUDLQLQJ H[KDXVWLYHO\ RYHU WKH HQWLUH RXWSXW SODQH ,W LV GHVLUDEOH WKHQ WR ILQG RWKHU HTXLYDOHQW FKDUDFWHUL]DWLRQV RI WKH UHMHFWLRQ FODVV ZKLFK PD\ DOOHYLDWH WKH FRPSXWDWLRQDO ORDG ZLWKRXW VLJQLILFDQWO\ LPSDFWLQJ SHUIRUPDQFH (IILFLHQW 5HSUHVHQWDWLRQ RI WKH 5HMHFWLRQ &ODVV 7UDLQLQJ EHFRPHV DQ LVVXH RQFH WKH DVVRFLDWLYH PHPRU\ VWUXFWXUH WDNHV D QRQOLQHDU IRUP 7KH RXWSXW YDULDQFH RI WKH OLQHDU 0$&( ILOWHU LV PLQLPL]HG IRU WKH HQWLUH RXWSXW SODQH RYHU WKH WUDLQLQJ H[HPSODUV (YHQ ZKHQ WKH FRHIILFLHQWV RI WKH 0$&( ILOWHU DUH FRPSXWHG LWHUDWLYHO\ ZH QHHG RQO\ FRQVLGHU WKH RXWSXW SRLQW DW WKH GHVLJQDWHG SHDN ORFDn WLRQ FRQVWUDLQWf IRU HDFK SUHZKLWHQHG WUDLQLQJ H[HPSODU >)LVKHU DQG 3ULQFLSH @ 7KLV LV GXH WR WKH IDFW WKDW IRU WKH XQGHUGHWHUPLQHG FDVH WKH OLQHDU SURMHFWLRQ ZKLFK VDWLVILHV

PAGE 83

WKH V\VWHP RI FRQVWUDLQWV ZLWK HTXDOLW\ DQG KDV PLQLPXP QRUP LV DOVR WKH OLQHDU SURMHFWLRQ ZKLFK PLQLPL]HV WKH UHVSRQVH WR LPDJHV ZLWK D IODW SRZHU VSHFWUXP 7KLV VROXWLRQ LV DUULYHG DW QDWXUDOO\ YLD D JUDGLHQW VHDUFK ZKLFK RQO\ FRQVLGHUV WKH UHVSRQVH DW WKH FRQn VWUDLQW ORFDWLRQ 7KLV LV QR ORQJHU WKH FDVH ZKHQ WKH PDSSLQJ LV QRQOLQHDU $GDSWLQJ WKH SDUDPHWHUV YLD JUDGLHQW VHDUFK VXFK DV HUURU EDFNSURSDJDWLRQf RQ UHFRJQLWLRQ FODVV H[HPSODUV RQO\ DW WKH FRQVWUDLQW ORFDWLRQ ZLOO QRW LQ JHQHUDO PLQLPL]H WKH YDULDQFH RYHU WKH HQWLUH RXWSXW LPDJH SODQH ,Q RUGHU WR PLQLPL]H WKH YDULDQFH RYHU WKH HQWLUH RXWSXW SODQH ZH PXVW FRQVLGHU WKH UHVSRQVH RI WKH ILOWHU WR HDFK ORFDWLRQ LQ WKH LQSXW LPDJH QRW MXVW WKH FRQVWUDLQW ORFDWLRQ 7KH 0$&( ILOWHU RSWLPL]DWLRQ FULWHULRQ PLQLPL]HV LQ WKH DYHUDJH WKH UHVSRQVH WR DOO LPDJHV ZLWK WKH VDPH VHFRQG RUGHU VWDWLVWLFV DV WKH UHMHFWLRQ FODVV $W WKH RXWSXW RI WKH SUH ZKLWHQHU SULRU WR WKH 0/3f DQ\ ZKLWH VHTXHQFH ZLOO KDYH WKH VDPH VHFRQG RUGHU VWDWLVWLFV DV WKH UHMHFWLRQ FODVV 7KLV FRQGLWLRQ FDQ EH H[SORLWHG WR PDNH WKH WUDLQLQJ RI WKH 0/3 PRUH HIILFLHQW )URP DQ LPSOHPHQWDWLRQ VWDQGSRLQW WKH SUHZKLWHQLQJ VWDJH DQG WKH LQSXW OD\HU ZHLJKWV FDQ EH FRPELQHG LQWR D VLQJOH HTXLYDOHQW OLQHDU WUDQVIRUPDWLRQ KRZHYHU SUHn ZKLWHQLQJ VHSDUDWHO\ DOORZV WKH UHMHFWLRQ FODVV WR EH UHSUHVHQWHG E\ ZKLWH VHTXHQFHV DW WKH LQSXW WR WKH 0/3 GXULQJ WKH WUDLQLQJ SKDVH 7KLV UHVXOW LV GXH WR WKH VWDWLVWLFDO IRUPXODWLRQ RI WKH RSWLPL]DWLRQ FULWHULRQ 0LQLPL]n LQJ WKH UHVSRQVH WR ZKLWH VHTXHQFHV LQ WKH DYHUDJH PLQLPL]HV WKH UHVSRQVH WR VKLIWV RI WKH H[HPSODU LPDJHV VLQFH WKH\ KDYH WKH VDPH VHFRQGRUGHU VWDWLVWLFV DIWHU SUHZKLWHQLQJf &RQVHTXHQWO\ ZH GR QRW KDYH WR WUDLQ RYHU WKH HQWLUH RXWSXW SODQH H[KDXVWLYHO\ WKHUHE\ UHGXFLQJ WUDLQLQJ WLPHV SURSRUWLRQDOO\ E\ WKH LQSXW LPDJH VL]H 1W1 ,QVWHDG ZH XVH D

PAGE 84

VPDOO QXPEHU RI UDQGRPO\ JHQHUDWHG ZKLWH VHTXHQFHV WR HIILFLHQWO\ UHSUHVHQW WKH UHMHFWLRQ FODVV 7KH UHVXOW LV DQ DOJRULWKP ZKLFK LV RI RUGHU 1 1QV ZKHUH 1QV LV WKH QXPEHU RI ZKLWH QRLVH UHMHFWLRQ FODVV H[HPSODUVf DV FRPSDUHG WR H[KDXVWLYH WUDLQLQJ ([SHULPHQWDO 5HVXOWV :H QRZ SUHVHQW H[SHULPHQWDO UHVXOWV ZKLFK LOOXVWUDWH WKH WHFKQLTXH DQG SRWHQWLDO SLW IDOOV 7KHUH DUH IRXU VLJQLILFDQW RXWFRPHV LQ WKH H[SHULPHQWV SUHVHQWHG LQ WKLV VHFWLRQ 7KH ILUVW LV WKDW ZKHQ XVLQJ WKH ZKLWH VHTXHQFHV WR FKDUDFWHUL]H WKH UHMHFWLRQ FODVV WKH OLQHDU VROXWLRQ LV D VWURQJ DWWUDFWRU 7KH VHFRQG RXWFRPH LV WKDW LPSRVLQJ RUWKRJRQDOLW\ RQ WKH LQSXW OD\HU WR WKH 0/3 WHQGV WR OHDG WR D QRQOLQHDU VROXWLRQ ZLWK LPSURYHG SHUIRUPDQFH 7KH WKLUG UHVXOW LQ ZKLFK ZH UHVWULFW WKH UHMHFWLRQ FODVV WR D VXEVSDFH \LHOGV D VLJQLILFDQW GHFUHDVH LQ WKH FRQYHUJHQFH WLPH 7KH IRXUWK UHVXOW LQ ZKLFK ZH ERUURZ IURP WKH LGHD RI XVLQJ WKH LQWHULRU RI WKH FRQYH[ KXOO WR UHSUHVHQW WKH UHMHFWLRQ FODVV >.XPDU HW DO @ \LHOGV VLJQLILFDQWO\ EHWWHU FODVVLILFDWLRQ SHUIRUPDQFH ,Q WKHVH H[SHULPHQWV ZH XVH WKH GDWD GHSLFWHG LQ ILJXUH $V LQ WKH SUHYLRXV H[SHULn PHQWV LPDJHV IURP YHKLFOH OD ZLOO EH XVHG DV WKH WUDLQLQJ VHW 9HKLFOHV OE DQG OF ZLOO EH XVHG DV WKH UHFRJQLWLRQ FODVV ZKLOH YHKLFOHV D DQG E ZLOO EH XVHG DV D UHMHFWLRQFRQIXVLRQ FODVV IRU WHVWLQJ SXUSRVHV ,Q HDFK FDVH FRPSDULVRQV ZLOO EH PDGH WR D EDVHOLQH OLQHDU ILOWHU 6SHFLILFDOO\ LQ DOO FDVHV WKH YDOXH RI D IRU WKH OLQHDU ILOWHU LV VHW WR 7KH DVSHFW VHSDUDWLRQ EHWZHHQ WUDLQLQJ LPDJHV LV GHJUHHV 7KLV UHVXOWV LQ WUDLQLQJ H[HPSODUV IURP YHKLFOH OD 7KHVH VHWWLQJV RI D DQG DVSHFW VHSDUDWLRQ ZHUH IRXQG WR JLYH WKH EHVW FODVVLILHU SHUIRUPDQFH IRU WKH OLQHDU ILOWHU ZLWK WKLV GDWD VHW :H FRQWLQXH WR UHIHU WR WKLV DV D 0$&( ILOWHU VLQFH WKH 0$&( FULWHULRQ LV VR KHDYLO\ HPSKDVL]HG 7HFKQLFDOO\ LW LV DQ

PAGE 85

276') ILOWHU EXW VXFK QRPHQFODWXUH GRHV QRW FRQYH\ WKH W\SH RI SUHSURFHVVLQJ WKDW LV EHLQJ SHUIRUPHG :H FKRRVH WKH YDOXH RI D VR DV WR FRPSDUH WR WKH EHVW SRVVLEOH 0$&( ILOWHU IRU WKLV GDWD VHW 7KH QRQOLQHDU ILOWHU ZLOO XVH WKH VDPH SUHSURFHVVRU DV WKH OLQHDU ILOWHU LH D f 7KH 0/3 VWUXFWXUH LV VKRZQ DW WKH ERWWRP RI ILJXUH ,W DFFHSWV DQ 1L 1 LQSXW YHFWRU D SUHSURFHVVHG LPDJH UHRUGHUHG LQWR D FROXPQ YHFWRUf IROORZHG E\ WZR KLGGHQ OD\HUV ZLWK WZR DQG WKUHH KLGGHQ 3( QRGHV UHVSHFWLYHO\f DQG D VLQJOH RXWSXW QRGH 7KH SDUDPHWHUV RI WKH 0/3 B[ f ; f ; 93 H :H : H W WSH65 DUH WR EH GHWHUPLQHG WKURXJK JUDGLHQW VHDUFK 7KH JUDGLHQW VHDUFK WHFKQLTXH XVHG LQ DOO FDVHV ZLOO EH HUURU EDFNSURSDJDWLRQ DOJRULWKP ([SHULPHQW QRLVH WUDLQLQJ $V VWDWHG XVLQJ WKH VWDWLVWLFDO DSSURDFK WKH UHMHFWLRQ FODVV LV FKDUDFWHUL]HG E\ ZKLWH QRLVH VHTXHQFHV DW WKH LQSXW WR WKH 0/3 7KH UHFRJQLWLRQ FODVV LV FKDUDFWHUL]HG E\ WKH H[HPSODUV ,W LV IURP WKHVH ZKLWH QRLVH VHTXHQFHV WKDW WKH 0/3 WKURXJK WKH EDFNSURSDJDn WLRQ OHDUQLQJ DOJRULWKP FDSWXUHV LQIRUPDWLRQ DERXW WKH UHMHFWLRQ FODVV 6R LW ZRXOG VHHP D VLPSOH PDWWHU GXULQJ WKH WUDLQLQJ VWDJH WR SUHVHQW UDQGRP ZKLWH QRLVH VHTXHQFHV DV WKH UHMHFWLRQ FODVV H[HPSODUV 7KLV LV H[DFWO\ WKH WUDLQLQJ PHWKRG XVHG IRU WKLV H[SHULPHQW )URP RXU HPSLULFDO REVHUYDWLRQV ZH REVHUYHG WKDW ZLWK WKLV PHWKRG RI WUDLQLQJ WKH OLQHDU VROXWLRQ LV D VWURQJ DWWUDFWRU 7KH UHVXOWV RI WKH ILUVW H[SHULPHQW LV GHPRQVWUDWHV WKLV EHKDYn LRU

PAGE 86

)LJXUH VKRZV WKH SHDN RXWSXW UHVSRQVH WDNHQ RYHU DOO LPDJHV RI YHKLFOH OD IRU ERWK WKH OLQHDU WRSf DQG QRQOLQHDU ERWWRPf ILOWHUV ,Q WKH ILJXUH ZH VHH WKDW IRU WKH OLQHDU ILOWHU WKH SHDN FRQVWUDLQW XQLW\f LV PHW H[DFWO\ IRU WKH WUDLQLQJ H[HPSODUV ZLWK GHJUDGDWLRQ IRU WKH EHWZHHQ DVSHFW H[HPSODUV $V PHQWLRQHG SUHYLRXVO\ LI WKH SXUH 0$&( ILOWHU FULWHULRQ ZHUH XVHG D HTXDO WR XQLW\f WKH SHDN LQ WKH RXWSXW SODQH LV JXDUDQWHHG WR EH DW WKH FRQn VWUDLQW ORFDWLRQ >0DKDODQRELV HW DO @ ,W WXUQV RXW WKDW IRU WKLV GDWD VHW WKH SHDN RXWSXW DOVR RFFXUV WKH FRQVWUDLQW ORFDWLRQ IRU WKH WUDLQLQJ LPDJHV KRZHYHU ZLWK D LW ZDV QRW JXDUDQWHHG ([DPLQDWLRQ RI WKH SHDN RXWSXW UHVSRQVH IRU WKH 1/0$&( ILOWHU VKRZV WKDW WKH FRQVWUDLQWV DUH PHW YHU\ FORVHO\ EXW QRW H[DFWO\f IRU WKH WUDLQLQJ H[HPSODUV DOVR ZLWK GHJUDGDWLRQ LQ WKH SHDN RXWSXW UHVSRQVH DW EHWZHHQ DVSHFW ORFDWLRQV 7KH GHJUDGDWLRQ IRU WKH QRQOLQHDU ILOWHU LV QRWLFHDEO\ OHVV WKDQ LQ WKH OLQHDU FDVH DQG VR LQ WKLV UHJDUG LW KDV RXWSHUIRUPHG WKH OLQHDU ILOWHU )LJXUH VKRZV WKH RXWSXW SODQH UHVSRQVH IRU D VLQJOH LPDJH RI YHKLFOH OD QRW RQH XVHG IRU FRPSXWLQJ WKH ILOWHU FRHIILFLHQWVf IRU WKH OLQHDU ILOWHU WRSf DQG WKH QRQOLQHDU ILOWHU ERWWRPf $JDLQ LQ WKLV ILJXUH ZH VHH WKDW ERWK ILOWHUV UHVXOW LQ D QRWLFHDEOH SHDN ZKHQ WKH LPDJH LV FHQWHUHG RQ WKH ILOWHU DQG D UHGXFHG UHVSRQVH ZKHQ WKH LPDJH LV VKLIWHG 7KH UHGXFWLRQ LQ UHVSRQVH WR WKH VKLIWHG LPDJH LV DJDLQ QRWLFHDEO\ EHWWHU LQ WKH QRQOLQHDU ILOWHU WKDQ LQ WKH OLQHDU ILOWHU 6XFK ZRXOG EH IRXQG WR EH WUDH IRU DOO LPDJHV RI YHKLFOH OD DQG VR LQ WKLV UHJDUG ZH FDQ DJDLQ VD\ WKDW WKH QRQOLQHDU ILOWHU KDG RXWSHUIRUPHG WKH OLQHDU ILOWHU +RZHYHU DV ZH KDYH DOUHDG\ LOOXVWUDWHG IRU WKH OLQHDU FDVH WKHVH PHDVXUHV DUH QRW VXIn ILFLHQW WR SUHGLFW FODVVLILHU SHUIRUPDQFH DORQH DQG DUH FHUWDLQO\ QRW VXIILFLHQW WR FRPSDUH OLQHDU V\VWHPV WR QRQOLQHDU V\VWHPV 7KLV SRLQW LV PDGH FOHDU LQ WDEOH ZKLFK VXPPDUL]HV WKH FODVVLILHU SHUIRUPDQFH DW WZR SUREDELOLWLHV RI GHWHFWLRQ IRU DOO RI WKH H[SHULPHQWV

PAGE 87

OLQHDU ILOWHU H r 2f H[ DVSHFW GHJUHHVf QRQOLQHDU ILOWHU DVSHFW GHJUHHVf )LJXUH 3HDN RXWSXW UHVSRQVH RI OLQHDU DQG QRQOLQHDU ILOWHUV RYHU WKH WUDLQLQJ VHW 7KH QRQOLQHDU ILOWHU FOHDUO\ RXWSHUIRUPV WKH OLQHDU ILOWHU E\ WKLV PHWULF DORQH UHSRUWHG KHUH ZKHQ YHKLFOHV OE DQG OF DUH XVHG DV WKH UHFRJQLWLRQ FODVV DQG YHKLFOHV D DQG E DUH XVHG IRU WKH UHMHFWLRQ FODVV $W WKLV SRLQW ZH DUH RQO\ LQWHUHVWHG LQ WKH UHVXOWV SHUWDLQLQJ WR WKH OLQHDU ILOWHU RXU EDVHOLQHf DQG QRQOLQHDU ILOWHU UHVXOWV IRU H[SHULPHQW ,

PAGE 88

)LJXUH 2XWSXW UHVSRQVH RI OLQHDU ILOWHU WRSf DQG QRQOLQHDU ILOWHU ERWWRPf 7KH UHVSRQVH LV IRU D VLQJOH LPDJH IURP WKH WUDLQLQJ VHW EXW QRW RQH XVHG WR FRPSXWH WKH ILOWHU 7KLV WDEOH VKRZV WKDW WKH FODVVLILHU SHUIRUPDQFH IRU WKH OLQHDU ILOWHU DQG QRQOLQHDU ILOWHUV DUH QRPLQDOO\ WKH VDPH GHVSLWH ZKDW PD\ EH SHUFHLYHG WR EH EHWWHU SHUIRUPDQFH LQ WKH QRQOLQHDU ILOWHU ZLWK UHJDUGV WR SHDN UHVSRQVH RYHU WKH WUDLQLQJ YHKLFOH DQG UHGXFHG RXWSXW SODQH UHVSRQVH WR VKLIWV RI WKH LPDJH )XUWKHUPRUH LI ZH H[DPLQH ILJXUH ZKLFK VKRZV

PAGE 89

WKH 52& FXUYH IRU ERWK ILOWHUV ZH VHH WKDW WKH\ RYHUOD\ HDFK RWKHU )URP D FODVVLILFDWLRQ VWDQGSRLQW WKH WZR ILOWHUV DUH HTXLYDOHQW 52& FXUYH )LJXUH 52& FXUYHV IRU OLQHDU ILOWHU VROLG OLQHf YHUVXV QRQOLQHDU ILOWHU GDVKHG OLQHf 'HVSLWH LPSURYHG SHUIRUPDQFH RI WKH QRQOLQHDU ILOWHU DV PHDVXUHG E\ SHDN RXWSXW UHVSRQVH DQG UHGXFHG YDULDQFH RYHU WKH WUDLQLQJ VHW WKH ILOWHUV DUH HTXLYDOHQW ZLWK UHJDUGV WR FODVVLILFDWLRQ RYHU WKH WHVWLQJ VHW 7KH H[SODQDWLRQ RI WKLV UHVXOW LV EHVW H[SODLQHG E\ ILJXUH 5HFDOO WKH SRLQWV X W DQG P ODEHOHG LQ ILJXUH :H FDQ YLHZ WKHVH RXWSXWV DV D IHDWXUH VSDFH WKDW LV WKH 0/3 GLVFULPLQDQW IXQFWLRQ FDQ EH VXSHULPSRVHG RQ WKH SURMHFWLRQ RI WKH LQSXW LPDJH RQWR WKLV VSDFH ,Q WKLV FDVH WKH IHDWXUH VSDFH LV D UHSUHVHQWDWLRQ RI WKH LQSXW YHFWRU LQWHUQDO WR WKH 0/3 VWUXFWXUH 7KH GHVn LJQDWLRQ RI WKHVH SRLQWV DV IHDWXUHV LV GXH WR WKH IDFW WKDW WKH\ UHSUHVHQW VRPH DEVWUDFW TXDO

PAGE 90

)LJXUH ([SHULPHQW 5HVXOWLQJ IHDWXUH VSDFH IURP VLPSOH QRLVH WUDLQLQJ 1RWH WKDW DOO SRLQWV DUH SURMHFWHG RQWR D VLQJOH FXUYH LQ WKH IHDWXUH VSDFH ,Q WKH WRS ILJXUH VTXDUHV DUH WKH UHFRJQLWLRQ FODVV WUDLQLQJ H[HPSODUV WULDQJOHV DUH ZKLWH QRLVH UHMHFWLRQ FODVV H[HPSODUV DQG SOXV VLJQV DUH WKH LPDJHV RI YHKLFOH OD QRW XVHG IRU WUDLQLQJ ,Q WKH ERWWRP ILJXUH VTXDUHV DUH WKH SHDN UHVSRQVHV IURP YHKLFOHV OE DQG OF WULDQJOHV DUH WKH SHDN UHVSRQVHV IURP YHKLFOHV D DQG E

PAGE 91

LW\ RI WKH GDWD DQG WKH GHFLVLRQ VXUIDFH FDQ EH FRPSXWHG DV D IXQFWLRQ RI WKH IHDWXUHV 0DWKHPDWLFDOO\ WKLV FDQ EH ZULWWHQ \f AAf WSff f 5HFDOO WKDW WKH PDWUL[ :c UHSUHVHQWV WKH FRQQHFWLYLWLHV IURP WKH RXWSXW RI OD\HU L f§ f WR WKH LQSXWV RI WKH 3(V RI OD\HU L S LV D FRQVWDQW ELDV WHUP DQG R f LV D VLJPRLGDO QRQOLQn HDULW\ K\SHUEROLF WDQJHQW IXQFWLRQ LQ WKLV FDVHf )LJXUH VKRZV WKLV SURMHFWLRQ IRU WKH WUDLQLQJ VHW WRSf DQG WKH WHVWLQJ VHW ERWWRPf :KDW LV VLJQLILFDQW LQ WKH ILJXUH LV WKDW DOWKRXJK WKH GLVFULPLQDQW DV D IXQFWLRQ RI WKH YHFn WRU X LV QRQOLQHDU WKH SURMHFWLRQ RI WKH LPDJHV OLH RQ D VLQJOH FXUYH LQ WKLV IHDWXUH VSDFH 7RSRORJLFDOO\ WKLV ILOWHU FDQ SXW LQWR RQHWRRQH FRUUHVSRQGHQFH ZLWK D OLQHDU SURMHFWLRQ 7KLV LV QRW WR VD\ WKDW WKH OLQHDU VROXWLRQ LV XQGHVLUDEOH EXW XQGHU WKH RSWLPL]DWLRQ FULWHn ULRQ LW FDQ EH FRPSXWHG LQ FORVHG IRUP )XUWKHUPRUH LQ D VSDFH DV ULFK DV WKH ,6$5 LPDJH VSDFH LW LV XQOLNHO\ WKDW WKH OLQHDU VROXWLRQ ZLOO JLYH WKH EHVW FODVVLILFDWLRQ SHUIRUPDQFH 7DEOH &RPSDULVRQ RI 52& FODVVLILHU SHUIRUPDQFH IRU WR YDOXHV RI 3G 5HVXOWV DUH VKRZQ IRU WKH OLQHDU ILOWHU YHUVXV IRXU GLIIHUHQW W\SHV RI QRQOLQHDU WUDLQLQJ 1 ZKLWH QRLVH WUDLQLQJ *6 *UDP6FKPLGW RUWKRJRQDOL]DWLRQ VXE1 3&$ VXEVSDFH QRLVH &+ FRQYH[ KXOO UHMHFWLRQ FODVV 3G rf 3Ibf OLQHDU ILOWHU QRQOLQHDU ILOWHU H[SHULPHQWV ,,9 1f ,, 1 *6f ,,, VXE1 *6f ,9 VXE1 *6 &+f ([SHULPHQW ,, QRLVH WUDLQLQJ ZLWK DQ RUWKRJRQDOL]DWLRQ FRQVWUDLQW $V D PHDQV RI DYRLGLQJ WKH OLQHDU VROXWLRQ D PRGLILFDWLRQ ZDV PDGH WR WKH WUDLQLQJ DOJRULWKP 7KH PRGLILFDWLRQ ZDV WR LPSRVH RUWKRJRQDOLW\ RQ WKH FROXPQV RI ,9 WKURXJK D

PAGE 92

*UDP6FKPLGW SURFHVV 7KH PRWLYDWLRQ IRU GRLQJ WKLV VWHPV IURP WKH IDFW WKDW ZH DUH ZRUNLQJ LQ D SUHZKLWHQHG LPDJH VSDFH ,Q D SUHZKLWHQHG LPDJH VSDFH WKLV FRQGLWLRQ LV VXIILFLHQW WR DVVXUH WKH RXWSXWV LQ WKH IHDWXUH VSDFH DV PHDVXUHG DW XO DQG X ZLOO EH XQFRUUHODWHG RYHU WKH UHMHFWLRQ FODVV 0DWKHPDWLFDOO\ WKLV FDQ EH VKRZQ DV e^XX7` e^Z>;M$@9,` Z@H^[[[?`:? +f`e-IU`f+n Z`e$_If>f+! 7 7 :\ D ,ZM :M D ,Z Z?H;?;?f:? ZA(L;>;cfZ 7 7 U : ,Z M : OZ +, R L.I 1 1 [ O ZKHUH ZW Z H  n DUH WKH FROXPQV RI ,7 7KLV UHVXOW LV WUXH IRU DQ\ QXPEHU RI QRGHV LQ WKH ILUVW OD\HU RI WKH 0/3 7KH UHVXOWV RI WKH WUDLQLQJ ZLWK WKLV PRGLILFDWLRQ DUH VKRZQ LQ ILJXUH ZKLFK LV WKH UHVXOWLQJ IHDWXUH VSDFH DV PHDVXUHG DW X M DQG X )URP WKLV ILJXUH ZH FDQ VHH WKDW WKH GLVn FULPLQDQW IXQFWLRQ UHSUHVHQWHG E\ WKH FRQWRXU OLQHV LV D QRQOLQHDU IXQFWLRQ RI Q DQG X )XUWKHUPRUH EHFDXVH WKH SURMHFWLRQ RI WKH YHKLFOHV LQWR WKH IHDWXUH VSDFH GR QRW OLH RQ D VLQJOH FXUYH DV LQ WKH SUHYLRXV H[SHULPHQWf WKH IHDWXUHV UHSUHVHQW GLIIHUHQW GLVFULPLQDn WLRQ LQIRUPDWLRQ ZLWK UHJDUGV WR WKH ERWK UHMHFWLRQ DQG UHFRJQLWLRQ FODVVHV 7KH ERWWRP RI WKH ILJXUH VKRZLQJ WKH SURMHFWLRQ RI D UDQGRP VDPSOLQJ RI WKH WHVW YHKLFOHV DOO ZRXOG EH WRR GHQVH IRU SORWWLQJf VKRZ WKDW ERWK IHDWXUHV DUH XVHIXO IRU VHSDUDWLQJ YHKLFOH IURP YHKLFOH ([DPLQDWLRQ RI WDEOH FROXPQ ,, LQ WKH QRQOLQHDU UHVXOWVf VKRZV WKDW DW WKH WZR GHWHFWLRQ SUREDELOLWLHV RI LQWHUHVW LPSURYHG IDOVH DODUP SHUIRUPDQFH KDV EHHQ

PAGE 93

REWDLQHG )LJXUH VKRZV WKH 52& FXUYH IRU WKH UHVXOWLQJ ILOWHU ,W LV HYLGHQW WKDW WKH QRQn OLQHDU ILOWHU LV D XQLIRUPO\ EHWWHU WHVW IRU FODVVLILFDWLRQ )LJXUH ([SHULPHQW ,, 5HVXOWLQJ IHDWXUH VSDFH ZKHQ RUWKRJRQDOLW\ LV LPSRVHG RQ WKH LQSXW OD\HU RI WKH 0/3 ,Q WKH WRS ILJXUH VTXDUHV LQGLFDWH WKH UHFRJQLWLRQ FODVV WUDLQLQJ H[HPSODUV WULDQJOHV LQGLFDWH ZKLWH QRLVH UHMHFWLRQ FODVV H[HPSODUV DQG SOXV VLJQV DUH WKH LPDJHV RI YHKLFOH OD QRW XVHG IRU WUDLQLQJ ,Q WKH ERWWRP ILJXUH VTXDUHV DUH WKH SHDN UHVSRQVHV IURP YHKLFOHV OE DQG OF WULDQJOHV DUH WKH SHDN UHVSRQVHV IURP YHKLFOHV D DQG E

PAGE 94

)LJXUH ([SHULPHQW ,, 5HVXOWLQJ 52& FXUYH ZLWK RUWKRJRQDOLW\ FRQVWUDLQW &RQYLQFHG WKDW WKH ILOWHU UHSUHVHQWV D EHWWHU WHVW IRU FODVVLILFDWLRQ WKDQ WKH OLQHDU ILOWHU ZH QRZ H[DPLQH WKH UHVXOW IRU WKH RWKHU IHDWXUHV RI LQWHUHVW )LJXUH VKRZV WKH RXWSXW UHVSRQVH IRU WKLV ILOWHU IRU RQH RI WKH LPDJHV $V VHHQ LQ WKH ILJXUH D QRWLFHDEOH SHDN DW WKH FHQWHU RI WKH RXWSXW SODQH KDV EHHQ DFKLHYHG 7KLV VKRZV WKDW WKH ILOWHU PDLQWDLQV WKH ORFDOn L]DWLRQ SURSHUWLHV RI WKH OLQHDU ILOWHU ,Q WKLV ZD\ WKH FKDUDFWHUL]DWLRQ RI WKH UHMHFWLRQ FODVV E\ LWV VHFRQG RUGHU VWDWLVWLFV WKH DGGLWLRQ RI WKH RUWKRJRQDOLW\ FRQVWUDLQW DW WKH LQSXW OD\HU WR WKH 0/3 DQG WKH XVH RI D QRQn OLQHDU WRSRORJ\ KDV UHVXOWHG LQ D VXSHULRU FODVVLILFDWLRQ WHVW ([SHULPHQW ,7, VXKVSDFH QRLVH WUDLQLQJ 7KH QH[W H[SHULPHQWV GHVFULEHV DQ DGGLWLRQDO PRGLILFDWLRQ WR WKLV WHFKQLTXH 2QH RI WKH LVVXHV RI WUDLQLQJ QRQOLQHDU V\VWHPV LV WKH FRQYHUJHQFH WLPH 7UDLQLQJ PHWKRGV ZKLFK UHTXLUH RYHUO\ ORQJ WUDLQLQJ WLPHV DUH QRW RI PXFK SUDFWLFDO XVH :H KDYH DOUHDG\ VKRZQ

PAGE 95

)LJXUH ([SHULPHQW ,, 2XWSXW UHVSRQVH WR DQ LPDJH IURP WKH UHFRJQLWLRQ FODVV WUDLQLQJ VHW KRZ WR UHGXFH WKH WUDLQLQJ FRPSOH[LW\ E\ UHFRJQL]LQJ WKDW ZH FDQ VXIILFLHQWO\ GHVFULEH WKH UHMHFWLRQ FODVV ZLWK ZKLWH QRLVH VHTXHQFHV :H QRZ VKRZ D PRUH FRPSDFW GHVFULSWLRQ RI WKH UHMHFWLRQ FODVV ZKLFK OHDGV WR VKRUWHU FRQYHUJHQFH WLPHV DV GHPRQVWUDWHG HPSLULFDOO\ 7KLV GHVFULSWLRQ UHOLHV RQ WKH ZHOO NQRZQ VLQJXODU YDOXH GHFRPSRVLWLRQ 69'f :H YLHZ WKH UDQGRP ZKLWH VHTXHQFHV DV VWRFKDVWLF SUREHV RI WKH SHUIRUPDQFH VXUIDFH LQ WKH ZKLWHQHG LPDJH VSDFH 7KH FODVVLILHU GLVFULPLQDQW IXQFWLRQ LV RI FRXUVH QRW GHWHUn PLQHG E\ WKH UHMHFWLRQ FODVV DORQH ,W LV DOVR DIIHFWHG E\ WKH UHFRJQLWLRQ FODVV :H KDYH VKRZQ SUHYLRXVO\ WKDW WKH ZKLWH QRLVH VHTXHQFHV HQDEOH XV WR SUREH WKH LQSXW VSDFH PRUH HIILFLHQWO\ WKDQ H[DPLQLQJ DOO VKLIWV RI WKH UHFRJQLWLRQ H[HPSODUV +RZHYHU ZH DUH VWLOO VHDUFKLQJ D VSDFH RI GLPHQVLRQ HTXDO WR WKH LPDJH VL]H 1^1 ‘ 2QH RI WKH XQGHUO\LQJ SUHPLVHV WR D GDWD GULYHQ DSSURDFK LV WKDW WKH LQIRUPDWLRQ DERXW D FODVV LV FRQYH\HG WKURXJK H[HPSODUV ,Q WKLV FDVH WKH UHFRJQLWLRQ FODVV LV UHSUHVHQWHG E\

PAGE 96

1 1 [ 1 1 1c $ H[HPSODUV SODFHG LQ WKH GDWD PDWUL[ [ H n ,W LV ZHOO NQRZQ WKDW LI [ LI LW LV IXOO UDQN FDQ EH GHFRPSRVHG ZLWK WKH 69' DV [ 8$9f RI WKH GDWD PDWUL[ $ DUH WKH VLQJXODU YDOXHV DQG 9 LV DQ RUWKRJRQDO PDWUL[ 7KLV GHFRPn SRVLWLRQ KDV PDQ\ ZHOO NQRZQ SURSHUWLHV LQFOXGLQJ FRPSDFWQHVV RI UHSUHVHQWDWLRQ IRU WKH FROXPQV RI WKH GDWD PDWUL[I*HUEUDQGV @ ,QGHHG DV KDV EHHQ QRWHG E\ *KHHQ>@ WKH 6') FDQ EH ZULWWHQ DV D IXQFWLRQ RI WKH 69' RI WKH GDWD PDWUL[ K6') 8. O97G f :H ZLOO XVH WKLV UHFRJQLWLRQ FODVV UHSUHVHQWDWLRQ WR IXUWKHU UHILQH RXU GHVFULSWLRQ RI WKH UHMHFWLRQ FODVV IRU WUDLQLQJ $V ZH VWDWHG WKH XQGHUO\LQJ DVVXPSWLRQ LQ D GDWD GULYHQ PHWKRG LV WKDW WKH GDWD PDWUL[ [ FRQYH\V LQIRUPDWLRQ DERXW WKH UHFRJQLWLRQ FODVV DQ\ LQIRUPDWLRQ DERXW WKH UHFRJQLWLRQ FODVV RXWVLGH WKH VSDFH RI WKH GDWD PDWUL[ LV QRW DWWDLQn DEOH IURP WKLV SHUVSHFWLYH 7KH LQIRUPDWLRQ FHUWDLQO\ H[LVWV EXW WKHUH LV QR PHFKDQLVP E\ ZKLFK WR LQFOXGH LW LQ WKH GHWHUPLQDWLRQ RI WKH GLVFULPLQDQW IXQFWLRQ ZLWKLQ WKLV IUDPHn ZRUN 7KLV GRHV KRZHYHU OHDG WR D PRUH HIILFLHQW GHVFULSWLRQ RI WKH UHMHFWLRQ FODVV :H FDQ PRGLI\ RXU RSWLPL]DWLRQ FULWHULRQ WR UHGXFH WKH UHVSRQVH WR ZKLWH VHTXHQFHV DV WKH\ DUH SURMHFWHG LQWR WKH 1W GLPHQVLRQDO VXEVSDFH RI WKH GDWD PDWUL[ (IIHFWLYHO\ WKLV UHGXFHV WKH VHDUFK IRU D GLVFULPLQDQW IXQFWLRQ LQ DQ 1W1 GLPHQVLRQDO VSDFH WR DQ $GLPHQVLRQDO VXEVSDFH

PAGE 97

7KH DGDSWDWLRQ VFKHPH RI EDFNSURSDJDWLRQ DOORZV D VLPSOH PHFKDQLVP WR LPSOHPHQW WKLV FRQVWUDLQW 7KH DGDSWDWLRQ RI PDWUL[ : DW LWHUDWLRQ N FDQ EH ZULWWHQ DV :rOf : rfr:HIWrf f ZKHUH Hn LV D FROXPQ YHFWRU GHULYHG IURP WKH EDFNSURSDJDWHG HUURU DQG [cNf LV WKH FXUUHQW LQSXW H[HPSODU IURP HLWKHU FODVV SUHVHQWHG WR QHWZRUN ZKLFK E\ GHVLJQ OLHV LQ WKH VXEVSDFH VSDQQHG E\ WKH FROXPQV RI 8 )URP HTXDWLRQ f LI WKH UHMHFWLRQ FODVV QRLVH H[HPSODUV DUH UHVWULFWHG WR OLH LQ WKH GDWD VSDFH RI [ ZKLFK FDQ EH DFKLHYHG E\ SURMHFWLQJ UDQGRP YHFWRUV RI VL]H 1 RQWR WKH PDWUL[ 9 DERYH DQG ,7 LV LQLWLDOL]HG WR EH D UDQGRP SURMHFWLRQ IURP WKLV VSDFH ZH ZLOO EH DVVXUHG WKDW WKH FROXPQV RI RQO\ H[WUDFW LQIRUn PDWLRQ IURP WKH GDWD VSDFH RI [ 7KLV LV EHFDXVH WKH FROXPQV RI :c ZLOO RQO\ EH FRQn VWUXFWHG IURP YHFWRUV ZKLFK OLH LQ WKH FROXPQV VSDFH RI 8 DQG VR ZLOO EH RUWKRJRQDO WR DQ\ YHFWRU FRPSRQHQW WKDW OLHV LQ WKH QXOO VSDFH RI 8 7KH VHDUFK IRU D GLVFULPLQDQW IXQFWLRQ LV QRZ UHGXFHG IURP ZLWKLQ DQ 1;1 GLPHQn VLRQDO VSDFH WR D VHDUFK IURP ZLWKLQ DQ 1W GLPHQVLRQDO VSDFH 'XH WR WKH GLPHQVLRQDOLW\ UHGXFWLRQ DFKLHYHG ZH ZRXOG H[SHFW WKH FRQYHUJHQFH WLPH WR EH UHGXFHG 7KLV LV WKH PHWKRG WKDW ZDV XVHG IRU WKH WKLUG H[SHULPHQW 5HMHFWLRQ FODVV QRLVH H[HP 1 [ SODUV ZHUH JHQHUDWHG E\ SURMHFWLQJ D UDQGRP YHFWRU Q H L n RQWR WKH EDVLV 8 E\ [ f 8QOQ ILJXUH WKH UHVXOWLQJ GLVFULPLQDQW IXQFWLRQ LV VKRZQ DV LQ WKH SUHYLRXV

PAGE 98

H[SHULPHQWV DQG WKH UHVXOW LV VLPLODU WR H[SHULPHQW ,, 7KH FODVVLILHU SHUIRUPDQFH DV PHDn VXUHG LQ WDEOH DQG WKH 52& FXUYH RI ILJXUH DUH DOVR QRPLQDOO\ WKH VDPH )LJXUH ([SHULPHQW +, 5HVXOWLQJ IHDWXUH VSDFH ZKHQ WKH VXEVSDFH QRLVH LV XVHG IRU WUDLQLQJ 6\PEROV UHSUHVHQW WKH VDPH GDWD DV LQ WKH SUHYLRXV FDVH 7KHUH DUH KRZHYHU WZR QRWDEOH GLIIHUHQFHV ([DPLQDWLRQ RI ILJXUH VKRZV WKDW WKH RXWSXW UHVSRQVH WR VKLIWHG LPDJHV LV HYHQ ORZHU DOORZLQJ IRU EHWWHU ORFDOL]DWLRQ 7KLV FRQ

PAGE 99

)LJXUH ([SHULPHQW ,+ 5HVXOWLQJ 52& FXUYH IRU VXEVSDFH QRLVH WUDLQLQJ GLWLRQ ZDV IRXQG WR EH WKH FDVH WKURXJKRXW WKH GDWD VHW 2I PRUH VLJQLILFDQFH LV WKH UHVXOW VKRZQ LQ ILJXUH LQ ZKLFK ZH FRPSDUH WKH OHDUQLQJ FXUYHV RI DOO RI WKH H[SHULPHQWV SUHn VHQWHG KHUH ,Q WKLV ILJXUH WKH GDVKHG DQG GDVKHGGRW OLQHV DUH WKH OHDUQLQJ FXUYHV IRU H[SHULPHQWV ,, DQG ,,, UHVSHFWLYHO\ ,Q WKLV FDVH WKH FRQYHUJHQFH UDWH ZDV LQFUHDVHG QRPLn QDOO\ E\ D IDFWRU RI WKUHH IURP HSRFKV WR DSSUR[LPDWHO\ HSRFKV +HUH DQ HSRFK UHSUHVHQWV RQH SDVV WKURXJK DOO RI WKH WUDLQLQJ GDWD ([SHULPHQW ,9 FRQYH[ KXOO DSSURDFK ,Q WKLV H[SHULPHQW ZH SUHVHQW D WHFKQLTXH ZKLFK ERUURZV IURP WKH LGHDV RI .XPDU HW DO >@ 7KLV DSSURDFK GHVLJQHG DQ 6') ZKLFK UHMHFWV LPDJHV ZKLFK DUH DZD\ IURP WKH

PAGE 100

)LJXUH ([SHULPHQW ,,, 2XWSXW UHVSRQVH WR DQ LPDJH IURP WKH UHFRJQLWLRQ FODVV WUDLQLQJ VHW )LJXUH /HDUQLQJ FXUYHV IRU WKUHH PHWKRGV ([SHULPHQW ,, :KLWH QRLVH WUDLQLQJ GDVKHG OLQHf ([SHULPHQW ,,, VXEVSDFH QRLVH GDVKHGGRW OLQHf ([SHULPHQW ,9 VXEVSDFH QRLVH SOXV FRQYH[ KXOO H[HPSODUV VROLG OLQHf

PAGE 101

ERXQGDU\ RI WKH FRQYH[ KXOO RI WKH WUDLQLQJ VHW 7KH FRQYH[ KXOO RI D VHW ^[M [ [1` LV GHILQHG DV DOO SRLQWV ZKLFK FDQ EH UHSUHVHQWHG DV 1 r ; \c L ZKHUH WKH fV DUH FRQVWUDLQHG WR VDWLVI\ 1 DL !c L ,W ZDV SRLQWHG RXW WKDW E\ .XPDU HW DO WKDW ZKHQ WKH SHDN FRQVWUDLQWV IRU WKH 6') RU DQ\ RI WKH OLQHDU GLVWRUWLRQ LQYDULDQW ILOWHUVf DUH DOO VHW WR XQLW\ SRLQWV LQ WKH LQWHULRU RI WKH FRQYH[ KXOO RYHU WKH WUDLQLQJ H[HPSODUV DUH UHFRJQL]HG DV ZHOO DV GLH WKRVH QHDU WKH H[WUHn PDO SRLQWV 7KLV ZRXOG LQFOXGH IRU H[DPSOH DQ LPDJH ZKLFK LV WKH PHDQ RI WKH WUDLQLQJ H[HPSODUV ([DPLQDWLRQ RI LPDJHU\ GHULYHG IURP SRLQWV WKDW DUH FORVHU WR WKH LQWHULRU RI WKH FRQYH[ KXOO UDWKHU WKDQ QHDU WKH ERXQGDU\ ZRXOG LQGLFDWH WKDW WKH\ DUH QRW UHSUHVHQWDn WLYH RI WKH UHFRJQLWLRQ FODVV ,W ZDV VXJJHVWHG WKDW D ZD\ WR PLWLJDWH WKLV SURSHUW\ ZDV WR VHW WKH GHVLUHG RXWSXW RYHU WKH WUDLQLQJ VHW WR EH FRPSOH[ XQLW\ PDJQLWXGH DQG PHDQ ]HUR 7KH PDJQLWXGH RI WKH RXWn SXW ZDV WKHQ XVHG DV WKH UHVSRQVH ,Q WKLV ZD\ RQO\ SRLQWV QHDU WKH ERXQGDU\ RI WKH FRQYH[ KXOO DUH UHFRJQL]HG 7KH DSSURDFK WDNHQ KHUH LV VLPLODU LQ WKDW H[HPSODUV IURP WKH LQWHULRU RI WKH FRQYH[ KXOO DUH XVHG DV UHSUHVHQWDWLYH RI WKH UHMHFWLRQ FODVV 7KH GLIIHUHQFH LV WKDW WKLV GHVFULSWLRQ LV LQFOXGHG LQ WKH OHDUQLQJ SURFHVV ZLWKRXW £ SULRUL GHWHUPLQLQJ WKH GHFLVLRQ VXUIDFH HJ

PAGE 102

PDJQLWXGH RI WKH FRUUHODWRU RXWSXWf ,W LV WKH QRQOLQHDU LWHUDWLYH SURFHVV ZKLFK GHWHUPLQHV KRZ WR VHSDUDWH WKH UHFRJQLWLRQ FODVV H[HPSODUV IURP WKH LPDJHV GHULYHG IURP WKH FRQYH[ KXOO 7KH UHVXOW LV VLJQLILFDQWO\ LPSURYHG FODVVLILFDWLRQ ,Q WKLV H[SHULPHQW ZH FRQWLQXH WR XVH UDQGRP QRLVH SURMHFWHG RQWR WKH EDVLV GHILQHG E\ FROXPQV RI WKH PDWUL[ 9 DV LQ H[SHULPHQW ,,, ,Q DGGLWLRQ FRQYH[ KXOO H[HPSODUV DUH 1 [O JHQHUDWHG E\ SURMHFWLQJ D UDQGRP YHFWRU D H RQWR WKH GDWD PDWUL[ [ 7KH EDVLV IRU WKLV DSSURDFK LV WKDW HOHPHQWV RI WKH FRQYH[ KXOO WKDW DUH GLVWDQW IURP WKH H[WUHPDO SRLQWV WKH WUDLQLQJ H[HPSODUVf GR QRW FRQYH\ LQIRUPDWLRQ DERXW WKH UHFRJQLWLRQ FODVV DQG VR LQ NHHSLQJ ZLWK WKLV LGHD ZH LPSRVHG D IXUWKHU UHVWULFWLRQ RQ WKH FRHIILFLHQWV Dc QDPHO\ 1Df1 7KLV UHVWULFWLRQ DVVXUHV WKDW QRQH RI WKH JHQHUDWHG FRQYH[ KXOO H[HPSODUV OLH WRR FORVH WR RQH RI WKH UHFRJQLWLRQ FODVV WUDLQLQJ H[HPSODUV 5HMHFWLRQ FODVV H[HPSODUV IURP ZLWKLQ WKH FRQYH[ KXOO DUH UDQGRPO\ JHQHUDWHG WKURXJKRXW WKH WUDLQLQJ IURP [UHM [D $QRWKHU SURSHUW\ RI WKHVH UHMHFWLRQ FODVV H[HPSODUV LV WKDW WKH\ DOVR OLH LQ WKH VXEVSDFH RI WKH GDWD PDWUL[ [ ‘ ([DPLQDWLRQ RI WDEOH DQG WKH 52& FXUYH RI ILJXUH VKRZ WKDW WKLV PHWKRG \LHOGV VLJQLILFDQWO\ LPSURYHG FODVVLILFDWLRQ SHUIRUPDQFH 7KH GLVFULPLQDQW IXQFWLRQ VKRZQ LQ ILJXUH LV TXLWH GLIIHUHQW DQG PXFK PRUH QRQOLQHDU WKDQ LQ WKH SUHYLRXV FDVHV ,Q WKH ILJn XUH WKH FRQYH[ KXOO H[HPSODUV DUH FOXVWHUHG EHWZHHQ WKH VXEVSDFH QRLVH H[HPSODUV DQG WKH UHFRJQLWLRQ FODVV H[HPSODUV ,I WKLV LV D JHQHUDO SURSHUW\ RI WKH W\SH RI GDWD ZH DUH XVLQJ WKHQ LW PD\ EH D SRZHUIXO PHWKRG E\ ZKLFK WR GHVFULEH WKH UHMHFWLRQ FODVV ZLWKLQ WKH QRQ

PAGE 103

OLQHDU IUDPHZRUN 0RUH DQDO\VLV LV QHHGHG KRZHYHU EHIRUH WKLV FRQFOXVLRQ FDQ EH PDGH :H GR FRQFOXGH WKDW LQ WKLV FDVH WKLV PHWKRG LV DQ HIIHFWLYH PHDQV E\ ZKLFK WR FKDUDFWHUn L]H WKH UHMHFWLRQ FODVV 7KH DGYDQWDJH LQ WKLV WHFKQLTXH YHUVXV WKH OLQHDU PHWKRG RI .XPDU HW DO >@ LV WKDW WKH WUDLQLQJ OHDUQV WR VHSDUDWH DXWRPDWLFDOO\ WKH UHFRJQLWLRQ FODVV H[HPn SODUV IURP WKH FRQYH[ KXOO H[HPSODUV DV RSSRVHG WR £ SULRUL DVVLJQLQJ D FRPSOH[ GHVLUHG RXWSXW IRU HDFK H[HPSODU 7KHUH ZHUH KRZHYHU VRPH GLIILFXOWLHV ZLWK WKLV WHFKQLTXH ZKLFK DUH ZRUWK PHQWLRQ f LQJ 5HFDOO WKDW WKH PRWLYDWLRQ IRU XVLQJ RUWKRJRQDOL]DWLRQ LQ WKH LQSXW OD\HU ZDV WR LQFUHDVH WKH OLNHOLKRRG WKDW D QRQOLQHDU GLVFULPLQDQW IXQFWLRQ ZDV IRXQG :KHQ XVLQJ FRQn YH[ KXOO H[HPSODUV LQ WKH UHMHFWLRQ FODVV WKLV PD\ VHHP XQQHFHVVDU\ ,Q SUDFWLFH KRZHYHU LW ZDV IRXQG WKDW ZKHQ WKH RUWKRJRQDOL]DWLRQ ZDV UHPRYHG WUDLQLQJ WLPHV EHFDPH H[WUHPHO\ ORQJ (YHQ ZLWK RUWKRJRQDOL]DWLRQ ZH FDQ VHH IURP WKH OHDUQLQJ FXUYH VROLG OLQHf LQ ILJXUH WKDW FRQYHUJHQFH WRRN RYHU DQ RUGHU RI PDJQLWXGH ORQJHU DV LQ H[SHULn PHQW +, 7KHUH ZHUH DOVR VWDELOLW\ LVVXHV DV ZHOO ZLWK WKLV W\SH RI WUDLQLQJ 7KH WUDLQLQJ EHFDPH XQVWDEOH QHDUO\ DV RIWHQ DV LW FRQYHUJHG +RZHYHU ZKHQ WKH WUDLQLQJ GLG FRQYHUJH DV LQ WKH UHVXOWV VKRZQ WKH FODVVLILFDWLRQ SHUIRUPDQFH ZDV DOZD\V VXSHULRU &RQYHUJHQFH RU LWV ODFN FDQ EH GLUHFWO\ PHDVXUHG IURP WKH 06( :KHQ FRQYHUJHQFH ZDV QRW UHDFKHG LQ D VXLWDEOH QXPEHU RI LWHUDWLRQV W\SLFDOO\ HSRFKVf WKH DOJRULWKP ZDV UHVWDUWHG ZLWK D QHZ UDQGRP SDUDPHWHU LQLWLDOL]DWLRQ 'XH WR WKH LPSURYHG FODVVLILFDWLRQ UHVXOWV ZH EHOLHYH WKDW WKLV PHWKRG EHDUV IXUWKHU VWXG\

PAGE 104

XL m‘ )LJXUH ([SHULPHQW ,9 UHVXOWLQJ IHDWXUH VSDFH IURP FRQYH[ KXOO WUDLQLQJ ,Q WKH ILJXUH V\PEROV DUH DV EHIRUH ,Q WKH WRS ILJXUH RQH GLIIHUHQFH WR QRWH LV WKDW WKH FRQYH[ KXOO H[HPSODUV LQGLFDWHG E\ DUURZf DUH FORVHU WR WKH GLVFULPLQDQW ERXQGDU\ DQG SOD\ D JUHDWHU UROH LQ GHWHUPLQLQJ WKH VKDSH RI WKH IXQFWLRQ

PAGE 105

)LJXUH ([SHULPHQW ,9 5HVXOWLQJ 52& FXUYH ZLWK FRQYH[ KXOO DSSURDFK

PAGE 106

&+$37(5 ,1)250$7,217+(25(7,& )($785( (;75$&7,21 9, LQWURGXFWLRQ 7KH PDWHULDO SUHVHQWHG LQ WKLV VHFWLRQ LV PRWLYDWHG E\ WKH DQDO\VLV RI WKH SUHYLRXV FKDSWHU 5HFDOO WKDW EHJLQQLQJ ZLWK VHFWLRQ RXU DQDO\VLV RI WKH QRQOLQHDU V\VWHP ZDV DLGHG E\ UHIHUHQFH WR D IHDWXUH VSDFH ZLWKLQ WKH 0/3 DUFKLWHFWXUH 7KH GHVLJQDWLRQ RI WKH IHDWXUH VSDFH ZKLFK OHG WR XVHIXO PRGLILFDWLRQV WR WKH LWHUDWLYH WUDLQLQJ DOJRULWKP ,Q WKH SUHYLRXV DQDO\VLV KRZHYHU WKH JHQHUDWLRQ RI WKH IHDWXUHV ZHUH D IXQFWLRQ RI WUDLQLQJ DOJRn ULWKP ZLWK UHJDUGV WR WKH GHVLUHG V\VWHP UHVSRQVH 7KH DQDO\VLV GLG VKRZ KRZHYHU WKDW WKH UHSUHVHQWDWLRQ RI WKH GDWD LQ WKH IHDWXUH VSDFH ZDV FULWLFDO WR WKH FODVVLILFDWLRQ SHUIRUn PDQFH 7KLV VHFWLRQ H[DPLQHV WKH GHFRXSOLQJ RI WKH IHDWXUH H[WUDFWLRQ VWDJH IURP WKH WUDLQLQJ RI WKH GLVFULPLQDQW IXQFWLRQ LQ RYHUDOO V\VWHP DUFKLWHFWXUH 2I FRXUVH ZKHQ WKH IHDWXUH H[WUDFWLRQ LV GHFRXSOHG LW LV LPSRUWDQW WR XVH D FULWHULRQ ZKLFK LV UHODWHG WR WKH RYHUDOO JRDO FODVVLILFDWLRQ 7KH DSSURDFK GHVFULEHG KHUH XVHV DQ LQIRUPDWLRQ WKHRUHWLF PHDVXUH QDPHO\ PXWXDO LQIRUPDWLRQ DV D FULWHULRQ IRU DGDSWDWLRQ :H ZLOO VKRZ WKDW DOWKRXJK WKH IHDWXUH H[WUDFWLRQ LV GHFRXSOHG IURP WKH FODVVLILHU WUDLQn LQJ WKH UHVXOWLQJ IHDWXUHV DUH LQ IDFW VSHFLILF WR FODVVLILFDWLRQ 7KLV PHWKRG UHSUHVHQWV D QHZ DGYDQFH WR WKH DUHD RI LQIRUPDWLRQ WKHRUHWLF VLJQDO SURFHVVLQJ DQG DV VXFK KDV ZLGH DSSOLFDWLRQ EH\RQG QRQOLQHDU H[WHQVLRQV WR 0$&( ILOWHUV DQG FODVVLILFDWLRQ :H KDYH UHFHQWO\ SUHVHQWHG D PD[LPXP HQWURS\ EDVHG WHFKQLTXH IRU IHDWXUH H[WUDFWLRQ >)LVKHU DQG 3ULQFLSH F@ ZKLFK ZH QRZ H[WHQG WR PXWXDO LQIRUPDWLRQ 7KLV PHWKRG

PAGE 107

GLIIHUV IURP SUHYLRXV PHWKRGV LQ WKDW LW LV QRW OLPLWHG WR OLQHDU WRSRORJLHV >/LQVNHU @ QRU XQLPRGDO SUREDELOLW\ GHQVLW\ IXQFWLRQV 3')Vf >%HOO DQG 6HMQRZVNL @ 7KH PHWKRG LV GLUHFWO\ DSSOLFDEOH WR DQ\ QRQOLQHDU PDSSLQJ ZKLFK LV GLIIHUHQWLDEOH LQ LWV SDUDPHWHUV ,Q SDUWLFXODU ZH GHPRQVWUDWH WKDW WKH WHFKQLTXH FDQ EH DSSOLHG WR D IHHGIRUn ZDUG PXOWLOD\HU SHUFHSWURQ 0/3f ZLWK DQ DUELWUDU\ QXPEHU RI KLGGHQ OD\HUV ,W LV DOVR VKRZQ WKDW WKH UHVXOWLQJ LWHUDWLYH WUDLQLQJ DOJRULWKP ILWV QDWXUDOO\ LQWR WKH EDFNSURSDJDn WLRQ PHWKRG IRU WUDLQLQJ PXOWLOD\HU SHUFHSWURQV ,Q WKLV VHFWLRQ ZH SUHVHQW VRPH EDFNJURXQG LQIRUPDWLRQ RQ IHDWXUH H[WUDFWLRQ DQG LQIRUPDWLRQ WKHRUHWLF DSSURDFKHV WR VLJQDO SURFHVVLQJ 7KLV LV IROORZHG E\ WKH GHULYDWLRQ RI WKH IHDWXUH H[WUDFWLRQ PHWKRG ([SHULPHQWDO UHVXOWV ZLOO EH SUHVHQWHG ZKLFK LOOXVWUDWH WKH XVHIXOQHVV RI WKLV DSSURDFK :H ZLOO FRQFOXGH ZLWK WKH ORJLFDO SODFHPHQW RI WKLV PHWKRG ZLWKLQ QRQOLQHDU 0$&( ILOWHUV DV ZHOO DV H[SHULPHQWDO UHVXOWV ZKLFK FDQ EH GLUHFWO\ FRPSDUHG WR WKH UHVXOWV RI VHFWLRQ 0RWLYDWLRQ IRU )HDWXUH ([WUDFWLRQ :H KDYH VKRZQ LQ VHFWLRQ WKDW WKHRUHWLFDOO\ DW OHDVW WKH 06( FULWHULRQ FDQ EH XVHG WR LWHUDWLYHO\ WUDLQ D XQLYHUVDO DSSUR[LPDWRU VXFK DV WKH PXOWLOD\HU SHUFHSWURQf WR FODVVLI\ UDZ LQSXW YDULDEOHV GLUHFWO\ 7KDW LV WKH 06( FULWHULRQ FRXSOHG ZLWK D XQLYHUVDO DSSUR[Ln PDWRU HVWLPDWHV SRVWHULRU FODVV SUREDELOLWLHV $Q LVVXH IRU DQ\ HVWLPDWRU LV WKH YDULDQFH 7KHUH DUH WZR ZD\V WR UHGXFH WKH YDULDQFH RI WKH HVWLPDWH RI WKH SRVWHULRU SUREDELOLWLHV LQ WKLV FDVH ZH FDQ HLWKHU VXSSO\ PRUH GDWD ZKLFK PD\ QRW EH SRVVLEOH RU SUDFWLFDOf RU ZH FDQ VRPHKRZ LPSRVH FRQVWUDLQWV RQ WKH V\VWHP )HDWXUH H[WUDFWLRQ LV D PHDQV E\ ZKLFK FRQVWUDLQWV FDQ EH LPSRVHG RQ WKH V\VWHP >)XNDQDJD %LVKRS @ 7KLV GRHV QRW FRQWUDGLFW WKH UHVXOWV RI WKH SUHYLRXV FKDSWHU 7KH SUHYLRXV FKDSWHU LOOXVWUDWHG WKDW PRYLQJ

PAGE 108

WR D SUREDELOLVWLF IUDPHZRUN FRXSOHG ZLWK D QRQOLQHDU WRSRORJ\ OHG WR LPSURYHG FODVVLILFDn WLRQ SHUIRUPDQFH )XUWKHUPRUH WKH SUHYLRXV FKDSWHU DGGUHVVHG WKH ODFN RI WUDLQLQJ GDWD WKURXJK HIILFLHQW GHVFULSWLRQV RI WKH UHMHFWLRQ FODVV E\ LWV VHFRQGRUGHU VWDWLVWLFV DQG WKH UHFRJQLWLRQ FODVV E\ D VXEVSDFH 7KH GLVFXVVLRQ LQ WKLV FKDSWHU VHHNV WR H[WHQG WKLV DSSURDFK EH\RQG VHFRQGRUGHU GHVFULSWLRQV E\ H[SORLWLQJ WKH XQGHUO\LQJ VWUXFWXUH RI ERWK FODVVHV ,W LV LPSHUDWLYH LQ DQ\ IHDWXUH H[WUDFWLRQ DOJRULWKP WKDW WKH FULWHULRQ E\ ZKLFK WKH IHDn WXUHV DUH VHOHFWHG LV VRPHKRZ UHODWHG WR WKH RYHUDOO V\VWHP REMHFWLYH 6XLWDEOH FULWHULD LQ FODVVLILFDWLRQ DUH QRW DOZD\V HDVLO\ HPSOR\HG DV D PHDQV IRU FODVVLILFDWLRQ HJ OLNHOLKRRG UDWLRV ZKLFK UHTXLUH SULRU NQRZOHGJH RI WKH XQGHUO\LQJ SUREDELOLW\ GHQVLW\ IXQFWLRQf &RQn VHTXHQWO\ VXERSWLPDO IHDWXUH VHWV DUH XVHG RU HYHQ PRUH FRPPRQO\ XVHU GHILQHG DG KRF IHDWXUHV EDVHG RQ LQWXLWLYH DVVXPSWLRQV EXW ZLWKRXW DQ\ ULJRURXV UHODWLRQVKLS WR FODVVLILn FDWLRQ ,W LV RIWHQ WKH FDVH WKDW SURMHFWLQJ KLJKGLPHQVLRQDO GDWD RQWR D VPDOOHU VXEVSDFH UHVXOWV LQ LPSURYHG SHUIRUPDQFH RI D QRQSDUDPHWULF FODVVLILHU 7KLV VWDWHPHQW LV FRXQWHUn LQWXLWLYH DV ZH FDQQRW LQ JHQHUDO SURMHFW RQWR D VXEVSDFH ZLWKRXW WKH ORVV RI VRPH LQIRUn PDWLRQ 7KH UHVXOWV RI WKH SUHYLRXV FKDSWHU KRZHYHU FRQILUP WKLV DVVHUWLRQ ,Q WKH ILQDO H[SHULPHQWV LQ RXU FRQVWUXFWLRQ RI UHMHFWLRQ FODVV H[HPSODUV ZH LPSOLFLWO\ UHVWULFWHG WKH VHDUFK VSDFH WR WZR GLVWLQFW VXEVSDFHV ,Q WKH ILUVW FDVH ZH FRQILQHG WKH UHMHFWLRQ FODVV WR WKH 3&$ VXEVSDFH RI WKH UHFRJQLWLRQ FODVV ZKLOH LQ WKH VHFRQG FDVH ZH UHVWULFWHG VRPH RI WKH UHMHFWLRQ FODVV H[HPSODUV WR WKH FRQYH[ KXOO RI WKH UHFRJQLWLRQ FODVV ,Q GRLQJ VR QR LQIRUPDWLRQ FRQFHUQLQJ WKH UHFRJQLWLRQ FODVV ZDV ORVW EXW WKH LQWULQVLF GLPHQVLRQDOLW\ RI WKH GDWD ZDV UHGXFHG WR WKH QXPEHU RI UHFRJQLWLRQ FODVV H[HPSODUV 1W

PAGE 109

:H QRWH WKDW ZKLOH WKHVH FRQVWUDLQWV OHG WR LPSURYHG FODVVLILHU SHUIRUPDQFH WKH VXEn VSDFHV XVHG ZHUH PRUH FORVHO\ UHODWHG WR VLJQDO UHSUHVHQWDWLRQ UDWKHU WKDQ FODVVLILFDWLRQ ,W VKRXOG EH QRWHG KRZHYHU WKDW IRU DQ 1FODVV FODVVLILFDWLRQ SUREOHP WKH RSWLPDO %D\HV FODVVLILHU FDQ EH GHULYHG IURP DQ 1 fGLPHQVLRQDO IHDWXUH VSDFH ZKHUH WKH IHDWXUHV DUH WKH SRVWHULRU SUREDELOLWLHV RI HDFK FODVV JLYHQ WKH REVHUYDWLRQ RI WKH GDWD LH 3&_; [ff 7KLV SRLQW XQGHUOLHV D NH\ GLIIHUHQFH EHWZHHQ IHDWXUH H[WUDFWLRQ IRU FODVVLILn FDWLRQ YHUVXV IHDWXUH H[WUDFWLRQ IRU VLJQDO UHSUHVHQWDWLRQ >)XNDQDJD @ ,Q FODVVLILFDn WLRQ LW LV WKH QXPEHU RI FODVVHV 1 ZKLFK GHWHUPLQHV WKH PLQLPXP IHDWXUHVSDFH GLPHQVLRQ 7KH IHDWXUH H[WUDFWLRQ DSSURDFK WR LPDJH FODVVLILFDWLRQ PHWKRGV LV RIWHQ GHFRPSRVHG DV LQ ILJXUH LQWR WZR VWDJHV IHDWXUH H[WUDFWLRQ IROORZHG E\ GLVFULPLQDWLRQ ,Q VRPH FDVHV WKH GHFRPSRVLWLRQ LV H[SOLFLW ZKLOH LQ RWKHUV LW LV D PDWWHU RI LQWHUSUHWDWLRQ 2IWHQ WKH IHDWXUHV DUH GHWHUPLQHG LQ DG KRF IDVKLRQ EDVHG RQ DQ LQWXLWLYH XQGHUVWDQGLQJ RI WKH GDWD EXW QRW H[SOLFLWO\ ZLWK UHVSHFW WR FODVVLILFDWLRQ $V DQ H[DPSOH ZH FDQ LQWHUSUHW WKH OLQHDU GLVWRUWLRQ LQYDULDQW ILOWHULQJ PHWKRGV DV D GHFRPSRVLWLRQ RI D SUHZKLWHQLQJ ILOWHU IHDWXUH H[WUDFWLRQf IROORZHG E\ DQ 6') V\QWKHWLF GLVFULPLQDQW IXQFWLRQf 6LPLODUO\ WKH 1/0$&( DUFKLWHFWXUH WKDW ZH DUH ZRUNLQJ ZLWK FDQ GHFRPSRVHG LQ WKLV ZD\ DV VKRZQ LQ ILJXUH ,Q IDFW DV WKH UHVXOWV KDYH EHHQ UHSRUWHG WKLV LV H[DFWO\ WKH GHFRPSRVLWLRQ WKDW KDV EHHQ XVHG IRU IHDWXUH VSDFH DQDO\VLV WR WKLV SRLQW +RZHYHU VXFK D GHFRPSRVLWLRQ LV DUELn WUDU\ DV WKH IHDWXUHV H[WUDFWHG ZHUH GULYHQ E\ D VLQJOH RSWLPL]DWLRQ FULWHULRQ GHULYHG IURP WKH RXWSXW VSDFH DQG ZKLFK LV FRXSOHG WR WKH WUDLQLQJ RI WKH GLVFULPLQDWRU

PAGE 110

7KH JRDO RI IHDWXUH H[WUDFWLRQ LV DOZD\V WR LPSURYH WKH RYHUDOO V\VWHP FODVVLILFDWLRQ SHUIRUPDQFH ,Q WKH WHFKQLTXH ZH DUH SUHVHQWLQJ QRZ WKH GHFRPSRVLWLRQ LV H[SOLFLW 7KH IHDWXUH H[WUDFWLRQ LV GHFRXSOHG IURP WKH GHWHUPLQDWLRQ RI WKH GLVFULPLQDQW IXQFWLRQ /Df cf§m \c f§ + &2f M G\c FRf IHDWXUH H[WUDFWLRQ GLVFULPLQDWRU )LJXUH &ODVVLFDO SDWWHUQ FODVVLILFDWLRQ GHFRPSRVLWLRQ )LJXUH 'HFRPSRVLWLRQ RI 1/0$&( DV D FDVFDGH RI IHDWXUH H[WUDFWLRQ IROORZHG E\ GLVFULPLQDWLRQ

PAGE 111

LQIRUPDWLRQ 7KHRUHWLF %DFNJURXQG $W WKLV SRLQW ZH SURYLGH VRPH EDFNJURXQG IRU WKH WHFKQLTXH ZH DUH XVLQJ $V WKLV PDWHULDO LV PRUH VSHFLILF WR LQIRUPDWLRQ WKHRUHWLF SURFHVVLQJ ZH IHHO WKDW LW LV PRUH DSSURn SULDWHO\ SUHVHQWHG DW WKLV WLPH 7KH PHWKRG ZH GHVFULEH KHUH FRPELQHV PXWXDO LQIRUPDWLRQ PD[LPL]DWLRQ ZLWK 3DU]HQ ZLQGRZ SUREDELOLW\ GHQVLW\ IXQFWLRQ HVWLPDWLRQ 7KHVH FRQn FHSWV DUH UHYLHZHG 0XWXDO ,QIRUPDWLRQ DV D 6HOI2UJDQL]LQJ 3ULQFLSOH (QWURS\ EDVHG LQIRUPDWLRQ WKHRUHWLF PHWKRGV KDYH EHHQ DSSOLHG WR D KRVW RI SUREOHPV HJ EOLQG VHSDUDWLRQ >%HOO DQG 6HMQRZVNL @ SDUDPHWHU HVWLPDWLRQ >.DSXU DQG .HVD YDQ @ DQG RI FRXUVH FRGLQJ WKHRU\ >6KDQQRQ @ HWFf /LQVNHU >@ KDV SURn SRVHG PXWXDO LQIRUPDWLRQ GHULYHG IURP HQWURS\f DV D VHOIRUJDQL]LQJ SULQFLSOH IRU QHXUDO V\VWHPV 7KH SUHPLVH LV WKDW D PDSSLQJ RI D VLJQDO WKURXJK D QHXUDO QHWZRUN VKRXOG EH DFFRPSOLVKHG VR DV WR SUHVHUYH WKH PD[LPXP DPRXQW RI PXWXDO LQIRUPDWLRQ /LQVNHU GHPRQVWUDWHV WKLV SULQFLSOH RI PD[LPXP LQIRUPDWLRQ SUHVHUYDWLRQ IRU VHYHUDO SUREOHPV LQFOXGLQJ D GHWHUPLQLVWLF VLJQDO FRUUXSWHG E\ JDXVVLDQ QRLVH 7KH DSSHDO RI PXWXDO LQIRUPDWLRQ DV D FULWHULRQ IRU IHDWXUH H[WUDFWLRQ LV WKUHHIROG )LUVW PXWXDO LQIRUPDWLRQ H[SORLWV WKH VWUXFWXUH RI WKH XQGHUO\LQJ SUREDELOLW\ GHQVLW\ IXQFn WLRQ 6HFRQG DGDSWDWLRQ DV ZH ZLOO VKRZ FDQ EH XVHG WR UHPRYH DV PXFK XQFHUWDLQW\ DERXW WKH LQSXW FODVV XVLQJ REVHUYDWLRQV RI WKH RXWSXW \ J[ Df 7KLUG WKLV LV DFFRPn SOLVKHG ZLWKLQ WKH FRQVWUDLQWV RI WKH PDSSLQJ WRSRORJ\ J> @ Df

PAGE 112

7KUHH HTXLYDOHQW IRUPXODWLRQV RI PXWXDO LQIRUPDWLRQ DUH ,[\f K[f K\f K[\f f [\f K\f K\?[f DQG f ,[\f K[fK[?\f f ZKHUH ,[\f LV WKH PXWXDO LQIRUPDWLRQ RI WKH 59V ; DQG < ,Q HTXDWLRQV WKURXJK K[f LV WKH GLIIHUHQWLDO HQWURS\ PHDVXUH ZKLFK ZH ZLOO UHIHU WR DV VLPSO\ HQWURS\f >3DSRXOLV @ RI WKH 59 ; K[?\f LV WKH HQWURS\ RI WKH 59 ; FRQGLWLRQHG RQ WKH 59 < DQG K^[\f LV WKH MRLQW HQWURS\ RI WKH 59V ; DQG < (QWURS\ LV XVHG WR TXDQWLI\ RXU XQFHUWDLQW\ DERXW D JLYHQ UDQGRP YDULDEOH RU YHFWRU 0XWXDO LQIRUPDWLRQ TXDQWLILHV WKH UHODWLYH XQFHUWDLQW\ RI RQH UDQGRP YDULDEOHYHFWRU ZLWK UHVSHFW WR DQRWKHU LW PHDVXUHV WKH LQIRUPDWLRQ WKDW RQH UDQGRP YDULDEOHYHFWRU FRQYH\V DERXW DQRWKHU :H QRWH WKDW PDQLSXn ODWLRQ RI PXWXDO LQIRUPDWLRQ LV GHSHQGHQW XSRQ WKH DELOLW\ WR PDQLSXODWH HQWURS\ ,Q IDFW ZH FDQ PDQLSXODWH WKH HQWURS\ UHODWHG WHUPV RI PXWXDO LQIRUPDWLRQ LQGHSHQGHQWO\ )ROORZLQJ WKH QRWDWLRQ RI 3DSRXOLV WKH HQWURS\ RI D FRQWLQXRXV UDQGRP YDULDEOH RU YHFWRU 59f ; H L: LV GHILQHG DV K[f ORJI[[ffI[[fG[ f ZKHUH I[[f LV WKH SUREDELOLW\ GHQVLW\ IXQFWLRQ RI WKH 59 WKH EDVH RI WKH ORJDULWKP LV DUELn WUDU\ DQG WKH LQWHJUDO LV 1IROG 7KH FRQGLWLRQDO DQG MRLQW IRUPV RI HQWURS\ XVHG LQ VXEVWLWXWH WKH MRLQW DQG FRQGLWLRQDO SUREDELOLW\ GHQVLW\ IXQFWLRQV UHVSHFWLYHO\ LQWR HTXD

PAGE 113

WLRQ ,QVSHFWLRQ RI HTXDWLRQ VKRZV WKDW HQWURS\ FDQ EH VHHQ DV WKH H[SHFWHG YDOXH RI WKH ORJ RI WKH SUREDELOLW\ GHQVLW\ IXQFWLRQV RU K[f e^ORJMUff` f 6HYHUDO SURSHUWLHV RI WKH HQWURS\ PHDVXUH DUH RI LQWHUHVW 1 ,I WKH 59 LV UHVWULFWHG WR D ILQLWH UDQJH LQ HQWURS\ LV PD[LPL]HG IRU WKH XQLIRUP GLVn WULEXWLRQ ,I WKH GLDJRQDO HOHPHQWV RI WKH FRYDULDQFH PDWUL[ DUH KHOG FRQVWDQW HQWURS\ LV PD[Ln PL]HG IRU WKH QRUPDO GLVWULEXWLRQ ZLWK GLDJRQDO FRYDULDQFH PDWUL[ 1 1 ,I WKH 59 LV WUDQVIRUPHG E\ D PDSSLQJ J? f§!  WKHQ WKH HQWURS\ RI WKH QHZ 59 \ J[f VDWLVILHV WKH LQHTXDOLW\ W&\f[f e^ORJ_U_f` f ZLWK HTXDOLW\ LI DQG RQO\ LI WKH PDSSLQJ KDV D XQLTXH LQYHUVH ZKHUH -;< LV WKH -DFRn ELDQ RI WKH PDSSLQJ IURP $n WR O 5HJDUGLQJ WKH ILUVW WZR SURSHUWLHV ZH QRWH WKDW LQ ERWK LQVWDQFHV HDFK HOHPHQW RI WKH 59 LV VWDWLVWLFDOO\ LQGHSHQGHQW IURP WKH RWKHU HOHPHQWV :H ZLOO PDNH XVH RI WKH ILUVW SURSHUW\ LQ WKH PHWKRG SUHVHQWHG KHUH (TXDWLRQ LPSOLHV WKDW E\ WUDQVIRUPLQJ D 59 ZH FDQ LQFUHDVH WKH DPRXQW RI LQIRUPDn WLRQ WKDW LW FRQYH\V WKDW LV WKH 59 < GHULYHG IURP WKH 59 ; FDQ KDYH PRUH LQIRUPDWLRQ WKDQ ; 7KLV LV D FRQVHTXHQFH RI ZRUNLQJ ZLWK FRQWLQXRXV 59V ,Q JHQHUDO WKH FRQWLQXRXV HQWURS\ PHDVXUH LV XVHG WR FRPSDUH WKH UHODWLYH HQWURSLHV RI VHYHUDO 59V :H FDQ VHH IURP HTXDWLRQ WKDW LI WZR 59V DUH PDSSHG E\ WKH VDPH LQYHUWLEOH OLQHDU WUDQVIRUPDWLRQ WKHLU UHODWLYH HQWURSLHV DV PHDVXUHG E\ WKH GLIIHUHQFHf UHPDLQV XQFKDQJHG +RZHYHU LI WKH

PAGE 114

PDSSLQJ LV QRQOLQHDU LQ ZKLFK FDVH WKH VHFRQG WHUP RI HTXDWLRQ LV D IXQFWLRQ RI WKH UDQGRP YDULDEOH LW LV SRVVLEOH WR FKDQJH WKH UHODWLYH LQIRUPDWLRQ RI WZR UDQGRP YDULDEOHV )URP WKH SHUVSHFWLYH RI FODVVLILFDWLRQ WKLV LV DQ LPSRUWDQW SRLQW ,I WKH PDSSLQJ LV WRSRORJLFDO LQ ZKLFK FDVH LW KDV D XQLTXH LQYHUVHf WKHLU LV QR LQFUHDVH WKHRUHWLFDOO\ LQ WKH DELOLW\ WR VHSDUDWH FODVVHV 7KDW LV ZH FDQ DOZD\V UHIOHFW D GLVFULPLQDQW IXQFWLRQ LQ WKH WUDQVIRUPHG VSDFH DV D ZDUSLQJ RI DQRWKHU GLVFULPLQDQW IXQFWLRQ LQ WKH RULJLQDO VSDFH 7KLV LV QRW WUXH KRZHYHU IRU D PDSSLQJ RQWR D VXEVSDFH 2XU LPSOLFLW DVVXPSWLRQ KHUH LV WKDW ZH DUH XQDEOH WR UHOLDEO\ GHWHUPLQH D GLVFULPLQDQW IXQFWLRQ LQ WKH IXOO LQSXW VSDFH $V D FRQVHTXHQFH ZH VHHN D VXEVSDFH PDSSLQJ WKDW LV E\ VRPH PHDVXUH RSWLPDO IRU FODVVLILn FDWLRQ :H FDQQRW DYRLG WKH ORVV RI LQIRUPDWLRQ DQG KHQFH VRPH DELOLW\ WR GLVFULPLQDWH FODVVHVf ZKHQ XVLQJ D VXEVSDFH PDSSLQJ +RZHYHU LI WKH FULWHULRQ XVHG IRU DGDSWLQJ WKH PDSSLQJ LV LQIRUPDWLRQ HQWURS\f EDVHG ZH FDQ SHUKDSV PLQLPL]H WKLV ORVV ,W VKRXOG EH PHQWLRQHG WKDW LQ DOO FODVVLILFDWLRQ SUREOHPV WKHUH LV DQ LPSOLFLW DVVXPSWLRQ WKDW WKH FODVVHV WR EH GLVFULPLQDWHG GR LQGHHG OLH LQ D VXEVSDFH 0XWXDO ,QIRUPDWLRQ DV D &ULWHULRQ IRU )HDWXUH ([WUDFWLRQ ,W LV RXU LQWHQW WR XVH PXWXDO LQIRUPDWLRQ DV D FULWHULRQ IRU IHDWXUH H[WUDFWLRQ SULRU WR FODVVLILFDWLRQf 7KH XVH RI PXWXDO LQIRUPDWLRQ LQ WKLV ZD\ FDQ EH PRWLYDWHG VLPSO\ E\ )DQRfV LQHTXDOLW\ >@ ZKLFK JLYHV D ORZHU ERXQG IRU WKH SUREDELOLW\ RI HUURU RU FRQn YHUVHO\ DQ XSSHU ERXQG RQ WKH SUREDELOLW\ RI FRUUHFW FODVVLILFDWLRQf ZKHQ HVWLPDWLQJ D GLVn FUHWH 59 IURP DQRWKHU 59 DV D IXQFWLRQ RI WKH FRQGLWLRQDO HQWURS\ DQG XOWLPDWHO\ WKH PXWXDO LQIRUPDWLRQf )DQRfV LQHTXDOLW\ LV VWDWHG DV IROORZV JLYHQ WKH GLVFUHWHO\ GLVWULEXWHG

PAGE 115

59 ; DQG D UHODWHG 59 < WKH SUREDELOLW\ RI LQFRUUHFWO\ HVWLPDWLQJ ; EDVHG RQ DQ HVWLPDWH GHULYHG IURP REVHUYDWLRQV RI < LV ORZHU ERXQGHG E\ 3;r;f K[?\f? ORJ 1f f ZKHUH 1 LV WKH QXPEHU RI GLVFUHWH HYHQWV RU FODVVHV UHSUHVHQWHG E\ WKH 59 ; DQG ; LV WKH HVWLPDWH RI ; 8VLQJ HTXDWLRQ ZH FDQ UHZULWH )DQRfV LQHTXDOLW\ DV D IXQFWLRQ RI WKH PXWXDO LQIRUPDWLRQ RI ; DQG < DV IROORZV 3;rtf! K[fO[\fO ORJ $f f ,Q WKLV IRUP RI )DQRfV LQHTXDOLW\ ZH VHH WKDW WKH ORZHU ERXQG RQ WKH HUURU SUREDELOLW\ LV PLQLPL]HG ZKHQ WKH PXWXDO LQIRUPDWLRQ EHWZHHQ ; DQG < LV PD[LPL]HG ,W FDQ EH VKRZQ WKDW WKH XSSHU ERXQG RQ WKH SUREDELOLW\ RI HUURU LV 3rrMcfOPD[^!`fOO$2 f ZKHUH 3c LV WKH SULRU SUREDELOLW\ RI WKH cWK FODVV RI ; DQG PD[^3` LV WKH PD[LPXP RYHU WKH VHW RI 3c fV (TXDWLRQ LV LWVHLI XSSHU ERXQGHG E\ O1f WKH FDVH LQ ZKLFK DOO FODVVHV DUH HTXDOO\ OLNHO\ 7KLV LV XSSHU ERXQG LV PHW ZLWK HTXDOLW\ ZKHQ WKH PXWXDO LQIRUn PDWLRQ EHWZHHQ ; DQG < LV ]HUR DQG WKH RSWLPDO FODVV HVWLPDWRU UHYHUWV WR FKRRVLQJ WKH FODVV ZLWK WKH JUHDWHVW SULRU SUREDELOLW\ 7KLV DSSURDFK LV GHSLFWHG LQ ILJXUH ZLWK UHJDUGV WR D %D\HVLDQ IUDPHZRUN ,Q WKH ILJXUH & LV D GLVFUHWH 59 ZKLFK UHSUHVHQWV WKH FODVV 7KH IXQFWLRQ )&f 3c L f

PAGE 116

LV WKH SUREDELOLW\ GHQVLW\ IXQFWLRQ RI WKH FODVV ZKHUH 3c LV WKH SULRU SUREDELOLW\ RI WKH LWK FODVV DQG 1 LV WKH QXPEHU RI FODVVHV 7KH SUREDELOLW\ GHQVLW\ IXQFWLRQ RI ; LV FRQGLWLRQHG RQ WKH FODVV 7KH IHDWXUH YHFWRU \ J[ Df LV GHULYHG IURP WKH REVHUYDWLRQ RI ; DQG LV LWVHOI D UDQGRP YHFWRU SULRU WR REVHUYDWLRQ ,W LV IURP WKH IHDWXUH YHFWRU \ WKDW ZH ZLVK WR HVWLPDWH WKH FODVV 2XU JRDO LV WR FKRRVH WKH SDUDPHWHUV D RI WKH PDSSLQJ J> @ Df VXFK WKDW WKH PXWXDO LQIRUPDWLRQ EHWZHHQ g DQG & LV PD[LPL]HG :H DUH VWLOO OHIW ZLWK WKH WDVN RI GHWHUPLQLQJ WKH HVWLPDWRU & KRZHYHU IURP )DQRfV LQHTXDOLW\ ZH NQRZ WKDW LI ,F\f LV PD[LPL]HG WKH ORZHU ERXQG RQ WKH FODVVLILFDWLRQ HUURU ZLOO EH PLQLPL]HG )LJXUH 0XWXDO LQIRUPDWLRQ DSSURDFK WR IHDWXUH H[WUDFWLRQ $Q REVHUYDWLRQ RI WKH UDQGRP YDULDEOH ; LV JHQHUDWHG E\ WKH SUREDELOLW\ GHQVLW\ IXQFWLRQ I[ &f ZKLFK LV FRQGLWLRQHG RQ WKH GLVFUHWH UDQGRP YDULDEOH & ZKLFK LV FKDUDFWHUL]HG E\ WKH GLVFUHWH SUREDELOLW\ GHQVLW\ IXQFWLRQ 3&f 7KH IHDWXUHV \ GHULYHG IURP WKH REVHUYDWLRQ RI ; DUH XVHG WR HVWLPDWH & 3ULRU :RUN LQ ,QIRUPDWLRQ 7KHRUHWLF 1HXUDO 3URFHVVLQJ 7KH FRQFHSW RI XVLQJ LQIRUPDWLRQ WKHRUHWLF PHDVXUHV LQ QHXUDO SURFHVVLQJ LV QRW QHZ 2QH DSSOLFDWLRQ UHODWHG WR IHDWXUH H[WUDFWLRQ ZDV IRU WKH SXUSRVH RI JHQHUDWLQJ RUGHUHG PDSV >/LQVNHU @ ,Q WKLV ZRUN D PRGLILFDWLRQ WR .RKRQHQfV VHOIRUJDQL]LQJ IHDWXUH PDS 62)0f >.RKRQHQ @ HQWURS\ LV XVHG DV D FRPSHWLWLYH PHDVXUH IRU DGDSn WDWLRQ 6SHFLILFDOO\ LQSXW H[HPSODUV DUH PDSSHG RQWR D GLVFUHWH ODWWLFH DQG HQWURS\ LV XVHG DV D PHDVXUH IRU GHWHUPLQLQJ ZKLFK ODWWLFH SRLQW WR DGDSW 7KH PHWKRG GLIIHUV IURP WKH SUHVHQWDWLRQ KHUH LQ WZR ZD\V

PAGE 117

WKH IRUP RI HQWURS\ XVHG LV GLVFUHWH ZKHUHDV ZH DUH ZRUNLQJ ZLWK FRQWLQXRXV HQWURS\ DQG WKH PDSSLQJ IURP WKH LQSXW WR WKH RXWSXW LV FRQVWUDLQHG WR EH OLQHDU ZKHUHDV WKH PHWKRG SUHVHQWHG KHUH PD\ EH XVHG ZLWK DUELWUDU\ QRQOLQHDU PDSV VR ORQJ DV WKH\ DUH GLIIHUHQn WLDEOHf 'HFR DQG 2EUDGRYLF>O@ KDYH DOVR SUHVHQWHG H[WHQVLYH UHVXOWV RQ LQIRUPDWLRQ WKHRn UHWLF DSSURDFKHV WR QHXUDO SURFHVVLQJ 7KH WHFKQLTXHV GHVFULEHG GLIIHU IURP /LQVNHUfV PHWKRG LQ WKDW WKH\ ZRUN ZLWK WKH FRQWLQXRXV IRUP RI HQWURS\ DQG XVH QRQOLQHDU PDSn SLQJV 7KH FRQVWUDLQW KRZHYHU LV WKDW WKH PDSSLQJ EH V\PSOHFWLF YROXPH SUHVHUYLQJf DQG ELMHFWLYH 7KHVH FRQVWUDLQWV UHVWULFW WKH PHWKRG WR D VXEVHW RI WKH PDSSLQJV ZKLFK DUH 1 1  f§}" $V ZH KDYH VWDWHG IURP D WKHRUHWLFDO SRLQW RI YLHZ VXFK PDSSLQJV LQ QR ZD\ LQFUHDVH RXU DELOLW\ WR GLVFULPLQDWH FODVVHV )XUWKHUPRUH LW LV RXU LPSOLFLW DVVXPSWLRQ WKDW WKH GLPHQVLRQDOLW\ UHGXFWLRQ LV RQH RI WKH PRWLYDWLQJ IDFWRUV IRU IHDWXUH H[WUDFWLRQ SULRU WR FODVVLILFDWLRQ 'HFR DQG 2EUDGRYLF DOVR VKRZ WKDW LI WKH PDSSLQJ IXQFWLRQ LV FKRVHQ WR EH OLQHDU LQ LWV SDUDPHWHUV YHU\ OLWWOH FDQ EH GRQH WR PDQLSXODWH WKH LQIRUPDWLRQ DW WKH RXWSXW RI WKH PDSSLQJ ZLWKRXW SULRU NQRZOHGJH RI WKH LQSXW 3') %HOO DQG 6HMQRZVNL >@ SUHVHQW \HW DQRWKHU DSSURDFK WR LQIRUPDWLRQ WKHRUHWLF PDSn SLQJV 7KHLU WHFKQLTXH LV DSSOLFDEOH WR VXEVSDFH SURMHFWLRQV ,W LV OLPLWHG LQ WKDW LW PDQLSXn ODWHV HQWURS\ RQO\ LI WKH XQGHUO\LQJ GLVWULEXWLRQ LQ WKH LQSXW VSDFH LV XQLPRGDO )XUWKHUPRUH LW LV UHVWULFWHG WR QRQOLQHDU 0/3 DUFKLWHFWXUHV RI D VLQJOH OD\HU 7KH PHWKRG ZH SUHVHQW KHUH KDV QHLWKHU RI WKHVH UHVWULFWLRQV 9LROD HO DO >@ KDYH WDNHQ D VLPLODU DSSURDFK WR WKH PHWKRG SUHVHQWHG KHUH IRU HQWURS\ PDQLSXODWLRQ 7KH ZRUN RI 9LROD HW DO GLIIHUV LQ WKDW LW GRHV QRW DGGUHVV DUELWUDU\ QRQOLQHDU PDSSLQJV GLUHFWO\ WKH JUDGLHQW LV HVWLPDWHG VWRFKDVWLFDOO\ DQG HQWURS\ LV PDQLS

PAGE 118

XODWHG H[SOLFLWO\ 7KHLU DSSURDFK LV VLPLODU WR WKH DSSURDFK LQ FRPPXQLFDWLRQV WKHRU\ ZKHUHLQ WKH FRPPXQLFDWLRQV FKDQQHO RU PDSSLQJf LV DVVXPHG WR EH IL[HG 0XWXDO LQIRUn PDWLRQ LV WKHQ XVHG WR HVWLPDWH WKH VRXUFH RI WKH REVHUYDWLRQV $ VLJQLILFDQW GLIIHUHQFH LQ WKH PHWKRG SUHVHQWHG KHUH LV WKDW WKH PDSSLQJ RU FRPPXQLFDWLRQV FKDQQHOf LV QRW DVVXPHG WR EH IL[HG UDWKHU LW LV SDUDPHWHUL]HG DQG ZH DUH IUHH WR FKRRVH WKH SDUDPHWHUV LQ RUGHU WR PDQLSXODWH HQWURS\ 1RQSDUDPHWULF 3') (VWLPDWLRQ 2QH REVWDFOH WR XVLQJ PXWXDO LQIRUPDWLRQ DV WKH ILJXUH RI PHULW LV WKDW LW LV DQ LQWHJUDO IXQFWLRQ RI WKH 3') RI D FRQWLQXRXV UDQGRP YDULDEOH 6LQFH ZH FDQQRW ZRUN ZLWK WKH 3') GLUHFWO\ XQOHVV DVVXPSWLRQV DUH PDGH DERXW LWV IRUPf ZH UHO\ RQ QRQSDUDPHWULF HVWLn PDWHV 1RQSDUDPHWULF GHQVLW\ HVWLPDWLRQ LQ D KLJKGLPHQVLRQDO VSDFH LV DQ LOOSRVHG SUREn OHP 7KH DSSURDFK GHVFULEHG KHUH KRZHYHU UHOLHV RQ VXFK HVWLPDWHV LQ WKH RXWSXW VSDFH DV GHSLFWHG LQ ILJXUH ZKHUH WKH GLPHQVLRQDOLW\ LV XQGHU WKH FRQWURO RI WKH GHVLJQHU m> @! 2&f IHDWXUH H[WUDFWLRQ 7 LQIRUPDWLRQ LV REVHUYHG LQ WKH ORZ GLPHQVLRQDO RXWSXW VSDFH DQG XVHG WR DGDSW WKH SDUDPHn WHUV RI WKH PDSSLQJ )LJXUH 0DSSLQJ DV IHDWXUH H[WUDFWLRQ ,QIRUPDWLRQ FRQWHQW LV PHDVXUHG LQ WKH ORZ GLPHQVLRQDO VSDFH RI WKH REVHUYHG RXWSXW 7KH 3DU]HQ ZLQGRZ PHWKRG >3DU]HQ @ ZKLFK ZH ZLOO XVH LV D QRQSDUDPHWULF NHUQHOEDVHG PHWKRG IRU HVWLPDWLQJ SUREDELOLW\ GHQVLW\ IXQFWLRQV 7KH 3DU]HQ ZLQGRZ

PAGE 119

HVWPDWH RI WKH SUREDELOLW\ GLVWULEXWLRQ mf RI D UDQGRP YHFWRU < V DW D SRLQW X LV GHILQHG DV .ff f 7KH YHFWRUV \W H Lf9 DUH REVHUYDWLRQV RI WKH UDQGRP YHFWRU DQG N> @f LV D NHUQHO IXQFWLRQ ZKLFK LWVHOI VDWLVILHV WKH SURSHUWLHV RI 3')V LH NNf DQG -LF0fGm f 7KH 3DU]HQ ZLQGRZ HVWLPDWH FDQ EH YLHZHG DV D FRQYROXWLRQ RI WKH HVWLPDWRU NHUQHO ZLWK WKH REVHUYDWLRQV 6LQFH ZH ZLVK WR PDNH D ORFDO HVWLPDWH RI WKH 3') WKH NHUQHO IXQFWLRQ VKRXOG DOVR EH ORFDOL]HG LH XQLPRGDO GHFD\LQJ WR ]HURf ,Q WKH PHWKRG ZH GHVFULEH ZH ZLOO DOVR UHTXLUH WKDW .> @f EH GLIIHUHQWLDEOH HYHU\ZKHUH 7KHUH DUH VHYHUDO SURSHUWLHV RI WKH 3DU]HQ GHQVLW\ HVWLPDWH RI QRWH ,I WKH HVWLPDWRU NHUQHO IXQFWLRQ VDWLVILHV WKH SURSHUWLHV DERYH WKH HVWLPDWH ZLOO VDWLVI\ WKH SURSHUWLHV RI D 3') ,Q WKH OLPLW DV WKH HVWLPDWRU DSSURDFKHV WKH WUXH XQGHUO\LQJ GLVWULEXWLRQ FRQYROYHG ZLWK WKH NHUQHO IXQFWLRQ WKDW LV OLP I\Xf \mfr.Xf f FRQVHTXHQWO\ WKH 3DU]HQ ZLQGRZ HVWLPDWRU LV D ELDVHG HVWLPDWRU 7KH ELDV FDQ EH PDGH DUELWUDULO\ VPDOO E\ UHGXFLQJ WKH H[WHQW RI WKH NHUQHO DW WKH FRVW RI UDLVLQJ WKH YDULDQFH >+DUGOH @ ,Q WKH PXOWLGLPHQVLRQDO FDVH WKH IRUP RI WKH NHUQHO LV W\SLFDOO\ JDXVVLDQ RU K\SHUFXEH $V D UHVXOW RI WKH GLIIHUHQWLDELOLW\ UHTXLUHPHQW RI RXU PHWKRG WKH JDXVVLDQ NHUQHO LV PRVW

PAGE 120

VXLWDEOH KHUH 7KH FRPSXWDWLRQDO FRPSOH[LW\ RI WKH HVWLPDWRU LQFUHDVHV ZLWK GLPHQVLRQ KRZHYHU DV ZH ZLOO EH HVWLPDWLQJ WKH 3') LQ WKH RXWSXW VSDFH RI RXU PDSSLQJ WKH GLPHQn VLRQDOLW\ FDQ EH FRQWUROOHG 'HULYDWLRQ 2I 7KH /HDUQLQJ $OJRULWKP 2XU JRDO LV WR ILQG IHDWXUHV WKDW FRQYH\ PD[LPXP LQIRUPDWLRQ DERXW WKH LQSXW FODVV +RZ GR ZH DGDSW WKH SDUDPHWHUV D RI D PDSSLQJ VXFK WKDW WKLV LV WKH FDVH" :H QRZ VKRZ KRZ WKH 3DU]HQ ZLQGRZ GHQVLW\ HVWLPDWRU FRXSOHG ZLWK D SURSHUW\ RI HQWURS\ FDQ EH XVHG WR DFFRPSOLVK WKLV JRDO &RQVLGHU WKH PDSSLQJ } n: 0 1 RI D UDQGRP YHFWRU ; H L5: ZKLFK LV GHVFULEHG E\ WKH IROORZLQJ HTXDWLRQ < J;Df f ,I WKH PDSSLQJ LV QRQOLQHDU ZH FDQ H[SORLW WKH IROORZLQJ SURSHUW\ RI HQWURS\ ,I D UDQn GRP YDULDEOH KDV ILQLWH UHJLRQ RI VXSSRUW HQWURS\ LV PD[LPL]HG IRU WKH XQLIRUP GLVWULEXn WLRQ 7KH 3DU]HQ ZLQGRZV HVWLPDWRU FRXSOHG ZLWK D PDSSLQJ ZLWK ILQLWH UHJLRQ RI VXSSRUW DW WKH RXWSXW HJ DQ 0/3 ZLWK VLJPRLGDO QRQOLQHDULWLHVf FDQ EH XVHG WR PLQLPL]H RU PD[n LPL]H WKH fGLVWDQFHf EHWZHHQ WKH REVHUYHG GLVWULEXWLRQ DQG WKH GHVLUHG GLVWULEXWLRQ )XUn WKHUPRUH LI WKH UHJLRQ RI VXSSRUW LV D K\SHUFXEH DV LV WKH FDVH IRU WKH 0/3 XVLQJ VLJPRLGDO QRQOLQHDULWLHV WKH IHDWXUHV DUH VWDWLVWLFDOO\ LQGHSHQGHQW ZKHQ HQWURS\ LV PD[Ln PL]HG &RQVLGHULQJ HTXDWLRQ WKH PHWKRG RI 9LROD HW DO HVWLPDWHV WKH YDOXH RI WKH LQSXW SDUDPHWHUV ; UDWKHU WKDQ WKH SDUDPHWHUV RI WKH PDSSLQJ D 7KH JRDOV DUH YHU\ GLIIHUHQW

PAGE 121

,OO %\ RXU FKRLFH RI WRSRORJ\ 0/3f DQG GLVWDQFH PHWULF ZH DUH DEOH WR ZRUN ZLWK HQWURS\ LQGLUHFWO\ DQG ILW WKH DSSURDFK QDWXUDOO\ LQWR D EDFNSURSDJDWLRQ OHDUQLQJ SDUDGLJP $V RXU FULWHULRQ ZH XVH LQWHJUDWHG VTXDUHG HUURU EHWZHHQ RXU HVWLPDWH RI WKH RXWSXW GLVWULEXWLRQ I\X\f DW D SRLQW D RYHU D VHW RI REVHUYDWLRQV \ DQG WKH GHVLUHG RXWSXW GLVn WULEXWLRQ I\Xf ZKLFK ZH DSSUR[LPDWH ZLWK D VXPPDWLRQ f M ,Q HTXDWLRQ LQGLFDWHV WKH QRQ]HUR UHJLRQ D K\SHUFXEH IRU WKH XQLIRUP GLVWULEXn WLRQf RYHU ZKLFK WKH 0 IROG LQWHJUDWLRQ LV HYDOXDWHG $VVXPLQJ WKH RXWSXW VSDFH LV VDPn SOHG DGHTXDWHO\ ZH FDQ DSSUR[LPDWH WKLV LQWHJUDO ZLWK D VXPPDWLRQ LQ ZKLFK X@ H "0 DUH VDPSOHV LQ 0 VSDFH DQG $X LV UHSUHVHQWV D YROXPH :H XVH WKH 3DU]HQ ZLQGRZ PHWKRG >3DU]HQ @ DV RXU HVWLPDWRU RI WKH RXWSXW GLVWULn EXWLRQ 7KH 3DU]HQ ZLQGRZ HVWLPDWH RI D 3') LV ZULWWHQ f L ZKHUH N f LV WKH NHUQHO IXQFWLRQ \ ^\ \1U` DUH WKH VHW RI REVHUYDWLRQV DW WKH RXWSXW RI WKH PDSSLQJ DQG X LV WKH ORFDWLRQ DW ZKLFK WKH RXWSXW HVWLPDWH LV EHLQJ FRP

PAGE 122

SXWHG 6LQFH WKH RXWSXW REVHUYDWLRQV DUH IXQFWLRQDO PDSSLQJV RI WKH LQSXW GDWD ZH FDQ UHZULWH DV \mmD-Wff f A 7KH JUDGLHQW RI WKH FULWHULRQ IXQFWLRQ ZLWK UHVSHFW WR WKH PDSSLQJ SDUDPHWHUV LV GHWHUn PLQHG YLD WKH FKDLQ UXOH DV B A
PAGE 123

QHXUDO QHWZRUNf WKLV WHUP FDQ EH FRPSXWHG HIILFLHQWO\ XVLQJ VWDQGDUG EDFNSURSDJDWLRQ 7KH UHPDLQLQJ SDUWLDO GHULYDWLYH GIGJ LV 1\ L ZKHUH .n f LV WKH GHULYDWLYH RI WKH NHUQHO IXQFWLRQ ZLWK UHVSHFW WR LWV DUJXPHQW 6XEVWLWXWLQJ LQWR \LHOGV f D GD D [f M L feHUXMf\f.n\LaX\f$f f f 7KH WHUPV LQ H[FOXGLQJ WKH PDSSLQJ VHQVLWLYLWLHV EHFRPH WKH QHZ HUURU GLUHFWLRQ WHUP LQ WKH EDFNSURSDJDWLRQ DOJRULWKP ,W LV LPSRUWDQW WR GLVWLQJXLVK HUURU GLUHFWLRQ IURP HUURU ,I WKH WHUP ZHUH DQ HUURU WKLV ZRXOG LPSO\ D GHVLUHG RXWSXW G \ Hf ZKLFK LV WKH FDVH IRU D VXSHUYLVHG WUDLQLQJ DOJRULWKP XVLQJ WKH PHDQ VTXDUH HUURU FULWHULRQ +RZHYHU LQ JHQHUDO WKH SDUWLDO GHULYDWLYH RQO\ LPSOLHV WKH GLUHFWLRQ ZH ZRXOG OLNH WR SHUWXUE WKH REVHUYDWLRQ /DWHU ZH ZLOO VKRZ KRZ WR LQWHUSUHW WKH HUURU GLUHFWLRQ DV DQ DFWXDO HUURU UHVXOWLQJ LQ D PXFK VLPSOLILHG DOJRULWKPf %\ UHYHUVLQJ WKH RUGHU RI VXPPDWLRQV LQ WKH VHFRQG VWHS RI ZH VHH WKDW WKH HUURU GLUHFWLRQ WHUP DVVRFLDWHG ZLWK HDFK REVHUYDWLRQ LV D FRQYROXWLRQ RI WKH HVWLPDWHG HUURU LQ

PAGE 124

WKH REVHUYHG RXWSXW GLVWULEXWLRQ H
PAGE 125

,W LV LQWHUHVWLQJ WR FRPSDUH WKLV UHVXOW WR VXSHUYLVHG WUDLQLQJ XVLQJ HUURU EDFNSURSDJD WLRQ :KHQ WUDLQLQJ LQ D VXSHUYLVHG PDQQHU DQ H[SOLFLW GHVLUHG RXWSXW Gc LV DVVLJQHG WR HDFK LQSXW [c 06( PLQLPL]DWLRQ UHVXOWV LQ WKH IROORZLQJ DGDSWDWLRQ RI WKH PDSSLQJ SDUDPHWHUV f ZKHUH \c LV WKH REVHUYHG UHVSRQVH WR WKH LQSXW [c DQG LV WKH REVHUYHG RXWSXW HUURU ,Q FRQWUDVW PD[LPL]LQJ RU PLQLPL]LQJ HQWURS\ LQ WKH PDQQHU GHVFULEHG UHVXOWV LQ WKH IROORZn LQJ DGDSWDWLRQ RI WKH PDSSLQJ SDUDPHWHUV ZKLFK QHJOHFWLQJ WKH VLJQ WHUP LV WKH VDPH WKH VDPH DV HTXDWLRQ ZLWK RQH VLJQLILFDQW GLIIHUHQFH 7KH VLJQ WHUP GHSHQGV RQ ZKHWKHU ZH DUH PLQLPL]LQJ RU PD[LPL]LQJ HQWURS\ *DXVVLDQ .HUQHOV ([DPLQDWLRQ RI WKH JDXVVLDQ NHUQHO DQG LWV GLIIHUHQWLDO LQ WZR GLPHQVLRQ LOOXVWUDWHV VRPH RI WKH SUDFWLFDO LVVXHV RI LPSOHPHQWLQJ WKLV PHWKRG RI IHDWXUH H[WUDFWLRQ DV ZHOO DV SURYLGLQJ DQ LQWXLWLYH XQGHUVWDQGLQJ RI ZKDW LV KDSSHQLQJ GXULQJ WKH DGDSWDWLRQ SURFHVV

PAGE 126

7KH 1GLPHQVLRQDO JDXVVLDQ NHUQHO HYDOXDWHG DW VRPH X LV VLPSOLILHG IRU WZR GLPHQn VLRQVf X7=rOXn? .Xf 1 6;3 f _,_ ; n X7X? H;3 L WLR N Rn f / R 1 7KH SDUWLDO GHULYDWLYH RI WKH NHUQHO DOVR VLPSOLILHG IRU WKH WZRGLPHQVLRQDO FDVHf LV N f§ NKf, X GX QR X7X? f D 1 7KHVH IXQFWLRQV DUH VKRZQ LQ ILJXUH IRU WKH WZRGLPHQVLRQDO FDVH 5HFDOO WKDW WKH WHUP <]
PAGE 127

)LJXUH *UDGLHQW RI WZRGLPHQVLRQDO JDXVVLDQ NHUQHO 7KH NHUQHOV DFW DV DWWUDFWRUV WR ORZ SRLQWV LQ WKH REVHUYHG 3') RQ WKH GDWD ZKHQ HQWURS\ PD[LPL]DWLRQ LV GHVLUHG D JDXVVLDQf 7KH NHUQHO JUDGLHQW LV FRQYROYHG ZLWK WKH GLIIHUHQFH EHWZHHQ WKH GHVLUHG DQG REVHUYHG GLVWULEXWLRQV WR GHWHUPLQH WKH HUURU GLUHFWLRQ 7KH UHVXOWLQJ HUURU GLUHFWLRQ LV VKRZQ LQ ILJXUH 7KH GLIIHUHQFH EHWZHHQ WKH FDVHV LV WKH VLJQ RI WKH HUURU GLUHFWLRQ $V ZH FDQ VHH DQG ZRXOG H[SHFW ZKHQ ZH DUH PD[LPL]LQJ HQWURS\ WRS ILJXUHf WKH HUURU GLUHFn WLRQ SRLQWV DZD\ IURP WKH PRGHV RI WKH REVHUYHG GLVWULEXWLRQ ZKLOH ZKHQ ZH DUH PLQLPL]n LQJ HQWURS\ ERWWRP ILJXUHf WKH HUURU GLUHFWLRQ LV WR WKH FHQWHU RI WKH PRGHV 7KLV UHSXOVLRQDWWUDFWLRQ EHKDYLRU H[WHQGV WR WKH PXOWLGLPHQVLRQDO FDVH DV ZHOO 7KH ERWWRP RI WKH ILJXUH LOOXVWUDWHV DQRWKHU SRLQW ZLWK UHJDUGV WR IHDWXUH H[WUDFWLRQV $V ZH FDQ VHH ZKHQ ZH DUH PLQLPL]LQJ HQWURS\ WKH WUHQG LV WR PDNH WKH REVHUYDWLRQV PRUH FRPSDFW D SURSn HUW\ ZKLFK ZRXOG EH XVHIXO IRU LGHQWLI\LQJ D FODVV

PAGE 128

HUURU GLUHFWLRQ OL )LJXUH 0L[WXUH RI JDXVVLDQV H[DPSOH 7KH HVWLPDWHG GLVWULEXWLRQ LV D PL[WXUH RI JDXVVLDQV ZKLOH WKH GHVLUHG GLVWULEXWLRQ LV XQLIRUP EHWZHHQ WR 7KH NHUQHO JUDGLHQW ZKLFK ZLOO EH FRQYROYHG ZLWK WKH GLIIHUHQFH EHWZHHQ WKH WZR GLVWULEXWLRQV LV VKRZQ LQ GRWWHG OLQH 0D[LPXP )QWURS\3&$ $Q 5PSLULFDO &RPSDULVRQ :H SUHVHQW VRPH H[SHULPHQW UHVXOWV GHVLJQHG WR LOOXVWUDWH WKH SURSHUWLHV RI LQIRUPDWLRQ WKHRUHWLF IHDWXUH H[WUDFWLRQ DV FRPSDUHG WR D VLJQDO UHSUHVHQWDWLRQDO DSSURDFK ,Q WKHVH H[SHULPHQWV ZH ZLOO FRPSDUH D VLPSOH HQWURS\PD[LPL]LQJ IHDWXUH H[WUDFWRU WR WKH ZHOO NQRZQ SULQFLSDO FRPSRQHQWV DQDO\VLV 3&$f DSSURDFK WR IHDWXUH H[WUDFWLRQ 7KH VRXUFH GLVWULEXWLRQV DUH VLPSOH E\ GHVLJQ EXW DV ZH VKDOO VHH WKH\ DUH VXIILFLHQW WR VKRZ WKH GLIn IHUHQFHV LQ WKH WZR PHWKRGV :H ZLOO EHJLQ ZLWK WKH VLPSOH FDVH RI D WZR GLPHQVLRQDO JDXVVLDQ GLVWULEXWLRQ 7KH GLVWULEXWLRQ ZH ZLOO XVH LV ]HUR PHDQ ZLWK D FRYDULDQFH PDWUL[ RI

PAGE 129

HQWURS\ PD[ I UY L L e b M M  ? D ) D ? Q L-g A X HQWURS\ PLQ L A f§! ? U L f§! L? If§f§$ ‘ Uf§? Q LL )LJXUH 0L[WXUH RI JDXVVLDQV H[DPSOH HQWURS\ PLQLPL]DWLRQ DQG PD[LPL]DWLRQ 7KH SORWV DERYH VKRZ WKH UHVXOWLQJ LQIOXHQFH IXQFWLRQ ZKHQ WKH NHUQHO JUDGLHQW LV FRQYROYHG ZLWK WKH REVHUYHG GLVWULEXWLRQ HUURU 7KH VLJQ GHSHQGV RQ ZKHWKHU ZH DUH PLQLPL]LQJ ERWWRPf RU PD[LPL]LQJ WRSf HQWURS\ 7KH FRQWRXUV RI WKLV GLVWULEXWLRQ DUH VKRZQ LQ ILJXUH DORQJ ZLWK WKH LPDJH RI WKH ILUVW SULQFLSDO FRPSRQHQW IHDWXUHV :H VHH IURP WKH ILJXUH WKDW WKH ILUVW SULQFLSDO FRPSRn QHQW OLHV DORQJ WKH [ D[LV :H GUDZ D VHW RI REVHUYDWLRQV LQ WKLV FDVHf IURP WKLV GLVWUL

PAGE 130

EXWLRQ DQG FRPSXWH D PDSSLQJ XVLQJ DQ 0/3 DQG WKH HQWURS\ PD[LPL]LQJ FULWHULRQ GHVFULEHG LQ SUHYLRXV VHFWLRQV 7KH DUFKLWHFWXUH RI WKH 0/3 LV LQGLFDWLQJ LQSXW QRGHV KLGGHQ QRGHV DQG RXWSXW QRGH 7KH QRQOLQHDULW\ XVHG LV WKH K\SHUEROLF WDQJHQW IXQFWLRQ :H DUH WKHUHIRUH QRQOLQHDUO\ PDSSLQJ WKH WZRGLPHQVLRQDO LQSXW VSDFH RQWR D RQHGLPHQVLRQDO RXWSXW VSDFH 7KH ULJKWKDQG SORW RI ILJXUH VKRZV WKH LPDJH RI WKH PD[LPXP HQWURS\ PDSSLQJ RQWR WKH LQSXW VSDFH )URP WKH FRQWRXUV RI WKLV PDSSLQJ ZH VHH WKDW WKH PD[LPXP HQWURS\ PDSSLQJ OLHV HVVHQWLDOO\ LQ WKH VDPH GLUHFWLRQ DV WKH ILUVW SULQFLSDO FRPSRQHQWV 3&$ PDSSLQJ HQWURS\ PDSSLQJ 9 ff b mM U\7 r‘r U L ‘‘‘ r‘‘‘‘ L ‘ I 9f§ )LJXUH 3&$ YV (QWURS\ JDXVVLDQ FDVH /HIW LPDJH RI 3&$ IHDWXUHV VKRZQ DV FRQWRXUV 5LJKW (QWURS\ PDSSLQJ VKRZQ DV FRQWRXUV 7KLV UHVXOW LV H[SHFWHG ,W LOOXVWUDWHV WKDW ZKHQ WKH JDXVVLDQ DVVXPSWLRQ LV VXSSRUWHG E\ WKH GDWD PD[LPXP HQWURS\ DQG 3&$ DUH HTXLYDOHQW IURP WKH VWDQGSRLQW RI GLUHFWLRQ 7KLV UHVXOW KDV EHHQ UHFRJQL]HG E\ PDQ\ UHVHDUFKHUV ,Q IDFW WKH JDXVVLDQ DVVXPSWLRQ LV RIWHQ XVHG DV D OLPLWLQJ FDVH IRU PD[LPXP HQWURS\ DSSURDFKHV >3OXPEH\ DQG )DOOVLGH @ 7KHVH WHFKQLTXHV KRZHYHU RQO\ H[DPLQH WKH FRYDULDQFH RI WKH GDWD LQ WKH RXWSXW VSDFH

PAGE 131

:H DUH PRUH LQWHUHVWHG LQ WKH UHVXOW ZKHQ WKH JDXVVLDQ DVVXPSWLRQ LV QRW FRUUHFW ,Q WKLV FDVH ZH ZRXOG QRW H[SHFW WKH 3&$ DQG HQWURS\ PDSSLQJV WR EH HTXLYDOHQW :H FRQn GXFW D VHFRQG H[SHULPHQW WR LOOXVWUDWH WKLV SRLQW ZKHUH ZH GUDZ REVHUYDWLRQV IURP D UDQn GRP VRXUFH ZKRVH XQGHUO\LQJ GLVWULEXWLRQ LV QRW JDXVVLDQ 6SHFLILFDOO\ WKH 3') LV D PL[WXUH RI JDXVVLDQ PRGHV ZLWK WKH IROORZLQJ IRUP S[f O1[ }W@ =Mf 1[ P =ff ZKHUH 1[ P =f LV D JDXVVLDQ GLVWULEXWLRQ ZLWK PHDQ P DQG FRYDULDQFH = ,Q WKLV FDVH P? ] P RT ] ,W FDQ EH VKRZQ WKDW WKH SULQFLSDO FRPSRQHQWV RI WKLV GLVWULEXWLRQ DUH WKH HLJHQYHFWRUV RI WKH PDWUL[ 5 A L+MP= PPMf ZLWK WKH SULQFLSDO FRPSRQHQW YHFWRU SDUDOOHO WR WKH [ D[LV 7KLV GLVWULEXWLRQ LV VKRZQ LQ ILJXUH DORQJ ZLWK LWV ILUVW SULQFLSDO FRPSRQHQW IHDWXUH PDSSLQJ 7KH ULJKW VLGH RI ILJXUH VKRZV WKH LPDJH RI WKH PD[LPXP HQWURS\ PDSSLQJ $V ZH FDQ VHH WKHUH DUH WZR GLVWLQFW GLIIHUHQFHV EHWZHHQ WKLV PDSSLQJ DQG WKH 3&$ UHVXOW 7KH ILUVW REVHUYDWLRQ LV WKDW WKH PDSSLQJ LV QRQOLQHDU 7KH VHFRQG REVHUYDWLRQ LV WKDW WKH

PAGE 132

PD[LPXP HQWURS\ PDSSLQJ LV PRUH WXQHG WR WKH VWUXFWXUH RI WKH GDWD LQ WKH LQSXW VSDFH ,W LV LQWHUHVWLQJ WR QRWH WKDW WKH PD[LPXP HQWURS\ PDSSLQJ ZHLJKWV WKH WDLOV RI WKH PRGHV HTXDOO\ DV HYLGHQFHG E\ WKH JUHDWHU VSUHDGLQJ RI WKH FRQWRXUV IRU WKH PRGH ZLWK WKH ODUJHU HLJHQYDOXH ZKLOH WKH 3&$ PDSSLQJ GRHV QRW :H FDQ VD\ IURP REVHUYLQJ WKH UHVXOWV WKDW WKH PD[LPXP HQWURS\ PDSSLQJ LV VXSHULRU LQ GHVFULELQJ WKH XQGHUO\LQJ VWUXFWXUH RI WKH GDWD ZKHQ FRPSDUHG WKH 3&$ PDSSLQJ ? 3 9Y : )LJXUH 3&$ YV (QWURS\ QRQJDXVVLDQ FDVH /HIW LPDJH RI 3&$ IHDWXUHV VKRZQ DV FRQWRXUV 5LJKW (QWURS\ PDSSLQJ VKRZQ DV FRQWRXUV :H FRQVLGHU RQH PRUH ELPRGDO GLVWULEXWLRQ 7KH VHWXS LV WKH VDPH DV WKH SUHYLRXV FDVH D ELPRGDO GLVWULEXWLRQf ZLWK WKH PRGLILFDWLRQV P P a 9 L f [

PAGE 133

&RQVHTXHQWO\ WKH SULQFLSDO FRPSRQHQWV DUH WKH HLJHQYHFWRUV RI WKH PDWUL[ ZLWK WKH PDMRU SULQFLSDO FRPSRQHQW DW GHJUHHV DERYH WKH [D[LV $V LQ WKH SUHYLRXV FDVH ZH FRPSDUH WKH SULQFLSDO FRPSRQHQW PDSSLQJ WR WKH PD[LPXP HQWURS\ PDSSLQJ 7KH UHVXOWV DUH VKRZQ LQ ILJXUH $JDLQ DV LQ WKH SUHYLRXV FDVH LW LV HYLGHQW WKDW WKH PD[LPXP HQWURS\ PDSSLQJ LV EHWWHU UHODWHG WR WKH XQGHUO\LQJ VWUXFWXUH RI WKH GLVWULEXWLRQ DV LW KDV IRXQG WKH VHSDUDWH GLUHFWLRQ RI WKH LQGLYLGXDO PRGHV ZKHUHDV WKH 3&$ SURMHFWLRQ KDV HVVHQWLDOO\ DYHUDJHG WKH GLUHFWLRQV )LJXUH 3&$ YV (QWURS\ QRQJDXVVLDQ FDVH /HIW LPDJH RI 3&$ IHDWXUHV VKRZQ DV FRQWRXUV 5LJKW (QWURS\ PDSSLQJ VKRZQ DV FRQWRXUV 7KHVH H[SHULPHQWV KHOS WR LOOXVWUDWH WKH GLIIHUHQFHV EHWZHHQ 3&$ D VLJQDO UHSUHVHQWDn WLRQ PHWKRGf DQG HQWURS\ DQ LQIRUPDWLRQWKHRUHWLF PHWKRGf 3&$ LV SULPDULO\ FRQFHUQHG ZLWK GLUHFWLRQ ILQGLQJ DQG RQO\ FRQVLGHUV WKH VHFRQG RUGHU VWDWLVWLFV RI WKH XQGHUO\LQJ GDWD ZKLOH HQWURS\ H[SORUHV WKH VWUXFWXUH RI WKH GDWD FODVV ,Q D IHZ OLPLWHG FDVHV VHFRQG RUGHU

PAGE 134

VWDWLVWLFV DUH VXIILFLHQW HJ JDXVVLDQf WR GHVFULEH VXFK VWUXFWXUH EXW LQ JHQHUDO WKH\ DUH QRW 0D[LPXP (QWURS\ ,6$5 ([SHULPHQW :H QRZ SUHVHQW VRPH H[SHULPHQWDO UHVXOWV XVLQJ PD[LPXP HQWURS\ IHDWXUH H[WUDFWRU IRU ,6$5 GDWD 7KH PDSSLQJ VWUXFWXUH ZH XVH LQ RXU H[SHULPHQW LV D PXOWLOD\HU SHUFHSWURQ ZLWK D VLQJOH KLGGHQ OD\HU LQSXW QRGHV KLGGHQ QRGHV RXWSXW QRGHVf 7KH PHWKRG LV XVHG WF H[WUDFW IHDWXUHV IURP WZR YHKLFOH W\SHV ZLWK ,6$5 LPDJHV IURP GHJUHHV RI DVSHFW ([DPSOHV RI WKH LPDJHU\ DUH VKRZQ LQ ILJXUH ,Q WKHVH H[SHULPHQWV WKH RSWLPLn ]DWLRQ FULWHULRQ LV DOZD\V WR PD[LPL]H HQWURS\ FRQGLWLRQHG RQ WKH LQSXW FODVV 7KH LQSXW FODVV PD\ EH UHSUHVHQWHG E\ D VLQJOH YHKLFOH W\SH RU ERWK YHKLFOH W\SHV GHSHQGLQJ RQ WKH H[SHULPHQW 7KLV LV QRW KRZ WKH WHFKQLTXH ZRXOG EH DSSOLHG WR WKH 1/0$&( VWUXFWXUH UHFDOO WKDW PXWXDO LQIRUPDWLRQ KDV ERWK DQ HQWURS\ PLQLPL]LQJ DQG PD[LPL]LQJ WHUPf EXW WKH UHVXOWV DUH LQWHUHVWLQJ DQG IXUWKHU LOOXVWUDWH WKH SRWHQWLDO RI WKH LQIRUPDWLRQ WKHRUHWLF DSSURDFK )LJXUH ([DPSOH ,6$5 LPDJHV IURP WZR YHKLFOHV XVHG IRU H[SHULPHQWV 7KH YHKLFOHV ZHUH URWDWHG WKURXJK DQ DVSHFW UDQJH RI WR GHJUHHV 7KH WRS DQG ERWWRP URZV DUH IURP GLIIHUHQW YHKLFOHV

PAGE 135

0D[LPXP (QWURS\ 6LQJOH 9HKLFOH &ODVV ,Q WKH ILUVW H[SHULPHQW ZH WUDLQHG WKH IHDWXUH H[WUDFWRU RQ D VLQJOH YHKLFOH XSSHU YHKLn FOH LQ ILJXUH f RYHU GHJUHHV RI DVSHFW ZLWK GHJUHHV DVSHFW EHWZHHQ HDFK H[HPn SODU :H VKRZ WKH PDSSLQJ RI WKH LQSXW LPDJHV RQWR WKH WZR GLPHQVLRQDO RXWSXW IHDWXUH VSDFH LQ ILJXUHV DQG DIWHU DQG LWHUDWLRQV UHVSHFWLYHO\ 7KH PDSn SLQJ RI WKH LPDJHV LQWR WKH IHDWXUH VSDFH DUH FRQQHFWHG LQ RUGHU RI LQFUHDVLQJ DVSHFW ,Q WKH ODWWHU WZR SORWV LW LV FOHDU WKDW WKH H[WUDFWHG IHDWXUHV KDYH EHJXQ WR ILOO WKH RXWSXW IHDWXUH VSDFH EXW KDYH DOVR PDLQWDLQHG DVSHFW GHSHQGHQFH RQ WKH LPDJHV :H EHOLHYH WKDW WKLV LV HYLGHQFH WKDW ZKLOH WKH PHWKRG LQFUHDVHV WKH VWDWLVWLFDO LQGHSHQGHQFH RI WKH WZR RXWSXW IHDWXUHV LW LV VWLOO WXQHG WR WKH XQGHUO\LQJ GLVWRUWLRQ RI WKH LQSXW YHKLFOH FODVV DV UHSUHn VHQWHG E\ URWDWLRQ WKURXJK DVSHFW )LJXUH 6LQJOH YHKLFOH H[SHULPHQW LWHUDWLRQV 3URMHFWLRQ RI WUDLQLQJ WRS OHIWf DQG WHVWLQJ WRS ULJKWf LPDJHV RQWR IHDWXUH VSDFH :H EHOLHYH WKDW WKLV LV HYLGHQFH WKDW WKH PDSSLQJ KDV PDLQWDLQHG WRSRORJLFDO QHLJKERUn KRRGV LQ D VLPLODU IDVKLRQ WR WKH .RKRQHQ VHOIRUJDQL]LQJ IHDWXUH PDS 62)0f >.RKRQHQ

PAGE 136

)LJXUH 6LQJOH YHKLFOH H[SHULPHQW LWHUDWLRQV 3URMHFWLRQ RI WUDLQLQJ WRS OHIWf DQG WHVWLQJ WRS ULJKWf LPDJHV RQWR IHDWXUH VSDFH $GMDFHQW DVSHFW DQJOHV DUH FRQQHFWHG E\ D OLQH )LJXUH 6LQJOH YHKLFOH H[SHULPHQW LWHUDWLRQV 3URMHFWLRQ RI WUDLQLQJ WRS OHIWf DQG WHVWLQJ WRS ULJKWf LPDJHV RQWR IHDWXUH VSDFH @ 7KH GLIIHUHQFH EHWZHHQ WKLV DSSURDFK DQG WKH 62)0 DSSURDFK LV WKDW LQ WKLV FDVH WKH PDSSLQJ LV FRQWLQXRXV ZKHUHDV LQ WKH 62)0 WKH VDPSOHV DUH PDSSHG RQWR D GLVFUHWH

PAGE 137

ODWWLFH 7KH UHODWLRQVKLS RI WKLV PD[LPXP HQWURS\ PDSSLQJ DSSURDFK WR WKH 62)0 RI .RKRQHQ LV D WRSLF WKDW ZLOO EH OHIW IRU ODWHU UHVHDUFK 0D[LPXP (QWURS\ 7ZR 9HKLFOH &ODVVHV ,W LV FRPPRQO\ DVVXPHG LQ WKH EOLQG VRXUFH VHSDUDWLRQ SUREOHP WKDW WKH VRXUFHV DUH VWDWLVWLFDOO\ LQGHSHQGHQW >%HOO DQG 6HMQRZVNL @ 0D[LPXP HQWURS\ KDV EHHQ XVHG LQ DSSURDFKHV WR WKLV SUREOHP $V DQ H[DPSOH RI EOLQG VRXUFH VHSDUDWLRQ ZH UHSHDW WKH SUHYLn RXV H[SHULPHQW RQ ERWK YHKLFOHV ZKLFK FDQ EH PRGHOHG DV VWDWLVWLFDOO\ LQGHSHQGHQW VRXUFHV IURP ILJXUH 7KH SURMHFWLRQ RI WKH WUDLQLQJ LPDJHV DQG EHWZHHQ DVSHFW WHVWLQJ LPDJHVf LV VKRZQ LQ ILJXUH ZKHUH DGMDFHQW DVSHFW WUDLQLQJ LPDJHV DUH FRQQHFWHGf $V FDQ EH VRPH VLJQLILFDQW FODVV VHSDUDWLRQ LV H[KLELWHG ZLWKRXW D SULRUL ODEHOLQJ WKH FODVVHVf ,Q WKH HDUO\ VWDJHV RI OHDUQLQJ WKH PHWKRG DSSHDUV WR PD[LPL]H LQIRUPDWLRQ ZLWK UHJDUGV WR WKH XQGHUO\LQJ GLVWRUWLRQ FRPPRQ WR ERWK FODVVHV URWDWLRQ WKURXJK DVSHFWf $V WKH PDSn SLQJ LV UHILQHG WKH LQIRUPDWLRQ EHJLQV WR IRFXV RQ WKH GLIIHUHQFHV EHWZHHQ WKH FODVVHV &RPSXWDWLRQDO 6LPSOLILFDWLRQ RI WKH $OJRULWKP 6R IDU ZH KDYH RQO\ SUHVHQWHG UHVXOWV XVLQJ WKH PHWKRG WR PD[LPL]H HQWURS\ 2XU LQWHUHVW ZLWK UHJDUGV WR FODVVLILFDWLRQ LV PXWXDO LQIRUPDWLRQ 6SHFLILFDOO\ DV GHVFULEHG E\ HTXDWLRQ ZKHUH WKH PXWXDO LQIRUPDWLRQ LV D IXQFWLRQ RI WKH REVHUYHG RXWSXW HQWURSLHV +RZHYHU EHIRUH ZH GLVFXVV H[WHQVLRQV WR PXWXDO LQIRUPDWLRQ ZH SUHVHQW VRPH VLJQLILFDQW FRPSXWDWLRQDO DVSHFWV RI RXU PHWKRG 7KHUH KDYH EHHQ RWKHU WHFKQLTXHV WR HQWURS\ PDQLSn XODWLRQ RI FRQWLQXRXV UDQGRP YDULDEOH SURSRVHG +RZHYHU WKH PHWKRGV HLWKHU RYHUVLPn SOLI\ DVVXPH *DXVVLDQLW\ RU XQLPRGDO SGIV >%HOO DQG 6HMQRZVNL 3OXPEH\ DQG )DOOVLGH @f RU DUH RYHUO\ FRPSOH[ (GJHZRUWK H[SDQVLRQV >:RQJ DQG %ODNH @f

PAGE 138

)LJXUH 7ZR YHKLFOH H[SHULPHQW 3URMHFWLRQ RI WUDLQLQJ OHIWf DQG WHVWLQJ ULJKWf LPDJHV RQWR IHDWXUH VSDFH DIWHU WRSf DQG ERWWRPf LWHUDWLRQV IRU WZR YHKLFOH FODVV WUDLQLQJ 9HKLFOH LV LQGLFDWHG E\ GLDPRQG V\PEROV ZKLOH YHKLFOH LV LQGLFDWHG E\ WULDQJOHV (DFK FODVV LV FRQQHFWHG LQ RUGHU RI DVSHFW DQJOH ,W DSSHDUV LQ WKHVH ILJXUHV WKDW WKH PDSSLQJ KDV PDLQWDLQHG DVSHFW GHSHQGHQFH IRU HDFK YHKLFOH $W WKH LWHUDWLRQ SRLQW VRPH VHSDUDWLRQ RI WKH YHKLFOHV LV LQ HYLGHQFH ,Q WKH ERWWRP OHIW SORW WKH FRQQHFWLQJ OLQHV KDYH EHHQ UHPRYHG LQ RUGHU WR EHWWHU VKRZ WKH FODVV VHSDUDWLRQ ZKLFK KDV WDNHQ SODFH

PAGE 139

,Q FRQWUDVW WKH PHWKRG KHUH LV IDLUO\ VWUDLJKWIRUZDUG DQG DV ZH ZLOO VKRZ FRPSXWDWLRQDOO\ VLPSOH 7KH UHVXOWV RI WKLV VHFWLRQ JUHDWO\ UHGXFH WKH FRPSXWDWLRQDO FRPSOH[LW\ RI RXU DSSURDFK DQG \LHOG D VXUSULVLQJO\ VLPSOH DQG LQWXLWLYH SHUVSHFWLYH RI PXWXDO LQIRUPDWLRQ $JDLQ ZH FRQVLGHU HTXDWLRQ ZKHUH ZH KDYH DOUHDG\ REVHUYHG WKDW WKH LPSOLFLW HUURU GLUHFWLRQ LV WKH FRQYROXWLRQ RI WKH REVHUYHG GLVWULEXWLRQ HUURU ZLWK WKH NHUQHO JUDGLHQW :H LOOXVWUDWH WKLV E\ UHZULWLQJ WKH LPSOLFLW HUURU GLUHFWLRQ WHUP DVVRFLDWHG ZLWK HDFK REVHUYDn WLRQ \c H[FOXGLQJ WKH WHUP UHODWHG WR PDSSLQJ VHQVLWLYLWLHV DQG QHJOHFWLQJ WKH VLJQ IRU WKH PRPHQWf DV H f M ZKHUH `f LQGLFDWHV WKH REVHUYHG GLVWULEXWLRQ HUURU DW SRLQW 8M HVWLPDWHG RYHU WKH VHW RI REVHUYDWLRQV ^\ ` $W ILUVW JODQFH LW ZRXOG VHHP WKDW WKH PHWKRG LV FRPSXWDWLRQDOO\ H[SHQVLYH &RPSXWDn WLRQ RI WKH 3DU]HQ ZLQGRZ HVWLPDWH LV LWVHOI RI RUGHU 1\1X WKH QXPEHU RI REVHUYDWLRQV PXOWLSOLHG E\ WKH QXPEHU RI ORFDWLRQV LQ WKH RXWSXW VSDFH DW ZKLFK WKH HVWLPDWRU LV FRPn SXWHG 5HDVRQDEOH HVWLPDWHV RI WKH GHQVLW\ XVLQJ D GLVFUHWH DSSUR[LPDWLRQ UHTXLUHV 1X WR LQFUHDVH H[SRQHQWLDOO\ ZLWK WKH GLPHQVLRQ RI WKH RXWSXW VSDFH )XUWKHUPRUH IURP HTXDn WLRQ LQ RUGHU WR FRPSXWH WKH LPSOLFLW HUURU WHUP ZH PXOWLSO\ WKH FRPSOH[LW\ RI WKH FRPn SXWDWLRQ E\ 1X \HW DJDLQ WR \LHOG DQ RYHUDOO FRPSOH[LW\ RI 1 \ fr f

PAGE 140

ZKHUH 1G LV WKH GLPHQVLRQ RI WKH RXWSXW VSDFH DQG 1X LV ZLWK UHVSHFW WR D RQH GLPHQn VLRQDO RXWSXW VSDFH )XUWKHUPRUH LQ WKH HTXDWLRQ ZH VHW 1X a 1\ LQ RUGHU WR JHW DQ DFFXn UDWH HVWLPDWH RI WKH LPSOLFLW HUURU WHUP WKDW LV WKH VDPSOLQJ JULG RQ WKH RUGHU RI WKUHH WLPHV DV GHQVH DV WKH GDWD REVHUYDWLRQV DVVXPLQJ JDXVVLDQ NHUQHOV >+DUGOH @f 8VLQJ WKLV UXOH RI WKXPE WKH RUGHU RI WKH FRPSXWDWLRQDO FRPSOH[LW\ DV D IXQFWLRQ RI WKH RXWSXW GLPHQVLRQ DQG WKH QXPEHU RI REVHUYDWLRQV EHFRPHV 1G LG 1\ f )RUWXQDWHO\ WKH GLPHQVLRQDOLW\ RI WKH RXWSXW VSDFH LV FRQWUROOHG E\ WKH GHVLJQHU KRZn HYHU HTXDWLRQ SRVHV D IXQGDPHQWDO FRPSXWDWLRQDO OLPLWDWLRQ WR WKH GLPHQVLRQDOLW\ RI WKH VXEVSDFH PDSSLQJ 7KLV OLPLWDWLRQ KRZHYHU LV RQO\ YDOLG LI WKH LPSOLFLW HUURU WHUP LV FRPSXWHG LQ WKH VWUDLJKWIRUZDUG PDQQHU WKDW WKH HTXDWLRQV LPSO\ )XUWKHU H[DPLQDWLRQ RI WKH 3DU]HQ ZLQGRZ HVWLPDWRU VKRZV KRZ WKLV FRPSOH[LW\ FDQ EH JUHDWO\ VLPSOLILHG 7KH ILQDO UHVXOW UHYHDOV WKDW WKH LPSOLFLW HUURU WHUP FDQ EH FRPSXWHG SXUHO\ DV D IXQFWLRQ RI WKH ORFDO LQWHUDFWLRQ EHWZHHQ WKH REVHUYDWLRQV LQ WKH RXWSXW VSDFH 7KH 3DU]HQ ZLQGRZ HVWLPDWRU LV WKH FRQYROXWLRQ RI WKH NHUQHO ZLWK WKH GDWD WKHUHIRUH ZH FDQ UHZULWH HTXDWLRQ DV H (\X_^\`fr.nXf_ ?X \ I0fI\X?^\`ffr.nXf?X \ 9ff \mfr.Xffr.nXf_% A f

PAGE 141

ZKHUH WKH WHUP 1 \Xf ; Xa\f UHSUHVHQWV WKH GDWD VHW DV REVHUYHG LQ WKH RXWSXW VSDFH I
PAGE 142

,Q VHFWLRQ $ RI WKH DSSHQGL[ WKH DQDO\WLF IRUP RI .D LV GHULYHG IRU WKH $WGLPHQn VLRQDO JDXVVLDQ NHUQHO ZLWK FRYDULDQFH PDWUL[ RI WKH IRUP FW 7KH UHVXOW IURP HTXDWLRQ LQ WKH DSSHQGL[ LV N fmf .Xfr.nXf a 1 1 1 f7Df@f ? Q D n U f BZ OMW9D9 @&ffOff ZKHUH 1 LV WKH GLPHQVLRQ RI WKH NHUQHO ,Q VHFWLRQ $ RI WKH DSSHQGL[ WKH DQDO\WLF IRUP RI IU LV DOVR GHULYHG 7KH UHVXOW IURP HTXDWLRQ LV f f§ )7 Q1 OO f§ 7 fYOO 9r D? > Xca ? HUI -LD HUI -LD NLfL_Df.L fff, r' 9 9 Y \ I UXc D? HUI HUI I I D ? I D f
PAGE 143

)LJXUH 7ZR GLPHQVLRQDO DWWUDFWRU IXQFWLRQV 7KH [c FRPSRQHQW LV VKRZQ DW WRS ZKLOH WKH [ FRPSRQHQW LV VKRZQ DW ERWWRP 7KH IXQFWLRQ UHSUHVHQWV WKH LQIOXHQFH RI HDFK GDWD SRLQW RQ LWV ORFDOH LQ WKH RXWSXW VSDFH $V LQ WKH DQDO\VLV RI WKH NHUQHO JUDGLHQWV ZH VHH WKDW WKLV IXQFWLRQ 7KH PDJQLWXGH RI WKH UHJXODWLQJ IXQFWLRQ LV VKRZQ LQ ILJXUH ,W LV HYLGHQW IURP WKH ILJXUH WKDW WKH UHJXODWLQJ IXQFWLRQ RQO\ KDV LQIOXHQFH DW WKH ERXQGDULHV RI WKH GHVLUHG RXWn SXW GLVWULEXWLRQ )XUWKHUPRUH H[DPLQDWLRQ RI WKH HTXDWLRQ VKRZV WKDW WKH WRSRORJ\ UHJn XODWLQJ IXQFWLRQ FRQWDLQV DQ HUI> @f IXQFWLRQ HYDOXDWLRQ ZKHQ WKH RXWSXW VSDFH LV JUHDWHU WKDQ RQH GLPHQVLRQ )URP D FRPSXWDWLRQDO VWDQGSRLQW WKLV IXQFWLRQ HYDOXDWLRQ FDQ EH FRVWO\ 7KLV WHUP LV HVVHQWLDOO\ XQLW\ H[FHSW DW WKH YHUWLFHV RI WKH K\SHUFXEH )LJXUH VKRZV DQ DSSUR[LPDWLRQ RI HTXDWLRQ PLQXV WKH HUI> @f HYDOXDWLRQ $V FDQ EH VHHQ LQ WKH ILJXUH ZLWKLQ WKH UHJLRQ RI VXSSRUW RI WKH GHVLUHG GLVWULEXWLRQ WKH IXQFWLRQ LV HVVHQn WLDOO\ XQFKDQJHG ,I ZH PDWFK WKH UHJLRQ RI VXSSRUW RI WKH GHVLUHG RXWSXW GLVWULEXWLRQ WR WKH

PAGE 144

)LJXUH 7ZR GLPHQVLRQDO UHJXODWLQJ IXQFWLRQ 7KH [@ FRPSRQHQW LV VKRZQ DW WRS ZKLOH WKH [ FRPSRQHQW LV VKRZQ DW WKH ERWWRP PDSSLQJ WRSRORJ\ WKH DSSUR[LPDWLRQ FDQ EH XVHG LQ RUGHU WR VDYH VLJQLILFDQW FRPSXWDn WLRQ )LJXUH 0DJQLWXGH RI WKH UHJXODWLQJ IXQFWLRQ 7KH PDJQLWXGH RI WKLV IXQFWLRQ LV ]HUR H[FHSW QHDU WKH ERXQGDU\ RI WKH GHVLUHG RXWSXW GLVWULEXWLRQ

PAGE 145

)LJXUH $SSUR[LPDWLRQ RI WKH UHJXODWLQJ IXQFWLRQ 7KH ILJXUH VKRZV WKH UHJXODWLQJ IXQFWLRQ ZKHQ WKH HUI f LV LJQRUHG 7KH FKDQJH LV QRW VLJQLILFDQW ZLWKLQ WKH UHJLRQ RI VXSSRUW RI WKH GHVLUHG GLVWULEXWLRQ 7KH UHVXOW RI WKLV DQDO\VLV LV WKDW WKH PDQLSXODWLRQ RI HQWURS\ FDQ EH PRGHOHG DV ORFDO LQWHUDFWLRQV RI WKH REVHUYHG GDWD LQ WKH RXWSXW VSDFH 7KH IXQFWLRQ RI WKH DWWUDFWRU NHUQHO .D> @f LV WR PRGHO WKH LQWHUDFWLRQ RI WKH GDWD SRLQWV ZLWK HDFK RWKHU ZKLOH WKH IXQFWLRQ RI WKH WRSRORJ\ UHJXODWLQJ WHUP IU> @f LV WR PRGHO WKH LQWHUDFWLRQ EHWZHHQ WKH GDWD

PAGE 146

SRLQWV DQG WKH FRQVWUDLQWV RI WKH GHVLUHG RXWSXW GLVWULEXWLRQ )XUWKHUPRUH LI WKH PDSSLQJ WRSRORJ\ LV PDWFKHG WR WKH GHVLUHG RXWSXW GLVWULEXWLRQ WKH HYDOXDWLRQ RI WKH WRSRORJ\ UHJn XODWLQJ WHUP FDQ EH IXUWKHU VLPSOLILHG 7KH ILQDO DOJRULWKP FRPSOH[LW\ KDV EHHQ UHGXFHG VXEVWDQWLDOO\ DV f &RQYHUVLRQ RI ,PSOLFLW (UURU 'LUHFWLRQ WR DQ ([SOLFLW (UURU ,Q WKH SUHYLRXV VHFWLRQ ZH GHULYHG D PHWKRG ZKLFK JUHDWO\ VLPSOLILHG WKH FRPSOH[LW\ RI WKH HUURU GLUHFWLRQ FRPSXWDWLRQ ,Q WKH SURFHVV PDQLSXODWLRQ RI D JOREDO SURSHUW\ HQWURS\ ZDV VHHQ WR EH D SURFHVV RI ORFDO DWWUDFWLRQUHSXOVLRQ RI WKH LQGLYLGXDO REVHUYDn WLRQV LQ WKH RXWSXW VSDFH 7KLV LGHD RI PD[LPL]LQJ DQG PLQLPL]LQJ HQWURS\ DQG XOWLPDWHO\ PXWXDO LQIRUPDWLRQ WKURXJK ORFDO LQWHUDFWLRQV FDQ EH IXUWKHU H[WHQGHG VXFK WKDW WKH FRPn SXWHG HUURU GLUHFWLRQ FDQ EH FRQYHUWHG LQWR DQ LPSOLFLW GHVLUHG VLJQDO 7KDW LV ZH FDQ JR IURP DQ XQVXSHUYLVHG OHDUQLQJ DOJRULWKP WR RQH ZKLFK LV VXSHUYLVHG LQ D VWHSZLVH IDVKn LRQ 7KH UHVXOWLQJ VLPSOLILFDWLRQ WR WKH DOJRULWKP LV WKDW ZH QR ORQJHU QHHG WR HVWLPDWH WKH HUURU GLUHFWLRQ IRU HYHU\ JUDGLHQW VWHS (QWURS\ 0LQLPL]DWLRQ DV $WWUDFWLRQ WR D 3RLQW :H EHJLQ ZLWK HQWURS\ PLQLPL]DWLRQ ZKLFK LV PRGHOHG DV ORFDO DWWUDFWLRQ EHWZHHQ WKH GDWD SRLQWV ,Q ILJXUH WKH ERWWRP SORW LQGLFDWHV WKDW WKH SRLQWV DUH DWWUDFWHG WR WKH FHQWHU RI WKH REVHUYHG GLVWULEXWLRQ PRGHV ZLWK WKH GHJUHH RI DWWUDFWLRQ EHLQJ VWURQJHU IRU WKH ODUJHU PRGH $V ZH KDYH VWDWHG KRZHYHU WKH LQIOXHQFH IXQFWLRQ LV LQ UHDOLW\ D GLUHFWLRQ ,I D SURSHU VFDOH IDFWRU FDQ EH IRXQG WKHQ WKH HUURU GLUHFWLRQ FDQ EH HTXDWHG WR DQ DFWXDO HUURU DQG D GHVLUHG VLJQDO

PAGE 147

7KH H[WHQW RI WKH DWWUDFWLRQ ILHOG EHWZHHQ SRLQWV LV GLUHFWO\ SURSRUWLRQDO WR WKH NHUQHO VL]H DV UHSUHVHQWHG E\ D LQ HTXDWLRQ ,Q WKH HTXDWLRQ ZH DOVR VHH WKDW WKH GHJUHH RI DWWUDFWLRQ LV LQYHUVHO\ SURSRUWLRQDO WR Y ZKHUH 1 LV WKH GLPHQVLRQ RI WKH NHUQHO 6R DV WKH NHUQHO VL]H GHFUHDVHV WKH GHJUHH RI DWWUDFWLRQ LQFUHDVHV GUDPDWLFDOO\ $JDLQ UHIHUULQJ WR ILJXUH DWWUDFWLRQ WR D SRLQW PDNHV VHQVH IURP DQ LQWXLWLYH VWDQGn SRLQW ZLWK UHJDUGV WR PLQLPL]LQJ HQWURS\ :H DOVR UHFRJQL]H WKDW WKH LQIOXHQFH RI DOO RI WKH SRLQWV LV DGGLWLYH 6R LQ RUGHU WR HQVXUH WKDW WKH QHW DWWUDFWLRQ LV WR D SRLQW ZH VLPSO\ VHW WKH JUDGLHQW DW WKH FHQWHU RI WKH DWWUDFWRU NHUQHO WR XQLW\ 7KH VFDOH IDFWRU DV D IXQFWLRQ WKH NHUQHO VL]H DQG GLPHQVLRQ LV VROYHG IRU LQ VHFWLRQ $ RI WKH DSSHQGL[ ZLWK WKH UHVXOW )LJXUH LOOXVWUDWHV WKUHH FDVHV RI VFDOLQJ WKH DWWUDFWRU NHUQHO IRU RQH GLPHQVLRQ :H FDQ VHH LQ WKH ILJXUH ZKHQ WKH DWWUDFWRU LV VFDOHG VXFK WKDW WKH VORSH LV OHVV WKDQ RU HTXDO WR XQLW\ ZH ZLOO JHW VWDEOH DWWUDFWLRQ WR D SRLQW $V D UHVXOW ZKHQ PLQLPL]LQJ HQWURS\ ZH DUH DEOH WR FRPSXWH DQ H[SOLFLW GHVLUHG RXWSXW DV IXQFWLRQ RI WKH FXUUHQW FRQILJXUDWLRQ RI WKH REVHUYDWLRQV LQ WKH RXWSXW VSDFH 7KLV DOORZV XV WR WUDLQ D PXOWLOD\HU SHUFHSWURQ LQ D VXSHUYLVHG IDVKLRQ :KHQ WKH 06( RI WKH HUURU LV UHGXFHG VDWLVIDFWRULO\ ZH FDQ FRPSXWH D QHZ GHVLUHG VLJQDO EDVHG RQ WKH QHZ FRQILJXUDWLRQ RI WKH REVHUYDWLRQV LQ WKH RXWSXW VSDFH 2QH TXHVWLRQ ZKLFK UHPDLQV LV KRZ WR VHW WKH VL]H RI WKH NHUQHO 7RZDUGV WKDW JRDO ZH QRWH WKDW ILJXUH KDV EHHQ QRUPDOL]HG E\ WKH NHUQHO VL]H D DQG E\ YLUWXH RI RXU VFDOH IDFWRU WKLV SORW FDQ EH H[WHQGHG WR WKH PXOWLGLPHQVLRQDO FDVH DV ZHOO 7KH ILHOG RI LQIOX

PAGE 148

XQGHUVKRRW [FU [FW RYHUVKRRW [FW )LJXUH )HHGEDFN IXQFWLRQV IRU LPSOLFLW HUURU WHUP 8QGHUVKRRW FRQGLWLRQ WRSf VORSH QRUPDOL]HG PLGGOHf RYHUVKRRW ERWWRPf

PAGE 149

HQFH LV HVVHQWLDOO\ ]HUR ZKHQ WKH GLVWDQFH IURP WKH FHQWHU RI WKH NHUQHO LV JUHDWHU WKDQ FW RU _\\f_!D f 7KH SURFHVV UHOLHV RQ ORFDO LQWHUDFWLRQ DQG VR IURP DQ DWWUDFWLRQ YLHZSRLQW ZH FDQ XVH WKH WZR PRVW GLVWDQW QHDUHVW QHLJKERUV WR VHW WKH NHUQHO VL]H DQG DGDSW LW GXULQJ WKH OHDUQn LQJ SURFHVVf 6WDWHG PDWKHPDWLFDOO\ f )URP D SUDFWLFDO VWDQGSRLQW WKH PXWXDO GLVWDQFHV EHWZHHQ HDFK SRLQW PXVW EH FRPSXWHG LQ WKH FRXUVH RI HYDOXDWLQJ WKH NHUQHOV DQG VR HTXDWLRQ GRHV QRW UHSUHVHQW D VLJQLILFDQW DGGLWLRQDO EXUGHQ )LJXUH VKRZV DQ H[DPSOH RI XVLQJ ORFDO DWWUDFWLRQ WR PLQLPL]H HQWURS\ ,Q WKH ILJn XUHV WKHUH DUH WZR FOXVWHUV RI SRLQWV %\ FKRRVLQJ WKH NHUQHO DFFRUGLQJ WR WKH PD[LPXP QHDUHVW QHLJKERU GLVWDQFH WKH SRLQWV ZLWKLQ WKH ORFDO FOXVWHUV DUH DWWUDFWHG WR D SRLQW :H DOVR VHH WKDW WKH FOXVWHUV DUH FRQYHUJLQJ WR D VLQJOH SRLQW LQ WKH WKLUG LWHUDWLRQ )QWURS\ 0D[LPL]DWLRQ DV 'LIIXVLRQ ,I LQVWHDG WKH JRDO ZDV PD[LPXP HQWURS\ WKHQ WKH ORFDO LQWHUDFWLRQ EHFRPHV UHSXOn VLRQ DQG WKH IHHGEDFN WHUPV RI ILJXUH SRLQW LQ WKH RSSRVLWH GLUHFWLRQ :H FDQ XVH WKH LGHD RI XQLIRUP GLIIXVLRQ LQ WKH RXWSXW VSDFH LQ RUGHU WR VHW WKH NHUQHO VL]H IRU HQWURS\ PD[LPL]DWLRQ ,Q WKH HDUO\ VWDJHV RI OHDUQLQJ ZH ZRXOG OLNH WKH UHODWLYH NHUQHO VL]H WR EH ODUJH ,Q WKLV ZD\ GHQVH JURXSLQJV RI SRLQWV ZLOO PD[LPDOO\ LQWHUDFW DQG UHSHOf KRZHYHU

PAGE 150

LWHUDWLRQ )LJXUH (QWURS\ PLQLPL]DWLRQ DV ORFDO DWWUDFWLRQ 7KH ILJXUHV DERYH VKRZ WKUHH LWHUDWLRQV RI WKH ORFDO DWWUDFWLRQ DOJRULWKP 7KH WZR JURXSV RI SRLQWV DUH VHHQ WR EH DWWUDFWHG WR WKHLU ORFDO PHDQV DV ZHOO DV WR HDFK RWKHU WRZDUGV WKH ODWHU VWDJHV RI OHDUQLQJ ZH ZRXOG OLNH WKH LQWHUDFWLRQ WR GHFUHDVH WR D QHJOLJLn EOH OHYHO DV WKH GLVWULEXWLRQ DSSURDFKHV D XQLIRUPLW\ ,I ZH PDNH WKH DVVXPSWLRQ WKDW WKH REVHUYHG GLVWULEXWLRQ ZKLFK ZH QR ORQJHU FRPn SXWH LQ WKH ORFDO LQWHUDFWLRQ IUDPHZRUNf ZLOO HYHQWXDOO\ DSSURDFK XQLIRUPLW\ ZH KDYH D EDVLV IRU VHWWLQJ DQ XSSHU ERXQG RQ WKH VL]H RI WKH NHUQHO *LYHQ 1< SRLQWV LQ DQ 1G

PAGE 151

GLPHQVLRQDO VSDFH XQLIRUPO\ VSDFHG LQ D K\SHUFXEH WKH GLVWDQFH EHWZHHQ QHDUHVW SRLQWV DSSURDFKHV O ZKHUH 9 LV WKH YROXPH RI WKH K\SHUFXEH 7KH XSSHU ERXQG RI WKH NHUQHO VL]H FDQ EH VHW SURSRUWLRQDOO\ WR WKLV YDOXH 'XULQJ WUDLQLQJ WKH NHUQHO VL]H FDQ EH VHW VR DV WR HQVXUH ORFDO LQWHUDFWLRQ VXEMHFW WR WKH XSSHU ERXQG )LJXUH VKRZV DQ H[DPSOH UHVXOW ZKHQ HQWURS\ PD[LPL]DWLRQ LV PRGHOHG DV GLIIXn VLRQ 7KH XSSHU ERXQG RQ WKH NHUQHO VL]H ZDV VHW WR RI HTXDWLRQ 6XEMHFW WR WKH XSSHU ERXQG WKH NHUQHO VL]H ZDV DGDSWLYHO\ VHW WR WKH PD[LPXP QHDUHVW QHLJKERU GLVn WDQFH 2QH LQWHUHVWLQJ REVHUYDWLRQ LV WKDW QHDU WKH FHQWHU RI WKH ILJXUH WKH GDWD GLDPRQG V\PEROVf KDYH DUUDQJHG WKHPVHOYHV LQ D KH[DJRQDO FRQILJXUDWLRQ ZKLFK LV ZHOO NQRZQ WR EH WKH PRVW HIILFLHQW VDPSOLQJ VFKHPH LQ WZR GLPHQVLRQV 6WRSSLQJ &ULWHULRQ )LJXUH EULQJV XS RQH ILQDO VXEMHFW LQ WKH ORFDO LQWHUDFWLRQ YLHZSRLQW 7KH RULJLQDO RSWLPL]DWLRQ FULWHULRQ ZDV WKH LQWHJUDWHG VTXDUHG HUURU EHWZHHQ WKH REVHUYHG GLVWULEXWLRQ DQG WKH GHVLUHG XQLIRUP GLVWULEXWLRQ 6LQFH WKH 3') HVWLPDWLRQ ZDV E\SDVVHG ZH QR ORQJHU KDYH DFFHVV WR WKH FULWHULRQ ZKLOH WUDLQLQJ &RQVHTXHQWO\ ZH QHHG D SUR[\ IRU WKH FULWHULRQ LQ RUGHU WR GHWHUPLQH ZKHQ WR VWRS WKH WUDLQLQJ :H SURSRVH WKH IROORZLQJ PHDn VXUH DV D VXEVWLWXWH PD[$11f PLQ$ZZf PD[ $f f

PAGE 152

)LJXUH (QWURS\ PD[LPL]DWLRQ DV GLIIXVLRQ 7KH GDWD SRLQWV DUH SORWWHG DV GLDPRQGV LQ WKH ILJXUH DERYH 3') HVWLPDWLRQ ORFDWLRQV DUH VKRZQ DV SOXV VLJQV :H VHH WKDW QHDU WKH FHQWHU RI WKH GLVWULEXWLRQ WKDW WKH SRLQWV KDYH DUUDQJHG WKHPVHOYHV LQ D KH[DJRQDO FRQILJXUDWLRQ NQRZQ WR EH WKH PRVW HIILFLHQW VDPSOLQJ VFKHPH LQ WZR GLPHQVLRQV ZKHUH PD[$ZZf LV WKH PD[LPXP QHDUHVW QHLJKERU GLVWDQFH ZKLFK ZH DUH DOUHDG\ NHHSn LQJ WUDFN RI PLQ&$Af LV WKH PLQLPXP QHDUHVW QHLJKERU GLVWDQFH DQG PD[$f LV WKH PD[LPXP GLVWDQFH EHWZHHQ DQ\ WZR SRLQWV 7KH QXPHUDWRU WHUP PHDVXUHV KRZ HTXDOO\ VSDFHG WKH SRLQWV DUH DQG LV H[SHFWHG WR DSSURDFK ]HUR DV WKH GLVWULEXWLRQ EHFRPHV PRUH XQLIRUP 7KH GHQRPLQDWRU WHUP LV D SHQDOW\ WHUP IRU QRW ILOOLQJ WKH HQWLUH VSDFH 7KLV PHDn VXUH LV VKRZQ IRU WKH SUHYLRXV GLIIXVLRQ H[DPSOH DORQJ ZLWK WKH LQWHJUDWHG VTXDUHG HUURU PHDVXUH DQG HQWURS\ LQ ILJXUH %RWK WKH LQWHJUDWHG VTXDUHG HUURU DQG HQWURS\ PHDVXUHV

PAGE 153

ZHUH FRPSXWHG IURP D VDPSOHG HVWLPDWH RI WKH REVHUYHG 3') 7KH HVWLPDWLRQ ORFDWLRQV DUH UHSUHVHQWHG E\ WKH FURVV V\PEROV LQ WKH ILJXUH $V ZH FDQ VHH ERWK WKH LQWHJUDWHG VTXDUHG HUURU PHDVXUH DQG WKH PHDVXUH RI HTXDWLRQ DUH DGHTXDWH HVWLPDWHV RI WKH HQWURS\ (TXDWLRQ KRZHYHU LV PXFK OHVV FRPSXWDWLRQn DOO\ H[SHQVLYH WKDQ WKH RWKHU WZR )LJXUH 6WRSSLQJ FULWHULRQ &RPSDULVRQ RI HQWURS\ LQWHJUDWHG VTXDUHG HUURU DQG GLVWDQFH GHULYHG VWRSSLQJ FULWHULRQ ,QWHJUDWHG VTXDUHG HUURU DQG WKH GLVWDQFH GHULYHG FULWHULRQ DUH UHDVRQDEOH DSSUR[LPDWLRQV WR WKH FULWHULRQ RI LQWHUHVW HQWURS\ 2EVHUYDWLRQV :H KDYH GHVFULEHG D QRQSDUDPHWULF DSSURDFK WR LQIRUPDWLRQ WKHRUHWLF IHDWXUH H[WUDFn WLRQ :H EHOLHYH WKDW WKLV PHWKRG FDQ EH XVHG WR LPSURYH FODVVLILFDWLRQ SHUIRUPDQFH E\ GLUHFWO\ FKRRVLQJ UHOHYDQW IHDWXUHV IRU FODVVLILFDWLRQ YLD PD[LPL]DWLRQ RI PXWXDO LQIRUPDn WLRQ $ FULWLFDO FDSDELOLW\ RI WKH LQIRUPDWLRQ WKHRUHWLF DSSURDFK LV WKH DELOLW\ WR DGDSW WKH HQWURS\ RI WKH RXWSXW VSDFH RI WKH QRQOLQHDU SURMHFWLRQ HQWURS\ :H KDYH VKRZQ WKDW

PAGE 154

WKURXJK WKH XVH RI D VLPSOH GLIIHUHQWLDEOH HVWLPDWRU QDPHO\ 3DU]HQ ZLQGRZV WKDW WKH DGDSWDWLRQ RI HQWURS\ FDQ ILW ORJLFDOO\ LQWR WKH HUURU EDFNSURSDJDWLRQ PRGHO 7KLV PHWKRG GLIIHUV IURP RWKHU HQWURS\ EDVHG DSSURDFKHV VXFK DV XVLQJ WKH .XOOEDFN/HLEOHU QRUP IRU VXSHUYLVHG OHDUQLQJ :H KDYH DOVR SUHVHQWHG H[SHULPHQWV WKDW LOOXVWUDWH WKH XVHIXOQHVV RI WKLV WHFKQLTXH &RPSDULVRQV WR WKH ZHOO NQRZQ 3&$ PHWKRG VKRZ WKDW WKH LQIRUPDWLRQ WKHRUHWLF DSSURDFK LV PRUH VHQVLWLYH WR WKH XQGHUO\LQJ GDWD VWUXFWXUH EH\RQG VLPSOH VHFRQGRUGHU VWDWLVWLFV 7KH GDWD W\SHV XVHG IRU WKH H[SHULPHQWV ZHUH VLPSOH E\ GHVLJQ 7KH\ VHUYHG WR LOOXVWUDWH WKH XVHIXOQHVV RI WKH PHWKRG HYHQ IRU VHHPLQJO\ VLPSOH SUREOHPV :H KDYH DOVR VKRZQ KRZ WKH DSSURDFK FDQ EH PRGHOHG DV ORFDO LQWHUDFWLRQ RI WKH GDWD LQ WKH RXWSXW VSDFH 7KLV YLHZSRLQW OHG WR D VLJQLILFDQW FRPSXWDWLRQDO VDYLQJV DV ZHOO DV D FOHDUHU LQWXLWLYH XQGHUVWDQGLQJ RI WKH DOJRULWKP 0XWXDO ,QIRUPDWLRQ $SSOLHG WR WKH 1RQOLQHDU 0$&+ )LOWHUV $W WKLV SRLQW ZH SUHVHQW H[SHULPHQWDO UHVXOWV ZKLFK LOOXVWUDWH DSSOLFDWLRQ RI WKLV WHFKn QLTXH WR WKH QRQOLQHDU 0$&( ILOWHU 7KLV LV DFFRPSOLVKHG E\ UHSHDWLQJ WKH H[SHULPHQWV ,,, VHFWLRQ f DQG ,9 VHFWLRQ f IURP WKH SUHYLRXV FKDSWHU ,Q H[SHULPHQW ,,, ZH WUDLQHG WKH FODVVLILHU DIWHU SUHSURFHVVLQJ WKH LPDJHU\f ZLWK VXEn VSDFH SURMHFWHG QRLVH H[HPSODUV IRU WKH UHMHFWLRQ FODVV DQG JUDPVFKPLGW RUWKRJRQDOL]D WLRQ RQ WKH LQSXW OD\HU 7KH RUWKRJRQDOLW\ FRQVWUDLQ HQVXUHG WKDW WKH IHDWXUH ZRXOG EH XQFRUUHODWHG RYHU WKH UHMHFWLRQ FODVV ,Q WKLV H[SHULPHQW ZH UHPRYH WKH RUWKRJRQDOLW\ FRQn VWUDLQW DQG GHFRXSOH WKH IHDWXUH H[WUDFWLRQ IURP WKH GLVFULPLQDQW IXQFWLRQ H[SOLFLWO\ 7KH LPDJHV DUH VWLOO EH SUHSURFHVVHG DQG WKH VDPH H[HPSODUV DUH XVHG WR WUDLQ WKH V\VWHP KRZHYHU WKH FODVVLILHU DUFKLWHFWXUH ZLOO EH WUDLQHG RQ WKH RXWSXW RI WKH IHDWXUH H[WUDFWRU

PAGE 155

7KH JRDO LV WR PD[LPL]H PXWXDO LQIRUPDWLRQ FRQGLWLRQHG RQ WKH UHFRJQLWLRQ FODVV RU ,&\f K\fK\?&f f ,&J[ Dff KJ[DffKJ[Df?&f ZKHUH [ LV WKH SUHSURFHVVHG WUDLQLQJ H[HPSODU DQG \ LV WKH H[WUDFWHG IHDWXUH YHFWRU 7KH IHDWXUH H[WUDFWLRQ DUFKLWHFWXUH LV DQ 1W1 0/3 KLGGHQ QRGHV RXWSXW QRGHVf 7KH UHVXOWLQJ IHDWXUH VSDFH PDSSLQJ LV VKRZQ LQ ILJXUH ,Q FRQWUDVW WR WKH UHVXOWV RI VHFWLRQ WKH IHDWXUH VSDFH LV WKH RXWSXW RI D QRQOLQHDU PDSSLQJ DQG VR LW LV GLIILFXOW WR PDNH RWKHU WKDQ TXDOLWDWLYH FRPPHQWV DERXW LW :H FDQ KRZHYHU VD\ PXFK DERXW WKH FULWHULRQ IURP ZKLFK LW ZDV GHULYHG DQG ZH KDYHf ,Q WKLV FDVH ZH DUH OHIW ZLWK D SHUIRUPDQFH FRPSDULVRQ WR WKH SUHYLRXV H[SHULPHQWV $ VXPPDU\ RI WKH UHVXOWV RI WKLV VHFWLRQ ZLWK WKRVH RI VHFWLRQ DU JLYHQ LQ WDEOH :KHUH ZH FDQ VHH WKDW WKH SHUIRUn PDQFH LV FRPSDUDEOH VOLJKWO\ EHWWHUf WKDQ LQ H[SHULPHQW ,,, IURP WKH SUHYLRXV FKDSWHU ZKLFK XVHG WKH VDPH QRLVH FODVV H[HPSODUV ZLWK WKH RUWKRJRQDOLW\ FRQVWUDLQW 7DEOH &RPSDULVRQ RI 52& FODVVLILHU SHUIRUPDQFH IRU WR YDOXHV RI 3G 5HVXOWV DUH VKRZQ IRU WKH OLQHDU ILOWHU YHUVXV H[SHULPHQWV ,,, DQG ,9 IURP VHFWLRQ DQG PXWXDO LQIRUPDWLRQ IHDWXUH H[WUDFWLRQ7KH V\PEROV LQGLFDWH WKH W\SH RI UHMHFWLRQ FODVV H[HPSODUV XVHG 1 ZKLWH QRLVH WUDLQLQJ *6 *UDP6FKPLGW RUWKRJRQDOL]DWLRQ VXE1 3&$ VXEVSDFH QRLVH &+ FRQYH[ KXOO UHMHFWLRQ FODVV SG bf 3ID bf OLQHDU ILOWHU VHFWLRQ PXWXDO LQIRUPDWLRQ VXE1 *6f VXE1 *6 &+f VXE1f VXE1&+f 7KH UHVXOWLQJ 52& FXUYH LV VKRZQ LQ ILJXUH DV FRPSDUHG WR WKH OLQHDU 0$&( ILOWHU ,7 LV QRW VXUSULVLQJ WKDW WKH SHUIRUPDQFH GLG QRW H[FHHG WKH SHUIRUPDQFH RI H[SHULPHQW ,,, ZKHQ ZH FRQVLGHU KRZ WKH UHMHFWLRQ FODVV ZDV JHQHUDWHG DV D UDQGRP SURMHFWLRQ RI JDXV

PAGE 156

VLDQ QRLVH RQWR DQ RUWKRQRUPDO EDVLV $V ZH KDYH VKRZQ LQ SUHYLRXV H[SHULPHQWV XQGHU WKH JDXVVLDQ FRQGLWLRQ HTXDO FRYDULDQFHVf RUWKRJRQDOLW\ DQG HQWURS\ DUH HTXLYDOHQW 7KHVH UHVXOWV WKHQ JLYH VXSSRUW WR WKLV WHFKQLTXH VLQFH RUWKRJRQDOLW\ ZDV QRW HQIRUFHG RQ WKH IHDWXUH H[WUDFWRU )LJXUH 0XWXDO LQIRUPDWLRQ IHDWXUH VSDFH 5HMHFWLRQ FODVV LV UHSUHVHQWHG ZLWK VXEVSDFH QRLVH LPDJHV 7KH WRS ILJXUH VKRZV WKH WUDLQLQJ H[HPSODUV SOXV VLJQ LV UHFRJQLWLRQ DQG WULDQJOHV DUH UHMHFWLRQf ZKLOH WKH ERWWRP VLGH VKRZV WKH WHVWLQJ VHW

PAGE 157

,Q WKH VHFRQG H[SHULPHQW ZH UHSHDW WKH FRQGLWLRQV RI H[SHULPHQW ,Q WKLV FDVH ZH UHSUHVHQW WKH UHMHFWLRQ FODVV ZLWK ERWK VXEVSDFH QRLVH DQG FRQYH[ KXOO H[HPSODUV 7KH JDXVVLDQ DVVXPSWLRQ LV QRZ QR ORQJHU FRUUHFW GXH WR WKH LQFOXVLRQ RI WKH FRQYH[ KXOO H[HPSODUV ,Q WKLV FDVH ZH ZRXOG H[SHFW WKH UHVXOWV WR LPSURYH RQ WKH DVVXPSWLRQ WKDW WKHUH LV LQIRUPDWLRQ WR EH H[WUDFWHG IURP WKH FRQYH[ KXOO H[HPSODUV ZLWK UHJDUGV WR FODVVLILFDn WLRQ $V ZH KDYH DOUHDG\ GHPRQVWUDWHG WKDW WKH FRQYH[ KXOO DSSURDFK \LHOGHG LPSURYHG FODVVLILFDWLRQ LQ WKH SUHYLRXV FKDSWHU ZH ZLOO KROG WKLV DVVXPSWLRQ WR EH WUXH 7KH IHDWXUH VSDFH IRU WKLV FDVH LV VKRZQ LQ ILJXUH :H REVHUYH WKH IHDWXUH VSDFH DV LQ WKH SUHYLRXV FKDSWHU LV TXLWH GLIIHUHQW ,QWXLWLYHO\ WKH UHVXOW PDNHV VHQVH LQ WKH FRQWH[W RI PXWXDO LQIRUn PDWLRQ 5HFDOO WKDW WKH FRQYH[ KXOO H[HPSODUV OLH LQ WKH LQWHULRU RI UHFRJQLWLRQ FODVV LQ WKH LQSXW VSDFH GXH WR WKHLU FRQVWUXFWLRQf DQG RXU JRDO YLD PXWXDO LQIRUPDWLRQ LV WR PDNH WKH UHFRJQLWLRQ FODVV FRPSDFW DQG WKH UHMHFWLRQ FODVV GLIIXVH 7KLV JRDO DQG WKH SURSHUW\ RI WKH FRQYH[ KXOO DUH VHHPLQJO\ DW RGGV $V D UHVXOW D WUDGHRII UHVXOWV 7KH UHFRJQLWLRQ FODVV LV FRPSDFW RQ DQ HOOLSVRLG ZLWK WKH FRQYH[ KXOO H[HPSODUV RQ WKH LQWHULRU EXW H[SDQGHG :H VHH IURP WDEOH WKDW WKH FODVVLILFDWLRQ UHVXOWV DUH VXEVWDQWLDOO\ EHWWHU WKDQ WKH SUHn YLRXV UHVXOWV 7KH 52& FXUYH IRU WKLV H[SHULPHQW LV VKRZQ LQ ILJXUH DV FRPSDUHG WR WKH OLQHDU V\VWHP :H ZRXOG K\SRWKHVL]H WKDW PXWXDO LQIRUPDWLRQ SHUIRUPHG EHWWHU DW H[WUDFWn LQJ GLVFULPLQDWLQJ LQIRUPDWLRQ IURP WKH H[HPSODUV :H DOVR QRWH WKDW ERWK UHVXOWV GLG QRW UHO\ RQ RUWKRJRQDOLW\ LQ WKH LQSXW OD\HU IRU ZKLFK ZH FDQ RQO\ PDNH VHFRQGRUGHU VWDWLVWLn FDO MXVWLILFDWLRQV DQG \HW ZHUH DEOH WR DFKLHYH WKH VDPH RU EHWWHU SHUIRUPDQFH

PAGE 158

)LJXUH 52& FXUYHV IRU PXWXDO LQIRUPDWLRQ IHDWXUH H[WUDFWLRQ GRWWHG OLQHf YHUVXV OLQHDU 0$&( ILOWHU VROLG OLQHf

PAGE 159

)LJXUH 0XWXDO LQIRUPDWLRQ IHDWXUH VSDFH UHVXOWLQJ IURP FRQYH[ KXOO H[HPSODUV 7KH WUDLQLQJ H[HPSODUV DUH VKRZQ LQ WKH WRS ILJXUH VTXDUH UHFRJQLWLRQ WULDQJOH UHMHFWLRQf 7KH ERWWRP ILJXUH VKRZV WKH WHVWLQJ H[HPSODUV

PAGE 160

52& FXUYH 3I )LJXUH 52& FXUYHV IRU PXWXDO LQIRUPDWLRQ IHDWXUH H[WUDFWLRQ GRWWHG OLQHf YHUVXV OLQHDU 0$&( ILOWHU VROLG OLQHf

PAGE 161

&+$37(5 &21&/86,216 :H KDYH GLVFXVVHG D PHWKRGRORJ\ E\ ZKLFK OLQHDU GLVWRUWLRQ LQYDULDQW ILOWHULQJ FDQ EH H[WHQGHG WR QRQOLQHDU V\VWHPV 7KH H[WHQVLRQ WR QRQOLQHDU V\VWHPV ZDV LQLWLDWHG E\ ILUVW HVWDEOLVKLQJ WKH OLQN EHWZHHQ GLVWRUWLRQ LQYDULDQW ILOWHUV DQG WKH OLQHDU DVVRFLDWLYH PHPRU\ LQ FKDSWHU 7KH OLQHDU DVVRFLDWLYH PHPRU\ SHUVSHFWLYH LV LPSRUWDQW LQ WKDW LW PRUH FORVHO\ DOLJQV GLVWRUWLRQ LQYDULDQW ILOWHULQJ ZLWK FODVVLILFDWLRQ $GYDQFHV LQ GLVWRUWLRQ LQYDULDQW ILOn WHULQJ DV GHVFULEHG LQ FKDSWHU KDYH RFFXUUHG ZLWKLQ D OLQHDU V\VWHPV IUDPHZRUN GHVSLWH WKH SULPDU\ DSSOLFDWLRQ EHLQJ FODVVLILFDWLRQ 7KH UHVXOW LV D FODVVLILFDWLRQ DSSURDFK ZKLFK FRQVLGHUV RQO\ VHFRQG RUGHU VWDWLVWLFV ,Q FRQWUDVW WKH GHYHORSPHQW RI DVVRFLDWLYH PHPRn ULHV KDV RFFXUUHG ZLWKLQ D SUREDELOLVWLF IUDPHZRUN HPSKDVL]LQJ D FODVVLILFDWLRQ DSSURDFK ZKLFK FRQVLGHUV WKH XQGHUO\LQJ SUREDELOLW\ GHQVLW\ IXQFWLRQ 7KLV SHUVSHFWLYH OHG QDWXUDOO\ WR QRQOLQHDU VLJQDO SURFHVVLQJ 7KH FRQVHTXHQFH RI XVLQJ WKH 06( FULWHULRQ ZDV DOVR GLVn FXVVHG LQ FKDSWHU 7KH UHVXOW ZKLFK KDV EHHQ VKRZQ E\ RWKHU UHVHDUFKHUV DV ZHOO ZDV WKDW WKH 06( FULWHULRQ FRPELQHG ZLWK D XQLYHUVDO DSSUR[LPDWRU DQG RI1 FRGLQJ WKH GHVLUHG RXWSXW LV DQ 1YHFWRU ZLWK WKH GHVLUHG RXWSXW IRU WKH LWK HOHPHQW VHW WR XQLW\ DQG DOO RWKHUV WR ]HUR IRU DQ 1FODVV FODVVLILFDWLRQ SUREOHPf LV VXLWDEOH IRU HVWLPDWLQJ SRVWHULRU FODVV SUREDELOLWLHV 6RPH RI WKH PDMRU FRQWULEXWLRQV RI WKLV GLVVHUWDWLRQ ZHUH SUHVHQWHG LQ FKDSWHU :H EHJDQ ZLWK DQ DQDO\VLV RI FRPPRQO\ XVHG PHDVXUHV RI JHQHUDOL]DWLRQ IRU GLVWRUWLRQ LQYDULn DQW ILOWHUV 2XU DQDO\VLV VKRZHG WKDW WKHVH PHDVXUHV ZHUH DFWXDOO\ FRXQWHU WR JRRG FODVVLIL

PAGE 162

FDWLRQ SHUIRUPDQFH ,W LV RXU RSLQLRQ WKDW WKH JHQHUDOL]DWLRQ PHDVXUHV GLVFXVVHG DUH PRUH SURSHUO\ VXLWHG WR D VLJQDO UHSUHVHQWDWLRQ IUDPHZRUN DQG QRW FODVVLILFDWLRQ 7KH DQDO\VLV DOVR UHYHDOHG WKDW HPSKDVLV RQ WKH 0$&( ILOWHU RSWLPL]DWLRQ FULWHULRQ LQ WKH FRQVWUXFWLRQ RI WKH 276') OHG WR VXSHULRU FODVVLILFDWLRQ SHUIRUPDQFH 7KH UHVXOWV RI WKH DQDO\VLV RI JHQHUDOL]DWLRQ PHDVXUHV ZDV VLJQLILFDQW LQ WKDW LW KLJKn OLJKWHG WKH IDFW WKDW FRPPRQO\ XVHG PHDVXUHV RI JHQHUDOL]DWLRQ VKRXOG QRW EH WKH VROH EDVLV XSRQ ZKLFK WR FRPSDUH QRQOLQHDU V\VWHPV WR WKH WKHLU OLQHDU FRXQWHUSDUWV VLQFH WKHVH PHDVXUHV DUH RQO\ ZHDNO\ FRXSOHG WR FODVVLILFDWLRQ SHUIRUPDQFH 7KH SUREDELOLVWLF YLHZSRLQW RI WKH 0$&( ILOWHU RSWLPL]DWLRQ FULWHULRQ ZDV SUHVHQWHG LQ FKDSWHU DV ZHOO :LWKLQ WKLV IUDPHZRUN QRQOLQHDU PDSSLQJV VXFK DV WKH PXOWLOD\HU SHUFHSWURQ ZHUH LQFOXGHG DOORZLQJ IRU D QRQOLQHDU H[WHQVLRQ RI WKH 0$&( ILOWHU 7KH ODFN RI FORVHG IRUP DQDO\WLFDO VROXWLRQV IRU JHQHUDO QRQOLQHDU PDSSLQJV QHFHVVLWDWHG DQ LWHUDn WLYH DSSURDFK &RQVHTXHQWO\ WKH IHHGIRUZDUG PXOWLOD\HU SHUFHSWURQ ZDV DQ REYLRXV FDQn GLGDWH IRU WKH QRQOLQHDU PDSSLQJ GXH WR LWV SURSHUW\ DV D XQLYHUVDO IXQFWLRQ DSSUR[LPDWRU FRXSOHG ZLWK FRPSXWDWLRQDOO\ HIILFLHQW LWHUDWLYH DOJRULWKPV 7KLV FKRLFH DOVR SUHVHUYHG WKH VKLIW LQYDULDQFH SURSHUW\ RI WKH RULJLQDO OLQHDU ILOWHU 6HYHUDO GHYHORSPHQWV UHVXOWHG IURP WKH QRQOLQHDU DSSURDFK $Q HIILFLHQW WUDLQLQJ DOJRULWKP UHVXOWHG IURP WKH UHFRJQLWLRQ WKDW WKH RSWLPL]DWLRQ FULWHULRQ ZDV HTXLYDOHQW WR FKDUDFWHUL]LQJ WKH UHMHFWLRQ FODVV E\ ZKLWHQRLVH LPDJHV LQ WKH SUHZKLWHQHG LPDJH VSDFH 7KH UHVXOWV RI H[SHULPHQW LQ VHFWLRQ HPSKDVL]HG WKH QHHG IRU VXLWDEOH SHUIRUPDQFH PHDVXUHV E\ ZKLFK WR FRPSDUH QRQOLQHDU DQG OLQHDU FODVVLILHUV 7KLV PRWLYDWHG D IHDWXUH VSDFH YLHZSRLQW RI WKH LQWHUQDO PDSSLQJV RI WKH PXOWLOD\HU SHUFHSWURQ ([DPLQDWLRQ RI WKH IHDWXUH VSDFH OHG WR VHYHUDO PRGLILFDWLRQV DQG VXEVHTXHQW SHUIRUPDQFH LPSURYHPHQWV

PAGE 163

WR WKH WUDLQLQJ DOJRULWKP DQG FODVVLILFDWLRQ SHUIRUPDQFH 6SHFLILFDOO\ DQ RUWKRJRQDOLW\ FRQVWUDLQW RQ WKH LQSXW OD\HU RI WKH PXOWLOD\HU SHUFHSWURQ ZDV VXIILFLHQW WR JXDUDQWHH m XQFRUUHODWHG IHDWXUHV RYHU WKH UHMHFWLRQ FODVV 3URMHFWLRQ RI WKH ZKLWH QRLVH UHMHFWLRQ FODVV H[HPSODUV RQWR WKH VSDFH RI WKH UHFRJQLWLRQ FODVV GDWD PDWUL[ HIIHFWLYHO\ UHGXFHG WKH GLPHQVLRQDOLW\ RI WKH SUREOHP IURP 1L1 WKH LPDJH VL]Hf WR 9 WKH QXPEHU RI UHFRJQLn WLRQ FODVV H[HPSODUVf 7KH UHVXOW RI WKLV PRGLILFDWLRQ ZDV D VLJQLILFDQWO\ IDVWHU FRQYHUn JHQFH UDWH 7KH ODVW UHVXOW ERUURZHG WKH LGHD RI XVLQJ WKH LQWHULRU RI WKH FRQYH[ KXOO RYHU WKH UHFRJQLWLRQ FODVV H[HPSODUVf DV UHSUHVHQWDWLYH RI WKH UHMHFWLRQ FODVV 7KLV LV D IXUWKHU UHILQHPHQW RI WKH FRQFHSW RI UHGXFLQJ WKH LQWULQVLF GLPHQVLRQDOLW\ RI SUREOHP 7KHUH ZHUH WZR REVHUYDWLRQV FRQFHUQLQJ WKH FRQYH[ KXOO DSSURDFK &RQYHUJHQFH WLPHV ZHUH FRQVLGHUn DEO\ ORQJHU DQG WKH VWDELOLW\ RI WKH LWHUDWLYH SURFHGXUH EHFDPH DQ LVVXH +RZHYHU ZKHQ WKH WUDLQLQJ GLG FRQYHUJH WKH FODVVLILFDWLRQ SHUIRUPDQFH ZDV VXSHULRU WR WKH SUHYLRXV FDVHV :H IHHO WKDW WKH UHVXOWV RI WKH FRQYH[ KXOO DSSURDFK PHULW IXUWKHU LQYHVWLJDWLRQ ,Q FKDSWHU ZH SUHVHQWHG D QHZ LQIRUPDWLRQ WKHRUHWLF IHDWXUH H[WUDFWLRQ PHWKRG :H SURYLGHG D FOHDU PRWLYDWLRQ )DQRfV LQHTXDOLW\f IRU XVLQJ PXWXDO LQIRUPDWLRQ DV WKH FULWHn ULRQ IRU IHDWXUH H[WUDFWLRQ LQ D FODVVLILFDWLRQ IUDPHZRUN ,W LV RXU RSLQLRQ WKDW WKLV QHZ PHWKRG UHSUHVHQWV D VLJQLILFDQW DGYDQFH WR WKH VWDWH RI WKH DUW IRU VHOIRUJDQL]LQJ V\VWHPV DQG LQIRUPDWLRQ WKHRUHWLF VLJQDO SURFHVVLQJ LQ VHYHUDO UHJDUGV ,W XWLOL]HV WKH FRQWLQXRXV IRUP RI HQWURS\ DQG PXWXDO LQIRUPDWLRQ DV RSSRVHG WR WKH GLVFUHWH IRUP FRQVHTXHQWO\ LW FDQ EH XVHG IRU FRQWLQXRXV PDSSLQJV ,Q FRQWUDVW WR SUHYLRXV HQWURS\ EDVHG DSSURDFKHV LW SRVHV QR OLPLWDWLRQ RQ WKH QXPEHU RI KLGGHQ OD\HUV RI WKH QHWZRUN PDSSLQJ $OVR LW GRHV QRW UHTXLUH WKH XQGHUO\LQJ SGI WR XQLPRGDO DJDLQ LQ FRQWUDVW WR SUHYLRXV DSSURDFKHV 7KHVH TXDOLWLHV PDNH LW D YHU\ SRZHUIXO PHWKRG IRU LQIRUPDWLRQ WKHRUHWLF VLJQDO SURFHVV

PAGE 164

LQJ $V VXFK WKLV PHWKRG KDV ZLGH SRWHQWLDO DSSOLFDWLRQ EH\RQG QRQOLQHDU H[WHQVLRQV WR WKH 0$&( ILOWHU $ VLJQLILFDQW UHVXOW RI FKDSWHU ZDV WKH GHPRQVWUDWLRQ WKDW D JOREDO SURSHUW\ RI D PDSn SLQJ QDPHO\ LQIRUPDWLRQ FRXOG EH PRGHOHG YHU\ VLPSO\ E\ ORFDO LQWHUDFWLRQ RI WKH GDWD LQ WKH RXWSXW VSDFH VLJQLILFDQWO\ UHGXFLQJ WKH FRPSXWDWLRQDO FRPSOH[LW\ RI WKH DOJRULWKP :H DOVR GHPRQVWUDWHG KRZ WKLV PHWKRG FRXOG EH DSSOLHG WR WKH 0$&( ILOWHU VXFK WKDW VWDWLVWLFDOO\ LQGHSHQGHQW IHDWXUHV UDWKHU WKDQ XQFRUUHODWHG IHDWXUHV FRXOG EH H[WUDFWHG RYHU WKH UHMHFWLRQ FODVV ,Q WKH FRXUVH RI WKH GLVFXVVLRQ ZH SUHVHQWHG UHVXOWV ZLWK UHVSHFW WR ,6$5 GDWD 7KH GDWD FKRVHQ UHSUHVHQWV LQ RXU RSLQLRQ D IDLUO\ GLIILFXOW FODVVLILFDWLRQ SUREOHP LQ WKH VHQVH WKDW WKH UDQJH RI GLVWRUWLRQV IRU WKH ,6$5 GDWD QRW RQO\ LQFOXGHV URWDWLRQ LQ DVSHFW EXW PRGLILFDWLRQV LQ WKH YHKLFOH FRQILJXUDWLRQ DQG GLIIHUHQFHV LQ WKH UDGDU GHSUHVVLRQ DQJOH ,Q VSLWH RI WKHVH REVWDFOHV WKH QRQOLQHDU V\VWHP JHQHUDOL]HV TXLWH ZHOO ,W LV RXU RSLQLRQ WKDW WKH UHVXOWV RI WKLV UHVHDUFK UHSUHVHQW D FRQWULEXWLRQ WR WKH VWDWH RI DUW LQ WKH DUHDV RI DXWRPDWLF WDUJHW UHFRJQLWLRQ DQG E\ H[WHQVLRQ SDWWHUQ UHFRJQLWLRQ DV ZHOO DV LQIRUPDWLRQ WKHRUHWLF VLJQDO SURFHVVLQJ :H DOVR IHHO WKDW WKLV UHVHDUFK KDV HVWDEOLVKHG D EDVLV IRU D FRQWLQXHG OLQH RI UHVHDUFK ,Q SDUWLFXODU WKH GLVFXVVLRQ RI FKDSWHU LV RI LQWHUHVW WR D ZLGHU DXGLHQFH WKDQ WKH DXWRPDWLF WDUJHW UHFRJQLWLRQ FRPPXQLW\ 6LJQDO SURFHVVLQJ SUREOHPV VXFK DV EOLQG VRXUFH VHSDUDWLRQ LQGHSHQGHQW FRPSRQHQW DQDO\VLV DQG SDUDPHWHU HVWLPDWLRQ UHSUHVHQW SRWHQWLDO DSSOLFDWLRQV RI WKH WHFKQLTXH $QRWKHU WRSLF ZH KDYH PHQn WLRQHG LV WKH UHODWLRQVKLS RI WKLV DSSURDFK WR WKH VHOIRUJDQL]LQJ IHDWXUH PDS 62)0f RI .RKRQHQ 7KHVH DUHDV RI DSSOLFDWLRQ ZLOO EH SXUVXHG LQ WKH IXWXUH

PAGE 165

$33(1',; $ '(5,9$7,216 $O )UHTXHQF\ 'RPDLQ 5HODWLRQVKLSV 7KH IROORZLQJ GHULYDWLRQV VKRZ IUHTXHQF\ GRPDLQ UHODWLRQVKLSV IRU XQLWDU\ GLVFUHWH IRX ULHU WUDQVIRUPDWLRQV 7KH UHVXOWV DUH VKRZQ IRU WKH RQH GLPHQVLRQDO YHFWRUV EXW FDQ EH HDVLO\ H[WHQGHG WR PXOWLSOH GLPHQVLRQV 7KH DXWRFRUUHODWLRQ VHTXHQFH RI D FRPSOH[ ZLGH VHQVH VWDWLRQDU\ SURFHVV [Qf LV GHILQHG DV 5[Pf ([rQf[Q Pff FDQ EH HVWLPDWHG IURP 1 REVHUYDWLRQV RI WKH SURFHVV DVVXPLQJ WKH VHTXHQFH LV HUJRGLFf DV tUff M\ ; rr}fr }}ff f $ UHODWLRQVKLS FDQ EH GHULYHG EHWZHHQ WKH HVWLPDWHG DXWRFRUUHODWLRQ VHTXHQFH DQG WKH ')7 RI WKH REVHUYHG VHTXHQFH XVLQJ WKH XQLWDU\ ')7 ;Nf ; [QfM HO0Q1 [Qf e [LNfM 60ND1

PAGE 166

6XEVWLWXWLQJ [Qf ZLWK LWV ')7 LQ f \LHOGV U$I .Pf 7M< Q ?B?N 1,1,1 ; ;NfM HLQNQ1 ; ;OfM HLQ.Q Pf1 8 R A; ; ;rNf;OfHaL.NQ1Hn.AQPf1 1 Q N O 1O1O A ; ;rNf;OfH.P1 ; 0O NfQ1 N O 1,1 MS ; ;rNf;OfHWP N 4O 1 B QOP1 N }c1 r LW O NrO ZKLFK LV WKH ')7 RI WKH SHULRGRJUDP RI WKH REVHUYHG VHTXHQFH VFDOHG E\ D IDFWRU ?-“ 7KH XQLWDU\ ')7 FDQ DOVR EH UHSUHVHQWHG E\ PDWUL[ RSHUDWLRQV DV ; '[ [ 2; RRI O pL! 2 >SmS FS1@U FSfrf H[SMQNQ1f -“ f

PAGE 167

ZKHUH ^[ ; Sf` H &1[ DUH FRPSOH[ FROXPQ YHFWRUV 7KH ')7 UHODWLRQVKLS EHWZHHQ WKH HVWLPDWHG DXWRFRUUHODWLRQ VHTXHQFH 5[Pf DQG WKH SHULRGRJUDP RI WKH REVHUYHG VHTXHQFH 3[Nf _;IFf_ LV ZULWWHQ 5$2f ,, bOOf H f 3-/f f 3$ 2 n -“4! 5$2f 01Of 3^1 f 3[1 f 5$1 Of 7KH FRYDULDQFH PDWUL[ RI D ]HURPHDQ FRPSOH[ ZLGHVHQVH VWDWLRQDU\ SURFHVV [Qf FDQ EH HVWLPDWHG IURP 1 REVHUYDWLRQV RI WKH SURFHVV DV ([[nf ([rQf[Qff ([rQ Of[Qff ([rQ 1 f[Qff B ([r^Qf[Q ff ([rQ Of[Q ff ([rQ 1 Of[Q ff ([rQf[Q 1 ff ([rQ OfrQ 1 ff ([rQ 1 fMFQ 1 ffB 5[f 5[f 5[1 f 5[ f "[f 5[1 f 5;1 f 5;1 f 5[f 5HSODFLQJ WKH HOHPHQWV RI WKH FRYDULDQFH PDWUL[ ZLWK WKH DXWRFRUUHODWLRQ VHTXHQFH HVWLn PDWHV DQG DSSO\LQJ WKH XQLWDU\ ')7 PDWULFHV \LHOGV r[f 3$f f 5$1 r[' 5$2f 5$1 5$1 f 5;1f ‘‘ 5$2f 2;!W 2

PAGE 168

8VLQJ WKH ')7 UHODWLRQVKLS RI f DQG WKH ')7 SURSHUWLHV [QPf -“I PNf;Nf DQG SPNf FSNPf \LHOGV 3fSf 3[ fSf 3[fmSf A[Of9&Of !fSmBf I\:ZL2f 3[1OfI1Of 31?fI1?f !,9Of$U:Of cIW 3 f 30f 3[f 3 f f 3Of 3[1Of 3[1Of 3-1 ,f 3&rf f 8 mLf QL8 RA8 9L1Lf Z:Of 'r Rf R2! RZ8 f L8 L:O! NLf NL8 fff ZL-9 f 2r W!r 3[ f ZKHUH 3 LV D GLDJRQDO PDWUL[ ZKRVH GLDJRQDO HOHPHQWV FRQWDLQ WKH SHULRGRJUDP RI WKH REVHUYHG VHTXHQFH [Qf 1[ 7KH RXWSXW YDULDQFH RI DQ ),5 ILOWHU K V GXH WR D ]HURPHDQ FRPSOH[ ZLGH VHQVHVWDWLRQDU\ UDQGRP QRLVH VHTXHQFH LQSXW Q H "A r LV *Q (>K7Qf@ (>K7QfQKf? K7(>QQ7@K K77K

PAGE 169

,QVHUWLRQ RI WKH XQLWDU\ ')7 WUDQVIRUPDWLRQ PDWULFHV LQWR HTXDWLRQ f XVLQJ WKH GHILQLn WLRQV RI f DQG WKH LGHQWLW\ RI f \LHOGV WKH IUHTXHQF\ GRPDLQ UHODWLRQVKLS UU K =QK f W f Rf rbr $ 2SWLPDO 7UDGHRII RI 1RLVH 5HVSRQVH ZLWK (UURU 9DULDQFH 6XEMHFW WR =HUR 0HDQ )UURU &RQVWUDLQW 6XSSRVH ZH ZLVK WR UHOD[ WKH HTXDOLW\ FRQVWUDLQWV ZLWK UHJDUGV WR WKH GHVLUHG RXWSXWV 7KDW LV ZH QR ORQJHU UHTXLUH WKDW [rK G f DQG LQVWHDG DOORZ WKH RXWSXW UHVSRQVH DW WKH ORFDWLRQV RI WKH IRUPHU HTXDOLW\ FRQVWUDLQWV WR YDU\ ZLWK ]HUR PHDQ :H FDQ WUDGHRII WKH QRLVH UHVSRQVH RI WKH ILOWHU ZLWK UHVSHFW WR WKH HUURU YDULDQFH DV IROORZV PLQ W A NK ? VXEMHFW WR WKH FRQVWUDLQW > @[AK f§ Gf $GMRLQLQJ WKH ]HURPHDQ HTXDOLW\ FRQVWUDLQW WR WKH RSWLPL]DWLRQ FULWHULRQ \LHOG 9K7K ?f[nKGf?[nKGf ?O??@[OKGf 3frMFMF$ [GfnK WI[G GnGf 8?O@[OKGf f

PAGE 170

&RPSXWLQJ WKH JUDGLHQW RI f ZLWK UHVSHFW WR K \LHOGV GG K K ?f[n[K?f[G [ ; f 6HWWLQJ f WR ]HUR DQG VROYLQJ IRU K \LHOGV I ? K 3 3MF[ffn 3f-& G[ / ; X f 6XEVWLWXWLQJ f LQWR WKH ]HURPHDQ FRQVWUDLQW HTXDWLRQ \LHOGV WKH FRQGLWLRQ >OO@rWS O3f[[WfO ? 3 f[G[ ; >@ f )RU WKH VSHFLDO FDVH RI G > @ ZH FDQ IXUWKHU VLPSOLI\ f \LHOGLQJ > @MF3 3f}Wff +L! 17 f /HWWLQJ [> 1W 17? [ DQG IDFWRULQJ RXW WKH VFDODU WHUP f§ 3 f§ ;f \LHOGV 3 f LW33f-fWfBfLfn f

PAGE 171

6XEVWLWXWLQJ f LQWR f \LHOGV WKH VROXWLRQ f $ &RQYROXWLRQ RI *DXVVLDQ .HUQHO ZLWK LWV *UDGLHQW 7KH 1 GLPHQVLRQDO *DXVVLDQ NHUQHO WDNHV WKH IRUP f ZKHUH X H n9[ DQG WKH FRYDULDQFH WHUP ; H 51[1 LV D SRVLWLYH GHILQLWH PDWUL[ 7KH JUDGLHQW RI WKH NHUQHO ZLWK UHVSHFW WR X KDV WKH IRUP >)XNDQDJD @ N G f mferm f M]f1,], 7H[SBAf7[ nfM +HUH ZH DUH LQWHUHVWHG LQ WKHVH IXQFWLRQV ZKHQ WKH FRYDULDQFH WHUP KDV WKH VLPSOLILHG IRUP ; D D D R

PAGE 172

LQ ZKLFK FDVH HTXDWLRQV f DQG f EHFRPH NXf f§ 1 ZH[SLAD7DO f f R 9 D f DQG .nXf 77Za1nH[SULf7f@f ‘ MLf D Y W 7KH FRQYROXWLRQ RI WKHVH WHUPV .DXf LV FRPSXWHG DV IROORZV DXf NXfrNnXf _Nm[f.n[fG[ $7 9 H;SLB7af Uf7X a[f ;7[f?[G[ LP Nf D Y D aWf9-H;3A87f f [ 8 ([DPLQDWLRQ RI HTXDWLRQ UHYHDOV WKDW LW FRQWDLQV D YHFWRU WHUP 7KH FRQYROXWLRQ RI WKH VFDODU NHUQHO H[SUHVVLRQ ZLWK YHFWRU JUDGLHQW H[SUHVVLRQ LV FDUULHG RXW ZLWK HDFK HOHPHQW RI WKH JUDGLHQW YHFWRU 6XEVWLWXWLQJ WKH HOHPHQWV RI WKH YHFWRUV X DQG [ LQWR HTXDWLRQ ZKHUH 8 >m mWI@ DQG [ >[f DA)

PAGE 173

ZH FDQ UHZULWH WKH M WK HOHPHQW RI WKH YHFWRU LQWHJUDO DV H[SLaAf7ffOnn I U M A ? 9 D OH[S ; 1 }Wf D 9 9 Lr ; cHr3A;M8Mff[MG;M H[SE"f7f`f@ 1 1 &f 2 Q HrSAr" [c fffrf[ f f ‘ LrM -H[3[MXMff[MG[M 1 1 Qf R > H[S f§ 7\LF D UH[S K ODMQX Y D ZKLFK DV RXU ILQDO UHVXOW FDQ EH FRQYHUWHG EDFN WR YHFWRU IRUP DV .DXf NXfrNnXf Y LQYD1fH[SAf7ffff f AA Q1D1 .Df]X f f $OVR RI LQWHUHVW LV WKH JUDGLHQW VSHFLILFDOO\ WKH PDJQLWXGHf RI HTXDWLRQ ZKLFK KDV WKH IRUP N fmf GX f D mD9: Rf7fff Df7ffn f

PAGE 174

7KH PDJQLWXGH RI WKH JUDGLHQW LV .JXf GX f1 X7X(YDOXDWLRQ RI WKH PDJQLWXGH DW _X_ JLYHV f NXf I GX n ?Qr?QFQrf f $ &RQYROXWLRQ RI WKH 8QLIRUP 'LVWULEXWLRQ )XQFWLRQ ZLWK WKH *UDGLHQW RI WKH 7KH XQLIRUP GLVWULEXWLRQ IXQFWLRQ KDV WKH IROORZLQJ IRUP ^ff Pf DL XL r EcfY L RWKHUZLVH f 7KH FRQYROXWLRQ RI WKH XQLIRUP GLVWULEXWLRQ IXQFWLRQ ZLWK JUDGLHQW RI WKH JDXVVLDQ NHUn QHO FDQ EH ZULWWHQ DV -W&0}[fG[n +! ZKLFK LV DQ 9IROG LQWHJUDO RYHU WKH UHJLRQ RI VXSSRUW e\ RI WKH XQLIRUP GLVWULEXWLRQ IXQFWLRQ :H DUH LQWHUHVWHG LQ WKH UHVXOW RI WKLV YHFWRU LQWHJUDWLRQ ZKHQ WKH NHUQHO JUDGLHQW WHUP KDV WKH VDPH IRUP DV LQ WKH SUHYLRXV VHFWLRQ DQG WKH XQLIRUP GLVWULEXWLRQ IXQFWLRQ LV VXFK WKDW E D D n

PAGE 175

:LWK WKHVH FRQGLWLRQV HTXDWLRQ FDQ EH UHZULWWHQ f ?IX[f.nX[fG[ D D D D O"I ‘‘ I 1 +7OH;Lf7m rfm rft D Wf R 9 F n L [fG[ a fn [9 9 D1Qf D IDIDH;3 D Y f

PAGE 176

7KH \nWK HOHPHQW RI WKH YHFWRU LQWHJUDWLRQ FDQ EH ZULWWHQ D ‘I H;3AfI ;Mf@^8M ;LfG;L Qr 1 1 D1Qf D Uf D D? D??? fL -O* HUI MR 9 --f f ‘ DH;3f! I ff H[Sf Iff 9Q D -"DLrM I 9IO HUI 9 -D HUI -O ? ff H[SZY,fff Q DOO frM HUI -D 9 HUI -D ; ‘ ‘ A r -nM ‘ LQL ImA D? 8 f§ f§ HUI f 9FW A HUI -F ? f

PAGE 177

ZKHUH .MP Df LV WKH RQHGLPHQVLRQDO *DXVVLDQ NHUQHO RI ZLGWK R 7KH YHFWRU UHVXOW RI WKH FRQYROXWLRQ LV WKHQ ZULWWHQ Uff f§77 DZLO r f§ 77 HUI HUI D? 0f &O?? fcM A -OR HUI -F ? ff D? fm -F HUI -D f

PAGE 178

5()(5(1&(6 $PLW f 0RGHOLQJ %UDLQ )XQFWLRQ 7KH :RUOG RI $WWUDFWRU 1HXUDO 1HWn ZRUNV &DPEULGJH 8QLYHUVLW\ 3UHVV 1HZ
PAGE 179

)LVKHU DQG 3ULQFLSH & Ef f$ QRQOLQHDU H[WHQVLRQ RI WKH 0$&( ILOWHUf 1HXUDO 1HWZRUNV 6SHFLDO ,VVXH RQ 1HXUDO 1HWZRUNV IRU $XWRPDWLF 7DUJHW 5HFRJQLWLRQ f )LVKHU DQG 3ULQFLSH & Ff f8QVXSHUYLVHG OHDUQLQJ IRU QRQOLQHDU V\QWKHWLF GLVFULPLQDQW IXQFWLRQVf 3URFHHGLQJ RI63,( )XQDKDVKL f f2Q WKH DSSUR[LPDWH UHDOL]DWLRQ RI FRQWLQXRXV PDSSLQJV E\ QHXUDO QHWZRUNVf 1HXUDO 1HWZRUNV f )XNDQDJD f 6WDWLVWLFDO 3DWWHUQ 5HFRJQWLRQ QG HG +DUFRXUW %UDFH -RYDQRY LFK &DPEULGJH 0DVVDFKHXVHWWV *HUEUDQGV f f2Q WKH UHODWLRQVKLSV EHWZHHQ 69' ./7 DQG 3&$f 3DWWHUQ 5HFRJQLWLRQ *KHHQ f f'HVLJQ RI FRQVLGHUDWLRQV IRU ORZFOXWWHU GLVWRUWLRQLQYDULDQW FRUn UHODWLRQ ILOWHUVf 2SWLFDO (QJLQHHULQJ f +DUGOH : f $SSOLHG 1RQSDUDPHWULF 5HJUHVVLRQ &DPEULGJH 8QLYHUVLW\ 3UHVV 1HZ
PAGE 180

.XOOEDFN 6 f ,QIRUPDWLRQ 7KHRU\ DQG 6WDWLVWLFV 'RYHU 3XEOLFDWLRQV 1HZ
PAGE 181

1RYDN / 0 0 & %XUO DQG : : ,UYLQJ f f2SWLPDO SRODULPHWULF SURFHVVLQJ IRU HQKDQFHG WDUJHW GHWHFWLRQf ,((( 7UDQVDFWLRQV RQ $HURVSDFH DQG (OHFWURQLF 6\VWHPV f 1RYDN / 0 2ZLUND & 1HWLVKHQ f f5DGDU WDUJHW LGHQWLILFDWLRQ XVLQJ VSDn WLDO PDWFKHG ILOWHUVf 3DWWHUQ 5HFRJQLWLRQ f 2SSHQKHLP $ 9 DQG 5 : 6KDIHU f 'LVFUHWH7LPH 6LJQDO 3URFHVVLQJ 3UHQn WLFH+DOO 1HZ -HUVH\ 3DSRXOLV $ f 3UREDELOLW\ 5DQGRP 9DULDEOHV DQG 6WRFKDVWLF 3URFHVVHV UG HGf 0F*UDZ+LOO 1HZ
PAGE 182

6FKPLGW : DQG 'DYLV f f3DWWHUQ UHFRJQLWLRQ SURSHUWLHV RI YDULRXV IHDWXUH VSDFHV IRU KLJKHU RUGHU QHXUDO QHWZRUNVf ,((( 7UDQVDFWLRQV RQ 3DWWHUQ $QDO\VLV DQG 0DFKLQH ,QWHOOLJHQFH f 6KDQQRQ & ( f f$ PDWKHPDWLFDO WKHRU\ RI FRPPXQLFDWLRQVf %HOO 6\VWHPV 7HFKQLFDO -RXUQDO 6XGKDUVDQDQ 6 $ 0DKDODQRELV DQG 0 6XQGDUHVKDQ f f6HOHFWLRQ RI RSWLPXP RXWSXW FRUUHODWLRQ YDOXHV LQ V\QWKHWLF GLVFULPLQDQW IXQFWLRQ GHVLJQf 2SW 6RF $P $ f 6XGKDUVDQDQ 6 $ 0DKDODQRELV DQG 0 6XQGDUHVKDQ f f$ XQLILHG IUDPHn ZRUN IRU WKH V\QWKHVLV RI V\QWKHWLF GLVFULPLQDQW IXQFWLRQV ZLWK UHGXFHG QRLVH YDULDQFH DQG VKDUS FRUUHODWLRQ VWUXFWXUHf $SSO 2SW f 9DQGHU /XJW $ f f6LJQDO GHWHFWLRQ E\ FRPSOH[ PDWFKHG VSDWLDO ILOWHULQJf ,((( 7UDQV ,QI 7KHRU\ f 9LROD 3 1 6FKUDXGROSK DQG 7 6HMQRZVNLf f(PSLULFDO HQWURS\ PDQLSXODWLRQ IRU UHDOZRUOG SUREOHPVf 1HXUDO ,QIRUPDWLRQ 3URFHVVLQJ 6\VWHPV WR DSSHDU LQ SXEn OLVKHG SURFHHGLQJV :HUERV 3 f f%H\RQG UHJUHVVLRQ QHZ WRROV IRU SUHGLFWLRQ DQG DQDO\VLV LQ WKH EHKDYLRUDO VFLHQFHVf 3K 7KHVLV +DUYDUG 8QLYHUVLW\ &DPEULGJH 0$ :LGURZ % DQG 0 +RII f f$GDSWLYH VZLWFKLQJ FLUFXLWVf ,5( :HVFRQ &RQYHQn WLRQ 5HFRUG :LONLQVRQ 7 DQG *RRGPDQ f f6\QWKHWLF GLVFULPLQDQWV DQG HLJHQYHFWRU GHFRPSRVLWLRQVf $SSO 2SW f :RQJ % DQG %ODNHf f'HWHFWLRQ LQ PXOWLYDULDWH QRQJDXVVLDQ QRLVHf ,((( 7UDQVDFWLRQV RQ &RPPXQLFDWLRQV

PAGE 183

%,2*5$3+,&$/ 6.(7&+ 0U )LVKHU ZDV ERP $SULO +H HDUQHG KLV EDFKHORUfV GHJUHH LQ HOHFWULFDO HQJLn QHHULQJ IURP WKH 8QLYHUVLW\ RI )ORULGD LQ +H ZDV D JUDGXDWH UHVHDUFK DVVLVWDQW LQ WKH (OHFWURQLF &RPPXQLFDWLRQV /DERUDWRU\ DW WKH 8QLYHUVLW\ RI )ORULGD IURP XQWLO GXULQJ ZKLFK WLPH KH HDUQHG KLV 0DVWHU RI (QJLQHHULQJ GHJUHH IURP WKH 8QLYHUVLW\ RI )ORULGD +H KDV FRQWLQXHG KLV DIILOLDWLRQ ZLWK WKH (&/ DV ERWK D IDFXOW\ PHPEHU DQG JUDGXDWH UHVHDUFK DVVLVWDQW VLQFH GXULQJ ZKLFK WLPH KH KDV FRQGXFWHG UHVHDUFK LQ WKH DUHDV RI XOWUDZLGHEDQG UDGDU IRU JURXQG SHQHWUDWLRQ DQG IROLDJH SHQHWUDWLRQ DSSOLFDn WLRQV UDGDU VLJQDO SURFHVVLQJ DQG DXWRPDWLF WDUJHW UHFRJQLWLRQ DOJRULWKPV +H KDV DOVR SHUIRUPHG GXWLHV DV D JUDGXDWH UHVHDUFK DVVLVWDQW DQG 3K FDQGLGDWH LQ WKH &RPSXWDn WLRQDO 1HXUR(QJLQHHULQJ /DERUDWRU\ GXULQJ ZKLFK WLPH KH KDV FRQGXFWHG UHVHDUFK 3K' WRSLFf RQ QRQOLQHDU H[WHQVLRQV WR V\QWKHWLF GLVFULPLQDQW IXQFWLRQV ZLWK DSSOLFDWLRQ WR FODVn VLILFDWLRQ RI PPZDYH 6$5 LPDJHU\

PAGE 184

, FHUWLI\ WKDW KDYH UHDG WKLV VWXG\ DQG WKDW LQ P\ RSLQLRQ LW FRQIRUPV WR DFFHSWDEOH VWDQGDUGV RI VFKRODUO\ SUHVHQWDWLRQ DQG LV IXOO\ DGHTXDWH LQ VFRSH DQG TXDOLW\ DV D GLVVHUWDWLRQ IRU WKH GHJUHH RI 'RFWRU RI 3KLORVRSK\ FHUWLI\ WKDW KDYH UHDG WKLV VWXG\ DQG WKDW LQ P\ RSLQLRQ LW FRQIRUPV WR DFFHSWDEOH VWDQGDUGV RI VFKRODUO\ SUHVHQWDWLRQ DQG LV IXOO\ DGHTXDWH LQ VFRSH DQG TXDOLW\ DV D GLVVHUWDWLRQ IRU WKH GHJUHH RI 'RFWRU RI 3KLORVRSK\ e 7KRPDV ( %XOORFN 3URIHVVRU RI (OHFWULFDO DQG &RPSXWHU(QJLQHHULQJ FHUWLI\ WKDW KDYH UHDG WKLV VWXG\ DQG WKDW LQ P\ RSLQLRQ LW FRQIRUPV WR DFFHSWDEOH VWDQGDUGV RI VFKRODUO\ SUHVHQWDWLRQ DQG LV IXOO\ DGHTXDWH LQ VFRSH DQG TXDOLW\ DV D GLVVHUWDWLRQ IRU WKH GHJUHH RI 'RFWRU RI 3KLORVRSK\ Da $ $A &M8VA! f -RKQ 0 0 $QGHUVRQ $VVLVWDQW 3URIHVVRU RI (OHFWULFDO DQG &RPSXWHU (QJLQHHULQJ FHUWLI\ WKDW KDYH UHDG WKLV VWXG\ DQG WKDW LQ P\ RSLQLRQ LW FRQIRUPV WR DFFHSWDEOH VWDQGDUGV RI VFKRODUO\ SUHVHQWDWLRQ DQG LV IXOO\ DGHTXDWH LQ VFRSH DQG TXDOLW\ DV D GLVVHUWDWLRQ IRU WKH GHJUHH RI 'RFWRU RI 3KLORVRSK\ VVLVWDQW 3URIHVVRU RI (OHFWULFDO DQG &RPSXWHU (QJLQHHULQJ FHUWLI\ WKDW KDYH UHDG WKLV VWXG\ DQG WKDW LQ P\ RSLQLRQ LW FRQIRUPV WR DFFHSWDEOH VWDQGDUGV RI VFKRODUO\ SUHVHQWDWLRQ DQG LV IXOO\ DGHTXDWH LQ VFRSH DQG TXDOLW\ DV D GLVVHUWDWLRQ IRU WKH GHJUHH RI 'RFWRU RI 3KLORVRSK\ )UDQN %RYD 3URIHVVRU RI 1XFOHDU DQG 5DGLRORJLFDO (QJLQHHULQJ

PAGE 185

, FHUWLI\ WKDW KDYH UHDG WKLV VWXG\ DQG WKDW LQ P\ RSLQLRQ LW FRQIRUPV WR DFFHSWDEOH VWDQGDUGV RI VFKRODUO\ SUHVHQWDWLRQ DQG LV IXOO\ DGHTXDWH LQ VFRSH DQG TXDOLW\ DV D GLVVHUWDWLRQ IRU WKH GHJUHH RI 'RFWRU RI 3KLORVRSK\ $QGUHZ ) /DLQH $VVRFLDWH 3URIHVVRU RI &RPSXWHU DQG ,QIRUPDWLRQ 6FLHQFH DQG (QJLQHHULQJ 7KLV GLVVHUWDWLRQ ZDV VXEPLWWHG WR WKH *UDGXDWH )DFXOW\ RI WKH &ROOHJH RI (QJLQHHULQJ DQG WR WKH *UDGXDWH 6FKRRO DQG ZDV DFFHSWHG DV SDUWLDO IXOILOOPHQW RI WKH UHTXLUHPHQWV IRU WKH GHJUHH RI 'RFWRU RI 3KLORVRSK\ 0D\ :LQIUHG 0 3KLOOLSV 'HDQ &ROOHJH RI (QJLQHHULQJ .DUHQ $ +ROEURRN 'HDQ *UDGXDWH 6FKRRO

PAGE 186

\ Orrf Y IVV\ 81,9(56,7< 2) )/25,'$


156
Substituting x(n) with its DFT in (93) yields
r/Af- 1
K(m) = TjY,
n = 0 \_\k = 0
N-IN-IN-1
X X(k)j=ei2nkn/N X X(l)-j=ei2nKn + m)/N
J U = o
^X I X X*(k)X(,l)e~i2Kkn/Ne'2K^n+m)/N
N2
n = 0k = 0l = 0
N-lN-l
= ^ X I X*(k)X(l)e2K,m/N X 2Ml
-k)n/N
k = 0l = 0
N-IN-1
= jp X I X*(k)X(l)e2t,m
k = Ql = 0
N- 1
. I 9 1 2nlm/N
k = 0
¡N
1
it = l
k*l
which is the DFT of the periodogram of the observed sequence scaled by a factor \/J.
The unitary DFT can also be represented by matrix operations as
X = x = O+X
oof -l +
O = [ cp(*) = exp(-j2nkn/N)/ J
(94)


55
4 7 1 .Shift Invariance of the Proposed Nonlinear Architecture
One of the properties of the MACE filter is shift invariance. We wish to maintain that
property in our nonlinear extensions. A transformation, T[ ], of a two -dimensional func
tion is shift invariant if it can be shown that
g(nl,n2) = T[y(nvn2) 3
g(n¡+n¡',n2 + n2) = T[y(nl + n{,n2 +n2)]
where n,,n2, n2 are integers. In other words, a shift of the input signal is reflected as
a corresponding shift of the output signal. [Oppenheim ana Shafer, 1989]
We show here that this property is maintained for our proposed nonlinear architecture.
The pre-processor of the nonlinear architecture at the bottom of figure 22 is the same as
the pre-processor of the linear filter shown at the top. The pre-processor is implemented as
a linear shift invariant (LSI) filter. Cascading shift invariant operations maintains shift
invariance of the entire system [Oppenheim and Shafer, 1989]. In order to show that the
system as a whole is shift invariant, it is sufficient to show that the MLP is shift invariant.
The mapping function of the MLP in figure 22 can be written
g(a>,y) = o(W3o(W2o(W,y) + q>))
JV2XW,
-A'x
ip e 91
Kix>
(41)
In the nonlinear architecture, the matrix represents the connectivities from the pro
cessing elements (PEs) of layer (i 1) to the input to the PEs of layer /; that is, the matrix
W¡ is applied as linear transformation to the vector output of layer (¡ 1). When i = 1
the transformation is applied to the input vector, y. The number of PEs in layer i is


147
In the second experiment we repeat the conditions of experiment 4. In this case we
represent the rejection class with both subspace noise and convex hull exemplars. The
gaussian assumption is now no longer correct due to the inclusion of the convex hull
exemplars In this case we would expect the results to improve on the assumption that there
is information to be extracted from the convex hull exemplars with regards to classifica
tion. As we have already demonstrated that the convex hull approach yielded improved
classification in the previous chapter we will hold this assumption to be true. The feature
space for this case is shown in figure 67. We observe the feature space, as in the previous
chapter is quite different. Intuitively the result makes sense in the context of mutual infor
mation. Recall that the convex hull exemplars lie in the interior of recognition class in the
input space (due to their construction) and our goal via mutual information is to make the
recognition class compact and the rejection class diffuse. This goal and the property of the
convex hull are seemingly at odds. As a result a trade-off results. The recognition class is
compact on an ellipsoid with the convex hull exemplars on the interior, but expanded.
We see, from table 4 that the classification results are substantially better than the pre
vious results. The ROC curve for this experiment is shown in figure 68 as compared to the
linear system. We would hypothesize that mutual information performed better at extract
ing discriminating information from the exemplars. We also note that both results did not
rely on orthogonality in the input layer for which we can only make second-order statisti
cal justifications and yet were able to achieve the same or better performance.


54
h = y(ytji) ld
SDF/LAM
y = Ax
pre-processor
Figure 22. Decomposition of optimized correlator as a pre-processor followed by
SDF/LAM (top). Nonlinear variation shown with MLP replacing SDF
in signal flow (middle), detail of the MLP (bottom). The linear
transformation A represents the space domain equivalent of the
spectral pre-processor (aPx + (l -a)?^)~l/2
S(tO,[ ])
MLP
xe 5R
W,xW2
jce 9'
N¡xN2
pre-processed
image
l MLP


114
the observed output distribution, e.Y(u,y), with the gradient of the kernel, k'( ). It is
through the gradient of the estimator kernel that the observed distribution error influences
the direction of each data observation in the output space and thereby (through backpropa-
gation) the parameters of the mapping. This point will be further illustrated in the next sec
tion for the case of gaussian kernels.
The adaptation scheme is depicted in figure 45. As can be seen, this approach fits
readily into the backpropagation framework. The point set x = {*,, ..., xN} is mapped
to a point set y = {y ,, ..., yN } The criterion then estimates from the set an error between
the observed output distribution and the baseline output distribution (uniform in this case).
From this distribution error computed over the range of the output space, an error direction
(whose sign depends on whether we wish to minimize of maximize entropy) is associated
with each data point in the set y. This error direction is then backpropagated through the
MLP in order to modify the parameters of the mapping.
Figure 45. A signal flow diagram of the learning algorithm. The criterion block computes, as
a function of the observed outputs, the error direction for the mapping network.


113
neural network), this term can be computed efficiently using standard backpropagation.
The remaining partial derivative, df/dg, is
1= 1
Ny
i = 1
where K'( ) is the derivative of the kernel function with respect to its argument.
Substituting 73 into 72 yields
(73)
a/
da
a, x,)
j i
)£er(ujy)K'(yi~uy)A
(74)
J
The terms in 74, excluding the mapping sensitivities, become the new error direction
term in the backpropagation algorithm. It is important to distinguish error direction from
error. If the term were an error this would imply a desired output (d = y + e), which is the
case for a supervised training algorithm using the mean square error criterion. However, in
general, the partial derivative only implies the direction we would like to perturb the
observation. Later we will show how to interpret the error direction as an actual error
(resulting in a much simplified algorithm).
By reversing the order of summations in the second step of 74 we see that the error
direction term associated with each observation is a convolution of the estimated error in


43
combining this solution for h with the pre-processor in equation (31) for the equivalent
linear system, ftsys, yields
hsys = Ah
= A&BX(tfD~lX)'d
= OTBtootBATtA't/y'A:)-'*/
= &D X(X^D 1X) 'd
Substituting the MACE filter solution, equation (24), gives the result
Kys = tflrMACE (39)
and so /¡sys is the inverse DFT pair of the spectral domain MACE filter. This result estab
lishes the relationship between the MACE filter and the linear associative memory. The
decomposition of the MACE filter of figure 16 can also be considered as a cascade of a lin
ear pre-processor followed by a linear associative memory (LAM) as in figure 17.
ya
Figure 17. Decomposition of MACE filter as a preprocessor (i.e. a pre
whitening filter over the average power spectrum of the exemplars)
followed by a linear associative memory.
Since the two are equivalent then why make the distinction between the two perspec
tives? The are several reasons. The development of distortion invariant filtering and asso
ciative memories has proceeded in parallel. Distortion invariant filtering has been


121
We are more interested in the result when the gaussian assumption is not correct. In
this case we would not expect the PCA and entropy mappings to be equivalent. We con
duct a second experiment to illustrate this point where we draw observations from a ran
dom source whose underlying distribution is not gaussian. Specifically the PDF is a
mixture of gaussian modes with the following form
p(x) = l/2(N(x, t], Zj) + N(x, m2, Z2))
where N(x, m, Z) is a gaussian distribution with mean m and covariance Z. In this
case
m\ =
-0.9
0.0
z, =
0.05 0
0 1.2
m2 =
0.9
o.q
z2 =
0.05 0
0 0.8
It can be shown that the principal components of this distribution are the eigenvectors
of the matrix
R = 2(^1 + iHjmJ + Z2 + m2mj)
0.86 0
0 1
with the principal component vector parallel to the x, -axis.
This distribution is shown in figure 50 along with its first principal component feature
mapping. The right side of figure 50 shows the image of the maximum entropy mapping.
As we can see there are two distinct differences between this mapping and the PCA result.
The first observation is that the mapping is nonlinear. The second observation is that the


39
Figure 16. Decomposition of MACE filter as a preprocessor (i.e. a pre
whitening filter over the average power spectrum of the
exemplars) followed by a synthetic discriminant function.
3.4 Associative Memory Perspective
Having presented the derivation of the MACE filter and the pre-processor/SDF decom
position, we now show that with a modification (addition of a linear pre-processor), the
MACE filter is a special case of Kohonens linear associative memory [1988],
Associative memories [Kohonen, 1988] are general structures by which pattern vec
tors can be related to one another, typically in an input/output pair-wise fashion. An input
stimulus vector is presented to the associative memory structure resulting in an output
response vector. The input/output pairs establish the desired response to a given input. In
the case of an auto-associative memory, the desired response is the stimulus vector,
whereas, in a hetero-associative memory the desired response is arbitrary. From a signal
processing perspective, associative memories are viewed as projections [Kung, 1992], lin
ear and nonlinear. The input patterns exist in a vector space and the associative memory
projects them onto a new space. The linear associative memory of Kohonen [1988] is for
mulated exactly in this way.
A simple form of the linear hetero-associative memory maps vectors to scalars. It is
formulated as follows. Given the set of input/output vector/scalar pairs


20
i.o
0.8
0.8
Figure 8. MACE filter output image plane response.
2.1.4 Optimal Trade-off Synthetic Discriminant Function
The final distortion invariant filtering technique which will be discussed here is the
method proposed by Rfrgrier and Fique [1991], known as the optimal trade-off syn
thetic discriminant function (OTSDF). Suppose that the designer wishes to optimize over
multiple quadratic optimization criteria (e.g. average correlation energy and output noise
variance) subject to the same set of equality constraints as in the previous distortion invari
ant filters. We can represent the individual optimization criterion by
J, = VQ,h,
where Q¡ is an N xN symmetric, positive-definite matrix (e.g. Q¡ = ln for MVSDF
optimization criterion).
The OTSDF is a method by which a set of quadratic optimization criterion may be
optimally traded off against each other; that is, one criterion can be minimized with mini-


87
The adaptation scheme of backpropagation allows a simple mechanism to implement
this constraint. The adaptation of matrix W, at iteration k can be written as
W,(*+l) = W, (*)+*,Weft*) (55)
where e'(. is a column vector derived from the backpropagated error and x¡(k) is the
current input exemplar from either class presented to network which, by design, lies in the
subspace spanned by the columns of U. From equation (55) if the rejection class noise
exemplars are restricted to lie in the data space of x2, which can be achieved by projecting
random vectors of size N, onto the matrix V above, and IT, is initialized to be a random
projection from this space we will be assured that the columns of only extract infor
mation from the data space of x2. This is because the columns of W¡ will only be con
structed from vectors which lie in the columns space of U and so will be orthogonal to
any vector component that lies in the null space of U.
The search for a discriminant function is now reduced from within an NXN2 -dimen
sional space to a search from within an Nt -dimensional space. Due to the dimensionality
reduction achieved we would expect the convergence time to be reduced.
This is the method that was used for the third experiment. Rejection class noise exem-
N x 1
plars were generated by projecting a random vector, n e 9i onto the basis U by
x = Un.ln figure 35 the resulting discriminant function is shown as in the previous


154
ing. As such this method has wide potential application beyond nonlinear extensions to the
MACE filter.
A significant result of chapter 5 was the demonstration that a global property of a map
ping, namely information, could be modeled very simply by local interaction of the data in
the output space, significantly reducing the computational complexity of the algorithm.
We also demonstrated how this method could be applied to the MACE filter such that
statistically independent features rather than uncorrelated features could be extracted over
the rejection class.
In the course of the discussion we presented results with respect to ISAR data. The
data chosen represents, in our opinion, a fairly difficult classification problem in the sense
that the range of distortions for the ISAR data not only includes rotation in aspect but
modifications in the vehicle configuration and differences in the radar depression angle. In
spite of these obstacles the nonlinear system generalizes quite well.
It is our opinion that the results of this research represent a contribution to the state of
art in the areas of automatic target recognition and by extension pattern recognition as well
as information theoretic signal processing. We also feel that this research has established a
basis for a continued line of research. In particular the discussion of chapter 5 is of interest
to a wider audience than the automatic target recognition community. Signal processing
problems such as blind source separation, independent component analysis, and parameter
estimation represent potential applications of the technique. Another topic we have men
tioned is the relationship of this approach to the self-organizing feature map (SOFM) of
Kohonen. These areas of application will be pursued in the future.


122
maximum entropy mapping is more tuned to the structure of the data in the input space. It
is interesting to note that the maximum entropy mapping weights the tails of the modes
equally as evidenced by the greater spreading of the contours for the mode with the larger
eigenvalue, while the PCA mapping does not. We can say from observing the results that
the maximum entropy mapping is superior in describing the underlying structure of the
data when compared the PCA mapping.
: \ ...
V-v
W
Figure 50. PCA vs. Entropy non-gaussian case. Left: image of PCA features
shown as contours. Right: Entropy mapping shown as contours.
We consider one more bi-modal distribution. The setup is the same as the previous
case (a bi-modal distribution) with the modifications
m, =
m
2 ~
-1
V, -
i
0
-1
1
0 0.1
1
x2 =
0.1
0
1
0
1


36
In the original development, SDF type filters were formulated using correlation opera
tions, a convention which will be maintained here. The output, g(nv n2), of a correlation
filter is determined by
n,-i n7-i
g(nt,n2)= ]T £ x*(nl +m1,n2 + m2)h(m¡,m2)
m, = 0 m2 = 0
x*(n|, n2)**h(n,, n2)
where x*(n¡, n2) is the complex conjugate of an input image with Nt x N2 region of sup
port, h(nx, n2) represents the filter coefficients, and ** represents the two-dimensional
circluar convolution operation [Oppenheim and Shafer, 1989].
The MACE filter formulation is as follows [Mahalanobis et al., 1987]. Given a set of
image exemplars, (x,-6 9tA,'xW2; i = 1...IV,}, we wish to find filter coefficients,
h e 9iN*xNi, such that average correlation energy at the output of the filter defined as
(19)
is minimized subject to the constraints
N¡-\ n2-i
£,(0,0) = Y, X xi*(mP m2)*(ml-m2) = di: = 1 Nr (20)
m, = 0 m2 = 0
Mahalanobis [1987] reformulates this as a vector optimization in the spectral domain
using Parsevals theorem. In the spectral domain we wish to find the elements of
H g CN,Nl x1 a column vector whose elements are the 2-D DFT coefficients of the space


139
ence is essentially zero when the distance from the center of the kernel is greater than 3 ct,
or
|(y-y,)|>3a.
(88)
The process relies on local interaction, and so from an attraction viewpoint we can use
the two most distant nearest neighbors to set the kernel size (and adapt it during the learn
ing process). Stated mathematically,
(89)
From a practical standpoint, the mutual distances between each point must be computed in
the course of evaluating the kernels, and so equation 89 does not represent a significant
additional burden.
Figure 62 shows an example of using local attraction to minimize entropy. In the fig
ures there are two clusters of points. By choosing the kernel according to the maximum
nearest neighbor distance, the points within the local clusters are attracted to a point. We
also see that the clusters are converging to a single point in the third iteration.
5.9.2 F.ntropy Maximization as Diffusion
If instead, the goal was maximum entropy, then the local interaction becomes repul
sion and the feedback terms of figure 61 point in the opposite direction. We can use the
idea of uniform diffusion in the output space in order to set the kernel size for entropy
maximization. In the early stages of learning we would like the relative kernel size to be
large. In this way, dense groupings of points will maximally interact (and repel), however,


85
Figure 34. Experiment II: Output response to an image from the recognition class
training set.
how to reduce the training complexity by recognizing that we can sufficiently describe the
rejection class with white noise sequences. We now show a more compact description of
the rejection class which leads to shorter convergence times, as demonstrated empirically.
This description relies on the well known singular value decomposition (SVD).
We view the random white sequences as stochastic probes of the performance surface
in the whitened image space. The classifier discriminant function is, of course, not deter
mined by the rejection class alone. It is also affected by the recognition class. We have
shown previously that the white noise sequences enable us to probe the input space more
efficiently than examining all shifts of the recognition exemplars. However, we are still
searching a space of dimension equal to the image size, N{N2
One of the underlying premises to a data driven approach is that the information about
a class is conveyed through exemplars. In this case the recognition class is represented by


15
synthetic discriminant function
1.2
1 T 1 1 i 1 1 1 1 1 T 1 | 1 1-
1.0
-
J > .
0)
0.8
f _
-
CO
G
\ ^ \ _
\vV
a
/
u,
0.6
" \ /
~
-
D-
0.4

0.2
0.0
i..1.,1 1,,,1.,,1,.
-
0 20 40 60 80
100
aspect angle
Figure 5. SDF peak output response of training vehicle la over all aspect angles.
The MSF response is also shown (dashed line). The degradation in the
peak response has been corrected.
2.1.2 Minimum Variance Synthetic Discriminant Function
The SDF approach seemingly solved the problem of generalizing a matched filter to
multiple images. However, the SDF has no built-in noise tolerance by design (except for
the white noise case). Furthermore, in practice, it would turn out that occasionally the
noise response would be higher than the peak object response depending on the type of
imagery. As a result, detection by means of searching for correlation peaks was shown to
be unreliable for some types of imagery, specifically imagery which contains recognition
class images embedded in non-white noise[Kumar, 1992]. Kumar [1986] proposed a
method by which noise tolerance could be built in to the filter design. This technique was
termed the minimum variance synthetic discriminant function (MVSDF). The MVSDF is


98
to a probabilistic framework coupled with a nonlinear topology led to improved classifica
tion performance. Furthermore, the previous chapter addressed the lack of training data
through efficient descriptions of the rejection class by its second-order statistics and the
recognition class by a subspace. The discussion in this chapter seeks to extend this
approach beyond second-order descriptions by exploiting the underlying structure of both
classes.
It is imperative in any feature extraction algorithm that the criterion by which the fea
tures are selected is somehow related to the overall system objective. Suitable criteria in
classification are not always easily employed as a means for classification (e.g. likelihood
ratios which require prior knowledge of the underlying probability density function). Con
sequently, sub-optimal feature sets are used or even more commonly user defined ad hoc
features based on intuitive assumptions, but without any rigorous relationship to classifi
cation.
It is often the case that projecting high-dimensional data onto a smaller subspace
results in improved performance of a nonparametric classifier. This statement is counter
intuitive as we cannot, in general, project onto a subspace without the loss of some infor
mation. The results of the previous chapter, however, confirm this assertion. In the final
experiments, in our construction of rejection class exemplars, we implicitly restricted the
search space to two distinct subspaces. In the first case, we confined the rejection class to
the PCA subspace of the recognition class, while in the second case we restricted some of
the rejection class exemplars to the convex hull of the recognition class. In doing so, no
information concerning the recognition class was lost, but the intrinsic dimensionality of
the data was reduced to the number of recognition class exemplars, Nt.


CHAPTER 1
INTRODUCTION
1.1 Motivation
Automatic target detection and recognition (ATD/R) is a field of pattern recognition.
The goal of an ATD/R system is to quickly and automatically detect and classify objects
which may be present within large amounts of data (typically imagery) with a minimum of
human intervention. In an ATD/R system, it is not only desirable to recognize various tar
gets, but to locate them with some degree of accuracy. The minimum average correlation
energy (MACE) filter [Mahalanobis et al., 1987] is of interest to the ATD/R problem due
to its localization and discrimination properties. The MACE filter is a member of a family
of correlation filters derived from the synthetic discriminant function (SDF) [Hester and
Casasent, 1980], The SDF and its variants have been widely applied to the ATD/R prob
lem. We will describe synthetic discriminant functions in more detail in chapter 2. Other
generalizations of the SDF include the minimum variance synthetic discriminant function
(MVSDF) [Kumar, 1986], the MACE filter, and more recently the gaussian minimum
average correlation energy (G-MACE) [Casasent et al., 1991] and the minimum noise and
correlation energy (MINACE) [Ravichandran and Casasent, 1992] filters.
This area of filter design is commonly referred to as distortion-invariant filtering. It is a
generalization of matched spatial filtering for the detection of a single object to the detec
tion of a class of objects, usually in the image domain. Typically the object class is repre
sented by a set of exemplars. The exemplar images represent the image class through a
1


CHAPTER 4
STOCHASTIC APPROACH TO TRAINING NONLINEAR
SYNTHETIC DISCRIMINANT FUNCTIONS
The MACE filter is the best linear system that minimizes the energy in the output cor
relation plane subject to a peak constraint at the origin. An advantage of linear systems is
that we have the mathematical tools to use them in optimal operating conditions from the
standpoint of second order statistics. Such optimality conditions, however, should not be
confused with the best possible classification performance.
Our goal is to extend the optimality condition of MACE filters to adaptive nonlinear
systems and classification performance. The optimality condition of the MACE filter con
siders the entire output plane, not just the response when the image is centered. With
regards to general nonlinear filter architectures which can be trained iteratively, a brute
force approach would be to train a neural network with a desired output of unity for the
centered images and zero for all shifted images. This would indeed emulate the optimality
of the MACE filter, however, the result is a training algorithm of order NtN2N, for IV,
training images of size A, x N2 pixels. This is clearly impractical.
In this section we propose a nonlinear architecture for extending the MACE filter. We
discuss some its properties. Appropriate measures of generalization are discussed. We also
present a statistical viewpoint of distortion invariant filters from which such nonlinear
extensions fit naturally into an iterative framework. From this iterative framework we
52


144
through the use of a simple differentiable estimator, namely Parzen windows, that the
adaptation of entropy can fit logically into the error backpropagation model. This method
differs from other entropy based approaches such as using the Kullback-Leibler norm for
supervised learning.
We have also presented experiments that illustrate the usefulness of this technique.
Comparisons to the well known PCA method show that the information theoretic
approach is more sensitive to the underlying data structure beyond simple second-order
statistics. The data types used for the experiments were simple by design. They served to
illustrate the usefulness of the method even for seemingly simple problems.
We have also shown how the approach can be modeled as local interaction of the data
in the output space. This viewpoint led to a significant computational savings as well as a
clearer intuitive understanding of the algorithm.
5.11 Mutual Information Applied to the Nonlinear MACH Filters
At this point we present experimental results which illustrate application of this tech
nique to the nonlinear MACE filter. This is accomplished by repeating the experiments III
(section 4.6.3) and IV (section 4.6.4) from the previous chapter.
In experiment III we trained the classifier (after pre-processing the imagery) with sub
space projected noise exemplars for the rejection class and gram-schmidt orthogonaliza-
tion on the input layer. The orthogonality constrain ensured that the feature would be
uncorrelated over the rejection class. In this experiment we remove the orthogonality con
straint and decouple the feature extraction from the discriminant function explicitly. The
images are still be pre-processed and the same exemplars are used to train the system,
however, the classifier architecture will be trained on the output of the feature extractor.