Group Title: Department of Computer and Information Science and Engineering Technical Reports
Title: Computer simulation : growth through extension
CITATION PDF VIEWER THUMBNAILS PAGE IMAGE
Full Citation
STANDARD VIEW MARC VIEW
Permanent Link: http://ufdc.ufl.edu/UF00095285/00001
 Material Information
Title: Computer simulation : growth through extension
Alternate Title: Department of Computer and Information Science and Engineering Technical Report
Physical Description: Book
Language: English
Creator: Fishwick, Paul A.
Publisher: Department of Computer and Information Sciences, University of Florida
Place of Publication: Gainesville, Fla.
Copyright Date: 1994
 Record Information
Bibliographic ID: UF00095285
Volume ID: VID00001
Source Institution: University of Florida
Holding Location: University of Florida
Rights Management: All rights reserved by the source institution and holding location.

Downloads

This item has the following downloads:

1994136 ( PDF )


Full Text








COMPUTER SIMULATION: GROWTH THROUGH EXTENSION


Paul A. Fishwick

Computer & Information Sciences Dept.
University of Florida, CSE 301
Gainesville, FL 32611, USA
E-m ail: fJ -/.,.. I / .. .. ufl.edu


ABSTRACT

Computer simulation is a fundamental discipline for
studying complex systems. Like any other discipline,
simulation must grow and be fine-tuned so that it
maintains its position as the base methodology for
doing computational science and constructing digital
worlds. We discuss ten areas outside of simulation
and demonstrate growth by identifying relationships
between simulation and each of the areas. We outline
each field by describing it briefly and then specifying
outstanding issues which remain to be resolved. We
have found that we are better able to characterize
basic simulation methodology by integrating and ex-
tending simulation within the context of other fields.


INTRODUCTION

The field of computer simulation is approximately
forty years old, and is still vibrant and growing. As
technology develops faster hardware, old forms of
simulation are made faster, and new varieties of simu-
lation emerge through an extension process. Extend-
ing the core simulation knowledge base involves tak-
ing existing simulation concepts and blending these
concepts with those outside of the simulation disci-
pline. An example of extension is taking two con-
cepts, a system model and an abstract programming
object (from object oriented design), and seeing how
both of these relate to one another. We can extend
system models by designing model components as ob-
jects. This extension seems simple enough, however,
controversies will arise. Should all physical systems
be modeled with objects, or are some better mod-
eled with equations, for instance? How do equational
models mesh with object based models? Sometimes,
the interface between the simulation concept and the
extensional concept is straightforward, but in most
instances, there are many issues to be addressed.
The ultimate goal within the simulation community


is to walk the narrow line separating a mathemati-
cally defined system theory-based foundation, on one
hand, and the world outside of simulation that en-
courages extension and possible revision of the basic
approaches.

Extension when used with simulation means
that we consider an arbitrary topic and then study
how this topic can be integrated with simulation. The
morphological box concept (Zwicky 1966) provides a
formal way of studying the interaction between dif-
ferent topics and the field of computer simulation.
This box forms a new relation by considering two
orthogonal sets and taking the cross product. The
cross product forces one to study interactions in an
organized manner. One possible morphological box is
shown in Table 1. This kind of focused approach pro-
motes the discovery of new extensions and concepts
for the simulation field. It is possible that some cells
will be empty, representing that a clear relationship
does not exist for that particular row-column combi-
nation. This box breaks simulation down into three
sub-fields: model design, model execution and execu-
tion analysis. Model design reflects how we should
design and engineer models from concepts to some-
thing that can be executed on a computer. Model
execution includes serial and parallel algorithms for
simulating the model once it has been designed. Ex-
ecution analysis uses statistical procedures to collect
data, verify and validate models. These three sub-
fields are listed as columns. For each row, we list
several fields outside of simulation. As we will see,
these external fields serve as vehicles for extension.
Let's consider the entry in Table 1 identified by the
area artificial life (AL). The relationship between AL
and simulation model design is that most models for
AL are discrete and spatial in character. That is,
the models use types such as cellular automata and
L-Systems To consider the next column, model ex-
ecution, we will simulate AL models by employing
simulated evolution with genotypes and operations










Table 1: Morphological box for simulation.

Outside Computer Simulation
Areas Design Execution Analysis
Abstraction Homomorphism, Multi-Level Hybrid
Multimodeling Coordination Methods
Artificial Quality, Logic
Intelligence Autonomy
Object Object Object
Orientation Design Methods
Neural Behavioral Training Error
Networks Model Analysis
Fuzzy Logic Behavioral Fuzzy
Model Arithmetic
Artificial Spatial Adaptation Emergence,
Life Order
Parallelism Message Distributed Deadlock,
Passing Architectures Load Balance
Computer Animation as Rendering Validation
Graphics Modeling
Virtual Geometry Physical Tracking,
Reality Based Equations Feedback
Information Simulation as Hypermedia
Access Information
Gathering


such as mutate and crossover.

Many fields within the purview of computer science
and computer engineering serve as candidates for the
extension process. That is, we take our simulation
knowledge base and create extensions by linking to
these fields. The forums for the exchange of infor-
mation, and suggestions for extension, are normally
found in simulation conferences, but they can also
be found in workshops found in the extension disci-
plines. An example of the latter is the series of AI
and Simulation Workshops held in the National Con-
ference on Artificial Intelligence (AAAI) during the
years 1986-1990. Another example was the focus on
object oriented simulation during the 1993 conference
on object oriented programming and systems (OOP-
SLA '93). For this paper, we have chosen ten fields
that have served as bases for extension to computer
simulation. Most of these extensions reflect active re-
search agendas found in most simulation conference
technical programs.

We begin the discussion by touching on the role of
simulation using concepts and quotes. This discus-
sion is meant to be general and introductory. Then,
we will describe ten extension areas. Within each
area, the following items will be addressed:


Definition of the extension area and relevance to
simulation.

Issues, controversies and concerns associated
with the extension.

Literature references on simulation researchers
working in the extension area.

Future approaches and forecasts.


CONCEPTIONS AND MISCONCEPTIONS

There are many questions that others, as well as simu-
lationists, ask of the simulation field. These questions
suggest that simulation is a growing and vibrant field,
trying to achieve a cohesive organization of technical
knowledge.


"What is Simulation?"

It is difficult to form a cohesive discipline when there
is no widely accepted taxonomy, even though sig-
nificant efforts have been made (Zeigler 1976; Oren
1 ia; Oren 1987b). Let's create a definition for sim-
ulation:









Conceptual Models
Declarative Models
Model Functonal Models
Design Costramnt Models
Spatial Models
Multimodels



Input-OutputAnalysis
Serial Algontlns Expenmental Deign
Parallel Algonthms Model Execution Surface Resonse Techniques
Execution Analysis V alzaton of Data
Venfica-im
Validation


Figure 1: A Taxonomy for simulation.


Computer simulation is the discipline of de-
signing a model of an actual or theoretical
physical system, executing the model on a
digital computer, and analyzing the execu-
tion output.

From this basic definition, we derive the three previ-
ously defined divisions: 1) model design, 2) model ex-
ecution and 3) execution analysis and subdefine them
as shown in Fig. 1. A more detailed discussion of this
taxonomy is in (Fishwick 1994).


"Simulation is a tool"

A tool is something that a researcher uses because it
is handy and useful. Simulation researchers should
rejoice in this sentiment because, at the very least,
simulation models, algorithms and software are actu-
ally being used in the real world. No greater compli-
ment could be offered. To the extent that the word
"tool" implies that simulation is not a research area,
one should realize that when one calls an area X a
"tool," this means only that one is not doing research
in area X but, instead, needs X as a resource. One
person's research serves as another person's tool.


"Simulation of what?"

Simulation, like most disciplines, can be generally di-
vided into methodology and applications. Sometimes,
methodology is termed theory. As our field matures
and builds upon the solid structure of systems theory,
we are developing a sound methodology. The impor-
tance of methodology and not just applications
cannot be overemphasized. The simulation discipline
has a core of knowledge which is independent of ap-
plications. We define simulation methodology into
the sub-fields of model design, model execution and
execution analysis. Methodology can apply itself to
all sorts of practical real-world applications, but it is
a substantial field by itself.


"Simulation is the method of last resort"

Methods of analytic (non time dependent) modeling
have frequently been used as a method of first resort
because of the expense inherent in the simulation en-
terprise. At one time, two decades ago, electronic
calculators were definitely the computational tool of
last resort. After all, they were bulky, expensive and
hard to find. This is no longer the case, since calcula-
tors can now fit on a person's wrist with ease. More-
over, calculators have become easier to use by virtue
of their cost. Simulation is in a similar situation.
As equipment becomes cheaper and our methods of
programming simulations become more efficient (with
code reuse and object orientation), simulation will be-
come the method of first resort, and frequently, the
only method.


"Simulations are created with a specific pur-
pose in mind"

When one builds a simulation model, one builds it
with the idea of answering a certain set or class of
questions about the physical system being modeled.
This is the traditional way that we build models and
run simulations, but it is need of an overhaul. The
reason is that, as simulationists, we should be in the
market for designing digital worlds containing digital
objects. A digital object is one that incorporates all
known knowledge about that object so that the ob-
ject appears and reacts to sensory feedback exactly
as the real object would1. Moreover, a digital object
can be asked questions whose answers would have to
be at varying levels of abstraction. The idea that
we build models to achieve a singular purpose does
support analysis by a single user, but we should be
building models that are robust and can respond to
a wide variety of real-world sensory interactions and
queries from many users. A simulationist's responsi-
bility should be to construct the object methods that
define how various pieces of geometry, comprising the
digital world, react to user intervention. This kind of
environment will be based on distributed simulation
(within the Internet) which will foster code and model
reuse which, in turn, will serve as a basis for digital
world construction.


MODEL ABSTRACTION: MULTIMODELS

Discussion

Modeling complex systems requires a "model of mod-
els," or a multimodel. A multimodel is a network

'Digital object behaviors may be different than real time
(faster or slower).









or hierarchy of models where each model represents
the physical systems at a given level of abstraction
or granularity. Homomorphic relations formally link
each level together, allowing one to traverse levels of
abstraction. The need for this kind of model was first
brought out in combined simulation efforts. Com-
bined simulation focused specifically on blending dis-
crete event methods with continuous methods, and
multimodeling provides an object-oriented method-
ology to extend this integration so that many addi-
tional model types can be integrated. A multimodel
is capable of answering a wide variety of queries and
responding to a number of sensory queues (or inputs).
In this sense, the multimodel provides the technology
for making a digital object, which contains all geo-
metric and dynamic modeling information associated
with that object. Also, the multimodel is computa-
tionally more attractive than the single level model
because the analyst may weave through the abstrac-
tion network while focusing the computation of dy-
namics only in those areas that require additional
computation. An example of this focusing can be
seen in cockpit simulators that use a large projection
screen on which the pilot focuses during a simulated
flight. The center of the screen in line with the
pilot's foveal vision uses coarse computer graph-
ics rendering techniques since the peripheral vision is
less acute and in need of less graphical detail. For the
same reason that we modify rendering complexity, we
can also manipulate the complexity of the dynamic
model used in peripheral vision, thereby reducing the
complexity of the simulation.


Issues

Is there a need to simulate levels independently?
In most cases, given a hierarchy of models, it will
be sufficient to execute the lowest level model
while allowing reporting (model output) at all
levels. A model, in the hierarchy, can be "cut
out" of the multimodel but then we must deal
with internal events that serve as input to the
model. Normally, these inputs come from the
next lower abstraction level.


If we have the lower level model, why do we need
the higher level models? Since models represent
a human language for exchanging information
about dynamical systems, removing the higher
level models also removes this more abstract sys-
tem knowledge. The abstract knowledge serves
as an important repository when we wish to rea-
son about system behavior.


Future

To make models more robust, we need to have models
containing more than one level of abstraction. Such
a model may be more complex, but it can answer a
larger class of questions than a single-layer model. We
do not always want to see the lowest level of detail for
all parts of a process. Moreover, we want to be able
to tell the simulation what parts of our interest to us
by tuning in on those parts of the multimodel. The
original term combined simulation should be replaced
by the more general multimodel concept which fosters
an integration of basic model types shown in Fig. 1.


References

Cellier, F. E. 1979. Combined Continuous Sys-
tem Simulation by Use of Digital Computers:
Techniques and Tools. Ph.D. Thesis, Swiss Fed-
eral Institute of Technology Zurich.

Fishwick, P. A. 1994. Simulation Model Design
and Execution: Building Digital Worlds. Pren-
tice Hall.

Fishwick, P. A. and Zeigler, B. P. 1992. A Mul-
timodel Methodology for Qualitative Model En-
gineering. AC if Transactions on Modeling and
Computer Simulation, 2(1):52-81.

Fishwick, P. A. 1993. A Simulation Environment
for Multimodeling. Discrete Event Dynamic Sys-
tems: Theory and Applications, 3:151-171.

Mattsson, S. E. and Andersson, M. 1993.
Omola-An Object-Oriented Modeling Lan-
guage. In Jamshidi, M. and Herget, C. J., ed-
itors, Recent Advances in Computer-Aided Con-
trol Systems Engineering, volume 9 of Studies in
Automation and Control, pages 291-310. Else-
vier Science Publishers.

Praehofer, H. 1991. Systems Theoretic For-
malisms for Combined Discrete-Continuous Sys-
tem Simulation. International Journal of Gen-
eral Systems, 19(3):219-240.

Praehofer, H., Auernig, F. and Reisinger, G.
1993. An Environment for DEVS-Based Multi-
formalism Simulation in Common Lisp/CLOSS.
Discrete Event Dynamic Systems: Theory and
Applications, 3:119-149.









ARTIFICIAL INTELLIGENCE (AI)

Discussion

There are two aspects of artificial intelligence which
are particularly important to simulation research:
1) the use of natural language and qualitative knowl-
edge, and 2) the encoding of the decision making
process. Humans speak and write in natural lan-
guage; however, there must be a translation pro-
cess if this knowledge is to be useful to simulation.
Most simulations of natural or artificial systems are
based on quantitative methods. In many instances
especially in areas such as social science or medicine
model components (parameters,state variables,input
and output) are defined in natural language or in an-
other qualitative representation. There needs to be a
way to map quality to quantity. Fuzzy set theory is
a well-formed discipline for mapping quality to quan-
tity.
While AI methods can be used within a particular
model design, they are even more useful in modeling
the decision making process that envelopes the sim-
ulation process. Simulations are often used in deci-
sion making, and expert systems can be used to guide
which simulations are to be executed, and what pa-
rameters are to be chosen. The major lesson that
we can learn from AI is that all knowledge about a
system should be encoded -not only that particu-
lar knowledge which is quantitative or amenable to
analysis. Expert systems provide a good illustration
of codifying meta-knowledge about a system, and not
only low-level aspects of the system.


Issues

SPsychology or engineering? The first thing that
must be done is to decide whether the goal of
the knowledge-based simulation is to validate
common sense human thinking about a system,
or to validate a physical system whose model
was created via a more compiled knowledge ap-
proach to human thinking. These are two dis-
tinct goals. It is straightforward to create a
human-like model of a four stroke gasoline en-
gine which is physically inferior to the quantita-
tive model even though the model may represent
how a particular human thinks about the engine.
Only a sturdy experimental method (routinely
performed in psychology) can validate a model
of human thinking. On other hand, if the model
is to model a physical system, then it must be
compared-with and contrasted-against existing
physical system models for a domain.


*Rules or mathematical models? When apply-
ing AI technology to simulation, at what level
is AI most appropriate? Although AI methods
can be used to create system models, its primary
contribution is at the higher level of organizing
the knowledge that makes up the assumptions
in modeling, and encoding the decision making
process (often using rules) used to control simu-
lation runs. For every system, we must question
whether it is appropriate or relevant to use rules,
equations or graph-based models. An arbitrary
choice of modeling technique can be problem-
atic. Often, the answer to the question of level
is that we should strive to manufacture multi-
models, thereby achieving the benefits of each
level with its own granularity and definitions for
mapping.

Future

We must connect common sense knowledge to the
more compiled, detailed knowledge available for dy-
namical systems. Having common sense models of
the world that are disconnected from existing, more
detailed models is counter-productive. We need to de-
velop ways of building qualitative models which are
demonstrated to be valid based on what we know
about a system. Ambiguous data should be repre-
sented in the greatest degree of detail possible. Good
first steps to defining a variable's value are to use
fuzzy sets and probability distributions (if a sufficient
sample is readily available). Expert systems should
be built to guide simulationists as to what kind of
model to use in a particular circumstance. Right
now, we have very few guiding tools in engineering
our models.

References

Cellier, F. E. 1991. Continuous System Model-
ing. Springer Verlag.

Elzas, M. S., Oren, Tuncer I. and Zeigler, B.
P. 1989 Modelling and Simulation Methodology:
Knowledge Systems' Paradigms, North-Holland.

Fishwick, P. A. and Modjeski, R. B., editors.
1991. Knowledge Based Simulation: Methodol-
ogy and Application. Springer Verlag.

Fishwick, P. A. and Luker, P. A., editors. 1991.
Qualitative Simulation Modeling and Analysis.
Springer Verlag.

Fishwick, P. A. and Zeigler, B. P. 1991. Quali-
tative Physics: Towards the Automation of Sys-
tems Problem Solving. Journal of Theoretical









and Experimental Artificial Intelligence, 3: 219-
246.

Widman, L. E., Loparo, K. A., and Nielsen,
N. R. 1989. Artificial Intelligence, Simulation
and Modeling. John Wiley.

OBJECT ORIENTED (00) SIMULATION

Discussion

On one hand, we have the real world which is full of
objects and interactions, and on the other, we have a
computer program. A central goal in computer simu-
lation is to map one to the other The most straight-
forward way of doing this is to create abstract objects
in the programming language, where these objects
map directly to real world objects. This approach
was first developed in the Simula language and has
gained much greater momentum over the past five
years. One reason for the lag in 0O-based design is
that no good visual analog existed for representing
class hierarchies, objects and object interaction. The
past five years have produced good visual 00 tech-
niques, mostly from the software engineering commu-
nity. Also, these 00 visual representations can only
recently be exploited using recent window-oriented
user interface construction kits.


Issues


* Processes or objects? Should we really be focus-
ing on objects, or should we think in terms of
processes and activities? The two concepts are
not mutually incompatible; it is natural to create
declarative state transition models as an object's
behavior; however, the objective way of looking
at the world seems most natural. Declarative
model components may be defined by refinement
into functional models, or vice versa. When look-
ing out of a window, we see tree objects with
branch sub-objects swaying in the breeze. We
usually do not first see the lumped state called
-' , branches." Instead, the concept of
swaying is located within an object as one of its
methods.

* Simulation in software engineering. Many of the
examples given in recent 0O-based software en-
gineering texts appear to be simulations. There-
fore, there is an intense cross-fertilization occur-
ring in this extension area. Programmers are
finding that it is easier if we create real world
metaphors for programming tasks, and then con-
struct programs using these metaphors. For ex-
ample, instead of writing a program to sort n


numbers in an abstract manner, let each number
represent a physical file folder in a filing cabi-
net. Then create a simulation that allows the
programmer to create cabinet, drawer and folder
objects while specifying the sorting procedure as
a method available within the drawer object. As
a result, the task of programming becomes less
abstract and more attune to real world objects.
Since simulation is founded on the study of real
world objects undergoing change, a natural con-
fluence now exists between 0O-based design and
simulation model design.


Future

We code simulations on a computer using program-
ming languages of some sort. It is natural to want
our programming devices to map clearly to the phys-
ical world devices, and so object orientation has many
advantages. With the multimodel extension to 00,
objects can have several abstraction levels. Concepts
from distributed simulation provide us with digital
objects, which are located where their counterpart
physical objects are located. New worlds are created
by picking the objects that are needed from wherever
they are located on the Internet. Objects are then
glued together using network messages.


References

Birtwistle, G. M. 1979. Discrete Event Modelling
in SIMULA. Macmillan.

Booch, G. 1991. Object Oriented Design. Ben-
jamin Cummings.

Ege, R. K., editor. 1991. Object-Oriented Sim-
ulation 1991. Society for Computer Simulation.
Simulation Series, Volume 23, Number 3.

Fishwick, P. A. 1994. Simulation Model Design
and Execution: Building Digital Worlds. Pren-
tice Hall.

Rumbaugh, J., Blaha, M., Premerlani, W., Fred-
erick, E., and Lorenson, W. 1991. Object-
Oriented Modeling and Design. Prentice Hall.

Zeigler, B. P. 1990. Object Oriented Simulation
with Hierarchical, Modular Models: Intelligent
Agents and Endomorphic Systems. Academic
Press.









NEURAL NETWORKS (NN)

Discussion

Two approaches have been used for applying neural
network research to simulation: 1) the use of a neu-
ral network as a behavioral model to map a systems
input to its output regardless of the nature of the sys-
tem, or 2) the use of the network as a model of brain
activity and human behavior. The first approach in-
volves neural networks as repositories of behavior for
any system whereas the second approach presupposes
that the system under question is an actual brain
whose model is to be validated against empirical data
that is obtained through experiments with a human
subject.

Issues

If we know the model, do we need the NN? Con-
sider an equational model of a system, such as
the heat or wave equation. We could also cap-
ture the essence of the wave equation, without
keeping the equational model, by training a neu-
ral network to store input-output (i.e., behavior)
pairs, but do we want to do that if the model
already does this more economically? First, the
issue of complexity must be addressed: which
modeling method is faster?2 More importantly,
neural networks may be useful to control systems
whose internal state-based model is difficult to
discern or obtain. Therefore, if we have an in-
complete level of knowledge about a system, the
NN approach becomes more appealing.

What can humans understand from a NN? A
chief criticism of NNs, which really applies to
all behavioral models, is that humans do not
gain insight into the way in which NNs perform
their internal function. That is, because of a
lack of states and events which are understand-
able to humans because of potential state/event
mappings to natural language humans are left
in the dark. Adequate system control may be
achieved but to what end? Can (or should)
we create systems that we cannot understand?
Some recent work attempts to aggregate sym-
bolic knowledge from NNs so that humans can
gain insights into NN operation. A reasonable
solution is to use NNs as first-cut models before
a state-space model has been formulated.

What about other behavioral models? The pro-
cess of using a generic model formulation and
2The issue of speed applies both to the time taken to design
the model and the time taken to execute the model.


fitting values for parameters is known as system
identification. We should also ask, then, where
we could use nonlinear or linear regression to
store input-output system behavior? Also, to
what extent do the NN parameters (such as hid-
den layer makeup, biases and starting weights)
need to be tuned to make the NN work while
minimizing the error?


Future

Neural networks are good first cut models for systems
for which we are lacking information, especially in the
relationship among state variables. More work should
be done to link neural network behavioral models to
state space models are they are developed. The cur-
rent problem is that NN models for systems are not
related to other existing models for the same systems.
We need to tie them together.


References

Cellier, F. E. 1991. Continuous System Model-
ing. Springer Verlag.

Fishwick, P. A. 1989. "Neural Network Models
in Simulation: A Comparison with Traditional
Modeling A. ..... I. In 1989 Winter Simu-
lation Conference, Washington, DC, pp. 702
710.

Kosko, B. 1992. Neural Networks and Fuzzy Sys-
tems. Prentice Hall.


FUZZY LOGIC AND ARITHMETIC

Discussion

Fuzzy logic is similar to neural networks in that one
can create behavioral systems with both methodolo-
gies. A good example is the use of fuzzy logic for au-
tomatic control: a set of rules or a table is constructed
that specifies how an effect is to be achieved provided
an input and the current system state. The idea of
fuzzy logic is to approximate human decision making
using natural language terms instead of quantitative
terms. While fuzzy logic creates a behavioral simu-
lation model, fuzzy arithmetic can be blended with
classical state-based models. Using fuzzy arithmetic,
one uses a model and makes a subset of the system
components fuzzy so that fuzzy arithmetic must be
used when executing the model.









Issues

When should we use fuzzy sets? Fuzzy logic and
sets have been at the center of many a debate.
Usually, the debate rages about the question of
whether probability theory can replace fuzzy set
theory. For simulation, we should be concerned
about whether there exists any statistical data
for a given variable. If data exist in sufficient
quantity, there is less of a need for using fuzzy
sets, but fuzzy sets may be useful for assigning
qualitative values to variables that are less well
defined.

Does industrial implementation breed research
acceptance? Fuzzy logic controllers are appear-
ing everywhere from cameras to washing ma-
chines. Is this the true test of fuzzy sets -that
they have proven themselves in the form of a con-
sumer product, and therefore have a solid foun-
dation? If, by using NNs, a product can be made
more efficient then, yes, industrial implementa-
tion does breed acceptance of fuzzy controllers
(or NN controllers for that matter).

Future

Fuzzy sets will be used for the same reason as NNs:
as behavioral models of a system which are easy to
create, without having to perform a more complicated
system identification procedure.


References


Fishwick, P. A. 1991a. Extracting Rules from
Fuzzy Simulation. Expert Systems with Applica-
tions, 3(3):317 327.

Fishwick, P. A. 1991b. Fuzzy Simulation: Speci-
fying and Identifying Qualitative Models. Inter-
national Journal of General Systems, 9(3 -,
316.

Klir, G. J. and Folger, T. A. 1988. Fuzzy Sets,
Uncertainty and Information. Prentice Hall.

Kosko, B. 1992. Neural Networks and Fuzzy Sys-
tems. Prentice Hall.

COMPLEX SYSTEMS AND ARTIFICIAL
LIFE (AL)

Discussion

Our first topic of complexity is chaos, or the study
of nonlinear dynamics. We learn from this discipline
that models with simple structure can often lead to


chaotic behavior when simulated. Simulation is the
only real way of studying these systems. Some ana-
lytic approaches may be used to obtain rough quali-
tative features of the chaotic attractor and its compo-
nent basins and separatrices, however, by simulating
these models, we are able to, with precision, numer-
ically determine the basins of attraction and other
qualitative features of interest. Analysis breaks down
and simulation is the only viable tool remaining.
Our second topic relates to systems composed of
very large numbers of homogeneous particles, bodies
or cells. Sometimes the cells can be different, but
most often they have a similar structure. Examples
of these kinds of systems are: 1) cellular automata,
2) Ising models, 3) boolean networks 4) percolation
lattices and 5) N Body models. Again, with these sys-
tems, simulation is the only viable method for study-
ing these systems. We can achieve qualitative insight
through iterative quantitative means. The field ar-
tificial life has sprung up as a branch of theoretical
biology. This field represents a bottom-up investi-
gation of complexity using the computer as a kind
of scientific laboratory. This bottom-up approach to
understanding nature is not only found in artificial
life, but also in the other complex system model types
such as lattice gases for fluid dynamics modeling. For
computational fluid dynamics (CFD), we can use the
top-down approach of starting with the Navier-Stokes
equations, or we can approach the problem by start-
ing with elementary conservation laws expressed in
cellular automaton rules.

Issues

Is Artificial Life a science? The AL area could
be criticized because of its bottom-up approach
to understanding the nature of life, as opposed to
the more traditional scientific approach of run-
ning experiments on real life forms. On the other
hand, AL uses simulation as "computational sci-
ence" by using the computer as a laboratory tool.
We cannot understand complex systems without
simulating them, just as we cannot understand
nature with operating upon it. The models have
a life of their own and are no less complex be-
cause they were created artificially. One way of
viewing the work in AL is to make a compari-
son with physics. The relationship between AL
simulation and the science of biology is like the
relationship between theoretical physics and ex-
perimental physics. Theoretical and experimen-
tal work complement each other.

Simulation as the ultimate laboratory tool. The
falling prices of hardware and the increasing









costs of performing experiments with real hu-
mans and objects causes many experimental
methods to become prohibitive. We must in-
sure that we are not deviating too much from
reality and traditional experiments, but we must
also embrace a new way of doing science through
simulation.


Future

Theoretical studies of complexity, including AL, will
continue to grow since simulation has become more
effective due to technological advances in fast com-
puter architectures. Theory should be carefully bal-
anced with experiment. We must be wary of simula-
tions, though, that demonstrate an artificial system
for which valid simulation models exist. For exam-
ple, we could build a spatial model of generic insects
with insect behaviors without doing a validation. But
what does this demonstrate? If the insect model has
not been shown to be physically valid, it should be
shown to have demonstrated an important theoreti-
cal contribution such as replication, or as a generator
of qualitatively distinct spatial patterns for instance.
The purely theoretical AL systems can still be useful
but we need to temper our enthusiasm with attempts
at some sort of validation where possible.

References

Langton, C. G., Taylor, C., Farmer, J. D., and
Rasmussen, S., editors. 1992. Artificial Life II.
Addison Wesley.3

Serra, R. and Zanarini, G. 1990. Complex Sys-
tems and Cognitive Processes. Springer Verlag.


* Weisbuch, G. 1991.
Addison Wesley.


Complex Systems Dynamics.


* Wolfram, S. 1986. Theory and Applications of
Cellular Automata. World Scientific Publishing.


PARALLEL AND
PUTTING


DISTRIBUTED COM-


Discussion

Simulation usually effects a substantial load on the
computer on which the model is executed To speed
up simulation runs, we can parallelize the model. The
vast majority of models to be parallelized are spatial:
1) lattice oriented automata or 2) partial differential

3There have been three conferences on Artificial Life to


equations. This is because it is relatively straightfor-
ward to parallelize over the domain (i.e., the 2D or 3D
space). Functional models can also be parallelized by
using conservative or optimistic approaches. Conser-
vative methods ensure that causality relations among
logical processes (functions) are not violated, whereas
the optimistic approach assumes that messages arriv-
ing first have lower time stamps. In the event that
causality is violated, the logical process states must
be rewound or rolled back.
Aside from the speedup advantage of applying par-
allel computing technology to simulation, simulation
models can also be distributed over a wide area net-
work. One consequence of distributed models is
speedup as for the parallel case; however, distributed
models are usually associated with real-time interac-
tive simulations such as those used for training pur-
poses and combat simulation. Distributed Interac-
tive Simulation (DIS) is a substantial research project
sponsored by the US Department of Defense to allow
heterogeneous simulators to communicate on a vir-
tual battlefield.


Issues

Network performance in DIS. There are several
problems with running simulations over a wide
area network. The key problem is reduce the
network load given a fixed bandwidth. The en-
tity state packet is the element that contributes
the most to the load problem, so dead reckoning
is used to minimize the number of entity state
protocol data units (PDUs) that must be issued
whenever an entity moves or changes its orienta-
tion.

Where is everything stored? In DIS research, it
is not clear where to store all of the simulation
information such as the terrain and vehicles. An
entity has its own state information, but with
dead-reckoning, the entity also has the state in-
formation (at some level of detail) of a selec-
tion of other entities within some radius. Should
every entity have a complete map of the ter-
rain? Should there be central simulation servers
or should a strict distributed approach be man-
dated?

Extending DIS for all simulation. The work in
DIS suggests that we build a standard for com-
munication among all distributed simulations,
and not only those used for combat simulation.
The development of new public domain DIS tools
will be a thriving research area for the future.









Future

Distributed Simulation will certainly speed up our
simulation runs; however, it is also likely to change
the way we think of simulation models, and the com-
position of such models. We need to start thinking
of ourselves as digital world builders instead of work-
ers building isolated models useful to a small number
of people. Many parts of the digital world (methods
and geometry) will be reused using distributed simu-
lation on the Internet. Object geometry and methods
will be physically located where their physical object
counterparts are located. For instance, if I want to
build a digital world that includes an automobile traf-
fic network, I will use automobile objects which are
located online within the automobile manufacturer's
object database. It makes little sense to reinvent ob-
ject geometries and methods for every simulation. If
your digital world contains lathes, then that link in
your distributed simulation will point to digital lathe
objects stored in the database of the company that
makes lathes.


References

Fox, G., Johnson, M., Lyzenga G., Otto, S.,
Salmon, J., and Walker, D. 1988. Solving Prob-
lems on Concurrent Processors: Volume 1, Gen-
eral Techniques and Regular Problems. Prentice
Hall.

Fujimoto, R. M. 1990. Parallel Discrete Event
Simulation. Communications of the AC I/
33(10):31-53.

Loper, M. and Seidensticker, S. 1993. The DIS
Vision: A Map to the Future of Distributed Sim-
ulation. Technical report, Institute for Simula-
tion and Training, Orlando, FL.

McDonald, B. 1992. Distributed Interactive
Simulation: Operational Concept. Technical re-
port, Institute for Simulation and Training, Or-
lando, FL.

Nicol, D. and Fujimoto, R. 1993. Parallel Sim-
ulation Today. Technical report, College of
William and Mary. (to be published in Annals
of Operation Research).


COMPUTER GRAPHICS

Discussion

Simulationists regard validation to be of critical con-
cern for trusting a mathematical model of a system .


This concern is well founded since some model struc-
tures or animations may "look good," while not be-
ing true to the physical phenomenon being modeled.
Still, the use of graphics and immersive interfaces
is of major importance to simulation since it brings
more people into the field. Techniques and tools
in computer graphics such as new rendering meth-
ods endow simulations with the ability to commu-
nicate complex behavior in terms that are easy for
humans to understand (visual communication). The
area of physically-based modeling is of particular in-
terest to simulationists. Simulationists have always
used physically-based models; however, graphics re-
searchers needing to perform animations have often
relied upon the multi-track, keyframe approach which
is simpler and faster. With the onset of faster ma-
chines, graphics researchers are moving to the phys-
ically based approach since it yields more realistic
animations. The goals of computer graphics and sim-
ulation have traditionally been different: the simula-
tionist is after valid models and the graphics person is
after entertaining animations. However, the lines are
not so clearly drawn any more. Since simulations are
employing graphical rendering methods, and anima-
tors are using physically-based modeling, there is a
convergence of interest. The movement of physically-
based modeling can be viewed, in the larger sense, as
a movement to do system modeling. Keyframe ani-
mations are also models, albeit simpler discrete state
or event oriented ones.


Issues

Can simulationists use animation techniques?
Computer graphics is moving in the direction of
using more system oriented models, but can sim-
ulationists use keyframing methods4? At first,
this may seem contrary to the aim of simulation:
validation. However, keyframe models are valid;
however, they are system models defined at a
high level of abstraction. Looked at from this
perspective, spline-based keyframing techniques
can be used within multimodel simulations in
those instances where speed is more important
than visual or statistical accuracy. Validation
and accuracy need not be system wide as the
analyst may be focusing on only a small part of
the multimodel.

Icons or Rendered Scenes? Many simulations
will use iconic displays instead of fully rendered
3D scenes. There are two reasons for this: 1) 3D
rendered scenes are computationally expensive,

4A keyframe is defined as an event in systems terminology.









and 2) icons may express the necessary informa-
tion content that is not contained in the scenery.
The ideal situation is one where the computer
doing the simulation is powerful enough to fully
render a 3D geometric model of the system, while
also having the capability to display other sorts
of information (numeric, iconic) within the 3D
context.

Future

Even though many simulation outputs are currently
iconic, all future simulations will be based on 3D ge-
ometry and advanced graphics. You will start with
the geometry and assign methods and outputs to the
geometrical objects.


References


Badler, N. I., Barsky, B. A., and Zeltzer, D. ed-
itors, 1991. Making Them Move: Mechanics,
Control and Animation of Articulated Figures.
Morgan Kaufmann.

Badler, N. I., Phillips, Cary B. and Webber,
Bonnie L. 1993. Simulating Humans: Computer
Graphics, Animation and Control. Oxford Uni-
versity Press.

Barzel, R. 1992. Physically-Based Modeling for
Computer Graphics. Academic Press.

Thalmann, D. editor, 1990, Scientific Visualiza-
lion and Graphics Simulation. John Wiley.

VIRTUAL REALITY (VR)

Discussion

In the same way that computer graphics is reducing
the man-machine communication bottleneck for our
model designs and their executions, immersive inter-
face technology will impact the way that we phys-
ically interact with the computer for model design
and execution analysis. Whereas computer graphics
focuses on a particular aspect of man-machine com-
munication (i.e., visual feedback), virtual reality ex-
plores the way in which man and machine can be more
harmoniously coupled so that users of the computer
feel as if they are immersed in the digital environment
rather than being separate from it. We need to inves-
tigate how our discipline will change when modeling,
analysis and execution are performed with immersive
interfaces. Let us take each of these three simulation
topics in turn. Object-oriented models, since they
map to physical phenomena, will mesh nicely with


the geometrical objects present in the system. There-
fore, the modeler will build the model to blend with
the geometry. Analysis will not involve simply a ta-
ble of statistics. Instead, the analyst will "-.... I. an
object and be presented with several ways of obtain-
ing analytical results. By pointing-to or touching a
server object with an interest in throughput the
object's color or transparency may change. If numer-
ical results are desired, these statistics can appear
as being physically attached to the server. Finally,
model execution will involve the analyst being part
of a dynamically changing digital world. The analyst
can observe the world from a distance or become part
of it -becoming one of the objects that is undergo-
ing transition.


Issues


The model behind the interface. VR has received
substantial coverage in the popular press and in
new research-oriented publications. VR is of-
ten viewed as consisting of the hardware man-
machine interface issue. This definition is far too
limiting, however, and does not reflect the break-
down of research areas required to make VR
work. Simulation is needed to drive and respond
to the interface. The technology of VR and the
science of man-machine communication will re-
quire more complex simulation models that re-
spond to a variety of sensory cues.


Price/performance and resolution. Unfortu-
nately, all but the most expensive immersive in-
terfaces lack suitable performance. Even though
equipment cost is not a technical issue, it affects
how much effort simulationists can expend in
linking models to humans through more effective
interfaces. The resolution of a device is critical
if it is to be linked to a simulation. Two major
technical hurdles are lengthy tracking delays and
coarse resolution in helmet-mounted displays.


Future

VR represents the future of simulation; however, VR
still has too much of a buzzword status. To be suc-
cessful in VR, we will need improvements in several
basic research disciplines including simulation, man-
machine communication and computer graphics. We
must be careful not let VR become associated soley
with man-machine interaction. It represents a much
larger movement.









References

Bass, L., Gornostaev, J. and Unger, C. edi-
tors, 1993. Human-Computer Interaction, Lec-
ture Notes in Computer Science 753, Springer
Verlag.

CHI 'r' 1992. Human Factors in Computing
Systems, AC\I/SIGCHI Proceedings, May 3-7.

Grechenig, T. and Tscheligi, M. editors, 1993.
Human Computer Interaction, Lecture Notes in
Computer Science 733, Springer Verlag.

INFORMATION ACCESS

Discussion

Hypermedia is the marriage of hypertext and mul-
timedia, where multimedia includes documents that
contain text, images, audio and video. Simulation
will play a major role in hypermedia research be-
cause many of the existing links that reference video
and audio files can be more generally defined as pro-
grams which produce audio and video as output given
user input. Consider opening up a document that
describes a new automobile. After reading some tex-
tual material on the automobile, the user clicks on
a picture of an automobile and is presented with a
menu with the following choices: 1) view automo-
bile; 2) drive automobile; 3) see performance statis-
tics. Item 1 would result in a video of the automobile
on the outside and inside depending on where the
user directed the viewing using a data glove or other
input device. Item 2, however, would result in a sim-
ulation which would immerse the human in a realistic
driving experience. The key point is that simulation
is natural part of information access and not just the
generator of information. To weave simulation pro-
cedures into multimedia documents will require the
ability of hypermedia products to accept input in a
form-like manner (or via a VR-type interface) and
execute arbitrary programs or scripts to engage the
simulation. Distributed simulation will play an im-
portant role in hypermedia document retrieval since
it is most likely that large complex, but partition-
able, models will be distributed, requiring the model
execution to be distributed as well. If there are a suf-
ficient number of readers of the documentation, real
time distributed interactive simulation will also be
possible while browsing or searching for information.
The World Wide Web (WWW) is a network of hy-
permedia documents that are located on the Internet.
Therefore, WWW sits on top of the Internet, provid-
ing a hypermedia information infrastructure. Client
programs such as Mosaic allow the user to view the


documents containing text, audio and video. With
Mosaic's introduction of forms, users can use WWW
as an interactive testbed and not just a means for
obtaining static information. For simulation, the in-
teraction is critical. The tools are in place today for
embedding simulation models and their outputs into
the WWW and it is quite possible that WWW-based
simulation will become the most predominant mecha-
nism for running any simulation. After all, a user will
generally begin the man-machine interaction process
by obtaining bits and pieces of information. Many
queries will result in static data transfers, but a grow-
ing number of queries will necessitate simulation and
the use of active data constructs. Database-centered
simulation approaches are also important to simu-
lation since the models will have to be stored and
retrieved in a logical manner. Object oriented lan-
guages, for the most part, do not incorporate the idea
of object persistence, and so do not support object-
based structures in an independent fashion. Object-
oriented databases will achieve this purpose.

Issues

*Simulation protocol. How is simulation infor-
mation to be transmitted over the WWW?
The existing standard for distributed interactive
simulation (DIS) provides some good ideas for
packets and their constituents. The informa-
tion exchange methods used by client programs
such as Mosaic and hypermedia-based electronic
mail programs which incorporate MIME (Multi-
purpose Internet Mail Extensions) can be used
as a basis for future hypermedia communication.
One approach is to extract concepts in DIS that
are generic enough for any type of simulation and
then use the MIME format and WWW as a foun-
dation for further development.

Future

Simulation is an integral, fundamental part of infor-
mation access. For too long, information has been
seen as being static in the form of text and images.
With the addition of video and audio, we now see that
information can include time-dependent information.
A natural step in this direction is to have underlying
processes producing the video and audio sequences.
This production is achieved through simulation.

References

Barrett, E. 1989. The Society of Text: Hyper-
text, Hypermedia and the Social Construction of
Information. MIT Press.









* Nielsen, J. 1990. Hypertext and Hypermedia.
Academic Press.


CONCLUSIONS


By taking computer simulation and combining it with
other disciplines, we obtain extensions that are used
to better solidify the current foundation for simula-
tion methodology. We have presented ten fields and
their relationship to simulation, along with some cur-
rent research issues and citations to the literature. It
is important that we relate our work to other fields
on a continual basis. Without these relations, we can
move off into the wrong direction or miss a vital con-
vergence that is occurring in other related fields. As
it happens, simulation is repeatedly seen as a founda-
tion for many other fields such as those we presented.
There is still quite a bit of work to be done in better
organizing the simulation field into the three afore-
mentioned sub-fields of design, execution and analy-
sis. The plethora of modeling methods must be con-
tained and categorized so that we can restore some
logic to simulation methodology. A constant push-
pull process between extension and integration is nec-
essary and will move simulation into the forefront as
a core discipline for creating digital world represen-
tations.


ACKNOWLEDGMENTS

I would like to thank Jin Joo Lee and Victor J. Cook
at the University of Florida for making suggestions
for improving an early version of this manuscript.


REFERENCES

Badler, N. I., Barsky, B. A., and Zeltzer, D., editors
1991. Making Them Move: Mechanics, Control
and Animation of Articulated Figures. Morgan
Kaufmann.
Badler, N. I., Phillips, C. B., and Webber, B. L. 1993.
Simulating Humans: Computer Graphics, Ani-
mation and Control. Oxford University Press.
Barrett, E. 1989. The Society of Text: Hypertext,
Hypermedia and the Social Construction of In-
formation. MIT Press.
Barzel, R. 1992. Physically-Based Modeling for Com-
puter Graphics. Academic Press.
Bass, L., Gornostaev, J., and Unger, C., editors 1993.
Human-Computer Interaction. Springer Verlag.
Lecture Notes in Computer Science 753.


Birtwistle, G. M. 1979. Discrete Event Modelling on
SIMULA. Macmillan.
Booch, G. 1991. Object Oriented Design. Benjamin
Cummings.
Cellier, F. E. 1979. Combined Continuous System
Simulation by Use of Digital Computers: Tech-
niques and Tools. PhD thesis, Swiss Federal In-
stitute of Technology Zurich.
Cellier, F. E. 1991. Continuous System Modeling.
Springer Verlag.
CHI -'-' 1992. Human Factors in Computing Systems.
AC\I/SIGCHI Proceedings.
Ege, R. K., editor 1991. Object-Oriented Simulation
1991. Society for Computer Simulation. Simu-
lation Series, Volume 23, Number 3.
Elzas, M. S., Oren, T. I., and Zeigler, B. P. 1989.
Modelling and Simulation Methodology: Knowl-
edge Systems' Paradigms. North Holland.
Fishwick, P. A. 1989. Neural network models in sim-
ulation: A comparison with traditional modeling
approaches. In 1989 Winter Simulation Confer-
ence, pages 702-710, Washington, DC.
Fishwick, P. A. 1991a. Extracting Rules from Fuzzy
Simulation. Expert Systems with Applications,
3(3):317 327.
Fishwick, P. A. 1991b. Fuzzy Simulation: Specify-
ing and Identifying Qualitative Models. Inter-
national Journal of General Systems, 9(3 "'-"'
316.
Fishwick, P. A. 1993. A simulation environment for
multimodeling. Discrete Event Dynamic Sys-
tems: Theory and Applications, 3:151-171.
Fishwick, P. A. 1994. Simulation Model Design and
Execution: Building Digital Worlds. Prentice
Hall.
Fishwick, P. A. and Luker, P. A., editors 1991.
Qualitative Simulation Modeling and Analysis.
Springer Verlag.
Fishwick, P. A. and Modjeski, R. B., editors 1991.
Knowledge Based Simulation: Methodology and
Application. Springer Verlag.
Fishwick, P. A. and Zeigler, B. P. 1991. Qualita-
tive Physics: Towards the Automation of Sys-
tems Problem Solving. Journal of Theoretical
and Experimental Artificial Intelligence, 3:219
246.
Fishwick, P. A. and Zeigler, B. P. 1992. A Multimodel
Methodology for Qualitative Model Engineering.
AC if Transactions on Modeling and Computer
Simulation, 2(1):52-81.









Fox, G., Johnson, M., Lyzenga, G., Otto, S., Salmon,
J., and Walker, D. 1988. Solving Problems
on Concurrent Processors: Volume 1, General
Techniques and Regular Problems. Prentice Hall.
Fujimoto, R. M. 1990. Parallel Discrete Event Simu-
lation. Communications of the AC i, 33(10):31
53.
Grechenig, T. and Tscheligi, M., editors 1993. Hu-
man Computer Interaction. Springer Verlag.
Lecture Notes in Computer Science 733.
Klir, G. J. and Folger, T. A. 1988. Fuzzy Sets, Un-
certainty and Information. Prentice Hall.
Kosko, B. 1992. Neural Networks and Fuzzy Systems.
Prentice Hall.
Langton, C. G., Taylor, C., Farmer, J. D., and Ras-
mussen, S., editors 1992. Artificial Life II. Ad-
dison Wesley.
Loper, M. and Seidensticker, S. 1993. The DIS Vi-
sion: A Map to the Future of Distributed Simu-
lation. Technical report, Institute for Simulation
and Training, Orlando, FL.
Mattsson, S. E. and Andersson, M. 1993. Omola
An Object-Oriented Modeling Language. In
Jamshidi, M. and Herget, C. J., editors, Re-
cent Advances in Computer-Aided Control Sys-
tems Engineering, volume 9 of Studies in Au-
tomation and Control, pages 291-310. Elsevier
Science Publishers.
McDonald, B. 1992. Distributed Interactive Simu-
lation: Operational Concept. Technical report,
UCF Institute for Simulation and Training, Or-
lando, FL.
Nicol, D. and Fujimoto, R. 1993. Parallel Simulation
Today. Technical report, College of William and
Mary. to be published in Annals of Operation
Research.
Nielsen, J. 1990. Hypertext and Hypermedia. Aca-
demic Press.
Oren, T. I. 1 i-;a. Simulation Model Symbolic Pro-
cessing: Taxonomy. In Singh, M. G., editor,
Systems and Control Encyclopedia, pages 4377
-4381. Pergammon Press.
Oren, T. I. 1987b. Simulation: Taxonomy. In Singh,
M. G., editor, Systems and Control Encyclope-
dia, pages 4411 -4414. Pergammon Press.
Praehofer, H. 1991. Systems Theoretic Formalisms
for Combined Discrete-Continuous System Sim-
ulation. International Journal of General Sys-
tems, 19(3):219-240.


Praehofer, H., Auernig, F., and Reisinger, G. 1993.
An Environment for DEVS-Based Multiformal-
ism Simulation in Common Lisp/CLOSS. Dis-
crete Event Dynamic Systems: Theory and Ap-
plications, 3:119-149.
Prusinkiewicz, P. and Lindenmeyer, A. 1990. The
Algorithmic Beauty of Plants. Springer-Verlag.
Rumbaugh, J., Blaha, M., Premerlani, W., Freder-
ick, E., and Lorenson, W. 1991. Object-Oriented
Modeling and Design. Prentice Hall.
Serra, R. and Zanarini, G. 1990. Complex Systems
and Cognitive Processes. Springer Verlag.
Thalmann, D., editor 1990. Scientific Visualization
and Graphics Simulation. Springer Verlag.
Weisbuch, G. 1991. Complex Systems Dynamics. Ad-
dison Wesley.
Widman, L. E., Loparo, K. A., and Nielsen, N. R.
1989. Artificial Intelligence, Simulation and
Modeling. John Wiley.
Wolfram, S. 1986. Theory and Applications of Cellu-
lar Automata. World Scientific Publishing, Sin-
gapore. (includes selected papers from 1983 -
1 I,.).
Zeigler, B. P. 1976. Theory of Modelling and Simula-
tion. John Wiley and Sons.
Zeigler, B. P. 1990. Object Oriented Simulation with
Hierarchical, Modular Models: Intelligent Agents
and Endomorphic Systems. Academic Press.
Zwicky, F. 1966. Discovery, Invention, Research
Tl....:.il, the Morphological Approach. Macmil-
lan.

BIOGRAPHY

Paul A. Fishwick is an associate professor in the
Department of Computer and Information Sciences
at the University of Florida. He received the BS
in Mathematics from the Pennsylvania State Uni-
versity, MS in Applied Science from the College of
William and Mary, and PhD in Computer and Infor-
mation Science from the University of Pennsylvania in
1986. He also has six years of industrial/government
production and research experience working at New-
port News Shipbuilding and Dry Dock Co. (doing
CAD/CAM parts definition research) and at NASA
Langley Research Center (studying engineering data
base models for structural engineering). His research
interests are in computer simulation modeling and
analysis methods for complex systems. He is a senior
member of the IEEE and the Society for Computer
Simulation. He is also a member of the IEEE Society









for Systems, Man and Cybernetics, ACMI and AAAI.
Dr. Fishwick founded the comp.simulation Inter-
net news group (Simulation Digest) in 1 'i;, which
now serves over 15,000 subscribers. He was chair-
man of the IEEE Computer Society technical com-
mittee on simulation (TCSIM) for two years (1988-
1990) and he is on the editorial boards of several
journals including the AC if Transactions on Mod-
eling and Computer Simulation, IEEE Transactions
on Systems, Man and Cybernetics, The Transactions
of the Society for Computer Simulation, International
Journal of Computer Simulation, and the Journal of
Systems Engineering.




University of Florida Home Page
© 2004 - 2010 University of Florida George A. Smathers Libraries.
All rights reserved.

Acceptable Use, Copyright, and Disclaimer Statement
Last updated October 10, 2010 - - mvs